Academy50

GPT-3: From text to coding

In September 2020, the Guardian asked a robot to write an op-ed of around 500 words. The total words fed to it, the input, in terms of introduction and instructions were only 73 words. The output was eight unique and different op-ed pieces, each advancing a different argument.

These op-ed pieces were written by the third generation Generative Pre-trained Transformer or GPT-3–a language model that uses machine learning (ML), a subfield of artificial intelligence, to produce human-like text.

“Editing GPT-3’s op-ed was no different to editing a human op-ed. We cut lines and paragraphs, and rearranged the order of them in some places. Overall, it took less time to edit than many human op-eds,” wrote the Guardian’s editor in his note on the GPT-3 op-ed published in the Guardian.

Developed by OpenAI, a San Francisco based AI research and deployment company, GPT-3 a language model that leverages deep learning is trained using internet data to generate any type of human-like text on demand including stories, blogs, articles and even poetry and so on.

A powerful breed of technology

GPT-3 was trained on a large corpus of text data i.e., over 175 billion ML parameters during its months of training, making it the largest neural network ever produced with powerful text computer code generation capabilities. Yes, GPT-3 can write codes as well with instructions in the English language. But since GPT-3 is not open source it is available only through an API.

OpenAI Codex is a descendant of GPT-3. It was released by OpenAI through its API in private beta in August 2021. According to OpenAI “Proficient in more than a dozen programming languages, Codex can now interpret simple commands in natural language and execute them on the user’s behalf—making it possible to build a natural language interface to existing applications.”

“OpenAI Codex is most capable in Python, but it is also proficient in over a dozen languages including JavaScript, Go, Perl, PHP, Ruby, Swift and TypeScript, and even Shell.” Like GPT-3, Codex has been trained on billions of lines of public code including code in public GitHub repositories making it super easy to translate natural English language into code.

Coding made simple

With GPT-3 you can code and that too fast even if you don’t know how to code. In order to generate code, GPT-3 requires a sample trigger known as “prompt”. All you need is to just show some examples or “prompts” to the system and just supervise. It does the rest and really fast. This is like asking your coder friend to code for you as you desire.

This isn’t the end. This is just the beginning. There is constant improvement in large language models like GPT-3 as new models are being trained on gigantic amounts of data larger than the GPT-3 models were trained. Many non-programmers fantasize about coding and it seems their dreams would surely come true with technologies like GPT-3. Welcome to the new world of dreaming and coding!

Add a Comment

Your email address will not be published. Required fields are marked *