Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

Open AI’s GPT-3: The Artificial Intelligence Creating all the Buzz

Table of contents

  • What is GPT-3?
  • How does GPT-3 work?
  • How is different from its predecessor GPT-2?
  • Why is GPT-3 such a big deal?
  • What do the early adopters have to say about it?
  • A new breakthrough for artificial intelligence

This is mind blowing. With GPT-3, I built a layout generator where you just describe any layout you want, and it generates the JSX code for you. W H A T.” The tweet by Sharif Shameem about an experiment he did with GPT-3 left thousands in the technology community astonished- for all the obvious reasons. How is it possible for an AI to write complex computer code from a request in simple English, despite never having been conditioned to write code in the first place – or even comprehend English?

What is GPT-3?

The third generation of OpenAI’s Generative Pretrained Transformer, GPT-3, is a general-purpose Language algorithm that uses machine learning to translate text, answer questions and predictively write text. It analyzes a sequence of words, text and other data, then elaborates on those examples to produce entirely original output in the form of an article or an image.

How does GPT-3 work?

By ingesting terabytes and terabytes of data to understand the underlying patterns in how humans communicate,” as shared by Sharif Shameem. GPT-3 processes an enormous data bank of English sentences and extremely powerful computer models called neural nets to identify patterns and determine its own rules of how language functions. GPT-3 possesses 175 billion learning parameters which enable it to perform almost any task it is assigned, making it larger than the second-most powerful language model, Microsoft Corp.’s Turing-NLG algorithm, which has 17 billion learning parameters.

How is it different from its predecessor GPT-2?

In February 2019, OpenAI published their findings and results on their unsupervised language model, GPT-2, which was trained in 40Gb texts and was capable of predicting words in proximity. GPT-2, a transformer-based language applied to self-attention, allowed researchers to generated very convincing and coherent texts. The system, which is a general-purpose language algorithm, used machine learning to translate text, answer questions and predicatively write text. However, it created a controversy because of its ability to create extremely realistic and coherent “fake news” articles based on something as simple as an opening sentence, making it unavailable for the public initially.

Why is GPT-3 such a big deal?

As people still wonder why GPT-3 is so hyped, the answer is simple- it is the largest model trained yet. Its 175 learning parameters are 10 times more than any previous non-sparse language model. For all tasks, GPT-3 is applied without any gradient updates or fine-tuning. It only requires few-shot demonstrations via textual interaction with the model. This giant breakthrough for deep learning and natural language processing has enabled GPT-3 to accomplish all of the following and much more:

  • Answer trivia puzzles correctly
  • Predict the last word of sentences by recognizing the context of the paragraph
  • Select the best ending out of many for a story
  • Translate common languages, which was initially difficult for GPT-2
  • Apply reasoning involving common sense
  • Perform 5 digital arithmetic with accuracy
  • Write news articles from a title with human-like essence

A research paper on GPT-3 titles “Language Models are Few-Shot Learners” highlights the results of testing GPT-3 on tasks mentioned above against fine-tuned state-of-the-art models. In most of the tests, GPT-3 performed better than those models at zero-shot configurations.

Reasons enough to create a buzz, isn’t it?

What do the early adopters have to say about it?

Soon after publishing GP-3 research, OpenAI gave select public members access to the model via an API. And since then, we can see a number of samples of text generated by GPT-3 widely on social media- leading to the hype we are sessing currently.

Founders Fund Principal, Delian Asparouhov, shared an excellent example of GPT-3 where he fed the algorithm half of an investment memo he had posted on his company website. He then gave GPT-3 half of an essay on how to run effective board meetings. In both cases, GPT-3 generated coherent and new paragraphs of text that followed the earlier formatting in such a manner that made it almost indistinguishable from the original text.

In another example, GPT-3 successfully showcased its capability to deceive people on almost any topic by writing about itself. Manuel Araoz, CTO, Zeppelin Solutions GmbH, used GPT-3 to create a complicated article about a faux experiment on the Bitcointalk forum by applying a basic prompt as a guideline. The article, “OpenAI’s GPT-3 may be the biggest thing since bitcoin,” details how GPT-3 deceived forum members into believing that its comments were genuine and human-written. Not just that, Araoz also tested GPT-3 in many other ways and made complex texts easier to understand, wrote poems in Spanish in Borges style, wrote music in ABC notation and much more.

A new breakthrough for artificial intelligence

In their mission to ensure that artificial general intelligence (AGI)-outperform humans at most economically valuable work-benefits to all of humanity, Open AI’s GPT-3 has been a major leap in achieving it by reaching the highest stage of human-like intelligence through ML and NLP. This is backed by experiments conducted by early testers who are left astounded by the results. We can only wonder what the next-gen of their developments can be capable of achieving.

Currently, the latest version of their GPT-3 general-purpose natural language processing model is available in private beta, and OpenAI is providing access to its API by invitation only. There’s still a long waiting list for the paid version, which is expected to be released in the next two months.

The post Open AI’s GPT-3: The Artificial Intelligence Creating all the Buzz appeared first on [x]cube LABS.



This post first appeared on Mobile Application Development - Digital Innovatio, please read the originial post: here

Share the post

Open AI’s GPT-3: The Artificial Intelligence Creating all the Buzz

×

Subscribe to Mobile Application Development - Digital Innovatio

Get updates delivered right to your inbox!

Thank you for your subscription

×