GPT-3

Lyngual GPT-3

What is GPT-3?

Introduced in 2017 Transformers with Self-Attention mechanisms have been revolutionizing the fields of NLP and text data since.

Generative Pre-trained Transformer 3 (GPT-3) is an autoregressive language model that uses Deep Learning to produce human-like text and was introduced in May 2020 by Open AI.

How Does GPT-3 Work?

GPT-3 is a Deep Neural-Network language model that predicts the probability of a given sentence existing in the world.

Dale Markowitz in OpenAI’s new GPT-3 language explained that GPT-3 is trained on an unlabelled dataset using the Common Crawl and Wikipedia with a random removal of words leaving the model to learn to fill the gaps by application of solely the neighbouring words used as context. The architecture of the model is based upon a Transformer Deep Neural Network model.

The model is vast, with 175 billion parameters and with that is the biggest language model created to date. The vast size of the model is responsible for GPT-3 appearing to be intelligent and sounding like a real person at times.

An advantage that GPT-3 possesses unlike other language Transformer models is the capability without specific tuning to conduct a specific task such as language translation or authoring a poem or an article with less than 10 training examples required to complete the task. This is due to its vast size. A reason why Natural Language Machine Learning professionals are so excited about GPT-3 is that other Natural Language models may require a huge number of examples in order to for example perform German to English translation.

GPT-3 is able to perform custom language tasks without the need for the vast training data required with other models.

Transformers with Self-Attention mechanisms were introduced in 2017 by a team at Google headed by Vaswani, in a paper entitled "Attention is All You Need". The paper caused a lot of surprise within the Natural Language Processing (NLP) research community. Especially that a model using no convolution or recurrence was able to outperform the existing existing Sequence-to-Sequence neural machine translation models of Google.

GTP-3 for content creation

GPT-3 will not only be interesting for language translation, but especially for content creation. GTP-3 models are able to create AI created content from short briefs or sentences. Based on the promising results so far the question is, whether content creation / writing will be impacted by AI next and how this will change the dynamic in the language and content industry.

GPT-3 at Lyngual

At Lyngual it is our mission to bring the latest and best technology to our community. Thus we are closely following the progression of GPT-3 for both language translation and content creation purposes. As soon as it is commercially viable we will bring GPT-3 based systems to our platform.

Andreas Jacobi

Andreas Jacobi

CEO


In this article


Most Recent

3 things to bear in mind when translating between English and German (part 2)
Continue reading...

More posts

Reasons why customers don’t like the translation agency experience

I come across a version of this statement so many times: “I have worked with so many translation agencies and my experience has been bad – I have...

Cultural differences in translation and how to deal with them

Language is closely tied to culture, which makes it important to not only consider the target language but also the target culture of the country...

3 things to bear in mind when translating between English and German (part 2)

Translating between English and German seems easy enough. However there are many pitfalls, false friends and things to keep in mind when creating a...

Tips for translators going through periods of little traffic.

Translators and other freelancers will be familiar with unsteady work dynamics. There may be weeks that are very busy – and others where there...