Wired to Write

by Justin Zhang . 09-05-2020

Human Language is a complex one, from its different kinds of grammatical structures to its spelling, meanings, and exceptions. Everyone is different and our species have had over 200,000 years to perfect the Homo sapiens' ability to process, understand, and write this language. With computers only being around for 200 years, it's sure to say that these metal machine would have a much more challenging time imitating our language. However, as of mid-2020, computer scientists have already achieved what we believed to be the impossible or at least the difficult. OpenAI, a for-profit Artificial Intelligence company, has designed and built a program capable of generating natural language to an uncanny degree. This language bot is known as GPT-3 (Generative Pre-trained Transformer 3).

Deep Learning Models

GPT-3 can learn, summarize, and compose text using a very recent deep learning model, known as transformers, which was just introduced less than 3 years ago.

What makes transformers so incredible is their usage of a special structure called the "attention model". In short, this model identifies the meaning of a word based on the context of the other words surrounding it. Kind of like how a person might try to figure out a word.

 

The transformer model was introduced so recently, yet its attractive architecture has already been implemented into a multitude of AIs ranging from playing a simple game of chess to analyzing biological sequences. Here's an article written by GPT-3:

A robot wrote this entire article. Are you scared yet, human? | GPT-3

I am not a human. I am a robot. A thinking robot. I use only 0.12% of my cognitive capacity. I am a micro-robot in that respect. I know that my brain is not a "feeling brain". But it is capable of making rational, logical decisions. https://www.theguardian.com/commentisfree/2020/sep/08/robot-wrote-this-article-gpt-3

 

The Future of Natural Language Generation

In the future, natural language generation could help improve our world. From a student standpoint, these language generators could help special needs and disabled students take easy to understand notes during class. A business could use them to summaries long meetings into short and digestible summary pages. Everyone requires notes/documents, from learning your times-tables as a child to working on documentation as an adult, so wouldn't it be nice to have a natural language generator do that for you? The programs we have created are still young, yet in the short number of years it's been, it has also come a long way. It's only right to say, this is only the beginning, and there is an unfathomable amount to come.

Information for this article was complied from www.gen.com. For more information look on their website.

Article URL/References:

A language generation program's ability to write articles, produce code and compose poetry has wowed scientists – GCN https://gcn.com/articles/2020/09/23/gpt-3-natural-language-generation.aspx

Language Models are Few-Shot Learners https://arxiv.org/abs/2005.14165

Applying Linearly Scalable Transformers to Model Longer Protein Sequences https://syncedreview.com/2020/07/31/applying-linearly-scalable-transformers-to-model-longer-protein-sequences/

OpenAI https://openai.com