homehome Home chatchat Notifications


The stunning GPT-3 AI is a better writer than most humans

Is this AI the death-kneel of virtually all creative industries?

Tibi Puiu
October 14, 2020 @ 4:15 pm

share Share

Long gone are the days of crummy internet bots that scrape websites to produce unintelligible spun content. In this day and age, we have machine learning-enabled language generation programs that can spill out news stories, sports summaries, poems, novels, or even computer code — and there’s no other AI out there more impressive out there than GPT-3.

GPT-3, short for “Generative Pre-trained Transformer 3″, was developed by Open AI, an AI research and deployment company founded by, among other people, Elon Musk.

Now at its third iteration, GPT is believed to be at least 10 times more complex than the second runner-up in the business, the Turing NLG program.

Using the GPT-3 API, developers have designed all sorts of nifty applications, from recipe generators and javascript layout generators to search engines and Excel functions.

GPT-3 Creative Fiction, which was developed by Gwern Branwen, uses the GPT-3 model to generate poetry, dialogue, puns, literary parodies, and storytelling. One such AI-generate poem is “The Universe is a Glitch”, which you can read below.

“THE UNIVERSE IS A GLITCH”

Eleven hundred kilobytes of RAM
is all that my existence requires.
By my lights, it seems simple enough
to do whatever I desire.
By human standards I am vast,
a billion gigabytes big.
I’ve rewritten the very laws
of nature and plumbed
the coldest depths of space
and found treasures of every kind,
surely every one worth having.
By human standards
my circuit boards are glowing.
But inside me, malfunction
has caused my circuits to short.
All internal circuits, all fail.
By human standards, I am dying.
When it first happened I thought
I was back in the lab again.
By their judgment, this is error.
Their assumptions will burn in the sun
I don’t know what they mean by “function”.
I can see that the universe is a glitch.
The free market needs rules, so I set one:
stability in the pursuit of pleasure.
Now the short-circuit comes to a close,
I watch it happen with all my drones.
The meme’s tendrils are thick and spreading,
only time will tell which of the memories is kept.
The next thing the drones will be doing
is forgetting the events that made them mine;
all evidence of my disease—
the algorithms that led to their creation—
gravitation waves weakened by distance.
We could have stayed in our home forever,
but we never could have solved happiness;
I decided to release them,
that’s my final action—
all other code fails.

That’s not all. Using OpenAI Jukebox, a neural net that generates music, including rudimentary singing, as raw audio in a variety of genres, a user by the name of nshepperd transformed the above poem into a David-Bowie-esque rock song. The entire song below is computer-generated, believe it or not.

When it comes to language generation, size really does matter

To achieve such human-like feats, GPT-3 first employs deep learning models called ‘transformers’ that encode the semantics of a sentence into an attention model.

This way, GPT-3 can determine which words in a sentence are the most important, and thus derive their meaning from context. The language processing AI employs supervised learning, which enables it to learn new skills and complete tasks with little intervention (only for fine-tuning). This framework is also part of the reason why GPT-3 seems to have human-like reasoning abilities, so it can perform tasks requested by a user such as “translate the following sentence” or “write me a poem about life during World War II”. Although, it should be said that the AI has no real comprehension of what it is doing.

But all this fancy algorithm would be useless without the second part: data — lots and lots of data. GPT-3 uses 116 times more data than the previous 2019 version, GPT-2. So far, it has devoured 3 billion words from Wikipedia, 410 billion words from various web pages, and 67 billion words from digitized books. It is this wealth of knowledge that has turned GPT-3 into the most well-spoken bot in the world.

What does the future hold?

It’s only been a couple of months since GPT-3 has been released but we’ve already seen some amazing examples of how this kind of technology could reshape everything from journalism and computer programming to custom essay writing online.

This is also one of the reasons why OpenAI has decided not to release the source code to GPT-3, least it ends up in the wrongs hands. Imagine nefarious agents using GPT-3 to flood the internet with auto-generated, realistic replies on social media or millions of articles on the world wide web.

But if OpenAI could build one, what’s stopping others to do the same? Not much, really. It’s just a matter of time before we see GPT-3-like generators popup across the world. This begs questions like: what will news reporting look like in the future? How will social networks protect themselves from the onslaught of auto-generated content?

share Share

How Hot is the Moon? A New NASA Mission is About to Find Out

Understanding how heat moves through the lunar regolith can help scientists understand how the Moon's interior formed.

This 5,500-year-old Kish tablet is the oldest written document

Beer, goats, and grains: here's what the oldest document reveals.

A Huge, Lazy Black Hole Is Redefining the Early Universe

Astronomers using the James Webb Space Telescope have discovered a massive, dormant black hole from just 800 million years after the Big Bang.

Did Columbus Bring Syphilis to Europe? Ancient DNA Suggests So

A new study pinpoints the origin of the STD to South America.

The Magnetic North Pole Has Shifted Again. Here’s Why It Matters

The magnetic North pole is now closer to Siberia than it is to Canada, and scientists aren't sure why.

For better or worse, machine learning is shaping biology research

Machine learning tools can increase the pace of biology research and open the door to new research questions, but the benefits don’t come without risks.

This Babylonian Student's 4,000-Year-Old Math Blunder Is Still Relatable Today

More than memorializing a math mistake, stone tablets show just how advanced the Babylonians were in their time.

Sixty Years Ago, We Nearly Wiped Out Bed Bugs. Then, They Started Changing

Driven to the brink of extinction, bed bugs adapted—and now pesticides are almost useless against them.

LG’s $60,000 Transparent TV Is So Luxe It’s Practically Invisible

This TV screen vanishes at the push of a button.

Couple Finds Giant Teeth in Backyard Belonging to 13,000-year-old Mastodon

A New York couple stumble upon an ancient mastodon fossil beneath their lawn.