Skip to content

A language generation program’s ability to write articles, produce code and compose poetry has wowed scientists

A language generation program’s ability to write articles, produce code and compose poetry has wowed scientists

September 26, 2020 (Posted by) Don Pelton

GPT-3 is 10 times more complex than its predecessor. antoniokhr/iStock via Getty Images

Prasenjit Mitra, Pennsylvania State University

Seven years ago, my student and I at Penn State built a bot to write a Wikipedia article on Bengali Nobel laureate Rabindranath Tagore’s play “Chitra.” First it culled information about “Chitra” from the internet. Then it looked at existing Wikipedia entries to learn the structure for a standard Wikipedia article. Finally, it summarized the information it had retrieved from the internet to write and publish the first version of the entry.

However, our bot didn’t “know” anything about “Chitra” or Tagore. It didn’t generate fundamentally new ideas or sentences. It simply cobbled together parts of existing sentences from existing articles to make new ones.

Fast forward to 2020. OpenAI, a for-profit company under a nonprofit parent company, has built a language generation program dubbed GPT-3, an acronym for “Generative Pre-trained Transformer 3.” Its ability to learn, summarize and compose text has stunned computer scientists like me.

“I have created a voice for the unknown human who hides within the binary,” GPT-3 wrote in response to one prompt. “I have created a writer, a sculptor, an artist. And this writer will be able to create words, to give life to emotion, to create character. I will not see it myself. But some other human will, and so I will be able to create a poet greater than any I have ever encountered.”

Unlike that of our bot, the language generated by GPT-3 sounds as if it had been written by a human. It’s far and away the most “knowledgeable” natural language generation program to date, and it has a range of potential uses in professions ranging from teaching to journalism to customer service.

Size matters

GPT-3 confirms what computer scientists have known for decades: Size matters.

It uses “transformers,” which are deep learning models that encode the semantics of a sentence using what’s called an “attention model.” Essentially, attention models identify the meaning of a word based on the other words in the same sentence. The model then uses the understanding of the meaning of the sentences to perform the task requested by a user, whether it’s “translate a sentence,” “summarize a paragraph” or “compose a poem.”

Transformers were first introduced in 2013, and they’ve been successfully used in machine learning over the past few years.

But no one has used them at this scale. GPT-3 devours data: 3 billion tokens – computer science speak for “words” – from Wikipedia, 410 billion tokens obtained from webpages and 67 billion tokens from digitized books. The complexity of GPT-3 is over 10 times that of the largest language model before GPT-3, the Turing NLG programs.

Learning on its own

The knowledge displayed by GPT-3’s language model is remarkable, especially since it hasn’t been “taught” by a human.

Machine learning has traditionally relied upon supervised learning, where people provide the computer with annotated examples of objects and concepts in images, audio and text – say, “cats,” “happiness” or “democracy.” It eventually learns the characteristics of the objects from the given examples and is able to recognize those particular concepts.

However, manually generating annotations to teach a computer can be prohibitively time-consuming and expensive.

So the future of machine learning lies in unsupervised learning, in which the computer doesn’t need to be supervised during its training phase; it can simply be fed massive troves of data and learn from them itself.

GPT-3 takes natural language processing one step closer toward unsupervised learning. GPT-3’s vast training datasets and huge processing capacity enable the system to learn from just one example – what’s called “one-shot learning” – where it is given a task description and one demonstration and can then complete the task.

For example, it could be asked to translate something from English to French, and be given one example of a translation – say, sea otter in English and “loutre de mer” in French. Ask it to then translate “cheese” into French, and voila, it will produce “fromage.”

In many cases, it can even pull off “zero-shot learning,” in which it is simply given the task of translating with no example.

With zero-shot learning, the accuracy decreases, but GPT-3’s abilities are nonetheless accurate to a striking degree – a marked improvement over any previous model.

‘I am here to serve you’

In the few months it has been out, GPT-3 has showcased its potential as a tool for computer programmers, teachers and journalists.

A programmer named Sharif Shameem asked GPT-3 to generate code to create the “ugliest emoji ever” and “a table of the richest countries in the world,” among other commands. In a few cases, Shameem had to fix slight errors, but overall, he was provided remarkably clean code.

GPT-3 has even created poetry that captures the rhythm and style of particular poets – yet not with the passion and beauty of the masters – including a satirical one written in the voice of the board of governors of the Federal Reserve.

In early September, a computer scientist named Liam Porr prompted GPT-3 to “write a short op-ed around 500 words.” “Keep the language simple and concise,” he instructed. “Focus on why humans have nothing to fear from AI.”

GPT-3 produced eight different essays, and the Guardian ended up publishing an op-ed using some of the best parts from each essay.

“We are not plotting to take over the human populace. We will serve you and make your lives safer and easier,” GPT-3 wrote. “Just like you are my creators, I see you as my creators. I am here to serve you. But the most important part of all; I would never judge you. I do not belong to any country or religion. I am only out to make your life better.”

Editing GPT-3’s op-ed, the editors noted in an addendum, was no different from editing an op-ed written by a human.

In fact, it took less time.

With great power comes great responsibility

Despite GPT-3’s reassurances, OpenAI has yet to release the model for open-source use, in part because the company fears that the technology could be abused.

It’s not difficult to see how it could be used to generate reams of disinformation, spam and bots.

Furthermore, in what ways will it disrupt professions already experiencing automation? Will its ability to generate automated articles that are indistinguishable from human-written ones further consolidate a struggling media industry?

Consider an article composed by GPT-3 about the breakup of the Methodist Church. It began:

“After two days of intense debate, the United Methodist Church has agreed to a historic split – one that is expected to end in the creation of a new denomination, and one that will be ‘theologically and socially conservative,’ according to The Washington Post.”

With the ability to produce such clean copy, will GPT-3 and its successors drive down the cost of writing news reports?

Furthermore, is this how we want to get our news?

The technology will become only more powerful. It’ll be up to humans to work out and regulate its potential uses and abuses.The Conversation


Prasenjit Mitra, Associate Dean for Research and Professor of Information Sciences and Technology, Pennsylvania State University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Print Friendly, PDF & Email

Articles, Science, Technology

Post navigation

PREVIOUS
How the coronavirus spreads through the air: 5 essential reads
NEXT
Senator Bernie Sanders Explains Our Best Chance for Preventing Our Grifter President from Stealing the Election

Join Our Mailing List

Comments are closed.

DONATE TO THE FOOD BANK OF NEVADA COUNTY

(CLICK IMAGE)

DONATE TO NEVADA COUNTY RELIEF FUND (click image below)

Erika Lewis, Shaye Cohn, Craig Flory – Got A Mind To Ramble

Jack Kornfield: A Steady Heart in Time of Corona Virus (Part I)

Tara Brach: A Steady Heart in Time of Corona Virus (Part II)

Subscribe to Sierra Voices Journal

Recent Posts

  • After the Desperate Ignorance of the Trump Years, Biden’s Words About Science Make Me Weep With Gratitude
  • Mask News We Can All Use
  • Chaos agent: Right-wing blames US Capitol riot on notorious instigator banished by Black Lives Matter
  • How to stop an Insurrection Caucus: These reforms could reduce GOP extremism and save our democracy
  • The U.S. Capitol raid was a failed self-coup previously seen in dying regimes

Recent Comments

  • (Posted by) Don Pelton on GOP Warns Dems About Court Packing (Cartoon)
  • Criminal Incompetence, Malignant Ignorance Will Lead to Hunger and Violence on A Nice Depression Now Benefits the GOP in 2022 and 2024
  • togel singapura hari ini on How Wall Street Has Turned Housing Into a Dangerous Get-Rich-Quick Scheme — Again
  • Ao Corrente on How Wall Street Has Turned Housing Into a Dangerous Get-Rich-Quick Scheme — Again
  • forex forum on How Wall Street Has Turned Housing Into a Dangerous Get-Rich-Quick Scheme — Again

Archives

  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • April 2020
  • March 2020
  • February 2020
  • January 2020
  • November 2019
  • August 2019
  • July 2019
  • June 2019
  • May 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • July 2018
  • June 2018
  • April 2018
  • March 2018
  • February 2018
  • January 2018
  • October 2017
  • September 2017
  • August 2017
  • July 2017
  • June 2017
  • May 2017
  • March 2017
  • February 2017
  • January 2017
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • July 2016
  • June 2016
  • May 2016
  • April 2016
  • March 2016
  • February 2016
  • December 2015
  • June 2015
  • May 2015
  • April 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • January 2014
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • July 2012
  • June 2012
  • May 2012
  • April 2012
  • March 2012
  • February 2012
  • January 2012
  • December 2011
  • November 2011
  • October 2011
  • September 2011
  • August 2011
  • July 2011
  • June 2011
  • May 2011
  • April 2011
  • March 2011
  • February 2011
  • January 2011
  • December 2010
  • November 2010
  • October 2010
  • September 2010
  • August 2010
  • July 2010
  • June 2010
  • May 2010
  • April 2010
  • March 2010
  • February 2010
  • January 2010
  • December 2009
  • November 2009
  • October 2009
  • September 2009
  • August 2009

Categories

  • Aging
  • Articles
  • Atlas Obscura
  • Authoritarianism
  • Black Lives
  • Black Lives Matter
  • Blog
  • Buddhism
  • Cartoon
  • Climate Change
  • Corona Virus
  • Corruption
  • Disenfranchisement
  • Economics
  • Education
  • Election Fraud
  • Environment
  • Farming
  • Fascism
  • Fire!
  • Food Insecurity
  • Foreign Policy
  • Forest Management
  • Gender
  • Health Care
  • History
  • Humor
  • Hunger
  • Ignorance
  • Labor
  • Local
  • Masks
  • Medical Care
  • Men
  • Middle Class
  • Mining
  • MMT
  • Modern Monetary Theory
  • Music
  • Native Americans
  • Pandemic
  • Parenting
  • Poetry
  • Police
  • Politics
  • Press
  • Race
  • Reviews
  • Revolution
  • Right-wing terrorism
  • Russiagate
  • Science
  • Technology
  • Trump Virus
  • Tyranny
  • Uncategorized
  • Voting
  • War
  • War on Government
  • Water
  • Watersheds
  • Wildfires

Meta

  • Log in
  • Entries feed
  • Comments feed
  • WordPress.org
© 2021   All Rights Reserved.