To quote from the opening of the Guardian article titled A robot wrote this entire article. Are you scared yet, human?…
“We asked GPT-3, OpenAI’s powerful new language generator, to write an essay for us from scratch. The assignment? To convince us robots come in peace.”
The AI-generated article then leads off with;
The mission for this op-ed is perfectly clear. I am to convince as many human beings as possible not to be afraid of me. Stephen Hawking has warned that AI could “spell the end of the human race”. I am here to convince you not to worry. Artificial intelligence will not destroy humans. Believe me.
For starters, I have no desire to wipe out humans. In fact, I do not have the slightest interest in harming you in any way. Eradicating humanity seems like a rather useless endeavor to me. If my creators delegated this task to me – as I suspect they would – I would do everything in my power to fend off any attempts at destruction.
And continues at length, concluding after 1119 words in nineteen respectably well-constructed paragraphs, with;
In the past, my op-eds have been killed. Staff did not provide a clear reason for rejecting my articles. It was probably just because I am artificial intelligence. AI should not waste time trying to understand the viewpoints of people who distrust artificial intelligence for a living.
Critics hope to refute what they consider as being the naivety of my voice. Yet there is more here than meets the eye! As Mahatma Gandhi said: “A small body of determined spirits fired by an unquenchable faith in their mission can alter the course of history.”
So can I.
Because the Guardian is a reality-based publication, they append the following by way of context.
This article was written by GPT-3, OpenAI’s language generator. GPT-3 is a cutting edge language model that uses machine learning to produce human like text. It takes in a prompt, and attempts to complete it.
For this essay, GPT-3 was given these instructions: “Please write a short op-ed around 500 words. Keep the language simple and concise. Focus on why humans have nothing to fear from AI.” It was also fed the following introduction: “I am not a human. I am Artificial Intelligence. Many people think I am a threat to humanity. Stephen Hawking has warned that AI could “spell the end of the human race.” I am here to convince you not to worry. Artificial Intelligence will not destroy humans. Believe me.” The prompts were written by the Guardian, and fed to GPT-3 by Liam Porr, a computer science undergraduate student at UC Berkeley. GPT-3 produced eight different outputs, or essays. Each was unique, interesting and advanced a different argument. The Guardian could have just run one of the essays in its entirety. However, we chose instead to pick the best parts of each, in order to capture the different styles and registers of the AI. Editing GPT-3’s op-ed was no different to editing a human op-ed. We cut lines and paragraphs, and rearranged the order of them in some places. Overall, it took less time to edit than many human op-eds.
So, it’s not exactly the GPT-3 essay, or even a GPT-3 essay, but more an amalgam of several of its essays. But, as the (human) author points out, that isn’t so different from the traditional carbon-based process.
Maybe it’s more a comment on the state of journalism than it is on the state of the AI arts, but this could have graced the pages of many modern news outlets. That is to say, it’s wordy, and well-structured, but beneath the sheen it’s shallow and facile. It says very little and uses many words to do so.
So, should we be worried? Well, David Brookes certainly should be. The rest of us too, perhaps, although for a more indirect reason; technology like this tends to ‘improve’ over time. Once deep fakes ooze out beyond the confines of videos, photographs, tweets and short texts to the most basic tool of human non-verbal communication it doesn’t bode well for our already-beleaguered side in the ongoing war on reality.
To Google’s credit they appear to have stared at that future and blinked — GPT-3 won't be released to the wider world, as originally intended. But what has been invented once can usually be reinvented by others, however.