I have largely ignored all of the debate over ChatGPT (“Generative Pre-training Tranformer), the artificial intelligence program used to generate prose. I have largely ignored it because I am not ready to come to grips with the fact that a phone app might be able to write a better essay than me. But if Washington Post columnist Karen Attiah is correct, it may be time to come to grips with all of this, or at least offer some thoughtful analysis and/or critique.
Here is a taste of her piece, “For writers, AI is like a performance-enhancing steroid“:
A few weeks ago, I had dinner with some friends. One of them pulled out his phone and asked me if I thought I had written more than 200 pieces or so at this stage of my career. “Probably,” I said. “Why?”
“You know, the way things are going, artificial intelligence is going to be able to write articles and books just by analyzing your writing style.” I was initially incredulous, maybe a bit dismissive. Then my friend went to a program on his phone and put in a query, asking it to write a 1,000-word Christmas story in the style of Charles Dickens. To my amazement, the program started churning out paragraphs within seconds — and not just a jumble of random words. The grafs had sentence variation, color, plot development.
He did another query: Think of a title for a children’s holiday book aimed at a Black audience. Within seconds, a list populated — and the winner was something to the effect of “Zahra’s Big Holiday Surprise.” I was stunned.
My friend smiled. “The future is already here,” he said. “You might as well get ahead of it as a writer.”
I thought of that convo as the news came down this week that the popular electronics site CNET has been using AI to write full articles. Frank Landymore at the Byte documents how eagle-eyed marketer Gael Breton figured out that CNET had quietly published more than 70 articles using AI since November, under the author name “CNET Money Staff.” Clicking on the author’s note reveals the truth:
“This article was assisted by an AI engine and reviewed, fact-checked and edited by our editorial staff.”
I told my friends that AI felt like an intellectual steroid. Writers can spend years reading the works of other writers and, over time, integrate those writers’ styles into their own work. But a bot can do it in under a minute. What does this mean for book authors? Will using AI come to be seen as “cheating”? Will writers start proclaiming they are “natural” writers, with no AI use in their work, akin to bodybuilders who choose not to use performance-enhancing drugs?
I’m not as widely known or read as Dickens (yet, at least! #ManifestingItIntoExistence), but does this mean that at some point, someone could program a bot to write exactly like me? As with music, writing in the English language follows certain rules to make it pleasing and memorable. If people can engineer pop music to make it as appealing as possible to the wide world, why wouldn’t it be the same for writing?
I know it’s a scary thing to think about, especially as colleagues in my field are facing layoffs and belt-tightening measures. But my friend is right: Journalists and regulators need to get with the program and think seriously about how AI can help … or hurt.
Read the entire piece here.