I've had many conversations with marketing copywriters in the past day or so about the significance of GPT-3, the new open source language model.
Grand claims have been made for the software. It can write poetry, business memos, even news articles. Shoudn't content marketers and copywriters be anxious about its impact on our careers?
But the 'problem' with GPT-3 is that although its intelligence is based on massive volumes of data, it lacks a soul. It has no sense of purpose nor empathy. It can't connect with the fundamental 'human needs' that underpin great storytelling.
Or as the Wired article below puts it:
"GPT-3 often spews contradictions or nonsense, because its statistical word-stringing is not guided by any intent or a coherent understanding of reality."
So GPT-3 might make a good brainstorming partner at the outset, but you still need a 'human-in-the-loop' to craft the output into meaningful content.
Stop worrying, start learning
The advice I give to marketing clients and trainees is to find out how AI can boost your content career rather than destroy it.
How? You can see analogies in other industries such as journalism. The reporters getting ahead are those that augment their core human skills with AI tools.
These machine learning models usually take on the data-driven drudgery: sifting through thousands of documents (Panama Papers) or writing simple programs that turn data into news (sports results, weather reports, financial updates).
In marketing, AI is slowly nudging its way into the project planning process. For example, looking back at previous campaigns and working out their strengths and weaknesses.
Then packaging up some basic recommendations to play with: headlines, length of copy, illustrations and so on.
But it's still just interpreting data. A creative campaign, with all the headlines, blogs, social posts and captions threaded together in a coherent narrative? That's still a long way away.
So don't worry about GPT-3. But do start harnessing the AI tools that will give your content marketing a boost. Doing nothing is not an option. The time to start learning is now!
But GPT-3 often spews contradictions or nonsense, because its statistical word-stringing is not guided by any intent or a coherent understanding of reality. “It doesn't have any internal model of the world, or any world, and so it can’t do reasoning that would require such a model,” says Melanie Mitchell, a professor at the Santa Fe Institute and author of Artificial Intelligence: A Guide for Thinking Humans. In her experiments, GPT-3 struggles with questions that involve reasoning by analogy, but generates fun horoscopes.