Why Nick Cave is wrong about human creativity and generative AI

First of all, I don’t think that Nick Cave is entirely wrong. Laying aside how ChatGPT is just one of the many LLMs that are publicly available, and that using it as a stand-in for all of generative AI is like saying “AOL Online” when you mean “the internet,” he does make a fair point that using generative AI as a replacement for basic human creativity is wrong.

What he doesn’t understand is that using AI this way is also counterproductive. He blithely assumes that it takes not skill or effort whatsoever to use these AI tools—that all one has to do is tell ChatGPT what to write, and it will magically produce something if not great, then at least publishable. But as someone who has written several AI-assisted novels and short stories, I can assure you that it does take effort to produce something more than merely passable. Indeed, with longer works like novels, I can assure you that our current AI models are incapable of producing even passable work without considerable human intervention.

This is why I call it AI-assisted writing, as opposed to AI writing. When you do it right, the AI tools don’t replace your inner human creativity, but augment and enhance it, making things possible that were either impossible before, or that required a prohibitive degree of struggle. Writing with AI is still a form of creativity, though it might not look exactly like previous forms. But isn’t that also true of writing on a computer vs. writing longhand? Does it take any less creativity to write a novel on Microsoft Word than it does to write it on parchment with a fountain pen?

Granted, the technological leap from word processor to generative AI is much more profound and fundamental than the leap from pen and paper to typewriter, or from typewriter to MS Word. Speaking from experience, I can say that writing a novel with ChatGPT or Sudowrite feels a lot more like directing a play with an amateur (and very stupid) actor than it feels like wrestling with the empty page, at least in the early generative stages. But it’s still, fundamentally, a creative act—and that’s the main thing that Nick Cave misses in his rant. Anyone can ask ChatGPT to write them a novel, just like anyone can bang their hands on a piano or strum their fingers across the strings of a guitar. But to produce something good—that requires effort.

However, there is an even deeper level where Nick Cave is wrong here, and that is in the unspoken assumption that the difficulty in creating something is the thing that gives it value. It’s the same principle that Karl Marx expounded in his labor theory of value: that the economic value of a good or service is determined by the amount of labor required to produce it, or in this case, the creative and artistic value. That’s just wrong.

Do we love J.R.R. Tolkien’s Lord of the Rings because it took him several decades to write it, and largely represents the greatest product of his life’s work? Obviously not—otherwise, every amateur writer who’s been polishing and repolishing the same unfinished novel for the last twenty years must necessarily be the next Tolkien, no matter the fact that their book reads more like the Eye of Argon than The Fellowship of the Ring.

So if it’s not the creative struggle or the amount of human effort that ultimately gives art its value, what does? The same thing that gives a product or service its economic value: the utility that it provides to the person who consumes it. In other words, the thing that gives art its value is the goodness, truth, and beauty that it brings into the lives of those who receive it.

This is especially true of writing, which is perhaps the most collaborative of all the arts. Without a reader to read it, a book is nothing more than processed and flattened wood pulp full of meaningless squiggles (even less than that for an ebook). When I read a book, I care not a whit for how much work it took for the author to come up with it. Same with the music I listen to, or the games that I play. What I care about is how it makes me think, feel, or experience the world.

And if it’s possible to bring more goodness, truth, and beauty into the world by using generative AI, so what? If it’s easier than writing a novel the old way, does that somehow mean it’s “cheating”? If the answer to that question is yes, please tell me why you don’t churn your own butter, or hunt your own food, or chop your own wood and burn it to heat your house—because all of those applications of modern technology are “cheating” in exactly the same way. Also, I hope all the books in your personal library are handmade, illuminated manuscripts, because the printing press is far more of a “cheat” than generative AI, as the last few hundred years of history clearly shows.

Nicholas Cave is wrong. ChatGPT is not the most “fiendish” thing “eat[ing] away at [our] creative spirit.” Our humanity is far more resilient and anti-fragile than he gives it credit. Those who try to replace human creativity with AI will fail, not because of artists like Cave who stubbornly resist the “temptation” to use these tools, but because of those who embrace the new technology with an open mind, and discover that our humanity is not a liability, but our greatest asset—a premise that Cave ironically rejects with his fearmongering about our fundamental replaceability.

By Joe Vasicek

Joe Vasicek is the author of more than twenty science fiction books, including the Star Wanderers and Sons of the Starfarers series. As a young man, he studied Arabic and traveled across the Middle East and the Caucasus. He claims Utah as his home.

Leave a Reply