They squashed a generation of readers… on purpose

This is one of the scariest and most horrifying videos I’ve seen on YouTube in a while. The first time I watched it was outrage-inducing enough, but then I watched it a second time, and wow. Our education system isn’t broken, it’s functioning exactly as the elites intended. And that is why so many millennials and zoomers can’t read, can’t write… and honestly, can’t even think.

This video is well worth an hour of your time. Possibly two.

Fantasy from A to Z: U is for Unicorns

If you were expecting a post on unicorns or other mythical beasts, I hate to disappoint you again, but that’s not what this is going to be. Instead, I want to write a bit about that most mythical of all human creatures: the full-time fiction writer.

Okay, perhaps we’re not that mythical. After all, Brandon Sanderson estimates that of all his students over the years, perhaps as many as 10% of the ones who set out to become full-time writers actually make that dream into a reality. I sometimes wonder: would Brandon count me as one of those 10%? Should he? The answer to that is… complicated. 

One of the first questions I get whenever I tell people that I’m a writer is “oh, wow—how is that working out for you?” Which is really a roundabout way of asking how much money I make, and whether I’ve been able to turn it into a full-time career. I am not (yet) a major bestselling author, and the closest thing I’ve had to a breakout thus far has been my (now unpublished) Star Wanderers novella series, which managed (mostly by accident) to hit the algorithms correctly back when a permafree first-in-series with lots of direct sequels was the best path to success. Then the publishing landscape changed, the algorithms shifted to favor pay-to-play advertising, and my books got left behind.

I will admit that if it weren’t for my wife’s income, I wouldn’t be able to pursue writing full-time. As a family, we’re following a path very similar to my Scandinavian ancestors, where the wife tends the farm while the husband goes off a-viking. In other words, my wife has the stable, traditional career that provides our family with some degree of security, while I have the more risky career that has the potential to catapult us into transformative levels of wealth and prosperity. We’re doing just fine, but it does sometimes feel like my Viking ship has yet to land ashore.

Because here’s the thing: something like 90% of the money in book publishing (after the booksellers and publishers and other middlemen take their often-exorbitant cuts) goes to less than 1% of the writers who actually make any money (and something like 30% of kindle books never sell a single copy). 

For every Brandon Sanderson, there are thousands—perhaps hundreds of thousands—of published authors who write on nights and weekends while holding down a day job to pay the bills. My writing contributes enough to the family budget to justify pursuing it, but if I were still single, I would need at least a part-time job.

Indie publishing has created a lot of opportunity for authors to make a career out of their writing, and there are many successful indies who are making a decent living at it. At the same time, indie publishing has also massively exploded the number of books that are published, so the proportion of full-time to still-aspiring authors is probably about the same (and may have actually tilted the other way). 

In recent years, it has very much turned into a zero-sum pay-to-play game, especially with advertising. From what I can tell, most authors lose money on advertising, and most of those who are making money are spending upwards of $10,000 each month to make $11,000. The elite few who learn how to successfully game the algorithms to blow up their books often put their writing on the backburner to launch their own companies or provide publishing services, leveraging their expertise to make a lot more than they otherwise would.

The algorithms are changing books in some very strange ways. If J.R.R. Tolkien or Roger Zelazny or Robert E. Howard were writing today, would they be able to make it in today’s publishing environment? 

Howard’s Conan stories would either have to be a lot sexier, or else would have to include the sort of tables and character stats you find in LitRPG. His covers would also be a lot more anime, and show a ridiculous amount of cleavage (which he actually might not have had a problem with, judging from some of the old Weird Tales covers). 

Zelazny’s Chronicles of Amber would all be far too short to make it in Kindle Unlimited—to make it in that game, you have to have super long books that max out on page reads, in order to maximize advertising ROI so that you can outbid your competitors. And if you aren’t winning the pay-to-play advertising game, your KU books will sink like rocks. Also, Zelazny took way too much time between books. Gotta work on that rapid release strategy, Roger.

As for Tolkien… hoo boy, there’s an author who did everything wrong. Decades and decades spent polishing his magnum opus, with a short prequel novel that falls squarely in the children’s category (totally different genre) as the only other fantasy book published in his lifetime. I suppose he could have serialized Lord of the Rings, except nothing really happened in episode 1: A Long-Expected Party. Certainly not anything that would adequately foreshadow all the dark and epic battles to come. Perhaps if he followed a first-in-series permafree strategy, and just gave away Fellowship of the Ring for free… and then made The Hobbit his reader magnet for signing up for his email list… maybe that could have worked? After all, there’s always BookBub…

I jest, of course. Each of these authors’ books became classics, not because of their marketing strategy, but because they hit the cultural zeitgeist in exactly the right way. But is it possible for an author to do that today without also getting a boost from the algorithms? Or do the algorithms have more power to shape our culture than anything else? Those are disturbing questions, and I honestly do not know the answer.

And then there’s the question of AI, which is massively disrupting all of the creative fields. In the interest of full disclosure, I am actually quite sanguine about generative AI, and have already been working to incorporate it into my creative process. I’m not a fan of AI slop, but I don’t feel particularly threatened by it. I decided a long time ago that if AI ever became good enough to write an entertaining book, it still would never be able to write a Joe Vasicek book. That’s insulated me from most of the doom porn out there.

Right now, there is a HUGE fight happening between authors like me who are embracing AI, and authors who treat it all as anathema, and have vowed to never use any sort of AI in any of their books (except Grammarly, of course, because… reasons. And Microsoft Word. And…) Frankly, it reminds me of the big debate between indie and traditionally published authors, back before self-publishing had lost its stigma. The biggest difference is that the level of online outrage has been ramped up to 11, mostly as a result of the social media algorithms (which weren’t as robust or as powerful back in the early 2010s). I suspect that we will ultimately settle on a “hybrid” approach, much like we did with publishing, but the sheer level of vitriol has made me wonder about that. 

On the reader end of things, though, it seems like most readers don’t really care if a book was written with or without AI assistance, so long as it’s actually a good book. Which means that there is a real opportunity for authors who 1) know how to tell great stories, 2) have already found and honed their voice, and 3) know how to strike the right balance between the AI and the human elements. 

Which describes my own position almost perfectly. Over the last fifteen years, I’ve read, written, and published enough books that I have a pretty good handle on what makes a great story. I’ve also honed my voice well enough that I can write in it quite comfortably. And as for the balance between AI and human writing, I’ve been working hard on that since ChatGPT burst onto the scene in 2022. Half a dozen books and about a million words later, I’ve learned quite a lot about how to best strike that balance.

Will AI replace authors entirely, making this particular unicorn extinct? I don’t think so. But AI may radically change our concept of what “books,” or “writers,” or “writing” really are. A long time ago, I realized that even if AI became good enough to write a decent book, it would never be able to write a Joe Vasicek book. Only I can do that. Whether or not that’s worth something is up to the readers to decide.

The dangers of relying too much on AI

I saw this really interesting video last week, and it made me think: am I relying too much on AI?

In my personal life, this probably isn’t an issue. I do occasionally ask ChatGPT to make me a recipe, or to advise me on a particular topic, but I always do a gut check and assume that it’s hallucinating if it doesn’t pass. If it gives me something that I can quickly and easily verify, I always do that… and half of the time, it turns out to be a hallucination to some degree. So yeah, I don’t rely on it nearly as much in my personal life as some of the characters in this video.

What about blogging? Don’t be too scandalized, but with my new blogging schedule, I have experimented a bit with using ChatGPT to write some of these blog posts. It’s not like I’ve been copy-pasting everything straight from the chatbot, but I have relied on it a little more heavily than I do in my own writing.

After trying that a couple of times, though, I decided to cut that out and write all of these blog posts by hand. Why? Because I felt like it was creating too much distance between myself and the people who read this blog, and the purpose for writing this blog is to foster a human connection. So it kind of defeats the purpose to rely on a chatbot to generate most of the content I post here. For that reason, I plan to keep writing all these blog posts entirely myself, with only minimal AI input.

So what about my fiction? This is where things get a little tricky. While I totally agree that simply copy-pasting from AI is a piss-poor way to write a book, I do think that AI can be a very useful tool in writing and crafting a novel, provided that you understand the limitations of the AI and don’t rely on it too much. But how much is too much? That is the question.

The biggest way that AI has helped to enhance my own writing is in giving me a birdseye view of the story as I generate a “crappy first draft.” This birdseye view allows me to see and fix major story issues before they metastatize and give me writer’s block, which is what tends to happen if I write these drafts out entirely by hand. When I’m focused on the page, I tend to lose sight of the forest for the trees, so I don’t notice that there’s a problem with the story until I’m several chapters in and find that I just can’t write.

This has happened with basically every project that I write on my own, and is the main reason why it took me anywhere from six to eighteen months (or longer) to write even a short novel, before I started using AI. However, since I began incorporating AI into my writing process, this problem has basically gone away, and I no longer experience this form of writer’s block at all.

However, while I do rely on AI to help me to craft my “crappy first draft,” that isn’t the draft that I publish. Once the AI draft is as good as I can make it, I will then go through scene-by-scene and rewrite the entire book in my own words. The purpose for this step is to make sure that I’m telling the story in my own words, and to make the story my own. I will still have the AI draft open on another screen, and refer to it as I write out the story, but I don’t do any copy-pasting. It’s all written out by hand.

Is this enough, though? Or do I need to add more steps to make sure that I’m not relying too much on AI, and thus losing my own voice? Recently, I’ve been spending a lot more time on the AI draft, generating multiple iterations and combining the best parts to (hopefully) boost the quality. I’ve also been doing a revision pass over the AI draft, tweaking it to smooth over some common AI-isms and (hopefully) adding a bit of my own voice before I move on to the human draft and rewrite the whole thing to make sure it’s all in my voice.

But while this might be enough to keep the book in my own words, is this enough to keep my own writing skills from atrophying? Or do I need to occasionally pick up a WIP that is 100% human writing, with no AI at all, just to make sure I don’t lose these writing skills? That is the question that I’m currently pondering. Perhaps this is the sort of thing that short stories could serve really well to help with. Perhaps I should go back to writing short stories again, just as a way to keep my writing skills sharp.

If I were starting out right now as a new writer, I would definitely avoid writing with AI until I’d written enough to find my own voice. And I would also make sure to write at least one novel 100% without AI-assistance, just for the experience, and to prove to myself that I could do it. Otherwise, I think there would be a very real danger in becoming over-reliant on AI to write my books, and thus risk losing my own unique voice, so that none of the books that I write ever truly become my own.

Anyhow, those are some of my current thoughts on the subject. What do you think of this problem?

Fantasy from A to Z: F is for Female

We live in a time of deepening division—not just between political parties or social classes, but between the sexes as well. Of course, men and women have always been different, but those differences have grown increasingly stark in recent years, even as it becomes more politically incorrect to say so.

Across the Western world, men are drifting one way, women another. In politics, men are turning more conservative, while women—especially young, unmarried women—are growing more liberal. We can see this gap not only in US voting patterns, but in voting patterns across the world. In matters of faith, men are turning toward traditional, even ancient forms of religious expression: high liturgy, orthodoxy, duty, and structure. Meanwhile, women are leaving organized religion altogether in record numbers. Some are embracing a kind of therapeutic spirituality—mindfulness, astrology, crystals—but many are simply checking out.

It’s not hard to see this growing rift playing out in other areas of life: marriage, dating, education, employment. But it’s also playing out in fantasy literature, not just among readers, but also among writers and publishers.

Instead of sharing a common ground, men and women are building parallel worlds. Many male readers are flocking to grimdark, with its blood-soaked realism and morally gray protagonists, or to litRPG, which merges game mechanics with fantasy worldbuilding in a system-focused power fantasy. Meanwhile, women are turning increasingly to romantasy, a subgenre that often verges on outright pornography and has virtually no appeal to men.

A lot of this is downstream from the gender divide in publishing. Traditional publishing—especially in the U.S.—has become overwhelmingly female, especially in the editorial departments. Some of that is demographic; some of it is cultural. But the result is that the gatekeepers of traditional fantasy publishing are mostly women. Their tastes, sensibilities, and values shape what gets acquired, marketed, and celebrated.

This divide wouldn’t be so troubling if it were merely about preferences or taste. But it runs deeper than that. Increasingly it seems that men and women no longer understand each other—or worse, no longer even try to. And when even our fiction reflects that fracture, it becomes that much harder to bridge the growing divide.

That’s what makes the current state of fantasy so toxic. Not because romantasy or grimdark are inherently bad—every subgenre has its place—but because they have become echo chambers that silo the sexes off from each other.

Men and women were not made to live in separate worlds. We need each other—not just to perpetuate the species, but to challenge, balance, and refine one another. I know this from personal experience. Without my wife, I’d be a lesser man. She often drives me crazy (to be fair, I return the favor), but we have each grown so much since marrying each other that I think I would hardly recognize the man I once was. Together, we are far more than the sum of our two parts.

Our stories should reflect that truth. We don’t need more genre ghettos. We need shared myths. Stories where masculine and feminine virtues don’t clash with each other, but come together in harmony.

That’s what I’m hoping to accomplish with my epic fantasy series, The Soulbound King. When building out the fantasy world, I deliberately designed the magic system so that latent magical powers can only be unlocked through marriage—the “soulbond”—between a man and a woman. I did that largely in response to the growing gender divide, because I wanted to write a story that shows how men and women can overcome it. Hopefully it works.

Fantasy, perhaps more than any other genre, gives us the space to reimagine what’s possible. It allows us to explore not just what the world is, but what it could be. And right now, what the world needs is for the young men and women of the rising generation to come together and reinvent the world.

Great interview about conservatives and art

This isn’t exactly a long-form podcast episode, but it is a really great interview on a subject that is near and dear to my own heart. Andrew Klavan is a fantastic author—I’m currently reading his latest Cameron Winter novel, A Woman Underground, and loving it—and of the Daily Wire hosts, he gives the most interesting commentary and insights on cultural issues (which isn’t surprising, given his age and Hollywood background).

AI-Assisted Writing: Why Write a Novel with AI?

One of the things about AI-assisted writing that has really surprised me is how resistant other writers can be to the very idea of using AI in their creative writing process. Here in Utah valley, there’s a large enough writing community that we occasionally get together for an informal meetup over lunch, and every time I’ve brought up the subject, I could almost see the fists come up. At one of our local writing conventions, Writer’s Cantina, I was on a panel about AI-assisted writing… and there were maybe only four people in the audience (and one of them was my wife!)

It’s a shame, because I really do think that generative AI is going to transform the way we write everything, from emails and reports to blog posts, long-form essays, and yes, even fiction. It’s only a matter of time. AI is gradually being worked into the apps and programs we all use to write, and as people become more comfortable with it in other aspects of their lives, they’re going to start using it to write fiction—and that’s okay! Almost all of the resistance is based on ignorance and fear, not a clear-eyed understanding of how these AI tools actually work.

As someone who remembers the days when “self-published” was very much a dirty word—in fact, many people considered it the kiss of death to ever having a professional writing career—it very much feels like we’re repeating the whole tradpub vs. indie wars of the early 2010s, just over the issue of AI-assisted writing. The biggest difference is that the internet is 10x more toxic than it used to be, probably because of how polarized and partisan our world has become in general. But other than that, it’s like we’re only a year or two away from an AI-assisted author having a massive breakout and proving that you can write with AI and be a success, the way Amanda Hocking proved that you can self-publish and be a success.

So why should writers consider writing a novel with AI? One of the things I hear a lot from other writers is “I enjoy the writing process too much to ever consider using an AI to help me write.” Which is fine, I guess, if you’re writing just for yourself and maybe your own family. But if writing is your career, or something you hope to turn into a career, why wouldn’t you want to experiment with AI-assisted writing? After all, if you refuse to even experiment with it, how can you possibly know that it won’t improve your process in some way? And if it can improve your process and give you a competitive edge, isn’t that reason enough to try?

Here is what I’ve found after a year and a half of experimenting with AI-assisted writing:

In the old days, it would take me anywhere from six months to several years to write a novel. Now, I can write a novel-length work in about 1-3 months.

Before, I would hit a patch of writer’s block in the middle of almost every project, leading to weeks (and sometimes months) of agonizing frustration and crippling self-doubt. Now, because of AI, I can step back far enough to see the forest from the trees and identify all of the major story issues before they become creative blocks—and generate a rough draft in about a week.

Before, whenever I would come up with a great new story idea while in the middle of another project, I would have to suppress my enthusiasm for that idea or risk having it derail everything I was working on. Now, I finish my projects fast enough that that generally isn’t a problem—and even if it is, it only takes a day or two of chasing that idea to satisfy the creative itch, and either trunk the idea entirely or turn it into a new project to work on later.

Before, my biggest limitation was my ability to turn ideas into words. Now, with a few clicks, I can generate all the words I could possibly need, and the biggest limiting factor is my ability to stay true to my own creative vision.

It’s a completely different paradigm, with a totally new skillset and a very long (and at times somewhat steep) learning curve—and that’s probably the real reason why most writers are so reluctant to experiment with it. But is it really worth it? The only way to find out is to make the leap. For the last 18 months, I’ve been making that leap, and even though I have yet to feel like I’ve mastered AI-assisted writing, I’ve already seen enough to believe that it is.

Why Nick Cave is wrong about human creativity and generative AI

First of all, I don’t think that Nick Cave is entirely wrong. Laying aside how ChatGPT is just one of the many LLMs that are publicly available, and that using it as a stand-in for all of generative AI is like saying “AOL Online” when you mean “the internet,” he does make a fair point that using generative AI as a replacement for basic human creativity is wrong.

What he doesn’t understand is that using AI this way is also counterproductive. He blithely assumes that it takes not skill or effort whatsoever to use these AI tools—that all one has to do is tell ChatGPT what to write, and it will magically produce something if not great, then at least publishable. But as someone who has written several AI-assisted novels and short stories, I can assure you that it does take effort to produce something more than merely passable. Indeed, with longer works like novels, I can assure you that our current AI models are incapable of producing even passable work without considerable human intervention.

This is why I call it AI-assisted writing, as opposed to AI writing. When you do it right, the AI tools don’t replace your inner human creativity, but augment and enhance it, making things possible that were either impossible before, or that required a prohibitive degree of struggle. Writing with AI is still a form of creativity, though it might not look exactly like previous forms. But isn’t that also true of writing on a computer vs. writing longhand? Does it take any less creativity to write a novel on Microsoft Word than it does to write it on parchment with a fountain pen?

Granted, the technological leap from word processor to generative AI is much more profound and fundamental than the leap from pen and paper to typewriter, or from typewriter to MS Word. Speaking from experience, I can say that writing a novel with ChatGPT or Sudowrite feels a lot more like directing a play with an amateur (and very stupid) actor than it feels like wrestling with the empty page, at least in the early generative stages. But it’s still, fundamentally, a creative act—and that’s the main thing that Nick Cave misses in his rant. Anyone can ask ChatGPT to write them a novel, just like anyone can bang their hands on a piano or strum their fingers across the strings of a guitar. But to produce something good—that requires effort.

However, there is an even deeper level where Nick Cave is wrong here, and that is in the unspoken assumption that the difficulty in creating something is the thing that gives it value. It’s the same principle that Karl Marx expounded in his labor theory of value: that the economic value of a good or service is determined by the amount of labor required to produce it, or in this case, the creative and artistic value. That’s just wrong.

Do we love J.R.R. Tolkien’s Lord of the Rings because it took him several decades to write it, and largely represents the greatest product of his life’s work? Obviously not—otherwise, every amateur writer who’s been polishing and repolishing the same unfinished novel for the last twenty years must necessarily be the next Tolkien, no matter the fact that their book reads more like the Eye of Argon than The Fellowship of the Ring.

So if it’s not the creative struggle or the amount of human effort that ultimately gives art its value, what does? The same thing that gives a product or service its economic value: the utility that it provides to the person who consumes it. In other words, the thing that gives art its value is the goodness, truth, and beauty that it brings into the lives of those who receive it.

This is especially true of writing, which is perhaps the most collaborative of all the arts. Without a reader to read it, a book is nothing more than processed and flattened wood pulp full of meaningless squiggles (even less than that for an ebook). When I read a book, I care not a whit for how much work it took for the author to come up with it. Same with the music I listen to, or the games that I play. What I care about is how it makes me think, feel, or experience the world.

And if it’s possible to bring more goodness, truth, and beauty into the world by using generative AI, so what? If it’s easier than writing a novel the old way, does that somehow mean it’s “cheating”? If the answer to that question is yes, please tell me why you don’t churn your own butter, or hunt your own food, or chop your own wood and burn it to heat your house—because all of those applications of modern technology are “cheating” in exactly the same way. Also, I hope all the books in your personal library are handmade, illuminated manuscripts, because the printing press is far more of a “cheat” than generative AI, as the last few hundred years of history clearly shows.

Nicholas Cave is wrong. ChatGPT is not the most “fiendish” thing “eat[ing] away at [our] creative spirit.” Our humanity is far more resilient and anti-fragile than he gives it credit. Those who try to replace human creativity with AI will fail, not because of artists like Cave who stubbornly resist the “temptation” to use these tools, but because of those who embrace the new technology with an open mind, and discover that our humanity is not a liability, but our greatest asset—a premise that Cave ironically rejects with his fearmongering about our fundamental replaceability.

More early thoughts on AI-assisted writing

It’s become something of a cliche that true writers write because they can’t not write, but as with so many other stereotypes and cliches, there’s a kernel of truth in it. I’ve been writing on and off since the 8th grade, and even during periods of my life when I wasn’t able to focus on writing, the writing itch would still come for me, and I would have to sit down and sketch out something, even if I never did anything with it.

Over the years, I’ve gotten pretty good at telling whether I’ve done enough to satisfy that creative urge that drives me to write, and whether that urge has been left unsatisfied. Yesterday, I realized that something felt off because that creative urge had not been satisfied—which is strange, because after only two weeks of working on this AI-assisted novel project, I’m already halfway done with the first draft. Indeed, yesterday I “wrote” (if that’s the right word for it) a little over 4.5k AI-assisted words.

Now, this should go without saying, but the point of writing professionally is not how good the creative process makes you feel, but how well and how quickly you produce a quality finished product. But I do think there’s a correlation between the two: that you are more likely to create a quality finished work the better your creative process satisfies your creative urges.

In the last three months of writing with AI, I’ve had some writing days that were better than almost any I’ve had in years. I’ve also had some very meh days, which is to be expected… but what isn’t so expected is this feeling of being creatively unsatisfied, which I usually don’t feel unless it’s been two or three weeks since I’ve done any writing. Something weird is going on.

How much of this feeling of creative dissatisfaction is due to the fact that I’m outsourcing a significant portion of the creative work to an AI, and how much of it is simply to be expected from trying to master a new and unfamiliar skill, which has kept me from satisfying that urge in the same way as I have in the past? At this point, it’s difficult to say. Probably a little of both.

Having worked on this for a while now, though, I think that the writers (and other creative types) who are going to succeed the most with AI-assisted creative work are the ones who figure out how to integrate the human element of their process with the AI element of their process, such that each one complements and enhances the other. Right now, everyone’s talking about how AI will replace us, but that’s really the wrong way to think about it if you want to learn how to master these tools.

I suspect that the way to master AI-assisted writing is not to try to get the AI to “do the hard stuff,” or replace some aspect of the creative process, but to integrate it within your creative process such that it enhances and magnifies your own, very human efforts. For that reason, I’m changing the way that I count my daily words so that I no longer make a distinction between words that I “write” myself, and words that the AI “writes” or generates, because the AI can’t generate words unless I give it enough to work with. Often, that means that I write a little, then generate a little, then tweak what the AI generates and write a little more. When the process is working well, it’s very difficult to say which parts were purely AI “written,” and which parts were purely human “written.”

But it’s still going to take a while to figure out exactly how to integrate AI into my writing process. As I continue to do that, I’m going to pay close attention to how it satisfies—or fails to satisfy—my creative urge to write, not because that is the end goal, but because I suspect that if my creative urge is not being satisfied, the AI-assisted stuff that I’m producing probably isn’t very good. It may not be very good even if my creative urges are satisfied, but if something about the process is missing, then something about the final product probably is missing as well.

Anyhow, those are some more of my random thoughts as I continue to experiment with AI-assisted writing. I was hoping to finish the rough/AI draft of The Riches of Xulthar before the end of the month, but I’m almost out of AI words for this billing cycle, so I’ll probably move on to the “humanizing” phase for what I’ve already written, which is where I retype the AI-generated stuff in order to pass it through what I like to call “the human filter.” Hopefully that helps to give the story a little more of my personal voice and style, and not read like something that could have been AI generated by anyone. But I’m still working out and experimenting with that part of the process, just like all the others.

Some early thoughts on AI-assisted writing

I remember the early days of indie publishing. Back in 2011, when self-publishing was still a dirty word (and Kindle Unlimited wasn’t yet a thing), there were a LOT of opinions about “indie vs. tradpub,” most of them heated opinions, and some of the arguments I witnessed at conventions like 2011 Worldcon Reno very nearly came to blows—which is to say nothing of the online discourse.

And yet, if I could go back to those times, I would tell my younger self not to spend so much time as a keyboard warrior on KBoards, because guess how much all that sturm und drang ultimately mattered? Not much. Instead, I would have told my younger self to focus on things like learning how to market myself and my books, learning how to build and run a business, and learning how to, for lack of a better word, network effectively, because the longer I stay in this business the more I realize that community is the thing that really drives everything.

(Though I still would have told him to delete his social media. That’s not community—it’s just a string of toxic echo chambers spreading toxic mind viruses that keep us all in the matrix.)

Anyhow, I say all of that by way of preface on my thoughts about AI-assisted writing, because I see many of the same things playing out right now. There are a lot of opinions floating around out there about AI, most of them heated, some of them to blows. And yet, I can’t help but get the sense that most of the noise right now isn’t going to matter much after the changes shake out. Therefore, my working assumption at this point is that most of the noise and argument is just a waste of time. The stronger the opinion, the less of a difference it will probably make.

For example, back in 2011, the groupthink consensus over at SFWA was that indie publishing was terrible, indie writers were going to destroy literature, and the publishing world needed gatekeepers in the form of agents and editors. Five years later, they amended their membership requirements to allow indies to join. Fast forward to today, and the groupthink consensus over at SFWA is that AI-assisted writing is tantamount to plagiarism, that AI is going to destroy literature, and that the publishing world needs to do everything it can to preserve the human element of writing. Are you beginning to see a pattern?

For the last two months, I’ve been experimenting with some of these AI writing tools. I’m still in the very early stages of integrating AI into my writing process, but I really do think that in the not too distant future, almost everyone who writes will use some form of generative AI in their process. After all, how many people still write with manual typewriters? A few, but not many.

And think of how much computers and word processors changed how we think about writing, compared to the old days when everything was done by typewriter and everyone knew what a SASE was. For one thing, revising was much more of a pain, since it required retyping everything—no small thing for a long book. Therefore, books and stories were probably much shorter, and writers much better at producing clean first drafts. Also, the revision draft was a genuine draft—no cycling through yesterday’s work to get yourself into the zone.

And how did the typewriter change how we thought about writing, back when it replaced longhand, or whatever the heck came before? It probably enabled skilled typists to write a lot faster—but that meant a lot less ruminating over the text. Was that why 19th century writing was so much more flowery and verbose: because the writing process involved spending more time with the language, and therefore immersing yourself more fully in it? Was the typewriter one of the major drivers of the pulp era, enabling fast writers like Hubbard, Howard, and Lovecraft to produce vastly more content for vastly more markets and publishers?

I really do think we’re on the cusp of a revolution in how we think about writing, and I suspect that generative AI is going to be at the heart of it. And it may turn out that the writers who cut their teeth in the previous era, when every word had to come from your own mind, will have an outsized advantage in the new, because we will be strong in areas where writers who come of age with generative AI will be weak.

Or it may be that most of us old-timers will cling to the old ways and never learn the new, excusing our willful ignorance in the name of “preserving our humanity.” It may be that we have so much to unlearn in order to make it in this new era that most of us won’t be able to make the transition.

What do you think, ChatGPT?

As an AI language model, I don’t have personal opinions or beliefs, but I can provide some analysis and insights based on the information you’ve shared.

It’s interesting to observe parallels between the early days of indie publishing and the current discussions surrounding AI-assisted writing. When new technologies or methodologies emerge, it’s common to see a range of opinions, often polarized, as people try to understand and adapt to the changes they bring. Over time, these opinions may evolve as the benefits and limitations of the new approaches become clearer.

AI-assisted writing tools are indeed gaining popularity, and it’s likely that they will become more integrated into the writing process for many authors in the future. Just as word processors changed the way we approach writing, AI tools have the potential to reshape the writing landscape, offering new possibilities and efficiencies.

However, it’s important to note that AI is a tool, and its impact on the writing process will depend on how it is utilized. While some fear that AI will lead to a loss of creativity or human touch, others see it as a valuable aid that can enhance productivity and offer new creative avenues. Ultimately, the extent to which writers embrace AI-assisted writing will vary, and different authors will find their own balance between leveraging AI tools and preserving their preferred writing methods.

Experience and skill acquired through traditional writing processes are likely to remain valuable, as they provide a foundation for storytelling and an understanding of narrative structure and language. Writers who have honed their craft in the pre-AI era may indeed have advantages in certain areas. However, it’s also important to acknowledge that new generations of writers growing up with AI tools may develop unique skills and approaches that complement the capabilities of these technologies.

The transition to any new era brings challenges and opportunities. It’s up to individual writers to explore and adapt to the changes that AI-assisted writing offers, determining how best to integrate these tools into their creative processes while staying true to their own artistic vision. The writing community, as you mentioned, plays a crucial role in sharing insights, experiences, and support as this evolution takes place.

In the end, the impact of AI on writing will be shaped by how writers choose to embrace and utilize the technology, and the value of community, collaboration, and individual creativity will continue to be important factors in the literary world.

What aspects of AI-assisted writing should we be focusing more on right now?

We should be focusing on exploring the capabilities and limitations of AI-assisted writing, understanding how it can be integrated into the writing process, and developing ethical standards and guidelines for its use. Additionally, we should be considering the potential impact of AI on the writing industry, such as changes to the job market and the possibility of increased accessibility to writing and publishing for marginalized voices.

Why I’m not worried about AI replacing writers

So machine learning artificial intelligence has really been blowing up this past month, probably because of ChatGPT and all of the fascinating things that people are doing with it. I’ve been getting into it myself, using it to help write or improve my book descriptions, and also experimenting with it for writing stories.

At this point, any original fiction that ChatGPT writes is about the same quality as something written by an overly eager six year-old (minus the grammar and spelling errors), but I can see how that could change in the future, especially on a language learning model that’s trained on, say, Project Gutenberg, or the complete bibliographies of a couple of hundred major SF&F writers. The technology isn’t quite there yet, but in a few years it could be.

But apparently, that hasn’t stopped hordes of amateur writers and/or warrior forum types from using ChatGPT to spam the major magazines with AI-written stories. In fact, Clarkesworld recently closed to submissions because they were getting flooded with “stories written, co-written, or assisted by AI.” Neil Clarke wrote an interesting blog post on this problem, saying that this is a major growing problem for all of the magazines and that they will probably have to change the way they do business to deal with it.

So will AI eventually become so good that it replaces writers altogether? I don’t think so, and here’s why.

Replacement vs. collaboration

The gap between an AI that can do 100% of what a fiction writer can do and an AI that can do 90% is actually much wider than the gap between an AI that can do 90% and an AI that can only do 50%. That’s because both the 90%-effective AI and the 50%-effective AI require collaboration with a human in order to do the job. Neither of them can fully replace the human, though a human-AI team may be able to do the work of many humans working alone.

If we ever get to the point where AI replaces storytellers completely, we have much bigger problems than a few out-of-work science fiction writers. Storytelling lies at the heart of what it means to be human: we call ourselves “homo sapiens,” but we really should call ourselves “homo narrans,” since story is how we make sense of everything in our world. If an AI can replace that, then we as a species have become obsolete.

But I don’t think we’re going to ever reach that point. My wife is currently getting a PhD in computer science—specifically in machine learning and language models—and she believes that there is an inherent tradeoff between intelligences that can specialize well, and intelligences that can generalize well. AIs are master specialists, but humans are master generalists. If we ever build an AI that’s a master generalist, we may find that it’s actually much less intelligent than an average human, because of the tradeoff.

But all of that is purely speculative at this point. Right now, we really only have AIs that can do about 20% of what a fiction writer can do. In the coming years, we may ramp that up to 50% or even 90%, but anything less than 100% is not going to fully replace me.

Tools, force multipliers, and the nature of writing

However, that doesn’t mean that the thing we currently call “writing” isn’t going to change in some pretty dramatic ways, much as how the internal combustion engine dramatically changed the thing we call “driving.” And with these changes, we may very well get to the point where the market just can’t support as many professional writers, and the vast majority of us have to find other lines of work.

Conversely, it may actually expand the market for “reading” and create new demand for “writers,” as “reading” becomes more interactive and “writing” turns into an AI-mediated collaboration with the “reader.” Kind of like a Choose Your Own Adventure that writes itself, based on the parameters set by the “writer.”

I have no idea, but the possibilities are fascinating, and the writers who are sure to lose are the ones who fail to confront the fact that their whole world is about to change—indeed, is already changing.

I think what it’s going to come down to is who owns the tools: not just who can use them, but who can modify them, personalize them, and use them to create original work. If copyright law decrees that the person who owns the AI also owns anything created with the AI’s assistance, that is going to be a major buzzkill… unless we get to the point where everyone can have their own personalized AI, which would be pretty cool. It would also solve a lot of the problems emerging from all of the super-woke filters that are getting slapped on ChatGPT.

Personally, I’m looking forward to the day where I can use an AI model to write fifty novels across a dozen pen names in a single year. What an incredible force multiplier that would be! But only if those novels are “mine,” whatever we determine that means.

So really, instead of arguing about whether AI will replace authors, what we really ought to be talking about are the aspects of writing and storytelling that drive us to create in the first place, and how those aspects can translate into a world where the nature of “writing” looks radically different than it does right now.

The “but I already know how it ends” problem

There is one problem that is unique to the written word, and it’s something that every writer has to confront when making the leap from amateur to professional (or even just from an amateur who dabbles in prose to an amateur who finishes what they start). The problem can best be summed up by this:

Why should I bother writing this story if I already know how it ends?

Unlike visual media such as TV, movies, video games, or illustrations, the art of the written word exists 100% in the reader’s head. These things we call “words” are really just symbols that convey thought from one mind to another, and have zero meaning outside of the head of the person reading. If you don’t believe me, try picking up a classic novel written in a foreign language that you don’t understand, and see how well you enjoy it.

But when we read, we like to be surprised on some level. There is something about the novelty of the story that appeals to us—indeed, that’s why we call them “novels.” The trouble is that the very act of creating a novel kills the novelty of it. At some point, you know how it’s going to end, and after that point the act of writing becomes a chore—or rather, it can be, unless you find something else about the process that fulfills you.

Some professional writers deliberately put off that moment for as long as they can, never figuring out their ending until it comes as surprise, even to them. Others look for fulfillment in something else, like the artfulness of their prose, or the dramatic suspense built up by their use of language. Still others just plow ahead, accepting this loss of novelty as a cost of doing business.

But however they choose to deal with it, every writer has to confront this problem in some manner before they make the leap from amateur to professional. And this is perhaps the biggest reason why I’m not too worried about AI replacing me as an author: because even an AI model that can do 90% of what I do will still require its human collaborator to address this problem.

Fanfiction and derivative works

Of course, the amateur vs. professional problem will affect some genres more than others: “write me a romance just like ____ where the male love interest has black hair instead and works in my office” is going to be just fine for a romance novel addict who just wants their happily-ever-after without any uncomfortable surprises. But we already have this: it’s called fanfiction.

Which is not to say that all fanfiction is formulaic and predictable. But the thing that sets fanfiction apart from original fiction are the things make it a derivative work: things like characters and settings that are already well-established, or a rehashing of storylines that were created by someone else.

This is an area where I think AI shows the most promise, and will turn out to be the most disruptive: not in creating original works, but in creating derivative works. Imagine if you could plug a novel into ChatGPT and tell it to rewrite the ending so that the girl ends up with your favorite character, or your favorite villain wins in the end. ChatGPT can’t do that very well right now, but I don’t think we’re far from building an AI language learning model that can—especially if it’s trained on actual books, instead of online content.

What I foresee is a world where AI blurs the line between fanfiction and original fiction so much that it becomes normal to read a bunch of these derivative works after you’ve read the original. Indeed, it may become a game to see who can make the most popular derivative work, and the popularity of some of them may very well exceed the popularity of the original.

Or it might become normal to run everything you read through an AI filter that removes offensive language, or the sex scenes that you were going to skip anyway (or conversely, an AI filter that adds offensive language and sex scenes). Taken to an extreme, this could lead to some really dystopian outcomes that further divide our already polarized world. We’ll have to see how it shakes out.

But all of this derivative content is only possible if there’s original content to derive it from. And while AI may lower the barrier of entry somewhat to creating original content (or not, since there really aren’t any barriers to entry right now, aside from the time and practice it takes to become proficient at your craft), the problem of “but I already know how it ends” will keep most dabblers and amateurs in the realm of creating derivative works, not original ones.

The act of “writing” and “reading” may change dramatically based on the force-multiplying effect of these tools. We may even get to a point where “writing” and “reading,” as most of us understand it, bear little resemblance to how we understand it today. But unless our very humanity becomes obsolete, I’m confident that I will still be able to carve out a place for myself as a writer.