Generative AIs are bullshit generators. But for truly epic bullshit, we still need fiction authors.
I recently attended a debate with sci-fi and fantasy writers on the use of AI in fiction writing. I’ve rarely heard so much nonsense. AIs may hallucinate, but for genuinely outlandish bullshit, it seems we still need fiction authors.
A recent scientific article described generative AIs as “bullshit generators.” Large language models (LLMs) are not truth-seeking entities that occasionally drift into fantasy. Instead, their entire purpose is to create works that *seem* human-made, regardless of reality. Without any inherent sense of truth, AIs churn out text, images, and videos designed to appeal to users. They are essentially bullshitting their way to approval, much like politicians, public speakers, and, yes, fiction writers.
Mention AI around creatives nowadays, and you’ll be met with a storm of objections—from accusations that LLMs are stealing the work of hardworking artists to doomsday warnings that AI will end creative jobs—and possibly humanity itself.
Take NaNoWriMo (National Novel Writing Month) as an example. Last week, the organization faced massive backlash after releasing a statement that failed to explicitly condemn the use of AI in their annual writing challenge. NaNoWriMo suggested that opposing AI might overlook “classist and ableist issues,” which sparked outrage among writers and participants. Board members resigned, a sponsor withdrew, and critics accused the organization of betraying its mission to promote human creativity.
Frankly, I’ve never understood the NaNoWriMo concept. Encouraging people to churn out a 50,000-word novel in one month by writing almost 2,000 words a day? The odds of that producing a readable book are about as good as an architect laying a few thousand bricks a day and hoping to deliver a livable home. If the goal is simply to hit a daily word count, you might as well let AI take over. ChatGPT could spit out ten thousand unreadable novels during NaNoWriMo. But I digress.
Concerns about AI in creative industries are growing. Consider a popular tweet by author Joanna Maciejewska, who said she’d rather write than do chores while the AI is doing her creative work. The irony, of course, is that AIs don’t do anything unless prompted. If Maciejewska sees an AI doing her work, it’s because she gave it instructions. Otherwise, the machine sits idle, twiddling its digital thumbs. ChatGPT only does what it’s told. The tweet was a bit silly, if you ask me.
The real point Maciejewska and others are trying to make is that AIs are increasingly taking over creative tasks. But let’s be honest—if a machine can produce work of the same quality as yours, maybe your work wasn’t that special to begin with. You might want to consider a career where you can truly outperform the algorithms.
Generative AI, as it stands today, tends to elevate the lowest performers. It helps those struggling to improve their work just enough to pass as “acceptable.” However, it doesn’t yet elevate those who are already excellent at what they do.
Imagine that. Suddenly, with generative AI, the masses can produce work that’s no longer terrible—just mediocre or moderately good. And the gap between excellence and mediocrity is shrinking fast. This narrowing divide terrifies elitists who fear their work is being replaced by a flood of average, AI-generated content. Well, if that’s the case, maybe their great art wasn’t as valuable as they once thought.
Despite the onslaught of AI-generated mediocrity, authors and other creatives can still do great work. In fact, we could do more if we delegate the tasks we dislike to AIs, allowing them to compensate for some of our weaknesses. We don't have to excel at every aspect of our job. AIs can provide feedback to help us improve even further for the parts of the work we do enjoy. There’s no reason to do the laundry and dishes and hand over the entire creative process to AI unless you were never that good at your creative job in the first place.
Recently, I joined a discussion with writers who claimed that ChatGPT holds the copyright on everything it generates. This is utter nonsense. It doesn’t make sense for OpenAI to retain copyright over generated text. Why would they? Their business model is based on subscriptions, not royalties. Claiming rights over generated output would completely undermine their primary revenue streams.
The same writers subsequently made another outrageous claim: ChatGPT steals and stores copies of all creative works it encounters. Again, this makes no sense at all. LLMs don’t retain full copies of content. The value of LLMs lies in storing statistical information about patterns, not verbatim copies of the Internet. The data centers are already expensive enough without keeping duplicates of everything the AIs encounter.
ChatGPT doesn’t own the work it generates, and it doesn’t store verbatim copies of what you create. When you use a tool, you need to understand what it does. Generative AIs are bullshit generators. We can learn to use them for that. But for truly epic bullshit, we still need fiction authors—they’re clearly still the masters.
P.S. For Glitches of Gods, I was only able to use ChatGPT for giving me feedback on my prose. I was not able to use it for input on earlier drafts because LLMs weren't available at the time.