We Don’t Use AI (and You Shouldn’t Either)

November 26, 2004 | Daphne Strasert

AI tools are everywhere. “Create your book cover using Midjourney!”, “Ask ChatGPT to analyze your story structure!”, “Write your first draft using Scribblr!”. Writers are bombarded with opportunities to cut corners in the creative process. I’m here to tell you: Don’t. Do. It.

But first, a disclaimer

The AI that I am talking about in this blog post is Generative AI. This is different from Assistive AI, which has been around for a long time and is essential in getting books to market. Assistive AIs are things like spell check, talk to text, and autocomplete—things that have limited scope and clearly defined uses. They do not create content.

What is AI?

Artificial Intelligence—AI for short—is a computer model that learns from existing data and then completes a task based on the data it was fed. So, a program fed 100 million documents with clearly flagged grammatical errors will be able to identify those errors in a new document that it has never seen before. These programs are not given a set of rules (no one has taught them the rules of grammar), they simply pick up on what those rules may be from the data that they were given. This is an important detail: AI DOES NOT KNOW WHAT IT IS LOOKING AT OR WHAT IT IS CREATING. This will become important later when we talk about the fact that AI isn’t very good at generating content.

Practical Reasons to Avoid AI

We as authors are trying our best to get our stories out into the world and, let’s face it, writing a novel is no easy task. Why not use tools to do it for us? Why not feed our ideas into an AI novel generator and have it spit out the first draft? There are lots of reasons not to do this, but let’s start with the more concrete ones that get right to the bottom line.

AI-produced content isn’t good

Remember when I said that AI doesn’t know what it’s doing? That becomes abundantly clear when you start looking at the content AI text generators produce. AI may create grammatically correct sentences (sometimes), but it has no concept of point of view, characterization, themes, or plot. In essence, it doesn’t know what makes up a good story. AI has been known to simply drop characters halfway through the story, or create completely new settings for no reason, or change a character’s appearance or backstory without explanation. AI puts words together based on the probability of those words appearing together in the data it trained on. That can create nonsensical results. 

But also, you don’t know what data (stories) the AI model was trained on. AI scrapes from millions of novels without regard to quality and many models were trained on fanfiction, which isn’t vetted in any way before being included. This means that the model is of dubious quality to begin with. And AI can’t create anything greater than what it started with. AI will never produce groundbreaking or visionary content. It simply can’t. It cannot synthesize tropes, themes, or genre cliches and turn them on their heads. That creative spark only comes from a writer.

“But Daphne,” you may say, “I can take that terrible first draft and edit it into something good.” 

I mean, I guess you could, but the amount of work that would take is immense, perhaps larger than the effort of writing a draft in the first place. And you wouldn’t even be able to publish it, because…

AI-produced content can’t be copyrighted

That’s right. In order for something to be considered for copyright, it must be produced by a human being. According to the Copyright Office of the United States, human authorship is a “bedrock requirement” of copyright. Even if you edit an AI novel, you still couldn’t claim it as your own work. 

Because it’s not. 

Think about it, all you created was the prompt that you gave to the AI model. The model itself produced the text. So, should the model hold the copyright? Or the person who created the model? Or the persons who created the stories that were fed as data into the model? 

Copyright doesn’t extend to the ideas inside your head, or even to words that you speak out loud. It only applies to words on the page that you put there yourself. And without copyright protection, you cannot have exclusive rights to sell the work. Anyone can do that. 

(If you want to know more about copyright, check out my webinar on intellectual property.)

Ethical Reasons to Avoid AI

But there is more to the decision of using generative AI than just the money-making aspect. After all, writing is a form of art—one of the highest forms of human expression. If we view it solely through the lens of personal gain, we’re missing the point. 

AI is trained on stolen works

AI is trained on vast datasets scraped from the internet, most of them without the permission of the authors being used. In fact, AI models rely on using copyrighted works without paying. When faced with a lawsuit by the New York Times, the creator of OpenAI said, “It would be impossible to train today’s leading AI models without using copyrighted materials.” And, given the amount of works needed to create the large training data sets, AI companies simply can’t afford to pay the works’ creators and remain profitable. 

Artists deserve to be paid for their work. That’s why the editors at Tomeworks encourage you to always seek out paying markets for your stories. Your creativity is valuable and you deserve to be paid for what you produce. AI models that take authors’ works and imitate them without compensation are stealing those creative efforts.

Embracing AI could push real writers out the market

Because AI works cannot be copyrighted, the traditional publishing industry hasn’t embraced them… yet. A number of publishing companies are already making moves to incorporate AI into their business models—from cover design to marketing copy. 

But let’s not kid ourselves, publishing is a money game and, if publishers could get quality (or even passable) novels from AI and retain the rights to them, they would absolutely go that route instead of paying writers. 

It’s not hard to imagine a world where that is the case: Where Harlequin pumps out legions of AI made romance (the genre has a well established formula and voracious readership that would make it a likely target); Where a model trained on the many works of Stephen King continues to produce books “in his style” that his publisher prints under his name, even long after he’s dead; Where the time, effort, and expense of working with real authors simply isn’t worth it when AI could create hundreds of books in a fraction of the time. Even if the books aren’t good, when the market is flooded with AI, the bar drops lower to everything. (For more imaginings of the future of AI, check out this blog post by Sean Morrissey Carroll.)

That’s why it’s important that we make a stand for ourselves and our craft. Don’t use AI, don’t normalize using AI. Stand in solidarity with artists and don’t use AI generated book covers. Advocate for safeguards against AI content. And keep writing.

Next
Next

Monstrous Origins