Let’s get one thing straight right off the bat: It is unacceptable to prompt ChatGPT to write a novel and then pass that off as your own original work. One would hope this is obvious, but just in case, it’s worth stating, unequivocally, that you cannot take shortcuts to write and then call the finished piece yours. Even if the model in question hasn’t been trained on stolen work from other authors—which isn’t the case in reality, but let’s consider it just for the sake of argument—if you didn’t write the book, well, you didn’t write the book.
That being said, people are using these new AI tools in all kinds of different ways. What about talking to a chatbot to work through ideas as if you were talking to a writing coach? What about teaching AI about your characters and then using it to test different choices your characters might make or reactions they might have? Everyone uses spell-check, and most word processors give grammar suggestions too, so isn’t that basically using a form of AI?
It’s hard to know what’s acceptable and what’s more of a gray area. And in the world of professional writers, ChatGPT can be a sore subject, making it even harder to talk about it.
What Can ChatGPT Do?
No matter which one of the tools you’re using, ChatGPT or whatever the buzzy tech company of tomorrow will be called, these tools are not like an android in a science-fiction movie. What we call AI in casual speech is actually a large language model (LLM). IBM defines LLMs as: ”a category of deep learning models trained on immense amounts of data, making them capable of understanding and generating natural language and other types of content to perform a wide range of tasks.”
The words “trained on immense amounts of data” are key—ChatGPT is like the predictive text on your phone, suggesting what word you’ll want next based on your patterns. So when you feed your novel to a bot and ask it for feedback, it isn’t making actual artistic judgments based on your content; it’s writing text based on the patterns it reads in the data it’s been trained on.
Does It Actually Meet Your Needs?
With this in mind, consider the example of using ChatGPT to develop ideas for plots. When the bot says, “yes, that is a good idea for a book” or “no, you need something better,” it’s simply giving you the predicted text.
Now you might argue that predictive text is helpful to you because it’s giving you a general idea of how your pitch might be received by the world. But as a writer, is that actually helpful to you? Isn’t it better to think about how your idea for a plot gives you opportunities for different kinds of characters, all of whom will have conflicting interests and individual goals?
If you teach AI to talk like your character, that might seem cool; you could ask the bot what it might think, feel, and do in any given situation and talk to it as if the character has come to life. But your characters are defined by the choices and feelings you, the author, give to them. And if you’re really doing your job as a writer, those traits will change as your characters are shaped by the events in your plot. If you read a book where a character’s traits are bullet points that could be plugged into ChatGPT, well, that story is probably going to be pretty boring.
When you’re setting out to write your book, you’d do better to pull away from the flat, generalized kinds of feedback you’d get from a bot that’s pulling from who knows what data.
What About the Boring Stuff?
Say you wrote your book on pen and paper, not using so much as basic spell-check, all with your own mind. When you’re finished, you’ll need to do the less interesting work of marketing writing, pitching to agents, or working with an agent to pitch to publishers.
Couldn’t you just let ChatGPT write that awful query letter? You might use AI tools at your job to send emails and intercompany communications, so can’t independent authors use it to draft out all the text that would otherwise be done by a publishing company’s marketing department?
Well, no. Again, it gets back to the fundamental question of who wrote that query letter, that email, that back cover copy. When you’re reaching out to agents, publishers, and potential readers, you are communicating with them personally. Farming any communication labor out, even if it’s boring, breaks that social contract.
And even for things like marketing copy, where people are less likely to care if it’s AI-generated text, we’ve all seen how bland and stilted AI text can sound. Is that really the impression you want to give people who are deciding if they want to read your book?
The Work Is the Work
Presumably, people who want to write actually want to write. And, presumably, writers respect the ability of the written word to connect person to person, not person to LLM.
Even if you don’t have the same negative reaction to ChatGPT as some of your peers, it’s still best practice to keep it out of your writing life.
Chelsea Ennen is a writer living in Brooklyn with her husband and her dog. When not writing or reading, she is a fiber and textile artist who sews, knits, crochets, weaves, and spins.