AI v. Mortality
Should you possess as your first name a goldilocks name, not ubiquitous yet not entirely obscure, not dated-sounding yet not so new-sounding as to clearly be an emerging trend, you are almost guaranteed sooner or later to learn that your name has been impressed into service as the brand name of some tech service or other. Cora is now a tampon company. Casper, a mattress company. Marcus? Some project of Goldman Sachs. And please spare a thought for the Alexas and Siris of the earth, whose lovely names have been commandeered by two of the biggest and most powerful corporations ever to exist, to stand in for their automated voice assistants. Hey Siri, we say now, rudely addressing our devices. Hey Alexa.
I guess I’m lucky that Jasper—after decades of service mainly as a high-volume dog name but also as my own—has only now been pressed into service as a tech product name, and it’s not for a voice assistant! That’s the good news. The bad? Jasper, the company, sells an AI copywriting tool that aims to put incarnated writers, like yours truly, out of business.
This, I will admit, has piqued my curiosity. Is an “AI writer” really going to take my job? Or will it at the least change how I carry it out, exponentially upping my productive capacity? To put it the way an AI might, it’s made me curious how 10x my content production.
That’s a screenshot from an actual email this company, Jasper, sent me after I tried to use their tool. (Upon me creating an account, they demanded a credit card; the fine print said that after a 5-day trial they would charge me $480; I declined.) I think that this email captures rather perfectly the riciculousness of what today passes for AI—and why I feel basically sanguine about attempts to use computers to replace me and other writers.
Here’s the thing: AI writing tools promise to help us create more written output faster—to “10x your content production.” But that’s a solution in search of a problem. The problem with writing isn’t that there isn’t enough of it. Hahahaha, no, that is not the problem. The problem with writing, obviously, is that we only live on this earth for a limited number of days, we only get to read so much, and we don’t want to feel our time has been wasted. Unfortunately for the AI automators, so far all but a thin sliver of AI writing—and that largely limited to explicitly AI art projects like this beautiful essay by Vauhini Vara—is just not worth your or anyone else’s time.
The thing is, AI writing lacks care. I mean this in all senses of the word. On the small scale, the AI isn’t careful, and its writing is often full of falsehoods and other errors. But in the bigger sense, too—the AI doesn’t care about what it’s writing. It’s not invested, emotionally or intellectually. It’s just riffing. As Robin Sloan writes in a recent newsletter:
The thing to know about the AI language models, OpenAI’s GPT-3 and its cousins, is that they are fundamentally bullshitters. The bullshit has gotten better and better, but at the core … well, there’s nothing at the core.
The AI, lacking a soul, is profoundly disinterested, it just doesn’t care, so it just generates what it’s been programmed to think you’ll want to hear. There is no insight. There is no curiosity. At least not on the part of the machine.
More than anything, raw AI prose reminds me of the way that in the NBA 2k video games you can play against the computer—but you can also set up the computer to play against itself. Back and the forth, the computer will slowly simulate an entire game on its own. And in many ways it might be similar to watching an actual, live basketball game. For myself, I’m a big basketball fan—I’m literally writing this essay after watching an entire basketball game on TV—but, man, you couldn’t pay me enough to more than glance at a simulated basketball game. It turns out, a big part of why I’m watching these games is to experience things that cannot be simulated: competition, creativity, human fallibility, and the possibility of the unexpected.
I would throw out that many, even most people who read books for pleasure do so not just for plot but also for similar human elements to why we watch basketball. Certainly these are among the primary driving forces behind highbrow fiction, essays, and poetry.
The writers who do seem to have use for tools like jasper.ai are, no coincidence, the same ones who are incentivized to value quantity over quality. Josh Dzieza’s fantastic piece in The Verge, “The Great Fiction of AI” is about just this question: how will increasingly powerful AI tools be used by writers. Understandably, he focuses on a genre novelist, Jennifer Lepp, who writes cozy paranormal mystery novels published directly to Amazon’s Kindle marketplace. She ends up turning to AI because of the insanse pace she must keep in order to make enough money to survive: 6 novels per year to start with, and then, as Amazon’s service becomes ever more crammed with other pulpy novels, 10 per year. (This seems like a particularly degraded version of a “dream job.”) The AI does help the central figure somewhat in her unceasing toil cranking out these novels—but only because, readers in this genre and on this platform have a seemingly bottomless appetite for fairly repetitive plots and stories.
Eventually, though, even Lepp finds that the AI is writing stories without soul. She takes back the reins and just uses the AI program (in her case Sudowrite) for parts of her books she doesn’t care about. Here’s a quote describing that:
“Like I know we’re going into the lobby, and I know that this lobby is a secret paranormal fish hospital for nyads, but I don’t particularly care what that looks like other than that there’s two big fish tanks with tons of fish and it’s high-end,” she explained. So she tells it that, and it gives her 150 words about crystal chandeliers, gold etching, and marble. “My time is better spent on the important aspects of the mystery and the story than sitting there for 10 minutes trying to come up with the description of the lobby.”
For myself, I struggle to imagine simply not caring about a description. If I don’t care about it enough to actually write it, then why would it be in the novel at all? Isn’t it the height of rudeness to expect a reader to care enough to read something that I literally didn’t care enough to write?
All this is not to say I’m not interested, at least a little bit, in these tools. I do think that AI writing is kind of interesting. But I just can’t bring myself to believe that it will ever fundamentally change the way that I write. Nor do I believe that it will crowd the market for good writing, the way that many of its boosters seem to think it will. At best, it may flood the market with half-baked, soulless crap that doesn’t respect readers’ time. Perhaps I’m simply not cynical enough, but I think such a future would have the central effect of raising, not lowering, the premium on truly great, thoughtful, writing.
Last night I was reading Mary Gaitskill’s review of Blonde, the novelization of the life of Marilyn Monroe that Joyce Carol Oates published in 2000. (I found it because Gaitskill sent out a PDF in her newsletter panning the film adaptation.) In the review, Gaitskill ultimately ends up advocating for the book as a powerful exploration of psyche, sex, and the entrancing figure of Marilyn. But she takes a winding road to get there. And she's not afraid to be crass. (Heads up: the passage discusses sexual violence.) Here are her first two paragraphs of the review:
Get back to me when an AI writes something as sharp, funny, and full of idiosyncratic but ultimately moral insight as that. Till then, I think we writers will still have jobs.
Here’s the real trouble: how in the hell do we get to be writing on the level of Mary Gaitskill? That’s something neither Jasper has yet achieved.