I didn't want to tell anyone I "wrote" a book using AI.
For obvious reasons. It’s a cheat.
But yet, the whole experiment would not have been effective had I labored in secret with my chatbot pal, released the thing into the wild, and then sat back and watched it generate, like, 1 sale.
The trick about writing about Artificial Intelligence, understanding AI, is you have to actually use AI.
When I started writing this newsletter – the first personal writing I’d done in at least 15 years – I decided that the only way through was complete transparency. If I was going to find any universality in what I wanted to talk about, I would have to be unerringly honest.
So, I told everyone.
And two things happened.
One: some friends were disturbed. One friend said, " Your AI book isn't terrible. It's upsetting."
Another sent me a video about AI's gross electrical and hydroelectric power use.
Two: other friends, from my tech world, were encouraged, asking questions, wanting to get on a call, wanting to know "How was it? Was it effective?"
Clearly I was on to something, but I, and it, wasn’t clear what it was.
I've worked in industry-disrupting, controversial technologies before - Napster, BitTorrent, Rhapsody, hell, even Batter Blaster - so I knew the reactions would not always be positive.
I’ve worked in media, and the intersection of media and technology.
My friend Andrew Anker talks about this intersection in a very enlightening interview from Frontline a few years ago.
In thinking about media as the Old Guard and the Vanguard (and now, probably, the Circus), you start to see how differently certain eras view things whose meaning we believe we understand - like “news” and “journalism” and “publishing.”
They view them very, very differently. As Andrew says, we use the same words but mean different things. And if you add Generative AI to the mix, the picture begins to get even more kaleidoscopic, in the psychedelic sense. In other words, really fucking confusing.
Like what is “writing?”
Clearly, the words on the page are mine. I wrote or rewrote all of the narrative bits. I conceptualized the whole thing. I likely wrote more of it than anyone would think.
My friend Jack Boulware said he noticed how soulless and banal the book “hype” blurb read. I had to admit that that was the part I actually entirely wrote.
This wasn’t typing in a prompt and then ChatGPT spit out an entire manuscript that I edited. This was more like asking an assistant to do some research, putting the notes in an organized format, and then drafting an original over the top.
Which is exactly what I did for a real, well-known author on a major literary biography about 30 years ago.
Except I didn’t use a 24 year old intern. I used ChatGPT.
To be extremely clear, the bulk of the words are mine. The structure was worked out with help from ChatGPT. The lists and ideation and charts - ChatGPT. Some of the bad ideas in it - me. Some of the good ideas in it - ChatGPT.
Some of the more dystopian ideas, toward the end of the book, lean a lot heavier on the AI than on me. I didn’t know I’d write about how to restart society and create a tribunal.
I thought I was going to be writing about how to turn pee into potable water.
But my curiosity was piqued, in a Fallout / apocalyptic science fiction sort of way.
Did I actually create something?
Not really.
I hybridized existing knowledge into a statistically likely set of organized ideas and then put a coat of polish on it.
And yet, isn’t that kind of the same thing as creating? Or at least a mashup?
Am I treading into copyright law rather than existential ideas about art?
Is a great cover song art?
Or, to be more precise, is a shitty cover song art?
I have incorporated AI into my daily workflow, but so have you, if you create a Genmoji on your iPhone, play a video game, or listen to an algorithmically generated playlist.
This was in my inbox this morning: “Farhad Manjoo, a former New York Times and Wall Street Journal columnist, reveals his AI-enhanced writing workflow, from research to finding the perfect metaphor, and how these tools have transformed his creative process without replacing his unique voice.”
Sure, a query on ChatGPT consumes 10x more power than a Google search. But watching a YouTube video for 1 minute dwarfs them all, by a factor of 20-50x.
I get it - hey, the earth is a blasted, hollowed-out, dystopian hellscape, but I got my grocery list done a lot faster.
I am reminded of the super hero who could do amazing feats, but every time he did so he inched minutes closer to his death.
On the whole, was this successful? No.
I didn’t even finish the experiment. I didn’t practically market the book. I stopped in my tracks.
Fear? Discomfort?
A little of both, but I also got distracted by some real world stuff that might be more consistent, stable, and less experimental.
My comfort level with my current career situation is, seemingly, directly related to the quality of my creative output.
Creative limitations can inspire great art, but panic inversely affects it.
It’s like desperation. They smell it on you
The muse thinks, Don’t try so hard, you’re overdoing it.
If you don’t put in the time, what you get out is unearned.
I have to say, I knew this going in.
But I wanted to see if there was a possibility that I could tap into something greater than what I could do alone, in the short amount of time it took me.
My better instincts always trip me up, though. Rather than release the book under my own name, I chose a semi-cute pseudonym that only I *got*.
Robert Neville is the name of the main character in Richard Matheson’s I Am Legend, aka The Omega Man, aka The Last Man on Earth.
I couched anything I wrote about the experience in careful context.
Rather than just putting it out and seeing the results.
It turns out, if you tell people your book isn't really a book and it kind of sucks, people don't buy it.
My idea had been to use the Minimum Viable Product model on publishing, and see if interest could be measured with a minimum of time and investment.
As someone who has thrown a lot of spaghetti at the wall, and who is on the other side of 50, it feels like I need to understand if what I’m working on is worthwhile fast.
This is not to say I don’t believe in myself.
Writing cannot be done successfully with “a minimum of time and investment.” Even if it’s high concept non fiction.
But unearned value in creative enterprise has no value at all.
The most successful newsletters I’ve written, as determined by the number of new readers added and the revenue generated, were the pieces that were hardest to write. About grief and loss.
Yet, you can’t make a creative career out of just writing about despair and death.
Not to mention, it would get old fast.
And sometimes, I feel like I’m just getting old fast.
Thanks for reading Are You Experienced. This thing is written by Nick Tangborn, entirely. You can reach me at nicholas (at) areyouexperienced.co as always.
So much to think about in this piece. It's crammed full of interesting questions. Don't abandon this ship too soon, please. For one thing, I found the premise of the When the Grid Goes Dark HIGHLY intriguing. After the black out in Spain in Portugal over the past few days, my mind keeps darting toward reading more. As a writer and fellow-creative, I find tremendous value in how you used AI to help organize and structure the text of When the Grid Goes Dark. I'm not off-put for one second, other than jealous that you could get your mss into sellable form so damn quick! Any "product" takes a lot of elbow grease to sell. If you don't try to get more attention on Grid Goes Black then of course it won't be read. Anyways, this piece is magnificent. And like I said, crammed full of thought-provoking stuff. Good on you, Nick!
So…did this experiment change your thoughts on the use of AI in writing books?