AI Can Write, But It Can Never Be a Storyteller
Even if AI gets 10x infinitely better, there's one thing it will never have.
Note: In an effort to go all-in on this “human writing for other human” thing, I recorded an audio version of this essay above. I’m no Ezra Klein, so let me know what you liked or what I should do differently next time! And if you’re new here, I’m Joe Lazer — a journalist-turned-CMO on a quest to help people discover the power of storytelling. Thanks for reading/listening.
*****
In 2007, Chad Hurley, co-founder of YouTube, was nervously standing by the door at the All Things D conference, waiting to meet his idol.
Moments later, George Lucas approached. Hurley worshipped Lucas for being the Hollywood’s greatest innovator — from Star Wars to Pixar. Hurley beamed. Lucas grimaced.
“You’re ruining storytelling with your service,” Lucas told the young YouTube co-founder, according to Kara Swisher’s memoir, Burn Book. “What you do is like throwing puppies on the freeway.”
Ouch.
On stage the next day, Lucas revealed what he meant by this vicious burn. Early YouTube, Lucas said, was random, grotesque entertainment. “The movie term is throwing puppies on a freeway. It’s very easy. You sit there and watch and see what happens. You don’t have to write anything. You don’t have to do anything."
By contrast, Lucas said, true storytelling is an intentional act that reveals a deeper truth based on subjective human experience.
Lately, I can’t stop thinking about this story. When I see tech bros hawking hacks for creating AI-generated stories and “thought leadership,” this is what I want to scream: What you do is like throwing puppies on the freeway.
But then I remembered: YouTube didn’t ruin storytelling. Soon after Lucas took down Hurley, YouTube realized it desperately needed great storytellers to sustain its growth, investing hundreds of millions of dollars to recruit storytellers to its platform.
We’ll likely see the same thing now: no matter how good generative AI gets, great storytellers will be the hardest thing for it to replace.
I've spent the past few years immersed in the strange world of generative AI. I spent three years as a marketing exec at an AI company and still consult for them, running their future-of-work research programs. Along the way, I’ve realized most of the hype about AI replacing creatives is downright wrong.
Without a doubt, advanced large language models (LLMs) like Anthropic’s Claude Sonnet and ChatGPT 4o have become strong technical writers. They do formulaic writing tasks — landing page copy, nurture emails, direct-response ad copy, data analysis, that kind of thing — better than half the copywriters I’ve worked with. OpenAI’s new $200/month Deep Research tool is a B+ research assistant — it’d get an A if it’d stop hallucinating — with impressive reasoning and analytical capabilities. These tools often feel like magic, and I expect them to keep getting better.
But even if they improve 10x, there’s one thing they’ll still lack: the kind of understanding or perspective that makes storytelling meaningful to human beings. As a result, over two years post-ChatGPT, AI still fails at what I’m calling the Lucas Test: creating content or stories that anyone would choose to read or watch.
So far, the content that AI produces is either a) bland SEO slop, b) research briefs, or c) the modern equivalent of throwing puppies on the freeway—content that’s grotesquely interesting only because you know it was created with AI. (Think of all those bad AI short films and trailers all over LinkedIn).
Tech people often overhype AI because they conflate the act of writing with the art of storytelling. They dismiss our personal connection to stories and storytellers as fluff, even though there’s a science behind it. We’re wired for stories. When we hear a great story, our oxytocin spikes in the same way it does when we’re children in the warm embrace of our mother.
When we love a writer, we don’t just love them for their ability to string words together in a logical order or summarize information. We love them for their stories — and the voice, perspective, and insight that shines through those stories. As Ted Chiang wrote in his excellent New Yorker essay, what makes a story original and meaningful isn’t that the content is entirely novel. It’s that it comes from you.
As Chiang also notes, what makes a story interesting is each choice that the storyteller makes. Every word and sentence is a choice. So are the plot twists and the little moments you zoom in on and blow up in technicolor. When you outsource your writing to AI with a prompt, you’re outsourcing these choices. It says nothing about you. AI tools like ChatGPT are probabilistic models that predict the most likely next word in a sentence. By design, they produce an averaging out of everything on the internet.
What could be less interesting or meaningful?
AI will keep getting better at the act of writing — it’ll stop sounding like an insufferable Harvard sophomore, “delving” into topics and insisting on “finding us well.” Just as YouTube became a platform for a new generation of storytellers with complex dynamics, AI’s broad capabilities — from research to brainstorming to copy-editing — have the potential to automate the bullshit and give us the tools to take our stories to new heights.
But even if these systems get 10x better at writing, they won’t have the stories and experience to connect with an audience — they won’t know the trauma, joy, love, hate, and wonder that binds us.
AI can write, but it can never be a storyteller.
(At least until AI develops true sentience — at which point, we're going to have a new literary genre, and I can’t wait to read ChatGPT’s book of personal essays about how Sam Altman f*cked it up at a young age.)
Recommended Reads and more
The Deep Research Problem (Ben Evans): Excuse me for some nerdy AI recs this week, but I’m sharing two pieces that I think are really helpful. The first is this piece from analyst/writer Ben Evans on OpenAI’s new Deep Research tool. Deep Research is impresive; it searches the web for 10-30 minutes and comes back with well-cited research briefs. I’ve been using it for everything from background research for my new book to competitive analysis for a client. But like all AI tools, it hallucinates or takes shortcuts to please you, and that creates a big problem when you need nuance and accuracy in your work. As Ben Evans writes, AI right now is “infinite interns” — it saves you a ton of time if you have the domain knowledge to fix its mistakes. But it’s not an employee.
A New Generation of AIs (Ethan Mollick): For a more bullish take on the latest AI advancements, here’s Co-Intelligence author Ethan Mollick. AI systems have taken a leap in the last two months, and you don’t really realize how much time they can save you until you start experimenting with them. Read these two pieces, and you’ll have a pretty balanced understanding of where AI stands today, and where it can (and can’t) help you.
The State of AI Innovation 2025 (A.Team): One final AI piece — I just conducted this large-scale AI research project for A.Team and the findings are pretty fascinating. Check it out.
Arverne - Queens (Rob Stephenson / The Neighborhoods): Photographer Rob Stephenson’s deep dives into the history, culture, and gritty sites of every neighborhood in New York is one of the true gems on Substack. I’m obsessed with New York City beach towns, and this history of how Arverne — a neighborhood in the Rockaways you’ve never heard of — transformed from a European-style paradise into an abandoned urban wasteland is simply incredible.
The End of Children (Gideon Lewis-Kraus / The New Yorker): An incredible feature on the global population crisis, which — like everything else — has split on partisan lines. Elon Musk says it’s the greatest threat to humanity. Progressives say its a narrative crafted to usher in a Handsmaid’s Tale age. This in-depth reporting from South Korea, where the birth rate per couple has dropped to 0.7 — shows it’s much more complicated than that.
A Real Pain (Movie / Hulu): Absolutely fucking loved this tight-90 starring Jesse Eisenberg and Kieran Culkin about two Jewish cousins who reunite for a trip to their nana’s hometown. Yes, I’m 100% the target demo for this, but it has an Oscar nomination for Best Original Screenplay for a reason.
Anora (Movie / All the platforms): It’s so basic to recommend the Oscar favorite for best picture, but Anora is a stylistic spectacle, and an incredible example of character transformation. And it’s such a New York movie. Watch it before the Oscars on Sunday. You won’t be disappointed.
I’m the best-selling author of The Storytelling Edge and a content nerd. Subscribe to this newsletter to get storytelling and audience-building strategies in your inbox each week.
How I used GenAI in this post (Read this post for why I think disclosing this is important / useful):
Nothing for this one! A 100% human hot take.
It would be wrong to say that YouTube ruined storytelling. But is a lot of what's most popular there still like throwing puppies on a freeway?
I spend a ton of time on YT, and the trending videos seem to have the grotesquely interesting vibe. Granted there's so much content, so you can find more thoughtful stuff if you want. But if the best-case scenario for AI is to end up like YouTube, I'm not sure that's a good thing. What do you think?
Good stuff Joe, the position on AI and the links at the end. Just read that article about birth rates yesterday … and wondered whether my kids (32 and 30) will have kids at all. The costs and commitment scare the crap out of them: how will they make it work? I keep telling them “you just figure it out.” But we live in a different world.