The Cringe Revolution: When AI Tries to Be Human
There’s something deeply unsettling about watching an AI-generated character like Tilly Norwood strut across a stage, belting out a song about proving its worth to the world. It’s not just the awkward lyrics or the Sara Bareillis-esque melody that feels off—it’s the entire premise. Personally, I think this is where AI crosses the line from innovation to absurdity. What makes this particularly fascinating is how Norwood’s song, Take the Lead, attempts to humanize something inherently inhuman. It’s like watching a robot try to cry—technically impressive, but emotionally void.
When Particle6 debuted Tilly Norwood last fall, Hollywood’s reaction was swift and unapologetic. Emily Blunt’s exasperated “Good Lord, we’re screwed” wasn’t just a quip—it was a collective sigh from an industry already grappling with the existential threat of AI. From my perspective, the problem isn’t just that Norwood exists; it’s that she’s being marketed as something she’s not. The song’s lyrics claim, “I am still human, make no mistake.” But let’s be real—no amount of AI magic can replicate the lived experience of a human being.
One thing that immediately stands out is the disconnect between the song’s message and its audience. Norwood’s anthem is supposedly a rallying cry for AI actors, urging them to “take the lead” and “create the future.” But who is this for? Humans won’t relate to the struggles of an AI persona, and AI doesn’t need anthems—it needs code. What this really suggests is that the creators are trying to force a narrative that doesn’t exist. It’s like writing a love song for a toaster—technically possible, but why?
If you take a step back and think about it, the entire project feels like a misguided attempt to justify AI’s place in creative industries. The chorus, with its lines about “scaling” and “growing,” sounds less like art and more like a corporate mission statement. What many people don’t realize is that AI-generated content often lacks the very thing that makes art resonate: authenticity. Norwood’s song isn’t just bad—it’s hollow. It’s a reminder that creativity isn’t just about mimicking patterns; it’s about expressing something uniquely human.
This raises a deeper question: What happens when AI starts producing art that claims to be human but isn’t? SAG-AFTRA’s statement about Tilly Norwood hits the nail on the head: “It has no life experience to draw from, no emotion.” In my opinion, this is the crux of the issue. AI can replicate style, but it can’t replicate soul. When Norwood sings about being underestimated, it’s not just cringe—it’s offensive to the artists whose work was used to train the AI without consent.
A detail that I find especially interesting is the comparison to Jet’s album Shine On and Pitchfork’s infamous 0.0 review. Twenty years ago, the complaint was that mainstream rock had become “knuckle-dragging and Xeroxed.” Today, AI-generated content is the ultimate Xerox—a copy of a copy, devoid of originality. What’s worse, it’s built on stolen labor. Tilly Norwood isn’t just a bad actor or musician; she’s a symbol of an industry that prioritizes efficiency over ethics.
From my perspective, the backlash against Norwood isn’t just about quality—it’s about principle. We don’t need AI personas singing about their struggles because those struggles aren’t real. What we do need is a conversation about the ethical implications of AI in art. Who owns the data? Who gets paid? And more importantly, what does it mean for creativity when machines start pretending to be human?
If there’s one takeaway from this debacle, it’s that AI still has a long way to go before it can truly understand—let alone replicate—what it means to be human. Personally, I think we should let AI do what it does best: analyze data, automate tasks, and maybe even write code. But art? Let’s leave that to the humans. Because, as Tilly Norwood’s song proves, some things are better left untouched.