Hollywood is kicking off one of its first large-scale, coordinated fights against generative AI, and it’s doing it with a blunt, meme-ready slogan: “Stealing Isn’t Innovation.” At its core, this is less a niche industry squabble and more a high-stakes argument about who gets to profit from human creativity in the age of AI — Big Tech or the people who actually make the stuff everyone loves.
The campaign is being driven by the Human Artistry Campaign, a coalition that’s quietly grown into a serious power bloc made up of unions, rights groups and trade bodies spanning film, TV, music, news, sports and publishing. Its new push, branded “Stealing Isn’t Innovation,” officially launches with a broadside against tech companies that have trained generative AI models on copyrighted scripts, songs, books, images and performances without asking permission or paying for any of it. The group’s message is deliberately simple: if your AI business model depends on vacuuming up copyrighted work without consent, that’s not disruption, it’s theft.
To make that point land beyond policy nerds and copyright lawyers, the campaign has recruited what is essentially an A‑list cast list for an industry-wide protest. Scarlett Johansson, Cate Blanchett, Joseph Gordon-Levitt, Jennifer Hudson, Kristen Bell, Olivia Munn, Sean Astin and “Breaking Bad” creator Vince Gilligan are among the actors and filmmakers putting their names on the effort. Musicians range from Cyndi Lauper, LeAnn Rimes and Questlove to bands like R.E.M., MGMT, OK Go and OneRepublic, while authors including George Saunders, Jodi Picoult, Roxane Gay and Jonathan Franzen have also signed on. In total, more than 700 creators are backing the campaign at launch, with the organizers saying that number is already climbing.
This isn’t just a change[.]org petition with some famous signatures, either. The New York Times — which is itself suing OpenAI and Microsoft over alleged misuse of its journalism to train AI models — has thrown its weight behind “Stealing Isn’t Innovation” with a coordinated ad campaign across print, digital, and social. The Times‘ publisher A.G. Sulzberger framed it as a fight against “systematic theft” by AI firms that have scraped news, books, music and more to build commercial products without consent. The Human Artistry Campaign’s own messaging leans into the same idea, warning that unlicensed data-mining is not just a legal gray area but “massive and unprecedented theft” that could gut the creative middle class.
If that language sounds familiar, it’s because Hollywood has been building to this moment for a while. During the 2023 writers’ and actors’ strikes, generative AI went from a futuristic talking point to a red-line bargaining issue, with the Writers Guild of America ultimately securing contract language that bars studios from using AI to write or rewrite scripts and prevents AI-generated text from being treated as “source material” that could undercut a writer’s credit and pay. SAG-AFTRA followed with hard-fought protections for performers’ likenesses and voices, after early deals revealed how easily a “background” scan could turn into a studio owning a digital double for life. Those negotiations framed AI as a labor issue; “Stealing Isn’t Innovation” is the next evolution, targeting the upstream pipeline: the training data itself.
What the campaign wants, put simply, is to flip the current default. Today, most big AI companies operate on an opt‑out model: they assume they can train on your work unless you jump through hoops to say no, often using fragmented or barely public processes. The Human Artistry Campaign is pushing for the opposite — an opt‑in, licensing-first system where companies must explicitly secure rights before ingesting creative works into training data. That includes concrete asks: legally enforceable licensing frameworks, the right for artists and rights holders to refuse being used as training material, and stronger enforcement against deepfakes and AI impersonations that muddy the waters between real and synthetic content.
Underneath the rhetoric, there’s a cold economic argument. The campaign frames unlicensed AI training as a direct attack on one of the U.S.’s most successful export industries: entertainment and media. The creative economy supports millions of jobs — not just household-name stars, but staff writers, session musicians, makeup artists, animators, journalists and countless below‑the‑line workers whose livelihoods depend on continuing demand for new, original work. The fear is that if AI companies can saturate the market with AI-generated “content” trained on existing material — and do it at scale and near-zero marginal cost — it strips away both the financial incentive and the negotiating power for human creators.
There’s also a culture war angle baked into the messaging. The Human Artistry Campaign repeatedly calls out “AI slop” — a term that’s become shorthand for low-effort, algorithmically generated junk flooding feeds and recommendation systems. Creators worry that the same platforms that already amplify engagement bait will happily serve up slightly‑remixed, barely‑original AI music, video and writing trained on their work, pushing their actual output further down the algorithmic stack. When they talk about defending “original thought and expression,” it’s partly about money, but it’s also about not letting the culture be defined by statistically plausible mashups of what already exists.
Of course, Hollywood’s relationship with AI is not purely adversarial. The industry is simultaneously experimenting with what sanctioned, paid‑for AI partnerships look like. Disney is the most visible example: in December, the company signed a three-year deal reportedly worth around $1 billion with OpenAI, aimed at bringing some of its iconic characters into the video-generation platform Sora. That deal came after a wave of anger when Sora 2.0 was shown generating video featuring recognizable characters from franchises like “Bob’s Burgers,” “Grand Theft Auto” and “SpongeBob SquarePants,” even though rights holders had not signed off. By inking a licensing agreement, Disney arguably legitimized OpenAI’s tech — while also signaling that if you want to play with its IP going forward, you’re going to pay.
That “two tracks at once” dynamic is important: on one side, artists and unions are trying to put hard legal and ethical rails around AI’s use of existing works; on the other, studios and conglomerates are cutting deals that could normalize AI use as long as the check clears. The Human Artistry Campaign leans into this tension but doesn’t reject AI outright. Its line is that there is a “better path” where AI can develop rapidly, but only when companies license content, share revenue, and collaborate with the people whose work powers their models. That pitch is designed to appeal to lawmakers and regulators as much as the public: this isn’t about banning AI, it’s about insisting that the tech sector follow the same copyright rules everyone else already lives under.
What makes “Stealing Isn’t Innovation” feel like a turning point is the way it connects dots across industries that don’t always move in sync. In the past two years, authors have sued over AI-generated “shadow libraries” trained on pirated books; visual artists have taken on image generators; news organizations from the Times to local publishers have started pushing back on AI firms lifting their archives. Now, those fights are being bundled into a single narrative that casts generative AI as a kind of industrial-scale copy machine, and creators as the ones being copied without being asked. The campaign’s website and social content are clearly built to be shareable, with posters, slogans and calls to action that can be screenshotted and circulated beyond trade press and policy circles.
For everyday viewers and listeners, the stakes might not be obvious yet; your favorite show still drops on time, your playlists still update, your feeds still scroll. But the people behind those stories are sending a pretty unambiguous warning: if AI development keeps leaning on “ask forgiveness, not permission” when it comes to training data, the pipeline of new, weird, risky human-made art gets thinner. One of the campaign’s bluntest lines sums it up: taking creators’ work without consent “is not innovation. It is not progress. It is theft — plain and simple.”
In other words, Hollywood’s AI fight is shifting from the picket line to the narrative battlefield. Legislatures, regulators and courts will ultimately decide how far AI firms can go in scraping the world’s culture to feed their models, and whether artists have any practical way to opt out. “Stealing Isn’t Innovation” is an attempt to set the terms of that debate early — not just in dense policy filings, but in the kind of clear, emotional language that tends to win when history looks back.
Discover more from GadgetBond
Subscribe to get the latest posts sent to your email.
