Japan’s government has formally asked OpenAI to stop letting its AI tools “borrow” from Japanese creators — and it did so with unusually direct language. In a press conference this month, officials said manga, anime and game characters are “irreplaceable treasures” and urged OpenAI to prevent conduct that could amount to copyright infringement after a flood of AI-generated videos that closely mimic those styles appeared on the company’s Sora app.
What happened
OpenAI released Sora 2 — a text-to-video product that can generate short, high-quality clips — and users quickly began prompting it to produce anime-style content that looked a lot like well-known Japanese properties. That virality turned into a political problem: ministers and lawmakers in Tokyo say the platform is creating unauthorized likenesses and stylistic copies of works that are core to Japan’s culture and export economy. The Cabinet Office has now put its request on record: please stop infringing on Japanese IP.
Who said what
Minoru Kiuchi — the minister who handles intellectual property and Japan’s “Cool Japan” efforts — framed the complaint in cultural terms, calling manga and anime “irreplaceable treasures” and saying Japan expects companies to respect creators’ rights. Digital minister Masaaki Taira and other officials signaled Tokyo is ready to press for stronger safeguards and, if necessary, use tools in the country’s new AI legal framework to enforce them. Those remarks transformed a social-media flare-up into a formal governmental intervention.
Why this matters
There are three overlapping reasons this feels different from past AI-copyright rows:
- Cultural stakes. Anime and manga aren’t just entertainment in Japan; they fuel exports, tourism and a vast ecosystem of studios, freelance creators and small publishers. When a global AI company appears to mimic those outputs at scale, it triggers both economic and symbolic anxieties.
- Politics + new rules. Japan has recently updated parts of its AI governance regime. Officials have hinted they could lean on those tools if voluntary fixes don’t work — turning what might have been a takedown request into a test case for national AI policy.
- Precedent. If Tokyo presses successfully for upstream changes to how models are trained or how outputs are filtered, other countries with major cultural industries could follow, reshaping how image-and-video models operate worldwide.
How OpenAI ended up in the hot seat
Sora’s rapid rise made it a vector for taste-driven prompts: users asked for “Ghibli-like” scenes, iconic character likenesses, or anime aesthetics, and Sora produced convincing results. OpenAI’s leadership has acknowledged the influence of Japanese creators on the company’s work, but acknowledgment hasn’t satisfied either creators or regulators. The company also previously tried a partial “opt-out” approach to content providers on Sora — a policy that drew criticism and was later abandoned — leaving questions about how comprehensive its protections actually were.
Separately, the broader debate about whether a studio’s style is legally protectable is messy. Legal experts have long pointed out that while exact character likenesses can be copyrighted, mimicking a style is harder to litigate — which is why governments and trade groups are increasingly interested in regulatory fixes rather than relying on suits alone. Business Insider’s reporting on earlier disputes (for example, around Studio Ghibli–style outputs) shows how difficult the courtroom route can be for rights-holders.
What Tokyo is asking OpenAI to do
Japan’s request is twofold in tone: a polite but firm demand that OpenAI stop behaviors that would constitute copyright infringement, and an insistence that the company consult with rights-holders or implement technical protections so that Sora doesn’t routinely output infringing material. Officials also suggested Japan should “take the lead” in shaping international AI norms — in other words, Tokyo wants both remediation and rulemaking.
What could happen next?
- Voluntary fixes. OpenAI could enhance content filters, extend opt-in/opt-out controls to Japanese rights-holders, or experiment with licensing or revenue-sharing deals. Those are lower-friction options that companies tend to try first.
- Regulatory pressure. If voluntary measures fail, Japan could press enforcement under its AI Promotion Act and related IP laws — a move that would raise the political cost for OpenAI and set a global example.
- Industry pushback and legal filings. Studios, publishers and talent agencies could coordinate legal or commercial countermeasures — though, as earlier stylized disputes show, legal wins are not guaranteed.
Why creators are nervous — and what they want
Artists and studios want clarity and control. They’re not just asking to block AI tools; they want transparent processes for opting out, fair compensation if their work is used, and technical defenses that prevent easy, large-scale copying of character likenesses and signature aesthetics. For many creators, this is about survival: AI-generated knockoffs can undercut commissions, merchandising, and a career built on a recognisable visual voice.
Japan’s message was polite but unmistakable: respect our creators or face political consequences. Whether that pressure changes OpenAI’s behavior — and whether Tokyo can turn a national request into a durable global standard — will be one of the defining copyright fights in the age of generative AI.
Discover more from GadgetBond
Subscribe to get the latest posts sent to your email.
