Two Stories the Industry Got Wrong
The past few months have given us two major AI-in-film stories that generated enormous noise and mostly shallow analysis. First: OpenAI's Sora faced backlash from artists, producers, and industry unions who argued the tool was trained on copyrighted footage without consent. Second: Disney announced an expanded AI partnership that was immediately framed as either "Disney is killing jobs" or "Disney is ruining cinema," depending on which corner of the internet you were reading.
Both stories were covered as if they were simple. Neither is. And the noise around both has obscured a more interesting question: where are AI creative tools actually delivering genuine value — and what does that tell us about where the technology is headed?
Sora: The Criticism Was Right, But Misses the Larger Point
The training data concerns around Sora are legitimate. The AI creative tool industry has a real, unresolved problem with consent and compensation for the creative work used to train these models. I'm not going to pretend that's a small issue — it affects creators I know personally, and the current legal and ethical frameworks aren't adequate.
But the criticism often conflates two separate questions: how was the model trained and what is the model capable of? The first question is about ethics and policy. The second is about creative utility. Both matter, but treating them as the same conversation leads to bad analysis.
Sora, as a technical achievement, is remarkable. The temporal coherence — its ability to maintain consistent subjects, lighting, and physics across generated video — is a genuine step forward from anything that existed 18 months ago. For specific use cases: pre-visualization, concept development, motion reference, background generation — it delivers real value to working filmmakers regardless of your position on its training methodology.
Disney's AI Partnership: What's Actually Happening
Disney's AI push isn't about replacing storytellers. Anyone who's spent time in a studio environment knows that the bottleneck in content production isn't creative talent — it's production cost and production speed. Disney has more stories to tell than it can produce. AI tools that accelerate pre-production, reduce VFX costs for non-hero shots, and allow smaller teams to produce reference-quality previsualization directly address that bottleneck.
The jobs at risk inside a studio like Disney are the same jobs at risk everywhere else in production: highly technical, execution-focused roles that involve applying established techniques to defined tasks. The jobs that are safe are the ones that require judgment: directors, writers, production designers, and — critically — the producers who understand how to manage the increasingly complex integration of AI and human creative work.
What These Tools Are Actually Getting Right
Setting aside the controversy, here's where AI creative tools are delivering genuine, measurable value in 2026:
- Previsualization at production scale. What used to require a specialized previs team now happens in real-time conversations between directors and AI tools. Shot blocking, camera movement, basic lighting decisions — all of this can be explored before a single crew member is booked.
- B-roll generation for editorial. For documentary and editorial projects, AI-generated supplementary footage that complements but doesn't replace primary material is already standard practice in the edit rooms of major outlets.
- VFX efficiency for non-hero shots. The establishing shot of a city. The background crowd in a period piece. The sky replacement in a scene where the weather was wrong. These are expensive but invisible elements that AI handles competently at a fraction of traditional VFX cost.
- Audio post acceleration. AI-assisted dialogue cleanup, music sync, and sound design routing has compressed post-audio timelines significantly for projects without complex mixing requirements.
The Conversation We Should Be Having
Instead of arguing about whether AI tools are good or bad for the film industry, the more useful conversation is about governance. Who owns the work? Who gets compensated when their creative work trains a model? What's the appropriate level of disclosure when AI-assisted production is used? How do unions and guilds adapt their frameworks?
These are policy questions, not creative questions. And they're more important than the weekly debates about whether any specific AI video tool is impressive or disappointing. The technology will continue to improve regardless. The governance conversation can't wait for the technology to settle.
The filmmakers and studios who engage seriously with both questions — the creative applications and the governance implications — will be the ones who shape how this plays out.
What I would tell a producer asking what to do today
If you run a small production company, the answer is not to wait for the policy questions to settle. Use the tools that have legitimate licensing tracks now. Document your creative process. Keep a record of what is human-made and what is AI-assisted in every project — not because anyone is asking yet, but because they will.
The studios that treat governance as a problem to figure out later will lose talent. The ones that build it into the workflow now will be the ones cast and crew want to work with.
Adopt the tools, but do not skip the paperwork. It is a boring answer that turns out to be the difference between a sustainable operation and a deposition.