Nº 060 · AI ·7 min read · March 28, 2026

Adobe Just Let 30 AI Models Into Firefly. Here's the One Feature That Actually Matters.

Fig. 01 Adobe Just Let 30 AI Models Into Firefly. Here's the One Feature That Actually Matters.

The Wrong Headline

Adobe announced this week that Firefly now integrates over 30 third-party AI models — Google Veo 3.1, Runway Gen-4.5, Kling 2.5 Turbo among them. Every tech outlet covered it as "Adobe adds more AI models."

That's the wrong headline.

The story is Custom Models: Adobe expanded access to Firefly Custom Models, which lets you train a reusable model on your own images and visual style. For independent creators and small production companies, this is the feature that changes the economics. Let me explain why.

What Custom Models Actually Do

Brand consistency has been the hardest problem in AI-assisted production. You can generate beautiful images. You cannot reliably generate images that look like they belong to the same visual universe — same lighting logic, same color temperature, same character design, same product representation — unless you're manually prompting for every detail, every time.

Custom Models changes this. You feed Adobe a set of reference images — your brand's approved visual library, your product photos, your established color palette — and it trains a model that encodes your aesthetic DNA. Every subsequent generation inherits that DNA without you having to re-specify it in the prompt.

For a brand with an established visual identity, this means: consistency at scale. The 50 pieces of content per month that would otherwise require a creative director reviewing every output can now start from a model that already knows what "on-brand" looks like.

The Production Case for Independent Creators

The feature was previously available to enterprise accounts. Expanding access is what makes this story relevant to anyone who isn't a Fortune 500 brand team.

Here's the practical scenario: I've been working with AI generation tools for a production company that needs consistent brand imagery across 40-50 deliverables per month. The current process involves detailed prompt templates, a style guide document, and manual review at every stage. It works. It's slow.

With Custom Models, the workflow becomes: build the model once from approved reference images, generate at volume, review exceptions rather than every output. That's a fundamentally different labor equation. The creative director's time goes toward what requires creative judgment, not what can be pattern-matched.

The 30 Models Question

On the model catalog expansion: having Google Veo 3.1, Runway Gen-4.5, and Kling 2.5 Turbo inside Firefly is genuinely useful — not because of any single model's capabilities, but because of unified billing and workflow. Right now, serious creative teams are juggling accounts, APIs, and billing relationships across five or six different platforms. Firefly as a hub for multiple models simplifies the operational side.

That said, integration quality will determine whether this is useful or just a checkbox. A model that works well through its native API sometimes degrades when accessed through a third-party wrapper. I'll run comparative tests when I have more time with the new integrations.

What This Means for the Independent Creator Economy

I've argued consistently on this site that the most significant AI story for independent creators isn't the headline capabilities — it's the infrastructure maturation. Tools moving from "impressive demo" to "reliable workflow component."

Adobe's Firefly expansion is infrastructure maturation. Not a capability breakthrough. A reliability and accessibility upgrade. Custom Models moving from enterprise-only to broader access is exactly the kind of shift that changes what small teams can produce without enterprise budgets.

The creative director skill — knowing what good looks like, recognizing when an output is off-brand, making judgment calls that require taste — remains the scarce resource. The tools are becoming easier to direct well. That's a good problem to have if you have the taste to direct them.

How I would actually use Custom Models on a real account

For a brand client at Pichorra, the first thing I would feed the model is not the brand book. It is the rejected work. The dozen frames the creative director shot down in the last campaign for being "off." That negative training set teaches the model the line a brand draws between on and off, which is more useful than the official approved-asset library, because the official library is what survives. The rejects are where the actual taste lives.

Then I would feed the approved set on top. The model trained on both the floor and the ceiling produces output that holds up in client review better than a model trained only on the ceiling.

The honest concern

Custom Models trained on your visual library encode your brand's aesthetic into Adobe's infrastructure. Read the data agreement before uploading. Different tier of subscription, different data handling. For a brand with proprietary visual IP, this is a contract question, not a creative one. Get the legal team in the loop before you train.

Source: Adobe Blog — Firefly expands video and image creation with new AI capabilities and Custom Models

About the author

Read the manifesto Write in