Why I Can’t Stay Silent Anymore
The story of how Adobe used 12K of my images to train their AI — and why I'm fighting for creators rights.
I Was Never Supposed to Be the One Talking
First, let me start by saying: I am not a content creator in the way you might think. I’m not someone who wakes up and thinks about going viral. I’m a behind-the-scenes guy. Always have been.
I built the platform called Diversity Photos. It’s a stock photography collection — nearly 100,000 images — specifically created to represent communities that the stock photo industry has historically overlooked. Black families at the dinner table. Latino professionals in a boardroom. Asian elders at a park. Muslim women at work. The everyday moments that exist in the real world but somehow didn’t exist in the visual content industry.
This was curated, intentional work. Not scraped from the internet. Not mass-produced. Every image was created with explicit consent and a focus on the representation of marginalized groups. That’s what made it valuable. That’s what made it different.
In April 2018, I signed a Stock Contributor Agreement with Adobe. The deal was simple and the deal was fair: I provide images to Adobe Stock, Adobe distributes them to end users, and we share the revenue. That’s it. A revenue share model. Two parties making money together.
The agreement called the license they needed exactly what it was: “License We Need to Distribute Your Work to Our End Users” and “License We Need to Promote Your Work.” That’s the language. That’s the context. That’s the purpose of the whole arrangement.
There was no section called “License We Need to Use Your Work to Train Our AI Models for Free and Cut You Out of Any Resulting Revenue.”
Because that was never the deal.
The Moment Everything Changed
Around June 2023, I became aware that Adobe had used my images to train their AI model called Firefly. If you’re not familiar, Adobe Firefly is a generative AI tool — you type in a prompt like “an angry purple tiger” and it creates an image for you. It was trained on content from Adobe Stock. My content. Your content, if you’re a contributor.
Think about what that means for a second. I licensed images to Adobe so they could sell them to customers. Instead, Adobe fed those images into a machine that now creates competing images. The AI outputs serve the exact same purpose as my original work. Except now, a customer doesn’t need to license my photo — they can just generate something similar for the cost of a subscription.
My reaction wasn’t rage. It was confusion. And then it was purpose.
I reached out to Adobe immediately. I came in calm. I actually asked for a partnership. I said: let’s figure out a fair arrangement for content already used in training and for content going forward. That’s a reasonable ask. That’s a business conversation between two parties who are supposed to be making money together.
Adobe’s response, months later in October 2023, was essentially: we have the right to do this under your agreement. And if you disagree, you should remove your content from Adobe Stock.
Read that again. They used my images. Trained an AI that competes with me. And when I raised my hand, they said leave if you don’t like it.
As if removing my content from Adobe Stock would somehow un-train their AI. As if you can take your flour back out of a baked cake.
What They Offered vs. What They Took
Before I even got to arbitration, Adobe offered me a “bonus” of about $1,173 for the use of my content in AI training. They framed it as generosity. They explicitly said they weren’t even required to offer it under the agreement.
$1,173 for 11,855 images used to train a generative AI product that Adobe now sells as part of its core business.
That’s about ten cents per image.
I said no.
When I retained legal counsel and we sent a formal demand letter, Adobe’s law firm — one of the largest firms in the world — responded with a settlement offer of $5,000.
$5,000 divided by 11,855 images = roughly 42 cents per image.
42 cents for an image that was intentionally created, curated, and licensed — now permanently embedded inside a billion-dollar AI product. That’s what your life’s work is worth to a company like Adobe. Not because that’s its actual value, but because they’re betting you don’t know any better. They’re betting you can’t afford to fight. They’re betting you’ll take the money and go away.
It’s like finding a gold mine in someone’s backyard, handing them $100, and walking off with a billion dollars worth of resources because they didn’t know what they had.
Some creators fell for it. And I don’t blame them. If you don’t know what your content is worth in the context of AI training data, how would you know to say no? That’s part of the strategy.
The Three-Letter Word That Changed Everything
Here’s what Adobe is really arguing — and I need you to understand this because it affects every single person who has ever uploaded content to any platform.
The contract I signed gave Adobe a license to use my images for “developing new features and services to promote my work.” That language was there so Adobe could do normal business things — improve their platform, create better search functionality, develop new tools for the stock photo marketplace. Standard stuff.
Adobe’s argument? The word “new” means they can do anything new. Anything. Including something that didn’t exist when I signed the contract. Including AI training. Including building a tool that directly competes with the very content I licensed to them.
Think about that logic. Because since the word “new” is in the contract, any new thing Adobe decides to do with your content is supposedly covered. Five years from now, if they want to use your images to train humanoid robots? New feature. If they want to beam your photos onto billboards from satellites? New service. If they want to sell your content directly to your competitors? New offering.
The word “new” became a blank check. And every creator who has a similar clause in their agreement should be concerned.
Why I’m Telling This Story Now
AI is changing everything. Every single thing. The way we create, the way we consume, the way we earn a living. And right now, in this moment, the rules are being written. Not by creators. Not by lawmakers who understand the technology. The rules are being written by the companies building the AI — and they’re writing them in their favor.
People entrusted platforms like Adobe with their content. There’s an expectation — a duty of care — that comes with that trust. When someone gives you their creative work under agreed-upon terms, you don’t get to just rewrite the deal in your head and pretend they consented.
I fought this battle in arbitration. I spent money I didn’t plan to spend. I hired experts and lawyers. I experienced things I never expected — like a hostile process server showing up at my home while my 4-year-old was sleeping on my shoulder. My heart was pounding. I wanted to protect my family and fight for my rights at the same time. That’s the reality of standing up to a corporation. It comes to your front door.
But I’m still here. And I’m still talking.
What This Series Is About
For the next 52 weeks, I’m going to share everything that I can. The documents. The strategies used against me. The contract clauses you need to look for. The alternatives to many products. The real costs of fighting a billion-dollar company. The emotional toll. The spiritual foundation that kept me going.
I paid for experts so you don’t have to. I lived through the arbitration process so you can learn from it. I experienced every tactic they used — the delays, the procedural battles, the motions to end the case before I could present evidence — so that if this ever happens to you, you won’t be walking in blind.
This is not just about Adobe. This is about every platform that has your content and a vague contract with the word “new” somewhere in it. This is about the future of creative ownership. This is about whether the people who make the content that trains the AI have any say in what happens next.
If you’re a creator — a photographer, a writer, a musician, a filmmaker, a designer, a person who makes things and puts them into the world — this story is yours too.
Follow along. Share it. Talk about it. Because the only thing that can change the rules is enough people knowing what the rules actually are.
This is Week 1 of 52
Photo by Nicole Carter


