Consent to One Thing Is Not Consent to Everything
Why we hold five-year-olds to a higher standard than billion-dollar tech companies.
Watch any kindergarten class for five minutes and you’ll hear a teacher say something like this: “Did you ask before you took that? You need to ask. And if they say no, that means no.”
We teach our kids that consent is specific. You ask before you borrow someone’s toy. You ask again next time. You don’t assume that because they let you borrow it yesterday, you can take it home today. And if they say stop, you stop. That’s not a complicated idea. Five-year-olds get it.
So how is it that we hold children to a higher consent standard than some of the largest technology companies in the world?
The Dinner Invitation
Think about it in the physical world for a second.
If someone agrees to let you into their home for dinner, that doesn’t mean you can go through their drawers. It doesn’t mean you can copy their house key. It doesn’t mean you can come back whenever you want and help yourself to whatever’s in the fridge.
You were invited for dinner. That’s the scope of the consent. Anything beyond that requires a new conversation.
We all understand this intuitively. Nobody would argue otherwise in the physical world. But in the digital world, companies are acting like a single “I agree” gives them the keys to your entire life. Your photos. Your voice. Your creative work. Your data. All of it. For anything they want. Forever.
Consent to one thing is not consent to everything.
How I Learned This the Hard Way
I invited Adobe to dinner. That’s essentially what happened.
I consented to let Adobe distribute my images to end users through Adobe Stock. That was the agreement. Revenue share. They distribute, I provide content, we both earn. The license sections were called “License We Need to Distribute Your Work to Our End Users” and “License We Need to Promote Your Work.” That was the dinner invitation. Clear scope. Clear purpose.
But Adobe didn’t just stay for dinner. They went through my drawers. They took my images and used them to train Firefly and Sensei — their AI models. They copied the key — embedding my work permanently into systems I never agreed to. And they argued they could come back whenever they wanted, for whatever they wanted, because the agreement mentioned the word “new.”
I consented to distribution. I did not consent to AI training.
That’s like inviting someone to dinner and having a judge say that because you opened the door, they were entitled to move in.
Consent Is a Spectrum, Not a Switch
That experience changed how I think about consent in technology. And it’s one of the driving principles behind what I’m building with Destined AI.
The way consent works in most technology products right now is binary. It’s a switch. You either agree to everything or you use nothing. There’s no middle ground. There’s no nuance. There’s no conversation.
But consent isn’t a switch. It’s a spectrum. And it has properties that every five-year-old already understands.
Consent should be specific. When I said yes to distribution, that meant distribution. It did not mean AI training, data scraping, model building, or any other use that wasn’t part of the original agreement. Consent to one thing is not consent to everything.
Consent should be informed. You can’t consent to something you don’t know about. When I signed that agreement in 2018, generative AI as we know it didn’t exist as a commercial product. I could not have consented to a use I couldn’t have imagined. And Adobe knew that. They didn’t ask because they knew what the answer would be.
Consent should be revocable. If I change my mind, I should be able to withdraw my consent. But once your content has been used to train an AI model, it’s embedded. You can’t un-train a model.
Consent should be ongoing. Just because I said yes in 2018 doesn’t mean I said yes to everything that comes after. Technology evolves. Uses change. If a company wants to do something fundamentally new with your content, they should have to come back and ask again. Just like a kid has to ask to borrow the toy again tomorrow.
From Diversity Photos to Destined AI
I built Diversity Photos with consent at the center. Every person in those images gave their explicit permission to be photographed. Every image was created with intention. The collection was curated, not scraped. That’s what made it valuable — and that’s exactly what was disrespected when Adobe used it for AI training without a clear conversation.
Now I’m building Destined AI with the same principle. The consent problem in technology isn’t just a legal issue. It’s a design issue. It’s an architecture issue. The way systems are built right now, consent is an afterthought — a checkbox on a form, a paragraph buried in a terms of service nobody reads, a single click that supposedly covers everything a company might do for the rest of time.
That’s not consent. That’s a loophole designed to look like consent.
I believe technology should be built so that consent is specific, informed, revocable, and ongoing. Not because it’s idealistic, but because it’s the only standard that actually works. We already know it works — we teach it to our children. Now we need to build it into our systems.
The Question We Should All Be Asking
Here’s what I want you to think about.
Right now, companies are using your content, your data, your voice, your images, and your creative work to build the most powerful technology in human history. And most of them are doing it based on a single click you made on a terms of service you didn’t read, for uses you couldn’t have imagined, with no mechanism for you to take it back.
If your five-year-old did that at school — took something without asking, used it for something they weren’t given permission for, and then refused to give it back — you’d correct them. You’d sit them down and explain why that’s not okay.
We need to have that same conversation with the companies building AI. Because the standard can’t be lower for a corporation than it is for a child.
We would never accept this in the physical world. We should not accept it in the digital one.

