<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[The Ethical Layer]]></title><description><![CDATA[Ethics. Privacy. Data. Opportunity. Atlanta made. ]]></description><link>https://layer.geraldcarter.co</link><generator>Substack</generator><lastBuildDate>Thu, 16 Apr 2026 22:55:20 GMT</lastBuildDate><atom:link href="https://layer.geraldcarter.co/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Gerald Carter]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[gerald360@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[gerald360@substack.com]]></itunes:email><itunes:name><![CDATA[Gerald Carter]]></itunes:name></itunes:owner><itunes:author><![CDATA[Gerald Carter]]></itunes:author><googleplay:owner><![CDATA[gerald360@substack.com]]></googleplay:owner><googleplay:email><![CDATA[gerald360@substack.com]]></googleplay:email><googleplay:author><![CDATA[Gerald Carter]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[Consent Isn’t Just an Ethical Safeguard - It’s a Data Quality Strategy.]]></title><description><![CDATA[20,000 People Said Yes. So Why Are Companies Afraid to Ask?]]></description><link>https://layer.geraldcarter.co/p/consent-isnt-just-an-ethical-safeguard</link><guid isPermaLink="false">https://layer.geraldcarter.co/p/consent-isnt-just-an-ethical-safeguard</guid><dc:creator><![CDATA[Gerald Carter]]></dc:creator><pubDate>Tue, 14 Apr 2026 14:54:45 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!wHnV!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe80088bd-9415-42ab-8f19-00c57c834214_1314x550.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>There&#8217;s a story the tech industry tells itself about consent. &#8220;If you ask people for explicit permission to use their data, they&#8217;ll say no. So you have to be creative about how you get it. Bury it in terms. Use broad language. Make the product so useful that people click &#8220;I agree&#8221; without thinking about what they&#8217;re agreeing to.&#8221;</p><p>I know this because an investor told me to do exactly that. Build a product that collects voice data for free. Train my own models. That&#8217;s the moat.</p><p>I didn&#8217;t take that advice. Instead, I asked.</p><div class="native-video-embed" data-component-name="VideoPlaceholder" data-attrs="{&quot;mediaUploadId&quot;:&quot;47284bc4-9b6b-48fc-92b6-e60aee061553&quot;,&quot;duration&quot;:null}"></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://layer.geraldcarter.co/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://layer.geraldcarter.co/subscribe?"><span>Subscribe now</span></a></p><p>With Destined AI, we asked people directly: will you contribute your voice data to help improve AI? No burying it in terms of service. No vague &#8220;new features&#8221; clauses. A clear ask with a clear purpose.</p><p><strong>Over 20,000 people said yes.</strong></p><p>So the excuse doesn&#8217;t hold up. People will consent; when you&#8217;re honest about what you&#8217;re building and why. The problem was never that people don&#8217;t want to participate. The problem is that companies don&#8217;t want to be accountable for what they do with the participation.</p><h2>What Happens When You Skip Consent</h2><p>Here&#8217;s the part nobody talks about. Skipping consent doesn&#8217;t just create a legal problem or an ethical problem. It creates a product problem.</p><p>Being from the South, most voice AI systems simply don&#8217;t understand our voices. It&#8217;s a familiar frustration; whether it&#8217;s personal assistants misunderstanding everyday instructions or more serious implications in healthcare, like ambient scribes missing critical details.</p><p>I decided to personally test every major speech-to-text platform so others don&#8217;t have to navigate the same trial and error. I ran a subset of data through eight of the top models; OpenAI, AssemblyAI, Deepgram, Amazon Transcribe, Google Speech, Azure Speech, and the open-source Voxtral model.</p><p>The results:</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!wHnV!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe80088bd-9415-42ab-8f19-00c57c834214_1314x550.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!wHnV!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe80088bd-9415-42ab-8f19-00c57c834214_1314x550.png 424w, https://substackcdn.com/image/fetch/$s_!wHnV!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe80088bd-9415-42ab-8f19-00c57c834214_1314x550.png 848w, https://substackcdn.com/image/fetch/$s_!wHnV!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe80088bd-9415-42ab-8f19-00c57c834214_1314x550.png 1272w, https://substackcdn.com/image/fetch/$s_!wHnV!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe80088bd-9415-42ab-8f19-00c57c834214_1314x550.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!wHnV!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe80088bd-9415-42ab-8f19-00c57c834214_1314x550.png" width="1314" height="550" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/e80088bd-9415-42ab-8f19-00c57c834214_1314x550.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:550,&quot;width&quot;:1314,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:108378,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://layer.geraldcarter.co/i/194192633?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe80088bd-9415-42ab-8f19-00c57c834214_1314x550.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!wHnV!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe80088bd-9415-42ab-8f19-00c57c834214_1314x550.png 424w, https://substackcdn.com/image/fetch/$s_!wHnV!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe80088bd-9415-42ab-8f19-00c57c834214_1314x550.png 848w, https://substackcdn.com/image/fetch/$s_!wHnV!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe80088bd-9415-42ab-8f19-00c57c834214_1314x550.png 1272w, https://substackcdn.com/image/fetch/$s_!wHnV!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe80088bd-9415-42ab-8f19-00c57c834214_1314x550.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Lower is better. We found word error ratings and how they correlate to demographics. More detailed evaluations available.</figcaption></figure></div><p>Word error rates ranged from 2.9% to 33.4%. Most of them failed badly on women from New Orleans. Not edge cases. Not unusual accents. Just people talking the way they actually talk.</p><p>That&#8217;s what happens when you build AI on data that was scraped, not given. You end up with models trained on the voices that were easiest to collect, not the voices that need to be heard. The data is biased because the collection was biased. And the collection was biased because nobody asked.</p><p><strong>Consent isn&#8217;t just an ethical safeguard. It&#8217;s a data quality strategy.</strong></p><h2>Why Companies Really Skip Consent</h2><p>So if people will say yes when you ask honestly, why do companies skip consent?</p><p>From the hundreds of people I&#8217;ve spoken with, there are three main reasons. </p><p><strong>First, they don&#8217;t ask so you don&#8217;t get the chance to say no.</strong> If you never ask the question, you never get a no. You just take. And then you argue later that the terms covered it. That&#8217;s exactly what happened with Adobe; the agreement said &#8220;distribute to end users&#8221; and &#8220;promote your work.&#8221; They used it for AI training. When I challenged it, they argued the language was broad enough to cover it. They never asked because asking creates a record. Not asking keeps things vague.</p><p><strong>Second, they think you will say no.</strong> And honestly, some people would. But as we proved with 20,000 contributors, most people are willing to participate when you&#8217;re transparent about what you&#8217;re doing and why. The industry assumption that people will refuse is based on the industry&#8217;s own behavior; people don&#8217;t trust you because you&#8217;ve given them reasons not to. Fix the trust, and the consent follows.</p><p><strong>Third, they think you&#8217;ll ask for money.</strong> And that eats into their profits. If you ask a creator for explicit permission to use their work for AI training, the creator might say &#8220;sure, but what&#8217;s the compensation?&#8221; And suddenly the &#8220;free&#8221; data isn&#8217;t free anymore. The whole business model of scraping and training depends on the data being costless. Consent introduces a price. And companies don&#8217;t want to pay it.</p><p><strong>It&#8217;s never been about what&#8217;s possible. It&#8217;s about what&#8217;s profitable.</strong></p><h2>The Idea Graveyard</h2><p>One of my biggest fears is ending up in what I call the idea graveyard. It&#8217;s a place where dreams were never pursued. Passions were never followed. Ideas that could have changed things just&#8230; stayed ideas.</p><p>I think about this a lot when it comes to what we&#8217;re building. The easy path would have been to take the investor&#8217;s advice. Collect data quietly. Build models fast. Move on. That&#8217;s the path most companies take because it&#8217;s faster and cheaper and nobody asks questions until it&#8217;s too late.</p><p>But the version of Destined AI that cuts corners on consent? That idea deserves to stay in the graveyard. The version that empowers community; that asks people, that gets explicit permission, that builds technology good enough to actually understand a woman from New Orleans and rural southerners; that&#8217;s the one worth building. Even if it&#8217;s harder. Even if it&#8217;s slower.</p><h2>Consent Is the Foundation</h2><p>The promise of AI means nothing if it fails to work reliably in real-world scenarios. 20,000 people proved that explicit consent works. They said yes because we asked honestly. Imagine what&#8217;s possible if more companies did the same. </p><p>More reliable AI. More breakthroughs. More advancements.</p><div><hr></div><p><strong>Companies don&#8217;t skip consent because people won&#8217;t agree. They skip it so you can&#8217;t say no, because they think you&#8217;ll say no, or because they think you&#8217;ll ask for money. But building AI that doesn&#8217;t work for the people it should serve? That&#8217;s more expensive than any of it.</strong></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://layer.geraldcarter.co/p/consent-isnt-just-an-ethical-safeguard?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://layer.geraldcarter.co/p/consent-isnt-just-an-ethical-safeguard?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p><p></p><p></p>]]></content:encoded></item><item><title><![CDATA[How AI is Changing the Platform and Creator Relationship]]></title><description><![CDATA[The &#8220;Blank Check&#8221;: What Adobe&#8217;s Contract Argument Really Means for Every Creator.]]></description><link>https://layer.geraldcarter.co/p/how-ai-is-changing-the-platform-and</link><guid isPermaLink="false">https://layer.geraldcarter.co/p/how-ai-is-changing-the-platform-and</guid><dc:creator><![CDATA[Gerald Carter]]></dc:creator><pubDate>Tue, 07 Apr 2026 14:08:00 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!wLH5!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F83f96991-88b5-4bee-a9ad-1c84ddf90738_863x575.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!wLH5!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F83f96991-88b5-4bee-a9ad-1c84ddf90738_863x575.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!wLH5!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F83f96991-88b5-4bee-a9ad-1c84ddf90738_863x575.jpeg 424w, https://substackcdn.com/image/fetch/$s_!wLH5!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F83f96991-88b5-4bee-a9ad-1c84ddf90738_863x575.jpeg 848w, https://substackcdn.com/image/fetch/$s_!wLH5!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F83f96991-88b5-4bee-a9ad-1c84ddf90738_863x575.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!wLH5!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F83f96991-88b5-4bee-a9ad-1c84ddf90738_863x575.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!wLH5!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F83f96991-88b5-4bee-a9ad-1c84ddf90738_863x575.jpeg" width="863" height="575" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/83f96991-88b5-4bee-a9ad-1c84ddf90738_863x575.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:575,&quot;width&quot;:863,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:147280,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://layer.geraldcarter.co/i/193466510?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F83f96991-88b5-4bee-a9ad-1c84ddf90738_863x575.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!wLH5!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F83f96991-88b5-4bee-a9ad-1c84ddf90738_863x575.jpeg 424w, https://substackcdn.com/image/fetch/$s_!wLH5!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F83f96991-88b5-4bee-a9ad-1c84ddf90738_863x575.jpeg 848w, https://substackcdn.com/image/fetch/$s_!wLH5!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F83f96991-88b5-4bee-a9ad-1c84ddf90738_863x575.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!wLH5!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F83f96991-88b5-4bee-a9ad-1c84ddf90738_863x575.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>I&#8217;ve been in a legal battle since 2023 fighting for creators rights and really want to help people understand what Adobe is actually arguing here, because it affects every single person who has ever uploaded content to any platform.</p><p>Not just photographers. Not just Adobe Stock contributors. Everyone. If you&#8217;ve ever uploaded a photo, a video, a voice recording, a design, a document &#8212; to any platform that has a terms of service &#8212; what I&#8217;m about to explain could apply to you.</p><div class="native-video-embed" data-component-name="VideoPlaceholder" data-attrs="{&quot;mediaUploadId&quot;:&quot;0ea63e54-46ed-4a71-9218-cc53fe4328d0&quot;,&quot;duration&quot;:null}"></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://layer.geraldcarter.co/p/how-ai-is-changing-the-platform-and?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://layer.geraldcarter.co/p/how-ai-is-changing-the-platform-and?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p><p></p><h2></h2><h2>What the Contract Actually Said</h2><p>The contract I signed gave Adobe a license to use my images for &#8220;developing new features and services [to promote my work].&#8221; The full context of the agreement made it clear what that meant. The sections were literally labeled &#8220;License We Need to Distribute Your Work to Our End Users&#8221; and &#8220;License We Need to Promote Your Work.&#8221;</p><p>That language was there so Adobe could do normal business things. Improve the platform. Create better search functionality. Develop new tools for the stock photo marketplace. Standard stuff that you&#8217;d expect a technology company to do with a content library it&#8217;s distributing on your behalf.</p><p>There was no section called &#8220;License We Need to Use Your Work to Train Our AI Models for Free and Cut You Out of Any Resulting Revenue.&#8221; Because that was never the deal. The deal was a revenue share. I provide content. Adobe distributes it to end users. We both make money. That was the overarching benefit of the bargain for both parties.</p><h2>What Adobe Is Actually Arguing</h2><p>Adobe&#8217;s argument? The word &#8220;new&#8221; means they can do anything new. Anything. Including something that didn&#8217;t exist when I signed the contract. Including AI training. Including building a tool that directly competes with the very content I licensed to them.</p><p>Think about that logic for a second.</p><p>Because the word &#8220;new&#8221; is in the contract, any new thing Adobe decides to do with your content is supposedly covered. It doesn&#8217;t matter that AI training wasn&#8217;t mentioned. It doesn&#8217;t matter that generative AI didn&#8217;t exist as a commercial product when I signed. It doesn&#8217;t matter that the purpose of the license was clearly about distribution and promotion. The word &#8220;new&#8221; apparently overrides all of that context.</p><p>Five years from now, if they want to use your images to train humanoid robots that look like you? New feature. If they want to beam your photos onto billboards from satellites? New service. If they want to sell your content directly to your competitors without compensation? New offering.</p><p><strong>The word &#8220;new&#8221; became a blank check.</strong></p><h2>Why This Isn&#8217;t Just My Problem</h2><p>Here&#8217;s where this gets bigger than Adobe and bigger than me.</p><p>Every creator who has a similar clause in their agreement &#8212; and most do, because these are standard, non-negotiable, click-through contracts &#8212; should be very concerned.</p><p>Go look at the terms of service for whatever platform you upload content to. Look for language like &#8220;new features,&#8221; &#8220;new services,&#8221; &#8220;new products,&#8221; &#8220;improve our offerings.&#8221; You&#8217;ll find something like it. Almost every platform has some version of this clause because it&#8217;s meant to give them operational flexibility. And that&#8217;s fine when it&#8217;s being used for what it was intended for.</p><p>But the moment a company argues that &#8220;new features&#8221; includes training generative AI models on your work &#8212; models that then compete with you, replace the need for your content, and generate revenue you&#8217;ll never see &#8212; that operational flexibility becomes something else entirely. It becomes a blank check written against your creative output for any future technology that hasn&#8217;t been invented yet.</p><p><strong>And you&#8217;ve already signed it.</strong></p><h2>You Didn&#8217;t Negotiate This. Nobody Did.</h2><p>One thing I want to help people to understand about these contracts: you don&#8217;t get to negotiate them. I didn&#8217;t sit in a room with Adobe&#8217;s lawyers and agree to specific terms. I clicked a button. The same button every contributor clicks. The same kind of button you click when you sign up for any platform.</p><p>These are contracts of adhesion &#8212; take it or leave it. The platform writes the terms. You either accept them as-is or you don&#8217;t use the platform. There&#8217;s no counteroffer. There&#8217;s no red-lining. There&#8217;s no conversation.</p><p>And now those same non-negotiable terms are being used to justify using your creative work for AI training &#8212; something that didn&#8217;t exist when the terms were written. The creators who signed those agreements never could have imagined this use. And the platforms know it.</p><p>That&#8217;s the part that gets me. It&#8217;s not like Adobe came to contributors and said &#8220;hey, we want to use your images for AI training, are you in?&#8221; That would have been the honest approach. That would have been the most sound legal strategy. Instead, they kept it vague, relied on broad language, and then argued after the fact that the language covered it all along.</p><p><strong>They didn&#8217;t ask because they knew what the answer would be.</strong></p><h2>Why We&#8217;re Fighting to Vacate This Ruling</h2><p>That&#8217;s why we&#8217;re fighting to have this ruling vacated. Not just for Diversity Photos, but because if this interpretation stands, it sets a precedent that guts creator rights across the board.</p><p>No contract from the pre-AI era should be interpreted as a blank check for AI training. If a company wants to use your content for AI, they should have to say so explicitly. They should have to get your informed consent. And they should have to compensate you. That&#8217;s not radical. That&#8217;s basic contract law &#8212; both parties should understand what they&#8217;re agreeing to.</p><p>This is not over. We&#8217;re in court now on the petition to vacate. And regardless of what happens with my specific case, I want every creator to understand the principle at stake: the word &#8220;new&#8221; in your contract should not mean unlimited.</p><h2>What You Can Do Right Now</h2><p>Go read the terms of service for every platform where you upload creative work. Search for &#8220;new features,&#8221; &#8220;new services,&#8221; &#8220;improve,&#8221; &#8220;develop.&#8221; See what language they&#8217;re using and think about what a company could argue that language covers if they decided to train AI on your content tomorrow.</p><p>If the platform has an arbitration clause, check if there&#8217;s an opt-out window. Most give you 30 days from when you sign up or from when the terms were last updated. That&#8217;s your window. Don&#8217;t miss it.</p><p>And talk about this. Share this post. The more creators who understand what&#8217;s happening, the harder it becomes for companies to use vague language as a blank check. They rely on people not reading the terms. They rely on people not understanding the implications. That advantage disappears when people start paying attention.</p><p>The word &#8220;new&#8221; should mean innovation. It should mean better tools, better experiences, better platforms for creators. It should not mean unlimited access to your life&#8217;s work for any purpose a corporation can imagine.</p><p><strong>That&#8217;s not what any of us agreed to. And that&#8217;s why this fight matters.</strong></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://layer.geraldcarter.co/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://layer.geraldcarter.co/subscribe?"><span>Subscribe now</span></a></p><p></p>]]></content:encoded></item><item><title><![CDATA[Their Terms Say They Don’t Use Your Data. An Investor Told Me Otherwise.]]></title><description><![CDATA[Do companies use your likeness hoping you will never find out?]]></description><link>https://layer.geraldcarter.co/p/their-terms-say-they-dont-use-your</link><guid isPermaLink="false">https://layer.geraldcarter.co/p/their-terms-say-they-dont-use-your</guid><dc:creator><![CDATA[Gerald Carter]]></dc:creator><pubDate>Tue, 31 Mar 2026 14:27:49 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!iv9r!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3f2a3dde-eafc-486f-bc10-a71f0214be5b_635x635.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>When I was talking to investors about Destined AI, I had a conversation that changed how I think about privacy.</p><div class="native-video-embed" data-component-name="VideoPlaceholder" data-attrs="{&quot;mediaUploadId&quot;:&quot;9f57b807-7915-42e3-b428-238c27c60a7f&quot;,&quot;duration&quot;:null}"></div><p>One investor in particular told me that I should build a product to collect people&#8217;s voice data for free and then train my own models with it. He said that would be my moat. That&#8217;s the word he used. Moat. Your competitive advantage. And his version of a competitive advantage was getting people&#8217;s voice data without them fully understanding what it would be used for.</p><p>He used a voice journaling app as the example. An app where people speak into it to journal &#8212; talk about their day, their feelings, their thoughts. Personal stuff. Intimate stuff. And according to this investor, that company took the user data and built their voice AI models from it.</p><p><strong>But here&#8217;s the thing. If you go and look at that app in the App Store or read their terms, it says they do not use your data.</strong></p><p>Let that sit for a second. </p><h2>Self-Reported Means Self-Policed</h2><p>Most people don&#8217;t know this, but the privacy labels you see on apps in the Apple App Store are self-reported. The developer fills out a form telling Apple what data they collect and how they use it. Apple publishes it. That&#8217;s it.</p><p>Apple is not going through the company&#8217;s code to verify it. Apple is not auditing their servers. Apple is not checking whether the data that flows through the app matches what the label says. The label is what the company told Apple. And Apple took them at their word. It&#8217;s the honor policy.</p><p>So when you see that green checkmark or that &#8220;no data collected&#8221; badge and you feel safe &#8212; understand that you&#8217;re trusting the company&#8217;s honesty. Not Apple&#8217;s verification. There is no verification.</p><p>This is the same principle with terms of service across the board. When a company writes &#8220;we do not sell your data&#8221; or &#8220;we do not use your data for training,&#8221; that&#8217;s a claim. It&#8217;s not a fact. And there is very little infrastructure in place to hold them to it. Especially in the US where data privacy laws are still a patchwork compared to other places where you have more rights.</p><h2>Voice, Video, and Image Data Are the New Gold</h2><p>That conversation made me think about what kinds of data are most valuable right now in the AI race. And the answer is obvious when you look at where the models are going. Text data has been scraped from the entire internet. Everyone has it. The next frontier is voice, video, and image data. That&#8217;s where the moats are being built.</p><p>Think about the apps on your phone right now that have access to your voice. Voice assistants. Voice journaling apps. Language learning apps. Voice memos. Telehealth platforms. Meeting transcription tools. Any app that listens to you has the raw material to build or improve a voice AI model.</p><p>Now think about the ones that have your images. Photo editing apps. Social media. Cloud storage. Stock photo platforms. Family album apps. They all have your visual data. And we&#8217;ve already seen what can happen with that &#8212; my 11,855 images trained Adobe&#8217;s AI. The terms didn&#8217;t say that&#8217;s what the license was for. But they did it anyway.</p><p>Video is the same story. Any platform where you upload video content has footage that could be used to train models for everything from facial recognition to motion capture to video generation AI.</p><p>This isn&#8217;t speculation. This is the business model that investor was recommending to me. Collect the data through a &#8220;useful&#8221; product. Train models on it. That&#8217;s the moat.</p><h2>I Saw This Firsthand</h2><p>I&#8217;m not guessing about whether companies honor their terms. I lived it.</p><p>Adobe&#8217;s agreement with me was for distributing my images to end users and promoting my work. The sections of the contract were literally labeled &#8220;License We Need to Distribute Your Work to Our End Users&#8221; and &#8220;License We Need to Promote Your Work.&#8221; That&#8217;s the context. That&#8217;s the purpose.</p><p>And then they used that same license to train Firefly and Sensei. They argued the phrase &#8220;new features and services&#8221; covered anything in the universe. On their own AI Ethics page, they talked about respecting creators&#8217; &#8220;choice and control.&#8221; Meanwhile, behind the scenes, the content was already in the training pipeline.</p><p><strong>Terms that said one thing. A company that did another.</strong></p><p>So when someone tells me an app&#8217;s privacy label says &#8220;no data collected,&#8221; I hear it. I just don&#8217;t automatically believe it. Not because every company is lying. But because I&#8217;ve seen what happens when they are, and there&#8217;s almost nothing in place to catch them.</p><h2>How I Think About It Now</h2><p>I&#8217;m not saying delete every app off your phone. I&#8217;m saying be reasonably cautious. Especially with voice, video, and image data. That&#8217;s the data companies are building their futures on right now. And they&#8217;re getting it from you, often through products that feel harmless.</p><p>Here&#8217;s what I do. Before I give an app access to my microphone, my camera, or my photo library, I ask myself a few questions. What does this company actually do with this data? Is there a business model that depends on my data being the product? Do their terms match their marketing? And is anyone holding them accountable if they don&#8217;t?</p><p>Usually the answer to that last question is no. Nobody is holding them accountable. The App Store label is self-reported. The terms of service are written by their lawyers. The privacy policy is designed to protect them, not you. And enforcement in the US is basically nonexistent unless you can afford to take them to arbitration or court &#8212; which most people can&#8217;t.</p><p>That doesn&#8217;t mean every company is lying. There are good ones. And when I find them, I support them. That&#8217;s the &#8220;buy&#8221; in my buy, build, or bypass framework.</p><p>But for the rest? I&#8217;m cautious. I give the minimum data necessary. I use the privacy controls available. And for things that really matter &#8212; my family&#8217;s photos, my creative work, my voice &#8212; I think carefully about who gets access and whether I trust them with something I can never fully take back.</p><h2>What I&#8217;d Tell You</h2><p>Next time you download an app and it says &#8220;no data collected&#8221; in the App Store &#8212; know that&#8217;s what the company told Apple, not what Apple confirmed. It might be true. It might not be. There&#8217;s no way for you to verify it from the outside.</p><p>If the app asks for access to your microphone, your photos, or your camera &#8212; think about what they could build with that data. Think about the investor who told me to collect voice data for free and turn it into a moat. That&#8217;s the thinking that drives this industry. Your data isn&#8217;t a byproduct of the product. In many cases, your data is the product.</p><p><strong>And if a company&#8217;s terms and their behavior ever contradict each other &#8212; trust the behavior. Every time.</strong></p><p>I learned that lesson with Adobe. I&#8217;m sharing it so you don&#8217;t have to learn it the same way.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://layer.geraldcarter.co/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://layer.geraldcarter.co/subscribe?"><span>Subscribe now</span></a></p><div><hr></div><p><em><strong>&#8220;If a company&#8217;s terms and their behavior ever contradict each other &#8212; trust the behavior.&#8221;</strong></em></p>]]></content:encoded></item><item><title><![CDATA[Consent to One Thing Is Not Consent to Everything]]></title><description><![CDATA[Why we hold five-year-olds to a higher standard than billion-dollar tech companies.]]></description><link>https://layer.geraldcarter.co/p/consent-to-one-thing-is-not-consent</link><guid isPermaLink="false">https://layer.geraldcarter.co/p/consent-to-one-thing-is-not-consent</guid><dc:creator><![CDATA[Gerald Carter]]></dc:creator><pubDate>Tue, 24 Mar 2026 13:25:50 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!iv9r!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3f2a3dde-eafc-486f-bc10-a71f0214be5b_635x635.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p></p><div class="native-video-embed" data-component-name="VideoPlaceholder" data-attrs="{&quot;mediaUploadId&quot;:&quot;d1a53eaa-d756-44e7-8c83-695bb778e58b&quot;,&quot;duration&quot;:null}"></div><div><hr></div><p>Watch any kindergarten class for five minutes and you&#8217;ll hear a teacher say something like this: &#8220;Did you ask before you took that? You need to ask. And if they say no, that means no.&#8221;</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://layer.geraldcarter.co/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Ethical Layer! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>We teach our kids that consent is specific. You ask before you borrow someone&#8217;s toy. You ask again next time. You don&#8217;t assume that because they let you borrow it yesterday, you can take it home today. And if they say stop, you stop. That&#8217;s not a complicated idea. Five-year-olds get it.</p><p>So how is it that we hold children to a higher consent standard than some of the largest technology companies in the world?</p><h2>The Dinner Invitation</h2><p>Think about it in the physical world for a second.</p><p>If someone agrees to let you into their home for dinner, that doesn&#8217;t mean you can go through their drawers. It doesn&#8217;t mean you can copy their house key. It doesn&#8217;t mean you can come back whenever you want and help yourself to whatever&#8217;s in the fridge.</p><p>You were invited for dinner. That&#8217;s the scope of the consent. Anything beyond that requires a new conversation.</p><p>We all understand this intuitively. Nobody would argue otherwise in the physical world. But in the digital world, companies are acting like a single &#8220;I agree&#8221; gives them the keys to your entire life. Your photos. Your voice. Your creative work. Your data. All of it. For anything they want. Forever.</p><p><em><strong>Consent to one thing is not consent to everything.</strong></em></p><h2>How I Learned This the Hard Way</h2><p>I invited Adobe to dinner. That&#8217;s essentially what happened.</p><p>I consented to let Adobe distribute my images to end users through Adobe Stock. That was the agreement. Revenue share. They distribute, I provide content, we both earn. The license sections were called &#8220;License We Need to Distribute Your Work to Our End Users&#8221; and &#8220;License We Need to Promote Your Work.&#8221; That was the dinner invitation. Clear scope. Clear purpose.</p><p>But Adobe didn&#8217;t just stay for dinner. They went through my drawers. They took my images and used them to train Firefly and Sensei &#8212; their AI models. They copied the key &#8212; embedding my work permanently into systems I never agreed to. And they argued they could come back whenever they wanted, for whatever they wanted, because the agreement mentioned the word &#8220;new.&#8221;</p><p>I consented to distribution. I did not consent to AI training. </p><p><strong>That&#8217;s like inviting someone to dinner and having a judge say that because you opened the door, they were entitled to move in.</strong></p><h2>Consent Is a Spectrum, Not a Switch</h2><p>That experience changed how I think about consent in technology. And it&#8217;s one of the driving principles behind what I&#8217;m building with Destined AI.</p><p>The way consent works in most technology products right now is binary. It&#8217;s a switch. You either agree to everything or you use nothing. There&#8217;s no middle ground. There&#8217;s no nuance. There&#8217;s no conversation.</p><p>But consent isn&#8217;t a switch. It&#8217;s a spectrum. And it has properties that every five-year-old already understands.</p><p><strong>Consent should be specific.</strong> When I said yes to distribution, that meant distribution. It did not mean AI training, data scraping, model building, or any other use that wasn&#8217;t part of the original agreement. Consent to one thing is not consent to everything.</p><p><strong>Consent should be informed.</strong> You can&#8217;t consent to something you don&#8217;t know about. When I signed that agreement in 2018, generative AI as we know it didn&#8217;t exist as a commercial product. I could not have consented to a use I couldn&#8217;t have imagined. And Adobe knew that. They didn&#8217;t ask because they knew what the answer would be.</p><p><strong>Consent should be revocable.</strong> If I change my mind, I should be able to withdraw my consent. But once your content has been used to train an AI model, it&#8217;s embedded. You can&#8217;t un-train a model. </p><p><strong>Consent should be ongoing.</strong> Just because I said yes in 2018 doesn&#8217;t mean I said yes to everything that comes after. Technology evolves. Uses change. If a company wants to do something fundamentally new with your content, they should have to come back and ask again. Just like a kid has to ask to borrow the toy again tomorrow.</p><h2>From Diversity Photos to Destined AI</h2><p>I built Diversity Photos with consent at the center. Every person in those images gave their explicit permission to be photographed. Every image was created with intention. The collection was curated, not scraped. That&#8217;s what made it valuable &#8212; and that&#8217;s exactly what was disrespected when Adobe used it for AI training without a clear conversation.</p><p>Now I&#8217;m building Destined AI with the same principle. The consent problem in technology isn&#8217;t just a legal issue. It&#8217;s a design issue. It&#8217;s an architecture issue. The way systems are built right now, consent is an afterthought &#8212; a checkbox on a form, a paragraph buried in a terms of service nobody reads, a single click that supposedly covers everything a company might do for the rest of time.</p><p><strong>That&#8217;s not consent. That&#8217;s a loophole designed to look like consent.</strong></p><p>I believe technology should be built so that consent is specific, informed, revocable, and ongoing. Not because it&#8217;s idealistic, but because it&#8217;s the only standard that actually works. We already know it works &#8212; we teach it to our children. Now we need to build it into our systems.</p><h2>The Question We Should All Be Asking</h2><p>Here&#8217;s what I want you to think about.</p><p>Right now, companies are using your content, your data, your voice, your images, and your creative work to build the most powerful technology in human history. And most of them are doing it based on a single click you made on a terms of service you didn&#8217;t read, for uses you couldn&#8217;t have imagined, with no mechanism for you to take it back.</p><p>If your five-year-old did that at school &#8212; took something without asking, used it for something they weren&#8217;t given permission for, and then refused to give it back &#8212; you&#8217;d correct them. You&#8217;d sit them down and explain why that&#8217;s not okay.</p><p>We need to have that same conversation with the companies building AI. Because the standard can&#8217;t be lower for a corporation than it is for a child.</p><p>We would never accept this in the physical world. We should not accept it in the digital one.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://layer.geraldcarter.co/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Ethical Layer! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[What If a Company Creates a Robot That Looks Like You — Without Your Permission?]]></title><description><![CDATA[It sounds like science fiction. But the legal framework that would allow it? That&#8217;s already here.]]></description><link>https://layer.geraldcarter.co/p/what-if-a-company-creates-a-robot</link><guid isPermaLink="false">https://layer.geraldcarter.co/p/what-if-a-company-creates-a-robot</guid><dc:creator><![CDATA[Gerald Carter]]></dc:creator><pubDate>Wed, 18 Mar 2026 15:46:15 GMT</pubDate><enclosure url="https://api.substack.com/feed/podcast/191378139/b1991f6ef71e287ddcd3058153fc0742.mp3" length="0" type="audio/mpeg"/><content:encoded><![CDATA[<p>It sounds like science fiction. But the legal framework that would allow it? That&#8217;s already here - or at least being argued.</p><p>Right now, companies like Adobe are arguing that because you uploaded content to their platforms, you gave them the rights to your likeness, your creative work, and your life&#8217;s output - to do <em>anything</em> they want with it. Today, that means training AI models that can reproduce your work. Tomorrow? Where exactly does it stop?</p><p>What happens when a company decides it wants to create a robot sex worker that looks like you - and claims your terms of service gave them permission?</p><p><strong>This Isn&#8217;t Hypothetical. I Lived It.</strong></p><p>In 2018, my company Diversity Photos was <em>recruited</em> by Adobe. They came to us in 2018, specifically because of our unique catalog of diverse content. Then they took our work and used it to train their AI models - without telling us, without asking, and without paying us a dime.</p><p>When I found out and reached out to Adobe, I was ignored. Repeatedly. When they finally responded, they told me I had already given them permission. Their argument? Buried in the Stock Contributor Agreement was language granting Adobe a license for &#8220;developing new features and services.&#8221; According to Adobe, that phrase - written in 2018, before generative AI even existed as a commercial product - covered training AI models on our work forever, royalty-free.</p><p><strong>The &#8220;New Features and Services&#8221; Trap</strong></p><p>Here&#8217;s what should scare every creator reading this: the phrase &#8220;new features and services [to promote your work]&#8221; is everywhere. It&#8217;s in Adobe&#8217;s terms and likely in many other platform terms - virtually every platform where you host content.</p><p>These companies wrote those terms years ago, long before generative AI. But now they&#8217;re using that language as a blank check. The arbitrator in our case literally looked up the dictionary definition of &#8220;new&#8221; - meaning &#8220;having recently come into existence&#8221; - and concluded that AI training qualified because it was, by definition, new.</p><p>Think about what that means. Under that interpretation, <em>any</em> future technology qualifies. AI-generated clones of your face? New feature. Synthetic voices trained on your speech? New service. A physical robot built using your likeness? If it&#8217;s new, it&#8217;s covered.</p><p>Companies are ignoring all other terms within the contract and ONLY highlighting the term &#8220;new&#8221; as the catch all for anything. And that&#8217;s the point.</p><p><strong>Why I Need Your Help</strong></p><p>I&#8217;m not sharing this story just to vent. I&#8217;m sharing it because this isn&#8217;t just about me. Every creator on every platform is sitting on a ticking time bomb hidden in their terms of service.</p><p>So here&#8217;s what I&#8217;m asking:</p><p><strong>Go to every platform where you have content.</strong> Adobe. Getty. YouTube. Instagram. Google. Wherever you create.</p><p><strong>Find the terms of service.</strong> Look for language about &#8220;new features,&#8221; &#8220;new services,&#8221; &#8220;developing products,&#8221; or anything that gives the platform broad rights to your content beyond its original purpose.</p><p><strong>Screenshot it.</strong> Document what you find.</p><p><strong>Share it with me. Tag me.</strong> Let&#8217;s build the receipts together.</p><p>Because right now, these companies are banking on the fact that nobody reads the fine print. They&#8217;re betting that creators will keep uploading, keep contributing, keep feeding the machine - without ever realizing what they signed away.</p><p>The law hasn&#8217;t caught up to the technology. The courts haven&#8217;t drawn clear lines. And the arbitration system is designed to favor the companies that can afford the best lawyers and the highest fees.</p><p>But if enough of us shine a light on what&#8217;s actually in these agreements - if we can show the world exactly how these platforms claim ownership over our creative lives - we can start to change the conversation.</p><p>They took my work. They trained their AI on images of real people - our friends, kids, and community - without consent. And when I tried to fight back, they used the legal system to try and stop me.</p><p>I&#8217;m still fighting. But I can&#8217;t do it alone.</p><p><strong>Let&#8217;s create the receipts. Together.</strong></p>]]></content:encoded></item><item><title><![CDATA[Buy, Build, or Bypass: How I Handle Data and Privacy Online]]></title><description><![CDATA[Your data is yours. A high level framework of how I think about data and privacy online.]]></description><link>https://layer.geraldcarter.co/p/buy-build-or-bypass-how-i-handle</link><guid isPermaLink="false">https://layer.geraldcarter.co/p/buy-build-or-bypass-how-i-handle</guid><dc:creator><![CDATA[Gerald Carter]]></dc:creator><pubDate>Mon, 16 Mar 2026 15:53:29 GMT</pubDate><enclosure url="https://api.substack.com/feed/podcast/191139969/97dd5cd2987fad1a242c0e2b2157db16.mp3" length="0" type="audio/mpeg"/><content:encoded><![CDATA[<p><strong>How do you approach data and privacy online?</strong></p><p>Most people don&#8217;t have an answer to that question. Not because they don&#8217;t care, but because nobody ever gave them a framework for thinking about it. You either trust the app or you don&#8217;t, and if you don&#8217;t, you use it anyway because what else are you going to do.</p><p>I used to think that way too. And then I watched Adobe take my content and claim the contract allowed it. I watched a family album app tell me they didn&#8217;t have to honor my data rights. And I realized: if you don&#8217;t have a system for protecting yourself, companies will make that decision for you. And they will not decide in your favor.</p><p>So now I have a system. It&#8217;s simple. Three options for every product and service I use: buy, build, or bypass.</p><h2>Buy</h2><p>If a product is good, and the company shows they truly care about privacy, and their terms don&#8217;t contradict what they say &#8212; I&#8217;ll buy it. I&#8217;ll support that.</p><p>To be clear - I&#8217;m not anti-technology. I&#8217;m not anti-business. I&#8217;m not the person who thinks every company is out to get you. Some companies do it right. They have clear terms. They tell you what they collect and why. Their privacy policy matches their marketing. They give you real controls over your data. When those things line up, I&#8217;m happy to be a paying customer.</p><p>The key is the contradiction test. A company can say &#8220;we respect your privacy&#8221; all day long. But if their terms of service say they can use your data for &#8220;developing new features and services&#8221; &#8212; sound familiar? &#8212; then what they say and what they do are two different things. I look at the terms. Not the marketing. The terms tell you the truth.</p><p>When the terms match the promises, that&#8217;s a company worth supporting with your money.</p><h2>Build</h2><p>If a company doesn&#8217;t respect data privacy and I want the product, I look at whether I can build it myself.</p><p>Real example. I used to use a family album app. It was convenient. I liked the interface. My family was on it. But at some point I started thinking about where all those photos actually live and what the company can do with them. So I submitted a data request &#8212; I wanted copies of my data and the option to delete it.</p><p><strong>Their response: &#8220;It appears that you are residing in the US, so GDPR law is not applicable.&#8221;</strong></p><p>Read that again. I asked for my own data. Photos of my family. My children. And they said because I&#8217;m in the US, they don&#8217;t have to give it to me.</p><p>That told me everything I needed to know. Not just about that app, but about the gap between what companies are legally required to do and what they should do. GDPR gives European users strong data rights. But if you&#8217;re in the US, many companies treat your data like it&#8217;s theirs. </p><p>So I built my own. It&#8217;s called MemoryNest &#8212; memorynest.app. Privacy-focused family photo sharing. Because my family&#8217;s photos are not someone else&#8217;s data. That&#8217;s not a product pitch. That&#8217;s a principle.</p><p>Now, not everyone can build their own app. I get that. But the point isn&#8217;t that everyone should become a developer. The point is that when a company shows you they don&#8217;t respect your data, you should take that seriously. And if you have the ability to build a better alternative, even if it&#8217;s just for yourself and your family, there&#8217;s real power in that.</p><h2>Bypass</h2><p>And then there&#8217;s the third option. If a product isn&#8217;t worth building and the company doesn&#8217;t respect your privacy &#8212; but you still want to use the service for some reason &#8212; bypass.</p><p>What does that look like? Disposable email addresses. Disposable information. Even disposable payment cards. So they can never build a true data profile of you.</p><p>Most apps and services ask you for your real name, your real email, your real payment info. And most people hand all of it over without thinking about it. But you don&#8217;t have to. There are services that let you create disposable email addresses that forward to your real inbox. There are virtual card services that generate one-time-use card numbers so a company never gets your actual payment info. And you can use whatever name and information you want for accounts that don&#8217;t require legal verification.</p><p>The idea isn&#8217;t to be paranoid. The idea is to be intentional. If a company hasn&#8217;t earned your real data, why give it to them? If they get breached, it&#8217;s a disposable email, not your primary one. If they sell your data, it&#8217;s a profile that doesn&#8217;t connect to anything real. If they refuse to delete your information, there was nothing real to delete in the first place.</p><p><strong>You control the relationship. Not them.</strong></p><h2>Why I Think About This the Way I Do</h2><p>I think people underestimate how much of their life is sitting in systems they don&#8217;t control. Your photos. Your messages. Your purchase history. Your location data. Your creative work. All of it living on someone else&#8217;s servers, governed by someone else&#8217;s terms, accessible to someone else&#8217;s AI models.</p><p>I learned this the hard way with Adobe. I gave them 11,855 images under a licensing agreement I thought was straightforward. They used them to train AI. When I challenged it, they argued the contract allowed it because of the word &#8220;new.&#8221; </p><p>I read terms before I sign up. I think about what data I&#8217;m giving and what they could do with it. I ask myself: does this company deserve my real information? And if the answer is no, I have a plan. Buy, build, or bypass.</p><p>It&#8217;s not about living in fear. It&#8217;s about being aware. Most people sleepwalk through sign-ups and downloads and &#8220;I agree&#8221; buttons. And then one day they find out their family photos can&#8217;t be deleted. Or their creative work trained an AI. Or their data was sold to a broker. By then it&#8217;s too late.</p><p>The framework is simple. The hard part is remembering to use it.</p><h2>If You Want to Start</h2><p>You don&#8217;t have to overhaul your entire digital life this week. But here are a few things you can do right now.</p><p>Pick one app you use daily and go read their privacy policy. Not the marketing page. The actual terms. See what they say about data sharing, data retention, and what happens when you want to delete your account. You might be surprised.</p><p>Set up a disposable email service. There are options out there &#8212; some free, some paid &#8212; that let you create aliases that forward to your real inbox. Next time an app asks for your email, give them an alias instead of the real thing.</p><p>Look into virtual card services. Some banks and fintech companies let you generate one-time-use card numbers. That way, if a company gets breached or starts charging you unexpectedly, your actual card is never exposed.</p><p>And for the stuff that really matters to you &#8212; family photos, creative work, personal memories &#8212; think about where it lives and who controls it. If the answer isn&#8217;t you, that&#8217;s worth changing.</p>]]></content:encoded></item><item><title><![CDATA[I Haven’t Used Photoshop in Years. Here’s What I Use Instead.]]></title><description><![CDATA[I get asked this a lot recently - here is the truth about how I do image editing.]]></description><link>https://layer.geraldcarter.co/p/i-havent-used-photoshop-in-years</link><guid isPermaLink="false">https://layer.geraldcarter.co/p/i-havent-used-photoshop-in-years</guid><dc:creator><![CDATA[Gerald Carter]]></dc:creator><pubDate>Wed, 11 Mar 2026 14:24:11 GMT</pubDate><enclosure url="https://api.substack.com/feed/podcast/190621222/98d4de4567fb468c09cba0e43427fc73.mp3" length="0" type="audio/mpeg"/><content:encoded><![CDATA[<p>Do you use Adobe for image editing?</p><p>I get asked this a lot, especially since people know about my situation with Adobe. And the answer might surprise you &#8212; not because it&#8217;s complicated, but because it&#8217;s simple.</p><p>I haven&#8217;t used Adobe Photoshop or Lightroom in years - and I&#8217;m better for it. My images are still being used in ads. And I&#8217;m spending less money.</p><p>Here&#8217;s what I actually use and why.</p><h2>Capture One</h2><p>For my main editing, I use Capture One.</p><p>There are a few reasons. First, the features &#8212; it has more than enough packed in it. RAW processing, color grading, tethered shooting, layer-based adjustments. Everything I need for professional work is there. I&#8217;m not missing anything from Photoshop or Lightroom.</p><p>Second, and this is the big one for me &#8212; they offer a one-off license. You can buy it once and own it. No subscription required. Now, if you want the latest updates, the subscription is still available. But you have the <em>choice</em>. That word matters. Adobe took choice away from creators a long time ago. Capture One still gives it to you.</p><p>The perpetual license is around $299. The subscription runs about $179 a year if you go that route. Either way, you&#8217;re looking at less than what Adobe charges for the Photography plan over the same period &#8212; and you actually own something at the end of it.</p><p>For me personally, I have everything I need with the license I have. I&#8217;m not chasing updates. I&#8217;m focused on the work.</p><h2>Affinity Photo</h2><p>I do like Affinity Photo on iPad. It&#8217;s a solid editor and it feels natural on the tablet for quick work.</p><p>Now, Canva acquired the company behind Affinity, and the new version is completely free. The core editing tools &#8212; photo editing, graphic design, page layout &#8212; all free. The only paid add-ons are AI features that require a Canva premium plan.</p><p>So if you&#8217;re someone who&#8217;s been on Photoshop purely because you didn&#8217;t want to pay for another app, that reason doesn&#8217;t exist anymore. It supports layers, masks, adjustment layers, RAW development, and it can open PSD files. It&#8217;s worth checking out again.</p><p>I&#8217;ll be transparent &#8212; Canva is a tech company with its own AI ambitions; I&#8217;ll share my experiences with them in a future post. But using Affinity doesn&#8217;t require you to upload content to a stock marketplace that feeds an AI training pipeline. That&#8217;s a meaningful difference from the Adobe ecosystem.</p><h2>The Biggest Secret</h2><p>But honestly, the real answer to &#8220;what replaced Photoshop for you&#8221; isn&#8217;t another app. It&#8217;s how I shoot.</p><p><strong>I focus on getting my images perfect in-camera so that no edits are needed.</strong></p><p>Lighting. Composition. Exposure. White balance. If you put the time in before you press the shutter, the image is done when it comes out of the camera. I&#8217;m not spending hours in post. I&#8217;m not color correcting things that should have been lit correctly. I&#8217;m not cropping because I should have moved three feet to the left.</p><p>I see my images used for ads all the time. And guess what &#8212; there were no edits. Straight out of camera. That&#8217;s not a brag. That&#8217;s the result of putting the work in before the capture, not after.</p><p>The photography software industry has conditioned people to think that editing is where the magic happens. And for some types of work, it is. But for a lot of photographers, heavy editing is a fix for not getting it right in the first place. I&#8217;d rather invest that time at the point of creation.</p><p>And when your editing needs are minimal, your software needs are minimal. That changes everything.</p><h2>The Side-by-Side</h2><p>For people who want the numbers, here&#8217;s how they compare.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!QfXK!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0a7eb48a-5515-4421-bad8-67e5086089f9_1966x1044.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!QfXK!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0a7eb48a-5515-4421-bad8-67e5086089f9_1966x1044.png 424w, https://substackcdn.com/image/fetch/$s_!QfXK!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0a7eb48a-5515-4421-bad8-67e5086089f9_1966x1044.png 848w, https://substackcdn.com/image/fetch/$s_!QfXK!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0a7eb48a-5515-4421-bad8-67e5086089f9_1966x1044.png 1272w, https://substackcdn.com/image/fetch/$s_!QfXK!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0a7eb48a-5515-4421-bad8-67e5086089f9_1966x1044.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!QfXK!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0a7eb48a-5515-4421-bad8-67e5086089f9_1966x1044.png" width="1456" height="773" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/0a7eb48a-5515-4421-bad8-67e5086089f9_1966x1044.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:773,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:228211,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://layer.geraldcarter.co/i/190621222?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0a7eb48a-5515-4421-bad8-67e5086089f9_1966x1044.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!QfXK!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0a7eb48a-5515-4421-bad8-67e5086089f9_1966x1044.png 424w, https://substackcdn.com/image/fetch/$s_!QfXK!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0a7eb48a-5515-4421-bad8-67e5086089f9_1966x1044.png 848w, https://substackcdn.com/image/fetch/$s_!QfXK!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0a7eb48a-5515-4421-bad8-67e5086089f9_1966x1044.png 1272w, https://substackcdn.com/image/fetch/$s_!QfXK!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0a7eb48a-5515-4421-bad8-67e5086089f9_1966x1044.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h2>Why This Matters Beyond the Software</h2><p>If you&#8217;ve been following me, you know the bigger picture. Adobe used nearly 12,000 of my images to train their AI without my consent. They argued vague contract language gave them the right. They offered 42 cents per image to settle.</p><p>So instead of funding a company that is taking creator content without consent, there are practical options for switching. And it&#8217;s better on your bank account.</p><p>That&#8217;s really what this comes down to. You have options. They&#8217;re good options. Some of them are free. And every dollar you redirect away from Adobe is a dollar that isn&#8217;t funding the system that exploits creators.</p><p>You don&#8217;t have to do it all at once. Download Affinity this week. It&#8217;s free. Open one of your projects in it and see how it feels. If you&#8217;re a professional photographer, try the Capture One trial on a real shoot. And on your next session, challenge yourself to get images that need zero editing. Focus on the light. Focus on the composition. See what happens.</p><p>The tools are there. The information is here. </p>]]></content:encoded></item><item><title><![CDATA[AI is changing everything, including me.]]></title><description><![CDATA[I used to be a behind the scenes guy, but I must speak up.]]></description><link>https://layer.geraldcarter.co/p/ai-is-changing-everything-including</link><guid isPermaLink="false">https://layer.geraldcarter.co/p/ai-is-changing-everything-including</guid><dc:creator><![CDATA[Gerald Carter]]></dc:creator><pubDate>Tue, 10 Mar 2026 13:11:47 GMT</pubDate><enclosure url="https://api.substack.com/feed/podcast/190503950/21823ea8220d7fd2ceb2ee7f3e6bf734.mp3" length="0" type="audio/mpeg"/><content:encoded><![CDATA[<p></p>]]></content:encoded></item><item><title><![CDATA[Why I Can’t Stay Silent Anymore]]></title><description><![CDATA[The story of how Adobe used 12K of my images to train their AI &#8212; and why I'm fighting for creators rights.]]></description><link>https://layer.geraldcarter.co/p/why-i-cant-stay-silent-anymore</link><guid isPermaLink="false">https://layer.geraldcarter.co/p/why-i-cant-stay-silent-anymore</guid><dc:creator><![CDATA[Gerald Carter]]></dc:creator><pubDate>Mon, 02 Mar 2026 15:17:34 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!PREz!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F82991a28-602b-4488-b080-34eaeb2cec4a_800x1200.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h2>I Was Never Supposed to Be the One Talking</h2><p>First, let me start by saying: I am not a content creator in the way you might think. I&#8217;m not someone who wakes up and thinks about going viral. I&#8217;m a behind-the-scenes guy. Always have been.</p><p>I built the platform called Diversity Photos. It&#8217;s a stock photography collection &#8212; nearly 100,000 images &#8212; specifically created to represent communities that the stock photo industry has historically overlooked. Black families at the dinner table. Latino professionals in a boardroom. Asian elders at a park. Muslim women at work. The everyday moments that exist in the real world but somehow didn&#8217;t exist in the visual content industry.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://layer.geraldcarter.co/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Ethical Layer! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>This was curated, intentional work. Not scraped from the internet. Not mass-produced. Every image was created with explicit consent and a focus on the representation of marginalized groups. That&#8217;s what made it valuable. That&#8217;s what made it different.</p><p>In April 2018, I signed a Stock Contributor Agreement with Adobe. The deal was simple and the deal was fair: I provide images to Adobe Stock, Adobe distributes them to end users, and we share the revenue. That&#8217;s it. A revenue share model. Two parties making money together.</p><p>The agreement called the license they needed exactly what it was: &#8220;License We Need to Distribute Your Work to Our End Users&#8221; and &#8220;License We Need to Promote Your Work.&#8221; That&#8217;s the language. That&#8217;s the context. That&#8217;s the purpose of the whole arrangement.</p><p>There was no section called &#8220;License We Need to Use Your Work to Train Our AI Models for Free and Cut You Out of Any Resulting Revenue.&#8221;</p><p><strong>Because that was never the deal.</strong></p><h2>The Moment Everything Changed</h2><p>Around June 2023, I became aware that Adobe had used my images to train their AI model called Firefly. If you&#8217;re not familiar, Adobe Firefly is a generative AI tool &#8212; you type in a prompt like &#8220;an angry purple tiger&#8221; and it creates an image for you. It was trained on content from Adobe Stock. My content. Your content, if you&#8217;re a contributor.</p><p>Think about what that means for a second. I licensed images to Adobe so they could sell them to customers. Instead, Adobe fed those images into a machine that now creates competing images. The AI outputs serve the exact same purpose as my original work. Except now, a customer doesn&#8217;t need to license my photo &#8212; they can just generate something similar for the cost of a subscription.</p><p>My reaction wasn&#8217;t rage. It was confusion. And then it was purpose.</p><p>I reached out to Adobe immediately. I came in calm. I actually asked for a partnership. I said: let&#8217;s figure out a fair arrangement for content already used in training and for content going forward. That&#8217;s a reasonable ask. That&#8217;s a business conversation between two parties who are supposed to be making money together.</p><p>Adobe&#8217;s response, months later in October 2023, was essentially: we have the right to do this under your agreement. And if you disagree, you should remove your content from Adobe Stock.</p><p>Read that again. They used my images. Trained an AI that competes with me. And when I raised my hand, they said <em>leave if you don&#8217;t like it</em>.</p><p><strong>As if removing my content from Adobe Stock would somehow un-train their AI. As if you can take your flour back out of a baked cake.</strong></p><h2>What They Offered vs. What They Took</h2><p>Before I even got to arbitration, Adobe offered me a &#8220;bonus&#8221; of about $1,173 for the use of my content in AI training. They framed it as generosity. They explicitly said they weren&#8217;t even required to offer it under the agreement.</p><p>$1,173 for 11,855 images used to train a generative AI product that Adobe now sells as part of its core business.</p><p><strong>That&#8217;s about ten cents per image.</strong></p><p>I said no.</p><p>When I retained legal counsel and we sent a formal demand letter, Adobe&#8217;s law firm &#8212; one of the largest firms in the world &#8212; responded with a settlement offer of $5,000.</p><p><strong>$5,000 divided by 11,855 images = roughly 42 cents per image.</strong></p><p>42 cents for an image that was intentionally created, curated, and licensed &#8212; now permanently embedded inside a billion-dollar AI product. That&#8217;s what your life&#8217;s work is worth to a company like Adobe. Not because that&#8217;s its actual value, but because they&#8217;re betting you don&#8217;t know any better. They&#8217;re betting you can&#8217;t afford to fight. They&#8217;re betting you&#8217;ll take the money and go away.</p><p>It&#8217;s like finding a gold mine in someone&#8217;s backyard, handing them $100, and walking off with a billion dollars worth of resources because they didn&#8217;t know what they had.</p><p>Some creators fell for it. And I don&#8217;t blame them. If you don&#8217;t know what your content is worth in the context of AI training data, how would you know to say no? That&#8217;s part of the strategy.</p><h2>The Three-Letter Word That Changed Everything</h2><p>Here&#8217;s what Adobe is really arguing &#8212; and I need you to understand this because it affects every single person who has ever uploaded content to any platform.</p><p>The contract I signed gave Adobe a license to use my images for &#8220;developing new features and services to promote my work.&#8221; That language was there so Adobe could do normal business things &#8212; improve their platform, create better search functionality, develop new tools for the stock photo marketplace. Standard stuff.</p><p>Adobe&#8217;s argument? The word &#8220;new&#8221; means they can do anything new. Anything. Including something that didn&#8217;t exist when I signed the contract. Including AI training. Including building a tool that directly competes with the very content I licensed to them.</p><p>Think about that logic. Because since the word &#8220;new&#8221; is in the contract, any new thing Adobe decides to do with your content is supposedly covered. Five years from now, if they want to use your images to train humanoid robots? New feature. If they want to beam your photos onto billboards from satellites? New service. If they want to sell your content directly to your competitors? New offering.</p><p><strong>The word &#8220;new&#8221; became a blank check. And every creator who has a similar clause in their agreement should be concerned.</strong></p><h2>Why I&#8217;m Telling This Story Now</h2><p>AI is changing everything. Every single thing. The way we create, the way we consume, the way we earn a living. And right now, in this moment, the rules are being written. Not by creators. Not by lawmakers who understand the technology. The rules are being written by the companies building the AI &#8212; and they&#8217;re writing them in their favor.</p><p>People entrusted platforms like Adobe with their content. There&#8217;s an expectation &#8212; a duty of care &#8212; that comes with that trust. When someone gives you their creative work under agreed-upon terms, you don&#8217;t get to just rewrite the deal in your head and pretend they consented.</p><p>I fought this battle in arbitration. I spent money I didn&#8217;t plan to spend. I hired experts and lawyers. I experienced things I never expected &#8212; like a hostile process server showing up at my home while my 4-year-old was sleeping on my shoulder. My heart was pounding. I wanted to protect my family and fight for my rights at the same time. That&#8217;s the reality of standing up to a corporation. It comes to your front door.</p><p><strong>But I&#8217;m still here. And I&#8217;m still talking.</strong></p><h2>What This Series Is About</h2><p>For the next 52 weeks, I&#8217;m going to share everything that I can. The documents. The strategies used against me. The contract clauses you need to look for. The alternatives to many products. The real costs of fighting a billion-dollar company. The emotional toll. The spiritual foundation that kept me going.</p><p>I paid for experts so you don&#8217;t have to. I lived through the arbitration process so you can learn from it. I experienced every tactic they used &#8212; the delays, the procedural battles, the motions to end the case before I could present evidence &#8212; so that if this ever happens to you, you won&#8217;t be walking in blind.</p><p>This is not just about Adobe. This is about every platform that has your content and a vague contract with the word &#8220;new&#8221; somewhere in it. This is about the future of creative ownership. This is about whether the people who make the content that trains the AI have any say in what happens next.</p><p>If you&#8217;re a creator &#8212; a photographer, a writer, a musician, a filmmaker, a designer, a person who makes things and puts them into the world &#8212; this story is yours too.</p><p><strong>Follow along. Share it. Talk about it. Because the only thing that can change the rules is enough people knowing what the rules actually are.</strong></p><p><strong>This is Week 1 of 52</strong></p><div><hr></div><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!PREz!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F82991a28-602b-4488-b080-34eaeb2cec4a_800x1200.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!PREz!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F82991a28-602b-4488-b080-34eaeb2cec4a_800x1200.png 424w, https://substackcdn.com/image/fetch/$s_!PREz!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F82991a28-602b-4488-b080-34eaeb2cec4a_800x1200.png 848w, https://substackcdn.com/image/fetch/$s_!PREz!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F82991a28-602b-4488-b080-34eaeb2cec4a_800x1200.png 1272w, https://substackcdn.com/image/fetch/$s_!PREz!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F82991a28-602b-4488-b080-34eaeb2cec4a_800x1200.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!PREz!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F82991a28-602b-4488-b080-34eaeb2cec4a_800x1200.png" width="238" height="357" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/82991a28-602b-4488-b080-34eaeb2cec4a_800x1200.png&quot;,&quot;srcNoWatermark&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/6870e1b0-a85c-4312-957d-9f359c95b3bf_800x1200.jpeg&quot;,&quot;fullscreen&quot;:false,&quot;imageSize&quot;:&quot;normal&quot;,&quot;height&quot;:1200,&quot;width&quot;:800,&quot;resizeWidth&quot;:238,&quot;bytes&quot;:371373,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://layer.geraldcarter.co/i/189658492?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6870e1b0-a85c-4312-957d-9f359c95b3bf_800x1200.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:&quot;center&quot;,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!PREz!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F82991a28-602b-4488-b080-34eaeb2cec4a_800x1200.png 424w, https://substackcdn.com/image/fetch/$s_!PREz!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F82991a28-602b-4488-b080-34eaeb2cec4a_800x1200.png 848w, https://substackcdn.com/image/fetch/$s_!PREz!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F82991a28-602b-4488-b080-34eaeb2cec4a_800x1200.png 1272w, https://substackcdn.com/image/fetch/$s_!PREz!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F82991a28-602b-4488-b080-34eaeb2cec4a_800x1200.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Photo by Nicole Carter</p><p> </p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://layer.geraldcarter.co/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Ethical Layer! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[Coming soon]]></title><description><![CDATA[This is The Ethical Layer.]]></description><link>https://layer.geraldcarter.co/p/coming-soon</link><guid isPermaLink="false">https://layer.geraldcarter.co/p/coming-soon</guid><dc:creator><![CDATA[Gerald Carter]]></dc:creator><pubDate>Fri, 30 Jan 2026 15:21:21 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!iv9r!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3f2a3dde-eafc-486f-bc10-a71f0214be5b_635x635.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>This is The Ethical Layer.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://layer.geraldcarter.co/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://layer.geraldcarter.co/subscribe?"><span>Subscribe now</span></a></p>]]></content:encoded></item></channel></rss>