Copy Rights and Wrongs

Who owns the output of AI—the machines or the creators?

DALL-E/Every illustration.

ICYMI: I’m teaching a new course called How to Write With AI. You’ll learn how to do the best writing of your life and how AI can help you achieve your goals, faster. It’s a four-week course running from September 19 through October 24 that includes hands-on workshops with tools like ChatGPT, Claude, Spiral, and Lex. Discounted early bird registration ends next week. Check out the course website for complete details.


To risk sounding like a zitty 14-year-old who just discovered Atlas Shrugged rather than a 32-year-old man gleefully planning to vote blue this fall, sometimes I can’t help but think that all taxation is theft.

It is a sad fact of the world that governments are constantly coming up with novel ways to carve off their pound of asset flesh from us. What is particularly galling is that all the taxes I pay over my lifetime will probably fund, like, one quarter of a Raytheon missile that blows up some cave in another country that is supposedly filled with terrorists but is more likely occupied by goats that are about to be judiciously kabobified by my tax dollars. If the government had just left me and my (hilariously unlucrative) newsletter earnings alone, I would’ve spent said monies on happy things like puppy adoption fees, pepperoni pizza on which the spicy meat cups are crunchy, and footbaths.

You may or may not agree with me on taxes, but I’m using them to make a very, very roundabout point about copyright protection laws, and what happens when large-scale government interventions, awash with noble purpose, sometimes end up failing to address the problems they set out to solve. 

Like taxes, copyright law has a worthy goal. Taxes are for investments in common goods like roads, libraries, and Boston Celtics championship parades. Yet in practice, American tax policy has often proved itself to be a boondoggle, captured by corporate interests and riddled with waste. Copyright, similarly, is for protecting the people in our society who make art by granting them the exclusive ability to monetize their work. Too often, though, copyright law has functioned to shore up the interests of mega-corporations while failing to facilitate meaningful revenue capture for actual artists. In both cases, the contrast between the noble ideals and the bureaucratic malarky is stark.

In the generative AI space, the difference matters, because copyright is shaping up to be the de facto legal framework for regulating this technology. Just this week, a group of authors filed a class-action lawsuit against Anthropic alleging that the company included their books in the Claude model’s training data. Last week, the journalists at 404 Media revealed that Nvidia had scraped YouTube and Netflix to form its own video models, in obvious violation of both platforms’ terms of service. 

There will be more stories like these in the years to come. Every AI company I can think of is scraping the internet’s content. And while many of the major startups are striking partnerships with media and entertainment companies—such as the deal Condé Nast and OpenAI announced yesterday—these are the exception, not the rule. The vast majority of training data is scraped without consent. 

Is this yet another case of powerful tech startups taking advantage of creatives, or are these technologies simply tools that enable creative people to make wonderful things? Even if we believe the latter to be true, are these innovations only possible because they bend the rules we have around intellectual property?

These questions matter both for individual creators and for the entire technology sector. Silicon Valley has bet the farm on this bubble, and copyright law could be the needle that pops it. Before we go any further down his road, however, it’s important to get clear on what we’re talking about when we talk about copyright and AI. When someone says that “generative AI violates IP” or “OpenAI stole my content without compensation,” they are asserting bigger ideas about the origins of creativity. 

Where does an idea come from?

Large language models (LLMs) get their smarts by training on vast datasets. They don't learn complete pieces of content, but rather focus on smaller units called "tokens," which are often word fragments. Then, when you give it a prompt like, “Write me an essay,” an LLM uses statistical calculations to predict the most likely next token in a sequence. 

To me, this is analogous to the human process of writing. You ingest a large amount of copyrighted data into your subconscious, then split those ideas up in a million little ways to produce something new. The issue is that asking a machine to engage in a similar process on our behalf is not something that copyright law was ever designed to adjudicate.

There’s another reason why conversations about AI and copyright tend to be so fraught.  When we ask whether a generative AI company is violating copyright law, we can be referring to a number of different things:

  1. Training data: Was copyrighted material included in the training data?
  2. Machine: Once a model is trained, is the software that is produced through that training run owned by the AI company, or by the owners of any copyrighted materials in the data set?
  3. Output: When you prompt the model with a question, who owns the copyright of the answer it spits out? Who is liable if the content that is generated violates another corporation’s copyright?
  4. Published work: If a publisher distributes a piece of AI-generated content, are they responsible for compensating any of the people in the previous buckets? What does copyright ownership look like if generative AI is used to create part of a distributed work, rather than all of it? 

No one knows the answer to any of these questions. There are multiple lawsuits ongoing against the AI companies, but they are all in various lower courts in the U.S. None of them have reached the Supreme Court, so it’ll be many years before we have a definitive answer. In the meantime, I would expect these companies to build as fast as possible. Rather than wait around for the courts to figure it out, it makes more strategic sense to build world-changing technology now and apologize later. Ex-Google CEO Erich Schmidt recently voiced this sentiment in a talk with Stanford MBAs, suggesting that startups should steal IP and have lawyers “clean up the mess” for them after the fact. 

When we are discussing copyright law, we are talking about a centuries-old bundle of social constructs and beliefs—one that arguably dates all the way back to the ancient Greeks, who were known to argue over whether a student should publish the notes they took at their master’s feet. But it is equally true that social standards can change on a whim. And after talking with several AI model company founders, it is clear to me that they are betting on the hunch that if they make technology that is impressive, powerful, and beloved enough, societal expectations will follow.

Still, I think the real reason creators are upset about generative AI is much more basic: money. 

Profit pools and power

Since George Washington signed the first piece of copyright legislation into law in 1790, copyright law has moved from its stated goal of protecting the creator to protecting the rights holder. It's a subtle difference, but one that Harvard Law grads delight in. Rights can be sold, chopped up, securitized, and traded. While monetization is an activity adjacent to that of creation, it’s typically the reason why we worry about copyright in the first place. Even copyright laws have been almost entirely designed with the former in mind. 

Perhaps because of that, scholars have argued that these laws have failed in their intent. Musicians typically receive only around 10 percent of the revenues they generate through record sales. Authors get 8 to15 percent of net sales on their books. Pick your industry, and I can give you napkin math on how most of the revenue creatives generate through their art gets captured by rights holders (aka assholes in suits). This may be good or bad, depending on the percentage of your closet that is currently filled with Brooks Brothers. 

But crucially, the internet does not favor rights holders. It favors distribution winners. In 2024, it is Meta and YouTube that capture value, not creatives. Online, copyright mostly matters in the sense that it can empower rights holders to force the attention aggregators to take down material. And since this is typically a retroactive action—and one that only benefits rights holders with sufficient scale—it does not meaningfully shift the profit pool. 

Generative AI doesn’t change the monetization paradigm for creators, but it does promise to make distribution more competitive. By increasing the volume and quality of content being uploaded to these platforms, it threatens to boost the power of attention aggregators, while making it harder for individual creators to cut through the noise. 

Even creative talent attempting to monetize their work outside of the attention economy proper  have cause to fear. Entertainment companies that would typically purchase their rights to their work—such as a Hollywood production company looking to license songs for a movie— may decide to use AI to approximate the sound of an artist’s music instead. So even if the legal discussions around this tech tend to revolve around the idea of AI "copying" an artist’s work, what they're really worried about is creative industry power players taking advantage of the existence of these tools to pay them even less than they already do. It’ll have a deflationary effect on creative wages, at least outside of the less than 1 percent of creators who have brand recognition. 

Artists and publishers are not dumb. They recognize what is about to happen. And as I see it, all of these arguments about copyright are the howling of the damned, even if they take the form of well-considered legal takes. The output of generative AI tools are substitutable goods, meaning corporations and users will choose whatever good is cheaper if they both have similar utility. My guess is that if the creative industries were doing better financially, the noise around generative AI would be significantly less audible. 

We need to grow the pie, not have more precise slices

Again, copyright laws are social constructs, not immutable laws of physics. They’re a compensation mechanism that governments design to incentivize the creation of new goods. The issue I have with discussions of copyright is that they often involve trying to litigate the most divine part of ourselves: our desire to create. Saying “this creation is protected by copyright” is both an attempt to set market dynamics and a statement about where human inventiveness comes from. Stronger copyright protections may protect existing rights holders, but they would likely have a net negative effect on the generation of new ideas.

Copyright laws were designed for a world of human creators. Now they are struggling to adapt to an era where machines can generate content at unprecedented speeds and volumes. The challenge lies not only in determining who owns AI-generated works, but also in understanding how this shift affects the entire creative ecosystem.

The concerns raised by creators and rights holders are valid. There's a genuine fear that AI could further undermine the value of human-created work, or allow corporations to benefit from artists’ IP without compensation. Still, we must also recognize that AI, like any tool, has the potential to enhance human creativity rather than replace it.

Moving forward, we need a nuanced approach that balances protection for creators with the need for innovation. This might involve rethinking how we compensate creators—an outcome that could potentially require us to move away from traditional copyright models, and toward systems that recognize inspiration and influence as much as they do direct copying.

We have an opportunity to shape this conversation in a way that fosters creativity and rewards innovation, not just lawyers. In the end, the goal should be to harness the power of AI to expand the creative pie instead of arguing about how to slice it. 


Evan Armstrong is the lead writer for Every, where he writes the Napkin Math column. You can follow him on X at @itsurboyevan and on LinkedIn, and Every on X at @every and on LinkedIn.

Was this newsletter forwarded to you? Sign up to get it in your inbox.

Like this?
Become a subscriber.

Subscribe →

Or, learn more.

Read this next:

Napkin Math

Oh No, I Kinda Want to Work for Elon

An examination of the man-child who would be king

12 Sep 21, 2023 by Evan Armstrong

Napkin Math

How I Use ChatGPT (As A Reasonable Person)

My AI awakening

9 Nov 30, 2023 by Evan Armstrong

Napkin Math

How to Win Arguments and Manipulate Managers

Make your spreadsheets shape strategy

7 Nov 2, 2023 by Evan Armstrong

Thanks for rating this post—join the conversation by commenting below.

Comments

You need to login before you can comment.
Don't have an account? Sign up!
Georgia Patrick 23 days ago

Evan... Keep on top of this and keep writing. Keep digging until you remove the layers of detritus produced by technology companies so you get to truth. Copyright rules intend to keep humans creating. AI is not a human. Full stop. If humans were like chickens or geese, think about what happens when you kill them all and have no more eggs. AI cannot give us eggs. Ever. It's about money with great disregard for the long game. Nobody is paying attention to the throughline on this. The human lives, creates, gets compensation for creation, continues to create, and then dies. At death, you get no more eggs. No more writing, art, music, or moral courage. At death, is AI at the funeral? Is AI taking care of the survivors and the estate? Let's start with intentions on all of this: What does the human intend when creating? What does the technology company intend in the manufacturing process that takes what a human creates and turns it into profits for them?

@raokrishna1 24 days ago

“Still, I think the real reason creators are upset about generative AI is much more basic: money.”

So, what exactly is are tech companies and the VCs funding them all about? All the big tech companies (and small) will fight tooth and nail and file patents. There will spend billions to acquire them.

But when it comes to ponying up money to creators they claim the “larger good.”

I am not saying the system is perfect. But this will only squeeze creators even further.

@jerry_delacruz.arcnow 22 days ago

Beautifully written and a joy to read, Evan. Your points are persuasive and I agree that the pie needs to grow larger to make room for what is inevitably coming. The shape and size of what is coming is blurred like a race car whizzing by. When the dust settles, perhaps we can ask our AI LLMs how to adjudicate that which each of us creates. I love the commenter’s point about how AI can never make eggs regardless of how intelligent it gets. However, I see a future where eggs will be treasured by society just as virtual eggs will be treasured.

Every smart person you know is reading this newsletter

Get one actionable essay a day on AI, tech, and personal development

Subscribe

Already a subscriber? Login