Your Work Is Training AI. A New Proposal Says You Should Get Paid.
Generative AI is drafting text, making images, and summarizing news. It's learning from the vast library of human creativity online-your creativity. As these systems become more integrated into our lives, a critical question arises: who should profit?
The current copyright system wasn't built for a world where machines learn from and compete with human artists and writers on this scale. The legal ground is shaky, and the stakes for creators are high.
The Core Problem: Your Work, Their Profit
The biggest AI models are trained on millions of books, articles, songs, and artworks scraped from the web. Your work is likely in there. Many creators are now suing AI companies, arguing that using their copyrighted material for commercial training without permission is illegal.
But the courts are undecided. Some judges are skeptical of the AI companies' claims, while others suggest that training an AI is like a human reading a book. Frank Pasquale, a law professor at Cornell, notes, "The ongoing legal uncertainty here creates problems for both copyright owners and technologists."
The impact is real. Artists see their unique styles mimicked in seconds. Journalists watch as chatbots summarize their work, stealing traffic from their publications. Professionals in design, coding, and marketing worry that their own past work is being used to automate their jobs away.
A Solution: "Learnright"
A new paper co-authored by Pasquale, Thomas W. Malone (MIT), and Andrew Ting (George Washington University) outlines a legal innovation called "learnright."
The concept is simple: create a new intellectual property right that gives you control over whether your work can be used for AI training. It's a right to license your work specifically for machine learning.
This isn't just about law; it's about ethics. Society benefits when people are incentivized to create. Tech companies protect their own IP, so it's only fair that the creators powering their models get the same consideration. Flourishing creative communities are built on respect and attribution, not just extraction.
How It Would Work
Learnright wouldn't replace copyright. It would add a seventh exclusive right to the six you already have. Just as there are special protections for digital audio, there would be a new protection for submitting a work to an AI training process.
Under this system, AI companies would need to license the right to learn from specific datasets. They already do this with some news archives and stock photo libraries. Market negotiations would set fair rates, and collective licensing groups, like those in the music industry, could simplify the process.
While some worry this could slow innovation, the authors argue the opposite. AI models fed only their own output can degrade over time in a process called "model collapse." Without a constant supply of fresh, human-generated art and writing, AI's progress could stall.
As you navigate this new territory, understanding the tools that affect your work is essential. You can explore a curated list of AI tools for generative art to see what you're up against.
A Fairer Path Forward
Lawmakers are paying attention to generative AI. This proposal offers a clear middle ground that doesn't ban training but also doesn't leave creators uncompensated.
As Pasquale puts it, "AI firms richly compensate their own management and employees... But the copyrighted works used as training data are also at the foundation of AI innovation. So it's time to ensure its creators are compensated as well."
Learnright would be a significant step in recognizing the value of your work in the age of AI. You can read more about the proposal in the Journal of Technology and Intellectual Property.
```Your membership also unlocks: