Why AI Licensing Won’t Save Creative Jobs—and What Will Actually Help Artists Thrive
A US court case challenges whether AI training on copyrighted works is fair use amid creator job fears. Licensing won’t stop automation; alternatives like regulation and collective bargaining may help.

Perspective AI Training, the Licensing Mirage, and Effective Alternatives to Support Creative Workers
On May 1, a US court considered for the first time whether developers' use of copyrighted works to train AI models falls under “fair use” and complies with copyright law. The case, Kadrey v. Meta, involves a group of authors suing Meta for using their works without permission. This is just one of over forty similar copyright cases pending against generative AI developers in US courts.
The debate often centers around job protection. Media companies and some creators fear AI-generated works—images, books, screenplays—could threaten creators’ livelihoods, especially when AI is trained on their original content. They argue licensing AI training data would protect jobs. However, requiring licenses for training data won’t stop automation or meaningfully support creators; it may even worsen the problem. Instead of hitting a copyright dead-end, we need alternatives that genuinely address concerns about automation, corporate power, and a shifting economy.
The Jobs Debate: Courtroom and Culture
Job loss fears were key in the Kadrey hearing. Judge Vince Chhabria expressed skepticism about the evidence showing harm to the specific works involved, calling some fears speculative. Another judge in a similar case described the idea that AI would replace all human work as “baloney.” Despite this, Chhabria warned against relying solely on past legal precedents, highlighting how AI can flood the market with competing products and disrupt creators’ markets in unprecedented ways.
Historical parallels are worth noting. When recorded music emerged, composer John Philip Sousa warned it would “ruin the artistic development of music.” Decades later, Jack Valenti likened the arrival of video recorders to a “great tidal wave” threatening the film industry’s financial security. Yet, these technologies ultimately expanded creativity and opportunities.
Today, concerns about AI could similarly stifle new creativity if treated too rigidly. A surge of novel AI-generated works aligns with copyright’s goal of encouraging creativity for public benefit. But as Chhabria cautioned, this might be a “highly unusual case” demanding fresh thinking.
Will Licensing Protect Creators from Automation?
Does requiring licenses for training data help creators keep their jobs? Not really. If AI models perform well with licensed data, companies will still use AI to replace human labor for cost savings. Licensing won’t stop AI deployment or job loss; it just adds a paperwork step.
What about revenue from licensing? High-performing AI models train on billions of data points. For example, Stable Diffusion trained on over 2 billion images. Even if all licensing revenue went directly to artists, individual payments would be negligible—mere cents per work. In practice, middlemen dominate licensing markets, benefiting large tech and media companies rather than individual creators.
Licensing can also hurt competition. Copyright protects specific expression, not ideas or facts. For example, recipes aren’t copyrightable, so granting monopoly-like control over cookbooks through licensing wouldn’t serve public interest or creators well. The “flood” of AI content may still arrive, but profits will largely go to industry giants, not artists.
Artist Mat Dryhurst sums it up: “Everyone is just going to have to get used to coexisting in a world of infinite kinds of good media that used to be challenging to make.”
Better Paths Forward for Creators
The flood of AI-generated content may benefit the public, but creators’ job concerns remain real. If licensing won’t solve this, what can?
- Regulating AI Outputs
Some laws focus on protecting artists' names, images, and voices from unauthorized AI-generated commercial uses. These laws don’t restrict AI training broadly but limit specific outputs that could compete with the original artists. - Ensuring Workers Share AI Gains
Collective bargaining has proven effective. For example, the 2023 Hollywood strikes secured agreements preventing pay cuts for human writers when AI tools are used. This approach safeguards pay and roles without slowing AI innovation. - Limiting Market Concentration
Most artists lack union representation, making a competitive marketplace crucial. Antitrust actions, like blocking major publisher mergers, protect authors from lower advances caused by market consolidation. Tackling advertising monopolies held by giants like Google and Facebook can also benefit journalists and creators. - Redistributing AI’s Economic Benefits
Taxing AI-generated outputs and using those funds to support artists, venues, and cultural programs can help. Proposals for narrow, output-focused taxes aim to aid creators whose work is substituted by AI.
These paths deserve more attention and collaboration across stakeholders who often disagree on AI training and licensing issues. Building consensus on these approaches can better support creative workers facing automation challenges.
For creatives interested in understanding AI’s impact and learning practical skills, exploring AI training courses can be a valuable step.