Nick Clegg Warns Artist Consent Requirement Could “Kill” UK AI Industry
Nick Clegg, former UK deputy prime minister and ex-Meta executive, has made a strong statement about the future of AI regulation in the UK. He argued that requiring AI companies to get permission from artists before using their work to train models would effectively “kill” the AI industry in the country overnight.
Speaking at an event for his new book, Clegg acknowledged that creators should have the option to opt out of having their work used for AI training. However, he stressed that asking for consent beforehand is impractical given the enormous amount of data these models require.
“These systems train on vast amounts of data,” Clegg said. “I just don’t see how you go around asking everyone first.” He warned that if the UK enforced such rules unilaterally, it would put local AI companies at a severe disadvantage compared to those in countries without such restrictions.
The Debate Over AI Transparency and Artist Rights
This debate comes amid UK parliamentary discussions on legislation aimed at increasing transparency about how AI models use copyrighted work. The proposed amendment to the Data (Use and Access) Bill would require tech companies to disclose which copyrighted materials were used to train AI.
The amendment has attracted support from a wide range of creatives, including Paul McCartney, Dua Lipa, Elton John, and Andrew Lloyd Webber. Hundreds of artists have signed an open letter advocating for clearer insight into AI training processes.
Proponents, such as film producer and director Beeban Kidron who introduced the amendment, argue that transparency is necessary to enforce copyright law effectively. They believe that if AI companies must disclose the content they use, it would discourage unauthorized use of creative works.
Parliament’s Response and What’s Next
Despite this support, the amendment was rejected by members of parliament. Technology Secretary Peter Kyle emphasized the need for both AI and creative sectors to thrive together for Britain’s economy.
Kidron and others remain committed to pushing for transparency. With the Data (Use and Access) Bill returning to the House of Lords soon, the discussion on how to balance AI innovation with artist rights is far from over.
What Creatives Should Know
For creatives, this ongoing debate highlights the tension between protecting intellectual property and enabling AI development. While the right to opt out is recognized, the practicality of enforcing pre-consent is questioned.
Understanding these discussions can help creatives engage more effectively with policymakers and tech companies. Staying informed about AI training practices and legal changes ensures that artists can advocate for fair treatment without stalling technological progress.
If you want to explore how AI tools are shaping creative work and learn practical skills to stay ahead, check out Complete AI Training’s courses for creative professionals.
Your membership also unlocks: