Hollywood Takes Legal Action Against AI Image Generators
This week marked a significant escalation in the clash between major entertainment studios and AI companies. Disney and Universal filed a comprehensive copyright lawsuit against Midjourney, a leading AI image generator, while Getty Images launched a high-profile case against Stability AI in London.
Disney and Universal vs. Midjourney
On June 12, Disney and Universal submitted a 110-page complaint to a Los Angeles federal court. They accuse Midjourney of “systematic, ongoing, and willful” copyright infringement. The studios claim Midjourney trained its AI using countless copyrighted images without permission, enabling users to create unauthorized reproductions of iconic characters such as Darth Vader, Shrek, and Homer Simpson simply by entering text prompts.
The complaint includes detailed comparisons between original film stills and AI-generated images, describing these AI tools as a “bottomless pit of plagiarism.” Disney’s chief legal officer, Horacio Gutierrez, emphasized that piracy remains piracy, regardless of whether an AI is involved. NBCUniversal's general counsel, Kim Harris, reinforced the message, stressing the need to protect artists and substantial investments in content creation.
Midjourney, which reportedly earned $300 million last year and serves over 20 million users, has not responded to the lawsuit. Previously, its founder described the AI’s training process as “just a big scrape of the internet,” admitting they “weren’t picky.”
Legal Questions at the Heart of the Case
Experts say the lawsuit will largely depend on whether Midjourney’s outputs are “transformative” enough to avoid copyright infringement. Another key issue is whether scraping copyrighted content without explicit consent crosses a legal boundary.
IP attorney Dustin Taylor noted, “The similarity is so strong there. If the courts agree, this could set a precedent that fundamentally changes how AI companies train their models.”
Other AI Companies Feel the Pressure
While Stable Diffusion’s prominence has declined, Midjourney remains a top player, known for generating ultra-realistic cinematic images. Many other companies, including Runway, Ideogram, Freepik, ChatGPT, Grok, and Leonardo (owned by Canva), offer similar AI tools.
These lawsuits are expected to curb the use of unlicensed data in AI training, creating a new operating standard. Some companies are adapting by using fully licensed content or indemnifying their users. Google’s Veo, Adobe, Moonvalley’s Marey (trained solely on licensed content), and smaller outfits like Invisible Universe and Toonstar (which train their own models without external data) represent this emerging class. Runway, for example, has a licensing deal with Lions Gate to use their intellectual property legally.
Getty Images Takes on Stability AI in London
Across the Atlantic, Getty Images initiated legal proceedings against Stability AI, accusing the company of scraping millions of copyrighted photos to train Stable Diffusion. Getty’s CEO Craig Peters highlighted the financial and ethical stakes: “AI companies argue that paying for access to creative works would kill innovation by raising costs, but taking copyrighted work without permission or compensation is really stealing.”
Stability AI acknowledges using some Getty images but claims their practices fall under “fair use” and insists their AI models do not directly reproduce original works. A spokesperson stated that artists using Stability AI’s tools create new works that build on collective human knowledge, which they argue is protected by fair use and freedom of expression. Getty has also filed a similar lawsuit against Stability AI in the United States.
Industry Leaders Push for Licensed AI Content
Bryn Mooser of Asteria Studios, an AI-focused company, partnered with Moonvalley to develop Marey, an AI video generator trained only on licensed content. Mooser supports the studios’ position: “There’s no question to me that the studios are right. AI models must have consent.”
Disney and Universal are seeking damages of up to $150,000 per infringed work. However, their main goal is to challenge the fair use defense and establish clear legal and ethical boundaries for AI’s use of copyrighted material moving forward.
What This Means for AI and Creative Industries
This legal battle could reshape how AI companies approach training their models and using copyrighted content. For developers and IT professionals working with AI, staying informed about these developments is crucial. The outcome may affect data sourcing, model training methods, and compliance requirements.
For those interested in expanding their AI knowledge responsibly, exploring courses on ethical AI use and copyright compliance can be valuable. Resources such as Complete AI Training offer relevant courses to help navigate these challenges.
Your membership also unlocks: