Stability AI wins landmark UK High Court ruling against Getty, a blow to copyright owners

UK court backs Stability AI, saying Stable Diffusion isn't an infringing copy if it doesn't store or reproduce works. But some watermark-related trademark claims landed.

Published on: Nov 05, 2025
Stability AI wins landmark UK High Court ruling against Getty, a blow to copyright owners

High Court backs Stability AI over Getty Images: practical takeaways for Legal, PR and Communications

A London High Court has ruled in favor of Stability AI in a closely watched case on whether training AI models on copyrighted images without permission amounts to infringement. The court found that Getty Images' core UK copyright claims could not proceed and that Stable Diffusion is not an "infringing copy" where the model does not store or reproduce copyright works.

The judge, Mrs Justice Joanna Smith, said the balance between creative industries and AI is of "very real societal importance," but the decision turned on narrower grounds after parts of Getty's case were withdrawn. Notably, some trademark claims related to Getty watermarks succeeded, while the passing off claim was not determined.

Stability AI's board includes the Oscar-winning film-maker behind Avatar, James Cameron, underscoring the cultural and commercial stakes. The dispute arrives as the UK government weighs new rules on AI and copyright, including a possible text and data mining exception with opt-outs for rights holders.

The ruling at a glance

  • Jurisdiction: Getty dropped its main copyright claim because there was no evidence the model training occurred in the UK.
  • Model status: The court held that "an AI model such as Stable Diffusion which does not store or reproduce any copyright works (and has never done so) is not an 'infringing copy'."
  • Trademarks: Some trademark infringement claims succeeded where AI-generated images included Getty watermarks.
  • Passing off: The judge declined to rule on this claim.
  • Facts acknowledged: There was evidence Getty images were used for training, and Stability's tool generates images from text prompts. That did not, on its own, decide copyright liability here.

Why this matters for Legal, PR and Communications

  • Copyright risk is narrowing in the UK-but not disappearing. One senior lawyer warned the decision exposes limits in the UK's "secondary copyright" protection for creators.
  • Trademark risk is live. Watermarks and logos surfacing in generated outputs can trigger brand and legal exposure. This is a communication and reputational issue, not just a legal one.
  • Jurisdiction is strategic. Where model training takes place can determine what claims stick. Expect more forum fights and venue shopping.
  • Transparency pressure is rising. Getty called for stronger transparency rules; expect regulators to push for disclosures on datasets, filters, and watermark handling.

What Legal teams should do now

  • Map model provenance and training locations. Capture where, how, and by whom models used by your organization were trained and fine-tuned.
  • Tighten vendor contracts. Require: no storage or reproduction of third-party works; watermark/logo filtering; prompt takedowns; log retention; cooperation with investigations; indemnities for IP claims.
  • Enforce trademark hygiene. Mandate pre-release checks for watermarks and known marks. Deploy automated detectors and human review for high-visibility assets.
  • Define incident response. A watermark surfacing is both legal and reputational. Pre-approve a removal process, notification triggers, and legal hold procedures.
  • Clarify acceptable use. Document when AI-generated images are allowed, what approvals are needed, and which datasets or models are off-limits.

What PR and Communications teams should do

  • Build a quick-response playbook. If a watermark appears in published content: pull the asset, log the prompt and output, swap with a cleared version, and escalate to Legal.
  • Add AI checks to brand safety reviews. Include watermark/logo scans and source declarations in pre-flight checklists.
  • Craft clear messaging. Prepare holding statements that emphasize prompt correction, respect for creators, and collaboration with rights holders.
  • Train creators and agencies. Ensure everyone using generative tools knows the rules on prompts, review steps, and escalation paths.

Policy watch: copyright and AI in the UK

The government has flagged uncertainty over copyright as a drag on growth for both AI and creative sectors. It is exploring a text and data mining exception with an opt-out for rights holders-an approach that would formalize access while preserving control.

For background on current UK copyright exceptions and potential changes, see the UK IPO's guidance on exceptions to copyright. Read more

Signals from both sides

  • Getty Images: expressed concern over the difficulty of protecting creative works without stronger transparency rules and indicated it will continue to pursue action in another venue.
  • Stability AI: welcomed the ruling, noting that most copyright claims were voluntarily dismissed during trial and that the court resolved the remaining issues.

Bottom line

For now, in the UK, a model that does not store or reproduce copyrighted works is unlikely to be treated as an "infringing copy." But trademarks remain a real exposure point, and policy is still in flux. Treat watermark and logo controls as mandatory, and prepare legal and comms teams to move fast when issues surface.

Further resources


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)