Australia shuts door on AI text and data mining exception, backing creators and upping pressure on the UK

Australia rejects a text and data mining exception, locking in a permission-first approach for AI training. That means agencies need licences, provenance, and solid indemnities.

Categorized in: AI News Government
Published on: Oct 28, 2025
Australia shuts door on AI text and data mining exception, backing creators and upping pressure on the UK

Australia Rejects Text and Data Mining Exception for AI: What Government Teams Need to Do Next

Australia has ruled out a new copyright exception for text and data mining that would have allowed AI firms to train models on protected works without permission. Attorney-General Michelle Rowland said, "we are making it very clear that we will not be entertaining a text and data mining exception". The government will review copyright in the context of AI, but not by weakening licensing requirements.

Why this matters for policy and procurement

This position locks in a permission-first approach. Any AI system trained on copyrighted material will need licences, or it will carry legal and financial risk. That matters for pilots, procurements, and ongoing vendor relationships across departments and agencies.

Expect stronger expectations on disclosure, data provenance, and indemnities from AI suppliers. If a model's training sources are unknown or disputed, the risk sits with the buyer unless contracts make it clear otherwise.

Global context: UK, EU, US

The UK is still consulting, balancing a growing AI sector with demands from creative industries. The EU already provides a text and data mining exception with an opt-out for rightsholders, which means licences are still required if owners say no. In the US, multiple lawsuits create uncertainty, pushing some AI companies toward licensing to avoid outsized damages if they lose later.

Creative sector pressure and transparency

The music industry and wider creative sectors want clear licensing and visibility on how their works are used. Rowland emphasized transparency and attribution, stating that without rules forcing disclosure, creators can't see how their content is used or get fairly paid.

The Australian Recording Industry Association welcomed the stance. CEO Annabelle Herd said major companies have used copyrighted works "without permission, without payment" to train large AI systems, and are now seeking permission after the fact. She added there's no evidence it's hard to license, calling the push for an exception "a bit of a try-on".

Market dynamics: licensing as strategy

As lawsuits expand and risks grow, more AI firms may shift from arguing they don't need licences to striking deals. Once they've paid for access, they have a strong incentive to oppose any new exceptions that would give rivals a free ride. That could reshape industry alliances and strengthen the case for permission-first norms.

Action checklist for Australian government teams

  • Require suppliers to disclose training data sources, licensing status, and opt-out compliance.
  • Bake in warranties, audit rights, and indemnities covering copyright claims and data provenance.
  • Prefer models trained on licensed, public domain, or synthetic data where feasible.
  • Document how vendors respect rightsholder opt-outs and any geographic restrictions on data use.
  • Assess litigation exposure in the US and EU if tools are globally trained or deployed.
  • Plan budgets for licences where AI outputs depend on protected content.
  • Support transparency and attribution standards across procurement frameworks.

Key quotes

Michelle Rowland: "We are making it very clear that we will not be entertaining a text and data mining exception."

Michelle Rowland: "If we don't have attribution and transparency regulations for these corporations, then how can you know where and how your content's being used anyway and how can you get fairly paid for it?"

Annabelle Herd (ARIA): Companies have used copyrighted works "without permission, without payment" to train massive AI systems, and "there is no evidence" licensing is difficult-making the exception push "a bit of a try-on".

Upskilling your team

If your unit is building internal capability to assess AI tools and contracts, role-based training can help standardize reviews and reduce risk. See government-relevant AI course options for policy, legal, procurement, and data teams.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)