Australia rules out weakening copyright for AI training as creatives push for stronger guardrails
Australia has drawn a clear line: there will be no broad copyright carve-out for AI training. Attorney-General Michelle Rowland said the tech industry and creative sector need to come together, but developers won't get a free pass to mine work without permission or payment.
The message is simple. AI offers big opportunities for the economy, but Australian creators need to benefit from the value their work generates.
What the government said
The government has no plans to weaken copyright protections in relation to AI. A reference group will meet over the next two days to consider fair and legal pathways for using copyrighted material in AI development.
That means the policy direction is collaboration and licensing, not blanket exemptions. The goal: growth for AI while protecting creative livelihoods.
Consultation: who's at the table
Writers, musicians, and other creatives will be consulted alongside tech companies. The brief is to explore practical options that keep innovation alive and keep infringement in check.
For context on current policy settings, see the Attorney-General's Department copyright guidance.
What tech companies want
Several tech firms have pushed for a broad text and data mining (TDM) exception. That would let AI developers train on creators' work for free and without permission.
The government's stance indicates that approach is off the table, at least for now. Licensing and consent are the expected path.
Why creatives are pushing back
Australian Society of Authors CEO Lucy Hayward called how some AI systems have been built "the greatest act of copyright theft in history." In her view, a broad TDM exemption would have handed developers a free pass and legitimised what has already occurred.
Creators want clear rules, enforceable consent, and fair payment mechanisms. Anything less shifts value away from the people who make the work.
What this means for your strategy
- For creative businesses: Audit your catalogues and licensing terms. Specify AI training permissions in contracts. Track where your works appear and prepare evidence trails for takedown or licensing negotiations.
- For tech teams: Build with provenance in mind. Use licensed, public domain, or appropriately permitted datasets. Budget for rights clearance and document source data and permissions.
- For legal and policy teams: Prepare input for consultation. Evaluate collective licensing, opt-out/opt-in mechanisms, and technical standards for attribution, auditing, and consent logging.
Key takeaways
- No blanket TDM exemption: developers cannot rely on free, permissionless training on copyrighted works.
- Consultation is active: government, creatives, and tech will shape practical licensing paths.
- Expect consent and compensation to anchor future policy and commercial deals.
- Start building compliance and provenance processes now to reduce legal and reputational risk.
Want a quick primer on TDM concepts that often surface in these debates? See this overview from WIPO.
If you're updating team skills for AI projects under stricter copyright settings, explore role-based options here: AI courses by job.
Your membership also unlocks: