AI, copyright, and Australia's voice: a government playbook
Australia is entering a new digital phase, and the stakes are public. News content is being scraped to train AI models without permission or payment. The warning is simple: weaken copyright, and we shrink Australian voices, local stories, and civic accountability.
Michael Miller has urged government to draw a hard line. He is calling for a social license for big tech, enforcement with teeth, and zero tolerance for "permissionless" use of creative work.
The core policy questions
- Should Australia create a text and data mining exception that lets AI firms use copyrighted material without permission? He says no.
- Should big tech meet a social license to operate, with standards tied to market access and penalties for breaches? He says yes.
- Should parliament activate the News Media Bargaining Incentive and deliver the promised News Media Assistance Program for small publishers? He says move now.
- Should platforms face the same scrutiny as other critical services when their systems cause harm? He argues they should.
Why this matters to government
Local media closures erode democratic oversight. PIJI reports more than 160 newsroom closures in five years, with direct impacts on councils, courts, and community life. That trend compounds if AI firms can extract value from journalism without licensing or compensation.
Changing copyright settings will reset who controls use, terms, and payment for Australian work. An "opt-out" posture flips the burden onto creators and small publishers who lack the resources to police their rights. An "opt-in" posture keeps consent and compensation intact.
Specific actions to consider
- Hold the line on copyright: no blanket text and data mining exception for training AI on copyrighted content.
- Legislate and commence the News Media Bargaining Incentive; publish timelines and compliance expectations.
- Deliver the News Media Assistance Program to small and regional publishers with simple, fast distribution.
- Define a social license for large platforms and AI providers: safety-by-default, legal data sourcing, verified complaint handling, independent audits, and criminal penalties for serious breaches.
- Mandate consent-based, paid licensing for training datasets used by foundation and large language models that serve Australians.
- Require AI suppliers in government procurement to attest to lawful data provenance and transparent model documentation.
- Establish algorithmic risk reporting for products with material child-safety, mental-health, or civic-harm risks.
- Back enforcement with resources for regulators and clear remedies for creators.
Avoiding a repeat of past mistakes
Australia lived through the first "big steal" when platforms built scale on the free use of others' work. The cost was borne by local newsrooms and communities. Without safeguards, AI training on unlicensed news content will accelerate that pattern.
The principle is straightforward: permission and payment first. Anything less rewards extraction and penalises those doing the reporting that keeps institutions honest.
What to watch next
- Government's position on a text and data mining exception and any copyright amendments.
- Progress on the News Media Bargaining Incentive and News MAP delivery to small publishers.
- Design choices inside a social license: standards, audits, penalties, and a clear path to compliance.
Helpful references
- Attorney-General's Department: Copyright and AI
- Public Interest Journalism Initiative: Newsroom Mapping Project
For policy teams upskilling on AI
Building internal literacy helps you write smarter rules and buy safer systems. If your department needs practical AI training for non-technical staff, see the role-based options here: AI courses by job.