No Consent, No Compensation: Why Creators Oppose Australia's AI Data Mining Exception
Australia considers a TDM exception letting AI use art without consent; creatives push back. They demand transparent datasets, consent, credit, compensation, and real enforcement.

AI training shouldn't erase artists' rights: what creatives need to know now
The Federal Government's recent economic reform roundtable ended with talk of a compensation model for AI training data. Sounds hopeful-until you read the interim report from the Productivity Commission, which proposes a new text and data mining exception that would let AI developers use copyrighted work without permission or payment.
NAVA and thousands across the arts sector oppose this. Australia's Copyright Act already protects artists' work. Using art for publication, reproduction, or AI training requires consent and a fee. Any workaround that normalises scraping without consent cuts against the law and the basic premise that creative labour has value.
What the change would mean
A TDM exception would give AI companies legal cover to ingest artworks at scale-with no permission, no credit, and no upfront compensation. It shifts the burden onto individual artists to hunt for infringements across opaque platforms and complex systems. That's not workable for most people-and it rewards extraction over ethics.
What artists are saying (NAVA survey)
- 80%+ believe AI threatens their income, practice, and moral rights.
- 73% support a compensation scheme when work is used to train AI.
- Many report their work and personal data have already been scraped without consent.
- Major barriers to detecting use: lack of transparency, legal complexity, and power imbalances.
- Common themes: scraping without permission is exploitation; styles and identities are being mimicked or monetised; impersonation is on the rise; many now feel unsafe sharing work online.
How creatives are actually using AI
- 46% use generative AI in their practice, 41% do not, and 13% may in the future.
- Top uses: editing/grammar (49%), drafting text (49%), grant writing/admin (36%), research and development (~40%), brainstorming (34%).
- Visual outputs remain limited: 22% generate sketches/reference images; only 6% produce final artworks with AI.
Artists aren't rejecting technology. They're asking for lawful, transparent, and fair use-consent first, and compensation where due.
Beyond copyright: culture, ethics, environment
Generative systems are flooding markets with faster, cheaper outputs that sideline slower, process-led, and experimental practices. Many feel pressure to adopt tools to meet unrealistic productivity expectations. Environmental costs also matter: AI infrastructure consumes significant water and energy, and artists are calling for full transparency on community impacts.
What good policy looks like
- Transparent training datasets that can be searched by artists.
- Meaningful consent processes, with clear opt-out and penalties for infringement.
- Attribution and compensation pathways when work informs AI outputs.
- Stronger enforcement of existing copyright, plus stand-alone AI legislation that upholds artists' rights.
- Education for creators and inclusion of artists in decision-making.
- Protection of Indigenous Cultural and Intellectual Property (ICIP).
Tools and actions you can take now
- Use defensive tools when appropriate. For example, Nightshade helps disrupt unauthorised training on your images.
- Update contracts and licensing terms to address AI explicitly: consent, usage scope, attribution, and fees.
- Keep clean records of your work (dates, versions, proofs). It strengthens your position if disputes arise.
- Audit platform settings and policies. Where available, set "do not train" preferences and review T&Cs before uploading.
- Join sector advocacy and share evidence of scraping or impersonation to help build enforcement cases.
Policy in motion
The Productivity Commission's proposal is a pivotal moment for Australian creatives. Any shift toward exceptions that normalise scraping without consent undermines the Copyright Act and artists' livelihoods. Track updates and make submissions via the Productivity Commission.
The bottom line
Creators don't owe their life's work to machine training by default. Consent, credit, and compensation must be the baseline. Strengthen the laws we have. Build clear AI rules that protect artists. And keep AI use honest-useful in practice, fair in principle.
If you're exploring AI for your visual practice, choose your tools with intent and keep control of your rights. For curated resources on creative AI tools, see this guide: AI tools for generative art.