First Nations creatives warn of AI "double colonisation" as works used to train models without consent
First Nations writers and publishers are sounding the alarm: without strong guardrails, AI training risks becoming a second wave of extraction - taking stories, culture, and livelihoods without consent or payment.
At the center is Indigenous Cultural and Intellectual Property (ICIP) - knowledge, stories, and expressions that carry community and personal significance. The concern is simple: if AI firms can train on this material without permission, creators lose control over how their culture is used, and how they get paid.
What happened
Gunai author Kirli Saunders says Meta used her award-winning book "Bindi" to train its AI model after sourcing it from the shadow library LibGen - without permission or payment. "It was really frustrating having a large tech company steal my work and provide no apology or explanation," she said.
According to Magabala Books chief executive Lilly Brown, 37 First Nations titles from the Indigenous publisher appeared in the training mix. "This kind of history of First Nations knowledge and stories being stolen by non-Aboriginal people and corporations is nothing new," she said. Brown and Saunders want ICIP protections embedded into AI rules, warning of "double colonisation" if their work is taken again.
The policy fight that affects your income
Australia's Productivity Commission has floated a "text and data mining" exception to the Copyright Act that could let companies train AI on copyrighted works without the usual permissions. Creative and media sectors pushed back, with Arts Law Centre CEO Louise Buckingham calling it a "get out of jail free card" for big tech. "If their works may be used without permission, without compensation, the whole system through which they make a living breaks down."
The government says it won't water down copyright and will consider impacts on creative and news sectors. The Productivity Commission is due to hand down its final report in December.
Why copyright alone isn't saving you
Experts say the law hasn't kept pace. Professor Kimberlee Weatherall notes Australia's Copyright Act hasn't been meaningfully updated for the digital environment in about twenty years, and enforcing rights against overseas tech firms is difficult in practice.
AI entrepreneur Dave Lemphers argues for transparency and penalties: companies should prove where their training data comes from, and creators should be able to set royalty terms. "You could easily build in commission structures and royalty structures into your technology. They just choose not to because there's a loophole."
What creatives can do now
- Audit your catalog: list titles, formats, and where they live online. Tighten access to full files. Remove unauthorized uploads and pursue takedowns on mirror sites like LibGen whenever you find them.
- Update your contracts: add explicit AI clauses for publishers, clients, and platforms - no training without written permission, clear usage boundaries, and royalty terms for any machine use.
- Publish a rights statement: make your AI policy public on your site and in metadata (e.g., "No AI training or dataset inclusion"). Consistency helps during disputes.
- Block known AI crawlers on your website: use robots.txt to disallow agents like GPTBot and CCBot. It won't stop scraping everywhere, but it creates a record of your intent and shuts down compliant bots.
- Watermark and track: use subtle watermarking or traceable excerpts for digital samples. Share full texts only via controlled channels.
- Collective action: coordinate with your publisher, peak bodies, and legal services such as the Arts Law Centre of Australia for template clauses, dispute support, and advocacy.
- Document everything: keep dated records of infringements, correspondence, distribution logs, and contract terms. Evidence matters if policy or litigation opens a path to compensation.
What to watch next
All eyes on the Productivity Commission's final report due in December. Key signals: any endorsement of a data mining exception, mandated transparency for training datasets, and penalties for non-compliance. You can track updates via the Productivity Commission.
Meta and the Tech Council of Australia were approached for comment but did not respond.
Skill up to protect your work - and choose where you participate
Understanding how AI models are built helps you set boundaries, negotiate smarter contracts, and decide when licensing makes sense on your terms. If you want structured guidance by job type, explore curated options here: Complete AI Training - Courses by Job.
Your membership also unlocks: