People Over Prompts: Saving Australia's Creative Future

Unchecked AI risks erasing human creativity, jobs, and Australia's cultural value in months. Creators call for consent, fair pay, labels, and audits to keep humans in the loop.

Categorized in: AI News Creatives
Published on: Sep 12, 2025
People Over Prompts: Saving Australia's Creative Future

The Real Cost Of AI? A World Without Human Creativity

What happens to imagination in the age of AI?

The speed, scale, and scope of generative AI is unlike anything we've dealt with. Without guardrails, careers, industries, and Australia's cultural value can be erased in months, not years.

I've fought for inclusivity my entire career. I never expected to fight for humans to be included in making art and imagery. It sounds absurd, but it's happening daily.

For two years I've spoken with AI companies worldwide, urging them to partner with human talent instead of replacing it. No luck. One executive told me: "Why would we, when we can make them in MidJourney and retain the total profit?"

So I built an Ethical AI platform. Not to replace people, but to work with them and protect them. The catch: I don't want to have to release it. I'd rather live in a market where creators can work and be paid fairly without needing a defensive shield.

What's at stake for creatives

It became obvious how easily humans can be replaced across the board. Not just in creative fields. Our society runs on a social contract: one sector needs the other to survive. Break that, and the ripple hits everyone.

I'm a third-generation member of Australia's creative community. I've seen the passion up close. Most didn't chase money; they chased meaningful work. That's the difference. And that's exactly what's under threat.

Australia's creative economy adds over $122 billion a year. Factor in the gig economy and support services, and it's much bigger. We're the teams behind every image, campaign, film, artwork, and story you see. If AI can hollow us out, every sector is next.

Why this wave is different

Past shifts-steam, electricity, the internet-took decades. AI spreads globally in days. Humans can't keep pace with product cycles that move at server speed.

The numbers are brutal. The World Economic Forum projects major job losses by 2030. In Australia, unemployment is creeping up, full-time roles are sliding, and wages are flat. Friends are losing work right now because companies are swapping people for avatars, scripts, and AI assets.

Forrester expects Australia's workforce could shrink by 11% by 2030. Leaders from Anthropic to former Google X executives are sounding the alarm about entry-level white-collar jobs. When the builders warn you, pay attention.

People want protections

Australians support stronger rules: fair compensation, clear labels on AI content, and actual consent. People don't want to be tricked. They want choice and respect.

Government action isn't about blocking innovation. It's about setting boundaries so creators and consumers aren't crushed. The EU is already moving faster on this with its AI Act. See the overview.

The economic math doesn't add up

We're told AI will add a $240 billion "dividend" over a decade. Meanwhile, free access to our work could strip $20-70 billion every single year from jobs, tax revenue, and cultural industries. That's a long-term promise versus an immediate, compounding loss.

The mental health cost is worse. People need purpose to function. We saw during COVID how fast things unravel when life loses meaning. Scale that across millions, and the social fabric frays.

All this to enrich (mostly) foreign tech firms. That's not innovation. That's a breach of the social contract.

What creatives can do now

  • Contracts: Add no-training, no-scraping, and no-synthetic-derivative clauses. Require consent, credit, and compensation for any AI use of your work or likeness.
  • Provenance: Use content credentials or similar methods to flag authorship and edits. Make authenticity traceable across platforms.
  • Licensing: Price your IP. Stop giving assets away for "exposure." Offer paid licenses with clear usage limits.
  • Distribution: Build direct channels (email, memberships, communities). Reduce dependency on platforms that devalue human work.
  • Collective action: Join your guild/union/association. Push for standards on consent, compensation, and labelling.
  • Upskill with intent: Learn AI so you can direct it-don't let it direct you. Keep humans in the loop on taste, story, and ethics. If you need structured options, explore role-based AI training here: Courses by job.

What to ask policymakers for

  • Consent by default: No scraping or training on our data without opt-in and payment.
  • Compensation: Pay creators for training data, likeness, voice, and ongoing model use.
  • Labelling: Mandatory and visible labels on AI-generated and AI-edited content.
  • Audit and safety: Independent audits for bias, safety, and provenance. Penalties for abuse.
  • Creator funds: Redirect a share of AI profits to support jobs, training, and cultural work.

Have your say

A full submission has been lodged with the Productivity Commission, backed by global experts. A free eBook breaks down the issues in plain language. Now add your voice.

  • Make your own submission before 15 Sept. Even one line counts: "I do not consent to my data being given away."
  • Share this with your network. The more creators speak, the harder we are to ignore.

Our creativity, culture, and livelihoods are not free fuel for big tech.