Creators to Parliament: Stop foreign AI from stealing Canadian culture

Canadian creators urge Parliament to stop foreign AI scraping their work and to enforce copyright. Witnesses call for dataset transparency, deepfake rules, and no expansion of TDM.

Categorized in: Ai News Creatives
Published on: Oct 17, 2025
Creators to Parliament: Stop foreign AI from stealing Canadian culture

Parliament urged to rein in foreign AI companies scraping Canadian culture

AI firms are ingesting music, art, and writing without consent, credit, or pay. Five leaders from Canada's creative sector told Parliament that this is already draining income from working artists and confusing the market with lookalike outputs.

If you write, compose, design, or film, this isn't abstract policy talk. It's your IP, your reputation, and your paycheck.

What witnesses told Parliament

OCAD University's Kelly Wilhelm said AI has already disrupted the value chains that creatives depend on. Her message: set simple, clear, harmonized standards that protect rights while still enabling responsible innovation.

Music leaders described the scale of the problem. "Nearly every song ever written by a Canadian songwriter has already been scraped and is already stolen by these AI companies without consent, credit or compensation," said Margaret McGuffin, CEO of Music Publishers Canada.

The legal fault line: copyright and TDM

Canada's Copyright Act allows text and data mining (TDM) for non-commercial purposes by organizations that already have lawful access. For-profit AI developers aren't covered. The catch: most major AI companies are outside Canada, making enforcement slow and costly.

Witnesses urged MPs to reject any push to expand the TDM exception. "The TDM exception would not facilitate growth in either the creative or technology sector… it would certainly deprive creators of the economic benefits of their works," said Jennifer Brown, CEO of SOCAN.

For context, see the Copyright Act and the Heritage Committee.

Transparency and deepfakes

Creators can't enforce rights if they can't see how their work is used. Witnesses called for mandatory labeling of AI-generated content and for AI firms to track and publicly disclose training datasets.

"If we're going to unlock human consciousness with AI, shouldn't it be able to write a bibliography?" said Patrick Rogers, CEO of Music Canada. He also urged action against deepfakes that mimic real artists: "What's illegal on paper should be illegal online. Putting your words in my mouth is not free speech."

Cultural sovereignty is at stake

"AI is fundamentally a homogenizing tool," said OCAD U's Wilhelm. Models trained on massive datasets tend to reproduce the most common patterns, pushing minority languages and stories to the margins.

Quebec's ArtIA warned that francophone, Indigenous, and other minority communities risk being diluted or ignored. Marc-Olivier Ducharme called AI "the next frontier of technological colonization" and proposed "cultural data trusts" - community-governed datasets that preserve cultural data and set rules for access and use.

What creatives can do now

  • Assert your rights: register works, add clear copyright notices, and keep timestamped source files and stems.
  • Update contracts: add "no AI training" clauses in licenses and commissions; scrutinize TDM and data-use terms.
  • Watermark and label: use content credentials and visible statements on AI use in your releases and portfolios.
  • Monitor and act: set up alerts for your name/titles, document suspected clones, and file takedowns promptly.
  • Band together: engage with SOCAN, Music Publishers Canada, unions, and collectives to scale enforcement.
  • Skill up, ethically: learn AI tools that respect IP so you can produce faster without giving away your catalog. Explore curated options by job at Complete AI Training.

What policymakers were asked to do

  • Affirm that unlicensed AI training on copyrighted works is infringement under current law.
  • Keep the TDM exception non-commercial and resist expansion to for-profit AI developers.
  • Require labeling of AI-generated content, and mandate dataset tracking, disclosure, and auditability.
  • Crack down on deepfakes that impersonate artists and public figures.
  • Fund culturally responsible AI infrastructure, including cultural data trusts with community governance.
  • Resource enforcement to address foreign firms and lower the cost of action for creators.

Why this matters to your career

Unlicensed training replaces your work and floods the feed with imitations. Without transparency and enforcement, attribution and royalties vanish, and minority voices get drowned out.

Clear rules won't block innovation. They set the floor so creators can build on it - and get paid.

What's next

The House of Commons Standing Committee on Canadian Heritage will hold three more hearings on Oct. 22, 27, and 29. Expect heavy debate on TDM, labeling, dataset disclosure, and funding for cultural data infrastructure.

Stay alert, tighten your contracts, and keep receipts. The outcomes here will shape whether AI works for creators - or feeds on them.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)