Ottawa signals fresh online harms bill, with deepfake protections split across files
The federal government is preparing new online harms legislation. Artificial Intelligence Minister Evan Solomon said Culture Minister Marc Miller will bring the bill forward, separate from a privacy bill Solomon plans to table and a justice bill introduced last year addressing certain deepfakes.
Miller's office offered no further detail, but noted the government intends to act quickly to better protect Canadians-especially children-and that platforms have a role in reducing online harm.
What's on the table
- New online harms bill: Coming from Canadian Heritage, with scope and enforcement details to be announced.
- Privacy reform: Solomon said his bill could include a right to delete deepfakes-part of an effort to modernize privacy protections.
- Justice measures: A justice bill tabled last year includes provisions to criminalize certain non-consensual deepfakes. Experts say many images spreading on X may fall outside its current reach; the government has not said if it will amend it.
- Potential youth measures: Following reporting this week, the forthcoming harms bill could include a social media ban for children under 14. Miller did not confirm.
Why this is moving now
Sexualized deepfakes generated by Elon Musk's Grok have spread on X in recent weeks, prompting a global backlash. Solomon said the government is working on a "suite of protections" to restore trust and shield Canadians from the worst online abuses.
Context and history
In 2024, the government introduced an online harms act that would have created a new regulator and required 24-hour takedowns for content that sexually exploits a child and for intimate content shared without consent, including deepfakes. It did not become law.
Under Prime Minister Mark Carney, the government signalled it would not reintroduce that bill in the same form. Instead, it would address pieces of online harm through multiple instruments-heritage, privacy, and justice.
What public servants should prepare for
- Cross-file alignment: Expect interaction between Canadian Heritage (regulatory obligations), ISED/Privacy (individual rights such as deletion), and Justice (criminal enforcement). Plan for interdepartmental coordination.
- Platform obligations: Possible takedown timelines, reporting duties, and risk-mitigation requirements for social platforms and hosting services. Assess your program areas that intersect with platform compliance, audits, or penalties.
- Rights of individuals: A potential right to delete deepfakes will raise process, verification, and redress questions. Map out workflows for identity validation and secure removal requests.
- Youth protections: If age-based access limits are proposed, anticipate standards for age assurance, data minimization, and oversight. Consider procurement clauses and guidance for departments engaging with social platforms.
- Operational readiness: Begin outlining policy, guidance, and public communications for rapid takedown expectations, complaint handling, and evidence preservation for law enforcement.
Open questions to watch
- Which entity will regulate and how will it coordinate with privacy and criminal enforcement?
- What content categories will trigger 24-hour removal, and how are appeals handled?
- How will the law address cross-border platforms and hosting outside Canada?
- Will the justice bill be updated to cover the full range of non-consensual deepfakes spreading on X?
- If there's a youth social media ban, what age assurance methods will be permitted and how will privacy be protected?
For background
- Parliament: Bill C-63 (2024), Online Harms Act (archived status)
- Canadian Heritage: Online safety initiatives
Skills and capacity
Policy teams working on online safety, privacy, and platform regulation will need deeper literacy in AI-generated media, detection limits, and redress workflows. If your team is building that capability, this resource may help:
Your membership also unlocks: