Deepfakes in South Africa: Strong Laws, Weak Enforcement
Deepfakes warp consent and weaponise identity, driving scams, false endorsements, and abuse. SA law covers this; enforcement lags-slow courts, platforms, and unmasking.

Deepfakes and South African law: remedies exist - enforcement lags
Deepfakes are AI-generated forgeries of faces, voices and likeness. They distort consent, weaponise identity, and erode trust. The legal frameworks to respond exist in South Africa. The bottleneck is enforcement.
What deepfakes look like
- Text: fake receipts, IDs, and documents.
- Photo: face/body swaps used for memes or impersonation.
- Audio: voice cloning via text-to-speech, often targeting public figures.
- Video: face and motion transfer; common in non-consensual pornography and false endorsements.
Why they matter to legal teams
- Deception: fabricated media that looks authentic.
- Harm enablement: reputational damage, financial scams, and misrepresentation.
- Low barrier to abuse: anonymous publishing at scale.
Recent South African incidents
High-profile cases show the threat is not theoretical. Leanne Manas's image was used to push fake endorsements on Facebook and TikTok. A deepfake of Elon Musk induced South Africans to invest in a fraudulent scheme. In 2025, Professor Salim Abdool Karim was impersonated in a video spreading anti-vaccination claims while promoting counterfeit medicine.
The legal toolkit: what already applies
- Cybercrimes Act 19 of 2020: criminalises disclosure of intimate images without consent and other malicious communications. See the Act text: Department of Justice.
- Electoral Act 73 of 1998: bans publishing false information intended to influence elections.
- Films and Publications Act 65 of 1996 (as amended): prohibits online distribution of private sexual photographs and films intended to cause harm.
- Protection of Personal Information Act (POPIA): limits unlawful processing of personal information; enables data subject requests for correction/deletion.
Common-law remedies that fit deepfakes
Personality rights protect privacy and identity. Unauthorised use of name, likeness, or voice can amount to an iniuria (violation of personality rights), false endorsement, or passing off.
- Identity appropriation: The Supreme Court of Appeal confirmed protection against exploiting another's identity without consent in GrΓΌtter v Lombard.
- Exploitation of minors' images: A surfer magazine's pin-up use of a 12-year-old's photo led to damages and costs.
- False endorsement: Basetsana Kumalo succeeded where a business used her shopping photos in an advert without permission-identity and privacy were infringed.
These principles map cleanly onto deepfakes: false endorsements, election disinformation, and non-consensual pornography all trigger liability.
Where enforcement breaks
- Access to justice: court backlogs make litigation expensive and slow; pro bono support is thin.
- Platform reach: courts can assert jurisdiction, but serving orders abroad and compelling compliance is costly and slow; takedowns often arrive after the damage.
- Anonymity shields perpetrators: fake profiles, burner devices, and limited SAPS capacity stall tracing.
- Data disclosure delays: platforms often drag their feet on unmasking accounts.
Playbook for legal teams: act fast, preserve leverage
Immediate steps (first 24-48 hours)
- Preserve evidence: capture URLs, handles, timestamps; take screen recordings showing context and metadata; hash files; note date/time and chain of custody.
- Issue takedown demands: use platform tools and send legal notices citing the Cybercrimes Act, POPIA, Films and Publications Act, and defamation/iniuria. Ask for expedited removal and retention of logs.
- Criminal report: open a case for malicious communications (e.g., intimate image disclosure) under the Cybercrimes Act; request a section 205 subpoena via the prosecutor to secure subscriber data.
- POPIA action: send a section 24 correction/deletion request to the controller; lodge a complaint with the Information Regulator if non-compliant.
- Protection order (where harassment present): apply under the Protection from Harassment Act; courts can direct service providers to disclose information identifying the originator.
Civil remedies
- Urgent interdict: High Court relief against the perpetrator and, where justified, platform entities; include "John Doe" respondents if identities are unknown.
- False endorsement/passing off: where the deepfake implies commercial support; claim damages and corrective statements.
- Privacy and identity claims: iniuria-based damages; seek delivery up/destruction of infringing material.
- Preservation and disclosure orders: request retention of data and orders compelling platforms to disclose account information; in criminal matters, leverage section 205 processes for speed.
Cross-border service and evidence
- Service abroad: use the Hague Service Convention channels; budget for delays.
- Evidence abroad: letters of request under the Hague Evidence Convention; align with platform policies for law-enforcement disclosures to avoid duplication.
What to ask platforms for
- Immediate takedown and geo-blocking.
- Preservation of logs, IPs, and device fingerprints.
- Disclosure of subscriber and payment info tied to the account.
- Confirmation of any paid amplification or ad spend linked to the content.
Policy fixes that would move the needle
- Capacity building: fund specialised digital-forensics units; formal training for SAPS and prosecutors in media authentication and trace requests.
- Platform accountability: statutory service-level timelines for takedowns and data disclosures; penalties for non-compliance; binding local points of contact.
- Proactive signals: require deepfake tools and platforms to embed content provenance and watermarking (e.g., C2PA Content Credentials) to aid tracing and user warnings.
- Clear user redress: fast-track procedures for false endorsements and intimate-image abuse; template orders and standard forms at Magistrates' Courts.
- Codes under POPIA: sectoral codes obliging platforms to process identity data lawfully and respond promptly to deletion requests tied to synthetic media abuse.
Compliance checklist for counsel
- Map facts to offences and delicts (Cybercrimes Act, POPIA, Films and Publications Act, defamation/iniuria).
- Run parallel tracks: criminal complaint, POPIA complaint, and civil interdict.
- Lock down evidence early; maintain an audit trail.
- Target distribution: search engines, mirrors, reposts, and link aggregators.
- Prepare affidavits from media-forensics experts to support urgency and authenticity analysis.
Bottom line
South African law already prohibits deepfake abuses of identity, privacy, and reputation. The gap is speed: slow courts, delayed platform responses, and hard-to-trace perpetrators. Close that gap with fast evidence preservation, parallel legal tracks, and policy reform that compels timely platform action.
For the statutory text of malicious communications offences, see the Cybercrimes Act (Act 19 of 2020).