Is This My Face? An Actress Confronts Her AI Double as the Law Lags Behind

Performer Briony Monroe says an AI actress looks eerily like her; Particle 6 denies it. Unions urge tighter consent and contracts as voice models raise fresh alarms.

Categorized in: AI News Creatives
Published on: Nov 26, 2025
Is This My Face? An Actress Confronts Her AI Double as the Law Lags Behind

"I believe my face was used to create an AI actress" - what creatives need to know

Performer Briony Monroe says she was stunned when she saw an AI-generated actress named "Tilly Norwood" who looked uncannily like her. One medieval-themed image in particular stopped her cold - the face, the vibe, the presence - all felt near-identical.

She told STV News the resemblance was so strong she had to sit down. Friends and colleagues saw it too.

The AI lookalike: Tilly Norwood

"Tilly Norwood," made by Particle 6, is a computer-generated actress that can be restyled and dropped into different scenes. From clip to clip, the face shifts slightly. But for Briony, one image crossed a line.

"That picture is the one that I feel has a striking resemblance to myself," she said. "The actress changes in every single clip and picture that you see it in."

What the company says

Particle 6 denies using Briony's likeness. A spokesperson said: "Briony Monroe's likeness, image, voice, or personal data have not been used in any way to create Tilly Norwood. Tilly was developed entirely from scratch using original creative design. We do not and will not use any performer's likeness without explicit consent and fair compensation."

Briony, now supported by her union, Equity, says the situation highlights how exposed performers are as AI tools get better at mimicking human identity.

Voice-over warning signs: the ScotRail case

Equity is also backing voice-over artist Gayanne Potter. She believes an AI voice model called "Iona," supplied by Readspeaker and used by ScotRail, was built from her recordings.

Gayanne says she thought her initial work with the company would support accessibility. Years later, a producer flagged the "Iona" model online and raised concerns it could be used for any kind of voice work. She wants the voice removed. Readspeaker says it has a contract with her and has "comprehensively addressed her concerns." ScotRail continues to work with the firm.

"You have a right to say no," Gayanne said. "It shouldn't be that people can take things and use things without your consent."

The legal gap

Lawyers say the rules haven't kept up. Barbara Neilan of Jamieson Law put it plainly: if an AI tool generates an output, who owns it - the user, the machine, or no one? Copyright wasn't built for this, and that's creating friction for performers whose identities are their livelihoods.

The UK Government is considering new measures next year. Unions like Equity are pushing now for explicit protections over image, voice, and likeness.

Practical steps for creatives

If your face, voice, or style is your income, act like it's IP. Here's a quick, workable checklist:

  • Contracts: Add clear "no training, no cloning" clauses. Specify that your image, voice, and data can't be used to build models, now or later.
  • Scope and usage: Define where, how long, and in what formats your work can be used. Ban sublicensing without your written approval.
  • Audit rights: Include the right to request logs of where your assets are stored or deployed, plus removal on request.
  • Compensation triggers: If synthetic versions are created or used, set rates and consent steps in advance.
  • Watermarks and fingerprints: Use subtle visual watermarks or audio markers on portfolio samples to discourage scraping.
  • Proactive monitoring: Run reverse image searches and spot-check synthetic voice marketplaces. Document dates, links, and evidence.
  • Union and legal support: If something looks off, contact your union and a lawyer early. Speed matters.
  • Platform settings: Review terms on marketplaces, casting sites, and social platforms. Opt out of data training where possible.

For a deeper look at current guidance, see Equity's resources on AI and performers' rights here and the UK IPO's overview of AI and intellectual property here.

Where this is heading

Expect more disputes like Briony's and Gayanne's. Expect tighter contracts, more standardized consent flows, and more work for producers who want to stay compliant.

If you're building a career on your image or voice, the play is simple: get your paperwork in order, keep receipts, and make consent the default. If you need to skill up on AI tools and workflows to protect your edge, you can explore role-specific training here.

Your identity is your asset. Guard it like one.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)
Advertisement
Stream Watch Guide