Organisations turn employees into AI replicas as digital twin use grows

Companies are building AI replicas of employees to preserve knowledge and fill skills gaps - but most lack employment contracts covering consent, compensation, or what happens to the twin when staff leave.

Categorized in: AI News Human Resources
Published on: Apr 13, 2026
Organisations turn employees into AI replicas as digital twin use grows

Digital doppelgangers are entering the workplace. HR leaders need to act now.

Organisations are creating AI replicas of their employees to retain knowledge, boost productivity, and solve skills shortages. Gartner listed digital doppelgangers as one of its Future of Work Trends for 2026. But the technology raises urgent questions about consent, compensation, identity fraud, and whether these tools actually augment workers or replace them.

The concept isn't new. LinkedIn co-founder Reid Hoffman published a video in April 2024 showing himself in a Q&A with his AI twin. What's changed is the technology's sophistication. Digital doppelgangers now replicate not just what someone produces-their notes, emails, reports-but their behaviour, tone, and patterns of thinking.

Knowledge transfer at scale

Peterborough City Council made headlines last summer by converting Geraldine Jinks, a 35-year employee, into an AI chatbot called Hey Geraldine. The council used her written and recorded interactions to train a language model that captured her knowledge, approach, and helpfulness. Early testing showed occupational therapists saved 15 minutes per conversation.

This points to a real HR problem: how do you retain expertise from high-performing, long-serving staff before they retire? Digital twins offer a way to preserve that knowledge without overburdening the expert themselves.

The appeal is obvious in an ageing workforce. Organisations face a skills gap as experienced employees approach retirement. A digital twin of an expert can mentor colleagues, answer routine questions, and ensure institutional knowledge doesn't walk out the door.

The technology also works across languages. Swiss bank UBS deployed cloned versions of its analysts to share with clients, citing time savings and client demand for video content. Payment provider Klarna and Zoom have used digital doubles for announcements. One CEO built an avatar to handle routine meetings.

The efficiency question

But does this actually give people back time? Jon Dawson, Chief People Officer at hospitality company Lore Group, says no. HR leaders at an AI summit he attended reported that while AI sped up outputs, it made people work "faster, harder and actually spend less time thinking."

The risk is clear: digital twins become a way to extract more work, not less.

How they're built

Digital doppelgangers take different forms. Hey Geraldine is a chatbot trained on Geraldine's data. Other approaches combine AI avatars with language models. UBS analysts visited a studio where AI video creator Synthesia captured their voice and likeness, then used a language model to generate scripts from their reports.

Many organisations build them in-house with IT teams. Before hiring external consultancies, assess what problem you're actually trying to solve and whether existing tools can do it.

The initial build is critical. Datnexa, which worked with Peterborough Council, emphasises that dumping content into an AI system doesn't work. You need a contextual layer that captures job-specific examples and nuance. Weekly feedback huddles with team members built trust and drove adoption.

Data protection and identity fraud

The data used to train a digital twin is, as one expert put it, "gold dust." HR professionals should keep digital twins behind internal firewalls, not public-facing. The data must be "verified, validated and tested," and nothing should change until you're certain the answers are correct.

Identity fraud is a real threat. In 2024, WPP CEO Mark Read was targeted by an elaborate deepfake scam using AI voice cloning. The attack failed, but it showed what's possible.

Catrin Gaston-Penny, HR director at Cell and Gene Therapy Catapult, says the risk makes her "uncomfortable" about twinning individuals. To mitigate it, work closely with your IT and cybersecurity teams. Implement transparent data collection policies, robust privacy controls, and proper ethical AI review processes.

Built-in bias is another concern. If the person being replicated has unconscious biases, those could transfer to the digital twin, creating discrimination risks in HR processes like recruitment.

The employment contract gap

Most organisations lack clear employment agreements covering digital twins. Key questions need answers:

  • How will the employee be compensated for training their digital twin?
  • What happens to the doppelganger if they leave?
  • Do you have explicit written permission to use their image and voice?
  • Will the tool support their role or create extra work?

Gemma O'Connor, head of HR advisory at BrightHR, points out that employees will want to know whether this supports them or replaces them. The obvious risk, as one employment lawyer noted, is that workers won't be needed anymore-echoing outsourcing patterns where UK staff train cheaper replacements abroad.

If you use a worker's image after they leave, legal exposure increases. California's Assembly Bill 2602 gives actors and artists control over their digital likeness. The UK has no equivalent yet, though Equity, a creative workers' union, has called on the government to establish automatic "personality rights" over voices, faces, and bodies.

The Data Use and Access Act 2025 updated UK data protection law and requires human oversight of automated decision-making. If you operate in the EU, the EU AI Act carries "quite heavy penalties" for data protection breaches. HR must work with legal, risk, and compliance teams before rolling out this technology.

When digital twins make sense

The best use cases are narrow and specific. Hey Geraldine works because it's "a scalpel, not a Swiss Army knife"-focused on one job. It won't work for roles requiring judgment calls, nuance, or handling situations without clear-cut answers.

Gemma O'Connor is sceptical whether a digital twin could manage the breadth of HR queries, where answers aren't linear. But she sees potential for specific tasks: customer complaints, FAQs, routine inquiries. The catch: you need allocated resources for human oversight.

The human risk

Business transformation expert Allister Frost uses a low-key digital doppelganger to assist with tasks. But he won't send it to deliver talks. "Absolutely not, because that's not me," he said. "You need to really ensure that as the technology gets smarter, the humans don't get smaller."

Catrin Gaston-Penny agrees: "The human touch is still important. People don't have to be everywhere all the time, but showing up in person helps."

Jon Dawson has seen digital twins used in recruitment to interview candidates. He questions what a candidate would think: "Would you feel that the organisation really values you as an individual?"

The technology is moving fast. Synthesia's senior strategic advisor Kevin Alster expects the next decade to shift from "static, one-way content to interactive, conversational experiences powered by AI agents." But Dawson takes a long-term view: why invest resources now when the technology could change significantly in three years?

What HR needs to do now

Gartner places digital twins near the beginning of its Hype Cycle, meaning the technology is still maturing. If you're considering it, start with these steps:

  • Define the specific problem you're solving. Avoid broad, vague use cases.
  • Secure explicit written consent from the employee, covering compensation, data use, and post-employment scenarios.
  • Work with legal, compliance, and IT teams to understand your obligations under data protection and employment law.
  • Build context into the training data, not just raw content.
  • Test thoroughly before launch. Gather feedback from actual users.
  • Assign clear human oversight-consider a data protection officer to review content and policies regularly.
  • Communicate transparently with staff about why the tool exists and how it will be used.

Nic Elliott, HR director at law firm Actons Solicitors, says the technology feels like "quite a minefield to navigate well." That's accurate. But with clear governance, explicit consent, and genuine focus on augmenting rather than replacing people, digital doppelgangers can solve real workforce problems.

The question isn't whether to adopt the technology. It's whether you're ready to do it responsibly.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)