Virginia content creator warns followers after AI deepfake uses her image to sell life insurance

A Virginia YouTuber found her face used in AI-generated videos selling life insurance she has never promoted. The U.S. has no comprehensive law against deepfake fraud, leaving creators and consumers exposed.

Categorized in: AI News Insurance
Published on: Apr 02, 2026
Virginia content creator warns followers after AI deepfake uses her image to sell life insurance

Deepfake videos used to fraudulently sell life insurance without creator's consent

A Virginia content creator discovered her face and likeness were used in AI-generated videos promoting life insurance-a product she has never sold and knows nothing about.

Karen Flowers runs a YouTube channel called Karen of Curl House with more than 90,000 subscribers focused on hair care tutorials. Her cousin sent her a video showing Flowers' image paired with a different voice pitching life insurance.

"I post videos on everything relevant to hair, and I know my image," Flowers said. "So when I saw this, I knew immediately it was a deepfake."

Deepfakes are digitally altered videos that can convincingly replicate a person's appearance while changing their voice or words. Viewers unfamiliar with Flowers' actual content could easily believe she endorses the insurance product.

Scammers exploit AI to target consumers

The incident reflects a broader threat. Cybersecurity experts warn that scammers use AI to search the internet for personal information about targets, then craft customized phishing emails, text messages, and fake voice calls designed to extract money or sensitive data.

Alex Nette, founder of Hive Systems, said AI allows bad actors to "understand who you are, where you work, maybe where you live" before launching targeted attacks.

The U.S. currently lacks comprehensive legislation to regulate AI-generated deepfakes, leaving creators and consumers vulnerable to fraud.

What this means for insurance professionals

For insurance workers, deepfake scams present two distinct risks. Fraudsters may impersonate licensed agents or company representatives to solicit customers. Simultaneously, your own image or likeness could be misused without consent to sell competing products.

Learn more about AI for Insurance and how Generative Video technology creates these risks.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)