Albanian Actress Sues Government After Her Face Becomes Its AI Minister

Albanian actress Anila Bisha sues the government over AI 'minister' Diella using her face and voice. She seeks €1M as a court weighs a ban; the state denies wrongdoing.

Categorized in: AI News Government
Published on: Feb 14, 2026
Albanian Actress Sues Government After Her Face Becomes Its AI Minister

Albanian Actress Sues Government Over AI Virtual Minister Image Use

Anila Bisha, a well-known Albanian theater and film actress, has sued the government over the use of her face and voice for "Diella," a virtual minister built with AI. She says she consented only to an online assistant for helping citizens with documents-not to being portrayed as a minister. Her legal claim seeks 1 million euros in compensation. The government denies the allegations and says it will defend its position in court.

Diella was unveiled last year by Prime Minister Edi Rama as a tool to oversee public procurement and fight corruption. The court is expected to consider a ban on any further use of Bisha's likeness. Albania is being cited as the first country to introduce a minister powered by AI. The outcome could influence how public institutions use real people's images and voices in digital projects.

Why this matters for public officials

AI avatars in government services are no longer hypothetical. This case tests consent, purpose limits, and public trust in one shot. The lesson is simple: if a person's likeness is involved, ambiguity is risk.

Key legal questions this case may shape

  • Scope of consent: Does permission for a general "assistant" cover a high-profile role like "minister"?
  • Likeness and voice rights: What counts as a match, and what clearance is required for image and voice cloning?
  • Purpose limitation: Can the state repurpose a likeness beyond the exact use initially agreed?
  • Reputational harm: How are harassment and unwanted attention weighed in damages?
  • Vendor responsibility: If a contractor built the avatar, who bears liability and defense costs?
  • Public transparency: What disclosures are needed so citizens know who and what they're interacting with?

What departments should do now

  • Inventory all projects using human faces or voices (live, pilot, or planned). Shut down anything without explicit, written consent.
  • Use consent forms that specify use case, role/title, channels, duration, revocation terms, and geographic scope.
  • Get separate approvals for titles that imply authority (e.g., "minister," "director," "inspector"). Avoid role inflation.
  • Secure voice rights independently from image rights. Many licenses don't cover synthetic voice reproduction.
  • Run a data protection impact assessment for any system using a real person's likeness or voice.
  • Route all likeness-based deployments through Legal, the data protection officer, and AI for PR & Communications before launch.
  • Add procurement clauses: warranties of non-infringement, likeness/voice clearances, audit rights, and indemnities.
  • Disclose clearly to users that they're interacting with AI. Include contact paths to a human official.
  • Watermark visuals and embed provenance signals where possible. Keep logs of prompts, outputs, and model versions.
  • Stand up a takedown and revocation process that can replace assets within 48-72 hours across all channels.
  • Prepare a comms plan for harassment incidents tied to digital personas and provide support to affected individuals.

Policy checkpoints to lock in

  • Approval gates for any use of real-person likeness or voice (pilot and production).
  • Naming standards that avoid titles implying elected or appointed authority unless legally authorized.
  • Plain-language notices for citizens: purpose, data handling, and escalation options.
  • Retention limits for training data and generated assets; defined offboarding and deletion steps.
  • Restrictions on cross-border processing and model hosting, aligned with national policy.
  • SLAs for complaints, consent withdrawal, and media inquiries.

If the court orders a ban

Be ready to pause the avatar, swap assets to a neutral or synthetic face with full clearance, and update all creative and channels. Have budget and contracts set up for fast replacement. Keep essential services running with a human fallback while changes go live.

The bigger signal

AI can support public service, but consent, purpose limits, and transparency need to be non-negotiable. This case is a clear reminder: treat likeness like any other sensitive asset-define it, document it, and respect its boundaries.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)