Poland urges EU action on AI-made TikTok Polexit videos amid Russian disinformation claims

Poland asked the EU to probe TikTok over AI 'Polexit' videos targeting young voters. Officials allege a coordinated push that may breach the DSA.

Published on: Jan 01, 2026
Poland urges EU action on AI-made TikTok Polexit videos amid Russian disinformation claims

Poland asks EU to act on AI-generated TikTok push for "Polexit"

Poland has asked the European Commission to open proceedings against TikTok after a wave of AI-generated videos urged young Poles to back "Polexit" and attacked the pro-EU government. Officials say the content shows signs of an organised influence effort and point to Russian-style syntax in some scripts.

Deputy digital affairs minister Dariusz Standerski sent a formal request to European Commissioner Henna Virkkunen, arguing the campaign threatens public order, information security and election integrity. The government says TikTok failed to moderate AI content and did not provide clear transparency on its origin, which would put it at odds with the EU's Digital Services Act (DSA).

What happened

A TikTok channel published AI-generated videos featuring young women wearing Polish symbols and speaking directly to young voters. Some clips promoted leaving the EU; others attacked Prime Minister Donald Tusk's government. The account's profile carried an anti-EU slogan linked to radical-right leader Grzegorz Braun, a Polexit advocate.

Standerski said the surge in such videos in recent days points to a coordinated campaign. Government spokesman Adam Szłaka added there was "no doubt" it was Russian disinformation, citing phrasing characteristic of Russian syntax in some voiceovers.

The legal move: use of the DSA

In his letter, Standerski asked the Commission to initiate DSA proceedings against TikTok, arguing the platform lacks adequate moderation for AI content and fails to provide effective transparency on source and generation methods. He says that undermines core DSA objectives on disinformation prevention and user protection.

The DSA, in force since 2022, sets accountability rules for large platforms, including risk assessments, mitigation, and transparency. Earlier this month, social network X was fined €120 million for non-compliance. For context on the law, see the European Commission's overview of the DSA: Digital Services Act.

Platform response so far

The TikTok channel in question has been removed after numerous user complaints, according to local media. Investigators note the account existed since May 2023, previously posted English-language content unrelated to Poland, then rebranded on 13 December 2025 to a Polish name and began posting on Polexit.

Political backdrop

Two recent polls show 25% of Poles now support leaving the EU, a significant increase, though most still prefer to remain. Support for Braun has risen; he finished fourth in this year's presidential race and his Confederation of the Polish Crown (KKP) is gaining attention.

Why this matters for government, IT, and development teams

  • Treat AI-synthetic media as a standing risk. Build playbooks for content provenance, rapid response, and public advisories. Make "prebunking" part of comms (explain tactics before they land).
  • Push for provenance at scale. Adopt and demand standards like C2PA content credentials. Require visible labels on AI-generated media and machine-readable metadata.
  • Upgrade detection. Monitor for sudden account pivots (topic, language, audience), identical scripts across avatars, voiceover artifacts, and mismatched lip-sync. Flag Russian-syntax patterns as one signal, not proof.
  • Rate-limit suspect growth. Throttle bulk uploads, recycled audio, and mass reposting. Pair device fingerprinting with behavior signals to catch coordinated networks.
  • Audit recommender systems. Under the DSA, very large platforms must assess systemic risks. Log how synthetic content spreads, and cut amplification when integrity risks are confirmed.
  • Preserve evidence. Maintain cryptographic hashes, timestamps, and moderation notes for regulator requests and independent audits.
  • Empower users. Provide one-click reporting for "synthetic political content," show traceable label history, and surface verified sources in search results.
  • Policy teams: prepare for DSA inquiries. Document risk assessments, mitigation steps, and transparency reports. Expect requests for detailed technical measures.

What to watch next

  • Whether the European Commission opens a formal DSA case against TikTok and requests interim measures.
  • Any platform-wide updates from TikTok on synthetic media labeling and provenance checks. Current rules are here: TikTok Community Guidelines.
  • Cross-platform spillover of the content, reuse of the AI assets, and coordinated accounts repeating the same scripts.
  • Domestic investigations into funding, network links, and third-party services used to generate and seed the videos.

If your team needs to upskill on AI content governance and detection, see practical training by role: AI courses by job.

Bottom line: Poland is pressing the EU to enforce the DSA on TikTok over an AI-driven disinformation push. Expect closer scrutiny of synthetic media labeling, provenance, and recommender systems-and prepare your playbooks accordingly.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)
Advertisement
Stream Watch Guide