Hungary Implements EU AI Act Framework: Broad Jurisdiction, New Watchdogs, Tough Fines

Hungary set up enforcement for the EU AI Act, covering any AI output used in Hungary. Legal teams should map exposure, classify systems, prep for oversight, and expect tough fines.

Categorized in: AI News Legal
Published on: Nov 07, 2025
Hungary Implements EU AI Act Framework: Broad Jurisdiction, New Watchdogs, Tough Fines

Hungary sets up national enforcement for the EU AI Act: what legal teams need to know

Hungary has moved fast. Two instruments-Act LXXV of 2025 and Government Decree No. 344/2025 (X.31)-create the institutional framework to enforce the EU AI Act at national level. The message is straightforward: companies operating in Hungary should keep building and proving EU AI Act compliance.

The scope is wide. Any use of "AI system output" in Hungary can trigger jurisdiction-no matter where the system was marketed, deployed, or where the provider sits. That brings downstream users firmly into view, not just developers and distributors.

Authorities and roles

  • Notifying Authority (Article 28): The National Accreditation Authority (Nemzeti Akkreditációs Hatóság).
  • AI Market Surveillance Authority (Article 70): The Minister of Enterprise Development (vállalkozásfejlesztési miniszter).
  • Default competence: These two authorities handle all implementation tasks unless another law assigns a specific matter elsewhere.

Market surveillance: key tasks

  • AI regulatory sandbox: Set up and operated by the Market Surveillance Authority.
  • Single point of contact: Performs the duties under Article 70 of the EU AI Act.
  • Business-to-business oversight: Handles market surveillance where the client is not a consumer and the matter concerns assessment of an AI system.
  • Enforcement tools: Orders to stop unlawful conduct, measures to restore compliance, administrative agreements in writing, and fines up to the EU AI Act maximums.
  • Real-world testing: Oversight for high-risk AI systems tested in real-world conditions.

Sector interplay and special cases

If an AI system is marketed or used in a sector overseen by a sectoral market surveillance authority, the AI Market Surveillance Authority acts as a special expert authority. The sectoral authority must obtain its opinion before deciding on EU AI Act compliance.

There is still uncertainty for matters outside the AI Market Surveillance Authority's remit and not captured by a sectoral market surveillance procedure-for example, issues that look more like data protection cases. Expect guidance or practice to fill these gaps.

Financial services

For high-risk AI systems used by financial institutions, where the system directly provides financial services, the Hungarian National Bank (Magyar Nemzeti Bank) takes the role of market surveillance authority.

Hungarian Artificial Intelligence Council

The new framework creates an advisory body-the Hungarian Artificial Intelligence Council. It has no decision-making powers but carries weight across policy and practice.

  • Guides national AI strategy and policy.
  • Focuses on fundamental rights and protection of vulnerable groups.
  • Issues recommendations on EU AI Act implementation, including market surveillance.
  • Supports AI skills initiatives and participates in sandbox operations.
  • Coordinates AI research and public sector use.
  • Monitors social and economic impacts.
  • Can request relevant data from authorities and organisations for regulation (subject to legal safeguards).

What this means for in-house legal and compliance

  • Map exposure to "AI system output" in Hungary: Track where outputs influence decisions, workflows, or customer interactions-even if the model or provider sits abroad.
  • Classify systems: Identify prohibited, high-risk, and limited-risk use cases. Document your rationale and evidence.
  • Tighten supplier contracts: Bake in EU AI Act obligations, audit rights, incident reporting, and decommissioning/rollback clauses.
  • Prepare for real-world testing: Set protocols, risk controls, and logging for high-risk pilots subject to oversight.
  • Sector checks: If you operate in a regulated sector-or finance-confirm which authority leads and where expert opinions are mandatory.
  • Incident and complaint handling: Align internal reporting lines with the single point of contact setup. Keep records that are easy to produce.
  • Enforcement readiness: Assume on-site or desk reviews. Keep technical documentation, data governance records, and conformity assessments inspection-ready.

Jurisdictional reach and risk

The trigger is use of output in Hungary. This pulls in global providers, distributors, and users whose AI outputs land inside the country. Fines can reach the EU AI Act maximums, so early remediation and documented controls matter.

Open items to watch

  • How data protection cases interact with AI market surveillance where competence overlaps or is unclear.
  • Practical guidelines for sectoral coordination and timelines for obtaining expert opinions.
  • Sandbox entry criteria, scope, and expected supervisory expectations.

Primary sources

Skills and readiness

If your team is building an internal training path for AI governance, model risk, or sector-specific use, plan it alongside your compliance roadmap and documentation build. Training that tracks risk class, documentation duties, and sector rules will save time when supervision starts.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)
Advertisement
Stream Watch Guide