Netherlands pilots GPT-NL, a homegrown, GDPR-compliant AI with open-source roots

Four Dutch agencies and TNO will pilot GPT-NL, a locally built model backed by €13.5M to meet GDPR and cut dependence on foreign AI. For now, access is limited to organizations.

Categorized in: AI News Government
Published on: Feb 27, 2026
Netherlands pilots GPT-NL, a homegrown, GDPR-compliant AI with open-source roots

GPT-NL enters government pilots: what public agencies need to know

Four Dutch government agencies and TNO will pilot GPT-NL, a homegrown language model built by SURF, the Netherlands Forensic Institute (NFI), and TNO. NFI and TNO plan to use it internally as well.

The project is positioned as open and secure AI built around European and Dutch values. Backed by a €13.5 million investment from the Ministry of Economic Affairs, development started in 2023 and is now moving into real-world testing.

Access is limited to organizations. It won't be opened to private users.

Why this matters for the public sector

The goal is to reduce dependence on foreign AI technologies while meeting local policy expectations on privacy, oversight, and accountability. TNO says GPT-NL could be the first major language model to demonstrably comply with the General Data Protection Regulation (GDPR).

The Association of Dutch Municipalities (VNG) is supporting the effort and helping shape governance frameworks for data quality and usability. Significant portions of the source code are expected to be released as open source; public datasets will carry open licenses. Model weights will be accessible under a controlled license. Copyrighted material will be referenced via metadata.

What's in the pilot

  • Participants: four government agencies and TNO.
  • Origins: developed by SURF, NFI, and TNO; funded by the Ministry of Economic Affairs.
  • Data partnerships: Dutch media groups, including ANP, will provide archives to help train the model; if commercialized, media organizations would be compensated.
  • Access model: organizational access only, with governance guardrails under development.

Immediate steps for agencies considering GPT-NL

  • Pin down use cases with measurable outcomes: drafting policy memos, summarizing case files, search over internal knowledge bases, multilingual support.
  • Run a DPIA early. Map data categories, legal bases, retention, cross-border risk, and human review points.
  • Set human-in-the-loop rules. Define what must be checked by staff and set accuracy thresholds before output reaches the public.
  • Protect inputs. Keep personal and sensitive data out of prompts unless covered by a formal data processing agreement and technical controls (masking, redaction, role-based access).
  • Plan evaluation. Build test sets for Dutch policy and legal language; check for bias across regions, dialects, and demographic groups.
  • Lock down logging. Decide how prompts/outputs are stored, who can view them, and how long they are retained to meet audit and public-records obligations.
  • Procurement terms. Specify IP rights, model/version identifiers, uptime, support, incident reporting, and exit plans (data portability and deletion).
  • Train your teams. Provide guidance on safe prompting, citation, and when to defer to experts.

Governance checklist to put in place now

  • Clear accountability: product owner, data protection lead, and risk owner named.
  • Policy guardrails: approved use cases, banned use cases, and escalation paths.
  • Testing discipline: red-team scenarios, fact-checking workflows, and continuous evaluation metrics.
  • Transparency: document model versions, datasets (where permissible), and known limitations.
  • Public-facing use: accessibility standards, plain-language outputs, and complaint handling.

Where this fits in the national agenda

GPT-NL sits within the Netherlands' broader generative AI strategy, emphasizing transparency, fairness, and responsible deployment. The pilot phase will show whether a locally developed model can meet operational needs across ministries and municipalities while honoring European privacy rules.

Useful resources

If your department is evaluating participation, start with use-case scoping, a DPIA, and a pilot plan that measures accuracy, time saved, and compliance outcomes. That groundwork will make adoption faster once access opens up beyond the initial cohort.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)