Litig's Legal AI Transparency Charter Sets a New Bar for Trust, with a Kitemark to Prove It

Litig launches an AI Transparency Charter for legal tech, stressing clear claims, testing, and accountability with a kitemark. Vendors and firms can sign; it's voluntary.

Categorized in: AI News Legal
Published on: Oct 23, 2025
Litig's Legal AI Transparency Charter Sets a New Bar for Trust, with a Kitemark to Prove It

Litig Calls on Legal Industry to Sign AI Transparency Charter

Litig, a UK-based legal tech group, has launched an AI Transparency Charter to help the market treat AI with more maturity and honesty. Signatories receive a kitemark badge to signal commitment to a shared quality code focused on clarity, testing, and accountability.

This move follows well-publicised accuracy issues across legal AI. The premise is simple: most legal tools now include AI, and buyers deserve clean, evidence-backed claims on what those tools can and can't do.

A compass, not a scoreboard

Litig's approach avoids fixed performance benchmarks that age fast. Instead, it pushes continuous improvement, with transparent methods and shared signals on accuracy, testing, and limits. It's about making better choices with clear information, not chasing a single magic number.

Core commitments of the Charter

  • Transparency: Plain-spoken disclosure of how AI is used in legal products and services.
  • Accuracy & Testing: Claims supported by evidence, test data, and methods.
  • Bias & Ethics: Active steps to identify and reduce bias and other risks.
  • Use Cases & Limitations: Clear lines on where AI works well and where it shouldn't be relied on.
  • Environmental Impact: Track and reduce compute, carbon, and resource use.
  • Regulation & Standards: Consistency with industry standards and the EU AI Act.

What comes with it

  • Litig AI Product Transparency Statement: A standardized template (inspired by "model cards") covering tech details, data, use cases, testing, and safeguards.
  • Litig AI Use Case Frameworks: Practical templates for firms and vendors to define scenarios, document workflows, evaluate value, and spell out risk controls.
  • Glossary and Reference Packs: Common terms plus links to benchmarks, evaluations, due diligence questions, and regulation.

Who should sign

AI vendors are invited to sign. Law firms that sell AI-driven products or client-facing outputs are in scope as well. Signatories get a kitemark image and will be listed on Litig's site.

Enforcement is not active at launch. This is a voluntary code. The working group will review issues raised and iterate as adoption grows.

Practical steps for legal teams

  • Add the Charter to procurement and panel reviews: Ask vendors to confirm adherence and provide a Product Transparency Statement.
  • Require test evidence: Ask for datasets, evaluation methods, error types, and real examples that match your matters.
  • Document limits: For each use case, state where human review is required, what the tool can miss, and known failure modes.
  • Bias checks: Request bias testing approaches, redress strategies, and escalation paths.
  • Incident handling: Define how hallucinations, data issues, or model changes are reported and fixed.
  • Regulatory fit: Map use cases to the EU AI Act risk categories and maintain a record of controls.
  • Footprint tracking: Ask vendors for compute and carbon disclosures where available; set thresholds for internal use.
  • Training: Brief lawyers and legal ops on safe use, limits, and review standards for AI outputs.

What CMS's John Craske said

  • Kitemark: Yes-signatories receive a badge and a link. Litig will also list organizations that sign.
  • Monitoring: No formal audits for now; the Charter is voluntary. The group may respond to issues and will keep enforcement options under review.
  • Law firms: Yes-if firms provide legal AI products to clients, they're expected to sign and apply the same standards.

How to use this in contracts

  • Ask suppliers to append their Product Transparency Statement and notify you before material model or data changes.
  • Require disclosure of test suites, evaluation cadence, and error rates over time.
  • Include responsibilities for human review, bias mitigation, and incident response (with timeframes).
  • Reference applicable regulatory duties and who carries which obligation for each use case.

Where to start

Review the Charter, pick 1-2 pilot use cases, and publish your own internal transparency notes. Keep it live and improve it as your testing matures.

Litig AI Transparency Charter | EU AI Act (Official text)

If your team needs structured upskilling on AI governance and practical workflows, see curated options by role: AI courses by job.

Dates to note: Legal Innovators 2025

  • Legal Innovators UK: Nov 4 (Law Firm Day), Nov 5 (In-house Day), Nov 6 (Litigation Day)
  • Legal Innovators New York: Nov 19-20

Both events are organized by the Cosmonauts team. If you plan to attend, reach out to them to get involved.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)