Clio's $1B vLex Bet: Why Jack Newton Says Generic AI Fails Lawyers

Clio's Jack Newton says generic LLMs trained on the open web fuel bad citations; legal AI needs real legal data. He's backing vLex's Vincent for grounded answers and more output.

Categorized in: AI News Legal
Published on: Oct 21, 2025
Clio's $1B vLex Bet: Why Jack Newton Says Generic AI Fails Lawyers

Clio's CEO: Generic LLMs miss the mark for legal work

At ClioCon in Boston, Clio founder and CEO Jack Newton put a clear stake in the ground: the biggest problem with many AI tools in law is imprecise data. That, he argued, is why lawyers keep seeing hallucinated citations and shaky answers.

"We've all seen the headlines of AI making mistakes, hallucinating cases that never existed," Newton told the audience. "Lawyers being sanctioned for doing something that, to be fair, we've been trained to do over the last 20 years, which is to trust the answer that a computer gives us."

Newton's critique focused on general-purpose large language models. "All of the foundational large language models - those from companies like OpenAI and Anthropic - are powerful but they're general purpose, they're generic, they're trained on the open web, not on real legal data," he said. In his view, performance in legal work "depends entirely on the quality behind its answers."

Clio's enterprise play: Vincent for large firms and corporate legal

Newton announced that Vincent, the AI legal assistant created by vLex, will be the second product offered under Clio's new enterprise division for large law firms and in-house teams. Clio acquired vLex this summer in a reported US$1 billion deal, said to be the largest in legal tech to date.

The pitch: Vincent is trained on vLex's legal database, which expanded to more than one billion legal documents from 180 countries after vLex acquired Fastcase in 2023. "Unlike generic AI that is trained on data available on the open web, Vincent is legal AI that understands the law," Newton said. "When AI is grounded in legal data, real cases and real decisions, it doesn't just produce more accurate answers - it gains new capabilities."

Why this matters now

Less than two years ago, a BC Supreme Court decision addressed the use of AI-hallucinated case law in proceedings. Since then, Canadian courts and tribunals have issued more than 30 decisions involving known uses of fabricated or misrepresented case law, according to a case tracker from an HEC Paris researcher. Many lawyers have pulled back on AI as a result.

Newton sees a different path. If AI is fed authoritative legal sources and wired into firm workflows, he argues it can expand capacity: "four times the clients," "four times the matters," and more revenue - while also narrowing the access to justice gap. He closed with a reassurance: "AI is not here to replace legal professionals. AI is here to amplify your impact."

Practical checklist for evaluating legal AI

  • Ask about data provenance. What primary law, secondary sources, and jurisdictions are in the corpus? How often is it updated?
  • Require verifiable citations. The tool should surface linked authorities by default and flag answers lacking support.
  • Test on your matters. Run representative briefs, memos, and research prompts. Track accuracy, recall, and time saved.
  • Define a human review policy. Specify use in drafts, research, and intake. Set rules for disclosure in filings where required.
  • Clarify confidentiality. Where is data stored? Is client content used for model training? What audit logs are available?
  • Negotiate safeguards. Include service levels, change notices for model updates, error remediation, and indemnities where feasible.
  • Integrate with your sources. Ensure the system can search and cite the primary law and subscriptions your lawyers rely on.
  • Train your team. Create short playbooks for prompts, verification, and handoffs between AI output and attorney review.

Bottom line for legal leaders

General-purpose models can be useful, but they're prone to drift without access to authoritative legal data. If you adopt AI, make provenance, citations, and verification non-negotiable - and pair the tech with clear policies and training.

If you're building firm-wide AI literacy and workflows, explore curated options for legal and compliance roles here: AI courses by job and a roundup of the latest AI courses.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)