Law Firm's Repeated AI Errors Prompt Citation Verification Tool
Gordon Rees has now faced three separate incidents of AI-generated hallucinations in legal filings. The firm filed a brief with fabricated citations in October, received a reprimand in December for citations that didn't support their claims, and faced another court filing flagging misleading or incorrect citations weeks later.
The pattern reflects a broader problem. Researcher Damien Charlotin has catalogued over 1,000 legal cases involving AI hallucinations. Lawyers increasingly rely on AI tools for research and drafting, but the errors follow.
A New Tool to Catch Hallucinations Before Courts Do
BriefCatch, a legal research platform, launched RealityCheck at Legalweek. The tool verifies that citations actually exist, that quoted language appears in the cited opinions, and that those opinions actually support the propositions they're cited for.
RealityCheck uses two layers of verification. First, it runs deterministic checks against authoritative legal databases-validating reporter volumes, court identifiers, and case names without any AI involvement. Then it uses AI-assisted analysis to evaluate whether quoted language matches the source material and supports its intended use.
Each citation receives a visual label: Green-Verified, Yellow-Caution, or Red-Incorrect. The tool explains its assessment for the reviewer.
When tested against the original Gordon Rees brief from October that the firm acknowledged contained hallucinations, RealityCheck identified the problems exactly as opposing counsel would expect to find them.
The Real Pressure: Courts Will Use This
BriefCatch is making RealityCheck available to federal and state court clients. That matters because once courts begin running filed briefs through the tool, the calculus changes for every litigator.
The question shifts from whether to verify citations to whether you want the court to find errors before you do. That's pressure applied directly to the entire legal market.
The underlying problem remains. Lawyers blame legal AI research tools for introducing errors into briefs, but that's like blaming a vending machine for not serving dinner. The tools are only as careful as the people using them. The difference now is that sloppiness has a deadline: discovery before filing.
For more on how AI affects legal work, see AI for Legal Professionals and the AI Learning Path for Paralegals.
Your membership also unlocks: