New Mexico courts grapple with AI-generated fake citations in legal filings
Federal and state courts in New Mexico have detected artificial intelligence hallucinations in at least seven lawsuits since 2023, with judges issuing sanctions and warnings as attorneys and self-represented litigants file documents containing fabricated case law.
In one case, a pro se litigant seeking disability discrimination damages requested "$355.69 quintillion" in sanctions-a figure Senior U.S. District Judge Judith Herrera called "quite simply ludicrous." The judge found AI hallucinations throughout his filings and ordered him to pay $8,640 in sanctions.
An attorney in another case filed a pleading citing six nonexistent cases, all likely generated by ChatGPT or similar AI tools. U.S. Magistrate Judge Damian Martínez of Las Cruces fined the attorney $1,500, required bar disciplinary reporting, and ordered a one-hour legal ethics course on AI use.
How AI creates false information
AI hallucinations occur when language models generate false sources that sound plausible. Judge Martínez explained that AI systems learn patterns from training data, but incomplete or flawed data leads to inaccurate predictions-the models confidently produce citations that never existed.
The problem extends beyond New Mexico. A federal judge in Colorado ordered two attorneys representing MyPillow CEO Mike Lindell to pay $3,000 each after they filed a defamation brief containing more than two dozen errors and fabricated cases.
Self-represented litigants at higher risk
State District Judge John P. Sugg of Carrizozo said self-represented litigants especially rely on AI without verifying citations or statutes. "I'll read a motion that they filed, and I can't find anything that they've cited to," Sugg said. "We've had it with attorneys, too."
Judges say wasted time chasing nonexistent citations strains limited judicial resources. Chief U.S. District Judge William Johnson noted in 2023 that such deception harms opposing parties, courts, and the legal system's reputation.
New safeguards emerging
Judges aren't banning AI-they're demanding accuracy. Sugg imposed a court order requiring attorneys and litigants who use generative AI to draft or edit documents to disclose that use at the top of filings.
Filers must also certify that AI-generated language was checked for accuracy using legal databases or human review. The New Mexico Supreme Court is developing formal policy guidance on AI use in state courts.
"AI is a good tool," Sugg said. "It just needs to be something that we're careful using."
For legal professionals navigating these risks, understanding how generative AI and LLMs work is essential to recognizing hallucinations. Resources on AI for legal practice can help attorneys and staff use these tools responsibly.
Your membership also unlocks: