AI hallucinations in legal filings rise in Oregon as lawyers and self-represented litigants misuse chatbots

Two Oregon lawyers were fined $110,000 for filing court documents containing fake cases invented by AI. Nationally, nearly 900 such filings have been identified, with penalties ranging from fines to potential disbarment.

Categorized in: AI News Legal
Published on: May 11, 2026
AI hallucinations in legal filings rise in Oregon as lawyers and self-represented litigants misuse chatbots

Oregon lawyers face fines for filing AI-fabricated cases

A federal judge fined two Oregon lawyers $110,000 for submitting legal documents filled with cases and citations invented by artificial intelligence. The penalty underscores a growing problem: lawyers across the country are filing briefs containing AI hallucinations-false information generated by tools like ChatGPT and Claude.

Oregon has identified approximately five court filings with AI-generated fabrications. Nationally, the count reaches closer to 900, according to Ankur Doshi, general counsel of the Oregon State Bar.

The scope of the problem

An AI hallucination occurs when a generative AI program produces inaccurate or misleading information, sometimes inventing facts entirely. In legal filings, this means fictitious case law, false quotes, and invented statements of law.

The problem extends beyond lawyers. Self-represented litigants are using AI to draft their own court documents without understanding the tool's limitations. These individuals face potential sanctions, including fines, for submitting fabricated material.

Why lawyers continue using AI

Despite the risks, many attorneys integrate AI into their workflow because it saves time and improves efficiency. AI can help review and draft documents quickly-but only if humans verify the work.

"The key element within utilizing AI well is that there has to be that human element who checks the work," Doshi said. Many lawyers, he added, don't fully understand how AI functions or why it generates false information.

Discipline depends on transparency

The Oregon State Bar's response to AI misuse hinges on whether lawyers disclose their use of the tool and how they resolve the problem. A lawyer who tells the court about AI use may receive minimal consequences. Those who hide it face fines, suspension, or disbarment.

The bar requires lawyers to be competent with any tools they use to provide legal services, including understanding that AI can produce false information.

Broader consequences

AI fabrications waste court resources. Opposing counsel must spend extra time reviewing potentially false cases. Over time, Doshi warned, these errors could damage the legal system's integrity.

"It's fabricated statements of law," he said. "You have entire arguments that have no basis in law, which strikes directly at our precedent-based system."

Learn more about AI for Legal professionals and how to use these tools responsibly. Paralegals and legal staff can explore AI learning paths for paralegals to understand document review and research automation without the risks of hallucination.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)