Georgia prosecutor admits submitting AI-generated brief with fabricated case citations
A Clayton County prosecutor submitted a legal brief containing at least five nonexistent case citations, later admitting to using artificial intelligence to draft the filing. District Attorney Tasha Mosley apologized to the Supreme Court of Georgia. The prosecutor now faces possible discipline and referral to the State Bar.
The incident exposes how quickly AI has moved into courtroom work-and how little oversight exists to catch errors before they reach judges.
AI adoption in courts is already widespread
The Clayton County case is not isolated. Legal tech expert Cat Casey said roughly 60% of judges are using AI in some capacity, though that figure may be inflated. The actual number remains unclear because adoption is happening quietly.
"It's been a quiet, rolling thunder," Casey said in an interview about the shift.
Judges use AI for legal research, drafting motions, and case analysis. Prosecutors and defense attorneys rely on it for similar tasks. The State Bar of Georgia acknowledges the trend and has issued guidance requiring lawyers to verify all AI-generated work and maintain professional standards.
High-volume courts create pressure for faster decisions
Atlanta's court system handles unusually large caseloads, sometimes called "rocket dockets." When volume is high, judges face strong incentives to adopt tools that promise efficiency.
"When you have that volume, the likelihood a judge is going to want to use AI to be more efficient is pretty high," Casey said.
That pressure means Georgia courts may embed AI deeper and faster than other jurisdictions.
Speed versus accuracy remains unresolved
Supporters argue AI could reduce legal costs and accelerate case resolution. If a legal task drops from $20,000 to $5,000, more people can afford representation.
Critics worry about the tradeoffs. Bias in training data could reinforce existing disparities. Overreliance on automation could weaken human judgment. And errors-like fabricated citations-could damage public trust in courts.
Lawyers remain legally responsible for AI output
The law is clear: if an attorney submits AI-generated work without verification, the attorney is responsible. Professional ethics rules require lawyers to supervise all tools they use, including AI.
"If I'm an attorney and I submit something AI-generated without checking it, I'm responsible," Casey said.
That accountability doesn't change because the technology is new. Lawyers must verify information, flag errors, and ensure client confidentiality when using AI systems.
Transparency will determine public trust
Courts and law firms need to disclose how AI is being used in specific cases. Judges and attorneys should explain where human judgment ends and automated analysis begins. And systems must exist to catch and correct mistakes before they reach the record.
Without transparency, the Clayton County incident may be the first of many.
For legal professionals, the immediate task is clear: treat AI as a tool that requires the same verification and oversight as any other resource. The technology is not replacing lawyers. But as Georgia courts show, it can cause real damage when used carelessly.
Learn more about AI for Legal professionals and how to implement these tools responsibly. The AI Learning Path for Paralegals covers document review, legal research automation, and the verification practices essential to avoiding errors like those in Clayton County.
Your membership also unlocks: