Los Angeles Superior Court tests AI tool to help judges manage growing caseloads

Los Angeles Superior Court is piloting an AI tool called Learned Hand to help judges handle administrative tasks as civil case filings jumped 49% in one year. The tool summarizes filings and drafts rulings but does not make judicial decisions.

Categorized in: AI News Legal
Published on: Mar 23, 2026
Los Angeles Superior Court tests AI tool to help judges manage growing caseloads

Los Angeles Courts Test AI Tool to Manage Rising Caseloads

The Los Angeles Superior Court is pilot testing an AI system called Learned Hand to help judges manage administrative work in civil cases. The tool summarizes filings, organizes evidence, and generates draft rulings-work that currently consumes time judges could spend on legal analysis.

Court backlogs are worsening. Filings in civil cases rose 49 percent in the past year, from 4,100 to 6,400 cases, according to a February 2026 report by law firm Fisher Phillips. AI itself is partly responsible: the technology makes it cheaper and faster to produce legal documents, which increases the volume courts must process.

How Learned Hand Works

A small group of judicial officers in Los Angeles now have access to the system. Learned Hand, founded in 2024 and named after a federal judge, was built specifically for courts rather than adapted from general-purpose AI tools.

The system does not make judicial decisions. Shlomo Klapper, the company's founder and CEO, said the goal is to eliminate "drudge work"-organizing case materials and surfacing key facts-so judges can focus on judgment and legal reasoning.

"With this partnership, we are carefully evaluating emerging technologies to determine how they may support judicial officers in working more efficiently and effectively," Presiding Judge Sergio C. Tapia II said. "While this tool may enhance the way judicial officers review and engage with case files, it will not replace the sanctity, independence, and impartiality of judicial decision-making."

The Verification Problem

The harder engineering challenge is not generating text but verifying it. Klapper said most of Learned Hand's computational cost goes to checking outputs against source materials and legal authorities, not to writing them.

AI systems have already caused problems in real court cases. In 2023, lawyers for Prakazrel "Pras" Michel, a founding member of the Fugees, used an AI tool that generated closing arguments containing frivolous claims and missed weaknesses in the government's case. That same year, a federal judge ordered lawyers to provide printed copies of cases they cited after the court could not verify them.

Learned Hand reduces hallucination risk by limiting its source material to a defined set of legal documents rather than drawing from the open internet. The system also breaks tasks into steps and assigns each to a model designed for that specific function, which reduces the risk of AI echoing biases from training data.

Design for Judges, Not Tech Experts

The interface requires no technical training. Klapper described it as "point and click"-judges do not need to write prompts or understand how the underlying model works.

Klapper said judges should verify all AI outputs rather than accepting them as reliable. "Don't trust, verify," he said. "They shouldn't trust anything. It has to show its worth."

The Los Angeles pilot will show whether courts can safely use AI to handle the administrative burden that keeps judges from their core work. Early results from this program could influence how other court systems approach similar tools.

For legal professionals, understanding how AI for Legal applications work-and their limitations-is becoming essential. Those in paralegal roles may find an AI Learning Path for Paralegals particularly relevant to document review and case management workflows.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)