Faster justice, more lawsuits: Brazil's AI courtroom loop
AI is speeding Brazil's courts, cutting backlogs while fueling more lawsuits. The test now: keep gains with human oversight, citations, and checks so speed serves justice.

Labor AI is helping judges to quickly close cases, and lawyers to quickly open them
Brazil is running one of the largest justice-system AI experiments on earth. With 76 million lawsuits in the pipeline and costs near $30 billion a year (1.6% of GDP), courts have launched more than 140 AI projects to triage, summarize, and draft. It's working - and it's also fueling more litigation.
Judges are clearing cases faster, while lawyers file faster. "We note that the use of AI, in the end, rather than diminishing litigation, is increasing it," said Rodrigo Badaró, a councilor who monitors AI at Brazil's judiciary. The question isn't whether AI adds speed. It's whether that speed serves justice.
Brazil's high-volume experiment
Brazil's Supreme Court receives around 80,000 new cases each year - a staggering load compared to fewer than 100 cases heard annually by the U.S. Supreme Court. That top-level burden is only a fraction of the nationwide queue.
To cope, courts built tools to find precedents, cluster filings, draft opinions, and flag repeat litigants, according to the National Council of Justice. Early results show faster document handling and shorter timelines across proceedings.
AI on the bench
At the Supreme Court, clerks use MarIA - a generative AI assistant built on Google's Gemini and OpenAI's ChatGPT - to draft reports that humans review and refine. "It's easier to adjust what the AI produced than start from scratch," said law clerk Arianne Vasconcelos.
Backlogs at the Court have fallen to their lowest since 1992, according to internal productivity reports. Nationwide, judges closed 75% more cases last year than in 2020. AI accelerates the routine. Human judgment still decides the hard calls.
AI at the bar
More than half of Brazil's attorneys use generative AI daily. They filed over 39 million new lawsuits last year - a 46% jump since 2020. Drafts that took minutes now take seconds, but every output still needs a lawyer's review.
Large firms use tools like Harvey to compare expert reports, spot inconsistencies, and review filings. Users report hours saved each week, with the caveat that sources must be verified. "Without the references, I can't check if it's making things up," said lawyer Thiago Sombra.
The risk: speed without justice
AI sometimes fabricates cases and citations. Researchers have tracked 350+ instances of hallucinated filings worldwide, with Brazil recording fines in multiple cases this year. The United Nations warned governments against "techno-solutionism" without clear safeguards.
There's also a deeper issue: the law isn't fully standardized. "I look at reality, I put it in a box, I formalize it, I create a product, and I'm efficient," said researcher André Fernandes. But family law, contracts, and successions hinge on context, fairness, and equity - factors that don't fit neatly into templates.
What this means for government leaders
AI will keep increasing throughput - and, paradoxically, demand. The task for policymakers is to capture the efficiency while protecting due process, equity, and public trust. Treat AI as assistive, not autonomous.
- Require human sign-off. Mandate source citations for any AI-assisted filing or draft. Enforce audit logs for prompts, outputs, and edits.
- Set procurement standards. Demand vendor transparency on training data, evaluation results, and data handling. Require red-teaming for bias and hallucinations.
- Pilot where risk is lower. Start with classification, deduplication, and template drafting. Measure cycle times, error rates, and appeal outcomes before scaling.
- Protect confidentiality. Use secure, on-prem or vetted environments. Prohibit input of sensitive data into consumer chatbots.
- Invest in people. Train judges, clerks, and public defenders to review AI outputs and spot failure modes. For structured upskilling paths, see Complete AI Training by job.
Operational metrics to track
- Average time to disposition by case type
- Appeal and reversal rates on AI-assisted drafts
- Hallucination incidents per 10,000 filings and resulting sanctions
- Monthly filings per attorney and closures per judge
- Backlog size and cost-to-serve as a share of GDP
- Access-to-justice indicators (pro se outcomes, time to first hearing)
Inside the workflow: what's changing
Clerks use AI to summarize, organize, and draft, then apply legal reasoning to finalize. Lawyers use AI to scan contracts, compare expert opinions, and spot gaps - followed by human validation and strategy. Independent practitioners lean on general models for rewrites and brainstorming, while avoiding sensitive data and checking every citation against court systems.
The repetitive work is shifting to machines. The responsibility stays with people.
Bottom line
AI is helping Brazil close cases faster - and open even more. That "vicious circle" can still serve the public if standards, oversight, and training keep pace. The goal isn't speed alone; it's fair, consistent, and timely justice at scale.