Sotomayor tells law students to master AI, but the profession should consider the cost
Supreme Court Justice Sonia Sotomayor told law students at the University of Alabama School of Law this week that they should not graduate without learning to use AI as a tool. She called AI systems the "new revolution" in legal work, comparable to the arrival of computers in the latter half of the 20th century.
Sotomayor acknowledged the risk built into these systems. "AI is a sophisticated human," she said. "All of its input is input from human beings. And because it is that, it has the potential to perpetuate the very best in us and the very worst in us." That makes AI particularly dangerous when applied to judging human situations and complex legal problems.
Her warning raises a harder question: what happens to the legal profession when AI becomes mandatory rather than optional?
The hidden costs of automation
The immediate concern is obvious - AI systems produce errors that reflect human bias and incomplete training data. But the longer-term effects are less visible and potentially more damaging to how lawyers develop as professionals.
Document editing offers a concrete example. Traditionally, junior lawyers submit drafts to partners or supervising attorneys, who return them marked up in red. The process is uncomfortable. It also builds writing ability and creates mentoring relationships that bind teams together.
Once AI enters the workflow, that dynamic collapses. Partners will tell junior lawyers to run drafts through the firm's AI tool multiple times before submission. The human feedback loop disappears. Training happens less. Alienation increases.
Firms are already responding by hiring experienced lawyers laterally rather than training junior staff from scratch. That strategy works in the short term. It is unclear whether it sustains a profession.
Questions without answers
Other unknowns loom larger. Lawyers, especially in large firms, already work extreme hours and often sleep four hours or less per night. How does exhaustion change the way someone interacts with an AI system that produces plausible-sounding errors? What are the neurological effects of regular reliance on these tools?
The profession has no infrastructure to measure these impacts. No one is tracking whether AI adoption correlates with burnout, depression, or cognitive changes among practitioners. The advice to "master AI" assumes mastery is possible and beneficial - assumptions that deserve scrutiny.
Lawyers who choose not to use AI will face pressure. Some firms may see abstention as a liability. Others may simply move faster than their non-AI peers. That creates a de facto mandate, regardless of what leadership says.
What comes next
Justice Sotomayor is right that lawyers need to understand AI. Understanding and mandatory adoption are different things. The legal profession should ask harder questions before treating AI as inevitable infrastructure.
For those adopting AI tools, the burden falls on individuals to maintain critical thinking skills. For those declining to use them, the burden may fall on your firm's tolerance for different approaches. Either way, the choice is narrowing.
Learn more about AI for Legal professionals, or explore the AI Learning Path for Paralegals to understand how these tools work in practice.
Your membership also unlocks: