When Using Artificial Intelligence Goes Wrong: Judge Slams Lawyers for Legal Bungle
A Gauteng High Court acting judge recently faced a startling issue: two non-existent legal citations appeared in a case before him. This triggered a serious warning about relying too heavily on artificial intelligence (AI) for legal research.
Acting Judge DJ Smit was preparing a judgment in a case involving Northbound Processing, which sought to compel the South African Diamond and Precious Metals Regulator to release a refining licence. While drafting, he found that two cases cited in Northbound’s heads of argument did not exist.
AI "Hallucinations" Confirmed in Court
Upon questioning, the advocate responsible for the citations admitted that the errors likely resulted from what is known as AI "hallucinations." The lawyer explained that he used an online tool called “Legal Genius,” which claimed to be trained exclusively on South African legal judgments and legislation.
The lawyer accepted full responsibility but stressed there was no intention to mislead the court. The senior advocate who argued the matter apologized on behalf of Northbound’s legal team, explaining he had relied on a trusted legal team and only conducted a brief sense-check before filing the heads of argument.
Time Pressure and AI Limitations
Time constraints were cited as a reason for the oversight, as the matter was an urgent application. However, the presence of fabricated citations raises concerns about the reliability of AI tools when used under pressure.
Judge Smit referenced a recent English King’s Bench Division judgment warning about the risks of AI in legal research. The judge highlighted that AI can produce completely inaccurate information and even cite sources that don't exist, which threatens the integrity of justice and public trust in the legal system.
Consequences and Professional Responsibility
Despite the apologies, Judge Smit emphasized that negligence—even if unintentional—can have serious consequences. The conduct of the involved legal practitioners has been referred to the Legal Practice Council for further investigation.
This case serves as a cautionary tale for legal professionals using AI tools. While such technology can aid research, it is vital to verify all citations and ensure accuracy before submission to the court.
- Always cross-check AI-generated legal references against official records.
- Recognize that AI tools may produce "hallucinations," or false information.
- Maintain rigorous review processes, especially under tight deadlines.
For legal professionals interested in responsible AI use and training on AI tools that can support legal work, resources are available at Complete AI Training.
Your membership also unlocks:
 
             
             
                            
                            
                           