AI is Not (Yet) a Lawyer
The recent launch of a new ChatGPT model grabbed headlines, not because software upgrades usually make news, but because AI is changing how we work—especially in legal practice. Employment lawyers are already seeing AI’s impact on communication speed in disputes.
At Constantine Law, a recent case involved a client alleging a breach of a restrictive covenant by a former employee. We drafted and sent a detailed letter of claim, outlining specific contractual breaches and demanding signed undertakings to avoid injunctive action.
Typically, the recipient would take three to four days to respond after consulting their lawyers. This time, however, the reply came within 90 minutes. Given the complexity, it’s unlikely a lawyer could have been instructed, briefed, and prepared a considered response so quickly.
Despite the former employee's claim of having “taken legal advice,” the response had all the hallmarks of AI-generated content. Although it appeared professional and was produced fast, it misrepresented key points, ignored vital facts, and contained inaccuracies. This flawed reply increased litigation risk for the former employee, giving us a clear advantage had the case proceeded.
Why AI Can Mislead in Legal Contexts
A Stanford University study found that general AI tools "hallucinate" or generate false information up to 80% of the time. Even legal-specific AI models make mistakes in at least one in six queries. Only legal professionals can spot these errors because clients often don’t know what they don’t know.
AI tools require precise, fact-specific input to improve accuracy. However, sharing personal data to refine AI responses risks violating GDPR rules if individuals can be identified directly or indirectly.
AI as a Tool, Not a Substitute
AI isn’t the problem; it’s a tool legal professionals are learning to use. Many top law firms are investing heavily in AI systems. A recent Lexis Nexis survey shows the number of legal professionals regularly using AI has doubled in the past year, with only 15% having no plans to adopt it.
However, AI should assist, not replace lawyers. Earlier this year, the Solicitors Regulation Authority approved Garfield.AI, a law firm operating solely through AI. Its services are limited to providing letters and debt recovery support for claims up to £10,000, and only for claimants. Defending claims still requires human legal input, highlighting AI’s current limits.
Risks of “Self-Lawyering” with AI
The danger lies in untrained individuals relying on AI to “self-lawyer.” Just as Google isn’t a doctor, AI isn’t (yet) a lawyer. Buyers beware.
AI can generate initial, generic responses, but the challenge is adapting those to the specific case. This applies to legal letters and contracts alike. People who grasp the facts, nuances, and strategic consequences remain essential. AI might be quick, but it still risks missing the mark.
For legal professionals interested in effectively integrating AI tools, exploring targeted training can be valuable. Resources like Complete AI Training’s courses for legal professionals offer practical guidance on using AI responsibly and efficiently in the legal field.
Your membership also unlocks:
 
             
             
                            
                            
                           