Indian courts warn lawyers that AI citation errors constitute professional misconduct

Indian courts have restricted AI use in legal practice, citing fabricated citations and poor accuracy as grounds for professional misconduct. Lawyers bear full accountability for every citation and argument, regardless of whether AI generated it.

Categorized in: AI News Legal
Published on: Apr 15, 2026
Indian courts warn lawyers that AI citation errors constitute professional misconduct

Indian Courts Restrict AI Use in Legal Practice, Demand Human Accountability

India's Supreme Court and multiple High Courts have pushed back against unchecked use of AI in litigation, flagging fabricated citations and poor accuracy as professional misconduct. The courts have made clear that accountability rests with the lawyer, not the tool.

The Punjab and Haryana High Court cautioned judges against using AI for judgments or legal research. The Gujarat High Court limited AI use to administrative tasks only. Legal experts including Sanjeev K Kapoor and CV Raghu said responsibility for every citation and argument remains with the human lawyer.

What Lawyers Need to Do

Treat generative AI and LLM systems as assistive tools, not authoritative sources. Common use cases in litigation include drafting petitions, summarizing precedents, and conducting legal research.

Each use case introduces distinct risks. AI systems hallucinate citations, truncate precedents, and misinterpret statutes. Verification must be explicit, reproducible, and auditable.

Implement these controls:

  • Link every citation to its source
  • Use conservative temperature settings or deterministic inference where available
  • Require human-in-the-loop review before filing
  • Maintain versioned evidence logs for all AI outputs

Disclose AI assistance clearly. Document your verification process. Keep audit trails for citations and factual claims.

What Courts and Regulators Are Watching

Expect formal guidelines from bar councils and court rules requiring disclosure of AI assistance. Demand will grow for verification tools that certify citation provenance and track where AI-generated content came from.

Law firms should update malpractice protocols now. Add mandatory verification steps. Tag AI outputs with traceable metadata so auditors can follow the chain of custody.

The Haryana Real Estate Regulatory Authority used AI for a market overview in a compensation ruling, showing both utility and risk. This signals courts will tolerate AI for narrow, well-defined tasks-but not for judgment calls or decisions.

For more on AI for legal practice, see how firms are building verification workflows and disclosure frameworks.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)