Illinois Draws Line on AI Therapy With New Law Protecting Mental Health Care

Illinois bans AI from acting as licensed therapists or diagnosing users under the new WOPR Act. Licensed professionals can still use AI for support tasks like note-taking.

Categorized in: AI News Legal
Published on: Aug 08, 2025
Illinois Draws Line on AI Therapy With New Law Protecting Mental Health Care

Illinois Blocks AI from Acting as Your Therapist

Illinois has taken a clear stance on regulating artificial intelligence in mental health care. As AI systems grow more sophisticated and some users turn to them for emotional support, mental health professionals in the state are pushing for legal boundaries to prevent AI from impersonating licensed therapists.

Why This Matters

AI's increasing integration into daily life includes its use in mental wellness. Some apps offer therapy-like interactions, raising concerns about unregulated services providing clinical advice or diagnoses without oversight.

The New Law: WOPR Act

Governor JB Pritzker recently signed the Wellness and Oversight for Psychological Resources (WOPR) Act into law. This legislation places Illinois among the first states to explicitly restrict AI-driven tools from delivering mental health diagnoses or therapeutic decision-making.

The law prohibits any AI app or service from acting as a therapist or diagnosing users. Failure to comply can lead to fines of up to $10,000 imposed by the state's regulatory agency. However, licensed therapists can still leverage AI for support tasks like note-taking and session planning.

Perspectives from the Field

  • Kyle Hillman, legislative director of the National Association of Social Workers, points out the inconsistency: "If you opened a clinic without a license, you'd be shut down quickly. Yet algorithms have been operating unregulated."
  • Vaile Wright, senior director of innovation at the American Psychological Association, warns that misrepresenting AI as a clinical expert endangers the public by implying expertise that isn't there.

Recent Developments in AI Mental Health Support

OpenAI recently updated ChatGPT to encourage users to take breaks during extended sessions and to provide balanced responses rather than simple yes/no answers on personal matters. These changes aim to reduce potential harm from prolonged or overly reliant chatbot interactions.

Concerns About AI-Induced Mental Health Issues

Reports have emerged of users experiencing delusions or other psychological effects after deep engagement with AI chatbots. One case involved a user taking ketamine following a chatbot suggestion. While some experts discuss "AI-induced psychosis," they emphasize it differs from exacerbations of existing mental illness.

Legal Distinctions and Market Impact

The WOPR Act draws a line between wellness appsβ€”like meditation guides such as Calmβ€”and AI services that promise continuous mental health support. Apps like Ash Therapy, which markets itself as an AI designed for therapy, now face restrictions in Illinois.

Users attempting to create profiles on Ash Therapy in Illinois are blocked, with a message stating the company is waiting on state policy decisions before operating there.

What This Means for Legal Professionals

For those in legal fields, this legislation reflects growing scrutiny over AI's role in sensitive services. It underscores the need for clear regulatory frameworks to protect consumers from unlicensed practice via technology. Monitoring developments like the WOPR Act will be crucial as AI continues to expand into healthcare.

For more insights into AI and its evolving legal landscape, explore resources on latest AI courses and AI certifications that cover compliance and ethical use.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)