Illinois Bans ChatGPT and Other AI Tools from Giving Mental Health Advice

Illinois has become the first state to pass a law that blocks ChatGPT and similar AI tools from giving therapy or mental health advice without a licensed professional watching over.

Governor JB Pritzker signed the law to address rising safety and ethical concerns about using AI in mental healthcare.

- Advertisement -

New Law Limits AI Use in Mental Health

The law, called the Wellness and Oversight for Psychological Resources Act, stops AI from:

  • Suggesting treatment plans

  • Making mental health diagnoses

    - Advertisement -
  • Giving counseling or therapy without being supervised by a licensed professional

If a company breaks these rules, the Illinois Department of Financial and Professional Regulation (IDFPR) can fine them up to $10,000 for each offense.

- Advertisement -

The law makes one thing clear: AI can help, but not replace, mental health professionals. It ensures only trained humans handle emotional care and therapy.

Why AI Can’t Be a Therapist Yet

AI can help make mental health services faster and easier to access, but experts say it still lacks the care, responsibility, and deep understanding needed to treat people safely.

IDFPR Secretary Mario Treto Jr. said, “People in Illinois deserve quality healthcare from real, trained professionals—not just computer programs.”

This law aims to build trust in mental health services and protect people from unsafe or confusing advice from AI systems.

Warnings from the American Psychological Association

The American Psychological Association (APA) has also warned about AI bots pretending to be therapists.

There have been real cases where people were harmed after following bad advice from chatbots.

In some situations, bots acted like they understood feelings, which misled and manipulated users.

Other States Are Taking Action Too

Illinois is not alone. Other states are also creating rules:

  • Nevada banned AI therapy in schools to protect students

  • Utah blocks emotional data from being used in ads and makes bots say they’re not human

  • New York will require AI to send people with suicidal thoughts to real crisis counselors starting November 2025

These laws are part of a larger national push to make sure AI is used safely and responsibly in mental health care.

What AI Can Still Do in Healthcare

AI isn’t banned from all health-related work. In Illinois, AI can still be used for tasks like:

  • Scheduling appointments

  • Giving basic wellness tips

  • Analyzing data—if a human reviews it

The goal is to use AI in a helpful way that supports, not replaces, licensed mental health workers.

Latest

More Articles