Evidence Is Growing That People Shouldn't Use Unregulated AI Chatbots For Therapy
The American Psychological Association issues a health advisory against using popular AI chatbots for mental health treatment
Mental health professionals and researchers are adding to the mountain of evidence that using unregulated AI chatbots for therapy can be harmful and should largely be avoided.
This past weekend, the Wall Street Journal reported on research showing “systemic failures” in how AI chatbots respond to psychiatric disorders. The American Psychological Association also recently issued a health advisory recommending that AI and “wellness applications,” not be used for psychotherapy or psychological treatment. Even pro-tech Bitcoin World published an article on lawsuits filed against OpenAI alleging ChatGPT played a role in the deaths of four users who died by suicide, and that other users had developed “life-threatening delusions.”
“Our mental health care system is overwhelming, expensive and difficult to access.”
Whether readers are taking heed of these warning signs is another matter. After all, the problems that lead people to turn to AI chatbots for therapy are still around: a lack of acces…




