Mental Health Professionals Can Play A Role In AI Therapy
With people increasingly seeking mental health care from AI chatbots, it's time to consider whether we need a "therapist in the loop" model for virtual therapy.
Update June 13, 2025: After I published this, 404 Media reported that Consumer Federation of America, a consumer rights organization, filed a complaint with the Federal Trade Commission to investigate Character.ai and Meta for the “unlicensed practice of medicine facilitated by their product” … “without inadequate controls and disclosures.”
The cliched image of a therapist’s office shows a comfy couch, relaxed lighting, and the practitioner thoughtfully listening from a respectful distance. A framed degree may be hanging on one wall, demonstrating that the therapist has undergone the appropriate education to help navigate mental health challenges. On a first visit, you might be asked to sign an agreement that protects your confessions, whether they are about a cheating spouse or a little lie that generated anxiety.
The point is, the therapist’s office is supposed to be a safe space shaped by tradition, professional ethics, and legal guardrails. But people are increasingly turning to AI…



