What Shaped Mental Health in 2025
Teens as AI chatbot guinea pigs, private equity in mental health tech and the MAHA movement reshapes federal policy.

This past year has seen dramatic changes to the mental health landscape, from politics to tech. Three major trends are worth examining: The debate over unregulated chatbots powered by Large Language Models (LLMs) like ChatGPT; increased private equity investment in mental health tech; and the systematic dismantling of the federal infrastructure that was designed to support communities and individuals suffering from substance abuse disorder or mental illness.
If you or someone you know is at risk of self-harm or suicide, call or text 988 for the National Suicide and Crisis Lifeline.
The unregulated use of so-called “AI chatbots” by millions of people has led to congressional and regulatory scrutiny as well as a number of terrifying reports of “AI psychosis” and self-harm.
This week, The Wall Street Journal reported that experts have determined that users are developing delusions in their interactions with these chatbots, calling it “AI-induced psychosis.” Parents have testified at a Senate hearing that two teens died after sustained interactions with chatbots after discussions about suicide. The Journal also cited a peer-reviewed study that concluded that “delusional thinking can emerge in the setting of immersive AI chatbot use.” The authors said that “pre-existing risk factors” combined with how chatbots are designed for prolonged engagement could contribute to “AI-associated psychosis.”
Teens’ growing use of chatbots has also drawn increased scrutiny. NPR’s stellar mental health correspondent, Rhitu Chatterjee, reported that “extended chatbot interactions may affect kids’ social development and mental health,” according to psychologists and safety advocates.
OpenAI, maker of ChatGPT, has acknowledged problems with the product but has said it is improving its chatbot to flag disturbing interactions that might be signs of distress and guiding users to help. CEO Sam Altman recently acknowledged that the AI models are “starting to present some real challenges,” particularly when it comes to mental health, according to a TechCrunch article. OpenAI has announced a new opening for a Head of Preparedness to assess safety risks.
Some mental health tech companies have said that chatbots can be designed with guardrails. Talkspace, the online therapy juggernaut, is developing a “behavioral health-specific” LLM designed to be an “AI companion.” In comments submitted to the FDA, Talkspace said it believes that behavioral health AI companions should offer psychoeducation, reflective prompts, check-ins, safety screening, and other measures to reduce risk. “It should not establish a diagnosis or replace human licensed care,” the company said. But getting FDA approval for the “companion” isn’t expected to come soon—despite the demand for AI-powered therapeutic services.
Another big trend of the year has been the growth of the mental health tech industry. Originally fueled by the isolation of the pandemic and demand for Telehealth services, the sector is being flooded with cash from private equity and venture capital. And AI is clearly a cornerstone for many business plans. Ambience Healthcare, which is promoting an AI platform, raised $243 million; Sword Health raised $40 million to combine AI and clinicians through a product called Mind; and Slingshot AI secured $93 million in funding for a chatbot.
Private equity, in particular, has created concern among clinicians. Megan Cornish, the author of the Incomes and Outcomes Substack and licensed clinician working in mental health tech, recently wrote:
Private equity views mental healthcare as an “under-optimized asset.” They see relationships as data points and patients as units of service. By understanding where the money comes from (and, therefore, who calls the shots) we can better protect the integrity of the work.
Mental health tech is expected to continue its growth, as investors often phrase it, though there is potential for harm for patients when startups fail or deliver insufficient, profit-driven services.
Meanwhile, Health Secretary Robert F. Kennedy and the Make America Healthy Again movement have been working to overturn longstanding practices, including with the treatment of postpartum depression and psychiatric medications as well as protections for consumers. MAHA weaves together various, sometimes opposing constituents: anti-establishment scientists, wellness influencers, right-leaning tech bros and TV doctors. The real-world consequences: the dismantling of the federal infrastructure and further undermining of trust in health institutions.
Left out of many of these discussions, unfortunately, is the people who are struggling with emotional distress and substance abuse disorders, especially those who don’t have strong political representation. Despite the increased attention by policymakers and investment in mental health tech, there is still a lack of access to treatment because of systemic barriers created by powerful institutions, insufficient public funding, and stigma towards people who are suffering.
Other recent stories about mental illness worth checking out:
Insurance companies are sending people to lists of providers that don’t take new patients, are out-of-network or simply don’t answer calls. Now patients are suing the insurance companies, according to the Wall Street Journal, claiming that promised mental health care pretty much doesn’t exist.
The Marshall Project-Cleveland and KFF Health News spent a year investigating psychiatric hospitals in Ohio and found that they are struggling to treat an increasing number of patients with criminal cases.
One county in Pennsylvania will begin using involuntary mental health care (assisted outpatient treatment) because of the need to get people into treatment before getting entangled in the legal system, WHYY reports.
Brianna Fair is a mental health clinician with the San Mateo Police Department who pairs up with law enforcement to respond to people in emotional distress. In analysis of the program shows that it works in lowering costs and reduces interactions with the legal system, according to KQED.
Dr. Sam Timimi, child psychiatrist and author of “Searching for Normal: A New Approach to Understanding Mental Health, Distress, and Neurodiversity,” writes in Mad in America, which has long been critical of how mental health care, that a new approach to treatment is needed. “Most institutional responses simply call for more: more early detection, more services, more interventions. But adding more of the same may worsen matters if the very foundations of our concepts and practices are flawed.”
Lonely people can get some help finding friends through a number of new apps. TechCrunch takes a look at them.
New York is now requiring social media platforms to display warning labels for young users.



