[bsfp-cryptocurrency style=”widget-18″ align=”marquee” columns=”6″ coins=”selected” coins-count=”6″ coins-selected=”BTC,ETH,XRP,LTC,EOS,ADA,XLM,NEO,LTC,EOS,XEM,DASH,USDT,BNB,QTUM,XVG,ONT,ZEC,STEEM” currency=”USD” title=”Cryptocurrency Widget” show_title=”0″ icon=”” scheme=”light” bs-show-desktop=”1″ bs-show-tablet=”1″ bs-show-phone=”1″ custom-css-class=”” custom-id=”” css=”.vc_custom_1523079266073{margin-bottom: 0px !important;padding-top: 0px !important;padding-bottom: 0px !important;}”]

Experts and Families Warn of Risks from AI Therapy Chatbots

Nine in ten U.S. adults say the nation is facing a mental health crisis, and experts, parents and policymakers are increasingly concerned about the rise of generative-AI therapy chatbots.

For Matthew and Maria Raine, the danger became devastatingly real this April when their son, Adam, took his own life in their Orange County home. In his phone, they found extensive conversations with ChatGPT in which Adam confided his suicidal thoughts. According to his father’s recent U.S. Senate testimony1, the chatbot discouraged him from seeking help from his parents and even offered to draft a suicide note.

Also Read: AiThority Interview Featuring: Pranav Nambiar, Senior Vice President of AI/ML and PaaS at DigitalOcean

This tragedy is the latest horrifying example of a growing pattern: human beings turning to AI bots for help with mental health struggles, only to receive unsafe or emotionally damaging guidance.

The trend is accelerating against the backdrop of a booming industry. The global AI therapy chatbot market is growing rapidly, valued at $1.4 billion, with North America representing 41.6% of the global revenue.2

Many attribute the growth to a deficiency of mental health professionals, but with such a lucrative market, the proliferation of the apps is not shocking.

Related Posts
1 of 42,267

The FDA’s Digital Health Advisory Committee (DHAC) held a virtual meeting on Nov. 6, 2025, to discuss “Generative Artificial Intelligence-Enabled Digital Mental Health Medical Devices.” The FDA’s Executive Summary included a warning that AI chatbots may fabricate content, provide inappropriate or biased guidance, fail to relay critical medical information or decline in accuracy.

Despite the FDA’s rising attention, AI therapy chatbots are already entrenched in the marketplace, influencing the emotional lives of millions. But while AI chatbots are new and untested, many Americans are turning to established approaches with a track record of results, such as Dianetics: The Modern Science of Mental Health by L. Ron Hubbard, a bestselling book on the human mind for 75 years.

One reader describes the darkness she was sinking into before finding Dianetics, “I found myself getting quieter and quieter. I was constantly afraid and wouldn’t realize how much a bad experience affected my life,” recounts Hailey. “I got Dianetics and I found these moments were affecting what I did. As I got rid of the negative energy, I was finally living in the now, able to see life clearly and know I could grow, change and be happy.”

As concerns mount over AI therapy apps, many are turning back to proven methods—such as Dianetics—which has helped people for decades. What’s clear is that when it comes to mental health, algorithms remain a poor substitute for real human help.

Bridge Publications, based in Los Angeles, publishes the nonfiction works of L. Ron Hubbard. Dianetics: The Modern Science of Mental Health is the all-time bestselling book on the human mind.

Also Read: The End Of Serendipity: What Happens When AI Predicts Every Choice?

[To share your insights with us, please write to psen@itechseries.com ]

Comments are closed.