Why Teens Are Turning to AI for Mental Health Support
- TWH
- 55 minutes ago
- 3 min read

A recent Nov 2025 study published in JAMA revealed a growing trend every parent and caregiver should pay attention to, that being teens and young adults are increasingly turning to artificial intelligence for emotional and mental health support. (1)
The study reported that 13% of U.S. youth now use AI tools to talk about their mental health. Among young adults aged 18 to 21, that number climbs to 22%, and two-thirds of those users seek AI support at least once a month.
Dr. Eric Arzubi, who is a board-certified child and adolescent psychiatrist based in the United States, who shared his thoughts on his LinkedIn page about the study, summarized the finding bluntly:
“5.4 million American youth are using AI for mental health advice, and we’re the ones who forced them into it.”
He added,
“Our children couldn’t wait any longer for us to fix a broken system. So they found their own solution. The question isn’t whether youth should use AI for mental health—they already are. The question is whether we’ll finally address the access crisis that drove them there in the first place.”
According to the study, nearly 40% of teens with depression receive no mental health support at all. AI tools offer what the mental health system often fails to deliver, things like immediacy, accessibility, and affordability. When traditional therapy feels out of reach due to long waitlists, limited providers, or high costs, AI becomes an available listener not just for youth and teens, but also for adults as well.
For many youth and teens, using AI isn’t about replacing human connection, it’s about survival in a system that feels unresponsive when it comes to their mental health. They needed help, and they looked for it where they could find it, via the use of their technology.
Dr. Eric Arzubi also stated that the study also revealed that black youth found AI tools significantly less helpful, suggesting that social and cultural inequities already seen in traditional healthcare may now be repeating themselves in digital spaces.
While AI can provide temporary comfort, it also comes with serious limitations that parents and their youth or teen need to understand:
No confidentiality: Unlike real therapists, AI tools don’t offer doctor–patient privilege. Anything shared could be accessed under a legal order.
Data privacy: Many AI chatbots collect and use user data to improve their systems, or even for targeted advertising.
Inaccurate or harmful advice: AI tools can produce responses that sound caring but lack empathy, nuance, or clinical context. In some cases, chatbots have delivered advice that could make a user’s situation worse.
Our takeaway after reading this study, don't shame young people for turning to AI, rather lets understand why they felt they had to. When systems fail, people turn to whoever or whatever will listen. This shift should be both a wake-up call, and a source of concern.
As parents and caregivers, the focus should be on rebuilding trust by ensuring real, human access to care. That means:
Advocating for better funding and access to youth mental health services.
Having open, nonjudgmental conversations with teens about why they might be turning to AI.
Helping them understand the difference between supportive technology and professional help.
Teens aren’t turning to AI because they prefer talking to machines, they’re doing it because the people and systems designed to support them aren’t responding quickly enough. To rebuild their trust, we need to match their urgency with real action such as accessible care, timely support, and genuine human connection. Without that, even more young people will continue seeking comfort from AI based therapy apps.
Digital Food For Thought
The White Hatter
Facts Not Fear, Facts Not Emotions, Enlighten Not Frighten, Know Tech Not No Tech
Reference:














