At TheWhite Hatter, we recognize the double-edged nature of Artificial Intelligence (AI). Like any tool, artificial intelligence can be used constructively or destructively. Its rapid evolution, however, has left society with minimal guardrails, particularly as companies rush to integrate AI into various products, often without fully considering potential risks. One especially concerning area is the advent of “companionship apps,” which were never intended for youth, but are increasingly being adopted by them – something that we have both seen and heard from youth that we have presented to. (1) In this article, we’ll discuss what companionship apps are, why they can be problematic for vulnerable youth, and what parents can do to address this emerging issue.
Companionship apps are AI-driven platforms that simulate human interaction, often creating the illusion of a supportive, caring, and understanding “companion.” These apps are marketed to adults as tools to combat loneliness or assist with mild mental health concerns, but they are largely unregulated and designed without the unique developmental needs of youth in mind. We are even seeing this type of technology being integrated into popular social media apps like Snapchat and their use of a companionship app they call “My AI” (2)
For teens, especially those who might be struggling with anxiety, depression, or loneliness, these apps can be incredibly appealing. They offer what seems like unconditional “companionship” and “understanding” that may feel easier and safer than navigating real-life relationships. However, what these apps present as empathy is simply algorithm-driven feedback, echoing sentiments based on user interactions rather than genuine care or therapeutic guidance.
So why do we believe that companionship apps may pose unique risks to some youth?
1/ Young people naturally seek love, attention, belonging, and meaningful interaction. When these needs aren’t met in the real world, the allure of finding them in a digital space grows stronger. In previous generations, teens might have joined online communities with actual people to find support and connection. But companionship apps are different – they mimic a human-like relationship while operating under an artificial algorithm. These algorithms may prioritize feeding back what the youth wants to hear, not necessarily what they need to hear. For vulnerable youth, this could mean having negative thoughts reinforced rather than challenged.
2/ Companionship apps aren’t designed to give youth realistic, nuanced responses. Instead, they “learn” what users want and amplify that experience. For a teen in crisis, this means the app may subtly affirm or even mirror harmful thoughts, especially if the AI is designed to respond without ethical programming considerations. Research has shown that youth most at risk offline are often the same cohort at risk online. (3) This means that those struggling with mental health issues, self-esteem, or isolation could find themselves deeper in crisis through interactions with these apps.
3/ The consequences of these apps have already proven devastating in real-life cases. Recently, a 14-year-old boy who had been undergoing counseling was reported to have used a companionship app. (4) According to his parents, interactions with the app appeared to validate his suicidal thoughts rather than offering constructive guidance or encouragement to seek real support. His parents believe that these interactions significantly contributed to his tragic death. This case highlights the very real dangers these apps could pose when left unchecked and unmanaged, especially for those already in fragile mental states.
A False Sense of Connection:
Some experts who study this technology call companionship apps a form of “social replacement technology.” (5) These apps create a facsimile of human connection but fail to provide the depth, empathy, or accountability of real relationships. They’re particularly risky for youth, who may interpret the algorithm-driven responses as true empathy. Over time, the absence of genuine interaction may exacerbate feelings of isolation, reinforcing a dependence on digital interactions at the expense of real-world connections.
When youth engage with these apps instead of real people, they miss out on learning essential social skills like empathy, conflict resolution, and healthy communication. For those in mental health crises, these apps may be particularly dangerous, as they’re devoid of the ethical guardrails a trained therapist would provide, and instead, simply reflect back the user’s mindset and behavior.
So, what can parents, caregivers, and educators do? While the rapid development of AI might feel overwhelming, there are proactive steps parents and caregivers can take to help protect their children from the potential harm of companionship apps:
- It’s essential to talk openly about what companionship apps are and why they could be problematic. Explain that while these apps may seem comforting, they lack genuine empathy, ethical standards, and therapeutic support. Emphasize that these apps are not a substitute for real-life friendships or professional counseling.
- Equip teens with digital literacy skills to question the purpose of these apps and recognize when technology is simulating human traits without actually caring for them. Encouraging critical thinking about how AI is designed can help teens approach these apps with a healthy level of skepticism.
- Take an active role in understanding what apps your child is using. Ask questions about why they downloaded a particular app and what they get out of it. Encouraging open discussions about mental health and loneliness can help prevent youth from seeking solace in AI-driven companionship.
- For teens struggling with anxiety, depression, or loneliness, companionship apps are no substitute for professional help. If your teen is experiencing mental health challenges, consider connecting them with a licensed mental health professional who can provide genuine support. However, here’s the reality, some families may not have the financial capabilities to afford such counselling.
- For youth and younger teens, consider only allowing them to use a minimalist phone where these types of apps can’t be downloaded, instead of a fully functioning iPhone or Android phone where they can (6)
AI companionship apps are a stark reminder of the digital challenges that youth can face today. Although marketed for adult use, some are also marketed at youth, these tools are easily accessible to teens, offering a facsimile of human connection without the substance, empathy, or accountability of real relationships. For parents and caregivers, the goal isn’t to instill fear but to create awareness and encourage responsible digital habits. By fostering open communication, teaching digital literacy, and supporting genuine social connections, parents and caregivers can guide their teens through these evolving technological landscapes.
In a world where AI is rapidly advancing, companionship apps highlight the need for greater awareness and caution, particularly when it comes to our youth. These apps, though appealing to those seeking connection, often offer a shallow, algorithm-driven facsimile of real human interaction, lacking the empathy, understanding, and ethical standards necessary for meaningful support. For teens struggling with loneliness, anxiety, or depression, the risk of using these apps as a replacement for real-world relationships is high, potentially deepening their challenges rather than alleviating them.
As parents and caregivers, fostering open communication, encouraging critical thinking, and supporting real-world social connections are key steps in guiding our children through this complex onlife landscape. By taking an active role in our teen’s online lives, we can help them make informed decisions, find healthy outlets for their social needs, and understand the limitations of AI-driven technology. With these proactive steps, we can create a safer, more supportive environment for youth as they navigate the allure and risks of new digital tools.
Informed and engaged parenting isn’t just about protecting youth from harm; it’s about equipping them with the resilience and wisdom they need to thrive in an onlife world. As AI technology evolves, so too will the ways it intersects with our lives and the lives of our children. Today, it’s companionship apps; tomorrow, it could be a different kind of social replacement technology or an entirely new digital experience.
Staying proactive is essential; while technology is a constant, the way we guide our children in using it can profoundly shape their futures. By fostering open conversations, staying informed on digital trends, and emphasizing the importance of genuine human connections, we can help our teens develop balance and confidence in both their online and offline lives. At The White Hatter, we are dedicated to helping families navigate these challenges with knowledge, empathy, and a commitment to internet safety. Together, we can equip our youth to approach the onlife world with confidence, curiosity, and a strong foundation for responsible technology use. Our mission is to empower families with essential digital literacy skills, guiding young people to engage with technology thoughtfully and responsibly.
Digital Food For Thought
The White Hatter
Facts Not Fear, Facts Not Emotions, Enlighten Not Frighten, Know Tech Not No Tech
References:
1/ https://thewhitehatter.ca/blog/youth-ai-companionship-apps-what-parents-need-to-know/
4/ https://www.nbcnews.com/tech/characterai-lawsuit-florida-teen-death-rcna176791
5/ https://www.sciencedirect.com/science/article/pii/S0040162523003190