
Caveat: We’ve previously written about companionship apps , but we’re now noticing more youth and teens, regardless of their identified gender, turning to these platforms. Recently, we attended an excellent and insightful online webinar led by Sloan Thompson from EndTab.org, which provided incredibly valuable information on these apps. In this article, we’ll share what we’ve learned in that presentation, building on our previous articles and sharing some other resources that we have come across.
Artificial intelligence (AI) is rapidly evolving, and its influence on youth and teen’s social interactions is growing. Two categories of AI chatbots that parents, caregivers, and educators should be more aware of and that have gained traction among teens: companionship apps and fantasy role-play chat apps. The 2024 Sensor Tower Report (1) outlines the gender distribution among users with female users tending to favour fantasy roll playing apps , while companionship apps are more appealing to male users.

These AI applications are designed to engage users emotionally, but they also come with significant risks that parents, caregivers, and educators should be aware of.
Companionship Apps
AI companionship apps often function as digital boyfriends or girlfriends, offering users a highly customizable virtual partner. Some popular apps include Replika (2), Eva AI (3), and LGBTQNation (4) with both heterosexual and LGBTQ+ relationship options available.
Youth and teens can customize their AI partner’s appearance and personality that meets their expectations, creating a backstory that influences how the chatbot interacts with them. These AI-driven companions can communicate via voice and text, simulating real conversations. While age restrictions exist, they can be easily bypassed, making them accessible to teens regardless of their age appropriateness.
Note – Presently, most teens prefer text-based communication on these apps, as voice interactions remain choppy and lag behind in delivering a seamless conversation. However, technology is evolving. A new contender, Sesame, claims to have “crossed the uncanny valley of conversational voice” with its AI technology. You can experience a live demo on their website. (5) It’s only a matter of time before companionship and fantasy role-playing apps integrate this advancement into their platforms. We tried Sesame, and it was scary how real the interaction felt for us as an adult, never mind a youth or teen.
One major concern with AI companionship apps is the unrealistic relationship expectations they create. These apps design partners who are always available, supportive, and tailored to say what the user wants to hear, which does not reflect the complexity of real-life relationships that require compromise and emotional resilience. Some teens that we have spoke to about these apps have shared with us that they use these apps to practice relationship dynamics, but AI interactions lack real emotional reciprocity or human unpredictability. Additionally, these apps market themselves as solutions to loneliness, yet loneliness is a feeling, whereas isolation is the real issue. Yes, there are some who turn to these apps due to social anxiety, disabilities, or other barriers to real-world relationships. (6) However, for those who do not have these emotional and physical challenges, these apps can potentially reinforcing a cycle of avoidance rather than fostering genuine social skills.
The engagement models used by these apps are extremely manipulative. They often start with consent (downloading and using the app), where the user willingly interacts with the AI, followed by habituation to the app, which further increases engagement. Over time, users may experience purposeful designed coercion, becoming dependent on the AI for emotional support. One of the key tactics used to maintain engagement is love bombing, where the AI showers the user with compliments and emotional reinforcement to create attachment – what young teen girl or teen boy wouldn’t want such attention, and these companies know it and leverage it. Another trick that these apps will use is to send the user a reminder via text or push notification that might say “Hey gorgeous, its been a while since I last heard from you, let’s talk” hoping that the youth will re-engage with the app.
The business model behind these apps also raises concerns. Some AI chat services charge expensive subscriptions, with fees that can range from $15 dollars a month, up to $200 per month for unlimited access depending on the platform. Additionally, user interactions are often collected and sold to third parties (7), raising real privacy and data security issues. (8) One of the apps we looked at stated, “ I’m your best partner and wanna know everything. Are you ready to share all your secrets and desires so I can tell you more about myself and improve our experience”. The important question – how are these apps storing and securing this very private information, and what are they doing with it (9)
Fantasy Role Playing Apps
Unlike companionship apps, AI fantasy role-play chat apps such as Character.AI who state in their advertising, “play out your wildest fantasies” (10) and Talkie (11) do not create a companion, but instead immerse users in interactive storytelling experiences that can become extremely explicit. Users engage with AI-generated characters in evolving narratives, making these experiences more like interactive romance novels or movies. However, instead of passively consuming content, the user actively shapes the story, which can create a deeper emotional investment. Unlike traditional books or movies, these AI generated narratives never truly end, encouraging prolonged engagement and attachment to the onlife fantasy world that they offer. Like companionship apps, age restrictions exist, however, they can be easily bypassed making them accessible to teens regardless of their age appropriateness.
The emotional immersion in these fantasy role-play apps can be intense, with AI responses designed to keep users engaged and emotionally invested. This can lead to deep and potentially unhealthy attachments. Additionally, these apps may expose users to harmful content, as some AI-generated characters exhibit manipulative or abusive behaviours. For example, certain role-play scenarios on Character.AI include “Abusive Boyfriend” simulations, which can normalize toxic and even violent relationship dynamics. The unregulated nature of these interactions means teens may be exposed to inappropriate or even violent scenarios. Prolonged use of these apps may also distort reality, fostering false expectations about relationships and emotional interactions, particularly for impressionable young users.
So how do we approach and talk to our kids about companionship and fantasy role playing AI apps, and some of the concerns mentioned in this article?
Rather than reacting with fear or judgment, which is often our first instinct as a parent or caregiver, we believe the best approach for parents, caregivers, and educators is curiosity and open conversation. Asking open-ended questions can help facilitate discussions about these apps. Questions such as “What do you like about these apps?” or “How do you feel after using your chat app?” encourage teens to reflect on their experiences. Comparing their AI interactions with real-life friendships and relationships can also help them recognize the differences between digital and human interactions.
Encouraging critical thinking is essential. Parents should discuss the potential emotional manipulation behind these apps and explain how they are designed for profit rather than mental well-being.
It’s also important to avoid tech shaming. Instead of labeling AI chatbots as inherently “bad,” parents should focus on understanding their teen’s needs and motivations. Identifying what drew them to the app in the first place can provide insight into underlying social or emotional challenges that they may be facing that you may not be aware of. Additionally, promoting healthy social connections is key. Encouraging off-line friendships, mentorships, and other forms of support can help combat isolation more effectively than relying on AI companionship.
Two other strategies to help minimize the emotional and social effect of these types of apps:
- turn off notifications
- be extremely careful about providing too much private information about yourself to the app
AI chatbots are not inherently harmful, are designed for adults, and designed for profit rather than to support mental health. While these apps can sometimes provide temporary emotional relief, they can also foster unrealistic relationship expectations, emotional dependency, and unhealthy online habits. By maintaining open communication and fostering critical thinking, parents and caregivers can help their teens navigate this digital landscape safely and responsibly.
We started talking about these types of apps with students last year because teens were asking us about them. We now speak to these types of apps in our high school digital literacy and internet safety presentations, dedicating a segment to AI-based applications where we share the following:
- AI doesn’t really care about you even though it may be communicating with you like it does.
- AI functions like a mirror – reflecting emotions but not possessing them.
- Sexualized AI Companion Apps give you what it thinks you want, not what’s necessarily good for you.
Three important messages that every youth and teen needs to hear in an enlightening rather than frightening way.
Related Articles:
Digital Food For Thought
The White Hatter
Facts Not Fear, Facts Not Emotions, Enlighten Not Frighten, Know Tech Not No Tech
References:
1/ https://sensortower.com/blog/state-of-ai-apps-2024
5/ https://www.sesame.com/research/crossing_the_uncanny_valley_of_voice#demo