Blog

The Rise of Empathetic Artificial Intelligence Companionship Apps – What Parents Should Know Part 2

November 3, 2024

Caveat – In October we wrote an article “The Real Concerns Surrounding Companionship Apps and Some Youth – What Parents, Caregivers, and Educators Need To Know” (1)  This is a follow-up to that article

In the rapidly evolving world of artificial intelligence, “empathetic Artificial Intelligence (AI) companionship apps” are gaining popularity, especially among teens and young adults – something that we have heard anecdotally from youth that we present to as well. Platforms like Replika (2) and Character.ai (3) allow users to engage in highly personalized, immersive interactions that are meant to mimic human companionship. These apps are widely discussed on social media platforms like Reddit and TikTok, where young users share their experiences and connections with these digital companions. However, parents may falsely assume that these apps are safe because their child isn’t interacting with a real person who the child doesn’t know – a false sense of security which isn’t always the case as you will read.

AI companionship apps are designed to offer users a responsive, always-available “friend” or “partner” that doesn’t get bored with them and can adapt to a user’s preferences. They offer highly immersive experiences, blurring the line between fantasy and reality by creating a virtual lived “story” that unfolds based on user inputs. These apps, however, aren’t sentient, yet they come across as if they are – engaging users through realistic two-way dialogue, personalization, and endless patience. This can be especially appealing to teens who may feel isolated or misunderstood in real life.

Having tested some of these companionship apps, they are clearly blurring the lines between fantasy and reality which we believe can be extremely challenging for youth and teens to differentiate between. Fact – AI companionship apps offer highly engaging, immersive experiences, which can appeal to a teen who is seeking connection or even escapism.

While the concept of AI companionship can seem harmless, especially if the AI is “just a program,” there are risks. For one, Silicon Valley’s motto of “move fast and break things” has led to innovations that prioritize being first to market no matter what, rather than ensuring user safety first in their design, and this approach has carried into AI companionship apps – in one case a companionship app appears to have had deadly results that involved a teen (4). Moving fast and breaking things like software or hardware is one thing, but when this mindset now flows over to break humans, especially teens, at an emotional, psychological, physical, and social level that is a whole new concern of epic proportion!

Character.ai, one of the most popular platforms among users aged 13-25, has over 20 million active monthly users who spend an average of two hours interacting with their AI. However, Character.ai presently lacks significant guardrails, making it potentially harmful for younger teens who might be more emotionally and psychologically vulnerable to such technology.

For-profit companies like these have incentives to keep users, including teens, engaged to increase monetization which can lead to problematic usage patterns. Without any real research on the long-term effects of these companionship apps, the emotional, psychological, and social impacts on young users are unknown, but we believe would be concerning, particularly for teens already at risk for mental health issues offline.

However, there might be situations where AI companionship apps may serve a positive purpose, such as with isolated adults or seniors in long-term care facilities who may lack regular human interaction. For them, these apps may likely offer comfort and reduce feelings of loneliness. But for younger users, especially teens, the potential for these types of apps to blur reality, reinforce isolation, or foster dependence is a significant and real concern.

So, what are some of the Key takeaways for parents and caregivers?

  • AI companionship apps offer highly engaging, immersive experiences, which can appeal to teens seeking connection or escapism.

  • Just because it’s “not real” doesn’t mean it’s safe. We believe that these apps can still influence mental health, especially when it comes to vulnerable youth.

  • Platforms like Character.ai have few protective measures or guard rails. Their business model relies on extended engagement, which can lead to unhealthy problematic usage patterns. It is because of this fact that we would not recommend these types of apps for younger teens, or any other teen if they are experiencing mental health challenges offline

  • Help teens distinguish between reality and AI-driven fantasy. Reinforce healthy social skills and encourage real-life connections.

  • With no long-term research on the effects of these apps, staying aware of emerging insights is crucial as the technology evolves – something that we will continue to do here at the White Hatter

As Laurie Segall, a tech based investigative reporter stated, “These companionship apps are a type of AI fan fiction fantasy platform”. Laurie has produced a truly spectacular investigative documentary type video on these AI companionship apps that every parent, caregiver, educator, and teen should watch here:

As AI companionship apps like Replika and Character.ai become increasingly popular among teens and young adults, parents must stay informed and proactive in addressing their potential impacts. While these apps can offer comfort and alleviate feelings of isolation for some adult users, especially isolated adults or seniors, they also present significant risks for younger individuals. The immersive and personalized nature of these platforms blurs the lines between fantasy and reality, making it challenging for teens to distinguish genuine human interactions from AI-driven experiences.

The “move fast and break things” mentality prevalent within the tech industry has led to the rapid deployment of these technologies without adequate safety measures, increasing the likelihood of negative emotional, psychological, and social consequences for vulnerable youth. The monetization strategies of for-profit companies developing these types of apps, further exacerbate these issues by encouraging prolonged and potentially unhealthy usage patterns.

Parents should be vigilant in monitoring their children’s interactions with AI companionship apps, understanding that the absence of a human counterpart does not equate to safety. It’s essential to foster open dialogues about the nature of these technologies, reinforce the importance of real-life relationships, and set appropriate boundaries to mitigate the risks associated with their use. Be your child’s best parent and not their best friend when it comes to their use of technology – there is a difference.

Additionally, advocating for more comprehensive research and robust regulatory frameworks can help ensure that the development of AI companionship technologies prioritizes the well-being of all users, particularly the most impressionable – youth and teens.

By taking these steps, parents can help navigate the complex landscape of AI companionship apps, ensuring that their children benefit from technological advancements without falling prey to their potential dangers.

Digital Food For Thought

The White Hatter

Facts Not Fear, Facts Not Emotions, Enlighten Not Frighten, Know Tech Not No Tech

References:

1/ https://thewhitehatter.ca/blog/the-real-concerns-surrounding-companionship-apps-and-some-youth-what-parents-caregiver-and-educators-need-to-know/

2/ https://replika.com/

3/ https://character.ai/

4/ https://www.nbcnews.com/tech/characterai-lawsuit-florida-teen-death-rcna176791

Support The White Hatter Resources

Free resources we provide are supported by you the community!

Lastest on YouTube
Latest Podcast Episode
Latest Blog Post
The White Hatter Presentations & Workshops

Ask Us Anything. Anytime.

Looking to book a program?

Questions, comments, concerns, send us an email! Or we are available on Messenger for Facebook and Instagram

Your subscription could not be saved. Please try again.
Your subscription has been successful.

The White Hatter Newsletter

Subscribe to our newsletter and stay updated.

We use Sendinblue as our marketing platform. By Clicking below to submit this form, you acknowledge that the information you provided will be transferred to Sendinblue for processing in accordance with their terms of use