top of page

Support | Tip | Donate

Recent Posts

Featured Post

Digital Afterlife and Mental Health AI Apps - What Parents, Caregivers, & Educators Need to Know!

  • Writer: The White Hatter
    The White Hatter
  • Jan 29
  • 5 min read

Updated: May 14


CAVEAT- This is a followup article to the one we posted earlier this month titled “How Artificial Intelligence Is Changing The Landscape of Online Sexual Exploitation: What Parents Need to Know” (1)

Last week, during a FaceTime call with Wayne Denner, a friend and respected digital literacy and internet safety advocate from Ireland, we discussed emerging concerns for youth in their use of technology and the internet. Wayne asked for our perspective on what we thought were going to be the key issues that could impact young people in 2025. We shared that the increasing use of technology to radicalize youth has become a pressing concern, a topic we’ve previously written about (2). However, we now also believe that artificial intelligence (AI) and its application to app design is another identified concern that parents, caregivers, educators, and legislators should be aware of when it comes to our kids emotional, psychological, physical, and social well-being.

In today's onlife world, countless industries are heavily regulated to ensure safety and ethical standards. Airplane construction follows rigorous protocols. Food production and sales are subject to safety inspections. Builders and contractors must adhere to building codes to ensure safe housing. Even hairdressers are licensed professionals. These regulations exist not to stifle creativity but to protect individuals and communities.

So why should the tech and social media industries be any different? 

Critics often argue that regulating the tech sector could stifle innovation and put countries like Canada and the U.S. at a competitive disadvantage, especially against nations like China, which are pushing technological boundaries at breakneck speeds. To us, this sentiment mirrors the fears of a "nuclear arms race," where falling behind could mean losing geopolitical influence. However, ignoring the need for safety and ethical oversight in the race to develop new technologies could lead to significant harm, especially for vulnerable populations like children and teens.

The global competition and rivalry in artificial intelligence is at full throttle. China’s DeepSeek AI (3) and the U.S.’s ChatGPT (4) are prime examples of this race, with some likening the development of DeepSeek to a Chinese "Sputnik moment”, a pivotal juncture that could redefine global technological leadership - once again this sound eerily similar to what we heard during the nuclear arms race of the past.

While these advancements are impressive, the profit-driven, fast-paced nature of AI development often prioritizes speed over safety. Without a framework of "safety by design," the consequences could be catastrophic, particularly in areas where AI intersects with youth safety and mental health.

We have written about the concerns surrounding AI DeepFakes and companionship apps (5), but two other rapidly growing areas of AI development that concern us are - #1 digital afterlife technologies and #2 mental health applications, both need to also be on the radar for parents, caregivers, educators, and regulators.

Digital Afterlife Technology: Grief, Ghost, and Legacy Apps

These AI-driven tools, such as Project December, allow users to simulate conversations with loved ones who have passed away, creating a "digital afterlife.” (6) While this concept might provide solace for informed adults who willingly engage with these apps, what happens when grieving and vulnerable youth turn to such tools?

Young people often process grief differently from adults. Their brains are still developing, and the emotional intensity of losing someone they love can lead to deep vulnerability. Turning to a digital simulation of the deceased could create confusion about reality, prevent healthy grieving, or even foster dependency on an artificial entity that cannot truly provide closure.

For example, imagine a teenager who loses a parent and begins using a digital afterlife app to maintain a virtual connection. Instead of working through their grief with the help of friends, family, or a counsellor, they might isolate themselves, relying on a simulation that reinforces their pain rather than helping them heal.

We highly recommend watching the documentary “Eternal You” (7) for a deeper understanding of this important issue.

Mental Health AI (Therabots)

AI-driven mental health apps, sometimes called "therabots," are marketed as accessible, stigma-free alternatives to traditional therapy. While these apps may seem like an innovative way to support mental health, they come with significant risks, especially for youth experiencing a crisis. (8)

A teen struggling with depression or anxiety might turn to a therabot because they feel too ashamed to seek help from a trusted adult or professional. But what if the AI provides generic or inappropriate responses that exacerbate the issue? Worse yet, what if the teen starts depending on the bot, avoiding real-life interventions that could genuinely help?

These apps lack the human empathy and nuanced understanding that a trained therapist brings to the table. For instance, a suicidal teen might input their feelings into a mental health app and receive a response that doesn’t adequately address the gravity of the situation, potentially leading to tragic outcomes.

The risks associated with AI technologies like deepfakes, companionship apps, digital afterlife apps, and therabot apps underscore the need for robust regulation. While innovation in these areas is undoubtedly valuable, we cannot allow profit-driven motives to overshadow safety considerations. However, given what has happened in the US election and the rise of the tech based oligarchy that we are seeing right before our very eyes (9), we believe that profit-driven motives will continue to trump safety by design initiatives.

Regulating the tech industry doesn’t have to mean stifling creativity. Just as aviation regulations didn’t stop humanity from reaching the skies, or flying to the moon, thoughtful oversight in tech can pave the way for safe, ethical innovation.

As AI continues to evolve, it’s critical to ask: Are we prioritizing safety and humanity in our pursuit of progress? Or are we hurtling toward a future where the well-being of our most vulnerable population, our kids, is sacrificed for the sake of speed and profit? Sadly, we think it’s the latter thus why parenting on the use and integration of technology is becoming an even bigger issue.

The rapid evolution of technology and artificial intelligence presents both incredible opportunities and significant risks, particularly for children and teens. While advancements like digital afterlife apps and AI-driven mental health tools offer intriguing possibilities, they also highlight the urgent need for safety-first well tested and evidence-based approaches in the tech industry. These technologies, when left unchecked, could exacerbate vulnerabilities in youth, disrupt healthy emotional development, and place undue reliance on artificial solutions where human empathy is essential.

As parents, caregivers, educators, and legislators it’s critical to remain informed and vigilant. Regulation of the tech industry, much like the safeguards in aviation, food production, and other industries, is not about stifling creativity, it’s about ensuring that innovation serves humanity, not endangers it. Without robust oversight, the pursuit of profit and global competition will continue to outweigh ethical considerations, leaving our most vulnerable populations exposed to preventable harm.

The question we must all consider is not whether these technologies should exist, but how they can be designed and deployed responsibly. By advocating for safety by design and fostering open conversations with our children about technology’s potential risks and benefits, we can better prepare them to navigate this rapidly changing digital landscape. At the heart of it all is the need to prioritize humanity and well-being over speed and profit, because the stakes couldn’t be higher.

Digital Food For Thought

The White Hatter

Facts Not Fear, Facts Not Emotions, Enlighten Not Frighten, Know Tech Not No Tech

References:

Support | Tip | Donate
Featured Post
Lastest Posts
The White Hatter Presentations & Workshops
bottom of page