
As many of you who follow our work already know, we strongly believe that parents and caregivers play a vital role in keeping their children safer online. Through communication, supervision, and education, caregivers can help guide young people toward healthier digital habits and protect them from the darker corners of the internet. (1)
But there’s a group of kids we often forget about, those who don’t have that level of guidance or protection.
What about youth living in group homes? The runaways couch-surfing through unstable environments? Or even those living in homes where a parent or caregiver is physically present but emotionally or mentally checked out?
For these youth and teens, there may be no one helping them set boundaries, no one teaching them how to identify a scam, spot grooming behaviour, or understand the permanence of a post. It’s these very youth, those who fall through the cracks of adult protection, who are often at the greatest risk online.
That’s why conversations around “Duty of Care” legislation aimed at Big Tech matter so much. (2)
At its core, Duty of Care laws seek to place legal responsibility on tech companies to design and manage their platforms with the safety of children and vulnerable users in mind. It’s a proactive shift one that says instead of simply responding after harm has occurred, companies would be expected to build safety into their products from the start.
This could mean:
- Limiting algorithmic exposure to harmful content
- Designing safer default privacy settings for minors
- Flagging suspicious behaviour more effectively
- Preventing anonymous accounts from directly messaging children
- Enforcing stronger age verification and moderation tools (admittedly, this one is easier said than done)
The aim is to create digital environments where children can explore, learn, and socialize without being exploited or harmed in the process. Having said this, it’s a Nirvana Fallacy (3) to expect legislation in and of itself will protect our kids 100% of the time. (4)(5)
While parents and caregivers are the first line of defence for many children, the unfortunate reality is that not all kids have that defence. Some are navigating digital spaces alone, and the internet doesn’t wait for them to grow up or catch up, it comes at them fast, and sometimes dangerously.
In our work, we’ve seen firsthand how predators target the vulnerable. They know which kids are isolated, which ones respond to attention, and which ones have no adult in their corner. (6) These predators aren’t just lurking in the shadows anymore, they’re using smart tools, AI-generated personas, and sophisticated grooming tactics.
For a 13-year-old without any adult guidance, that’s not just overwhelming, it can be life-altering.
When Big Tech companies aren’t held accountable for the environments they create that are purely profit driven, they leave these kids to fend for themselves in digital spaces that were never built with their protection in mind.
This isn’t an argument against parenting playing a key role. Quite the opposite. We still believe that involved, informed parenting is one of the most powerful protective factors for a child online. (7) However, we must also acknowledge that many youth and teens are left out of this support system. And we can’t keep building an internet that only assumes someone is looking out for them.
This is why Duty of Care legislation is essential. There is a lack of legal and financial incentives to make big tech do more to protect kids online. This needs to change.
It’s about system-wide responsibility. It’s about ensuring tech companies play a role in safety, not just parents, caregivers, or schools. And it’s about recognizing that children’s rights to be safer online should not depend on their home life circumstances.
If we really care about keeping all kids safe online, not just the ones who are lucky enough to have engaged parents and caregivers, then it’s time we support a model that shares the responsibility. Tech companies have the resources, the data, and the influence to make significant changes, but there is no legal and financial consequences and incentive to make these companies of more to protect kids online.
The onlife world is not an equal playing field. While some youth and teens benefit from the watchful eye of engaged parents and caregivers who guide them through their online lives, many others are left to navigate this complex landscape entirely alone. Foster youth, runaways, and those living in homes marked by neglect or indifference, these are the kids who slip through society’s safety nets. They are among the most digitally vulnerable cohort, yet they are the ones least likely to have someone watching out for their safety online. This is precisely why Duty of Care legislation is not just helpful, it’s essential.
We cannot continue to rely solely on families and schools to bear the full weight of online safety when so many children don’t have those supports. Instead, we need a system-wide approach where responsibility is shared and where tech companies are no longer allowed to prioritize profit over protection, especially when vulnerable children are involved.
Duty of Care legislation holds Big Tech accountable to a higher standard. It ensures platforms are designed with child safety at their core, not as an afterthought. It creates meaningful incentives for companies to act through enforceable standards, clearer expectations, and real consequences for inaction.
This is not about villainizing technology. We believe in the positive power of the onlife world when it is used responsibly with intent. But believing in that power also means demanding that the companies who shape it do so with intention, ethics, and a commitment to protecting every child, and not just those with strong parental oversight.
No child should have to earn their right to safety online by virtue of having good parents or caregivers.
If we want to create a digital world where all kids, regardless of their background. can explore, connect, and grow without fear of exploitation, then we must call for change at the structural level. Because when tech companies are required to care, we all benefit. And more importantly, so do the youth and teens who need it most.
Digital Food For Thought
The White Hatter
Facts Not Fear, Facts Not Emotions, Enlighten Not Frighten, Know Tech Not No Tech
References:
4/ https://thewhitehatter.ca/blog/you-cant-childproof-the-internet-through-legislation/
6/ https://thewhitehatter.ca/online-sexual-predation-and-exploitation/