Don’t Age Gate Youth In Canada, Instead Regulate and Legislate Big Tech Design
- The White Hatter

- Apr 12
- 7 min read

Caveat - Our call to action is simple but important. Take a moment to share this article with your Federal Member of Parliament. As this issue moves closer to formal debate, it is critical that the conversation is informed by thoughtful, evidence based perspectives, not just headlines or quick solutions. Policy decisions in this space will have long-term impacts on youth, families, and the broader online environment. Those decisions should be shaped by a clear understanding of both the risks and the realities. Elected officials need to hear from the communities they represent, including parents, caregivers, and educators who are navigating this space every day. By sharing informed viewpoints, we help ensure that the discussion goes beyond surface level fixes and moves toward meaningful, effective approaches that prioritize both the safety and the rights of youth and teens in today’s onlife world.
This week, at the Liberal Party of Canada policy convention in Montreal, delegates debated and ultimately passed a resolution supporting age restrictions for social media and AI chatbots (1) . It’s important to pause here, because there is a common misunderstanding about what this actually means. A resolution at a party convention is not law, and it does not obligate the federal government to act. Rather, it reflects the priorities and direction that party members would like their leadership to consider moving forward.
That distinction matters, especially in a conversation that is already emotionally charged. When headlines move faster than the details, it becomes easy for Canadians to assume that sweeping legislative change is imminent. In reality, what we are seeing is the beginning of a policy conversation, not the end of one. That conversation deserves a closer look, particularly when it comes to how public opinion is being interpreted and presented.
Since this announcement, many who support age based restrictions here in Canada have pointed to polling data from the Angus Reid Institute to reinforce their position (2). At first glance, the headline number is compelling. Roughly 75 percent of Canadians surveyed expressed support for a full ban on social media use for youth under the age of 16. Taken in isolation, that statistic appears to signal strong national consensus.
However, when we move beyond the headline and examine the full dataset, a more nuanced picture begins to emerge. In that same March 2026 polling, more than 70 percent of respondents indicated that responsibility for managing a youth or teen’s access to social media should rest primarily with parents, not government. Only about 20 percent supported direct government regulation of youth access.
This is where the conversation becomes more complex, and where important context is often left out. Canadians are not rejecting the idea of limits, in fact, many clearly support them in principle. However, there is far less agreement on who should be responsible for enforcing those limits. What we are seeing is not contradiction, but tension between concern and control, between protection and autonomy.
That tension reflects something deeper about Canadian values. There is a recognition that the digital environment can present real challenges for youth and teens, both bad and good. At the same time, there is a strong belief in parental and caregiver responsibility and a degree of caution when it comes to expanding government involvement in family life. These positions can exist at the same time, but they do not always translate neatly into policy.
For parents and caregivers this matters, it reinforces the idea that legislation alone is not a complete solution. Even if age based restrictions were implemented, the role of the parent or caregiver does not disappear. In many ways, it becomes even more important. Guidance, relationship, and ongoing conversations about digital life remain central, regardless of what policy decisions are made at the national level.
We also want to be clear about something we fully agree on, youth and teens deserve safe online spaces. They also deserve to have their childhoods respected and protected. However, age gating legislation does not meaningfully create safer environments. In contrast, legislation that targets the design of these platforms, the very features that shape behaviour and experience, has far greater potential to reduce harm. There is also an important balance to strike, that being preserving childhood should never come at the expense of a young person’s rights under the Canadian Charter of Rights, including their right to access information, connect with others, and participate in the digital world in developmentally appropriate ways.
Another argument we are hearing more frequently is that age gating legislation helps establish a social norm, giving parents and caregivers more authority to say “no”. On the surface, that can feel like a helpful shift. Many parents are looking for support in setting boundaries, and the idea that legislation could reinforce those boundaries is appealing.
However, when we look more closely, there are important limitations to that thinking. Rules that rely heavily on external authority often do little to build internal understanding. If a youth or teen’s only reason for not using a platform is because it is “against the law,” that boundary can quickly lose its influence when they find a workaround or simply age out of the restriction. If a young person is 15 today and turns 16 tomorrow, the realities and concerns tied to social media don’t suddenly shift overnight. The risks, pressures, and design features remain exactly the same. Age alone doesn’t change the environment, and it doesn’t automatically change a youth or teen’s ability to navigate it.
What we consistently see is that youth and teens are highly capable of navigating around technical barriers. VPNs, shared accounts, older peers, and secondary devices can all undermine the effectiveness of age gating. When that happens, the authority parents and caregivers were hoping to lean on becomes far less reliable.
There is also a difference between compliance and resilience. A rule may stop a behaviour temporarily, but it does not necessarily prepare a youth or teen to navigate that behaviour in the future. The goal is not just to delay access, it’s to equip youth with the skills they need when access inevitably comes. Without building digital literacy, critical thinking, and self-regulation, we risk postponing the learning process rather than supporting it. When youth and teens do enter these spaces, often with more independence and less oversight, they may be less prepared to make informed decisions.
The relationship between parent and child also plays a critical role here. When authority is rooted primarily in legislation, it can shift the dynamic toward enforcement rather than guidance. Youth and teens are far more likely to engage in open, honest conversations when they understand the reasoning behind a boundary. Authority that is built through connection, explanation, and trust tends to be far more durable than authority that is borrowed from policy.
As mentioned earlier, it is also important to recognize that age gating does not change the environment itself. The design features that drive many of the concerns parents have, such as algorithmic amplification, endless scrolling, and attention driven notifications, remain unchanged regardless of a user’s age. Turning 16 does not suddenly make these features less influential. If anything, they continue to operate in the same way, shaping behaviour and engagement over time. So while age based legislation may create the appearance of a new norm, it does not improve the conditions youth and teens are eventually stepping into.
Social norms themselves are more complex than legislation alone. They are shaped by peer culture, family values, and lived experience. Even in countries that have introduced age based restrictions, early observations suggest that peer dynamics continue to play a significant role. If a youth or teen believes that “everyone else is on it,” a legal restriction does not always carry the weight we might expect.
None of this is to suggest that parents and caregivers should not set boundaries. In fact, it highlights just how important that role continues to be. But rather than relying on legislation to create authority, a more sustainable approach is to build it through ongoing conversation, clear expectations, and a strong relationship. That kind of authority does not disappear when a child turns a certain age or finds a workaround, it stays with them as they begin to navigate the digital world more independently. As we like to say here at the White Hatter, “Be your child’s best parents and not their fest friend when it comes to their use of technology, there’s a difference.”
This is why we continue to emphasize a different path forward. Rather than focusing primarily on age gating youth and teens, we should be directing our attention toward how these platforms are designed. The challenges many youth and teens experience online are not simply a function of access. They are shaped by environments that are intentionally engineered to capture attention and maximize engagement.
Features such as infinite scrolling, autoplay, push notifications, and algorithmic amplification are not accidental. They are core components of a business model that prioritizes time on platform. These design choices do not suddenly become less influential when a user turns 16. If anything, they remain just as powerful, if not more so, as teens gain greater independence online.
This is why legislation that targets design standards, transparency, and accountability has the potential to be far more effective. When we shift the focus from restricting access to improving environments, we move toward solutions that benefit all users, not just those below an arbitrary age threshold. It also addresses a practical reality. Age gating is not difficult to bypass, which makes enforcement inconsistent at best.
As this national conversation continues, it is also important to recognize that not all voices entering the discussion are neutral. There are individuals and organizations, including high-profile influencers and academics, who bring their own perspectives, priorities, and in some cases, agendas. That does not mean their contributions lack value, but it does mean they should be considered thoughtfully, especially when the solutions being proposed are simple answers to very complex issues.
Looking internationally, we can already see that age based restrictions are far from a settled solution. Countries that have moved in this direction are still working through questions of enforcement, effectiveness, and unintended consequences. Early indicators suggest that while such policies may signal intent, their real world impact is far less clear. Many youth and teens continue to access platforms despite restrictions, and the underlying design features that drive problematic use remain unchanged.
What this all points to is the need for a more balanced, evidence informed approach. One that acknowledges risk without overstating it, one that supports parents and caregivers without replacing them, and one that holds technology companies accountable not just for who is using their platforms, but for how those platforms are built.
At the end of the day, this is not just about keeping young people off platforms, it’s about ensuring that when they are on them, the environment they are stepping into is safer, more transparent, and designed with their well being in mind.
So what should that kind of legislation look like, and why does it matter? We have explored this in more detail in the following two articles:
Again, please share this article with your member of parliament, and every parent and caregiver that you know!
Digital Food For Thought
The White Hatter
Facts Not Fear, Facts Not Emotions, Enlighten Not Frighten, Know Tech Not No Tech
References:














