Blog

“Wicked Problems” and the Online Safety of Youth: Why Digital Literacy Demands Integrated Solutions Rather Than A Simplistic Banning Approach.

April 15, 2025

Caveat – We’d like to thank Chris McKenna for introducing us to the concept of “Wicked Problems.” Although we may differ in our approaches to digital literacy and internet safety, we both deeply care about our kids and hold more similarities than differences. After diving deep into the idea of Wicked Problems, we quickly realized how many of its principles apply to youth and teen digital literacy and internet safety. That inspired us to write this article, so here’s our take. Enjoy!

If you’re a parent or caregiver trying to keep your child or teen safer online, you’ve probably found yourself overwhelmed by the ever-changing world of technology, the internet, social media, apps, privacy settings, and online risks. Just when you think you’ve got a handle on things, a new trend, platform, tech product, or threat pops up. It can feel like playing whack-a-mole, and at times it is.

In the evolving landscape of digital technology, keeping youth and teens safer online has become a significant concern for parents, educators, presenters, policymakers, and society at large. However, many efforts to address online safety and digital literacy fall short not because of a lack of concern or commitment, but because we are facing what is known in systems thinking as a “wicked problem.”

The term “wicked problem” was introduced by design theorists Horst Rittel and Melvin Webber in 1973. A wicked problem isn’t necessarily evil, although some would like you to think it is, it’s wicked in the sense that it is complex, interconnected, and notoriously difficult to define, let alone solve. Wicked problems do not have a single, clear solution. They involve incomplete or contradictory information, ever-changing variables, and they resist simple cause-and-effect thinking.

What Makes a Problem “Wicked”?

In today’s onlife world (a term that recognizes the seamless integration of online and offline life of youth and teens), our kids are constantly navigating shifting digital landscapes. What worked for your youth or teen last year, may no longer apply today. The risks aren’t just about strangers and explicit content, they also involve mental health, social pressure, misinformation, habituation design, data exploitation, and privacy challenges. Today, with AI-generated deepfake content becoming more realistic and accessible, the urgency to educate and empower youth, rather than simply restrict them, becomes even more apparent.

When we apply the wicked problem framework to youth and teen digital literacy and online safety, we begin to see how simplistic solutions such as banning devices, delaying access, or enforcing surveillance-heavy controls miss the mark. They fail to account for the dynamic, diverse, and deeply personal nature of how youth use technology. Here’s what we believe to  be some of the characteristics of a wicked problem specific to digital literacy and internet safety when it comes to youth and teens.

No Clear Definition

What does it mean to be “safer” online? For some, it means protection from strangers and explicit content. For others, it includes mental health protection, data privacy, digital reputation management, or freedom from habituating algorithmic patterns, or all the above. The goals vary widely depending on cultural values, faith, age groups, and technology use, making the problem nearly impossible to define in one sentence.

To understand how the concept of “wicked problems” applies to youth and teen online safety and digital literacy, it’s crucial to recognize that these challenges are complex, ever-changing, and deeply interconnected with social, psychological, emotional, and technological dynamics. A wicked problem in this context is not just difficult, it is nearly impossible to solve in a definitive, one-size-fits-all way which “some” are promoting, such as banning technology or preventing access to the internet until a certain age.

Online safety and digital literacy among young people cannot be neatly defined because the issues involved such as cyberbullying, sextortion, misinformation, screen use, and algorithmic influence, are constantly evolving. As new platforms emerge and digital behaviours shift, so too does the nature of the risks and how we need to respond. The problem itself doesn’t have a fixed shape, it’s fluid and entangled with broader societal issues like mental health, education, parenting norms, and the lack of tech industry ethics.

This is why we reject “one-size-fits-all” approaches. Digital safety must be framed within the unique context of each youth’s life. For one teen, gaming may offer critical social connection. For another, it may become a source of toxic stress. The same tool can provide value or harm depending on how it’s used, emphasizing why the concept of screen value, not screen time, is a more useful measure for parents and educators.

No Finish Line Rule

Even if you implement all the latest parental controls, moderation tools, and educational programs, the nature of the internet, and youth behaviour, will change tomorrow. New apps emerge. New threats develop. New behaviours trend. There is no finish line to cross, no moment where we can say “problem solved.”

There’s also no clear endpoint to this work. We can’t declare the problem “solved” because any attempt to address one part of the issue such as, age gating, improving privacy settings, or teaching youth about critical thinking, immediately alters the landscape. Solutions often create new questions. For instance, restricting screen use might reduce exposure to harmful content but could also unintentionally limit access to beneficial online communities or learning opportunities.

This underscores the need for adaptive digital parenting. The job is never done, but that doesn’t mean we’re failing. We’re simply navigating a moving target. It’s not about arriving at “complete safety,” but about continually helping youth build resilience, critical thinking, and agency.

Solutions Are Judged, Not Proven

Blocking access to technology or certain apps may reduce exposure to risks, but  it may also infringe on a teen’s autonomy or harm their ability to socialize. Teaching digital literacy in schools is widely praised but hard to measure in terms of long-term effectiveness. There’s no universally accepted way to evaluate success, only stakeholder judgments of what’s “good enough.”

Solutions in this space are not black and white. What works for one teen might not work for another. Some parents, educators, and policymakers might see tighter content moderation as helpful, while others view it as censorship. These perspectives can be shaped by values, culture, faith, and personal experience. So, when evaluating solutions, we’re often choosing between “better or worse”, “desirable or less than desirable”, rather than “right or wrong.”

Some solutions may feel good emotionally, like surveillance apps or GPS trackers, but may backfire by fostering distrust, reducing youth agency, or increasing risk. This is why we caution: Just because you can, doesn’t mean you should. We encourage parents to prioritize building relationships over relying solely on tech-based controls.

No Immediate or Final Test of Success

Even the most thoughtful online safety program might not show results for years. A youth or teen who seems unaffected by risky online behaviour today may experience consequences later in life when heading to college or joining the workforce. We often only see the real test of our digital education strategies after the damage, or success, has been done.

Adding to the complexity, we rarely know right away whether a particular intervention, like banning technology, the introduction of a new digital literacy curriculum, or the application of a new parental control tool, has worked. Often, the effects only surface over time, this is why longitudinal research is so important but very time heavy. A youth or teen may initially resist media literacy education, but the long-term benefits of increased critical thinking might not be fully apparent for years, and this is what the social sciences are trying to figure out, but again this takes time.

Teaching youth to think critically about social media, AI, sextortion, or digital manipulation is not a one-time lesson, it’s a lifelong process. You may not see the benefits today, but the seeds planted through open dialogue and media literacy will grow over time, something that we have empirically seen time and time again.

It’s also why we push back against reactionary moral panics that demand immediate action without proper “good” evidence-based research. Good digital literacy outcomes may take years to manifest, but that doesn’t make them any less vital.

One-Shot Operations

Encouraging teens to share their experiences online or post content can backfire if they’re targeted by bullies or online predators. Telling a parent to monitor a child’s device may permanently damage trust. Once you hit that send button, it’s hard to walk back what you just posted.

Even more challenging is that most solutions are “one-shot operations.” Once a policy is implemented or a conversation is had with a teen, it can’t easily be taken back. Banning access to phones and the internet until a certain age might offer short-term protection, but could also lead to social isolation or delay the development of digital resilience and agency.

Conversations about sexting, privacy, or digital ethics are not “once and done.” Nor are rules about device use. Every policy, conversation, or intervention leaves a digital or emotional footprint. Parents must approach these topics with care, flexibility, and curiosity, knowing that mistakes can’t always be undone.

This is where the “redirect and pave the way” philosophy that we promote becomes crucial. Rather than delay tech use out of fear, we believe parents and caregivers must proactively guide kids through it, before they make critical mistakes alone.

No List of Solutions

Should schools teach coding or even use tech in the classroom? Should parents be the primary digital mentors? Should social media companies bear the full weight of responsibility? Should governments legislate age limits? The “solutions” vary wildly and are deeply rooted in ideological and political values. There’s no clear path forward. However, could a convergence of all the above be an important starting point?

All of these may be part of the answer, or none of them, depending on the context. What’s needed is a convergent mindset, where families, schools, researchers, and industry collaborate, not compete, for solutions.

There’s no comprehensive list of answers to choose from. Parents, caregivers, educators, and youth themselves often have to get creative in finding what works for them. Strategies must be flexible and adaptable, ranging from collaborative family tech agreements, to peer-led education or school-based digital wellness initiatives, to the right tech and the right time. It’s all about, “principles stay the same, diverse in their application.”

Every Situation Is Unique

A 13-year-old girl on TikTok in rural Canada faces different risks and realities than a 17-year-old boy gaming online in downtown Toronto. Socio-economic status, cultural background, neurodiversity, and family dynamics all play a role in how digital experiences unfold, making a one-size-fits-all approach ineffective.

The risks and opportunities of technology depend on the individual. A neurodiverse teen using YouTube to learn social skills has vastly different needs than a socially anxious youth navigating Snapchat.

Equity matters. Not all families have access to the same tools, bandwidth, or support. This is why community-based solutions and inclusive digital literacy programs are so vital.

Each situation is also unique. A digital issue that affects a youth or teen in an urban school might look very different from one facing a youth in a rural community. As we stated earlier, cultural differences, faith, access to resources, family dynamics, and even local legislation all influence how the problem presents and how it can be addressed.

It’s a Symptom of Other Problems

Youth and teen online risk is not just a tech issue, it reflects broader concerns like the decline of mental health supports, parenting gaps, educational inequities, social media regulation, and even systemic economic pressures. Trying to solve “online safety” in isolation is like putting a Band-Aid on a leaking dam.

Wicked problems in digital safety for youth and teens are often symptoms of even bigger systemic issues, such as inequality, lack of mental health support, or corporate prioritization of engagement for financial gain over well-being. Trying to “solve” online safety in isolation misses the broader context that shapes young people’s digital experiences.

Online risk isn’t just about phones or apps, it’s often a mirror reflecting broader issues: lack of mental health services, inadequate digital education, parental overwhelm, or exploitative tech design. Trying to “fix” youth online behaviour in isolation is like treating symptoms while ignoring the root cause.

We must acknowledge how capitalism, algorithmic manipulation, and data monetization influence the onlife environments our youth inhabit. The burden shouldn’t fall only on youth or their parents, yes, both play a huge role, but tech companies and governments must be held accountable too.

Different Definitions = Different Solutions

Some parents see tech as a privilege; others see it as a necessity. Some educators focus on cyberbullying; others worry about disinformation. Some policymakers want age verification; others worry about surveillance and privacy rights. Because everyone defines the problem differently, they advocate for different and sometimes opposing solutions.

How we define the problem greatly influences the solution. If the issue is framed as “youth and teens are irresponsible with tech,” the response might be more restrictive. But if the problem is seen as “youth and teens lack access to digital literacy education,” the solution becomes more empowering and preventative. Different stakeholders such as parents, schools, digital literacy and internet safety advocates, tech companies, and policymakers, often see the problem through very different lenses, which can lead to tension or inaction.

Some frame tech as dangerous, others see it as essential. Some view youth as vulnerable, others see them as capable. These opposing beliefs shape drastically different solutions, ranging from total bans to fully open access.

Instead of choosing sides, we advocate for a nuanced middle ground: one that respects developmental stages, recognizes youth agency, and embraces a “principles stay the same, diverse in their application” approach.

No Room for Error

A single failure to protect a child online can result in serious harm such as exploitation, bullying, mental health crises, or even suicide. As a result, there is immense pressure on parents, schools, presenters, and tech companies to get it “right” even when the path forward is unclear.

The stakes are high. Exploitation, bullying, and mental health challenges are very real. But fear-driven approaches can also cause harm, especially when they shut down conversations, stigmatize normal adolescent behaviour, or delay important education.

This is why we urge parents and caregivers to embrace the reality that mistakes will happen, and that it’s our response to those mistakes, not the prevention of them, that often determines long-term outcomes. Youth and teens don’t need perfection, they need support, understanding, and consistent guidance.

All of this is compounded by the pressure placed on those who are trying to lead in this space. Presenters, educators, youth workers, and policymakers are expected to make high-stakes decisions with incomplete information. The cost of getting it wrong, whether it’s a missed warning sign of online exploitation or a punitive policy that harms a teen’s development, can be significant.

In short, youth digital literacy and online safety are the very definition of a “wicked” problem. Navigating it requires humility, creativity, collaboration, and a willingness to adapt over time.

Wicked problems aren’t unsolvable,  they’re just unsolvable in traditional ways. They demand a different asymmetrical mindset. Instead of looking for the “right” answer, we must focus on adaptive, iterative approaches that prioritize collaboration, critical thinking, and resilience. Again, a principles stay the same, diverse in their application approach.

In the context of youth and teen digital literacy and internet safety, one of the most important shifts we need to make is moving from a mindset of protection to one of empowerment. While it’s tempting to want to shield young people from every online risk, the reality is that complete protection is neither possible nor practical. Instead, we must focus on equipping youth with the critical thinking skills needed to navigate digital spaces with confidence. This includes the ability to recognize manipulation, resist harmful behaviours like online harassment or misinformation, and recover from inevitable missteps or setbacks. Empowerment places the tools for safety and resilience in the hands of youth themselves.

Digital literacy initiatives should also be dynamic and responsive, supported by consistent feedback loops. Programs and strategies should evolve based on the lived experiences of students, parents, and educators. What worked well last year may not be as effective today due to changes in technology, platforms, or youth culture. And that’s okay. By treating digital literacy as a continuous conversation rather than a one-time lesson, we make space for growth and relevance.

Digital literacy and internet safety should be promoted as a lifelong skill, not just a one-off topic to check off a curriculum list. Much like reading and math, digital literacy must be reinforced year after year, adapted across platforms and devices, and supported by families, schools, and community organizations alike. Whether it’s recognizing clickbait, decoding influencer marketing, or understanding privacy settings, youth need repeated, age-appropriate opportunities to build and deepen their digital competencies.

Just as importantly, we adults need to stop designing digital literacy efforts solely for youth and teens and start creating them with youth. Too often, adult assumptions guide decisions about what kids need online. But teens are the true experts in their own digital lives. Their voices, insights, and experiences must be included in the design and implementation of programs, policies, and tools intended to support them. When we engage youth as co-creators rather than passive recipients, we build more effective and relevant solutions.

Balancing freedom with accountability is another critical element of this work. We must resist the urge to fall into panic-driven policies that restrict access without addressing root issues. At the same time, we need to hold tech platforms accountable for unethical design choices, such as exploitative algorithms or deceptive user interfaces that manipulate behaviour. Creating safer digital spaces means holding powerful actors to account without undermining the autonomy and rights of young users.

It’s essential to acknowledge the trade-offs involved in every onlife decision. Surveillance tools, for example, may seem to increase safety but can also erode a young person’s sense of trust and privacy. Expanding digital education programs can lead to better outcomes but requires sustainable investment and resources. By being transparent about these trade-offs, we foster trust and make more informed, balanced choices that reflect both the risks and the rewards of growing up in a connected world.

In the world of youth and teen digital literacy and internet safety, we’re not searching for a silver bullet ,  we’re learning to navigate a maze. Wicked problems don’t end, they evolve. But through collective effort, informed dialogue, and humility, we can make things better, even if we can’t make them perfect.

When we treat digital literacy and internet  safety as a wicked problem rather than a simple one, we move from reaction to reflection, from blame to collaboration, and from fear to informed action. And that’s the kind of thinking our kids deserve.

Digital literacy and internet safety for youth and teens isn’t a box you check ,  it’s a journey you take with your child. It’s messy, it’s complicated, and it won’t ever be perfect. But by understanding it as a “wicked problem,” we can let go of unrealistic expectations and focus on what really helps: connection, communication, education, adaptability, and empathy.

As parents, caregivers, and educators, our goal shouldn’t be to raise kids who never make mistakes online. Instead, it should be to raise kids who understand how to reduce risks, manage challenges when they arise, and know what to do when they inevitably face an online dilemma, whether by accident or by choice!

Digital Food For Thought

The White Hatter

Facts Not Fear, Facts Not Emotions, Enlighten Not Frighten, Know Tech Not No Tech

Support The White Hatter Resources

Free resources we provide are supported by you the community!

Lastest on YouTube
Latest Podcast Episode
Latest Blog Post
The White Hatter Presentations & Workshops

Ask Us Anything. Anytime.

Looking to book a program?

Questions, comments, concerns, send us an email! Or we are available on Messenger for Facebook and Instagram

Your subscription could not be saved. Please try again.
Your subscription has been successful.

The White Hatter Newsletter

Subscribe to our newsletter and stay updated.

We use Sendinblue as our marketing platform. By Clicking below to submit this form, you acknowledge that the information you provided will be transferred to Sendinblue for processing in accordance with their terms of use