Caveat – We’d like to thank Chris McKenna for introducing us to the concept of “Wicked Problems.” Although we may differ in our approaches to digital literacy and internet safety, we both deeply care about our kids and hold more similarities than differences. After diving deep into the idea of Wicked Problems, we quickly realized how many of its principles apply to youth and teen digital literacy and internet safety. That inspired us to write this article, so here’s our take. Enjoy! If you’re a parent or caregiver trying to keep your child or teen safer online, you’ve probably found yourself overwhelmed by the ever-changing world of technology, the internet, social media, apps, privacy settings, and online risks. Just when you think you’ve got a handle on things, a new trend, platform, tech product, or threat pops up. It can feel like playing whack-a-mole, and at times it is. In the evolving landscape of digital technology, keeping youth and teens safer online has become a significant concern for parents, educators, presenters, policymakers, and society at large. However, many efforts to address online safety and digital literacy fall short not because of a lack of concern or commitment, but because we are facing what is known in systems thinking as a “wicked problem.” The term “wicked problem” was introduced by design theorists Horst Rittel and Melvin Webber in 1973. A wicked problem isn’t necessarily evil, although some would like you to think it is, it’s wicked in the sense that it is complex, interconnected, and notoriously difficult to define, let alone solve. Wicked problems do not have a single, clear solution. They involve incomplete or contradictory information, ever-changing variables, and they resist simple cause-and-effect thinking. What Makes a Problem “Wicked”? In today’s onlife world (a term that recognizes the seamless integration of online and offline life of youth and teens), our kids are constantly navigating shifting digital landscapes. What worked for your youth or teen last year, may no longer apply today. The risks aren’t just about strangers and explicit content, they also involve mental health, social pressure, misinformation, habituation design, data exploitation, and privacy challenges. Today, with AI-generated deepfake content becoming more realistic and accessible, the urgency to educate and empower youth, rather than simply restrict them, becomes even more apparent. When we apply the wicked problem framework to youth and teen digital literacy and online safety, we begin to see how simplistic solutions such as banning devices, delaying access, or enforcing surveillance-heavy controls miss the mark. They fail to account for the dynamic, diverse, and deeply personal nature of how youth use technology. Here’s what we believe to be some of the characteristics of a wicked problem specific to digital literacy and internet safety when it comes to youth and teens. No Clear Definition What does it mean to be “safer” online? For some, it means protection from strangers and explicit content. For others, it includes mental health protection, data privacy, digital reputation management, or freedom from habituating algorithmic patterns, or all the above. The goals vary widely depending on cultural values, faith, age groups, and technology use, making the problem nearly impossible to define in one sentence. To understand how the concept of “wicked problems” applies to youth and teen online safety and digital literacy, it’s crucial to recognize that these challenges are complex, ever-changing, and deeply interconnected with social, psychological, emotional, and technological dynamics. A wicked problem in this context is not just difficult, it is nearly impossible to solve in a definitive, one-size-fits-all way which “some” are promoting, such as banning technology or preventing access to the internet until a certain age. Online safety and digital literacy among young people cannot be neatly defined because the issues involved such as cyberbullying, sextortion, misinformation, screen use, and algorithmic influence, are constantly evolving. As new platforms emerge and digital behaviours shift, so too does the nature of the risks and how we need to respond. The problem itself doesn’t have a fixed shape, it’s fluid and entangled with broader societal issues like mental health, education, parenting norms, and the lack of tech industry ethics. This is why we reject “one-size-fits-all” approaches. Digital safety must be framed within the unique context of each youth’s life. For one teen, gaming may offer critical social connection. For another, it may become a source of toxic stress. The same tool can provide value or harm depending on how it’s used, emphasizing why the concept of screen value, not screen time, is a more useful measure for parents and educators. No Finish Line Rule Even if you implement all the latest parental controls, moderation tools, and educational programs, the nature of the internet, and youth behaviour, will change tomorrow. New apps emerge. New threats develop. New behaviours trend. There is no finish line to cross, no moment where we can say “problem solved.” There’s also no clear endpoint to this work. We can’t declare the problem “solved” because any attempt to address one part of the issue such as, age gating, improving privacy settings, or teaching youth about critical thinking, immediately alters the landscape. Solutions often create new questions. For instance, restricting screen use might reduce exposure to harmful content but could also unintentionally limit access to beneficial online communities or learning opportunities. This underscores the need for adaptive digital parenting. The job is never done, but that doesn’t mean we’re failing. We’re simply navigating a moving target. It’s not about arriving at “complete safety,” but about continually helping youth build resilience, critical thinking, and agency. Solutions Are Judged, Not Proven Blocking access to technology or certain apps may reduce exposure to risks, but it may also infringe on a teen’s autonomy or harm their ability to socialize. Teaching digital literacy in schools is widely praised but hard to measure in terms of long-term effectiveness. There’s no universally accepted way to evaluate success, only stakeholder judgments of what’s “good enough.” Solutions in this space are not black and white. What works for...