The battle between protecting our children from online child exploitation and safeguarding privacy rights can often feel like an unending tug-of-war. On one side, we have the urgent need to address the creation and spread of child sexual abuse material (CSAM). On the other, privacy advocates emphasize the importance of preserving user rights in an increasingly digital world. However, emerging technologies suggest that it’s possible to achieve both goals—keeping children safe while maintaining privacy.
Recently, an insightful article by John Tanagho titled “Child Protection at Our Fingertips: What Governments and Corporations Can Do” (1) highlighted how technology can effectively combat CSAM without compromising user privacy. This revelation is a game changer for governments, corporations, and parents alike, showcasing a path where safety and privacy don’t have to be mutually exclusive.
Some tech companies, including Apple (2), Instagram, and WhatsApp (owned by Meta) (3), are already leveraging on-device machine learning to “partially” tackle harmful online content (however, we would argue don’t go far enough) .These systems work by detecting suspicious material—like nudity or harmful links—directly on a user’s device. Importantly, this process doesn’t involve transferring data to the company’s servers or breaching user privacy. Features like these are evidence that child safety doesn’t have to come at the cost of personal security.
How It Works:
- Scanning Attachments and Messages: These tools scan for harmful content, such as explicit images or suspicious links, before they are sent or received.
- On-Device Operation: All analysis happens locally on the device, meaning the tech company itself never sees the data.
- Compatibility with Encryption: These tools can function even in end-to-end encrypted environments, preserving both security and privacy.
This type of proactive technology demonstrates that corporations can prevent harm to children while simultaneously safeguarding user privacy. It’s an innovative approach that governments and policymakers should explore to address the escalating issue of CSAM and online exploitation.
Governments can learn from these tech companies by adopting policies and regulations that encourage the use of similar technologies across digital platforms. By doing so, they can create a unified front against child exploitation without eroding citizens’ rights to privacy. For example:
- Promoting On-Device Solutions – Governments can incentivize companies to implement on-device machine learning to detect CSAM and other harmful content.
- Standardizing Technology Use – Policymakers can establish guidelines ensuring that platforms adopt privacy-preserving safety measures.
- Encouraging Transparency – Companies should clearly communicate to users how these tools work to build trust.
The perceived conflict between child protection and privacy rights is not as insurmountable as it once seemed. Advances in on-device machine learning and privacy-preserving technologies have demonstrated that it is possible to safeguard children from online exploitation without compromising user privacy.
Governments and policymakers must encourage, through legislation, the adoption of technologies that strike this critical balance. By promoting on-device solutions, standardizing their use across platforms, and fostering transparency, we can build a unified approach to combating CSAM and online exploitation.
The choice between protecting children and preserving privacy is no longer a binary one. With collaboration between technology companies, governments, and communities, we can achieve a safer digital future that honours both security and individual rights.
Ultimately, this path forward requires more than just technological innovation – it demands collective commitment. Parents, educators, corporations, and lawmakers all have a role to play in shaping a digital landscape where children are safe to explore and learn without fear of exploitation. Open dialogue, shared responsibility, and proactive measures are the keys to fostering this balance.
As we move ahead, we must remember that neither child safety nor privacy should ever be considered optional. Both are fundamental rights that deserve our unwavering attention and effort. By leveraging technology thoughtfully and supporting policies that prioritize both goals, we can overcome the false dichotomy that has long divided these critical issues.
Digital Food For Thought
The White Hatter
Facts Not Fear, Facts Not Emotions, Enlighten Not Frighten, Know Tech Not No Tech
References:
3/ https://en-gb.facebook.com/help/instagram/503437025160040