Blog

We Can Identify Travel-Related Content From Screen Shots, So Why Not Child Sexual Abuse Material (CSAM)

March 31, 2025

Artificial intelligence is increasingly being integrated into consumer technology, often with the goal of improving user experience and optimizing services. However, the same AI-driven capabilities that allow companies like Google to analyze screenshots for travel data could also be used for a far more critical purpose, identifying and preventing the spread of child sexual abuse material (CSAM)

The ability to scan and analyze images stored on a user’s device is not new. AI-powered systems, such as Google’s new Gemini, are already being developed to recognize travel-related content from screenshots. (1) Why not allow this same technology to be adapted to detect known CSAM, helping prevent its circulation and flagging illegal content before it reaches online platforms.

This would represent a significant advancement in the fight against child exploitation. If AI can recognize landmarks in travel photos, it could also be trained to compare images against known CSAM databases maintained by organizations like the National Center for Missing and Exploited Children (NCMEC) – such technology does exist. By leveraging on-device scanning, tech companies could block such content before it is uploaded, shared, or even downloaded.

Apple had once proposed a similar initiative. In 2021, the company announced a controversial CSAM detection system that would have scanned users’ iCloud Photos for known CSAM images using a technology called NeuralHash. (2) The system was designed to compare images to a database of known CSAM without revealing other private user data. However, following widespread privacy backlash, Apple ultimately abandoned the plan in 2023, citing concerns about potential government misuse and the risk of a slippery slope toward mass surveillance. (3)

Despite this, Apple’s ability to deploy such a system remains. The company has the technology but has chosen not to implement it, leaving a gap in proactive CSAM detection that could potentially prevent further exploitation.

The opposition to Apple’s CSAM detection was largely driven by privacy advocates who feared that scanning personal photos, even with safeguards, could set a dangerous precedent. Critics argued that authoritarian governments could pressure companies to expand scanning capabilities beyond CSAM to other forms of content deemed undesirable, such as political dissent or LGBTQ+ materials in restrictive regions – a reasonable concern in our opinion.

While these concerns are valid, they also highlight the broader ethical dilemma: At what point does the need to protect children outweigh the potential risks to privacy?

If Google’s AI can analyze screenshots for travel metadata for their financial benefit, isn’t it reasonable to ask why similar technology isn’t being widely used to combat CSAM for the survivors of CSAM? The challenge lies in striking a balance between user privacy and the urgent need to protect children from exploitation.

The fight against CSAM is a moral and legal imperative. While concerns about privacy are valid, they should not outweigh the necessity of preventing the spread and creation of CSAM. Technology companies like Google and Apple possess the capability to scan for this content, and they should be permitted, if not required, to do so, provided that strong privacy safeguards are in place.

CSAM is not just another category of illicit content; it represents the ongoing exploitation of children. Every image and video is evidence of a crime, and allowing such material to circulate perpetuates harm. Tech companies already have a duty to report known CSAM when they encounter it, and proactive scanning helps identify and remove content before further damage is done.

Legal frameworks such as the U.S. National Center for Missing and Exploited Children (NCMEC) reporting requirements recognize that companies play a crucial role in CSAM detection. By scanning for CSAM, companies are not engaging in unnecessary surveillance but fulfilling an ethical and legal responsibility.

The primary opposition to CSAM scanning revolves around potential privacy violations. However, this concern can be mitigated through well-designed safeguards:

  • As mentioned earlier, Apple’s now-paused CSAM detection system proposed using hash-matching technology to identify known CSAM without exposing users’ private content to human reviewers. If implemented correctly, this system would only flag images that match known CSAM databases, not personal photos.

  • Allow for Independent oversight: To prevent misuse, CSAM scanning programs should be subject to third-party audits, ensuring transparency and accountability.

  • We create legislation where companies must be legally bound to use these tools exclusively for CSAM detection, with severe penalties for misuse or expansion beyond their original intent.

  • If an image is flagged, users should have a process to appeal before any action is taken against their accounts.

Failing to act enables perpetrators to continue sharing CSAM without obstruction. Privacy rights are important, but they should not be used as a shield for criminal activity. We already accept certain privacy trade-offs for security reasons, such as airport screenings and financial fraud detection. CSAM scanning is another necessary measure to protect the most vulnerable members of society.

AI has the power to revolutionize digital safety, but its implementation must be carefully considered. While privacy concerns are legitimate, companies should not abandon tools that could save children from harm. The question is no longer whether we have the technology to detect CSAM, but whether we have the will to use it responsibly.

Digital Food For Thought

The White Hatter

Facts Not Fear, Facts Not Emotions, Enlighten Not Frighten, Know Tech Not No Tech

Reference:

1/ https://www.theverge.com/news/637137/google-maps-screenshot-searchhotels-travel-features#comments 

2/ https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf 

3/ https://ctlj.colorado.edu/wp-content/uploads/2023/09/Rudin-FINAL-08.02.2023.pdf 

Support The White Hatter Resources

Free resources we provide are supported by you the community!

Lastest on YouTube
Latest Podcast Episode
Latest Blog Post
The White Hatter Presentations & Workshops

Ask Us Anything. Anytime.

Looking to book a program?

Questions, comments, concerns, send us an email! Or we are available on Messenger for Facebook and Instagram

Your subscription could not be saved. Please try again.
Your subscription has been successful.

The White Hatter Newsletter

Subscribe to our newsletter and stay updated.

We use Sendinblue as our marketing platform. By Clicking below to submit this form, you acknowledge that the information you provided will be transferred to Sendinblue for processing in accordance with their terms of use