Blog

New Sextortion Update Targeting Teens (Update August 2024)

August 26, 2024

In July, we posted an important ALERT regarding a new sextortion tactic targeting teens. Unfortunately, this weekend, we encountered yet another case where this same technique was used. This serves as a crucial reminder of why it’s vital for youth to keep their social media accounts private, carefully consider who they allow as friends or followers, and rethink the types of photos they post online.

Given that education on sextortion is working, we’ve seen a growing trend where teens are targeted for sextortion not because they shared explicit photos, but because offenders, often organized crime groups, are using AI deepfake apps to now manipulate innocent images. These apps can take fully clothed photos, usually screen-captured from public social media profiles like Instagram, and alter them to create fake nude images. The offenders then use these manipulated photos to extort money or explicit content from the teen, threatening to share the fake nudes with the teen’s parents or publicly if they don’t comply.

In July we assisted a family where the teen provided a picture from their social media account to prove they had never sent a nude. It turned out, and we confirmed, the image had been screen-captured and manipulated without their knowledge.

This weekend’s situation once again highlights the importance of teaching youth to keep their social media profiles private and to be cautious about who they connect with online. However, it’s equally important to understand that even these precautions may not fully protect against such incidents.

Digital literacy today must include teaching youth to be mindful of their online self-presentation to reduce their vulnerability to AI-generated deepfake nudes. Simple strategies like altering the angle of selfies or other images and opting for side profiles rather than straight-on shots, can make it harder for AI to manipulate photos. Additionally, consider avoiding posts that feature swimwear or underwear, as these images are particularly susceptible to being altered, as happened in case this weekend.

It’s crucial for parents and caregivers to recognize that the nude images being used in these scams may not have been created by your child. In one case we handled, the parent initially didn’t believe the image was AI-generated because they weren’t aware such technology existed, and blamed the child for sending such an intimate image. This underscores the need for all of us to stay informed about these emerging threats.

Remember, knowledge and the understanding and application of that knowledge is power. Sharing this information and ensuring our children understand it can make a significant difference in protecting them from these evolving sextortion risks.

Digital Food for Thought,
The White Hatter

Support The White Hatter Resources

Free resources we provide are supported by you the community!

Lastest on YouTube
Latest Podcast Episode
Latest Blog Post
The White Hatter Presentations & Workshops

Ask Us Anything. Anytime.

Looking to book a program?

Questions, comments, concerns, send us an email! Or we are available on Messenger for Facebook and Instagram

Your subscription could not be saved. Please try again.
Your subscription has been successful.

The White Hatter Newsletter

Subscribe to our newsletter and stay updated.

We use Sendinblue as our marketing platform. By Clicking below to submit this form, you acknowledge that the information you provided will be transferred to Sendinblue for processing in accordance with their terms of use