Caveat – The term “Self-Generated Child Sexual Abuse Material” is used in three different contexts in the literature that we were able to locate: #1) when referring to material created by offenders using technology such as artificial intelligence, #2) when discussing teens who sell or trade intimate images of themselves online or offline, and #3) when youth consensually take and share intimate images or videos of themselves, within the context of a private consensual teen relationship, either online or offline. In this article, we will be focusing on the third context.
As the onlife world continues to evolve, so do the challenges related to the sharing of intimate images, particularly among youth and older teenagers. Increasingly, consensually shared intimate images between teens are being labeled as “self-generated child sexual abuse material,” especially by law enforcement or those closely affiliated with them. While this may be done with good intentions, it ultimately does more harm than good in educating teens about sexting and the non-consensual distribution of intimate images, and here’s why.
According to the Supreme Court of Canada, intimate images shared between teens in private, consensual, and non-exploitative relationships are NOT illegal in Canada. This legal precedent was established to protect young people by ensuring that consensual sharing of intimate images in appropriate contexts is not classified as child sexual abuse material (CSAM). The law makes a clear distinction between consensual, private exchanges within relationships and exploitative actions, focusing on protecting youth from harm rather than criminalizing consensual behavior.
Labeling consensual teen intimate images as “self-generated child sexual abuse material” risks creating confusion and misperceptions among youth, leading them to believe their behavior is criminal, despite the law in Canada stating otherwise. This overreach can negatively impact teens involved in consensual, non-exploitative relationships, particularly if the image is later shared without consent.
Attaching a label like “self-generated child sexual abuse material” to consensual image exchanges fosters fear and shame among teens. Those who are victims of non-consensual distribution may fear being labeled as criminals themselves, which can discourage them from seeking help if their images are misused. This fear of stigma or arrest can prevent them from reporting the incident, leaving them vulnerable to further exploitation, blackmail, and emotional distress.
For instance, if a teen shares an intimate image consensually with a partner, and that partner later violates their trust by distributing the image, the teen may hesitate to involve law enforcement. The perception that their consensually shared image is considered “self-generated child sexual abuse material” can cause them to avoid reporting the violation, allowing the individual who shared the image to go unpunished. This is a real concern that we have heard from both teens and parents who have connected with us for help.
Context matters! It’s crucial to differentiate between consensual sharing of intimate images and situations of exploitation. When an intimate image is shared consensually within a trusted relationship, it remains a private act, not child sexual abuse material. However, when trust is broken and the image is shared or exploited without consent, it becomes an issue of exploitation and criminality for the individual who distributed the image—not the sender.
In Canada, if someone exploits an intimate image of a person under 18 through grooming or non-consensual distribution, they are in possession of child sexual abuse material (CSAM). However, the teen who initially shared the image did not knowingly create or distribute “self-generated child sexual abuse material.” The criminal act occurs when the image is shared without consent. This distinction is crucial in educating teens about sexting.
EXCEPTION: If a teen under 18 knowingly sells intimate images of themselves for profit—commonly known as sugaring—this would be classified as distribution or trafficking of self-generated child sexual abuse material under Canadian law.
Calling all consensually shared intimate images “self-generated child sexual abuse material” hinders effective sexting education. Criminalizing consensual behavior unnecessarily creates fear and shame, making it harder to have open discussions and deterring teens from seeking help when their trust is violated. Educating teens on consent, trust, and legal protections encourages better understanding and increases the likelihood that they will seek help from parents or law enforcement if their intimate images are shared without their consent.
Parents, educators, and policymakers should focus on fostering open, honest discussions about the risks of sharing intimate images and the legal resources available to those who are harmed. Teens need to understand that if their intimate image is shared without consent, they are not at fault and have every right to seek help. When trust is broken, the person who violated that trust is the one committing the illegal act—not the teen who consensually shared the image.
The goal is not to create fear, but to equip young people with the tools and knowledge to navigate digital relationships safely and responsibly. Referring to consensually shared intimate images as “self-generated child sexual abuse material” is a counterproductive label that promotes fear rather than education, ultimately discouraging teens from seeking help when they need it most.
Digital Food For Thought
The White Hatter
Facts Not Fear, Facts Not Feelings, Enlighten Not Frighten, Know Tech Not No Tech
Postscript:
We do agree that the use of the phrase “Self-Generated Child Sexual Abuse Material” fits well when used to describe child sexual abuse material created by offenders using technology like artificial intelligence or if the teen is selling or trading nudes of themselves with others.