Recently, a Toronto Star article titled “A boy created AI-generated porn with the faces of girls he knew. Why Toronto police said he didn’t break the law” (see attached image) has sparked significant discussion among parents and educators. Many of our followers have reached out to us for clarification and perspective on the incident, where police chose not to proceed with charges against a teen boy who used a deepfake app to create AI-generated nudes of girls from his school. (1) Here’s a breakdown of the case based on the facts presented in the article.
- A group of teen girls approached the police after discovering that a male peer had used a deepfake app to create nude images of them based on photos they had shared on social media.
- Another teen, who had access to the suspect’s phone uncovered the images while looking for something else, and recorded video evidence of them.
- Following a thorough investigation, police found no evidence that the suspect had shared the AI-generated images with anyone. The images appeared to have been created for his private use that remained on his phone.
After consulting with Crown Counsel, police opted not to lay charges. They ensured, however, that all the AI-generated images were deleted. This decision, while troubling to many, was rooted in existing Canadian law that we should all be aware of:
#1/ Deepfake Nudes and the Criminal Code
- Currently, deepfake nudes are not explicitly defined under the Criminal Code of Canada.
- If the AI-generated images depicted individuals under 18, they could legally fall under the definition of child pornography or, more accurately, Child Sexual Abuse Material (CSAM).
- Sharing or distributing such images without consent could result in charges of distributing CSAM and or the non-consensual distribution of an intimate image
#2/ The “Private Use Exception”
In the 2001 Supreme Court case R. v. Sharpe, [2001] 1 S.C.R. 45, 2001 SCC 2, the court established a “private use exception” for the possession of CSAM. This exception applies to:
- Self-created expressive material: Content created solely by the accused.
- Private recordings of lawful sexual activity: Visuals created by or depicting the accused, provided the content remains private and lawful.
In this case, since there was no evidence that the teen suspect had shared the images, the situation likely fell under this exception. Therefore, the Crown likely concluded there was no reasonable prospect of conviction, resulting in no charges being laid.
While this legal outcome aligns with existing laws and case law, it understandably feels inadequate to the victims and their families. The psychological harm and violation of privacy experienced by the girls are profound, even if the images were not distributed. In this case, since there was no evidence that the suspect had shared the images, the situation likely fell under this exception. The Crown likely concluded there was no reasonable prospect of conviction, resulting in no charges being laid.
What Could Have Happened If Images Were Shared
To be clear, if the suspect had distributed the AI-generated images, charges under the Criminal Code would likely have included:
- Distribution of Child Pornography (CSAM)
- The non-consensual distribution of an intimate image
- Indecent Communications
- Defamatory Libel
In fact, we are aware of several cases where such charges have been laid, and we are awaiting to hear the outcomes of the cases. This is important, why? – because we have read postings where Canadian parents, and even some social media safety advocates, believe that there are no legal consequences for AI-generated deepfake nudes in Canada. This is simply not the case.
In British Columbia, victims in similar cases could pursue civil action under the Intimate Images Protection Act, which explicitly includes provisions for AI-generated deepfake nudes. Unfortunately, Ontario lacks comparable legislation, leaving victims with fewer legal remedies.
This case highlights the urgent need for updates to the Criminal Code to address the unique challenges posed by AI-generated intimate images – especially when it comes to the definition of what it is. As technology evolves, so must our legal frameworks to ensure justice and protection for victims of such violations.
For parents, caregivers, and educators, this case underscores the importance of conversations with teens about the ethical and legal implications of their actions online – especially when it comes to the use of these deepfake nude apps. It also serves as a reminder that while current laws may not fully address emerging technologies, advocating for legal reform can help close these gaps and provide better protections for vulnerable individuals.
The emotional and ethical weight of this case cannot be ignored. It’s a call to action for both legal reform and proactive education to ensure such incidents are prevented in the future, or if an incident does occur, there are defined legal consequences to such actions.
Digital Food For Thought
The White Hatter
Facts Not Fear, Facts Not Emotions, Enlighten Not Frighten, Know Tech Not No Tech
References:
1/ https://thewhitehatter.ca/deepnudes-undressing-ai-generated-intimate-image-abuse-material/