
In an era where children spend an increasing amount of time online, ensuring their safety in the onlife world has become an understandable concern for parents, caregivers, and policymakers alike. Recently, Canada introduced a new Online Harms Bill aimed at addressing online harms and protecting vulnerable users, especially children (1). As parents and caregivers, it’s essential to understand the key points of this Bill, and what it means for our children’s online experiences.
First and foremost, it’s crucial to dispel any misconceptions about the Bill. Unlike its earlier iteration, which was perceived as a censorship attempt, the current version takes a more balanced approach in our opinion. Rather than seeking to eliminate harmful content entirely, it focuses on a more realistic goal of minimizing the risk of exposure, acknowledging the complexities of moderating online platforms. The Bill also recognizes there is a role and responsibility for internet and social media companies specific to their own due diligence to minimize risk to youth, and that this is not just a parenting or policing issue.
The Bill narrowly focuses specifically on addressing seven distinct dangers that endanger children online, aiming to compel large tech corporations and social media platforms to behave responsibly. These particular dangers encompass:
- Intimate content communicated without consent
- Content that sexually victimizes or re-victimizes a child
- Content that induces self-harm
- Content used for bullying
- Content that promotes hatred
- Content that incites violence
- Content that incites violent extremism or terrorism
Furthermore, the Bill places legal requirements and responsibilities on large tech companies and social media platforms to address these issues through four main duties, emphasizing accountability and proactive measures to safeguard users which are:
- Duty to act responsibly.
- Duty to protect children (which will be determined in the regulations that are yet to be made public)
- Duty to make certain content inaccessible.
- Provide access to inventories and electronic data.
One notable aspect is the protection of private or encrypted communications, such as those on messaging apps ensuring that the Bill’s scope is limited to public-facing content only – thus protecting “private” speech. Additionally, the appointment of an Ombudsman to advocate for victims and users marks a significant step forward in digital safety initiatives – a first of its kind in the world. It will be interesting to see what the process will be for a Canadian citizen to report a violation, and how long it will take to initiate the 24hr takedown timeframe specific to Child Sexual Abuse Material and non-consensually distributed intimate images as directed in this Bill – when it comes to legislation, “the devil will be in the detail.”
However, there are valid apprehensions regarding specific elements of the Bill. For instance, the inclusion of terms like “humiliation” concerning bullying lacks clarity and will require further specification. Additionally, notable omissions such as the absence of provisions addressing “synesthetic deepfakes” portraying “real” individuals, and the oversight of including “semi-nude” content, which is not explicitly mentioned in the bill, were not recommended for amendments in the Criminal Code of Canada – as Canadian Internet and privacy lawyer David Fraser stated in an X posting, “it’s a huge missed opportunity to come up with a proper, Charter-compliant definition of synthetic deepfakes that depict actual people, and add it to the Criminal Code.” – we agree! Also, some of the thresholds to initiate a takedown of a deepfake or intimate image are lower than those we see in the Criminal Code (beyond a reasonable doubt) or even a civil process (balance of probability) which could lead to a court challenge. Therefore, we believe that these aspects require clarification, more precise definitions, particularly regarding deepfakes, and integrated into the Bill to enhance its effectiveness.
We also hold the view that it’s crucial to establish measures for checks, balances, and oversight to curb potential overreach and protect against potential misuse of the broad regulatory authority granted to the new “Digital Safety Commissioner” proposed in this Bill, similar to what some legal experts see as the formation of a regulatory body akin to the CRTC.
The broad scope of authority bestowed upon this new regulatory body, especially when it comes to search and seizure, coupled with the potential eagerness of its adjudicators in enforcing the legislation, necessitates robust legal and public oversight. This is crucial to prevent the encroachment or mission creep of bureaucratic tendencies that could inadvertently lead to censorship, a use of power that diverges from the Bill’s intended purpose in our opinion.
Enforcing the Bill poses another challenge, particularly concerning companies based outside of Canada – say in Russia as an example. The jurisdictional limitations and international legal complexities can often make it very difficult to hold such entities accountable.
While acknowledging these challenges, it’s important to recognize that Bill C-63 marks a positive stride in bolstering digital safety for children from a regulatory standpoint. However, there’s a sentiment that maintaining a sharper focus on child safety, rather than the addition of changes to the Human Rights Act, would have been preferable, as this inclusion may render the legislation more susceptible to criticism, something we are already seeing. Nonetheless, it remains imperative to remain vigilant and actively engage in the legislative process to address any concerns and refine its provisions. Seeking insights from experts in Canadian tech and privacy law, such as David Fraser, Emily Laidlaw, and Michael Geist, can be instrumental in enhancing the Bill’s effectiveness and legal clarity as it progresses through the parliamentary journey to become law.
We do believe that the Online Harms Bill C-63 is a commendable first step, it’s just the beginning of a broader conversation and a much longer legislative process – we believe we are at least a year out, if not longer, before it receives final reading and is passed into law. There is also a chance that if there is a change in government later this year, the Bill could be halted.
As parents and caregivers, it’s crucial to stay informed and actively advocate for our children’s online safety. Collaboration with policymakers, legal experts, internet platforms, and fellow parents can enhance the onlife environment for future generations. However, parents need to recognize that while legislation plays a role, it isn’t a cure-all solution, panacea, or silver bullet for mitigating online risks faced by youth. For instance, provisions like Sections 6 & 7 of Bill C-63, which address private messaging exclusion and lack of proactive content search, may not fully address issues like sexting and sextortion, which often occur in private or direct messages. We believe that involved parenting remains pivotal in safeguarding our kids in today’s onlife world. (2)
A YouTube Live video we did on Bill C-63
Digital Food For Thought
The White Hatter
Reference: