Skip to main content Skip to section navigation

Rinse and repeat: On-going failure of social media giants to prioritize child safety documented by latest Australian tech regulator report


Written by , Executive Director of the Canadian Centre for Child Protection (C3P)

From the vantage point of a child protection agency, it is impossible to grasp the full scope of safety failures on the social media platforms that our kids and teens use every day.

Many factors make it difficult to unpack, not the least of which is the technology industry’s lack of transparency about their content moderation and safety design practices.

This is why Australia’s office of the eSafety Commissioner's latest report scrutinizing the child protection practices and resource investment of five major technology companies — Google, TikTok, Twitch, Discord, and X (formerly Twitter) — is crucial information needed to hold these companies accountable. The results reinforce why the self-regulatory model simply does not work and is seriously damaging children/survivors.

Thanks to their willingness to apply their legislative powers to request detailed information from these platforms, eSafety details many concerning findings.

Some key findings include:

  • Discord and Twitch — both of which rely partly on community content moderation — are not automatically notified when volunteer users find child sexual abuse material (CSAM). Some self-appointed moderators, among others, can also set up dedicated channels for the exploitation and abuse of children;
  • Companies are not making use of widely available lists of URL block lists for sites devoted to known CSAM;
  • Platforms are opting to forgo the use of widely available tools for blocking and preventing the upload of CSAM by users on certain services; and,
  • Discord, a platform designed in part for livestreamed content, is not deploying harm detection tools on livestreamed content citing “prohibitively expensive” costs.

What’s more, the eSafety Commissioner has now fined X just over US$380,000 for non-compliance, while Google has been put on notice for providing insufficient information. This continues to underscore a general defiance demonstrated by the technology industry when basic and reasonable expectations are imposed upon them by duly-elected governments.

These are just a few examples of the woefully inadequate patchwork of safety mechanism and policies captured by eSafety’s second instalment of their world-leading Basic Online Safety Expectations reports. These findings also build upon the information revealed in their inaugural report in 2022.

These findings are sadly consistent with C3P’s experience working to support survivors of online abuse, and through the operation of Project Arachnid, an international tool for disrupting the distribution of known CSAM and harmful/abusive material of children online. Numerous C3P research reports highlight the ongoing failures of technology companies to prioritize child safety.

On behalf of those working on the front line of combatting online exploitation, of parents, and, most importantly, of survivors — we are grateful for the office of the eSafety Commissioner’s ongoing pursuit of accountability. More than ever, we need governments to urgently act and protect their citizens online — particularly those most vulnerable to sexual abuse and exploitation.

-30-

About the Canadian Centre for Child Protection: The Canadian Centre for Child Protection (C3P) is a national charity dedicated to the personal safety of all children. The organization’s goal is to reduce the sexual abuse and exploitation of children through programs, services, and resources for Canadian families, educators, child serving organizations, law enforcement, and other parties. C3P also operates Cybertip.ca, Canada’s tipline to report child sexual abuse and exploitation on the internet, and Project Arachnid, a web platform designed to detect known images of child sexual abuse material (CSAM) on the clear and dark web and issue removal notices to industry. As of October 12, 2023, Project Arachnid has processed over 161 billion images and issued over 35 million takedown notices to +1,400 electronic service providers in over 100 countries.

Support our work. Donate today.

Be a part of something big. As a registered charitable organization, we rely on donations to help us offer our programs and services to the public. You can support us in helping families and protecting children.

Donate Now