Skip to main content Skip to section navigation

For the longest time there was nothing to be done. It’s on the internet, a lawless black hole of code. Now there is something to combat CSAM, and that solution is Project Arachnid.

— A member of the Phoenix 11

Project Arachnid: Online availability of child sexual abuse material

There is an entire chain of electronic service providers, image boards, file host providers, and other entities through which child sexual abuse material (CSAM) is made accessible online. Using data from Project Arachnid, C3P authored a first-of-its-kind report which offers a never-before-seen look into the availability of CSAM online and how these companies respond to removal notification from Project Arachnid.

The findings show expecting industry to voluntary invest in resources to prevent the availability of CSAM has been an ineffective strategy, and points to a need for governments at a global scale to impose meaningful regulation that prioritizes the protection and privacy of children and survivors.

Read the Report

Key Findings

Over three years, Project Arachnid detected and verified more than 5.4 million images.

12 seconds

Project Arachnid has achieved a median content removal time of 24 hours, but 10% of CSAM that was issued a removal notice took over seven weeks to be removed.

11 seconds

Images depicting older teens take significantly longer to remove than images with younger victims. Teens are also more likely to have their images re-uploaded than younger victims.

19 seconds

Project Arachnid detects suspected CSAM exponentially quicker than a human, creating a backlog of more than 32.8 million suspected images that have to be assessedy by C3P.

8 seconds

  • The vast majority of CSAM detected by Project Arachnid is not physically hosted on the dark web. However, the dark web does act as the main conduit for directing individuals on where to find it on the clear web.
  • Close to half of all media detections (48%) are linked to a file-hosting service operated by one French telecommunications company — Free.fr.
  • The rate at which Project Arachnid detects suspect media far outpaces the human resources available to assess the content. As of the writing of this report, C3P is facing a backlog of more than 32.8 million suspect media that have yet to be assessed.

Recommendations

Rooted in C3P’s extensive experience in issuing removal notices, the following set of recommendations are intended to assist policy makers in developing effective regulatory frameworks to combat CSAM and harmful-abusive content online:

  1. Enact and impose a duty of care, along with financial penalties for non-compliance or failure to fulfill a required duty of care.
  2. Impose certain legal/contractual obligations in the terms of service for electronic service providers and their downstream customers.
  3. Require automated, proactive content detection for platforms with user-generated content.
  4. Set standards for content that may not be criminal, but remains severely harmful-abusive to minors.
  5. Mandate human content moderation standards.
  6. Set requirements for proof of subject or participant consent and uploader verification.
  7. Establish platform design standards that reduce risk and promote safety.
  8. Establish standards for user-reporting mechanisms and content removal obligations.

Support our work. Donate today.

Be a part of something big. As a registered charitable organization, we rely on donations to help us offer our programs and services to the public. You can support us in helping families and protecting children.

Donate Now