For the longest time there was nothing to be done. It’s on the internet, a lawless black hole of code. Now there is something to combat CSAM, and that solution is Project Arachnid.
Project Arachnid: Online availability of child sexual abuse material
There is an entire chain of electronic service providers, image boards, file host providers, and other entities through which child sexual abuse material (CSAM) is made accessible online. Using data from Project Arachnid, C3P authored a first-of-its-kind report which offers a never-before-seen look into the availability of CSAM online and how these companies respond to removal notification from Project Arachnid.
The findings show expecting industry to voluntary invest in resources to prevent the availability of CSAM has been an ineffective strategy, and points to a need for governments at a global scale to impose meaningful regulation that prioritizes the protection and privacy of children and survivors.
Read the Report
Over three years, Project Arachnid detected and verified more than 5.4 million images.
Project Arachnid has achieved a median content removal time of 24 hours, but 10% of CSAM that was issued a removal notice took over seven weeks to be removed.
Images depicting older teens take significantly longer to remove than images with younger victims. Teens are also more likely to have their images re-uploaded than younger victims.
Project Arachnid detects suspected CSAM exponentially quicker than a human, creating a backlog of more than 32.8 million suspected images that have to be assessedy by C3P.
- The vast majority of CSAM detected by Project Arachnid is not physically hosted on the dark web. However, the dark web does act as the main conduit for directing individuals on where to find it on the clear web.
- Close to half of all media detections (48%) are linked to a file-hosting service operated by one French telecommunications company — Free.fr.
- The rate at which Project Arachnid detects suspect media far outpaces the human resources available to assess the content. As of the writing of this report, C3P is facing a backlog of more than 32.8 million suspect media that have yet to be assessed.
Rooted in C3P’s extensive experience in issuing removal notices, the following set of recommendations are intended to assist policy makers in developing effective regulatory frameworks to combat CSAM and harmful-abusive content online:
- Enact and impose a duty of care, along with financial penalties for non-compliance or failure to fulfill a required duty of care.
- Impose certain legal/contractual obligations in the terms of service for electronic service providers and their downstream customers.
- Require automated, proactive content detection for platforms with user-generated content.
- Set standards for content that may not be criminal, but remains severely harmful-abusive to minors.
- Mandate human content moderation standards.
- Set requirements for proof of subject or participant consent and uploader verification.
- Establish platform design standards that reduce risk and promote safety.
- Establish standards for user-reporting mechanisms and content removal obligations.