Skip to main content Skip to section navigation

STATEMENT FROM THE PHOENIX 11: META PRIORITIZES PROFIT OVER CHILDREN


For Immediate Release

The Phoenix 11 is devastated by Meta’s decision to move forward with the full implementation of end-to-end encryption (E2EE) on their platforms without necessary safeguards in place to continue their current child sexual abuse material (CSAM) reporting and removal obligations.

We have been advocating against E2EE without these safeguards for over five years, which means that Meta has had plenty of time to develop them. They have simply chosen not to.

The Phoenix 11 has repeatedly warned of the horrific and unacceptable consequences this decision would have on survivors and current child victims. Child safety organizations, law enforcement agencies, researchers, analysts, and academics around the world have all stepped up over the years to provide more studies, data, reports, and statistics than anyone would ever need to reaffirm what we know. Implementing E2EE without technology in place to address child sexual abuse, exploitation, and imagery on Meta’s platforms will stop professionals from saving children in real time. It will aid in the continued proliferation of child sexual abuse crime scenes, including ours, which will now be shared faster and further than we can begin to imagine.

Who is going to tell these children waiting to be rescued that no one is looking for them? Who is going to tell survivors that they are going to live with being revictimized every single day? We think it should be those in charge at Meta. If Meta’s leadership is going to decide to make this fight harder for us, we invite them to tell us directly. We are looking forward to the conversation.

The executives at Meta have decided that it is easier for them to turn a blind eye to the rampant problem of child sexual abuse and exploitation on their platforms than to continue to make attempts to fight against it. They will no longer be able to report what they cannot see. They believe the problem will be out of sight and therefore out of mind.

The Phoenix 11 wishes to remind Meta and Big Tech that we are not going anywhere. We have no choice because we are stuck on these platforms, frozen in time during the most horrific moments of torture and abuse perpetrated against us. If we continue to suffer alongside children and babies, if companies allow these crimes to happen on their technology, we will continue to demand justice for these children, and for us. We will show up to this fight for long as it takes.

We call on governments and the public to stand with us and act with urgency. Thousands of victims are identified every year in online investigations that start with reports. Someone’s daughters, sons, sisters, brothers, and friends will be directly harmed because of this decision. Meta is going to mask crime scenes of child sexual abuse on behalf of online perpetrators. Their platforms are going to be a safe haven for pedophiles.

Stand with us and demand that Meta prioritize children over profit.

Signed,

A group of 11 survivors of child sexual abuse who have banded together to challenge inadequate responses to the prevalence of child sexual abuse images on the internet

-30-

About the Phoenix 11: The Phoenix 11 is a group of survivors whose child sexual abuse was recorded, and in the majority of cases, distributed online. This group has banded together as a powerful force to challenge the inadequate responses to the prevalence of child sexual abuse images online.

About the Canadian Centre for Child Protection: The Canadian Centre for Child Protection (C3P) is a national charity dedicated to the personal safety of all children. The organization’s goal is to reduce the sexual abuse and exploitation of children through programs, services, and resources for Canadian families, educators, child-serving organizations, law enforcement, and other parties. C3P also operates Cybertip.ca, Canada’s national tipline to report child sexual abuse and exploitation on the internet, and Project Arachnid, a web platform designed to detect known images of child sexual abuse material (CSAM) on the clear and dark web and issue removal notices to industry.

Support our work. Donate today.

Be a part of something big. As a registered charitable organization, we rely on donations to help us offer our programs and services to the public. You can support us in helping families and protecting children.

Donate Now