Facebook admission shows child safety not a top priority, NSPCC says

Facebook’s revelation that it expects reports of child sexual abuse to drop once its controversial encryption plans go ahead have been branded a “remarkable admission” by the NSPCC.

The charity said the acknowledgement from the social media giant demonstrates that children’s safety is not a top priority.

Proposed changes to end-to-end encryption on the firm’s private messaging services on Facebook Messenger and Instagram have led to fears that paedophiles distributing child sexual exploitation material could find it easier to evade police detection.

Facebook – which already end-to-end encrypts messages via WhatsApp – has repeatedly defended the move as a step to better protect people’s privacy.

Asked by MPs on Wednesday whether reporting of child sexual abuse would decline, Monika Bickert, the company’s vice president of global policy management, said: “I do expect those numbers will go down.”

Yvette Cooper, chairwoman of the Home Affairs Committee, responded: “Why is Facebook trying to introduce something that will put more children at risk, that will make it harder for law enforcement to rescue vulnerable children, why are you doing this?”

Ms Bickert, said the company is “mindful of all the different types of abuse online”, describing it as a “pretty complicated area”.

“I don’t think there is a very clear answer on how to keep people safe most of the time,” she told the Committee.

“In the UK, adults who were surveyed have said that the crimes online that are most concerning to them are data, data loss and hacking, that sort of crime.”

She added: “I think as we get more aggressive at just prohibiting people from having access to the service in the first place that should also drive those numbers down.”


Facebook argues encryption is the industry standard and that it has committed to working with law enforcement unlike other end-to-end encrypted messaging providers.

The NSPCC has long campaigned against the social network’s plan.

“These comments are a remarkable admission of where Facebook’s priorities lie, and it’s abundantly clear child safety is not at the top of the list,” said Andy Burrows, head of child safety online policy for the charity.

“Facebook has the opportunity to set the industry standard by moving forwards with end-to-end encryption when children’s safety will not be compromised, rather than pushing ahead at the first opportunity.

“Their seemingly cavalier approach to child safety underlines precisely why we need a legal duty for care on tech firms that repeatedly fail to protect their young users.

“It’s important the Online Safety Bill gives the regulator the power and agility to hold firms accountable if their platforms and design choices put children at risk.”

During the Committee it was revealed that about 250,000 WhatsApp accounts are banned every month for participating in groups that are sharing child exploitation imagery, just based detection from unencrypted parts within the app, such as the group profile image, name and description, as well as user reports.