NSPCC tells PM to get on with social media laws as grooming crimes pass 10,000

Updated

The Prime Minister has been urged by the NSPCC to get on with regulation of social networks, as the number of online grooming crimes topped 10,000.

Under freedom of information requests, the charity found that 10,119 offences of sexual communication with a child were recorded by police in England and Wales in the two and a half years since a law come into force making it illegal to send sexual messages to children.

Boris Johnson has been asked to publicly commit to an online harms bill, setting out a duty of care on tech firms to make their platforms safer for children, within 18 months.

Almost a quarter (23%) of incidents took place in the six months up to October last year, indicating that the number is accelerating.

The means of communication was reported in 5,784 incidents, of which 3,203 (55%) took place on a Facebook-owned app – including the main Facebook social network, Messenger, Instagram and WhatsApp – data from April 2017 to October 2019 suggests.

Meanwhile, 1,060 (18%) were recorded for Snapchat.

The number of offences where the means of communication was was not recorded was 4,335.

The charity warned there could be an even sharper increase this year due to more people and children being online at home during the coronavirus lockdown.

NSPCC chief executive Peter Wanless held talks with Mr Johnson last week, in which he highlighted how the pandemic was the perfect storm for abusers and asked that there be no unnecessary delay to legislation.

Facebook apps
Facebook apps

“Child abuse is an inconvenient truth for tech bosses who have failed to make their sites safe and enabled offenders to use them as a playground in which to groom our kids,” said Mr Wanless.

“Last week the Prime Minister signalled to me his determination to stand up to Silicon Valley and make the UK the world leader in online safety.

“He can do this by committing to an online harms bill that puts a legal duty of care on big tech to proactively identify and manage safety risks.

“Now is the time to get regulation done and create a watchdog with the teeth to hold tech directors criminally accountable if their platforms allow children to come to serious but avoidable harm.”

Susie Hargreaves, chief executive of the Internet Watch Foundation – which is responsible for finding and removing online child sexual abuse material – welcomed the push, saying: “The length of time it is taking is leading to uncertainty for us all, which stalls progress.”

A Facebook spokesman said: “There is no place for grooming or child exploitation on our platforms and we use technology to proactively find and quickly remove it.

“We have a content and security team of over 35,000 people investigating reports from our community and working to keep our platforms safe.

“Our teams also work closely with child protection experts and law enforcement, reporting content directly to specialists such as CEOP (Child Exploitation and Online Protection Command) and NCMEC (National Centre for Missing & Exploited Children).”

Advertisement