Fine social networks and make bosses accountable over harmful content – NSPCC
Social networks such as Facebook should face large fines if they fail to remove harmful content that is a risk to children, proposals by the NSPCC have outlined.
The child protection charity wants to see tough duty of care laws imposed on social networks, with fines of up to 20 million euros (£17.5 million), or 4% of the company’s annual global turnover – whichever is higher.
In the case of the biggest social network Facebook, it would mean a fine of £1.7 billion, based on 4% of its latest turnover results of £43.3 billion.
Named directors should also be held personally liable for upholding a legal duty of care to children, banning them from other directorial roles if they are found to be in breach, the NSPCC sets out in its regulation proposals.
In addition, it wants social networks to be forced to disclose any information a regulator requires, proactively report when sites put children at risk, as well as risk assessing new products or services.
The proposals come as a new survey commissioned by the charity shows that nine out of 10 parents of children aged 11 to 18 support calls to regulate social networks.
“Today we stand at an absolutely crucial moment in a long battle to protect children from harm, abuse and grooming online,” said Peter Wanless, chief executive of the NSPCC.
“Over the last decade, self-regulation has been tried and found wanting time and time again. Thirteen voluntary codes of conduct have each launched with warm words and good intentions. None have designed essential child protection properly into the online world. Instead, children have continued to face an unacceptable level of risk right across social networks, with devastating consequences.
“In the offline world, from toys to playgrounds, we take child safety regulations for granted. Yet online, the most basic child protection remains an optional extra.”
The charity was joined by Ruth Moss, whose daughter Sophie Parkinson had been looking at harmful content and chatting to older men online before taking her own life in 2014 at the age of 13.
“I stumbled into a whole new area that I never thought as a parent that I’d have to be dealing with,” Ms Moss explained.
“The internet is so ubiquitous these days – in cafes, on buses, on trains – that it becomes very difficult to police children and adolescents as a parent.
“And I’ve often heard people say, ‘But it’s the parent’s responsibility to keep their children safe online’, and yes it absolutely is, parents need to do as much as they can, but my message today is parents cannot do that on their own because the internet is too ubiquitous and it’s too difficult to control, it’s become a giant.
“It has some hugely beneficial things for children but it comes at a cost – and it came at a cost to our family.”
The proposals come amid intense pressure on social networks to take action in safeguarding vulnerable people, also highlighted by the case of Molly Russell, who took her life aged 14.
Her family found material relating to depression and suicide when they looked at her Instagram account after her death.
Last week, Instagram announced it would ban graphic images of self-harm from its platform, following talks with Health Secretary Matt Hancock.