Government to appoint Ofcom as online safety regulator
The Government is to appoint broadcasting regulator Ofcom as a new internet watchdog, with the ability to fine social media companies that do not protect users from harmful content.
Culture Secretary Baroness Nicky Morgan and Home Secretary Priti Patel said Ofcom’s existing position as a regulator made it suitable to enforce rules to keep the internet safe.
The decision was published as part of an initial response to a consultation on the Government’s Online Harms White Paper which was released last year and called for a statutory duty of care for internet companies to protect users against potentially harmful content.
Those proposals suggested allowing the regulator to issue fines against platforms and websites it judges to have failed to protect users from seeing harmful videos such as those depicting violence or child abuse.
The Government’s response said the regulator would have the responsibility of making sure online companies have the systems and processes in place to fulfil the duty of care to keep people using their platforms safe.
Baroness Morgan said: “With Ofcom at the helm of a proportionate and strong regulatory regime, we have an incredible opportunity to lead the world in building a thriving digital economy, driven by groundbreaking technology, that is trusted by and protects everyone in the UK.
“We will give the regulator the powers it needs to lead the fight for an internet that remains vibrant and open but with the protections, accountability and transparency people deserve.”
Ofcom’s interim chief executive, Jonathan Oxley, said: “We share the Government’s ambition to keep people safe online and welcome that it is minded to appoint Ofcom as the online harms regulator.
“We will work with the Government to help ensure that regulation provides effective protection for people online and, if appointed, will consider what voluntary steps can be taken in advance of legislation.”
The regulator has also announced the appointment of a new chief executive, civil servant Dame Melanie Dawes, as part of its preparation for a new, wider role.
The Government’s response says platforms will need to ensure that illegal content is removed quickly and minimise the risk of it appearing, with particularly strong action needed on terrorist content and online child sexual abuse.
It also says future legislation will protect freedom of expression by not targeting or punishing individuals who access content which is legal, but may be offensive.
Instead, the proposals suggest that “companies will be required to explicitly state what content and behaviour is acceptable on their sites in clear and accessible terms and conditions, and enforce these effectively, consistently and transparently”.
The Government said the proposed legislation will only apply to companies that allow the sharing of user-generated content – such as images, videos and comments.
Internet giants including Facebook, YouTube and Instagram are seen as the main targets of the proposed new rules, which are part of government plans to make the UK the “safest place in the world to be online”.
Responding to the proposals, Facebook’s head of UK public policy, Rebecca Stimson, said: “Keeping people safe online is something we take extremely seriously. We have clear rules about what is and isn’t allowed on our platforms and are investing billions in safety.
“Over the last few years we’ve tripled the size of our safety and security team to 35,000 and built artificial intelligence technology to proactively find and remove harmful content.
“While we recognise we have more to do, our regular transparency reports show we are removing more and more harmful content before anyone sees it and reports it to us.
“We look forward to carrying on the discussion with Government, Parliament and the rest of the industry as this process continues.”