Tech firms have 12 months to ensure their platforms adhere to new child privacy protection measures.
The Age Appropriate Design Code sets out 15 standards that companies must build into any online services used by children, making data protection of young people a priority from the design up.
These can stretch from apps and connected toys, to social media sites and online games, and even educational websites and streaming services.
Organisations that fail to follow the code after the transition period ends on September 2 2021 could face enforcement action by data regulator ICO (Information Commissioner’s Office) which include compulsory audits, orders to stop processing and fines of up to 4% of global turnover.
Under the rules, privacy settings must be set to high by default and nudge techniques should not be used to encourage children to weaken their settings, the code states.
Location settings that allow the world to see where a child is should also be switched off by default.
Data collection and sharing should be minimised, and profiling that can allow children to be served up targeted content should be switched off by default too.
“A generation from now we will all be astonished that there was ever a time when there wasn’t specific regulation to protect kids online,” said Elizabeth Denham, Information Commissioner.
“This code makes clear that kids are not like adults online, and their data needs greater protections.
“We want children to be online, learning and playing and experiencing the world, but with the right protections in place.
“We do understand that companies, particularly small businesses, will need support to comply with the code, and that’s why we have taken the decision to give businesses a year to prepare, and why we’re offering help and support.”
Andy Burrows, head of child safety online policy at the NSPCC, said the move will force tech firms to “take online harms seriously” so there can be “no more excuses for putting children at risk”.
“For the first time, high-risk social networks will have a legal duty to assess their sites for sexual abuse risks and no longer serve up harmful self-harm and suicide content to children,” he said.
“The Government must also press ahead with its Online Harms Bill to go hand in hand with the Code.
“This must be enforced by an independent regulator with the teeth it needs to hold platforms to account for safety failings.”