Online child safety plans: The code of practice

A raft of proposed standards to keep internet companies in check over child safety have been published by the information watchdog.

The code of practice for internet services comprises 16 standards of “age-appropriate design”, which are:

1. Best interests of the child

This should be the primary consideration for developers of online services likely to be accessed by youngsters.

2. Age-appropriate application

Firms need to consider the age range of their audience and the needs of children of different ages.

They will have to apply the code to all users unless they have robust age-verification mechanisms to distinguish adults from children.

3. Transparency

“Small print” that explains privacy information and community standards needs to be concise, prominent and in clear language suited to the age of the child.

4. Detrimental use of data

Firms must not use children’s personal data in ways that have been shown to be detrimental to their well-being or go against industry standards and government advice.

Child avatars catching out gambling adverts
(Peter Byrne/PA)

5. Policies and community standards

Companies will have to uphold their own standards.

6. Default settings

Settings must be “high privacy” by default – unless they can demonstrate a “compelling reason” for a different default setting.

7. Data minimisation

Firms should only collect and retain the minimum amount of personal data they need to provide the elements of their service in which children are “actively and knowingly engaged”.

8. Data sharing

Children’s data should not be shared unless firms can demonstrate a compelling reason to do so that takes account of children’s best interests.

Social media stock
(Nick Ansell/PA)

9. Geolocation

Technology that links a young user’s physical location with their online data should be disabled by default, unless there is a compelling reason not to. It should also be obvious when geolocation is active.

As a further measure, options which make a child’s location visible to others must default back to off at the end of each session.

10. Parental controls

Children need to be aware of any controls that allow a parent to monitor them.

If an online service allows a parent or carer to monitor their child’s online activity, or track their location, then they need to provide an obvious sign to the child when they are being monitored.

Children's favourite apps are always changing and we understand it can be hard to keep up. That's why we've teamed up with @O2 to help you learn about different apps & how safe they are: https://t.co/spZ8VY4mG1pic.twitter.com/RgGV50Oqzo

— NSPCC (@NSPCC) April 14, 2019

11. Profiling

By default, functions that direct content to users based on their profile should be switched off.

Profiling should only be allowed if measures are in place to protect children – particularly from feeding them content “detrimental to their health or well-being”.

12. Nudge techniques

Services should not use methods that encourage children to engage with them in a certain way. This includes “likes” on Facebook and Instagram or “streaks” on Snapchat.

Snapchat
(Kirsty O’Connor/PA)

13. Connected toys and devices

Such gadgets should comply with the code.

14. Online tools

Prominent and accessible tools should be in place to help children exercise their data protection rights and report concerns.

15. Data protection impact assessments

Firms need to appraise and mitigate risks specifically with children in mind.

16. Governance and accountability

Firms need to be able to show the watchdog that all staff involved in the design and development of online services likely to be accessed by children comply with the code.

Read Full Story

FROM OUR PARTNERS