TikTok allowing under-13s to keep accounts, evidence suggests

<span>Composite: Guardian Design/Getty Images</span>
Composite: Guardian Design/Getty Images

TikTok faces questions over safeguards for child users after a Guardian investigation found that moderators were being told to allow under-13s to stay on the platform if they claimed their parents were overseeing their accounts.

In one example seen by the Guardian, a user who declared themselves to be 12 in their account bio, under TikTok’s minimum age of 13, was allowed to stay on the platform because their user profile stated the account was managed by their parents.

The internal communication sent in the autumn involved a quality analyst – someone who is responsible for any queries related to moderating video queues – who was asked by a moderator whether they should ban the user’s account.

The advice from the TikTok quality analyst was that if the account bio said it was managed by parents then moderators could allow the account to stay on the platform. The message was sent into a group chat with more than 70 moderators, who are responsible for looking at content mostly from Europe, the Middle East and Africa.

It has also been alleged that moderators have been told in meetings that if a parent is in the background of a seemingly underage video, or if the bio says an account is managed by a parent, those accounts can stay on the platform.

Suspected cases of underage account holders are sent to an “underage” queue for further moderation. Moderators have two options: to ban, which would mean the removal of the account, or to approve, allowing the account to stay on the platform.

A staff member at TikTok said they believed it was “incredibly easy to avoid getting banned for being underage. Once a kid learns that this works, they will tell their friends.”

TikTok said it was false to claim that children under 13 were allowed on the platform if they stated in their bio that the account was managed by an adult.

A spokesperson said: “These allegations about TikTok’s policies are wrong or based on misunderstandings, while the Guardian has not given us enough information about their other claims to investigate. Our community guidelines apply equally to all content on TikTok and we do not allow under-13s on our platform.”

TikTok states on its website that it is “deeply committed to ensuring that TikTok is a safe and positive experience for people under the age of 18”. It adds: “This starts by being old enough to use TikTok. You must be 13 years and older to have an account.” TikTok says all users have to pass through a compulsory age gate to sign up for an account and that between April and June this year alone it removed more than 18m suspected underage accounts globally.

TikTok has had run-ins with regulators over its management of under-18s’ accounts. In September the Irish data watchdog fined it €345m (£296m) for breaking EU data law in its handling of children’s accounts, including failing to shield underage users’ content from public view.

In April the UK data regulator fined TikTok £12.7m for allegedly misusing the data of children under the age of 13. The Information Commissioner’s Office said TikTok did not take enough action to ensure children under 13 were not using the app and that it used their data without the consent of parents.

TikTok does not refer to a parental supervision waiver in its community guidelines.

The Guardian has been investigating TikTok amid continuing concern about how it moderates its more than 1 billion users worldwide and has seen internal communications that are likely to raise fresh questions about how the app is policed.

Evidence seen by the Guardian suggests that some potentially underage accounts have received internal tags that would give them preferential treatment.

In one case, a “top creator” tag was attached to an account of a child who appeared to be underage. The child posts videos about getting ready for school and their after-school routines.

The child also indicated in their bio that their account was overseen by parents.

The user had “dm for collabs” in their bio and a hashtag “tiktokdontbanme” on one of their videos. TikTok community guidelines state that users must be at least 16 to use direct messages.

There was no clear evidence of parental supervision of the account, and it had fewer than 2,000 followers.

In another case, a child who appeared to be under 13 also had a “top creator” label beside their name. This child had more than 16,000 followers on the platform.

The “top creator” tag appears on accounts that some moderators have been asked to treat more leniently.

TikTok community guidelines state that users must be 13 years and older to have an account. In the US there is a separate under-13s TikTok experience, with additional safety protections and a dedicated privacy policy. TikTok also says in its guidelines: “If we learn someone is below the minimum age on TikTok, we will ban that account.”

TikTok’s approach to age limits is covered in the UK by the children’s code, which is designed to protect children’s data online. The code states that processing the personal data of a child is lawful if they are at least 13 years old. Below that age, parental consent is needed for processing of a child’s data.

The code states that services under its remit must “take a risk-based approach to recognising the age of individual users and ensure you effectively apply the standards in this code to child users”.

The architect of the code, the crossbench peer Beeban Kidron, told the Guardian she was “horrified” to hear that it appeared “possible for an underage child to remain on a service once the service is alerted to the fact that the user is 12”.

She added: “I believe the sector’s design choices and moderation choices are profit-driven. Those choices can put children at risk of harm.”

TikTok is also regulated in the UK by Ofcom, the communications watchdog, under its video-sharing platform rules, which are being folded into the Online Safety Act.

In the newly introduced act, tech platforms are required to set out in their terms of service – to which all users sign up – the measures they use to prevent underage access, and to apply those terms consistently.

Related: TikTok users including Russell Brand given special status, messages show

Lorna Woods, a professor of internet law at the University of Essex, said: “Under the Online Safety Act, platforms have obligations to enforce their terms of service and they have to do so consistently. Quite apart from the provisions in the act on age verification, if the platform has a rule saying under-13s aren’t allowed on the platform then that should be applied consistently.”

Children in the EU are protected by the bloc’s Digital Services Act, which states that major platforms such as TikTok must put in place measures, such as parental controls or age verification, that protect children from harmful content. The act also implies that platforms should have a high degree of certainty about a user’s age, because it prohibits tech firms from using under-18s’ data in order to show them targeted adverts.

TikTok says it has more than 6,000 moderators in Europe who apply the platform’s community guidelines “equally to all content”.

Advertisement