Ofcom has launched an investigation into TikTok into whether it has provided inaccurate information about its parental controls.
Although it was generally complimentary about the steps they are taking, Ofcom said it had “reason to believe” that TikTok given “inaccurate” information about its family mating system.
The feature allows adults to link their accounts with their children’s to control settings such as screen time limits.
Ofcom will now investigate whether the firm “failed to comply with its duties” by failing to respond properly.
TikTok blamed technical issues and said Ofcom was being made aware and would provide the necessary data.
A spokesman said the platform enforced an age requirement of 13 and Ofcom’s report showed it puts “extensive effort and resources” into “finding and removing” those who do not meet it.
Ofcom’s report comes two years after it issued guidelines for video sharing apps on how to protect younger users from encountering harmful content.
Ofcom’s report, published on Thursday, found that TikTok, Snapchat and Twitch all met the requirements it set out two years ago.
They do not allow registrations from children under 13 and use various methods, including AI and human moderators, to identify potentially minor accounts and remove them.
For TikTok, this was around 1% of its monthly active UK user base.
All three platforms also rate and label content to ensure it is age appropriate.
But while Snapchat and TikTok require users to have an account to access most content, Twitch is completely open.
As for parental controls, it said Snapchat and TikTok offer them, but Twitch’s terms and conditions require parents to monitor their children in real time.
Ofcom said that while it was satisfied that steps were being taken to protect young users, they “can still sometimes be harmed while using these platforms”.