TikTok Should’ve Known Better: $16M Fine for Misusing Kids’ Data 

TikTok has breached the UK data protection law by allowing users under the age of 13 to access the platform. As estimated by the Information Commissioner’s Office, there have been at least 1.4 million such cases between May 2018 and July 2020.
This violation of violating children’s privacy online should serve as an example to other countries as well to take the initiative in regulating the space for the better.

Reading Time: 3 minutes

tiktok children privacy

Illustration: Lenka Tomašević

Why was TikTok fined?   

Following the national law regarding children’s privacy online, the Information Commissioner’s Office (ICO) in the UK has issued a £12,700,000 fine (around $15.7M) to TikTok Information Technologies UK Limited and TikTok Inc (TikTok) for a number of breaches of data protection law, including failing to use children’s personal data lawfully. 

According to the official statement from ICO, this video-sharing platform didn’t do enough to protect the kids’ privacy and believed that “it should’ve known better”. 

image-1

There are laws in place to make sure our children are as safe in the digital world as they are in the physical world. TikTok did not abide by those laws.
As a consequence, an estimated one million under 13 were inappropriately granted access to the platform, with TikTok collecting and using their personal data. That means that their data may have been used to track them and profile them, potentially delivering harmful, inappropriate content at their very next scroll.
TikTok should have known better. TikTok should have done better. Our £12.7m fine reflects the serious impact their failures may have had. They did not do enough to check who was using their platform or take sufficient action to remove the underage children that were using their platform.

John Edwards, UK Information Commissioner 

In the UK, children’s online privacy is regulated under the General Data Protection Regulation. The ICO found that TikTok breached this law by: 

  1. Providing its services to UK children under the age of 13 and processing their personal data without consent or authorization from their parents or carers; 
  2. Failing to provide proper information to people using the platform about how their data is collected, used, and shared in a way that is easy to understand. Without that information, users of the platform, in particular children, were unlikely to be able to make informed choices about whether and how to engage with it; and 
  3. Failing to ensure that the personal data belonging to its UK users was processed lawfully, fairly and in a transparent manner. 

In other words, for children under the age of 13, parental consent is required in order for organizations to collect and use information. So, even though TikTok was aware that children under the age limit were using the app, as some senior employees have warned, the company didn’t do enough to identify this group of users and remove them from the platform. 

At the beginning of April 2023, the fine was set at £12.7 million, twice less than the original £27 million fine that the ICO was planning. 

Is TikTok recommending harmful content to minors?   

The platform was also accused of recommending harmful content every 39 seconds, according to the report published by the Center for Countering Digital Hate (CCDH). 

image-1

The new study had researchers set up TikTok accounts posing as 13-year-old users interested in content about body image and mental health. It found that within as few as 2.6 minutes after joining the app, TikTok's algorithm recommended suicidal content. The report showed that eating disorder content was recommended within as few as 8 minutes.

CCDH Report 

Not only that, but the research found as many as 56 TikTok hashtags hosting eating disorder videos with over 13.2 billion views. 

“TikTok is able to recognize user vulnerability and seeks to exploit it. It’s part of what makes TikTok’s algorithms so insidious; the app is constantly testing the psychology of our children and adapting to keep them online,” said the CEO of CCDH Imran Ahmed. 

Lawmakers worldwide are taking action, according to the New York Times

  • Utah passed a law to prohibit social media platforms like TikTok and Instagram from allowing minors in the state to have accounts without parental consent. 
  • California passed a law that would require social media, video games and other apps to turn on the highest privacy settings — and turn off potentially risky features like friend-finders — by default for minors. 

In addition to that, as ABC News reports, a bipartisan group of senators recently introduced legislation aiming to prohibit all children under the age of 13 from using social media. 

And finally, following the recent school shooting in Serbia, the Health Minister has proposed on national television a nationwide ban of social media ban for at least a month. 

What is Children’s code?   

Following this huge violation coming from TikTok, the regulator has published the Children’s Code which is aimed at all apps, websites and other platforms that children access, and it is necessary. 

This statuary code of practice sets 15 standards that should be followed to ensure the safest online experience possible for children. 

The official statement 

Namely, the code states that 1 in 5 internet users in the UK are children, but they are using an internet that was not designed for them

This is the first code of its kind but definitely not the last one, considering that the US and Europe are also heading in this direction. 

A journalist by day and a podcaster by night. She's not writing to impress but to be understood.

[the_ad_placement id="end-body"]