TikTok hit with £12.7m fine for unlawful use of children’s data

ICO investigation finds that video platform failed to prevent more than one million underage users signing up

TikTok has been hit with a multimillion-pound fine for unlawful use of children’s data in the UK.

The Information Commissioner’s Office announced today that it has imposed a £12.7m penalty on the video-sharing platform.

The fine comes just days after ministers in both Westminster and Edinburgh imposed a ban on the use of the app on government devices and networks. The measures were taken following consultation with the National Cyber Security Centre.

The ICO’s punishment comes in light of an investigation by the regulator that found that, as of 2020, up to 1.4 million UK children under 13 were using TikTok.

Without the consent of their parents or guardian, processing the personal information of these children was in breach of UK data-protection laws, the ICO said, as well as contravening’s the social-media firm’s own rules.

TikTok did not take sufficient steps to check who was using its services and ensure that underage users were found and removed, according to the regulator – including a failure to “respond adequately” after concerns about children using the platform were raised with senior employees. The ICO’s investigation also found the video app did provide enough information on how users’ personal data was being used.

Information commissioner John Edwards said: “There are laws in place to make sure our children are as safe in the digital world as they are in the physical world. TikTok did not abide by those laws. As a consequence, an estimated one million under-13s were inappropriately granted access to the platform, with TikTok collecting and using their personal data. That means that their data may have been used to track them and profile them, potentially delivering harmful, inappropriate content at their very next scroll.”

The fine imposed by the regulator is less than half its originally intended penalty of £27m, announced last year. The fine was reduced following “representations from TikTok”, after which the regulator chose not to include in its considerations potential legal breaches related to special-category data.

After wrapping up its investigation into the video platform, the regulator has since published a Children’s Code – a set of 15 statutory guidelines for providers of online services that are likely to be used by children.

Edwards added: “TikTok should have known better. TikTok should have done better. Our £12.7m fine reflects the serious impact their failures may have had. They did not do enough to check who was using their platform or take sufficient action to remove the underage children that were using their platform.”

A spokesperson for TikTok said: “TikTok is a platform for users aged 13 and over. We invest heavily to help keep under-13s off the platform and our 40,000 strong safety team works around the clock to help keep the platform safe for our community. While we disagree with the ICO’s decision, which relates to May 2018 – July 2020, we are pleased that the fine announced today has been reduced to under half the amount proposed last year. We will continue to review the decision and are considering next steps.”

Sam Trendall

Learn More →

Leave a Reply

Your email address will not be published. Required fields are marked *