At the start of this year, we predicted that regulators would continue to focus on regulatory enforcement against Big Tech firms.
Now, the ICO has fined TikTok £12.7 million for failing to prevent children under 13 from using its platform, in breach of its own terms and conditions. Under GDPR, personal data relating to children needs to be especially protected because children are particularly vulnerable due to their relative lack of power and life experience. The ICO was especially concerned that TikTok could have been tracking and profiling children and that this could have put them at risk of seeing harmful content in their feeds.
Chinese-owned TikTok is having a somewhat torrid time at the moment, with the UK government recently requiring it to be removed from all government-issued mobile phones due to security concerns. Laws in China allow the Chinese government to access personal data processed by Chinese organisations if it is relevant to national security. The EU and US have also implemented similar bans – sending a message to private citizens that they should consider whether it is safe to continue to use TikTok. There is no evidence that TikTok has handed over data, but we know from the Snowden report in the US that government surveillance agencies can find social media usage interesting, hence the concern.
The Italian data protection regulator has also been flexing its muscles recently, requiring Replika to prevent children from accessing the more adult services available via its chatbot (it chose to terminate those capabilities for all users) and banning ChatGPT due to its data collection practices. There is a lot of conversation about AI at the moment – including the systems used to power chatbots and recommendation engines. These systems have the potential to shape our future, but they are new and risky technologies. It is good to see regulators taking clear steps to guide the development of these tools to ensure that people are protected.