UK watchdog investigates TikTok, Reddit and Imgur over child data privacy
ICO assessing whether platforms comply with data protection laws
The ICO has launched investigations into TikTok, Reddit and Imgur, over concerns about how they handle children's personal data and whether their systems sufficiently protect young users from harmful content.
The Information Commissioner’s Office (ICO) announced the probes on Monday, stating it would assess whether these platforms comply with data protection laws when processing personal information belonging to users aged 13 to 17. A key focus is on TikTok’s recommendation algorithms and how they use teenagers’ data to serve content, which could expose them to inappropriate or harmful material.
John Edwards, the UK Information Commissioner, highlighted the responsibility of social media companies to protect young users, stating: “My message is simple. If social media and video sharing platforms want to benefit from operating in the UK they must comply with data protection law. The responsibility to keep children safe online lies firmly at the door of the companies offering these services and my office is steadfast in its commitment to hold them to account.”
The ICO will also examine Reddit and Imgur, focusing on their age assurance measures and how they manage data collected from underage users.
While the investigation is ongoing, the ICO has not yet determined whether there have been legal breaches. If evidence of non-compliance is found, the affected companies will be given an opportunity to respond before any final ruling is made.
Child access to social media
The move comes amid growing concerns over children's access to social media. An ICO survey found that 42% of British parents felt they had little or no control over the data collected about their children by social media and video-sharing platforms. A quarter of respondents said they had stopped their children from using certain platforms due to privacy concerns.
The scrutiny of TikTok follows a previous £12.7 million fine imposed by the ICO in 2023 for failing to adequately identify and remove children under 13 from its platform, allowing as many as 1.4 million underage UK users to access the service in 2020 despite its own policies.
In a statement, TikTok said: “Our recommender systems are designed and operate under strict and comprehensive measures that protect the privacy and safety of teens, including industry-leading safety features and robust restrictions on the content allowed in teens’ feeds.”
The UK is not alone in examining the impact of social media on children. Australia introduced a ban on some platforms for under-16s in November 2024, while discussions around similar measures in the UK remain ongoing, though not currently part of government policy.
TikTok users based within the EU have the functionality to switch off the recommendation algorithm, and the under 17’s have an automatic opt-out of personalised ads based on their online activity. Child safety campaigners in the UK have long argued that children in the UK deserve the same protections, and that the Uk has been left with a more toxic version of the platform.
As digital independence among younger users increases, regulators are paying closer attention to how platforms process and use their data. The outcome of the ICO’s investigations could have significant implications for the future of data privacy regulations governing social media platforms operating in the UK.