Tech firms must 'tame' algorithms as part of new child safety code

40 steps set out to ensure child internet use is ‘safer by design’

Tech firms must 'tame' algorithms as part of new child safety code

Social media firms have been warned that they will be told to tame algorithms recommending harmful content to children, as part of a raft of other measures which constitute Ofcom's new Children's Safety Codes of Practice.

The draft codes emphasise the importance of age checks, the filtering and downranking of content such as violence, and the complete removal of content relating to suicide, self-harm and pornography for the under 18's. The fundamental design and operating choices of platforms must become safer.

Put simply, the social media giants have to re-engineer their algorithms to stop them serving up harmful content to children.

At the end of 2022, the Centre for Countering Digital Hate (CCDH) published research which foused on Tiktok. The research showed that accounts of a 13-year-old (TikTok's minimum user age) pausing for seconds on videos (not accessing them, simply pausing as they scroll) about body image and mental health were recommended content on eating disorders and suicide within minutes.

In contrast to the government's stated ambition to make the UK the safest place in the world to be a child online, British children are currently less protected than those in mainland Europe. EU users can now switch off social media algorithms due to the EU Digital Services Act, but UK users can't switch off such algorithmically curated content. EU users between the ages of 13 and 17 are also now opted out of personalised ads based on their online activities by default, rather than having to choose to opt out. British teenagers are not similarly protected.

Ofcom's latest code is still in draft and open for feedback until July 17th. Enforcement isn't expected until Spring/Summer 2025.

The regulator has consulted widely to get to this stage. It has received input from over 15,000 young people about their online experiences. It has invited input from parents, carers, teachers, childcare professionals and crucially the technology sector itself.

It is clear that the regulator is trying to sidestep one of big concerns about the Online Safety Act which is the potential impact of measures to protect children on the privacy of adults.

The Act mandates that social media companies must assess whether children are likely to use their service. This involves the completion of a ‘Children's Access Assessment' which Ofcom has published in draft form today. If the outcome of this exercise is that children are likely to access the service then, a second ‘Children's Risk Assessment' must be completed. This has also been published today, and combined with the draft Harms Guidance, provides the material that social media companies need to mitigate the risks that their juvenile users are exposed to.

Age verification is also going to have to be robust. Ticking a box to say you're over 13 isn't going to cut it anymore. Ofcom states that age verification "must be technically accurate, robust, reliable, and fair. " The regulator provides examples of age assurance methods such as photo-ID matching, facial age estimation, and reusable digital identity services.

Proposals will be published later this year on the use of ‘tech notices.' This is the mechanism by which platforms will have to use accredited technologies to identify specific types of illegal content whether this content is communicated "publicly or privately." This is likely to prove more controversial than the current draft guidance given the arguments around the use of encryption.

Dame Melanie Dawes, Ofcom Chief Executive said:

"In line with new online safety laws, our proposed Codes firmly place the responsibility for keeping children safer on tech firms. They will need to tame aggressive algorithms that push harmful content to children in their personalised feeds and introduce age-checks so children get an experience that's right for their age.

"Our measures – which go way beyond current industry standards – will deliver a step-change in online safety for children in the UK. Once they are in force we won't hesitate to use our full range of enforcement powers to hold platforms to account. That's a promise we make to children and parents today."