Ofcom finalises its child safety rules
All online services in scope and accessible in the UK must comply by 25th July
Ofcom, the UK’s telecoms and broadcasting regulator, has finalised its child safety requirements that sites and apps must introduce by 25th July to comply with the Online Safety Act.
“Subject to our Codes completing the Parliamentary process as expected, from 25 July providers will have to apply the safety measures set out in our Codes (or take alternative action to meet their duties) to mitigate the risks that their services pose to children,” the watchdog says in a blog post, setting out 40 measures that sites will need to address.
The headline requirements of the Protection of Children Codes are that any site hosting content including pornography, self-harm or suicide-related material, or information about eating disorders must have age assurance systems in place to prevent children from accessing it.
Sites will also be required to filter harmful content from children’s feeds and prevent recommender systems from promoting such material.
They must show they can respond quickly to concerns and to take down harmful content when requested and grant more control to children over their online experience. And all services in scope must have a named person accountable for children’s safety.
Fines for non-compliance could reach £18 million or 10% of global revenue, and Ofcom will have the power to prevent offending sites or apps from being available in the UK.
The regulator says it will publish a consultation on additional measures to include CSAM and AI and age assurance shortly.
Chief executive, Melanie Dawes, said in a statement: “These changes are a reset for children online. They will mean safer social media feeds with less harmful and dangerous content, protections from being contacted by strangers and effective age checks on adult content.”
Technology secretary Peter Kyle said: “The children’s safety codes should be a watershed moment – turning the tide on toxic experiences on these platforms – with the largest social media companies now having to prioritise children’s safety by law.”
The move was welcomed by Lina Ghazal, head of regulatory and public affairs at age assurance company Verifymy: “Ofcom’s new regulations also mean under-18s should no longer encounter pornography or harmful material like suicide content, self-harm or eating disorder content on their phones and laptops. Platforms must provide stronger moderation and have a tighter grip on any algorithms used to recommend content.”
However, Ian Russell, father of Molly Russell who took her own life after viewing self-harm material, said the provisions lack ambition. “Instead of moving fast to fix things, the painful reality is that Ofcom’s measures will fail to prevent more young deaths like my daughter Molly’s,” he said. “Ofcom's risk-averse approach is a bitter pill for bereaved parents to swallow. Their overly cautious codes put the bottom line of reckless tech companies ahead of tackling preventable harm.”
Hollie Dance, the mother of teenager Archie Battersbee who died after performing a prank as the result of a social media trend described the measures a “softly-softly” approach and “gaslighting”. "Why are we tip-toeing around these huge platforms over children's safety? The platforms routinely allege they do not allow harmful content," she told Sky News.
The scope of the measures, which includes “all services, even the smallest,” has worried small site owners who fear being caught in a legislative dragnet aimed at social media companies and had hoped to be exempted. However, in its latest guidance Ofcom says: “In recommending measures, the Act requires us to ensure regulation is proportionate. We recognise that the size, capabilities, and risks of services differ widely and we have taken this into account in our impact assessments.”
Technology lawyer Neil Brown advised small site owners to wait and see: “If you are running a small, low risk online service, within the scope of the UK's Online Safety Act, do not panic about what is likely to be a cascade of news about Ofcom's latest code of practice,” he said in a post on Mastodon. “You may need to do something, or you may not, but there's no reason to panic.”