Call to exempt small sites from the Online Safety Act
Several sites have already closed owing to ‘disproportionate’ legislative burden
Small, low-risk sites should be out of scope so long as providers have a reasonable belief that children will not come across harmful material on them, say campaigners.
Last month Ofcom provided its latest guidance on illegal content under the Online Safety Act (OSA). Sites must have effective safety measures in place from, at the latest, 17th March 2025.
According to the OSA, services “that are neither large nor multi-risk” must nevertheless take measures to respond to “complaints of a relevant kind,” those being about illegal content, user content being taken down and warnings issued. In other words, they must have a proactive content moderation policy.
User-to-user services need to conduct a thorough risk assessment to identify potential online harms, review their terms of service, provide for reporting and complaints procedures and possibly implement age assurance measures.
That sounds fair enough, until you consider that many sites that allow user are non-commercial, self-funded and run part-time by hobbyists and small groups of enthusiasts.
Tech lawyer Neil Brown of decoded.legal maintains a list of sites that have already closed owing to the burden of compliance. They include a cycling forum, local debating sites, Mastodon instances and a gaming site.
The risk of harmful material popping up on these sites and being viewed by children is not zero, of course, but it’s very low. One of the recently closed sites LFGSS (London Fixed Gear Single Speed) was a long-running cycling forum, that was run at a loss. The site owner complained that compliance was a “disproportionate burden for small sites like ours, and the personal liability is too high.” They spoke of the very real possibility of disgruntled users weaponising the Act by deliberately posting illegal material.
One problem is a lack of clarity with respect to some of the Act’s key definitions. “The OSA is not written in an accessible, intelligible way,” said Brown in a blog post. “Basic concepts, such as ‘user’, and ‘significant number of UK users’, are unclear.”
Also unclear are definitions of what constitutes pornographic content.
Digital rights organisation the Open Rights Group says the OSA, as it currently set out, is a “threat to net plurality”. The group urging the secretary of state to review categorisation of sites under the law.
“There are over 450 million WordPress blogs, for example, many of which allow user comments: these face liability and sanctions, under the OSA,” wrote CEO Jim Killock. “Many will ignore the OSA, others will shut the UK out.”
Ofcom is doing its best to provide guidance to small site owners, said Brown, but it’s hampered by stretched resources and the OSA’s unclear terms.
"Small, low-risk, services are unnecessarily caught up in the OSA and their providers face disproportionate obligations and risks,” he told Computing.
“The fix is simple: remove them from the OSA's scope based on the provider's reasonable belief that they are of no or low risk, while empowering Ofcom to intervene when it is necessary and proportionate for it to do so. This doesn't affect small but high-risk services, which remain in scope."
An Ofcom spokesperson told Computing: “We know there's a diverse range of services in scope of the UK’s new online safety laws. The actions sites and apps will need to take will depend on their size and level of risk. We’re providing support to online service providers of all sizes to make it easier for them to understand – and comply with – their responsibilities.”
The spokesperson flagged resources including a Regulation Checker to help firms check whether they’re covered by the OSA, and a digital toolkit to help businesses comply with the new rules.