The Online Safety Bill is doomed to fail

The Online Safety Bill is doomed to fail

Image:
The Online Safety Bill is doomed to fail

The draft legislation suffers massively from mission creep, internal contradictions and a dire lack of clear definitions

On the face of it, the Online Safety Bill (OSB) should be in for an easy ride. Enjoying cross-party support in Parliament and promoted by prominent children's charities, 81% of UK adults want senior tech managers to be appointed and held legally responsible for stopping children being harmed by social media, according to a recent YouGov poll.

But however well-intentioned, the OSB, which is currently heading to the Lords, will almost certainly fail in its stated objective of protecting children from online harms. Instead, it may usher in measures that will be detrimental to privacy, innovation and freedom.

Since its inception in 2019, the OSB has crossed the desks of four different prime-ministers and four culture secretaries. Inevitably, over that time numerous hobby horses have found their way into the draft legislation, which suffers massively from mission creep, internal contradictions and a dire lack of clear definitions.

No-one would argue against the bill's stated aims, but policing content on the web is no easy task. Freedom of expression and a right to privacy need to be balanced against possible harms. It's a job for precision engineering, not a sledgehammer.

Definitional deficit

The draft bill is full of vague terms. Amazingly, there is no clear definition of what "material harmful to children" actually is. That, it seems, will be for government ministers to decide, and to a (presumably mightily beefed-up) Ofcom to police after the bill has become law. The lack of oversight is worrying, to say the least.

Ah, but we'll be socking it to Silicon Valley! Companies that fail to live up to their policies to take down illegal content may be fined up to 10% of global revenue, a hefty whack for a Facebook. And this week, a provision to jail tech company executives found to be "deliberately" exposing children to harmful content or who "connive" in ignoring regulatory warnings was introduced on the request of several MPs. So far so scary, but executives who "act in good faith" have nothing to fear, according to current Culture Secretary Michelle Donelan, which will be music to the ears of Big Tech legal teams who should have no problem arguing their way around that one.

So who will be caught in the crossfire in the battle between grandstanding MPs and the lawyered-up tech giants? Wikipedia and other collaborative ventures for one.

All platforms, large and small, rich and poor are in scope of the OSB. There are classifications, but the lines are not clear. The EU Digital Services Act differentiates between centralised content moderation carried out by employees and the decentralised moderation model used by Wikipedia; the OSB does not.

Wikimedia Foundation's VP global advocacy Rebecca MacKinnon told the BBC the law will affect not only big corporations with their professional content moderators, but also public interest websites such as Wikipedia, which is moderated by volunteers.

Small groups or individuals running a Minecraft server, might also fear the heavy hand of the law, as could volunteers administering a Mastodon instance and hobbyists running a video sharing server or a blockchain node, a code-sharing site like GitHub, developers of privacy-enhancing software - the potential list is almost endless. The bill applies to almost any site used by UK users, not just Twitter or Facebook.

The OSB does not explain how such actors should protect themselves against users who upload harmful material, or the anticipated cost of complying. What is likely to be pocket change for the big platforms could be make-or-break for the admin of a small server, even before any legal fees are considered.

Mission creep

"Think of the children" has often, historically, been a useful cover for pushing controversial measures, and the OSB is no exception.

It has certainly reinvigorated the government's long-running battle against encryption. MPs purport to be worried about end-to-end encrypted (E2EE) messaging apps like WhatsApp (unless, of course, they are being used by ministers to conduct government business off the record, in which case they're apparently fine). The OSB contains clauses that would erode end-to-end encryption in private messaging services, which have been drawn into the debate.

People could share illegal material over E2EE apps like Signal or ProtonMail; they undoubtedly do. But then again, they could encrypt the material and send it by Gmail or Dropbox if they wanted to, and undoubtedly do that too. Banning or backdooring something that keeps citizens' information secure is not the answer and will inevitably have negative consequences. There is no such thing as a backdoor only for the good guys, as technologists have been explaining for decades; besides, privacy is a fundamental human right.

"In a democratic society, we need to accept and defend privacy, even though there will be some negative externalities, because the alternative, which is no privacy, is worse," said Proton CEO Andy Yen in an interview with Wired. Yen sees the OSB, and similar legislation in Europe, as "a Trojan horse; they really want to [break encryption] for other purposes". It's easy to see why he might think so.

More mission creep pops up in the latest draft. The OSB will now make it illegal for sites to show people crossing the Channel in small boats in a "positive light". What's that got to do with child safety? Nothing. It's yet another hobby horse, this one ridden by Tory MP Natalie Elphicke. Quite aside from the ethics of such a ban, what does a "positive light" mean in this context? Presumably whatever a minister decides it should mean.

Free speech for me not for thee

With such arbitrary provisions, the government is open to accusation of using the OSB explicitly to stifle views it doesn't like, of using a measure designed to protect children instead to control the political narrative.

Something like that will likely happen anyway, given the vague way harmful material is defined. Rather than risking jail and heavy fines for getting it wrong, platforms will simply remove borderline material, shunt it down listings where it won't be seen (shadow-banning), or apply automated filters.

"Fear of a jail sentence will lead to over-moderation where content that is lawful is removed. It portends the use of upload filters - where an algorithm sweeps in and censors content before it has been posted," said Monica Horten, policy manager for freedom of expression at Open Rights Group.

Interestingly, the press is out of scope of the OSB, which raises other questions: what about user-generated content defined as harmful appearing in comments below articles, or material posted by newspapers on Twitter or Facebook?

The government removed the previous "legal but harmful" clause after free speech concerns were voiced by MPs who needed to be kept onside. Horse-trading over, it will be interesting to see if the same voices are raised over other the current provisions, which might conceivably lead to satire, edgy humour or simply views the government doesn't care for being taken down.

How will the tech giants react?

Let's face it, Mark Zuckerberg isn't going to do time in Wormwood Scrubs; that would require an extradition ruling the US would be extremely unlikely to agree to. Maybe the law will require some underling to reside in the UK (Nick Clegg perhaps?) to take the rap when Meta breaches its own rules. But the company's many lawyers would make mincemeat of the "good faith" clause.

If push came to shove, Meta or Musk might decide, reluctantly, that the world is a big place, that a small island of 70 million people is not worth spending too much time on, to implement some catch-all, UK-only age verification and upload filter software, and then, who knows? Perhaps seed the press with articles on VPNs and how to get around the blocks.

That's not likely, but there is a risk it could discourage investment in the UK tech sector. The threat of governance by ministerial whim, attacks on encryption and possible prison time for start-up execs will make other pastures seem all the greener.

Ironically, the OSB could strengthen the tech giants at the expense of the little guys.

Next steps

Anyway, it may never come to that. The bill can still look forward to plenty of back and forth between the Lords and Commons, and unless considerably tightened up, whatever emerges will face numerous legal challenges, leading to backtracking and fudge. But if it passes even somewhat intact, the power it grants to ministers to decide what's harmful, what's positive and who is acting in good faith, or not, is deeply concerning.

Policing global content at scale is a hard problem. It's understandable that people want to see high-handed big tech platforms punished when they do wrong, and of course everyone wants to prevent children seeing and sharing stuff that is harmful. It's a shame the politicians, rather than focus on the real problem, have chosen instead to co-opt the OSB for other ends which could make it unworkable. Rip it up and start again.