Online Safety Bill should exclude encryption scanning, says policy paper

The UK Government has never trusted end-to-end encryption

Image:
The UK Government has never trusted end-to-end encryption

The cost would outweigh any benefits, researchers argue

A new policy paper, presented at the Conservative party conference, advises against a provision in the Online Safety Bill that would require tech firms to monitor encrypted messages.

The UK's Online Safety Bill is currently making sluggish progress through Parliament. It has become a focus of global debate, with commentators arguing about whether organisations like Google, Twitter, and Meta should be required to proactively search and delete harmful content from their platforms.

Tech firms say it is not technically possible to scan apps using encrypted messaging - like WhatsApp or Facebook Messenger - for illegal content without jeopardising network security as a whole. They contend that encryption technology keeps platform operators and others from accessing user communications.

However, the UK Government insists on giving telecoms regulator Ofcom the power to require tech firms to do just that.

The policy paper [pdf] - by Ross Anderson, Professor of Security Engineering at the Universities of Cambridge and Edinburgh, and Sam Gilbert, Bennett Institute Research Affiliate - suggests that the so-called 'last resort' powers in the draft bill should be dropped.

Anderson and Gilbert say it would be "unworkable" to scan communications for harmful content using artificial intelligence (AI) technology, since it would generate too many false positives.

Around 10 billion text messages are sent each day in Europe alone. That could generate as many as 1 billion false alarms, the researchers write.

"Europe's 1.6 million police officers would [each] have to scan 625 of them every day. Such a system would be simply unworkable."

The paper argues that that client-side scanning technology, as the Online Safety Bill proposes, is technically inefficient; impractical as a way of minimising violent online extremism and child sexual abuse material; and undermines fundamental freedom.

They say that while the Online Safety Bill is right to require a duty of care from technology and social media firms, the cost of some of its suggested remedies would outweigh any benefits.

Will Cathcart, Meta's Head of WhatsApp, warned UK ministers last month that attempts in the Online Safety Bill to weaken encryption would endanger the security of the Government's own communications and encourage authoritarian regimes to adopt such measures.

He told the Financial Times that there are other ways to protect minors using WhatsApp, without having to give up the fundamental security technology that keeps its more than 2 billion users secure.

Prime Minister Liz Truss said last month that her new administration would press ahead with the law, with some changes to focus on child protection.

The new policy paper by Anderson and Gilbert also suggests including gaming service providers in the Bill's purview.

"Gaming platforms expose children to the same risks of abuse as social media, as well as to financial harms," they say.

The Government has a long-standing distrust of end-to-end encryption, which prevents anyone except the sender and recipient from reading messages. Popular platforms like WhatsApp, Telegram and Facebook Messenger all use the technology.

Earlier this year, the Government hired M&C Saatchi to rally citizens against end-to-end encryption - a move that the ICO, which supports encryption, criticised just a few days later.