How Ofcom is transforming to regulate the online space

Penny Horwood
clock • 4 min read
Chloe Colliver, Ofcom
Image:

Chloe Colliver, Ofcom

Chloe Colliver, Head of Industry Developments at Ofcom, shares some of the more technology focused developments at the regulator

In her keynote address to the Computing IT Leaders Summit, Chloe Colliver, Head of Industry Developments at Ofcom, set out how technology is central to its role in regulating the digital realm, and why the industry will be crucial in determining outcomes.

Colliver set out just how the Online Safety Bill, which will become law as soon as the bill receives royal assent, will widen the role of Ofcom and give the regulator more power. Ofcom will become responsible for enforcing the requirements of the bill and for assessing the risk of harm on online platforms and working with companies to put in place proportionate systems and processes.

Types of harm 

There will be two main categories of harm - illegal content and that which is likely to be harmful to children.  

"The government has given us robust powers to try to enforce this regime," said Colliver. "This includes information gathering powers, and we'll also be able to impose a range of sanctions on companies that don't comply with their responsibilities."

Ofcom will publish its first consultation about its new approach to regulating online safety as soon as the bill receives royal assent and is keen for stakeholders and experts in this space to contribute to feedback on their proposed approach to platform regulation.

Colliver showcased some of Ofcom's new data and analytical tools by way of sharing some rather sobering statistics.

68% of internet users aged between 13 and 17 had encountered at least one online harm in the last four weeks

Harms are classified into three groups. Content harms include the images, videos etc. that can be encountered when browsing or which are served up by algorithms or shared by others. 51% had encouraged these. Contact harms and been experienced by 57%. These are more targeted harms such as trolling, abuse, bullying or unwanted sexual messages. Less frequently encountered were commercial harms which is content or behaviour which puts the user at risk of financial harm ad this category includes the commercialised collections of personal data. These had been experienced by 28%

These are interesting top line findings, but Colliver explained that Ofcom needed to explore further, particularly when it comes to finding out what children are up to online. Technology - particularly automation - will be crucial. 

"There are significant technical and ethical challenges understanding what harms children might encounter on the internet. We've had to innovate quickly to build our evidence base in this area. Some of these new methods are proving particularly valuable. One example is using sock puppet accounts to try to mirror a child's experience on the platforms. We've been piloting that and will potentially scale up using a semi-automated approach."  

One of Ofcom's biggest challenges has been learning what internet platforms and services people are using.

"Finding robust ways to measure user activity within a specific geographic location like the UK - at scale - without using a lot of new technology s no mean feat," said Colliver.

The regulator has worked with IPSOS to find out how much time UK users spend on the big-name services because these clearly have a dominant reach and corresponding responsibility toward users. However, an interesting finding is that users spent almost two hours daily on the "long tail" of small platforms. Responsibility for protecting children doesn't end with the likes of Meta and Alphabet. This will inform best practice and guidance over the years to come.

The need for tech talent 

Because of the need to measure outcomes over the long term, Ofcom have built a behavioural insights team.

"These newer members of the Ofcom team are busy designing experiments to test how different design features in platforms can affect whether users meaningfully use safety measures on platforms," said Colliver.

An example of their work can be found in a report detailing the effectiveness of different interfaces and options for how users can report harmful content to the platform they encounter it on.

In addition to the expertise Ofcom has had to recruit in areas such law and social policy more generally, it is also recruiting serious numbers of data engineers, data scientists and analysts, ML and AI specialists statisticians and product managers.

"It's our Data Innovation Hub that provides the real core capability and leadership across Ofcom for this kind of work," said Colliver. "My team uses a lot of that capability and data to generate insights and inform quality, evidence-based decisions. The hub's job is to ensure the correct tools, skills and capabilities are in place for people to realise those insights from the vast amounts of data that will be available to us."

The next step for Ofcom in terms of their duties is the focus on illegal harms, a consultation for which is due shortly. The second phase will be child safety consultation, and the third a greater focus on transparency and user empowerment more generally.

You may also like
Lock-in secrets of the hyperscalers, according to Civo

Cloud Computing

All cloud providers are not equal when it comes to winning contracts, according to Mark Boost, CEO of UK cloud provider Civo.

clock 16 February 2024 • 3 min read
Encryption backdoors violate human rights, says EU court

Privacy

Implications for EU's own efforts to regulate encryption

clock 16 February 2024 • 3 min read
Ofcom investigating TikTok's parental controls

Privacy

TikTok blames technical glitch

clock 15 December 2023 • 2 min read
Penny Horwood
Author spotlight

Penny Horwood

Associate Editor focusing on diversity in tech and sustainability content.

Most read

Sign up to our newsletter

The best news, stories, features and photos from the day in one perfectly formed email.

More on Legislation and Regulation

Regulation has made EU firms less data-hungry

Regulation has made EU firms less data-hungry

GDPR has cut storage and processing

Tom Allen
clock 21 February 2024 • 2 min read
EU backtracks on Apple and Microsoft 'gatekeeper' designation

EU backtracks on Apple and Microsoft 'gatekeeper' designation

Edge and iMessage will no longer be subject to DMA rules

Vikki Davies
clock 15 February 2024 • 1 min read
Europe edges closer to landmark AI regulations after France agrees to ratify AI Act

Europe edges closer to landmark AI regulations after France agrees to ratify AI Act

France was seeking additional concessions to safeguard its own AI startups

clock 05 February 2024 • 3 min read