Preparing for the tricky task of regulating online safety

Interview with Chloe Colliver, Head of Industry Developments - Online Safety, Ofcom

Penny Horwood
clock • 6 min read
Chloe Colliver, Ofcom

Chloe Colliver, Ofcom

Chloe Colliver, Head of Industry Developments - Online Safety at Ofcom, will be speaking at the IT Leaders Summit in October, about overcoming the challenges of regulating online platforms. Ofcom will play a critical role in this regulation when the Online Safety Bill becomes law, and this keynote will deliver important insights into policymaking. 

In the meantime, Computing asked Chloe for some background into her own journey into this fascinating role, and some of the hurdles that her team and Ofcom face in the ongoing task of trying to make the online world safer. 

Computing: Can you share some background about your role at Ofcom?

I joined Ofcom in January 2022 to help set up the teams and the capabilities that Ofcom needs for its new Online Safety duties. My focus is on making sure that we have the best possible evidence base for our regulation of the online industry. The scale and variety of evidence we will need is huge: the online sector is diverse, quick-moving and complex. My job is to help pull together the people, skills, tools and approaches that Ofcom will use to understand this industry and the individual companies within it.

I've spent the last year and half recruiting people from a wide range of research, technical and industry backgrounds. As we've grown, we've been conducting more and more research to understand the risks people face online, as well as the systems and processes that tech companies have in place to keep them safe. We've been mapping the industry that we're set to regulate: its size, the key business models that drive its growth and the evolving products and apps on offer. There's a lot to do, and we haven't even started regulating yet! 

What brought you into a tech policy focused role?

Since my days in academia, I've always been interested in how new technologies can be exploited to cause harm. Before I arrived at Ofcom, I led a team of 30 analysts around the world studying terrorism and disinformation threats online. When we started that work in 2016, social media was still poorly researched and therefore little understood. But extremist groups and hostile foreign states were already weaponising it to persuade, radicalise and harass at scale.

At the charity I worked at, we had to quickly find new ways to collect and analyse data so we could monitor these types of threat online. That meant building new partnerships with computer scientists to give us the tools to access and analyse social media data at massive scale, while ensuring the accuracy and nuance needed for such sensitive topics. We were able to push the boundaries of  natural language processing and machine learning technologies to help us map these complicated online communities: it revealed a new kind of use case for these technological innovations.

How has Ofcom found recruiting the tech and data skills it requires?  

One of the key challenges for us has been how to recruit for roles that have never existed before. Many of the new roles combined a number of different skills and experiences and so we had to work closely with the hiring community to work out what the key skills for each position were and what could be developed once a person was in-role. We then worked with hiring managers and our recruitment team to understand where these skills could come from - what other roles in the market were comparable in terms of transferable skills and how could we tap into them? We started out doing this on a role by role basis, which then (based on some successful hiring) was easier to map into a Team by Team focus. This meant that we could develop a framework for targeting particular areas of the talent market that could provide us with the transferable skills we needed.

We were also acutely aware that we couldn't compete with the salaries, bonuses, and long-term incentives of some areas of the tech market and so the messaging of our mission, values and other employer brand pillars was critical to us gaining initial interest from some sectors of the talent market we were actively trying to target. We also saw a lot of internal movement at Ofcom in the first year of setting up the new Online Safety teams, with 65% of roles being filled internally. The internal mobility generated as a result of our new duties has been fantastic for both our colleagues and us as an organisation.

What's your biggest challenge in your role?

The biggest challenge for me and my team at the moment is prioritisation: the online world is so varied and vast, as is the range of harms and risks that we need to be concerned with. My job as a team leader at Ofcom is in no small part to help set clear priorities that will help a new and growing team to know how their work contributes to our overall strategy for impact.

If anyone has tried to read the Online Safety Bill, they'll know it's not exactly a short and simple legal document. As I've brought new technical experts and subject-matter experts into my team, I've had to give them a clear focus for their skills and work while also allowing them to soak in the breadth of activity going on across Online Safety at Ofcom. That means ensuring the right connections are in place between the more technical colleagues we've had join us and the policy-focused experts focusing on specific harms or platforms. I see a huge part of my role right now as a translator and connector between people and teams.

What do you consider to be Ofcom's biggest tech policy related challenge?

That's a difficult question: we haven't even started regulating the online sector yet and we already face a number of significant challenges in understanding the huge, complex online industry, its users and the safety measures that are available.

Because we've already touched on the challenges of fast and effective recruitment above I'll pull out another key area that we think about a lot: children. A lot of the Online Safety Bill focuses on protecting children online from both illegal and potentially harmful content. The legal, ethical and technical challenges of understanding what children do online, what services they use and what risks they face are pretty significant.

Beyond that, testing what kinds of interventions and safety measures might best protect children is fraught with its own challenges. We've done a lot of work already to overcome these obstacles, but there will always be challenges in doing this kind of work. Ofcom holds itself to the highest standards possible in terms of its evidence base, and that means we're having to come up with novel approaches to collecting and analysing data for sensitive topics like children's online experiences. Thankfully the brilliant data, research, legal and policy teams that we have on board are working away on these complicated issues and already making great progress.

Chloe Colliver will be delivering her keynote: Turning Tech into Policy: Tackling the new challenges of regulating online platforms on 5th October at the IT Leaders Summit 2023. The ITLS is a two-day event bringing together the most senior and influential voices from IT leaders in the UK at Down Hall in Essex. Register today.



You may also like
Lock-in secrets of the hyperscalers, according to Civo

Cloud Computing

All cloud providers are not equal when it comes to winning contracts, according to Mark Boost, CEO of UK cloud provider Civo.

clock 16 February 2024 • 3 min read
Encryption backdoors violate human rights, says EU court


Implications for EU's own efforts to regulate encryption

clock 16 February 2024 • 3 min read
Ofcom investigating TikTok's parental controls


TikTok blames technical glitch

clock 15 December 2023 • 2 min read
Penny Horwood
Author spotlight

Penny Horwood

Associate Editor focusing on diversity in tech and sustainability content.

Sign up to our newsletter

The best news, stories, features and photos from the day in one perfectly formed email.

More on Legislation and Regulation

Europe sets global benchmark with stricter AI laws

Europe sets global benchmark with stricter AI laws

EU AI Act to come into force in June

Vikki Davies
clock 23 May 2024 • 1 min read
EU investigating Meta over failure to protect children from 'addictive' algorithms

EU investigating Meta over failure to protect children from 'addictive' algorithms

Suspects Facebook and Instagram leading minors down 'rabbit hole'

John Leonard
clock 16 May 2024 • 1 min read
Tech firms must 'tame' algorithms as part of new child safety code

Tech firms must 'tame' algorithms as part of new child safety code

40 steps set out to ensure child internet use is ‘safer by design’

Penny Horwood
clock 08 May 2024 • 4 min read