Meta Trusted Partner Program failing to fulfil core remit

Meta Trusted Partner Program failing to fulfil core remit

Media non-profit Internews claims Meta is not acting on reports of dangerous content posted on Facebook and Instagram, including death threats and incitements to violence.

Media non-profit organisation Internews has published a report containing serious criticisms of Meta's Trusted Partner program. This program consists of 465 global civil society and human rights groups, and is supposed to give these groups a priority channel to alert Meta to harmful and dangerous content posted on Facebook and Instagram with the aim of identifying and removing this content as quickly as possible.

But the report claims that some organisations have received long delays when reporting this content - the same delays regular users of the social media platforms experience when making similar reports. Response times are slow, inconsistent, and in some cases Meta has failed to react at all to even the most dangerous and time-sensitive content such as calls for and threats of imminent harm.

The report is based on dozens of interviews with some of Meta's closest partners, and found that many of the most severe operational failures of the Trusted Partner program appear to relate directly to a lack of resourcing, and that recent swingeing job cuts at Meta are likely to be exacerbating the problem.

The report identified a significant disparity of service between Ukraine and many other countries experiencing armed conflicts, displacement and disinformation. Trusted Partners in Ukraine can expect a response to reports withing 72 hours. Reports relating to the Tigray war in Ethiopia are ignored for months.

Another disturbing finding was the fact that many Trusted Partners are choosing to supplement or bypass the official Trusted Partner channel by communicating directly with personal contacts or at least copying them into official reports to ensure they are read. Partners who can leverage these contacts receive better responses, which indicates that the program is not functioning as it should be.

The review was originally set up as a collaboration with Meta but the company withdrew in 2022, claiming that "the reporting issues of the small sample of Trusted Partners who contributed to the report do not, in our view, represent a full or accurate picture of the program."

However, Meta declined requests from Internews to provide information on average response times or internal targets.

Rafiq Copeland, Platform Accountability Advisor at Internews and author of the report said:

"Trusted flagger programs are vital to user safety, but Meta's partners are deeply frustrated with how the program has been run. As Meta launches Threads to be a ‘public square for communities,' can we trust the company to operate it responsibly? This research suggests more investment is needed to ensure Meta's platforms are safe for users."

This latest reports builds on concerns about the program which have been building for several years. Frances Haugen's testimony to the US Congress in 2021 exposed how Meta (then Facebook) tiers countries in terms of the content moderation resources afforded to them. These criteria are not transparent but documents Haugen put into the publc domain indicated that much of the global south is classified into Tier 3, with minimal content moderating resource attached.