Facebook experimenting with user participation in its moderation rules
Rising misinformation on climate change and other issues has led Meta to conduct tests with 'deliberative democracy'
Meta is reportedly conducting experiments to determine whether it is possible to include Facebook users in discussions about complex policy questions, such as how to handle the significant amount of posts that contain false information on its platforms, and ultimately come to practical solutions.
One of the main issues social networks like Facebook and Twitter have in terms of moderation is deliberate disinformation, or misinformation, which is based on either ignorance or misunderstanding.
Significantly inaccurate information may be found on a variety of subjects, including the Covid-19 vaccine and climate change, the latter of which is especially challenging to address.
See also Scaling up fact checking: AI's role
An investigation conducted by the environmental organisation Stop Funding Heat found 45,000 postings that downplayed or denied the climate crisis, The Guardian reported last year.
Recently, the platform brought together three groups of users (a total of 250 users from five countries) from February to April to find an answer to the question, ' What should Meta do about problematic climate information on Facebook?'
Meta was particularly interested in learning what normal users would want in terms of moderation if they were given the right information on the issue.
Meta tasked Behavioural Insights Team (BIT), a UK-based policy consulting company, with inviting 250 Facebook users to participate in the process of policy creation.
According to BIT, ' problematic information ' is the content that is not necessarily false, yet presents viewpoints that may include misleading, low quality or incomplete information that might potentially lead to incorrect conclusions.
Over the course of two weekends, Facebook users in the three groups were brought together online to learn about platform regulations and climate issues. They were given access to Facebook employees as well as independent experts on both speech and climate issues.
Facebook presented users with a number of potential solutions to the problematic climate information, and the users discussed and voted on their preferred outcomes.
The results of the investigation are being analysed by Facebook's policy teams and have not been made public.
However, in a post-activity survey, 80% of participants stated that Facebook users like them need to have a voice in policy creation.
Meta says it intends to keep using this strategy.
"We don ' t believe that we should be making so many of these decisions on our own," Brent Harris, vice president of governance at the company, told the Verge.
"You ' ve heard us repeat that, and we mean it," he added.
BIT refers to the process as "deliberative democracy", saying it adds "depth and legitimacy" to policy decisions.
In an effort to improve and scale user deliberation, BIT says it intends to keep working with Meta.
" Together, we look forward to developing governance mechanisms that allow social media users to meaningfully influence the design, content and regulation of the platforms that shape their lives. In doing so, we also hope to blaze a trail for governance innovation at social media platforms and institutions around the world," the firm noted.