Facebook hampered misinformation research with incomplete data

Facebook hampered misinformation research with incomplete data

Image:
Facebook hampered misinformation research with incomplete data

The error could have held the researchers back by months or even years

Facebook provided flawed and incomplete data to misinformation researchers on how users interact with posts and links on its platform, according to the New York Times.

The publication says it has seen an email that Facebook sent to researchers to apologise for the 'inconvenience [it] may have caused'. The company told the researchers that it was working to fix the issue, which could take weeks due to the huge volume of data that needs to be processed.

Facebook has been sharing data with researchers for the past three years, so they could study the spread of misinformation on social media. While sharing data, the company emphasised complete transparency and access to all user interaction.

However, Facebook has now told researchers that the data they received included interactions for only half of its users in the US, not all of them as the company had previously claimed.

Moreover, US users whose information was shared were found to engage with political pages enough to make their political leanings clear.

According to the NYT report, some researchers questioned whether the mistake was an instance of negligence or was carried out intentionally to damage the research work.

Many complained that they had lost months of work due to Facebook's error.

The flaw was first spotted by Fabio Giglietto, a researcher at Italy's University of Urbino, who compared the data provided to the researchers with another set the company released last month.

The results of the data sets didn't match up, and many researchers later found similar inaccuracies.

"It's a great demonstration that even a little transparency can provide amazing results," Giglietto said of the series of events leading to his discovery.

Cody Buntain, an assistant professor and social media researcher at the New Jersey Institute of Technology, told the NYT that such incidents undermine the "trust researchers may have in Facebook".

Buntain is part of the group, dubbed Social Science One, whose members had access to Facebook's user activity information.

"A lot of concern was initially voiced about whether we should trust that Facebook was giving Social Science One researchers good data," Buntain said.

"Now we know that we shouldn't have trusted Facebook so much and should have demanded more effort to show validity in the data."

A Facebook spokesperson told the NYT that the mistake was the result of a "technical error." They added that the firm had "proactively told impacted partners" that it was working to resolve the issue.

It is not the first instance where the work of misinformation researchers has been interrupted as a result of Facebook actions.

Last month, Engadget reported that the platform had disabled the accounts associated with the NYU Ad Observatory project, which used a browser extension to collect data on political ads.

Laura Edelson, the project's lead researcher, said Facebook was interfering with her team because their work highlighted various issues on the platform.

Also last month, climate-focused thinktank InfluenceMap accused the company of allowing big oil companies to use the platform to spread fossil fuel propaganda.

A study by climate-focused thinktank InfluenceMap concluded that fossil fuel firms and lobby groups used Facebook to run advertisements aimed at promoting oil and gas as part of a climate change solution, rather than part of the problem, to delay the extinction of fossil fuel use.

The analysis found more than 6,700 ads on Facebook promoted claims that natural gas is a green or low carbon fuel last year, despite research by the Intergovernmental Panel on Climate Change stating otherwise.