DCMS committee slams Facebook over political adverts, disinformation and obfuscation
'Serious failings in the company's operations' resulted in data manipulation, misinformation and disinformation, say MPs
An interim report by the Department of Culture Media and Sport (DMS) select committee investigating fake news and the use of data in misinformation and disinformation is harshly critical of the role of social media companies, in particular, Facebook.
In the 89-page report Disinformation and ‘fake news' the committee accuses Facebook of "hampering our efforts to get information" about its operations throughout the Cambridge Analytica scandal. Facebook last week suspended another third-party app company Crimson Hexagon which has connections with both the US government and organisations linked to the Kremlin, and the report notes that despite the company's protestations little has changed.
"Over the past month, Facebook has been investing in adverts globally, proclaiming the fact that ‘Fake accounts are not our friends'. Yet the serious failings in the company's operations that resulted in data manipulation, resulting in misinformation and disinformation, have occurred again."
Facebook has approximately the same number of users as there are followers of Christianity (2 billion) while the number of YouTube users matches the entire global population of Moslems (1.8 billion), the report notes. Huge numbers of people can be ‘micro-targeted' via these and other platforms with paid-for messaging in a way that is completely non-transparent.
Moreover, the quasi-monopoly powers of this small number of private companies have been consolidated through acquisitions including Facebook's purchase of WhatsApp and Alphabet owning both Google and YouTube, the report adds.
For a long time social media companies have argued that they are platforms rather than content providers or publishers, so should not be subject to the same laws that govern traditional media, including with respect to political advertising. The committee says this should change.
"The word ‘platform' suggests that these companies act in a passive way, posting information they receive, and not themselves influencing what we see, or what we do not see. However, this is a misleading term; tech companies do control what we see, by their very business model.
It continues: "Within social media, there is little or no regulation. Hugely important and influential subjects that affect us—political opinions, mental health, advertising, data privacy—are being raised, directly or indirectly, in these tech spaces. People's behaviour is being modified and changed as a result of social media companies. There is currently no sign of this stopping."
While it would undoubtedly hurt their bottom line, experience shows that tech companies can be made to be more responsible for their content on their platforms. In response to tighter laws in Germany, with large fines available for the non-removal of hate speech, the situation changed.
"As a result of this law, one in six of Facebook's moderators now works in Germany, which is practical evidence that legislation can work."
The committee recommends a new category be formulated somewhere between a ‘platform' and a ‘publisher' which tightens the liabilities of tech companies.
It is also highly critical of Facebook's rollout of its Free Basics internet service in Burma, noting it "severely limits the information available to users, making Facebook virtually the only source of information online for the majority of people in Burma," and blaming it for the spread of hate speech against the Rohingya minority, many of whom have been killed or forced to leave the country. Again, the company has failed to take sufficient responsibility, the report says.
Among the recomendations for changes in the way solcial media firms are governed are the following:
- Beefed up powers for the ICO including hiring more trained data scientists.
- Changes in electoral law to require all electronic campaigning to have easily accessible digital imprint requirements, including information on the publishing organisation and who is legally responsible for the spending.
- The availability of much bigger fines for transgressions of electoral law than the 20,000 maximum, as recommended by the Electoral Commission.
- An investigation into the advertising supply chain with the possible involvement of the Advertising Standards Authority. The government has already moved to launch a Cairncross Enquiry in this area.
- Facebook and other social media companies to have a duty to publish and to follow transparent rules, including being liable under defamation laws.
- Tech companies should be audited, including the algorithms that allow micro-targeting.
- A professional global Code of Ethics should be developed by tech companies, in collaboration with this and other governments, academics.
- An investigation into the monopoly powers of some tech firms.
- Addressing the issue of shell companies buying political advertising on digital platforms.
- A tax on tech companies to finance a comprehensive educational framework for digital literacy. "Digital literacy should be the fourth pillar of education, alongside reading, writing and maths."
The interim report also covers Russian infuence in the Leave and Leave.EU Brexit campaigns and the role of SCL and subsidiaries Cambridge Analytica and Aggregate IQ in that referendum. The final report is due later this year.