Myanmar blocks Facebook as Parler CEO Matze claims he was fired over desire to control extremism

Arguments over the power and responsibility of social media companies are coming to a head

Myanmar has blocked Facebook in the country following a military coup which has seen civilian leader Aung San Suu Kyi and members of her party placed under arrest.

The coup has led to widespread protests in the country, and blocking Facebook is designed to prevent protesters from communicating.

Far from just being a social media platform in Myanmar, Facebook is a major provider of internet services in the country too, with Messenger the main channel of communications. The military has said Facebook services will be cut off at least until Sunday, according to the WSJ.

The company has a chequered history in Myanmar, being widely blamed for allowing disinformation to be propagated about the Rohinja Muslim minority, leading to large scale massacres and a refugee crisis in 2018.

While the Facebook admitted its part in the genocide, saying it hadn't done enough to prevent the spread of extremist content, it has been treading a fine line, wanting to be seen as a platform not a publisher, as under many jurisdictions, including the US, that would require it to be responsible for content published on its services.

However, this balance is starting to change, particularly after the Capitol violence which saw former president Trump banned.

Facebook users have frequently stated said they are concerned about the rise of hate speech and negativity on the platform and want the company to intervene. A survey by Drive Research in the run-up to the US election last year questioned 562 Facebook users across the United States about how misinformation and hate speech have impacted their experience on the social media platform, with 75 per cent saying they wanted the company to step in and clean things up.

Participants said they were unhappy about the rise of disinformation on Facebook. 69 per cent complained about 'political debates', 64 per cent mentioned hate speech and 63 per did not welcome posts concerning social movements (63 per cent), all of which gave a negative perception of the platform.

One in three survey respondents said the social media platform negatively impacts their mental health and half said the platform had become less enjoyable to use.

The extent to which moderators should step in is a growing question for all social platforms. John Matze, the former CEO of right-wing social media app Parler, claimed this week that he was fired by the company over his disagreement with the board members over content moderation policies for the app.

In an interview with NPR, Matze said that he wanted to control right-wing extremism on the app, but faced resistance from the Parler board and mega-donor Rebekah Mercer.

Parler is currently reliant on Russian hosting services. Last month, Amazon removed Parler from its cloud hosting Amazon Web Services (AWS) over its failure to act quickly enough against violent content on the platform.

The move came after a group of Amazon employees called on the firm to take tough action against Parler following Capitol building violence.

Amazon's decision followed similar steps from Apple and Google who suspended Parler from their respective app stores over its sloppy approach to moderate content.

Before its removal from AWS, Parler was seen as a haven for people censored by other social media platforms. Thanks to its hands-off approach to policing user content, the platform became an ideal spot for Donald Trump supporters to post messages celebrating violence, encouraging "patriots" to march on Washington, DC, with weapons.

Parler, at that time, described the actions against the platform as "a coordinated attack by the tech giants to kill competition in the marketplace."

In an interview with Fox News, Apple CEO Tim Cook said that Parler could return to Apple's App Store if it promises to properly moderate posts on the platform.