EU opens disinformation probe against Meta as elections loom

Comes as Meta closes key disinformation tracking tool

EU opens disinformation probe against Meta as elections loom

Image:
EU opens disinformation probe against Meta as elections loom

The European Commission has launched a formal investigation into Meta, the parent company of Facebook and Instagram, for potentially violating the bloc's new online content regulations.

The EU suspects Meta of failing to adequately tackle disinformation and misleading advertising, particularly with key EU elections approaching in June.

The investigation hinges on potential breaches of the Digital Services Act (DSA), a set of EU regulations aiming for safer online environments.

It comes amidst heightened concerns about external actors like Russia, China and Iran using social media to spread misinformation and influence the upcoming EU elections in June.

The EU will assess whether Meta effectively combats disinformation and political manipulation on its platforms. This includes investigating if the company's advertising tools were exploited by "malicious actors" to spread pro-Russian propaganda.

A specific area of focus appears to be a recently exposed Russian influence operation ("Doppelganger") that mimicked legitimate media outlets, attempting to sway the elections in Moscow's favour.

While Meta claims to have blocked tens of thousands of links associated with this campaign, the EU remains unconvinced.

Additionally, concerns exist regarding anti-establishment parties in the EU potentially using social media to spread their own brand of disinformation.

The EU will also examine Meta's transparency regarding its political content moderation practices. This includes concerns about the lack of clarity around demoting political content and the effectiveness of user reporting mechanisms for illegal content.

The EU has also expressed concerns about Meta's decision to retire CrowdTangle, a tool used to track disinformation, without offering a suitable replacement. While Meta claims a new Content Library is in development, the EU seeks a more immediate solution.

"This Commission has created means to protect European citizens from targeted disinformation and manipulation by third countries," European Commission President Ursula von der Leyen said.

"If we suspect a violation of the rules, we act. This is true at all times, but especially in times of democratic elections. Big digital platforms must live up to their obligations to put enough resources into this and today's decision shows that we are serious about compliance."

The DSA mandates stricter content moderation practices for "Big Tech" companies operating within the EU. This includes removing illegal content and taking stronger action against disinformation campaigns.

Failure to comply with the DSA could result in hefty fines amounting to 6% of the company's global turnover or even a platform ban in severe cases.

Other major online platforms like Amazon, Snapchat, TikTok and YouTube are also subject to the DSA's regulations.

Meta has five working days to respond to the EU's concerns and outline corrective actions.

The company maintains they have well-established risk mitigation processes.

"We have a well-established process for identifying and mitigating risks on our platforms. We look forward to continuing our cooperation with the European Commission and providing them with further details of this work," a spokesperson said.

In December, the European Commission initiated its first investigation under DSA, targeting social media giant X, formerly known as Twitter. At that time, the Commission said the probe would focus on suspected breaches of obligations, particularly related to posts following Hamas' attacks on Israel.