EU Investigates Facebook’s and Instagram’s Handling of Disinformation Ahead of Elections
The European Commission has initiated an investigation into Meta Platforms, the parent company of Facebook and Instagram, over alleged failures to curb disinformation and deceptive advertising in the lead-up to the European Parliament elections. The move follows concerns regarding potential sources of disinformation both within and outside the EU.
According to a report by Reuters, EU tech regulators have raised alarms over the proliferation of misleading information — not only from external actors like Russia, China and Iran, but also from political parties and organizations within the EU. These concerns have prompted the European Commission to take action amid preparations for the upcoming elections scheduled for June 6-9.
The investigation is rooted in suspicions that Meta Platforms may be in breach of EU online content rules, particularly the Digital Services Act (DSA), which came into effect last year. Under the DSA, major tech companies are obligated to take more robust measures to combat illegal and harmful content on their platforms, with potential fines reaching up to 6% of their global annual turnover.
Related: Brussels to Investigate Meta Platforms’ Handling of Disinformation on Facebook and Instagram
One focal point of the probe will be the activities of a Russia-based influence operation network known as Doppelganger, which was previously exposed by Meta in 2022. People familiar with the matter, as cited by Reuters, state that the EU investigation aims to assess Meta’s compliance with DSA obligations, particularly regarding the dissemination of deceptive advertisements, disinformation campaigns and coordinated inauthentic behavior within the EU.
In response to the investigation, Margrethe Vestager, the EU’s digital chief, expressed concerns about Meta’s moderation practices and transparency regarding advertisements and content moderation procedures. She stated, “We suspect that Meta’s moderation is insufficient, that it lacks transparency of advertisements and content moderation procedures.”
Meta Platforms, with over 250 million monthly active users in the European Union, defended its approach to risk mitigation, asserting that it has a well-established process for identifying and addressing risks on its platforms. A spokesperson for Meta emphasized the company’s commitment to cooperating with the European Commission and providing further details of its efforts to mitigate risks.
The Commission’s investigation signals a concerted effort to ensure that tech giants like Meta comply with EU regulations aimed at safeguarding the integrity of elections and combating the spread of misinformation and deceptive advertising online.
Source: Reuters
Featured News
Prime Therapeutics Found in Violation of Antitrust Laws, Arbitrator Rules
Jan 23, 2025 by
CPI
Honda and Nissan Face Challenges in China Amid Potential Merger
Jan 23, 2025 by
CPI
Trump Criticizes EU’s Tech Crackdown, Calls It ‘A Form of Taxation’
Jan 23, 2025 by
CPI
Meta Faces Fresh Allegations of EU Law Breaches in Subscription Service Rollout
Jan 23, 2025 by
CPI
European Commission Investigates Crypto Rules for Cross-Border Stablecoins
Jan 23, 2025 by
CPI
Antitrust Mix by CPI
Antitrust Chronicle® – International Criminal Enforcement
Jan 23, 2025 by
CPI
The Antitrust Division’s Recent Work to Combat International Cartels
Jan 23, 2025 by
Emma Burnham & Benjamin Christenson
Information Sharing: The New Frontier of U.S. Antitrust Enforcement
Jan 23, 2025 by
Brian P. Quinn, Casey Kovarik & Michael Tubach
The Key Role of Guidelines on Exchanges of Information Among Competitors and the Divergent Transatlantic Paths
Jan 23, 2025 by
Rosa Abrantes-Metz & Albert Metz
Leniency, Whistleblowers, and Compliance
Jan 23, 2025 by
Richard Powers, Tara O’Malley & Cory Gordon