The European Commission has initiated an investigation into social media giant Meta over allegations of breaching the bloc’s online content rules.
The probe, announced on Tuesday, raises concerns regarding Meta’s moderation practices and its handling of deceptive advertising and disinformation on platforms like Facebook and Instagram.
The investigation will also scrutinize Meta’s transparency regarding its moderation of political content and accounts.
EU digital chief Margrethe Vestager emphasized the inadequacy of Meta’s moderation efforts, stating, “We suspect that Meta’s moderation is insufficient, that it lacks transparency of advertisements and content moderation procedures.”
The EU Commission’s investigation revolves around four primary concerns:
- Inadequate oversight and moderation of advertisements.
- Insufficient transparency regarding the demotion of political content and accounts.
- Limited accessibility for journalists and civil society researchers to real-time data and monitoring tools for political content during elections.
- Absence of clear and user-friendly methods for reporting illegal content.
The investigation move follows the implementation of the Digital Services Act (DSA) last year, which mandates stricter measures for combating illegal and harmful content online.
As the EU gears up for elections in June, worries about disinformation campaigns, particularly from countries like Russia, China, and Iran, have intensified.
Last month, a revelation of a suspected Russian-sponsored network attempting to influence the upcoming EU vote brought into focus the importance of addressing disinformation ahead of the elections.
Meta, in response to the investigation, defended its risk mitigation processes, asserting, “We have a well-established process for identifying and mitigating risks on our platforms. We look forward to continuing our cooperation with the European Commission and providing them with further details of this work.”
However, concerns have also been raised regarding Meta’s decision to discontinue its disinformation tracking tool, CrowdTangle, without a suitable alternative in place.
The regulatory scrutiny comes amidst a broader effort to hold tech giants accountable for their content moderation practices. Meta, along with 22 other major online platforms, faces potential fines of up to 6% of global turnover for non-compliance with the DSA.
Meta has been given a five-day window to address the EU’s concerns and provide details on remedial actions taken to rectify the issues raised.