European Union Escalates Battle for Child Online Safety, Accuses Meta of Failing to Protect Minors

BRUSSELS – The European Union has launched formal proceedings against Meta Platforms Inc., the parent company of Facebook and Instagram, accusing the tech giant of systemic failures in protecting minors on its platforms. The landmark action, initiated under the bloc's stringent Digital Services Act (DSA), alleges that Meta has not adequately prevented children under the age of 13 from accessing its social media services and has neglected to implement robust measures to remove existing underage accounts. The European Commission's move underscores a growing international resolve to hold powerful online platforms accountable for the well-being of their youngest users, with potential fines reaching billions of dollars should Meta be found in violation.
Deep Dive into the Accusations: A Breach of Trust and Regulation
The European Commission’s formal proceedings, which commenced on May 16, 2024, are rooted in a comprehensive analysis of Meta’s risk assessment reports, responses to official information requests, and extensive public and internal reports. The investigation highlights several critical areas where Meta is purportedly falling short of its obligations under the DSA. A primary concern revolves around Meta's age-assurance and verification methods, which the EU deems ineffective in preventing children younger than 13 from creating accounts on Facebook and Instagram. Despite Meta's own terms of service setting a minimum age of 13, the Commission suggests that users can easily circumvent these rules by simply entering a false birth date, often without further checks to confirm their age.
Beyond the initial access, the accusations extend to Meta's purported inadequacy in identifying and removing children after they have successfully opened accounts. The EU expresses particular apprehension about the platforms' interfaces and algorithmic systems, which are believed to exploit the vulnerabilities and inexperience of minors. There are concerns that these systems may stimulate behavioral addictions and create "rabbit-hole effects," drawing young users deeper into potentially harmful content or prolonged engagement. Furthermore, the Commission has criticized Meta's risk assessment practices as "incomplete and arbitrary," asserting that the company has seemingly disregarded readily available scientific evidence indicating the heightened vulnerability of younger children to potential harms associated with such services. The ease of reporting underage users and the subsequent follow-up by Meta have also been flagged as insufficient.
The Digital Services Act: A New Era of Accountability for Tech Giants
The formal proceedings against Meta represent a significant test of the Digital Services Act, a sweeping piece of EU legislation designed to compel large tech companies to take greater responsibility for the content and safety on their platforms. The DSA came into full effect in 2024 for very large online platforms (VLOPs), establishing strict rules aimed at cleaning up the online environment and safeguarding internet users, particularly minors. Under Article 28(1) of the DSA, platforms are required to implement appropriate and proportionate measures to ensure a high level of safety, privacy, and security for minors, and to prevent children below the applicable national minimum age from accessing their services.
The gravity of these accusations is underscored by the potential penalties. If Meta is found to be in violation of the DSA, the company could face substantial fines, reaching up to 6% of its worldwide annual revenue. Given Meta's reported revenue of $164.5 billion in 2024, such a fine could amount to billions of dollars, with one estimate placing a potential penalty at $9.9 billion. For serious and repeat violations, the DSA even allows for the possibility of a platform ban, though such a measure would be an extreme last resort. This marks a precedent, as it is the first time the specific charge of a platform-level failure to prevent underage access has been directed at a mainstream social media company under the DSA, a charge previously reserved for adult content sites.
Broader Industry Scrutiny and a Changing Digital Landscape
This action against Meta is not an isolated incident but rather part of a broader, intensified effort by European regulators to address child safety on online platforms. The EU has initiated similar investigations into other prominent platforms, including Snapchat, YouTube, and TikTok, probing their age verification mechanisms and their efforts to prevent minors from accessing harmful content. Thierry Breton, the EU's internal market commissioner, has publicly voiced concerns that Meta has not done enough to meet its DSA obligations, particularly in mitigating the negative effects on the physical and mental health of young Europeans.
The growing regulatory pressure in Europe mirrors a global movement towards greater scrutiny of social media's impact on youth. Across the continent, governments are actively debating and considering new legislative measures, with some nations like Spain, France, and the UK exploring potential bans or significant age restrictions for social media access for children under 16. These legislative discussions reflect a mounting societal concern over what has been described as a "tsunami of big tech flooding" people's homes, impacting the development and well-being of younger generations. Furthermore, Meta has faced prior criticism from the EU for allegedly blocking researchers from accessing data crucial for studying content that reaches children, potentially hindering independent analysis of its platforms' effects.
The Road Ahead: Meta's Response and the Future of Digital Regulation
With the formal proceedings now underway, Meta has the opportunity to respond to the European Commission's preliminary findings and propose remedial actions. This crucial phase will allow the company to present its defense, challenge the accusations, and outline any measures it plans to implement to enhance child protection on its platforms. The Commission will then evaluate Meta's response before issuing a final decision, a process for which no specific deadline has been set.
The outcome of this investigation carries significant implications, not only for Meta but for the entire digital industry. A ruling against Meta could set a powerful precedent, reinforcing the DSA's authority and potentially catalyzing a wave of stricter enforcement actions across other platforms. It would send a clear message that self-regulation alone is no longer sufficient and that tech companies must proactively implement robust safeguards to protect their youngest users. This ongoing battle between regulators and tech giants marks a pivotal moment in shaping the future of online safety, underscoring the European Union’s commitment to ensuring a safer digital environment for children across the bloc.
Related Articles

Hungary's New Leader Seeks Fresh Start with EU Amidst Critical Fund Talks
Brussels, Belgium – In a pivotal moment for Hungary's relationship with the European Union, Péter Magyar, the nation's incoming prime minister, arrived in Brussels today for high-stakes informal discussions with...

EU Parliament Demands Ambitious Budget Hike and New Taxes for 2028-2034 Spending Cycle
STRASBOURG – The European Parliament has fired the opening salvo in what promises to be a contentious battle over the European Union's next long-term budget, overwhelmingly endorsing a proposal for a significantly...

Germany Grapples with Tightened Purse Strings as Geopolitical Turmoil Shadows 2027 Budget
BERLIN – The German cabinet is locked in critical discussions this week over the nation's 2027 budget, a fiscal blueprint that reveals the profound economic fallout from escalating geopolitical tensions, particularly...