AI Regulation

EU Rules Meta Fails to Keep Kids Under 13 Off FB, IG

Brussels is calling foul on Meta. A preliminary ruling says the social media giant is flat-out failing to keep kids under 13 off Facebook and Instagram, potentially opening the door to massive fines.

{# Always render the hero — falls back to the theme OG image when article.image_url is empty (e.g. after the audit's repair_hero_images cleared a blocked Unsplash hot-link). Without this fallback, evergreens with cleared image_url render no hero at all → the JSON-LD ImageObject loses its visual counterpart and LCP attrs go missing. #}
European Commission building facade with digital data streams overlayed

Key Takeaways

  • The European Commission issued a preliminary decision stating Meta is breaching the Digital Services Act by failing to prevent children under 13 from using Facebook and Instagram.
  • Meta's current age verification methods are deemed inadequate, allowing minors to easily enter false birthdates to access the platforms.
  • If Meta fails to remedy the breaches, it risks fines of up to six percent of its global annual turnover, potentially amounting to $12 billion.

So, what was the big, dramatic expectation here? After years of Meta promising the moon and stars about protecting its youngest users, everyone figured the EU was finally going to get serious about enforcing its own rules. Turns out, Brussels is indeed getting serious. And it’s not pretty for Menlo Park.

This isn’t just a slap on the wrist; it’s the latest salvo in a war of attrition over who’s really in control of the internet’s sprawling digital playgrounds. The European Commission’s preliminary decision — a document that usually precedes a hefty fine or a stern lecture — states Meta is flat-out breaching the Digital Services Act (DSA). The crime? Apparently, not doing enough to keep kids under the age of 13 off Facebook and Instagram. You know, the age they themselves say is the minimum.

The sheer audacity of it, right? Minors can apparently just punch in a fake birthday and poof – they’re in. No effective checks, no real verification, just a digital nod and a wink. Henna Virkkunen, an EU tech policy leader, basically said it herself: Meta’s own rules say these services aren’t for the under-13 crowd, yet they’re doing “very little” to stop them. It’s like setting a speed limit and then ignoring anyone who goes twice as fast.

Are the Reporting Tools Even Working?

And the tools they do offer for reporting underage users? The Commission calls them “difficult to use and not effective.” Even when a child is reported, there’s often no follow-up. This whole situation reeks of a company that’s optimized for engagement and ad revenue above all else, with child safety treated as an afterthought, a PR hurdle to be managed rather than a core ethical imperative.

Meta’s own risk assessment for protecting minors? The EU’s calling it “incomplete and arbitrary.” They’ve got evidence suggesting that a shocking 10-12 percent of children under 13 are already lurking on Facebook and Instagram. This isn’t some fringe issue; it’s a significant chunk of their user base, potentially exposed to harms they’re too young to process. The Commission also points out that Meta seems to have conveniently “disregarded readily available scientific evidence” about how vulnerable younger children are to the addictive nature of these platforms. An ongoing investigation into whether these platforms cause “behavioral addictions in children” is still simmering.

“Meta’s own general conditions indicate their services are not intended for minors under 13,” EU tech policy leader Henna Virkkunen said in a statement. “Yet, our preliminary findings show that Instagram and Facebook are doing very little to prevent children below this age from accessing their services.”

Now, Meta has a chance to fix this. They can update their risk assessment and, you know, actually implement some decent age verification. If they don’t, the fines could be astronomical – up to six percent of their global annual turnover. We’re talking potentially $12 billion, a sum that should make even the most jaded Silicon Valley executive sweat.

Meta’s response, predictably, is a mix of defensive posturing and vague promises. “We’re clear that Instagram and Facebook are intended for people aged 13 and older and we have measures in place to detect and remove accounts from anyone under that age,” they stated. Translation: We say we’re doing it, and we think we’re doing it. They promise “additional measures rolling out soon.” We’ll see. Historically, these promises from Meta tend to be about as solid as a digital ghost.

My take? This is the inevitable consequence of a system that prioritizes growth and data above all else. For 20 years, I’ve watched these platforms push boundaries, apologize for the fallout, and then do it again. The EU is finally saying, ‘Enough is enough.’ It’s not just about fines; it’s about forcing a fundamental re-evaluation of who these platforms are designed for and who they’re actively failing to protect.

This isn’t just a European problem. The global implications are huge, and it forces the question: when will platforms like Meta truly internalize the responsibility that comes with connecting billions of people, especially the most vulnerable among us? Or will they continue to play the regulatory game, making just enough changes to avoid the biggest penalties, while the underlying issues fester?

Who Is Actually Making Money Here?

Let’s be blunt: Meta makes money when people are on its platforms, scrolling, clicking, and consuming ads. Children, unfortunately, are a prime demographic for early adoption and long-term engagement. The cost of strong age verification, the potential loss of younger users and their future ad revenue — these are real business considerations that weigh against child safety. This ruling forces Meta to confront the uncomfortable truth that their business model, at least as currently implemented, clashes directly with the imperative to shield young eyes.

**


🧬 Related Insights

Frequently Asked Questions**

What does the European Commission’s preliminary decision mean for Meta? It means Meta is accused of breaking the Digital Services Act by not adequately preventing children under 13 from using Facebook and Instagram. This could lead to significant fines if Meta doesn’t comply.

Will this affect users outside of Europe? The Digital Services Act is a European regulation, but its principles and the precedent it sets can influence how other regions approach platform accountability for child safety.

Is Meta being fined already? No, this is a preliminary decision. Meta has an opportunity to respond and propose remedies before a final ruling and any potential fines are issued.

Written by
Legal AI Beat Editorial Team

Curated insights, explainers, and analysis from the editorial team.

Frequently asked questions

What does the European Commission's preliminary decision mean for Meta?
It means Meta is accused of breaking the Digital Services Act by not adequately preventing children under 13 from using Facebook and Instagram. This could lead to significant fines if Meta doesn't comply.
Will this affect users outside of Europe?
The Digital Services Act is a European regulation, but its principles and the precedent it sets can influence how other regions approach platform accountability for child safety.
Is Meta being fined already?
No, this is a preliminary decision. Meta has an opportunity to respond and propose remedies before a final ruling and any potential fines are issued.

Worth sharing?

Get the best Legal Tech stories of the week in your inbox — no noise, no spam.

Originally reported by The Verge - Policy

Stay in the loop

The week's most important stories from Legal AI Beat, delivered once a week.