Ursula von der Leyen pushes EU social media age limits for kids

▼ Summary
– The EU’s age-verification app is technically complete, and the Commission is developing bloc-wide rules on minimum social-media ages.
– France, Spain, and several other EU member states are already moving ahead with national laws banning under-15s or under-16s from social media.
– The Digital Services Act already requires large platforms to assess risks to children, with ongoing enforcement actions against Meta, TikTok, X, and Snap.
– The Commission has instructed Apple and Google to integrate the age-verification system at the operating-system level.
– A formal Commission proposal for a single EU rule on minimum social-media ages is expected before the autumn break.
European Commission President Ursula von der Leyen announced Tuesday that the EU’s age-verification app is technically ready for deployment, signaling that bloc-wide minimum social media ages are the next frontier in child online protection. Speaking to MEPs in Strasbourg, she confirmed the Commission is developing a unified approach, though the exact age threshold remains under expert consultation.
The push comes as a wave of national legislation outpaces EU-level action. France enacted a ban on under-15s from social media in January 2026, framing it as a public health response. Spain has proposed raising the bar to 16 years old, while Austria, Denmark, and Slovenia are crafting rules at ages 14, 15, and 15 respectively. Italy and Ireland are also exploring restrictions at similar age ranges.
Von der Leyen stressed the need for a single EU standard rather than a patchwork of 27 national laws. “Children should be protected in the same way wherever they live in our Union,” she said. The European Parliament has called for a uniform 16-year minimum, but the Commission is first seeking expert input to determine the appropriate threshold.
The age-verification system itself was built by the Commission’s digital-identity team, using zero-knowledge cryptographic techniques that confirm a user meets an age requirement without revealing their exact age or identity. The app is ready for member states to implement at their own pace.
This initiative sits within the broader Digital Services Act (DSA) enforcement cycle. Meta, TikTok, X, and Snap are under active Commission investigation for how they handle minors, with potential findings expected within the next year. The Commission has also instructed Apple and Google to integrate the age-verification system at the operating-system level.
Industry reaction has been cautious. Platform operators argue that hard age limits could drive minors toward unmoderated or non-EU services, and that scaling age verification poses significant technical challenges. Child-safety groups, however, urge faster action, citing rising rates of self-reported harm and worsening mental health metrics among adolescents linked to high social media use.
Privacy advocates warn that even zero-knowledge systems could drift toward identity verification over time. The Commission insists the system is designed to prevent that, though full governance and audit details have not yet been published.
A formal Commission proposal is expected before the autumn break. In the meantime, national laws already enacted will remain in force, and member-state regulators will continue enforcing existing DSA obligations regardless of the new age-related framework’s timeline.
(Source: The Next Web)