EU Rules Meta Isn’t Doing Enough to Protect Kids on Facebook, Instagram

▼ Summary
– Meta is in preliminary breach of the EU’s Digital Services Act for failing to prevent children under 13 from using Facebook and Instagram, according to the European Commission.
– Minors can bypass Meta’s age limit by entering a false birth date during sign-up, with no effective age verification controls in place.
– The Commission found Meta’s tools for reporting underage users are difficult to use and often result in no follow-up removal of the child.
– Meta’s risk assessment for protecting minors is described as “incomplete and arbitrary,” ignoring EU evidence that 10-12% of children under 13 use the platforms.
– Meta risks fines of up to 6% of global annual turnover (potentially $12 billion) if it fails to implement robust age verification and risk assessment updates.
The European Commission has formally accused Meta of violating the Digital Services Act (DSA) by failing to adequately shield children under 13 from accessing Facebook and Instagram. This preliminary ruling, announced on Wednesday, caps a nearly two-year investigation into the tech giant’s safety practices.
According to the Commission, Meta lacks sufficient safeguards to prevent underage users from creating accounts. A glaring example is that minors can simply enter a false birth date during sign-up to bypass the 13-year-old minimum age set in Meta’s own terms. The platform has no effective controls to verify a user’s real age, leaving the door wide open for young children.
“Meta’s own general conditions indicate their services are not intended for minors under 13,” said Henna Virkkunen, the EU’s tech policy leader. “Yet, our preliminary findings show that Instagram and Facebook are doing very little to prevent children below this age from accessing their services.”
The Commission also found that the tools available for reporting underage users are “difficult to use and not effective.” Even when a child is flagged, there is often no follow-up to actually remove them from the platform. This failure directly contradicts the DSA’s requirement that platforms must “diligently identify and mitigate the risks” of under-13s using their services.
The EU’s announcement further criticizes Meta’s own risk assessment for protecting minors as “incomplete and arbitrary.” The Commission argues that this assessment ignores “large bodies of evidence from all over the European Union” showing that roughly 10 to 12 percent of children under 13 are already active on Facebook or Instagram.
“Moreover, Meta seems to have disregarded readily available scientific evidence indicating that younger children are more vulnerable to potential harms caused by services like Facebook and Instagram,” the Commission added. A separate investigation into whether these platforms cause “behavioral addictions in children” is still ongoing.
Meta now has a chance to address these breaches. The Commission is demanding that Instagram and Facebook update their risk assessment methodology and implement more robust age verification tools. If Meta fails to comply and receives a final non-compliance ruling, it could face fines of up to six percent of its global annual turnover. Given Meta’s reported revenue of $201 billion in 2025, that penalty could reach as high as $12 billion.
In a statement to The Guardian, Meta pushed back against the preliminary findings. “We’re clear that Instagram and Facebook are intended for people aged 13 and older and we have measures in place to detect and remove accounts from anyone under that age,” the company said. “We continue to invest in technologies to find and remove underage users and will have more to share next week about additional measures rolling out soon.”
(Source: The Verge)




