BigTech CompaniesBusinessNewswireTechnology

EU Accuses Instagram and Facebook of Breaking Content Rules

▼ Summary

Facebook and Instagram are violating Europe’s Digital Services Act regarding illegal content handling, moderation, and transparency according to the European Commission.
TikTok and Meta have both been found to breach DSA transparency obligations in the preliminary decision.
– Meta platforms use confusing obstacles and deceptive “dark patterns” that hinder illegal content reporting and moderation challenges.
– Both companies employ burdensome procedures that prevent researchers from accessing public data on their platforms.
– Meta and TikTok face potential fines up to 6% of global annual revenue and can challenge findings before final rulings.

European regulators have formally accused Meta’s Facebook and Instagram platforms of violating the bloc’s Digital Services Act through problematic content moderation systems and insufficient transparency. The preliminary finding from the European Commission identifies significant compliance failures in how these social media giants handle illegal material and provide researcher access to public data.

The Commission specifically highlighted Meta’s implementation of what it describes as “confusing” systems that make it unnecessarily difficult for users to report prohibited content or appeal moderation decisions. Investigators determined the platforms appear to employ deceptive interface designs known as dark patterns, which can obstruct the reporting process for serious violations including terrorist propaganda and child exploitation material.

Beyond user-facing issues, both Meta and TikTok face criticism for establishing what officials characterize as burdensome procedures that effectively prevent legitimate researchers from accessing publicly available data. These transparency shortcomings represent another area where the companies allegedly fall short of DSA requirements mandating greater accountability from large online platforms.

Should the preliminary findings become official rulings after further investigation, the potential penalties could reach substantial financial proportions. Both companies now face potential fines of up to six percent of their global annual revenue, a significant financial consequence that reflects the seriousness of the alleged violations.

Before the Commission issues its final determination, Meta and TikTok retain the opportunity to either contest the preliminary findings or implement corrective measures addressing the identified concerns. The outcome of this regulatory process could establish important precedents for how major tech platforms must structure their content moderation and transparency systems within European markets.

(Source: The Verge)

Topics

digital services act 95% platform regulation 90% content moderation 88% tech companies 85% illegal content 85% transparency obligations 83% dark patterns 80% european commission 78% regulatory fines 75% user rights 72%