BigTech CompaniesBusinessNewswireTechnology

EU Demands TikTok End Addictive Design

Originally published on: February 7, 2026
▼ Summary

– The European Commission has warned TikTok that its endlessly scrolling feeds may violate the EU’s Digital Services Act.
– Regulators found TikTok failed to adequately assess and mitigate risks from addictive design features that could harm users’ wellbeing.
– This case is a major test of the DSA, which requires large platforms to identify and curb systemic risks from their products.
– The Commission stated TikTok’s design rewards users to keep scrolling, shifting their brains into “autopilot mode.”
– If the provisional findings are confirmed, TikTok could face a fine of up to 6% of its global turnover.

European regulators have issued a formal warning to TikTok, stating that its platform design may violate the bloc’s landmark online content rules. The preliminary findings focus on the app’s potential to foster addictive user behavior, marking a significant step in the enforcement of the Digital Services Act (DSA). This action underscores a broader push to hold major digital platforms accountable for the societal impacts of their products.

The European Commission’s investigation concluded that TikTok’s parent company, ByteDance, has not sufficiently evaluated or addressed the risks associated with features that could damage the physical and mental health of its users. Officials singled out the platform’s endlessly scrolling video feeds, which they argue are engineered to keep people engaged for extended periods. The commission described a system that constantly “rewards” users with new content, potentially triggering a compulsive need to continue scrolling and placing the brain into what they termed an “autopilot mode.”

This case represents one of the most advanced probes under the DSA, a comprehensive set of regulations that mandates large online platforms to identify and mitigate systemic risks. The law specifically requires these companies to conduct thorough risk assessments and implement measures to protect users, with a special emphasis on safeguarding minors and other vulnerable groups.

EU officials expressed particular concern about the impact on young people. “Social media addiction can have detrimental effects on the developing minds of children and teens,” stated Henna Virkkunen, the European Commissioner for the internal market. “In Europe, we enforce our legislation to protect our children and our citizens online.” The findings suggest that TikTok’s current safeguards and age verification processes may be inadequate to prevent these harms.

Should the commission’s provisional conclusions be confirmed after TikTok exercises its right to defend its practices, the company could face substantial financial penalties. Fines for non-compliance can reach up to 6 percent of a firm’s global annual revenue, a sum that would amount to billions of dollars for a platform of TikTok’s scale. The company now has an opportunity to review the findings and make changes to its service before regulators make a final determination.

This move by Brussels signals a more aggressive regulatory stance toward the design choices of social media giants, challenging the core engagement models that have driven their growth. It places the potential for psychological harm and addictive design at the center of digital governance, setting a precedent that could influence how platforms operate worldwide.

(Source: Ars Technica)

Topics

digital services act 95% social media addiction 93% tiktok regulation 90% addictive design 88% child protection 87% platform accountability 85% eu enforcement 83% content moderation 80% mental wellbeing 78% Regulatory Compliance 76%