Platform
EU Regulators Target TikTok’s Infinite Scroll, Autoplay Features Under Digital Services Act
The European Commission issued preliminary findings on February 6 stating that TikTok‘s design features breach the Digital Services Act (DSA), potentially requiring the platform to modify core functionality or face fines of up to 6% of its worldwide annual revenue.
The Commission identified TikTok’s infinite scroll, autoplay, push notifications, and personalized recommendation system as violations of DSA requirements. Regulators stated that the platform failed to adequately assess how these features could harm users’ physical and mental well-being, including minors and vulnerable adults.
“For example, by constantly ‘rewarding’ users with new content, certain design features of TikTok fuel the urge to keep scrolling and shift the brain of users into ‘autopilot mode’,” the Commission stated in its press release. “Scientific research shows that this may lead to compulsive behaviour and reduce users’ self-control.”
The Commission’s investigation found TikTok disregarded indicators of compulsive app use, including the time minors spend on the platform at night, frequency of app openings, and other potential indicators.
Inadequate Risk Mitigation Measures
The Commission determined TikTok’s current measures, particularly screen-time management and parental control tools, do not effectively reduce risks from the platform’s design.
Time management tools are easy to dismiss and introduce limited friction, according to the preliminary findings. Parental controls may prove ineffective because they require additional time and skills from parents to implement.
“At this stage, the Commission considers that TikTok needs to change the basic design of its service,” the press release reads. “For instance, by disabling key addictive features such as ‘infinite scroll’ over time, implementing effective ‘screen time breaks’, including during the night, and adapting its recommender system.”
Investigation Methodology and Next Steps
The Commission based its preliminary views on analysis of TikTok’s risk assessment reports, internal data and documents, responses to multiple information requests, review of scientific research on the topic, and interviews with experts in behavioral addiction and other fields.
TikTok can now examine documents in the Commission’s investigation files and respond in writing to the preliminary findings. The European Board for Digital Services will be consulted in parallel.
A TikTok spokesperson told the Financial Times: “The Commission’s preliminary findings present a categorically false and entirely meritless depiction of our platform. We will take whatever steps are necessary to challenge these findings through every means available to us.”
If the Commission confirms its findings, it may issue a non-compliance decision triggering fines proportional to the nature, gravity, recurrence, and duration of the infringement.
Investigation Context
The preliminary findings are part of formal proceedings launched on February 19, 2024, investigating TikTok’s DSA compliance. The investigation covers the platform’s recommender systems, age verification processes, and privacy protections for minors.
The Commission adopted preliminary findings on data access for researchers in October 2025 and closed an advertising transparency investigation through binding commitments in December 2025.
“Social media addiction can have detrimental effects on the developing minds of children and teens,” said Henna Virkkunen, Executive Vice-President for Tech Sovereignty, Security and Democracy. “The Digital Services Act makes platforms responsible for the effects they can have on their users. In Europe, we enforce our legislation to protect our children and our citizens online.”
Regulatory Context
The development follows TikTok’s deployment of enhanced age-detection technology across Europe in recent weeks. The system analyzes profile information, videos, and behavioral signals to identify accounts that may belong to users under 13. TikTok reports removing approximately 6 million underage accounts globally each month through existing verification measures.
The platform also generated €31 billion in economic value across the EU in 2025 through business advertising, according to independent research by Public First.
Multiple countries are implementing stricter social media restrictions. Australia banned social media for users under 16 in December 2025, resulting in more than 4.7 million account removals across platforms. UK Prime Minister Keir Starmer indicated openness to similar restrictions, while Denmark proposed banning social media for users under 15. Spain recently proposed blocking social media apps from users under 16.
The European Parliament is pursuing age limit requirements for social media platforms across member states.
