The European Commission found that TikTok’s design features, including infinite scrolling, encouraged compulsive use and failed to adequately protect users, particularly children and young people.
This is according to preliminary findings published by the European Commission on Friday as part of an ongoing investigation under the EU’s Digital Services Act.
They will order the company to change the way the app works or face fines of up to 6% of owner ByteDance’s global revenue.
The findings come amid growing global scrutiny of social media platforms over the impact of excessive screen time and addictive design on young users.
what the committee said
According to the European Commission, TikTok relies heavily on addictive design features such as infinite scrolling to continuously provide users with new content and put their minds on what regulators call “autopilot.”
The commission said these features encourage compulsive behaviors such as repeatedly opening apps or scrolling for long periods of time, exposing users to risks that platforms do not sufficiently mitigate.
“Social media addiction can have a negative impact on the mental development of children and teenagers,” said Hena Virkunen, European Commission Vice-President for Technology, Sovereignty, Security and Democracy.
He added that the Digital Services Act holds platforms accountable for the impact they have on their users, and stressed that the EU enforces laws to protect children and citizens online.
back story
The investigation into TikTok was launched in 2024 to assess whether the platform complies with the Digital Services Act. The Digital Services Act imposes obligations on large online platforms to manage systemic risk, protect users and ensure transparency.
As part of the investigation, regulators examined TikTok’s internal risk assessments, company data, and scientific research on behavioral addictions. The commission said the findings reflect growing concerns among regulators around the world about whether social media companies are doing enough to limit designs that are addictive, especially to minors.
The commission previously issued warnings to TikTok and Meta in October for making it difficult for researchers to access data on public platforms, which may also violate the DSA.
Further insights
Regulators expressed concerns about TikTok’s Daily Screen Time feature, which allows users to set usage limits and receive alerts when the limits are reached. The one-hour-per-day limit automatically applies to users aged 13 to 17, but the commission said the warning was ineffective because it was easily ignored.
The committee also criticized TikTok’s parental control system, known as Family Pairing, which allows parents to manage screen time, receive activity reports and restrict certain content. The regulator said these controls were not fully effective because they required additional time and technical effort from parents to set up and administer. Based on its assessment, the Commission concluded that TikTok must change the basic design of its service to comply with the DSA. Suggested changes include disabling infinite scrolling, introducing more effective screen time breaks, adjusting how videos are recommended to users, and more.
TikTok rejected the commission’s preliminary findings, calling them “categorically false and completely worthless.” The company said it would use all means possible to contest the findings. TikTok insisted there is no one-size-fits-all approach to regulating screen time and said it offers multiple tools to help users manage their usage.
What you need to know
Meta’s platform is one of the most heavily fined platforms in Europe for data protection violations. Facebook, Instagram, WhatsApp and others have paid billions of euros in fines under the EU’s General Data Protection Regulation for mishandling user data, including on issues related to children’s privacy, according to a report.
Twitter’s successor Platform
The European Union has expanded its technology oversight beyond TikTok, designating services like WhatsApp as super-large online platforms under the Digital Services Act, with increased requirements to manage risk, illegal content and user protection.

