EU says TikTok’s “addictive design” may break the Digital Services Act — and wants core features changed

Europe’s tech regulators have fired a clear warning shot at TikTok: the European Commission says it has preliminarily found the app in breach of the Digital Services Act (DSA) because of design choices that can push users—especially minors and vulnerable adults—into compulsive, “autopilot” scrolling.

This isn’t a debate about one bad video or one toxic trend. It’s about the architecture of the platform.

What the EU is targeting

The Commission’s preliminary view focuses on the very features that make TikTok feel frictionless:

  • Infinite scroll
  • Autoplay
  • Push notifications
  • A highly personalized recommender system

In the EU’s framing, these aren’t neutral conveniences. They are “engagement engines” that can continuously reward the brain with novelty—making it harder to stop.

The core allegation: TikTok didn’t assess the risks properly

Under the DSA, large platforms are expected to identify and evaluate systemic risks—and show that they understand how their product choices affect users.

The Commission says TikTok’s risk assessment fell short, particularly around harm to physical and mental wellbeing. The concern is that constant reward cycles can encourage compulsive behavior and weaken self-control—especially for people with higher vulnerability to addictive patterns.

What stands out is the Commission’s emphasis on internal indicators of overuse, such as:

  • how much time minors spend on the app at night
  • how frequently users open the app
  • other signals that suggest compulsive engagement patterns

The point is blunt: regulators believe the signs were visible—and should have been taken seriously.

The second allegation: TikTok’s “fixes” don’t really fix it

TikTok already has time-management tools and parental controls. The Commission’s preliminary view is that these measures don’t effectively reduce risk, because:

  • time controls can be easy to dismiss and create too little friction
  • parental controls can be burdensome, requiring extra time/skills that many families don’t realistically have

In other words: optional settings aren’t enough if the default product is designed to pull users back in.

What the EU wants next: redesign, not tweaks

The Commission’s message is that TikTok may need to change “the basic design” of the service. The examples it raises are significant:

  • disabling key addictive features (like infinite scroll) over time
  • adding effective “screen time breaks,” including at night
  • adapting the recommender system so it doesn’t relentlessly optimize for maximum engagement

This is a shift from “remove harmful content” to “stop shipping harmful mechanics.”

Why this matters beyond TikTok

If the EU follows through, the precedent won’t stay contained to one app.

Nearly every major social platform uses the same core playbook:

  • frictionless feeds
  • algorithmic personalization
  • autoplay and notification hooks
  • streaks, badges, and other “return triggers”

A regulatory finding that treats these mechanics as a systemic risk under the DSA could accelerate a broader European push toward design accountability—where platforms must prove they’ve engineered for user wellbeing, not just attention extraction.

What “preliminary” means (and what happens next)

A preliminary finding is not a final verdict. It signals that regulators believe they have enough evidence to allege a breach, and the company typically has the chance to respond before any final decision.

But it does put TikTok on a clock:

  • respond to the Commission’s concerns
  • demonstrate stronger risk assessment and mitigation
  • or prepare for enforcement outcomes

Under the DSA, confirmed violations can lead to serious penalties, including major fines and binding orders to change product behavior.

The bigger takeaway

This is the EU saying: the product itself is the problem.

TikTok didn’t get targeted because it’s popular. It got targeted because regulators believe its engagement design can systematically steer users—especially kids—toward compulsive use, while the safeguards are too easy to bypass.

If Europe holds the line, the next era of platform regulation won’t be about “take down this post.”

It’ll be about: stop building the slot machine.

Exit mobile version