Facebook, Instagram and TikTok violate DSA obligations on moderation and data access
The European Commission has issued a preliminary decision finding that Facebook, Instagram, and TikTok breached the Digital Services Act (DSA) in areas of illegal content handling, user moderation workflows, and transparency. Meta’s platforms are cited for “confusing” user pathways to report illegal content and to contest moderation decisions, with the Commission indicating the use of deceptive interface designs that can impede removal of material such as child sexual abuse content and terrorism propaganda. TikTok and Meta are also flagged for failing to meet data access obligations for vetted researchers.
The decision highlights concerns over dark patterns that may deter or misdirect users from effectively reporting unlawful content. Such design practices, if confirmed, would undermine core DSA requirements for user-friendly notice-and-action mechanisms and structured redress processes. The Commission’s preliminary findings suggest systemic friction, rather than isolated defects, in the platforms’ reporting tools and appeals infrastructure.
Transparency shortcomings extend to research access, where both companies are said to impose burdensome procedures that limit availability of public data to qualified researchers. Under the DSA, Very Large Online Platforms must provide meaningful, timely access to publicly available data for research serving the public interest, subject to safeguards. The Commission’s assessment indicates non-compliance in how these platforms operationalize researcher access.
Potential sanctions are substantial: both companies face fines up to 6 percent of their annual worldwide revenue, pending the final decision. Meta and TikTok may challenge the findings or implement corrective measures before the Commission issues a binding ruling. If confirmed, the case will reinforce the DSA’s enforcement posture on illegal content workflows, user redress, and research transparency across major platforms.