TikTok faces EU fines over ad transparency and researcher access
EU regulators are intensifying scrutiny of TikTok under the Digital Services Act (DSA) and related transparency obligations, with provisional findings that the platform failed to provide a sufficiently auditable advertising library and imposed onerous conditions on researcher access to public data. The European Commission’s ongoing investigation could result in significant fines and mandated operational changes, mirroring enforcement trends already applied to other very large online platforms. TikTok and Meta have both been criticized for inadequate transparency tools that impede regulatory oversight and independent research.
TikTok has countered that enforcement lacks consistency across platforms and that obligations should be calibrated to risk rather than revenue thresholds. The company advocates extending DSA-style duties to more online services, including measures to assess and mitigate risks from persuasive design features, while avoiding blanket bans on specific design elements. TikTok further proposes a central EU enforcement authority to coordinate cross-border cases, set strategic priorities, and streamline interaction with DSA and GDPR regulators, aiming to reduce fragmentation and improve compliance clarity.
Enforcement momentum remains strong. In May, TikTok was fined approximately €557 million by the Irish Data Protection Commission for GDPR violations, including inadequate protection of user data and transfers to China. Similar large-scale penalties have been imposed on Meta, reflecting the Commission’s stance that sanctions should be proportionate to the scale and societal impact of major platforms to ensure effective compliance with evolving EU rules.
Critics argue that the cumulative effect of EU digital enforcement resembles a de facto digital services tax on large platforms, channeling substantial funds into EU jurisdictions. Industry responses include lobbying efforts and calls for regulatory reform. Meanwhile, the Commission continues to expand its inquiries, including examining social media’s effects on well-being, signaling that transparency, data governance, and consumer protection will remain high-priority enforcement areas.