EU Challenges TikTok’s Addictive Social Media Practices
The European Commission is using the Digital Services Act to challenge TikTok’s addictive design, signaling stricter EU scrutiny of platform architecture and mental health risks.
The European Commission is using the Digital Services Act to challenge TikTok’s addictive design, signaling stricter EU scrutiny of platform architecture and mental health risks.
TikTok is expanding age‑detection and moderation tools across Europe as regulators and governments push for stronger safeguards to keep under‑13 users off social media.
Coimisiún na Meán is investigating TikTok and LinkedIn for possible DSA breaches over opaque, non-anonymous reporting mechanisms for suspected child sexual abuse material.
European Parliament urges an EU-wide under‑16 default ban on social media, targeting addictive design and dark patterns, while pressing to strengthen child protection beyond the DSA.
Commission issues non-binding DSA Guidelines setting a benchmark for proportionate, by-design measures to ensure minors’ privacy, safety, and security, including robust age assurance beyond self-declaration.
Denmark plans to ban social media for under‑15s, allow access from 13 with parental consent, and align measures with DSA-compliant national age limits amid EU calls for youth protections.
Denmark urges the EU’s December simplification package to include the AI Act and DSA, aiming to cut reporting burdens while advancing child protection and deepfake safeguards.
EU countries are adopting varied national rules to restrict minors’ social media access, leveraging new Digital Services Act guidelines and advanced age verification technologies.
The Commission’s new DSA guidelines outline measures for platforms to protect minors online, focusing on privacy, safety, age assurance, and risk-based compliance.
The European Commission will allow EU countries to set their own social media age limits under the DSA, with flexible age verification methods to reduce regulatory fragmentation.