TikTok Strengthens Age Verification Across Europe
TikTok is expanding age‑detection and moderation tools across Europe as regulators and governments push for stronger safeguards to keep under‑13 users off social media.
TikTok is expanding age‑detection and moderation tools across Europe as regulators and governments push for stronger safeguards to keep under‑13 users off social media.
Commission issues non-binding DSA Guidelines setting a benchmark for proportionate, by-design measures to ensure minors’ privacy, safety, and security, including robust age assurance beyond self-declaration.
Denmark plans to ban social media for under‑15s, allow access from 13 with parental consent, and align measures with DSA-compliant national age limits amid EU calls for youth protections.
Denmark urges the EU’s December simplification package to include the AI Act and DSA, aiming to cut reporting burdens while advancing child protection and deepfake safeguards.
EU countries are adopting varied national rules to restrict minors’ social media access, leveraging new Digital Services Act guidelines and advanced age verification technologies.
The European Commission has issued new guidelines and an age-verification app prototype to enhance online safety and privacy for minors under the Digital Services Act.
The Commission’s new DSA guidelines outline measures for platforms to protect minors online, focusing on privacy, safety, age assurance, and risk-based compliance.
The European Commission will allow EU countries to set their own social media age limits under the DSA, with flexible age verification methods to reduce regulatory fragmentation.
Denmark is leading an EU push for stricter online child protection, including a possible ban on social media for under-15s and stronger age verification measures.
France is moving to classify certain social media platforms as porn sites, requiring strict age checks under new rules, despite complex EU digital law challenges.