Commission presents digital safety guidelines and age verification app
The European Commission has issued new guidelines and an age-verification app prototype to enhance online safety and privacy for minors under the Digital Services Act.
The European Commission has issued new guidelines and an age-verification app prototype to enhance online safety and privacy for minors under the Digital Services Act.
The Commission’s new DSA guidelines outline measures for platforms to protect minors online, focusing on privacy, safety, age assurance, and risk-based compliance.
Poland has formally requested the European Commission to investigate X’s AI chatbot Grok for potential Digital Services Act violations after reports of antisemitic and defamatory content.
The European Commission has reaffirmed that its digital regulations are non-negotiable with the U.S., emphasizing enforcement based on European values and ongoing investigations under the DSA.
The European Commission will allow EU countries to set their own social media age limits under the DSA, with flexible age verification methods to reduce regulatory fragmentation.
Denmark is leading an EU push for stricter online child protection, including a possible ban on social media for under-15s and stronger age verification measures.
France is moving to classify certain social media platforms as porn sites, requiring strict age checks under new rules, despite complex EU digital law challenges.
The European Commission has found TikTok in breach of the DSA for insufficient ad transparency, potentially facing a fine of up to 6 percent of global revenue.
The European Commission is investigating major adult platforms for DSA breaches on minors’ protection, while Member States target smaller sites and advance EU-wide age verification solutions.
Spain, Greece, and France are pressing for an EU-wide age verification system and minimum age for social media, aiming to better protect minors online.