Snapchat Faces DSA Probe Over Child Protection Failures
The European Commission has opened formal proceedings against Snapchat under the Digital Services Act (DSA), focusing on whether the platform adequately protects minors. The investigation examines Snapchat’s risk management measures, particularly regarding children’s exposure to harmful content, illegal activities, and privacy risks.
A central concern is Snapchat’s age verification system. Regulators question whether reliance on self-declared age is sufficient to prevent access by children under 13 and to ensure appropriate safeguards for teenagers. The Commission is also assessing whether Snapchat has effective measures to prevent grooming and the recruitment of minors into criminal activities, noting the risk that adults may impersonate younger users.
The probe further scrutinizes Snapchat’s default account settings. Investigators suspect that privacy protections for minors may be too weak, with features such as automatic friend suggestions and default notifications potentially exposing children to unwanted contact. The lack of clear guidance on safety settings during account creation is also under review.
Finally, the Commission is examining Snapchat’s content moderation and reporting tools. This includes concerns about the spread of illegal or age-restricted goods such as drugs, alcohol, and vapes, and whether reporting mechanisms are user-friendly and transparent. Following earlier action by the Dutch regulator, the Commission may impose fines, adopt interim measures, or accept binding commitments from Snapchat.