Irish media regulator probes TikTok and LinkedIn over DSA child protection duties
Coimisiún na Meán has opened a formal investigation into TikTok and LinkedIn over suspected non-compliance with the Digital Services Act (DSA), focusing on their mechanisms for reporting suspected child sexual abuse material (CSAM). The regulator is examining whether these platforms allow users to report CSAM anonymously and whether the reporting tools are easy to access and genuinely user-friendly, as required by Article 16 DSA. Concerns arise from indications that users may struggle to locate or use these mechanisms effectively in practice.
The investigation follows a broader supervisory sweep launched in September 2024 into several very large online platforms, including YouTube, X, Meta and Pinterest, to assess their compliance with Article 16 DSA. That review, informed by Coimisiún na Meán’s contact center, user complaints and information shared by other European regulators, identified potential non-compliance and triggered targeted follow-up. Some providers have already adjusted their reporting tools in response, while supervisory engagement with others continues.
The regulator has indicated that TikTok’s and LinkedIn’s interfaces may incorporate “dark patterns” in their illegal-content reporting flows. In particular, Coimisiún na Meán is concerned that design choices could mislead users into thinking they are reporting illegal content to competent mechanisms when they are merely flagging breaches of platform terms and conditions, or could frustrate or deter users from reporting at all. If confirmed, such practices could undermine the effectiveness of illegal-content reporting systems, in breach of Articles 16 and 25 DSA, and weaken users’ rights under the DSA framework.
Both TikTok and LinkedIn state that they are committed to compliance and are cooperating with the Irish regulator. Coimisiún na Meán, however, has signaled that it has already requested additional information from several other providers on their reporting workflows and is not excluding further regulatory action to secure DSA compliance. In parallel, the authority recently opened a separate investigation into X’s compliance with DSA appeal rights, highlighting an increasingly assertive enforcement posture on user-facing procedural safeguards across EU-based platforms.