EU Investigates Meta Over Child Protection Concerns
European Union regulators have initiated a formal investigation into Facebook and Instagram over concerns related to child protection, as announced by the European Commission on Thursday. The investigation focuses on whether the algorithms used by both platforms exploit the vulnerabilities and inexperience of minors, potentially leading to addictive behaviors. Additionally, the Commission has raised concerns about the “rabbit-hole” effect, which may expose minors to increasingly disturbing content.
The investigation also examines potential privacy issues and the possibility of minors accessing inappropriate content. Thierry Breton, the European Commissioner for Internal Market, expressed doubts about Meta’s compliance with the Digital Services Act (DSA). Both Facebook and Instagram are classified as very large online platforms (VLOPs) under the DSA, requiring them to adhere to the strictest regulations or face potential sanctions.
Regulators aim to evaluate Meta’s efforts to mitigate risks to the physical and mental well-being of children and to ensure the protection of their rights. The assessment will determine if Meta has met its DSA obligations to provide a high level of privacy, safety, and security for minors, particularly regarding default privacy settings and the design of their recommender systems. Non-compliance could result in fines up to 6% of Meta’s global revenue, which translates to billions of euros.
Meta is already under another DSA investigation related to deceptive advertising and political content on Facebook and Instagram, particularly concerning the upcoming European Parliament elections in June. The outcome of these investigations could significantly impact Meta’s operations within the EU.