Google Refuses Fact-Checking for EU Law Compliance
Google has informed the European Union it will not incorporate fact-checking into its search results or YouTube videos, nor will it use them to rank or remove content, despite the new EU law requirements. In a letter to Renate Nikolay, Deputy Director General at the European Commission, Google’s Global Affairs President, Kent Walker, stated that integrating fact-checking as required by the Commission’s new Disinformation Code of Practice is not suitable or effective for Google’s services. Google intends to withdraw from all fact-checking commitments under the Code before it becomes a Digital Services Act (DSA) Code of Conduct.
The EU’s Code of Practice on Disinformation, introduced in 2022, includes voluntary commitments for tech firms and private companies to deliver on, including fact-checking. This Code predates the EU’s Digital Services Act (DSA), which came into effect in 2022. The European Commission has been in discussions with tech companies to convert these voluntary measures into an official code of conduct under the DSA. Despite this, Google has reiterated its stance against adopting these measures.
Google emphasizes its current content moderation strategies, highlighting successful moderation during last year’s global elections. The company has introduced a feature on YouTube allowing users to add contextual notes to videos, similar to X’s Community Notes. Google will continue to invest in its current moderation practices, focusing on providing more information about search results through features like Synth ID watermarking and AI disclosures on YouTube.
This occurs amidst a broader shift in the tech industry concerning the role of platforms in fact-checking and content moderation. Recently, Meta announced its decision to end its fact-checking program, favoring community notes to address concerns about censorship. This trend indicates a move away from policing misinformation, raising concerns about digital safety and the impact on young users.