AI Act Standards Face Delays as Deadlines Approach
Delays in EU AI Act technical standards mean companies may face uncertainty until at least 2026 as standardization bodies work to ensure compliance and consensus.
Delays in EU AI Act technical standards mean companies may face uncertainty until at least 2026 as standardization bodies work to ensure compliance and consensus.
More than half of EU Member States have missed the deadline to implement the NIS2 cybersecurity directive, leaving critical sectors exposed to heightened cyber risks.
MEPs urge the European Commission to adopt a strict definition of open source AI in the AI Act, excluding models with restrictive licensing like Meta’s Llama.
The EU is reviewing its AI Act to simplify compliance for businesses, balancing industry demands for flexibility with the law’s original goal to ensure accountability and mitigate AI risks.
The EU prioritizes compliance over punishment in enforcing the Digital Markets Act, with fines for Apple and Meta signaling its commitment to regulating Big Tech effectively.
The ProtectEU strategy strengthens the EU’s internal security framework with enhanced law enforcement tools, cybersecurity measures, and international cooperation to address evolving threats.
The EU strengthens enforcement of digital laws, prepares AI and election integrity measures, and emphasizes sovereignty, innovation, and fair rules amidst geopolitical challenges.
The European Commission is reviewing Meta’s risk assessment for its AI chat tool to ensure compliance with the Digital Services Act amid ongoing regulatory scrutiny in the EU.
The EU Commission has adopted rules to establish a scientific panel of AI experts to assist in the enforcement and governance of the AI Act.
The EU Commission plans a 2025 impact assessment of the AI Act within a digital package, focusing on simplifying rules for SMEs while addressing cybersecurity and data-sharing regulations.