AI Act Standards Face Delays as Deadlines Approach
The development of technical standards to support compliance with the EU’s Artificial Intelligence Act is experiencing significant delays, according to CEN-CENELEC, the main European standardization bodies. Originally, the European Commission requested these organizations to establish standards by August 2025, enabling companies to demonstrate that their AI products and services are safe, trustworthy, and compliant with the new regulations. However, current project timelines indicate that completion will extend into 2026.
CEN-CENELEC, comprising 34 national standardization bodies, emphasized that the extended timeline is necessary to ensure the standards accurately reflect the state of the art and achieve consensus among European stakeholders. Once the initial drafts are prepared this year, they must undergo mandatory editing, assessment by the Commission, and several rounds of consultation and voting, which will occupy much of 2025 and part of 2026.
To address these delays, CEN-CENELEC noted that extraordinary measures are being taken to streamline the development process, working closely with the Commission’s AI Office. Despite these efforts, the lag in standardization raises concerns for companies seeking certainty and clear pathways to compliance under the AI Act.
The AI Act, targeting high-risk AI applications, became applicable in August 2024 and will be fully enforced by 2027. By August 2025, member states are required to establish national regulators to oversee domestic compliance, coordinating with the Commission’s AI Office. The ongoing delay in standardization underscores the urgency for both regulators and companies as the final implementation deadline approaches.