What the “Digital Omnibus” Would Change?
The EU’s digital legal framework is extensive and layered: GDPR and ePrivacy govern personal data and communications; the Data Act, Data Governance Act and Open Data Directive regulate data access and reuse; NIS2, DORA and eIDAS drive security and trust; DMA and DSA govern platforms and markets. The Commission’s proposed “Digital Omnibus for the digital acquis” (leaked version here) aims to simplify this landscape, consolidate overlapping rules, and streamline compliance. It also proposes targeted changes to core privacy rules. Here is a clear, legal-accurate walkthrough of what the package would do, how it fits into the EU’s digital law, and where the risks lie.
A single backbone for data rules in the Data Act
- Consolidation: The proposal would repeal the Data Governance Act (DGA), the Open Data Directive (ODD) and the Free Flow of Non‑Personal Data Regulation (FFDR), merging their relevant provisions into the Data Act. The result is one main instrument for “Europe’s data economy.”
- Public sector data reuse: The Data Act would incorporate a unified regime for:
- Open government data (formerly ODD): principles, formats (machine-readable, APIs, bulk download), charging rules (free or marginal cost, with defined exceptions), standard licences, and high‑value datasets via implementing acts.
- Reuse of certain categories of protected data (formerly DGA Chapter II): conditions to access data protected by commercial/statistical confidentiality, IP and personal data, secure processing environments, single information points, and competent bodies to assist public sector holders.
- New discretion for very large enterprises/gatekeepers: public bodies could set higher fees or special conditions for reuse by very large companies and DMA gatekeepers, based on objective criteria and proportionality, while extending incentives to SMEs and small mid‑caps (SMCs).
- Data intermediation and data altruism: The DGA’s notification regimes would become voluntary EU labels in the Data Act:
- Data intermediation services: a trust label with simplified obligations (functional, not legal separation from other services; neutrality rules; international transfer safeguards), national registration feeding into a public EU register and common logo.
- Data altruism organisations: a trust label for entities that collect/process data for objectives of general interest, with transparency and security duties, an EU register and common logo/QR.
- Free flow of data: The prohibition of unjustified localisation requirements for non‑personal data would be retained within the Data Act (FFDR repealed).
Calibrations to the Data Act’s cloud switching and trade secrets
- Cloud/service switching: A lighter regime for custom‑made services (outside IaaS) and for providers that are SMEs/SMCs under contracts signed before or on 12 September 2025. These providers may include early-termination penalties in fixed-term contracts. Switching and egress charge withdrawal remains the core aim; contrary clauses are void.
- Trade secrets: Data holders could refuse sharing where they demonstrate a high risk of unlawful acquisition, use or disclosure to third-country entities or EU entities under their control, subject to weaker or non‑equivalent protection, with written justification and notice to a competent authority.
- Smart contracts: The Data Act’s essential requirements for smart contracts executing data sharing would be deleted; the Commission may adopt standards via implementing acts to mitigate uncertainty.
GDPR amendments (targeted but material)
- Personal data definition: Clarifies that information is not personal for a given entity if that entity does not have “means reasonably likely” to identify the person. A downstream recipient who can identify would treat the same information as personal.
- Special categories of data (Article 9): Enhanced protection applies where data directly reveal, in relation to a specific person, sensitive traits (racial/ethnic origin, political opinions, religious/philosophical beliefs, trade union membership, health status, sex life or sexual orientation). Genetic and biometric protections remain. Two derogations are added:
- Biometric verification: permitted if necessary to confirm identity and under sole control of the data subject (e.g., local storage or subject-held key).
- Residual special category data in AI development/operation: permitted with strict organisational/technical measures to avoid collecting such data, remove them when identified, or effectively protect them where removal would require disproportionate effort.
- AI training legal basis: Article 88c would clarify that legitimate interests can support processing of personal data for the development/operation of AI systems/models, subject to safeguards (minimisation, enhanced transparency, unconditional right to object, respecting technical signals, protection against regurgitation/data leakage).
- Information duties (Article 13): Derogation where there is a clear, circumscribed controller–data subject relationship, the activity is not data‑intensive, and the data subject can reasonably be assumed to already have core information—unless the controller shares with other recipients, transfers to a third country, performs automated decision‑making, or the processing is likely to result in high risk.
- Automated decisions (Article 22): Clarifies that a solely automated decision “necessary” for a contract is allowed even if a human could also take the decision; other safeguards remain.
- Breach notification (Article 33): Threshold aligned to “high risk” for data subjects; deadline extended to 96 hours; mandatory use of the single-entry point once operational. EDPB to propose a common template for notifications; Commission to adopt via implementing acts.
- DPIAs (Article 35): EU-level harmonised lists of processing operations requiring/not requiring DPIAs; common template and methodology; prepared by EDPB and adopted by Commission via implementing acts, reviewed at least every three years.
Cookies, ePrivacy and machine‑readable privacy signals
- GDPR control for terminal equipment personal data: Processing of personal data on/from terminal equipment (cookies and similar) would be governed solely by GDPR, aligning lawful grounds and simplifying the dual regime.
- Machine‑readable signals: Once standards exist, controllers would be obliged (after a grace period) to respect automated signals of consent refusal and objections to direct marketing transmitted via browsers, operating systems or other interfaces. The Commission could mandate browser/OS support if market uptake is insufficient. Media service providers would be exempt from honouring these signals. ePrivacy Article 4 (security/breach obligations for electronic communications providers) would be repealed to avoid overlap with NIS2 and GDPR.
“Report once, share many”: single entry point for incident reporting
- ENISA would build and operate a secure single-entry point. Entities submit one notification; relevant authorities under NIS2, GDPR, DORA, eIDAS and the Digital Identity Regulation receive the required information for their instrument. The tool would support retrieval, updates, and future onboarding of other sectors (CER, energy, aviation). Contingency arrangements apply if the portal is unavailable.
Platform law clean-up
- The Platform‑to‑Business Regulation (P2B) would be repealed, with transitional clauses to maintain cross‑references until amended (at the latest by 31 December 2031), on the basis that DSA/DMA now cover its objectives.
Governance and standards
- European Data Innovation Board (EDIB): Formalised as an expert group chaired by the Commission, bringing together Member State authorities, EDPB/EDPS, ENISA and SME representation to support consistent enforcement, coordinate data policy, and advise on standardisation, delegated/implementing acts, and interoperable frameworks for common European data spaces.
Legal and policy context: benefits and risks
- Benefits: Simplification reduces fragmentation and overlapping compliance (one data instrument; a “report once, share many” portal; harmonised DPIA/breach templates; clear cloud switching transition rules; retention of free flow of non‑personal data). Public sector data reuse rules become clearer and more uniform. Trade secret defence against risky disclosures strengthens legal certainty.
- Risks to privacy: Narrowing the special categories scope to “directly reveal” may reduce enhanced protections for inferences (e.g., sensitive profiles inferred from behaviour), even though Articles 5 and 6 still apply. The “means reasonably likely” test could be interpreted to shrink what individual controllers treat as “personal”—useful for pseudonymous IDs—but onward transfers remain sensitive. AI training under legitimate interests would need robust safeguards to prevent regurgitation, leakage and unfairness; governance needs to be precise, auditable and enforceable. Exempting media providers from honouring machine‑readable signals raises concerns about user control in news environments. Civil society organisations (EDRi, ICCL, noyb) argue several changes amount to deregulation and lack impact assessments under Better Regulation; the Charter (Articles 7 and 8) must be respected.
- Legislative prudence: Several GDPR/ePrivacy elements are not mere technical simplifications. They should be debated fully, with evidence and clear safeguards. Harmonisation (DPIAs, breaches, signals) is welcome; substantive changes to definitions and special categories need scrutiny to avoid unintended weakening of rights.
What organisations should do now
- Track the legislative process; provide feedback on operational impacts and rights risks.
- Audit data classification: document your “means reasonably likely” assessment; review onward transfer implications.
- Review models using inferred sensitive data; re‑assess fairness, transparency and harm mitigation in line with Articles 5 and 6.
- Prepare AI governance for legitimate interests: data minimisation, special category purging/controls, regurgitation/leakage protections, transparency notices, opt‑out workflows, model monitoring.
- Update breach processes to “high risk” threshold and 96‑hour deadline; plan migration to the single-entry point and adopt the EU templates once available.
- Revisit cookies and CMPs: design for machine‑readable signals; liaise with browser/OS vendors; if you are a media provider, assess the exemption and its accountability implications.
- For public bodies/re‑users: map the unified Data Act regime; set licence and fee policies (including gatekeeper conditions); build secure processing environments; prepare single information points and competent body workflows.