The European Commission wants to reform the General Data Protection Regulation (GDPR) in order to ease the burden on companies with fewer than 750 employees in particular. Following the initial proposal of 21 May 2025, a further draft has now been published. We explain what companies can expect as a result of the GDPR amendment.
In a nutshell
- On 21 May 2025, the EU Commission presented a reform proposal. The focus is on making things easier for SMEs and so-called small-mid-cap companies.
- At the beginning of November 2025, another draft was leaked that was not actually supposed to be published until 19 November 2025. The changes in the then officially published draft on 19 November 2025 are comprehensive, but less drastic than initially feared.
- Some of the proposals for amending the GDPR concern key definitions, which is likely to lead to renewed legal uncertainty.
GDPR amendments in the Omnibus IV package
The European Commission’s plans to amend the GDPR were recently announced. In addition to the GDPR, the so-called Omnibus IV package also amends the AI Regulation (AI Act), the Data Act and the e-Privacy Directive. The latter is the basis in EU law for cookie banners on websites. According to the published draft, the regulations on this topic are to become part of the GDPR.
Below, we explain the planned changes to the GDPR. As with the European Commission’s first official draft, the question arises as to whether the reform will really bring relief or lead to the erosion of fundamental rights.
Background: What does omnibus mean?
Omnibus IV is part of a series of proposed changes by the European Commission to reduce bureaucracy in the European Union. The aim is to ease the burden on small and medium-sized enterprises in particular. While Omnibus IV is intended to make changes in the area of digitalisation in particular, the other Omnibus packages aim to simplify sustainability (especially due diligence obligations), EU investments, the common agricultural policy, defence readiness and chemicals.
Planned changes to the GDPR
The draft amendment, which was first leaked in early November 2025, goes so far beyond the initial proposals for reforming the GDPR that Politico magazine has already dubbed it “Brussels knives privacy to feed the AI boom”.
The originally planned abolition of the obligation to keep a record of processing activities (Art. 30 GDPR) for small businesses and greater consideration of SMEs in codes of conduct and certifications (Art. 40 and 42 GDPR) had given way to a comprehensive amendment of the GDPR. The final draft, presented on 19 November 2025, no longer contains all of the most critical changes.
Caution! The paperwork that many stakeholders find burdensome will not be eliminated. Documentation requirements will only be reduced in a few areas and will be accompanied by new, vague legal terms that will require clarification by the courts.
Many of these legal terms will require assessments which – you guessed it – will lead to new paperwork.
Specifically, the proposed GDPR reform provides for the following changes:
Subjective personal reference
Amendment
First, the Commission proposes a change to the definition of personal data itself (Art. 4 No. 1 GDPR). This will shift from an objective to a subjective view. In other words, it will no longer depend on the data itself whether a person can be identified with it. Rather, it will depend on the ability of a company or any other entity to identify a person with the information at its disposal.
The draft thus aims to codify a recent ruling by the ECJ (C-413/23). In sum, the court ruled that personal data originally received by a recipient in pseudonymised form, which the recipient can no longer link to a data subject, is no longer personal data for the recipient.
However, the draft goes significantly further. According to the draft, information should not be considered personal data for an entity if the entity cannot identify the person to whom the information relates. In doing so, the means that can reasonably be expected to be used by the entity must be taken into account. The information also does not become personal data for an entity if a subsequent recipient, such as a processor or, specifically, AI, is able to assign or reassign the information to the data subject.
Our assessment
Criticism can be found, on the one hand, in the significance for the persons concerned. It will be difficult for them to find out whether information is currently personal data for the respective entity or not. This has a particular impact on the ability to effectively assert the rights of data subjects, such as the right to access.
The changes also pose difficult challenges for companies. The ECJ ruling has already led to confusion in practice as to who is required to fulfil information obligations and to what extent, or whether, for example, a processing agreement (Art. 28(3) GDPR) is still necessary.
In addition, companies now have to discuss what means of identifying a person can reasonably be expected of them. This is a decision that can only be made on a case-by-case basis, as a reasonably expected means depends not only on the size and resources of the company in question, but also on the type of information and access to additional information that can be used to identify the data subject.
Example: In practice, this is particularly relevant when IDs are used instead of identifying features such as real names. Individual actors, for example in the field of online advertising, use IDs that can be assigned to a person without knowing their name. However, with sufficient effort, this information can be compiled so that a natural person can be identified from the ID profiles. The decisive factor is whether such an approach can reasonably be expected from the company in question at the time.
The new Art. 41a GDPR potentially provides guidance for classification. The Commission has the option of adopting implementing acts that specify the means and criteria for determining when information is no longer personal data for certain entities as a result of pseudonymisation. To this end, the state of the art must be taken into account and specific criteria and categories for analysing possible re-identification must be developed. The extent to which a potential regulation will alleviate the problem raised cannot be predicted at this stage.
In theory, companies may also have the option of having certain identifying features such as real names, addresses or email addresses processed separately from the rest of the information by a service provider, thereby avoiding the scope of the GDPR and the associated obligations.
Special categories of personal data
Special categories of personal data
Amendment
The initially comprehensive amendment to Art. 9(1) GDPR is no longer planned. Originally, only personal data that directly reveals sensitive information about the data subject was to be considered a special category.
Among other things, the facilitation of verification with biometric data has remained. Art. 9(2)(l) GDPR provides for an exception that allows biometric data to be processed if it is under the sole control of the data subject.
Our assessment
The Commission’s decision to limit Art. 9 GDPR in this way is to be welcomed. However, the uncertainties for companies arising from the ECJ’s OTC ruling (C- 21/23) remain unresolved. However, limiting the existence of special categories of personal data to direct disclosure would have excluded data from the protection of Art. 9(1) GDPR from which, for example, a person’s state of health or political orientation can only be inferred, as is the case with online advertising.
Artificial intelligence
Amendment
The EU Commission’s draft plans to make the processing of special categories of personal data easier for providers and operators of artificial intelligence.
A new exception for artificial intelligence is to be added to Art 9 (2) GDPR – namely Art. 9(2)(k) GDPR. Sensitive data may then be processed in connection with the operation or development of AI systems, provided that further requirements are met. First, appropriate technical and organisational measures must be in place to prevent the processing of special categories of personal data as far as possible. If such personal data is nevertheless found, it must be removed. However, if this involves a disproportionate effort on the part of the controller, they must prevent results from being produced using this data or the data from being disclosed.
For the definition of AI systems, however, the very broad definition in Art. 3(1) AI Act is used.
The existing exception in Art. 9(2) GDPR is extended in the proposed inserting of Art. 4a into the AI Regulation to allow the processing of special categories of personal data for bias detection and corrections in connection with high-risk AI systems.
Furthermore, the planned Art. 88c GDPR expressly creates the possibility of processing personal data in connection with artificial intelligence on the basis of a legitimate interest (Art. 6(1)(f) GDPR). However, no further details are provided on the necessity or the necessary balancing of interests. Particularly in the case of artificial intelligence, it is difficult for data subjects to assess the consequences of the various processing steps. This would still need to be taken into account in a complete balancing of interests.
Abuse of rights in relation to data subjects' rights
Amendment
Abuse of rights in the assertion of requests for access in terms of Art. 15 GDPR is currently rarely assumed. The Commission wants to counteract this. To this end, the draft clarifies that the controller has the right to refuse a request or to transfer the costs to the data subject if the latter “abuses” the rights conferred by the GDPR for purposes other than the protection of their data.
The burden of proof remains with the controller, with the change that legitimate reasons for assuming excessive requests are sufficient.
Our assessment
This change is problematic in the context of asserting requests for information for various purposes, if not all of them serve to protect their data. Requests for information often have purposes that go beyond the protection of one’s own data, simply because of their connection to the processing itself. For example, requests for information are made to employers or social media providers in order to understand how they handle one’s own data, to secure evidence that would otherwise be much more difficult to obtain, and to assert legal claims. Which of these purposes still serve to protect one’s own data and at what threshold “abuse” occurs can also lead to lengthy legal disputes under the proposed amendment and place a burden on companies, but above all on the individuals concerned.
Information obligations
Amendment
An exception to the requirement for a privacy policy is to be made if
- the data has been collected in a clearly defined relationship between the controller and the data subject,
- the activity is not data-intensive, and
- it can be assumed that the data subject is already aware of the relevant information from Art. 13(1)(a) and (c) GDPR.
The exception cannot be used if the controller transfers the data to other recipients or to a third country, makes automated decisions or if the processing may result in a high risk to the rights and freedoms of the data subject.
Our assessment
It is difficult to determine which processing operations are affected by the exception, as various terms are not defined or explained in further detail. For example, the exact meaning of “clear and circumscribed relationship” and “not data-intensive” cannot be predicted. In the case of 3., it can generally only be assumed that the data subject is already aware of the necessary information if it has been provided by the controller. This applies in particular to the legal basis. The main focus is on everyday transactions, such as those with tradespeople, and the processing of personal data in sports clubs.
Due to the extensive exceptions, the amendment is not expected to provide much relief for companies. In addition, in accordance with the accountability principle set out in Art. 5(2) GDPR, the existence of the exception should require documentation, at least in borderline cases.
Notification of data breaches
Amendment
Notification of data breaches (Art. 33 GDPR) is now only required in cases where the data breach is likely to result in a high risk to the rights and freedoms of the data subject. This significantly raises the threshold for notification and aligns it with the criterion for notification to data subjects in Art. 34(1) GDPR. At the same time, the deadline for reporting is extended from 72 hours to 96 hours.
In addition, notifications are to be made possible via a “single entry point” (SEP) that is yet to be created, as is also provided for notifications under NIS-2 and DORA.
Our assessment
The change is likely to lead to a significant reduction in the number of data protection breaches reported to supervisory authorities. This will relieve the burden on both companies and supervisory authorities, which have previously been required to report and process data protection breaches unless the breach is unlikely to result in a risk to data subjects.
Data protection impact assessment
Amendment
For data protection impact assessments (Art. 35 GDPR), there is to be a central list from the Commission specifying which processing operations require such an assessment. This list replaces the individual lists previously maintained by the respective national supervisory authorities.
Cookie banners
Amendment
According to the Commission’s proposals, provisions on personal data in terminal equipment are to be moved from Art. 5(3) ePrivacy Directive to the GDPR. Requirements relating to non-personal data in terminal equipment will remain in the ePrivacy Directive.
Our assessment
While both regulations would coexist and their requirements for the design of cookie banners would have to be observed, there are significant differences between the requirements of the new Art. 88a GDPR and Art. 5(3) of the ePrivacy Directive.
Firstly, Art. 88a GDPR covers not only the storage of information on the user’s terminal equipment or access to such information already stored on the terminal equipment, but also the processing of personal data “on or from terminal equipment”. The wording is so broad that it could also apply to processing within applications on a terminal device or any personal data that is transferred from the user’s terminal device to a service provider’s server.
However, the intended regulation also introduces new exceptions. In future, it would be easier to process personal data for the purpose of measuring reach or ensuring the security of the service offered. These exceptions are in addition to the familiar exceptions of an explicit request by the data subject and the transmission of a message via a public telecommunications network.
Consent and refusal to process personal data under these rules should be possible by means of a single-click solution. The selection made should remain valid for a period of time yet to be determined, without repeated requests by the controller.
Consent itself should become easier for internet users. To this end, it should be possible to give consent automatically and in a machine-readable form. Websites must be able to process this automated consent. Furthermore, Art. 88b(6) of the GDPR obliges web browser providers that are not SMEs to create technical possibilities for users to give or refuse consent automatically.
Outlook for GDPR reform
The question posed at the outset about relief or erosion cannot be answered clearly. While some changes may lead to relief through reduced documentation requirements in current practice, changes to fundamental definitions, i.e. the DNA of the GDPR, and to artificial intelligence in particular will lead to an unpredictable decline in the protection of data subjects.
There will be both relief and erosion. The only question is how much of each we will get.
The fundamental changes to the GDPR will lead to legal uncertainty in addition to a decline in the protection of data subjects. The apparent relief for companies will be associated with legal uncertainties, at least in the initial phase, as various new legal terms will be introduced that require clarification. As things stand at present, the changes will therefore require in-depth legal analysis when they come into force and, not least, a great deal of case law for final clarification.
It is to be hoped that the current Commission draft will lose some of its weaknesses in the further legislative process and create clear guidelines that actually reduce the burden on small and medium-sized enterprises without eroding the fundamental right to protection of personal data under Art. 8 of the Charter of Fundamental Rights.
