Search

Privacy by Design and Privacy by Default

For many companies, data protection is still something that is only taken care of when everything is in place: moving to the cloud, the expensive development of an app or the elaborately programmed logistics system. What companies still have difficulty with is complying with the requirements of Art. 25 of the General Data Protection Regulation (GDPR). When using IT technologies and software, this Article obliges companies to review whether these were developed and are used in accordance with data protection standards (Privacy by Design). In addition, suitable default settings must be in place to ensure compliance with the data protection principles of Art. 5 of the GDPR (Privacy by Default).

A breach of Privacy by Design and Privacy by Default can result in heavy fines. Yet data protection-compliant IT can be of great benefit to both companies and data subjects.

Privacy by Design and Privacy by Default as a solution to modern data protection problems

Personal data, so a popular statement, is the gold of the 21st century. On closer inspection, however, the comparison is misleading because, unlike gold, data can neither be accumulated nor used at will. Nor is data transferable like gold: It always remains a personal characteristic of the person concerned. According to the right to informational self-determination, this person should always remain the master of their data.

Yet, how can this demand be realised in a world where personal data is in fact much more fungible than gold? In a world in which data subjects tick off declarations of consent for the processing of their data without further ado because they have no time or desire to deal with the extensively described risks (e.g. passing on to third parties, data transfer to an unsafe third country)?

The requirements from Art. 25 GDPR can provide a solution here, namely

  • Privacy by Design, the data protection-compliant design and development of IT systems, as well as
  • Privacy by Default, data protection-friendly default settings.

Roughly speaking, a technology should always provide certain functionalities by default to protect users and these should then also be preset.

An app should be designed in such a way that, by default, it only processes data that is required for the basic functionality and offers the functionality for the user to adjust settings themselves (privacy by design). In addition, only the minimum rights required for the basic functions are preset during installation or delivery (privacy by default).

All other functions for which further data of the data subject would be required must be activated by the user, if he or she wishes to do so. Before each individual activation, the user would have to be informed in a few sentences about the benefit of authorising the recipients to access the data and about the storage period of the respective data.

In the event that the user can return to the standard configuration of the app at any time, changes or additions to the purpose of individual selectable functions could be implemented without hesitation. In this way, the data subject would always remain the master of his or her data, because he or she would know at all times exactly which app function requires which of his or her data for which purpose, and because he or she could activate or deactivate individual functions.

Example: Compliance in the company through Privacy by Design and Privacy by Default

Privacy by Design or by Default can be particularly useful in the context of employment: If, for example, the app in question is to be used in the company and the standard configuration corresponds to what the employer may contractually expect from his employee (e.g. use of the app for communication purposes), it can be assumed that the employee’s decision to consent to additional functions (e.g. activation of a linked birthday calendar and thus notification to the employer) is voluntary. The employee then consents to the processing of his or her date of birth. However, he does not only do this because otherwise he cannot use the app as such – and would therefore have to fear consequences from the employer.

The employer, on the other hand, would not have to fear having obtained a consent that is not in conformity with the law, as it indicates the voluntary nature and the purpose of processing beyond the basic function when inserting the app.

The example shows how it is technically possible to preserve the data subject’s right to informational self-determination through the use of privacy-by-design systems. This creates a win-win situation: on the part of the data subject as the use of privacy-enhancing technologies leads to a greater willingness to accept, and on the part of the company as it enjoys more legal certainty.

The data protection principles of Privacy by Design and Privacy by Default

For the development, acquisition and use of data protection-compliant IT, the European Union General Data Protection Regulation (GDPR) specifies data protection principles that have long been discussed in the literature under the keywords “Privacy by Design and Privacy by Default”. The three most essential principles are:

  1. Transparency of data processing and possibility of direct control by the data subject;
  2. Use of procedures that meet technical safety standards;
  3. Data protection-compliant default settings. This includes in particular the implementation of the principles of data minimisation and storage limitation.

Another principle, not necessarily related to data protection law, is the ease of use of a system, including the release and withdrawal of authorisations, which is intended to prevent discrimination against older, younger and disabled people in particular.

Depending on which privacy-by-design model one wishes to follow in the literature, further principles are added. However, these can also be subordinated to the principles already mentioned, such as the limited validity of certificates or secure authentication procedures.

Ultimately, it makes sense to understand privacy by design and privacy by default holistically. It is about taking all appropriate technical and organisational measures (TOM) necessary to protect the rights and freedoms of natural persons in relation to the processing of personal data. For this purpose, data controllers must first define strategies that meet the requirements of Privacy by Design or Privacy by Default.

Privacy by Design thus provides an answer to the question of how to guarantee the data protection principles from Art. 5 of the GDPR. Only when processing is carried out in accordance with these principles are data controllers able to comply with all obligations under the GDPR.

Transparency principle in the GDPR

The principle of transparency in data protection law is already embodied in numerous obligations and rights of the GDPR. In addition to the standardised principle of transparency in Art. 5 (1) (a) of the GDPR, the right to access in Art. 15 of the GDPR should be emphasised. According to this, the data subject must be provided with information on all circumstances relevant to processing upon request.

Since the right of access also includes the right to a copy, it can become an expensive and time-consuming burden for data controllers in practice. Before introducing a data processing system, it should therefore be confirmed to what extent data controllers are able to obtain automated exports of data subjects’ data on the system side. In particular, it should be ensured that the export is possible at the level of individual data subjects and not just an entire database record.

Principle of data security in the GDPR

In addition to the transparency principle, the use of procedures that meet technical security standards is also more or less explicitly regulated in current data protection law. Thus, numerous articles in the GDPR stipulate that appropriate technical measures must be used in addition to organisational measures. Particularly noteworthy here are the requirements from Art. 24 and 32 of the GDPR. In practice, however, these regulations prove to be problematic time and time again, as those responsible for IT systems that they have not developed themselves have hardly any room for manoeuvre in terms of adaptation, let alone the subsequent implementation of encryption technology or multi-factor authentication.

The GDPR therefore requires controllers in many places to ensure appropriate technical and organisational measures,

“taking into account the state of the art, the cost of implementation and the nature, scope, context and purposes of processing as well as the risks of varying likelihood and severity for rights and freedoms of natural persons posed by the processing,  […] both at the time of the determination of the means for processing and at the time of the processing itself” (Art. 25(1) GDPR).

Furthermore, according to the fourth sentence of recital 78 of the Regulation,

“producers of the products, services and applications should be encouraged to take into account the right to data protection when developing and designing such products, services and applications and, with due regard to the state of the art, to make sure that controllers and processors are able to fulfil their data protection obligations”.

Even if there is no mention of an obligation for manufacturers in the text of the regulation itself, the obligation on data controllers to review the use of data protection-compliant systems can – according to the regulation – at least indirectly have an effect on the manufacturers of IT systems.

Principle of data protection compliant default settings in the GDPR

Art. 25 (2) of the GDPR requires ensuring, by setting a system accordingly, that only such personal data are processed that are necessary for the respective specific processing purpose. Specifically, this applies to the scope of the data collected, the scope of the processing, the storage period and the accessibility of the data. Thus, a person who wishes to participate in a social network must not be forced to fill his or her profile in advance with data that are not at all necessary for the contractual participation in this network.

Furthermore, Art. 25 GDPR requires that once a profile has been created, it should not be made accessible to “an indeterminate number of natural persons” without the intervention of a natural person (e.g. the data subject or a network account manager). Here, the fundamental idea of privacy by design becomes clear, according to which the data subject (or a person appointed by him or her) should decide on the scope and nature of the processing of his or her data.

Technology that violates data protection can be severely sanctioned

If a controller relies on insecure or data protection-critical IT despite the existence of data protection-compliant alternatives, it must expect a fine of up to 10,000,000 Euros (in the case of a company, up to 2% of its annual worldwide turnover, if this amount is higher) pursuant to Art. 83 (4) GDPR. When selecting systems, data controllers should therefore pay particular attention to ensuring that they process as little personal data as possible, at least as the default setting, or that they specifically limit the meaningfulness of the processed data by using pseudonymisation technology.

If, on the other hand, a data protection breach occurs despite Privacy by Design, the supervisory authority must take the use of data protection-friendly IT into account when deciding on a fine and its amount pursuant to Art. 82 (2) (d) GDPR.

Does Privacy by Design inhibit or promote the economy?

The economic value of personal data is not measured in the informative value of an individual data point

At first glance, Privacy by Design and Privacy by Default only seem to inhibit the monetisation of personal data that many companies are striving for. The economic value of personal data, one might think, is to accumulate as much intrinsically meaningful data about a person as possible. Against the backdrop that uncontrolled and unlimited accumulation was never permissible and can also be severely sanctioned under the General Data Protection Regulation, a strategic approach to the use of data processing technology is indispensable.

The economic value of personal data is not measured in the informative value of the individual data point. It is usually the contextual connection of individual data to a specific person that makes it valuable. If you know how a person behaves in certain situations or what preferences he or she shows, you can infer that person’s behaviour in similar situations or similar preferences. Thus, in many constellations (e.g. selection procedures, profile information in a network), static writing-based information of a person about him/herself is often less valuable in terms of getting to know this person in practice than data that this person shows in his/her behaviour.

Relying on supposedly meaningful static data, such as a person’s name, to infer cultural preferences, for example, increasingly leads to erroneous conclusions in an increasingly globalised world. The pseudonymisation of data therefore only rarely contradicts economically effective personality analyses, which large advertising networks in the online sector have long begun using. For these, the application of Privacy by Design is indispensable. The synthesising of data, i.e. artificially generated data sets derived from real data, will also increasingly play a role in Big Data.

Recognise privacy-friendly technology as a competitive advantage

Companies should also not fear the shift of control from the company to the data subject, which is the aim of the privacy-by-design approach. Numerous analytics apps (e.g. in the health sector) make it clear that data subjects today have a very great interest in voluntarily making their data available for analytics and marketing purposes. The more transparent the process is for the data subject, the greater their willingness to do without adblockers or similar. In addition, numerous companies have already recognised data protection-friendly technology as an effective marketing strategy. Seals and certificates, however, should never replace the responsible person’s own examination of applications or software before use. Often, data controllers are misled regarding supposed data protection conformity.

Conclusion: Involving data subjects for more economic and legal security

Privacy by Design and Privacy by Default are of immense importance for the future of data protection law. Due to the legislator´s failure to address certain obligations to international manufacturers of data processing technologies by means of the GDPR, Privacy by Design indirectly shifts this requirement to data controllers using these technologies in this country and throughout Europe. Unfortunately, practice to date shows that such market pressure is only perceived in isolated cases and not across the board.

Contrary to expectations, many companies should not only rely on data protection compliant IT for legal reasons, but also from a genuine economic point of view. For where the data subject is sufficiently informed about the use of his data and can also actively control it, his acceptance and willingness to consent can be expected. In contrast, non-transparent, monologue data processing without the participation of the data subject reinforces the population’s increasing mistrust in the processing of their data.

This seemingly unresolvable conflict between the right to self-determination and paying with one’s own data seems to be diametrically opposed to the principle of data protection-friendly default settings. In the long term, the conflict will probably only be resolved with pressure brought to bear on the large, data-powerful manufacturers. Either through the market by means of privacy-by-design requirements or by means of direct sanctions. The former requires far more extensive legal enforcement, which is currently still failing due to the resources of the public bodies responsible.

Nevertheless, a violation of the requirement to use IT that is as data protection compliant as possible can still be severely sanctioned. For this reason alone, companies should choose the more data protection-friendly solution in case of doubt. This presupposes, however, that the possible use of different solutions is considered first. Therefore, it is also important to exert maximum influence on the data processing process as early as possible. This already starts with the selection, ergo before acquisition. It is therefore advisable to use a mandatory, standardised catalogue of requirements when introducing new technology in the company.

The German original of this article was published by our partner activeMind AG.

Contact us!

Secure the knowledge of our experts!

Subscribe to our free newsletter: