DigitalEU.net

Independent views on digitalisation in Europe

Digital reality: Individual privacy in Europe has never been as deeply compromised – a status review from 2020

Personal privacy in Europe is truly at stake in 2020. What if our governments and the EU institutions are missing the point?

By Hartmut Seibel, Brussels – 23/10/2020

In light of the current corona crisis it is abundantly clear that the digital space is becoming even more crucial for almost all aspects of a European citizen’s daily live. It should therefore be vital to look at what the European Commission has put forward earlier this year in this area and, in particular, to review the actual situation around digital privacy in Europe.

The “European strategy for data”

A key document setting out the European Commission’s vision for the digital future in Europe is the ‘Communication’ on “a European strategy for data”, that was released in February 2020 (the ‘Communication’). This document illustrates that a closer status review of the current situation around individual rights to privacy in Europe is warranted. Two other communications that were issued on the same date and in the same context – “On Artificial Intelligence” and on “Shaping Europe’s digital future” have understandably attracted wider public attention. However, the European Commission’s views as expressed in February 2020 on a “European data strategy” clearly deserve closer scrutiny. They should be considered against the current reality in the area of digital privacy.

The stakes for Europe and the rest of the world are gigantic

The stakes in the digital industry are undoubtedly extremely high, actually gigantic, for a wide range of stakeholders in Europe as is outlined in the Commission’s Communication. The document rightly explains that nothing less than Europe’s technological future is at stake. There is consensus that data is a vital part of the lifeblood of economic development. For illustration: The volume of data that is being produced in the world is growing exponentially, i.e. from 33 zettabytes in 2018 to an expected 175 zettabytes in 2025 according to the estimates at the disposal of the European Commission.

The digital space is increasingly reaching into our homes, our health, our mobility and many other personal aspects of our daily lives. Those of us who are concerned about what this explosion of global data volume implies may wonder more and more what these spectacular developments mean for our personal privacy. Are we still in control of the technology and of our data? Or is the technology – or possibly some players behind it – increasingly controlling us?

In this context it is noted that the Communication speaks about the objective to preserve high privacy, security, safety and ethical standards. But do these statements really mean that our data privacy is respected and safeguarded in reality? It is admittedly not simple to see through this complex and very fast-moving jungle of developments in the digital world but given our rapidly increasing use of digital technology one should rather check again if our privacy is truly safeguarded. But let’s first glance at what the European Commission had to say in February 2020.

The GDPR – a ‘solid framework for digital trust’?

The European Commission interestingly refers in its Communication to the so-called General Data Protection Regulation (GDPR) claiming that based on the GDPR the European Union has created “a solid framework for digital trust” for personal data. Unfortunately, the reality check in the European Union shows a different picture. Digital trust may at this point in time be a notion that speaks to those who have not been able to observe more closely what is actually going on in the context of the digital space in Europe.

The European Commission’s so-called “solid framework for digital trust” which is supposedly based on the effectiveness of the GDPR unfortunately seems to contain some crucial shortcomings.

One of the most critical ones goes right to the core of the GDPR: It is related to the GDPR’s concept of consent and the question of lawfulness of processing of those data. There are several open questions related to the interpretation and the practical enforceability of the relevant legal provisions of the GDPR.

How effective is the GDPR?

Practical problems arise rather frequently in the daily life of the average citizen: One type of scenarios may well be familiar to many of us: Are we not frequently confronted with a situation where one would have little choice but to provide consent – although probably not “freely given” as would be required in the GDPR – due to the specific (individual) circumstances and because of the lack of alternative (digital) offerings in order to be able to use or to continue to use a proposed software solution, application or device?

Very often solution providers, application or hardware suppliers find ways to almost “oblige” the consumer to provide her/ his consent. They seem to have the upper hand in many situations and we frequently get ‘strong-armed’ into giving our consent to the installation of a solution or application (and grudgingly accept the imposed licensing conditions) even if we do not really agree with the terms of the app. But can one speak about “freely given consent” in some of those circumstances? Even the interested user would find it difficult to find the answer to this question.

Abusive terms and conditions imposed by technology providers

The average consumer in the EU is frequently confronted with abusive terms and conditions imposed by technology providers, hardware suppliers or by social media platforms with significant market power who are easily able to extract some form of consent from the consumer. From a legal perspective it is often unclear whether these forms of consent in such cases can be considered as valid in accordance with the requirements of the GDPR or, alternatively, if the conduct of the provider could possibly be challenged under the separate set of the European competition rules because the relevant hardware provider or social media platform has been able to obtain the consent as a result of its market power in the market for this kind of digital offerings.

Interested readers may refer – for illustration – to the German legal proceeding ‘Bundeskartellamt’ (Federal Cartel Office) versus Facebook which was dealt with – from a German competition law perspective – by the German Supreme Court (“Bundesgerichtshof” or “BGH”) in a ruling from 23 June 2020.

In the relevant scenario Facebook unilaterally decided to integrate all personal data gathered separately by its group undertakings (Instagram, WhatsApp, Messenger) and obtained from other third party sources into one single Facebook data base. Its millions of users get no chance of objecting to the relevant users’ terms and conditions. Users are only able to use Facebook’s social network if they agreed to the terms of service stipulating that Facebook may as well collect data outside of the Facebook website in the internet or on smartphone apps and assign these data to the respective Facebook user account. The German Supreme Court (BGH) – in an interim relief decision – recently confirmed the prohibition by Bundeskartellamt addressed to Facebook and to stop integrating those personal data gathered from external sources in one central data base.

Similar questions should naturally arise as well in other EU countries and, more importantly, on a Union-wide basis given that the market position of Facebook is very similar in most countries in Europe. It is therefore surprising that the European Commission has not yet decided to pursue this matter in parallel at the European level. One concrete conclusion from those scenarios could be that provisions in the GDPR such as Article 7 (about “Conditions for consent”) may need to be clarified or expanded or that specific guidance be added in the context of the application of the competition rules.

Any interested observer that is exposed to the digital space these days should furthermore be able to get the rough idea that vast amounts of personal data are being creamed off, inter alia, from the smart phones in use in Europe on a daily basis (e.g.: location data, individual phone operation data, apps allowing for various usage data to be sent to the app provider, data linked to personal profiling etc.) whereas the user is not even asked for her/his consent. Unfortunately, the user of the smart phone even has little or no means to fully understand what data is exactly being taken from his phone on a daily basis and which actor is responsible for such abuse.

It is fairly clear that there is a gigantic and rather opaque – but wider – practice of siphoning off personal data from most handheld and stationary devices in the digital space which is extremely lucrative for the tech giants and their partners (so-called third parties) on a global basis. The additional complexity that comes with the current situation is that it is difficult to understand for the user of a digital devise which party should be held responsible for those massive data gathering practices: Is it the hard ware supplier or is it the software provider that siphons off e.g. the personal location data? Which actor should we report to the relevant data protection authorities in these circumstances?

It is entirely unclear how the EU’s so-called ‘solid framework for digital trust’ would protect the consumer against these questionable practices that have nonetheless become standard in Europe nowadays. The current overall situation where consumers are being tracked, inter alia, when using their web browser or searching the internet via their PC or when using their smart phone would not appear to inspire much trust in the use of the digital space as far as personal privacy is concerned.

It is recognised that the GDPR, adopted in April 2016 and being applicable as from May 2018, is still recent as a binding, Union-wide measure. Nonetheless, an increasing number of citizens, NGOs, think-tanks and other experts are starting to understand that there are significant problems with some key provisions of the GDPR but also with the actual implementation and the enforcement of this measure.

It is very disappointing however that the European Commission, in its recent review and communication from June 2020 – after two years following the implementation of the GDPR – did not reach any conclusion about any urgent need for the GDPR to be revised. One other aspect to be addressed should be that the level of the fines for non-compliance with the GDPR that could be imposed is visibly insufficient to discourage some of the non-complying tech giants. The fines that may be imposed on tech companies are rather in the range of rounding errors compared to these giants’ overall budgets and do not seem to encourage compliance with the GDPR.

Practical problems with enforcing the GDPR

One other important aspect is that the relevant national authorities (the so-called DPAs) are hopelessly overloaded and understaffed to deal with the challenge of enforcing this measure. In particular, the authorities in Ireland and in Luxembourg who are responsible for some of the more important tech giants that are active in Europe do not seem to have sufficient resources to confront the armies of lawyers, consultants and lobbyists that the tech giants can assign to deal with any complaints against them. It is a good illustration of this issue that more recently a well-known video-app operation (“TikTok”) is now seeking to establish its European headquarters in Ireland.

The issue around the inadequacy of the resources of the various data protection authorities is not made transparent to the wider public by the decision-makers who are responsible for the enforcement of the GDPR at national level. This status-quo leads to a regrettable situation in which – so it seems – a great number of law-abiding players in Europe are doing their best to comply with the GDPR (despite the numerous questions of interpretation to be dealt with) whereas other players pretend to comply with it but in reality are busy undermining the objectives of the GDPR. Yet there might be a third camp of players who do not even pretend to comply with it.

Consequently, there are meanwhile more and more voices from experts in the area who conclude from the current situation that the GDPR puts many European players which seek to be GDPR-compliant at a serious disadvantage versus some of the tech giants, particularly from outside Europe, that are hardly negatively impacted by the GDPR. The latter are entirely undeterred while continuing their data-gathering practices. The latter type of players is visibly thriving and expanding their dubious data-gathering practices in the current situation. This situation is entirely unacceptable as it creates an un-level playing field and appears to put the tech giants from outside the EU even at an advantage versus some smaller players. This outcome visibly runs counter to the objectives of the GDPR and the EU’s digital agenda.

What the Communication does not say:

About ‘digital diplomacy’

It is worthwhile to recall as well those aspects on which the Communication remains silent in the data privacy context. Although the document from February 2020 mentions that the European Commission has engaged in “digital diplomacy” recognising 13 countries as providing an adequate level of protection for personal data it stays silent about the fact that there has been a so-called ‘EU-U.S. privacy shield framework’ in place since 2016 which was however challenged in court in 2018 in the so-called “Schrems II” proceeding.

Interested observers may remember that also the previous so-called “EU-U.S. safe harbour agreement” which had been applicable as of 2000 had already been challenged and invalidated in 2015 by the European Court of Justice in the so-called “Schrems I” proceeding. With respect to the more recent “Schrems II” case it does not come as a big surprise to some that, like the predecessor arrangement (the “EU-US safe harbour agreement”), also the EU-US privacy shield framework has been invalidated by the European Court of Justice in July 2020.

Following the invalidation of the EU-U.S. privacy shield framework it might have been useful and timely for the European Commission to provide a set of questions and answers to citizens and to the industry concerned on this issue. The European Commissioner for Justice reportedly mentioned to European Members of Parliament in early September 2020 that the European Commission is working on updating the set of standard contractual clauses that have actually not been invalidated by the European Court of Justice. There would however not be a “quick fix” to the now invalidated privacy shield framework due to the complex politics in the current EU-U.S. dialogue. The European Commission should nonetheless consider delivering a set of Q&A to the numerous parties that are directly concerned by the invalidation of the EU-US privacy shield framework.

Under the invalidated “EU-US privacy shield” framework the relevant technology provider was able to ‘self-certify’ its compliance with the applicable EU-US privacy shield framework that was developed by the US Department of Commerce in consultation with the European Commission in order to meet the European “adequacy” standard for privacy protection. But could we ever trust such mechanism that was based on a process of ‘self-certification’ by the data controllers that are managed in places outside the European Union where personal privacy of EU citizens is not considered a priority and where the intelligence and national security services are able to override any agreed standards with the EU?

It is however worthwhile to note that in the equally relevant ‘cloud services’ context the European Commission admits in its Communication that in light of third country legislation – such as the U.S. CLOUD Act – “there is uncertainty about compliance of cloud service providers with important EU rules and standards, for example on data protection.”

Finally, with respect to the European Commission’s digital diplomacy activity it is important to identify those countries that are part of this initiative and benefit from this activity. It is far from certain that all of these thirteen non-EU countries can be truly considered politically stable and able to demonstrate having effective checks and balances in place to ensure the appropriate safeguards against the misuse of personal data from EU citizens that would be shared with actors established in these countries. Furthermore, the Commission decisions granting the “adequacy” status for those 13 countries date back many years and should probably be urgently revisited due to the inevitable political evolutions in those places. Again, it is not clear how this overall situation should inspire the European consumer to have trust in the current digital framework in the EU.

About the legislative gap in electronic communications

Another important aspect in the context of digital privacy is the lack of (legislative) follow-up at EU legislative level in the area of secrecy of electronic communications (the so-called ePrivacy legislation). The Communication does not mention the current stalemate that continues at EU Council level in this area. Actually, it is nothing less than a scandal that a draft ePrivacy Regulation which was finally proposed in 2017 by the European Commission has still not been approved by the EU Member States at EU Council level to date. It is widely recognised that the measure that is currently in place, the directive on privacy and electronic communications from 2002, has been out of date for many years. The consequence of the existing legislative gap is that there are numerous types of electronic communications that are currently not subject to secrecy protection. Many electronic mails and message types may consequently be scanned or read either by the provider or by third parties these days with a remote risk of sanctions. The Communication unfortunately remains silent about this important gap and the lack of legislative follow-up by the EU Member States in this context.

The concept of data sharing in a common European data space

Finally, in terms of data security the Communication seems to ignore the risk that the creation of a ‘European data space’ within which data in general – in this case all types of data including personal, non-personal and so-called industrial data – would be shared as widely as possible inside Europe could be easily flawed or undermined. Such initiative would very likely attract significant interest as well from powerful global players who will find a way and the weakest data sharing hubs in one or another of the 27 EU countries from which they would equally be able to “benefit” – through the back door – from such common ‘European data space’. Such weaknesses and “leaks” would easily backfire and undermine the very concept of a ‘European data space’.

European institutions continue to do too little too late

Overall, it is a good illustration of the political landscape and the strong lobbying influences in Brussels that it has taken the European legislator about six years and, overall far too long after the first data protection directive from 1995, to finally adopt the GDPR in 2016. This measure – with some necessary improvements – should have been in place at least ten years earlier when looking at the technological developments that have taken place in the decade after 2000. Meanwhile, too many business models of the global tech giants but also many other, less visible actors in the digital area (including many almost invisible so-called ‘third parties’) have relied, for too long, on the possibility of processing personal data in a way which is probably not compatible with the GDPR. The stakes for them have become far too big by 2016 – the year of adoption of the GDPR. They are now hardly capable of back-pedalling and moving on to other business models.

Furthermore, in the closely related context of e-commerce it is noteworthy that the European Commission announced earlier this year that it will propose a so-called ‘digital services act and package’ before the end of 2020. However, the experts will recall that the EU’s legal framework for digital services – the so-called e-commerce directive – was adopted 20 years ago. This framework indeed badly needs to be updated to reflect the rapid digital transformation over the last two decades. Unfortunately, based on a review of the currently available documents in this context it does not appear at this moment that any concerns about the risks of abusive practices in the area of data privacy will be addressed as part of this comprehensive legislative package.

Is data privacy for individuals in the digital world a true priority?

The attentive reader of the above-mentioned Communication as well as of any related documents from the EU institutions will find that these documents speak in many instances about the need to make the fundamental rights of individuals to personal privacy a priority in the overall context of the European strategy for data and in accordance with what is referred to as ‘European values’. The last communication from the European Commission from June 2020 on the review of the situation after two years of application of the GDPR speaks about a “human-centric approach to technology” being an important component and a compass for the use of technology. Needless to say, these objectives and priorities must be welcomed.

However, the careful observer of the actual situation around personal privacy would conclude that there is much room for improvement, both at European and at national level. It is after all not a luxury but a cornerstone of the European Charter of Fundamental Rights (refer to Articles 7 and 8 therein) that our institutions and our governments must do their utmost to protect the privacy of the European citizen. These fundamental rights are not disposable and cannot, under any circumstances, be balanced in exchange for any industrial or commercial interests.

Our genuine ‘trust’ in the digital environment should be considered vital for the success of the data strategy for Europe. Considering the track record – as outlined above – in terms of digital reality in Europe there could be doubts – in a post-Facebook Cambridge-Analytica world – as to whether such trust would be justified. How many more data leaks, data losses, abuses, court decisions, political commitments and declarations will be needed before digital privacy is taken seriously in Europe?

[The various issues raised in the above article from October 2020 will be subject to specific follow-up articles focussing more specifically on the questions raised in the course of 2021]