GDPR: New Hurdles for Companies but More Power to People!

The European Union’s General Data Protection Regulations (GDPR) is finally here, coming into full force today, 25 May 2018, for all organizations (‘data controllers’) processing the personal data of natural persons (‘data subjects’) in the EU. The regulations are a big deal because they will have significant effect not just on controllers based in the EU, but also on those operating outside of EU borders. Any processing of EU personal data will be subject to the regulations, regardless of the location or headquarters of the controllers.

In particular, this will affect operations of many ICT organizations involved in consumer-based digital services: mobile network operators, internet service providers, digital content providers, media platforms, social networks, ecommerce and online retailers, and communication service providers, among many others. To date, they have operated easily internationally, serving end-users with often minimum application of national privacy and data protection laws. In large part, this is because laws have been disparate, especially in the EU.

The GDPR’s predecessor was a mere directive (notably 2002/58/EC on privacy and electronic communications, dubbed the ePrivacy Directive). A regulation differs from a directive in that the text of a regulation is implemented as such directly into national legislation, whereas a directive requires member states to craft their own legal instruments based on the text. This means that there can be disparity in scope and meaning of national laws based on directives, whereas the regulations are exactly the same across all member states. The replacement of the ePrivacy Directive with the GDPR will harmonise data protection and privacy laws across the EU.  As such, controllers will be facing the EU as a block, rather than being able to play off national disparities when it comes to legal challenges of privacy regulations.

Many controllers will see this as a blow to tech innovation and expansion. However, the GDPR is less concerned with the barriers it may be putting in place in terms of tech development, but more with the nefarious effects of unbridled data collection and processing of its subjects. While the memory of European wartime and post-WWII eras may have faded somewhat from modern minds (and notably mass data collection and surveillance of entities such as the Stasi, StB, Securitate, etc.), the intent and core aim in the creation of the EU, its institutions and regulations have always been to keep the peace and ensure the fundamental human rights and freedoms of all its subjects, including as they relate to privacy and data protection. While Europe is currently enjoying the longest period of peace-time in it’s history to date, the concerns regarding the adequate protection of those fundamental rights and freedoms have grown in parallel with the democratization and commoditization of ICTs.

While governments and their related entities have remained dedicated to the protection of such rights (if not always competent in how they go about it), the new threat viewed by the EU has been from the private sector and the relative ease with which they have leveraged digital technologies to collect, process and monetize personal data. With the development and application of online tracking, automated technologies, big data analytics, cloud storage services, artificial intelligence and machine learning, ever more data can be aggregated and precise profiles can be created of individuals as they relate to all kinds of behavior and activity, both on- and off-line. Such data is increasingly valuable, a lucrative commodity that fuels billion-dollar business models and industries.

While the collection and processing of data is not the issue, and far be it from the EU to ban all such activity, what is at stake it is the capacity of individuals to weigh in on how this data, which is inherently theirs, is ultimately used. The GDPR is focused on making controllers more transparent, pushing them to provide better information, and ultimately some degree of control to the data subject in terms of explicit consent, if those subjects so wish to exercise it.

In the larger picture, most data subjects will continue to opt-in to receive digital services, but the GDPR aims to make the controllers accountable in their processing practices. This of course entails changing many embedded practices and controllers are having to invest time and resources in becoming compliant. Undoubtedly, the GDPR will limit many monetization opportunities, and present some hefty barriers to the development of certain technologies. And of course, needless to say, the financial cost of non-compliance will sting.

For many controllers, the GDPR will be unwelcome. It will impose obstacles to technology usage that they will now need to address if they want to continue operating in the EU. While there are too many to go into in any meaningful way in this blog, there are a few that are worth highlighting.

One of interest is how the GDPR will affect machine learning. Notably, the debate focuses around Article 22, on automated individual decision-making, including profiling. In short, the article states that automated decision-making and profiling used by many machine learning algorithms (e.g. for recommendation, advertising, social networks, rating and assessment) are not allowed without the explicit consent of the data subject. The hotly debated issue is whether it confers a legal right to explanation about how decisions are made by machine learning algorithms, or simply a less-stringent right to information.

The article is set to radically change the way many companies use machine learning to provide services to EU data subjects. Designers and deployers of machine learning algorithms will need to figure out a way to explain potentially complex algorithm functions to data subjects. This effort will be compounded by the fact that it is sometimes unknown why a model arrived at a specific decision. Further, some may not want to explain their models due to concerns over the protection of intellectual property rights, trade secrets and other confidential or sensitive business data. Designers will need to find a way to explain black box models in a way that satisfies the GDPR and any potential IPR and other sensitive data concerns.

Beyond that, other issues that will need to be tackled by controllers include how to incorporate the other fundamental principles enshrined in the GDPR, notably fairness, legality, purpose limitation, data minimization, data retention periods, and integrity and confidentiality. The fairness principle will need to be balanced against algorithmic bias, in order to prevent the arbitrary discriminatory treatment of individuals and not emphasize any of the personal data categories referenced in Article 9. This right to non-discrimination seems to go against the very essence of algorithmic profiling, which is inherently discriminatory.

Purpose limitation is another principle that may have significant adverse effect on the use of machine learning. The principle aims to limit the use of personal data to the purpose intended. As such, reuse of that data for other algorithms will not be permitted without demanding again the explicit consent from the subject for that new purpose.

Further, the data minimization principle will also pose a challenge. For machine learning algorithms, the more data they use, the better the results will be. However, data minimization requires that the data used be adequate, relevant and limited to what is required for achieving the intended purpose. Designers will have to decide what is an ‘adequate’ amount of data they can use that remains relevant and limited to purpose; certainly this will be a difficult balance to achieve.

Certainly, the advent of the GDPR will have significant impact worldwide. The EU clearly estimates that the advances brought about by digitization, and it’s associated technological breakthroughs, should not be left unchecked, and could easily be used to the detriment of data subjects. The protection of individuals’ rights and freedoms is paramount, even when pitted against arguments of restraining competitiveness and free market advances. Notably, the fact that the GDPR enshrines the principle of privacy and data protection by design will force radical changes in the way many digital services are offered in the EU, and globally. For the data subjects however, the GDPR will be both beneficial and provide better understanding and control over how their data is used, which this author whole-heartedly supports.