The General Data Protection Regulation (GDPR) is a new data protection regulation that will be effective across the EU from 25th May 2018. The GDPR applies to all companies that process data of EU citizens regardless of where the companies are based. It replaces the Directive 95/46/EC normally referred to as the Data Protection Directive which dates back to the 1990’s.
The purpose of the GDPR is to regularise data protection within the EU and to give citizens more control over how their personal data is used. It also provides clarity for businesses by harmonising data protection law across the EU. The GDPR introduces much tougher penalties for organisations found to be in breach of the regulations. The GDPR was four years in the making and was adopted by the EU Parliament in April 2016 although not coming into force until 2018.
While parts of the GDPR are complex, one way of framing the legislation for ease of understanding is in terms of six key data protection principles underpinning it. These principles for processing personal data are contained in Article 5 of the Regulation.
- Lawful, Fair and Transparent Processing
- Purpose Limitation
- Data Minimisation
- Accurate and up to date Processing
- Limitation of Storage
- Security and Accountability
The new legislation will ensure the following rights for individuals:
- The right to be informed
- The right of access
- The right of rectification
- The right of erasure
- The right to restrict processing
- The right to data portability
- The right to object
- Rights in relation to Automated Decision-Making and Profiling
With regard to the burgeoning use of data mining and machine learning, one of the aspects of the GDPR that has prompted much debate relates to right number eight above and whether it provides a ‘right to explanation’ for those affected by the decisions of automated decision-making models.
Right to an Explanation
Whether we realise it or not, algorithms increasingly influence how we live our lives. From little things like Amazon recommendations to more important ones that can influence who has access to finance, education, employment etc, decisions are increasingly being made by computers trained on large data sets.
Technological advances in computing mean that approaches to machine learning that were not as feasible in the past are now being applied with success, for example neural networks and deep learning. The architecture of a neural network is based on that of neurons in the brain. Essentially, it consists of an input layer, a hidden layer and an output layer organised as a set of interconnected nodes in a weighted graph as in the diagram below. Each node changes its activation state in response to input and an activation function which is modified as the model is trained.
The reason why a right to explanation might pose a difficulty for data scientists is because although algorithms like neural networks allow us to accurately model complex non-linear relationships they operate like black boxes. This means that the algorithm provides little information about how it came to any individual decision and hence providing an explanation may not be feasible.
But does the GDPR contain a right to explanation? There are differing views on this ranging from the opinion that it contains no right to explanation (Wachter, Mittelstadt & Floridi, 2017) to the assertion that individuals affected by automated processing are entitled to an explanation (UK Information Commissioner, 2017).
The relevant parts of the GDPR are articles 13-15 and articles 22. Both articles 13 and 15 state that in the case of automated decision-making the data controller must make the data subject aware at point of data collection of
the existence of automated decision-making, including profiling, referred to in Article 22(1) and (4) and, meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject.
While Article 22.1 states
The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.
So far, so simple, right? Well, maybe not.
Firstly Article 22.2 states that 22.1 above does not apply where the decision is necessary for entering into or for performance of a contract or where the decision is authorised by EU or state law or when the subject has given his or her consent.
Secondly one needs to consider what an explanation might mean and when it would be given.
An explanation can be given in terms of the general functionality of the system used to make the decision or in terms of the specifics of how a decision was arrived at in the case of a particular data subject. The first of these is likely easier to provide than the second.
An explanation can also be given either before or after decision-making has taken place. But consider that if given before decision- making has taken place, then the explanation can really only be in general terms of model functionality.
Thirdly the language used is not always completely clear. For example if a person has the right to an explanation in the case of automated decision-making but it is not completely clear here exactly what ‘automated’ means. Would a minimal amount of human involvement negate the right to explanation?
Andrew Burt in a good article considers the right to explanation in the GDPR in the context of the accompanying Recitals. The Recitals are essentially a guide to the legislation. He concludes that in the case of automated decision making the GDPR provides that a data subject is entitled to enough information about the automated process to be able to make an informed choice to opt in or not. This would indicate that the right to the second type of explanation mentioned above (the subject-centric one) is more aspirational than operational in the GDPR. Nonetheless given the lack of clarity in the legislation itself, ultimately it may be left to the courts to decide how the right to explanation in the GDPR should be implemented.