Despite recent adoption of the EU General Data Protection Regulation (EU GDPR), CANDID researchers found that efforts in Europe to protect fundamental rights in ICT environments to be notably complex and with important shortcomings.
For the extended peer review process of CANDID Module 2 (on Risks, Rights And Engineering) two new procedures introduced by the GDPR were considered: the Data Protection Impact Assessments (DPIAs) and Data Protection by Design and by Default (DPbD).
Researchers in Social Sciences and Humanities, coordinated by Vrije Universiteit Brussel, developed insights into these procedures, and, following the June 2017 Bergen consortium meeting are looking closely into aspects related to the day-to-day implementation by organisations of GDPR provisions about Data Protection Impact Assessments and Data Protection by Design and by Default.
Conclusions will only be available after full analysis of the data, but according to a spokesperson of the Interdisciplinary Research Group on Law, Science, Technology & Society at Vrije Universiteit Brussel, “The findings from the peer consultation point at the difficulties for most actors directly and indirectly involved in DPIA and DPbD processes, in understanding how to deal with the legal specificities and nuances inherent to fundamental rights protection”.
“The extended peer review exercise also raised issues about standardisation and governance, which are described by many peers as topical right now for considerations of GDPR implementation.”
New procedures for the protection of fundamental rights and freedoms in ICT environments
The EU General Data Protection Regulation (GDPR) is intended to protect fundamental rights and freedoms of individuals in contexts of technological development, and also to help achieve policy objectives linked to the digital single market.
So GDPR is actually designed to help enable, rather than hinder, movement of personal data within the EU internal market.
The Regulation introduces two main procedures in order to protect personal data:
- Data Protection Impact Assessments (DPIAs): operators and authorities that collect and use (“process”) personal data, are required to implement such assessment procedures any time their processing operations are likely to result in a high risk to the rights and freedoms of natural persons. This is a requirement under Article 35 of the Regulation.
- Data Protection by Design and by Default (DPbD) requires identification of ways of engineering and integrating safeguards for personal data into technology and its settings. This procedure is introduced by article 25 of the Regulation.
Evaluating risks to rights: lessons from social studies on risk and technology assessment
Data Protection Impact Assessments (DPIAs) typically require that salient ‘risks’ to rights are identified at the earliest possible stages in development of technologies. These also require the identification of measures to prevent potential infringements to privacy and other fundamental rights.
Likewise, the legal provisions for DPIAs in the EU General Data Protection Regulation introduce a risk-based approach to the protection of rights. Furthermore, the European Commission has supported the consolidation of two methodologies for the assessment of the fundamental rights to privacy and personal data protection that also include risks-to-rights types of evaluations. These are the DPIA Framework for RFID Technologies, and the DPIA Template for Smart Grids.
In this respect, the CANDID Module 2 researchers highlighted that lessons can be learnt from influential studies on technology assessment which have already warned against centring the success of these assessments on the very notion of risk. Notions of risk, especially those elaborated by experts in closed circles, have often been found to be quite narrowly defined and tend to limit accommodation for public concerns.
In relation to this point, several peers that took part in the extended peer review reported major difficulties in defining the scope of protection of fundamental rights and freedoms such as, for instance, the fundamental right to privacy that should be protected against risks. A few of them, especially in the engineering and business domain admitted they have troubles understanding what should be protected, in a way rigorous enough that it can be translated to formal definitions and they lamented a lack of certainty for business activities.
A similar issue was thought to apply to associations with trans-individual values of privacy (eg effect on social groups and categories of people), affected by smart and surveillance technologies.
The view of right-holders in Data Protection Impact Assessments and Data Protection by Design and by Default
To prevent a situation where the definition of risks are too narrowly defined, the Module 2 researchers suggest that Data Protection Impact Assessments should include the views of a broad range of actors, such as activists and individual rights holders. Such actors, can often be overlooked, through use of arguments according to which risks had already been dealt with by specialist risk assessors and design processes. This is a concern voiced especially by actors in civil society organisations but also by some representatives from Data Protection Authorities and researchers in the field of value sensitive design and user interaction design.
Multidisciplinary cooperation in Data Protection Impact Assessments and Design: what happens when engineers and software designers interact with legal and other practitioners?
A related aspect that the research considered was how to favour collaborations among professionals in organisations such as risk managers product designers, ICT developers with other, wider, practitioners, such as human rights lawyers, ethicists, social scientists, in the actual ‘operationalisation’ of Data Protection Impact Assessment and Data Protection by Design and by Default.
This was thought essential for broadening the scope of the definitions of evaluation processes for notions such as risks, rights, protections, as well as for aligning different working methods and vocabularies across disciplines.
The extended peer review, tentatively, shows that an interdisciplinary perspective favours a broad understanding of rights and this must be solicited in the course of the data protection assessments and design exercises. However, the peer consultation shows that multidisciplinary cooperation requires an authentic willingness for close collaboration and openness to other disciplines: such as media scholars understanding technical affordances, engineers willing to see beyond technology, lawyers grasping social behaviour, and others besides.
Current state-of-play of CANDID Module 2 a special focus on Data Protection by Design
The Module researchers are currently delving further into findings.
An area of major interest is Data Protection by Design and by Default where many open questions remain with respect to its actual operationalisation. There is so far little experience in the notion and application of data protection by design by default. Various understandings exist around this notion as well as suggestions on how to best make it work in concrete terms.
This is encouraging as it shows the debate is lively and there is willingness and potential to improve current practices and state of the art. At the same time this also shows the difficulty of interpreting and dealing with the protection of fundamental rights and freedoms “in very practical terms” when the mission is that of “building” legally relevant safeguards in ICT technologies and systems.
Research developed in the context of Module 2 reveals that various ways or styles of designing data protection and privacy principles into technology are often centred in organisational strategies and leave out key actors such as legal practitioners and publics.
The 26 September CANDID workshop in Barcelona, will be an opportunity to further develop discussion with peers on this and other topics which are of major importance and cross-module relevance.