Department of Public Health, University of Oxford, Richards Building, Old Road Campus, Headington, Oxford OX3 7LF
Stergios is a DPhil in Socio-Legal Studies candidate and a member of St Cross college, supervised by Professors Jane Kaye and Bettina Lange. His research, funded by the ESRC and the Onassis Foundation, proposes a theoretical model for the understanding of discretionary decision-making within the public sector. More specifically, working at the intersections of administrative justice and regulatory theory, it assesses stark differences in the behaviour of public bodies who are responsible for managing large, so-called administrative datasets in the United Kingdom.
Stergios is currently (2018-19) serving as a Graduate Teaching Assistant in Human Rights Law and has completed studies in socio-legal research [MSt (dist.)] and law (MJur) at the University of Oxford and the Aristotle University of Thessaloniki, Greece [LLB & LLM in Criminal Law (both dist.)].
- DOI: https://doi.org/10.21552/edpl/2019/1/8The EU General Data Protection Regulation (‘GDPR’) seeks to balance the public interest in research with privacy rights of individuals, in particular, through research exemptions and safeguards set out in Article 89. While this affords Member States limited opportunities to modify the application of the GDPR at a national level, including for data processing that is necessary for the performance of a task carried out in the public interest, it is necessary for national approaches to conform with Article 89 safeguards where appropriate. One development of interest to the research community in the UK is a statutory power for public authorities to disclose administrative data for research under the Digital Economy Act 2017 (DEA). This article uses the DEA as a case study for analysis of the GDPR provisions governing processing of data for research purposes—including de-identification—and draws on human rights norms and jurisprudence to interpret the broad requirement for ‘appropriate safeguards’ for the ‘rights and freedoms of the data subject’ under Article 89. This analysis is important for data controllers seeking to meet their obligations under the UK framework and for those in other EU Member States considering the development of similar national provisions for data processing for research purposes.DOI: https://doi.org/10.23889/ijpds.v4i1.1093The UK government funded the Administrative Data Research Network (ADRN) with the explicit aim of making administrative data available for research; however, the legal framework for doing so is complex and the basis for disclosing these data to third party researchers is not straightforward. This paper critically analyses the legal framework for public authorities to disclose administrative data for the purposes of research, which will change significantly with the introduction of the General Data Protection Regulation (GDPR) and the UK Digital Economy Act 2017 (DEA). Our practical assessment of the new statutory power under the DEA for public authorities to disclose non-identifiable administrative data for research purposes highlights the challenges that may remain for researchers requiring access to linked administrative and health and adult social care data. Our review, in this paper, of the existing regimes for research using linked data is therefore necessary and useful for researchers, public authorities and data protection advisers.DOI: https://doi.org/10.1016/j.clsr.2018.01.002There has naturally been a good deal of discussion of the forthcoming General Data Protection Regulation. One issue of interest to all data controllers, and of particular concern for researchers, is whether the GDPR expands the scope of personal data through the introduction of the term ‘pseudonymisation’ in Article 4(5). If all data which have been ‘pseudonymised’ in the conventional sense of the word (e.g. key-coded) are to be treated as personal data, this would have serious implications for research. Administrative data research, which is carried out on data routinely collected and held by public authorities, would be particularly affected as the sharing of de-identified data could constitute the unconsented disclosure of identifiable information. Instead, however, we argue that the definition of pseudonymisation in Article 4(5) GDPR will not expand the category of personal data, and that there is no intention that it should do so. The definition of pseudonymisation under the GDPR is not intended to determine whether data are personal data; indeed it is clear that all data falling within this definition are personal data. Rather, it is Recital 26 and its requirement of a ‘means reasonably likely to be used’ which remains the relevant test as to whether data are personal. This leaves open the possibility that data which have been ‘pseudonymised’ in the conventional sense of key-coding can still be rendered anonymous. There may also be circumstances in which data which have undergone pseudonymisation within one organisation could be anonymous for a third party. We explain how, with reference to the data environment factors as set out in the UK Anonymisation Network's Anonymisation Decision-Making Framework.A number of claims have been made for the Data Protection Bill, as it serves a number of purposes—modernisation, ensuring data flows post-Brexit, and exercising derogations under the GDPR to create a more ‘nationalised’ law. This comment discusses them and evaluates the progress of the Bill.