Stergios is a DPhil in Socio-Legal Studies candidate and a member of St Cross college, supervised by Professors Jane Kaye and Bettina Lange. His research, funded by the ESRC and the Onassis Foundation, theorises data sharing regulation in the British public sector, particularly with regard to disclosing Government-owned, administrative data for research purposes. Working at the intersections of data sharing law, administrative justice and regulatory theory, this work assesses stark differences in the behaviour of public-sector data custodians in the United Kingdom.
He has published in the fields of socio-legal studies and regulation theory in such journals as the Journal of Social Welfare & Family Law and The Modern Law Review, as well as on the interplay between data protection and human rights law in such journals as the European Data Protection Law Review, the Computer Law & Security Review and International Data Privacy Law.
Stergios has served as a Graduate Teaching Assistant in Human Rights Law at the University of Oxford and has completed studies in socio-legal research [MSt (dist.)] and law (MJur) at the University of Oxford and the Aristotle University of Thessaloniki, Greece [LLB & LLM in Criminal Law (both dist.)].
- Will Brexit diminish digital rights protection in the UK or are domestic institutions better-placed to deliver such protection unencumbered by the oversight of EU institutions? This article scrutinises the validity of conflicting arguments about the future of human rights protection in the UK by reference to a paradigmatically ‘European’ digital right, the right to be forgotten (RTBF). Having considered the interplay between the multiple layers of UK law that an RTBF claim involves, the article argues that some legal implications of Brexit will have a graver impact on digital rights protection than others. In respect of EU law no longer being supreme in the UK, the analysis offered here calls for more nuance in critical arguments about losing fundamental protections when it comes to the RTBF. Brexit, however, will erode the protection of the RTBF in the longer term as a result of the loss of EU law’s direct effect. The scope of the ‘British RTBF’ will be gradually developed as ‘narrower’ compared to EU member states due to fundamental differences between the UK and European conceptions of privacy. The central place of ‘reasonable expectations’ of the data subject within the UK privacy conception, it is argued, sits at odds with social realities related to the RTBF and, thus, raises significant risks for the robust protection of the right in the future.DOI: 10.21552/edpl/2019/1/8The EU General Data Protection Regulation (‘GDPR’) seeks to balance the public interest in research with privacy rights of individuals, in particular, through research exemptions and safeguards set out in Article 89. While this affords Member States limited opportunities to modify the application of the GDPR at a national level, including for data processing that is necessary for the performance of a task carried out in the public interest, it is necessary for national approaches to conform with Article 89 safeguards where appropriate. One development of interest to the research community in the UK is a statutory power for public authorities to disclose administrative data for research under the Digital Economy Act 2017 (DEA). This article uses the DEA as a case study for analysis of the GDPR provisions governing processing of data for research purposes—including de-identification—and draws on human rights norms and jurisprudence to interpret the broad requirement for ‘appropriate safeguards’ for the ‘rights and freedoms of the data subject’ under Article 89. This analysis is important for data controllers seeking to meet their obligations under the UK framework and for those in other EU Member States considering the development of similar national provisions for data processing for research purposes.DOI: 10.1080/09649069.2019.1663024In Nobody’s Law (2018), Marc Hertogh introduced the notion of legal alienation as part of a ‘secular’ approach to legal consciousness, i.e. one that does not assume law’s hegemonic power in everyday life. This approach has been criticised, with it being suggested that it does not refute claims about law’s hegemonic power but partly explains resistance to it. I argue here that critical discussion of legal alienation is hampered by the employment of different definitions of the ‘legal’ in legal consciousness studies: from legality as an ongoing social structure to positive/State law. Using the example of administrative justice studies, I demonstrate that this definitional divergence results in confusion about the role of law as a variable in legal consciousness research designs. Is the law what is to be explained, or does the law explain another outcome? In the interest of achieving meaningful and clear analytical constructions of the ‘legal’ in this context, I argue that legal consciousness and legal alienation are concepts that shall be conceived as embodying two different conceptions: ‘identification/non-identification’ and ‘relevance/irrelevance to behaviours, processes or outcomes’. Acknowledging this conceptual distinction is integral to bridging the theoretical and methodological divide between social-scientific and legal approaches to legal consciousness research.DOI: 10.1093/idpl/ipz010The General Data Protection Regulation (GDPR) includes a new power for Member States to pass exemptions for the purpose of ‘academic expression’. This may appear to provide greater freedom to researchers working under the new EU data protection regime. Using the UK as a case study, however, it is evident that even a full exercise of the academic derogation is likely to be limited by the GDPR’s requirement of necessity, and by privacy rights wherever they are engaged. Ultimately, the GDPR provisions applicable to universities as public authorities are likely to have greater impact on academic data processing in public institutions; a shift which is not conducive to greater freedom in research data processing.DOI: 10.23889/ijpds.v4i1.1093The UK government funded the Administrative Data Research Network (ADRN) with the explicit aim of making administrative data available for research; however, the legal framework for doing so is complex and the basis for disclosing these data to third party researchers is not straightforward. This paper critically analyses the legal framework for public authorities to disclose administrative data for the purposes of research, which will change significantly with the introduction of the General Data Protection Regulation (GDPR) and the UK Digital Economy Act 2017 (DEA). Our practical assessment of the new statutory power under the DEA for public authorities to disclose non-identifiable administrative data for research purposes highlights the challenges that may remain for researchers requiring access to linked administrative and health and adult social care data. Our review, in this paper, of the existing regimes for research using linked data is therefore necessary and useful for researchers, public authorities and data protection advisers.DOI: 10.1016/j.clsr.2018.01.002There has naturally been a good deal of discussion of the forthcoming General Data Protection Regulation. One issue of interest to all data controllers, and of particular concern for researchers, is whether the GDPR expands the scope of personal data through the introduction of the term ‘pseudonymisation’ in Article 4(5). If all data which have been ‘pseudonymised’ in the conventional sense of the word (e.g. key-coded) are to be treated as personal data, this would have serious implications for research. Administrative data research, which is carried out on data routinely collected and held by public authorities, would be particularly affected as the sharing of de-identified data could constitute the unconsented disclosure of identifiable information. Instead, however, we argue that the definition of pseudonymisation in Article 4(5) GDPR will not expand the category of personal data, and that there is no intention that it should do so. The definition of pseudonymisation under the GDPR is not intended to determine whether data are personal data; indeed it is clear that all data falling within this definition are personal data. Rather, it is Recital 26 and its requirement of a ‘means reasonably likely to be used’ which remains the relevant test as to whether data are personal. This leaves open the possibility that data which have been ‘pseudonymised’ in the conventional sense of key-coding can still be rendered anonymous. There may also be circumstances in which data which have undergone pseudonymisation within one organisation could be anonymous for a third party. We explain how, with reference to the data environment factors as set out in the UK Anonymisation Network's Anonymisation Decision-Making Framework.A number of claims have been made for the Data Protection Bill, as it serves a number of purposes—modernisation, ensuring data flows post-Brexit, and exercising derogations under the GDPR to create a more ‘nationalised’ law. This comment discusses them and evaluates the progress of the Bill.