Addressing Discrimination in Algorithmic Decision-Making

Event date
17 November 2020
Event time
12:30 - 13:45
Oxford week
Audience
Anyone
Venue
Zoom Webinar
Speaker(s)
Sandra Wachter

Notes & Changes

Register here. Please note that this event will be recorded, with the exception of any live audience questions.

Fairness and discrimination in algorithmic systems is globally recognised as a topic of critical importance. To date, a majority of work has started from an American regulatory perspective defined by the notions of ‘disparate treatment’ and ‘disparate impact’. European legal notions of discrimination are not, however, equivalent. In this talk, Sandra will examine EU law and jurisprudence of the European Court of Justice concerning non-discrimination. She will identify a critical incompatibility between European notions of discrimination and existing work on algorithmic and automated fairness. Algorithms are not similarly to human decision-making; they operate at speeds, scale and levels of complexity that defy human understanding, group and act upon classes of people that do not resemble historically protected groups, and do so without potential victims ever being aware of the scope and effects of decision-making. As a result, individuals may never be aware they have been disadvantaged and thus lack a starting point to raise a claim. A clear gap exists between statistical measures of fairness and the context-sensitive, often intuitive and ambiguous discrimination metrics and evidential requirements historically used by the Court.

The talk will focus on three contributions. First, she will review the evidential requirements to bring a claim under EU non-discrimination law. Due to the disparate nature of algorithmic and human discrimination, the EU’s current requirements are not fit to be automated. Second, she will show that automating fairness or non-discrimination in Europe may be impossible because the law does not provide a static or homogenous framework. Finally, she will propose a statistical test as a baseline to identify and assess potential cases of algorithmic discrimination in Europe. Adoption of this statistical test will help push forward academic and policy debates around scalable solutions for fairness and non-discrimination in automated systems in Europe.

Professor Sandra Wachter is an Associate Professor and Senior Research Fellow in Law and Ethics of AI, Big Data, and robotics as well as Internet Regulation at the Oxford Internet Institute at the University of Oxford. Wachter is specialising in technology-, IP-, data protection and non-discrimination law as well as European-, International-, (online) human rights,- and medical law. Her current research focuses on the legal and ethical implications of AI, Big Data, and robotics as well as profiling, inferential analytics, explainable AI, algorithmic bias, diversity, and fairness, governmental surveillance, predictive policing, and human rights online. At the OII, Professor Sandra Wachter also coordinates the Governance of Emerging Technologies (GET) Research Programme that investigates legal, ethical, and technical aspects of AI, machine learning, and other emerging technologies. Wachter is also a Fellow at the Alan Turing Institute in London, a Fellow of the World Economic Forum’s Global Futures Council on Values, Ethics and Innovation, a Faculty Associate at The Berkman Klein Center for Internet & Society at Harvard University, an Academic Affiliate at the Bonavero Institute of Human Rights at Oxford’s Law Faculty, a Member of the European Commission’s Expert Group on Autonomous Cars, and a member of the Law Committee of the IEEE. Prior to joining the OII, Wachter studied at the University of Oxford and the Law Faculty at the University of Vienna and worked at the Royal Academy of Engineering and at the Austrian Ministry of Health.

Professor Martin Scheinin will be the respondent. He is a British Academy Global Professor at the Bonavero Institute of Human Rights, University of Oxford. He will arive from the European University Institute where he has been Professor of International Law and Human Rights since 2008. He is the author numerous books and articles concerning international and European human rights law, international courts and tribunals, the law of treaties, as well as comparative constitutional law. He was the leader of the EU-funded research project SURVEILLE that in 2012-2015 which developed a multidisciplinary methodology for the holistic assessment of the security benefit, cost efficiency, moral harm and human rights intrusion of a wide range of surveillance technologies, including in the context of the threat of terrorism. Besides his academic expertise, he will bring to the Bonavero Institute his long experience from the practice of human rights law, having served on the United Nations Human Rights Committee (1997-2004), as UN Special Rapporteur on human rights and counter-terrorism (2005-2011) and as member of the Scientific Committee of the EU Fundamental Rights Agency (since 2018).

The discussion will be chaired by Oliver Butler. He is currently the Fellow in Law at Wadham College, covering a period of leave taken by Dr Tarun Khaitan, Tutorial Fellow in Law at Wadham College. Oliver studied law as an undergraduate at Emmanuel College, Cambridge before completing his BCL at Lincoln College, Oxford and LLM at Harvard Law School. He was called to the Bar in 2013 and worked at the Law Commission as a research assistant before returning to Emmanuel for a PhD on information law. His research at the Bonavero Institute of Human Rights looks into justifications for regulating public authorities differently from private actors in relation to privacy, confidentiality and data protection.

Suggested Readings:

Logo of the Price Media Law Moot.

This event forms a part of the repertoire of International Events Around the Moot for the participants, judges and coaches of the Price Media Law Moot Court Competition 2020/21. Please remember to register in order to access the event.

Found within

Human Rights Law