Blackstone's Ratio and Black Boxes: How matters of justice are hidden in the configuration of algorithms

Event date
25 February 2020
Event time
12:30 - 13:45
Oxford week
Venue
Bonavero Institute of Human Rights - Gilly Leventis Meeting Room
Speaker(s)
Dr Reuben Binns

The steady march of algorithmic decision-making in the public sector is plain to see, with technologies like 'predictive analytics', 'machine learning', and 'artificial intelligence' recently procured for use in areas like health and social care, education, and tax. What are these technologies, and what implications might they have for the lawfulness of government decision making? While it is common to see algorithms positively reported as having high accuracy, it is more useful to consider the ratio between errors. There are always trade-offs to be made between false positives (how many cases predicted ‘fraudulent’ were actually legitimate) and false negatives (how many cases predicted ‘legitimate’ were actually fraudulent), echoing Blackstone's ratio: ‘better that ten guilty persons escape, than that one innocent suffer’. The distribution of false positive and negative errors may also differ between groups in ways which entrench structural inequalities and oppression based on race, gender, disability, or other protected characteristics. This could happen even if the system doesn’t explicitly include the characteristic as a feature, as many seemingly benign features (or combinations of them) may correlate with it. This talk will explain how these considerations are typically encoded in machine learning systems being deployed today.

An audio recording of this event is available to listen to on Soundcloud

  

In December 2018, Reuben began a 2 year Research Fellowship at the Information Commissioner's Office, addressing AI / ML and data protection. He also continue research and teaching part time at the Department of Computer Science at the University of Oxford, where he is currently funded by the EPSRC's PETRAS Internet of Things Research Hub project 'Respectful Things in Private Spaces'. Reuben's research interests include technical, legal and ethical aspects of privacy, machine learning, and decentralised systems. From 2011-2015 he undertook a PhD in Web Science at the University of Southampton. His recent work has focused on two strands: human factors of third-party tracking on the web, mobile and Internet-of-Things devices; and transparency, fairness and accountability in profiling and machine learning.

Most of my recent papers can be found on Scholar.

 

 

Found within

Human Rights Law