Predictive algorithms are transforming decision making in justice systems across several western and non-western jurisdictions. The algorithms are ostensibly neutral and non-discriminatory. But their data configurations and inherent design make them capable of reconstructing the systemic and structural problems that disproportionately affect certain groups as risk predictors. Yet, criminological scholarship, particularly the nascent field of digital or computational criminology, has paid insufficient attention to these dynamics.
This paper addresses the dearth of criminological insights by setting out three main conduits of algorithmic bias and proposing remedial measures.
Dr Pamela Ugwudike is an Associate Professor of Criminology at the University of Southampton. She is particularly interested in interdisciplinary studies of the ethics and social implications of AI technologies applied in justice systems to predict crime risks and determine levels of criminal justice intervention.
Her research projects explore how problems relating to data provenance and mining, provoke algorithmic outputs (predictions) that are shaping understandings of risk and informing risk-focused penal governance. More broadly, she is interested in the ways in which AI technologies are experienced and are transforming knowledge about key aspects of social life including crime and justice. Dr Ugwudike’s research projects generally reflect her interest in interdisciplinarity and they bring together researchers from criminology, data science, mathematics, and software engineering.