Latest News

Evidence reveals risk assessment algorithms show bias against Hispanic population

8th August 2019

Automated risk assessment has become increasingly popular in the criminal justice system, but a new study – published in the American Criminal Law Review – assessed the accuracy, validity and predictive ability of a risk assessment algorithm tool to reveal algorithmic unfairness against Hispanics.

Risk assessment can be an objective way to reduce rates of imprisonment without jeopardising public safety, and criminal justice officials are increasingly reliant on algorithmic processing to inform decisions on managing offenders according to their risk profiles. However, there is alarming evidence to suggest that risk algorithms are potentially biased against minority groups.

Reader in Law and Criminal Justice at the University of Surrey Dr Melissa Hamilton used a large dataset of pre-trial defendants who were scored on COMPAS – a widely-used algorithmic risk assessment tool – soon after their arrests to evaluate the impact of this algorithmic tool specifically on the Hispanic minority group.

Dr Hamilton said: “There is a misconception that algorithmic risk assessment tools developed using big data automatically represent a transparent, consistent and logical method for classifying offenders. My research suggests that risk tools can deliver unequal results for minority groups if they fail to consider their cultural differences. Bias occurs when risk tools are normed largely on one group, for example White samples, as they provide inaccurate predictions for minority groups as a result.

“Cumulative evidence showed that COMPAS consistently exhibited unfair and biased algorithmic results for those of Hispanic ethnicity, with statistics presenting differential validity and differential predictive ability. The tool fails to accurately predict actual outcomes, subsequently overpredicting the level of risk of reoffending for Hispanic pre-trial defendants.”

Whilst there have been impressive advances in behavioural sciences, the availability of big data and statistical modelling, justice officials should be aware that greater care is needed to ensure that proper validation studies are conducted prior to the use of an algorithmic risk tool, to confirm that it is fair for its intended population and subpopulations.


The Biased Algorithm: Evidence of Disparate Impact on HispanicsAmerican Criminal Law Review 56 (4) Georgetown University Law Center Hamilton Melissa (2019)