Imagine This…. You’ve been convicted of a non-violent crime, say petty theft. Your legal team decides the best course of action is to take a plea deal. On the day of your sentencing, the judge rejects your plea deal and doubles your sentence. Why? An algorithm says that you are at high risk for violent crime in the future…
You may be reading this thinking, that can’t possibly be real? But that is an all too real scenario because of the COMPAS algorithm.
COMPAS, an acronym for Correctional Offender Management Profiling for Alternative Sanctions, is a case management and decision support tool used by U.S. courts to assess the likelihood of a defendant becoming a repeat offender.
The problem with COMPAS, as a ProPublica report states, “Only 20 percent of the people predicted to commit violent crimes actually went on to do so.” ProPublica also concluded that the algorithm was twice as likely to falsely flag black defendants as future criminals as it was to falsely flag white defendants. And therein lies the problem, the algorithm has inherently biased training data due to years of human bias in the courtroom.
COMPAS is not only biased racially, but it also has bias against age and gender. An independent study done by researchers at Cornell University and Microsoft found that because most of the training data for COMPAS was based on male offenders the model is not as good at distinguishing between male and female as it could be. They even decided to make a separate COMPAS model aimed specifically at recidivism risk prediction for women.
But why would COMPAS separate the data based solely on gender when COMPAS has also shown to have racial bias? Why are judicial systems still turning to private, for-profit, companies whose algorithms are known to support racial, age and gender bias?
Turning to these types of algorithms have long standing implications on human life and our judicial system. Criminals receiving their sentences in the early ages of algorithmic adoption should not be test samples or guinea pigs for faulty and biased algorithms. As Artificial Intelligence becomes more main stream, understanding the data sets and training methodologies is key to understanding the results – how is data bias affecting your daily life?
For more information on COMPAS and ProPublica’s report, please click here.
Up Next for NCW: Digitization and Chemical Manufacturing
1 Comment