New Study Finds That Unbiased Computer Algorithms Have Racially Biased Recommendations

Posted on : February 6, 2018
learn more about COMPAS a computer program to recommend sentencing for criminals

Did you know that judges don’t always make recommendations on sentencing based on their experience alone? There are guidelines in place about a range of sentencing options available to judges. They are responsible for looking at all the facts of the case, including any aggravating factors, to determine the appropriate sentencing. In these cases, judges often rely on a software that’s designed to give them some insight after inputting factors about the crime itself. This software may be biased, however.

COMPAS, a popularly used software that many judges use to inform their decisions about sentencing, may not be as unbiased as it is intended to be, according to a recent study. The software that was recently analyzed in a research project and it was determined that black people were classified as a higher risk while white people were classified as a lower risk, impacting the sentencing recommendations.

Even though explicit information about the person’s race was not included in the particular study, this outcome suggests that more lenient rehabilitation suggestions are typically associated with white defendants even though more rigorous programs and longer sentences are typically associated with a black defendant who faces the same risk of carrying out criminal activity in the future. The bias and the results were indistinguishable from a statistical standpoint from judgment calls that were made by human volunteers were selected randomly over the internet. According to the outcome of the research project, the COMPAS algorithm may be affected by data on arrest rates, which means that in some counties and cities these arrest rates are skewed.

The developer of the computer program however, says that it relies on a 127 separate data points when evaluating rehabilitation programs and opportunities. However, there are only six that are evaluated carefully when a person is at risk of re-offending. A recent study published in ProPublica looked at the performance of this program in Broward County Florida across 2013 and 2014. The software was no better than humans who participated in the study as it came to assessing the risk of someone re-offending.  If you’ve been accused, you must act quickly to protect your rights. 

Posted in : Administrator Attorney Leads

Comments are closed.