Courtrooms across the nation are using computer programs to predict who will be a future criminal. The programs help inform decisions on everything from bail to sentencing. They are meant to make the criminal justice system fairer — and to weed out human biases. ProPublica tested one such program and found that it’s often wrong — and biased
What Algorithmic Injustice Looks Like In Real Life
A computer program rated defendants’ risk of committing a future crime. These are the results.