Some judges in America have recently started using a closed-source algorithm that predicts how likely convicts are to commit another crime. Mosquito Bites shared an article by law professor Frank Pasquale raising concerns about the algorithms:
They may seem scientific, an injection of computational rationality into a criminal justice system riddled with discrimination and inefficiency. However, they are troubling for several reasons: many are secretly computed; they deny due process and intelligible explanations to defendants; and they promote a crabbed and inhumane vision of the role of punishment in society…
When an algorithmic scoring process is kept secret, it is impossible to challenge key aspects of it. How is the algorithm weighting different data points, and why? Each of these inquiries is crucial to two core legal principles: due process, and the ability to meaningfully appeal an adverse decision… A secret risk assessment algorithm that offers a damning score is analogous to evidence offered by an anonymous expert, whom one cannot cross-examine… Humans are in charge of governments, and can demand explanations for decisions in natural language, not computer code. Failing to do so in the criminal context risks ceding inherently governmental and legal functions to an unaccountable computational elite.
This issue will grow more and more important, the law professor argues, since there’s now proprietary analytics software that also predicts “the chances that any given person will be mentally ill, a bad employee, a failing student, a criminal, or a terrorist.”
Read more of this story at Slashdot.