Algorithms, Increasingly Controlling Lives, Lack Accountability
Algorithms are ubiquitous in our lives. They map out the best route to our destination and help us find new music based on what we listen to now. But they are also being employed to inform fundamental decisions about our lives.
Companies use them to sort through stacks of résumés from job seekers. Credit agencies use them to determine our credit scores. And the criminal justice system is increasingly using algorithms to predict a defendant’s future criminality.
Those computer-generated criminal “risk scores” were at the center of a recent Wisconsin Supreme Court decision that set the first significant limits on the use of risk algorithms in sentencing.
The court ruled that while judges could use these risk scores, the scores could not be a “determinative” factor in whether a defendant was jailed or placed on probation. And, most important, the court stipulated that a presentence report submitted to the judge must include a warning about the limits of the algorithm’s accuracy.
This warning requirement is an important milestone in the debate over how our data-driven society should hold decision-making software accountable. But advocates for big data due process argue that much more must be done to assure the appropriateness and accuracy of algorithm results.
No comments:
Post a Comment