Artificial intelligence is everywhere these days, including in the criminal justice system. But a new report out Friday joins a chorus of voices warning that the software isn’t ready for the task.
“You need to understand as you’re deploying these tools that they’re extremely approximate, extremely inaccurate,” said Peter Eckersley, research director at Partnership on A.I., a consortium of Silicon Valley heavyweights and civil liberties groups that helped published the report. “And that if you think of them as ‘Minority Report,’ you’ve gotten it entirely wrong,” he added, referencing the Steven Spielberg science fiction blockbuster from 2002 that’s become a kind of shorthand for all allusions to predictive policing.
The study — “Algorithmic Risk Assessment Tools in the U.S. Criminal Justice System” — scrutinizes how A.I. is increasingly being used throughout the country.
Algorithmic software crunches data about an individual along with statistics about groups that person belongs to. What level of education did this individual attain? How many criminal offenses did this individual commit before the age of 18? What is the likelihood of, say, skipping bail for individuals who never finished high school and committed two crimes before the age of 18?