The Defendant and the Algorithm
Should we put algorithms in charge of our criminal justice system? Would an impartial AI make better decisions than our human and fallible judges when it comes to deciding legal issues?One of the news stories making the rounds recently claims that a computer algorithm would be a significant improvement over a judge's discretion when deciding which defendants were eligible for pre-trial release, and which ones should stay in jail awaiting trial. From the article in the MIT Technology Review:
In a new study from the National Bureau of Economic Research, economists and computer scientists trained an algorithm to predict whether defendants were a flight risk from their rap sheet and court records using data from hundreds of thousands of cases in New York City. When tested on over a hundred thousand more cases that it hadn’t seen before, the algorithm proved better at predicting what defendants will do after release than judges.
Yet another article talks about how judges in New Jersey are using an algorithm to guide their decisions on granting defendants pre-trial release without bail.
The promise of AI: that a machine can suddenly remove the implicit bias of judges and make the justice system more just. Remove the decision making power from humans, put the machine in charge. Now salivating over the prospect of the terms "AI" and "machine learning" the techno-elite are aiming their disruptor beams at the criminal justice system.
This isn't to say the criminal justice system is operating well. Quite the contrary. Judges do have implicit bias. Minorities are policed and convicted at a much higher rate than non-minorities.
The idea of algorithms taking decision making out of judges' hands reminds of a similar scheme from the 1980's - when states and the federal government began enacting mandatory minimum sentences. Judges, so the argument went, had too much discretion and were letting dangerous criminals go with light sentences because their emotions were getting in the way. Mandatory minimums were supposed to standardize punishment for crimes, reduce recidivism, and deter people from crime simply by being on the books. This has not worked as planned. Their list of ills goes on. And on. And on.
It also reminds me of a scene from the movie Brazil, where a bug gets caught in a mechanical typewriter, causes a typo and thereby the death of someone whose name is remarkably close to that of an alleged terrorist.
Perhaps our betters would proceed with a modicum of caution then, when considering the wonders of their new and improved standardized justice algorithm.
But there are more than philosphical problems with implementing algorithmic justice at the court level. Courts are strapped for funding. The computer systems courts use, even when up-to-date, are error prone and faulty. Some of those errors even mean warrants are issued for the wrong people, people stay jailed longer than they should, and court staff and attorneys must work overtime to prevent the machine-created errors from causing more injustice.
Changing to an algoritm-driven justice system will mean adding new layers of complexity onto an already burdened system. Algorithms, after all, are only as good as the data that gets fed to them. In many jurisdictions the Odyssey system (with the problems mentioned above) handles all of the court's data both internally and externally. Add to that the problem of different jurisdictions and agencies having their data in different silos, and our true and just algorithm is nothing more than a computer program operating on incomplete and erroneous data sets.
Computer algorithms are not without their own inherent biases. A study by Pro-Publica found that a computer program used to determine a defendant's risk of re-offending was heavily biased against black people. Consider too the problem of who (or what) is the actual owner of the algorithm once it's in place. Does the actual computer code fall under public records laws so that it can be examined for bias? Or does it, like the Stingray cell phone monitoring device, fall under some kind of non-disclosure agreement that encourages the court to keep it a closely-guarded secret? Would the machine's inherent bias even be up for debate?
There is room in criminal justice for data analysis and algorithmic guidance. This site, after all, is built on the idea that algorithmic data analysis can help people facing the court system. But when it's the state imposing its might through a faceless algorithm we have to take great care. The rules of disrpution - move fast and break things - shouldn't apply when we're talking about someone's liberty.