We Are Not Ready for a Robo-Judge

September 30, 2017

 Photo Source: Wikimedia Commons

 

 

 

       The phrase “as sober as a judge” reflects the high expectations we have of judges and their responsibility to apply the law objectively and evenly to anyone in their court.  Judges bear a heavy burden to dispense the law and render not just judgement but sentencing as well.  Given the awesome responsibility placed on judges and the pivotal role they play in the lives of the plaintiffs and defendants that stand before them, it would seem very unlikely that we would hand the role of judge over to a computer.  However, artificial intelligence has already begun to enter the judge’s chambers.  A system being sold to courts called COMPAS (Correctional Offender Management Profiling for Alternative Sanctions) has been developed by Northpointe Inc to evaluate the risk that a criminal will commit another crime.  In other words,  their recidivism rate (COMPAS).  Using artificial intelligence combined with information gathered from the subject in an interview and additional data such as criminal records, health history, history of violence, gender, race, and more, COMPAS will provide an estimation of whether a defendant will likely commit another crime and a separate estimation of whether the defendant will likely commit a violent crime.  Courts and prisons use COMPAS to help guide sentencing and inmate management.  Another system called Modria, marketed by Tyler Technologies, functions as an online dispute resolution system (Tyler).  Modria was founded by Colin Rule in 2011.  Before starting Modria, Rule led the development of eBay’s automated dispute resolution system that apparently resolves over 60 million disputes a year for eBay clients.  The system and others like it guides disputants through a series of questions, and based on calculations, proposes resolutions both parties can agree on.  If no resolution can be found, human adjudicators come into play.  Another example of computer based resolution was developed in the United Kingdom by a student named Joshua Browder.  Browder developed an application that helps people dispute parking tickets.  The Chatbot is called DoNotPay and has also been called the “World’s First Robot Lawyer” (The Guardian). The system apparently uses artificial intelligence to guide a dispute resolution algorithm that has helped over 160,000 people reverse their parking tickets resulting in the recuperation of over $4 million.  Proponents of the free service laud the application of AI to help resolve parking tickets without having to take time off to go to court or hire and expensive lawyer.  

 

       The acceptance of artificial intelligence in the legal system has not been completely smooth.  For example, COMPAS has been in the news for several reasons. Recently, ProPublica, a nonprofit news group based in New York, published an evaluation of COMPAS concluding that the risk assessment is racially biased. In the article titled “How We Analyzed the COMPAS Recidivism Algorithm” published in May 2016, the authors describe how they analyzed the actual recidivism rate of over 10,000  criminal defendants from Broward County in Florida with their predicted recidivism rate (Pro Publica).  The authors observed that COMPAS accurately predicted the recidivism rate of white defendants 59% of the time and 63% for black defendants (it should also be noted that both of these odds are just a little bit better than the probability of a simple coin flip).  Moreover, the investigation concluded that COMPAS is overly biased in predicting the likelihood of black defendants reoffending.  In the COMPAS system, Black defendants were twice as likely to be misclassified as a higher risk of reoffending The authors conclusions suggest that the algorithm lacks the required impartiality needed for the legal system. COMPAS was also featured in the legal case of Loomis v. Wisconsin heard by the Wisconsin Supreme Court.  The defendant, Eric Loomis, was convicted and later pled guilty to being the driver in a drive by shooting. In the sentencing of Loomis, he was characterized as a “high risk to the community.” (SCOTUSblog) The assessment was in part derived from information provided by COMPAS.  The defendant argued that because the algorithm used in his sentencing is proprietary and could not be fully examined,  he was denied due process.  He continued that without due process, which is guaranteed under the US Constitution that he should be set free.  The Wisconsin Supreme Court determined that the sentence would have been given even without the COMPAS information.  Loomis appealed to be heard by the Supreme Court of the United States.  The Supreme Court denied the request to hear this case but noted that a “sentencing court’s use of actuarial risk assessments raises novel constitutional questions that may merit this Court’s attention in a future case.” In other words, the court feels that computer aided sentencing will eventually make its way to the Supreme Court for evaluation.

 

      Judges accept the weighty responsibility of determining the guilt or innocence of defendants and the determination of the sentence for the guilty.  Automation in dispute resolution and sentencing has already made its way into court with systems like COMPAS, Modria, and DoNotPay.  Although, judges are not infallible.  Take for example the now famous study titled “Extraneous Factors in Judicial Decisions” in which the authors concluded that judges are more likely to hand down a guilty sentence before a break such as lunch or at the end of the day than in the morning or after lunch (PNAS). Judges may be fallible, but complex, proprietary computer programs should not replace them.  Technological enhancements to legal decision making need to be transparent to everyone to whom they are being applied.

 

Share on Facebook
Share on Twitter
Please reload