aside Warning: Reform Bill Could Add To Disparities in Judicial Sentences (Northpointe)

justice cartoon ok scalia walking into clouds -b8f9fdddc8f7142a

I was in the process of doing research as to why credible studies consistently demonstrate that minority defendants are much more likely to face incarceration and longer prison terms than non minority offenders for having committed similar legal infractions, when I came across a commonly utilized practice that explains in part this inequity. The courts have been resorting to software based on a defendants answers to a series of questions to determine the likelihood of their being a repeat offender. The most popular software is by “Northpointe.”

You would hope that there would have been studies conducted to evaluate the effectiveness of this practice to avoid another layer of unfair bias against minority suspects but this was not to be. This is problematic because there is a landmark sentencing reform bill currently under consideration by the U.S. Congress to expand the usage of these assessments.

justice drawing good expand-courtroom04

Management at ProPublica, a legal publication decided to take on this task by analyzing this issue themselves. Their article on Machine Bias was published on 5/23/16 by Julia Angwin, Jeff Larson, Surya Mattu,, Lauren Kirchner, and titled, “There’s software used across the country to predict future criminals. And it’s biased against blacks.” Here are some excerpts:

Analysis by ProPublica

(Example  1)

“The first time Paul Zilly heard of his score (based on software designed to predict future criminality)— and realized how much was riding on it — was during his sentencing hearing on Feb. 15, 2013, in court in Barron County, Wisconsin. Zilly had been convicted of stealing a push lawnmower and some tools. The prosecutor recommended a year in county jail and follow-up supervision that could help Zilly with “staying on the right path.” His lawyer agreed to a plea deal.”

“But Judge James Babler had seen Zilly’s scores. Northpointe’s software had rated Zilly as a high risk for future violent crime and a medium risk for general recidivism. “When I look at the risk assessment,” Babler said in court, “it is about as bad as it could be.”

“Then Babler overturned the plea deal that had been agreed on by the prosecution and defense and imposed two years in state prison and three years of supervision.”

justice court painting 7 large_picture

(Example 2)

“However, Boessenecker, who trains other judges around the state (California) in evidence-based sentencing, cautions his colleagues that the score doesn’t necessarily reveal whether a person is dangerous or if they should go to prison.”

“A guy who has molested a small child every day for a year could still come out as a low risk because he probably has a job,” Boessenecker said. “Meanwhile, a drunk guy will look high risk because he’s homeless. These risk factors don’t tell you whether the guy ought to go to prison or not; the risk factors tell you more about what the probation conditions ought to be.”

justice drawing ok expand-courtroom06Scores (based on software) — known as risk assessments — are increasingly common in courtrooms across the nation. They are used to inform decisions about who can be set free at every stage of the”from assigning bond amounts — as is the case in Fort Lauderdale — to even more fundamental decisions about defendants’ freedom. In Arizona, Colorado, Delaware, Kentucky, Louisiana, Oklahoma, Virginia, Washington and Wisconsin, the results of such assessments are given to judges during criminal sentencing.”

“Rating a defendant’s risk of future crime is often done in conjunction with an evaluation of a defendant’s rehabilitation needs. The Justice Department’s National Institute of Corrections now encourages the use of such combined assessments at every stage of the criminal justice process. And a landmark sentencing reform bill currently pending in Congress would mandate the use of such assessments in federal prisons.”

“In 2014, then U.S. Attorney General Eric Holder warned that the risk scores might be injecting bias into the courts. He called for the U.S. Sentencing Commission to study their use. “Although these measures were crafted with the best of intentions, I am concerned that they inadvertently undermine our efforts to ensure individualized and equal justice,” he said, adding, “they may exacerbate unwarranted and unjust disparities that are already far too common in our criminal justice system and in our society.”

justice courtroom painting 8da75d9d732faeb6972484f088ed8f115

“The sentencing commission did not, however, launch a study of risk scores. So ProPublica did, as part of a larger examination of the powerful, largely hidden effect of algorithms in American life.”

“We obtained the risk scores assigned to more than 7,000 people arrested in Broward County, Florida, in 2013 and 2014 and checked to see how many were charged with new crimes over the next two years, the same benchmark used by the creators of the algorithm.””

“The score proved remarkably unreliable in forecasting violent crime: Only 20 percent of the people predicted to commit violent crimes actually went on to do so.”

“When a full range of crimes were taken into account — including misdemeanors such as driving with an expired license — the algorithm was somewhat more accurate than a coin flip. Of those deemed likely to re-offend, 61 percent were arrested for any subsequent crimes within two years.”

justice courtroom good pix 14345027_BG1

“We also turned up significant racial disparities, just as Holder feared. In forecasting who would re-offend, the algorithm made mistakes with black and white defendants at roughly the same rate but in very different ways.”

  • “The formula was particularly likely to falsely flag black defendants as future criminals, wrongly labeling them this way at almost twice the rate as white defendants.”
  • “White defendants were mislabeled as low risk more often than black defendants.”

“Could this disparity be explained by defendants’ prior crimes or the type of crimes they were arrested for? No. We ran a statistical test that isolated the effect of race from criminal history and recidivism, as well as from defendants’ age and gender. Black defendants were still 77 percent more likely to be pegged as at higher risk of committing a future violentcrime and 45 percent more likely to be predicted to commit a future crime of any kind. (Read our analysis.)”

justice courtroom 08maxresdefault

“The algorithm used to create the Florida risk scores is a product of a for-profit company, Northpointe. The company disputes our analysis.”

“In a letter, it criticized ProPublica’s methodology and defended the accuracy of its test: “Northpointe does not agree that the results of your analysis, or the claims being made based upon that analysis, are correct or that they accurately reflect the outcomes from the application of the model.”

“Northpointe’s software is among the most widely used assessment tools. The company does not publicly disclose the calculations used to arrive at defendants’ risk scores, so it is not possible for either defendants or the public to see what might be driving the disparity. (On 5/22/2016, Northpointe gave ProPublica the basics of its future-crime formula — which includes factors such as education, and whether a defendant has a job. It did not share the specific calculations, which it said are proprietary.”

justice drawing good pix sq2PElhkeiG-yuBOoutG8Q-default

“Northpointe’s core product is a set of scores derived from 137 questions that are either answered by defendants or pulled from criminal records. Race is not one of the questions. The survey asks defendants such things as: “Was one of your parents ever sent to jail or prison?” “How many of your friends are taking drugs illegally?” and “How often did you get in fights while at school?” The questionnaire also asks people to agree or disagree with statements such as “A hungry person has a right to steal” and “If people make me angry or lose my temper, I can be dangerous.”

“The appeal of risk scores is obvious: The United States locks up far more people than any other country, a disproportionate number of them black. For more than two centuries, the key decisions in the legal process, from pretrial release to sentencing to parole, have been in the hands of human beings guided by their instincts and personal biases.”

“If computers could accurately predict which defendants were likely to commit new crimes, the criminal justice system could be fairer and more selective about who is incarcerated and for how long. The trick, of course, is to make sure the computer gets it right. If it’s wrong in one direction, a dangerous criminal could go free. If it’s wrong in another direction, it could result in someone unfairly receiving a harsher sentence or waiting longer for parole than is appropriate.”

6 comments

  1. What makes this truly fascinating is that we can’t impose mandatory treatment on someone who is floridly psychotic because it violates their right to refuse treatment. In other words, we use software with a built in class bias to incarcerate the poor (and mentally ill) in prison but we can’t use objective medical criteria to impose treatment in a hospital on someone who is demonstrably ill with a mental illness.

    • Dear Rob,

      This just adds another layer of judicial unfairness and bias towards minority defendants that already exists. This continues the current trend of Blacks being incarcerated at a rate of almost 6 times more than White suspects which packs the jails..

      What I don’t know is if a defendant can refuse to take this assessment.

      Something needs to be done to end this practice. Voting against the GOP party is a start.

      Thanks a million for your comments and for the reblog, Gronda

      • I was the GOP that privatized the prison system. It was the GOP that cut funding for the public defenders office and legal aid. It was the GOP that closed all public programs designed to help people to learn new skills and work their way out of poverty. Clearly, voting the GOP out of power won’t end a system that turns the poor into dollars for greedy men and women. The government must be used to re-allocate public money to services that assist rather than use people.

Comments are closed.