Collaborating Authors

Changes Proposed to Bail System That Jails Poor Defendants

U.S. News

Lawmakers and advocates across the political spectrum are joining together to propose changes to Michigan's cash bail system, which is under criticism for allowing low-income defendants to be jailed when they cannot afford to post bond.

Legal AI is still biased in 2019


In October 2017, we published an article on how legal Artificial Intelligence systems had turned out to be as biased as we are. One of the cases that had made headlines was the COMPAS system, which is risk assessment software that is used to predict the likelihood of somebody being repeat offender. It turned out the system had a double racial bias, one in favour of white defendants, and one against black defendants. To this day, the problems persist. By now, other cases have come to light.

An Algorithm Is Helping Reform The Criminal Justice System In New Jersey

International Business Times

New Jersey is using a new algorithm to help spark criminal justice reform in the state by removing bias from the bail system through the replacement of the cash bail system the state was previously working with. The new risk-based system is designed to assess the potential threat defendants pose as well as how likely they are to flee before their court date in combination with their prior convictions to determine bail. Previously a judge would set a bail amount based on the crime, which left the state with poor defendants who were unable to post bail on small crimes and subsequently would have to sit in jail while awaiting trial. Twelve percent of defendants were unable to post a bail of $2,500 or less and more than two thirds of those defendants were minorities, according to a fact sheet from the state of New Jersey. Not only were poor defendants left in jail, but also some dangerous defendants who were capable of posting bail would be released, the algorithm is designed to prevent both scenarios.

Is your Machine Learning Model Biased? โ€“ Towards Data Science


ProPublica's analysis of the COMPAS tool found that black defendants were far more likely than white defendants to be incorrectly judged to be at a higher risk of recidivism, while white defendants were more likely than black defendants to be incorrectly flagged at low risk. Northpointe, the company behind the tool responded by saying that the model was not unfair because it had several similar overall performances for both white people and black people. The table above presents the results from each model for the outcomes of any arrest for African American and White men. The AUCs for African American men range from .64 to .73 while for the whites, it ranges from .69 to .75. Northpoint hence concluded that since the AUC results for White men are quite similar to the results for African American men, their algorithm is completely fair.

Change of plea hearing set for third Camp Minden defendant

FOX News

SHREVEPORT, La. โ€“ A change of plea hearing is scheduled Monday for a third defendant in a case involving an industrial explosion at a site leased from the Louisiana National Guard. Kenneth Lampkin was program manager at Explo Systems, which abandoned 7,800 tons (7,100 metric tons) of potentially explosive artillery propellant at Camp Minden when it went bankrupt in 2013. He has pleaded not guilty to 29 counts of conspiracy, false statements and wire fraud. His change of plea would leave an owner from Tennessee and two officials to go on trial June 4. An owner from Kentucky and the company's inventory control officer have pleaded guilty to reduced charges.