Legal AI is still biased in 2019
In October 2017, we published an article on how legal Artificial Intelligence systems had turned out to be as biased as we are. One of the cases that had made headlines was the COMPAS system, which is risk assessment software that is used to predict the likelihood of somebody being repeat offender. It turned out the system had a double racial bias, one in favour of white defendants, and one against black defendants. To this day, the problems persist. By now, other cases have come to light.
Oct-1-2019, 06:37:34 GMT
- Country:
- Europe > United Kingdom (0.06)
- North America > United States
- Kentucky (0.06)
- Industry:
- Law > Criminal Law (0.38)
- Technology: