BRITISH cops are using a system to stop crimes BEFORE they happen. Police in Durham are employing artificial intelligence designed to help officers decide whether or not to keep a suspect in custody. Dubbed the Harm Assessment Risk Tool (HART), it predicts the risk of the suspect re-offending by categorising them as low, medium or high risk. The force says the system is due to go live in the next few months, and could be picked up elsewhere in the country before the end of the year. HART is a system developed by University of Cambridge Professor Dr Geoffrey Barnes in a partnership between Durham Constabulary and the University of Cambridge's Centre for Evidence-Based Policing.
AI is rocking the world of policing -- and the consequences are still unclear. British police are poised to go live with a predictive artificial intelligence system that will help officers assess the risk of suspects re-offending. It's not Minority Report (yet) but certainly sounds scary. Just like the evil AIs in the movies, this tool has an acronym: HART, which stands for Harm Assessment Risk Tool, and it's going live in Durham after a long trial. The system, which classifies suspects at a low, medium, or high risk of committing a future offence, was tested in 2013 using data that Durham police gathered from 2008 to 2012.
Some victims of domestic violence and other serious crimes have to wait days to be seen by police officers because 999 calls are not getting a prompt response, a report has found. The Inspectorate of Constabulary said a quarter of forces in England and Wales were often "overwhelmed" by demand. The annual review found instances of police taking days to respond to calls that should be acted on within an hour. Police chiefs said increases in demand had put policing under "real strain". Her Majesty's Inspectorate of Constabulary and Fire & Rescue Services' (HMICFRS) annual assessment of police effectiveness said most police forces were doing a good job and keeping the public safe.
A UK police force which was using an algorithm designed to help it make custody decisions has been forced to alter it amid concerns that it could discriminate against poor people. Durham Constabulary has been developing an algorithm to better predict the risk posed by offenders and to ensure that only the most "suitable" are granted police bail. But the programme has also highlighted potential social inequalities that can be maintained through the use of these big data strategies. This might seem surprising, since an apparent feature of such programmes is that they are apparently neutral: technocratic evaluations of risk based on information that is "value-free" (based on objective calculation, eschewing subjective bias). In practice, the apparent neutrality of the data is questionable.