Police have said they are seeking "balance" in the use of artificial intelligence to predict crimes, after freedom of information requests found that 14 UK police forces were deploying, testing or investigating predictive AI techniques. The report by Liberty, "Policing by Machine", warned that the tools risk entrenching existing biases and delivering inaccurate predictions. The civil liberties group urged police to end the use of predictive AI, saying mapping techniques rely on "problematic" historical arrest data, while individual risk assessment programmes "encourage discriminatory profiling". The forces using or trialling predictive mapping programmes are Avon and Somerset Constabulary, Cheshire Constabulary, Dyfed-Powys Police, Greater Manchester Police, Kent Police, Lancashire Police, Merseyside Police, the Metropolitan Police Service, Norfolk Constabulary, Northamptonshire Police, Warwickshire Police and West Mercia Police, West Midlands Police and West Yorkshire Police, while a further three forces – Avon and Somerset, Durham and West Midlands – are using or trialling individual risk-assessment programmes. Norfolk Police, for instance, is trialling a system for identifying whether burglaries should be investigated, while Durham Constabulary's Harm Assessment Risk Tool (Hart) provides advice to custody officers on individuals' risk of re-offending, and West Midlands Police uses hotspot mapping and a data-driven analysis project.
Feb-12-2019, 08:17:20 GMT