uk police
UK police under fire for arresting French publisher
The London Metropolitan Police have been condemned by writers, journalist unions and activists for questioning and detaining a French publisher under the United Kingdom's Terrorism Act. Ernest Moret, foreign rights manager for popular science fiction author Alain Damasio as well as Editions La Fabrique, was on his way to the London Book Fair when he was stopped by police officers on Monday evening. Editions La Fabrique, in a joint statement with the British publishing house Verso Books, said police officers pulled Moret aside for questioning under Schedule 7 of the Terrorism Act after he arrived at London's St Pancreas railway station. The legislation gives police officers the power of stopping, questioning and detaining people to determine if they were involved in the "preparation or instigation of acts of terrorism", read a Metropolitan police's definition. The officers said Moret took part in demonstrations in France against a controversial pension reform, the publishers said in their statement.
- Europe > United Kingdom (1.00)
- Europe > France (0.44)
- Law Enforcement & Public Safety > Crime Prevention & Enforcement (1.00)
- Government > Regional Government > Europe Government > United Kingdom Government (0.70)
- Transportation > Ground > Rail (0.58)
'Minority Report' now a reality? UK police to use AI in war on 'pre-crime'
Suggesting that budget cuts have rendered mere human police incapable of doing their jobs without cybernetic help, project lead Iain Donnelly claims working with an AI system will allow the force to do more with less. He insists that the National Data Analytics Solution, as it's called, will target only those individuals already known to have criminal tendencies, sniffing out likely offenders to divert them with therapeutic "interventions," including individuals who are stopped and searched but never arrested or charged. Donnelly claims the program is not designed to "pre-emptively arrest" anyone, but to provide "support from local health or social workers," giving the example of an individual with a history of mental health problems being flagged as a likely violent offender, then contacted by social services. Given that a violent mental case would almost certainly react negatively to being contacted out of nowhere by a mysterious social worker – and that a history of mental health problems is not in itself criminal – Donnelly was wise to end his example there. "Interventions" will be offered only to potential offenders, but the NDAS claims to be able to pick their victims as well.
- Information Technology > Artificial Intelligence (1.00)
- Information Technology > Data Science > Data Mining (0.37)
UK police need to slow down with face recognition, says data watchdog
A legal code of practice is needed before face recognition technology can be safely deployed by police forces in public places, says the UK's data regulator. The Information Commissioner's Office (ICO) said it has serious concerns about the use of the technology as it relies on large amounts of personal information, in a blog post. Current laws, codes and practices "will not drive the ethical and legal approach that's needed to truly manage the risk that this technology presents," said information commissioner Elizabeth Denham. She called for police forces to be compelled to show justification that face recognition is "strictly necessary, balanced and effective" in each case it is deployed. Face recognition can map faces in a crowd by measuring the distance between facial features, then compare results with a "watch list" of images, which can include suspects, missing people and persons of interest. South Wales Police and the Met Police have been trialling face recognition as a possible way to reduce crime, but the move has been divisive.
Artificial intelligence used by UK police to predict crimes amplifies human bias
Artificial intelligence technology used by police forces in the UK to predict future crimes replicates - and in some cases amplifies - human prejudices, according to a new report. While "predictive policing" tools have been used in the UK since at least 2004, advances in machine learning and AI have enabled the development of more sophisticated systems. These are now used for a wide range of functions including facial recognition and video analysis, mobile phone data extraction, social media intelligence analysis, predictive crime mapping and individual risk assessment. However, the report by the Royal United Services Institute (RUSI) warns that human biases are being built into these machine learning algorithms, resulting in people being unfairly discriminated against due to their race, sexuality and age. One police officer who was interviewed for the report commented that: "Young black men are more likely to be stop and searched than young white men, and that's purely down to human bias. "That human bias is then introduced into the datasets, and bias is then generated in the outcomes of the application of those datasets." In addition to these inherent biases, the report points out that individuals from disadvantaged sociodemographic backgrounds are likely to engage with public services more frequently. As a result, police often have access to more data relating to these individuals, which "may in turn lead to them being calculated as posing a greater risk". Matters could worsen over time, another officer said, when software is used to predict future crime hotspots. "We pile loads of resources into a certain area and it becomes a self-fulfilling prophecy, purely because there's more policing going into that area, not necessarily because of discrimination on the part of officers," the officer said. The report also warns that police forces could become over-reliant on the AI to predict future crimes, and discount other relevant information. "Officers often disagree with the algorithm.
UK police are using AI to spot spikes in Brexit-related hate crimes
THE UK police are monitoring hundreds of thousands of Twitter posts containing hate speech every day. It is part of a pilot project to predict spikes in hate crimes in the run up to 31 October, when the UK is due to leave the European Union. The Online Hate Speech Dashboard is being used by analysts at the National Police Chiefs' Council's online hate crime hub, which was established by the Home Office in 2017 to "tackle the emerging threat of online hate crime". It gathers Twitter posts from across the UK and uses artificially intelligent algorithms to detect speech that is, for example, Islamophobic, anti-Semitic or directed against people from certain countries or with disabilities or from LGBT groups. The police chiefs' council tasked Matthew Williams at Cardiff University, UK, and his colleagues with developing the dashboard so that government organisations could monitor hate speech.
- Europe > United Kingdom > Wales (0.05)
- Europe > United Kingdom > England (0.05)
- Europe > Germany (0.05)
UK police: 2 drones found near Gatwick Airport not involved
LONDON – British police said Saturday that two drones found near London's Gatwick Airport were not involved in the disruption that shut down the busy airport just days before Christmas. Sussex Police Chief Giles York told BBC radio that police have searched 26 potential launch sites near the airport but do not believe they have found the drone that was seen near the runway on Dec. 19 and Dec. 20. York said he is "absolutely certain that there was a drone flying throughout the period that the airport was closed." A senior detective said last week it was possible drones hadn't flown over the airport last week, sowing confusion, but police later insisted that the drone sightings were authentic. The airport's closure led to more than 100,000 people being stranded or delayed in the worst ever drone-related disruption at an international airport.
- Europe > United Kingdom > England > West Sussex (0.97)
- Europe > United Kingdom > England > East Sussex (0.27)
- Transportation > Infrastructure & Services > Airport (1.00)
- Transportation > Air (1.00)
- Law Enforcement & Public Safety > Crime Prevention & Enforcement (1.00)
- Government > Regional Government > Europe Government > United Kingdom Government (0.89)
Exclusive: UK police wants AI to stop violent crime before it happens
Police in the UK want to predict serious violent crime using artificial intelligence, New Scientist can reveal. The idea is that individuals flagged by the system will be offered interventions, such as counselling, to avert potential criminal behaviour. However, one of the world's leading data science institutes has expressed serious concerns about the project after seeing a redacted version of the proposals. The system, called the National Data Analytics Solution (NDAS), uses a combination of AI and statistics to try to assess the risk of someone committing or becoming a victim of gun or knife crime, as well as the likelihood of someone falling victim to modern slavery. West Midlands Police is leading the project and has until the end of March 2019 to produce a prototype.
- Europe > United Kingdom > England > West Midlands (0.27)
- North America > United States > California > Los Angeles County > Los Angeles (0.16)
- North America > United States > District of Columbia (0.05)
- (2 more...)
- Information Technology > Artificial Intelligence (1.00)
- Information Technology > Data Science > Data Mining (0.36)
UK police forced to ground drones after DJI warns that some are falling out of the sky mid-flight
Dronemaker DJI has warned that some of its unmanned aerial vehicles are suddenly falling out of the sky mid-flight. The company says there have been a'small number' of reports surrounding its Matrice 200 series drones, where a power issue is causing them to crash mid-flight. However, the warning has prompted UK police to ground some of their drones. DJI says there have been a'small number' of reports surrounding its Matrice 200 series drones (pictured), where a power issue is causing them to crash mid-flight The United Kingdom Civil Aviation Authority issued a safety notice saying that some 200 model drones lost power mid-flight and dropped straight down to the ground. In one case, a drone experienced an'in-flight issue' and landed on the roof of a commercial building.
- Transportation > Air (1.00)
- Government > Regional Government > Europe Government > United Kingdom Government (0.61)
Facial recognition tech used by UK police is making a ton of mistakes
At the end of each summer for the last 14 years, the small Welsh town of Porthcawl has been invaded. Every year its 16,000 population is swamped by up to 35,000 Elvis fans. Many people attending the yearly festival look the same: they slick back their hair, throw on oversized sunglasses and don white flares. At 2017's Elvis festival, impersonators were faced with something different. Police were trialling automated facial recognition technology to track down criminals.
- Europe > United Kingdom > Wales (0.78)
- Asia > China (0.05)
- Oceania > Australia (0.05)
- Europe > United Kingdom > England > Leicestershire (0.05)
- Law Enforcement & Public Safety > Crime Prevention & Enforcement (1.00)
- Law (0.96)
- Government > Regional Government > Europe Government > United Kingdom Government (0.50)
- Leisure & Entertainment > Sports > Soccer (0.30)
The Latest: UK Police Seek Sightings of Poisoned Spy's Car
"We will always do what is necessary to defend ourselves, our allies and our values against an attack of this sort, which is an attack not only on the United Kingdom, but upon the international rules-based system on which all countries, including Russia, depend for their safety and security," Bristow told reporters.