Goto

Collaborating Authors

Wales


Police drones are taking to the skies

ZDNet

Police forces in the UK are trialing the use of drones to provide air support to forces on the ground in cases where deploying a helicopter or an aeroplane might be less practical. The National Police Air Services (NPAS), the police aviation service that assists territorial police forces in England and Wales, is evaluating how drone technology might complement its existing national fleet of helicopters and planes. First trials for the technology kicked off at West Wales Airport near Aberporth, and included various typical scenarios that the NPAS's fleet might be confronted with. Typically, police forces request NPAS to assist them with tasks such as searching for suspects or missing people, vehicle pursuits, public order, counter-terrorism and firearms incidents. The NPAS is evaluating how drone technology might complement its existing national fleet of helicopters and aeroplanes.


Banwen rave: Eight fined and arrests made for drug driving

BBC News

Eight people have now been fined up to £10,000 after an illegal rave that attracted 3,000 people, with arrests also made for public order offences and driving under the influence of drugs. The unlicensed event at Banwen, on the edge of the Brecon Beacons, started Saturday night. There were still 400 people at the site on Monday morning. South Wales Police Assistant Chief Constable Dave Thorne said drone footage would help identify organisers. A student who attended the rave admitted being taken aback by the scale of the event and likened it to a festival.


AI Weekly: Surveillance, structural racism, and the Biden 2020 presidential campaign

#artificialintelligence

In the United Kingdom there's been some landmark AI news recently involving government use of the technology. First, use of facial recognition by South Wales Police was ruled unlawful by a Court of Appeal judge in part for violating privacy, human rights, and failure by police to verify the tech did not exhibit race or gender bias. How the U.K. treats facial recognition is important since London has more CCTV cameras than any major city outside of China. Then, U.K. government officials used an algorithm that ended up benefiting kids who go to private schools and downgrading students from disadvantaged backgrounds. Prime Minister Boris Johnson defended the algorithm grading results as "robust" and "dependable for employers."


Court finds some fault with UK police force's use of facial recognition tech – TechCrunch

#artificialintelligence

Civil rights campaigners in the UK have won a legal challenge to South Wales Police's (SWP) use of facial recognition technology. The win on appeal is being hailed as a "world-first" victory in the fight against the use of an "oppressive surveillance tool", as human rights group Liberty puts it. However the police force does not intend to appeal the ruling -- and has said it remains committed to "careful" use of the tech. The back story here is SWP has been trialing automated facial recognition (AFR) technology since 2017, deploying a system known as AFR Locate on around 50 occasions between May 2017 and April 2019 at a variety of public events in Wales. The force has used the technology in conjunction with watchlists of between 400-800 people -- which included persons wanted on warrants; persons who had escaped from custody; persons suspected of having committed crimes; persons who may be in need of protection; vulnerable persons; persons of possible interest to it for intelligence purposes; and persons whose presence at a particular event causes particular concern, per a press summary issued by the appeals court.


Police use of facial recognition gets reined in by UK court - CNET

CNET - News

A close-up of a police facial recognition camera used in Cardiff, Wales. Since 2017, police in the UK have been testing live, or real-time, facial recognition in public places to try to identify criminals. The legality of these trials has been widely questioned by privacy and human rights campaigners, who just won a landmark case that could have a lasting impact on how police use the technology in the future. In a ruling Tuesday, the UK Court of Appeal said South Wales Police had been using the technology unlawfully, which amounted to a violation of human rights. In a case brought by civil liberties campaigner Ed Bridges and supported by human rights group Liberty, three senior judges ruled that the South Wales Police had violated Bridges' right to privacy under the European Convention of Human Rights.


Is police use of face recognition now illegal in the UK?

New Scientist

The UK Court of Appeal has unanimously reached a decision against a face-recognition system used by South Wales Police. The judgment, which called the use of automated face recognition (AFR) "unlawful", could have ramifications for the widespread use of such technology across the UK. But there is disagreement about exactly what the consequences will be. Ed Bridges, who initially launched a case after police cameras digitally analysed his face in the street, had appealed, with the support of personal rights campaign group Liberty, against the use of face recognition by police. The police force claimed in court that the technology was similar to the use of closed-circuit television (CCTV) cameras in cities.


South Wales police lose landmark facial recognition case

The Guardian

The use of facial recognition technology by South Wales police broke race and sex equalities law and breached privacy rights because the force did not apply proper safeguards, the court of appeal has ruled. The critical judgment came in a case brought by Ed Bridges, a civil liberties campaigner, who was scanned by the police software in Cardiff in 2017 and 2018. He argued that capturing of thousands of faces was indiscriminate. Bridges' case had previously been rejected by the high court, but the court of appeal ruled in his favour on three counts, in a significant test case for how the controversial technology is applied in practice by police. But the appeal court held that Bridges' right to privacy, under article 8 of the European convention on human rights, was breached because there was "too broad a discretion" left to police officers as to who to put on its watchlist of suspects.


UK court rules police facial recognition trials violate privacy laws

Engadget

Human rights organization Liberty is claiming a win in its native Britain after a court ruled that police trials of facial recognition technology violated privacy laws. The Court of Appeal ruled that the use of automatic facial recognition systems unfairly impacted claimant Ed Bridges' right to a private life. Judges added that there were issues around how people's personal data was being processed, and said that the trials should be halted for now. The court also found that the South Wales Police (SWP) had not done enough to satisfy itself that facial recognition technology was not unbiased. A spokesperson for SWP told the BBC that it would not be appealing the judgment, but Chief Constable Matt Jukes said that the force will find a way to "work with" the judgment.


Facial recognition use by South Wales Police ruled unlawful

BBC News

The use of automatic facial recognition (AFR) technology by South Wales Police is unlawful, the Court of Appeal has ruled. It follows a legal challenge brought by civil rights group Liberty and Ed Bridges, 37, from Cardiff. But the court also found its use was proportionate interference with human rights as the benefits outweighed the impact on Mr Bridges. South Wales Police said it would not be appealing the findings. Mr Bridges had said being identified by AFR caused him distress.


Police built an AI to predict violent crime. It was seriously flawed

#artificialintelligence

A flagship artificial intelligence system designed to predict gun and knife violence before it happens had serious flaws that made it unusable, police have admitted. The error led to large drops in accuracy and the system was ultimately rejected by all of the experts reviewing it for ethical problems. The prediction system, known as Most Serious Violence (MSV), is part of the National Data Analytics Solution (NDAS) project. The Home Office has funded NDAS with at least £10 million during the last two years with the aim to create machine learning systems that can be used across England and Wales. As a result of the failure of MSV, police have stopped developing the prediction system in its current form.