Police facial recognition system faces legal challenge

BBC News

A legal challenge against the use of automatic facial recognition technology by police has been launched by a civil liberties group. Automatic Facial Recognition uses CCTV or surveillance cameras to record and compare facial characteristics with images on police databases. Lawyers for Big Brother Watch argue the use of AFR breaches the rights of individuals under the Human Rights Act. The Metropolitan Police says the technology will help keep London safe. The system is being piloted in London, with three other forces - Humberside, South Wales, and Leicestershire - also trialling the technology.


'I was shocked it was so easy': meet the professor who says facial recognition can tell if you're gay

The Guardian

Vladimir Putin was not in attendance, but his loyal lieutenants were. On 14 July last year, the Russian prime minister, Dmitry Medvedev, and several members of his cabinet convened in an office building on the outskirts of Moscow. On to the stage stepped a boyish-looking psychologist, Michal Kosinski, who had been flown from the city centre by helicopter to share his research. "There was Lavrov, in the first row," he recalls several months later, referring to Russia's foreign minister. "You know, a guy who starts wars and takes over countries." Kosinski, a 36-year-old assistant professor of organisational behaviour at Stanford University, was flattered that the Russian cabinet would gather to listen to him talk. "Those guys strike me as one of the most competent and well-informed groups," he tells me. Kosinski's "stuff" includes groundbreaking research into technology, mass persuasion and artificial intelligence (AI) – research that inspired the creation of the political consultancy Cambridge Analytica. Five years ago, while a graduate student at Cambridge University, he showed how even benign activity on Facebook could reveal personality traits – a discovery that was later exploited by the data-analytics firm that helped put Donald Trump in the White House.


Weaponized drones. Machines that attack on their own. 'That day is going to come'

#artificialintelligence

Technicians and researchers are cautioning about the threat such technology poses for cybersecurity, that fundamentally important practice that keeps our computers and data -- and governments' and corporations' computers and data -- safe from hackers. In February, a study from teams at the University of Oxford and University of Cambridge warned that AI could be used as a tool to hack into drones and autonomous vehicles, and turn them into potential weapons. "Autonomous cars like Google's (Waymo) are already using deep learning, can already raid obstacles in the real world," Caspi said, "so raiding traditional anti-malware system in cyber domain is possible." Another study, by U.S. cybersecurity software giant Symantec, said that 978 million people across 20 countries were affected by cybercrime last year. Victims of cybercrime lost a total of $172 billion -- an average of $142 per person -- as a result, researchers said.


Weaponized drones. Machines that attack on their own. 'That day is going to come'

#artificialintelligence

Technicians and researchers are cautioning about the threat such technology poses for cybersecurity, that fundamentally important practice that keeps our computers and data -- and governments' and corporations' computers and data -- safe from hackers. In February, a study from teams at the University of Oxford and University of Cambridge warned that AI could be used as a tool to hack into drones and autonomous vehicles, and turn them into potential weapons. "Autonomous cars like Google's (Waymo) are already using deep learning, can already raid obstacles in the real world," Caspi said, "so raiding traditional anti-malware system in cyber domain is possible." Another study, by U.S. cybersecurity software giant Symantec, said that 978 million people across 20 countries were affected by cybercrime last year. Victims of cybercrime lost a total of $172 billion -- an average of $142 per person -- as a result, researchers said.


A.I. Powered Civil Servants Could Take Over the Government

#artificialintelligence

Until robots reach some kind of full consciousness, nobody feels bad about not paying them. That's why austerity-focused governments are on a collision course with replacing civil servants with automated workers to cut costs: We're talking robo-cops, robo-firefighters, robo-mailmen. If artificial intelligence gets good enough to take private-sector jobs, there's little to stop it from driving a technological revolution that could leave government services unrecognizable. For the past six years, the British Conservative Party has been pursuing an austerity program to cut the deficit, slashing funds for local services and reducing staff numbers in the public sector. A recent report, published in October by Oxford University and Deloitte, suggests the government could go much further: over 850,000 public sector jobs could be lost to automation by the year 2030.