The Chinese government's plans for mass surveillance using facial recognition have received a boost from one of the country's tech powerhouses, after Alibaba led a $600m investment in SenseTime, which develops technology for tracking individuals. The company is working on facial and object recognition technology that accurately can spot people using cameras, recently demonstrated on CCTV in Beijing. Honda is using SenseTime for its driverless car research and development and it is also being used at shopping counters that allows customers to check-out using their faces. SenseTime already smashed the record for AI funding, beating British competitor DeepMind which was bought by Google for an...
San Francisco supervisors approved a ban on police using facial recognition technology, making it the first city in the U.S. with such a restriction. SAN FRANCISCO – A routine traffic stop goes dangerously awry when a police officer's body camera uses its built-in facial recognition software to misidentify a motorist as a convicted felon. At best, lawsuits are launched. That imaginary scenario is what some California lawmakers are trying to avoid by supporting Assembly Bill 1215, the Body Camera Accountability Act, which would ban the use of facial recognition software in police body cams – a national first if it passes a Senate vote this summer and is signed by Gov. Gavin Newsom. State law enforcement officials here do not now employ the technology to scan those in the line of sight of officers.
The facial-recognition cameras installed near the bounce houses at the Warehouse, an after-school recreation center in Bloomington, Indiana, are aimed low enough to scan the face of every parent, teenager and toddler who walks in. The center's director, David Weil, learned earlier this year of the surveillance system from a church newsletter, and within six weeks he had bought his own, believing it promised a security breakthrough that was both affordable and cutting-edge. Since last month, the system has logged thousands of visitors' faces – alongside their names, phone numbers and other personal details – and checked them against a regularly updated blacklist of sex offenders and unwanted guests. The system's Israeli developer, Face-Six, also promotes it for use in prisons and drones. "Some parents still think it's kind of '1984,' " said Weil, whose 21-month-old granddaughter is among the scanned.
Facebook has boosted its face recognition capabilities with the acquisition of startup FacioMetrics. FacioMetrics uses facial image analysis to determine emotions, and is aimed at sectors including gaming, healthcare, augmented reality and robotics. Fernando De la Torre, founder and CEO of FacioMetrics said the company was formed to cater for the increasing interest in and demand for facial image analysis, with applications including augmented/virtual reality, animation and audience reaction measurement. The technology comes out of research at Carnegie Mellon University into developing computer vision and machine learning algorithms for facial image analysis. "Over time, we have successfully developed and integrated this cutting-edge technology into battery-friendly and efficient mobile applications, and also created new applications of this technology," said De la Torre.
Artificial intelligence has made major strides in the past few years, but those rapid advances are now raising some big ethical conundrums. Chief among them is the way machine learning can identify people's faces in photos and video footage with great accuracy. This might let you unlock your phone with a smile, but it also means that governments and big corporations have been given a powerful new surveillance tool. A new report from the AI Now Institute (large PDF), an influential research institute based in New York, has just identified facial recognition as a key challenge for society and policymakers. The speed at which facial recognition has grown comes down to the rapid development of a type of machine learning known as deep learning.