Goto

Collaborating Authors

These glasses trick facial recognition software into thinking you're someone else

#artificialintelligence

Facial recognition software has become increasingly common in recent years. Facebook uses it to tag your photos; the FBI has a massive facial recognition database spanning hundreds of millions of images; and in New York, there are even plans to add smart, facial recognition surveillance cameras to every bridge and tunnel. But while these systems seem inescapable, the technology that underpins them is far from infallible. In fact, it can be beat with a pair of psychedelic-looking glasses that cost just $0.22. Researchers from Carnegie Mellon University have shown that specially designed spectacle frames can fool even state-of-the-art facial recognition software.


Boston becomes the second largest city in the US to ban facial recognition software

Daily Mail - Science & tech

Boston will become the second largest city in the US to ban facial recognition software for government use after a unanimous city council vote. Following San Francisco, which banned facial recognition in 2019, Boston will bar city officials from using facial recognition systems. The ordinance will also bar them from working with any third party companies or organizations to acquire information gathered through facial recognition software. The ordinance was co-sponsored by Councilors Ricardo Arroyo and Michelle Wu, who were especially concerned about the potential for racial bias in the technology, according to a report from WBUR. 'Boston should not be using racially discriminatory technology and technology that threatens our basic rights,' Wu said at a hearing before the vote.


Opinion There's no federal standard on facial recognition. Congress should step in.

#artificialintelligence

AND THEN there were three. Amazon has joined Microsoft and Google in supporting regulation of facial recognition technology, and it is easy to guess why: Research on bias in the software has amplified public skepticism, and legislators are starting to take note by proposing restrictions and even bans. Facial recognition technology could have many beneficial effects. The software could help stop human trafficking, reunify refugee families and make everyday services -- from banking to paying for groceries -- safer and faster. But it could come with costs, too, which is why regulators are right to pay attention.


AI Startup Develops Facial Recognition Software For Dogs

#artificialintelligence

Better known as a supplier of facial recognition software used by the Chinese government, an AI-startup that is backed by Alibaba has developed software that can identify dogs by their noses. No, it isn't April 1st; the facial recognition software developed by Megvii really can identify one dog from another by using nasal biometrics. KrAsia news reports that the company has developed the software on the basis that dogs have unique nose prints. Dr. David Dorman, a professor of toxicology, has previously said that: "Like human fingerprints, each dog has a unique nose print. Some kennel clubs have used dog nose prints for identification."


Opinion Facial recognition could make us safer -- and less free

#artificialintelligence

IT IS something of an American tourist tradition to gaze through the iron fences around the White House lawn, but citizens think little about how government might be gazing back. A pilot program by the Secret Service to test the use of facial recognition in and around 1600 Pennsylvania Ave. should prompt everyone, and especially Congress, to start paying attention. The Department of Homeland Security published details recently on its plans to scan feeds from existing cameras in the executive complex and run them through recognition software. This is slightly less scary than it sounds: The cameras will capture people in adjacent public spaces, but only consenting Secret Service employees will be in the program database -- so, barring false positives, faces of passersby that do not match participants' photos will not be stored. More concerning is the potential for future misuse of the technology.