The FBI maintains a huge database of more than 411m photos culled from sources including driver's licenses, passport applications and visa applications, which it cross-references with photos of criminal suspects using largely untested and questionably accurate facial recognition software. A study from the Government Accountability Office (GAO) released on Wednesday for the first time revealed the extent of the program, which had been queried several years before through a Freedom of Information Act request from the Electronic Frontier Foundation (EFF). The GAO, a watchdog office internal to the US federal government, found that the FBI did not appropriately disclose the database's impact on public privacy until it audited the bureau in May. The office recommended that the attorney general determine why the FBI did not obey the disclosure requirements, and that it conduct accuracy tests to determine whether the software is correctly cross-referencing driver's licenses and passport photos with images of criminal suspects. The Department of Justice "disagreed" with three of the GAO's six recommendations, according to the office, which affirmed their validity.
Our brains are wired in a way that they can differentiate between objects, both living and non-living by simply looking at them. In fact, the recognition of objects and a situation through visualization is the fastest way to gather, as well as to relate information. This becomes a pretty big deal for computers where a vast amount of data has to be stuffed into it, before the computer can perform an operation on its own. Ironically, with each passing day, it is becoming essential for machines to identify objects through facial recognition, so that humans can take the next big step towards a more scientifically advanced social mechanism. So, what progress have we really made in that respect?
Facial recognition software has become increasingly common in recent years. Facebook uses it to tag your photos; the FBI has a massive facial recognition database spanning hundreds of millions of images; and in New York, there are even plans to add smart, facial recognition surveillance cameras to every bridge and tunnel. But while these systems seem inescapable, the technology that underpins them is far from infallible. In fact, it can be beat with a pair of psychedelic-looking glasses that cost just $0.22. Researchers from Carnegie Mellon University have shown that specially designed spectacle frames can fool even state-of-the-art facial recognition software.
These are just some of the questions being raised by lawmakers, civil libertarians, and privacy advocates in the wake of an ACLU report released last summer that claimed Amazon's facial recognition software, Rekognition, misidentified 28 members of congress as criminals. Rekognition is a general-purpose, application programming interface (API) developers can use to build applications that can detect and analyze scenes, objects, faces, and other items within images. The source of the controversy was a pilot program in which Amazon teamed up with the police departments of two cities, Orlando, Florida and Washington County, Oregon, to explore the use of facial recognition in law enforcement. In January 2019, the Daily Mail reported that the FBI has been testing Rekognition since early 2018. The Project on Government Oversight also revealed via a Freedom of Information Act request that Amazon had also pitched Rekognition to ICE in June 2018.
Data brokers already buy and sell detailed profiles that describe who you are. They track your public records and your online behavior to figure out your age, your gender, your relationship status, your exact location, how much money you make, which supermarket you shop at, and on and on and on. It's entirely reasonable to wonder how companies are collecting and using images of you, too. Facebook already uses facial recognition software to tag individual people in photos. Apple's new app, Clips, recognizes individuals in the videos you take.