Study shows face recognition experts perform better with AI as partner


Experts at recognizing faces often play a crucial role in criminal cases. A photo from a security camera can mean prison or freedom for a defendant--and testimony from highly trained forensic face examiners informs the jury whether that image actually depicts the accused. Just how good are facial recognition experts? In work that combines forensic science with psychology and computer vision research, a team of scientists from the National Institute of Standards and Technology (NIST) and three universities has tested the accuracy of professional face identifiers, providing at least one revelation that surprised even the researchers: Trained human beings perform best with a computer as a partner, not another person. "This is the first study to measure face identification accuracy for professional forensic facial examiners, working under circumstances that apply in real-world casework," said NIST electronic engineer P. Jonathon Phillips.

Amazon's facial-recognition tool misidentified 28 lawmakers as people arrested for a crime, study finds

Washington Post - Technology News's facial recognition tools incorrectly identified Rep. John Lewis (D-Ga.) and 27 other members of Congress as people arrested for a crime during a test commissioned by the American Civil Liberties Union of Northern California, the watchdog said Thursday. The ACLU said its findings show that Amazon's so-called Rekognition technology -- already in use at law-enforcement agencies in Oregon and Orlando -- is hampered by inaccuracies that disproportionately put people of color at risk and should prompt regulators to halt "law enforcement use of face surveillance." Amazon chief executive Jeffrey P. Bezos owns The Washington Post. For its test, the ACLU of Northern California created a database of 25,000 publicly available arrest photos, though the civil liberties watchdog did not give details about where it obtained the images or the kinds of individuals in the photos. It then used Amazon's Rekognition software to compare that database against photos of every member of the U.S. House and Senate.

Is Facial Recognition Technology Racist? The Tech Connoisseur


Recent studies demonstrate that machine learning algorithms can discriminate based on classes like race and gender. In this work, we present an approach to evaluate bias present in automated facial analysis algorithms and datasets with respect to phenotypic subgroups. Using the dermatologist approved Fitzpatrick Skin Type classification system, we characterize the gender and skin type distribution of two facial analysis benchmarks, IJB-A and Adience. We find that these datasets are overwhelmingly composed of lighter-skinned subjects (79.6% for IJB-A and 86.2% for Adience) and introduce a new facial analysis dataset which is balanced by gender and skin type. We evaluate 3 commercial gender classification systems using our dataset and show that darker-skinned females are the most misclassified group (with error rates of up to 34.7%).

Supporting Feedback and Assessment of Digital Ink Answers to In-Class Exercises

AAAI Conferences

Effective teaching involves treating the presentation of new material and the assessment of students' mastery of this material as part of a seamless and continuous feedback cycle. We have developed a computer system, called Classroom Learning Partner (CLP), that supports this methodology, and we have used it in teaching an introductory computer science course at MIT over the past year. Through evaluation of controlled classroom experiments, we have demonstrated that this approach reaches students who would have otherwise been left behind, and that it leads to greater attentiveness in class, greater student satisfaction, and better interactions between the instructor and student. The current CLP system consists of a network of Tablet PCs, and software for posing questions to students, interpreting their handwritten answers, and aggregating those answers into equivalence classes, each of which represents a particular level of understanding or misconception of the material. The current system supports a useful set of recognizers for specific types of answers, and employs AI techniques in the knowledge representation and reasoning necessary to support interpretation and aggregation of digital ink answers.

Amazon's facial recognition software mistakes women as men and darker-skinned women as men

Daily Mail - Science & tech

Amazon's controversial facial recognition software, Rekognition, is facing renewed criticism. A new study from the MIT Media Lab found that Rekognition may have gender and racial biases. In particular, the software performed worse when identifying gender for females and darker-skinned females. Amazon's controversial facial recognition software, Rekognition, is facing renewed criticism. When the software was presented with a number of female faces, it incorrectly labeled 19 percent of them as male.