The lifts rising to Yitu Technology's headquarters have no buttons. The pass cards of the staff and visitors stepping into the elevators that service floors 23 and 25 of a newly built sky scraper in Shanghai's Hongqiao business district are read automatically – no swipe required – and each passenger is deposited at their specified floor. The only way to beat the system and alight at a different floor is to wait for someone who does have access and jump out alongside them. Or, if this were a sci-fi thriller, you'd set off the fire alarms and take the stairs while everyone else was evacuating. But even in that scenario you'd be caught: Yitu's cameras record everyone coming into the building and tracks them inside.
Question: can AI vision systems from Microsoft and Google, which are available for free to anybody, identify NSFW (not safe for work, nudity) images? Can this identification be used to automatically censor images by blacking out or blurring NSFW areas of the image? Method: I spent a few hours creating in some rough code in Microsoft office to find files on my computer and send them to Google Vision and Microsoft Vision so they could be analysed. I spent a few hours over the weekend just knocking some very rough code. Yes, they did reasonably well at (a) identifying images that could need censoring and (b) identifying where on the image things should be blocked out.
But a new, comprehensive report on the status of facial recognition as a tool in law enforcement shows the sheer scope and reach of the FBI's database of faces and those of state-level law enforcement agencies: Roughly half of American adults are included in those collections. The 150-page report, released on Tuesday by the Center for Privacy & Technology at the Georgetown University law school, found that law enforcement databases now include the facial recognition information of 117 million Americans, about one in two U.S. adults. Meanwhile, since law enforcement facial recognition systems often include mug shots and arrest rates among African Americans are higher than the general population, algorithms may be disproportionately able to find a match for black suspects. In reaction to the report, a coalition of more than 40 civil rights and civil liberties groups, including the American Civil Liberties Union and the Leadership Conference for Civil and Human Rights launched an initiative on Tuesday asking the Department of Justice's Civil Rights Division to evaluate current use of facial recognition technology around the country.