Civil Rights & Constitutional Law


Stanford professor says face-reading AI will detect IQ

Daily Mail

Stanford researcher Dr Michal Kosinski went viral last week after publishing research (pictured) suggesting AI can tell whether someone is straight or gay based on photos. Stanford researcher Dr Michal Kosinki claims he is working on AI software that can identify political beliefs, with preliminary results proving positive. Dr Kosinki claims he is now working on AI software that can identify political beliefs, with preliminary results proving positive. Dr Kosinki claims he is now working on AI software that can identify political beliefs, with preliminary results proving positive.


FaceApp removes 'Ethnicity Filters' after racism storm

Daily Mail

When asked to make his picture'hot' the app lightened his skin and changed the shape of his nose The app's creators claim it will'transform your face using Artificial Intelligence', allowing selfie-takers to transform their photos Earlier this year people accused the popular photo editing app Meitu of being racist. Earlier this year people accused the popular photo editing app Meitu of giving users'yellow face'. Earlier this year people accused the popular photo editing app Meitu of giving users'yellow face'. Twitter user Vaughan posted a picture of Kanye West with a filter applied, along with the caption: 'So Meitu's pretty racist'


Rise of the racist robots – how AI is learning all our worst impulses

#artificialintelligence

Last year, Lum and a co-author showed that PredPol, a program for police departments that predicts hotspots where future crime might occur, could potentially get stuck in a feedback loop of over-policing majority black and brown neighbourhoods. Programs developed by companies at the forefront of AI research have resulted in a string of errors that look uncannily like the darker biases of humanity: a Google image recognition program labelled the faces of several black people as gorillas; a LinkedIn advertising program showed a preference for male names in searches, and a Microsoft chatbot called Tay spent a day learning from Twitter and began spouting antisemitic messages. Lum and her co-author took PredPol – the program that suggests the likely location of future crimes based on recent crime and arrest statistics – and fed it historical drug-crime data from the city of Oakland's police department. As if that wasn't bad enough, the researchers also simulated what would happen if police had acted directly on PredPol's hotspots every day and increased their arrests accordingly: the program entered a feedback loop, predicting more and more crime in the neighbourhoods that police visited most.


Algorithms aren't racist. Your skin is just too dark.

#artificialintelligence

Lately, I have been in the press discussing the need for more inclusive artificial intelligence and more representative data sets. One way to deal with the challenges of illumination is by training a facial detection system on a set of diverse images with a variety of lighting conditions. My face is visible to a human eye as is the face of my demonstration partner, but the human eye and the visual cortex that processes its input are far more advanced than a humble web camera. Who has to take extra steps to make technology work?


'Racist' FaceApp beautifying filter lightens skin tone

Daily Mail

When asked to make his picture'hot' the app lightened his skin and changed the shape of his nose The app's creators claim it will'transform your face using Artificial Intelligence', allowing selfie-takers to transform their photos Earlier this year people accused the popular photo editing app Meitu of being racist. Earlier this year people accused the popular photo editing app Meitu of giving users'yellow face'. Earlier this year people accused the popular photo editing app Meitu of giving users'yellow face'. Twitter user Vaughan posted a picture of Kanye West with a filter applied, along with the caption: 'So Meitu's pretty racist' The views expressed in the contents above are those of our users and do not necessarily reflect the views of MailOnline.


People call FaceApp racist after it lightens their skin tone

Mashable

The makers of FaceApp are backtracking after users accused the popular face-morphing app of racism. Troublingly, this last option, labeled as "hot," appears to lighten users' skin tone. Since I did my face reveal, I also did the face app thing, tell me what you think, also "hot" me is just me with lighter skin, so racist,:P pic.twitter.com/l2rrVobWyW Mashable reached out to FaceApp founder and CEO Yaroslav Goncharov about the criticism, and he was quick to apologize. This is not the first time that an app has been accused of racism for altering users' appearance.


If Artificial Intelligence Is Taught To Think Like Humans, Then Are Machines Going To Be Sexist, Racist And Discriminatory?

#artificialintelligence

Our devices are connected, personal digital assistants answer our queries, algorithms track our habits and make recommendations, AI is sparking advancements in medicine, cars will soon be driving themselves, and robots will be delivering our pizza etc. An AI-judged beauty contest went through thousands of selfies and chose 44 fair skin faces and only one dark face to be the winners. Tools are usually designed for men, women clothing have no pockets, seat belts were till recently only tested on male dummies, thus putting women at greater risk in case of a crash. Artificial intelligence gives us the incredible opportunity to wipe out human bias in decision making.


5 AI Solutions Showing Signs of Racism

#artificialintelligence

Several artificial intelligence projects have been created over the past few years, most of which still had some kinks to work out. For some reason, multiple AI solutions showed signs of racism once they were deployed in a live environment. It turned out the creators of the AI-driven algorithm powering Pokemon Go did not provide a diverse training set, nor did they spend time in those neighborhoods. It is becoming evident a lot of these artificial intelligence solutions show signs of "white supremacy" for some reason.


Nowhere to hide

BBC News

And Russian app FindFace lets you match a photograph you've taken of someone to their social media profile on the country's popular social media platform Vkontakte. Carl Gohringer, founder and director at Allevate, a facial recognition firm that works with law enforcement, intelligence and government agencies, says: "The amount of media - such as videos and photos - available to us as individuals, organisations and businesses, and to intelligence and law enforcement agencies, is staggering. But Ruth Boardman, data privacy specialist at international law firm Bird & Bird, says individual rights still vary from one EU state to another. And the automation of security vetting decisions based on facial recognition tech raises serious privacy issues.


Cops Have a Database of 117M Faces. You're Probably in It

WIRED

But a new, comprehensive report on the status of facial recognition as a tool in law enforcement shows the sheer scope and reach of the FBI's database of faces and those of state-level law enforcement agencies: Roughly half of American adults are included in those collections. The 150-page report, released on Tuesday by the Center for Privacy & Technology at the Georgetown University law school, found that law enforcement databases now include the facial recognition information of 117 million Americans, about one in two U.S. adults. Meanwhile, since law enforcement facial recognition systems often include mug shots and arrest rates among African Americans are higher than the general population, algorithms may be disproportionately able to find a match for black suspects. In reaction to the report, a coalition of more than 40 civil rights and civil liberties groups, including the American Civil Liberties Union and the Leadership Conference for Civil and Human Rights launched an initiative on Tuesday asking the Department of Justice's Civil Rights Division to evaluate current use of facial recognition technology around the country.