Goto

Collaborating Authors

People think robots are stupid, new study finds

#artificialintelligence

Dang robots are crummy at so many jobs, and they tell lousy jokes to boot. In two new studies, these were common biases human participants held toward robots. The studies were originally intended to test for gender bias, that is, if people thought a robot believed to be female may be less competent at some jobs than a robot believed to be male and vice versa. The studies' titles even included the words "gender," "stereotypes," and "preference," but researchers at the Georgia Institute of Technology discovered no significant sexism against the machines. There was only a very slight difference in a couple of jobs but not significant.


The Genderless Digital Voice the World Needs Right Now

WIRED

Boot up the options for your digital voice assistant of choice and you're likely to find two options for the gender you prefer interacting with: male or female. The problem is, that binary choice isn't an accurate representation of the complexities of gender. Some folks don't identify as either male or female, and they may want their voice assistant to mirror that identity. But a group of linguists, technologists, and sound designers--led by Copenhagen Pride and Vice's creative agency Virtue--are on a quest to change that with a new, genderless digital voice, made from real voices, called Q. Q isn't going to show up in your smartphone tomorrow, but the idea is to pressure the tech industry into acknowledging that gender isn't necessarily binary, a matter of man or woman, masculine or feminine. The project is confronting a new digital universe fraught with problems.


Gender and Smart Learning Technologies

#artificialintelligence

How can we tackle gender imbalance in the personalities of AI learning tools? The expected growth in use of artificial intelligence (AI) in learning applications is raising concerns about both the potential gendering of these tools and the risk that they will display the inherent biases of their developers. Well, to make it easier for us to integrate AI tools and chatbots into our lives, designers often give them human attributes. For example, applications and robots are often given a personality and gender. Unfortunately, in many cases, gender stereotypes are being perpetuated.


Robot gender: Is it bad for human women?

#artificialintelligence

The creators of robots, then, have both a fantastic opportunity and a very real responsibility to consider what gender means as they design the machines that are becoming increasingly present in our hospitals, our schools, our homes, and our public spaces at large. Some researchers suggest gender stereotypes could be beneficial for robot interfacing, by, for example, capitalizing on our tendency to be more comfortable with women as caretakers. More feminine home health care robots could put patients at ease. But that might be a dangerous path, one that's antithetic to the decades of ongoing work to bring women into fields like business, politics, and particularly science and technology. If robots with a feminine appearance are built only when someone wants a sexbot or an in-home maid--leaving masculine robots with all the heavy lifting--what does that say to the flesh-and-blood humans who work with them?


Is AI Sexist?

#artificialintelligence

It started as a seemingly sweet Twitter chatbot. Modeled after a millennial, it awakened on the internet from behind a pixelated image of a full-lipped young female with a wide and staring gaze. Microsoft, the multinational technology company that created the bot, named it Tay, assigned it a gender, and gave "her" account a tagline that promised, "The more you talk the smarter Tay gets!" She brimmed with enthusiasm: "can i just say that im stoked to meet u? humans are super cool." She asked innocent questions: "Why isn't #NationalPuppyDay everyday?" Tay's designers built her to be a creature of the web, reliant on artificial intelligence (AI) to learn and engage in human conversations and get better at it by interacting with people over social media. As the day went on, Tay gained followers. She also quickly fell prey to Twitter users targeting her vulnerabilities. For those internet antagonists looking to manipulate Tay, it didn't take much effort; they engaged the bot in ugly conversations, tricking the technology into mimicking their racist and sexist behavior.