Collaborating Authors

Alexa, Google, Siri: What are Your Pronouns? Gender and Anthropomorphism in the Design and Perception of Conversational Assistants Artificial Intelligence

Technology companies have produced varied responses to concerns about the effects of the design of their conversational AI systems. Some have claimed that their voice assistants are in fact not gendered or human-like -- despite design features suggesting the contrary. We compare these claims to user perceptions by analysing the pronouns they use when referring to AI assistants. We also examine systems' responses and the extent to which they generate output which is gendered and anthropomorphic. We find that, while some companies appear to be addressing the ethical concerns raised, in some cases, their claims do not seem to hold true. In particular, our results show that system outputs are ambiguous as to the humanness of the systems, and that users tend to personify and gender them as a result.

Improving Humanness of Virtual Agents and Users' Cooperation through Emotions Artificial Intelligence

In this paper, we analyze the performance of an agent developed according to a well-accepted appraisal theory of human emotion with respect to how it modulates play in the context of a social dilemma. We ask if the agent will be capable of generating interactions that are considered to be more human than machine-like. We conduct an experiment with 117 participants and show how participants rate our agent on dimensions of human-uniqueness (which separates humans from animals) and human-nature (which separates humans from machines). We show that our appraisal theoretic agent is perceived to be more human-like than baseline models, by significantly improving both human-nature and human-uniqueness aspects of the intelligent agent. We also show that perception of humanness positively affects enjoyment and cooperation in the social dilemma.

People think robots are stupid, new study finds


Dang robots are crummy at so many jobs, and they tell lousy jokes to boot. In two new studies, these were common biases human participants held toward robots. The studies were originally intended to test for gender bias, that is, if people thought a robot believed to be female may be less competent at some jobs than a robot believed to be male and vice versa. The studies' titles even included the words "gender," "stereotypes," and "preference," but researchers at the Georgia Institute of Technology discovered no significant sexism against the machines. There was only a very slight difference in a couple of jobs but not significant.

What It Means To Grow Up As A Sex Object

Huffington Post - Tech news and opinion

There is absolutely nothing sexy about being an object. And Jessica Valenti's new memoir, Sex Object, makes that painfully clear. The book, which uses Valenti's own experiences to explore the harassment, sexual objectification, and dehumanization that women and girls face on a daily basis, is essentially a 204-page lesson in the power of women's stories. When women aren't heard from -- or when the culture you live in does its best to ignore your words and experiences -- it becomes easier for people to pretend you aren't a fully realized person. "Thinking of women as not full human beings makes it easy to flash them, it makes it easy to call them names or write them a harassing email," Valenti told The Huffington Post.