Tetris reveals how people respond to an unfair AI algorithm

AIHub 

An experiment in which two people play a modified version of Tetris – the 40-year-old block-stacking video game – revealed that players who get fewer turns perceive the other player as less likable, regardless of whether a person or an algorithm allocates the turns. "We expected that people working in a team would care if they are treated unfairly by another human or an AI," said Malte Jung, associate professor of information science in the Cornell Ann S. Bowers College of Computing and Information Science, whose group conducted the study. Most studies on algorithmic fairness focus on the algorithm or the decision itself, but Jung sought to explore the relationships among the people affected by the decisions. "We are starting to see a lot of situations in which AI makes decisions on how resources should be distributed among people," Jung said. "We want to understand how that influences the way people perceive one another and behave towards each other. We see more and more evidence that machines mess with the way we interact with each other."

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found