In her new book Race After Technology: Abolitionist Tools for the New Jim Code, Ruha Benjamin breaks down the "New Jim Code," technology design that promises a utopian future but serves racial hierarchies and racial bias. When people change how they speak or act in order to conform to dominant norms, we call it "code-switching." And, like other types of codes, the practice of code-switching is power-laden. Justine Cassell, a professor at Carnegie Mellon's Human-Computer Interaction Institute, creates educational programs for children and found that avatars using African American Vernacular English lead Black children "to achieve better results in teaching scientific concepts than when the computer spoke in standard English." But when it came to tutoring the children for class presentations, she explained that, "We wanted it [the avatar] to practice with them in'proper English.' Standard American English is still the code of power, so we needed to develop an agent that would train them in code-switching."
People have a tendency to treat technology and data as neutral, sterile and immune to mortal failings. Yet the digital tools we use at schools, jobs and home don't simply fall from the sky--humans produce them. And that means human biases can and do slip right into the algorithms that have increasing power over our lives. This week for the podcast, we're talking with someone who's questioning the assumptions embedded in the technology and data we use in education, health care, law enforcement and beyond. That guest is Ruha Benjamin, associate professor of African American Studies at Princeton University and author of new book "Race After Technology."
Ruha Benjamin is an associate professor of African American studies at Princeton University, and lectures around the intersection of race, justice and technology. She founded the Just Data Lab, which aims to bring together activists, technologists and artists to reassess how data can be used for justice. Her latest book, Race After Technology, looks at how the design of technology can be discriminatory. Where did the motivation to write this book come from? It seems like we're looking to outsource decisions to technology, on the assumption that it's going to make better decisions than us.
The coronavirus pandemic, protests over police killings and systemic racism, and a contentious election have created the perfect storm for misinformation on social media. But don't expect AI to save us. Twitter's recent decision to red-flag President Donald Trump's false claims about mail-in ballots has reinvigorated the debate on whether social media platforms should fact-check posts. In response, tech leaders explored the idea of using open-source, fully automated fact-checking technology to solve the problem. Agree this should be open source and thus verifiable by everyone https://t.co/T7XSN8anTm
A video that shows an automatic bathroom soap dispenser failing to detect the hand of a dark-skinned man has gone viral and raised questions about racism in technology, as well as the lack of diversity in the industry that creates it. The now-viral video was uploaded to Twitter on Wednesday by Chukwuemeka Afigbo, Facebook's head of platform partnerships in the Middle east and Africa. He tweeted: 'If you have ever had a problem grasping the importance of diversity in tech and its impact on society, watch this video.' The video begins with a white man waving his hand under the dispenser and instantly getting soap on his first try. Then, a darker skinned man waves his hand under the dispenser in various directions for ten seconds, with soap never being released.