Whilst the main focus of Noble (2018) is on Google's algorithms and how its search engine reinforces racism and sexism, a wider social issue that can be deduced from her book, is the growth of artificial technology which might have negative consequences on the society, despite the assistance they provide in our everyday life. Noble (2018) believes that artificial intelligence will become a major human rights issue in the twenty-first century. Noble has spent some time debating the reasons for Google's racist and sexist search engine results. Her reasons range from paid advertising (I discussed this in my first blog post) to artificial errors developed by Google's automatic algorithms. These artificial errors can have serious repercussions on women because as pointed out by Halavais, search engines often help us with everyday life enquires and, therefore, we often trust in the results that appear without questioning it (As cited in Noble, 2018, p. 25).
The internet might seem like a level playing field, but it isn't. Safiya Umoja Noble came face to face with that fact one day when she used Google's search engine to look for subjects her nieces might find interesting. She entered the term "black girls" and came back with pages dominated by pornography.
A screenshot of Google image search results for'three black teenagers.' SAN FRANCISCO -- Google image searches for "three black teenagers" and "three white teenagers" get very different results, raising troubling questions about how racial bias in society and the media is reflected online. Kabir Alli, an 18-year-old graduating senior from Clover Hill High School in Midlothian, Va., posted a video clip on Twitter this week of a Google image search for "three black teenagers" which turned up an array of police mugshots. He and friends then searched for "three white teenagers," and found groups of smiling young people. "I had actually heard about this search from one of my friends and just wanted to see everything for myself.
Editor's Note: This post is part of our Big Ideas series, a column highlighting the innovative thinking and thought leadership at IIeX events around the world. Bethan Turner will be speaking at IIeX North America (June 11-13 in Atlanta). If you liked this article, you'll LOVE IIeX North America. Click here to learn more. It can be found in machine learning tools that predict whether or not criminals will re-offend (and finds that black offenders are much more likely to re-offend).