Algorithms of Oppression: How Search Engines Reinforce Racism by Safiya Umoja Noble
My rating: 5 of 5 stars
I found out about this book through a series of events. A woman I follow on Twitter tweeted about a talk one of her peers gave and in that talk she referenced this book. I plan to share this book any time that I see someone saying that coding is “color blind” and that it past experience and history doesn’t matter.
One of the things that I’ve learned from doing Agile software development, is that outcomes are more important than the specifics of the coding. This book focuses on the outcomes of search engine algorithms and explains how search engine algorithms drive racist results.
The author Safiya Umoja Noble, does a fantastic job explaining the basis of her research, querying Google for “black girls” and getting a huge list of porn. She verifies monthly for almost 6 years. She meticulously explains that these results are racist, sexist and the responsibility of Google.
Noble explains why Google believes that these results are not biased, because they are objectively using an algorithm to determine what links appear where on their search results. These results, they argue, are based on what is popular and therefore that makes the content “true.” The basis for Google search is Library Sciences, which uses citations to determine relatedness between content and how important (regardless of context or if the import is a negative or positive attribution) specific pieces of work are. This is called Bibliometrics (I did some of this work while I was earning my Masters).
To my surprise, Noble explained to me that the Dewey Decimal System was racist, sexist, and Eurocentric. I was taught, this was just the way libraries were organized. That this was the best way to organize books. Noble explains, briefly, that this system still used Racist categories like “Black Question” and similar type of classification. Similarly, the US Library of Congress uses similar classification headings. This is problematic, because other major library systems follow a similar system and take their lead from the US.
Google claims that they cannot control their outcomes, however, Noble points out, repeatedly, that they will modify their searches based on criticism or serious problems. Furthermore, Noble explains that after she wrote an article in Bitch Magazine, within 5 months the search results significantly shifted when she queried “black girls.”
I believe that outcomes are the responsibility of engineers, testers, and product owners. Algorithms are not neutral technologies. Hell, even Bridges can be non-neutral technologies, as they can be shaped to prevent buses from using them (which happened in NYC https://en.wikipedia.org/wiki/Robert_…). So, engineers and leaders that believe that software projects are always neutral is flawed.
The best solution is to hire Black Women, Black people, Latinas, and other minorities. Furthermore, it’s important to allow those developers to significantly impact the outcomes of the product. Otherwise, having them as employees will not be effective.