Algorithms Of Oppression
   HOME

TheInfoList



OR:

''Algorithms of Oppression: How Search Engines Reinforce Racism'' is a 2018 book by Safiya Umoja Noble in the fields of information science, machine learning, and human-computer interaction.


Background

Noble earned an undergraduate degree in sociology from California State University, Fresno in the 1990s, then worked in advertising and marketing for fifteen years before going to the University of Illinois Urbana-Champaign for a Master of Library and Information Science degree in the early 2000s. The book's first inspiration came in 2011, when Noble Googled the phrase "black girls" and saw results for pornography on the first page. Noble's doctoral thesis, completed in 2012, was titled "Searching for Black girls: Old traditions in new media." At this time, Noble thought of the title "Algorithms of Oppression" for the eventual book. By this time, changes to Google's algorithm had changed the most common results for a search of "black girls," though the underlying biases remain influential. Noble became an assistant professor at University of California, Los Angeles in 2014. In 2017, she published an article on racist and sexist bias in search engines in The Chronicle of Higher Education. The book was published on February 20, 2018.


Overview

''Algorithms of Oppression'' is a text based on over six years of academic research on Google search algorithms, examining search results from 2009 to 2015. The book addresses the relationship between search engines and discriminatory biases. Noble argues that search algorithms are racist and perpetuate societal problems because they reflect the negative biases that exist in society and the people who create them. Noble dismantles the idea that search engines are inherently neutral by explaining how algorithms in search engines privilege whiteness by depicting positive cues when key words like “white” are searched as opposed to “asian,”  “hispanic,”  or “Black.” Her main example surrounds the search results of "Black girls" versus "white girls" and the biases that are depicted in the results. These algorithms can then have negative biases against
women of color The term "person of color" (plural, : people of color or persons of color; abbreviated POC) is primarily used to describe any person who is not considered "White people, white". In its current meaning, the term originated in, and is primarily a ...
and other marginalized populations, while also affecting Internet users in general by leading to "racial and gender profiling, misrepresentation, and even economic redlining." The book argues that algorithms perpetuate oppression and discriminate against People of Color, specifically women of color. Noble takes a Black intersectional feminist approach to her work in studying how google algorithms affect people differently by race and gender. Intersectional Feminism takes into account the diverse experiences of women of different races and sexualities when discussing their oppression society, and how their distinct backgrounds affect their struggles. Additionally, Noble's argument addresses how racism infiltrates the google algorithm itself, something that is true throughout many coding systems including facial recognition, and medical care programs. While many new technological systems promote themselves as progressive and unbiased, Noble is arguing against this point and saying that many technologies, including google's algorithm "reflect and reproduce existing inequities."


Chapter Summaries


Chapter 1

In Chapter 1 of ''Algorithms of Oppression'',
Safiya Noble Safiya Umoja Noble is a professor at UCLA, and is the Co-Founder and Co-Director of the UCLA Center for Critical Internet Inquiry. She is the author of ''Algorithms of Oppression'', and co-editor of two edited volumes: ''The Intersectional Intern ...
explores how Google search's auto suggestion feature is demoralizing. On September 18, 2011, a mother googled “black girls” attempting to find fun activities to show her stepdaughter and nieces. To her surprise, the results encompassed websites and images of porn. This result encloses the data failures specific to people of color and women which Noble coins algorithmic oppression. Noble also adds that as a society we must have a
feminist Feminism is a range of socio-political movements and ideologies that aim to define and establish the political, economic, personal, and social equality of the sexes. Feminism incorporates the position that society prioritizes the male po ...
lens, with racial awareness to understand the “problematic positions about the benign instrumentality of technologies.” Noble also discusses how Google can remove the human curation from the first page of results to eliminate any potential racial slurs or inappropriate imaging. Another example discussed in this text is a public dispute of the results that were returned when “Jew” was searched on Google. The results included a number of
anti-Semitic Antisemitism (also spelled anti-semitism or anti-Semitism) is hostility to, prejudice towards, or discrimination against Jews. A person who holds such positions is called an antisemite. Antisemitism is considered to be a form of racism. Antis ...
pages and Google claimed little ownership for the way it provided these identities. Google instead encouraged people to use “Jews” or “Jewish people” and claimed the actions of White supremacist groups are out of Google's control. Unless pages are unlawful, Google will allow its algorithm to continue to act without removing pages. Noble reflects on AdWords which is Google's advertising tool and how this tool can add to the biases on Google. Adwords allows anyone to advertise on Google's search pages and is highly customizable. First, Google ranks ads on relevance and then displays the ads on pages which it believes are relevant to the search query taking place. An advertiser can also set a maximum amount of money per day to spend on advertising. The more you spend on ads, the higher probability your ad will be closer to the top. Therefore, if an advertiser is passionate about his/her topic but it is controversial it may be the first to appear on a Google search.


Chapter 2

In Chapter 2 of ''Algorithms of Oppression'', Noble explains that Google has exacerbated racism and how they continue to deny responsibility for it. Google puts the blame on those who have created the content and as well as those who are actively seeking this information. Google's algorithm has maintained social inequalities and stereotypes for Black, Latina, and Asian women, mostly due in part to Google's design and infrastructure that normalizes whiteness and men. She explains that the Google algorithm categorizes information which exacerbates stereotypes while also encouraging white hegemonic norms. Noble found that after searching for black girls, the first search results were common stereotypes of black girls, or the categories that Google created based on their own idea of a black girl. Google hides behind their algorithm that has been proven to perpetuate inequalities.


Chapter 3

In Chapter 3 of ''Algorithms of Oppression'',
Safiya Noble Safiya Umoja Noble is a professor at UCLA, and is the Co-Founder and Co-Director of the UCLA Center for Critical Internet Inquiry. She is the author of ''Algorithms of Oppression'', and co-editor of two edited volumes: ''The Intersectional Intern ...
discusses how Google's search engine combines multiple sources to create threatening narratives about minorities. She explains a case study where she searched “black on white crimes” on Google. Noble highlights that the sources and information that were found after the search pointed to conservative sources that skewed information. These sources displayed racist and anti-black information from white supremacist sources. Ultimately, she believes this readily-available, false information fueled the actions of white supremacist Dylann Roof, who committed a massacre


Chapter 4

In Chapter 4 of ''Algorithms of Oppression'', Noble furthers her argument by discussing the way in which Google has oppressive control over identity. This chapter highlights multiple examples of women being shamed due to their activity in the porn industry, regardless if it was consensual or not. She critiques the internet's ability to influence one's future due to its permanent nature and compares U.S. privacy laws to those of the European Union, which provides citizens with “the right to forget or be forgotten.” When utilizing search engines such as Google, these breaches of privacy disproportionately affect women and people of color. Google claims that they safeguard our data in order to protect us from losing our information, but fails to address what happens when you want your data to be deleted.


Chapter 5

In Chapter 5 of ''Algorithms of Oppression'', Noble moves the discussion away from google and onto other information sources deemed credible and neutral. Noble says that prominent libraries, including the Library of Congress, encourage whiteness
heteronormativitypatriarchy
and other societal standards as correct, and alternatives as problematic. She explains this problem by discussing a case between Dartmouth College and the Library of Congress where "student-led organization the Coalition for Immigration Reform, Equality (CoFired) and DREAMers" engaged in a two-year battle to change the Library's terminology from 'illegal aliens' to 'noncitizen' or 'unauthorised immigrants.' Noble later discusses the problems that ensue from misrepresentation and classification which allows her to enforce the importance of contextualisation. Noble argues that it is not just google, but all digital search engines that reinforce societal structures and discriminatory biases and by doing so she points out just how interconnected technology and society are.


Chapter 6

In Chapter 6 of ''Algorithms of Oppression'',
Safiya Noble Safiya Umoja Noble is a professor at UCLA, and is the Co-Founder and Co-Director of the UCLA Center for Critical Internet Inquiry. She is the author of ''Algorithms of Oppression'', and co-editor of two edited volumes: ''The Intersectional Intern ...
discusses possible solutions for the problem of algorithmic bias. She first argues that public policies enacted by local and federal governments will reduce Google's “information monopoly” and regulate the ways in which search engines filter their results. She insists that governments and corporations bear the most responsibility to reform the systemic issues leading to algorithmic bias. Simultaneously, Noble condemns the common
neoliberal Neoliberalism (also neo-liberalism) is a term used to signify the late 20th century political reappearance of 19th-century ideas associated with free-market capitalism after it fell into decline following the Second World War. A prominent fa ...
argument that algorithmic biases will disappear if more women and racial minorities enter the industry as
software engineers Software engineering is a systematic engineering approach to software development. A software engineer is a person who applies the principles of software engineering to design, develop, maintain, test, and evaluate computer software. The term '' ...
. She calls this argument “complacent” because it places responsibility on individuals, who have less power than media companies, and indulges a mindset she calls “ big-data optimism,” or a failure to challenge the notion that the institutions themselves do not always solve, but sometimes perpetuate inequalities. To illustrate this point, she uses the example of Kandis, a Black hairdresser whose business faces setbacks because the review site Yelp has used biased advertising practices and searching strategies against her. She closes the chapter by calling upon the Federal Communications Commission (FCC) and the
Federal Trade Commission The Federal Trade Commission (FTC) is an independent agency of the United States government whose principal mission is the enforcement of civil (non-criminal) antitrust law and the promotion of consumer protection. The FTC shares jurisdiction ov ...
(FTC) to “regulate decency,” or to limit the amount of
racist Racism is the belief that groups of humans possess different behavioral traits corresponding to inherited attributes and can be divided based on the superiority of one race over another. It may also mean prejudice, discrimination, or antagonism ...
, homophobic, or prejudiced rhetoric on the Internet. She urges the public to shy away from “ colorblind” ideologies toward race because it has historically erased the struggles faced by racial minorities. Lastly, she points out that big-data optimism leaves out discussion about the harms that
big data Though used sometimes loosely partly because of a lack of formal definition, the interpretation that seems to best describe Big data is the one associated with large body of information that we could not comprehend when used only in smaller am ...
can disproportionately enact upon minority communities.


Conclusion

In ''Algorithms of Oppression'',
Safiya Noble Safiya Umoja Noble is a professor at UCLA, and is the Co-Founder and Co-Director of the UCLA Center for Critical Internet Inquiry. She is the author of ''Algorithms of Oppression'', and co-editor of two edited volumes: ''The Intersectional Intern ...
explores the social and political implications of the results from our Google searches and our search patterns online. Noble challenges the idea of the internet being a fully democratic or post-racial environment. Each chapter examines different layers to the algorithmic biases formed by search engines. By outlining crucial points and theories throughout the book, ''Algorithms of Oppression'' is not limited to only academic readers. This allows for Noble's writing to reach a wider and more inclusive audience.


Critical reception

Critical reception for ''Algorithms of Oppression'' has been largely positive. In the '' Los Angeles Review of Books'',
Emily Drabinski Emily Drabinski is an academic librarian, author, teacher and president-elect of the American Library Association for 2023-24. Early life and education Drabinski grew up in Boise, Idaho, attending Madison Elementary School, North Junior High Sch ...
writes, "What emerges from these pages is the sense that Google’s algorithms of oppression comprise just one of the hidden infrastructures that govern our daily lives, and that the others are likely just as hard-coded with white supremacy and misogyny as the one that Noble explores." In '' PopMatters,'' Hans Rollman writes that ''Algorithms of Oppression'' "demonstrate that search engines, and in particular Google, are not simply imperfect machines, but systems designed by humans in ways that replicate the power structures of the western countries where they are built, complete with all the sexism and racism that are built into those structures." In '' Booklist,'' reviewer Lesley Williams states, "Noble’s study should prompt some soul-searching about our reliance on commercial search engines and about digital social equity." In early February 2018, ''Algorithms of Oppression'' received press attention when the official Twitter account for the
Institute of Electrical and Electronics Engineers The Institute of Electrical and Electronics Engineers (IEEE) is a 501(c)(3) professional association for electronic engineering and electrical engineering (and associated disciplines) with its corporate office in New York City and its operation ...
expressed criticism of the book, saying that the results of a Google search suggested in its blurb did not match Noble's predictions. IEEE's outreach historian, Alexander Magoun, later revealed that he had not read the book, and issued an apology.


See also

* Algorithmic bias *
Techlash The platform economy is economic and social activity facilitated by platforms. Such platforms are typically online sales or technology frameworks. By far the most common type are "transaction platforms", also known as "digital matchmakers". Example ...


References

{{reflist


External links


Algorithms of Oppression: How Search Engines Reinforce Racism
Books about race and ethnicity Algorithms 2018 non-fiction books English-language books Information science Human–computer interaction Machine learning algorithms New York University Press books