Well, information is not neutral and that’s where Safiya Umoja Noble’s Algorithms of Oppression comes in. The subtitle of the book is How Search Engines Reinforce Racism, but the overall framework that Noble establishes here could easily transcend any number of intersections and is well-worth considering, especially given the last few years of the slogan “do your research” being thrown around willy-nilly.
Noble’s text examines a range of issues related to the destructive impact and implications for search engines. She considers the economic forces that drive search engine optimization, the role of information sciences in sorting websites, issues in representation and identity, a discussion of the industry itself. There’s a reasonably broad range of considerations; some are welcome reminders of things we ought to know while others are genuinely insightful. The text overall serves as a nice primer, although I do wish that Noble delved more into some areas, specifically the mechanics of how things like search engine optimization work or more tangible case studies of Google searches / AI going awry.
One of the common explanations that people give for the persistence of racism and White supremacy in tech is that there just aren’t enough Black (or other poc) coders. Noble addresses this concern when considering Google’s response to racist gaffes in their algorithms. In the first place, she exposes this as a false narrative: more Black and Latino people are graduating from college programs with degrees in computer science. The deeper level that Noble explores beyond that is the philosophical underpinnings of such a narrative. Quoting from another scholar, she notes that this narrative “that nothing can be done today and so we must invest in the youth of tomorrow ignores the talents and achievements of thousands of people in tech from underrepresented backgrounds and renders them invisible.” Meanwhile, “filling the pipeline and holding future Black women programmers responsible for solving the problems of racist exclusion and misrepresentation in Silicon Valley or in biased product development is not the answer.” I think this is an important point to acknowledge; the responsibility for disrupting the system from within is an onerous task and when there are already so many barriers to entry, the idea of trying to overcome the barriers and then dismantle them from within seems to be an unrealistic expectation on some of the more disempowered stakeholders while those with more power to enact change can easily do just that.
Noble addresses further concerns within the industry, particularly “the exclusionary practices of Silicon Valley.” She continues on to note that we need to challenge “the notion that merit and opportunity go to the smartest people prepared to innovate” because that “render[s] people of colour non-technical [and reinforces that] the domain of technology belongs to Whites and reinforces the problematic conceptions of African Americans.” This myth of technology being driven by innovation and not by material factors overlooks some of the foundational components of the industry and Noble notes that the problem is “exacerbated by framing the problems as ‘pipeline issues’ instead of as an issue of racism and sexism, which extends from employment practices to product design.” The issue is that we continually place the burden in the wrong place: “‘Black girls need to learn how to code’ is an excuse for not addressing the persistent marginalization of Black women in Silicon Valley.” I appreciated that Noble addresses the negative impacts of such a mindset for people currently in the industry AND for those who will be impacted in the long-term, the coders of tomorrow.
As a brief aside, one of the most compelling testimonials in the book comes from a former employee in big tech (I think at Facebook?). She commented on the way that hateful content was moderated, whether racist, sexist, homomphobic in its imagery or content. She identifies that there is a lack of transparency in what is and is not permitted—and is so by design. When content moderation standards are publicized and transparent, it becomes too easy to ‘game the system’ and morph offensive language into something new.
From here, Noble launches into a discussion of the economics of search. She recognizes that “on commercial social media sites and platforms, these principles [of content moderation] are always counterbalanced by a profit motive. If a platform were to become notorious for being too restrictive in the eyes of the majority of its users, it would run the risk of losing participants to offer to its advertisers” in turn, companies “err on the side of allowing more rather than less racist content.” Elon Musk is happily tweeting at such a model. It’s a bit strange to think about here, but the lack of transparency in content moderation actually seems beneficial, whereas total transparency leads to the wrong kinds of innovation.
Very briefly, it was interesting, too, to see Noble’s discussion of pornography as a driving force for innovation with respect to online credit card payment, advertising and promotion, audio and video streaming technology, and so on. In Ray Kurzweil’s book The Age of Spiritual Machines, he also notes how sexual impulses drive technological innovation—-and incidentally, there have been two stories in the news this week of note. For posterity’s sake, I’ll note that Replika (a chat-with-AI app) has suddenly ramped up its sexualization (offering to sell illicit photos, etc.) and then just as suddenly shut it off completely. Meanwhile, we’ve got the Bing AI telling men they don’t actually love their wives, introducing itself as Sydney, and confessing love for the user (and asking for love in return!). Noble touches on the hypersexualization of Black bodies in Google’s algorithm, even when pornographic content is filtered out, so this will be an area that demands further academic study as we move forward in debates surrounding innovation, technology, and safety.
Where Noble’s text really shines is in its discussion of how to reframe information sciences. We like to consider Google results (and the internet generally) as more or less neutral. In fact, we fail to account “for the complexities of human intervention involved in vetting of information nor [...] pay attention to the relative weight or importance of certain types of information.” I found this entire section illuminating. It’s worthwhile, of course, to consider what types of sources are being used and the credibility of individual sources, but to examine the sources within a broader network (literally and figuratively) is Noble’s most impressive feat. She compares the idea of citing works in a publication: “all citations are given equal weight in the bibliography, although their relative importance to the development of thought may not be equal at all. Additionally, no relative weight is given to whether a reference is validated, rejected, employed, or engaged, complicating the ability to know what a citation actually means in a document.” It made me think back to writing my thesis; I hadn’t really considered before how my bibliography includes an entry for a text from which I cited one line and an equal entry for a text that is foundational to my argument. Noble also discusses how “Authors who have become so mainstream as not to be cited, such as not attributing modern discussion of class or power dynamics to Karl Marx or the notion of the individual to the scholar of the Italian Renaissance Jacob Burckhardt mean that these intellectual contributions may undergird the framework of an argument but move through works without being cited any longer.” This can become a real problem for information found online. While algorithms may point to sources that seem more relevant or more cited, some forms of information may disappear in the shuffle or be decontextualized entirely. Conversely, there may be a number of works that are oft-cited but insignificant that nonetheless boost search engine scores.
Unlike a project like my thesis, Noble points out that “scientific or scholarly citations, which once in print are persistent and static, hyperlinking is a dynamic process that can change from moment to moment.” She notes how the stability of results in Google rankings is continually shifting through search engine optimization and advertising. Unlike in academia, where “citation importance is a foundational concept for determining scholarly relevance in certain disciplines and citation analysis has largely been considered a mechanism for determining whether a given article or scholarly work is important to the scholarly community”, citation practices online are driven by particular kinds of tags and frequencies that, in some ways, get flattened in their decontexualization.
I’ve referenced the idea of search engine optimization (SEO) in relation to citations. Search engine optimization drives ads or sites to the top of a result list, making weblinks profitable since they are presented as “the best” on the first page. Noble points out that “results that appear not to be advertising are in fact influenced by the advertising algorithm.” There can be sneaky and hidden practices for embedding and cross-referencing hyperlinks that give power to the most common, if invisible, sources—not for those with the greatest merit. The politics of such information sorting undergird the entire text and a persistent question emerges: who should be responsible for providing and dispensing information? Algorithms have failed us, and human intervention can also be flawed. At one point Noble addresses the political role of librarians and then presents some alternatives to Google that are tailored towards particular communities as an engaging model. For example, there are search engines that are geared towards vetted information about Jewish or Black experiences. As we accelerate into the future, I don’t find it hard to believe that the internet will become so replete with false information that it will cease to be valuable at all, so it’s important to address the question of alternatives now.
In Noble’s words, “if the government, industry, schools, hospitals, and public agencies are driving users to the internat as a means of providing services, then this confers a level of authority and trust in the medium itself.” I think about this in particular with respect to having students research, for instance, Indigenous content. I have to recognize my lack of expertise, but then I also have to recognize the problematic nature of asking them to look up information online. Noble is particularly concerned about issues of representation when it comes to particular identities. Recognizing that forms of representation can be harmful, and yet that our identities are crafted and documented online, “this raises questions about who owns identity and the identity markers in cyberspace and whether racialized and gendered identities are ownable property rights that can be contested.” Noble argues that social identity is formed both by individuals and in terms of social categorization that “happens at a socio-structural level and as a matter of personal definition and external definition.” Our identities, therefore, are mediated by depictions online and so we each have some vested interest in what that representation looks like.
By extension, it’s worthwhile to consider the Right to be Forgotten Law in the European Union. At the time of the book being written, the same law did not extend to the United States, where “vulnerable individuals and communities are less likely to find recourse when troublesome or damaging information exists and is indexed by a commercial search engine.” What’s fascinating to me is that the law seems to enable very limited control of information. Even when you request your information to be deleted, “Google is still indexing and archiving links about people in groups within the EU on its domain outside of Europe, such as on Google.com, opening up new challenges to the notion of national boundaries of the web and to how national laws extend to information that is digitally available beyond national borders.” Because the internet is so pervasive, it’s wild to think that your information might exist elsewhere, yet you’re not able to find it, that your information is protected somewhere and yet not elsewhere. Results about you may still appear in Google’s public-facing search results, even when your claim to ownership has been asserted.
Ultimately, Noble presents a compelling case for a Black feminist lens in critical information studies as a way of “contextualizing information as a form of representation or cultural production, rather than as seemingly neutral or benign data that is thought of as a website or a URL that surfaces to the top in a search. The language and terminologies used to describe results on the internet and commercial search engines often obscure the fact that commodified forms of representation are being transacted on the web and that these commercial transactions are not random or without meaning as simply popular websites.”
In an age that increasingly relies on big business to “feed us information” that leads us to unexpected places and conclusions, that allows erroneous, false, and private information to be part of the official record of the self online (despite our protestations), and the lack of protections and actions for challenging group and individual representations online, Algorithms of Oppression seems like a necessary starting place for some conversations about where the internet can and should go next. Especially as AI accelerates to unexpected places at an unmatched rate, the techno-politics we’ll be engaging in require a consideration of its economic, industrial, and sociological foundations.
While I would have liked to see Noble deal with some more specifics and offer methods of taking action, Algorithms of Oppression will generate a great deal of conversation and allows me to reconsider how information is presented in my classes. Anyone who cares about the future should likely read this book (and moreover alongside the book Glitch Feminism by Legacy Russell).
Happy reading and thanks to Michelle for the recommendation!
No comments:
Post a Comment