Race After Technology Part 2
As part of your reflection, based on the self-portrait app presentations, comment on how your experience with technology is similar or different from your peers'.
Race After Tech Reading
Please read Chapters 1 and 2 of Race After Technology. Below are some discussion questions to consider in your reflection:
- What is the difference between the types of racist tech discussed in Chapter 1 vs Chapter 2. Why do you think Dr. Benjamin makes a distinction between these two types?
- If both algorithms and people discriminate, which type is preferable? What are benefits/disadvantages of each?
- Dr. Benjamin writes: "As machines become more 'intelligent,' that is, as they learn to think more like humans, they are likely to become more racist." Why might we expect computers to become more or less racist as they increase in their capabilities?
- Dr. Benjamin argues that simply diversifying the tech work force will not stop algorithms from being racist. Yet this is where most of the focus is. What kinds of progress can be made by diversifying the workforce? What kinds of problems will not be solved by diversifying the workforce?
- What are some of the "glitches" that Dr. Benjamin discusses, and what do they signal about the deeper biases of technology? What do they signal about the deeper biases of our society?
- Dr. Benjamin describes (famous) people who say racist things as "racial glitches." In what way are these glitches similar to technological glitches? When Paula Dean says something racist, does our society normally think that signals something deep about our culture or just about her? How does this reaction compare to, for example, our societal reaction if a Black person commits a crime? How does this relate to technological glitches?
- What other questions do you have from the reading? What were you surprised about? What interested you, or felt particularly important?
Glitch Case Study: Google Search
Watch Safiya Umoja Noble's talk on Algorithms of Oppression. (12:18) Note: Discussion of porn, particularly of porn depicting Black girls, but no explicit images or descriptions.
Dr. Noble's discovery of this "glitch" of porn results when searching the term "Black Girls" led her to investigate more systemic problems with Google Search. Here are a few of the issues she identifies and discusses in her research:
- Google's primary objective is not to provide you with the best information, but to sell your information to others, who then try to sell you things.
- Their search algorithm can be manipulated through SEO (search engine optimization) by organizations that have a lot of money. The porn industry is one such group that invests heavily in SEO.
- Google has adjusted its algorithm to respond to criticism. For example, if you try searching for "Black girls" on Google today, you will not find pornographic content on the first page. They also filter content in countries where, for example, neo-Nazi-ism is illegal. This in turn suggests that Google is moderating its content, and is acting more like a newspaper than like a library/service provider.
Take some time to look into Google Search, as well as alternative search engines (e.g. Duck Duck Go, StartPage, Qwant, etc), and try them out. Some things that you might want to consider:
In your reflection, please discuss:
Further Optional Reading
- Safiya Umoja Noble's book, Algorithms of Oppression, is available from the Middlebury College Library
- Watch an extended version of Dr. Noble's talk