29 pages • 58 minutes read
Ruha BenjaminA modern alternative to SparkNotes and CliffsNotes, SuperSummary offers high-quality Study Guides with detailed chapter summaries and analysis of major themes, characters, and more.
“This book explores how such technologies, which often pose as objective, scientific, or progressive, too often reinforce racism and other forms of inequity. Together, we will work to decode the powerful assumptions and values embedded in the material and digital architecture of our world.”
Benjamin outlines the objectives of Race After Technology. This quote also demonstrates her literary use of tech puns to describe society; she uses “decode” as a synonym for understand, and “embedded” to describe how certain values are a part of our society.
“The stakes are high not only because parents’ decisions will follow their children for a lifetime, but also because names reflect much longer histories of conflict and assimilation and signal fierce political struggles—as when US immigrants from Eastern Europe anglicize their names, or African Americans at the height of the Black Power movement took Arabic or African names to oppose White supremacy.”
Benjamin unpacks the cultural and political significance of names. By tying names to the histories that engendered them, Benjamin draws connections between the past and the present—a temporal move she repeats throughout the book.
“A ‘normal’ name is just one of many tools that reinforce racial invisibility.”
This anticipates the theme of visibility and exposure. As Chapter 3 will explore, technologies such as cameras and facial recognition often interact differently with Black individuals, making them hypervisible or invisible. Likewise, names can subject a person to more government surveillance.
“Codes, in short, operate within powerful systems of meaning that render some things visible, others invisible, and create a vast array of distortions and dangers.”
Benjamin defines the word “code” beyond the context of technology. Evoking law and naming, she foregrounds a broader understanding of code that encompasses anything with implied meaning.
“Karl Marx might call tech personalization our era’s opium of the masses and encourage us to ‘just say no,’ though he might also point out that not everyone is in an equal position to refuse, owing to existing forms of stratification.”
Intertextuality is a key element of Race After Technology. Here, she draws intertextual connections between her meditation on social hierarchies and social theorist Karl Marx’s.
“This is why the notion that tech bias is ‘unintentional’ or ‘unconscious’ obscures the reality—that there is no way to create something without some intention and intended user in mind.”
A main argument of Race After Technology is that technology often and easily reproduces the prejudices of their human creators. This quote foreshadows the concept of the “glitch” that Benjamin explores in Chapter 2: small moments of tech bias that appear unintentional indicate a greater, more insidious source.
“But the outsourcing of human decisions is, at once, the insourcing of coded inequity.”
Benjamin challenges the idea that the solution to human problems is to bypass the human. Instead she calls for accountability and for humans to be responsible programmers and designers, anticipating the biases they may have and not assuming technology to be immune to them.
“However, a universalizing lens may actually hide many of the dangers of discriminatory design, because in many ways Black people already live in the future. The plight of Black people has consistently been a harbinger of wider processes—bankers using financial technologies to prey on Black homeowners, law enforcement using surveillance technologies to control Black neighborhoods, or politicians using legislative techniques to disenfranchise Black voters—which then get rolled out on an even wider scale.”
Discussions around technology typically involve ideas of the future. When new technologies are pitched and marketed, rhetoric often envisions an idealized era to come, where problems are solved and boundaries are bravely pushed. Benjamin suggests a different rhetorical relationship between technology and the future, where tech used to control Black freedom is a precursor of broader threats to individual freedom.
“Until we come to grips with the ‘reasonableness’ of racism, we will continue to look for it on the bloody floors of Charleston churches and in the dashboard cameras on Texas highways, and overlook it in the smart-sounding logics of textbooks, policy statements, court rulings, science journals, and cutting-edge technologies.”
Benjamin asks the reader to consider the impact of subtle inequities that go unseen because we are distracted by more obvious ones. This quote also demonstrates Benjamin’s awareness of her present moment. By mentioning mass shootings and police brutality, she makes Race After Technology current and timely.
“For those of us who believe in a more egalitarian notion of power, of collective empowerment without domination, how we imagine our relation to robots offers a mirror for thinking through and against race as technology.”
The relationship between robots and humans is an analogue for the master and slave relationship that characterized the pre-emancipation US. Benjamin’s analogy draws connections between the past and present. This quote also captures the community ethos that is central to Race After Technology, asking us to tackle the New Jim Code together while working toward equity across backgrounds and identities.
“For instance, we might reflect upon the fact that the infrared technology of an automated soap dispenser treats certain skin tones as normative and upon the reason why this technology renders Black people invisible when they hope to be seen, while other technologies, for example facial recognition for police surveillance, make them hypervisible when they seek privacy. When we draw different technologies into the same frame, the distinction between “trivial” and “consequential” breaks down and we can begin to understand how Blackness can be both marginal and focal to tech development.”
The theme of visibility is characterized by a seeming paradox: Black people are both seen too much—as with police surveillance—and not seen at all—as with an automated soap dispenser. This creates a situation where privacy and protection are not equally accessible to all.
“And, while the idiom of the New Jim Code draws on the history of racial domination in the United States as a touchstone for technologically mediated injustice, our focus must necessarily reach beyond national borders and trouble the notion that racial discrimination is isolated and limited to one country, when a whole host of cross-cutting social ideologies make that impossible.”
Although the reference to Jim Crow evokes US history, the New Jim Code exceeds the boundaries of the United States because technology does as well. While this might make the name the “New Jim Code” inadequate, it also evokes legalized racism, an issue that has roots internationally.
“Consider, too, how gender norms are encoded in the value accorded to buying diapers, together with the presumption that parenthood varnishes (and, by extension, childlessness tarnishes) one’s character.”
Chapter 1 explores fictionalized and real examples of social credit systems where individuals can rate each other. Those ratings carry weight for how people are valued by others and the government. While Race After Technology is primarily about racism, this example explores sexism as well. Benjamin implies that the technologies that perpetuate racial prejudices are reinforcing other forms of discrimination.
“Glitches are generally considered a fleeting interruption of an otherwise benign system, not an enduring and constitutive feature of social life. But what if we understand glitches instead to be a slippery place (with reference to the possible Yiddish origin of the word) between fleeting and durable, micro-interactions and macro-structures, individual hate and institutional indifference?”
Academic studies commonly examine the etymology of their key terms to highlight greater meanings behind them. Benjamin does this with the Yiddish roots of the word “glitch,” which expands the word’s meaning.
“But, as with Trinity’s response to Neo in the Matrix regarding his path being crossed twice by a black cat, perhaps if we situated racist “glitches” in the larger complex of social meanings and structures, we too could approach them as a signal rather than as a distraction.”
This reference to The Matrix exemplifies how Benjamin uses popular culture to make her arguments about technology and racism. Like mentions of Twitter or presidential candidacies, The Matrix enables Benjamin to keep her book accessible and relatable to a wide readership while tackling theoretically challenging concepts.
“Some technologies fail to see Blackness, while others render Black people hypervisible and expose them to systems of racial surveillance. Exposure, in this sense, takes on multiple meanings. Exposing film is a delicate process—artful, scientific, and entangled in forms of social and political vulnerability and risk. Who is seen and under what terms holds a mirror onto more far-reaching forms of power and inequality.”
This is another example of Benjamin’s use of puns, or words with double meanings. The choice of the word “exposure” creates a transition into a discussion on the histories of photography and racism.
“This story reveals to us that a key feature of Black life in racist societies is the constant threat of exposure and of being misread; and that being exposed is also a process of enclosure, a form of suffocating social constriction.”
Chapter 3 focuses on “scopic vulnerability” (101), or the dangers that come with being seen. However, the vulnerability comes not only with viewing but also with the viewer’s interpretation of what they have seen. This quote implies the concept of the stereotype, where an outside appearance is assumed to correlate with meaning that is reductive and inaccurate.
“The popular trope that technology is always one step ahead of society is not only misleading but incorrect, when viewed through the lens of enduring invisibility.”
Benjamin challenges the idea of advancement. She insists there are manifestations of older prejudices and inequalities in our rapidly changing technology. Rather than being “one step ahead of society,” technology frequently shows itself to be perfectly in step with our prevailing biases.
“Nothing short of a collective and sustained effort that, like the aforementioned Polaroid Revolutionary Workers’ Movement, draws together those who work inside and outside powerful institutions can begin to counter the many violent exposures underway.”
Benjamin acknowledges that only group effort can be successful in combating the New Jim Code. This runs counters to the individualism popular in American culture, as well as in the tech industry.
“The very solutions to mass incarceration and prison overcrowding, in other words, give rise to innovative forms of injustice. They are, in short, racial fixes that harm even as they purport to help.”
Like the paradoxical visibility and invisibility that Black people are subjected to, tech fixes can also be paradoxical: they promise to provide a solution but create another problem. Benjamin uses paradox to destabilize how we understand our society.
“Returning to electronic monitoring, like other forms of racial fixing, its function is to create vertical realities—surveillance and control for some, security and freedom for others.”
Verticality refers to societal hierarchies, where some have more power than others and where many are subject to the authority of larger institutions. Electronic monitoring (such as ankle bracelets) is one tech fix that promises to protect individuals from having a vertical relationship with prisons. However, it reinforces power dynamics in the individual’s relationship to law enforcement, as they are easier to surveil.
“In one sense, these forms of discriminatory design—engineered inequity, default discrimination, coded exposure, and technological benevolence—fall on a spectrum that ranges from most obvious to oblivious in the way it helps produce social inequity.”
Benjamin asks us to see the coded inequities that tend to be invisible. While hypervisiblity can be harmful for Black subjects targeted by the law, making the New Jim Code hypervisible is necessary for deconstructing it.
“Let us shift, then, from technology as an outcome to toolmaking as a practice, so as to consider the many different types of tools needed to resist coded inequity, to build solidarity, and to engender liberation.”
The word “liberation” ties back to Benjamin’s use of “abolition” and its historical breadth. By using the word “liberation,” Benjamin emphasizes how coded inequity is not just discriminatory but challenges people’s very freedom.
“Virtual reality (VR) technology in particular is routinely described as an ‘empathy machine’ because of the way it allows us to move through someone else’s world. Perhaps it does, in some cases. But, as some critics emphasize, this rhetoric creates a moral imperative to sell headsets and to consume human anguish, and in the process ‘pain is repurposed as a site of economic production.’”
Technology often corresponds to a product. When capitalism and social justice are partnered, Benjamin says, financial incentives will frequently eclipse equity. This concern about capitalism connects back to Benjamin’s invocation of social theorist Karl Marx.
“In the breathless race for newer, faster, better technology, what ways of thinking, being, and organizing social life are potentially snuffed out? If design is treated as inherently moving forward, that is, as the solution, have we even agreed upon the problem?”
Race After Technology insists that we slow down our rush for “better technology.” By slowing down, we afford ourselves time to consider how we might be perpetuating bias and to create other, perhaps more human ways of tackling the challenges that we face as a society.