logo

60 pages 2 hours read

Shoshana Zuboff

The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power

Nonfiction | Book | Adult | Published in 2018

A modern alternative to SparkNotes and CliffsNotes, SuperSummary offers high-quality Study Guides with detailed chapter summaries and analysis of major themes, characters, and more.

Part 2Chapter Summaries & Analyses

Part 2: “The Advance of Surveillance Capitalism”

Part 2, Chapter 7 Summary: “The Reality Business”

The concept of “ubiquitous computing” dominates Chapter 7. Zuboff argues that surveillance capitalism grew so competitive that it forced firms to ramp up their efforts in the amounts of data they culled and the quality of their predictive products. Surveillance capitalism thus evolved to adopt a second economic imperative: the prediction imperative. If the extraction imperative involved economies of scale, focusing merely on the amount of data collected, the second imperative sought to incorporate economies of scope, with deeper, more varied methods of data extraction.

However, Zuboff is quick to point out that even this evolution into economies of scope proved insufficient for surveillance capitalism. Firms needed surefire ways to secure profit, and there was still too much risk—or, put another way, human agency—involved with economies of scale and scope. Surveillance capitalists thus turned to economies of action, which aim to actively intervene in one’s life and suggest the direction of one’s choices. This evolution transforms surveillance capitalism into an economic form that relies on a means of behavioral modification, not a means of production.

Zuboff also introduces the concept of the “uncontract.” The uncontract is surveillance capitalism’s image of the future, an alternative mode of contractual agreements and regulations that rely not on human relationships, but automation. Notably, this concept is born out of Hal Varian, Google’s Chief Economist, one of the spearhead figures of surveillance capitalism throughout the book. While Varian says that new technologies simply offer new contracts, Zuboff corrects his language into the uncontract precisely because of how automation, alongside surveillance capitalism’s incursions into private lives through ubiquitous computing, acts of rendition, and behavioral modification, will change the very fibers of society

The rest of the chapter describes different types of technologies that illustrate economies of scope and action at work. Zuboff considers telemetry, which was originally invented in 1964 by scientists like R. Stuart MacKay to track and observe animal populations in the wild. This is the template for surveillance capitalism’s goal of ubiquitous computing. A combination of biology, engineering, physics and electronics, telemetry involves tagging animals with tiny sensors that monitor their every movement without detection. MacKay’s work was then taken by contemporary researchers to transform normal physical spaces into datified, “browse-able” spaces, such as the insides of vehicles and even entire cities. Many car insurance companies hope to rely on telemetrics to monitor and control drivers’ behavior and rapidly adjust premiums based on a driver’s habits.

Even more concerning for Zuboff are the growing number of “smart cities,” urban areas where a multitude of sensors and voice activation modules collect staggering amounts of data about citizens. The concept of actualizing smart cities from mere concept to real life has already begun, with the first venture tracing back to 2016. The United States Department of Transportation partnered with technology firm Sidewalk Labs—importantly, a company that lives under Google’s holding company Alphabet—to install a new digital infrastructure in Columbus, Ohio. On the surface, Sidewalk completely revolutionized Columbus with new technologies such as free Wi-Fi stations and an online parking space “market.” However, each of these services also served as a rigorous behavioral surplus farm, drawing in the personal data of an entire city for Sidewalk Labs’ profit.

Zuboff turns her attention to a NYC Yale Club talk given by Sidewalk CEO Dan Doctoroff in 2016. Zuboff quotes Doctoroff’s speech, drawing attention to his goals of “replicating the digital experience in physical space [...][to achieve] ubiquitous connectivity. [...] We can actually then target ads to people in proximity, and then obviously over time track them through things like beacons and location services as well as their browsing activity” (220). Doctoroff’s language of ubiquity draws the chapter full circle, with Zuboff observing that his comments reflect Google’s utterly transparent goals to use “smart cities” not to improve people’s lives but to increase market operations and profit off of lived experiences.

Chapter 7 ends by noting that Sidewalk announced partnerships with 16 more American cities in 2016, and in 2017 it partnered with Toronto, Canada, moving Google’s mission of ubiquity into the international sphere.

Part 2, Chapter 8 Summary: “Rendition: From Experience to Data”

This next chapter focuses on the concept of rendition, the name Zuboff uses for the strategies that bring ubiquitous computing to life. One illustration of the dynamics of the dynamics of rendition is the iRobot company’s use of their Roomba. The Roomba, a cleaning robot, is a foundational machine in iRobot’s mission to establish “smart homes,” in which automation systems monitor and control home appliances. In 2017, iRobot drew controversy when their plans were made public: The company aimed to sell floor plans of customer homes drawn from Roomba’s mapping capabilities to Google, Amazon, or Apple. In preparation for these sales, iRobot revamped their Roomba to include a camera, advanced sensors, and mapping software.

Despite concerns from privacy experts, iRobot got away with such a plan because CEO Colin Angle insisted that it all relied on customer choice; if people didn't want the floorplans of their homes sold, they could simply opt out. Yet, Zuboff points out, when one tries to opt-out, many of the “smart” features of the Roomba such as phone controls of the vacuum, scheduled cleanings, and automatic software updates—the main reasons that people purchase these devices—stop working. Other smart home products, like the SleepIQ mattress or the Nest thermostat, operate under the same caveat.

Zuboff estimates that homes will be the next target of surveillance capitalism. She refers to the practice of collecting ever more detailed information about citizens as “Economies of depth.” Economies of depth will exert their power in everyday lives as companies turn seemingly innocuous objects like toothbrushes, lightbulbs, and ovens into “smart” technology to render citizens’ lives into profitable data. One way in which economies of depth are already well at work in our lives is through  phones. Smartphones are rich sources of location data for surveillance capitalists. Location services are one of the most significant sources of behavioral surplus for Google.

Firms will defend their collections of behavioral surplus gleaned from these location strategies by claiming that it is all kept as metadata stored in large, anonymous collections, which cannot be used to identify individuals. However, Zuboff argues that when supposedly anonymous metadata is cross-referenced with public record information like birthday, zip code, or sex, it is easy to “de-anonymize” this location data.

Chapter 8 draws to a close by analyzing the power dynamics at work through rendition. Zuboff refers to surveillance capitalism as an intimidating regime and argues that rendition necessitates ignorant consumers who are blind to the true power dynamics behind services such as smart home technology.

Part 2, Chapter 9 Summary: “Rendition from the Depths”

The next chapter explores the concept of personalization and its relationship to ubiquitous computing and economies of depth. Personal digital assistants learn from their users and become highly personalized to fit different users’ habits and needs. While personalization is presented as a progressive benefit to consumers, Zuboff argues that personalization is a method used by surveillance capitalism to further secure streams of behavioral surplus data—and mine them at new personal depths.

Unsurprisingly, Google was at the forefront of personal digital assistants. The firm invented Google Now, a digital assistant equipped with a predictive search that combined every system Google had ever invented—including voice search and machine intelligence—to analyze one’s digital habits to predict what a user will ask. Products like Google Now signaled a new type of prediction product that brought surveillance capitalism’s goals of ubiquitous computing to life. In the race to further perfect the personal digital assistant, companies began folding in voice recognition technology to deepen user interactions with these technologies. Exploitative applications of voice recognition technology did not end with digital personal assistants, however. Companies like Samsung and Vizio sold smart TVs that recorded audio of everything said in their vicinity. Genesis Toys marketed products that allowed the toys to capture and process what a child said.

One of the most effective examples of rendition is affective computing, which involves a computer observing humans’ voices and physiological responses and linking them to particular emotions. MIT Professor Rosalind Picard was one prominent intellectual who became invested in affective computing. While Picard admitted that privacy guidelines were necessary to ensure such technology was not abused by governments or others in power, surveillance capitalism latched onto the work of Picard and used it to its own benefit. Facebook, for example, applied for an emotion detection patent in 2014 to gauge user interests in different pages and content. Zuboff notes that Picard later ended up as a CEO as a company called Affectiva, which designed a system of machine intelligence called MindReader in the hopes of creating technology that could identify moods and emotions in autistic children.

Affectiva, however, received significant interest by market research firms and advertisers, and the company took a more commercial turn. Picard led Affectiva to embrace Emotion AI, an artificial intelligence aiming to compile an emotional data repository storing data from video games, conversations, and online videos. Picard even expressed interest in creating an “emotional chip” which could be installed and used to constantly assess the emotions of users on their computers, phones, TVs, and other devices. Although Picard’s intentions with affective computing started off as benign, Zuboff observes that her vision was “subordinated to the larger aims of surveillance capitalism, [and] the thrust of the affective project changed as if distorted in a fun-house mirror” (276). At the chapter’s conclusion, Picard is used as an example to further emphasize how surveillance capitalism overtakes ingenuity and corrupts it into an exploiting force.

Part 2, Chapter 10 Summary: “Make Them Dance”

Chapter 10 is largely focused on actuation and the impacts of economies of action on human psychology. Building off of the concept of ubiquitous computing, actuation involves ubiquitous action, wherein surveillance capitalism aims to alter one’s choices and behaviors in the real world to increase predictive products’ performances and overall revenue.

The first actuation strategy is tuning, which can include subliminal messages or “nudges.” These occur on websites and user interfaces that are specifically designed to elicit certain patterns of behavior and engagement. The second strategy, herding, is achieved through controlling elements of an environment to force people down a certain path of behavior; for example, a smart TV can be programmed to shut off at a particular hour to make a viewer stop watching and go to bed to encourage healthy sleep schedules. Conditioning is the final strategy of actuation. Inspired by the work of Harvard behaviorist B. F. Skinner, conditioning involves a chain of action in which a stimulus initiates a desired behavior. The behavior is then reinforced, thus shaping specific behavioral patterns.

To illustrate how actuation strategies work, Zuboff turns to the Pokémon Go app, owned by Niantic Labs—which, notably, receives funding from Google and was created by John Hanke, the previous head of Google’s Street View. Hanke identified the mobile app as a significant potential source of behavioral surplus and a promising route to explore actuation methods. The app helped Niantic learn how to condition, herd, and nudge behaviors at an unprecedented scale. Most importantly, Pokémon Go was among the first and most successful applications to modify behaviors outside in real life.

The pattern of conditioning was built right into the game’s framework: A user downloads the app, which then uses GPS and the smartphone camera to hunt Pokémon creatures that “appear” in a user’s surroundings via augmented reality. When a user captures a Pokémon, they are rewarded with game currency and game points. Niantic began sponsoring locations like bars and restaurants, where Pokémon would pop up in large number, thus attracting app users. Niantic then amassed a network of collaborators who, alongside the app creators, enjoyed increased profits as a result of the app’s behavioral modification strategies.

Zuboff then rewinds her chapter’s perspective to the 1970s, where the conceptual roots of behavioral modification lay. She traces the science of manipulating behavior at scale to the American government during the Cold War. Anxieties over the ideological spread of Communism from the East spurred the CIA to create MKUltra, a secret set of experiments that sought to identify methods of behavioral modification and control. The Constitutional Rights Subcommittee convened in 1971 to address public concerns over such research, eventually arguing that the First Amendment must protect one’s rights to independent thought and that privacy must extend to protecting citizens’ thoughts, emotions, and behavior.

The Subcommittee’s arguments lead to a rewriting of psychologists' ethical codes. The National Research Act in 1974 would eventually lead to the Common Rule dictating protections of research subjects in federal studies, and Congress’ creation of the National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research. Zuboff points out that it is interesting to look back on this history of behavioral modification and see how past anxieties were centered around state abuse of behavioral modification; corporate abuses of behaviors were essentially unimaginable. Little did the Senators and privacy rights activists of the 1970s know that the power they fought so hard to defeat would merely respawn again in a different form: surveillance capitalism.

Part 2, Chapter 11 Summary: “The Right to the Future Tense”

Chapter 11 is preoccupied with the role of free will in individuals’ futures. Surveillance capitalism seeks to corral and curtail freedom of thought and behavior and is therefore a significant threat to humanity’s future, according to the author. In an effort to gather society’s collective bearings, fully understand the threat to our humanity, and face the challenge head-on, Zuboff outlines 16 reasons explaining why surveillance capitalism has been able to succeed. These reasons include surveillance capitalism’s unprecedented nature, its use of the language of conquering invaders, the Cold War historical context that facilitated its rise, and the extent to which surveillance capitalists like Google are shielded from government regulation and scrutiny. Zuboff also points to the dispossession cycle, whereby firms push incursions into public privacy until they are challenged at the government level, which leads to public relations ministrations and surface-level changes to buy time to adjust strategy behind the scenes to continue behavioral data extraction/camouflage it in a new way. Firms also encourage users to become dependent on these services, either to fulfill perceived needs or to feel included.

Moreover, surveillance capitalists present themselves as master innovators and hard workers; people identify with and grow to love their success stories, even aspiring to become leaders of this kind themselves. Surveillance capitalists also present themselves as authorities on the future of our society and the leaders of our collective progress. Many believe in and look up to their subsequent “expertise”. Finally, surveillance capitalism has grown so prevalent that it is essentially impossible to escape. People are essentially left with no alternatives, leading to a sense of “inevitablism” around surveillance capitalism.

In response to these explanations, Zuboff demands laws that reject and dismantle the functions of surveillance capitalism. Furthermore, she argues there must be a collective social movement representing a withdrawal of agreement to surveillance capitalism’s existence.

Part 2 Analysis

Whereas Part 1 was dedicated to explaining the historical, political, and ideological foundations of surveillance capitalism, Part 2 explains how it presently operates. In this respect, the most important concepts covered in Part 2 are ubiquitous computing, methods of rendition, and behavioral modification. Each of these three operations are expressions of surveillance capitalism’s economies of action. Zuboff structures her book to establish the “building blocks” of surveillance capitalism, where each Part—and each individual chapter within these parts—work off of the last to present a cumulative image of surveillance capitalism’s construction and methods of operations. Thus, in order to understand Zuboff’s thesis, a proper understanding of each of her concepts explaining how the economic form works—and how they relate to each other—is especially important.

Each of Part 2’s operations—ubiquitous computing, rendition, and behavioral modification—represent a step in surveillance capitalism’s economic evolution. Zuboff purposefully structures Part 2 to reflect this evolution from economies to scale, to scope, and to action; while Chapter 7 discusses ubiquitous computing, Chapters 8 and 9 discuss rendition and personalization as advanced applications of ubiquitous computing, and Chapters 10 and 11 discuss behavioral modification and its potential impact on society at large. Chapter 7 covers how ubiquitous computing represents the concept of extension in the economies of scope. Employing a similar methodology as she did in Part 1, Zuboff relies on case studies to illustrate how the particular aspects of surveillance capitalism she discusses work in real life. In the case of ubiquitous computing, her explorations into how Columbus, Ohio was transformed into a “smart city” in Chapter 7 are particularly helpful—especially when establishing thematic links between Part 1 and 2. Whereas Zuboff used Part 1 to explore large-scale links between the government and surveillance capitalism, such as how the War on Terror led to state interest in surveillance policies and thus fueled the rise of surveillance capitalism, Chapter 7’s case study of Columbus, Ohio is an effective example of how surveillance capitalism and ubiquitous computing can express itself on a local scale.

If ubiquitous computing and rendition represented the dual strategies of economies of scope, then the last operation covered by Part 2, behavioral modification, represents the final and most threatening evolution of surveillance capitalism: economies of action. Perhaps the most poignant observation that Zuboff makes at this juncture of her book is how surveillance capitalism operates in secret to exploit human nature itself. Surveillance capitalism’s targeting, corralling, and controlling human psychology—and even the core fibers of humanity, such as the concepts of free will—is a theme developed throughout Part 2.

Perhaps the most important philosophical concept of Part 2 is the uncontract. First mentioned in Chapter 7, the uncontract is a recurring term throughout this portion of Zuboff’s book. Chapter 11 is wholly dedicated to this idea. Surveillance capitalists like Varian present their technologies to their customers as the ultimate solutions to their complex lives, because automation and predictive products erase uncertainty.

However, Zuboff argues, these “solutions” will come at a price for the mass public. With uncertainty comes free will, choice, and human collaboration towards collective solutions and goals. The uncontract’s automation creates certainty, but not because it wishes to solve society’s issues—rather, it seeks to erase the good aspects of uncertainty because no profit can be derived from the human’s capacity of free will. As Zuboff explains:

In the dystopia of the uncontract, surveillance capitalism’s drive toward certainty fills the space once occupied by all the human work of building and replenishing social trust, which is now reinterpreted as unnecessary friction in the march toward guaranteed outcomes. This deletion of uncertainty is celebrated as a victory over human nature: our cunning and our opportunism (317).

Under surveillance capitalism, something as profound as free will is merely a roadblock to profit, a mass object that must be destroyed to achieve a larger profit. Intelligence, opinions, debates, and democracy are all friction slowing the evolution of surveillance capitalism and are thus in danger. With its attention on the uncontract, Part 2 wishes to impart to readers that in the future ruled by surveillance capitalism, the very essences of human connection will be undone.

blurred text
blurred text
blurred text
blurred text