Fri 19 Apr 2019 01:08:13 PM -03

Topics

  • Second-modernity individuals.
  • Dispossession Cycle: incursion, adaptation, habituation, redirection.
  • Division of learning: who knows? Who decides? Who decides who decides?
  • Shadow text.

Excerpts

Just a moment ago, it still seemed reasonable to focus our concerns on the
challenges of an information workplace or an information society. Now the
oldest questions must be addressed to the widest possible frame, which is best
defined as “civilization” or, more specifically, information civilization. Will
this emerging civilization be a place that we can call home?

[...]

The sense of home slipping away provokes an unbearable yearning. The
Portuguese have a name for this feeling: saudade, a word said to capture the
homesickness and longing of separation from the homeland among emigrants across
the centuries. Now the disruptions of the twenty-first century have turned
these exquisite anxieties and longings of dislocation into a universal story
that engulfs each one of us.3

[...]

Although the saying tells us “If it’s free, then you are the product,” that
is also incorrect. We are the sources of surveillance capitalism’s crucial
surplus: the objects of a technologically advanced and increasingly inescapable
raw-material-extraction operation. Surveillance capitalism’s actual customers
are the enterprises that trade in its markets for future behavior.

[...]

Although the saying tells us “If it’s free, then you are the product,” that
is also incorrect. We are the sources of surveillance capitalism’s crucial
surplus: the objects of a technologically advanced and increasingly inescapable
raw-material-extraction operation. Surveillance capitalism’s actual customers
are the enterprises that trade in its markets for future behavior.

[...]

Surveillance capitalism operates through unprecedented asymmetries in
knowledge and the power that accrues to knowledge. Surveillance capitalists
know everything about us, whereas their operations are designed to be
unknowable to us. They accumulate vast domains of new knowledge from us, but
not for us. They predict our futures for the sake of others’ gain, not ours. As
long as surveillance capitalism and its behavioral futures markets are allowed
to thrive, ownership of the new means of behavioral modification eclipses
ownership of the means of production as the fountainhead of capitalist wealth
and power in the twenty-first century.  These facts and their consequences for
our individual lives, our societies, our democracies, and our emerging
information civilization are examined in detail in the coming chapters. The
evidence and reasoning employed here suggest that surveillance capitalism is a
rogue force driven by novel economic imperatives that disregard social norms
and nullify the elemental rights associated with individual autonomy that are
essential to the very possibility of a democratic society.  Just as industrial
civilization flourished at the expense of nature and now threatens to cost us
the Earth, an information civilization shaped by surveillance capitalism and
its new instrumentarian power will thrive at the expense of human nature and
will threaten to cost us our humanity. The industrial legacy of climate chaos
fills us with dismay, remorse, and fear. As surveillance capitalism becomes the
dominant form of information capitalism in our time, what fresh legacy of
damage and regret will be mourned by future generations?

[...]

For now, suffice to say that despite all the futuristic sophistication of
digital innovation, the message of the surveillance capitalist companies barely
differs from the themes once glorified in the motto of the 1933 Chicago World’s
Fair: “Science Finds—Industry Applies—Man Conforms.”

[...]

In order to challenge such claims of technological inevitability, we must
establish our bearings. We cannot evaluate the current trajectory of
information civilization without a clear appreciation that technology is not
and never can be a thing in itself, isolated from economics and society. This
means that technological inevitability does not exist. Technologies are always
economic means, not ends in themselves: in modern times, technology’s DNA comes
already patterned by what the sociologist Max Weber called the “economic
orientation.” Economic ends, Weber observed, are always intrinsic to
technology’s development and deployment. “Economic action” determines
objectives, whereas technology provides “appropriate means.” In Weber’s
framing, “The fact that what is called the technological development of modern
times has been so largely oriented economically to profit-making is one of the
fundamental facts of the history of technology.”15 In a modern capitalist
society, technology was, is, and always will be an expression of the economic
objectives that direct it into action. A worthwhile exercise would be to delete
the word “technology” from our vocabularies in order to see how quickly
capitalism’s objectives are exposed.

[...]

Surveillance capitalism employs many technologies, but it cannot be equated
with any technology. Its operations may employ platforms, but these operations
are not the same as platforms. It employs machine intelligence, but it cannot
be reduced to those machines. It produces and relies on algorithms, but it is
not the same as algorithms. Surveillance capitalism’s unique economic
imperatives are the puppet masters that hide behind the curtain orienting the
machines and summoning them to action. These imperatives, to indulge another
metaphor, are like the body’s soft tissues that cannot be seen in an X-ray but
do the real work of binding muscle and bone. We are not alone in falling prey
to the technology illusion. It is an enduring theme of social thought, as old
as the Trojan horse. Despite this, each generation stumbles into the quicksand
of forgetting that technology is an expression of other interests. In modern
times this means the interests of capital, and in our time it is surveillance
capital that commands the digital milieu and directs our trajectory toward the
future. Our aim in this book is to discern the laws of surveillance capitalism
that animate today’s exotic Trojan horses, returning us to age-old questions as
they bear down on our lives, our societies, and our civilization.

[...]

We have stood at this kind of precipice before. “We’ve stumbled along for a
while, trying to run a new civilization in old ways, but we’ve got to start to
make this world over.” It was 1912 when Thomas Edison laid out his vision for a
new industrial civilization in a letter to Henry Ford. Edison worried that
industrialism’s potential to serve the progress of humanity would be thwarted
by the stubborn power of the robber barons and the monopolist economics that
ruled their kingdoms. He decried the “wastefulness” and “cruelty” of US
capitalism: “Our production, our factory laws, our charities, our relations
between capital and labor, our distribution—all wrong, out of gear.” Both
Edison and Ford understood that the modern industrial civilization for which
they harbored such hope was careening toward a darkness marked by misery for
the many and prosperity for the few.

[...]

Most important for our conversation, Edison and Ford understood that the
moral life of industrial civilization would be shaped by the practices of
capitalism that rose to dominance in their time. They believed that America,
and eventually the world, would have to fashion a new, more rational capitalism
in order to avert a future of misery and conflict. Everything, as Edison
suggested, would have to be reinvented: new technologies, yes, but these would
have to reflect new ways of understanding and fulfilling people’s needs; a new
economic model that could turn those new practices into profit; and a new
social contract that could sustain it all. A new century had dawned, but the
evolution of capitalism, like the churning of civilizations, did not obey the
calendar or the clock. It was 1912, and still the nineteenth century refused to
relinquish its claim on the twentieth.

[...]

I describe the “collision” between the centuries-old historical processes
of individualization that shape our experience as self-determining individuals
and the harsh social habitat produced by a decades-old regime of neoliberal
market economics in which our sense of self-worth and needs for
self-determination are routinely thwarted. The pain and frustration of this
contradiction are the condition that sent us careening toward the internet for
sustenance and ultimately bent us to surveillance capitalism’s draconian quid
pro quo.

[...]

The youngest members of our societies already experience many of these
destructive dynamics in their attachment to social media, the first global
experiment in the human hive. I consider the implications of these developments
for a second elemental right: the right to sanctuary. The human need for a
space of inviolable refuge has persisted in civilized societies from ancient
times but is now under attack as surveillance capital creates a world of “no
exit” with profound implications for the human future at this new frontier of
power.

[...]

The Apple inversion depended on a few key elements. Digitalization made it
possible to rescue valued assets—in this case, songs—from the institutional
spaces in which they were trapped. The costly institutional procedures that
Sloan had described were eliminated in favor of a direct route to listeners. In
the case of the CD, for example, Apple bypassed the physical production of the
product along with its packaging, inventory, storage, marketing,
transportation, distribution, and physical retailing. The combination of the
iTunes platform and the iPod device made it possible for listeners to
continuously reconfigure their songs at will. No two iPods were the same, and
an iPod one week was different from the same iPod another week, as listeners
decided and re-decided the dynamic pattern. It was an excruciating development
for the music industry and its satellites—retailers, marketers, etc.—but it was
exactly what the new listeners wanted.

[...]

The implication is that new market forms are most productive when they are
shaped by an allegiance to the actual demands and mentalities of people. The
great sociologist Emile Durkheim made this point at the dawn of the twentieth
century, and his insight will be a touchstone for us throughout this book.
Observing the dramatic upheavals of industrialization in his time—factories,
specialization, the complex division of labor—Durkheim understood that although
economists could describe these developments, they could not grasp their cause.
He argued that these sweeping changes were “caused” by the changing needs of
people and that economists were (and remain) systematically blind to these
social facts: The division of labor appears to us otherwise than it does to
economists. For them, it essentially consists in greater production. For us,
this greater productivity is only a necessary consequence, a repercussion of
the phenomenon. If we specialize, it is not to produce more, but it is to
enable us to live in the new conditions of existence that have been made for
us.7

[...]

The sociologist identified the perennial human quest to live effectively in
our “conditions of existence” as the invisible causal power that summons the
division of labor, technologies, work organization, capitalism, and ultimately
civilization itself. Each is forged in the same crucible of human need that is
produced by what Durkheim called the always intensifying “violence of the
struggle” for effective life: “If work becomes more divided,” it is because the
“struggle for existence is more acute.”

[...]

What are these modernities and how do they matter to our story? The advent
of the individual as the locus of moral agency and choice initially occurred in
the West, where the conditions for this emergence first took hold. First let’s
establish that the concept of “individualization” should not be confused with
the neoliberal ideology of “individualism” that shifts all responsibility for
success or failure to a mythical, atomized, isolated individual, doomed to a
life of perpetual competition and disconnected from relationships, community,
and society. Neither does it refer to the psychological process of
“individuation” that is associated with the lifelong exploration of
self-development. Instead, individualization is a consequence of long-term
processes of modernization.10

[...]

The Spanish poet Antonio Machado captured the exhilaration and daring of
these first-modernity individuals in his famous song: “Traveler, there is no
road; the road is made as you go.” This is what “search” has meant: a journey
of exploration and self-creation, not an instant swipe to already composed
answers.

[...]

Socialization and adaptation were the materials of a psychology and
sociology that regarded the nuclear family as the “factory” for the “production
of personalities” ready-made for conformity to the social norms of mass
society.12 Those “factories” also produced a great deal of pain: the feminine
mystique, closeted homosexuals, church-going atheists, and back-alley
abortions. Eventually, though, they even produced people like you and me.

[...]

The free-market creed originated in Europe as a sweeping defense against
the threat of totalitarian and communist collectivist ideologies. It aimed to
revive acceptance of a self-regulating market as a natural force of such
complexity and perfection that it demanded radical freedom from all forms of
state oversight. Hayek explained the necessity of absolute individual and
collective submission to the exacting disciplines of the market as an
unknowable “extended order” that supersedes the legitimate political authority
vested in the state: “Modern economics explains how such an extended order…
constitutes an information-gathering process… that no central planning agency,
let alone any individual, could know as a whole, possess, or control.…”22 Hayek
and his ideological brethren insisted on a capitalism stripped down to its raw
core, unimpeded by any other force and impervious to any external authority.
Inequality of wealth and rights was accepted and even celebrated as a necessary
feature of a successful market system and as a force for progress.23 Hayek’s
ideology provided the intellectual superstructure and legitimation for a new
theory of the firm that became another crucial antecedent to the surveillance
capitalist corporation: its structure, moral content, and relationship to
society.

[...]

In 1976 Jensen and Meckling published a landmark article in which they
reinterpreted the manager as a sort of parasite feeding off the host of
ownership: unavoidable, perhaps, but nonetheless an obstacle to shareholder
wealth.

[...]

In the “crisis of democracy” zeitgeist, the neoliberal vision and its
reversion to market metrics was deeply attractive to politicians and policy
makers, both as the means to evade political ownership of tough economic
choices and because it promised to impose a new kind of order where disorder
was feared.25 The absolute authority of market forces would be enshrined as the
ultimate source of imperative control, displacing democratic contest and
deliberation with an ideology of atomized individuals sentenced to perpetual
competition for scarce resources. The disciplines of competitive markets
promised to quiet unruly individuals and even transform them back into subjects
too preoccupied with survival to complain.


[...]

In Capital in the Twenty-First Century, the French economist Thomas
Piketty integrated years of income data to derive a general law of
accumulation: the rate of return on capital tends to exceed the rate of
economic growth. This tendency, summarized as r > g, is a dynamic that
produces ever-more-extreme income divergence and with it a range of
antidemocratic social consequences long predicted as harbingers of an
eventual crisis of capitalism.

[...]

Many scholars have taken to describing these new conditions as
neofeudalism, marked by the consolidation of elite wealth and power far
beyond the control of ordinary people and the mechanisms of democratic
consent.55 Piketty calls it a return to “patrimonial capitalism,” a
reversion to a premodern society in which one’s life chances depend
upon inherited wealth rather than meritocratic achievement.56

[...]

We now have the tools to grasp the collision in all of its destructive
complexity: what is unbearable is that economic and social inequalities
have reverted to the preindustrial “feudal” pattern but that we, the
people, have not. We are not illiterate peasants, serfs, or slaves.
Whether “middle class” or “marginalized,” we share

[...]

Nevertheless, Occupy revealed a similar conflict between inequality’s
facts and inequality’s feelings, expressed in a creatively
individualized political culture that insisted on “direct democracy”
and “horizontal leadership.”60 Some analysts concluded that it was this
conflict that ultimately crippled the movement, with its “inner core”
of leaders unwilling to compromise their highly individualized approach
in favor of the strategies and tactics required for a durable mass
movement.61 However

[...]

This is the existential contradiction of the second modernity that
defines our conditions of existence: we want to exercise control over
our own lives, but everywhere that control is thwarted.
Individualization has sent each one of us on the prowl for the
resources we need to ensure effective life, but at each turn we are
forced to do battle with an economics and politics from whose vantage
point we are but ciphers. We live in the knowledge that our lives have
unique value, but we are treated as invisible

[...]

The deepest contradiction of our time, the social philosopher Zygmunt
Bauman wrote, is “the yawning gap between the right of self-assertion
and the capacity to control the social settings which render such
self-assertion feasible. It is from that abysmal gap that the most
poisonous effluvia contaminating the lives of contemporary individuals
emanate.”

[...]

When it comes to genuine economic mutation, there is always a tension
between the new features of the form and its mother ship. A combination
of old and new is reconfigured in an unprecedented pattern.
Occasionally, the elements of a mutation find the right environment in
which to be “selected” for propagation. This is when the new form
stands a chance of becoming fully institutionalized and establishes its
unique migratory path toward the future. But it’s even more likely that
potential mutations meet their fate in “transition failure,” drawn back
by the gravitational pull of established practices.63

[...]

Among the many violations of advocacy expectations, ubiquitous
“terms-of-service agreements” were among the most pernicious.67 Legal
experts call these “contracts of adhesion” because they impose
take-it-or-leave-it conditions on users that stick to them whether they
like it or not.

[...]

These “contracts” impose an unwinnable infinite regress upon the user
that law professor Nancy Kim describes as “sadistic.”

[...]

The digital milieu has been essential to these degradations. Kim points
out that paper documents once imposed natural restraints on contracting
behavior simply by virtue of their cost to produce, distribute, and
archive. Paper contracts require a physical signature, limiting the
burden a firm is likely to impose on a customer by requiring her to
read multiple pages of fine print. Digital terms, in contrast, are
“weightless.

[...]

Radin calls this “private eminent domain,” a unilateral seizure of
rights without consent. She

[...]

Once firms understood that the courts were disposed to validate their
click-wrap and browse-wrap agreements, there was nothing to stop them
from expanding the reach of these degraded contracts “to extract from
consumers additional benefits unrelated to the transaction.”73 This
coincided with the discovery of behavioral surplus that we examine in
Chapter 3, as

[...]

2008 two Carnegie Mellon professors calculated that a reasonable
reading of all the privacy policies that one encounters in a year would
require 76 full workdays at a national opportunity cost of $781
billion.75 The numbers are much higher today. Still, most

[...]

These developments reflect the simple truth that genuine economic
reformation takes time and that the internet world, its investors and
shareholders, were and are in a hurry. The credo of digital innovation
quickly turned to the language of disruption and an obsession with
speed, its campaigns conducted under the flag of “creative
destruction.” That famous, fateful phrase coined by evolutionary
economist Joseph Schumpeter was seized upon as a way to legitimate what
Silicon Valley euphemistically calls “permissionless innovation.”77
Destruction rhetoric promoted what I think of as a “boys and their
toys” theory of history, as if the winning hand in capitalism is about
blowing things up with new technologies. Schumpeter’s analysis was, in
fact, far more nuanced and complex than modern destruction rhetoric
suggests.

[...]

was as if a shark had been silently circling the depths all along, just
below the surface of the action, only to occasionally leap glistening
from the water in pursuit of a fresh bite of flesh

[...]

Over time, the shark revealed itself as a rapidly multiplying,
systemic, internally consistent new variant of information capitalism
that had set its sights on domination. An unprecedented formulation of
capitalism was elbowing its way into history: surveillance capitalism.

[...]

As we shall explore in detail throughout the coming chapters, thanks to
surveillance capitalism the resources for effective life that we seek
in the digital realm now come encumbered with a new breed of menace.
Under this new regime, the precise moment at which our needs are met is
also the precise moment at which our lives are plundered for behavioral
data, and all for the sake of others’ gain. The result is a perverse
amalgam of empowerment inextricably layered with diminishment. In the
absence of a decisive societal response that constrains or outlaws this
logic of accumulation, surveillance capitalism appears poised to become
the dominant form of capitalism in our time.

[...]

This left us wholly unprepared to defend ourselves from new companies
with imaginative names run by young geniuses that seemed able to
provide us with exactly what we yearn for at little or no cost. This
new regime’s most poignant harms, now and later, have been difficult to
grasp or theorize, blurred by extreme velocity and camouflaged by
expensive and illegible machine operations, secretive corporate
practices, masterful rhetorical misdirection, and purposeful cultural
misappropriation. On this road, terms whose meanings we take to be
positive or at least banal—“the open internet,” “interoperability,” and
“connectivity”—have been quietly harnessed to a market process in which
individuals are definitively cast as the means to others’ market ends.

[...]

The new harms we face entail challenges to the sanctity of the
individual, and chief among these challenges I count the elemental
rights that bear on individual sovereignty, including the right to the
future tense and the right to sanctuary. Each of these rights invokes
claims to individual agency and personal autonomy as essential
prerequisites to freedom of will and to the very concept of democratic
order.

[...]

The Spanish Data Protection Agency recognized that not all information
is worthy of immortality

[...]

As for the Spanish people, their Data Protection Agency, and the
European Court of Justice, the passage of time is likely to reveal
their achievements as a stirring early chapter in the longer story of
our fight for a third modern that is first and foremost a human future,
rooted in an inclusive democracy and committed to the individual’s
right to effective life. Their message is carefully inscribed for our
children to ponder: technological inevitability is as light as
democracy is heavy, as temporary as the scent of rose petals and the
taste of honey are enduring.

[...]

The point for us is that every successful vaccine begins with a close
understanding of the enemy disease. The mental models, vocabularies,
and tools distilled from past catastrophes obstruct progress. We smell
smoke and rush to close doors to rooms that are already fated to
vanish. The result is like hurling snowballs at a smooth marble wall
only to watch them slide down its facade, leaving nothing but a wet
smear: a fine paid here, an operational detour there, a new encryption
package there.

[...]

It is the habitat for progress “at the speed of dreams,” as one Google
engineer vividly describes it.100 My aim here is to slow down the
action in order to enlarge the space for such debate and unmask the
tendencies of these new creations as they amplify inequality, intensify
social hierarchy, exacerbate exclusion

[...]

Peter Drucker’s field studies for his seminal Concept of the
Corporation, the 1946 book that codified the practices of the twentieth

[...]

The closest thing we have to a Buck Weaver or James Couzens codifying
Google’s practices and objectives is the company’s longtime chief
economist, Hal Varian, who aids the cause of understanding with
scholarly articles that explore important themes. Varian has been
described as “the Adam Smith of the discipline of Googlenomics” and the
“godfather” of its advertising model.6 It is in

[...]

Nowadays there is a computer in the middle of virtually every
transaction… now that they are available these computers have several
other uses.”8 He then identifies four such new uses: “data extraction
and analysis,” “new contractual forms due to better monitoring,”
“personalization and customization,” and “continuous
experiments.”Varian’s discussions of

[...]

Data extraction and analysis,” Varian writes, “is what everyone is
talking about when they talk about big data.” “Data” are the raw
material necessary for surveillance capitalism’s novel manufacturing
processes. “Extraction” describes the social relations and material
infrastructure with which the firm asserts authority over those raw
materials to achieve economies of scale in its raw-material supply
operations. “Analysis” refers to the complex of highly specialized
computational systems that I will generally refer to in these chapters
as “machine intelligence.” I like this umbrella phrase because it
trains us on the forest rather than the trees, helping us decenter from
technology to its objectives. But in choosing this phrase I also follow
Google’s lead. The company describes itself “at the forefront of
innovation in machine intelligence,” a term in which it includes
machine learning as well as “classical” algorithmic production, along
with many computational operations that

[...]

Amit Patel, a young Stanford graduate student with a special interest
in “data mining,” is frequently credited with the groundbreaking
insight into the significance of Google’s accidental data caches. His

[...]

Google’s engineers soon grasped that the continuous flows of collateral
behavioral data could turn the search engine into a recursive learning
system that constantly improved search results and spurred product
innovations such as spell check,

[...]

was Google that recognized the gold dust in the detritus of its
interactions with its users and took the trouble to collect it up.…
Google exploits information that is a by-product of user interactions,
or data exhaust, which is automatically recycled

[...]

that early stage of Google’s development, the feedback loops involved
in improving its Search functions produced a balance of power: Search
needed people to learn from, and people needed Search

[...]

Hal Varian, who joined Google as its chief economist in 2002, would
note, “Every action a user performs is considered a signal to be
analyzed and fed back into the system.”16 The Page Rank algorithm,
named after its founder

[...]

The key point for us rests on a critical distinction. During this early
period, behavioral data were put to work entirely on the user’s behalf.
User data provided value at no cost, and that value was reinvested in
the user experience in the form of improved services: enhancements that
were also offered at no cost to users. Users provided the raw material
in the form of behavioral data, and those data were harvested to
improve speed, accuracy, and relevance and to help build ancillary
products such as translation. I call this the behavioral value
reinvestment cycle, in which all behavioral data are reinvested in the
improvement of the product or service (see Figure 1).

[...]

cycle was similarly oriented toward the individual as its subject, but
without a physical product to sell, it floated outside the marketplace,
an interaction with “users” rather than a market transaction with
customers.

[...]

Users are not paid for their labor, nor do they operate the means of
production, as we’ll discuss in more depth later in this chapter.
Finally, people often say that the user is the “product.” This is also
misleading, and it is a point that we will revisit more than once. For
now let’s say that users are not products, but rather we are the
sources of raw-material supply. As we shall see, surveillance
capitalism

[...]

impatient money

[...]

These behavioral data available for uses beyond service improvement
constituted a surplus, and it was on the strength of this behavioral
surplus that the young company would find its way to the “sustained and
exponential profits” that would be necessary for survival. Thanks to a
perceived

[...]

the New York Times reported, “The precision of the Carol Brady data was
eye-opening for some.” Even Brin was stunned by the clarity of Search’s
predictive power, revealing events and trends before they “hit the
radar” of traditional media. As he told the Times, “It was like trying
an electron microscope for the first time. It was like a
moment-by-moment barometer

[...]

Google maximizes the revenue it gets from that precious real estate by
giving its best position to the advertiser who is likely to pay Google
the most in total, based on the price per click multiplied by Google’s
estimate of the likelihood that someone will actually click on the
ad.”42 That pivotal multiplier was the result of Google’s advanced
computational capabilities trained on its most significant and secret
discovery: behavioral surplus

[...]

competitors, in which ads were targeted to keywords or content, were
unable to identify relevant ads “for a particular user.” Now the
inventors offered a scientific solution that exceeded the
most-ambitious dreams of any advertising executive

[...]

This new Google assures its actual customers that it will do whatever
it takes to transform the natural obscurity of human desire into
scientific fact. This Google is the superpower that establishes its own
values and pursues its own purposes above and beyond the social
contracts to which others are bound.

[...]

Google’s unique auction methods and capabilities earned a great deal of
attention, which distracted observers from reflecting on exactly what
was being auctioned: derivatives of behavioral surplus. Click-through
metrics institutionalized “customer” demand for these prediction
products and thus established the central importance of economies of
scale in surplus supply operations. Surplus capture would have to
become automatic and ubiquitous if the new logic was to succeed, as
measured by the successful trading of behavioral futures.

[...]

kind of commerce that depended upon online surveillance at scale.
Insiders referred to Google’s new science of behavioral

[...]

here was an unprecedented and lucrative brew: behavioral surplus, data
science, material infrastructure, computational power, algorithmic
systems, and automated platforms. This convergence produced
unprecedented “relevance” and billions of auctions. Click-through rates
skyrocketed. Work on AdWords and AdSense became just as important as
work on Search

[...]

their community effectively declared a “state of exception” in which it
was judged necessary to suspend the values and principles that had
guided Google’s founding and early practices.

[...]

Google’s inventions, their origins in emergency, and the 180-degree
turn from serving users to surveilling them. Most of all, he credited
the discovery of behavioral surplus as the game-changing asset that
turned Google into a fortune-telling giant, pinpointing Google’s
breakthrough transformation of the Overture model, when the young
company first applied its analytics of behavioral surplus to predict
the likelihood of a click:

[...]

Google loosed a new incarnation of capitalism upon the world, a
Pandora’s box whose contents we are only beginning

[...]

On the strength of Google’s inventions, discoveries, and strategies, it
became the mother ship and ideal type of a new economic logic based on
fortune-telling and selling—an ancient and eternally lucrative craft
that has fed on humanity’s confrontation with uncertainty from the
beginning of the human story.

[...]

The scientific and material complexity that supported the capture and
analysis of behavioral surplus also enabled the hiding strategy, an
invisibility cloak over the whole operation. “Managing search at our
scale is a very serious barrier to entry,” Schmidt warned would-be
competitors.79 To be sure, there are always sound business

[...]

public were told that Google’s magic derived from its exclusive
capabilities in unilateral surveillance of online behavior and its
methods specifically designed to override individual decision rights?
Google policies had to enforce secrecy in order to protect operations
that were designed to be undetectable because they took things from
users without asking and employed those unilaterally claimed resources
to work in the service of others’ purposes.

[...]

George Orwell once observed that euphemisms are used in politics, war,
and business as instruments that “make lies sound truthful and murder
respectable.”81 Google has been careful to camouflage the significance
of

[...]

Google discovered this necessary element of the new logic of
accumulation: it must assert the rights to take the information upon
which its success depends.

[...]

signing on with Facebook, the talented Sandberg became the “Typhoid
Mary” of surveillance capitalism as she led Facebook’s transformation
from a social networking site to an advertising behemoth. Sandberg
understood that Facebook’s social graph represented an awe-inspiring
source of behavioral surplus: the extractor’s equivalent of a
nineteenth-century prospector stumbling into a valley that sheltered
the largest diamond mine and the deepest gold mine ever to be
discovered. “We have better information than anyone else. We know
gender, age, location, and it’s real data as opposed to the stuff other
people infer,” Sandberg

[...]

Sandberg understood that through the artful manipulation of Facebook’s
culture of intimacy and sharing, it would be possible to use behavioral
surplus not only to satisfy demand but also to create demand. For
starters, that meant inserting advertisers into the fabric of
Facebook’s online culture, where they could

[...]

This new market form declares that serving the genuine needs of people
is less lucrative, and therefore less important, than selling
predictions of their behavior. Google discovered that we are less
valuable than others’ bets on our future behavior. This changed
everything.

[...]

VIII. Summarizing the Logic and Operations of Surveillance Capitalism

[...]

is obscene to suppose that this harm can be reduced to the obvious fact
that users receive no fee for the raw material they supply. That
critique is a feat of misdirection that would use a

[...]

remarkable questions here concern the facts that our lives are rendered
as behavioral data in the first place; that ignorance is a condition of
this ubiquitous rendition; that decision rights vanish before one even
knows that there is a decision to make; that there are consequences to
this diminishment of rights that we can neither see nor foretell; that
there is no exit, no voice, and no loyalty, only helplessness,
resignation, and psychic numbing; and that encryption is the only
positive action left to discuss when we sit around the dinner table and
casually ponder how to hide from the forces that hide from us.

[...]

Social theorist David Harvey builds on Arendt’s insight with his notion
of “accumulation by dispossession”: “What accumulation by dispossession
does is to release a set of assets… at very low (and in some instances
zero) cost. Overaccumulated capital can seize hold of such assets and
immediately turn them to profitable use.” He adds that entrepreneurs
who are determined to “join the system” and enjoy “the benefits of
capital accumulation” are often the ones who drive this

[...]

Even when knowledge derived from our behavior is fed back to us as a
quid pro quo for participation, as in the case of so-called
“personalization,” parallel secret operations pursue the conversion of
surplus into sales that point far beyond our interests. We have no
formal control because we are not essential to this market action. In
this future we are exiles from our own behavior, denied access to or
control over knowledge derived from its dispossession by others for
others. Knowledge

[...]

When asked about government regulation, Schmidt said that technology
moves so fast that governments really shouldn’t try to regulate it
because it will change too fast, and any problem will be solved by
technology. ‘We’ll move much faster than any government.’”26 Both Brin
and Page are even more candid in their contempt

[...]

Economic historians describe the dedication to lawlessness among the
Gilded Age “robber barons” for whom Herbert Spencer’s social Darwinism
played the same role that Hayek, Jensen, and even Ayn Rand play for
today’s digital barons. In the same way that surveillance capitalists
excuse their corporations’ unprecedented

[...]

There was no need for law, they argued, when one had the “law of
evolution,” the “laws of capital,” and the “laws of industrial
society.” John Rockefeller insisted that his outsized oil fortune was
the result of “the natural law of trade development.” Jay Gould, when
questioned by Congress on the need for federal regulation of railroad
rates, replied that rates were already regulated by “the laws of supply
and demand, production and consumption.”31 The millionaires mobilized
in 1896 to defeat the populist Democrat William Jennings Bryan, who had
vowed to tether economic policy to the political realm, including
regulating the railroads and protecting the people from “robbery

[...]

Surveillance After September 11, surveillance scholar David Lyon

[...]

After several decades in which data-protection officials, privacy
watchdogs, civil rights groups, and others have tried to mitigate
negative social effects of surveillance, we are witnessing a sharp tilt
toward more exclusionary and intrusive surveillance practices.”56 This
abrupt refocusing of governmental power and policy after the 9/11
attacks in New York City and Washington, DC

[...]

With the attacks of September 11, 2001, everything changed. The new
focus was overwhelmingly on security rather than privacy.”61 The
privacy provisions debated just months earlier vanished from the
conversation more or less overnight. In both the US Congress and across
the EU, legislation was

[...]

including Germany (a country that had been highly sensitized to
surveillance under the hammer of both Nazi and Stalinist
totalitarianism), the UK, and France.62 In the US the failure to
“connect the dots” on the terrorist attack was a source of shame and
dismay that overwhelmed other concerns. Policy guidelines shifted from
“need to know” to “need to share” as agencies were urged to tear down
walls and blend databases for comprehensive information and analysis.63
In a parallel development, privacy scholar Chris Jay

[...]

The elective affinity between public and private missions was evident
as early as 2002, when former NSA Chief Admiral John Poindexter
proposed his Total Information Awareness (TIA) program with a vision
that reads like an early guide to the foundational mechanisms of
behavioral surplus capture and analysis:

[...]

secret public-private intelligence collaborations that tend to be
“orchestrated around handshakes rather than legal formalities, such as
search warrants, and may be arranged this way to evade oversight and,
at times, to defy the law.”88 He observed that intelligence agencies
are irresistibly drawn to “and in some respects dependent upon” firms’
privately held data resources.89

[...]

former NSA Director Mike McConnell offered another glimpse into the
elective affinities between Google and the intelligence community.
Writing in the Washington Post, McConnell made clear that Google’s
surveillance-based operations in data capture, extraction, and analysis
were both taken for granted and coveted. Here the boundaries of private
and public melt in the intense heat of new threats and their
high-velocity demands that must be met in “milliseconds.” In
McConnell’s future there is one “seamless” surveillance empire in which
the requirements of self-preservation leave no opportunity for the
amenities of

[...]

Once again, history offers us no control groups

[...]

1) the demonstration of Google’s unique capabilities as a source of
competitive advantage in electoral politics; (2) a deliberate blurring
of public and private interests through relationships and aggressive
lobbying activities; (3) a revolving door of personnel who migrated
between Google and the Obama administration, united by elective
affinities during Google’s crucial growth years of 2009–2016; and (4)
Google’s intentional campaign of influence over academic work and the
larger cultural conversation so vital to policy

[...]

Obama used his proximity to Schmidt to cement his own identity as the
innovation candidate poised to disrupt business as usual in
Washington.98 Once elected, Schmidt joined the Transition Economic
Advisory Board and appeared next to Obama at

[...]

Political correspondent Jim Rutenberg’s New York Times account of the
data scientists’ seminal role in the 2012 Obama victory offers a vivid
picture of the capture and analysis of behavioral surplus as a
political methodology. The campaign knew “every single wavering voter
in the country that it needed to persuade to vote for Obama, by name,
address, race, sex, and income,” and it had figured out how to target
its television ads to these individuals. One breakthrough was the
“persuasion score” that identified

[...]

According to the Center for Media and Democracy’s investigatory
research report, “The Googlization of the Far Right,” the corporation’s
2012 list of grantees featured a new

[...]

Meanwhile, a list of Google Policy Fellows for 2014 included
individuals from a range of nonprofit organizations whom one would
expect to be leading the fight against that corporation’s
concentrations of information and power, including the Center for
Democracy and Technology, the Electronic Frontier Foundation, the
Future of Privacy Forum, the National Consumers League, the Citizen
Lab, and the Asociación por los Derechos Civiles.116 In July 2017 the
Wall Street Journal reported that

[...]

That summer, one of the New America Foundation’s most highly regarded
scholars and a specialist in digital monopolies, Barry Lynn, posted a
statement praising the EU’s historic decision to levy a $2.7 billion
fine on Google as the result of a multiyear antitrust investigation.
According to the New York Times and Lynn’s own account, New America’s
director bent to pressure from Schmidt, firing Lynn and his Open
Markets team of ten researchers. “Google is very aggressive in throwing
its money around Washington and Brussels, and then pulling strings,”
Lynn told the New York Times. “People are so afraid of Google now.” The
reporters cite Google

[...]

Google in the lead, surveillance capitalism vastly expanded the market
dynamic as it learned to expropriate human experience and translate it
into coveted behavioral predictions. Google and this larger
surveillance project have been birthed, sheltered, and nurtured to
success by the historical conditions of their era—second-modernity
needs, the neoliberal inheritance, and the realpolitik of surveillance
exceptionalism—as well as by their own purpose-built fortifications
designed to protect supply chain operations from scrutiny through
political and cultural capture.

[...]

the capture of behavioral surplus and the acquisition of decision
rights. Like a river running to the sea, if one route is blocked

[...]

increasingly ruthless cycle of kidnapping human experience, cornering
surplus supplies, and competing in new behavioral futures markets

[...]

The extraction imperative demands that everything be possessed. In this
new context, goods and services are merely surveillance-bound supply
routes. It’s not the car; it’s the behavioral data from driving the
car. It’s not the map; it’s the behavioral data from interacting with
the map. The ideal here is continuously expanding borders that
eventually describe the world and everything in it, all the time.

[...]

Traditionally, monopolies on goods and services disfigure markets by
unfairly eliminating competition in order to raise prices at will.
Under surveillance capitalism, however, many of the practices defined
as monopolistic actually function as means of cornering user-derived
raw-material supplies. There is no monetary price for the user to pay,
only an opportunity

[...]

The corporation unfairly impedes competitors in Search in order to
protect the dominance of its most important supply route, not primarily
to fix prices. These cornering operations are not abstractions,

[...]

products such as Android are valued more for supply than for sales.
Disconnect, Inc., founded in 2011 by two former Google engineers and a
privacy-rights attorney, developed

[...]

Google executive, noting that if other manufacturers switched to
Skyhook, it “would be awful for Google, because it will cut off our
ability to continue collecting data” for the company’s Wi-Fi location
database. Court documents from Skyhook’s eventual lawsuit against
Motorola (and Samsung) include an e-mail from Google’s senior vice
president of Mobile

[...]

Finally, extraordinary research from the French nonprofit Exodus
Privacy and the Yale Privacy Lab in 2017 documented the exponential
proliferation of tracking software. Exodus identified 44 trackers in
more than 300 apps for Google’s Android platform, some

[...]

For example, the ad tracker FidZup developed “communication between a
sonic emitter and a mobile phone.…” It can detect the presence of
mobile phones and therefore their owners by diffusing a tone, inaudible
to the human ear, inside a building: “Users installing

[...]

a pattern foreshadowed by the Google patent that we examined in Chapter
3 and that we shall see repeatedly in the coming chapters, the research
findings emphasize that the always-on tracking is impervious to the
Android “permissions system,” despite its promises of user control.17

[...]

Disconnect software was banned from Google Play’s vast catalog of
mobile apps, leading to Disconnect’s lawsuit against Google in 2015.
The startup’s complaint explains that “advertising companies including
Google use these invisible

[...]

dispossession operations reveal a predictable sequence of stages that
must be crafted and orchestrated in great detail in order to achieve
their ultimate destination as a system of facts through which surplus
extraction is normalized.The four stages of the cycle are incursion,
habituation, adaptation, and redirection. Taken together, these stages
constitute a “theory of change” that describes and predicts
dispossession as a political and cultural

[...]

with Google’s wider practice: it’s great to empower people, but not too
much, lest they notice the pilfering of their decision rights and try
to reclaim them. The firm wants to enable people to make

[...]

Google’s ideal society is a population of distant users, not a
citizenry. It idealizes people who are informed, but only in the ways
that the corporation chooses. It means for us to be docile, harmonious,
and, above all, grateful.

[...]

Within days, an independent analysis by German security experts proved
decisively that Street View’s cars were extracting unencrypted personal
information from homes. Google was forced to concede that it had
intercepted and stored “payload data,” personal information grabbed
from unencrypted Wi-Fi transmissions. As its apologetic blog post
noted,

[...]

Google’s “Spy-Fi” scandal filled headlines around the world. Many
believed that the Street View revelations would inflict irreparable

[...]

April 2012 FCC report is heart wrenching in its way, a melancholic
depiction of democracy’s vulnerability in the face-off with a wealthy,
determined, and audacious surveillance capitalist opponent. In November
2010 the FCC sent Google a letter of inquiry

[...]

The second point is that in retrospect, one sees that the very idea of
a single rogue engineer was designed and elaborated as a brilliant
piece of misdirection, a classic scapegoating ploy. It directed
attention away from the ambitious and controversial agenda of the
extraction imperative toward a different narrative of a single infected
cell excised from the flesh of an enormous but innocent organism. All
that was left was to excise the infected flesh and let the organism
declare itself cured of its privacy kleptomania. Then—a return to the
streets

[...]

This is to say that her job was a logical impossibility. That she may
have nevertheless taken it seriously is suggested by

[...]

Street View’s redirection and elaboration announced a critical shift in
the orientation and ambition of the surveillance program: it would no
longer be only about routes, but about routing

[...]

For now, suffice to say that Street View and the larger project of
Google Maps illustrate the new and even more ambitious goals toward
which this cycle of dispossession would soon point: the migration from
an online data source to a real-world monitor to an advisor to an
active shepherd—from knowledge to influence to control. Ultimately,
Street View’s elaborate data would become the basis for another complex
of spectacular Google incursions: the self-driving car and “Google
City,” which we learn more about in Chapter 7. Those programs aim to
take surplus capture to new levels while opening up substantial new
frontiers for the establishment of behavioral futures markets in the
real world of goods and services. It is important to understand that
each level of innovation builds on the one before and that all are
united in one aim, the extraction of behavioral surplus at scale.In
this progression, Google perceives an opportunity

[...]

Google discovered by chance or intention the source of every mapmaker’s
power

[...]

The first US rectangular land survey captured this language perfectly
in its slogan: “Order upon the Land.”72 The cartographer is the
instrument of power as the author of that order, reducing reality to
only two conditions: the map and oblivion. The cartographer’s truth
crystallizes the message that Google and all surveillance capitalists
must impress upon all humans: if you are not on our map, you do not
exist

[...]

Google has done incrementally and furtively what would plainly be
illegal

[...]

done all at once.”98

[...]

Nevertheless, the company made a canny decision not to disclose the
true extent of Cortana’s knowledge to its users. It wants to know
everything about you, but it does not want you to know how much it
knows or that its operations are entirely geared to continuously
learning more. Instead, the “bot” is programmed to ask for permission
and confirmation. The idea is to avoid spooking the public by
presenting Cortana’s intelligence as “progressive” rather than
“autonomous,” according to the project’s group program manager, who
noted that people do not want to be surprised by how much their phones
are starting to take over: “We made an explicit decision to be a little
less ‘magical’ and little

[...]

The Siren Song of Surveillance Revenues

[...]

PrecisionID

[...]

ID is then broadcast to every “unencrypted website a Verizon customer
visits from a mobile device. It allows third-party advertisers and
websites to assemble a deep, permanent profile of visitors’ web
browsing habits without their consent.”126 Alarmed by the threat of
fresh competition, Google, posing as a privacy advocate, launched a
campaign for a new internet protocol that would prevent “header
injections” such as Verizon’s PrecisionID.127 Privacy expert and
journalist Julia Angwin and

[...]

UIDH [unique identifier header], and expect that to be available soon.”
The New

[...]

capitalism gene therapy. As Verizon’s president of Operations told
investors, “For

[...]

The companies understood, and they persuaded Republican senators, that
the principle of consent would strike a serious blow to the
foundational mechanisms of the new capitalism: the legitimacy of
unilateral surplus dispossession, ownership rights to surplus, decision
rights over surplus, and the right to lawless space for the prosecution
of these activities.146 To this end the resolution also prevented

[...]

another trend, surveillance in the interest of behavioral surplus
capture and sale has become a service in its own right. Such companies
are often referred to as “software-as-a-service” or SaaS, but they are
more accurately termed “surveillance as a service,” or “SVaaS.” For
example, a new app-based approach to lending instantly establishes
creditworthiness based on detailed

[...]

You’re able to get in and really understand the daily life of these
customers,” explained the CEO of one lending company that analyzes
10,000 signals per customer.151 Such methods were originally

[...]

Surveillance capitalism was born digital, but as we shall see in
following chapters, it is no longer confined to born-digital companies.
This logic for translating investment into revenue is highly adaptive
and exceptionally lucrative as long as raw-material supplies are free
and law is kept at bay. The rapid migration to surveillance revenues
that is now underway recalls the late-twentieth-century shift from
revenues derived from goods and services to revenues derived from
mastering the speculative and shareholder-value-maximizing

[...]

Who knows? Who decides? Who decides who decides

[...]

According to the philosopher of language John Searle, a declaration is
a particular way of speaking and acting that establishes facts out of
thin air, creating a new reality where there was nothing. Here is how
it works: sometimes we speak to simply describe the world—“you have
brown eyes”—or to change it—“Shut the door.” A declaration combines
both, asserting a new reality by describing the world as if a desired
change were already true: “All humans are created equal.” “They are
yours to command.” As Searle writes, “We

[...]

Searle concludes, “All of institutional reality, and therefore… all of
human civilization is created by… declarations

[...]

Instead, Durkheim trained his sights on the social transformation
already gathering around him, observing that “specialization” was
gaining “influence” in politics, administration, the judiciary,
science, and the arts. He concluded that the division of labor was no
longer quarantined in the industrial workplace. Instead, it had burst
through those factory walls to becoming the critical organizing
principle of industrial society. This is also an example of Edison’s
insight: that the principles of capitalism initially aimed at
production eventually shape the wider social and moral milieu.
“Whatever opinion one has about the division of labor,” Durkheim wrote,
“everyone knows that it exists, and is more and more becoming one of
the fundamental bases of the social order.”17 Economic imperatives
predictably mandated the division of labor in production, but what was
the purpose of the division of labor in society? This was the question
that motivated Durkheim’s analysis, and his century

[...]

What would hold society together in the absence of the rules and
rituals of clan and kin? Durkheim’s answer was the division of labor.
People’s needs for a coherent new source of meaning and structure were
the cause, and the effect was an ordering

[...]

conclusions are still relevant for us now. He argued that the division
of labor accounts for the interdependencies and reciprocities that link
the many diverse members of a modern industrial society in a larger
prospect of solidarity. Reciprocities breed mutual need, engagement,
and respect, all of which imbue this new ordering principle with moral
force

[...]

Britain, university administrators are already talking about a “missing
generation” of data scientists. The huge salaries of the tech firms
have lured so many professionals that there is no one left to teach the
next generation of students. As one scholar described it, “The real
problem is these people are not dispersed through society. The
intellect and expertise is concentrated in a small number of
companies.”32

[...]

Under the regime of surveillance capitalism, the corporation’s
scientists are not recruited to solve world hunger or eliminate
carbon-based fuels. Instead, their genius is meant to storm the gates
of human experience, transforming it into data and translating it into
a new market colossus that creates wealth by predicting, influencing

[...]

under scrutiny, those long-awaited delivery trucks look more like
automated vehicles of invasion and conquest: more Mad Max than Red
Cross, more Black Sails than Carnival Cruise. The wizards behind their
steering wheels careen across every hill and hollow, learning how to
scrape and stockpile our behavior

[...]

Schmidt was, in fact, merely paraphrasing computer scientist Mark
Weiser’s seminal 1991 article, “The Computer for the 21st Century,”
which has framed Silicon Valley’s technology objectives for nearly
three decades. Weiser introduced what he called “ubiquitous computing”
with two legendary sentences: “The most profound technologies are those
that disappear. They weave themselves into the fabric of everyday life
until they are indistinguishable from it.” He described a new way of
thinking “that allows the computers themselves to vanish into the
background.… Machines that fit the human environment instead of forcing
humans to enter theirs will make using a computer as refreshing as
taking a walk in the woods.”2

[...]

new phase, supply operations were enlarged and intensified to
accommodate economies of scope and economies of action. What does this
entail? The shift toward economies of scope defines a new set of aims:
behavioral surplus must be vast, but it must also be varied. These
variations are developed along two dimensions. The first is the
extension of extraction operations from the virtual world into the
“real

[...]

call economies of action. In order to achieve these economies, machine
processes are configured to intervene in the state of play in the real
world among real people and things. These interventions are designed to
enhance certainty by doing things: they nudge, tune, herd, manipulate,
and modify behavior in specific directions by executing actions as
subtle as inserting a specific phrase into your Facebook news feed,
timing the appearance of a BUY button on your phone, or shutting down
your car engine when an insurance payment is late.

[...]

means of behavioral modification.” The aim of this undertaking is not
to impose behavioral norms, such as conformity or obedience, but rather
to produce behavior that reliably, definitively, and certainly leads to
desired commercial results. The research director of Gartner, the
well-respected business advisory and research firm, makes the point
unambiguously when he observes

[...]

shadow text.4As the prediction imperative gathers force, it gradually
becomes clear that extraction was the first phase of a
far-more-ambitious project. Economies of action mean that real-world
machine architectures must be able to know as well as to do. Extraction
is not enough; now it must be twinned with execution. The extraction
architecture is combined with a new execution architecture, through
which hidden economic objectives are imposed upon the vast and varied
field of behavior.5 Gradually, as surveillance capitalism’s imperatives

[...]

is an extraordinary statement because there can be no such guarantees
in the absence of the power to make it so. This wider complex that we
refer to as the “means of behavioral modification” is the expression of
this gathering power. The prospect of guaranteed outcomes alerts us to
the force of the prediction imperative, which demands that surveillance
capitalists make the future for the sake of predicting it. Under this
regime, ubiquitous computing is not just a knowing machine; it is an
actuating machine designed to produce more certainty about us and for
them.

[...]

Finally, I want to underscore that although it may be possible to
imagine something like the “internet of things” without surveillance
capitalism, it is impossible to imagine surveillance capitalism without
something like the “internet of things.” Every command arising from the
prediction imperative requires this pervasive real-world material
“knowing and doing” presence. The new apparatus is the material
expression of the prediction imperative, and it represents a new kind
of power animated by the economic compulsion toward certainty. Two
vectors converge in this fact: the early ideals of ubiquitous computing
and the economic imperatives of surveillance capitalism. This
convergence signals the metamorphosis of the digital infrastructure
from a thing that we have to a thing that has us.

[...]

was coaxed to life nearly sixty years ago under the warm equatorial sun
of the Galapagos Islands, when a giant tortoise stirred from her torpor
to swallow a succulent chunk of cactus into which a dedicated scientist
had wedged a small machine. It was a time when scientists reckoned with
the obstinacy of free-roaming animals and concluded that surveillance
was the necessary price of knowledge. Locking these creatures in a zoo
would only eliminate the very behavior that scientists wanted to study,
but how were they to be surveilled? The solutions once concocted by
scholars of elk herds, sea turtles, and geese have been refurbished by
surveillance capitalists and presented as an inevitable feature of
twenty-first-century life on Earth. All that has changed is that now we
are the animals

[...]

If you’re not in the system, you don’t exist

[...]

dark data

[...]

This means that surplus must be both plentiful (economies of scale) and
varied (economies of scope) in both range and depth.

[...]

gamification

[...]

all else fails, insurers are advised to induce a sense of inevitability
and helplessness in their customers. Deloitte counsels companies to
emphasize “the multitude of other technologies already in play to
monitor driving” and that “enhanced surveillance and/or geo-location
capabilities are part of the world we live in now, for better or
worse.”46

[...]

Behavioral data drawn from their experience are processed, and the
results flow in two directions. First, they return to the drivers,
executing procedures to interrupt and shape behavior in order to
enhance the certainty, and therefore profitability, of predictions
(economies of action). Second, prediction products that rank and sort
driver behavior flow into newly convened behavioral futures markets in
which third parties lay bets on what drivers will do now, soon, and
later: Will

[...]

uncontract

[...]

This is not the automation of society, as some might think, but rather
the replacement of society with machine action dictated by economic
imperatives. The uncontract is not

[...]

Despite its pervasiveness both in Silicon Valley and in the wider
culture of data scientists and technology developers, inevitabilism is
rarely discussed or critically evaluated. Paradiso

[...]

Paradiso imagines a society in which it falls to each individual to
protect herself from the omniscient ubiquitous sensate computational
systems of the new apparatus. Rather than paradise, it seems a recipe
for a new breed of madness. Yet this is precisely the world that is now
under construction around us, and this madness appears to be a happy
feature of the plan.

[...]

Cisco Kinetic gets the right data to the right applications at the
right time… while executing policies to enforce data ownership,
privacy, security and even data sovereignty laws.”73 But, as is so
often the case, the most audacious effort to transform the urban
commons into the surveillance capitalist’s equivalent of Paradiso’s
250-acre marsh comes from Google, which has introduced and legitimated
the concept of the “for-profit city.” Just as MacKay had counseled and
Weiser proselytized, the computer would be operational everywhere and
detectable nowhere, always beyond the edge of individual awareness. In
2015, shortly after Google reorganized

[...]

We fund it all… through a very novel advertising model.… We can
actually then target ads to people in proximity, and then obviously
over time track them through things like beacons and location services
as well as their browsing activity

[...]

As we are shorn of alternatives, we are forced to purchase products
that we can never own while our payments fund our own surveillance and
coercion. Adding insult to injury, data rendered by this wave of things

[...]

life pattern marketing” based on techniques derived from military
intelligence known as “patterns of life analysis

[...]

It allows you to tap into people’s compulsive nature by encouraging
impulse buys with the notifications you send out.… It also allows you
to gain insight on your current customers by reading what they’re
saying on Yelp and Facebook.…”24 Another mobile marketing firm
recommends

[...]

November 2017 Quartz investigative reporters discovered that since
early 2017, Android phones had been collecting location information by
triangulating the nearest cell towers, even when location services were
disabled, no apps were running, and no carrier SIM card was installed
in the phone. The information was used to manage Google’s “push”
notifications and messages sent to users on their Android phones,
enabling the company to track “whether an individual with

[...]

databases of ruin.”34

[...]

The company built an “employment index” for the national economy as
well as a “consumption index.” It also touted its ability to generate
quite-specific predictions such

[...]

The agencies’ well-meaning guidelines overlook the inconvenient truth
that transparency and privacy represent friction for surveillance
capitalists in much the same way that improving working conditions,
rejecting child labor, or shortening the working day represented
friction for the early industrial capitalists. It took targeted laws to
change working conditions back

[...]

The only real protection is when an app randomly but regularly
generates a new MAC address for your phone, but of the nine trackers,
only Apple’s performed this operation. The report also identifies a
general pattern of careless

[...]

other words, privacy policies are more aptly referred to as
surveillance policies, and that is what I suggest we call them. There
are many new territories of body rendition

[...]

Nobody reckoned with the fact that the prediction imperative makes
individual ignorance the preferred condition for rendition operations,
just as Arendt had observed and Mackay had prescribed for animals in
the wild. Original sin prefers the dark. The talks continued without
the advocates, and

[...]

1767 the political economist Nathaniel Forster worried that
“fashionable luxury” was spreading “like a contagion,” and he
complained of the “perpetual restless ambition in each of the inferior
ranks to raise themselves to the level of those immediately above
them.”4 Adam Smith wrote insightfully on this social process, noting
that upper-class luxuries can in time be recast as “necessaries.” This
occurs as “the established rules of decency” change to reflect new
customs introduced by elites, triggering lower-cost production methods
that transform what was once unattainable into newly affordable goods
and services.5 Ford’s Model T

[...]

Conversation” stands alone in its promise to dominate raw-material
supply, and the rewards to the One Voice would be astronomical. Casual
talk helps to blur the boundaries between “it”—the apparatus saturated
with commercial agents—and us. In conversation we imagine friendship.
The more we fancy the apparatus as our confidante, nanny, governess,
and support system—a disembodied, pervasive “Mrs. Doubtfire” for each
person—the more experience we allow it to render, and the richer its
supply operations grow. Communication is the first human joy, and a
conversational interface is prized for the frictionless ease in which a
mere utterance can trigger action

[...]

2018, Amazon had inked deals with home builders, installing its Dot
speakers directly into ceilings throughout the house as well as Echo
devices and

[...]

Mark Zuckerberg’s unilateral upending of established privacy norms in
2010, when he famously announced that Facebook users no longer have an
expectation of privacy. Zuckerberg had described the corporation’s
decision to unilaterally release users’ personal information,
declaring, “We decided that these would be the social norms now, and we
just went for it.”55 Despite

[...]

The personalization project descends deeper toward the ocean floor with
these new tools, where they lay claim to yet a new frontier of
rendition trained not only on your personality but also on your
emotional life. If this project of surplus from the depths is to
succeed, then your unconscious—where feelings form before there are
words to express them—must be recast as simply one more source of
raw-material supply for machine rendition and analysis, all of it for
the sake of more-perfect prediction. As a market research report on
affective computing explains, “Knowing the real-time emotional state
can help businesses to sell their product and thereby increase
revenue.”88 Emotion analytics products such as SEWA use

[...]

Conditioning” is a well-known approach to inducing behavior change,
primarily associated with the famous Harvard behaviorist B. F. Skinner.
He argued that behavior modification should mimic the evolutionary
process, in which naturally occurring behaviors are “selected” for
success by environmental conditions. Instead of the earlier, more
simplistic model of stimulus/response, associated with behaviorists
such as Watson and Pavlov, Skinner interpolated a third variable:
“reinforcement.” In his laboratory work with mice and pigeons, Skinner
learned how to observe a range of naturally occurring behaviors in the
experimental animal and then reinforce the specific action, or
“operant,” that he wanted the animal to reproduce. Ultimately, he
mastered intricate designs or “schedules” of reinforcement that could
reliably shape precise behavioral routines. Skinner called the
application

[...]

Conditioning at scale is essential to the new science of massively
engineered human behavior.” He believes that smartphones, wearable
devices, and the larger

[...]

Varian endorsed and celebrated this self-authorizing experimental role,
warning that all the data in the world “can only measure correlation,
not causality.”3 Data tell what happened but not why it happened

[...]

element in the construction of high-quality prediction products—i.e.,
those that approximate guaranteed outcomes—depends upon causal
knowledge. As Varian says, “If you really want to understand causality,
you have to run experiments. And if you run experiments continuously,
you can continuously improve your system.”4

[...]

Psychologists have found that the more a person can project himself or
herself into the feelings of another and take the other’s perspective,
the more likely he or she is to be influenced by subliminal cues,
including hypnosis. Empathy orients people toward other people. It
allows one to get absorbed in emotional experience and to resonate with
others’ experiences

[...]

Facebook’s persistence warns us again of the dispossession cycle’s
stubborn march. Facebook had publicly acknowledged and apologized for
its overt experimental incursions into behavior modification and
emotional manipulation, and it promised adaptations to curb or mitigate
these practices. Meanwhile, a new threshold of intimate life had been
breached. Facebook’s potential mastery of emotional manipulation became
discussable and even taken for granted as habituation set in. From
Princeton’s Fiske to critic Grimmelmann and supporter Meyer, the
experts believed that if Facebook’s activities were to be forced into a
new regulatory regime, the corporation would merely continue in secret

[...]

Individual awareness is the

[...]

The evasion of individual and group awareness was critical to
Facebook’s behavior-modification success, just as MacKay had
stipulated. The first paragraph of the research article on emotional
contagion celebrates this evasion: “Emotional states can be transferred
to others via emotional contagion, leading people to experience the
same emotions without their awareness.” Nor do the young adults of
Australia’s great cities suspect that the precise measure of their
fears and fantasies is exploited for commercial result at the hour and
moment of their greatest vulnerability.

[...]

enemy of telestimulation because it is the necessary condition for the
mobilization of cognitive and existential resources. There is no
autonomous judgment without awareness. Agreement and disagreement,
participation and withdrawal, resistance or collaboration: none of
these self-regulating choices can exist without awareness.

[...]

Indeed, some theorists have suggested that the primary purpose of self
awareness is to enable self-regulation.” Every threat to human autonomy
begins with an assault on awareness, “tearing down our capacity to
regulate our thoughts, emotions, and desires.”22

[...]

salience of self-awareness as a bulwark against self-regulatory failure
is also underscored in the work of two Cambridge University researchers
who developed a scale to measure a person’s “susceptibility to
persuasion.” They found that the single most important determinant of
one’s ability to resist persuasion is what they call “the ability to
premeditate.”23 This means that people who harness self-awareness to
think through the consequences of their actions are more disposed to
chart their own course and are significantly less vulnerable to
persuasion techniques. Self-awareness also figures in the
second-highest-ranking factor on their scale: commitment. People who
are consciously committed to a course of action or set of principles

[...]

We have seen already that democracy threatens surveillance revenues.
Facebook’s practices suggest an equally disturbing conclusion: human
consciousness itself is a threat to surveillance revenues, as awareness
endangers the larger project of behavior modification. Philosophers
recognize “self-regulation,” “self-determination,” and “autonomy” as
“freedom of will.” The word autonomy derives from the Greek and
literally means “regulation by the self.” It stands in contrast to
heteronomy, which means “regulation by others.” The competitive
necessity of economies of action means that surveillance capitalists
must use all means available to supplant autonomous action with
heteronomous action.

[...]

However, it would be dangerous to nurse the notion that today’s
surveillance capitalists simply represent more of the same. This
structural requirement of economies of action turns the means of
behavioral modification into an engine of growth. At no other time in
history have private corporations of unprecedented wealth and power
enjoyed the free exercise of economies of action supported by a
pervasive global architecture of ubiquitous computational knowledge and
control constructed and maintained by all the advanced scientific
know-how that money can buy.

[...]

Most research on games concludes that these structures can be effective
at motivating action, and researchers generally predict that games will
increasingly be used as the methodology of choice to change individual
behavior.34 In practice, this has meant that the power of games to
change behavior is shamelessly instrumentalized as gamification spreads
to thousands of situations in which a company merely wants to tune,
herd, and condition the behavior of its customers or employees toward
its own objectives

[...]

One analyst compiled a survey of more than ninety such “gamification
cases,” complete with return-on-investment statistics.35 Ian Bogost, a
professor of interactive computing at Georgia Tech and a digital
culture observer, insists that these systems should be called
“exploitationware” rather than games because their sole aim is behavior
manipulation and modification.36

[...]

The zeal for Pokémon Go gradually diminished, but the impact of Hanke’s
accomplishments is indelible. “We’ve only just scratched the surface,”
Hanke told a crowd of fans.46 The game had demonstrated that it was
possible to achieve economies of action on a global scale while
simultaneously directing specific individual actions toward precise
local market opportunities where high bidders enjoy an ever-closer
approximation of guaranteed outcomes. Niantic’s distinctive
accomplishment

[...]

TechCrunch noted the game’s “precise location tracking” and “ability to
perform audio fingerprinting” through its access to your camera and
microphone, concluding, “So it’s prudent to expect some of your
location data to end up in Google’s hands.”48 The Electronic Privacy
Information Center noted in a letter of complaint to the Federal

[...]

However, it does not acknowledge that its services operate on two
levels: game services for players and prediction services for Niantic’s
customers. The company concedes that it uses third-party services,
including Google’s, to “collect and interpret data,” but it is careful
to sidestep the aims of those analyses.51 The seven-page letter
mentions “sponsored

[...]

The genius of Pokémon Go was to transform the game you see into a
higher-order game of surveillance capitalism, a game about a game

[...]

the end we recognize that the probe was designed to explore the next
frontier: the means of behavioral modification. The game about the game
is, in fact, an experimental facsimile of surveillance capitalism’s
design for our future

[...]

Thus began a morbidly fascinating and often bizarre chapter in the
history of American spy craft.55 Much of the new work was conducted in
the context of the CIA’s highly classified MKUltra project, which was
tasked with “research and development of chemical, biological, and
radiological materials capable for employment in clandestine operations
to control human behavior

[...]

Another factor was the 1971 publication of B. F. Skinner’s incendiary
social meditation Beyond Freedom & Dignity. Skinner prescribed a future
based on behavioral control, rejecting the very idea of freedom (as
well as every tenet of a liberal society) and cast the notion of human
dignity as an accident of self-serving narcissism

[...]

First Amendment, the subcommittee argued, “must equally protect the
individual’s right to generate ideas,” and the right to privacy should
protect citizens from intrusions into their thoughts, behavior,
personality, and identity lest these concepts “become meaningless.” It
was in this context that Skinnerian behavioral engineering was singled
out for critical examination

[...]

Where is the hammer of democracy now, when the threat comes from your
phone, your digital assistant, your Facebook login? Who will stand for
freedom now, when Facebook threatens to retreat into the shadows if we
dare to be the friction that disrupts economies of action that have
been carefully, elaborately, and expensively constructed to exploit our
natural empathy, elude our awareness, and circumvent our prospects for
self-determination? If we fail to take notice now, how long before we
are numb to this incursion and to all the incursions? How long until we
notice nothing at all? How long before we forget who we were before
they owned us, bent over the old texts of self-determination in the dim

[...]

Now we know that surveillance capitalists’ ability to evade our
awareness is an essential condition for knowledge production. We are
excluded because we are friction that impedes

[...]

The commodification of behavior under the conditions of surveillance
capitalism pivots us toward a societal future in which an exclusive
division of learning is protected by secrecy, indecipherability, and
expertise. Even when knowledge derived from your behavior is fed back
to you in the first text as a quid pro quo for participation, the
parallel secret operations of the shadow text capture surplus for
crafting into prediction products destined for other marketplaces that
are about you rather than for you. These markets do not depend upon you
except first as a source of raw material from which surplus is derived,
and then as a target for guaranteed outcomes

[...]

this future we are exiles from our own behavior, denied access to or
control over knowledge derived from our experience. Knowledge,
authority, and power rest with surveillance capital, for which we are
merely “human natural resources

[...]

Centuries of debate have been levied on the notion of free will, but
too often their effect has been to silence our own declarations of
will, as if we are embarrassed to assert this most fundamental human
fact. I recognize my direct experience of freedom as an inviolate truth
that cannot be reduced to the behaviorists’ formulations of life as
necessarily accidental and random, shaped by external stimuli beyond my
knowledge or

[...]

influence and haunted by irrational and untrustworthy mental processes
that I can neither discern nor avoid

[...]

American philosopher John Searle, whose work on the “declaration” we
discussed in Chapter 6, comes to a similar conclusion in his
examination of “free will.” He points to the “causal gap” between the
reasons for our actions and their enactment. We may have good reasons
to do something, he observes, but that does not necessarily mean it
will be done. “The traditional name of this gap in philosophy is ‘the
freedom of the will.’” In response to the “sordid history” of this
concept, he reasons, “even if the gap is an illusion it is one we
cannot shake off.… The notion of making and keeping promises
presupposes the gap.… [It] requires consciousness and a sense of
freedom on the part of the promise-making and promise-keeping agent

[...]

Our freedom flourishes only as we steadily will ourselves to close the
gap between making promises and keeping them. Implicit in this action
is an assertion that through my will I can influence the future. It
does not imply total authority over the future, of course, only over my
piece

[...]

should an experience as elemental as this claim on the future tense be
cast as a human right? The short answer is that it is only necessary
now because it is imperiled. Searle argues that such elemental
“features of human life” rights are crystallized as formal human rights
only at that moment in history when they come under systematic threat.
So, for example, the ability to speak is elemental. The concept of
“freedom of speech” as a formal right emerged only when society evolved
to a degree of political complexity that the freedom to speak came
under threat. The philosopher observes that speech is not more
elemental to human life than breathing or being able to move one’s
body. No one has declared a “right to breathe” or a “right to bodily
movement” because these elemental rights have not come under attack and
therefore do not require formal protection. What counts as a basic
right, Searle argues, is both “historically contingent” and “pragmatic

[...]

Most simply put, there is no freedom without uncertainty; it is the
medium in which human will is expressed in promises. Of course, we do
not only make promises to ourselves; we also make promises to one
another. When we join our wills and our promises, we create the
possibility of collective action toward a shared future, linked in
determination to make our vision real in the world. This is the origin
of the institution we call “contract,” beginning with the ancient
Romans.6 Contracts originated as shared “islands of predictability”
intended to mitigate uncertainty for the human community, and they
still retain this meaning. “The simplest way

[...]

The uncontract aims instead for a condition that the economist Oliver
Williamson describes as “contract utopia”: a state of perfect
information known to perfectly rational people who always perform
exactly as promised.12 The problem is, as Williamson writes, “All
complex contracts are unavoidably incomplete

[...]

you have ever seen a house built according to architectural plans, then
you have a good idea of what Williamson means. There is no blueprint
that sufficiently details everything needed to convert drawings and
specifications into an actual house. No plan anticipates every problem
that might arise, and most do not come close. The builders’ skills are
a function of how they collaborate to invent the actions that fulfill
the intention of the drawings as they solve the unexpected but
inevitable complications that arise along the way. They work together
to construct a reality from the uncertainty of the plan

[...]

Were “contract utopia” to exist, Williamson says, it would best be
described as a “plan” that, like other “utopian modes,” requires “deep
commitment to collective purposes” and “personal subordination.”
Subordination to what? To the plan. Contract in this context of perfect
rationality is what Williamson describes as “a world of planning.” Such
planning was the basic institution of socialist economics, where the
“new man” was idealized as possessing “a high level of cognitive
competence” and therefore, it was espoused, could design highly
effective plans.14 Varian deftly swaps out socialism’s “new man” and
installs instead a market defined by surveillance capitalism’s economic
imperatives, expressed through a ubiquitous computational architecture,
the machine intelligence capabilities to which data are continuously
supplied, the analytics that discern patterns, and the algorithms that
convert them into rules. This is the essence of the uncontract, which

[...]

Uncertainty is not

[...]

chaos but rather the necessary habitat of the present tense. We choose
the fallibility of shared promises and problem solving over the certain
tyranny imposed by a dominant power or plan because this is the price
we pay for the freedom to will, which founds our right to the future
tense. In the absence of this freedom, the future collapses into an
infinite present of mere behavior, in which there can be no subjects
and no projects: only objects. In the future

[...]

Life inclines us to take action and to make commitments even when the
future is unknown. Anyone who has brought a child into the world or

[...]

the real world of human endeavor, there is no perfect information and
no perfect rationality

[...]

improve their approximation to guaranteed outcomes. Just as industrial
capitalism was driven to the continuous intensification of the means of
production, so surveillance capitalists are now locked in a cycle of
continuous intensification of the means of behavioral modification

[...]

Surveillance capitalists’ interests have shifted from using automated
machine processes to know about your behavior to using machine
processes to shape your behavior according to their interests. In other
words, this decade-and-a-half trajectory has taken us from automating
information flows about you to automating you. Given the conditions of
increasing ubiquity, it has become difficult if not impossible to
escape this audacious, implacable web

[...]

In order to reestablish our bearings, I have asked for a rebirth of
astonishment and outrage. Most of all, I have asked that we reject the
Faustian pact of participation for dispossession that requires our
submission to the means of behavioral modification built on the
foundation of the Google declarations. I am also mindful, though, that
when we ask How did they get away with it? there are many compelling
reasons to consider, no one of which stands alone

[...]

need laws that reject the fundamental legitimacy of surveillance
capitalism’s declarations and interrupt its most basic operations,
including the illegitimate rendition of human experience as behavioral
data; the use of behavioral surplus as free raw material; extreme
concentrations of the new means of production

[...]

shock and awe

[...]

withdrawal of agreement takes two broad forms, a distinction that will
be useful as we move into Part III. The first is what I call the
counter-declaration. These are defensive measures such as encryption
and other privacy tools, or arguments for “data ownership.” Such
measures may be effective in discrete situations

[...]

turn to the history of the Berlin Wall as an illustration of these two
forms of disagreement

[...]

industrial capitalism dangerously disrupted nature, what havoc might
surveillance capitalism wreak on human nature? The answer to this
question requires a return to imperatives

[...]

Industrial capitalism brought us to the brink of epic peril, but not as
a consequence of an evil lust for destruction or runaway technology.
Rather, this result was ineluctably driven by its own inner logic of
accumulation, with its imperatives of profit maximization, competition,
the relentless drive for labor productivity through the technological
elaboration of production, and growth funded by the continuous
reinvestment of

[...]

Similarly, the meaning of Polanyi’s prophecy for us now can be grasped
only through the lens of surveillance capitalism’s economic imperatives
as they frame its claim to human experience. If we are to rediscover
our sense of astonishment, then let it be here: if industrial
civilization flourished at the expense of nature and now threatens to
cost us the Earth, an information civilization shaped by surveillance
capitalism will thrive at the expense of human nature and threatens to
cost us our humanity

[...]

The idea from the start was that naming and taming are inextricable,
that fresh and careful naming can better equip us to intercept these
mechanisms of dispossession, reverse their action, produce urgently
needed friction, challenge the pathological division of learning, and
ultimately synthesize new forms of information capitalism that
genuinely meet our needs for effective life

[...]

the heart of Gentile’s political philosophy is the concept of the
“total.”3 The state was to be understood as an inclusive organic unity
that transcends individual

[...]

secret plans executed by secret police, the silent complicities and
hidden atrocities, the ceaseless transformation of who or what was up
or down, the intentional torsion of facts into anti-facts accompanied
by a perpetual deluge of propaganda, misinformation, euphemism, and
mendacity. The authoritative leader, or “egocrat,” to use the French
philosopher Claude Lefort’s term, displaces the rule of law and
“common” sense to become the quixotic judge of what is just or unjust,
truth or lie, at each moment.9

[...]

Great Terror

[...]

murders of whole sectors of the Soviet population, from poets to
diplomats, generals to political loyalists. According to Soviet
historian Robert Conquest, that two-year period saw seven million
arrests, one million executions, two million deaths in labor camps, one
million people imprisoned, and another seven million people still in
camps by the end of 1938.11 Despite the immediacy of catastrophic

[...]

Until the rise of surveillance capitalism, the prospect of
instrumentarian power was relegated to a gauzy world of dream and
delusion. This new species of power follows the logic of Planck, Meyer,
and Skinner in the forfeit of freedom for knowledge, but those
scientists each failed to anticipate the actual terms of this
surrender. The knowledge that now displaces our freedom is proprietary.
The knowledge is theirs, but the lost freedom belongs solely to us.
With this origin story in

[...]

Instrumentarian power cultivates an unusual “way of knowing” that
combines the “formal indifference” of the neoliberal worldview with the
observational perspective of radical behaviorism

[...]

Forget the cliché that if it’s free, “You are the product.” You are not
the product; you are the abandoned carcass. The “product” derives from
the surplus that is ripped from your life.Big Other finally enables the
universal technology of behavior that, as Skinner, Stuart MacKay, Mark
Weiser, and Joe Paradiso each insisted, accomplishes its aims quietly
and persistently, using methods that intentionally bypass our
awareness, disappearing into the background of all things. Recall that
Alphabet/Google’s Eric Schmidt provoked uproar in 2015 when in response
to a question on the future of the web, he said, “The internet will
disappear.” What he really meant was that “The internet will disappear
into Big Other.”

[...]

We may confuse Big Other with the behaviorist god of the vortex, but
only because it effectively conceals the machinations of surveillance
capital that are the wizard behind the digital curtain

[...]

Under the regime of instrumentarian power, the mental agency and
self-possession of the right to the future tense are gradually
submerged beneath a new kind of automaticity: a lived experience of
stimulus-response-reinforcement aggregated as the comings and goings of
mere organisms. Our conformity is irrelevant to instrumentarianism’s
success. There is no need for mass submission to social norms, no loss
of self to the collective induced by terror and compulsion, no offers
of acceptance

[...]

Take one wrong step, one deviation from the path of seamless
frictionless predictability, and that same voice turns acid in an
instant as it instructs “the vehicular monitoring system not to allow
the car to be started.”

[...]

belonging as a reward for bending to the group. All of that is
superseded by a digital order that thrives within things and bodies,
transforming volition into reinforcement and action into conditioned
response. In this way instrumentarian power produces endlessly accruing
knowledge for surveillance capitalists and endlessly diminishing
freedom for us as it continuously renews surveillance capitalism’s
domination of the division of learning in society. False consciousness
is no longer produced by the hidden facts of class and their relation
to production but rather by the hidden facts of instrumentarian power’s
command over the division of learning in society as it usurps the
rights to answer the essential questions: Who knows? Who decides? Who
decides who decides? Power was once identified with the ownership of
the means of production, but it is now identified with ownership of the
means of behavioral modification that is Big Other.

[...]

The last stage of the laboring society, the society of jobholders,
demands of its members a sheer automatic functioning, as though
individual life had actually been submerged in the over-all life
process of the species and the only active decision still required of
the individual were to let go, so to speak, to abandon his
individuality, the still individually sensed pain and trouble of
living, and acquiesce in a dazed

[...]

tranquilized,” functional type of behavior. The trouble with modern
theories of behaviorism is not that they are wrong but that they could
become true, that they actually are the best possible conceptualization
of certain obvious trends in modern society. It is quite conceivable
that the modern age—which began with such an unprecedented and
promising outburst of human activity—may end in the deadliest, most
sterile passivity history has ever known.5 Is this to be our home

[...]

Now imagine, decades hence, another thinker meditating on the
“disturbing relevance” of instrumentarian power, observing that “the
true problems of our time cannot be understood, let alone solved,
without acknowledgement that instrumentarianism became this century’s
curse only because it so terrifyingly took care of its problems.” What
problems? I have

[...]

In the age of surveillance capitalism it is instrumentarian power that
fills the void, substituting machines for social relations, which
amounts to the substitution of certainty for society. In this imagined
collective life, freedom is forfeit to others’ knowledge, an
achievement that is only possible with the resources of the shadow text

[...]

private institutions of capital led the way in this ambitious
reformation of collective life and individual experience, but they
found necessary support from public institutions, especially as the
declaration of a “war on terror” legitimated every inclination to
enshrine machine-produced certainty as the ultimate solution to
societal uncertainty. These mutual affinities assured that
instrumentarian power would not be a stepchild but rather an equal
partner or even, with increasing regularity, the lord and master upon
whom the state depends in its quest for “total awareness.” That
instrumentarian power is regarded as the certain

[...]

White House briefing memo encouraged the companies to develop a
“radicalism algorithm” that would digest social media and other sources
of surplus to produce something comparable to a credit score, but aimed
at evaluating the “radicalness” of online content.14 The turn to
instrumentarian

[...]

Global Internet Forum to Counter Terrorism. The objective was to
tighten the net of instrumentarian power through

[...]

One startup, Geofeedia, specializes in detailed location tracking of
activists and protesters, such as Greenpeace members or union
organizers, and the computation of individualized

[...]

ACLU attorney countered that the government is using tech companies “to
build massive dossiers on people” based on nothing more than their
constitutionally protected speech.26 Another, more prominent
surveillance-as-a-service company, Palantir, once touted by Bloomberg
Businessweek as “the war on terror’s secret weapon,” was found to be in
a secret collaboration with the New Orleans Police Department to test
its “predictive policing” technology. Palantir’s software not only
identified gang members but also “traced people’s ties to other gang
members, outlined

[...]

thrust their scores into an inexorable downward spiral: “First your
score drops. Then your friends hear you are on the blacklist and,
fearful that their scores might be affected, quietly drop you as a
contact. The algorithm notices, and your

[...]

places less value on privacy than does Western culture and that most
Chinese have accommodated to the certain knowledge of online government
surveillance and censorship. The most common word for privacy, yinsi,
didn’t even appear in popular Chinese dictionaries until the
mid-1990s.42 Chinese citizens have accepted national ID cards with
biometric chips, “birth permits,” and now social credit rankings
because their society has been saturated with surveillance and
profiling for decades. For example, the “dang’an” is a wide-ranging
personal dossier compiled on hundreds of millions of urban residents
from childhood and maintained throughout life. This “Mao-era system for
recording the most intimate details of life” is updated by teachers,
Communist Party officials, and employers. Citizens have no rights to
see its contents, let alone contest them. The dossier is only one
feature of long-institutionalized

[...]

government urges the tech companies to train their algorithms for a
“radicalism” score. Indeed, the work of the shadow text is to evaluate,
categorize, and predict our behavior in millions of ways that we can
neither know nor combat—these are our digital dossiers. When it comes
to credit scoring, US and UK banks and

[...]

the Chinese context, the state will run the show and own it, not as a
market project but as a political one, a machine solution that shapes a
new society of automated behavior for guaranteed political and social
outcomes: certainty without terror. All the pipes from all the supply
chains will carry behavioral surplus to this new, complex means of
behavioral modification. The state will assume the role of the
behaviorist god, owning the shadow text and determining the schedule of
reinforcements and the behavioral routines that it will shape. Freedom
will be forfeit to knowledge, but it will be the state’s knowledge that
it exercises, not for the sake of revenue but for the sake of its own
perpetuation.

[...]

Joe” Stalin

[...]

The road from Shenzhen to an American or European airport also leads to
the Roomba vacuum cleaner mapping your living room and your breakfast
with Alexa

[...]

one direction lies the possibility of a synthetic declaration for a
third modernity based on the strengthening of democratic institutions
and the creative construction of a double movement for our time. On
this road we harness the digital to forms of information capitalism
that reunite supply and demand in ways that are both genuinely
productive of effective life and compatible with a flourishing
democratic social order. The first step down this road begins with
naming, establishing our bearings, reawakening our astonishment, and
sharing a sense of righteous indignity.

[...]

They aim to fashion a new society that emulates machine learning in
much the same way that industrial society was patterned on the
disciplines and methods of factory production. In their vision,
instrumentarian power replaces social trust, Big Other substitutes
certainty for social relations, and society as we know it shades into
obsolescence

[...]

Citing Abraham Lincoln, Facebook’s founder located his company’s
mission in the evolutionary time line of civilization, during which
humanity organized itself first in tribes, then cities, then nations.
The next phase of social evolution would be “global community,” and
Facebook was to lead the way, constructing the means and overseeing the
ends.14 Speaking at Facebook’s 2017 developers’ conference, Zuckerberg
linked his assertion of the company’s historic role in establishing a
“global community” to the standard myth of the modern utopia, assuring
his followers, “In the future, technology is going to… free us up to
spend more time on the things we all care about, like enjoying and
interacting with each other and expressing ourselves in new ways.… A
lot more of us are gonna do what today is considered the arts, and
that’s gonna form the basis of a lot of our communities

[...]

The “societal goal” articulated by the leading surveillance capitalists
fits snugly into the notion of limitless technological progress that
dominated utopian thought from the late eighteenth century through the
late nineteenth century, culminating with Marx. Indeed, surveillance
capitalists such as Nadella, Page, and Zuckerberg conform to five of
the six elements with which the great scholars of utopian thought,
Frank and Fritzie Manuel, define the classic profile of the most
ambitious modern utopianists: (1) a tendency toward highly focused
tunnel vision that simplifies the utopian challenge, (2) an earlier and
more trenchant grasp of a “new state of being” than other
contemporaries, (3) the obsessive pursuit and defense of an idée fixe,
(4) an unshakable belief in the inevitability of one’s ideas coming to
fruition, and (5) the drive for total reformation at the level of the
species and the entire world system

[...]

Often a utopian foresees the later evolution and consequences of
technological development already present in an embryonic state; he may
have antennae sensitive to the future. His gadgets, however, rarely go
beyond the mechanical potentialities of his age. Try as he may to
invent something wholly new, he cannot make a world out of nothing.”18
In our time, however, surveillance capitalists can and do make such a
world—a genuinely historic deviation from the norm. Individually and
collectively, the

[...]

The only way to grasp the theory advanced in their applied utopistics
is to reverse engineer their operations and scrutinize their meaning,
as we have done throughout these chapters.

[...]

Microsoft’s instrumentarian society, the factories and workplaces are
like Skinner’s labs, and the machines replace his pigeons and rats.
These are the settings where the architecture and velocities of
instrumentarian power are readied for translation to society in a
digital-age iteration of Walden Two in which machine relations are the
model for social relations. Nadella’s construction site exemplifies the
grand confluence in which machines and humans are united as objects in
the cloud, all instrumented and orchestrated in accordance with the
“policies.” The magnificence of “policies” lies precisely in the fact

[...]

result is that “policies” are functionally equivalent to plans, as Big
Other directs human and machine action. It ensures that doors will be
locked or unlocked, car engines will shut down or come to life, the
jackhammer will scream “no” in suicidal self-sacrifice, the worker will
adhere to norms, the group will swarm to defeat anomalies. We will all
be safe as each organism hums in harmony with every other organism,
less a society than a population that ebbs and flows in perfect
frictionless confluence, shaped by the means of behavioral modification
that elude our awareness and thus can neither be mourned nor resisted.

[...]

the twentieth century the critical success factors of industrial
capitalism—efficiency, productivity, standardization,
interchangeability, the minute division of labor, discipline,
attention, scheduling, conformity, hierarchical administration, the
separation of knowing and doing, and so forth—were discovered and
crafted in the workplace and then transposed to society, where they
were institutionalized in schools, hospitals, family life, and
personality. As generations of scholars have documented, society became
more factory-like so that we might train and socialize the youngest
among us to fit the new requirements of a mass production order.

[...]

With conspicuously thin theory complemented by thick practice, the
patented device is designed to monitor user behavior in order to
preemptively detect “any deviation from normal or acceptable behavior
that is likely to affect the

[...]

Alternatively, the behavior could be assessed in relation to a “feature
distribution representing normal and/or acceptable behavior for an
average member of a population

[...]

user’s mental state

[...]

the circle widens as the patent specifications unfold. The scientists
note the utility of alerts for health care providers, insurance
companies, and law-enforcement personnel. Here is a new
surveillance-as-a-service opportunity geared to preempt whatever
behavior clients choose. Microsoft’s patent returns us to Planck,

[...]

In each case, corporate objectives define the “policies” toward which
confluent behavior harmoniously streams.

[...]

The machine hive—the confluent mind created by machine learning—is the
material means to the final elimination of the chaotic elements that
interfere with guaranteed outcomes

[...]

Instead of the typical assurances that machines can be designed to be
more like human beings and therefore less threatening, Schmidt and
Thrun argue just the opposite: it is necessary for people to become
more machine-like.

[...]

In this world the “correct” outcomes are known in advance and
guaranteed in action. The same ubiquitous instrumentation and
transparency that define the machine system must also define the social
system, which in the end is simply another way of describing the ground
truth of instrumentarian society.

[...]

this human hive, individual freedom is forfeit to collective knowledge
and action. Nonharmonious elements are preemptively targeted with high
doses of tuning, herding, and conditioning, including the full
seductive force of social persuasion and influence. We march in
certainty, like the smart machines. We learn to sacrifice our freedom
to collective knowledge imposed by others and for the sake of their
guaranteed outcomes. This is the signature of the third modernity
offered up by surveillance capital as its answer to our quest for
effective life together

[...]

Pentland is often referred to as the “godfather of wearables,”
especially Google Glass. In 1998 he predicted that wearables “can
extend one’s senses, improve memory, aid the wearer’s social life and
even help him or her stay calm and collected

[...]

Most noteworthy is that Pentland “completes” Skinner, fulfilling his
social vision with big data, ubiquitous digital instrumentation,
advanced mathematics, sweeping theory, numerous esteemed coauthors,
institutional legitimacy, lavish funding, and corporate friends in high
places without having attracted the worldwide backlash, moral
revulsion, and naked vitriol once heaped on Harvard’s outspoken
behaviorist. This fact alone suggests the depth of psychic numbing to
which we have succumbed and the loss of our collective bearings.

[...]

’s like watching beavers from outer space, like Jane Goodall watching
gorillas. You observe from a distance.”7 (This is a slur on Goodall, of
course, whose seminal genius was her ability to understand the gorillas
she studied not as “other ones” but rather as “one of us.”)

[...]

The team saw that it would be possible to exploit the increasingly
“ubiquitous infrastructure” of mobile phones and combine those data
with new streams of information from their wearable behavioral
monitors. The result was a radical new solution that Pentland and Eagle
called “reality mining

[...]

Pentland argued that information gathered by his
sociometers—“unobtrusive wearable sensors” measuring communication,
voice tones, and body language—“could help managers understand who is
working with whom and infer the relationships between colleagues” and
“would be an efficient way to find people who might work well
together.”20

[...]

people analytics

[...]

Pentland appeared in 2016 at a conference organized by Singularity
University, a Silicon Valley hub of instrumentarian ideology funded in
part by Larry Page. An interviewer tasked to write about Pentland
explains, “Though people are one of the most valuable assets in an
organization, many companies are still approaching management with a
20th century mentality.… Pentland saw the factor that was always
messing things up was—the people.”29 Like Nadella, Pentland described
his aims as developing the social systems that would work along the
same lines as the machine systems, using behavioral data flows to judge
the “correctness” of action patterns and to intervene when it is
necessary to change “bad” action to “correct” action. “If people aren’t
interacting correctly and information isn’t spreading correctly,”
Pentland warns, “people

[...]

Pentland articulated his ambitions for the capabilities and objectives
of this new milieu in a series of papers, published primarily between
2011 and 2014, but one remarkable 2011 essay of which he is the sole
author stands out: “Society’s Nervous System: Building Effective
Government, Energy, and Public Health Systems.”31

[...]

The initial premise is reasonable enough: industrial-age technology
once revolutionized the world with reliable systems for water, food,
waste, energy, transportation, police, health care, education, and so
forth, but these systems are now hopelessly “old,” “centralized,”
“obsolete,” and “unsustainable.” New digital systems are required that
must be “integrated,” “holistic,” “responsive,” “dynamic,” and
“self-regulating”: “We need a radical rethinking of societies’ systems.
We must create a nervous system for humanity that maintains the
stability of our societies’ systems

[...]

What is missing… are the dynamic models of demand and reaction,” along
with an architecture that guarantees “safety, stability, and
efficiency.… The models required must describe human

[...]

Regarding incentives, Pentland outlines a principle of “social
efficiency,” which means that participation must provide value to the
individual but also to the system as a whole.37 For the sake of this
wholeness, it is believed, each of us will surrender to a totally
measured life of instrumentarian order

[...]

Skinner advocated, via Frazier, that the virtue of a “planned society”
is “to keep intelligence on the right track, for the good of society
rather than of the intelligent individual.… It does this by making sure
that the individual will not forget his

[...]

Pentland says that “continuous streams of data about human behavior”
mean that everything from traffic, to energy use, to disease, to street
crime will be accurately forecast, enabling a “world without war or
financial crashes, in which infectious disease is quickly detected and
stopped, in which energy, water, and other resources are no longer
wasted, and in which governments are part of the solution rather than
part of the problem.”48 This new “collective intelligence” operates to
serve the greater good as we learn to act “in a coordinated manner”
based on “social universals.” “Great leaps in health care,
transportation

[...]

The main barriers are privacy concerns and the fact that we don’t yet
have any consensus around the trade-offs between personal and social
values.” Like Skinner, he is emphatic that these attachments to a
bygone era of imperfect knowledge threaten to undermine the prospect of
a perfectly engineered future society: “We cannot ignore the public
goods that such a nervous system could provide.…”49 Pentland avoids the
question “Whose greater good?” How is the greater good determined when
surveillance capitalism owns the machines and the means of behavioral
modification? “Goodness” arrives already oriented toward the interests
of the owners of the means of behavioral modification and the clients
whose guaranteed outcomes they seek to achieve. The greater good is
someone’s, but it may not be ours

[...]

Capitalism and socialism are equally tainted by their shared emphasis
on economic growth, which breeds overconsumption and pollution. Skinner
is intrigued by the Chinese system but rejects it on the grounds of the
bloody revolution that any effort to convert Westerners would entail.
“Fortunately,” Skinner concludes in the preface to Walden Two, “there
is another possibility.” This option is Skinner’s version of a
behaviorist society that provides a way in which “political action is
to be avoided.” In Walden Two a “plan” replaces politics, overseen by a
“noncompetitive” group of “Planners” who eschew power in favor of the
dispassionate administration of the schedules of reinforcement aimed at
the greater good.52 Planners exercise unique control over society but
“only because that control is necessary for the proper functioning of
the community

[...]

Pentland worries that our political-economic constructs such as
“market” and “class” hail from an old, slow world of the eighteenth and
nineteenth centuries. The new, “light-speed hyperconnected world”
leaves no time for the kind of rational deliberation and face-to-face
negotiation and compromise that characterized the social milieu in
which such political concepts originated

[...]

There is no room for politics in this instrumentarian society because
politics means establishing and asserting our bearings. Individual
moral and political bearings are a source of friction that wastes
precious time and diverts behavior from confluence

[...]

Computation thus replaces the political life of the community as the
basis for governance. The depth and breadth of instrumentation make it
possible, Pentland says, to calculate idea flow, social network
structure, the degree of social influence between people, and even
“individual susceptibilities to new ideas.” Most important,
instrumentation makes it possible for those with the God view to modify
others’ behavior. The data provide a “reliable prediction of how
changing any of these variables will change the performance of all the
people

[...]

Frazier acknowledges that you cannot coerce people into doing the right
thing. The solution is far more subtle and sophisticated, based upon
scientifically calibrated schedules of reinforcement: “Instead you have
to set up certain behavioral processes which will lead the individual
to design his own ‘good’ conduct.… We call that sort of thing
‘self-control.’ But don’t be misled, the control always rests in the
last analysis in the hands of society

[...]

Pentland’s idea is comparable: “The social physics approach to getting
everyone to cooperate” is “social network incentives,” his version of
“reinforcement.” With such incentives, he explains, “we focus on
changing the connections between people rather than focusing on getting
people individually to change their behavior.… We can leverage those
exchanges to generate social pressure for change.”60 Social media is
critical to establishing these tuning capabilities, Pentland believes,
because this is the environment in which social pressure can best be
controlled, directed, manipulated, and scaled

[...]

Pentland ignores the role of empathy in emulation because empathy is a
felt experience that is not subject to the observable metrics required
for computational governance. Instead, Pentland subscribes to the label
Homo imitans to convey that it is mimicry, not empathy, and certainly
not politics, which defines human existence

[...]

stream of ideas as a swarm or collective intelligence, flowing through
time, with all the humans in it learning from each other’s experiences
in order to jointly discover the patterns of preferences and habits of
action that best

[...]

What is being abolished is autonomous man—the inner man, the
homunculus, the possessing demon, the man defended by the literatures
of freedom and dignity

[...]

One important study of Bitcoin, the cryptocurrency that relies on
blockchain, suggests that such machine solutions both express and
contribute to the general erosion of the social fabric in ways that are
both consistent with instrumentarianism and further pave the way for
its success. Information scholars Primavera De Filippi and Benjamin
Loveluck conclude that contrary to popular belief, “Bitcoin is neither
anonymous nor privacy-friendly

[...]

We can begin by asking our children. Without knowing it, we sent the
least formed and most vulnerable among us to scout the hive and settle
its wilderness. Now their messages are filtering in from the frontier

[...]

Indeed, Facebook’s early advantage in this work arose in no small
measure from the simple fact that its founders and original designers
were themselves adolescents and emerging adults. They designed
practices for an imagined universe of adolescent users and college

[...]

contrary to Pentland’s belief that “class” divisions would disappear,
life in the hive produces new cleavages and forms of stratification:
not only tune or be tuned but also pressure or be pressured

[...]

Schüll learned that addictive players seek neither entertainment nor
the mythical jackpot of cash. Instead, they chase what Harvard Medical
School addiction researcher Howard Shaffer calls “the capacity of the
drug or gamble to shift subjective experience,” pursuing an
experiential state that Schüll calls the “machine zone,” a state of
self-forgetting in which one is carried along by an irresistible
momentum that feels like one is “played by the machine.”12 The machine
zone achieves a sense of complete immersion

[...]

Addiction by Design

[...]

Shaffer, the addiction researcher, has identified five elements that
characterize this state of compulsion: frequency of use, duration of
action, potency, route of administration, and player attributes

[...]

Perhaps the most difficult quality to capture is that in this period
that precedes the hard bargaining, an “inner” sense of “self” simply
does not yet exist. It is a time when “I” am whatever the “others”
think of me, and how “I” feel is a function of how the “others” treat
me. Instead of a stable sense of identity, there is only a chameleon
that reinvents itself depending upon the social mirror into which it is
drawn. In this condition, the “others” are not individuals but the
audience for whom I perform. Who “I” am depends upon the audience. This
state

[...]

Research shows that these big leaps in self-construction are stimulated
by experiences such as structured reflection, conflict, dissonance,
crisis, and failure. The people who help trigger this new inward
connection refuse to act as our mirrors. They reject fusion in favor of
genuine reciprocity

[...]

What are the consequences of the failure to win a healthy balance
between inner and outer, self and relationship? Clinical studies
identify specific patterns associated with this developmental
stagnation. Not surprisingly, these include an inability to tolerate
solitude, the feeling of being merged with others, an unstable sense of
self, and even an excessive need to control others as a way of keeping
the mirror close. Loss of the mirror is the felt equivalent of
extinction

[...]

The cultivation of inner resources is thus critical to the capacity for
intimacy and relationship, challenges that have become more
time-consuming with each new phase of the modern era. And while young
people are bound as ever to the enduring existential task of
self-making, our story suggests three critical ways in which this task
now converges with history and the unique conditions of existence in
our time

[...]

For example, Evil by Design author Chris Nodder, a user-experience
consultant, explains that evil design aims to exploit human weakness by
creating interfaces that “make users emotionally involved in doing
something that benefits the designer more than them.” He coaches his
readers in psychic numbing, urging them to accept the fact that such
practices have become the standard suggesting that consumers and
designers find ways to “turn them to your advantage

[...]

Facebook’s precocious mastery of “social proof”: “Much of our behavior
is determined by our impressions of what is the correct thing to do…
based on what we observe others doing.… This influence is known as
social proof

[...]

Most critical is that the more the need for the “others” is fed, the
less able one is to engage the work of self-construction. So
devastating is the failure to attain that positive equilibrium between
inner and outer life that Lapsley and Woodbury say it is “at the heart”
of most adult personality disorders

[...]

more that a user “liked,” the more that she informed Facebook about the
precise shape and composition of her “hand,” thus allowing the company
to continuously tighten the glove and increase the predictive value of
her signals.

[...]

On the demand side, Facebook’s “likes” were quickly coveted and craved,
morphing into a universal reward system or what one young app designer
called “our generation’s crack cocaine.” “Likes” became those variably
timed dopamine shots, driving users to double down on their bets “every
time they

[...]

News Feed is also the fulcrum of the social mirror. In the years
between revulsion and reverence, News Feed became Facebook’s most
intensely scrutinized object of data science and the subject of
extensive organizational innovation, all of it undertaken at a level of
sophistication and capital intensity that one might more naturally
associate with the drive to solve world hunger, cure cancer, or avert
climate destruction

[...]

laugh, cry, smile, click, like, share, or comment.”40 The glove
tightens around the hand with closed feedback loops enabled by the God
view, which favors posts from people with whom you have already
interacted, posts that have drawn high levels of engagement from
others, and posts that are like the ones with which you have already
engaged

[...]

According to the 302 most significant quantitative research studies on
the relationships between social media use and mental health (most of
them produced since 2013), the psychological process that most defines
the Facebook experience is what psychologists call “social
comparison.”45 It is usually considered a natural and virtually

[...]

One study found an increase in criminal larceny as television diffused
across society, awakening an awareness of and desire for consumer
goods. A related issue was that increased exposure to television
programs depicting affluence led to “the overestimation of others’
wealth and more dissatisfaction with one’s own life

[...]

Both television and social media deprive us of real-life encounters, in
which we sense the other’s inwardness and share something of our own,
thus establishing some threads of communality. Unlike

[...]

consequence of the new density of social comparison triggers and their
negative feedback loops is a psychological condition known as FOMO
(“fear of missing out”). It

[...]

Profile inflation triggers more negative self-evaluation among
individuals as people compare themselves to others, which then leads to
more profile inflation, especially among larger networks that include
more “distant friends.” As one study concluded, “Expanding one’s social
network by adding a number of distant friends through Facebook may be
detrimental by stimulating negative emotions for users

[...]

This compulsive behavior is intended to produce relief in the form of
social reassurance, but it predictably breeds more anxiety and more
searching.52 Social comparison

[...]

When considered from the vantage point of the self-other balance,
positive social comparisons are just as pernicious as negative
comparisons. Both are substitutes for the “hard bargain” of carving out
a self that is capable of reciprocity rather than fusion

[...]

Facebook use does not promote well-being.… Individual social media
users might do well to curtail their use of social media and focus
instead on real-world relationships

[...]

This is the world of Pentland’s “social learning,” his theory of
“tuning” little more than the systematic manipulation of the rewards
and punishments of inclusion and exclusion. It succeeds through the
natural human inclination to avoid psychological pain

[...]

confluence,” in which harmonies are achieved at the expense of the
psychological integrity of participants

[...]

This synthetic hive is a devilish pact for a young person. In terms of
sheer everyday effectiveness—contact, logistics, transactions,
communications—turn away, and you are lost. And if you simply crave the
fusion juice that is proof of life at a certain age and stage—turn
away, and you are extinguished

[...]

Just as Pentland stipulated, these closed loops are imposed outside the
realm of politics and individual volition. They move in stealth,
crafting their effects at the level of automatic psychological
responses and tipping the self-other balance toward the
pseudo-harmonies of the hive mind. In this process, the inwardness that
is the necessary source of autonomous action and moral judgment suffers
and suffocates. These are the preparatory steps toward the death of
individuality that Pentland advocates. In fact, this

[...]

the eighteenth century’s political ideal of the individual as the
repository of inalienable dignity, rights, and obligations; (2) the
early twentieth century’s individualized

[...]

human being called into existence by history, embarking on Machado’s
road because she must, destined to create “a life of one’s own” in a
world of ever-intensifying social complexity and receding traditions;
and (3) the late twentieth century’s psychologically autonomous
individual whose inner resources and capacity for moral judgment rise
to the challenges of self-authorship that history demands and act as a
bulwark against the predations of power. The self-authorship toward
which young people strive

[...]

post-political societal processes that bind the hive rely on social
comparison and social pressure for their durability and predictive
certainty, eliminating the need for trust

[...]

the closing lines of Jean-Paul Sartre’s existential drama No Exit, the
character Garcin arrives at his famous realization, “Hell is other
people.” This was not intended as a statement of misanthropy but rather
a recognition that the self-other balance can never be adequately
struck as long as the “others” are constantly “watching.” Another
mid-century social psychologist, Erving Goffman, took up these themes
in his immortal The Presentation of Self in Everyday Life. Goffman
developed the idea of the “backstage” as the region in which the self
retreats from the performative demands of social life.

[...]

work as in life, “control of the backstage” allows individuals “to
buffer themselves from the deterministic demands that surround them.”
Backstage, the language is one of reciprocity, familiarity, intimacy,
humor. It offers the seclusion in which one can surrender to the
“uncomposed” face in sleep, defecation, sex, “whistling, chewing,
nibbling, belching, and flatulence.” Perhaps most of all, it is an
opportunity for “regression,” in which we don’t have to be “nice”: “The
surest sign of backstage solidarity is to feel that it is safe to lapse
into an asociable mood of sullen, silent irritability.” In the absence
of such respite where a “real” self can incubate and grow, Sartre’s
idea of hell begins to make sense.62

[...]

Milgram identified three key themes in the subway experiment as he and
his students debriefed their experiences. The first was a new sense of
gravitas toward “the enormous inhibitory anxiety that ordinarily
prevents us from breaching social norms.” Second was that the reactions
of the “breacher” are not an expression of individual personality but
rather are “a compelled playing out of the logic of social relations

[...]

Embarrassment and the fear of violating apparently trivial norms often
lock us into intolerable predicaments.… These are not minor regulatory
forces in social life, but basic ones.” Finally, Milgram understood
that any confrontation of social norms crucially depends upon the
ability to escape. It was not an adolescent who boarded the subway that
day. Milgram was an erudite adult and an expert on human behavior,
especially the mechanisms entailed in obedience to authority, social
influence, and conformity. The subway was just an ordinary slice of
life, not a capital-intensive architecture of surveillance and behavior
modification, not a “personalized reward device.” Still, Milgram could
not fight off the anxiety of the situation. The only thing that made it
tolerable was the possibility of an exit. Unlike Milgram, we face an
intolerable situation

[...]

are meant to fuse with the system and play to extinction: not the
extinction of our funds but rather the extinction of our selves.
Extinction is a design feature formalized in the conditions of no exit.
The aim of the tuners is to contain us within “the power of immediate
circumstances” as we are compelled by the “logic of social relations”
in the hive to bow to social pressure exerted in calculated patterns
that exploit our natural empathy. Continuously tightening feedback
loops cut off the means of exit, creating impossible levels of anxiety
that further drive the loops toward confluence. What is to be killed
here is the inner impulse toward autonomy and the arduous, exciting
elaboration of the autonomous self as a source of moral judgment and
authority capable of asking for a subway seat or standing against rogue
power.

[...]

To exit means to enter the place where a self can be birthed and
nurtured. History has a name for that kind of place: sanctuary

[...]

We know that nothing guarantees safety and certainty in this world, but
we are comforted by the serenity of this home and its layered silences.
The days unfurl now

[...]

In the march of institutional interests intent on implementing Big
Other, the very first citadel to fall is the most ancient: the
principle of sanctuary. The sanctuary privilege has stood as an
antidote to power since the beginning of the human story. Even in
ancient societies where tyranny prevailed, the right of sanctuary stood
as a fail-safe. There was an exit from totalizing power, and that exit
was the entrance to a sanctuary in the form of a city, a community, or
a temple.4 By the time of the Greeks, sanctuaries were sacred sites
built across the ancient Greek world and consecrated to the purposes of
asylum and religious sacrifice. The Greek word asylon means
“unplunderable” and founds the notion

[...]

sanctuary as an inviolable space.5 The right of asylum survived into
the eighteenth century in many parts of Europe, attached to holy sites,
churches, and monasteries. The demise of the sanctuary privilege was
not a repudiation but rather a reflection of social evolution and the
firm establishment of the rule of law. One historian summarized this
transformation: “justice as sanctuary.”6 In the modern

[...]

empirical study makes the point. In “Psychological Functions of
Privacy,” Darhl Pedersen defines privacy as a “boundary control
process” that invokes the decision rights associated with “restricting
and seeking interaction

[...]

The same themes appear from the perspective of psychology. Those who
would eviscerate sanctuary are keen to take the offensive, putting us
off guard with the guilt-inducing question “What have you got to hide?”
But as we have seen, the crucial developmental challenges of the
self-other balance cannot be negotiated adequately without the sanctity
of “disconnected” time and space for the ripening of inward awareness
and the possibility of reflexivity: reflection on and by oneself. The
real psychological truth is this: If you’ve got nothing to hide, you
are nothing.

[...]

six categories of privacy behaviors: solitude, isolation

[...]

proper realm of inaccessibility or secrecy with respect to the world at
large as well as a recognition of the important social dimension of
such protected inner space.…”7

[...]

contemplation, autonomy, rejuvenation, confiding, freedom, creativity,
recovery, catharsis, and concealment

[...]

anonymity, reserve, intimacy with friends, and intimacy with family

[...]

billions of sensors filled with personal data fall outside of Fourth
Amendment protections, a large-scale surveillance network will exist
without constitutional limits

[...]

This theme is illustrated in the odyssey of Belgian mathematician and
data protection activist Paul-Olivier Dehaye, who in December 2016
initiated a request for his personal data collected through Facebook’s
Custom Audiences and tracking Pixel tools, which would reveal the web
pages where Facebook had tracked him. Dehaye probably knew more about
the rogue data operations of Cambridge Analytica than anyone in the
world, outside of its own staff and masterminds

[...]

of the right to contest “automatic decision making.” If the algorithms
are to be contestable in any meaningful way, it will require new
countervailing authority and power, including machine resources and
expertise to reach into the core disciplines of machine intelligence
and construct new approaches that are available for inspection, debate,
and combat. Indeed, one expert has already proposed the creation of a
government agency—an “FDA for algorithms

[...]

is already possible to see a new awakening to empowering collective
action, at least in the privacy domain. One example is None of Your
Business (NOYB), a nonprofit organization led by privacy activist Max
Schrems. After many years of legal contest, Schrems made history in
2015 when his challenge to Facebook’s data-collection and
data-retention practices—which he asserted were in violation of EU
privacy law—led the Court of Justice of the European Union to
invalidate the Safe Harbor agreement that governed data transfers

[...]

the absence of synthetic declarations that secure the road to a human
future, the intolerability of glass life turns us toward a societal
arms race of counter-declarations in which we search for and embrace
increasingly complex ways to hide in our own lives, seeking respite
from lawless machines and their masters. We do this to satisfy our
enduring need for sanctuary and as an act of resistance with which to
reject the instrumentarian disciplines of the hive, its “extended
chilling effects,” and Big Other’s relentless greed. In the context of
government surveillance, the practices of “hiding” have been called
“privacy protests” and are well-known for drawing the suspicion of
law-enforcement agencies.33 Now, hiding is also invoked by Big Other
and its market masters, whose reach is far and deep as they install
themselves in our walls, our bodies, and on our streets, claiming our
faces, our feelings, and our fears of exclusion. I have suggested

[...]

Equally more poignant is the way in which a new generation of
activists, artists, and inventors feels itself called to create the art
and science of hiding.34 The intolerable conditions of glass life
compel these

[...]

Chicago artist Leo Selvaggio produces 3-D–printed resin prosthetic
masks to confound facial recognition. He calls his effort “an organized
artistic intervention.”35 Perhaps most poignant is the Backslash Tool
Kit: “a series of functional devices designed for protests and riots of
the future,

[...]

New Museum for Contemporary Art in Manhattan, and you pass a display of
its bestseller: table-top mirrors whose reflecting surface is covered
with the bright-orange message “Today’s Selfie Is Tomorrow’s Biometric
Profile.” This “Think Privacy Selfie Mirror” is a project of the young
Berlin-based artist Adam Harvey, whose work is aimed at the problem of
surveillance and

[...]

Trevor Paglen’s richly orchestrated performance art combines music,
photography, satellite imagery, and artificial intelligence to reveal
Big Other’s omnipresent knowing and doing. “It’s trying to look inside
the software that is running an AI… to look into the architectures of
different computer vision

[...]

greatest danger is that we come to feel at home in glass life or in the
prospect of hiding from it. Both alternatives rob us of the
life-sustaining inwardness

[...]

Glass life is intolerable, but so is fitting our faces with masks and
draping our bodies in digitally resistant fabrics to thwart the
ubiquitous lawless machines. Like every counter-declaration, hiding
risks becomes an adaptation when it should be a rallying point for
outrage. These conditions are unacceptable. Tunnels under this wall are
not enough. This wall must come down.

[...]

Surveillance capitalists are no different from other capitalists in
demanding freedom from any sort of constraint. They insist upon the
“freedom to” launch every novel practice while aggressively asserting
the necessity of their “freedom from” law and regulation. This classic
pattern reflects two bedrock assumptions about capitalism made by its
own theorists: The first is that markets are intrinsically unknowable.
The second is that the ignorance produced by this lack of knowledge
requires wide-ranging freedom of action for market actors

[...]

Adam Smith’s famous metaphor of the “invisible hand” drew on these
enduring realities of human life. Each individual, Smith reasoned,
employs his capital locally in pursuit of immediate comforts and
necessities. Each one attends to “his own security… his own gain… led
by an invisible hand to promote an end which was no part of his
intention.” That end is the efficient employ of capital in the broader
market: the wealth of nations. The individual actions that produce
efficient markets add up to a staggeringly complex pattern, a mystery
that no one person or entity could hope to know or understand, let
alone to direct

[...]

Adam Smith,” Hayek wrote, “was the first to perceive that we have
stumbled upon methods of ordering human economic cooperation that
exceed the limits of our knowledge and perception. His ‘invisible hand’
had perhaps better have been described as an invisible or unsurveyable
pattern

[...]

As with Planck, Meyer, and Skinner, both Hayek and Smith unequivocally
link freedom and ignorance. In Hayek’s framing, the mystery of the
market is that a great many people can behave effectively while
remaining ignorant of the whole. Individuals not only can choose
freely, but they must freely choose their own pursuits because there is
no alternative, no source of total knowledge or conscious control to
guide them. “Human design” is impossible, Hayek says, because the
relevant information flows are “beyond the span of the control of any
one mind

[...]

However, Big Other and the steady application of instrumentarian power
challenge the classic quid pro quo of freedom for ignorance. When it
comes to surveillance capitalist operations, the “market” is no longer
invisible

[...]

Hayek chose the market over democracy, arguing that the market system
enabled not only the division of labor but also “the coordinated
utilization of resources based on equally divided knowledge.” This
system, he argued, is the only one compatible with freedom. Perhaps
some other kind of civilization might have been devised, he reckoned,
“like the ‘state’ of the termite ants,” but it would not be compatible
with human freedom.4

[...]

More astonishing still is that surveillance capital derives from the
dispossession of human experience, operationalized in its unilateral
and pervasive programs of rendition: our lives are scraped and sold to
fund their freedom and our subjugation, their knowledge and our
ignorance about what they know

[...]

One conclusion of our investigations is that surveillance capitalism’s
command and control of the division of learning in society are the
signature feature that breaks with the old justifications of the
invisible hand and its entitlements. The combination of knowledge and
freedom works to accelerate the asymmetry of power between surveillance
capitalists and the societies in which they operate. This cycle will be
broken only when we acknowledge as citizens, as societies, and indeed
as a civilization that surveillance capitalists know too much to
qualify for freedom

[...]

The surveillance capitalists that operate at hyperscale or outsource to
hyperscale operations dramatically diminish any reliance on their
societies as sources of employees, and the few for whom they do
compete, as we have seen, are drawn from the most-rarefied strata of
data science

[...]

The absence of organic reciprocities with people as either sources of
consumers or employees is a matter of exceptional importance in light
of the historical relationship between market capitalism and democracy

[...]

’s dependency on the “masses” and their contribution to the prosperity
necessitated by the new organization of production.23 The rise of
volume production and its wage-earning labor force established British
workers’ economic power and led to a growing appreciation of their
political legitimacy and power. This produced a new sense of
interdependence between ordinary people and elites. Acemoglu and
Robinson conclude that the “dynamic positive feedback” between
“inclusive economic institutions” (i.e., industrial firms defined by
employment reciprocities) and political institutions was critical to
Britain’s substantial and nonviolent democratic reforms. Inclusive
economic institutions, they argue, “level the playing field,”
especially when it comes to the fight for power, making it more
difficult for elites to “crush the masses” rather than accede to their
demands. Reciprocities in employment produced and sustained
reciprocities in politics

[...]

sharp contrast to the pragmatic concessions of Britain’s early
industrial capitalists, surveillance capitalists’ extreme structural
independence from people breeds exclusion rather than inclusion and
lays the foundation for the unique approach that we have called
“radical indifference

[...]

significant result of the systematic application of radical
indifference is that the public-facing “first text” is vulnerable to
corruption with content that would normally be perceived as repugnant:
lies, systematic disinformation, fraud, violence, hate speech, and so
on. As long as content contributes to “growth tactics,” Facebook
“wins.” This vulnerability can be an explosive problem on

[...]

guiding principles of radical indifference are reflected in the
operations of Facebook’s hidden low-wage labor force charged with
limiting the perversion of the first text. Nowhere is surveillance
capitalism’s outsized influence over the division of learning in
society more concretely displayed than in this outcast function of
“content moderation,” and nowhere is the nexus of economic imperatives
and

[...]

The larger point of the exercise is to find the point of equilibrium
between the ability to pull users and their surplus into

[...]

site and the risk of repelling them. This is a calculation of radical
indifference that has nothing to do with assessing the truthfulness of
content or respecting reciprocities with users.36 This tension helps to
explain why disinformation is not a priority. One investigative report
quotes a Facebook insider: “They absolutely have the tools to shut down
fake news.

[...]

radical indifference is a permanent invitation to the corruption of the
first text

[...]

It is obvious that the rogue forces of disinformation grasp this fact
more crisply than do Facebook’s or Google’s genuine users and customers
as those forces learn to exploit the blind eye of radical indifference
and escalate the perversion of learning in an open society

[...]

Surveillance capitalism’s antidemocratic and antiegalitarian juggernaut
is best described as a market-driven coup from above. It is not a coup
d’état in the classic sense but rather a coup de gens: an overthrow of
the people concealed as the technological Trojan horse that is Big
Other. On the strength of its annexation of human experience, this coup
achieves exclusive concentrations of knowledge and power that sustain
privileged influence over the division of learning in society: the
privatization of the central principle of social ordering in the
twenty-first century. Like the adelantados and their silent
incantations of the Requirimiento, surveillance capitalism operates in
the declarative form and imposes the social relations of a premodern
absolutist authority. It is a form of tyranny that feeds on people but
is not of the people. In a surreal paradox, this coup is celebrated as
“personalization,” although it defiles, ignores, overrides, and
displaces everything about you and me that is personal

[...]

Tyranny” is not a word that I choose lightly. Like the instrumentarian
hive, tyranny is the obliteration of politics. It is founded on its own
strain of radical indifference in which every person, except the
tyrant, is understood as an organism among organisms in an equivalency
of Other-Ones. Hannah Arendt observed that tyranny is a perversion of
egalitarianism because it treats all others as equally insignificant:
“The tyrant rules in accordance with his own will and interest… the
ruler who rules one against all, and the ‘all’ he oppresses are all
equal, namely equally powerless.” Arendt notes that classical political
theory regarded the tyrant as “out of mankind altogether… a wolf in
human shape.…”55Surveillance capitalism rules by instrumentarian power
through its materialization in Big Other, which, like the ancient
tyrant, exists out of mankind while paradoxically assuming human shape

[...]

Polanyi’s lens, we see that surveillance capitalism annexes human
experience to the market dynamic so that it is reborn as behavior: the
fourth “fictional commodity.” Polanyi’s first three fictional
commodities—land, labor, and money—were subjected to law. Although
these laws have been imperfect, the institutions of labor law,
environmental law, and banking law are regulatory frameworks intended
to defend society (and nature, life, and exchange) from the worst
excesses of raw capitalism’s destructive power. Surveillance
capitalism’s expropriation of human experience has faced no such
impediments.

[...]

will be Facebook, he says, that will address problems that are
civilizational in scale and scope, building “the long-term
infrastructure to bring humanity together” and keeping people safe with
“artificial intelligence” that quickly understands “what is happening
across our community.”56 Like Pentland, Zuckerberg imagines machine
intelligence that can “identify risks that nobody would have flagged at
all, including terrorists planning attacks using private channels,
people bullying someone too afraid to report it themselves, and other
issues both local and global.”57 When asked about his responsibility to
shareholders, Zuckerberg told CNN, “That’s why it helps to have control
of the company

[...]

industrial civilization aimed to exert control over nature for the sake
of human betterment. Machines were our means of extending and
overcoming the limits of the animal body so that we could accomplish
this aim of domination. Only later did we begin to fathom the
consequences

[...]

Years later, in his moving 1966 essay “Education after Auschwitz,”
social theorist Theodor Adorno attributed the success of German fascism
to the ways in which the quest for effective life had become an
overwhelming burden for too many people: “One must accept that fascism
and the terror it caused are connected with the fact that the old
established authorities… decayed and were toppled, while the people
psychologically were not yet ready for self-determination. They proved
to be unequal to the freedom that fell into their laps.”67

[...]

We can now see that surveillance capitalism takes an even more
expansive turn toward domination than its neoliberal source code would
predict, claiming its right to freedom and knowledge, while setting its
sights on a collectivist vision that claims the totality of society.
Though still sounding like Hayek, and even Smith, its antidemocratic
collectivist ambitions reveal it as an insatiable child devouring its
aging fathers

[...]

The critical role of public opinion explains why even the most
destructive “ages” do not last forever. I echo here what Edison said a
century ago: that capitalism is “all wrong, out of gear.” The
instability of Edison’s day threatened every promise of industrial
civilization. It had to give way, he insisted, to a new synthesis that
reunited capitalism and its populations. Edison was prophetic.
Capitalism has survived the longue durée less because of any specific
capability and more because of its plasticity. It survives and thrives
by periodically renewing its roots in the social, finding new ways to
generate new wealth by meeting new needs. Its evolution has been marked
by a convergence of basic principles—private property, the profit
motive, and

[...]

It is not OK for every move, emotion, utterance, and desire to be
catalogued, manipulated, and then used to surreptitiously herd us
through the future tense for the sake of someone else’s profit. “These
things are brand-new,” I tell them. “They are unprecedented. You should
not take them for granted because they are not OK

[...]

Burnham’s cowardice is a cautionary tale. We are living in a moment
when surveillance capitalism and its instrumentarian power appear to be
invincible. Orwell’s courage demands that we refuse to cede the future
to illegitimate power. He asks us to break the spell of enthrallment,
helplessness, resignation, and numbing. We answer his call when we bend
ourselves toward friction, rejecting the smooth flows of coercive
confluence. Orwell’s courage sets us against the relentless tides of
dispossession that demean all human experience. Friction, courage, and
bearings are the resources we require to begin the shared work of
synthetic declarations that claim the digital future as a human place,
demand that digital capitalism operate as an inclusive force bound to
the people it must serve, and defend the division of learning in
society as a source of genuine democratic renewal