Capitalism Has Always Been “Rogue”

Companies like Google and Facebook make money not just by predicting our behavior, but by influencing our choices. It’s an intensification of the surveillance that has always been at the heart of capitalism, not a new economic system.

Electronic Frontier Foundation graphic created by EFF senior designer Hugh D'Andrade. EFF-Graphics / Wikimedia

A long time ago, in what now feels like a past life, I spent several years working in London as a management consultant, for consultancies that specialized in advising consumer-facing companies of various kinds — retailers, media firms, packaged-goods manufacturers, and the like. For someone who would go on to become a scholar interested in understanding the nuts and bolts of capitalism, it was an invaluable experience. I’m not a physical scientist, but, if I were, it would be like I had done my laboratory trials before even applying to do my doctorate. The research is done. Now, what are my research questions, and who is going to fund me?

It was also in numerous respects a salutary experience. In immersing myself in my clients’ operations I was frequently taken aback by how unscientific the whole enterprise of business appeared to be — and how peculiar. Having grown up believing that our “captains of industry” knew what they were doing and that they did it in ways that were eminently rational, it repeatedly came as a shock to find things otherwise, although with time that shock diminished. Managers often seemed to be fumbling around in the dark, guessing, more or less, at the likely outcomes of strategic choices. Entire industries functioned in ways that would have been essentially unimaginable had one sat down to design them from scratch.

Television advertising was — and is — one example of this seeming wrongheadedness. Imagine you are a major branded manufacturer, let’s say, Volvo, sitting down to negotiate an advertising deal with a major broadcaster, let’s call them Channel 4 (one of the UK’s leading advertiser-funded broadcasters).

The potential audience is segmented into different demographic groups based principally on gender, age (e.g., 16–34s), social grade (e.g., ABC1s) or some combination thereof. So far, I suppose, so sensible: You, Volvo, are more interested in your advertisements being seen by some audiences than others because they will have varying degrees of propensity to purchase your vehicles. Their “eyeballs” (the term used) have different value. The odd thing, though — or at least I always thought it was odd — is that only some eyeballs actually get traded in advertising deals. When I consulted in this sector, Friends was a 16–34 sale and The West Wing was an ABC1 sale. All other eyeballs were considered “waste.”

Neither Channel 4 nor Volvo have any idea how many of those individuals who see Volvo’s advertisements on Channel 4 go on to actually buy a Volvo vehicle; more than that, they’re not sure how many individuals in different audience groups actually see the ads in the first place.

It is true that all countries with advertiser-funded television broadcasting have audience measurement services designed to measure what people are watching. Without such services and the metrics they furnish, advertising deals as they are currently negotiated would be impossible. The UK service, for example, which is run by the Broadcasters’ Audience Research Board (BARB), contains a “representative” panel of some 5,300 homes. But few people think the reported viewing figures accurately represent viewing realities. Skepticism is justified by the fact that when the panel samples are changed, viewing patterns often swing wildly. When BARB updated its panel in 2002, ITV1, the country’s leading advertiser-funded TV channel, immediately saw its overall reported audience plummet by 25 percent. Had a quarter of its audience really suddenly abandoned it? Of course not.

One of the other sectors to which I enjoyed a degree of exposure was retail, especially UK grocery retail. The sector was going through changes that enabled the likes of Tesco and Sainsbury — if you are reading this in the United States, think Wal-Mart on a smaller scale — to know considerably more about their shoppers’ purchasing habits than they had previously been able. Loyalty cards were the key breakthrough: suddenly, at least for the growing proportion of its customers who used such cards, Tesco knew exactly who had bought what, at what price, where, and when.

Nevertheless, a considerable degree of fumbling still seemed to go on. Concerned to know for instance how different shop-floor formats or placements of items — a branded soup here, an own-label pasta sauce there — shaped purchasing behaviors, retailers, guided by their all-seeing consultants, would run small-scale experiments. Formats and placements in individual shops would be altered, and changes in local purchasing patterns then monitored, with the assumption being that the latter resulted directly from the former.

On the basis of such experiments, the consultants would model the predicted impacts of nationwide reformatting and re-placement on a retailer’s overall costs, revenues, and profits. Their clients, paying handsomely for the insights, would more often than not act accordingly. Aisles and shelves were reorganized and restocked at scale; having once sat next to the sweet corn, olives now found themselves cheek-by-jowl with gherkins.


I was reminded of these experiences and the befuddlement they frequently inspired when reading Shoshana Zuboff’s new book, The Age of Surveillance Capitalism. Zuboff’s thesis is simply stated: capitalism is today increasingly surveillance-based, driven by companies who make money not only by knowing our behaviors, but also by attempting to influence or configure those behaviors in ways that maximize money-making opportunities.

Surveillance capitalism, Zuboff claims, was born with the internet and with the proliferation of digital and networked technologies more generally. Facebook, Microsoft and, more than any other company, Google, are on her telling the surveillance capitalists par excellence. (She is more ambivalent about Amazon and Apple. The former is said to be “migrating toward” surveillance capitalism; meanwhile, “time will tell” if Apple “succumbs.”)

But if surveillance capitalism was originally a creature of the virtual world, it has long since escaped those spatial shackles. As the internet itself has increasingly come to permeate the “real” world of physical objects linked together by virtue of the data-enabled sensors and software embedded within them (the so-called “Internet of Things”), so too has the surveillance capitalism that is its calling card.

At the heart of Zuboff’s account is the idea of prediction. Prediction is surveillance capitalism’s modus operandi, if you like. If, as Zuboff says, companies are better able to predict what we will do in a particular context, they can better tailor commercial offerings to accommodate and profit from those predicted behaviors. And what surer method of accurate prediction than rigging the game through intervention in the behaviors in question?

“The most predictive source of all,” Zuboff writes in one of the best lines in the book, “is behavior that has already been modified to orient toward guaranteed outcomes.” At its extreme, prediction — probability — becomes certainty: hence Amazon’s long-mooted idea that based on our viewing and purchasing habits it can ship us books it knows we will like before we ourselves realize it. Zuboff ultimately envisions a techno-commercial dystopia of “machines that modify the behavior of individuals, groups, and populations in the service of market objectives.”

There are certainly some false steps along the way. Like many commentators on Facebook, Google et al, Zuboff overstates what is actually happening with the data that those companies’ surveillance of us occasions. Predictive data, she tells readers, are being traded in “a new kind of marketplace for behavioral predictions.” Generally speaking, however, this is not true. Neither Facebook nor Google sells its user data; whatever else one thinks of them, on this count they are not guilty. Data is not the new oil. What Facebook and Google principally sell, like Channel 4 and other television broadcasters, is eyeballs or attention, to advertisers. User data has value insofar as it enables Facebook and Google (or rather the algorithms they use to read that data) to predict how attentive different users will be to different stimuli.

Still, there is much that Zuboff gets right. She offers a penetrating and persuasive account of how groups like Facebook and Google mine online behavior so as to better match advertisements to users’ interests, and indeed of just how much data such companies possess. (I vividly recall a conversation I had with a Stanford MBA graduate who founded and went on to run one of America’s biggest real estate search sites. When I mentioned that I did research on housing markets and asked him what type of relevant data he had, his answer was, “Basically, everything.” Given customer privacy and company confidentiality concerns, he wasn’t sure he could give me all the data I might be looking for; but he was totally sure that he had it.)

If users’ behavior can be accurately predicted and they are shown the “right” ads, then they click through more than they otherwise might and Facebook or Google or whoever it happens to be earns more money (“click-through rates” having become the standard pricing metric). Given this business model, the incentive to orient the user toward clicking through — through a range of techniques that Zuboff refers to collectively as “behavioral modification,” or what behavioral economists have termed “nudging” — is clear enough.

Here, Zuboff discusses the various experiments undertaken at Facebook and Google and designed to prime users toward desirable online behaviors, for example by testing the reactions of millions of users to hundreds of variations in page characteristics, “from layout to buttons to fonts.” “Because the ‘system’ is intended to produce predictions,” Zuboff, citing a Google economist, writes, “‘continuously improving the system’ means closing the gap between prediction and observation in order to approximate certainty.”


What seems much less clear is Zuboff’s contention that in view of these developments in the spheres of collation and use of user data and attempts to nudge consumers toward behaving in ways that suit capitalists, surveillance capitalism amounts to a decisively new economic system. She terms this “rogue” capitalism, presumably to indicate that as well as being fundamentally different to what has come before, surveillance capitalism is different in a bad way. It is abnormal, undisciplined, dangerous.

I can understand why Zuboff makes this claim, and yet my mind keeps going back to Tesco and Sainsbury and to Channel 4 and Volvo. What are grocery-retailer loyalty schemes and television audience measurement services if not, at least in part, systems of user surveillance? In both cases a good part of the rationale for the arrangement is to better cognize how users are behaving and to do so precisely in order to enable improved — that is, more profitable — service delivery.

Is this so different from the systems of online surveillance described by Zuboff? I am not sure that it is. Surely the thing that most meaningfully distinguishes Google’s monitoring of what its users do online from, say, BARB’s monitoring of what television viewers do in their living rooms is the quality, intensity, and accuracy of the former. As a system of surveillance, BARB, baldly stated, just isn’t very good.

This strikes me as a more useful way of thinking about what is new about Zuboff’s surveillance capitalism. From the perspective of an advertiser, Google is Channel 4, only (in most ways) far superior as a vehicle to relay its message. The particular attractiveness of advertising via free-to-air mass-market broadcasters such as Channel 4 has always been that few other media allow brand owners simultaneously to reach such a large audience. Facebook and Google offer the same scale — actually, substantially greater scale — and two other enormously valuable capabilities. One is the ability to see how effective one’s ads are: if a user clicks through, the advertiser (and Google) knows. The other, perhaps more valuable still, is the ability to display advertisements only to those parts of the audience who are likely to be responsive to it.

The analogy would be Channel 4 broadcasting different advertisements into each home based on comprehensive information about who lives there, what type of person they are, and how they typically respond to different commercial triggers. Hence, what’s new about surveillance capitalism surely is not that there are now ‘rogue’ capitalists mischievous enough to do such things. What’s new is that the capabilities to do so now exist. Would Channel 4 do what Facebook and Google do if it could? You are at best naïve if you think it wouldn’t. It “wastes” millions of eyeballs not because it chooses to but because the technical constraints associated with broadcast television give it no choice.

Similarly, when one reads Zuboff’s assertion that surveillance capitalists’ efforts at behavioral modification are aimed “to produce behavior that reliably, definitively, and certainly leads to desired commercial results,” it is hard to avoid the thought that such aims have always been integral to capitalism in general and advertising in particular. (The only other credible description I have heard of advertising’s core function was one provided by a former senior consulting colleague, who said that the function of advertising is to make you unhappy.) What, after all, were Tesco’s aisle and shelf rearrangements, if not exercises in attempted behavior modification, albeit crude and — with the benefit of hindsight — almost quaint ones?

In examining the large-scale user reaction-testing experiments run in recent years by Facebook and Google, which, as she says, are enabled by the fact that in the digital world such experimentation can be entirely automated, Zuboff notably points out that in the analogue world experiments of this kind would be far too expensive to be practical. And this, I think, is to approach the nub of the matter. Capitalism hasn’t gone rogue. Rather, it can now do that which has always been desirable but which has until now been frustrated.


My point here then is not the one that other critics of Zuboff have made, which is that her characterization of surveillance capitalism as “rogue” legitimates all other forms of capitalism — although that is definitely true. Nor is my point that there is nothing new about surveillance capitalism as Zuboff describes it. Clearly there is. The point is that surveillance capitalism is not a radical departure. It represents the realization or intensification of tendencies that have always been part of capitalism’s DNA but which for a variety of reasons associated primarily with the technical infrastructures of product and service delivery have not until now been realizable.

There is, in the first chapter of the first part of Zuboff’s book, a revealing passage that I think goes a long way to explaining why Zuboff misreads intensification as transformation. Asking how it came to pass that surveillance capitalism has so rapidly advanced to the point of potentially becoming dominant, she writes: “over the centuries we have imagined threat in the form of state power. This left us wholly unprepared to defend ourselves from new companies with imaginative names run by young geniuses that seemed able to provide us with exactly what we yearn for at little or no cost.”

The narrowness of historical perspective in this sentence really is nothing short of remarkable. It is not so much that capitalists of various ilk have been providing “us” with things at little or no cost, getting others to pay them to do so, for as long as capitalism has existed. Analogue, advertiser-funded mass media — so invisible it seems to Zuboff — is only one example. Bank current (“checking”) accounts, usually free to use and funded by the spread between borrowing and lending rates, are another.

It is more Zuboff’s perception of where power resides and thus of where threats associated with the use or abuse of power might emanate from. Hers is a world in which prior to the internet age, power resided exclusively with the state. Markets, one imagines, were “free” and transparent and exploitation existed only in the realm of the relationship between state and citizenry, not in the relationship between company and worker, still less in the relationship between company and consumer. It is, in short, a liberal ideal-type world which to all intents and purposes has never existed.

Equipped with such rose-tinted spectacles, Zuboff casts a wary eye over the new breed of surveillance capitalists. If you see what has always been as what it is not, then no wonder you see as new something that has long been emergent but perhaps only now is coming to full fruition.