In this essay for Surveillance and Society, Mark Andrejevic distinguishes usefully between panoptic surveillance (as outlined by Foucault in Discipline and Punish, drawing on Jeremy Bentham’s ideas) and the surveillance rapidly being implemented by tech companies today. The aims of these different forms of surveillance are entirely different, even antithetical. One induces internal obedience; the other aims to obliterate the idea of psychic interiority altogether.
As Andrejevic explains, Bentham’s panopticon — a central watchtower capable of observing all of a prison’s cells — “operated on the principle of parsimony: the least actual surveillance, the least punishment, and the fewest overseers could achieve the greatest impact.” It didn’t matter if someone was even in the tower watching — the mere threat of being observed was meant to condition the behavior of those in the tower’s shadow. The logic was similar to putting a camera in a convenience store or an apartment building hallway. It might not even be connected to anything, but its presence threatens the possibility of being caught. The finding (described in this Smithsonian article) that simply putting pictures of watching eyes on billboards may be sufficient to discourage crime takes this principle to its logical extreme: the feeling of being watched makes people conscious of their behavior, which induces them to exercise a certain control over it.
By contrast, today’s world of ubiquitous data collection, what Andrejevic calls “automated surveillance,” seeks to operate invisibly and universally — “the monitoring must be as comprehensive as possible.” The modes of tracking are not centralized in an imposing tower but embedded in devices and distributed across countless points of contact. The point of this blanket surveillance is not to force you to control yourself — the intention of a “disciplinary society,” in which individuals assume responsibility for adhering to the rules and norms instilled in them by various institutions and social practices — but to enclose your entire life within an environment where all your behavior can be captured and exploited and remolded in real-time in various ways. The point is to anticipate what you will do to make sure someone is there to profit somehow from your doing it — what Zuboff describes in her book about “surveillance capitalism.”
In Discipline and Punish, Foucault’s analysis moves from a literal panopticon to a sort of conceptual one, a condition of “unavoidable visibility” that he saw in terms of the “age of the infinite examination and of compulsory objectification.” He assumed that perpetual visibility meant being subject to a continual test of our inner resources to conform to expectations, and this amounted to a kind of masochistic will to objecthood. So surveillance had to invest us with individuality — with will and agency — so we could relinquish it in conforming to the norms. It was the relinquishing itself that was the most important norm, the habit of obedience experienced as self-mastery. This assures the “automatic functioning of power.”
Foucault argued that discipline and punishment didn’t govern particular acts but “the individual as he may described, judged, measured, compared with others, in his very individuality.” Being an “individual” went from being a privilege (reserved for the literate, in Foucault’s account) to an imposed means of control. Althusser made a similar claim in his “Ideological State Apparatuses” essay: “the individual is interpellated as a (free) subject so that he shall submit freely to the commandments of the Subject, i.e. in order that he shall (freely) accept his subjection, i.e. in order that he shall make the gestures and actions of his subjection ‘all by himself.’ There are no subjects except by and for their subjection.”
Power, in Foucault’s account, is productive: it makes people into subjects; it constitutes their individuality. Under automated surveillance as Andrejevic describes it, people are not induced into becoming subjects (albeit normalized ones) but are instead objects for manipulation, whose will doesn’t meaningfully enter into the mechanisms of control. Our experience of will is instead epiphenomenal, following from the kinds of choices prearranged for us.
So the idea of ideological interpellation that Althusser described — the famous “Hey, you there” yelled out by a cop that makes you into a discrete person of interest when you acknowledge it instinctively — is dispensed with. Even though surveillance and algorithmic systems account for us as individual users, they don’t hail us as individuals in Althusser’s sense and make us recognize ourselves as subjects; algorithms don’t have any expectations of our active conformity and can operate without our recognizing ourselves in how they call out to us. We can laugh at the bad automatic recommendations and the failed attempts to target us and the various misfires of AI. But these local failures are not indications that the system as a whole is failing. Rather it legitimates what it needs to function: the collection of more data to work with better “accuracy.”
The point of automatic surveillance isn’t preventive but anticipatory. It merely wants more data, so that can be used to reshape the possibilities of action around us, making what we choose to do or how much we obey beside the point. As Andrejevic notes, automated surveillance intends to pre-empt individual behavior through a continual application of external force, directed by ever more invasive and comprehensive monitoring. Our internal mechanisms of self-control are not reshaped (or called into existence) by this kind of surveillance, because it presumes that individual agency has been completely overwritten. “Being watched does not result in the internalization of the monitoring gaze and its imperatives,” Andrejevic writes. All the choices will already be preapproved, and disobedience will be impossible. What goes on in our minds is irrelevant. Maybe in a sense, that means we are free; we get to experience our prescribed choices as open and autonomous.
In one sense, automated surveillance appears post-ideological, because it doesn’t require control over how a subject imagines his relation to “the real conditions of his existence,” to draw from Althusser’s definition. But it also seems that we are invited to perceive personal autonomy in the algorithmic assistance that surveillance systems supply, in the convenience that they administer. That is, in Althusser’s terms, convenience is the “imaginary distortion” of our relationship to the reality (total surveillance) that allows us to believe we have some kind of autonomy, even in the absence of the friction that makes resistance possible. Convenience substitutes for any personal desires we might have conceived without anticipatory algorithmic intervention and in effect becomes the only thing left that it’s possible to desire. Inside the Skinner box there is nothing but convenience. Andrejevic argues that automated surveillance is “post-representational”: It doesn’t show us a camera or eyes and expect us to act differently. It simply acts directly on us according to the prerogatives of the system. “The automated system,” he writes, “triggers the ongoing application of force.”
Under these conditions we don’t have to be interpellated as individual subjects at all. We have a tracking number and an archive of behavioral data but little in the way of meaningful agency. Automatic surveillance makes “individuals” unnecessary; what it requires are what Deleuze, in his “Control Society” essay, called “dividuals.” With the mesh of networks and surveillance technologies closing around us we are rendered as discrete nodes in a total system of control that doesn’t rely at all on conditioning our will before the fact. This is when platforms like Facebook purport to give us more agency, it is essentially meaningless; the choices don’t have any impact on the premise of full enclosure, as Lizzie O’Shea suggests in this Baffler essay about Facebook’s ludicrous pivot to privacy: “Mark wants you to be able to easily exclude people from seeing your messages and stories—unless that person happens to be him. This vision of privacy doesn’t hold any power because it does not challenge the definitive power framework for users of social media.”
But privacy feels like an inadequate framework for this. “Privacy has necessarily become an expansive concept in the digital age, given the myriad ways in which technology occupies more of our personal spaces,” O’Shea notes, but even that seems to set up a distinction between “personal space” and surveilled space that no longer applies. O’Shea argues that “the right to privacy includes the right to exist outside of the market. It is the right to enjoy spaces without feeling as though your presence is being used by marketers to predict your future.” I agree totally, but I’m not sure many of us have any experience with “outside of the market” — personal identity has been fully subsumed by consumerism for decades. The problem is that platforms offer us a life in which individuality, understood as a kind of resistance to the world around us, to the various social pressures and ascribed identities imposed on us, is superfluous — just friction. The problem isn’t that when platforms try to predict our future for marketing purposes, they are being invasive. It’s more that they have erected a total environment in which any resistance is already captured. We aren’t asked to obey, so we can’t be defiant.
Andrejevic’s articulation of the control society as one of “automated surveillance” helps me understand a line from Deleuze’s essay that has always puzzled me: “The disciplinary man was a discontinuous producer of energy, but the man of control is undulatory, in orbit, in a continuous network. Everywhere surfing has already replaced the older sports.” Why surfing? He is not talking about the internet, because the essay predates it. Presumably he means that a person is situated in a state of pure reaction to the uncontrollable force of the wave, rather than dictating the action in one way or another. The “ongoing application of force” in our control society is not a wave but the impact of a connected environment full of sensors, capable of reconfiguring itself around us and in anticipation of us. It does this to profit by us; automated surveillance is a mode of continual extraction. So we are not working in “discontinuous” spurts. Instead we are having productivity continually extracted from us through our efforts to merely keep our balance.