Home

Proactive Paranoia

The dark web’s embrace of state-endorsed “operations security” tactics previews what could become standard procedure for us all

Until this past July 4, Alphabay had been the largest market for drugs and stolen information on the “dark web,” the part of the internet accessible only through special routing software, such as Tor, I2P, or Freenet. But on July 20, U.S. Attorney General Jeff Sessions finally announced what the dark web market participants had suspected for weeks: Alphabay had been seized by law enforcement agents. The marketplace had already been offline for weeks, and users had been wondering, in Reddit subs and Tor-based forums such as the Hub, whether it was down for maintenance or something more ominous. After Sessions’s announcement, panic replaced the nervousness: How did law enforcement find Alphabay? And am I next?

One Redditor, digging through the civil forfeiture documents provided by the Department of Justice, proposed an explanation for what went wrong; it was all due to a “simple mistake.” The Redditor pointed out that Alphabay’s alleged founder, Alexandre Cazes, had been caught because his personal email had been sent to new users via the header of the Alphabay registration welcome email. “Just think about it,” the Redditor wrote, “He made one mistake, and got fucked.”

Central to OPSEC is a radical distrust of everyone one associates with. You can’t be paranoid in hindsight

On Alphabay, Cazes was Alpha02, the administrator of a three-year-old market for drugs, stolen credit cards, and weapons. His Hotmail account, by contrast, belonged to a Canadian expat living in Thailand, driving expensive cars and posting about his sexual exploits on pickup artist Roosh V’s forum. Cazes’s “one mistake,” which law enforcement agents exploited, was failing to keep these identities apart. The Redditor called this a “simple OPSEC mistake,” drawing on military jargon for “operations security,” and other Redditors and dark web forum participants also reminded one another: “Let’s keep our OPSEC on point here people.”

This is far from the first time dark web market participants have invoked “OPSEC.” The term is a keyword for dark web markets, regularly appearing in thousands of forum posts, particularly in the wake of the last great market bust — of Silk Road in late 2013. It has effectively become the organizing principle of dark web markets. In a YouTube video of a 2012 presentation from a Malaysian security convention, widely shared among dark web market participants, the information-security researcher and software-exploit broker known as the Grugq lays out techniques for hackers to avoid drawing the attention of state agencies. “What the fuck is OPSEC?” he asks. “OPSEC, in a nutshell, is keep your mouth shut. Don’t say it. The less you say, the harder it is for people to figure out what you’re doing … In short, shut the fuck up.

According to the Grugq, OPSEC is not a matter of technology but of mentalities, practices, and relationships. Central to it is a radical distrust of everyone one associates with. “This particularly goes for people you are operating with,” he says. “They are not your friends; they are criminal co-defendants … there is a high likelihood that they [will get busted] because they are dumb, because they are doing what they are doing.” He intones that “it hurts to get fucked,” meaning that it hurts to go to jail. And because of this pain, “No one is going to go to jail for you … Your friends will betray you.” The Grugq argues, above all, that one needs to be “proactively paranoid,” because you can’t be paranoid in hindsight.

Notably, the Grugq’s presentation includes multiple favorable references to the agencies of state power, including the military, which is not surprising given operations security’s provenance: the Vietnam war, according to a heavily redacted, formerly top secret U.S. National Security Agency research report.

In 1965 and 1966, U.S. bombing raids were producing low casualties and little damage to the Viet Cong or North Vietnamese Army equipment, because they were getting advanced warning of the attacks. The U.S. military formed a research team, Purple Dragon, to discover the source of these early warnings. After discounting theories that the North Vietnamese forces had broken American encryption, the team focused on the mundane ways an enemy could gather information: monitoring Voice of America or BBC broadcasts, listening to tactical radio broadcasts and paying attention to military call signs, or reading nonclassified documentation such as requests for food for specific regions. None of these on their own provided specific information about impending attacks, but as a whole they provided many small details that could be pieced together into a high-probability prediction about upcoming targets.

By training commanders and soldiers to avoid talking about seemingly insignificant details via “open source” (i.e., non-encrypted or classified) channels, the Purple Dragon team was able to reduce forewarning of bombing attacks from eight hours in 1966 to under 30 minutes in 1968. In other words, operations security increased the body counts.

Based on the success of Purple Dragon, operations security was championed by the National Security Agency, which set up a training course in the practices in the 1980s that thousands of government workers would eventually take. Toward the end of the decade, President Ronald Reagan signed an executive order instructing all U.S. government agencies and their contractors to provide operations-security training for their employees. As it spread from the military to government contractors and beyond, “OPSEC” became a keyword for corporate organizations seeking to defend their intellectual property. From there, it moved into information security circles more broadly — culminating in hackercon presentations such as the Grugq’s.

By engaging in OPSEC politics, dark web market participants reinforce the idea that communication is a Manichean battle of states and subjects

The phrase from OPSEC training that seems to resonate most among dark web market participants is the Grugq’s call for “proactive paranoia.” This is understandable, given the law enforcement raids, fear of surveillance, and scams that permeates these markets. Everything in dark web markets is subjected to intense OPSEC-driven skepticism, from the infrastructural level (Is Tor compromised? Is this market’s web software secure?) to the administrative (Are the people running this market law enforcement agents?) to fellow market participants (Is this vendor going to steal my Bitcoins? Am I selling to a cop?) to the larger flows of information (Is this bank routing information secure, and if not, can I take advantage of it?). OPSEC’s proactive paranoia provides heuristics for market participants as they relate to both the state, to their market colleagues, and to the broader information society.

Above all, OPSEC as politics has the goal of structuring an emergent social order through radical self-regulation and individualism. On dark web market forums, new members are advised to self-regulate by studying operations-security guides, which are now a regular feature of dark web markets and forums. New users are instructed to read the guides, practice the skills, and defend themselves. They also are taught that, in the end, they are solely responsible for their own safety and security. As one participant put it at the Hub, “Big things are coming … are you prepared to Learn how to protect yourselves? Don’t end up like fuckwad Vendors that do not take their Safety and their clients Safety seriously.” And as the Alphabay FAQs warned users, “We take no responsibility if you get caught, so protecting yourself is your responsibility.”

The proactive paranoia of OPSEC politics on markets prompts lack of trust for others. While the Grugq argues that law enforcement agencies are the apex predators of the internet, dark web market participants also have one another to fear. The history of dark web markets is littered with scams. Market forums are full of posts by vendors complaining about buyers, buyers about vendors, and everyone about market admins. The goal of OPSEC, then, is to avoid being put into a position of vulnerability, not only to the state but also to fellow market participants. Markets are, after all, fine places for people to exploit one another.

Similarly, OPSEC politics orients dark web market participants to the wider world of unsecured information. Besides drugs, Alphabay was best known as a market for credit card fraud, Paypal and Amazon scams, and the theft and sales of “fullz” (full identification of people, including Social Security numbers, addresses, and date of birth). These activities reveal the flip-side of dark web OPSEC. Markets for poorly secured information are a direct result of the increasing pressure to move more and more of our personal information into digital databases, from our social connections to our shopping habits to our state-sanctioned legal documentation. In other words, while OPSEC politics is geared toward the individual’s paranoid ability to self-regulate with respect to the revelation of personal information, it also teaches dark web market members to watch for opportunities to exploit others who don’t “shut the fuck up.” While the rest of us reveal ourselves via social media, health or dating apps, or commercial exchanges, the paranoid exfiltrate our data.


While “proactive paranoia” sounds like a pathological condition reserved for users of hidden web sites, OPSEC politics functions as a means to structure online social relations and from there build a social order. After all, despite scams, exploitation, and arrests, dark web markets continue to thrive. This is where dark web market OPSEC politics become instructive for those of us who never visit the dark web. Proactive paranoia was forged in the toxic cultural contexts of increasingly militarized states, ubiquitous surveillance, global neoliberalism, and an all-out hustle for online money. In other words, OPSEC politics need not be limited to the dark web. In a world of arrests, scams, fines and fees, constant monitoring, and extreme caveat emptor, suspicion and paranoia are rational responses — not just in dark web markets, but in our daily lives.

Reasons to be paranoid seem to be endless. The use of police to harass citizens — particularly people of color — to gather revenue through citations and court fees has increased in the past few decades as tax-adverse governments use policing to try to fill coffers. Dark web markets are explicitly marked by scams and fraud, but then again, scams and fraud abound in our daily lives as well: Recently, the bank Wells Fargo has been repeatedly engaging in charging customers for accounts without their knowledge or consent. Data breaches and personal information exfiltration are increasingly common (and incidentally these data often show up for sale on the dark web). Legal recourse for violations such as these are being undermined by forced arbitration agreements, making disputes a market commodity. Whether dealing with states or markets — the powerful institutions of our contemporary age — each of us is on our own.

It seems then that OPSEC politics has application beyond the dark web markets: Perhaps we all could use dark-web-style proactive paranoia and a radical lack of trust. In light of the cruelties of life under neoliberalism, we may feel the need to “shut the fuck up,” to never trust anyone, to suspect every institution in our lives. We may welcome the growing market for privacy technologies, laud those who avoid paying taxes to the state, or move our transactions into cryptocurrencies. We may see the state as the adversary and use our self-interest as our only compass. The dark web, it seems, has something to teach us.

But there are dangers lurking in OPSEC politics. First, as an appropriation of state practices, drawn from the language of the U.S. military and National Security Agency, dark-web-style operations security can be, in turn, re-appropriated by the state. By engaging in OPSEC politics, dark web market participants reinforce the idea that communication is a Manichean battle of states and subjects — that communication and information can be “weaponized” and thus should be subject to state regulation and policing. Like other social practices linked to war metaphors, dark web markets’ appropriation of operations security will further fuel the expansion of military and police surveillance of and action in spaces of communication, continuing to make communication itself a theater of war — the purportedly legitimate sphere of state control. This isn’t limited to the dark web but is directed at all forms of digital communications, as shown by the search warrant filed by the Justice Department (which it has since filed to amend) for information on visitors to an anti-Trump website.

Even in the face of the cruelty of contemporary neoliberal life, where making a mistake may mean “getting fucked,” we run the risk of strengthening the now commonsensical idea that everyone is out to get everyone if we take up OPSEC politics into our lives. This comes at the expense of other potential social formations. Radical lack of trust may make sense in a market-driven, hypercompetitive world where every institution is out to take advantage of us — and where, conversely, those who can exploit others are lauded as winners. Collective organizing would continue to give way to individual grievance and self-defense. This is the real lesson of the seizure of Alphabay and its OPSEC failure: Even if OPSEC offers a prescription for self-defense, the adversaries it takes on are too great for any of us alone.

Robert W. Gehl is an associate professor of communication at the University of Utah. This essay is based on a chapter from his forthcoming book, Weaving Dark Webs: Violence, Propriety, Authenticity (MIT Press).