Home

No Alternative

How culture jamming was culture-jammed

Though I barely qualify as a millennial — I was born in 1982, a year after the Pew Research Center sets the cutoff, and right on the cusp according to the U.S. Census Bureau — I have to confess that I have long savored my classification. I enjoy being the subject of thinkpiece after thinkpiece about phones, debt, political leanings; part of the the generation destined to destroy everything from napkins to Applebee’s. But according to this Huffington Post article, which went viral last summer, I am not actually a millennial after all but a “xennial” — part of a microgeneration between Gen X and millennials that combines their governing attributes, Gen X “cynicism” and milliennial “striving.”

Defining generations has always struck me as like astrology — a potential conversation starter that shouldn’t be taken too seriously. But one “xennial” periodization struck me as too apt to ignore: analog childhood, digital adulthood. My adolescence coincided with the adolescence of the web, between its infancy as a preserve of academics and geeks and its mature ubiquity as social media (or now, “platforms”). Both unfolded in a decade that itself seems liminal, hard to pin down: the 1990s. Between the conquest of neoliberalism and the onslaught of neoconservativism — for some, a time of flannels and the other trappings of “alternative” culture but for me the time of the internet. As major media corporations quickly determined the most effective methods for presenting the latest dissident music culture, I’d discovered an impossibly labyrinthine digital alternative, full of parody, pranks, and politics that MTV and the Gap seemingly could never conquer. And though digital culture has never yielded an unconditional surrender to commerce and conformity, something else has happened: The internet’s ways became the foundation for all forms of expression, from “culture jam” to culture, period.

The brief times I got to experience something different, like the art theater or the indie record store, felt like unchecked rebellion, even if that rebellion was just another act of consumption

The internet was important to me then because I experienced it as an escape from provincial Ohio — as an extension of the books, movies, and music that already served this purpose. But media was also my home. My suburban ennui was inextricable from the full-spectrum dominance of mass commercial culture; the television was always on. Though I watched quite a lot of it, I understood that television was my enemy. So the brief times I got to experience something different — the occasional trip downtown to the art theater or the indie record store — felt like unchecked rebellion, even if that rebellion was just another act of consumption.

Television had been under suspicion since it infiltrated American homes in the 1940s and ’50s, but the ’90s were a great time to hate it. Bolstered by deregulation and the expansion of cable, it had never seemed more powerful or more deeply loathed. A mere decade after TV journalists had been credited with turning public opinion against the Vietnam War, television news had become a problem, not for what it reported but by sheer virtue of its form, which was seen as inherently open to manipulation and hostile to substantive facts. As political journalist William Greider wrote in 1984, “whatever the newspapers printed about Reagan, no matter how harsh, became almost irrelevant; only the pictures really mattered.” While Noam Chomsky meticulously detailed the way capitalist media companies and the state could construct a propaganda apparatus within a democracy, brasher voices blamed television for unleashing America’s basest elements: “Ronald Reagan is merely an anthology of the worst of American popular culture, edited for television,” wrote media critic Mark Crispin Miller. “Reagan won because he was the perfect television spectacle.”

All the concerns of academic postmodernism — the depthlessness of image-based culture, the end of grand meta-narratives in favor of the fragmentary, the loss of historical context and the rise of self-referentiality and intertextuality — could be detected in the distraught reaction to a decade of Reagan’s rule. Guy Debord’s claim, in the 1967 book The Society of the Spectacle, that “everything that was directly lived has moved away into a representation,” seemed to ring ever truer, and Jean Baudrillard, who became a touchstone for cultural criticism then, took this further, diagnosing the Western world as a “hyperreality” where representation had, in folding in on itself, mushroomed into an expanded universe that referred, ultimately and unverifiably, only to itself.

If we are trapped in a world of representations of representations, than what is to be done? In 1993, the year before I got online, writer Mark Dery penned a manifesto of sorts that addressed this question: “Culture Jamming: Hacking, Slashing, and Sniping in the Empire of Signs.” Dery reconceived the totalizing mediascape as a kind of battleground and, following semiotician Umberto Eco, theorized a new kind of political actor: “communications guerrillas, who would restore a critical dimension to passive reception” by making parodic versions of pop culture, corporate logos, and advertising slogans. Rather than completely sabotage the signal — think Roddy Piper blowing up the aliens’ mind-control device in John Carpenter’s 1988 film They Live — culture jammers intervened in the flow of media, détourning it as Debord’s Situationists had advocated, to inculcate viewers with a critical perspective on media.

Dery’s manifesto was not simply a call to arms, but a diagnostic that tried to knit together an assortment of contemporary aesthetic practices into an oppositional movement. Those practices, however, had already begun to crystallize under a different banner, the flimsy and seemingly self-negating term “alternative.”


As nationwide circuits for punk re-emerged, with Sonic Youth and then Nirvana as flagships, the concept of alternative was more and more frequently invoked but remained notoriously difficult to define. This was a running joke at the time: “Alternative to what?” was the obligatory rejoinder.

Even with the benefit of academic hindsight, “alternative media” has remained too baggy to hem in. Mitzi Waltz’s book on the subject, Alternative and Activist Media (2005), notes that it refers first to “media that provide a different point of view from that usually expressed, that cater to communities not well served by the mass media, or that expressly advocate social change.” It was defined by what it negates, which, as Waltz notes, means that “alternative media are ‘alternative’ only in the context of their response to, and participation in, the cultures within which they are produced and consumed.” Rather than articulate a clear vision of, say, a radically reorganized society, alternative is cursed to indicate the locus of rebellion but without anything programmatic following behind. In the context of a totalitarian mass media apparatus of the 1990s, this kind of reactionary inflection point felt significant enough. But as soon as the ground underneath shifts, that raison d’être disappears.

What it leaves behind is, often enough, little more than a sneer, a prank. The continual impulse to negate that was intrinsic to alternative revealed itself in the ironic pose and reflexive sarcasm so forcefully pinned to Generation X as its major cultural contribution. The mixture of radicalism bereft of seriousness is perfectly encapsulated by Sonic Youth guitarist Thurston Moore’s pastiche performance of a left intellectual in the tour documentary 1991: The Year Punk Broke: “I think we should destroy the bogus capitalist process that is destroying youth culture,” he babbles to some gobsmacked German fans. “The first step is to destroy the record companies. Do you not agree?”

Yet even without concrete precepts, other than perhaps a vague commitment to the ethos of “Do It Yourself” — what we now call, in its fully commodified form, “artisanal” — there nevertheless seemed to be something in the air. In 1993, Jürgen Habermas, the ur-theorist of the public sphere, was provoked to modify his view that the public sphere would inevitably be colonized by capitalism, positing instead a “periodically recurring violent revolt of a counterproject to the hierarchical world of domination, with its official celebrations and everyday disciplines.”

The early internet seemed like a perfect example of this revolt, an ideal setting for the apparently anti-corporate media criticism of alternative culture. Dery notes this with enthusiasm, enlisting alternativity as part of the web’s natural structure: Online space was “interactive rather than passive, nomadic and atomized rather than resident and centralized, egalitarian rather than elitist.” Again, the implicit foil here is television. The problem of TV was that it was seen as structurally producing a unidirectional and passive consumerist subjectivity. You watched what others chose to show you, or chose among their choices. But the web was something else: an unruly, unmappable mess that required tenacity and creativity to navigate, especially before the development of search engines. The web cultivated a researcher’s élan: Do a deep dive through a “rabbit hole” of links, and you might be rewarded with the treasure of truly eccentric content. (I’m still addicted to that feeling.)

Caught up in its radical difference from established media, many a media scholar temporarily shelved any antipathy to technological determinism and embraced rhetoric about the web’s inherently egalitarian and non-hierarchical nature. Yet if the web made for an alternative public sphere, this did not come about automatically as a consequence of its technological differences. Rather, it was a product of how people made their way through its networks and responded to the battles that flashed across its landscapes. That is, this alternate public sphere developed as a historical phenomenon.

If the web made for an ideal setting for anti-corporate alternative culture, it was a product of how people responded to the battles that flashed across its landscapes

One way the internet built a set of alternative media and cultural practices was through the practical demands going online made on would-be users. Like the punk scenes that flowed into alternative music, the early web was DIY, both by choice and by necessity, a kind of sprawling folk creation that artist Olia Lialina has dubbed “the vernacular web”: “bright, rich, personal, slow and under construction.” Corporations went online sluggishly and cautiously, which meant that much of the work of designing the web and filling it with information was left to amateurs and hobbyists. If you, like me, wanted to look up Nine Inch Nails lyrics, you had to go to someone’s meticulously (and likely poorly) crafted fan page. Official sites, if they existed at all, tended to hold back the good stuff anyway. What was one corporate-sanctioned background wallpaper and some old tour dates compared with a trove of lovingly hand-scanned images and arduously compiled discographies, compiled and given away freely by a fan just like you?

Second, there were real conflicts between corporations and the new capabilities being explored on the web. Many of the internet’s early adopters — programmers like Eric Raymond and Richard Stallman — had cut their teeth in the fight for free and open-source software and bequeathed to web culture an extensively articulated antipathy to copyright, which aligned with the art pranks of the early culture jammers. But this indifference to intellectual property extended beyond politicized rebels to anyone self-publishing online. The cease-and-desist notices sent to X-Files devotees, Harry Potter fanfiction writers, and even the tender souls with pages hosting images of Disney characters spoke to how quotidian the fight against corporate control and intellectual property could be online.

This politicization of copyright forged a strange alliance between those enthralled to the avatars of mass culture and those who sought to undermine them through parody. But at the time it made perfect sense. The web’s homespun aesthetics — the irreverence for professionalism contained in every gaudy color choice and trademark-flaunting Tinkerbell gif — came to signify, like the deliberate rawness of alternative music, a proudly adversarial culture. Among my earliest online activities were downloading a bitmap of the AdBusters “Corporate States of America” flag, and reading all manner of rough-and-ready cultural criticism, from reviews of Aphex Twin albums to rants about the vapidity of MTV. Soon I would be writing these sorts of things myself, posting them to my own ramshackle Geocities page. In its most heroic form, this online alternative media meant sites like muckraking anarchist infoshops, some still producing new anti-authoritarian FAQs after all these years. If we’re being honest, though, the major players at the time were shock sites like Rotten.com, which drew relatively huge audiences for its outrageous images without any of the apparatus of built-in likes and automated feedback loops that now makes for viral culture.

But the occasional Goatse was the price we paid for liberty: Extreme anti-authoritarianism and pornography have traveled arm in arm since at least the libelles that scandalized France’s ancien régime. After all, the internet was a frontier, as J.P. Barlow told us, and its rugged independence needed defending. The prophylactic web offered then by America Online presaged a web of safe and bland curation, of filters and restrictions, and so it was attacked in a thousand ways — rhetorically, of course, but more directly as well. A popular and rather easy-to-use set of hacking tools, AOHell, let even a novice script kiddie wreak havoc (more properly spelled HaVoK) on the gentler sorts just beginning to participate on the web, knocking newbies offline and flooding chat rooms with vulgar ASCII art. AOL’s merger with media conglomerate Time Warner proved the righteousness of our cause.

This kind of culture jamming was part of a defense against the intrusion of dreaded corporations, with their standardization, censorship, and perceived paternalistic approach to culture. But it was also the mark of a subcultural elitism. In meatspace, I was an unfashionable nerd, but online I was part of a thriving subculture where my outré tastes were acknowledged, and celebrated. And this, as much as anti-corporate rebellion, was a major attraction of dissident digital culture: It was a source of identity and status. This is the same mechanism that transformed the lackadaisical slackers of the alternative 1990s into the sneering hipsters of the indie 2000s. It is also, to some extent, what transformed niche gamers into the frothing swarms of Gamergate.

Of course, the pre-corporate web is gone, enclosed into a handful of proprietary platforms where our digital activity can be rationalized and monetized. Our free-ranging through shabby websites have been replaced by algorithmically curated feeds of standardized posts. But my point is less to mourn the death of the open web than to confront how the political valence of the normative ideals it embodied — an independent, homemade, anti-commercial “alternative” culture with a flippant orientation toward established media — have shifted. As the web was emerging from its adolescence, these ideals seemed inherently “progressive,” to use a shopworn term, but they have quickly deteriorated. Now they are suspect. What changed?


The internet did conquer the “dinosaurs” of old media. But rather than culturally jammed, the old media were economically disrupted. In their own grand act of appropriation, two platforms, Facebook and Google, snatched up the revenue sources of old media by seizing upon their product — the ability to deliver demographically aligned stables of data-producing consumers (a.k.a., us) — and supercharging it. We’ve got identity now, too much of it. Social media jacked the old media’s mass audiences and remixed them into targeted micro-demographics. The data analytics company YouGov tells me that Sonic Youth fans tend to be male, urban, Democrat, and earn six figures. One in five work in advertising and marketing. Truly, no alternative here.

Online interactivity has been limited to a few large sites oriented toward advertising and shopping. Scrolling through your feed — the word itself conjures a pig’s trough — feels more like flipping through channels than surfing the web of yore. What once seemed like the birth of an interactive and participatory new media environment has shifted to an ostensibly more consumer-friendly (and more scrutinized) method of watching television, and today’s cultural critics, rather than condemn television, help us use our high-speed connections to select which shows to “binge” on.

But beyond the conquest of digital cool, the internet, which not so long ago was viewed as having helped elect the U.S.’s first black president and spread “democratic revolutions” across the world, is now blamed for stoking the flames of dangerous right wing populism. Specifically, it is the techniques of culture jamming that, rather than show us a way out, have come under fire for bringing us to this point. Indeed, if culture jamming has an apotheosis, it was the 2016 election, where memes, hacking, conspiracy theories, infodumps, and pranks consistently scandalized the media and galvanized support for the alternative outsider candidate, who happened to be a billionaire game show host.

In the wake of Trump’s win, media hoaxing — “culture jamming in its purest form” according to Dery — has become the pernicious “fake news” that nudged voters into the arms of a dangerous incompetent. Trump was even able to détourn establishment media’s criticism of reactionary propaganda and turn it against them, making “fake news” a rallying cry for his supporters against CNN and the New York Times. And one has to admit that 4channers convincing people that Hillary Clinton was involved in a child-sex-trafficking ring run out of a pizza place is not miles away, in form, from the famous culture-jamming antics of the Yes Men, who have printed and distributed thousands of imitations of the New York Times and the New York Post.

In the wake of Trump’s election, intellectuals and politicos have not told us to hack, snipe, poach, or wage guerrilla war. Instead, we’ve been told to subscribe to the New York Times

During the election, all-volunteer cadres of racist meme warriors weaponized alternative aesthetics until even Hillary Clinton felt it necessary to publicly acknowledge Pepe the Frog, a character appropriated from underground comics, as a white supremacist symbol. In one of the purest examples of the inversion of the old values of the web, the cartoon’s creator, Matt Furie, has leaned on his intellectual property rights to issue DMCA takedowns of unauthorized uses of the frog.

In this truly topsy-turvy world, “alternative” has been truncated and appended to “right,” where irreverent appropriation is thriving. The style guide for “alt”-Nazi outlet the Daily Stormer boils down culture-jamming techniques to their essence. Rather than write news stories, Stormer specializes in reappropriation and recontextualization, deliberately modeled on snarky liberal sites like Gawker, with the goal to “hijack culture” and “co-opt the perceived authority of the mainstream media.” Rather than weighty sturm und drang, these fascists strive for a light and irreverent tone: “we rely on lulz.” The Stormer’s founder, Andrew Anglin, hails from the same Ohio suburb as I do. I might have run into him in the halls of my high school, but he attended the alternative program.

The alt-right cultivates a desire for an idealized past where white supremacy and patriarchy were more firmly entrenched, an aspiration unremarkable among American conservatives — normie — but for its lulzy vulgarity. Since Trump, though, a parallel nostalgia has risen on the shell-shocked #Resistance: a yearning for the past of Big Media. When political theorist Francis Fukuyama, who has taken aim at “postmodern relativism” for decades, despairs over the internet’s ability to “liberate us from gatekeepers” of authority, it comes as no surprise. But when Joan Donovan, a researcher from the tech-savvy Data & Society think tank, reproduces Fukuyama’s criticism almost exactly — “we have few gatekeepers and we’ve meshed all of the different ways in which we consume reality” — something has changed.

For critics like Jackson Lears, the post-election obsession with internet-based disinformation offers a convenient scapegoat for Democrats unwilling to confront their political failures. But it reveals something else: a complete reversal of the 1990s sensibilities that yearned to leave stodgy Big Media verities behind and valued a subversive and jocular underground media. So powerful is this reversal that it has produced the surreal spectacle of reverence for the New York Times, whose deceptions and collusions with the Bush regime, as Lears wryly notes, facilitated the Iraq War, and for the Washington Post, that illuminator of democracy now owned by Amazon’s Jeff Bezos, one of the wealthiest and most exploitative employers in the country.

Those who awoke on November 9, 2016 and found themselves in a nation they no longer recognized reached for old stabilities and comforts — for those who might once again keep the gates. Even Clinton partisan Peter Daou’s much-mocked platform Verrit, which promises to verify each of its talking points with a numeric code, speaks, in its clumsy fashion, to the yearning for stable truths and trustworthy institutions, even as its own bizarre recursivity comes off as the kind of knotty joke a culture jammer might play. (Facebook has recently unveiled a similar system but, in keeping true to their business model, has outsourced the work of authentication to their users.)

In short, we wanted off the frontier. We wanted the state back. We wanted its protection from hackers (especially Russian ones); we wanted its legal concepts of property, expression, identity, movement, and context; we wanted it to regulate Facebook and Google in ways we can barely articulate. We wanted to call our congressperson on the phone because it’s more “meaningful” than email. We wanted an end to anonymity. We wanted God. We wanted big media. Hell, we wanted IBM and ITT and AT&T and Union Carbide, if only they’d have us.

Once, not long ago, critical media scholars told us to jam culture for the sake of good. Now when we spread an obvious parody on Twitter, we are part of the problem, demonstrating “people’s willingness to suspend disbelief and their susceptibility to confirmation bias.” In the wake of Trump’s election, intellectuals and politicos have not enjoined us to create a “counterproject” media sphere to combat hegemonic ideology. They have not told us to hack, snipe, poach, or otherwise take to the semiotic hills to wage guerrilla war. Instead, we’ve been told to bolster capitalist media: to subscribe to the New York Times, to dutifully consume advertisements by whitelisting our favorite sites, to obtain our music from commercial platforms — so the artists get paid, of course.

Before, in the 1990s, one radical solution was finding means of creating more critical, combative, active media consumers and expanding the range of perspectives on offer. “Alternative” was, it seems to me now, itself a kind of insufficient consumerist prerogative, a demand for more choices that was agnostic to the content of those choices. It certainly left room for the reactionary alternative media we have today, which exploits as an alibi the reflexive ironization that the ’90s avant-garde used to obfuscate its aims and seduce outsiders: Poe’s Law as a weapons-grade version of alternative culture’s skepticism of earnestness.

But the rush back into the arms of establishment media seems like another kind of problem. It speaks to another simple solution, that the solution to Trump’s careening demagoguery is a return to the neoliberal technocracy and its officially sanctioned ideologies. It is a desire for a narrower world where corporations promise to, once again, produce a stable sense of shared reality through mass culture. If the world cannot be better, it can at least be predictable, the way television was. The new alternative is no alternative, again.


This essay is part of a collection of essays on the theme of ALTERNATIVE. Also from this week, Adam Clair on the alternative channel of loneliness.

Gavin Mueller is a contributing editor at Jacobin. He lives in Washington, D.C.