In September 2017, Firefox launched their “farm to browser” ad campaign on Instagram, partnering with influencers and food bloggers for a series of embedded posts that juxtapose sleek, luxury devices — MacBooks and iPhones, often encased in on-trend rose gold marble phone cases — with meals and ingredients associated with “clean eating”: kale, chia seeds, açai berries, avocado toast, charcoal lattes, and smoothie bowls. Videos on their own account feature smiling millennials holding handfuls of dirt, farm-raised chickens, and heads of lettuce. The campaign compares using Firefox to other values-based decisions, like buying organic fruits and vegetables, shopping at farmer’s markets, and practicing yoga. “So we don’t have organic heirloom tomatoes inside our code,” one caption reads. “We are the only major browser backed by a non-profit, though, and that matters.”
These posts are meant to highlight the fact that Firefox, unlike Chrome and other browsers, does not profit from selling your data, making them the more “ethical” choice for internet users. Providing this option, Firefox suggests, is just as important as the work that farmers do to provide free-range, chemical-free, local, organic, healthy food. Though not without humor, the message is clear: Your consumption habits matter, and you should think about your browsing habits the same way you would about the food you eat and the clothes you wear. Firefox wants to package their attention to data privacy together with other forms of ethical living, making data privacy issues recognizable and appealing through humor, Silicon Valley aesthetics, and a very familiar rhetoric of salvation through language of aspirational purification, self-improvement, and control.
But much like the health products they align themselves with, these campaigns promote an elite vision of purity that pathologizes individuals
Firefox is just one example of metabolic metaphors — food, dieting, and detoxing — used in data privacy guides and in advertising for internet services; once you notice, you find them everywhere. Google’s Digital Wellbeing toolkit offers features, tips, and dashboards for you to help “fine-tune your tech habits to achieve your personal digital wellbeing goals.” Media outlets from Forbes to the Dr. Oz Show offer month-long “technology diet” guides for their readers; news articles and opinion pieces regularly instruct and analyze the latest digital diets and social media detoxes. Digital Detox, LLC, a company that describes itself as “a slow-down and not a start-up,” sells device-free workshops, retreats, and happy hours for individuals and corporate teams across the United States.
Together, these guides and services promote what one might call personal data hygiene: a set of tools, habits, skill sets, technical knowledge, and digital literacy that constitute practices of a so-called healthy digital lifestyle. They encourage personal data hygiene affectively, linking browsing habits with certain feelings and structures of feeling. We have reason to be conscientious about our browsing habits: our personal information is susceptible to being leaked, and will almost certainly be sold to analytics companies in order to sell more things to us. But much like the health products they align themselves with, these campaigns promote an elite vision of purity that pathologizes individuals for the choices they do or don’t make — urging the user to keep themselves “clean,” while shifting attention and responsibility away from the systemic ways they are targeted for their data.
The Farm to Browser Campaign borrows from an ethos in which aesthetic choices are equated with health, and health with productivity: quality-over-quantity minimalism, managing distractions through mindfulness, screen-free meditation retreats. A good way of understanding this visual language is through considering the intertwined histories of the countercultural movement and the emerging entrepreneurial culture of Silicon Valley in the late 1960s, as described by Fred Turner in his book From Counterculture to Cyberculture. Turner describes a countercultural community who, even as their peers organized political protests, “turned away from political action and toward technology and the transformation of consciousness as the primary sources of social change.” These New Communalists, as he calls them, share with technologists the utopian vision of empowered individualism, collaborative community, and global harmony through the “cybernetic notion of the globe as a single, interlinked pattern of information.”
Richard Barbrook and Andy Cameron advance a similar argument in their 1995 essay, the “Californian Ideology,” describing how a neoliberal digital utopianism emerged from the unlikely ideological fusion of free-spirited hippies from San Francisco and the entrepreneurs and hi-tech industries of Silicon Valley. They describe how the Californian myth of free market entrepreneurship relies on decades of immense public funding — via subsidized Cold War defense contracts — as well as a long history of DIY initiatives and cultural bohemianism in the Bay Area, the signifiers of which (“community media, ‘new age’ spiritualism, surfing, health food, recreational drugs, pop music…”) are still recognizable in media like Firefox’s campaign.
Out of this fusion of countercultural values, technological determinism, and libertarian individualism emerged a class of workers Barbrook and Cameron call digital artisans — well-paid media and information technology workers who are “heirs of the ideas of radical community activists” but make up “a privileged part of the labor force.” Although digital artisans “enjoy cultural freedoms won by the hippies, most of them are no longer actively involved in the struggle to build ‘ecotopia.’” Digital artisans look for liberation not through struggle and solidarity, they argue, but through technological innovation and individual consumption.
Firefox’s campaign and others like it similarly sell the path to health, happiness, and self-worth by changing individual consumption habits or adopting new technologies. The Data Detox Kit, a promotional publication co-produced by Mozilla and Tactical Tech, is an eight-day guide to teach users about personal data hygiene habits. The website asks, “Do you feel like your digital self is slipping out of control? Have you let yourself install too many apps, clicked “I agree” a few too many times, lost track of how many accounts you’ve created? Perhaps you’re not as in-control of your digital lifestyle as you’d like to be.” Colloquially, the term “detox” is a way of going on a diet without actually saying you’re on a diet — losing weight, or flushing toxins from your body, often through the consumption of some products (juices, say) over others.
Metabolic metaphors ignore the structural factors that place internet users in peril
Though it never uses the term “diet,” the kit uses much of the same moralizing language of diet magazines in the check-out aisle: you are not “in control” because of what you have “let yourself” do. You’ve “lost track” of how much you’ve consumed — if you just had more discipline, you could prevent yourself being tracked and spied on online. It’s worth noting that this sort of rhetoric has a gendered history — companies and organizations have long marketed dieting, detoxing, toning up, and slimming down to women as a way of achieving self-worth through controlling what you eat. (The farm to browser campaign was also mostly targeted to Instagram, a platform where the majority of users are women and specifically women under 35. Even in the realm of the datafied self, women are continual targets for the ways they take up space.)
The Data Detox Kit describes how to reduce your “data bloat — a toxic build-up of data that can lead to uncomfortable consequences in the longer term,” and instructs users how to establish a regimen to prevent more toxic data build up in the future. It emphasizes how easy and straightforward this path will be, but in fact provides quite an extensive list of things to keep track of: Not only does it require you have to change your own habits with pretty much every website or app you use, it also demands that you convince your friends and family to do the detox too. (“Every time they tag you, mention you or upload data about you, it adds to your data build-up, no matter how conscientious you’ve been.”)
The emphasis on “personal data hygiene” more or less ignores the questions of who has time, money, and access to the spaces and rituals of purification. Eating organic isn’t cheap; yoga studios, farmer’s markets, the farm to table movement, juice bars, even the Pacific Northwest, where Firefox has its offices, have all been criticized as overwhelmingly white spaces. Scholars like Alison Hope Alkon and Julian Agyeman have explored the ways racism, classism, and imperialism have created foodways that continue to privilege the global elite. As they argue in their book Cultivating Food Justice, “Communities of color and poor communities have time and again been denied access to the means of food production, and, due to both price and store location, often cannot access the diet advocated by the food movement.” Urban planning policies, zoning ordinances, and mortgage requirements shape the built environment, creating food deserts that restrict access to affordable and nutritious food — usually in poor communities and communities of color.
Barbrook and Cameron write that this utopian vision of California “depends on a willful blindness toward the other, much less positive, features of life on the West coast: racism, poverty, and environmental degradation.” In their video launching the farm to browser campaign, Firefox tells the viewer: “You value things that are fresh. Things that take hard work” — the narrator pauses — “by someone else.” In some cases, it is exactly the farmworkers who cultivate fresh produce who suffer most from hunger and food insecurity. This line in the video points to questions of class and access: These campaigns are centered on middle-class consumers, overlooking the working conditions of producers, of agricultural laborers both in the U.S. and worldwide.
The wellness-centric language of these campaigns also obscures who is harmed the most in these practices of surveillance and data extraction. Barbrook and Cameron warn that if only some people have access to new information technologies, the seemingly admirable ideal of a technology-fueled democracy will only become a “hi-tech version of the plantation economy of the Old South.” Scholars like Simone Browne, Virginia Eubanks, Safiya Noble, and Kathy O’Neill have shown that, in some ways, their warning has come true: contemporary systems of algorithmic classification and data extraction continue to deepen social inequalities and replicate long-existing patterns of racial and gender bias. Any movement for data justice that does not explicitly engage with issues of feminism and racial and economic justice runs the risk of perpetuating these same norms.
Digital wellness campaigns promote an aspirational notion of purity, positing it as a feeling one can achieve through a few easy steps, while ignoring the class- and race-based constructions of what is considered “pure” in the first place. In her influential 1966 book Purity and Danger, anthropologist Mary Douglas argues that dirt is “matter out of place.” The distinctions between clean and unclean, she argues, are less about inherent cleanliness than about maintaining order. Scholars like Dana Berthold have explored the ways current preoccupations with hygiene are genealogically linked with explicitly racist ideals of physical and moral purity. She argues the United States has “a history of extreme preoccupation with hygiene because racialized ‘cleanliness’ (for example the one-drop rule and segregation) has been utterly crucial to class status — high status being anxiously guarded by whites and truly unattainable for others because of the ways in which it was coded white.”
A brief glance at the history of natural purity reveals the deep associations of cleanliness with civility, high social status, and whiteness; notions of purity cannot be separated from these histories of colonialism, slavery, and white supremacy. Personal data hygiene practices are products of such classification systems. Although it is not explicit, by promoting aspirations of digital purity, these digital wellness campaigns are linked genealogically to histories of racism and exclusion.
Any movement for data justice that does not explicitly engage with issues of feminism and racial justice risks perpetuating these same norms
In her book Against Purity, Alexis Shotwell writes, “Being against purity means that there is no primordial state we might wish to get back to, no Eden we have desecrated, no pretoxic body we might uncover through enough chia seeds and kombucha. There is not a preracial state we could access, erasing histories of slavery, forced labor on railroads, colonialism, genocide, and their concomitant responsibilities and requirements. There is no food we can eat, clothing we can buy, or energy we can use without deepening our ties to complex webs of suffering.” Rather than aspirations of digital purity and shame for inevitably falling short, I want to use this as a starting point for data justice and, as Shotwell suggests, explore the radical potentials of impurity, implication, and compromise in social action.
Metabolic metaphors ignore the structural factors that place internet users in peril, putting the burden on individuals to know where their data exists, how they’re being tracked, who has access to the data, and how it is being used to make decisions about them. And while these habits might make people feel like they have a modicum more control, it distracts from the real issue, which is the corporations actually doing the extracting, and the systems that allow this in the first place.
Ultimately, dieting metaphors replace one bad feeling with another — from anxiety to shame: a painful feeling of being fundamentally flawed. Shame can induce feelings of inadequacy, avoidance, fear of failure, and create the urge to disconnect from those around you. Furthermore, using shame to get people to do things doesn’t actually work. If we want to motivate people to think differently about their data rights and take action, drawing on feelings of shame might be counterproductive — discouraging people rather than galvanizing them. This is not an issue of unhealthy browsing; it’s an issue of an industry built on surveillance and data extraction.
The metaphors used by Firefox and other campaigns do the important work of encouraging Internet users to think critically about their browsing habits and might inspire them to turn away from the monocultures of Google and Facebook. I want to encourage work toward building a stronger and deeper critique of data extraction, though, one that considers how data justice intersects with issues of feminism and social and economic justice. By focusing on production rather than consumption, and looking at the ways these systems of oppression continue to exploit those who are already marginalized, we can focus on the real need to organize. By using a framework of data labor and data rights, we can link issues of data extraction with other movements of workers and other social justice movements happening right now.