Home

Necessary Purity

Western medicine’s trouble understanding allergies points beyond simple explanations of toxins and purity

Some people live with impurity less easily than others.

Take me, for instance. I grew up on an organic farm in Maine. Farm life is supposed to fortify a body against later immune dysfunction, like allergies. But my body doesn’t tolerate very much. My list of environmental allergies is long: Most of the major allergenic pollens in the northeastern United States, every mold I’ve been tested for, cockroaches, dogs, cats, and horses. Not only does my body have high standards for purity, the things I must exclude are all over the map of statistical probability and at times defy the logic of medical science. When it comes to food, I’m allergic to select nuts — especially cashews, but not peanuts (the most common food allergy) or almonds — and most fish, but not, as far as I know, the more commonly allergenic crustaceans. I’m doubled over in pain after eating some stealthy anchovy paste in French-style roast chicken, but I can gobble down a jar of kimchi as a snack with no trouble. And keep me away from anything made with Chardonnay grapes, though Rieslings are welcome.

The difference between living well and catastrophic illness makes purity more than a metaphor, even as its metaphorical register inspires political intervention

I am far from alone, though. While data is thin before 1997, survey data suggests that food allergy is a condition that is on the rise. As many as one in eight people in the United States may have the condition, with the “top eight” allergens — peanut, tree nut, milk, egg, wheat, soy, fish, and shellfish — accounting for 90 percent of food allergies. Somewhere in the neighborhood of 2,000 people a year may be hospitalized due to a reaction, which can become serious quickly, progressing from a skin rash and gastrointestinal symptoms to wheezing, fainting, a “feeling of doom,” and cardiac and respiratory distress just minutes after eating the wrong food.

Rising rates of food allergies seems to validate the menacing sense that the postindustrial world is inimical to human health. It is even tempting to regard food allergy as the signature disease of modernity, as historian Mark Jackson has suggested. Some speculate that people with food allergies may be canaries in the coal mine, their hypervigilant bodies warning us all about the eventual dangers we’ll face from exposure to the pesticides and industrial additives of the unclean modern world. If so, a return to pure, clean living — avoiding pollution, pesticides, the hustle and bustle of modern life — would seem to be the solution.

But the forms of scientific evidence and analysis are not — at least, not yet — fully enough elaborated to explain food allergy in terms of our general condition of contamination. Biomedicine, which treats biology as the exclusive explanation for sickness, does not make such grand claims. In scientific terms, food allergy involves complex, as yet incompletely understood interactions between inherited genetic and epigenetic dispositions, conditions in the womb, microbial colonization in the first moments of life, and the precise timing and route of early-life exposure to foods. Tying modern life to food allergy scientifically — at either the level of the individual body or the level of society — remains a work in progress, often driven by non-scientists who experience bodily changes as a response to modernity.

From a biomedical point of view, “purity” is a matter of timing individual bodily encounters with foods with the immune-system development. An intentional feeding of eggs or peanut butter to an infant by a conscientious caretaker might trigger an allergy. Or it might be a child’s peanut-covered fist finding its way into their mouth after grabbing food-coated playground equipment — an accidental exposure to a normal substance in an ordinary environment. This “impurity” appears to have little to do with the generalized pollution of modernity.

At the same time, certain forms of early exposure can help prevent allergy formation: Exposure to the mother’s microbiome in the vagina seeds the newborn infant with a borderline unruly zoo of friendly and pathogenic microbes that may have a protective effect against food allergy. Though it may appear to violate the hygienic sensibility of the aseptic modern hospital, it potentially stands as a sort of necessary impurity. But once food allergies develop, impurity is no longer indeterminate. A nut, a seed, a crumb, an invisible smear of protein and oil on a cooking surface become contaminants, posing major threats to exquisitely sensitive bodies.

Material purity has thus become a rallying cry of food allergy activists, with local political efforts being directed toward keeping peanuts out of schools and sports stadiums and having federal food purity reporting standards changed. Such purity projects are not needless, nor are they reactionary responses to modernity’s undetectable dangers. In this limited sense, there is a necessary purity for foods and environments that can make the difference between living well and a potentially catastrophic illness. This makes purity more than a metaphor for those living with allergies, even as its metaphorical register does, at times, inspire political intervention. As a metaphor, purity easily translates from necessary practices to exclusionary principles, from protecting an individual’s body in ways appropriate for them to limiting exposure to all kinds of people who might be carrying or cooking with the food that might pose a threat.

How, then, should we understand the biomedical significance of purity without further muddling its material and metaphorical uses? What forms of purity are necessary? Could a degree of impurity save us from rising rates of chronic immunological illness? What evidence is acceptable for answering these questions, and how might we collect it? Who has the authority to say what forms of hygiene and pollution lead to sickness and health? And how do we find answers to these questions without demanding purity in our politics and societies where none is needed?

If food allergy appears to be a modern disease, manageable by techniques of purification and carefully controlled contamination, it also uncannily recalls pre-20th century ecological understandings of human health, in which environments rather than specific pathogens were seen as responsible for illness. With allergies, no clearly identifiable trigger tips the body over the threshold from “healthy” to “ill.” While an array of contributing factors can be described, pinpointing a single, specific cause is elusive. Instead, one becomes allergic during the course of life, often early but possibly at any time. While the stereotypical food allergic individual is an elementary school child with a peanut allergy, my own nut allergies appeared well into my twenties. In online communities, middle-aged women recount the acquisition of five, 10, even 20 food allergies following the birth of a child, gradually over many years, or suddenly and following no apparent trigger at all. Their cases confound their doctors, which in turn compounds their sense of fragility and fear.

The way that food allergy tracks an individual’s biography is reminiscent of pre-germ theory environmental medicine. In this paradigm, bodies with the certain underlying, constitutional dispositions were thought capable of living harmoniously with the diseases, soil emanations, and climactic vagaries of particular places. A healthful body, then, was one whose features complemented those of the place where the person lived.

For example, as historian Conevery Bolton Valenčius documented, mid-19th century Americans settlers in the Arkansas and Missouri territories were fixated upon the healthfulness of the land. Healthful land could be identified in part by geospatial features: elevation, climate, wetness or dryness, vegetation, and so on. The healthfulness of the land could also be detected through its effects on human bodies. In the world of 19th century America, the human body itself was understood to be dynamically, flexibly, and continuously interacting with its environment. Smells indicated the presence or absence of putrescent materials that could give rise to disease. A land that produced frequent fevers in its inhabitants was more than likely an unhealthy land, whereas one that grew tall, muscled, hardy young people undoubtedly had essential features to commend it. To live in healthy country was to have a healthful body; a healthy body was evidence that the country was healthful.

Food allergy is peculiar because the patient’s memory and real-time bodily functions are considered more reliable than the way the body is mediated through medicine’s sophisticated apparatus

By abiding in a locale, some peoples’ bodies could, within certain bounds, become accustomed to a place. This process was called seasoning. The porousness of the body was, for individuals with the right constitutions, potentially a resource for survival. But not all bodies were thought to be susceptible to seasoning in the same way, or to mesh well with every environment. In the American south, for example, human bodies of African descent were thought to intrinsically possess the right constitutional characteristics to adapt most successfully to subtropical climes, regardless of their individual histories of exposure. Thus, they were seen as more suited to agricultural labor than white settler bodies. Such ideas traveled everywhere that white Europeans and Americans colonized in the 19th and early 20th centuries, providing scientific justification for the exploitation of the people they occupied and persisting as an influential medical framework until as late as the 1920s.

Starting in the late 19th century, discrete biological causes like microbes began to replace environmental, constitutional, and seasoning mechanisms as the explanations for disease. Allergies recall these older kinds of environmental and constitutional explanations, though to date they have resisted the same sort of pernicious racialization. Perhaps it is this anachronistic aura that prompts the question I often receive when people find out what I study: Are food allergies real?

As I understand this question, it employs a very restricted sense of the word real. I think what people are actually asking is: Are food allergies verifiably biological, and not just in someone’s head? Can they be detected objectively with modern diagnostic technologies? Can they be shown to be inherent in specific, individual bodies rather than a sign of our general decline as a nation, as a society, as a human race? That is, can we assure ourselves that they are the problem of certain, troublesome individuals, and not something that might be a general pathology triggered by modern life?

The case for the “reality” of food allergy is complicated by the current state of food allergy diagnostics. Blood tests are available for food allergy, as they are for many conditions. It’s just that they are not universally thought to be useful for helping patients manage their condition. These tests measure the body’s response to specific allergens rather than some intrinsic, independent property of the body itself. Confusingly, an immunological response in a blood test does not always mean a person will have symptoms. This means, from a treatment perspective, that it is not the same as having the disease. Many people with “positive” blood tests to certain foods show no symptoms when they eat them, and they may restrict their diets unnecessarily if they avoid foods based on that metric alone.

For these reasons, blood tests are often regarded as secondary to other diagnostic tools, like a patient’s history of reactions and direct tests of the body’s sensitivity through skin-prick tests and other exposures. Most definitive and direct of all is the so-called oral food challenge: feeding a patient the foods they believe they are allergic to in a controlled clinical setting to see if they react to it. In this simulation of ordinary eating, sudden onset of illness serves as incontrovertible evidence of an allergy.

Food allergy is peculiar among modern diseases because the patient’s memory and the real-time functions of their body are often considered more reliable than the way the body is mediated through modern medicine’s sophisticated technological apparatus. Much of modern biomedical practice makes a habit of suspending belief in a patient’s symptoms until it can be technologically verified, whether through a 19th century technology like a stethoscope or a computerized CT scanner. How can a disease be “real” in the 21st century if it cannot be validated by laboratory techniques that operate independently of human bodies?

For philosopher Bruno Latour, this desire for laboratory verification extends the laboratory into society, and extends society into the laboratory. According to Latour, Louis Pasteur, the 19th-century inventor of the sterilization process that bears his name and an advocate for germ theory, was the first to master these translations. In his research on anthrax, he made aspects of the laboratory transparent to outside observers and assumed the authority to speak for the anthrax microbes he observed there in a variety of public forums. After Pasteur, Latour suggests, everyone in society must enter the laboratory to have their non-expert sensations and observations verified as factual, as real. The laboratory — and, by extension, scientific expertise more generally — has subsequently gained the authority to adjudicate on what is “real” in the society that lies beyond its walls.

But food allergy so far defies this measure of reality. It is epistemologically elusive, not easily reduced to objective proof from high-tech tests or measurements. It is an ontologically impure disease, troubling biomedical biases about the borders and definitions of disease. Food allergy is not an either/or kind of disease (either environmental or individual, either testable or a matter of memory, either subjective or objective) but a both/and: both externally verifiable and personally experienced, both a question of how bodies encounter the world and how the processes within them function. It promiscuously skirts the edges of what counts as a sensible biomedical category: It is both genetic and environmental, both constitutionally patterned and triggered by specific exposures, both detectable in the lab and diagnosable through patient experience.

During the AIDS crisis in the 1980s, women were also afflicted and died, but their symptoms were illegible to medical science. Purity as a guiding principle marginalizes and kills

It is within this confusing matrix of purity and impurity, laboratory verified facts and embodied sensations, that people with food allergies are tasked with finding things to eat that do not make them sick.

Sociologist Alexis Shotwell recently inveighed against the pursuit of purity in Against Purity: Living Ethically in Compromised Times. For Shotwell, the pursuit of purity has dark political implications. Defending environmental purity through a discourse of moral panic about gender ambiguous frogs, she suggests, devalues queer and trans struggles for political recognition. And clinging to a pure definition of a disease as a problem affecting a single demographic group or generating a single set of symptoms can exclude patients from necessary treatment. During the AIDS crisis in the 1980s and early 1990s, women were also afflicted and died. But since their symptoms were illegible to medical science, they often did not receive the same needed treatment and disability payments as their male counterparts. Purity as a guiding principle, Shotwell tells us, marginalizes and kills.

But here again, food allergy resists the rubric. With food allergy, there is, at the level of practice, necessary purity and helpful impurity. Precisely timed impurity early in life might lower the chance that a person would ever develop the disease. But once one develops the condition, material purity — food made of purely what it is said to be — is not just a metaphor but a bodily necessity.

What food allergy teaches the careful observer is to be specific about the kind of purity necessary to live. It does not extend to avoidance of foods that have not triggered reactions. Nor does it extend to a narrow understanding of single-cause disease etiology or a definitive testing regime that precludes the need to listen to messy patient stories and to observe real-time bodily reactions (at least not yet). And it most certainly does not extend to exclusionary interpersonal, or institutional politics — something to remain especially vigilant of because of the historical tendency of white Americans to claim that the bodies and lifeways of people of color, immigrants, and people with disabilities introduce contaminating influences into society.

Purity is necessary in a limited, practical sense in the case of food allergy. Thinking about purity in this context serves as a parable for understanding its role in other areas of life. We must be specific concerning the politics that necessary purity practices enact or demand. While a politics of providing appropriate care for specific bodies throughout the life course is a good politics for unpredictable times, demands for total purity of category, kind, and substance at all costs seem more dubious. Yet purity practices are embedded in — and can even be ways of materializing — our politics. We hope that purifying our problems by identifying simple causes of our ailments or demanding political allegiance around a single analytic frame or idea can simplify our search for a better life. But this has rarely been the case, and it should not motivate our every pursuit.

We must remain vigilant that necessary purity practices do not translate into exploitive and exclusionary purity politics. Necessary purity exists, but it is specific, modest, material, and local. The muddled status of food allergy points to the limits of purity as an epistemic, ontological, and material ideal for biomedical science. Identifying simple causal triggers, inventing new tools that guarantee easy identification, and developing quick fixes for diseases on the rise, like food allergy, are tempting. But these pursuits would also limit what we are capable of seeing as reasonable, long-term solutions to caring for our messy, unpredictable bodies.

Should we be seeking quick fixes for ailments of individual bodies? Or should we be aiming to make it easier to live as part of collectives and communities with a wider variety of bodily difference, by lowering the cost of medical treatments and expanding health-care access and other social services? Will new obsessions with protecting the purity of bodies translate into new forms of paranoia about living with bodies that are racially, sexually, or otherwise different from our own? We ignore the material specificity and historical lessons of the changing politics of bodily purity at our own peril. Seeking purity and simplicity limits how we confront the challenges that face us in constructing more livable, healthful futures — and even what we imagine them to be.

Danya Glabau researches, teaches, and writes, about gender, bodies, and technology in New York at the Brooklyn Institute for Social Research and NYU Tandon School of Engineering. She is working on a book on food allergies, gender, and capitalism.