Home

Doctor’s Orders

Vaccine refusal can’t be understood outside the context of the broader medicalization of society

Full-text audio version of this essay.

In the midst of the Covid pandemic’s first winter, a contagious form of hope picked up steam and carried people through the darkness of lockdowns and social distancing, offering the promise that soon it would be a faint memory. With the swift arrival of a vaccine, a human-made miracle, all would go back to normal. But this optimistic, magic-bullet view didn’t anticipate an ever-morphing virus that would spawn more contagious variants, breakthrough cases, booster shots, and continued mask mandates — let alone a divided political climate that would manifest as vaccine hesitancy and outright refusal.

Through the months spent waiting for the vaccine in the U.S., momentum built for the idea that post-vaccine, life would be brighter than normal: “Hot vax summer.” Some planned “vaxxications” to begin exactly two weeks after their second dose. New York City, hit earliest and hardest by Covid, was treated to Uber ads that dusted off an old Ace Frehley hit, promising everyone that they would soon be “back in the New York Groove,” bustling again at its pre-Covid apex.

While skepticism regarding medical expertise was seen as ignorance, belief in its authority was also viewed as, yes, ignorance

That is not to say that the Covid vaccines  — itself one of myriad serviceable public health interventions employed daily in this country to save lives — don’t work. By July, among the 18 states with a Covid case rate higher than the U.S. average, all but two had fewer vaccinated people than the national average of 55 percent, according to USAFacts. And the MRNA Covid vaccines have been proved unquestionably protective against Covid-related hospitalizations. Yet as with their predecessors, masks, these vaccines have become another issue that divides Americans into warring factions who cannot find and do not seek unity.

The U.S. has long employed prophylactic compulsory vaccination to prevent the spread of communicable illness. Those seeking a green card, for instance, must be vaccinated for no fewer than eight transmissible diseases, and compulsory vaccination stands as one of the few legally enforced prerequisites to a child’s right to a free education. Resistance to these mandates was a fringe movement, but the Covid vaccine has spawned a different response. At an August news conference, Mississippi governor Tate Reeves told reporters that he was opposed to Covid vaccine mandates, but when asked about extant mandates for measles and rubella, he deflected this query by saying, “Y’all are going down some silly rabbit trails now.” But these are exactly the rabbit trails one would need to go down to understand this incongruity.

Why has this vaccine specifically been viewed with such suspicion or as an assault on personal freedom by so many? And how is that related to the equally widespread belief that vaccines alone could end the complex social and political condition that the Covid pandemic has become? These seemingly antithetical beliefs do not cancel each other out but are two opposing reactions to the same underlying history of the progressive medicalization of American society, and the understanding of almost all problems as solvable through a medical intervention administered on an individual basis.


For a certain bracket of Americans, advice about social distancing, mask-wearing, hand washing, and vaccination was received as liturgy, largely delivered through the scripture of CNN and the New York Times. Meanwhile, a different bracket of Americans rejected such strictures, sometimes questioning the severity of the pandemic itself, and whether mask and vaccine mandates weren’t themselves a denial of basic human freedoms. It was common for the first bracket to regard this second bracket as paranoid, selfish, politically brainwashed, and altogether divorced from reality. And it was common for the second bracket to view the first in nearly the exact same terms. While skepticism regarding medical expertise was seen as ignorance, belief in its authority was also viewed as, yes, ignorance.

To be fair, the sheer amount of garbled information with respect to best practices (due in part to disputes in the medical field) has made information integration in the Covid epoch far less than seamless. On February 27, 2020, the CDC tweeted that it did not recommend the use of face masks to prevent coronavirus, but swiftly began singing a different tune as the crisis mounted (though to this day mask-wearing protocols remain potentially confusing and can seem counterintuitive). Then there were the months of expert disagreement over whether or not the coronavirus was passed from person to person via airborne “droplets” (followed by the eventual confirmation that it is). Such inconsistency was at times obscured by the politicization of expertise itself.

But concerns about blindly obeying medical advice haven’t sprung up only in response to the pandemic. And this type of reactionary response to medical mandates is not only the province of right-wing zealots, libertarians, and self-righteous parents. Vaccine refusal and some of the broader sociopolitical concerns brought to the surface by Covid may be seen as an expression against what sociologists, philosophers, and psychiatrists — including Irving Zola, Ivan Illich, Peter Conrad and Thomas Szasz — began describing in the late 1960s as medicalization: the elevation of medical advice and knowledge to the status of society’s ground for truth. The problem with medicalization is not the idea that doctors are often wrong or that medical intervention is misguided. Rather, as health researcher Tiago Correia writes in “Revisiting Medicalization: A Critique of the Assumptions of What Counts As Medical Knowledge” (2017), the problem was how it sought to impose social conformity under the auspices of personal health:

Medicalization first entered the vocabulary of social scientists to address the “tendency [of] medical institutions to deal with non-conforming behavior.” The concept speaks to the influence and role of medical regulation in daily life, which replaced previous social control institutions, namely the church and the law, in the management of deviance as explanations for human health conditions gradually changed from sin, to crime, and eventually to sickness.

This perspectival transformation is not always deleterious in its effects: Addiction and ADHD are often cited as conditions whose medicalization has led to a more humane understanding of them not as “sin” or personal moral failing but as illness. But medicalization has also encroached on a range of conditions that were not necessarily subject previously to criminalization or regimes of institutional normalization. It has played out as the redescription of many previously quotidian human circumstances — shyness, thin lips, pregnancy, and boredom, to name just a few. Such “problems” as hating one’s job or inheriting a “weak” chin began to be construed as “fixable” through pharmaceutical or surgical intervention.

The boundaries between medical issues and any concern whatsoever have been blurred

As a result, the boundaries between medical issues and any concern whatsoever have been blurred, and the lines between medical professionals and those offering “solutions” to medicalized problems have also been blurred. Medical expertise is seen as more important to a broader range of experiences, but it is also stretched thinner — “patients” and “medical practitioners” now comprise a dominant swath of society (maybe its entirety), which allows for both groups to begin to view the whole of life through the lens of medicalized language and behavior. The crude instrumentalization of medical practice makes it harder to take a holistic view of health and makes medical-seeming advice appear more open to financial incentivization.

As society begins to pathologize more everyday problems as medical concerns, the medical industry itself follows what philosopher Ivan Illich, in his 1974 book Medical Nemesis, described as the “medical-decision rule,” wherein illness is presumed until health is proved, just as the legal system purports to presume innocence. “To ascribe the pathology to some Tom, Dick, or Harry is the first task of the physician acting as member of a consulting profession,” Illich writes. “Trained to ‘do something’ and express his concern, he feels active, useful, and effective when he can diagnose disease. Though, theoretically, at the first encounter the physician does not presume that his patient is affected by a disease, through a form of fail-safe principle he usually acts as if imputing a disease to the patient were better than disregarding one. The medical-decision rule pushes him to seek safety by diagnosing illness rather than health.”  The litigiousness of the American health-care system may also encourage and reinforce this attitude, not to mention the financial incentives for finding a way to bill insurance companies for various diagnoses, be they actual or fraudulent.

In Illich’s view, a society that turned “people into patients … inevitably loses some of its autonomy to its healers.” And medical professionals do wield unprecedented authority far beyond the treatment room: Think, for instance, of the current passport-like value of a vaccination card or even the power a simple doctor’s note has over employers and administrators. This phenomenon, which Illich labeled “diagnostic imperialism,” gives “medical bureaucrats” the power to “subdivide people into those who may drive a car, those who may stay away from work, those who must be locked up, those who may become soldiers, those who may cross borders, cook, or practice prostitution, those who may not run for the vice-presidency of the United States, those who are dead, those who are competent to commit a crime, and those who are liable to commit one.” While this insight does not mean that one should reject public health measures to gain some mythical form of medical sovereignty, it helps frame vaccine refusal as not simply an act of willful ignorance but as a political struggle against perceived bureaucratic overreach at the level of both the government and the medical system. If it appears that all power rests with medicine, it may seem to some individuals that the only way to manifest and express their power is through self-diagnosis and self-medicating.

Amid political polarization, appeals to the neutrality of the medical field seem to offer a way out. In “Medicine as an Institution of Social Control” (1972) sociologist Irving Zola argued that medical discourse was “nudging aside, if not incorporating, the more traditional institutions of religion and law. It is becoming the new repository of truth, the place where absolute and often final judgments are made by supposedly morally neutral and objective experts.” Judgments made in the name of “health” would seem to be incontrovertible — who wants to be unhealthy? But “health” itself is not a straightforward universal but a complex and multisided concept. It is ultimately a political concept; it is open to debate and can be conceived in any number of ways, with different degrees of emphasis on individual and collective aims. What appears as health to some may appear as “disability” to others.

In “Toward the Necessary Universalizing of a Disability Policy” (1989), Zola argued that “the issues facing someone with a disability are not essentially medical. They are not purely the result of some physical or mental impairment but rather of the fit of such impairments with the social, attitudinal, architectural, medical, economic, and political environment.” In other words, a person is not “disabled” in a vacuum, and what appears as a personal disability may instead be understood as a social inequity, a problem with how systems, institutions, and the built world have been organized.

Just as there are many ways one can be disadvantaged in American society, there are many ways to define what “abled” is meant to construe. Similarly, the definition of “health” is dynamic, not a universal and unconditional fact that medical professionals can be trained to unilaterally intuit. Rather they may be understood as one competing interest in that political field. The inescapable ambiguity of what “health” means, and what constitutes disease, can cast a shadow over the practices of for-profit medicine and pharmaceutical companies. The opioid crisis stands as a classic example of a process that Illich termed “iatrogenesis,” whereby medical intervention increases disease and social ills.

While Covid is undeniably a public health crisis that is best mitigated through public health interventions like the vaccine, it is also a global calamity that relates to almost every corner of human experience. Often only medical expertise has been sought by government agencies and communicated by dominant media outlets to address the crisis, presenting an inconsistent narrative that may have served to reinforce suspicions and concerns about the integrity of this government-commissioned medical expertise and the press’s motives in communicating it. Inadvertently or not, it has reinforced the idea that the stakes of the pandemic rest entirely in who decides to administer what medical interventions on themselves.


While a lot of media coverage of Covid skepticism and vaccine hesitancy has focused on discrediting politically motivated, conspiracy-focused misinformation and unproven treatments — including the infamous “fish tank cleaner” and “horse dewormer” — that often seem framed to underscore the ignorance and credulity of certain groups, these responses are refractions of the same impulse to trust personally in a single intervention to solve the pandemic. They are expressions of the same medicalization of society and the individualism it produces.

But disease is not ultimately an entirely individualized phenomenon. In a paper published this June in Nature, “Thinking clearly about social aspects of infectious disease transmission,” Caroline Buckee, Abdisalan Noor, and Lisa Sattenspiel outline the necessity for exploring “social, economic, and cultural forces [that] also shape patterns of exposure, health-seeking behavior, infection outcomes, the likelihood of diagnosis and reporting of cases, and the uptake of interventions.” Rather than focus solely on sanctioned (or, for that matter, renegade) medical treatments, the authors argue that, “One of the most important lessons of the pandemic so far is that the central forces shaping local and global variation in disease burden and dynamics have been social, not biological. Although substantial biological questions remain unanswered, the multiple waves of infection that have been driven by shifting control policies and the heterogeneous public response to them as well as the disproportionate impact of the disease on poor and marginalized communities around the world are the defining features of the pandemic’s trajectory on local and global scales.” The medicalized perspective has neglected to account for the manner in which communal norms impact the overall health of society — in its limited framework of individual diagnostics, it does not prioritize for “social” health.

While Covid is undeniably a public health crisis, it also relates to almost every corner of human experience. Often only medical expertise has been sought by government agencies

This may be a result of profit motivation. In the U.S., health care is a for-profit industry that can be seen benefiting from the illnesses of those it treats, particularly when treatments create new problems in the form of dependency and addiction. Medical advertising, which is commonplace in the U.S. and geared to individual consumers (“Do you have trouble sleeping at night?”), also sends the message of profitability over prudent diagnosis. No matter how rigorously the standards and codes of professional conduct are enforced, the industry can’t help but operate under the cloud of capitalism’s imperatives and consumerism’s demands, which is why a tragedy like the opioid epidemic occurred in the first place. This colors how people interpret sensationalistic news reporting about “silent killer” diseases and imperceptible chemical imbalances, and the insistence on regular checkups and “knowing our numbers.” It also colors the interpretation of directives about the pandemic, which have not been handed down outside that context.

Because of the inroads made by medicalization, many people have a hard time understanding suffering — never mind their lives — in anything but medical terms. It may be that the only time they can feel assured that another human being will show genuine concern for their well-being is when they enter the doctor’s office or an emergency room. But as the medical field has expanded to subsume other possible notions of wellness, this has produced a counter-reaction: a thriving world of self-help and competing “alternative” wellness discourses that have flourished on social media platforms. “Self-care” was originally coined by nursing theorist Dorothea Orem to describe the responsibility an individual has to the management and maintenance of their own health. She argued that a patient should not become overly dependent on medical assistance and should instead practice self-care up until a “self-care deficit” occurs — at which a nurse should step in. This is the idea that Illich was gesturing toward when he claimed that “effective health care depends on self-care.” But the term has since been swallowed by capitalism as well, turning “health” into another product to be consumed, as an “alternative” to conventional medicine and the broader discourses of medicalization, if not a source of individualized pampering. Implicit in the Oremian notion of self-care, and in much of grand nursing theory, is both individual responsibility and the manner in which functioning communities care for themselves: Public health is about maintaining one’s own health and also looking after the health of those whose deficits make them more susceptible to illness.

In the U.S., “health care” is often understood to mean health insurance, which is largely tied to employment. This tethers health to one’s economic viability, as if maintaining wellness were only a matter of maintaining a worker’s productivity. Health itself — alongside other intangibles like safety and happiness — has been framed as something that can be individualistically consumed. This is the legacy of the medicalized framework that we live under and that has shaped the logic of the Covid response, both of the state and of individuals. In our current climate of distrust, some people have framed the issue of the vaccine as a combat between warring factions: one group wants perfect safety through the medical, the other wants perfect freedom through refusal. Taking or not taking the vaccine cannot provide either of these.

That is no reason not to take it — doing so maintains a standard of true, community-led public health care (versus health “insurance”) that bolsters a care-centered democracy. But it also does not mean we should not question the limits of what experts can do to address not only the pandemic but the broader problems it has exposed and exacerbated. Choosing to be vaccinated during a public health emergency is likely the best and most universally beneficial approach we have in this moment, and mandates (which are nothing new in America) help spur that choice. However, the controversy surrounding mandates should serve less as a reason to hate thy neighbor and more as wake-up call to issues with far deeper roots. We must begin to try as a people to redefine what a more complete picture of human health looks like and recognize that individual medical treatment is only part of the equation.

Aimee Walleston is a writer and adjunct professor based in New York. She has contributed writing for publications including Art in America, T Magazine, and Mousse. She teaches at New York University, International Center of Photography, and Sotheby’s Institute of Art.