Home

Sins of the Mother

The benefits of self-tracking don’t outweigh the risks of disciplinary surveillance

At the Apple Special Event this September, CEO Tim Cook announced the company’s latest device, the Apple Watch 4, to a crowd of industry insiders and reporters. Hailing it as a way to “stay connected, be more active, and live a healthier day,” Cook characterized the Watch 4 as Apple’s latest attempt to create technologies that are “more personal.” With cell service, built in fitness trackers, and a newly added heart-rate monitor, the watch had become “an intelligent guardian for your health.” Its sensors can now follow your every move (at least, anywhere that matters, anywhere with cell service) at anytime (until the 18-hour battery fails) while doing any activity (except deep-sea diving).

Unlike most recent Apple device announcements, this one hooked me. Since August, I had already been tracking my every move, calorie, cramp, and sickness using another self-tracking app, under the vague impression that tracking myself would speed along the process of becoming that most overdetermined kind of cyborg, a pregnant woman. I even had a file on my cloud drive called “Notes on Pregnancy.” I intended to use it to capture the thoughts, sensations, and encounters with fertility-enhancing technologies that could not be easily tracked through the app. But my enthusiasm had already begun to wane by the time the Apple event came around.

Pregnancy intensifies already dense networks of surveillance

As I watched video of the event, I wondered if the Watch could salvage and redeem my self-tracking ambitions, somehow facilitate my admittance to the cult of motherhood. Could it help me better discipline my sex life in line with my fertility? Wrangle my BMI closer to an ideal level for a woman moving briskly toward the age threshold of geriatric pregnancy? Could a wise digital guardian help me and my potential children become stronger, prettier, and healthier? Could it help me transcend the unseemly limits of my earthly flesh, to pass through a trial-by-data and come out renewed and rejuvenated on the other side?

The temptation of transcendence through technology — the possibility to overcome weaknesses of body and willpower that seem to be part of the ordinary human experience — pervades digital life. Self-tracking speaks directly to this enculturated desire for transcendence. We hope to be able to change ourselves through knowing the numbers, so we seek out more and more information about ourselves to change our futures.

But there are risks that come with seeking to exceed the embodied self through data. It requires relinquishing a great degree of control over our private activities and motivations. Complex desires and activities get broken down into fragments as they are captured and stored as data. The pieces of us can be tracked in minute detail by whoever is granted access under terms-of-service contracts. The clouds of data gathered about us by our technological add-ons quickly escape our control. Instead, they become the means by which we can be controlled.

The potential risks are especially acute for people seeking to bear children and for their offspring. Pregnancy intensifies already dense networks of surveillance. Pregnant women and new mothers are expected to adhere to a strict regimen of doctor’s appointments, diet, and exercise. In the United States, they are seen as an especially prime target for advertising. As sociologist Janet Vertesi discovered in 2014, avoiding the gaze of marketers involves extensive planning for even mundane purchases and an evasiveness that can make one look like a criminal in the process. More frighteningly, the everyday surveillance of the internet has outed at least one woman’s pregnancy without her consent.

The imperative to do more and be better is not only a question of the well-being of the person carrying the child. At stake (so we are told!) are concerns that are bigger than us and yet seem to depend on us: the future of the national economy and the health of the species. When pregnant people fall short, they fail not only themselves but the imagined heirs, nations, and biological kin by whom they could have done better.

What is good for the disembodied cyborg may not be good for the fleshy person — especially the reproducing one

So while digital self-tracking might seem to be the answer to someone like me who seeks to become a parent, using it to record evidence of all the ways that the parent might set the child up for failure could make it easier to revisit the sins of the mother upon the child. To safeguard future generations, it may be more important to guard against self-tracking’s intrusion into our lives than to reap its benefits. Is the Apple Watch truly a guardian, caring for our well-being — or is it a warden, watching and waiting for us to make a misstep?


These reservations about self-tracking were on my mind as I walked into the Apple Store in downtown Brooklyn. The store juts out from one side of a new luxury high-rise, sharing the ground floor with a 365 Market, the new, budget-friendly version of Amazon–Whole Foods, and a simulation of a neighborhood food court populated by stores selling $5 loaves of bread and $10 cups of green juice. Glass walls rise to a height of several stories, bending gradually toward each other like the prow of a ship. This vessel for the future is aimed down Flatbush Avenue, the next step in central Brooklyn’s gentrified future.

The store is not just a place to buy Apple products. It is also where individuals are meant to come learn how to remake themselves — their bodies, their minds, their design portfolios — for a better future, powered by Apple technology. It is an outcropping of the Californian ideology that preaches liberation through technology, transcendence through market choice, and becoming cyborg mainly as a way of becoming a better consuming citizen. We are invited in to be seduced by promises of freedom from the everyday, the material, the now. We are invited to live in the future, closer to the world of our descendants.

But what is good for the disembodied, future-oriented cyborg may not be good for the fleshy person — especially the reproducing one. While it may seem necessary and prudent to track the self to ensure good outcomes in the future, the real risk may be in handing over extensive, real-time data about oneself to a third party. The promise of transcendence relies on the faith that the outside party truly has an individual’s best interests at heart. To be a responsible user of self-tracking technologies and to get the most out of them (as I have learned in my fits and starts at being a self-tracker), one must relinquish a great deal of privacy. To get the most accurate estimate of energy exerted during a run or a walk, for example, I need to allow the FitBit app on my iPhone to keep track of where I go. It knows where I live and work. I tell it how much I weigh and what I eat. I even tell it where I shop when I scan bar codes from store-brand foods to automatically look up nutritional content. While this is risky in the best of times, the potential effects of data misuse is even higher when the life of a future child is in the picture.

We are invited in by promises of freedom from the everyday, the now. We are invited to live in the future, closer to the world of our descendants

While Apple advertises the Watch as a caretaker, its effects are more likely in line with an overseer. Digital self-tracking tools are the latest wave of disciplinary technologies, imprisoning their users in a cage of data. This makes digital self-tracking consistent with modernity’s signature strategy of social control as theorized by Michel Foucault: discipline. Foucault famously argued that a fundamental break happened in how the state approached crimes and moral transgressions beginning in the 17th century. While earlier forms of punishment relied on spectacular episodes of violent force — up to and including public executions — dealt out in the name of the sovereign king, the modern “gentle way” in social control aligned individuals to social norms not by doling out painful or deadly bodily harm as deterrents but by monitoring them.

Modern disciplinary techniques urge penitence and reform. The disciplinary state collects data about its citizens to both identify mathematical norms for society and to improve those numbers within the population as a whole through education and enforced compliance. In Foucault’s reference point of mid-century France, this involved state institutions like the military, the school, the hospital, and the prison. When individuals fall short of the norms, these institutions delivered exhortations to shape up, receive more training, and work through one’s deficiencies toward a better, future self.

We face similar injunctions from the contemporary equivalents of these institutions — not only schools, hospitals, and prisons but increasingly from our health insurance companies, public service agencies, and workplaces as well. While Foucault’s 18th and 19th century disciplinary institutions were controlled by the state, the disciplinary apparatus has now extended into private industry and local government as well. Over time, we internalize the message of these institutions, and their demands for a better self morphs into a personal commitment to progress, a genuine desire to submit to a responsible overseer who can guide and train us.

Social scientists have imported this framework into conversations about digital self-tracking technologies. Rather than inducing submission to a nation-state, contemporary self-tracking technologies prompt compliance with the norms and assumptions built into them by medical and technical professionals: advice about how many steps to take per day, how many calories to eat, and when a person might be at the peak of fertility. The apps and devices themselves become the warden of an individual’s opt-in confinement, with the data stored and owned by the company that makes them. This data is largely exempt from oversight by elected officials, and can move quickly from device and app manufacturers through third parties to marketers and advertisers. Popular business models for self-tracking tools scratch the individual’s itch for self-knowledge while also providing useful data for future marketing campaigns, as digital sociologist Deborah Lupton explains.

More disturbingly, personal data is being centralized in the public sector to police, predict, and punish across generations. As Virginia Eubanks documents in Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor, new technologies are being used to make predictions about the behavior of future generations based on the actions of current ones. Designers of such systems are thereby instantiating the most dystopian and literal version of social reproduction.

It seems only a matter of time before medical data from self-tracking tools becomes part of the determinative matrix in both the public and private sectors. Fitbit, for example, now has an entire division devoted to selling tracking-centric wellness programs to large employers as a way to find savings on “annual health care spend.” The expansion of digital surveillance combined with the expanded use of automated decision-making tools is creating new threats to privacy and liberty for everyone. If these two regimes become fully interdigitated, a wide array of behaviors, interactions, and motivations captured in data in the past will shape the futures of individuals and communities, leaving little room for self-determination in the future.

It seems only a matter of time before biological data is also enlisted into multigenerational systems of discipline

As digital surveillance becomes embedded in our lives, we come to identify the possibility of future well-being with tracking and tinkering with our measurable behavior in the present. But the effects and affects this monitoring produces are unevenly distributed. They reinforce medical discipline for those of us who are already disproportionately surveilled, like pregnant people, as well as people of color, immigrants, women, and disabled people. They reinscribe punishment before any harm has been done for the children of people who are disproportionately intervened upon by criminal justice institutions. It seems only a matter of time before self-tracking can be used to set up future people as consumers before they are even old enough to pay for their own lunch or sign up for a credit card. By reducing behavior to numerical data, the platforms and cloud technologies behind consumer self-tracking tools provide powerful ways to link the many domains in which minute-by-minute data about what we do can be turned into data, money, and control.


In the end, I did not buy the Apple Watch, despite the initial stirrings of desire. I remembered that tracking the self is still work — of attention, of caring about the metrics, the outcomes, the future risks one is creating in the present. Buying a Watch could get more expensive than it seems, as the first purchase opens the floodgates for more invitations to buy technological consumer goods. One device could spawn the need for more devices (connected thermometers, connected scales, connected water bottles). And the data collected would most likely be crunched into new appeals to buy more and be more both for me and for my potential future children.

Although the Watch could possibly help me bring my body in line as I prepare my womb, I would rather not intensify the inevitable onslaught of baby-related advertising through sharing daily data about my activity, diet, and health status with companies who seek to monetize it in creative ways. But most important, I cannot bear to think that what I give that device today would be held against me and my lineage in the future. If I don’t eat right, if I weigh too much, if I drank wine after conception, or if I don’t meet prescribed quotas of daily physical activity, could such data be used by a pediatrician to place the blame of perceived deficiencies in my child back on me? Would it be a reason for a company to withhold affordable health insurance from my family? Will schools crunch my prenatal health data to track my toddler into the appropriate pre-school classroom? If the social services benefits used by a parent can already determine the relationship between a child and local welfare agencies, it seems only a matter of time before biological data is also enlisted into multigenerational systems of discipline.

In a different arrangement of society, I would be comfortable, or even excited, about tracking and storing data about my health at this (so the world tells me) critical biological juncture. If I could be confident that the sins of the mother would not and could not be visited upon the child, then perhaps. If my possible missteps today meant extra care and regard for future generations, then gladly. In a world where children were not tracked into future careers from toddlerhood, where disability was not seen as an individual failing and blamed on the poor choices of mothers, where social benefits flowed freely to those who needed them, and where the slightest deviance from bodily or social norms did not result in swift and harsh financial retribution from employers, criminal justice systems, and public agencies, then the questions I find myself asking in this essay would be entirely moot. I wish I lived in that world. But for now, this earthbound cyborg’s wrist remains conspicuously bare.

Danya Glabau researches, teaches, and writes, about gender, bodies, and technology in New York at the Brooklyn Institute for Social Research and NYU Tandon School of Engineering. She is working on a book on food allergies, gender, and capitalism.