Home

Bill of Health

How medical data is becoming a luxury good

Forward, a startup launched by former executives from Google and Uber in 2017, touts itself as the future of health care. But the future the company is creating is one where health is an exclusive, premium-priced good rather than a human right or a general social aim. For $199 per month (and access to a wide array of your personal information), it offers “primary care membership focused on prevention,” using digital monitoring technologies to try to forestall disease with data. Through knowing oneself, the marketing suggests, one can prevent future catastrophe. But Forward adds something new to now-familiar pitch for health-data gadgets like period trackers and diet-and-exercise logging bracelets. For the monthly fee, a person can experience the aura of health in person in the company’s bespoke offices, buying peace of mind to be delivered in the future.

During a standard visit, a variety of connected “smart” devices are used to “gather comprehensive data wirelessly for you and your doctor.” If its promotional material is any indication, going to a Forward location is less like visiting a doctor’s office and more like dropping into a minimalist, high-end boutique. The clinical space depicted on its website is accented with high-end flat screens and technological gadgetry that doesn’t immediately disclose its medical purpose. The company promises a 3-D full-body scan for new clients, though what that machine can detect remains undefined. The entire presentation is medicine as mystery and as luxury.

Data-driven health services promise transcendence through data collection. But health data is also becoming a means of stratification

Following appointments, patient data is aggregated and self-knowledge is served on demand to customers on the company’s glossy platforms. In an article for Quartz, Michael J. Cohen compared its clinical strategy to that of an app: sleek on the user-interface end, with “software and data” powering “the new operating system of health care” on the back end. The New Yorker noted the company’s aspiration to be “the Apple Store of doctor’s offices,” and compared Forward to a gym, as did Business Insider.

As with a high-end gym, Forward’s monthly membership fee and the possibility of casually stopping by for routine tune-ups is designed to bring peace of mind. While gyms, yoga studios, spinning facilities, and other fitness spaces now offer raw juices, custom fragrances, and specially designed socks as talismans for the pursuit of elevated states of wellness, Forward and other data-driven health services promise transcendence through data collection. But the luxe ambiance makes it obvious how access to health data is also becoming a means of stratification. For those with cash to burn each month, health data collection can be timed to fit one’s schedule and the resulting information can be readily accessed through the device in your pocket. But for those without such services, access to one’s own health records remains difficult, with incompatible formats, paper records, and physician recalcitrance to share information keeping them in the dark.

In the real world where people frequently fall ill, doctor-patient encounters are crunched into shorter and shorter time frames and preventive health-care services are becoming scarcer in rural areas in the U.S. Finding a doctor who can provide adequate basic health care is increasingly a privilege of the rich and urban. Forward offers its urban customers — it was launched in California and recently expanded to New York City — an intensification of that privilege: not only the knowledge that one is looked after by a medical professional but also reams of exclusive data about the self.

The public fanfare over services like Forward makes explicit how medical treatment in the U.S. is increasingly positioned as a luxury for the wealthy. At the same time, the rest of us are led to believe that it is not money or affordable care that really matters but self-knowledge. Devices like the Apple Watch and the FitBit are proxies for the attentive, face-to-face health care that all people actually need. But in the utopian realm of digital health, personalized on-demand health data is framed as both sufficient and aspirational. Like the latest iPhone, health is becoming an exclusive novelty item, enticing to early adopters with cash to spare.


At the beginning of this century, health data was positioned as key to improving outcomes for all people worldwide, an unquestionable social good for everyone. New techniques — often digital — for collecting, sharing, and accessing quantitative data from blood tests; historical symptom data revealed by patients to their doctors; and records of drug prescriptions were supposed to make it easier for everyone to live longer and better. The more health information that was available to individuals, the more people would live life by the numbers, taking preventive steps to stay a step ahead of serious illness. Early diagnosis of everything from cancer to diabetes would make it easier to treat complex diseases; proactive individuals equipped with up-to-the-minute sleep and weight data could adjust their routines to optimize everything from attention to heart health.

More data — about the self and about future potential patients — had come to be seen as key to producing more health, as though “health” were an absolute quantity that could be counted and increased. In his 2012 book Drugs for Life, anthropologist Joseph Dumit described how the prevalence of these ideas turned health data into a valuable asset for pharmaceutical companies, which ensure future profits by anticipating demand for the drugs based on population-level data. Data-driven market research, Dumit observed, became an equally valuable site of knowledge production as the laboratory and the clinic, allowing companies predict and prepare for future health trends.

Data appears straightforward. It doesn’t depend on the people who generate it, with their messy lives filled with competing priorities

The incursion of Silicon Valley investors and professionals into health care has since intensified the health data obsession. Armed with the faith that knowledge always leads rational individuals to take appropriate action, the health-technology and digital-health fields blossomed, spawning new services (like DNA testing service 23andMe), platforms (like period-tracking app Clue), devices (including exercise trackers like FitBit), and now clinics like Forward, centered on collecting and organizing health data and slickly reporting it to doctors and individuals.

As these fields have grown, they have contributed to the impression that data alone is sufficient for medical care. Compared with the intricacies of human biology and relationships, data appears straightforward. It doesn’t depend on the people who generate it, with their messy lives filled with competing priorities. While a patient’s prevention regimes and adherence to active treatments can be upset by everything from a new job to distrust of a particular medical professional, data is easy to make and distribute. As an executive of a health-tech or digital-tech company, it’s easy to know how much of it you have and to assign each unit a corresponding price as you amass more. And it is easy to convince yourself that assigning it financial value makes it more medically valuable.

The procedural and economic advantages of data processing reinforces the conviction that data is the key to health. As medical practice is reconfigured around easily digitized packets of information that can be traded and sold, nuance in doctor-patient interactions and the integrity of health research are threatened. With data as the goal, the value of people and their needs and rights recede from view.

The growing number of ethically questionable transfers of health data between for-profit companies and to law enforcement is evidence of this. For example, the recent disclosure of an agreement between direct-to-consumer genetic testing company 23andMe and pharmaceutical giant GSK demonstrates that companies with private health data might change how they use that data in the future. Further, many observers complained that it was unfair that the company was making money after customers had paid to use the service, with none of those additional profits flowing back to the people whose data helped to generate them. And while DNA ancestry data might be helpful in catching the occasional criminal (like the Golden State Killer), critiques of such sharing have raised questions about what law-enforcement access to ancestry data will mean for the privacy of future generations, distant relatives of suspected criminals, and racialized groups.

The myopic emphasis on data collection distracts us from observing how access to health care is being further stratified by social class: Rich urbanites can sign up for concierge health data services like Forward. Middle-class people get FitBits for Christmas. The working class and the poor, meanwhile, must treat access to attentive, culturally sensitive doctors as a distant dream.


Rather than alleviate risks in a cost-efficient manner, the rush to build new data-driven business models is producing different ones. Many health technology companies try to strike a bargain with their customers: In return for agreeing to have their data shared with a variety of unnamed third parties, the customer gets access to doctors or genetic data, or calming meditation experiences, or alerts to prepare for menstruation as well as a wealth of data about their personal health. If health data is capital, this is a new form of profit sharing.

Potential users need to consider whether the promise of more health knowledge today is truly worth a potential lifetime of precarity and discrimination

But it is unlikely that any particular health tech company will last long. Despite a few out-of-the-park successes, the sector overall has struggled to realize returns for investors. One can imagine a company like Forward looking for new ways to monetize its trove of customer information during difficult times, following in 23andMe’s footsteps and selling information to a pharmaceutical company. Or perhaps it will sell to a tech company, which are starting their own efforts devoted to health (making the landscape even harder for small-scale ventures) or one of its tech-affiliated investors. If the company follows the example of its peers in tech, that additional revenue would not flow back to users as reduced fees or improved service; it would benefit investors and executives.

The use of monitoring devices to generate health data should also be a cause for concern. U.S. lawmakers are only just starting to debate “internet of things” security legislation and federal regulators are just beginning to draft rules for the protection of digital patient data. The ways that apps and medical devices like 3-D scanners, “smart” thermometers, and even heart implants communicate with each other are notoriously insecure. As yet, there is no meaningful carrot or stick to entice device makers to ensure the security of their devices in isolation, let alone in complex networks of exchange. Data may be traveling even further than device and service terms disclose.

Many medical tech companies’ terms of service claim that information will be de-identified before being shared with third parties, but researchers have demonstrated that re-identifying digital health data is not all that hard, and could expose a person’s financial, demographic, or location data as well as exposing their health status, opening them up to potential identity theft. In the U.S., where health insurers and life insurers are allowed to discriminate based on health and disability status, re-identification of health data could easily lead to a lifetime of denied access and extreme financial vulnerability.

What is at stake throughout the health technology sector are structural questions about who the health data economy will benefit. In the long term, customers and patients who give up their data for free or a relatively low cost might be the product, not the beneficiary — just like in other online social networks. The sector’s norms of exchanging data with third parties from drug companies to advertisers piggybacks on the now widely suspect data-for-cash business model of many tech platforms. Potential users thus need to consider whether the promise of more health knowledge today is truly worth a potential lifetime of precarity and discrimination in any of the many scenarios in which their data travels in unanticipated ways.

For the rich, this may not be a concern because any ill effects can be minimized with wealth. But for everyone else — for whom desirable devices might help to fill gaps in basic health care or cope with chronic illnesses of poverty — evading law enforcement, suing the nefarious actors who stole your data, and amending the record with potential employers could be difficult, if not ruinous.

While more data seems like a commonsense solution to a rickety health system, it further divides the haves from the have-nots, distracting consumers and decisionmakers with glossy interfaces. The health data economy is an economy of risk, and like any risky business, the gig will be up before long.

Danya Glabau researches, teaches, and writes, about gender, bodies, and technology in New York at the Brooklyn Institute for Social Research and NYU Tandon School of Engineering. She is working on a book on food allergies, gender, and capitalism.