Home

Double Trouble

You should know who the algorithms think you are

I don’t watch much reality television but this summer, I found myself obsessed with the MTV reality dating show Are You the One. This was the infamous “sexually fluid” eighth season and I came mostly for the thrill of messy queer drama on national television. But, as the show went on, I found myself fascinated by the premise: 16 singles must determine who in the house is their “perfect match,” as determined by a complex matchmaking algorithm which takes into account interviews with friends, family, and exes; a personality test; and a thousand-page questionnaire.

The contestants alternate between playing with “heart” and “strategy” — seeing who they naturally connected with, and trying to solve who the algorithm might have picked out for them based on math and deduction. Leading with heart is encouraged: the premise of the show suggests that someone’s true feelings should lead them to their algorithmic match. If a contestant happens to fall in love with a “no match,” their feelings are taken to be incorrect, and they are urged to get back out there and do better.

This MTV reality dating show deliberately casts contestants who are “bad at relationships” to show how the algorithm knows best

One of the defining story arcs of AYTO’s eighth season revolves around Max: a self-proclaimed “straight in Ohio / bi in LA” #ravegod party boy who meets and falls in love with the highly sought-after Justin. Being with his first boyfriend is a deeply formative, potentially life-changing experience for Max. But the algorithm is not a deity that can be appealed to. The verdict? Not a perfect match.

Max is heartbroken, but there’s no time to wallow: the gang has only two more episodes to sort out the correct matches if they want to win the money. Fortunately, great agony leads to great clarity. In a pivotal moment, Max finds himself red-faced with revelation during a conversation with Kari, a party planner from New Jersey who he’s been at odds with all season. “It’s Kari! She’s my match!” he insists, overcome. Kari tells him everything he told the matchmakers he wanted, before he discovered that what he really wanted was Justin. He isn’t recognizing his soulmate, but rather the data soulmate of his data double. Max fed the matchmakers the sum total of his emotional data, and the machine responded with Kari.

AYTO deliberately casts contestants who are “bad at relationships,” and throughout the season they repeat (likely per their contract) how they are ready to find their perfect match, how the algorithm knows best, how it can do what they, as ordinary humans, cannot. Although the contestants are supposedly there to fall in love, their real task is to discover who they’ve been profiled to fall in love with. The show rewards a very specific digital literacy: In order to win, the AYTO cast has to find their algorithmic soulmate by divining their algorithmically legible self.


Watching the contestants struggle to put aside their own instincts and desires and trust in the algorithm felt very familiar — it is something that is asked of us every day. The Spotify algorithm determines what I listen to; the Facebook/Instagram/Twitter algorithms determine whose lives I keep up with, what events I am aware of, what ads I see, even what news I read. I try to practice active browsing, but inevitably slide back into the algorithmic stream. The thing is, it’s not totally wrong. Most of the time the algorithm guesses accurately, or accurately enough, and I am comforted by the customized ease of my digital environment.

Meanwhile, everything hidden from me slips smoothly out of my awareness. I am only jarred to the curated nature of my reality when something obfuscated inexplicably reappears at the edges of my digital periphery. Where has this person been? How could I have forgotten them? What coincidence of data has brought them back?

The algorithms encase me in a bubble of “me,” or at least a bubble of who they think I am. It’s not a particularly deep assessment. If I’m listening to “female fronted pop-punk” it will return more of the same (or, usually, gender-segregated bands from different genres). It won’t venture to give me country or harsh noise, though I love those artists who work in those modes as well. The algorithm lets me reify certain choices, but it doesn’t let me grow.

As humans, we have our own internal algorithms for reading people around us, with our own biases. When I see a large man in salmon shorts I might think “bro,” and depending on how he moves I might sort him into “dangerous” or “chill.” A person with brightly dyed hair and a septum ring might scan as “queer” or “alt.” I will scan as “ma’am” and “sir” to different people’s algorithms, often over the course of a single hour. The data our algorithms take in is largely visual, but incorporates other factors such as body language, conversation, smell.

The algorithms encase me in a bubble of “me.” It’s not a particularly deep assessment

Like digital algorithms, ours are based on information we’ve been trained on over the course of our lives. (Unlike digital algorithms, we do not factor in where someone has been over the last week or how they spend their money, at least not without extensive stalking.) Human algorithms are equipped with plenty of bias, as any woman whose pain is ignored by a doctor or person of color who is routinely stopped by the police can attest. It’s difficult enough to hold humans accountable: even with anti-discrimination laws in place, it can be hard to prove that a human-made hiring decision was a result of someone’s name or gender marker and not just someone not being “the right fit for the job.” Algorithmic evangelists believe machines can rectify our human faults with a divine mathematical mandate of pure truth. But we create our machines, and, when we do, we pass our flaws down to them. And when we do, machines turn our most damaging myths into mathematical truths, applied across a massive scale.

Most of us are warned of the potential human biases acting against, and within us, from an early age. This information is passed down from parents, older siblings, friends. We learn to code-switch to get what we want, or to achieve a sense of safety. We try to develop an awareness of these shifts to maintain a consistent sense of self outside of someone else’s gaze. But most of us still have no idea how to code-switch before the algorithm, or even which of our data points we would need to modify. In many cases the option might not even be available to us. While human beings largely profile each other based on immediately visible data, digital algorithms often pull from data long buried in the past, fostering a feeling of helplessness and codifying disposability. It can feel impossible.

Just sitting with that despair is a part of working toward undoing it. Algorithmic literacy is not necessarily about cracking the code but just realizing that there is one: that the everydayness of our lives is intensely mediated.


In 2016, Cathy O’Neill’s Weapons of Math Destruction dove into the effect algorithms have on our material circumstances, whether it’s a company deciding not to hire someone based on a history of mental illness illegally detected through a personality test, or a predictive policing model that turns using arrest histories (regardless of conviction or severity of crime) into self-fulfilling prophecies by overpolicing areas where arrest rates are already high. Repetition shapes reality. Without an awareness of the algorithms at work, those affected start to believe that they are somehow fundamentally broken. Their suspicions are dismissed as paranoia. The prevailing belief is still that computers are meant to eliminate human error, not perpetuate it.

The algorithms that shape and determine our lives are necessarily complex: job candidates aren’t flagged for any one answer but rather for certain patterns; car insurance payments are calculated not only based on a customer’s driving history but also on their credit score, zip code, and the routes they take. The government will not reveal why someone has ended up on the No Fly List even after multiple challenges by the ACLU. These algorithms are often described as a sort of “black box”: their compounded nature makes it impossible to see what goes on inside. But even though their machinery is obfuscated, we can sometimes sense when they are acting upon us. And just as we can follow a feeling of dissonance to the version of ourselves another human has created in their mind, we can sometimes follow our unease to discover the outline of our algorithmically legible self: our data double.

As anyone who’s looked up their Facebook targeted ads knows, our data refracts an uncanny version of ourselves that is only partially accurate. This “data double,” as coined by Kevin D. Haggerty and Richard F. Ericson, is the sum total of what is visible to our surveillants, a “decorporealized body of pure virtuality.” Our data doubles are similar enough to be recognizable, but distorted enough to be disturbing, built out of our clicks and likes and digital movements. “Google tells me she loves American football, and I wonder what twist / of data gave her this quirk, this sweet brave way in which she diverges / from me, diverts surveillance, leads the advertisers astray,” writes Navya Dasari in her poem “Data Double.” Our double’s glitches reassure us that we are still the most authentic version of ourselves.

Scrolling through Instagram, my data double and I read our advertisements. We’re both tempted by knockoff Balenciagas and discount CrunchyRoll subscriptions. Like me, the data double has moved recently, judging from the furniture and mattress ads, but somehow they got to stay in New York. We both used to fall for androgynous goth couture but recently it seems like they’re becoming the type of person that’s more lured in by cookware and vegan bath products. Which of us has changed? Also — when did they get a dog?

During one post-election cybersecurity workshop I tried to look through my computer from a stranger’s point of view. What kind of person would keep a drive like this? Are they a person who would have an easy or a hard time getting past airport security? In a dystopian secret police raid, would this be the kind of person flagged for extermination? As countries begin asking people for their social media information at the border, I consider whether my Twitter (which an IBM personality metric rated as “inconsiderate” and “disagreeable” after incorrectly guessing that I didn’t listen to country music) resembles that of a welcome guest.

We try to gather a sense of self from the world being created around us. Just like the “attractiveness” of people being shown to users on OkCupid reflects how the app perceives their “attractiveness,” or like the tone of my classroom reflects my energy for the day, I try to assemble every ad and alert shown to me into an algorithmic portrait.

When my Lyft driver drops my ride at 4 a.m. I have to wonder — did they change their mind, or is my Lyft rating dangerously low? My financial data affects whether I’m approved for a credit card or a house; my health data determines my insurance premiums. For a while, something about my data meant I was repeatedly being pulled into secondary screenings while traveling, until, just as mysteriously, it stopped.


We are told to delete drunken college photos, to watch what we post lest it’s used to justify an arrest (particularly for already-targeted populations), to turn off Location Services whenever possible, to get a VPN or open Tor when reaching for privacy. We guess cash is digitally clear (though not untrackable), Signal is more or less encrypted, and the safest conversation is one where your phone is in the other room. There are several guides for guarding our digital personas; few for totally disguising them. While these measures offer a feeling of control, we are inherently leaky beings. There is no data double charm school I know of. 

I run my Facebook through Apply Magic Sauce, which promises to show “what your digital footprint says about your psychological profile” and experience momentary disappointment in how bad of a psychic it is. It gets my age and gender mostly right, but the similarities stop there. Further, it pulls back the curtain and reveals which values it assigns to various Likes. Liking Canadian nu metal band Kittie marks me as “more feminine,” “conservative,” “contemplative,” “easily stressed,” and “less intelligent.” Liking techno live-stream session platform Boiler Room marks me as “younger,” “hard working,” “laid back,” “married,” and “more Catholic.” The whole illusion is shattered.

Which of us has changed, me or my data double? Also, when did they get a dog? I had a perverse need to have them resemble me more closely

As Facebook and other social data repositories integrate with other surveillant assemblages, reading our data doubles is likely to prove increasingly material. In O’Neill’s book, when a Vanderbilt student named Kyle Behm is repeatedly rejected from minimum-wage jobs, his friend is able to find out that Kyle had been red-lighted by the personality test. Kyle’s father, an attorney, discerns that the personality test flagged Kyle for bipolar disorder — something that is both misguided and illegal under the Americans With Disabilities act. Most people do not have the resources or the access to determine where their ill fortune comes from. Without an awareness of how algorithms interpret our digital footprints, we run the risk of correlating the material consequences with our own perceived failings, believing that we are unqualified or unworthy, rather than trailed by data doubles who unwittingly set off algorithmic alarms.

In Kyle’s case, algorithmic literacy led to material change: His father filed a suit and at least one company Kyle had applied to changed their application process. In other cases, the effects of algorithmic literacy might be merely existential. But that itself is no small victory.

According to a self-help meme that my algorithm thought I might like, one thing I can never control is what other people think of me. This is supposed to be reassuring. After being locked out of account after account due to forgetting my new super-strong passwords and learning that nearly every securitizing mechanism I have the capacity to understand is always-already compromised, I start to wonder if it isn’t better to just own our vulnerabilities rather than pretend we don’t have any. I know that I have to keep my data double in check, even if I can’t yet grasp or anticipate the full scope of why. But I also want to ignore them, to let us lead our separate lives and be confident in our differences. That is, until the next time they’re called upon to represent us.

Chasing one’s data double is a lifelong project. Regardless of how effective it is, it gives me a sense of control in a digital space that’s simultaneously wholly personalized and wholly alienating. And, psychologically, I take solace in the fact that there are some things the AI can’t see: perhaps how we are feeling (though with the advent of affect surveillance, not for much longer). Perhaps our dreams (though sometimes I am certain there are major blockbusters pulled straight from my nightmares). Perhaps what this literacy offers is a hard-won sense that we are more than the sum total of our likes.


As with many algorithmic ventures, the algorithm on Are You the One inevitably lets the contestants down. Only one confirmed “perfect match” is still together, while plenty of “no match” couples are dating happily. After the season wrapped, Kari, the party planner from New Jersey, expressed an enduring faith in the algorithm and a desire to make it work with Max despite glaring red flags in their relationship. “I don’t speak to Kari. I don’t want any sort of relationship with Kari after she hooked up with Justin in the hotel,” Max said during a reunion special, adding: “Kari and I never really got along. I found out she was my match and that was that. I never wanted any relationship with Kari outside of that.” 

Watching the show was both hopeful and sobering. Despite their best attempts, algorithms still can’t fully gauge our true selves or desires. Call it a “soul” or a “vibe,” a kind of light that doesn’t reach into the uncanny valley. Whatever it is, it is the ultimate romantic notion: that something ineffable in us is still, for the time being, unsurveilled.

In my own life I’ve alternated between running from my data double, trying to shape them in my image, and trying to build a safe dummy data double to hide behind. (As my favorite Privacy Issues song goes, “Every day you watch me, so / I change the way I speak / I change the way I speak.”) For a brief, paranoid period of time, I debated keeping my legal name as my byline long after I stopped using it in meatspace. I wanted to plan a good life for her — one that could go on separate from myself. I would keep up the data trail that kept her going, I reasoned, and she would in turn sneak me safely past hawk-eyed gatekeepers. A past self as an alter ego — a ghost I could wrap around myself like a soft sheet.

It didn’t work for long. I was no Norman Bates — the deadname had lived a good life and deserved to be laid to rest with dignity. Despite my understanding of security culture and my keen post-Soviet paranoia, I was overcome with a perverse need to have my data double resemble me more closely. My own craving for data intimacy surprised me: Something in me was willing to sacrifice being safe for being known. Call it data dysphoria. I wanted to recognize myself in the black mirror.

At the end of Annihilation, Natalie Portman’s character is in an eerie dance with its alien twin. The twin crushes her against a wall. But when Portman collapses, the twin collapses as well. They engage in an eerie mirror dance. As Portman wraps her twin’s hands around a grenade and pulls the tab, the iridescent thing takes on her face.

NM Mashurov is a music writer, poet, and nightlife worker from Brooklyn, and current MFA candidate at UCSD. They write on collectivity, queer intimacies under the surveillance state, cyborg affect, smashing borders of all kinds, gay noir, and, occasionally, nü metal.