In a post-apocalyptic future, a man I barely know offers to risk his life so that I might find my kidnapped son. Actually he’s a robot, both in the sense that I can see gears turning through a hole in his jaw and in that he’s an artificially intelligent non-player character (NPC) written into the video game Fallout 4. Either way, I’m floored by his generosity. A doctor plugs a neural implant into the robot’s brain to retrieve memories recorded by my son’s abductor. When the operation succeeds — when the robot, Nick Valentine, not only survives the procedure but retains his personality — I feel as though I have just seen a friend through a risky surgery. Controller in hand, I feel gratitude and relief.

The sequence in Fallout 4 where Nick Valentine offers his hardware toward the pursuit of the player character’s main quest is pre-scripted: He will risk his safety no matter how the player has treated him. But many of his actions in other parts of the narrative will refer to data accrued by the game’s social bonding mechanism, which evaluates the player’s behavior against a companion NPC’s likes and dislikes. Do something a companion approves of, and a notification appears in the top left corner of the screen: “Nick liked that.”

Each non-player character’s reward system is encoded into a video game before the player begins, simulating sociality as a series of points to score

Certain social rewards within the game mirror the way two people might bond with each other in person. Valentine “likes” or “loves” it when the player chooses to help other NPCs, just as someone might gain respect for an acquaintance upon watching them assist a stranger. He dislikes it when the player is disrespectful, just as someone might cancel a second date if a potential partner is rude to a waiter. Other in-game rewards are so specific as to be perplexing. Several NPC companions delight in watching the player character pick locks, offering a “like” when a door swings open but not when the player takes a bullet for them. Each NPC’s reward system is encoded into the game before the player begins, simulating sociality not as a dynamic, reciprocal engagement with another being but as a series of points to score. There are even unique conversations that can be unlocked if the player accumulates sufficient points, ensuring sufficient motivation for the player to keep impressing their companion.

While there is no internet in Fallout 4 — radio and DOS-equipped computers are about as far as its retrofuturistic civilization got before it fell to nuclear war — the way it quantifies social relationships precisely mimics the mechanics of contemporary social media. Even the language is similar: likes and loves, dislikes and hates. A companion won’t necessarily tell the player they’ve been offended or impressed by an action, but the game will, superimposing a hate or a like over the player’s field of vision. The reaction happens instantaneously, and there’s no room for ambivalence. The relationship meter either rises or falls; it cannot do both across more than one axis.

In the world of Fallout 4 — and other games with similar social mechanics, like the popular Sims series — sociality comprises approval and disapproval, nothing more. These games imagine human bonding the way a Facebook database might log interactions among users. The player cannot complete an action that would make an NPC feel simultaneously proud and envious, say, or inspire that gnawing combination of gratitude and guilt that sometimes arises in response to profound kindness. Characters in may be richly written to the point where they can seem, at points, like real people, but their relationships with the player unfold along a single dimension. These relations don’t reproduce the nuanced friendships that can arise between two human actors. Instead, they replicate the quantified social media scaffolding that can grow around such friendships.


On Facebook and Twitter, the “like” constitutes the primary mode of interaction. By default, it’s a positive engagement, a unit of approval tossed at a post. Justin Rosenstein, a former Facebook engineer credited with having helped to invent the now-ubiquitous “like” button, told The Ringer in 2017 that he hoped the platform’s new mechanism would encourage users to treat each other better. “Something I had been thinking about was, is there a way to increase positivity in the system?” he said. “Not force it, but to increase the likelihood that Facebook is contributing to creating a world in which people uplift each other rather than tear each other down.”

By now, with precise algorithmic tooling of the news feed, the lifting he describes is not merely a self-esteem boost but a literal amplification. A popular post is more likely to be seen. On Facebook, as on Twitter, a rapid accumulation of “likes” increases the potential for a post to be delivered to other users, even if they don’t follow the post’s creator, leading to the opportunity for ever more “likes.” The system motivates users to create the sort of content that racks up lots of approval quickly: Facebook posts describing major life events, tweets that spark political outrage or an easy laugh. The system itself does not explicitly differentiate between the tenor of each post; a eulogy for a loved one may receive the same treatment within the algorithm as a well-executed pun or the dissemination of a video of police brutality. Despite Facebook’s recent attempt to allow emotional reactions other than “like,” the way a post is weighted in the system derives mainly from its volume.

A companion won’t necessarily tell the player they’ve been offended or impressed by an action, but the game will, superimposing a hate or a like over the player’s field of vision

In his 2014 paper “What Do Metrics Want? How Quantification Prescribes Social Interaction on Facebook,” artist and new media professor Ben Grosser describes how social media’s tendency to reward volume over affect mirrors capitalism’s presiding ethos of growth at any cost. “Our need for personal worth is highly dependent on these social interactions … If this essential human need can only be fulfilled within the confines of capitalism, then it stands to reason that we are subject to a deeply ingrained desire for more: a state of being where more exchange, more value, or more trade equals more personal worth,” Grosser writes. “Personal worth becomes synonymous with quantity.” Grosser notes that Facebook’s “like” metric “functions as a form of symbolic capital, as a unit of trade in the recognition and prestige within one’s social group.”

In recent years, the function of quantified social metrics has grown beyond the symbolic. Social media skills don’t merely lead to an increase in online visibility; they make it easier to access material resources, whether that’s advertising revenue from a successful YouTube channel, a job opportunity from a viral tweet about getting unjustly fired, or a lucrative pool of potential donors to a medical campaign. But unlike geographically specific offline communities, whose robustness may also ensure the material wellbeing of their members, social media connect users to potential resources in exchange for data extracted from their activity. The overarching goal of any for-profit online network is not to facilitate bonds, but to serve advertisements with increasing algorithmic precision.

The system as it stands has no incentive to reward users who speak to each other, learn from each other, or forge meaningful bonds online. The system individuates. It wants only to know each user’s preferences and investments so that it might direct attention to third party advertisements — and, of course, so that the user might be lured back to the network to refine their profile and see more ads. The sociality of each network is not its focus, but its bait. Everyone wants to be where their friends are. Everyone would like to be heard.

A 2016 study conducted by researchers at UCLA found that teenagers were more likely to derive pleasure from looking at images they perceived as popular, even if the images themselves were mundane. “When the youngsters viewed images that had a lot of likes, there was greater activity in neural regions of the brain involved with reward processing, social cognition, imitation and attention,” wrote Roni Caryn Rabin in an article about the study for the New York Times. “Teenagers were more likely to give a like to an image that had already gotten dozens of likes, even if it was a fairly banal picture of a plate of food or a pair of sunglasses. They were less apt to like the same kind of image if it had gotten few likes.”

The prioritization of the like itself over what is being liked can steer the behavior of social media users. In a recent New York Times essay about the proliferation of fembots – both women pretending to be robots and robots pretending to be women – Amanda Hess notes that reality TV star and social media dynamo Kylie Jenner adjusts her online behavior in response to the number of likes she receives on a given post. “On her reality series, E!’s ‘Life of Kylie,’ Ms. Jenner has said that her followers drive her pressure to post more and better selfies to Instagram,” Hess writes. “She’ll delete images that don’t instantly please, turning her own image into a site of crowd curation.” While much offline social interaction comes with a certain degree of behavioral tailoring — most people won’t repeat a joke that turns out to be a dud among their circle of IRL friends — social media amplifies the sensation of performance and reward. Its mechanics can guide users to the biggest, brightest, most hyperbolic modes of expression, not as a way of fitting in with a group, necessarily, but as a way of earning the virtual currency of likes.

Social media-adjacent systems can also guide behavior in games like Fallout 4. When I am traveling with a companion who likes to see me pick locks, I’ll make sure I’m in the character’s line of sight before I jostle a door open. I’ll even pick locks I don’t need to pick, just to score more affinity points. This habit serves no narrative function within the game, and it doesn’t make me feel closer to the character I’m trying to impress. I’m simply acting the way the system has motivated me to act.


After Twitter stopped showing tweets chronologically and started weighting tweets based on popularity, regardless of source, I noticed a change in the network’s tenor. I saw fewer conversational tweets, fewer quiet thoughts and subtle jokes meant for the enclosed audience of their author’s followers. Instead, my feed became front-loaded with tweets from people I did not follow that had already racked up thousands of likes. Often these tweets tended toward the hyperbolic: Something is wrong and something must be done about it by everyone, or something is so funny that the person sharing it is literally shaking in literal tears. Sometimes these tweets would be rehashings of tweets I’d seen some months ago, their language blunted for maximum impact. Sometimes they’d be a near-replica of a tweet I’d seen earlier that day.

From what can be seen of their end results, social media algorithms prioritize mass over specificity, rewarding users who wield the kind of language that can launch a tweet to thousands of impressions. Big gestures dominate the feed, while small moments may get lost in the shuffle. A few years ago, Twitter for me felt more like the Javascript chatrooms I’d used as a tween: a semi-private space where a group of people with overlapping interests could convene at will and speak to the room about nothing in particular. It was less confrontational than a text or an email — you were only speaking to people who had shown up to be spoken at, and there was no real obligation for anyone to reply. Now, with its feed punched out of chronological time, Twitter feels to me less like a forum and more like a TV channel, partially live but rife with pre-recorded messages designed for maximum impact, blaring asynchronously alongside notes from people staring at the screen at the same time I am staring at the screen. Even organic, unpaid tweets have taken on the tenor of advertising, once they snowball into something that the algorithm has decided I should see.

While much offline social interaction comes with a certain degree of behavioral tailoring, social media amplifies the sensation of performance and reward

There’s a computer game a lot of people on my Twitter feed play, a charming independent RPG called Stardew Valley. Developed by a single designer who goes by ConcernedApe and released in early 2016, the game begins with a prelude in which the player character’s grandfather dies, bequeathing them an old farm in a small town. The player leaves their soulless office job at a fictional corporation that’s coded in-game as something like Walmart, but could easily be Amazon or Google or any kind of far-reaching, community-disrupting corporate powerhouse. They move to the small town and work the farm out of disrepair. Their goal, as dictated by the grandfather in his will, is to bond with the members of the community to which they’ve relocated, to shed capitalism’s habituations and build a meaningful life.

The goal is a noble one to which the game itself struggles to ascend. Though Stardew Valley comes replete with an idiosyncratic cast of characters, it limits itself to a transactional bonding mechanism: To form connections with neighbors, the player must give the correct gifts. Each character has a preordained list of objects they love and hate to receive; the player has to discern each character’s preferences through trial and error, or by playing close attention during conversations. Dialogue with non-player characters may vary depending on the time, season, and scenario, but conversation itself adds negligible points to the relationship meters the game tallies next to its inventory screen. You can talk to people all you want, but it won’t make them like you. To make friends, you have to give people shit.

Systems like Stardew Valley‘s sociality model reiterate the capitalist notion that one’s human worth is equivalent to one’s capacity for productivity. By displaying the player’s connections with other characters via heart meters, and encouraging them to fill the meters, the game likens sociality to a form of currency. Bonding with NPCs is much like climbing a corporate ladder: The player impresses other characters with material transactions in order to progress along certain gameplay branches. Rather than community-building, whose work tends to be reciprocal among participants and unmotivated by financial gain, Stardew Valley simulates networking.


In a 2014 essay for Paste, Austin Walker critiques systems that steer players inevitably toward violence, even in games with relatively complex social systems. “I can’t touch anyone. This has been bugging me since I started playing Watch Dogs,” he writes. “I want that so much more than the ability to do harm, but it’s all I can do. And here, then, is the largest problem with these systems as they stand. No matter how many songs the Orcs of Mordor sing, no matter the desperation of the out-of-work Chicagoan teacher, all I can do is hurt people.”

To form connections with neighbors, the player must give the correct gifts

In Stardew Valley and in Fallout 4, I can help people. I can return a stolen family heirloom and I can forage leeks for an elderly man in a wheelchair. But I still can’t touch anyone. These games so rarely give me the chance to speak; when they do, I have to choose from a handful of scripted responses, some of them aligned along a clear positive/negative binary. I can be a jerk, or I can be kind. I can’t keep a conversation going once an NPC has decided it’s over. I can’t do anything for them beyond letting them speak their lines, picking locks in front of them, or giving them the objects they like.

Compared to games in which the only available interaction with NPCs is murder, Fallout 4 and Stardew Valley entertain the possibility of positive in-game sociality. And yet both cast sociality as a series of points to be scored by performing for NPCs within narrow parameters. There is nothing I can do with these characters; there is only what I can do for them. Both games ostensibly take place somewhere outside capitalism — Fallout 4 after capitalism’s fall, Stardew Valley in an isolated locale minimally inflected by consumerism — but still structure their social interactions according to capitalism’s logics. Affection and its in-game rewards are traded for the player’s willingness to jump through a specific set of hoops.

What would it look like for a game to simulate not just the accumulation of approval points on social media, but sociality as a broader whole? What, for that matter, would social media look like were it not designed to reward the massive accumulation of approval points above all else? I am imagining a connection that does not look like money, one with space for variations in tone and affect, one whose goal is not necessarily “more.” These bonds happen in social media all the time, but between the lines of the system’s intended purpose; friendships flourish in the cracks, in enclaves known as “weird Facebook” and “weird Twitter,” for instance, whose denizens use the platforms so chaotically as to be rendered all but illegible to ad-tracking devices. Outside the algorithm’s gaze, under pseudonyms and cartoon avatars, users redirect their online activity away from, say, building a Twitter following impressive enough to net a spot on a prestigious masthead or networking with industry gatekeepers. Because their presences are not tied to their government identity or their jobs, people who use social media in this way find space to simply play, and to bond with each other without simultaneously pursuing material reward.

“What would videogaming look like if it rejected the machine as a model for play, if more games incorporated gratuitous moments of relaxation from their constant, accelerated striving? Or if more games did not treat us as employees but as autonomous co-creators?” asks critic Steven Poole in the 2008 essay “Working for the Man.” These, perhaps, are the kinds of questions that  sociality-simulating games should be asking. I’d like less goal-chasing, less methodological progress toward maximum hearts. I’d like more reciprocity with the characters with whom I’m supposed to bond. I’d like to see them not as coworkers in a labor-driven system, but as co-conspirators in leisure and disobedience. I’d like more surprises from them. I’d like more chaos.


This essay is part of a collection on the theme of POSITIVITY. Also from this week, Olivia Rosane on how solarpunk aims to cancel the apocalypse, and Hanif Abdurraqib on online’s small, sustaining joys