Calling themselves Dadabots, researchers CJ Carr and Zack Zukowski have released several albums of AI-generated music, including a death metal album called Coditany of Timeless (lauded as “potentially enjoyable” in this Outline piece by Jon Christian). Their approach involved training a neural net with samples derived from the metal band Krallice, so that it can learn what sorts of sounds plausibly follow what other sorts of sounds within the training set and then generate them. In this research paper describing their method, they admit that this approach sometimes yielded “long silences.” Trained on death metal, some models would learn to be quiet.

Recently, as a follow up, Carr and Zukowski have made a YouTube channel named Relentless Doppelganger, which they describe as a “neural network generating technical death metal, via livestream 24/7 to infinity.” In the notes posted below the stream, they call it part of their “research into eliminating humans from metal.” That is possibly the most metal thing one could do — the apotheosis of the genre’s constitutive nihilism. Hideous Gnosis, the proceedings from the Black Metal Theory Symposium, offers some useful context on that front: “Black Metal is also about self loathing, for modernity has transformed us, our minds, bodies and spirit, into an alien life form; one not suited to life on earth without the mediating forces of technology, culture and organized religion,” Steven Shakespeare writes. “We are weak and pitiful in our strength over the earth—in conquering, we have destroyed ourselves. Black Metal expresses disgust with humanity and revels in the misery that one finds when the falseness of our lives is revealed.” As I’m writing now, there are 17 people listening to the Relentless Doppleganger stream, and the chat is a bit listless.

It can seem like spontaneously generated death metal is implicitly mocking the genre, its performers, and its fans: It is so tuneless and formulaic, even a machine could make it! From that perspective, the music isn’t meant to be listened to, per se, anymore than Kenneth Goldsmith’s procedural “uncreative writing” stunts — where he transcribes weather reports or an entire newspaper or as much of the internet as he could print out — are meant to be literally read. Rather they are conceptual objects for contemplation at an aesthetic distance. You don’t necessarily need to listen to the stream to grasp the metaphoric implications of an endless onslaught of aggressively abrasive non-music produced by algorithms. Here they are, grinding away the last remnants of human creativity as we are forcibly integrated into a world administered by machines and geared toward their smooth functioning and not our own.

Actually appreciating death metal might get in the way of one’s appreciation for the stunt and its possible meanings. Some might read the project as an invitation to feel superior to those who purport to enjoy death metal and find creativity and aesthetic merit in it — they are just dupes, perhaps caught up in the optics of rebellion. But the Relentless Doppleganger exposes the ruse. Just as no one is supposed to actually listen to this algorithmic music; no one really should listen to death metal either.

That doesn’t seem to be Carr and Zukowski’s intention — they seem to genuinely like metal and appear more interested in augmenting human composing techniques with AI, much as Robbie Barrat discusses here with respect to fashion. However, statements like “we demonstrated that creating music can be as simple as specifying a set of music influences on which a machine learning model trains” seem open to interpretation: One might take away that genre music is “simple,” and the people who are content to consume it are simpletons.

But only a simpleton would think that. Every genre is complex: Its loosely codified principles become a stable backdrop against which more and more intricate variations can appear. It seems a common experience to think every song in some genre sounds the same, only to discover the genre’s richness as you gain more familiarity with it. Suddenly you can hear the significance of aesthetic choices made within the formal constraints, suddenly a genre becomes a tapestry of nuances meant primarily to reward connoisseurs, not beguile philistines capable only of consuming formula.

Projects like Relentless Doppelganger, though, let listeners consume a genre not in its subtleties but in and of itself. Genre appears as fundamentally a system of formal rules rather than a set of songs or particular performers with particular competencies and ideological inclinations. In other words, genre is boiled down to an algorithm, which can have the additional effect of seeming to strip away the genre’s semiotics. In consuming streams rather than songs, one’s attention is directed toward the underlying patterns being reproduced with content, and the assumptions being made to generate the flow.

This seems to me true of all the various algorithmic products we now routinely consume, not necessarily as infinite death metal but as the Facebook news feed or Twitter’s top tweets or YouTube’s recommended videos and so on. Content consumption online is oriented by sorted feeds toward consuming genre, experienced as a flow, rather than individual pieces of content, experienced as significant for the particular difference they express. Difference is subordinated to an overall sameness, a homogenization orchestrated at the level of the automated stream.

Since we seem to consume algorithms all the time, I had a hard time figuring out what political science professor Davide Panagia meant when, in this interview with the Los Angeles Review of Books, he claimed that  “we don’t experience algorithms. We experience inputs and outputs. But not algorithms.” Maybe that means algorithms are a metastructure that merely orders or filters other kinds of media and not a medium themselves. But that seems a bit like saying we don’t experience poems, just words being ordered in different ways. Or maybe his point is that the output of an algorithm works to conceal its operation, though it seems that exposure to algorithmic systems eventually pushes algorithms to the fore. They are designed to make us dependent on them, but that dependency isn’t generated by algorithms being hidden from us.

Sometimes critics want to make algorithmic sorting a problem of awareness: They’ll point to the fact that many social media users don’t realize their feeds are algorithmically sorted as if that ignorance would convert to outrage automatically if they knew. But that’s far from a sure thing. In example after example, users have been shown to consume more of sorted feeds than “raw” chronological ones, but what does that mean? That algorithms are brainwashing us into mindless intensified consumption, or that users “prefer” algorithmic sorting, and the data reveals their preference? It may depend on your prior assumptions.

My usual argumentative move at this point is to claim that we consume the ongoing calculations of such algorithms as a way of consuming ourselves as an alienated product, as a dynamic representation, as a relentless doppelganger. In fact, the self posited by content-recommendation algorithms works similarly to the Dadabots music generator: Our behavioral data is scrutinized for patterns that allow an algorithm to predict what is most likely that we’ll do next. We then can convince ourselves that is precisely what we wanted — desire thereby follows from rather than precedes a course of action. Of course, it doesn’t always work that way: We ignore or reject many algorithmic recommendations, but this too affords a related satisfaction of proving our “humanity” in the face of computational totalization. Either way, these kinds of algorithms conflate who we are with how they work; this invests us with an interest in immersing ourselves in them, learning how they operate and allowing them to take on a sort of material form through the way they affect us. To my mind, that is “experiencing algorithms.”  We can come to think of ourselves as a genre, a set of rules, a series of filters, a certain set of plausible associations and exclusions.

The algorithmic approach to behavior is, as Brian House details in this essay about smart cities, reminiscent of Vito Acconci’s Following Piece, a 1969 performance for which he gave himself these rules: “Each day I pick out, at random, a person walking in the street. I follow a different person everyday; I keep following until that person enters a private place (home, office, etc.) where I can’t get in.” As House notes, Acconci basically surrendered himself to a simple algorithm (it wouldn’t be too hard for a neural net to decode these correlations) and accordingly felt like an automaton: “I don’t have to control myself,” Acconci wrote. “I am almost not an ‘I’ anymore; I put myself in the service of this scheme.”

That might sound like a bad thing, but surrendering control to a system or a habit or a set of rules also can provide a great deal of relief. It’s easy to experience that sort of surrender as a kind of self-aggrandizing convenience. This essay by Colin Horgan suggests a “tyranny of convenience” at work in contemporary society in which our short-term thinking leads to ill-conceived trade-offs of “frictionlessness” for a loss of individual privacy and autonomy. Horgan suggests that “nobody seems to want” the “dystopian world tech companies are building,” which implies that we were somehow tricked into consenting to it, one expedient terms-of-service acceptance at a time. This is equivalent to assuming that no one could enjoy the death metal generator, or anything generic — that genres always operate without depth, as superficial tricks.

But to see convenience as a shallow trick would be to overlook its masochistic appeal: the way it makes us feel like the tyrant by depriving us of choices. Convenience is a paradoxical kind of self-centeredness that makes exercising one’s will feel like an unreasonable and compromising amount of effort to have to put forth. But the paradox is the point: It allows us to feel as though we are taking up more space (in that we are the specific center of a decisionmaking apparatus) by doing less (the decisions are made for us and we obey). Algorithms are a more pointed implementation of this in that they leverage personal data to make your surrender even more specific to your circumstances.

If we could hear what convenience sounds like, it might help us to pretend it sounds like the Relentless Doppelganger, and it will be easy for us to dismiss, to render other, to see as something made for someone else, someone with less discerning or more esoteric tastes. But convenience probably sounds more like the silences that their algorithms tended to generate without human supervision and intervention: It’s what you hear when you don’t listen to anyone else’s voice or consider anyone else’s needs as real. Perhaps the predictive algorithms will eventually find that the most likely thing for us to do next is nothing at all.