The media businesses that have relied on brokering attention are mostly shrinking. The cable TV model is teetering on collapse, newspapers and magazines have spent decades consolidating and cutting staff and coverage to no avail, the movie business has bifurcated its own markets while almost fully shifting its attention to China and other emerging markets, and the music business has shrunk so dramatically (and been so thoroughly “disrupted” by Spotify, Apple Music, and other streaming platforms) as to be an almost unrecognizably different industry.

Video games, on the other hand, are flourishing. Their market and market share are growing every year. That is because, despite some similarities to traditional media, the games industry has structural, ideological, and technological factors that make it particularly well adapted to an increasingly polarized consumer economy.

It may not be the case that ads aren’t working; it could be that there’s no one to sell the products to

On its face, it would seem absurd to claim that the popularity of video games might reflect a shift away from conventional consumerism — that is, the system whereby mass consumption rationalizes and justifies wage work and consumer goods are the major engine of the economy. After all, games, understood mainly as products, seem to support those established business models so well. They rationalize the purchase of ever more expensive computer and network hardware and assure the serial obsolescence of consoles every three to five years. Games generate endless franchise sequels and are supported by a boosterish press focused on telling consumers which games to buy. The games themselves have seemingly no use beyond their pleasurable consumption, and produce in their wake legions of “gamers” whose identity, like consumers more generally, is linked purely to their consumption habits, patterns, and preferences.

In the early days of video games, in the 1980s and ’90s, the cultural product they were most similar to were toys or models: a somewhat niche cultural market that made money almost exclusively through the sale of commodities themselves. Games and the means to play them were mostly sold in toy stores. This longstanding association with toys has contributed to the aggrieved defensiveness in gaming’s adult fans about its cultural importance, an attitude that still drives much of video-game criticism and debate. (Are they art?)

This defensiveness about video games has proven to be a significant boon to developers, as organized gamer-rage has helped crush union efforts and assisted in generalized labor discipline across the industry.  But this embattled defensiveness to the medium-qua-medium has also meant that many of the fans, consumers, and even harsh critics of video games still identify the health of their favorite hobby with the health of its biggest studios, which is to say, with the profits of the industry’s owners. (This has also played out in “console wars”; loyalty to certain studios, brands and franchises; and other perverse forms of “fandom.”) The consumer base committed to the health of the “video game industry” long past the point at which it needs any defending has meant that fans often accept or even support and rationalize increasingly exploitative and predatory models of sales and consumption.

And that’s been important for video games growth, because, as a form of cultural entertainment, they push forward a different model from the one that supported most 20th century media. For most of the last century, media businesses were basically parasitical on marketing: They predominantly made money from ads for other products rather than the direct sale of their own. Media companies gathered and held an audience’s attention with entertainment, and then showed that audience ads that promoted brands and products and naturalized the idea that what we consume constitutes who we are or how we are seen. Advertising and entertainment programming were integrated but largely distinct: Without the pleasures of television, for instance, most people wouldn’t agree to sit through 18 minutes of advertisement every hour.

Social media and web-based “traditional” media have, for the most part, attempted to continue the model, but they are failing: Advertising-based business models are collapsing. Part of this has to do with how metrics have merely underscored the inefficacy of pay-per-click advertising models, but the real problem is the collapse of America’s consumer base. Technological changes have facilitated (and have been used to accelerate) the automating, globalizing, and deskilling of production labor, a 45-year-and-counting process, and this has decimated the spending power of the middle and working classes. In 2012, the top five percent of income earners in the U.S. accounted for 39 percent of consumption spending, most of that on luxury goods (expensive jewelry, vehicles, fine art—purchases that are also often investments). For the bottom 60 percent of Americans — who account for less than 15 percent (!) of national spending — the biggest expenses by far are food, housing, transportation, medicine, and debt servicing. In other words, there is a much smaller market for nonessential consumer goods than there used to be. So it may not be the case that ads aren’t working; it could be that there’s no one to sell the products to.

In the world of platforms, consumers become less like owners of commodities and more like workers who produce their own entertainment as self-exploitation

In a world of growing consumption inequality, consumer purchases like cars, appliances, vacations, and the like become out of reach for more and more people. They were part of a consumerism that featured a range of products and brands, from low-end to mid-tier to high-end, which required and facilitated their own assembly lines, showrooms, and distribution networks. But as the middle falls out and people at the bottom spend less and less on consumer goods, supporting production at those differing scales just isn’t profitable. When the top 25 percent of Americans are doing nearly 80 percent of the consumption, it just doesn’t make sense for most companies to market to anyone below the rich.

But the video games industry is particularly suited to a more unequal society. As with all digitally managed products, the games industry facilitates rent-to-own models, but those models make more immediate sense in the context of gaming than other cultural industries dominated by digital rights management. Apple removing a copy of Spy Kids 2 from your laptop as a result of a rights dispute just feels like they’re taking something from you. But when game developers remove content or discontinue support for a video game through updates, changes, or sequels, such clawback behavior suddenly feels technologically sound, reasonable, and part and parcel of making gameplay more enjoyable.  And this continued, networked control of the product by developers facilitates more and more micro-transactions — technologically advanced innovations on an economic model that plagued the working classes in times of more explicit exploitation. (Think of the early 20th century electricity meters that apartment dwellers had to put a coin in to operate.) Games are products as platforms to the highest degree, allowing for an ongoing extortion of users on a variety of fronts, making for the perfect product to pull every last penny out of stressed out, broke, precarious people in the name of fun and relaxation.

As we spend more of our money in digitally enhanced marketplaces, the fluidity, rate and volume of economic transactions increase, even as the cost of each individual transaction decreases. This lends itself well to systems that are capable of ever more fine-tuned economic stratification. As paywalls and privileges proliferate around more and more of the things we consume, and as the possibility of sharing the things we buy decreases through digital rights management and other technological shifts, our ownership of even these impoverished commodities becomes less reliable and total. Video games make this logic seem immanent, reasonable, even fun.

The stark class divide made plain by income inequality and spending disparities in the broader economy is reflected within games themselves in a way that other cultural markets are increasingly mimicking. “Prestige” games — the games written about by games journalists and considered “serious” works of culture — tend to cost between $30 and $60 and run on systems that cost hundreds if not thousands of dollars. These are predominantly pointed — ideologically and culturally — toward what remains of the middle classes: Centering the psychological motivations of individual protagonists, they get story-driven narratives of hard work and perseverance leading to overcoming. This is the heroic individualism of the American Dream or the crypto-fascistic hero myth of “chosen ones” who rise up to save the world. They get massive, spectacular feats of simulation and fantasy, huge amounts of money and labor embedded as graphics, “things-to-do,” hours of play, or complexity of systems. Some of these games are surrounded by competitive e-sports infrastructures that make them even more “serious,” as they exist both as hobby and spectator sport.

When the more expensive gaming systems are available to working class people, it’s often under lease or rent-to-own agreements that mean, over the life of the system, they pay significantly more than those who can afford an upfront investment. A PS4 Pro will set you back $400, but if you get one from Rent-a-Center you’ll ultimately spend $1,000 on it — and if you miss a few payments you’ll lose it altogether, along with whatever content you purchased and installed. For the working class, ownership of the means of even our own entertainment is increasingly limited.

The rent-to-own model is so profitable that on some fronts, it is being extended to everyone: Microsoft is beginning to sell its Xbox hardware through a monthly “All Access” subscription service, and technologists increasingly look toward future gaming systems being streamed completely from the cloud. Following in the footsteps of digital-rights-management systems that allowed Apple, Amazon, and Adobe to revoke your access to content on your devices, erstwhile video game consoles are increasingly becoming strictly digital platforms: Nintendo recently announced the closing of their Wii store and servers, meaning that, after a certain date, people will no longer be able to download or access content they’ve purchased. In the world of platforms, consumers become less like owners of commodities and more like workers who produce their own entertainment as self-exploitation, renting the owner’s equipment under increasingly complex and opaque systems of digital “ownership.”

But the majority of games that are played are not these “prestige” games and systems. Most are not made into a leasable property that supplies a steady stream of rents. Most of the games actually played in the world are cheap or free-to-play — a crucial factor in Fortnite’s success — and are played on phones, browsers, or low-quality computers. These games, more readily available to working-class gamers, have tended to be much more nakedly repetitive, attempting to inculcate and prey on compulsion: endless runners, match-three games, slot machines, gatchas, and clickers. These games are littered with micro-transactions, hidden fees for progression, advertisements, and addictive casino-like spending mechanics. As a result, over a significant amount of play, they end up being much more expensive than the $60 prestige experiences — but available to people who couldn’t easily spend that money up front.

As the possibility of sharing the things we buy decreases, our ownership of these impoverished commodities becomes less reliable. Video games make this logic seem reasonable, even fun

And as with the rent-to-own strategy around consoles, these exploitative techniques are increasingly coming to dominate prestige games as well. One of the most important video game news stories of 2017 was a controversy over micro-transactions. Anger about the practice came to a head around the release of EA’s 2017 Star Wars shooter Battlefront 2, which, despite being $60, paywalled a tremendous percentage of the games’ content behind loot boxes—a casino-like mechanic in which you buy a blind box, knowing all the possible things you could get but with no guarantee of any particular product. The tent-pole game faced a massive backlash and performed much worse than EA had needed. The backlash got so bad that legislation has appeared in multiple countries to regulate or outright ban loot boxes.

But while many game companies have been chastened, the industry’s reliance on micro-transactions isn’t going anywhere. That’s because micro-transactions are not simply a canny maneuver by video-game companies; video games as a medium lend themselves to these economies. Micro-transactions are one of the foundational economic models of video game play, dating back to arcade machines that required players to pump in quarters to continue.

It is hard to imagine any equivalent sort of repetitive purchase for novels, TV shows, or movies: Their seriality requires the wholesale production of new content: new episodes or sequels that require another marketing push, another massive expenditure on par with the first. Even merchandising has limits — eventually you’ll need to put out a new movie for new characters to sell toys of. But a single game can continue to indefinitely add new content, assets, and characters; it can tweak core gameplay and update the graphics; it can in itself become and remain an active marketplace without developers having to again invest the amount of time, money, and creativity it took to make the game in the first place.

It takes only lines of code to change so many things that can significantly alter players’ experiences: in-game costumes, skins for avatars, stat boosts, extra lives or continues, faster character unlocks, in-game currency, new downloadable content. This demonstrates how video games are well situated to exploit new models of digital delivery — not only with the games themselves but in-game experiences — in ways that other cultural products are simply not capable of, because of the necessarily bounded, “finished” nature of the cultural object when it is consumed.

The “prestige” games industry has changed rapidly to accommodate this: Major studios are releasing fewer games every year but spending much longer supporting and producing content for already released or “continuing” games, fostering communities of consumer-players who might spend hundreds or even thousands in in-game marketplaces over time.

The virtuous cycle of the post-war era in which consumption lead to more production which lead to higher wages which lead to more consumption is long over. The post-1970s solutions to this sag in real consumer power — cheap credit, financialization, price-reduction through globalization — have reached their limits. The “democratization” of capitalism that Cold Warriors always espoused — the relative consumer wealth of the imperial heartlands, every (white) family with their own house and car, a TV and a washing machine — is a fantasy now as distant as the Cold War.

The depression that even many mainstream economists agree is coming this year will smash much of what remains of the middle classes. Video games are designed to survive that crash. They rationalize and reify some of the ugliest outcomes of economic hardship: More dependence on the market and its global supply chains (because games have to be graphically cutting edge and are increasingly played and supplied online), less and less personal control and autonomy over our time (because games are as much compulsive as escapist and entertaining), and increased and increasingly stratified inequality (because games bring a new immediacy to the logic of class distinction that used to work itself out more slowly and less pervasively in conventional consumer goods). Video games, consequently, are the perfect cultural vehicle to help 21st century technology secure the return of a 19th century economy.