Home

The Organic Myth

There’s no type of content that is natural to social media

Full-text audio version of this essay.

When the U.S. Congress held a hearing about election-related abuse of Facebook’s platform ahead of the 2018 elections, there was enough talk about “feeds” and “organic content” that you’d be forgiven for mistaking it for an agricultural convention. “What percentage of the content on Facebook is inorganic?” Senator Kamala Harris asked Facebook’s chief operating officer Sheryl Sandberg, after Sandberg deflected a question about ads on the network. Implicit in the question is the idea that something can be organic to the platform, something that “naturally” happens.

The word organic suggests that everything that happens on a platform is naturally ordered, magically appearing with no intervention

The myth of “organic” experience has been sold by platforms (here Twitter touts “organic Tweet analytics”) and tech companies (Google distinguishes in this SEO starter guide between “organic” and paid search results) for almost two decades. But just as much of the food we eat today is engineered, so are the online feeds we consume on a daily basis. The word organic tends to be used in a tech context to differentiate between advertising’s “paid reach” and the other kinds of content that appear in one’s feed, but its implications go beyond that: It suggests that everything that happens on a platform is naturally ordered, magically appearing with no intervention — as if once you removed the ads, what would remain is a garden of wildflowers growing on the wings of nature.

The politics surrounding “organic” food are messy. The term designates no specific or naturally occurring quality; rather the rules governing what may be designated as organic change from country to country. People often think organic farming doesn’t include pesticides or that it is better for the environment, but that is not true. While some researchers argue that organic food is healthier for your diet, others claim there is not a big difference.

When it comes to “organic” media ordering of our feeds, the situation is even murkier. Unlike the agriculture industry, in big tech and the advertising ecosystem that fuels it there are no clear, let alone enforced, operating standards. We know very little about what goes into making and organizing our feeds. Interfaces hide how they work and the “chemicals” involved in growing our engagement.

Interfaces typically offer us tools such as likes, shares, comments and retweets to give us the impression that these are sufficient to understanding how platforms work. But these are just a small part of the data used to analyze our activity and then shape our experience and influence our behavior. The mechanisms that actually dictate what appears on our feeds include a multitude of elements, including the algorithms that arrange what will connect with who, at what time and how, as well as the automated advertising market that monitors us and bids on space in our feeds in auctions that take milliseconds. For example, as I show in my new book Media Distortions, Facebook spies, measures, and records many of users’ “silent” actions and decides which ones to make visible according to what will yield more profit.

In other words, platforms don’t just moderate or filter “content”; they alter what registers to us and our social groups as “social” or as “experience.” Their design influences behavioral patterns across different conversations, groups, and geographical areas, with different frequencies and paces. Given these different orchestrated (re)organizings, it may be better to think of how feeds are ordered not as “organic” and “inorganic” but in terms of their rhythm, or as I call it, “rhythmedia.” This concept describes how media companies order our tempo-spatial experience according to economic purposes. Rhythmedia shapes the way we engage with information and others and how we understand what we can do in mediated environments.

Facebook conducts experiments in search of an elusive balance between engagement and well-being, as though it were a trade-off

Rhythms have a conductor (or several) who shape how elements turn into a tune. Similarly, platforms orchestrate people, objects, and their relations in a rhythm that privileges behaviors that lead to more engagement — ones that are more emotional, sensational, or repetitive — while filtering out those that are not. This shapes our sense of what “engagement” is and how to value different types of sociality. It engineers how we understand and engage with various topics while manipulating our emotions and sense of time.

This can be clearly seen in the series of experiments Facebook conducted (described here) where they asked some users whether posts they saw on their feeds were “good” or “bad” for the world and found that the posts with the most reach (the ones that were most frequently repeated) were considered bad. When Facebook designed an algorithm to predict which posts users would consider “bad” and suppress them on their feeds, it reduced “negative” content but also decreased the amount of times peopled opened Facebook, a metric called “sessions,” derived from the ad industry. This finding prompted the company “to try a different approach,” intentionally changing an algorithm that made people have a “good” experience (whatever that may be) in favor of one that made them open the app more. This shows how Facebook conducts ongoing experiments in search of an elusive balance between engagement and people’s well-being, as though it were a trade-off.

“Sessions” are a particularly important metric for Facebook. A few years ago, the company conducted research, with partial funding from the U.S. Army, to examine people’s behavior on the platform in short “sessions” — minutes instead of hours or days. The researchers found that such data as time spent on stories, the number of different interactions during the first minute of the session, the time of the day, the time since previous session, and the number of notifications at the beginning of the session helped them predict how much and which type of content a person will engage with, what they will do within a session, and when they will return to the platform. This can be seen as a kind of tuning of users’ rhythms.

The careful sequencing of content in social media platforms mirrors how television was programmed in the 20th century. In his 1984 book Rhythmanalysis, sociologist Henri Lefebvre shows how television worked to create what he calls “the media day,” by which the experience of time is divided into “hourly slices. The output (rhythm) changes according to intention and the hour.” That is to say, media companies orchestrate pieces of data to induce a desired rhythm in consumers, with specific economic intentions in mind. This happens especially with repetitions, which according to Lefebvre are instrumental in shaping us, making viewers into specific sorts of subjects whose understanding and feeling of time is shaped by television’s rhythms.

As with doomscrolling today, the “media day” doesn’t have a beginning or end. Rather it blurs viewers’ sense of time and space, immersing them into the medium. Likewise, in Television: Technology and Cultural Form (1974), Raymond Williams argued that television networks and advertisers shaped people’s experience by implementing what he called “planned flow,” which deliberately effaced the lines between content and ads to naturalize a feeling of uninterrupted immersion: Turn on (the TV), tune in, and engage more.

Williams argued that “broadcasting can be diagnosed as a new and powerful form of social integration and control,” even though it was consumed on individual sets by atomized viewers. “Many of its main uses can be seen as socially, commercially and at times politically manipulative.” This is similar to the experience of social media, which offers an illusion of individualized control that can mask the other levels of manipulation.

Television, as these writers’ analyses show, was deployed to sell us the “organic” myth: that there is such a thing as “natural” experience that unfolds spontaneously through our habitual navigation on media. This myth still conditions the effects of our media consumption, only now with “social” platforms, the specific techniques have changed. Their multilayered ecosystem — which includes interface design and algorithmic ordering as well as human content moderation and user behavior itself — filters and orders various elements, such as when and where we engage with news, ads, our friends, celebrities and politicians, and so on so that we engage longer on platforms under their definition of sociality. Through this system, tech companies not only decide what kinds of behavior, content, and profiles are considered legitimate on their platforms but more important, what type of sociality is possible.


It’s no secret that the main goal of platforms, especially Facebook, Twitter, Instagram, and Snapchat is to keep users on their platforms as much as possible. As artist Benjamin Grosser showed in 2011 with his Facebook Demetricator, such platform metrics as likes, comments, shares, and even time on site and the timing of when we post things are displayed in ways designed to keep us on the platform. The same goes for notifications, which signal us in red and get pushed whenever someone has a birthday or when someone commented or approved an event. The technique of resurfacing older posts as “memories” that Facebook introduced in 2013 uses nostalgia to secure engagement. These interface designs are meant to suck us back into platforms and make us stay as long as possible.

These sorts of interventions attempt to take control of our sense of time, and our sense of what needs attention now and what doesn’t. Where television established a planned flow, contemporary media platforms pursue forms of disruption and distraction as well. Because Facebook can (or at least claims to) predict when you’ll come back to the platform again, how much time you’ll spend there, and what you’ll do, that means its interface design, its algorithmic feed ordering, and its timing of notifications are all harmonized in an ongoing process of fine tuning. So while television requires immersion and being in a certain space, online platforms combine doomscrolling with multiple types of interruptions inviting us to come back at particular times.

These sorts of interventions attempt to take control of our sense of time, and our sense of what needs attention now and what doesn’t

As I show in a recent article, your rhythms and especially your repetitions (frequency and pace of behaviors, interactions with others, self-expressions, etc.) matter to Facebook because they indicate preferences that the company can leverage in its advertising backend. Facebook’s ad auction enables companies and advertisers to buy rhythm interventions — how often they can intervene in people’s experiences, as with the “Reach and frequency” feature. By understanding what people do in each session, Facebook can both order the newsfeed to maximize engagements and regulate how many and at which times companies will bid to make interventions not too intrusive and yet still profitable.

Working in tandem with the automated and algorithmic methods of controlling user experience and establishing its rhythms are human moderators, who tune individual and group feeds to filter content that companies have decided to prohibit, often on the basis that it is anti-social. Content moderators’ work determines which behaviors and which people are deemed illegitimate and deviant, noisy and therefore removable. In her 2019 book Behind the Screen, Sarah T. Roberts notes that the details of content moderation policies have been “treated as trade secrets to be used internally, and ultimately functioned in the service of brand management for MegaTech.” The fact that moderation standards are considered proprietary points to the fact that they are not about “community standards” but about maintaining profitable engagement.

Working under extreme time pressure, commercial content moderators have only a few seconds to decide if an image, video, post, or hashtag will stay on the platform or not. This can influence whether a piece of content, group, or hashtag becomes viral, because social media’s algorithms prioritize not only popularity but newness. Similarly, Facebook, has an internal metric for “violence and incitement trends” that is used to suppress certain hashtags, conversations, and groups. Platforms try to regulate the frequency that certain topics appear and the speed with which they circulate. They try to identify which should be removed, filtered or slowed down because their rhythms are harmful. This drives the development of tactics to neutralize moderators’ interventions. For example, President Trump recently used a technique described as #typosquatting to bypass content moderators trying to demote problematic hashtags, using the hashtag #BidenCrimeFamiily, with an extra i, to buy more time for his conspiracy theories to spread.

The cumulative effect of platforms’ various attempts to control the rhythm of user experience is to produce users themselves as a further filter. We impose a platform’s rhythms and imperatives on ourselves and other users, which shapes our sense of time and our interpretation of events. For example, repeating misinformation creates an “illusory truth effect,” making people feel something is true even if the sources are not credible and even if you know that the information is false.

There is nothing “organic” in how we experience platforms. Both human and nonhuman filtering mechanisms orchestrate our experience to make it feel as if it were “natural” and smoothly functioning. But this default setting aims to keep us as engaged as possible, as though that were the only natural thing to do. As tech companies attempt to engineer our lives, we must counter their strategies by, for instance, creating alternative business models, helping organizations that challenge big tech, or create online parks. We can change the soundtrack of our lives, and demand rhythms that provide harmony rather than discord.

Dr. Elinor Carmi is a feminist, journalist and postdoc research associate at the Communication and Media Department at Liverpool University, UK. She’s currently working on several projects around data literacies and has recently published her second book Media Distortions: Understanding the Power Behind Spam, Noise and Other Deviant Media on Peter Lang.