In November of last year, 17-year-old Zander Cumbey noticed some disturbing social media posts shared by a classmate, including a “countdown” and more “really scary stuff.” Other students at his high school were also aware of these posts, and some of them stayed home on November 30th in fear for their safety. Cumbey decided to attend because “he assumed that school administrators had the situation handled.”
That day, 15-year-old Ethan Crumbley shot multiple students at his high school in Oxford, Michigan, killing four students. In the week that followed, more than 150 threats were made to schools across the country (which, for comparison, is about the number of threats made during the entire month of September that same year).
Just over two weeks after the Oxford shooting, several school districts were forced to respond to more alleged shooting threats by issuing warnings to parents, increasing security, or closing entirely; and students in several states, including Florida, Connecticut, Pennsylvania, Texas, and Ohio, were arrested in relation to them. This wave of threats, however — while part of the post-Oxford cascade of copycats — was determined to be part of a social media trend apparently started and propagated on TikTok as a “challenge”: to make December 17th, the last day of class before the winter break, a day of nationwide school shootings. It’s unclear whether the primary motivation behind this was actually to inspire a wave of shootings or instead to prompt school closures. Most threats were deemed by police and school districts not to be credible.
What emerged in the wake of Oxford was what many feminists might call a “whisper network,” created by a community with unmet needs
Nobody can seem to find the original threats on TikTok, or even determine whether they existed at all. (TikTok’s Communications Team tweeted that they were “working with law enforcement” to investigate the threats, but had not found evidence for them.) Instead, the rumors spread as warnings, shared by users about what would happen on the 17th. The ambiguity of the threat — simultaneously trending but also nowhere to be found, targeting all U.S. schools but also specific ones, “not credible” but also resulting in arrests — served to intensify the discombobulation of students, parents, and teachers trying to understand what to do. Students encouraged other students to stay home, expressed distrust of authorities who said that the threats were false; most of all, they talked about the fear and confusion they felt and told others to “stay safe.” Teens were not the only ones posting about the challenge; parents on TikTok also warned each other about sending their kids to school on the 17th, and gave advice for talking to their children about it.
The dominant reaction by parents and adult authorities was to lecture and condescend to kids who, in the end, were only trying to keep themselves safe. The Frisco, Texas police chief told students in a community message that they “must convey to [their] classmates these threats are not funny or cool.” Other messages recommended that students avoid sharing posts about threats on social media to stop the spread of rumors. While TikTok was pilloried for seeming to profit off of the terrorizing of young people, and for platform incentives that made this viral campaign possible in the first place, it was ultimately “TikTok teens” who got blamed for spreading misinformation. The moral panic that ensued was only quelled by news of the emerging Omicron variant.
This brief but tragic episode highlights the ways social media and its vulnerable or marginalized users can be scapegoated for larger social problems. What really drove this phenomenon was a certain desperation among schoolchildren who felt abandoned by their elders — in the Oxford shooting, authorities had an opportunity to stop Crumbley, but did nothing to contain him. As we’ve all experienced, to one degree or another over the last two years, fear and the sense of abandonment by powerful authorities can drive compulsive behavior online: from doomscrolling, to sharing a viral TikTok in the desperate hope that it’ll save someone’s life. Young people — facing just as many threats, but with less power to protect themselves — are just as susceptible as their elders.
What emerged in the wake of Oxford was what many feminists might call a “whisper network.” Though that term is often used to describe the informal network of contacts who warn one another about abusive members of an affinity group, it can also describe any informal association of people who are attempting to reverse-engineer protective infrastructure for themselves when official channels for such things have been blocked or suborned.
Whisper networks are infrastructural desire paths created by communities that have unmet needs. In the horrible aftermath of the Oxford shooting, many students felt a spike in vulnerability, exacerbated by the pandemic, which remains a threat to their lives. With the ostentatious failure of school officials to stop Crumbley, a sense of “we can only rely on ourselves” settled upon students nationwide.
While it’s condescending to indulge in moral panic about “TikTok teens,” we do them a disservice by ignoring their agency and casting them only as scared kids indulging in antisocial behavior because adults are AWOL. There is a worthy question to be asked here about why and how TikTok is vulnerable to these sorts of potentially toxic viral challenges, and why students so eagerly shared them.
In one sense, these challenges have less in common with other memes that bear the name — the Ice Bucket Challenge, for instance — than they do with the chain emails that still plague discourse among older generations of internet users. These emails, based on “chain letters” before them, usually consisted of a threat (of financial ruin, physical harm, etc.) if the recipient didn’t forward them along to others, along with a promise (of wealth, true love, etc.) if they did. The peer pressure, the sense of having nothing to lose from sharing and something to lose from not, all prey on the psychology of individuals together alone online. Though the medium is different, the message is altogether similar: A digital Pascal’s Wager against your own demise.
What really drove this phenomenon was a certain desperation among schoolchildren who felt abandoned by their elders
What distinguishes platforms like TikTok is the way that they have harnessed virality, something email was never designed to do. Email suffered from numerous social problems from its earliest days — spam, harassment, threats, and so on. But because it was meant to be a digital imitation of traditional on-paper post, virality was more incidental to the medium. Contemporary platforms like Twitter or TikTok, on the other hand, thrive on virality and are designed to encourage it. Every incentive impels users towards making numbers go up; like a massively multiplayer online game, there is a particular dopamine hit that comes from the high score. By contrast, a large number of unopened emails is now a byword for ennui and depression.
These platforms afford what we might call high-velocity content: short hits that can be read or viewed quickly, and travel far as a consequence. TikTok’s algorithm in particular excels at propelling content from low-follow accounts into the viral stratosphere. When the viral content feels like a game of digital Russian Roulette — share this, or you and your friends might die in a school shooting — there’s a certain impetus to keep this content’s velocity high. The frisson of gambling, combined with the status conferred by virality and with very genuine fear for one’s life, all melted together in the pressure cooker of a platform like TikTok, make episodes like this almost inevitable.
Between the dissociative effects of internet use — which distance us from the impact of our words and actions in online spaces — and the freedom from accountability that such platforms offer, it can be easy for bad actors to send a lie halfway around the world. When lies are popular, and the platform’s design greases the skids, so to speak, a whisper network, however vital, becomes just another rumor mill.
In the aftermath of events like this December’s, adults — even ones who are hardly conservative or reactionary — often fixate on the deeds of potential bad actors. In an interview with the Washington Post, information science scholar Casey Fiesler likened the sharing of this viral challenge to the pulling of a fire alarm in the school building; scare your classmates and teachers, shut down the school, get an instant day off. But while we all know some teenagers who would do such a thing, fixating on bad faith prevents us from recognizing the agency and humanity of these students.
Banning or stigmatizing students’ backchannelling on social media will do precisely nothing to combat what is structurally making American schools unsafe
The impetus to view these whisper networks as fronts of misinformation, while understandable, obscures the deeper problem: an unwillingness to tackle the structural issues making American schools unsafe. Banning or stigmatizing students’ backchannelling on social media will do precisely nothing to combat this. Even our primary sources of information about the credibility of these threats are based on statements from local and federal law enforcement, which we are asked to accept at face value despite these agencies’ track record of spreading both mis- and disinformation themselves. Whether the reported investigations and/or arrests of students in the aftermath of these threats were actually related to the threats is unknown, and such reports should be taken with a grain of salt.
We’ve seen, from Parkland to Oakland to Brooklyn Tech, students genuinely and sincerely standing up for their beliefs as young activists, tackling issues that they have been saddled with by the adults who govern their lives. If they can act in good faith here, we should respect that this TikTok challenge, destructive as it was, might have involved good intentions gone awry as well. A mistake, to be sure. But one that was part of a larger network of problems.
In the rush to scapegoat these platforms as corrupting influences on children and teens, we smuggle in assumptions that may only create conditions that redouble abuse and compel them to suffer in silence once more.
Ours is an age saturated by fear. The Covid-19 pandemic, like school shootings before and after it began, represents a structural problem that we are encouraged to deal with individually. In response, social media is flooded with rage, anxiety, breakdowns, shaming, and terror as people struggle to find some way of taking control of the uncontrollable. They seek some means of protecting themselves, their loved ones, and their communities when the government is unwilling to do so, even if those means are dubious or outright toxic.
There can be little question that scaremongering on Twitter — the sharing of misleading or false content designed to make people even more afraid of an already horrifying pandemic — has little to offer us beyond further eroding our already fragile collective mental health. Yet the platform itself urges this content into virality, while filling the void of structural failures. Put another way, disinformation is a symptom, not a cause, of these myriad social problems. An environment where misinformation and disinformation can flourish among individuals was created by the absence of genuine, societal-level support.
The same is true of these teenagers, cleaving to something like a nightmarish chain email in hopes that desperate Toks may save lives where government officials and school administrators willfully fail. In the face of adult and administrative incompetence, teenagers are using the tools they have access to in attempts to keep themselves safe. To whatever extent they err, or fall prey to the algorithm’s perverse incentives, cognitive biases, or even their own personal moral shortcomings, they merely reflect the same problems adults struggle with every single day on social media. If we must condemn the TikTok teens, then we must condemn ourselves as well.