Home

The Right to Have Remained Silent

Journalists serve the public by telling the truth, but not the whole story

A few years ago, I accidentally set a bad legal precedent. In October 2011 I was arrested along with hundreds of others during an Occupy Wall Street march across the Brooklyn Bridge. Like everyone else I was charged with “disorderly conduct,” and People v. Harris was totally unexceptional until I got an email from Twitter. At first it looked like spam, but my filter didn’t flag it. The New York district attorney’s office had subpoenaed my account information, and Twitter was giving me a heads-up and a chance to try and stop them.

What had been little more than a parking ticket turned into a heavyweight bout. Twitter filed its own motion to quash the subpoena. The ACLU, the NYCLU, the Electronic Frontier Foundation, and Public Citizen joined in on an amicus brief defending my right to dispute the search and asserting the state’s need for a warrant. Judge Matthew Sciarrino was not buying it, even though he had had his own troubles with imperfect digital hygiene. (Two years before the Brooklyn Bridge arrests, Sciarrino was moved from the Staten Island Criminal Court to Manhattan after attorneys complained he had been friending them on Facebook before hearings, according to the New York Post.)

The prosecutors said there was something in my tweets that incriminated me, and the judge believed them. When Sciarrino started threatening Twitter with major fines, the bird squawked. Twitter’s official policy continues to be that authorities seeking user data require a warrant, but the company turned mine over without one. With the prosecutor seeking jail time — unusual for disorderly conduct — I pleaded guilty on the advice of my attorney and did some community service. I was thinking of how worried my mother would be, and how I really did not want to go to jail, even just for a couple weeks.

Privacy became a central question in the case, and though my tweets were public, having them brought to bear against me in a courtroom felt like a violation. The Fifth Amendment protects us from self-incrimination, and without Twitter’s cooperation or my testimony, the tweets would have been hard to authenticate. But the prosecutor claimed that because I tweeted them, my words ceased to belong to me — except insofar as they could be used to undermine my defense. Twitter didn’t think that was true, and their user agreement said as much. But that’s what the judge decided.

Restraint is one of journalism’s cardinal virtues; the profession serves the public by not saying things too

Even more galling was that the prosecutor, assistant district attorney Lee Langston, was young, five or six years older than me (I was 23 at the time). He is only a few connections removed from me socially, and I heard through the grapevine that he considered himself “really liberal” and that he “supported” Occupy Wall Street. I did not feel supported. Here he was, arguing that everything you post can be used against you in a court of law, while as a young American living in part on the internet he knew damn well that there are different kinds of public information. If he’s going to argue that social media posts are totally public, then he shouldn’t have gotten mad at me when I tweeted out unprotected Facebook photos of him dancing silly at a Harvard Law prom.

Objectively speaking, you should not troll prosecutors. Objectively speaking, I should not have tweeted that Langston and the editor of a prominent feminist blog were dating. But why shouldn’t the messy public-private divide cut both ways? I knew Langston was reading my tweets, and his whole “trying to put me in jail” thing felt very personal. What’s off-limits when someone wants to lock you in a cage? What could be a greater privacy violation than jail? It’s not like I tried to cavity search him. Besides, I was scared. Humanizing the state in the person of this prosecutor made me feel safer, even though it may have just put me at greater risk.


When so much of our communication, shopping, even thinking happens on the record, how (and how much) can we achieve any meaningful privacy? Do we surrender our information in perpetuity to the public, to the press, to corporations, and to the government when it goes online? Is it naive to expect anything less?

My weird experience at the intersection of these and more digital-privacy questions is what prompted me to read Meg Leta Jones’s CTRL+Z: The Right to Be Forgotten. Jones, a professor of communication, culture, and technology at Georgetown University, explores what could be called the regime of digital-past management, comparing the U.S.’s laissez-faire model with the European Union’s varied and more active policies. While the First Amendment puts almost everything up for grabs, the Europeans believe individuals should have tools to defend themselves from their pasts.

Central to the author’s questions about private data is the length of time we’re tied to the information we shed. Information never dies, but a lot of it is much harder to reach than you might imagine. The average web page doesn’t last forever; it might not even last a few months. Jones quotes a few studies that put the half-life of information on the internet — the time at which 50 percent of it will have vanished — at three years. And this is really vanished, like you couldn’t find it if you tried.

Depending on your perspective, this “URL death” is either like torching the library of Alexandria every week, or one of the internet’s most essential features. In terms Jones borrows from Sumit Paul-Choudhury, the “preservationists” on one side fear a world unwound by link-rot entropy, while the “deletionists” on the other worry that the Tetris blocks of information are stacking far faster than our ability to handle them responsibly. Yet most people fall somewhere between these poles, due to a paradox in privacy that Jones points out: We want it for ourselves, but we still want to peek at others.

This is one of the reasons Jones doesn’t put a lot of faith in approaches to digital-past management that call not for new laws but new attitudes about privacy, like legal scholar Jeffrey Rosen’s case for “new norms of atonement and forgiveness.” From Rosen’s point of view, we need social solutions to social problems. If we’re all more exposed, then we need to do a better job accepting nudity. A village of glass houses should cut it out with the stone throwing.

Jones sees the effects in the meantime — harassment, shame, cramped lifestyles, paranoia — as too damaging to allow. People need a legal recourse, she argues, when past bad or even just embarrassing acts threaten to overwhelm a person’s present and future. The E.U.’s answer is Title 17 of the “Proposal for a Regulation of the European Parliament and of the Council on the Protection of Individuals With Regard to the Processing of Personal Data and on the Free Movement of Such Data,” which provides for a “right to erasure” for E.U. citizens. Essentially, you fill out a request form asking Google to detach embarrassing but publicly inconsequential search results from your name. The information stays online, but it doesn’t show up when someone looks for you. There’s a way for the individual to escape their past, at least on Google.

When it comes to libel on platforms, American law is not much help. A defamed individual is probably better off trying to change the narrative

Beyond individual well-being, collectively our privacy rights constrain institutional actors. You can’t, for example, elect to have the government monitor your sex life. Individual choices — malicious or forgiving as they may be — can’t protect society, especially when our choices are gameable. For example, individuals will click “accept” on anything you put between them and a new tool or toy, but that doesn’t mean we want companies to use that knowledge to obviate any and all rights we have over our personal data. We want to use Google Maps and Amazon and all the rest, but that doesn’t mean we want them to have the information our usage generates.

With full access to the average person’s data trail, you can know a lot: where they’ve been when, what they talk about and with whom, what they wear and eat, the pills they take, whom they love, whom they hate, where they’ll be, and so on. With that knowledge and all the computing power companies can buy, people are reduced to infinitely manipulatable input-output functions.

American law is badly suited to deal with these questions, perhaps uniquely so. In denotation, the Constitution is much more afraid of governmental interference with the press than of the press interfering with privacy. The U.S.’s press privacy regime is based on the idea that there are two kinds of people who can reach the public as such: journalists and the people they write about. American journalists are broadly protected from government action both when it comes to the good-faith pursuit of news and prurient gossip. Scrutiny is, by design, the price for fame or power in America.

But with social media, we can all theoretically reach everyone at once. Tweet the wrong (or right) thing, and a no one can become a someone in a minute. Onetime some offhand joke I made was retweeted by Justin Bieber and suddenly that joke was the most popular thing I had ever said in my life. We may not know whether we’re now journalists or celebrities, but we’re all very suddenly public. In this environment, U.S. law isn’t set up to be any help.

The First Amendment’s protections for speech and the press aren’t unlimited, but they’re very broad, and stronger than the E.U.’s. Whereas the U.K. press is governed by stricter libel laws, the American approach to speech and public information is norm-based. If celebrities sued every time an American tabloid printed lies about them, they’d never have time to be famous.

Rather, reader norms delegitimize tabloids. With few exceptions, when the National Enquirer publishes a bombshell story, nothing happens. If publications want to be taken seriously, if they want to affect the world, they don’t print lies — and apologize when they do. Other norms include respecting the division between public and private people (your neighbor being an asshole isn’t news) and the division between public and private behavior (a celebrity’s beach body isn’t news either).

Given that the First Amendment singles out American journalists for trust, journalists’ first responsibility is to be accountable for the trust placed in the profession. Journalists need to do more than just tell the truth; they are supposed to abide by a set of standards and practices that may exceed our individual appreciation of their usefulness. Maybe it’s not immediately apparent why a newspaper might on ethical grounds refrain from printing a photo of a kid breaking the windshield of a police car — it’s public! it’s news! — but editors are called to balance benefits to the public against the costs to their subjects. An exposed corrupt congressman gets what he deserves, but does the publicly incriminated rioter? Is that what the First Amendment is for? Restraint is one of journalism’s cardinal virtues; the profession serves the public by not saying things too.

In the age of data, regular folks can’t rely on the impotence of their words

Online platforms did not sign up for journalism’s responsibilities any more than the ink producers did, but they’ve always walked a murky line. If you post your content on Google’s servers and YouTube makes money on the ads they sell against it, Google fulfills the function of a publisher or a broadcaster but without any of the voluntary responsibilities. And when American courts allow Google to comply with French privacy laws by delinking people from unflattering information on the .fr domain and nowhere else, for better or worse they project around the world our quirky Constitution and its emphasis on unlimited speech over personal privacy.

U.S. courts generally don’t hold online platforms liable for any illegality with respect to content users post unless the platforms know it’s illegal beforehand. This is usually only an issue when it comes to intellectual property. The copyright holders to the old cartoon Fillmore! can’t successfully sue YouTube just because someone posts full, infringing episodes. When it comes to libel on platforms, American law is not much help. A defamed individual is probably better off trying to change the narrative.

Compared with the costly responsibilities of journalism, being a platform looks appealing, especially when you have to compete with them for attention. Gawker is not unique, but its synthesis of British journalism norms and Cayman Islands liability made Nick Denton’s project a trailblazer. The site has famously transgressed both public-private divides, outing a relatively unknown corporate executive’s texts with an escort and publishing a Hulk Hogan sex tape. It is playing a version of the privacy paradox: People will still click on things they think shouldn’t have been published. And if Gawker publishes these things, other sites will too, or at least write about Gawker doing it. Self-restraint works only if everyone does it, and though Gawker built a bridge between legitimate news and tabloid gossip, it’s now the so-called legacy media companies who walk it every day.

Supervillain Peter Thiel, with his many lawsuits against Gawker, claims to be playing white knight for the American press norms of yesteryear, but whether he wins or loses, that fragile deal is definitely over. Is that so bad? Privacy norms shift over time. Allegations of Bill Cosby’s repeated sexual assault were public and well-founded, but for a long time they fell into the media void of “private behavior.” A university professor of no particular public profile might be considered a private person and his sexual harassment of graduate students not newsworthy, but exposing this behavior can go a long way toward creating a social environment of accountability. Privacy norms, in part, protect the locally powerful, and journalists who abide by them aren’t always serving the public after all.

Which brings me back to People v. Harris. It’s flattering to think the district attorney’s office picked me out of a crowd of hundreds because they thought I was important, but I’m pretty sure that’s not what happened. Rather, I’m almost positive that the assistant district attorney saw a tweet of mine quoted in a Guardian article about the march. The tweet was public, but I didn’t expect it to be that kind of public. My audience, though potentially limitless, was practically limited to my fewer than 2,000 followers at the time. And if the Guardian’s writer had contacted me, I wouldn’t have given the same quote for fear of getting myself in trouble. Do I want a law that would have prevented the Guardian from publishing the tweet? I do not. Do I wish the reporter had asked me anyway? Hell yes.

I was lucky that I didn’t face more serious consequences; someone in a different situation could have had their life ruined. Allen Bullock, a black teenager from Baltimore, faced possible life in prison and a $500,000 bail after dramatic photos of him breaking a cop-car window with a traffic cone during a 2015 riot led coverage around the world. It’s impossible to know if Bullock would have felt compelled to turn himself in if the pictures had never been disseminated. Punishing their subject is probably not what the photographers intended, but it was a predictable outcome of their actions.

America is not Europe, and national content producers and platforms are not likely to see their lawful speech regulated any time soon, not for privacy. In CTRL+Z, Jones doesn’t get into regulatory capture, but American multinationals have a lot of influence on the laws they have to follow. Even if regulators were able to navigate the First Amendment challenges, it’s hard to imagine getting strong digital rights to privacy past Silicon Valley lobbyists. This Wild West approach makes the internet we have possible, but it also makes the internet we have hard to manage.

Jones thinks the process of changing norms is too slow for the immediate harm reduction we need, but norms are all we have. Whether most editors and journalists have ever lived up to the special trust our system puts in them, I don’t know, but they’re not so special anymore. In the age of data, regular folks can’t rely on the impotence of their words, and there’s no one to stop us from broadcasting whatever dumb — or allegedly incriminating — shit we think up. No law restricting information will save us, from ourselves or each other. If we want to live networked in safety, dignity, and respect anyway, we’ll all have to be professionally thoughtful.

Malcolm Harris is a freelance writer and an editor at the New Inquiry.