It’s great that Facebook, Twitter, and other large social media companies are doing what they’re doing to prevent the spread of misinformation around Covid-19. Sure, Facebook is still ignoring what’s shared in private groups and has co-hosted a town hall with Fox News; sure, Google’s efforts to ban Covid-19 advertising will play right into Donald Trump’s hands. The response isn’t perfect. Still, these companies deserve credit for intervening so decisively.


Image may contain Rug

Subscribe to WIRED and stay smart with more of your favorite Ideas writers.

The flipside of that applause is, well, why wouldn’t they intervene decisively? There are no good arguments for platforming information that will get people killed. Beyond that, false claims about the virus are often cut-and-dry, and therefore more amenable to fact-checks. Even Mark Zuckerberg agrees, telling the New York Times that falsehoods about the virus cross a clear threshold, making it “easier to set policies that are a little more black and white and take a much harder line.” These policies—including Facebook’s efforts to surface accurate information, remove harmful information, and ban exploitative Covid-19 ads—don’t represent some sudden about-face on moderating content. Instead, they reflect the uniqueness of the moment. Zuckerberg’s responses suggest that medical and political misinformation are simply different: they have different standards and features and consequences. So don’t worry, he implies, once our current information lockdown order lifts, such policies can slip back into their much less clear-cut, hands-off normal.

God help us if they do. The pre-pandemic normal is part of the reason we’re in this mess. By not taking seriously the ways in which political misinformation is itself a threat to public health, we’ll fail to learn what must be learned from this pandemic.

From the outset of the Wuhan outbreak in January, coronavirus conspiracy theories roared across social media. On the reactionary fringes, these centered on QAnon and the usual Deep State suspects, narratives that have been percolating in far-right corners for years. Within the more mainstream right—to the extent that such a thing exists in 2020—commentators may have sidestepped QAnon but they’ve still pinned the tail on the Deep State. For example, Sean Hannity said earlier this month that it “may be true” that a nefarious army of resistance bureaucrats were using the outbreak to “manipulate economies, suppress dissent, and push mandated medicines.” Many others, including Donald Trump, insisted that the response to Covid-19 was a feverish overreaction of the fake news media and their Democratic allies, who were desperate to tank the economy in order to hurt Trump’s reelection. It was just another impeachment hoax.

And so millions of people in the US downplayed the threat, blamed the Democrats, and derided scientific expertise. The specific circumstances of the Covid-19 outbreak may have been new, but the underlying arguments were not. Donald Trump won the 2016 election on a wave of screw-the-Left, drain-the-swamp, ignore-the-lamestream-media energies. Given that buildup of pollution, and all the time it had to filter through the far-right water table, it’s no surprise that the Covid-19 threat was met—at least in the critical first few months, when we could have started preparing en masse—with partisan jeers, attacks on the media, and efforts to own the libs through social un-distancing. It certainly wasn’t a surprise that someone like Anthony Fauci would be roped into his very own Deep State plot.

We’re only just now beginning to feel the consequences. Our healthcare system, already strained, is struggling to keep pace with surging cases. Sirens wail day and night through New York City.

The population didn’t know what it needed to know, wasn’t doing what it needed to do, and seemed on the verge of indescribable loss if something didn’t change. That’s precisely why Facebook and Twitter and YouTube and the like were forced to take such drastic measures to curb the spread of false information. But the platforms only acted after having wasted months dithering in principled restraint—treating Covid-19 conspiracy theories and racist invective and false cures as if they were all no different from normal political speech, and thus deserving of the same broad protections. At least by 2020 standards, this was normal political speech. But from the very outset, it was also a threat to public health. The platforms only made that connection—and their public relations case—after the World Health Organization declared Covid-19 a global pandemic.

Read all of our coronavirus coverage here.

Of course some falsehood has slipped through the cracks since then; even after the WHO designation, QAnon supporters on Facebook have kept themselves plenty busy, and so have antivaxxers. What’s different now is the posture of social responsibility affected by those in charge. In the past, Zuckerberg has stated that he doesn’t think Facebook should remove deeply offensive, deeply false things like Holocaust denial. People should be allowed to be wrong, he said; it’s not Facebook’s job to intervene when they are. Now, the argument is that Facebook should take false Covid-19 information down. That it is the platform’s job to intervene.

But these are extraordinary times. Writing in Foreign Affairs last week, political communication scholars Sarah Krep and Brendan Nyhan argued that the spread of false political information doesn’t and shouldn’t warrant the same kinds of sweeping, break-glass-in-case-of-emergency measures as the spread of false medical information, because political information “does not threaten people’s health.”