Nine Pandemic Words That Almost No One Gets Right

Actually, you’re probably not in quarantine.

a word search for of pandemic terms
The Atlantic

One of the best and toughest parts of being a science writer is acting as a kind of jargon liaison. Weird, obscure, aggressively multisyllabic words appear in scientific discourse; I, wielding nothing but a Google Doc, a cellphone, and the Powers of the Internet™, wrest these terms from their academic hidey-holes and try to pin them down with some endearing yet accurate analogy. If I do my job well, sometimes readers never even need to see the original word, because there’s a more approachable way to describe it.

In a lot of cases, that’s how these words move—from academic to journalist to reader. (Hi there.) But sometimes the words leapfrog me. And that’s when I panic.

I have panicked a lot in this way during the pandemic. The coronavirus has prompted a huge shift in the ways we talk with one another, and about one another. That’s what people do in a crisis: We borrow, massage, and invent words to make sense of what’s happening around us.

But this most recent go-round has involved a lot of linguistic “leakage,” the linguist Elena Semino told me last month. “All of a sudden, something for a professional community is being used for everyone.” We’ve had to assimilate a whole slew of terms from public health, immunology, and medicine, some of them totally foreign (cytokines, positive predictive value, R-naught), others more familiar but with colloquial and academic meanings that at least partially conflict (bubbles, breakthroughs, boosters). The transition doesn’t always go smoothly, and confusion and misunderstandings, much like contagion, are very hard to rein in once they’ve started to spread.

By now a lot of our pandemic verbiage has been misconstrued. Last week, I asked experts, friends, family, and colleagues what field-hopping terms or phrases had been causing the biggest headaches this past year; the recommendations came pouring in. What follows is by no means comprehensive, and probably represents a futile exercise in refining and redefining: The horses have left the barn, the ships have sailed from the harbor, the words have already slipped through my fingers like so much semantic sand. But I suppose I will continue to grasp at them, until they have escaped me entirely.


Let’s start with asymptomatic, which scientists use to denote infections that never make people feel sick. Seems simple enough. But many who start off their infection symptomless might not stay that way, and until someone is rid of the coronavirus, it’s impossible to say whether they’re asymptomatic or presymptomatic. The boundary between no symptoms and symptoms is also surprisingly fuzzy. COVID-19 symptoms vary enormously from person to person, and are somewhat subjective: A headache two days after a positive coronavirus test could be a COVID symptom or an ill-timed hangover.

Truly silent cases, though, are detectable only through a test that hunts for bits of the coronavirus. These infections don’t count as COVID-19, a term that’s supposed to be reserved for a documentable, symptomatic disease that unspools from a subset of SARS-CoV-2 infections. The virus, SARS-CoV-2, is what actually infects us, what actually transmits, what tests actually detect. Not COVID. (I am screaming into a void here, but that also means there’s no such thing as a COVID test, and there’s no such thing as asymptomatic COVID.)

Okay, fine. Say you do test positive for SARS-CoV-2, and you lose your sense of smell, and your nose is kind of running a bit—you have straight-up symptomatic COVID. Maybe the person you mingled with unmasked a few nights ago does too, but they’ve got chills, nausea, and a high fever that will wreck them for weeks. Surprise! Both of you have mild COVID-19, a euphemistic term that’s still commonly used to describe all cases too “inconsequential” to land someone in the hospital. (At that point, a case is “severe.”) Mild might be useful for collecting population-level data, but a lot of experts dislike the adjective because it elides the debilitating and sometimes very lengthy illnesses that can unfurl from a SARS-CoV-2 infection, including long COVID. From the beginning, it’s been clear that “there’s mild, moderate, and severe, even for outpatients,” Sri Edupuganti, an infectious-disease physician and vaccinologist at Emory University, told me.

Whichever direction the pendulum swings, for the first few days after your symptoms start, you’re going to be in … quarantine, right? Sadly, no. Two years into our run with COVID, that’s still one of the terms we most commonly mess up. Correctly used, quarantine describes the period of time when people who think they’ve been exposed to SARS-CoV-2 are supposed to cloister themselves—a precaution in case an infection manifests. If you know you’re infected, thanks to, say, a positive test or legit COVID symptoms, you’re going into full-blown isolation. (Unless you’re in the United Kingdom, where they apparently play it pretty fast and loose with these terms and “use them interchangeably,” Saskia Popescu, an infection-prevention expert at George Mason University, told me. Woof.)

To confuse matters further, we have also adopted quarantine as a catchall moniker for somewhat sheltered pandemic life, or lockdown-lite. (Just check Google for 8 trillion listicles on quarantine cats, quarantine TV shows, quarantine meals, quarantine quarantinis …) Part of this obsession is probably cultural baggage​​: If Americans heard quarantine before the pandemic, it was usually in foreboding contexts—outbreak-centric history texts, or the plot twists of Contagion-esque sci-fi thrillers. (We have, after all, been using the term for centuries, since at least the time when ships arriving from plague-stricken countries were cordoned off for 40 days before docking—hence the quar- prefix.) Isolation is a much more well-worn term, something we’ve all gotten at least a taste of; it lacks that only-in-crisis allure. Quarantine—quarantine!—sounds way worse.


We’ve struggled with cheerier words, too. The prospect of being fully vaccinated, for instance, is pretty appealing. Our COVID shots substantially reduce the risk of getting infected or seriously sick with SARS-CoV-2, and slash the chances that the virus will be passed on to others.

But oh boy, is fully vaccinated also a nightmare to define. For starters, being fully dosed isn’t the same as being fully immunized, because it takes a couple of weeks for immune cells to learn the contents of a shot and react. (Even the professionals use this one in a confusing way: The CDC counts people as fully vaccinated the day they receive their second dose of Moderna’s or Pfizer’s vaccine or their first of Johnson & Johnson’s, but says they aren’t “considered” fully vaccinated until two weeks after that.) The rise of third doses and booster shots has also made the concept of full vaccination quite a bit squishier. If these additional shots are meant to build iteratively on prior defenses, does that take us to … fuller vaccination? Super vaccination? Or did we at some point get less full? (For now, at least, you don’t need a third dose or a booster to be considered fully vaccinated.) Fully also implies completeness, even invulnerability, when no vaccine in existence can ever confer such a thing.

That vaccines aren’t impenetrable shields against infection isn’t bad news; it’s very much in keeping with how immunity works, waxing or waning as encounters with microbes or vaccines build it up or as time or pathogen evolution erode it away. This has been a point of confusion when discussing vaccine effectiveness, the formal term for how we measure a shot’s success; those numbers will always vary, depending on what we’re measuring effectiveness against. (This one, to be fair, isn’t widely misused so much as widely misunderstood.) Establishing any infection at all is the easiest feat for a virus to accomplish—the first step toward causing disease—and the hardest event for a vaccinated immune system to block. That’s always where protection will falter first.

That sounds like a bummer, but SARS-CoV-2 infections among the vaccinated are entirely expected—especially because our shots were designed to help us stamp out disease, not eradicate all positive test results. It’s unfortunate, then, that we’ve spent months wringing our hands over breakthroughs of all severities. The term breakthrough has an established history in vaccinology—counting up these events is necessary to know how well inoculations are working in and out of trials. But because of our fuzzy understanding of vaccine effectiveness, the word’s use in pandemic times has become much more doom and gloom, with some reports even equating breakthroughs with vaccine failures. That’s absolutely not the case.

Consider the CDC’s definition for a SARS-CoV-2 breakthrough: any test-based detection of the virus in someone who’s been fully vaccinated against the coronavirus. This dumps an enormous range of postinoculation outcomes into the same category, everything from exceedingly rare hospitalizations and deaths to totally silent infections that would’ve gone unnoticed if not for that choicely timed test. Simply receiving a positive test result does not guarantee that a person will experience disease or spread the virus to someone else. For these reasons, a lot of experts have sworn off using the term breakthrough—and wince noticeably when it comes up in conversation. (Many prefer post-vaccination infection.)

If the terminology of breakthroughs has been exaggerated toward the negative, the discourse around natural immunity might be its overhyped foil. Natural immunity is another foster-phrase; long before the pandemic started, scientists used it to describe the protection left behind after an infection by a bona fide pathogen. But in the age of COVID, the phrase has become weaponized into a false binary: If infection-induced immunity is natural, some have argued, immunity obtained through different means must be unnatural—artificial, undesirable, a dangerous hoax, or even, in some cases, a moral failure, the religious-studies expert Alan Levinovitz recently explained in The Washington Post.

But that dichotomy is scientifically nonexistent. Inoculations are designed to mimic the microbes that cause infections, and often end up tickling pretty similar responses out of immune cells. The main difference is that vaccines deliver their defensive lessons safely, without risking disease. As a nod to this, the immunologist John Wherry and others prefer using terms such as infection-acquired and vaccine-acquired immunity. They’ve even started using another phrase—hybrid immunity—to refer to the heightened protection that’s afforded when people with a prior SARS-CoV-2 infection get vaccinated.

If the worry truly is that vaccines are a technological unknown, there’s at least one other way to look at this. Vaccines, like many other human inventions, are body-inspired. They leverage and build on our inborn defenses, in much the same way that glasses can enhance vision and good running shoes can speed up a person’s pace. They’re not an indictment of the immune system and its numerous powers, but a tribute to them. In a pandemic, vaccines, in protecting both the people who receive them and the people those recipients interact with, really do accomplish what no other tool can—and that, if anything, is worth saying over and over and over again.

The Atlantic’s COVID-19 coverage is supported by grants from the Chan Zuckerberg Initiative and the Robert Wood Johnson Foundation.

Katherine J. Wu is a staff writer at The Atlantic.