Before Omicron, before Delta, before anti-vaxxers and booster jabs, before we had absorbed an entire vocabulary of Covidian epiphenomena, one fundamental question persisted far longer than it should have: How was the SARS-CoV-2 virus transmitted? More bluntly, how did people catch the damn thing? At first, the authorities seemed unanimous: the virus spread through droplets of saliva or respiratory fluid expelled with coughing and sneezing, droplets that were heavy enough to fall straight to the ground. You should be okay, we were told, even indoors and without a mask, if you washed your hands frequently, disinfected surfaces, and maintained enough “social distance” to allow droplets to plunge safely to the carpet. Per the World Health Organization, that meant at least one meter, or about 3.3 feet. The governments of Australia, Germany, and Italy went with 1.5 meters; of Spain, Britain, and Canada with two meters; the U.S. Centers for Disease Control with six feet. But you get the idea. We all did. We washed our hands. We sanitized. We anxiously wiped down packages.
Yet almost from the first there was reason to suspect the official wisdom. On April 10, 2020, less than a month after Covid was declared a pandemic, the physicist Lidia Morawska, an expert on the dynamics of ultrafine particles, and Junji Cao, who directs the Chinese Academy of Sciences’ Institute of Atmospheric Physics, published an article arguing that the virus was likely transmitted through particles so small that they did not fall to the ground at all but stayed aloft, blown about on air currents that could carry them up to “tens of meters from where they originated.” If Morawska and Cao were right, this would change everything. It should have, anyway.
Already at that point, significant evidence appeared to contradict the droplet theory. At a hospital in Singapore, swabs taken from air vents in a Covid ward tested positive for the virus. At an ICU in Wuhan, researchers detected the virus in air collected about four meters from any patient. During a choral practice at a church in Skagit Valley, Washington, fifty-three of sixty-one choir members were infected in one evening of divine praise. A single tourist on the Diamond Princess cruise ship appears to have been responsible for the infection of 712 others; the virus spread even after passengers were confined to their cabins.
Aerosol transmission was never merely a medical question.
As the first wave tapered off, scientists around the world were independently concluding that the virus spread through ultrafine aerosol mists. The WHO remained in denial. On March 28, the organization had tweeted, “FACT: #COVID19 is NOT airborne.” In July, 239 scientists signed a statement, of which Morawska was the lead author, appealing to public health authorities to acknowledge the likelihood of airborne transmission and encourage measures to mitigate it. The focus should shift to preventing overcrowding and providing adequate ventilation, “particularly in public buildings, workplace environments, schools, hospitals, and aged care homes.” The way out of the disaster, in other words, was not individual but collective and infrastructural.
WHO officials promised to review the evidence. Nearly ten months later, on April 30, 2021, a Q&A page on the WHO website was quietly updated to include the possibility that the virus was transmitted through aerosols that “remain suspended in the air or travel farther than one meter (long-range).” The CDC followed suit seven days later. All of this is to say that officials responsible for preserving public health on a national and global scale let the pandemic spread for more than a year before they allowed the publication of an account of the virus’s transmission that was consistent with the prevailing scientific evidence. Why this extraordinary delay?
Early accounts put it down to the “slow and risk-averse” nature of bureaucracy, as if bureaucracies do not otherwise accomplish a great deal without disastrous delay. Some speculated that WHO staffers were reluctant to commit to a position without irrefutable proof. But transmission through droplets had also not been directly demonstrated. To date, there is no decisive evidence that Covid has ever spread via fomites—objects like door handles, countertops, and keyboards that the WHO was advising us to disinfect. The sociologist Zeynep Tufekci suggested in the New York Times that theories upending conventional wisdom face a higher “standard of proof.” But evidence for the aerosol transmission of diseases has existed since the 1890s. So why did droplet transmission become the dominant paradigm? Apparently, several media accounts suggested, it all had something to do with the nineteenth-century victory of germ theory over the medieval superstition that illness spread through miasmatic clouds: a battle with such “high stakes,” in Tufekci’s words, that it still makes epidemiologists sweat.
If that explanation is unsatisfying, it is also largely fictional. The battle fought in that era was not a noble crusade of science against superstition, as modernist mythmaking would have it, but a political struggle. The stakes then were not different from what they are today: the role of state and commerce in society, the degree to which “free” markets could be permitted to rule. Aerosol transmission was never merely a medical question. Accepting it, as Dr. John Conly, an infectious disease specialist advising the WHO on its Covid guidelines, told Reuters that first summer, “would affect our entire way of life.”
Corruptions of the Air
The prevailing narrative goes something like this. From Hippocrates in the fifth century BC until the late 1800s, humanity was trapped in the miasma of, well, miasma theory. Disease was believed to be spread via poisonous vapors, a “corruption of the air” caused by unfavorable climatic conditions and decomposing organic matter, better known as “filth.” The ailment we still call “malaria” owes its name to the Italian words mala and aria, literally bad air. Illness was seen as environmental: it spread from places to people, not from people to people. Thanks, however, to the ever-growing rationality of European societies, and to a few men of genius, science prevailed. We learned the mechanics of contagion: bacteria and viruses spread between individuals, sometimes through animal or insect hosts, causing ailments that we know as infectious diseases.
The story is not without its martyrs. John Snow famously halted a London cholera outbreak in 1854 by convincing officials to remove the pump handle of a contaminated well; he died “from an attack of apoplexy” years before his theories gained acceptance. (The Lancet only acknowledged that its excoriations of Snow had been “perhaps somewhat overly negative in tone” 155 years after his death.) Ignaz Semmelweis was ridiculed by peers for suggesting that doctors might reduce maternal mortality by washing their hands between performing autopsies and delivering babies. He died in a mental institution in 1865, of septic wounds sustained in a beating from guards. But his and Snow’s sufferings were not in vain. Their findings paved the way for the work of Louis Pasteur and Robert Koch, who in the 1880s fully elaborated the germ theory of disease.
This narrative is tidy, heroic, and tragic enough to be compelling: a triumphal sub-arc to the broader sweep of human progress. It is, however, entirely false—in scope if not in each detail. Human beings have observed for a very long time that sick people make other people sick. This is why the authors of the Book of Leviticus recorded careful instructions for the identification and isolation of lepers. In the fourteenth century, traumatic experience with plague led European states to set up institutions like quarantine and the lazaretto, as well as basic sanitary measures for handling the dead and segregating the sick. In 1546, the Renaissance physician Girolamo Fracastoro laid out a comprehensive theory of contagion, arguing that ailments like syphilis, typhus, and leprosy were spread by invisible spores that could be transmitted through touch, via fomites (a word he coined), and through the air. A century later, Athanasius Kircher inspected plague victims’ blood in a microscope and identified what he believed were the living seeds that carried the disease.
In fact, theories of contagion were already “rather old” by 1800, as the medical historian Erwin Ackerknecht argued in 1948, in one of most influential articles in the history of medicine. But it was around 1800—on the very hinge of modernity, just after the American, French, and Haitian revolutions and just before the full emergence of globalized industrial capitalism—that those theories “experienced the deepest depression and devaluation in their long and stormy career.” Their antagonists were not retrograde defenders of superstition but some of the most “outstanding scientists” of their day. For them, Ackerknecht wrote, the battle against contagionism “was a fight for science, against outdated authorities and medieval mysticism; for observation and research against systems and speculation.” It was a “medical revolution” to parallel the political upheavals of the time.
The actual epidemiological battle of the early nineteenth century was then a mirror image of the modernist myth that has been spun around it. The contagionists were the ones defending antiquated tradition. The empiricist crusaders—who Ackerknecht lumped together as “anticontagionists”—were on the side of an updated miasma theory. They argued that infectious diseases such as yellow fever and cholera were produced by environmental factors and unsanitary conditions. The notion of contagion through invisible germs was to them “an insult to the understandings of mankind” and based on “a series of chimeras,” as the British polemicist physician Charles Maclean put it. To be fair, there was good scientific reason to discard a theory that could offer no verifiable mechanism for disease transmission nor account for its variable nature—some individuals stayed healthy after contact with the sick while others fell ill even in isolation.
The issue was hardly academic. The economic system that emerged with the spread of European colonial empires had created unprecedented microbial hazards. Old World diseases like smallpox and measles tore through the indigenous populations of the Americas while illnesses previously unknown to Europeans decimated port cities on both sides of the Atlantic. Yellow fever, endemic to West Africa, was most likely ferried to the Americas in the cargo holds of slave ships. In 1793, it wiped out approximately one-tenth of the population of Philadelphia before crossing the Atlantic again to wreak havoc in Spanish ports. By 1832, cholera, which originated in India, had killed tens of thousands across Europe.
At the height of the controversy, partisans of both sides were hopping across the Atlantic and Mediterranean, investigating epidemics of yellow fever in Savannah (1820), Barcelona (1821), Gibraltar (1828), not to mention cholera and plague, all the while issuing manifestoes and submitting reports to commissions, academies, and boards of health. The true believers demonstrated their convictions in spectacular fashion: the Frenchman Nicolas Chervin, Ackerknecht’s clear favorite, was not the only anticontagionist to risk drinking “large amounts” of the black vomit expelled by yellow fever patients. (He was fine. Yellow fever is spread by mosquitos.)
The question of quarantine—the customary forty-day hold imposed on ships arriving from foreign ports—loomed large over these debates. The contagionists tended to support it. They were, per Ackerknecht, mainly “high ranking royal military or naval officers . . . or bureaucrats . . . with the corresponding convictions.” If illness could be proven to be caused by local conditions rather than imported contagions, the practice would no longer be justified. And by the 1800s, quarantine was reviled equally by laissez-faire liberals and by governments dependent on profits extracted from colonial empires, France and Great Britain foremost among them. Maclean, the most vociferous of the British anticontagionists, worked as an East India Company surgeon while crusading against the evils of quarantine. As late as 1886, the British government, worried that the victory of germ theory might jeopardize free movement through the Suez Canal, convened a committee of experts to officially refute Robert Koch’s recently published work on cholera.
Like Maclean, the men who Ackerknecht lumped together as anti-contagionists tended to be exemplary liberals whose faith in scientific rigor went hand in hand with a valorization of individual liberty and opposition to arbitrary state power. Ackerknecht stressed their antipathy to quarantine above all else, but they were not hostile to other state interventions. The sanitary movement in Britain and the United States, which overhauled urban infrastructure from London to New Orleans, was fundamentally miasmatist. The causes of epidemics were understood to be local and environmental. Disease was both literally and metaphorically in the air: it hung like a mist over the social relations that produced it.
The actual epidemiological battle of the early nineteenth century was a mirror image of the modernist myth that has been spun around it.
The implications were potentially revolutionary. At their most “messianic,” wrote the historian Peter Baldwin, they entailed “a total program of thoroughgoing reform.” The great German radical physician Rudolf Virchow, an anti-contagionist in his youth, was dispatched to investigate an outbreak of typhus in 1848; he blamed the epidemic on immiseration and exploitation, prescribing “free and unlimited democracy.” The great champions of rationality and progress were hence aligned against the theory that now sits at the foundations of modern scientific medicine. “To many of them,” Ackerknecht wrote, “both slogans: freedom of commerce (no quarantines) and freedom of science (anticontagionism) were . . . the natural expression of the same fundamental attitude and social position.”
In the nearly seventy-five years since he first published on the subject, Ackerknecht has been criticized for posing the debate in overtly dualistic terms. But in outline at least, his basic narrative has stood up well. The ultimate victory of germ theory was not a triumphal tale of science vindicated but a complex history in which epidemiology was bound up with the politics of capital and empire. Ackerknecht’s fundamental insight, that, as Baldwin put it, “prophylaxis is a continuation of politics,” these days feels truer and more pertinent than ever.
By the end of the nineteenth century, when germ theory emerged triumphant, it was evident that the paradigm that had prevailed was a conservative one. Virchow had been such a consistent irritant to Otto von Bismarck, the future “Iron Chancellor” of the German Reich, that Bismarck once challenged him to a duel. Robert Koch, by contrast, was a darling of the German imperial state. When cholera broke out in Hamburg in 1892, the central government dispatched Koch to impose quarantine and disinfection measures. His success there marked the final victory of contagionism.
The benefits that it has since brought to the species are difficult to dispute—many of us would not be alive without them. But they came with a loss. The arena in which disease could be fought shrank. It no longer encompassed all of society, its relations with the earth and with the air that we breathe. Instead, it would be confined to individual cases, individual behaviors, and the controlled spaces of the laboratory and the examining room. Contagionism had triumphed within the bureaucratic centralism of Bismarck’s Germany. Within a generation, however, it would be made compatible with the free-market ideology that has dominated most of the globe ever since.
Terror from the Air
Which brings us back to aerosols. It may have occurred to you that germ theory does not rule out aerosol transmission. In 1917, it was still possible to lament, as Charles Value Chapin, superintendent of health for Providence, Rhode Island, did in a book titled How to Avoid Infection, that germ theory even “encouraged belief in air infection, for, when it was suggested that bacteria are the real cause of disease, it was assumed that such tiny particles could easily float in the air.”
The ultimate victory of germ theory was not a triumphal tale of science vindicated but a complex history in which epidemiology was bound up with the politics of capital and empire.
This was the conclusion of the German bacteriologist Carl Flügge and his colleagues, who in a series of experiments performed in the 1890s showed that tuberculosis was transmitted through microscopic droplets expelled from a patient’s respiratory tract. Confusingly, Flügge used the term “droplet” to include mists so fine that the infecting bacilli could stay airborne for hours, in what we would now characterize as “aerosols.” Chapin, who would later serve as president of the American Public Health Association, was well aware of these findings. In The Sources and Modes of Infection (1910), which for decades served as a foundational textbook in American epidemiology, he selectively discussed Flügge’s research, only to dismiss its implications for less than empirical reasons. “Infection by air, if it does take place, as is commonly believed,” Chapin wrote, “is so difficult to avoid or guard against.” Admitting the very possibility, he worried, would discourage people from taking basic measures to avoid infection.
In the same text, Chapin went out of his way to link the notion of airborne transmission to the recently discredited miasma theory. “It will be a great relief to most persons to be freed from the specter of infected air,” he wrote,
which has pursued the race from the time of Hippocrates, and we may rest assured that if people can as a consequence be better taught to practice strict personal cleanliness, they will be led to do that which will more than anything else prevent aerial infection also, if that should in the end be proved to be of more importance than now appears.
In case he is wrong, and diseases do turn out to be airborne, we would do best to deny that possibility. Got it?
It is not hard to see ideology at work in these contortions. Like his predecessors, Chapin was engaged in a campaign that was as much political as it was scientific. His wider goal was to limit and confine the “public” part of public health—an explicit riposte to the ethic of Virchow’s social medicine and of the crusading sanitarians, mostly anticontagionists, whose concern with removing filth had led to the construction of sewer systems and the rebuilding of crowded urban centers from Naples to Chicago. (Such reforms did effectively reduce outbreaks, though not for the reasons put forward.) The epidemiologist Hibbert Winslow Hill, who championed Chapin’s ideas in a book called The New Public Health, put it thus: “The old public health was concerned with the environment; the new is concerned with the individual. The old sought the sources of infectious disease in the surroundings of man; the new finds them in man himself.” You will recognize here the same battle fought between contagionists and anticontagionists, still alive after Koch’s resounding victory.
Government, Chapin allowed, should still provide clean water and regulate hygiene in dairies and slaughterhouses. The most important task of the public health profession, however, was controlling outbreaks of communicable disease through laboratory work and data collection. Otherwise, responsibility for disease prevention must be left to individuals. “It was a common saying of sanitarians of years ago that for every death from typhoid fever someone ought to be hung,” whether it be “the city councilman, the health officer, or the landlord,” Chapin wrote. “It is more in accord with present day knowledge and modern conditions to say that if a man has typhoid fever it is his own just punishment for sanitary sins.”
Like economists of his day and ours, Chapin imagined the public as a collection of individuals blessed with sufficient time, resources, and liberty to make rational choices. Whatever the government did or didn’t do to regulate dairies, he judged, the “best way to improve the milk supply is for the consumer to purchase only the best milk.” Since rats carry fleas that transmit plague, he advised, “when a man builds a house, or a stable, or a shed, he should build it rat proof. It costs only a few dollars more.” Plain sense and individual action would be enough to keep disease away. “By turning the face from the coughing and loud talking of our neighbors,” Chapin wrote in 1917, “by putting nothing in the mouth except clean food and drink; by never putting the fingers in the mouth, or nose; most contagious diseases can be avoided.”
His philosophy would soon be put to the test. The influenza pandemic known as the Spanish Flu arrived in 1918. By the time its fourth and final wave faded in the spring of 1920, it had killed between seventeen and one hundred million people, many of them no doubt responsible consumers of the cleanest comestibles. Chapin, who at the time headed Providence’s health department, played a role that will now be familiar. Believing that the outbreak would “run its course” within about six weeks, he consistently advised Rhode Island’s governor not to bother closing schools, churches, or theaters. “Community action cannot check the disease,” he wrote in the Dallas Morning News, but “the individual can do something to protect himself.”
A few years earlier, Chapin had acknowledged that tiny droplets containing “the influenza bacilli” could float in the air for as long as five hours. Now he kept his own counsel and said nothing about the possibility that the virus was airborne. (Aerosol transmission of Influenza A has since been confirmed in numerous studies.) He told the newspapers that the flu was “chiefly spread by droplets” and that staying “at arm’s length from everybody” should be sufficient protection so long as you “put nothing in the nose or mouth except what belongs there” and don’t “let people talk in your face.” As soon as cases began to drop, Chapin declared the outbreak over. The state’s ban on public gatherings, imposed despite his misgivings, was lifted one week later. That was late October. Predictably, deaths began to surge again before the year was out.
In the century that followed, Chapin’s vision of public health prevailed. Copious evidence has accumulated that many infectious diseases (measles, rubella, strep, smallpox, flu, TB, among others) spread via aerosols. Yet skepticism about the possibility of airborne infection remains a reflex among public health officials. If that mistrust is still expressed in scientific terms, it remains inextricable from politics. A disease that is airborne, as Chapin feared, does not care how diligently we wash our hands. It forces us to engage critically with the way we organize our societies: how we work, where we live, how we move, what we do with the old and the infirm and with those we condemn as criminal. It is no accident that some of the worst Covid outbreaks and highest rates of mortality have taken place among society’s least valued: in prisons, in elder care homes, among immigrants laboring elbow to elbow on the killing floors of slaughterhouses. To date, no CDC or WHO “guidelines” have questioned the viability of these arrangements.
Skepticism about the possibility of airborne infection remains a reflex among public health officials.
Airborne or not, we have known a great deal about what causes Covid fatalities since the earliest days of the pandemic. Being Black or Latino in America put you at greater risk of dying, and being Native American was most deadly of all. Race was not the only nonmedical condition comorbid with severe Covid. So was not having insurance, not having attended college, being poor, and simply living in a state that allowed its eviction ban to expire. The inequalities that float through our lives like a noxious mist have proved as murderous as the virus itself. But which public health official dares discuss the pandemic in the context of racial and economic violence? The Biden administration, even more definitively than Trump’s, has abandoned all interventions save vaccination, which itself has been cast in moralizing terms. (For Americans, that is. The rest of the world is on its own.) Responsibility for survival once again falls on each of us. As CDC director Rochelle Walensky put it on Twitter, “Your health is in your hands.”
Months have passed since the WHO and CDC conceded that Covid spreads via aerosols. But the broader implications of airborne transmission have been effectively kept at bay. An airborne pandemic demands a more comprehensive response than providing HVAC filters and N95 masks for all, or updating building codes to improve air flow, not that anyone is doing that. It means letting the collectivity back in, allowing our relationship to place, and to each other, back into our understanding of health and disease. Perhaps first of all, it means accepting that we were sick long before Covid hit. Our recovery, if it is not to be just another illness, will have to begin with the acknowledgment that our destinies are shared, like the air we have no choice but to breathe.