Illustrations: Théophile Bartz

What We Learned From Smallpox, Measles, Cholera, and Other Health Crises

The novel coronavirus is just that — completely new. But disease is very old, indeed, and outbreaks, epidemics, and even pandemics have plagued humankind since the beginning of time. Could past pandemics help illuminate the path ahead? Here are some lessons worth considering.

Cholera: Geography matters

In 19th-century Europe, scientists and the public thought cholera was caused by “miasma,” or bad air. The highly contagious diarrheal infection is actually caused by Vibrio cholerae, a nasty bacterium that’s spread through feces. But in a time before the germ theory of disease had taken hold, it seemed like periodic outbreaks — even pandemics — were simply inevitable.

In 1854, the world was in the midst of the third of six cholera pandemics, and London was hit hard. A doctor named John Snow practiced in the city’s Soho district, which three out of four residents abandoned when it was hit by the disease. But Snow had a hunch, and stayed behind to interview families of victims.

By using interviews and mapping to determine the precise cause of an outbreak, Snow pioneered the outbreak investigation.

Snow initially thought cholera was spread by foul air, too, and though he didn’t uncover a cure, his interviews did uncover a pattern. The more than 500 people who died in just 10 days all lived in a cluster near a public water pump on Broad Street. When Snow examined water from the pump under a microscope, he noticed “small white, flocculent particles.” He convinced local officials to remove the pump’s handle so people could no longer access the contaminated water — and the outbreak ground to a sudden halt.

Snow had just discovered that cholera was waterborne, but he also had contributed something invaluable to public health. By using interviews and mapping to determine the precise cause of an outbreak, he’d pioneered the outbreak investigation. Modern epidemiologists rely on field investigations like Snow’s to confirm an outbreak, find out what’s going on, and control and prevent infection. The takeaway: It’s worth investing in robust field investigations of even small-scale outbreaks, like the choir practice in Washington linked to dozens of cases of the coronavirus disease.

Influenza: Delay can be fatal…

The influenza epidemic of 1918 infected an estimated one-third of the world’s population and killed a mind-boggling 50 million people worldwide. Of those, 675,000 were from the United States. But the number could have been drastically reduced if officials had been willing to act earlier.

Philadelphia had the second-highest death rate in the nation, thanks in part to a fatal decision to stage a massive war bonds parade in a city that didn’t realize it was already in the throes of a pandemic. Wilmer Krusen, the city’s public health director, had hoped an outbreak of influenza among sailors could be contained. But Krusen misjudged the size of the outbreak, and political factors, including ongoing American involvement in World War I, and a gigantic parade designed to help recruit soldiers and raise funds for the war, went forward as planned.

As a result, the 200,000 people who lined the streets of downtown Philadelphia on September 28, 1918 were inadvertent influenza vectors. Within 72 hours of the parade, every single hospital bed in the city was filled. The daily death toll rose from then on. Eventually, over 16,000 died there — most of them within five weeks of the parade.

In contrast, cities that took swift action lost fewer lives. In St. Louis, the health commissioner, Max Starkloff, called for broad closures and other measures — including canceling a war bond drive like the one that proved so fatal in Philadelphia — after just 100 St. Louis residents and a group of 500 soldiers billeted outside city limits contracted influenza. As a result, the city’s death toll was far lower than Philadelphia’s, and the sixth-lowest in the nation. The takeaway: Don’t wait for infections to explode before taking action.

Polio: …but haste can kill

In 1955, a deadly polio epidemic in the United States turned even deadlier when some of the vaccine designed to prevent the disease ended up causing it instead. The Cutter incident, as it is now known, illustrates what can happen when new treatments are rushed to market.

There was good reason to hurry: By the time Jonas Salk’s trial of a vaccine made with inactivated poliovirus was announced a success in 1955, the U.S. had been the years-long epicenter of wave after wave of the disease, which can cause muscle weakness and paralysis. The April 12, 1955 announcement that Salk’s vaccine worked was greeted with fanfare and relief. Less than three hours after the vaccine was declared safe and effective, the National Institutes of Health had licensed it to five companies who had already submitted samples of the vaccine for safety testing. Despite warnings from a NIH researcher who had realized a vaccine produced by Cutter Laboratories had given a monkey polio during a routine safety test, Cutter was among the companies licensed to produce the drug.

In the weeks that followed, 120,000 doses of Cutter’s vaccine were administered to American children who had no idea they were being injected with live poliovirus. Forty thousand of the vaccinated children came down with polio, and the vaccine itself began its own epidemic. By the time the vaccine was recalled, over 160 people were paralyzed and 10 were dead.

The disaster panicked parents and provoked a backlash as some withdrew their children from vaccination programs. The deadly incident spurred tighter vaccine safety protocols and disease surveillance. But it also made pharmaceutical companies less likely to manufacture vaccines. A landmark ruling against Cutter on behalf of children who had been paralyzed after receiving the company’s polio vaccine made it easier to obtain verdicts against manufacturers who produced dangerous products.

“The Cutter incident resulted in the first coordinated response to a national emergency,” writes Paul A. Offit, MD, director of the Philadelphia Children’s Hospital’s vaccine education center. “But [its] final ironic legacy may lie in a court ruling that, although designed to protect children from harm, has greatly reduced the willingness of pharmaceutical companies to make lifesaving vaccines.” The takeaway: Too-hasty approval of treatments or vaccines could backfire.

Ebola: Surveillance is everything

To respond to an outbreak, public health workers need information. Who’s sick? Where? A lack of answers to those questions can waste time and resources and cause the outbreak to spread undetected. Lives are in the balance — and as one very recent epidemic showed, a lack of robust testing and surveillance systems can be tragic.

When Ebola virus disease broke out in Guinea in 2014, it swept through West Africa. During the next two and a half years, 28,600 people got sick and 11,325 people died. Earlier outbreaks of the disease, which causes fever, vomiting, and even internal bleeding and which kills an average of 50% of all people who contract it, had been contained. So why did it spread in 2014?

The answer largely came down to surveillance. Disease surveillance — collecting and analyzing information about an outbreak, then using it to guide the response — is a classic bedrock of public health. And tragically, the Ebola outbreak of 2014 and 2015 became a case study in what not to do. The disease hit people in isolated and poor areas of countries with underfunded public health systems, and even when reports did reach authorities, they often lacked the resources to investigate.

As a result, the deadly disease spread throughout the rural areas, prompting public health officials to come up with creative, mostly low-tech surveillance solutions over the course of the years-long epidemic. In Sierra Leone, for example, community members helped identify “trigger events” like households with more than two sick people or deaths or unsafe burials. Over a seven-month period in 2015, these workers alerted officials to over 12,000 trigger events, detecting about a third of all Ebola cases in their districts. This didn’t just help identify hotspots: Since the spotters reported dead bodies, too, they also helped officials figure out when epidemics were slowing.

Because of its experience tackling Ebola and other public health crises, says Cornell University international studies professor Rachel Beatty Riedl, PhD, “Africa is the first place we should be looking to understand effective responses and to learn from existing responses to health crises.” As of April 28, the WHO Africa region, which covers the entire continent and 47 countries, has seen only 22,000 cases of Covid-19 and 899 deaths. That reprieve is unlikely to last long, but existing surveillance systems and techniques can be adapted to respond to the novel coronavirus, too.

“Faced with novel and complex diseases, African communities did not sit back and wait for the disaster to destroy them. They rallied the best they could with whatever was available,” writes The Nation’s Nanjala Nyabola. The takeaway: Engage community members to shore up shoddy surveillance systems.

Smallpox: Solutions may already be out there

Smallpox was a terrifying fact of life for much of the past — written accounts of the disease date from as early as the fourth century, A.D., and evidence of smallpox rash has been found on Egyptian mummies buried in the third century B.C. Over thousands of years, the disease swept through continent after continent as trade, modernization, settlement, and colonization grew. Smallpox caused fever and a painful, pustular rash and could result in disfigurement and death; about 30% of all smallpox victims died. But even before the vaccination was invented, people in Asia, the Middle East, and Africa knew a way to provoke immunity — and an enslaved man in colonial America helped bring the concept of inoculation to the Western world.

As a smallpox epidemic rushed through Boston in 1721, the city panicked. The situation seemed hopeless, until a preacher and a local doctor teamed up to test the concept of inoculation. It had been introduced to the preacher, Cotton Mather, by Onesimus, an African man he enslaved. Onesimus had participated in a procedure in which pus from a smallpox-infected person was rubbed into a cut in his arm. When he told Mather about the technique, the preacher was immediately intrigued. When Mather read an account of inoculation in the Ottoman Empire in a scientific journal, he learned that the practice was already common in the Middle East. Mather became an inoculation convert.

As the 1721 epidemic began, Mather and Zabdiel Boylston, a local doctor, began experimenting with inoculation. The idea was initially extremely unpopular, and Mather was ridiculed and even threatened for believing what an enslaved African man told him. “There was an ugly racial element to the anger,” Brown University historian Ted Widmer, PhD, told WGBH. But inoculation worked: Only one in 40 of the people inoculated in the test run died, as opposed to the one in seven among those who didn’t get inoculated.

Soon, the practice of “variolation,” as the technique became known, was commonly accepted in the Western world. People who got inoculated usually developed a mild, contagious smallpox infection, then went on to develop immunity. The practice set the stage for the later discovery of vaccination by Edward Jenner in 1796. Jenner realized that the animal disease cowpox protected against smallpox, too, and that people inoculated with cowpox didn’t spread smallpox symptoms to others. Soon, variolation had been abandoned in favor of vaccination.

If Mather and Boylston had been unwilling to listen to what was already common knowledge in other parts of the world, smallpox might have taken even longer to conquer. Instead, the disease eventually became the first in history to be completely eradicated, in 1980. The takeaway: Consider existing treatments even while investing in new solutions.

“Africa is the first place we should be looking to understand effective responses and to learn from existing responses to health crises.”

Malaria: Nature may contain the cure — but it might not be replicable

For centuries, indigenous people used bark of the cinchona or quina-quina tree to treat the symptoms of malaria, a parasitic infection that can cause fever, chills, and sweating, or more severe symptoms like coma, respiratory distress, or organ failure. In 1820, two French scientists isolated a compound in cinchona bark they called quinine, and soon it became the front-line treatment for the disease. Though other treatments were eventually discovered, some forms of malaria resist those newer drugs — and quinine remained elusive to chemists who wanted to find a way to reproduce it in the lab.

During World War II, the world’s dependence on the cinchona tree for quinine became a big problem when the Allies were cut off from Dutch colonies that produced the majority of the world’s cinchona bark. The war eventually ended, but chemists’ quest didn’t. It would end up taking over 150 years from the discovery of the quinine compound to finally reliably synthesize quinine. Even today, cinchona trees are the only economically feasible source of quinine.

The story of quinine illustrates the benefits and potential drawbacks of looking to nature for disease cures. “With over 28,000 species known to be used in traditional medicine for a variety of diseases and symptoms, there are ample options to explore in the search for new drugs,” says Cassandra Quave, PhD, herbarium curator and assistant professor of dermatology and human health at Emory University.

Nature’s pharmacopeia presents some unique challenges, though, says Quave. Plant ingredients tend to be chemically complex, and sometimes their compounds interact in surprising ways. But according to Quave, advances like metabolomics, the study of tiny molecules produced by biological processes, machine learning, and molecular networking, which reveals chemical relationships between molecules, could speed up the process. “Nature is incredibly chemically complex,” she says. But, “In fact, many of our essential medicines were originally discovered in plants.” The takeaway: Effective treatments could come from nature…but there’s no bypassing the complexity of biology.

Guinea worm: You don’t always need a cure

In 1986, an estimated 3.5 million people in Africa and Asia had guinea worm disease, a parasitic disease that causes painful blisters, swelling, bacterial infection, and disability. But by 2019, only 54 cases were detected, even without a cure or a vaccine. That drop didn’t happen by chance — it was the result of a decades-long public health campaign that has nearly stamped out guinea worm for good.

By the 1980s, it was clear that researchers were nowhere near being able to develop an effective vaccine against Dracunculus medinensis, the parasite that causes the disease. Since it’s transmitted through contaminated drinking water, public health experts knew local efforts focused on clean water would pay dividends. The Carter Center, the NGO founded by Jimmy Carter, took the lead, working with ministries of health in affected countries and helping establish community-based programs.

Today, village volunteers — people who educate their peers about the parasite’s transmission, teach others how to filter their water, and help surveil cases — are at the heart of the approach. They have proven particularly effective at helping address local practices and superstitions that can make it hard for international public health workers to make inroads.

It could be years before researchers dial in a strategy for creating effective vaccines against parasites of any kind. George Washington University microbiologist Jeffrey Bethony, PhD, who is working on vaccines against hookworms and other parasites, has called vaccines against parasitic infections “the ultimate challenge.” But by the time researchers are able to create a vaccine against D. medinensis, they may not need to. An estimated 80 million cases of guinea worm are thought to have been averted due to the public health campaign, and eradication no longer seems just possible, but imminent. The takeaway: We still need a cure for Covid-19, but education and accurate information can empower people to limit its spread in the meantime. Even if a cure isn’t found right away, it’s possible to stem its spread through effective education.

Measles: We’re still learning what viruses are capable of

Measles was thought of as a common childhood disease for generations, but once vaccines became available in the 1960s, hope grew that it was eradicable. Though the world is still chasing that goal, the disease’s incidence has been dramatically reduced, and worldwide deaths dropped 73% worldwide between 2000 and 2018.

It’s been 66 years since the measles virus was first isolated, and measles is now thought of as a preventable disease. But there’s still more to learn about what the virus is capable of. Just last year, researchers revealed that the measles virus damages the immune system’s ability to remember how to fight other, non-measles diseases, wiping out between 11% and 73% of the average person’s antibodies against other pathogens.

That confirms long-standing questions about why the measles vaccine seems to confer immunity to other diseases, too. But it also points out how much there is to learn about viruses. Unlike measles, the novel coronavirus hasn’t been in scientists’ sights for decades — and even when it has been, it may still contain secrets that will take even longer to unlock. The takeaway: Rapid scientific progress and knowledge gleaned in past outbreaks, epidemics, and pandemics may help us, but there’s no way to tell just how long the world will be fighting, or learning about, SARS-CoV-2.

“This is human beings manipulating the microbial world to their advantage. They used what they saw as biology to double down on incredible racial and social discrimination.”

Yellow fever: Privileging immunity may exacerbate social ills

Health disparities have long been known to prolong pandemics, epidemics, and outbreaks — just look at Madagascar, where the same plague that fueled the Black Death has proven hard to stamp out due to social inequality and deteriorating infrastructure, political instability, and environmental degradation. But there’s another danger in disease outbreaks: Giving too much privilege to people who have survived can actually increase inequality.

During the 19th century, New Orleans was a hotbed of yellow fever. The viral infection, which is spread by mosquitoes, hit the humid city year after year. And without effective medications or preventatives, the city’s residents had no choice but to endure it and its gruesome symptoms, which can range from a fever to delirium to black vomit that contains blood. The cycles of disease, suffering, and death became so familiar that yellow fever was thought of as a feature of daily life. The only people who weren’t subject to yellow fever’s caprices were the ones who had survived a prior bout with it — and their immunity became a potent source of social capital in the city.

“There was a hierarchy of citizens” based on the disease, explains historian Kathryn Olivarius, DPhil, an assistant professor of history at Stanford University. “People who survived yellow fever were at the top. Then you had ‘unacclimated strangers’ [those who had never had the disease], as they were called. And then you have the dead at the bottom.” People who had never had yellow fever were considered immoral or unmanly or worse, she explains. Proof of “acclimatization,” or immunity, became a kind of pass to society’s benefits. Without having survived yellow fever, people couldn’t get loans, access high-paying jobs, or even marry their desired spouse.

This had disastrous effects, prompting people to purposely get infected with yellow fever and justifying elites’ abandonment of those they considered socially inferior. People would spend time around the sick or even eat black vomit in a desperate attempt to get the disease and get their ordeal over with. Despite a widely held belief that Black people had an innate immunity to the disease, slaveholders were prepared to pay up to 50% more for an enslaved person who had survived yellow fever.

Although there was no way to verify immunity, says Olivarius, yellow fever made societal stratification worse. “This is human beings manipulating the microbial world to their advantage. They used what they saw as biology to double down on incredible racial and social discrimination.”

And it could happen again. There’s no clarity on whether being infected with Covid-19 confers immunity, and antibody tests are still unreliable at best. Officials in multiple countries, including the United States, have floated the idea of certificates of immunity that allow people who have had Covid-19 to end lockdown. But, says Olivarius, that carries its own risks.

“We cannot allow immunity to become one more form of discrimination in our society,” she says. “Viruses don’t think, but we as humans overlay them with meaning. We cannot allow this pandemic to become an engine of further discrimination.” The takeaway: Covid-19 has hamstrung the economy and obliterated life as we once knew it. To allow it to also short-circuit people’s sense of community obligation would be to let history repeat itself.

Get the Medium app