Ebola's lessons: How WHO mishandled the crisis

A contact tracer for the World Health Organization in a community in Conakry, Guinea, after a family member was infected with Ebola. Will WHO be better prepared for the next global outbreak? Photo by: Martine Perret / UNMEER / CC BY-ND 

EDITOR’S NOTE: Understanding the disastrous international response to last year’s Ebola pandemic in West Africa is important to ensure mistakes won’t be repeated. In this essay, Council on Foreign Relations’ senior fellow for global health Laurie Garrett takes a closer look at WHO’s missteps and makes the case for why the U.N. health agency needs to evolve. An excerpt.

In a biological sense, last year’s Ebola epidemic, which struck West Africa, spilled over into the United States and Europe, and has to date led to more than 27,000 infections and more than 11,000 deaths, was a great surprise.

Local health and political leaders did not know of the presence of the hemorrhagic fever virus in the 35,000-square-mile Guinea forest region, and no human cases had ever been identified in the region before the outbreak. Its appearance in the tiny Guinean village of Meliandou in December 2013 went unnoticed, save as a domestic tragedy for the Ouamouno family, who lost their toddler son Emile to a mysterious fever. Practically all the nonbiological aspects of the crisis, however, were entirely unsurprising, as the epidemic itself and the fumbling response to it played out with deeply frustrating predictability. The world has seen these mistakes before.

Humanity’s first known encounter with Ebola occurred in 1976, with an outbreak in the village of Yambuku, Zaire (now the Democratic Republic of the Congo), and surrounding areas. A horrible unknown disease suddenly started causing internal bleeding, high fevers, sometimes hallucinations and deranged behavior, and often death; it was eventually named Ebola after a nearby river.

Back then, science lacked today’s toolkit for the rapid identification and genetic analysis of viruses, not to mention meaningful anti-virus treatments, biotechnology, sophisticated hazmat suits and cellphones. Considerable courage, combined with a fair amount of swagger and medical savvy, was the key trait of the couple of dozen foreigners who swooped in to assist the local disease fighters. Most were veterans of battles against other microbes, such as smallpox or yellow fever, but had not previously worked together. Karl Johnson, a virologist at the U.S. Centers for Disease Control and Prevention, took charge, and the multinational group operated as a team of rivals, jockeying for their respective institutional or national stature in the loosely governed investigation.

Conducting its work under the brutal dictatorship of Mobutu Sese Seko, the group’s every small achievement, from corralling air transport to communicating with the CDC’s headquarters in Atlanta, was a near miracle. But within a few months, the virus was identified, the Belgian Catholic mission hospital at the center of the outbreak was closed, quarantines were enacted and the epidemic ended. Almost 300 people had died.

The world’s second serious confrontation with Ebola came 19 years later, in 1995, when the disease again broke out in Zaire — this time in Kikwit, a community of nearly half a million people spread out along the edges of a vast rainforest in what amounted to a giant village of mud roads, with no running water, no electricity, no phones, no media of any kind, and only the crudest of medical facilities. I took up temporary residence in Kikwit during the epidemic, reporting on how it played out. There was (and still is) only one paved road out of town, the N1, heading around 300 miles due west to Kinshasa and 550 miles southeast to Mwene-Ditu. At the time, Mobutu held Zaire in his clutches and used its national treasury as his family’s personal account; he would die two years later, and the nation would discover its bank vaults were empty. When the mysterious disease plaguing the community was finally confirmed as Ebola, the despot had his military cut off access to the highway, leaving the people of Kikwit to suffer on their own.

The global response boiled down to the Zairean doctor Jean-Jacques Muyembe-Tamfun and his medical team; three physicians from Médecins Sans Frontières, three World Health Organization officials, and about two dozen clinicians and scientists from the CDC, France’s Institut Pasteur, Belgium’s Institute of Tropical Medicine, South Africa’s National Institute for Virology (now the National Institute for Communicable Diseases), and other Western agencies and academic centers. Supplies and funds were scarce, electricity was available only by using generators, and there were no rapid diagnostic tools, medicines or vaccines available.

The Kikwit epidemic ended after around nine months, having killed 250 people. Afterward, the leader of the global response, David Heymann, an American employed by the CDC but temporarily working at WHO’s headquarters in Geneva, returned to Switzerland with a list of frustrations. Some of his concerns mirrored those of Johnson in fighting Ebola 19 years earlier: there was still no vaccine, no treatment, no field diagnostic tools, limited supplies of protective gear, nearly nonexistent local health care systems and trained medical personnel, no clear lines of national and global authority for epidemic response, few qualified scientists capable of and interested in being deployed, no international law governing actions inside countries lacking the capacity to stop epidemics on their own, and no money. Heymann had scoured Europe looking for funds to get his team and supplies to Kikwit. WHO had not been able to help much, and in the end, the German airline Lufthansa provided free travel and logistical support.

Yet another 19 years on, when I visited Liberia in late 2014, I found that little had improved. Although there had been at least 16 more Ebola outbreaks across the Congo basin and Uganda in the interim, the world had not developed any new technical or medical tools for addressing the virus. Treatment was only incrementally more sophisticated than it had been back in 1995, it was still impossible to rapidly diagnose infections, and there was still no vaccine.

Same old story

The 1976 Yambuku outbreak came at a time of tremendous optimism in the fields of global health and Western medicine. The previous decades had seen the development and widespread use of a host of remarkably effective vaccines. They had brought horrors such as diphtheria, measles, pertussis, polio, rubella, and tetanus down to insignificant levels in rich countries, offering the hope that immunization campaigns in poor countries could eliminate the diseases entirely. New antibiotics kept appearing on the market, pushing the prices of older stalwarts, such as penicillin and tetracycline, further down toward affordability in poor countries. The medical establishment in the United States was growing in size and sophistication, producing specialists offering treatments for rare forms of cancer, obscure inherited disorders and deep psychiatric afflictions. The pharmaceutical industry was at the beginning of an enormous boom. And the WHO was successfully straddling both sides of the Cold War, garnering support from the Soviet Union and the United States.

But 1976 was also a year of harbingers of bad things to come. There was not just Ebola’s emergence in Yambuku. The United States struggled with two strange new outbreaks of its own, of swine flu and Legionnaires’ disease. In addition, the sexual revolution was spreading across Europe and North America, with increases in unprotected sex leading to a rising incidence of sexually transmitted diseases, such as gonorrhea, herpes and syphilis. Within five years, physicians in the United States would note a set of new, fatal symptoms among hemophiliacs, gay men and intravenous drug users; the disease would eventually be called acquired immune deficiency syndrome, or AIDS, caused by the human immunodeficiency virus, or HIV.

In what became known as the swine flu fiasco, the Ford administration and the American public health establishment overreacted to the death of a U.S. Army private from the disease. The fatality was isolated, but it led to a panic and a national immunization campaign. Convinced that a massive pandemic was on the way, Congress indemnified the vaccine industry. Immunizations were hastily rushed into production; amid claims of contamination and side effects, years of lawsuits followed. The episode left policymakers skeptical about trusting their health care professionals and determined never again to indemnify drugmakers; manufacturers, in turn, ran for cover, and some drug companies shed their vaccine production lines entirely. An infuriated Congress convened hearings to rake the CDC over the coals, forcing the resignation of the agency’s director.

Six months after the death of the army private, 34 hotel guests attending an American Legion convention in Philadelphia died from a mysterious illness (later dubbed Legionnaires’ disease). The inability of the CDC and Pennsylvania health authorities to rapidly determine what had happened further undermined policymakers’ confidence, and when the cause of the disease turned out to be a previously unknown species of bacteria lurking in the air conditioning system, the public was shocked. If the age of infectious diseases was past, how could a new bacterial ailment appear, go undiagnosed for months, and prove tough to treat with antibiotics?

AIDS would, of course, prove the greatest challenge — to human hubris, the pharmaceutical and research communities, and international global health governance. Shortly after his first visit to Liberia to see the Ebola epidemic firsthand last August, the CDC’s current director, Thomas Frieden, told reporters, “I will say that in the 30 years I’ve been working in public health, the only thing like this has been AIDS. And we have to work now so that this is not the world’s next AIDS.”

Frieden was referring not to the disease itself but to the world’s disastrous response to it. For two decades, as the AIDS pandemic unfolded in country after country, governments and general populations almost always proved more interested in attacking the subpopulations at greatest risk for the disease than in fighting the virus itself. Children infected by HIV-contaminated blood transfusions were banned from schools, the homes of hemophiliacs were burned, masses of gay men died with little attention from the heterosexual communities around them, intravenous drug users were denied sterile syringes, female prostitutes were imprisoned or denied access to health care, and many medical and dental providers refused to allow HIV-positive individuals access to care unrelated to their infections.

From the perspective of HIV prevention, in nearly every country in the world, the 1980s and 1990s were long, ugly decades during which the virus spread relentlessly, with AIDS eventually ranking as the third-largest pandemic in world history (after the Black Death and the 1918 influenza pandemic). In comparing Ebola and AIDS, Frieden was not forecasting that Ebola would infect 60 million people, as HIV has; rather, he was indicating that the ignorant, inept and cruel response to AIDS was being mirrored by events unfolding in West Africa in 2014.

During the 1980s, WHO failed to recognize the importance of HIV and AIDS. Inside its Geneva headquarters, some experts exhibited as much prejudice against the populations at great risk for AIDS — especially homosexuals — as did the general public. For a brief time in the mid-1980s, its Global Program on AIDS thrived, led by the epidemiologist Jonathan Mann. But WHO insiders grumbled and complained about the millions of dollars in AIDS funds Mann was raising and about the dire (and, in retrospect, mostly accurate) forecasts his group was issuing. A common refrain among insider critics was, “Since more people die of diarrhea — or cancer or high blood pressure or malaria or whatever — than of AIDS, why is it getting so much money and media attention?”

Heeding the grousing, WHO’s director general, Hiroshi Nakajima, forced Mann’s resignation, slashed the AIDS budget, and eventually shut down the GPA, essentially walking away from the largest pandemic in modern history.

Since then, the global response to the rise of new pathogens has continued to be limited, uncoordinated and dysfunctional. From the severe acute respiratory syndrome to the Middle East respiratory syndrome to bird flu, the story has been similar. Poor nations are unable to detect new diseases quickly and bring them swiftly under control. Rich nations generally show only marginal interest in outbreaks until the microbes seem to directly threaten their citizens, at which point they hysterically overreact. Governments look after their own interests, cover up outbreaks, hoard scarce pharmaceutical supplies, prevent exports of lifesaving medicines, shut borders and bar travel.

The global health infrastructure has shown itself to be weak, fractured, prone to infighting, and more interested in searching for technological silver bullets than engaging in the hard slog of social mobilization and classic local public health work. And through it all, WHO has struggled to remain credible, as its financial resources have shrunk, tensions have grown between its Geneva headquarters and its regional offices, and rival multilateral organizations have taken control over much of the global health action and agenda.

Republished with permission from the Foreign Affairs magazine. Read the full article.

About the author

  • Laurie Garrett

    Since 2004, Laurie Garrett has been a senior fellow for global health at the Council on Foreign Relations in New York. Her expertise includes global health systems, chronic and infectious diseases, and bioterrorism.