COVID pandemic “has quickly become the worst human and economic crisis of our lifetime.”
Our mastery over microbes is only a few decades old. It is also far more precarious than we imagine.
KYLE HARPER
In 1879, Jules Verne published a science fiction novel, The Begum’s Millions. In the story, two rival heirs—a French doctor and a German chemist—fall into a fortune left by a long-lost relative. They each use their share of the inheritance to build model cities far in the American West.
The contrasts are not subtle. The German establishes “Steel-City,” a nightmarish industrial town dedicated to the manufacture of destructive weapons. The doctor, by contrast, founds a city called “France-ville,” premised from top to bottom on scientific principles of hygiene, where public spaces and private habits are minutely regulated to promote healthy living: “To clean, clean ceaselessly, to destroy as soon as they are formed those miasmas which constantly emanate from a human collective, such is the primary job of the central government.”
William McNeill’s influential history of disease, Plagues and Peoples (1976), writes closing words that remain as true today as they were when he wrote them four and a half decades ago. “Infectious disease which antedated the emergence of humankind will last as long as humanity itself, and will surely remain, as it has been hitherto, one of the fundamental parameters and determinants of human history,” he wrote. Human mastery over nature may wax and wane, as the unprecedented experiment in planetary dominance plays out. Paradoxically, we are in some ways more fragile than our ancestors, precisely because our societies depend on a level of security against infectious disease that may be unrealistic.
KYLE HARPER
Essay echo by Kyle Harper, published in the Boston Review, a political and literary journal.
The contrasts are not subtle. The German establishes “Steel-City,” a nightmarish industrial town dedicated to the manufacture of destructive weapons. The doctor, by contrast, founds a city called “France-ville,” premised from top to bottom on scientific principles of hygiene, where public spaces and private habits are minutely regulated to promote healthy living: “To clean, clean ceaselessly, to destroy as soon as they are formed those miasmas which constantly emanate from a human collective, such is the primary job of the central government.”
Citizens are indoctrinated from childhood “with such a rigorous sense of cleanliness that they consider a spot on their simple clothes as a dishonor.” Hygiene is a public imperative and private duty. In exchange for uncompromising vigilance, the residents of France-ville enjoyed the blessings of long life.
Verne’s story is a fable of modern science; it envisions a future in which human ingenuity is used to destroy or to preserve life. In the novel’s happy ending, Steel-City destroys itself, but there is something unsettling about France-ville, too: life is consumed in the crusade against disease and death.
Verne’s story is a fable of modern science; it envisions a future in which human ingenuity is used to destroy or to preserve life. In the novel’s happy ending, Steel-City destroys itself, but there is something unsettling about France-ville, too: life is consumed in the crusade against disease and death.
Such visions were in the air when Verne wrote; he cribbed the sanitary rules of France-ville almost verbatim from a contemporary British reformer named Benjamin Ward Richardson.
In a speech delivered in 1876, Richardson imagined a utopia that he called Hygeia, the City of Health. For Richardson there was nothing inherently fictional about this future. “The details of the city exist,” he said. “They have been worked out by those pioneers of sanitary science.” In the coming years, he hoped the “desires and aspirations” of the hygienic reformers would become the lived reality of the mass of humanity.
Today, many of us live in a version of Hygeia. Around 1870, even in the most rapidly developing nations, infectious disease still filled the graveyards. But soon, human societies brought infectious disease under control. Toward the end of the nineteenth century, in the United States and Britain, a great threshold was crossed for the first time in the history of our species: noninfectious causes of death—cancer, cardiovascular disorders, and other chronic and degenerative diseases—accounted for a greater portion of total mortality than did infectious diseases.
Today, many of us live in a version of Hygeia. Around 1870, even in the most rapidly developing nations, infectious disease still filled the graveyards. But soon, human societies brought infectious disease under control. Toward the end of the nineteenth century, in the United States and Britain, a great threshold was crossed for the first time in the history of our species: noninfectious causes of death—cancer, cardiovascular disorders, and other chronic and degenerative diseases—accounted for a greater portion of total mortality than did infectious diseases.
By 1915, an American social reformer could observe that “a generation ago we could only vainly mourn” the deaths of children from disease: “To-day we know that every dying child accuses the community. For knowledge is available for keeping alive and well so nearly all, that we may justly be said to sin in the light of the new day when we let any die.” By midcentury, dying of infectious disease had become anomalous, virtually scandalous, in the developed world.
The control of infectious disease is one of the unambiguously great accomplishments of our species. Through a succession of overlapping and mutually reinforcing innovations at several scales—from public health reforms and the so-called hygiene revolution, to chemical controls and biomedical interventions like antibiotics, vaccines, and improvements to patient care—humans have learned to make the environments we inhabit unfit for microbes that cause us harm. This transformation has prevented immeasurable bodily pain and allowed billions of humans the chance to reach their full potential. It has relieved countless parents from the anguish of burying their children. It has remade our basic assumptions about life and death. Scholars have found plenty of candidates for what made us “modern” (railroads, telephones, science, Shakespeare), but the control of our microbial adversaries is as compelling as any of them. The mastery of microbes is so elemental and so intimately bound up with the other features of modernity—economic growth, mass education, the empowerment of women—that it is hard to imagine a counterfactual path to the modern world in which we lack a basic level of control over our germs. Modernity and pestilence are mutually exclusive; the COVID-19 pandemic only underscores their incompatibility.
But to grasp the full significance of the history of infectious disease, we need more than ever to understand this recent success through the lens of ecology and evolution. Modern human expansion is unnatural in its scale. Global population in 1900 was 1.6 billion; in 2020, humans numbered 7.8 billion. The primary and proximate reason the human population has ballooned is the control of infectious disease. As Peter Adamson noted in the New Internationalist in 1974, human beings have multiplied deliriously not because we “suddenly started breeding like rabbits. It’s just that we’ve stopped dying like flies.” The rapid development and diffusion of disease control technologies thus helped to provoke what is known as the Great Acceleration, the startling intensification of human impact on the planet from the middle of the twentieth century.
The control of infectious disease is one of the unambiguously great accomplishments of our species. Through a succession of overlapping and mutually reinforcing innovations at several scales—from public health reforms and the so-called hygiene revolution, to chemical controls and biomedical interventions like antibiotics, vaccines, and improvements to patient care—humans have learned to make the environments we inhabit unfit for microbes that cause us harm. This transformation has prevented immeasurable bodily pain and allowed billions of humans the chance to reach their full potential. It has relieved countless parents from the anguish of burying their children. It has remade our basic assumptions about life and death. Scholars have found plenty of candidates for what made us “modern” (railroads, telephones, science, Shakespeare), but the control of our microbial adversaries is as compelling as any of them. The mastery of microbes is so elemental and so intimately bound up with the other features of modernity—economic growth, mass education, the empowerment of women—that it is hard to imagine a counterfactual path to the modern world in which we lack a basic level of control over our germs. Modernity and pestilence are mutually exclusive; the COVID-19 pandemic only underscores their incompatibility.
But to grasp the full significance of the history of infectious disease, we need more than ever to understand this recent success through the lens of ecology and evolution. Modern human expansion is unnatural in its scale. Global population in 1900 was 1.6 billion; in 2020, humans numbered 7.8 billion. The primary and proximate reason the human population has ballooned is the control of infectious disease. As Peter Adamson noted in the New Internationalist in 1974, human beings have multiplied deliriously not because we “suddenly started breeding like rabbits. It’s just that we’ve stopped dying like flies.” The rapid development and diffusion of disease control technologies thus helped to provoke what is known as the Great Acceleration, the startling intensification of human impact on the planet from the middle of the twentieth century.
Our indelible impact has led many earth scientists and others to propose that we now live in the Anthropocene, a geological epoch defined by humanity’s pervasive imprint on the planet. The Anthropocene has a microbiological dimension too, as microbes respond and adapt to the human-dominated Earth.
Indeed, as human control of the environment has expanded, Darwinian evolution has continued, or even accelerated.
Indeed, as human control of the environment has expanded, Darwinian evolution has continued, or even accelerated.
In fact, the evolution of new pathogens is not an anomaly but, rather, the strictest obedience to the laws of nature.
COVID-19 was the evolutionary product of the ecological conditions we have created—our numbers, density, and connectivity, especially in the age of jet travel. The ongoing pandemic has been a jarring reminder that humanity’s control over nature is necessarily incomplete and unstable. We urgently need to learn from the experience of this pandemic—and the larger history of the globalization of health over the twentieth century—to ready ourselves for the inevitable next one.
In 1956 American demographer Kingsley Davis published a paper called “The Amazing Decline of Mortality in Underdeveloped Areas.” Kingsley was an influential academic whose work on global population shaped international policy. He wrote at a time when it was becoming evident that massive improvements in health were possible, even without equivalent progress in the economic realm.
In 1956 American demographer Kingsley Davis published a paper called “The Amazing Decline of Mortality in Underdeveloped Areas.” Kingsley was an influential academic whose work on global population shaped international policy. He wrote at a time when it was becoming evident that massive improvements in health were possible, even without equivalent progress in the economic realm.
“The truth is that these areas [i.e., low-income societies] do not need to become economically developed to reduce their death rates drastically,” he wrote. There were some spectacular examples that underscored his point. During the 1940s, the mortality rate in Puerto Rico fell by 46 percent. In Taiwan (known then as Formosa) it fell by 43 percent; in Jamaica it fell by 23 percent. In Sri Lanka (then called Ceylon), where DDT was used to suppress the mosquito vectors of malaria, the crude mortality rate fell by 34 percent in a single year. As Kingsley noted, “this was no fluke, because the death rate continued to fall.” Gains that had once taken decades to achieve might now be compressed into a few years.
At the cost of simplification, we can bunch the countries experiencing health transition in the twentieth century into three groups. (1) The first followers were those countries where sustained mortality decline started around 1900, just on the heels of the pioneers. In these nations, health often preceded economic development, sanitation and hygiene played a leading role, and major gains were still to be achieved midcentury. (2) In a second group, the global majority experienced headlong transition later, centered on the period from the 1940s to the 1970s; because the full kit of biomedical interventions and insecticides was now available, this transition was the largest and fastest mortality decline in human history. (3) A third group of countries still had life expectancies in the thirties or forties by 1970. Largely though not exclusively in Africa, these countries were located in regions with the heaviest burden of infectious disease, especially falciparum malaria. Mosquito abundance, insecticide resistance, AIDS, and poverty have made progress slower and more grinding.
As historian James C. Riley observes in his book Low Income, Social Growth, and Public Health: A History of Twelve Countries (2007), there was no single logic determining which countries joined the first followers. It helped to be northerly, like Japan and Korea, but places like Costa Rica and Jamaica managed to defy the geographic odds. Japan, as a temperate-latitude island, enjoyed a naturally favorable geography, and the nation’s culture reinforced sanitary and hygienic programs. Life expectancy was already in the thirties during the Tokugawa Period (1603–1868). The Meiji Restoration of 1868,launched a modernization project that deliberately emulated aspects of western culture, including medical science and industrialization. In 1870, the government formally adopted German medicine as its model and hired professors from Germany, who came to dominate the medical establishment in Japan. Smallpox vaccination was officially adopted in the 1870s. At least by the 1890s, the health transition was underway in Japan, and by the start of World War II, life expectancy was in the high forties, even though incomes remained modest. Progress resumed after the war, and economic growth took off. Japan invested heavily in health and education, and by the end of the century it had achieved the longest life expectancy in the world.
Korea is an interesting parallel, because it too launched into the health transition early, and it did so as a colonial society under Japanese rule. The health transition started around 1910, largely due to economic and social reforms imposed by Japan. These reforms included smallpox vaccination, quarantine, sanitary policy, and the expansion of medical care. In 1910, life expectancy was only around 23.5 years, but by 1942, it had reached 44.9. The country remained poor, and the midcentury conflicts interrupted progress. But from 1953, improvements resumed quickly (in South Korea), as mortality from infectious disease declined for two decades. South Korea became one of the healthiest and most educated populations on the planet, and prosperity followed.
There is some uncertainty about when life expectancy started to increase in Costa Rica, but it was probably in the 1890s. The country attained independence in the 1820s, and in the later nineteenth century an export-based plantation economy—largely financed and controlled by foreign capital—grew up around coffee and bananas. The United Fruit Company, an American corporation that came to own vast properties, sponsored campaigns against hookworm and malaria, extended by the Rockefeller Foundation’s programs to include diarrheal disease. (Scholars have argued over the imperial nature of these projects.) The “targeted” campaign against specific diseases became an important approach in the early twentieth century. Progress was slow at first but gathered pace from the 1920s. Insecticides like Paris Green were sprayed to combat malaria, and the school system was used as a conduit to teach health and hygiene. The Costa Rican state reinforced social development with investment in public health and welfare programs. The heavy artillery of antibiotics and DDT arrived midcentury, and gains continued apace through the 1970s.
Jamaica is a unique case. In this former British colony once dominated by the plantation system, mortality had been very high, with malaria, yellow fever, and diarrheal disease taking a heavy toll. But the colony experienced a form of the early health transition, thanks to drainage, quarantine, cinchona bark, sanitation, and hygiene. White mortality declined from the mid-eighteenth century, in parallel with western Europe. Among the enslaved population, mortality improved later, and the institution of slavery was abolished by 1838. Across the nineteenth century, life expectancy hovered in the high thirties. The island enjoyed greater autonomy from Britain in the 1890s. Gains were achieved against yellow fever and malaria, and progress accelerated in the 1920s. The Rockefeller Foundation was one catalyst, seeking to eliminate hookworm and carrying out wider health promotion campaigns. These broad-based efforts were remarkably successful in spreading knowledge about infectious disease and empowering Jamaicans to take ownership in the public health movement, as well as in building durable local infrastructure to maintain public health advances. Improvements in life expectancy stretched from the 1920s, to the 1970s.
These first followers were slightly ahead of the global curve. From the 1920s, the number of countries entering the health transition climbed rapidly, with the era of greatest improvement concentrated in the middle decades of the twentieth century. The global majority came to enjoy long life as humanity’s control over its parasites was extended worldwide. Let us look briefly at how infectious diseases were brought under control in two small countries, Ghana and Sri Lanka, and then the world’s two most populous countries: India and China.
Ghana, known as the Gold Coast prior to independence in 1957, sits on the Gulf of Guinea in West Africa. The coastal south of Ghana is equatorial forest, whereas the north is an arid savanna. The country bears the full burden of tropical diseases, including malaria, schistosomiasis, tropical ulcers, sleeping sickness, yellow fever, and yaws. A long history of trading ensured that the cosmopolitan diseases were either permanently present or repeatedly introduced. A burst of development around 1900, only made the disease environment more perilous: roads and railways expanded, forest was cleared (this was especially conducive to malaria), and headlong urbanization proceeded. Accra, the capital, was home to around 18,000 people in 1900, about 135,000 half a century later, and some 2.5 million today.
Ghana was a British colony; as in most European colonies, a medical department had been established by the imperial government. However, the mission of the medical administration was to make the colony safe for Europeans. Racist ideas bred fatalism about the poor health of colonial populations. Public health was beyond the scope of official concern, except insofar as the general sanitary and hygienic conditions of the territory infringed on the well-being of whites.
At the cost of simplification, we can bunch the countries experiencing health transition in the twentieth century into three groups. (1) The first followers were those countries where sustained mortality decline started around 1900, just on the heels of the pioneers. In these nations, health often preceded economic development, sanitation and hygiene played a leading role, and major gains were still to be achieved midcentury. (2) In a second group, the global majority experienced headlong transition later, centered on the period from the 1940s to the 1970s; because the full kit of biomedical interventions and insecticides was now available, this transition was the largest and fastest mortality decline in human history. (3) A third group of countries still had life expectancies in the thirties or forties by 1970. Largely though not exclusively in Africa, these countries were located in regions with the heaviest burden of infectious disease, especially falciparum malaria. Mosquito abundance, insecticide resistance, AIDS, and poverty have made progress slower and more grinding.
As historian James C. Riley observes in his book Low Income, Social Growth, and Public Health: A History of Twelve Countries (2007), there was no single logic determining which countries joined the first followers. It helped to be northerly, like Japan and Korea, but places like Costa Rica and Jamaica managed to defy the geographic odds. Japan, as a temperate-latitude island, enjoyed a naturally favorable geography, and the nation’s culture reinforced sanitary and hygienic programs. Life expectancy was already in the thirties during the Tokugawa Period (1603–1868). The Meiji Restoration of 1868,launched a modernization project that deliberately emulated aspects of western culture, including medical science and industrialization. In 1870, the government formally adopted German medicine as its model and hired professors from Germany, who came to dominate the medical establishment in Japan. Smallpox vaccination was officially adopted in the 1870s. At least by the 1890s, the health transition was underway in Japan, and by the start of World War II, life expectancy was in the high forties, even though incomes remained modest. Progress resumed after the war, and economic growth took off. Japan invested heavily in health and education, and by the end of the century it had achieved the longest life expectancy in the world.
Korea is an interesting parallel, because it too launched into the health transition early, and it did so as a colonial society under Japanese rule. The health transition started around 1910, largely due to economic and social reforms imposed by Japan. These reforms included smallpox vaccination, quarantine, sanitary policy, and the expansion of medical care. In 1910, life expectancy was only around 23.5 years, but by 1942, it had reached 44.9. The country remained poor, and the midcentury conflicts interrupted progress. But from 1953, improvements resumed quickly (in South Korea), as mortality from infectious disease declined for two decades. South Korea became one of the healthiest and most educated populations on the planet, and prosperity followed.
There is some uncertainty about when life expectancy started to increase in Costa Rica, but it was probably in the 1890s. The country attained independence in the 1820s, and in the later nineteenth century an export-based plantation economy—largely financed and controlled by foreign capital—grew up around coffee and bananas. The United Fruit Company, an American corporation that came to own vast properties, sponsored campaigns against hookworm and malaria, extended by the Rockefeller Foundation’s programs to include diarrheal disease. (Scholars have argued over the imperial nature of these projects.) The “targeted” campaign against specific diseases became an important approach in the early twentieth century. Progress was slow at first but gathered pace from the 1920s. Insecticides like Paris Green were sprayed to combat malaria, and the school system was used as a conduit to teach health and hygiene. The Costa Rican state reinforced social development with investment in public health and welfare programs. The heavy artillery of antibiotics and DDT arrived midcentury, and gains continued apace through the 1970s.
Jamaica is a unique case. In this former British colony once dominated by the plantation system, mortality had been very high, with malaria, yellow fever, and diarrheal disease taking a heavy toll. But the colony experienced a form of the early health transition, thanks to drainage, quarantine, cinchona bark, sanitation, and hygiene. White mortality declined from the mid-eighteenth century, in parallel with western Europe. Among the enslaved population, mortality improved later, and the institution of slavery was abolished by 1838. Across the nineteenth century, life expectancy hovered in the high thirties. The island enjoyed greater autonomy from Britain in the 1890s. Gains were achieved against yellow fever and malaria, and progress accelerated in the 1920s. The Rockefeller Foundation was one catalyst, seeking to eliminate hookworm and carrying out wider health promotion campaigns. These broad-based efforts were remarkably successful in spreading knowledge about infectious disease and empowering Jamaicans to take ownership in the public health movement, as well as in building durable local infrastructure to maintain public health advances. Improvements in life expectancy stretched from the 1920s, to the 1970s.
These first followers were slightly ahead of the global curve. From the 1920s, the number of countries entering the health transition climbed rapidly, with the era of greatest improvement concentrated in the middle decades of the twentieth century. The global majority came to enjoy long life as humanity’s control over its parasites was extended worldwide. Let us look briefly at how infectious diseases were brought under control in two small countries, Ghana and Sri Lanka, and then the world’s two most populous countries: India and China.
Ghana, known as the Gold Coast prior to independence in 1957, sits on the Gulf of Guinea in West Africa. The coastal south of Ghana is equatorial forest, whereas the north is an arid savanna. The country bears the full burden of tropical diseases, including malaria, schistosomiasis, tropical ulcers, sleeping sickness, yellow fever, and yaws. A long history of trading ensured that the cosmopolitan diseases were either permanently present or repeatedly introduced. A burst of development around 1900, only made the disease environment more perilous: roads and railways expanded, forest was cleared (this was especially conducive to malaria), and headlong urbanization proceeded. Accra, the capital, was home to around 18,000 people in 1900, about 135,000 half a century later, and some 2.5 million today.
Ghana was a British colony; as in most European colonies, a medical department had been established by the imperial government. However, the mission of the medical administration was to make the colony safe for Europeans. Racist ideas bred fatalism about the poor health of colonial populations. Public health was beyond the scope of official concern, except insofar as the general sanitary and hygienic conditions of the territory infringed on the well-being of whites.
Gradually, paternalistic attitudes fostered broader interest in the population, allowing limited if insufficient improvements.
For example, in 1908, a committee assessing Britain’s imperial medical administration in Africa baldly stated, “The idea that the officers of the West Africa Medical Service are appointed solely to look after Government officers should be discouraged. It is a part of the duty of a medical officer to increase the progress of sanitation amongst the natives dwelling in his district.”
Moreover, missionaries and private companies (with different motives) also helped to promote the diffusion of medical science.
In Ghana, the imperial medical administration expanded from the 1910s. Clinics and dispensaries were opened across the country, inadequate in proportion to the population but broader than had existed before. A new health branch promoted sanitation, hygiene, vector control, and vaccination. Life expectancy in the 1920s, the earliest that any estimates can be made, was still in the twenties.
Moreover, missionaries and private companies (with different motives) also helped to promote the diffusion of medical science.
In Ghana, the imperial medical administration expanded from the 1910s. Clinics and dispensaries were opened across the country, inadequate in proportion to the population but broader than had existed before. A new health branch promoted sanitation, hygiene, vector control, and vaccination. Life expectancy in the 1920s, the earliest that any estimates can be made, was still in the twenties.
But, progress over the next few decades was rapid. Sleeping sickness and yaws were targeted. Antibiotics reduced mortality rates from respiratory infections. Vaccination was carried out more widely. Modest economic development alleviated the most grievous poverty. Malaria remained intractable, although therapeutics blunted the mortality rate. Mortality declined rapidly in the 1930s, and 1940s. By midcentury, life expectancy in Ghana was forty-one years, the highest in sub-Saharan Africa. Progress broadened in the postcolonial period, and life expectancy is now sixty-four years.
The experience of Sri Lanka, known as Ceylon during the colonial period, is also illustrative. The island was a British colony, and sustained increases in life expectancy began in the 1920s. The Rockefeller Foundation’s campaign against hookworm had positive spillover effects in promoting better sanitation. Public-sector investments by the local government helped to build a robust social welfare system, providing education and health care broadly across the population. A malaria outbreak in 1934–35, interrupted this early progress, but in the space of about a decade after World War II, Sri Lanka achieved maybe the most rapid sustained improvement in population health in the history of our species. The heavy weaponry of human disease control—DDT, antibiotics, vaccines—was deployed in full force. The rapid gains were followed by five decades of steady progress. The case of Sri Lanka involves both “magic bullets” and broad investment in social welfare.
The demographic history of India recapitulates these themes on a grander scale. As late as 1920, life expectancy in India was only around twenty. Malaria, cholera, plague, and influenza had kept mortality rates high. The crude death rate hovered around forty per one thousand. Mortality then declined gradually before independence (in 1947). Thereafter, gains in life expectancy were truly impressive. Broader access to health care, economic development, and improvements in sanitation (e.g., sewage, piped water) reduced the prevalence of infectious disease. The National Malaria Control Programme was established in 1953 and proved radically successful. By 1965 malaria was basically eliminated as a cause of death, but DDT-resistant mosquitos evolved, and soon the disease was in resurgence. Vaccination programs reduced the incidence of smallpox. Cholera was targeted and has declined. TB programs met with limited success. But by the early 1970s life expectancy was above fifty; today life expectancy has surpassed seventy years.
China arrived at similar results by a different route. Around 1930, life expectancy in China was between twenty-five and thirty-five years. Diarrheal disease, tuberculosis, and other respiratory infections were rife. Malaria was endemic in the south, and schistosomiasis was a major burden in rice-producing regions. Under Mao, who seized power in 1949, China experienced rapid modernization in the arena of public health, despite the horrific famine of 1959–61. The government carried out mass vaccination, and in 1952 it launched the Patriotic Health Campaign to control infectious disease, build sanitary infrastructure, and implement vector eradication. These efforts were accompanied by health education and expansion of health care provision as well as medical training. In the space of a few decades, China was transformed from a stagnant peasant society with life expectancy near the bottom of the international rankings into a modern nation with life expectancy near seventy years.
These cases illustrate common themes in the patterns of midcentury development. The first is an enduring tension in global health between, on the one hand, “magic bullets” or targeted technical or biomedical interventions to control specific diseases and, on the other, broader development programs. This tension runs throughout the entire history of public health. It was there in the beginning, in European debates over whether disease caused poverty, or poverty caused disease. It has been a constant question for governments, philanthropies, and global governance organizations operating on finite budgets: whether to invest in projects that reduce morbidity and mortality, or in reforms that focus more holistically on equity and human development. The two can be complementary, but visions compete for resources and influence.
In the course of this longstanding tension, the 1940s, were the pivot of a structural change in which the control of infectious disease became possible at unprecedentedly low cost.
The experience of Sri Lanka, known as Ceylon during the colonial period, is also illustrative. The island was a British colony, and sustained increases in life expectancy began in the 1920s. The Rockefeller Foundation’s campaign against hookworm had positive spillover effects in promoting better sanitation. Public-sector investments by the local government helped to build a robust social welfare system, providing education and health care broadly across the population. A malaria outbreak in 1934–35, interrupted this early progress, but in the space of about a decade after World War II, Sri Lanka achieved maybe the most rapid sustained improvement in population health in the history of our species. The heavy weaponry of human disease control—DDT, antibiotics, vaccines—was deployed in full force. The rapid gains were followed by five decades of steady progress. The case of Sri Lanka involves both “magic bullets” and broad investment in social welfare.
The demographic history of India recapitulates these themes on a grander scale. As late as 1920, life expectancy in India was only around twenty. Malaria, cholera, plague, and influenza had kept mortality rates high. The crude death rate hovered around forty per one thousand. Mortality then declined gradually before independence (in 1947). Thereafter, gains in life expectancy were truly impressive. Broader access to health care, economic development, and improvements in sanitation (e.g., sewage, piped water) reduced the prevalence of infectious disease. The National Malaria Control Programme was established in 1953 and proved radically successful. By 1965 malaria was basically eliminated as a cause of death, but DDT-resistant mosquitos evolved, and soon the disease was in resurgence. Vaccination programs reduced the incidence of smallpox. Cholera was targeted and has declined. TB programs met with limited success. But by the early 1970s life expectancy was above fifty; today life expectancy has surpassed seventy years.
China arrived at similar results by a different route. Around 1930, life expectancy in China was between twenty-five and thirty-five years. Diarrheal disease, tuberculosis, and other respiratory infections were rife. Malaria was endemic in the south, and schistosomiasis was a major burden in rice-producing regions. Under Mao, who seized power in 1949, China experienced rapid modernization in the arena of public health, despite the horrific famine of 1959–61. The government carried out mass vaccination, and in 1952 it launched the Patriotic Health Campaign to control infectious disease, build sanitary infrastructure, and implement vector eradication. These efforts were accompanied by health education and expansion of health care provision as well as medical training. In the space of a few decades, China was transformed from a stagnant peasant society with life expectancy near the bottom of the international rankings into a modern nation with life expectancy near seventy years.
These cases illustrate common themes in the patterns of midcentury development. The first is an enduring tension in global health between, on the one hand, “magic bullets” or targeted technical or biomedical interventions to control specific diseases and, on the other, broader development programs. This tension runs throughout the entire history of public health. It was there in the beginning, in European debates over whether disease caused poverty, or poverty caused disease. It has been a constant question for governments, philanthropies, and global governance organizations operating on finite budgets: whether to invest in projects that reduce morbidity and mortality, or in reforms that focus more holistically on equity and human development. The two can be complementary, but visions compete for resources and influence.
In the course of this longstanding tension, the 1940s, were the pivot of a structural change in which the control of infectious disease became possible at unprecedentedly low cost.
Scientific and medical advances enabled underdeveloped societies living on near-subsistence wages to achieve life expectancies that were previously impossible. To put it crudely, before vaccines, antibiotics, and insecticides, it was impossible for a society with per capita income of less than $1,000 to have a life expectancy above fifty years.
But, health without wealth—or, rather, the control of infectious disease absent economic development—became progressively more attainable. The allure of low-cost interventions had greater appeal than ever before.
Not coincidentally, the miraculous midcentury progress took place against the background of decolonization and the Cold War. Progress preceded the end of imperialism and also helped to hasten its demise. Social development raised expectations and political engagement. Newly sovereign nations, like India, were then able to expand and sustain progress, often on a broader scale. But international support remained critical, and the geopolitics of the Cold War loomed large over international health efforts from the 1950s to 1970s. The United States was the world’s supplier and promoter of DDT, and malaria eradication was a focus throughout the 1950s and early 1960s, when advances against the disease stalled. The campaign to eradicate smallpox was born of uneasy cooperation between the Soviet Union and United States, inseparable from jostling between the West and Communism to earn goodwill in the so-called Third World.
The flip side of rapid progress, from the earliest health transition to the present, has been unprecedented population expansion. The passage that all modernizing societies have undergone from a high-mortality, high-fertility regime to a low-mortality, low-fertility regime is known as the demographic transition. It is part and parcel of modernization—no society has modernized without passing through this biological and social transformation. Despite the variety of human cultures, and variations in the timing and mechanisms, the demographic transition has virtually always happened in the same sequence: mortality falls, then total population grows for a time, then fertility falls. The decline in mortality can be seen as the proximate cause of explosive population growth and the remote cause of fertility decline. Mortality decline was achieved more slowly in the early movers, and more abruptly in societies that started the transition later.
Not coincidentally, the miraculous midcentury progress took place against the background of decolonization and the Cold War. Progress preceded the end of imperialism and also helped to hasten its demise. Social development raised expectations and political engagement. Newly sovereign nations, like India, were then able to expand and sustain progress, often on a broader scale. But international support remained critical, and the geopolitics of the Cold War loomed large over international health efforts from the 1950s to 1970s. The United States was the world’s supplier and promoter of DDT, and malaria eradication was a focus throughout the 1950s and early 1960s, when advances against the disease stalled. The campaign to eradicate smallpox was born of uneasy cooperation between the Soviet Union and United States, inseparable from jostling between the West and Communism to earn goodwill in the so-called Third World.
The flip side of rapid progress, from the earliest health transition to the present, has been unprecedented population expansion. The passage that all modernizing societies have undergone from a high-mortality, high-fertility regime to a low-mortality, low-fertility regime is known as the demographic transition. It is part and parcel of modernization—no society has modernized without passing through this biological and social transformation. Despite the variety of human cultures, and variations in the timing and mechanisms, the demographic transition has virtually always happened in the same sequence: mortality falls, then total population grows for a time, then fertility falls. The decline in mortality can be seen as the proximate cause of explosive population growth and the remote cause of fertility decline. Mortality decline was achieved more slowly in the early movers, and more abruptly in societies that started the transition later.
Population growth was therefore even more rapid in late starters. But every modernizing society has contributed to the dramatic expansion of human numbers.
As the global majority experienced mortality decline, anxieties regarding overpopulation only intensified; these anxieties formed the explicit background of the academic scholarship on the demographic transition represented by Kingsley Davis’s study. Tellingly, the full title of his article refers to “the population specter.”
As the global majority experienced mortality decline, anxieties regarding overpopulation only intensified; these anxieties formed the explicit background of the academic scholarship on the demographic transition represented by Kingsley Davis’s study. Tellingly, the full title of his article refers to “the population specter.”
Population control—often tainted by racist and eugenic assumptions—became entwined with global development and health initiatives. A whole range of intrusive and sometimes involuntary measures were enacted, requiring sterilization and fertility limitation, such as the “one child” policy introduced in China in 1979. Some of these programs may have hastened the fertility decline. But ultimately the mortality decline has reliably triggered a fertility decline with a two- or three-generation lag, and in most of the countries that experienced midcentury improvements the demographic transition is now starting to reach its equilibrium.
By the 1960s, and 1970s, a number of countries still seriously lagged behind in the control of infectious disease. There are several common ingredients. One is persistent conflict, civil war, or kleptocracy—often a legacy of colonialism and artificially drawn state boundaries—which have hindered economic growth, social development, and adoption of public health measures.
By the 1960s, and 1970s, a number of countries still seriously lagged behind in the control of infectious disease. There are several common ingredients. One is persistent conflict, civil war, or kleptocracy—often a legacy of colonialism and artificially drawn state boundaries—which have hindered economic growth, social development, and adoption of public health measures.
Another is geography, because most of these nations are in the Old World tropics and bear the heaviest natural burden of infectious disease. Malaria and diarrheal diseases, which proved most obstinate in poor and rural areas, have remained overarching factors. Not all mosquito vectors are created equal, and malaria programs proved insufficient in regions with the species of Anopheles that are most adapted to bite humans. The environmental side effects of DDT made it less appealing from the late 1960s, and mosquitos have evolved resistance to insecticides as fast as new ones can be brought to use. In the context of these challenges, the tensions between broader development programs and technical interventions flared anew.
The single greatest reason for the slow pace of gains among the trailing countries, though, is the evolutionary misfortune of HIV and the conditions that allowed it to become a catastrophe.
The single greatest reason for the slow pace of gains among the trailing countries, though, is the evolutionary misfortune of HIV and the conditions that allowed it to become a catastrophe.
Sadly, the AIDS pandemic was the global health calamity of the late twentieth century. Its emergence was enabled by the forces that brought the modern world into being. It remains a serious humanitarian crisis, falling hardest on sub-Saharan Africa. Recent progress in bringing the disease under partial control can be celebrated while recognizing that remaining gaps are both an affront to humanity and a lingering peril for global health.
Over the past generation, the fight against infectious disease has been shaped by structural shifts in the framework of the post–Cold War world. Notably, a system of “international health” gave way to the regime of “global health” in which nonstate actors, notably philanthropies, play a massive role in funding projects, measuring progress, and setting priorities. Simultaneously, the power and budget of the World Health Organization remains anemic relative to its vision and purpose. Critics of this order point to the lack of accountability inherent in a system where private actors loom so large. Defenders underline the value of entrepreneurial approaches, and they point to the measurable progress that has undoubtedly been achieved. Meanwhile, the old tension between holistic social welfare and targeted technical intervention persists. The United Nations’ Millennium Development Goals (affirmed in 2000) and subsequent Sustainable Development Goals (affirmed in 2015) represent a collaborative “blueprint” in which the control of infectious disease is part of a more encompassing vision of equality and human well-being.
Whether that blueprint can succeed—within the structures of endless growth that it presumes and the consequent environmental effects that are hard to control or predict—is the open question. Forebodingly, perhaps, the UN has noted that progress toward each of the development goals has already been gravely interrupted by a pandemic that “has quickly become the worst human and economic crisis of our lifetime.”
In 1991, in response to the rise of HIV and a range of other novel threats to human health, the U.S. Institute of Medicine commissioned an expert panel whose charge was to assess the future of infectious disease. The panel’s report, published the following year as Emerging Infections: Microbial Threats to Health in the United States, was a broadside against “complacency.” One of its cochairs, the Nobel Prize–winning microbiologist Joshua Lederberg, was a pioneering expert in microbial evolution. He knew better than anyone that nature does not stand still, and he had been sounding the alarm for years. “In the context of infectious diseases, there is nowhere in the world from which we are remote and no one from whom we are disconnected,” the report stated. The arsenal of magic bullets had given humans the upper hand, but this advantage was unstable. “Because of the evolutionary potential of many microbes,” the panel warned, “the use of these weapons may inadvertently contribute to the selection of certain mutations, adaptations, and migrations that enable pathogens to proliferate or nonpathogens to acquire virulence.”
The Institute’s report gave “emerging infectious diseases” a place in both scientific and public consciousness. This clarion call still echoes. Complacency and alarm have coexisted in an uneasy mix for the past generation. Sometimes our anxiety has been rational, leading to smarter investments in disease surveillance, global public health, international development, and basic science. Anxiety also manifests itself in collective fascination with new diseases, in sensationalist journalism, in lurid fiction and films, even in zombie apocalypses. Prophets in the line of Lederberg have continuously forewarned us that new diseases were one of the most fundamental risks we face as a species.
Now, the COVID-19 pandemic makes it all too evident that their alarms were both prescient and unheeded. We were, in short, complacent.
Some have attempted to portray the COVID-19 pandemic as a force majeure, an act of God. For scholars who study the past or present of infectious disease, however, the pandemic was a perfectly inevitable disaster and one that stands in a long line of emerging threats. No one could have known that a novel coronavirus would infect humans precisely in central China late in 2019. Yet it was bound to happen that some new pathogen would emerge and evade our collective defense systems. It was a reasonable likelihood that the culprit would be a highly contagious RNA virus of zoonotic origins spread via the respiratory route. In short, a destabilizing pandemic was inescapable and its contours predictable, even if its details were essentially random.
Indeed, the COVID-19 pandemic is part of a deep pattern shaped by the interplay of ecology and evolution. The combination of predictability and unpredictability, of structure and chance, of pattern and contingency, lies in the very nature of infectious disease. New diseases emerge at the meeting point between long-term ecological changes and constant but essentially random evolutionary processes. Humans have uniquely sped-up ecological timescales, experiencing dizzying population growth and unparalleled resource usage. Humanity’s ecological supremacy is unnatural—or, perhaps more accurately, a force of nature itself. What we think of as a medical triumph—the control of infectious disease—is from a planetary perspective a truly novel, systemic breakdown of an ecological buffer. And, as ever, our parasites respond to the ecological circumstances we present them.
In 200,0 atmospheric scientist Paul Crutzen and biologist Eugene Stoermer proposed that we are living in a new geological epoch characterized by humanity’s overarching influencing on the earth’s natural system: the Anthropocene. Even as it is still being debated by geologists, the Anthropocene has become an indispensable concept for thinking about the relationship between humanity and the environment. It recognizes the sheer dizzying novelty of the current experiment in human supremacy. And although climate change is often considered the preeminent environmental problem of the Anthropocene, humanity’s influence on planetary biota is as important as, and inseparable from, our influence on the physical climate. This influence extends to microorganisms, which now must learn to live on a human-dominated planet.
Evolution and ecology are the basic reasons we can never entirely escape the risk of global pandemics. New diseases often emerge when microorganisms that infect animals cross the species barrier and adapt the ability to transmit between humans. New strains of old diseases evolve in response to selective pressures that we place upon them. Antibiotic resistance, for example, is a form of evolutionary response to our ample use of a select number of chemical weapons against bacteria. Similarly, microbes have strong incentive to change their outward appearance in order to escape from our vaccines. On basic Darwinian principles, those strains that adapt the ability to survive and reproduce in such an environment will pass on their genes to future generations—to our peril.
Indeed, over the past few generations, infectious diseases have emerged faster than ever. Anything that affects either the exposure to potential pathogens or the transmission dynamics of infectious disease will bear strongly on the Darwinian pressures and opportunities for disease-causing microbes. Our exposure to the threat of new diseases has never been greater simply because of our numbers. There are now nearly eight billion of us, and by midcentury there will be around ten. Moreover, the knock-on effects of demographic increase also widen our exposure to potential pathogens. Human land use continues to expand, and the human-animal interface is dominated by our need to feed ourselves and, in particular, the unslakable global demand for meat. Industrial farming, and especially meat production, creates evolutionary environments where new germs adapt and emerge. Chickens are now the most numerous bird on the planet; there are more than twenty-three billion of them. The hundreds of millions of cows and pigs brought together in unnatural aggregations are also an evolutionary stewpot of new germs. To make matters more perilous, the overuse of antibiotics in farm animals, largely to promote weight gain, supercharges the evolution of antibiotic resistance.
The way we live also shapes the transmission dynamics of infectious disease, in turn providing the context for the evolutionary prospects of our germs. Today we live more densely than at any time in the past. A number of cities have populations above twenty million; the population of Tokyo today is at least double the number of humans that existed on the entire planet at the beginning of the Holocene epoch. Moreover, we are more interconnected and interdependent than ever before. The jet airplane is a transportation technology of fundamental epidemiological significance, on par with the horse, the steamship, and the railroad.
Over the past generation, the fight against infectious disease has been shaped by structural shifts in the framework of the post–Cold War world. Notably, a system of “international health” gave way to the regime of “global health” in which nonstate actors, notably philanthropies, play a massive role in funding projects, measuring progress, and setting priorities. Simultaneously, the power and budget of the World Health Organization remains anemic relative to its vision and purpose. Critics of this order point to the lack of accountability inherent in a system where private actors loom so large. Defenders underline the value of entrepreneurial approaches, and they point to the measurable progress that has undoubtedly been achieved. Meanwhile, the old tension between holistic social welfare and targeted technical intervention persists. The United Nations’ Millennium Development Goals (affirmed in 2000) and subsequent Sustainable Development Goals (affirmed in 2015) represent a collaborative “blueprint” in which the control of infectious disease is part of a more encompassing vision of equality and human well-being.
Whether that blueprint can succeed—within the structures of endless growth that it presumes and the consequent environmental effects that are hard to control or predict—is the open question. Forebodingly, perhaps, the UN has noted that progress toward each of the development goals has already been gravely interrupted by a pandemic that “has quickly become the worst human and economic crisis of our lifetime.”
In 1991, in response to the rise of HIV and a range of other novel threats to human health, the U.S. Institute of Medicine commissioned an expert panel whose charge was to assess the future of infectious disease. The panel’s report, published the following year as Emerging Infections: Microbial Threats to Health in the United States, was a broadside against “complacency.” One of its cochairs, the Nobel Prize–winning microbiologist Joshua Lederberg, was a pioneering expert in microbial evolution. He knew better than anyone that nature does not stand still, and he had been sounding the alarm for years. “In the context of infectious diseases, there is nowhere in the world from which we are remote and no one from whom we are disconnected,” the report stated. The arsenal of magic bullets had given humans the upper hand, but this advantage was unstable. “Because of the evolutionary potential of many microbes,” the panel warned, “the use of these weapons may inadvertently contribute to the selection of certain mutations, adaptations, and migrations that enable pathogens to proliferate or nonpathogens to acquire virulence.”
The Institute’s report gave “emerging infectious diseases” a place in both scientific and public consciousness. This clarion call still echoes. Complacency and alarm have coexisted in an uneasy mix for the past generation. Sometimes our anxiety has been rational, leading to smarter investments in disease surveillance, global public health, international development, and basic science. Anxiety also manifests itself in collective fascination with new diseases, in sensationalist journalism, in lurid fiction and films, even in zombie apocalypses. Prophets in the line of Lederberg have continuously forewarned us that new diseases were one of the most fundamental risks we face as a species.
Now, the COVID-19 pandemic makes it all too evident that their alarms were both prescient and unheeded. We were, in short, complacent.
Some have attempted to portray the COVID-19 pandemic as a force majeure, an act of God. For scholars who study the past or present of infectious disease, however, the pandemic was a perfectly inevitable disaster and one that stands in a long line of emerging threats. No one could have known that a novel coronavirus would infect humans precisely in central China late in 2019. Yet it was bound to happen that some new pathogen would emerge and evade our collective defense systems. It was a reasonable likelihood that the culprit would be a highly contagious RNA virus of zoonotic origins spread via the respiratory route. In short, a destabilizing pandemic was inescapable and its contours predictable, even if its details were essentially random.
Indeed, the COVID-19 pandemic is part of a deep pattern shaped by the interplay of ecology and evolution. The combination of predictability and unpredictability, of structure and chance, of pattern and contingency, lies in the very nature of infectious disease. New diseases emerge at the meeting point between long-term ecological changes and constant but essentially random evolutionary processes. Humans have uniquely sped-up ecological timescales, experiencing dizzying population growth and unparalleled resource usage. Humanity’s ecological supremacy is unnatural—or, perhaps more accurately, a force of nature itself. What we think of as a medical triumph—the control of infectious disease—is from a planetary perspective a truly novel, systemic breakdown of an ecological buffer. And, as ever, our parasites respond to the ecological circumstances we present them.
In 200,0 atmospheric scientist Paul Crutzen and biologist Eugene Stoermer proposed that we are living in a new geological epoch characterized by humanity’s overarching influencing on the earth’s natural system: the Anthropocene. Even as it is still being debated by geologists, the Anthropocene has become an indispensable concept for thinking about the relationship between humanity and the environment. It recognizes the sheer dizzying novelty of the current experiment in human supremacy. And although climate change is often considered the preeminent environmental problem of the Anthropocene, humanity’s influence on planetary biota is as important as, and inseparable from, our influence on the physical climate. This influence extends to microorganisms, which now must learn to live on a human-dominated planet.
Evolution and ecology are the basic reasons we can never entirely escape the risk of global pandemics. New diseases often emerge when microorganisms that infect animals cross the species barrier and adapt the ability to transmit between humans. New strains of old diseases evolve in response to selective pressures that we place upon them. Antibiotic resistance, for example, is a form of evolutionary response to our ample use of a select number of chemical weapons against bacteria. Similarly, microbes have strong incentive to change their outward appearance in order to escape from our vaccines. On basic Darwinian principles, those strains that adapt the ability to survive and reproduce in such an environment will pass on their genes to future generations—to our peril.
Indeed, over the past few generations, infectious diseases have emerged faster than ever. Anything that affects either the exposure to potential pathogens or the transmission dynamics of infectious disease will bear strongly on the Darwinian pressures and opportunities for disease-causing microbes. Our exposure to the threat of new diseases has never been greater simply because of our numbers. There are now nearly eight billion of us, and by midcentury there will be around ten. Moreover, the knock-on effects of demographic increase also widen our exposure to potential pathogens. Human land use continues to expand, and the human-animal interface is dominated by our need to feed ourselves and, in particular, the unslakable global demand for meat. Industrial farming, and especially meat production, creates evolutionary environments where new germs adapt and emerge. Chickens are now the most numerous bird on the planet; there are more than twenty-three billion of them. The hundreds of millions of cows and pigs brought together in unnatural aggregations are also an evolutionary stewpot of new germs. To make matters more perilous, the overuse of antibiotics in farm animals, largely to promote weight gain, supercharges the evolution of antibiotic resistance.
The way we live also shapes the transmission dynamics of infectious disease, in turn providing the context for the evolutionary prospects of our germs. Today we live more densely than at any time in the past. A number of cities have populations above twenty million; the population of Tokyo today is at least double the number of humans that existed on the entire planet at the beginning of the Holocene epoch. Moreover, we are more interconnected and interdependent than ever before. The jet airplane is a transportation technology of fundamental epidemiological significance, on par with the horse, the steamship, and the railroad.
Since the mid-twentieth century, international travel has grown continuously. It has become possible to travel between faraway points on the planet during the incubation phase of many infections. The 1968 influenza, which claimed more than one million victims globally, might be said to mark the first pandemic of the jet age. Biosecurity policy and international cooperation have not caught up to the new reality.
There are also environmental and human wild cards that will shape the future of infectious disease. Climate change is one. As greenhouse gas emissions heat the planet, the biological repercussions are unpredictable. Warming is likely to expand the territorial range of vector species. Mosquitos that transmit malaria, dengue fever, Chikungunya fever, yellow fever, Zika fever, and so on may find that they can thrive at higher latitudes and higher elevations for more of the year, with consequences for human health. Similarly, regional changes in patterns of precipitation are likely to alter the breeding habitats of mosquitos and thus affect the geography of infectious disease. And the vicious cycle of climate crisis, subsistence migration, and violent conflict remains a multifaceted threat to health that may be no easier to contain in the future than it has been throughout our past. Finally, if continuing demographic and economic growth pushes the earth system beyond sustainable boundaries in the Anthropocene, then the dream of limitless global development will fail, and future convergence in living standards and life expectancy will be unlikely.
A final wild card is biological warfare. The intentional weaponization of biological agents is within humanity’s technical capacity. In principle, existing or potential pathogens could be genetically manipulated in the laboratory to be made more deadly or more contagious. Although the deployment of biological weapons is a violation of international law and moral norms, the ability of global institutions to enforce these legal and ethical standards is highly limited. Only a few states have traditionally had the wherewithal and infrastructure for advanced programs of research, but the democratization of knowledge means that a growing number of states have or will become players in biological research that is either directly or indirectly relevant to the weaponization of disease. Moreover, the potential for nonstate actors to develop or deploy biological weapons will only continue to increase.
There are also environmental and human wild cards that will shape the future of infectious disease. Climate change is one. As greenhouse gas emissions heat the planet, the biological repercussions are unpredictable. Warming is likely to expand the territorial range of vector species. Mosquitos that transmit malaria, dengue fever, Chikungunya fever, yellow fever, Zika fever, and so on may find that they can thrive at higher latitudes and higher elevations for more of the year, with consequences for human health. Similarly, regional changes in patterns of precipitation are likely to alter the breeding habitats of mosquitos and thus affect the geography of infectious disease. And the vicious cycle of climate crisis, subsistence migration, and violent conflict remains a multifaceted threat to health that may be no easier to contain in the future than it has been throughout our past. Finally, if continuing demographic and economic growth pushes the earth system beyond sustainable boundaries in the Anthropocene, then the dream of limitless global development will fail, and future convergence in living standards and life expectancy will be unlikely.
A final wild card is biological warfare. The intentional weaponization of biological agents is within humanity’s technical capacity. In principle, existing or potential pathogens could be genetically manipulated in the laboratory to be made more deadly or more contagious. Although the deployment of biological weapons is a violation of international law and moral norms, the ability of global institutions to enforce these legal and ethical standards is highly limited. Only a few states have traditionally had the wherewithal and infrastructure for advanced programs of research, but the democratization of knowledge means that a growing number of states have or will become players in biological research that is either directly or indirectly relevant to the weaponization of disease. Moreover, the potential for nonstate actors to develop or deploy biological weapons will only continue to increase.
When it comes to infectious disease, we might prove to be our own worst enemy.
It is too soon to know whether the COVID-19 pandemic is a painful lesson on our path to even greater control over microbial threats or the harbinger of a new period in which our freedom from infectious disease becomes increasingly less secure. The unsettling truth about the current pandemic is that it could be much worse. Diseases with higher case fatality rates, like Ebola or the first SARS, have so far proven less effective at transmitting, or more susceptible to our interventions. Even though SARS-CoV-2 emerged in just the manner we were warned was most predictable, it has still managed to buckle our systems of response. While it is disquieting to consider, the long history of disease counsels us to expect the unexpected.
It is too soon to know whether the COVID-19 pandemic is a painful lesson on our path to even greater control over microbial threats or the harbinger of a new period in which our freedom from infectious disease becomes increasingly less secure. The unsettling truth about the current pandemic is that it could be much worse. Diseases with higher case fatality rates, like Ebola or the first SARS, have so far proven less effective at transmitting, or more susceptible to our interventions. Even though SARS-CoV-2 emerged in just the manner we were warned was most predictable, it has still managed to buckle our systems of response. While it is disquieting to consider, the long history of disease counsels us to expect the unexpected.
In other words, the worst threat may be the one we cannot see coming.
William McNeill’s influential history of disease, Plagues and Peoples (1976), writes closing words that remain as true today as they were when he wrote them four and a half decades ago. “Infectious disease which antedated the emergence of humankind will last as long as humanity itself, and will surely remain, as it has been hitherto, one of the fundamental parameters and determinants of human history,” he wrote. Human mastery over nature may wax and wane, as the unprecedented experiment in planetary dominance plays out. Paradoxically, we are in some ways more fragile than our ancestors, precisely because our societies depend on a level of security against infectious disease that may be unrealistic.
Even though our tools of control are vastly more powerful than in the past, we still have much to learn from the experience of those who lived and died before us.
It is urgent that we do so. Our salvation in the long term is unlikely to be technical, or at least not purely so. The future of human health is, as ever, a question of values and cooperation. It remains a matter of both chance and choice, and the role of history is to help us see our choices a little more clearly, by seeing ourselves a little more clearly: as one species, ingenious and vulnerable, whose health is intimately dependent on each other and on the planet we share with our invisible companions.
It is urgent that we do so. Our salvation in the long term is unlikely to be technical, or at least not purely so. The future of human health is, as ever, a question of values and cooperation. It remains a matter of both chance and choice, and the role of history is to help us see our choices a little more clearly, by seeing ourselves a little more clearly: as one species, ingenious and vulnerable, whose health is intimately dependent on each other and on the planet we share with our invisible companions.
Labels: Boston Review, Jules Verne, Kyle Harper, Plagues
0 Comments:
Post a Comment
<< Home