Vacated niches, competitive release and the community ecology of pathogen eradication

James O. Lloyd-Smith

Abstract

A recurring theme in the epidemiological literature on disease eradication is that each pathogen occupies an ecological niche, and eradication of one pathogen leaves a vacant niche that favours the emergence of new pathogens to replace it. However, eminent figures have rejected this view unequivocally, stating that there is no basis to fear pathogen replacement and even that pathogen niches do not exist. After exploring the roots of this controversy, I propose resolutions to disputed issues by drawing on broader ecological theory, and advance a new consensus based on robust mechanistic principles. I argue that pathogen eradication (and cessation of vaccination) leads to a ‘vacated niche’, which could be re-invaded by the original pathogen if introduced. Consequences for other pathogens will vary, with the crucial mechanisms being competitive release, whereby the decline of one species allows its competitors to perform better, and evolutionary adaptation. Hence, eradication can cause a quantitative rise in the incidence of another infection, but whether this leads to emergence as an endemic pathogen depends on additional factors. I focus on the case study of human monkeypox and its rise following smallpox eradication, but also survey how these ideas apply to other pathogens and discuss implications for eradication policy.

1. Introduction

I can see no reason why any other virus should ever occupy the smallpox ‘niche’ now that it has been vacated. Fenner [1, p. 481] Could monkeypox or cowpox virus emerge from its natural reservoir to become a fully human-adapted pathogen, occupying the ecologic niche vacated by the eradication of smallpox? We cannot know the answer, but doubt about the possibility should be tempered by the realization that smallpox itself must once have been a zoonosis. Bray [2, p. 500] A similar threat exists for both rinderpest and measles: global eradication may create niches for other morbilliviruses to fill. de Swart et al. [3, p. 333] Nature abhors a vacuum. Aristotle

Since the era of smallpox eradication, a debate has smoldered over the possible effects that eradicating one pathogen may have on the emergence of others. The debate is centred on the idea that an ecological niche, once occupied by the eradicated pathogen, is left vacant. What impact might this have on the remaining community of pathogens? What is the likelihood that another pathogen will move in to ‘replace’ the eradicated pathogen? Is this a serious possibility, to be studied and quantified, or is it a hypothetical that distracts from the real operational issues of eradication and post-eradication planning?

These questions, and many others, were circulating when Fenner [1] staked out his position dismissing the possibility that another virus could capitalize on the vacated smallpox niche. Over ensuing decades, the hypothesis that a ‘replacement’ pathogen may fill a vacated ecological niche was criticized further. The proceedings of the Dahlem Conference on eradication contained two articles taking aim at the notion [4,5], stating that: ‘Some authors have contended that if a disease is eradicated, another will arise to fill that ecologic niche. We believe that there is no basis for this belief’ [4, p. 3]. On the specific question of whether monkeypox virus might ‘soon take over the ecologic niche left vacant by smallpox’, Breman & Henderson [6] found that ‘available data do not support this possibility’ (p. 556). An article seeking to clarify ecological concepts for parasitologists endorsed the view that vacant niches do not even exist: ‘although extinct species had niches, there can be no unfilled niches’. [7, citing 8, p. 30].

Yet the idea persists, and it appears to be gaining currency. The possible risks arising from the niche vacated by smallpox eradication are invoked routinely, with reference to zoonotic poxviruses or smallpox bioterrorism [2,911]. The case of monkeypox virus has garnered particular attention, as epidemiologic studies in the Congo basin have identified dramatic increases in the incidence of human infection since the 1980s [2,1113]. Even as the march towards poliovirus eradication presses forward, a study provocatively titled ‘Will the polio niche remain vacant?’ [14] has been followed by molecular studies examining the risks from related enteroviruses [15,16]. With the eradication of rinderpest complete and that of measles being contemplated, similar questions arise for morbilliviruses [3,17]. Niche-based mechanisms have been proposed for the rise of zoonotic malaria caused by Plasmodium knowlesi [18], and invoked as a legitimate concern in planning for possible dengue eradication [19]. Finally, concepts of vacated niches and strain replacement are commonly used to explain shifting epidemiological patterns after strain-specific vaccines are deployed against multi-strain pathogens [20,21].

Thus the literature stands in a state of disagreement on the question of whether niches vacated by pathogen eradication lead to significant risks for future pathogen emergence—and indeed whether such niches exist. Distinguished epidemiologists who played leading roles in past eradication programmes have rejected the idea that vacated niches matter, but it is evident that the notion resonates intuitively for many infectious disease researchers and continues to have influence in the field. In this article I aim to untangle the roots of this conflict, and to identify the underlying principles that are robust. I begin by summarizing the main critiques of the risks from replacement pathogens and vacant niches following eradications. I review the controversial history of the niche concept in the ecological literature, including its multiple and conflicting definitions, and summarize some more recent developments in ecological theory. Using monkeypox as a central case study, I argue that the niche and associated ecological ideas provide a useful framework to assess the possible impacts of widespread control of one pathogen on the surrounding pathogen community. I propose the term ‘vacated niche’ to describe the gap left by an eradication, and delineate the factors governing which—if any—unintended consequences may arise. I conclude by surveying the relevance of these ideas to a range of infectious diseases, and discussing the implications for eradication and post-eradication planning.

2. Roots of the controversy: the case against niches

To move towards a reconciled picture, it is useful to review the basis and context for the dissenting views. Fenner's 1981 statement was in response to an article by Yekutiel [22], which put forward factors to be considered in choosing whether to pursue eradication of a disease. Yekutiel described a biological argument against eradication that worried that elimination of one disease ‘may be followed by its replacement by other pathogens producing similar disease syndromes’ [22, p. 466]. Yekutiel opposed the indiscriminate application of this view, citing examples of disease agents that had been regionally eliminated for 75 years with no signs of replacement pathogens, and stating his view that intensive epidemiological studies had shown insignificant danger that smallpox or malaria (another erstwhile target for global eradication) would be replaced by similar organisms from monkeys [22].

Fenner [1] wrote a response, stating that he knew of no evidence supporting the idea of replacement pathogens. He pointed to controversy among ecologists as to the validity of the niche concept for plants and animals, and stated his view that it does not apply to viruses or (probably) to other microbes. To support this view, he cited the existence of humans for tens of thousands of years in Australia and the Americas without measles, smallpox or similar diseases. ‘The ‘niches’ were there, but they were unoccupied’ (p. 481). This led to his conclusion that no other virus should ever occupy the vacated smallpox niche. Finally, he emphasized that human monkeypox is not a new disease that might occupy the smallpox niche, but rather a zoonosis, probably ancient, whose occasional occurrences are newly recognized because of the eradication of smallpox. (For good measure, he made a parallel argument about non-polio enteroviruses that occasionally cause paralytic disease, pointing to the lack of evidence that their frequency had increased in regions where wild polioviruses had been eliminated.)

The two articles from the Dahlem Conference took a similarly strident position against the notion of vacated niches and associated risks from replacement pathogens, but glimmers of possible reconciliation began to appear. First, it is noteworthy that the strongest words of opposition were reserved for very forceful (arguably straw-man) statements of the niche argument, such as ‘It is sometimes argued that each infectious agent occupies an ecological niche and that removal of a species from that niche will inevitably result in adaptive changes in other species, either currently prevalent or newly emerging, to fill that niche, i.e. eradication is pointless since new infections will arise’ [5, p. 56]. Few biologists would agree with this extreme statement of the niche theory (particularly ‘inevitably’ and ‘pointless’), so it is not surprising that the authors rejected it. However, they went on to make more measured arguments. Fenner et al. [4] acknowledged that there was natural concern about the vacated smallpox niche when monkeypox virus was first discovered to infect humans in Africa in 1970. They then reviewed subsequent genetic and epidemiological evidence, showing that monkeypox is not a direct ancestor of smallpox and transmissibility among humans had been concluded to be too low to allow establishment in human populations [23]. (These arguments run parallel to those cited by Breman & Henderson [6] in dismissing any imminent risk from monkeypox.) Together with the fact that numerous countries had remained free of smallpox and other diseases for extended periods, Fenner et al. [4] were left sceptical of the niche hypothesis.

Otteson et al. [5] also took a sceptical tone, but discussed the existence of pathogen niches and introduced the idea of resource competition among pathogens. They emphasized that there is no evidence that vacated niches are beneficial to other pathogens ‘on a relevant time scale’ (their italics) [5, p. 56]. For support, they again cited the lack of replacements for smallpox (then eradicated 20 years prior), polioviruses in the Americas (‘several years’), or rabies in Britain (approx. 100 years). Curiously, they cited the emergence of novel pathogens such as HIV as evidence that ‘niches may remain unfilled for very long periods’ [5, p. 56], which they attributed to adaptive barriers to developing transmissibility. In a significant development, they went on to discuss the competition between pathogen species, or strains within species, whereby reduction in the incidence of one pathogen leads to increased incidence of another that is antigenically related. Without stating it explicitly, this invoked the idea that the pathogen species are competing for a shared resource of hosts that lack cross-immunizing antibodies. Otteson et al. borrowed from community ecology to apply the term ‘competitive release’ [5, p. 57], which describes the phenomenon whereby one species becomes more successful when a competing species declines in abundance (table 1). However, while they said some evidence suggests this relationship applies to yaws and syphilis, they did not find it likely that smallpox eradication led to additional human monkeypox infections (instead it was ‘much more likely that humans have always been accidental hosts for monkeypox’ [5, p. 56–57]). Furthermore they argued that any replacement pathogens will be less well adapted and hence easier to control or eradicate, and that it would be unethical to curtail eradication programmes because of hypothetical rises in other pathogens.

View this table:
Table 1.

Definitions of key terms.

A contemporary viewpoint, and the benefit of hindsight, allows one to take issue with some of the specific arguments summarized above. Research on pathogen emergence and disease ecology has advanced substantially in recent decades, and the ways in which cross-species transmission, pathogen adaptation, and human social and land-use changes can combine to cause viral host jumps are better understood [2426]. At the same time, there are plain truths behind some of the statements above, which need to be incorporated into any new synthesis on this topic. Also, Fenner had a valid point that ecologists themselves have struggled with the validity and applicability of the niche concept, and indeed multiple, sometimes conflicting, definitions of the ecological niche are in common usage. Given that even the niche sceptics recognize the basic principle of competition among pathogen species or strains, this suggests that the dispute over the vacated niche hypothesis may be largely a matter of semantics. Therefore, I will review the relevant dimensions of the niche concept in the ecological literature, including recent developments in the theory, before considering how these principles apply to our current understanding of pathogen eradication, competition and competitive release.

3. The concept(s) of niche in ecology

The niche concept occupies a curious position in ecology, simultaneously lauded as a central organizing principle and damned for its ambiguous and varied definitions. Of many memorable quotations about the niche [27], this one is most relevant to the current discussion: No concept in ecology has been more variously defined or more universally confused than ‘niche’. Real & Levin [28, p. 180]

It is no wonder that epidemiologists also cannot agree on it. In recent years, the theory has been reinvigorated by the syntheses offered by Chase & Leibold [27] and Peterson et al. [29], but instead of a unified niche concept we now have a proliferation of ‘types’ of niche (Grinnellian, Eltonian and Hutchinsonian niches [29,30], or recess/role, population-persistence and resource-utilization niches [31]) aiming to clarify the discourse. There is no simple definition, so there will be no simple resolution to the debate considered here. Instead, my aim is to summarize the major themes underlying niche concepts in ecology, and to draw upon recent theoretical developments to clarify whether, and how, these ideas can inform our understanding of pathogen eradication.

(a) A taxonomy of niches

In attempting to systematize the profusion of niche concepts, ecologists have proposed a number of (more or less) dichotomous keys. Most authors recognize two broad classes of niche concepts: one centred on the environmental requirements necessary for a species to thrive, and another focused on the role of a species in its community and its impacts on that community [27,32]. The environmental parameters that define a niche (sometimes called niche dimensions or niche variables) can be divided into abiotic versus biotic components. Alternatively, and not quite equivalently [29], they can be divided into linked variables, which are typically resources that are dynamically consumed and hence the object of competition, versus unlinked variables that are not consumed, such as temperature or vegetation structure (Hutchinson called these bionomic and scenopoetic (for ‘setting the stage’), respectively [33]). A final, crucial distinction is whether the niche is viewed to be a property of a species or of the environment.

The seminal contribution of Grinnell [34] spanned the requirement and role components of the niche concept, describing the habitat requirements, physiological tolerances, feeding habits and interspecific interactions of the California thrasher. Elton [35], writing independently on animal ecology, presented the niche of a species as its functional role within a community, and especially its position in the food chain and its effects on other species. While the niche concepts of Grinnell and Elton differed significantly in their emphases, both used the term to describe a place or ‘recess’ in the environment that had potential to support a species with appropriate characteristics [31,32]. (‘Recess’ is used in the sense of ‘nook’, giving rise to the term niche.) Schoener [31] calls this the recess/role niche concept; importantly it is a property of the environment not the subject species, so in principle ecologically equivalent species can occupy the same niche in different geographical areas.

Hutchinson [36] proposed a new niche concept that rejected this view, defining the niche in terms of the demographic success of a focal species. He defined the ‘fundamental niche’ of a species as the complete set of all environmental conditions (formalized as an n-dimensional hypervolume of environmental variables) ‘which would permit the species to exist indefinitely’ (p. 416). This definition is customarily interpreted to mean conditions where the instantaneous growth rate of the population is positive, so that a population will grow and persist if introduced [27, though see 30,31]. Hutchinson's ‘fundamental niche’ was defined in the absence of other species (except those that are resources for the focal species), and falls into the class of niche concepts focused on environmental requirements. He went on, however, to define the ‘realized niche’ as the subset of the fundamental niche that a species occupies in the presence of competing species.

Hutchinson's niche concept has had enormous influence, but has been criticized for its over-emphasis on competition to the exclusion of other community interactions, and for the lack of explicit feedbacks that depict how a species impacts its environment. Chase & Leibold [27] sought to address these challenges, and to present a unified niche concept that embraces the dual interpretations of niches as environmental requirements and as impacts on the community. Their framework is posed in terms of zero-net-growth isoclines (describing which environmental conditions support population growth) and impact vectors (describing how that growth affects the environment), and can encompass any species interaction that can be specified with a mechanistic mathematical model. Schoener [31] groups this approach with Hutchinson's as the population-persistence niche concept.

MacArthur and Levins led an influential movement known as ‘niche theory’ that analysed how many species could coexist in a given community, based on the breadth and overlap of resources used by each species [37]. The theory was built on a foundation of mathematical models of interspecific competition, but spawned extensive empirical studies that quantified the resource use of different species along various dimensions. Schoener [31] terms the associated niche concept the resource-utilization concept and emphasizes its operational utility. Core ideas included niche partitioning, wherein species arrange themselves (via evolutionary or behavioural mechanisms) on a resource axis, and limiting similarity, which was the highest degree of niche overlap that enabled two species to coexist. In its heyday in the 1970s, niche theory influenced research across many domains of ecology. In the early 1980s it suffered a dramatic fall from grace, in a turbulent period marked by bitter feuding about study design and controversy over the presumed dominance of competition as an ecological force [38]. This spectacle, which triggered a 20-year decline in the application of niche concepts in ecology, may have influenced Fenner [1] when he rejected the validity of the niche for eradication problems in 1981.

(b) Source–sink dynamics and dispersal limitation

Ecology emerged from these controversies with a broader worldview, and research in the 1980s and 1990s shifted its emphasis to new problems. One growth area was the investigation of spatial patterns and processes, playing out across landscapes or on disconnected habitat patches. Pulliam [39] considered the influence of heterogeneities among patches, including ‘source’ habitats where population growth rates are positive and ‘sink’ habitats where growth rates are negative. He pointed out that populations can exist indefinitely in sink habitats, if they are maintained by continued immigration from nearby sources. In this scenario, he argued, the realized niche could be said to be larger than the fundamental niche, since the range of habitats continually occupied by the species is greater than the range that supports positive population growth.

In a follow-up study, Pulliam [32] provided a fuller exploration of how dispersal patterns and habitat heterogeneity influence the relationship between the niche and distribution of a species. He argued that many species may often be present in unsuitable habitat (i.e. outside their fundamental niches) because of on-going immigration, but this is difficult to prove definitively because it requires precise measurement of demographic rates and experimental confirmation of the role of immigration. Conversely, he emphasized that limited dispersal abilities often mean that species are absent from suitable habitat, and cited examples at numerous spatial scales where such dispersal limitation has been demonstrated.

(c) Empty niches?

A final point of contention (and confusion) in the ecological literature concerns whether it is meaningful to discuss empty niches. The answer varies, depending which niche concept is applied. The recess/role niche, with its tenet that the niche is a property of the environment, allows the possibility of empty niches, though they must be defined with respect to some reference species. Meanwhile the niche theory of community assembly, based on the resource-utilization niche, posits a certain maximum diversity of species in an environment based on the number and breadth of resource dimensions. In this paradigm, a community that is unsaturated can be thought to have empty niches, but these niches do not have specific properties beyond the existence of excess resources that enable another species to be squeezed in [40].

The population-persistence niche of Hutchinson and others takes a stricter view. Because of its focus on species properties, the Hutchinsonian niche concept is widely interpreted to exclude the possibility of empty niches [8,31,32]. Indeed, it was their adherence to Hutchinson's concept that led Bush et al. [7] to advise parasitologists to discard the notion of empty niches. They conceded that the term would probably continue to be used, to describe the absence of a parasite species from one system when the species is known to be present in a similar system elsewhere, and probably would be interpreted to mean that resources are not limiting in the focal system. They pointed out that the term ‘empty niche’ is not needed to make this argument, nor is the conclusion about resources necessarily valid because some other factor may prevent the species from completing its life cycle in the unoccupied site. After the work of Pulliam [32], it is important to note that dispersal limitation is a common cause for the absence of a species from apparently suitable habitat.

(d) Revisiting the critiques

We can now reassess the critiques lodged against using the niche concept to assess the possible consequences of eradication programmes. First, Fenner questioned whether the niche concept can be applied to viruses at all. Because a virus can be viewed as a consumer species, with host species as its resource(s), the ecological niche concepts with their strong emphasis on consumer-resource interactions seem well suited. Viruses also have requirements for abiotic conditions such as humidity or temperature, and exhibit all sorts of interspecific interactions with hosts, vectors and other microbes, giving them a recess/role in their community. From a Hutchinsonian perspective, the requirement for demographic growth defines the fundamental niche for a given virus as that set of conditions where the basic reproductive number, R0, exceeds 1. Even the resource-utilization niche could be applied, by placing bounds on the degree of cross-immunity that enables two viruses to coexist in a host population. It is noteworthy that the International Committee on Taxonomy of Viruses defines a virus species as ‘a polythetic class of viruses that constitutes a replicating lineage and occupies a particular ecological niche’ [41]. It is not clear which niche concept they have in mind, but the broader point seems to be agreed.

Moving to the problem of eradication, Fenner made two historical arguments. He (and others) argued that humans had existed free of smallpox for millennia on several continents, seemingly showing that unoccupied niches pose no hazard for pathogen emergence. He also emphasized that monkeypox is not a new disease, and has probably been spilling into human populations since ancient times. In fact, the second point annuls the first. The absence of smallpox (and measles and many other infections) from the Americas and Australia is clearly an example of dispersal limitation. These viruses emerged in humans in the Old World, from animal hosts present only in the Old World [42]. The devastating epidemics that occurred when they were finally introduced to naive populations on other continents are ample testament to the hazards arising from large pools of susceptible humans. In contrast, the long history of frequent animal-to-human spillover of monkeypox virus means that introduction is not a limiting factor in these locations, and we should expect no such delay in the response of monkeypox to the expanding resource of susceptible humans. Monkeypox is always knocking on the door, and crossing the threshold for brief visits; what we do not know is whether it will ever take up residence (i.e. whether it can have R0 > 1 in any human society).

The remaining criticisms of the niche hypothesis are directed at overly strong statements of its implications for pathogen replacement. These points are valid. There is no basis, in general, to say that a new pathogen ‘will arise’ to fill a vacated niche, never mind ‘by an etiologic agent of equal or greater pathogenicity’ [4, p. 14]. As stated above, no evolutionary biologist would say that an available niche ‘will inevitably result in adaptive changes’ in other pathogen species [5, p. 56]. Finally, there is no basis to say that the possibility of replacement pathogens necessarily renders eradication efforts ‘pointless’ (though one can imagine extreme examples where this is probably true) [4,5]. It is important to note, though, that these usages are almost always rhetorical flourishes of authors arguing against the niche hypothesis. Perhaps these statements (often phrased ‘Some have argued … ’ [4, p. 14]) are based on impassioned discussions in meeting rooms around the world. In the published literature, statements of the risk from niches vacated by eradication tend to be appropriately cautious.

To move forward, it is useful to build upon the points of agreement outlined in §2. Otteson et al. [5] frame the problem in terms of competition between pathogen species, and the potential for competitive release of one species owing to eradication of another. Fenner [1], in his 1981 remarks about non-polio enteroviruses, identifies the need to look for a quantitative rise in incidence as the first sign of competitive release. In §4, I will develop these arguments and consider their implications for the case study of monkeypox and its relation to smallpox eradication.

4. Pathogen eradication, vacated niches and competitive release

(a) A brief history of monkeypox

Monkeypox virus was discovered in laboratory monkeys in 1959. It is an orthopoxvirus, related to variola virus (the agent of smallpox), though not as closely as other species in the genus [43]. Monkeypox infections of humans were first discovered in 1970 in the Congo basin, after smallpox had been locally eliminated. Human monkeypox resembles smallpox in its clinical presentation, though its case fatality rates are lower (estimated to be 2–10% for the central African clade of the virus) [10,44]. A zoonosis, it causes sporadic clusters of cases in human populations in affected areas of central and West Africa. It exhibits limited human-to-human transmission, causing ‘stuttering chains’ of transmission that can last several generations but invariably go extinct [45]. In the animal reservoir, there is serological and epidemiological evidence that numerous species, including rodents, antelope and monkeys, are naturally infected by monkeypox virus. The true maintenance hosts (i.e. the species responsible for the virus's persistence in the landscape [46]) are unknown, but squirrels and other rodents are suspected [44,47].

After smallpox was eradicated by a tremendous global vaccination campaign, the World Health Organization led an intensified surveillance campaign to assess the risk posed by monkeypox virus. This is the first and best-known example of the concern that a ‘replacement pathogen’ might fill the niche vacated by an eradication effort. In a remarkable effort, from 1981–1986 surveillance teams characterized the epidemiology of monkeypox in rural areas of the Democratic Republic of Congo (DRC; then Zaire). They observed frequent animal-to-human spillover of the virus, and quantified the secondary attack rate among humans. All observed transmission chains were short, but their results showed that past smallpox vaccination gave strong cross-protective immunity against monkeypox, leading to concern that monkeypox might establish sustained transmission in an unvaccinated population. In a pioneering analysis, Jezek et al. [48] extrapolated their findings to show that—if nothing changed except the proportion vaccinated in the population—monkeypox would not establish sustained transmission among humans. A follow-up article re-stated this finding in terms of the basic reproductive number, reporting that R0 for monkeypox in an unvaccinated population (in rural DRC) was approximately 0.83, less than the threshold value of 1 needed for persistent circulation [23]. This conclusion influenced the decision to cease intensified surveillance for monkeypox and maintain the cessation of smallpox vaccination.

Following a long period of passive and sporadic surveillance effort, an intensified surveillance programme re-started in the central DRC over the period 2005–2007. Its findings showed a dramatic rise in the incidence of human monkeypox, concentrated in the younger age groups that were born since the cessation of smallpox vaccination programmes [13] (figure 1). In older age groups, unvaccinated individuals had a 5.2-fold higher risk of monkeypox than vaccinated individuals. Hundreds of isolated case clusters were observed in this 2-year period, indicating frequent animal-to-human transmission. While quantitative comparisons are challenged by different surveillance methodologies and shifting demographic conditions, the one health zone covered by intensified surveillance in both the 1980s and 2000s showed a 20-fold rise (95% CI: 14–29 fold) in per capita incidence over this period (and surveillance efforts were, if anything, weaker in the later period) [13]. Teams working in other central and western African countries have reported long chains of human-to-human transmission [49], and surprisingly high seroprevalence of anti-orthopoxvirus antibodies in unvaccinated populations [47,50]. Further empirical research is badly needed, particularly to quantify transmissibility, but the strong age structure of incidence (and especially its shift over recent decades) demonstrates clearly that the expanding pool of orthopoxvirus-naive humans has led to more cases of human monkeypox (figure 1).

Figure 1.

Monkeypox incidence and smallpox vaccine coverage, by age class, for two periods of intensified surveillance in the Kole health zone of the Democratic Republic of Congo. The per capita incidence within an age class varies inversely with the proportion of the population that has ever received the smallpox vaccine. Smallpox vaccination was officially discontinued in 1980 (a), so the continued protection in 2005–2007 (b) shows that the smallpox vaccine gives lasting immunity against clinical monkeypox infection. The overall fraction of the population that is unvaccinated is rising with time since smallpox eradication, causing a rise in monkeypox incidence. Data re-plotted from Rimoin et al. [13]; and error bars show 95% CIs.

(b) Vacated niches

How does this phenomenon relate to the concept(s) of ecological niches? If one accepts the proposition that viruses can have niches, then it follows that smallpox occupied a niche while it was circulating. This could be conceived as a recess/role niche, representing the role of transmitting among that portion of the human population that is immunologically susceptible to orthopoxvirus infection, or a population-persistence niche, representing those conditions of human population density and low vaccination coverage where smallpox had R0 > 1.

Motivated by the strong claim that ‘there can be no unfilled niches’ for extinct species [7, citing 8, p. 30], let us consider whether the smallpox niche remains a valid concept in the post-eradication world. I argue that concluding otherwise requires absurdly fine lines to be drawn. If the smallpox niche ceased to exist upon eradication, then does it follow that the smallpox niche disappeared in each country and continent, in turn, as the virus was eliminated regionally? This seems an obviously nonsensical idea, particularly given the often-realized risk of importation from other regions [51]. Yet, we are confronted with the reality that smallpox is not irrevocably extinct, given existing laboratory stocks and the possibility of de novo synthesis, so the difference between eradication and regional elimination is a matter of degree. As a thought experiment, consider the consequences if smallpox were released in a major city in 2013 and left uncontrolled for six months. It could re-occupy its niche just as surely as it colonized the Americas centuries ago. As such it seems clear that its niche still exists.

To avoid entanglement with the disputed notion of empty niches, I propose the term ‘vacated niche’ to describe this post-eradication scenario. This term captures the key point that distinguishes the post-eradication setting from other hypothetical empty niches: the focal species did occupy the niche, and was persisting successfully in it. It was only through the deliberate human act of eradication that the niche became unoccupied. Even the strict Hutchinsonian definition seems to be satisfied by this circumstance. Importantly, the vacated niche is fundamentally different from a putative empty niche identified by comparing two apparently similar systems and noting that a given species is absent from one of them. As argued by Bush et al. [7], it is impossible to know whether some other factor is preventing demographic success in this comparative situation, whereas in the vacated niche demographic success has been observed. Of course, as time passes since eradication, it is possible that the environment changes such that the vacated niche becomes inhospitable. It is even conceivable, if a replacement pathogen were to arise and thrive, that the original pathogen would be unable to reinvade its vacated niche if re-introduced.

(c) A niche for monkeypox?

What does this mean for monkeypox, or for any other pathogen that is a candidate to fill the vacated niche? Does the vacated niche of smallpox equal a new niche for monkeypox? Certainly there is a growing population of humans who have not been exposed to smallpox or its vaccine, and who are thus susceptible to orthopoxvirus infection [13,50,52]. Thus, there is an unused resource pool, and an unoccupied ecological role. Under the recess/role niche concept, which pertains to the environment not to a particular species, this corresponds to an available niche. Reynolds et al. [52] call this an ‘immunologic niche’, which keeps the focus on the host (i.e. the resource) and leaves the impact on the pathogen species unspecified.

Applying the population-persistence niche concept raises more issues. In this species-centric view, we must consider the demographic ability of the candidate replacement pathogen to ‘exist indefinitely’ (in Hutchinson's words) in the environment under consideration, which in this case is the human population. Monkeypox can have a fundamental niche in the human population only if its R0 for human-to-human spread exceeds one. In the rural DRC, R0 was projected to remain below one even when nobody is immune [23], so this fundamental niche is not expected to exist. Motivated by Pulliam, we could observe that the rural human population is a sink for monkeypox virus (i.e. the virus cannot persist), but that cross-species spillover transmission plays the role of immigration in subsidizing the population of monkeypox-infected humans. At some scale of aggregation and for some frequency of spillover events, monkeypox will be continually present in the human population, and so the realized niche of monkeypox could be said to include humans [39]. However, this stretches the definition of a realized niche, and muddies the distinction between endemic and sporadic human infections. It is more useful to focus on the qualitative threshold of R0 > 1, corresponding to an ‘indigenous disease’ that circulates without continued introduction and aligns with the modern definition of disease elimination [53,54].

In this light, a more important possibility is that the vacated smallpox niche may create a fundamental niche for monkeypox in some as-yet-uncharacterized human population. Human-to-human transmissibility of monkeypox depends on the number of close contacts of the infected case [44], which could be significantly higher in urban or peri-urban settings than in the rural villages studied so far. As a point of comparison, smallpox transmitted with two- to threefold greater efficiency in crowded settings with poor hygienic standards [55,56]. Populations that are immune-compromised by HIV infection, malnutrition or diabetes could also sustain greater spread [2,57]. Given that R0 for monkeypox in rural settings is around 0.8, it is certainly conceivable (though not certain) that human populations exist where R0 > 1. Such populations would comprise a fundamental niche for human monkeypox.

To illustrate these possibilities and connect them to contemporary niche concepts, I adapt the graphical representation of Chase & Leibold [27] to show how monkeypox and smallpox population growth depends on the densities of susceptible humans and susceptible rodents (figure 2). (I have made the simplifying assumption that there is one species of rodent that supports enzootic spread of monkeypox.) The solid lines show the zero-net-growth isoclines (ZNGI) for smallpox and monkeypox in this two-dimensional resource space. These lines show the boundary between susceptible population densities too low to sustain pathogen transmission and higher densities where positive growth is possible. Smallpox does not infect rodents so its ZNGI is horizontal, defining a boundary only in terms of human density. Monkeypox is primarily maintained by rodents, so its ZNGI is mostly vertical. When the density of susceptible humans reaches a critical level, however, R0 for human-to-human spread exceeds 1, and the ZNGI for monkeypox bends to become horizontal. At this level a second ZNGI can be defined for monkeypox spreading solely in the human population (shown by a dashed line).

Figure 2.

Schematic of the niche relationships of smallpox (SPX) and monkeypox (MPX) in the resource space defined by susceptible human hosts and susceptible rodents. Solid lines show the zero-net-growth isoclines (ZNGIs) for each disease, as described in the text. The dashed line shows the ZNGI for monkeypox spreading only among humans. The area outside the ZNGIs (further from the origin) represents the fundamental niche for each pathogen–host combination. The points depict four different scenarios: A, a rural village in DRC at the height of the smallpox eradication campaign; B, a rural village where the population is naive to orthopoxvirus infection (i.e. after vaccination coverage has reduced to zero); C, a high-density urban setting with an orthopoxvirus-naive human population; D, a high-density setting without monkeypox-susceptible rodents. Note that both C and D fall within the fundamental niche for monkeypox transmission in humans only.

Consideration of different points in this resource space underscores how the existence of population-persistence niches for pathogens is dynamic, and can vary in space and time as the densities of susceptible hosts vary. The labelled point A corresponds to a rural village in DRC during (or immediately following) the smallpox eradication campaign. Vaccination coverage is high enough to eliminate smallpox, but monkeypox is still maintained in the rodent reservoir. Point B shows this rural village 75 years later, when population immunity has dropped owing to demographic turnover; smallpox could persist, but is eradicated, and monkeypox has R0 around 0.8 so is still maintained only in rodents. Point C depicts a high-density urban setting such as Kinshasa, after population immunity has dropped. Now R0 in humans exceeds 1 and the pathogen can be maintained in either host species. Finally, point D shows a high-density, immunologically naive human population that can sustain monkeypox spread in the absence of the rodent reservoir. I emphasize that scenarios C and D are hypothetical, but they illustrate how monkeypox could gain a fundamental niche for transmission in the human population.

(d) Unintended consequences of eradication: competitive release, evolutionary emergence or none

It is certainly not true that the niche vacated by an eradication effort inevitably leads to unintended, adverse consequences. Three broad classes of outcome can be imagined, depending on the structure of the pathogen community.

First, if the community includes one or more pathogen species that are direct competitors for the hosts used by the eradicated pathogen, then competitive release will occur. This follows simply from the fact that more hosts are available for infection by the remaining pathogen(s), assuming that vaccination programmes are halted after eradication. The outcome of competitive release is not pre-ordained and may be difficult to predict. A quantitative rise in incidence is expected, and will show the hallmark (observed for monkeypox [13]; figure 1) of affecting a broader age range of hosts as time since eradication increases. Yet this does not guarantee that the ‘released’ pathogen will become a full-blown replacement pathogen, filling the vacated niche. This will depend on the intrinsic transmissibility of the pathogen in the host population, and whether its reproductive number rises above the threshold value of 1. In many instances, it will not (e.g. cowpox infection, which is not known to transmit among humans), and the full consequence may be a mild rise in incidence. Another determinant of the outcome is the frequency of introduction of the competing pathogen. If cross-species transmission is rare, or if dispersal limitations (of the animal host or infected humans) prevent the pathogen from reaching a geographically separated human population, then the impact of competitive release will be diminished or delayed.

A second possibility is that a new pathogen may emerge through evolutionary mechanisms to capitalize on the vacated niche. This may occur by the same pathway that gave rise to the original pathogen, or by a novel route. At first glance, evolutionary emergence may seem such a low-probability event that it can be ignored, but the immediacy of the risk depends on pathogen biology and ecology. For example, Vasilakis et al. [19] state that any future eradication of human dengue viruses would be unlikely to last without continued vaccination, given the frequency of spillover events and low adaptive barrier for emergence of sylvatic dengue virus in humans. This also highlights the potential for competitive release and evolutionary emergence to act in concert, as lengthening transmission chains provide greater opportunities for adaptation to human hosts [30,58].

The final possibility is that no unintended consequences will occur. If the eradicated pathogen had no direct competitors, and no menacing relatives with potential to evolve into replacements, then the eradication will stand unchallenged by any adverse effects. Of course, this may also occur if some risks do exist, if by luck or preventive measures the risks are not realized.

5. Application to other pathogens

Concerns about possible competitive release or pathogen replacement linked to eradication efforts extend well beyond monkeypox. These illustrate other manifestations and other dimensions of the phenomena described above, as well as challenges that arise in trying to understand pathogen community ecology in a complex world, often with imperfect data.

First, remaining within the orthopoxviruses, reports of other zoonoses such as cowpox and buffalopox have also risen in the aftermath of smallpox eradication [5961]. The data are less systematic, so quantitative comparisons are challenging, but these patterns are broadly consistent with competitive release arising from the decline of cross-immunity from smallpox and its vaccine. However, the degree of cross-immunity for these viruses is less well characterized than for monkeypox, and the absence of stable, long-term surveillance data means that increases in reported cases could result from heightened attention (perhaps due to the suspected connection to smallpox eradication) rather than true increases in incidence [62]. Similar concerns have been raised for monkeypox, in considering why other monkeypox-affected countries, particularly in West Africa, do not report the conspicuous rise in incidence reported in the DRC [9]. The simplest explanation is that surveillance is weak—it is known that monkeypox case counts correlate strongly with the intensity of surveillance programmes [13], and this effect may be accentuated by the less virulent monkeypox strains of West Africa [10]. A more intriguing hypothesis comes from a recent study in Ghana, which found greater than 50 per cent seroprevalence against orthopoxvirus antigens in unvaccinated people with no history of monkeypox-like illness, and suggested that widespread exposure to an uncharacterized orthopoxvirus may be the cause [47]. In this scenario, monkeypox may be subject to continued competition from another cross-immunizing orthopoxvirus, explaining the lack of competitive release. All of these questions cry out for further investigation.

The eradication of rinderpest was a colossal achievement, but appears to have led to competitive release of the related morbillivirus that causes peste des petits ruminants (PPR) [63]. PPR virus is chiefly associated with sheep and goats, but can infect other livestock species as well as wild ungulates [64]. Historically, PPR was controlled using live-attenuated rinderpest vaccine, but this practice had to be stopped in order for countries to confirm rinderpest-free status [64,65]. Over the decade since the last rinderpest cases were detected, reported outbreaks of PPR have grown in intensity and the geographical range of the virus is spreading in Africa and Asia [63,64]. It is unknown whether rinderpest infection or vaccination was the greater contributor to cross-immunity, and the importance of wildlife hosts in the resurgence of PPR is also unclear; these factors will influence future prospects for PPR control. Ironically, PPR is thought to have aided rinderpest eradication in regions where vaccination levels were inadequate, by subclinically infecting cattle and bolstering herd immunity [66]. Such cross-immunizing ‘help’ from the pathogen community turns out to be a double-edged sword in light of competitive release.

For poliovirus, concerns have centred on the possibility that related viruses may evolve to become neurovirulent and cause polio-like disease. Recent studies proposed that polioviruses emerged by mutation from C-cluster coxsackie A viruses (C-CAVs), which circulate widely but cause mild disease, and that replacement pathogens may arise by the same pathway if polio eradication succeeds [14,15]. A related line of work has shown that some circulating vaccine-derived polioviruses have arisen through recombination between C-CAVs and polio vaccine strains [15,16]. Thus evolutionary mechanisms exist to produce a replacement pathogen (i.e. a neurovirulent enterovirus that transmits among humans), particularly if live-attenuated oral vaccines are used in the endgame for polio eradication. Important unknowns remain, including the selection pressures for or against these phenotypes, the impact of population immunity on this emergence process, and whether such pathogens could achieve sustained transmission in human populations—i.e. whether a Hutchinsonian fundamental niche exists for them. A further concern about polio has emerged in recent years, with reports that multiple doses of monovalent oral polio vaccine are correlated with non-polio acute flaccid paralysis, with high rates of severe outcomes [6769]. These findings have been controversial, but raise the possibility that vaccination is perturbing communities of non-polio enteroviruses at individual or population scales [69].

Similar concerns have been weighed for other pathogens that are candidates for possible future eradication efforts. Sanders et al. [70] conducted a risk analysis of possible sources of reintroduction of measles virus, after a hypothetical future eradication. While focusing mostly on other hazards, they invoked the ‘theoretical’ possibility that another morbillivirus could ‘jump the species barrier to occupy the human niche left by measles’ (p. S76). In the absence of concrete evidence, they judged the potential risk of such a jump to be very low. de Swart et al. [3] were not so dismissive of the threat from zoonotic morbilliviruses in a post-measles world, observing that canine distemper virus has adapted to cause massive outbreaks in non-human primates [71]. Clearly many unknown factors prevent a definitive assessment of this risk. We know more about the possible impact of sylvatic dengue strains on potential efforts to eradicate human dengue using tetravalent vaccines. Because sylvatic strains spill over frequently to humans, and epidemiological and experimental evidence shows no sign that significant adaptation to humans is needed, it appears likely that sylvatic strains could readily fill a vacated niche for dengue if vaccination were terminated [19,72]. A contrary example is yellow fever, for which urban transmission cycles were interrupted in the Americas in the 1930s, and sylvatic strains have not reinvaded [22]. Proposed explanations include cross-protective immunity from dengue virus or other flaviviruses (i.e. competition), the distance between cities and foci of sylvatic transmission (i.e. dispersal limitation), and the fact that, unlike dengue, yellow fever has not become adapted to the urban mosquito Aedes aegypti [73]. Nonetheless some experts believe it is just a matter of time before urban yellow fever is re-established in South America [73].

Competitive release and niche arguments also pertain to many pathogens subject to widespread, effective control measures. One prominent example is P. knowlesi, a zoonotic source of severe human malaria that appears to have risen in incidence as successful control efforts have driven down malaria from endemic human strains [18]. Current evidence suggests that, like monkeypox in Africa, P. knowlesi has been a sporadic zoonotic infection in Southeast Asia since ancient times. Thus, it is expected that a reduction in cross-immunizing protection from endemic strains should cause a quantitative increase in incidence, as reported; whether there is further potential for emergence as a fifth endemic human malaria will depend on human-to-human transmissibility, vector ecology and land-use change [18].

Finally, widespread evidence for competitive release can be seen in the impacts of strain-specific vaccines on the dynamics of multi-strain pathogens. In 2002, Burke [74] cited adenovirus as ‘the best example of an evolving or re-emerging virus that is filling an ecological niche left vacant after vaccination’ (p. 109), describing the apparent release of adenovirus-7 by the deployment of an adenovirus-4 specific vaccine. More recently, the multivalent pneumococcal conjugate vaccine has led to serotype replacement, successfully reducing prevalence of target serotypes but allowing non-target serotypes to rise [75]. A recent modelling study applied niche concepts (in the resource-utilization vein, based on partitioning antigenic space) to explain this and other patterns of pneumococcal diversity [20]. Other models have highlighted the potential for strain replacement owing to new vaccines against rotavirus [76] and human papillomavirus [21]. The prevailing assumption is that strain replacement occurs because the vaccine is more effective against some strains than others, hence releasing non-target strains from competition, though other ecological and evolutionary mechanisms can also suffice [77]. The analogy is not perfect, but many insights from this literature may carry over to the growing theory of pathogen eradication.

6. Implications for eradication programmes

While the situation for every pathogen is unique, there is abundant evidence that eradication or widespread control can lead to undesirable changes in the surrounding community of pathogens. It is certainly not true that the possibility of competitive release or pathogen replacement renders all eradication efforts ‘pointless’, but these risks should be considered before, during and after eradication.

The community of host species is classically a major criterion in assessing the potential to eradicate a pathogen, because existence of a non-human reservoir adds significant (often intractable) challenges. This view should be extended to encompass the community of pathogens, in order to assess the relative hazards arising from competitive release or emergence of replacements. Sometimes these risks will be sufficient to negate the benefit of eradication, or at least to alter the cost–benefit analysis by negating the major savings from permanently stopping vaccination after a pathogen is eradicated (though note that eradication still may be cost-effective [78]). Dengue may be an example of a disease for which vaccination could not be stopped owing to valid concerns about replacement [19]. Often a robust assessment of risk from the pathogen community will require basic research, from studying the ancestries and emergence pathways of circulating pathogens to probing the determinants of host range. A mechanistic understanding of interactions among pathogens, such as key host or vector species that bridge sylvatic and urban transmission cycles, may suggest other targeted means of reducing adverse impacts. Sometimes technological solutions may exist, such as the development of marker vaccines that do not interfere with certification of disease-free status, so that cross-protection against related pathogens can be maintained. Such a vaccine could have prevented the competitive release of PPR in livestock when rinderpest vaccination had to be halted [64].

Further measures should be taken after an eradication campaign succeeds. An essential priority is to conduct systematic and well-documented surveillance for any pathogens that could benefit from the vacated niche, first to set a baseline and then to monitor for outbreaks and signs of adaptation toward greater transmissibility [23]. The monkeypox surveillance programmes in 1981–1986 and 2005–2007 were exemplary in this regard, though consistent measures of surveillance effort would have enabled more quantitative comparisons across eras, and updated transmissibility studies are needed [13]. Surveillance studies should be designed to support estimation of R0, for example by measuring secondary attack rates and counting contacts [23] or by collecting systematic data on the size of transmission chains [79]; a complementary priority is to develop better methods for assessing transmissibility when R0 < 1, particularly in light of real-world heterogeneities and imperfect case detection [45,80]. If pathogen replacement is a significant concern, then the global community should maintain the ability to conduct vaccination in response to outbreaks (through a vaccine stockpile, or by developing new vaccines as needed). A policy of maintaining vaccinal herd immunity in selected firewall populations may be warranted under some circumstances, for example when the ecological conditions for spillover into human populations are spatially restricted. Such a strategy has been recommended to prevent re-establishment of yellow fever in South American cities [22], though in practice this policy is implemented imperfectly, with vaccination focused on enzootic areas rather than the boundaries where disease range could expand [73].

Finally, it is helpful to counter certain misconceptions. Some authors have argued that pathogen species benefiting from competitive release will be less well adapted to humans and hence easier to control or eradicate than the original, eradicated pathogen [5]. While it might be true that they are less well adapted to humans, such statements overlook the inherent and substantial challenges that arise when trying to control (and especially eradicate) a zoonotic pathogen. With that said, it is important to recognize that if cessation of vaccination is a key cause of competitive release, then a known effective tool is at hand to combat any replacement pathogen that arises (provided a vaccine stockpile and/or production capability is maintained). Another common error in the literature is to draw upon past surveillance data to assess present risk from pathogens undergoing competitive release. Unfortunately, it is not safe to conclude that epidemiological patterns seen in the immediate aftermath of eradication will continue to hold. For instance, as population immunity against orthopoxvirus infection declines, the fraction of monkeypox cases owing to secondary transmission is expected to rise because human-to-human transmission becomes more efficient [53].

7. Summary

This article has considered whether it is sensible to discuss the niche left behind when a pathogen is eradicated, and to worry about the risk that this niche will be recolonized by another pathogen causing a similar disease. This has been a contentious point in the epidemiological literature surrounding eradication, but continued frequent appearances of the idea underscore its intuitive resonance. I have argued that eradication of a successful pathogen does give rise to a ‘vacated niche’, which can alter the epidemiology of the surrounding community of pathogens. Mechanisms of competitive release or evolutionary adaptation (or both, acting synergistically) can elevate the health burden from other pathogens, with outcomes ranging from quantitative rises in incidence to establishment of new endemic pathogens. However, it is important to avoid the implication that a vacated niche will necessarily cause emergence of a replacement pathogen, or that any such pathogen will have particular disease characteristics. The vacated niche is an opportunity for other pathogens, but many factors will determine whether and how they may capitalize on it. We can learn from epidemiological history, and take heed of the pertinent issue of ‘relevant time scales’ raised by past critics of the vacated niche concept, but we must also factor in the sociological and environmental changes that have quickened the pace of pathogen emergence in recent decades.

Faced with the complexity of a community of pathogens spreading among a community of host species, the most balanced approach would pair appropriate caution with proactive data-driven investigation of the ecological, epidemiological and evolutionary processes that could lead to undesired consequences of eradication. While it may have been obscured by controversies, this is not a new insight into the planning or conduct of eradication programmes. I close as I began, by quoting another hero of the smallpox eradication campaign: The greatest impediment to the total eradication of smallpox is … premature complacency and the failure to achieve a full understanding of the relation of smallpox virus to the rest of the pox-virus family.Foege [81, p. 671]

Acknowledgements

I thank Petra Klepac, Anne Rimoin, Bryan Grenfell, Peter Hudson, and members of my research group at UCLA for illuminating discussions. I am grateful for the support of NSF grant no. EF-0928690, the De Logi Chair in Biological Sciences, and the RAPIDD programme of the Science & Technology Directorate, Department of Homeland Security and the Fogarty International Center, National Institutes of Health.

References

View Abstract