Stigmatization is characterized by chronic social and physical avoidance of a person(s) by other people. Infectious disease may produce an apparently similar form of isolation—disease avoidance—but on symptom remission this often abates. We propose that many forms of stigmatization reflect the activation of this disease-avoidance system, which is prone to respond to visible signs and labels that connote disease, irrespective of their accuracy. A model of this system is presented, which includes an emotional component, whereby visible disease cues directly activate disgust and contamination, motivating avoidance, and a cognitive component, whereby disease labels bring to mind disease cues, indirectly activating disgust and contamination. The unique predictions of this model are then examined, notably that people who are stigmatized evoke disgust and are contaminating. That animals too show avoidance of diseased conspecifics, and that disease-related stigma targets are avoided in most cultures, also supports this evolutionary account. The more general implications of this approach are then examined, notably how it can be used to good (e.g. improving hygiene) or bad (e.g. racial vilification) ends, by yoking particular labels with cues that connote disease and disgust. This broadening of the model allows for stigmatization of groups with little apparent connection to disease.
1. Disease avoidance as a functional basis for stigmatization
An individual is stigmatized when they possess some signs which lead to them being permanently avoided by members of the larger society within which they reside. Stigmatization is important to study because of its adverse consequences for personal and social wellbeing. Stigmatized groups experience inequities in employment, education and healthcare settings, as well as adverse health outcomes and difficulties forming interpersonal relationships . In this article, we argue that stigmatization of many different groups may result either directly or indirectly from an evolved predisposition to avoid diseased conspecifics. This basic claim derives from two observations. The first is that reactions to people who have an infectious disease are similar to reactions to people who are stigmatized . The second is that the most severely stigmatized groups (i.e. those who are most avoided) are individuals who bear apparent signs of disease. Stigmatization then may be in part a consequence of a signal detection problem whereby the detectable cues to infectious disease are imperfect; thus it is less costly to avoid those who appear sick even if they are not.
While this type of argument has been made before [3–6], it has not been developed into a fully-fledged theory of stigmatization—that is one which is testable and that can potentially accommodate within it the multiple forms of stigmatization (see table 1 for a tentative list of stigmatized groups). The aim of this article is to develop such a theory and provide a preliminary evaluation of its status relative to other theoretical approaches. To do so, we start in §2 by briefly reiterating the evidence for the first observation above—namely that both stigmatized and infectious individuals are avoided. While the pattern of avoidance is arguably similar, a major problem lies in identifying whether this results from a common underlying cause—disease avoidance. Although there are theoretical reasons to suspect that it does, namely error management theory  which we then discuss in §3, the disease-avoidance perspective on stigmatization is too poorly specified to predict the precise similarities that should be evident. This lack of specificity is one factor that motivated our desire to develop a more complete account of disease-avoidance-based stigmatization. The second observation, above, was that more overt disease signs were likely to be associated with more severe stigmatization. This too is difficult to evaluate precisely, because there has been no previous attempt to identify—as we do here—exactly what constitutes a disease sign and how these might be detected.
In §4 we propose a model of disease-avoidance-based stigmatization, which includes three interrelated pathways that lead to avoidance: one based upon disgust, one based directly upon knowledge and another, stemming from this, based upon fear. To evaluate this model, we start by outlining the alternative theories of stigmatization that are available in the literature. From this in §5, a set of predictions is developed which is unique to the disease-avoidance model, relative to these other theoretical formulations. The evidence relative to these points is then examined. In §6 we discuss how our model can be expanded to include other forms of stigmatization—some current and some historical—that seemingly have no direct connection with disease avoidance. Finally, in §7 we outline predictions of the model that have yet to be tested, its broader basis in mate selection, and identify the forms of stigmatization which it cannot readily accommodate.
2. People with infections and stigmatized people are both avoided
Individual and societal reactions to novel diseases and to known infectious agents are characterized by avoidance of contact, thereby minimizing the opportunity for contagion [13,41,42]. At the societal level, this is reflected in the process of quarantine, a procedure deployed historically and also in more recent threatened or actual epidemics as we illustrate next. In the early stages of the human immunodeficiency virus (HIV) epidemic, legislation seeking to quarantine people living with HIV/AIDS (acquired immune deficiency syndrome) was proposed in several states in the USA, and the UK still includes HIV/AIDS among diseases covered by the nation's quarantine laws . Similarly, mandatory home quarantine was sanctioned in Hong Kong, Singapore, Taiwan and Canada  following the outbreak of severe acute respiratory syndrome (SARS), and mandatory home quarantines and surveillance, closure of schools, cancellation of public sporting events, voluntary avoidance of public spaces and the use of facemasks in public followed the World Health Organizations H1N1 influenza pandemic alert .
Avoidance is also observed at the individual level, and has been reported for HIV/AIDS sufferers , for individuals with SARS , and for children and adults who were infected with the H1N1 virus . A particularly striking feature of this individual-level disease-related avoidance is its sensitivity to any perceived connection back to the disease carrier. It has also been reported that almost one-third of respondents would not wear a laundered sweater previously worn by a person living with HIV/AIDS, nor would they drink out of a washed, sterilized glass that had been used a few days earlier by such a person . The SARS and H1N1 outbreaks provide similar observations. In one study, almost 20 per cent of respondents believed that even 18 months post-recovery, SARS patients were still infectious and that shaking hands or dining with them would transmit the disease . Such post-recovery social exclusion was also widely observed for children (and their families) who had contracted the H1N1 virus [46–48].
Isolation, temporary or otherwise, of sick (or recently sick) people has been extensively used as a form of protection against contagious diseases both historically and into the present day. Public reaction, as the earlier examples suggest, is of a similar form, namely behavioural avoidance and social exclusion. A notable feature of these reactions is their apparently indiscriminate nature, in which any individual who may even remotely be infectious, is excluded. It is this targeting of such individuals, identified by visible signs of infection or by the label ‘carrier’ or ‘recovered-carrier’, that has attracted previous investigators' attention (e.g. ) because of its apparent similarity to avoidance and exclusion behaviours levelled against many stigmatized groups.
A large literature suggests that people who just appear to be unwell experience social and physical exclusion. The best examples of this are conditions that most directly appear to signal an infectious disease, even when the stigmatizing person knows that the condition is patently non-infectious. Individuals with acne , psoriasis  and eczema , as well as people with cleft palates, burns or birthmarks , have all reported experiencing social and physical avoidance by other people [50–53]. Such self-reports by the sufferer could perhaps be written-off as a consequence of intense self-awareness, but this does not appear to be the case. Numerous behavioural studies , some of which we detail later, indicate otherwise.
For example, behavioural avoidance was reported by a naturalistic study that measured the personal space afforded to a disfigured or non-disfigured confederate by public pedestrians in a busy street . The study employed two types of disfigurement: a birthmark under the right eye (permanent disfigurement), and trauma scarring and bruising (temporary disfigurement). It was found that members of the public stood further away from the confederate in the disfigured conditions than in the no disfigurement condition. Other similar findings have also been reported . It has been observed that people travelling on a suburban railway avoided sitting next to someone who appeared to have a facial port-wine stain, relative to controls . This tendency to avoid individuals with facial lesions extends to other visible indicators of ‘ill-health’. Another experiment examined interactions between healthy participants and a confederate who appeared either disabled (via a wheelchair) or physically normal . Participants terminated interviews with a physically disabled confederate sooner, thereby physically removing themselves from the interaction. It has also been reported that the type of physical disability influenced the amount of personal space given to that person . Healthy participants maintained a close personal distance with confederates who appeared ‘normal’ or who feigned a temporary condition (e.g. a broken arm) relative to those confederates feigning more permanent conditions such as an amputated leg or clubfoot. Several other studies have observed similar effects. For example, participants stood further away from people described as amputees or epileptics relative to people described as ‘normal’ , maintained greater physical distance from a disabled confederate than from a ‘normal’ confederate during face-to-face interviews , and increased their personal distance during an encounter (e.g. volunteering directions) with a confederate in a wheelchair, relative to an able-bodied confederate . Interestingly here, the authors observed that while participants appeared equally willing to help, they did not want to ‘catch’ whatever it was that the disabled confederate had .
A key issue then is whether the behavioural avoidance detailed earlier by people with non-infectious facial lesions or those with physical disabilities, and the exclusion experienced by people directly or indirectly affected by infectious disease, all result from the same underlying cause—disease avoidance. One theoretical reason for suspecting that they might, comes from error management theory as discussed in §3.
3. Error management theory
An obvious problem for members of a social species is how to avoid the pathogens that are transmitted from one individual to another . The increased risk of exposure to infection that comes with social behaviour creates a tension between selection pressures for sociality (social proximity) and selection pressures for disease avoidance (social avoidance) . Disease can impact on an individual in a number of different ways, many of which will lead to visible signs as we discuss later. A disease-avoidance system can capitalize on these signs as a cue for the presence of disease. While such signs can be predictive of contagious disease, they may also be benign. This creates a signal detection problem: when an organism is faced with evaluating potential disease risks it can err by making a false alarm (a healthy person is erroneously perceived to be sick) or a false rejection (a sick person is erroneously perceived to be healthy). Any general tendency towards avoiding false alarms leads to an increase in the rate of false rejections, and vice versa. Miss-assessing a diseased individual to be healthy can result in a big cost—death or an inability to sire or bear an adequate number of offspring. Conversely, mistaking superficial imperfections as signs of disease will limit the available number of social interaction and sexual partners. Evolutionary logic suggests that one should minimize the error that poses the greatest threat to one's fitness—error management theory [40,59]. In this case, as with most systems designed for self-protection, humans should be biased towards false alarms because false rejections are more costly [5,60]. This bias might take the form of reacting to relatively scant evidence that someone is harbouring a contagious disease, but requiring much stronger evidence that someone is healthy (smoke detector principle: ). Accordingly, we are likely to be especially sensitive to signs of sickness, and the perception of such signals may result in avoidance, especially if the threat of disease is highly salient or if we feel particularly vulnerable to disease (functional flexibility: ).
While error management theory provides an overarching theoretical argument, it does not—nor was it designed to—provide specific testable predictions as they might apply to a disease-avoidance account of stigmatization. Nor, for that matter, can it address whether the apparent similarity between exclusion of individuals with signs of disease—facial lesions or physical disability—and those with infectious disease stem from a common cause. It is for these reasons we felt the need to articulate a theoretical account of how disease-related stigmatization might operate. An important consequence of taking this step is that it allows for the generation of testable hypotheses about how disease-related stigmatization should manifest. This then allows the disease-avoidance account to be contrasted with other theoretical accounts of stigmatization as well as allowing us to assess whether the apparent similarity between reaction to real infectious diseases and apparent signs of ill-health share a common underlying cause.
4. A disease avoidance model of stigmatization
In this section, we present a model of disease-avoidance-based stigmatization. Consistent with error management theory, we start from the premise that many infectious agents provide less than perfect cues to their presence and that this gave rise to a disease-avoidance system biased towards false alarms [5,6,62]. This bias then leads to aversive and avoidant reactions to individuals exhibiting signs of disease. According to this view, individuals are being evaluated for disease-relatedness. That is, the perception of some unusual feature or ‘mark’ may suggest a relationship to contagious disease and this then initiates avoidance. A significant issue for this approach lies in defining what constitutes a ‘sign’ of disease and we examine this issue first.
(a) What constitutes a disease sign?
The primary disease signs are those that can be detected by one or more of the senses, and that correspond with a true sign of infection. An examination of the 25 infectious diseases that currently impose (or imposed) the highest human mortality indicates that 16 present with readily visible facial lesions (rashes, bleeding under the skin, cyanosis and changes in colour of the sclera), 20 present with fever (which will include abnormal facial colouring and perspiration), six with a cough or nasal discharge, five with abnormal movement or behaviour that extends beyond illness-related malaise (muscle spasms, torpor and psychosis) and five with changes to the physical structure of the body (swollen neck, cachexia and lipodystrophy) (; table 2). In total, all 25 diseases—15 of which are contracted from human transmission, nine via animal vectors and one via wound infection—demonstrate one or more of these signs, with many (23/25) directly displayed via the face (i.e. skin lesions, jaundice, fever and cough/nasal discharge).
The face represents the point of initial focus in social encounters . It therefore seems likely that relative to other forms of bodily distortions, facial abnormalities may attract special attention [3,65]. Indeed, facial perception appears to have a specific locus in the brain  and the ready detection of facial abnormalities may be an innate signal for disease . A recent study demonstrated that disfigured faces were more likely to hold attention relative to normal faces . Consistent with this claim, a grossly distorted face is disturbing to viewers , a conceptually similar reaction is reported in newborns , and these effects seem to hold cross-culturally [67,70]. This contention is also supported by evidence that structural asymmetries of the face, such as ears with attached lobes, hair whorls and widely spaced eyes, are associated with increased susceptibility to infectious diseases , mental disorders  and hyperactivity . This is not to say that bodily abnormalities may not also act as signs of disease, but these may be more easily hidden and as the brief analysis of the 25 major human infectious diseases suggests, the face is an especially useful indicator of infection status. Consequently, we regard the face as the most prominent location for a disease sign and for this reason, distortions to the face should be particularly effective at generating stigmatization .
A secondary class of disease sign, and one that is unique to humans, is the disease label. A disease label, once applied, should operate like a visible sign of disease. That is, it too should lead to avoidance in two related ways. First, the label may trigger mental images of disease (e.g. the label ‘herpes’ may bring to mind yellow crusted ulcers on the lip). Second, the label may access disease knowledge (e.g. herpes is contagious). Both processes should result in aversive and avoidant reactions, and both processes are ultimately dependent upon disease-related semantic knowledge. That is, unless one knows what the label means, its implications cannot be appraised and mental images relating to it cannot be formed. We suggest that of the two pathways (e.g. visibility (visually odd) versus label (meaning)), a label is less important than a visible sign.
Finally, while we can easily assign certain ‘signs’ as being disease-related (e.g. rashes), many are more continuous than discrete. Notable examples include bodily asymmetries, gait, facial feature ratios, height and weight, but—to some extent—all disease signs will demonstrate variation in degree (e.g. one pustule versus one hundred). Sensitivity to such variability is dealt with by a particular feature of the model that we present next.
(b) A model of disease-avoidance responding
The model of disease-based stigmatization presented here is composed of three interrelated but functionally dissociable modules (components), which are illustrated in figure 1. The model's unique features include its specification of how particular components interrelate, its suggestion of two discrete forms of contamination and its ability to integrate disease labelling and disease signs within the same model. This latter feature can potentially explain how labels may come to evoke disgust and avoidance, and, indeed, how labelling can contribute to a broadening of disease-related stigmatization to groups with no apparent connection with infection.
We start by providing a summary description of this model and then examine each of its individual components in more detail. The first component is emotive and reflexive, and its output is feelings of disgust and its consequent, contamination. Disgust provides a powerful impetus to avoid particular people, objects connected with them and things they may have touched—contamination—thus acting as an implicit ‘germ theory’ (see ). Our use here of emotion closely parallels an emotional system previously proposed within an evolutionary framework . The second component is cognitive, but largely inaccessible to consciousness, and is based on passive exposure to human body forms accrued over a lifetime. It functions to detect deviations from ‘typical’ body forms, both discrete and continuous (e.g. bodily asymmetries, gait, facial feature ratios, height and weight) and will thus act to draw attention to any feature that is atypical and thus a possible disease cue. The third component is cognitive and largely consciously accessible. This functions to evaluate output from the first and second components, and to trigger activity in the disgust/contamination component via mental imagery based on experience and knowledge (e.g. he's got herpes, it's contagious, with yellow crusted ulcers on the lip). This cognitive component may also lead to the generation of other emotions, notably anticipatory fear (i.e. anxiety at the thought of contact), as well as containing explicit contamination beliefs based on the western germ theory (among people educated in this way). It is this third component, we hypothesize, that is initially responsive to labels, and that gives human disease avoidance its unique flexibility. It is also this component of the model that can be ‘hijacked’ to political ends to stigmatize particular groups or people (i.e. by associating them with disease-related themes) or to positive ends by associating particular behaviours or situations with disease (e.g. hand hygiene). We discuss this capacity in §6.
(i) Module 1: disgust/contamination
Many disease-avoidant behaviours are likely to be reflexive reactions to primary disease signs that occur largely independently of conscious decision-making . It has been suggested that primary disease signs evoke disgust so that the individual avoids potential sources of contamination with a pathogen [5,75]. This process, being largely automatic, should operate to any cue that resembles a disgust elicitor such that a benign object resembling an agent of disease can assume the infectious threat value of the actual repugnant stimulus. Consistent with this, people are not only disgusted by things that pose a genuine disease risk (e.g. pustules), they are also disgusted by things that pose no risk at all but which simply resemble genuine disease risks (e.g. psoriasis)—just as error management theory would suggest.
Disgust is typically experienced as a feeling of revulsion, sometimes accompanied by nausea, along with a strong desire to withdraw from the eliciting stimulus [78,79]. The perception of a potent disease cue should not only produce behavioural disgust (e.g. avoidance), it should also be accompanied by facial displays of disgust (i.e. slightly narrowed brows, a curled upper lip, wrinkling of the nose, and visible protrusions of the tongue), although different disease signs may produce variants of this expression . It has been suggested that disgust is fundamentally revulsion towards the prospect of oral incorporation of an offensive substance . This is in keeping with the disgust facial expression, which seems to function to keep offensive substances from the nose and mouth. In general, the more intimate the contact with the offensive substance (e.g. proximity to the mouth), and hence the more real the threat of incorporation, the greater the disgust . However, the mouth is not the only orifice vulnerable to contamination. For example, a series of experiments tested the reactions to contact of various contaminants with different body apertures and body surfaces . It was reported that the mouth and the vagina were the most contamination-sensitive points of the body. Moreover, the more vulnerable an aperture is to contamination, the more potent it is as a source of contamination for other persons. This finding is likely to be especially relevant in the context of interpersonal disgusts—namely here, those involved in stigmatization. That is, disgust and avoidance towards a person who looks sick might intensify when the prospect of contact with that person becomes more sexualized (e.g. kissing and genital contact).
(ii) Module 2: atypicality detection
Humans are good at categorization [84,85] and this seems to depend—in part—on forming memories of individual instances of particular members of a category (e.g. faces, body morphology). These forms of knowledge appear to be acquired tacitly, are cognitively impenetrable (i.e. feelings of atypicality—abnormality—are unavoidable), attention-demanding and may be influenced by the context in which categorization judgements are made (e.g. by knowing that a contagious illness abounds and that particular features are predictive, making the system especially sensitive to these features). In addition, frequent exposure to abnormal body forms (e.g. obesity) may render these typical, paralleling the apparent habituation of disgust to repeated instances of a particular type of elicitor [86,87].
An instance-based knowledge of typical body forms likely underpins our ability to attend to particular features (e.g. facial lesions, asymmetry and missing limbs), which potentially identify their bearer as atypical. Such atypicalities are likely to be especially attention-demanding if they are located on the face, as abnormality detection here may be partially hard-wired. Finally, while the outcome of module 1—disgust and contamination—is primarily avoidance, module 2 differs in that its output only generates behaviour indirectly, either through drawing attention to features that are themselves disgust-evoking or via appraisal of the meaning of the deviation in the third component of the model.
(iii) Module 3: cognition
The expressly accessible component of the model can be conceptualized as having three sub-components. These are: (i) the capacity to learn, understand and apply labels relating to disease—labelling—and to generate mental images based upon these; (ii) a model of contamination based upon learning, which may include germ theory, as well as radiation, chemical and other forms of contamination—cognitive contamination—and the attendant fear that may result from directly encountering such situations or anticipatory fear (anxiety) from contemplating contact with them; and (iii) a capacity to evaluate labels, cognitive contamination and the inputs from the other model components, and to plan appropriate responses (avoidance, decontamination)—evaluation and action.
Once a person has learnt the meaning of a particular label (e.g. HIV+ ), contact with a person bearing that label (e.g. person X is HIV+ ) may evoke activation of the model's first component—disgust. This is not because the label is intrinsically disgusting (although it may become so via associative learning), but because it can bring to mind disease-related knowledge and images that the person has learned to associate with that label. It is arguably the ability of such labels to induce mental representations of disease—disgust-evoking images—that may allow labels to trigger the same affective and behavioural reactions as visible correlates of disease (i.e. activate module 1—disgust and contamination). Indeed there is evidence to suggest that mere reference to a disease-label (e.g. AIDS, SARS and H1N1), contagious or otherwise, is sufficient to provoke avoidance . People who have been labelled as having hepatitis C report chronic social avoidance despite the absence of overt physical symptoms . People avoid shaking hands with, and using silverware previously used by people who are known to have cancer . People also report avoiding swimming in pools where they have been told that psychiatric patients have swum , and desire greater physical distance from individuals labelled with a non-contagious condition (e.g. physically disabled ).
Not only may labels lead to disgust-inducing mental images, but they may also have two further consequences. The first is that they may engage a mental model of contamination, based purely upon explicit knowledge about a particular disease or condition. While this may be entirely rational (i.e. washing hands after touching a person who is unwell), on other occasions it may not, because cognition itself accesses processing routines—mental heuristics—that may themselves be evolutionarily rational, but irrational in that particular context (e.g. ‘better safe than sorry’). The second and broader consequence is the use of labels to deliberately associate people or objects with disease. This process, which seems to parallel Rozin's concept of moralization, is discussed in §6 and, as noted already, it can provide a powerful explanation for how people can be stigmatized via a disease-avoidance mechanism, even when they are in fact disease-free and free of visible disease signs .
(v) Cognitive contamination
A cognitive contamination model simply refers to our knowledge of (and beliefs about) contemporary germ theory, alongside knowledge of other forms of contamination that are based upon scientific understanding of the natural world. While these types of models may exist in western cultures, it is important to recognize that other cultural groups may have cognitive models of contamination based upon entirely different principles, which may or may not align with the disgust-based contamination response (i.e. implicit germ theory) of the first component of the model. Knowledge about the consequences of becoming contaminated may itself induce fear (e.g. at having eaten something contaminated with Salmonella or having ingested food with a chemical contaminant such as mercury), and the key distinction here is that the fear response is dependent upon understanding the risk of the contamination. Similarly, imagining contact with a contaminant may induce anticipatory fear—anxiety or ‘fear of contamination’, which again may be directed at cues that activate either one of the contamination systems or both.
(vi) Evaluation and action
All of the outputs—disgust, contamination, cognitive contamination and detection of atypical body forms—are evaluated in this component of the model. In some cases, action may be overwhelmingly driven by disgust (e.g. it may be impossible to mask one's reaction to gangrenous flesh or the need to clean faeces from one's hands), but in other cases appraisal and the disgust/contamination component may operate to generate other effects. If an object is known to have been in contact with a disgust or disease cue, this may then bring to mind images of the disgust cue whenever the contaminated object is perceived. This may result in long-lasting or permanent contamination—long after any physical trace has been lost . This has been observed in laboratory settings by participants' reluctance, for example, to drink juice that had been in contact with a sterilized cockroach , or to wear a laundered sweater previously worn by a person reported to have HIV/AIDS .
The capacity to evaluate disease threats may also result in output suppression. That is, individuals who routinely engage in socially desirable behaviours might work harder to minimize or ‘adjust’ any negative reactions towards individuals that society at large deem as vulnerable. A recent study demonstrated such a relationship—that is an association between social desirability and reported avoidance to people that were labelled as diseased . Similarly, Snyder et al.  found that people tended to avoid interactions with disabled people when a socially acceptable excuse was available. In this experiment, participants were asked to choose between watching a movie with a person wearing a leg brace and watching a very similar movie alone. Participants reliably preferred to watch the movie alone; however, if the movie was identical (across situations; viewing with the person wearing a leg brace or viewing alone), then participants chose to watch the movie with the disabled confederate. These findings suggest that healthy participants prefer to avoid interactions with physically disabled individuals, but only when this socially undesirable behaviour could go undetected.
Further evidence for output mediation of disease-related avoidance is provided by the disparity between expressed attitudes towards people with physical disabilities and the nature of the behaviour displayed towards them. One study had participants teach a craft activity to either an able-bodied confederate or a ‘disabled’ confederate in a wheelchair . Participants reported equally positive impressions of able-bodied and ‘disabled’ learners; however, when interacting with ‘disabled’ confederates, the participants displayed non-verbal behaviour suggesting anxiety and avoidance. The discrepancy between explicit and implicit measures suggests that behaviours readily under the control of participants (e.g. speech content) were more likely to conform to the normative pressure to be ‘kind’ to disabled people, whereas non-verbal behaviours tended to be unresponsive to this norm, most probably because they are under less voluntary control .
In summary, this model suggests that a variety of cues may alert an individual to a disease threat. These cues or signs may be indirect (labels) or direct (perceived) and will activate both explicit knowledge and emotional reactions, especially disgust, leading to avoidance of that person. Individuals bearing permanent disease-related signs should experience chronic avoidance by other people because these signs never remit. Moreover, as these signs directly and permanently connote disease, such persons will experience avoidance from most people—stigmatization. To evaluate this account, we start in §5a by presenting a summary of alternative accounts of stigmatization. We then examine whether the disease-avoidance model makes unique predictions, relative to these other accounts, and whether these unique predictions are supported.
5. Does the disease-avoidance model make unique predictions?
(a) Other potential accounts of stigmatization
A number of other theories could also account for the aversive and avoidant responding directed at stigmatized individuals, and there is some empirical evidence bearing on each. One proposition is that stigmatization can enhance self-esteem by motivating favourable inter-group comparisons. According to social identity theory, the social categorization of people into out-groups (different from the self) and in-groups (including the self) motivates a desire to achieve a sense of positive group distinctiveness . This motivation can initiate a search for dimensions on which the in-group is favoured over the out-group (and placing greater emphasis on these dimensions), and it can also motivate active avoidance and denigration of out-group members in the form of stigmatization. Enhancement of one's in-group reflects positively on the ‘collective as well as personal self-esteem’ [96, p. 8].
A second perspective focuses on the fact that members of a stigmatized group may remind perceivers of their own mortality—an aversive experience with implications for stigmatization. Terror management theory (TMT; ) holds that humans are unique because of awareness of our own mortality, and that ‘death can often occur prematurely and unexpectedly’ [98, p. 95]. The awareness of one's mortality can create an overwhelming and incapacitating anxiety. According to the theory, individuals defend against this existential anxiety by subscribing to a cultural worldview that imposes order and meaning on an otherwise random and senseless world. Perceptions of difference and deviance are sufficient to arouse existential anxiety; however, it is especially likely to occur when such differences generate concerns in people about their own vulnerability, such as when faced with physical disability and disfigurement . The experience of existential anxiety, in turn, motivates people to bolster their cultural worldview, and one way to do so is to reject those who are different, particularly those who deviate from cultural norms or standards .
A third explanation is that particular stigmas may be confronting to the perceiver and that the negative effect and avoidance they experience reflects the perceiver's social unease rather than any fear of contracting disease. For example, a condition's perceived or actual lethalness may produce social awkwardness, and interaction partners might struggle to find the right thing to say . Social unease may also reflect lack of contact with stigmatized persons (e.g. disabled persons, ), and uncertainty about the ways in which to behave towards them . It has been proposed that such discomfort arises owing to a conflict between the desire to stare at a novel stimulus (e.g. disabled person) and the suppression of that desire for the sake of social desirability . They reported that participants stared at a photo of a disabled person longer when they were alone, relative to when in the company of an observer. Finally, it has also been suggested that a combination of fear, sadness, shame and guilt lead to the experience of anxiety . Therefore, non-stigmatized individuals might be apprehensive about mixed interactions because of the guilt and shame that arises over the negative effect that exists towards their interaction partner, and because they are fearful that such negative effect will be apparent to them .
It has also been suggested that avoidance may result from a process in which stigma targets are judged to have limited potential in the realm of social exchange [5,104]. Tooby & Cosmides propose that humans have a finite number of friendship niches. Because an individual has only a limited amount of time and can associate only with a limited number of people, they must be judicious in their selection of potential affiliates . The selection of one affiliation presumably constitutes a decision to decline another . Therefore, those individuals who fail to qualify as good dyadic cooperators (e.g. pose a social cost greater than their potential social benefit) should be avoided. Kurzban & Leary  suggest that poor exchange partners might include those who are financially poor , elderly  or infirm . These individuals possess characteristics that suggest the inability to provide future social benefits and accordingly might activate systems that induce one to exclude them from cooperative interactions.
A reverse halo effect might also be a factor in the social avoidance of stigmatized groups . A large body of social psychology research has documented that facial attractiveness exerts a strong effect on impressions . This work has revealed that perceivers attribute more desirable personal qualities to attractive persons than to unattractive persons. Studies have shown that attractive individuals have an advantage in employment settings, are more likely to be acquitted for a crime, and are treated less harshly by teachers and peers [109,110]. Conversely, the facially disfigured are stigmatized by the society for looking different from normal and are valued less than others who are not disfigured .
A final account for certain forms of stigmatization is that they result from blame. Lifestyle-related factors, such as disease controllability, clearly contribute to the stigmatization of certain groups (e.g. the multiple sexual partners that are perceived to accompany a gay lifestyle in HIV ; the perceived choice made by people who smoke in lung cancer ; and the choice to eat in obese individuals ). Then, the perception that a person is ultimately responsible for their lifestyle leads to blame—and hence stigmatization—for its consequences. While these documented cases for smoking, obesity and a gay lifestyle are clearly supported in the literature, this is unlikely to explain all instances of stigmatization, nor all instances directed at these groups. For example, HIV status still leads to reported social exclusion even in individuals who acquired the disease incidentally via blood transfusion .
(b) Evaluating these differing theoretical perspectives
It is clear that multiple theories may account for or contribute to stigmatization. One argument for the primacy of a disease-avoidance account could be made upon the basis of phylogenetic continuity. While avoidance of death probably reflects the functional basis for death anxiety, solely cognitive-based explanations for stigmatization—which arguably includes all of the alternative theories described above—would suggest that stigmatization is a uniquely human phenomenon. However, as we describe in §5c, avoidance of sick conspecifics and potential mates who bear signs of disease is very widespread in animals and seems to occur even in phyla with a limited capacity for neural processing. One implication of this observation is that only the disease-avoidance account would seem to suggest the presence of a similar functionally grounded behaviour in animals. Another implication is that because disease avoidance is functionally important, people with disease-related physical cues and atypical body forms should be stigmatized in all cultures. In considering cross-cultural similarity, two caveats need to be borne in mind that are both specific to the model presented here. First, some cultures will have more exposure to disease signs than others (with habituation of disgust and possibly reduced atypicality). Second, cultures should differ in their explicit knowledge about disease and in their cognitive models of contamination and contagion—a point we return to below.
The specific model we presented in §4 generates a number of novel predictions, but unfortunately, relevant data are available only for two of these. First, the response to stigma targets should accord with our model and include behavioural avoidance (which is general to all the theories above and is widely supported in the literature as we outlined earlier), and, uniquely, include disgust and contamination. That is, stigmatized individuals should be capable of inducing disgust, and hence be able to contaminate previously neutral objects and people (this may involve either disgust-related contamination, cognitive contamination, or both). Second, a further feature of the model is its capacity to form associations between labels (e.g. a particular social group) and disease, such that a particular label can then act to bring to mind disease-related images and thoughts. This particular prediction is examined in §7, because of its broader implications for forms of stigmatization that appear unconnected with the disease.
As we noted above, the model suggests that while different cultures should share a common reaction to visible disease signs, there should be considerable variation in their cognitive models of contamination. Although such differences have been reported , as far as we are aware there have been few attempts to document these differences systematically, or to test whether the cognitive module's ‘contamination model’ can result in the emotions of fear and anxiety—as our model suggests. An additional feature, which also has not been tested, is the capacity of a disease label—or indeed labels more generally as they apply here—to invoke mental imagery, and then disgust. We would predict that any procedure, which hampered a person's ability to form a relevant mental image would act to reduce the degree of disgust felt towards the person bearing that label.
More broadly, we would also expect that it would be possible to show deficits in one component module, while being intact for the other two. This claim rests on the assumption that the different components are instantiated in different regions of the brain, with disgust/contamination involving the insula and basal ganglia , atypical body form detection mediated by temporal lobe structures (notably in the inferotemporal cortex ) and the cognitive component of the model by fronto/temporal structures . Indeed, some evidence for such dissociations may already be present, in that Huntington's disease patients may evidence impaired disgust processing (feeling the emotion), but still retain semantic knowledge about disgust and contamination . More generally, the impact of specific dysfunctions within the disgust module versus the cognitive module should allow for the detection of a double-dissociation between disgust-based contamination and cognitive-based contamination. These impairments should also translate into abnormal interpersonal evaluations, restricting the range of stigma targets to which the person can respond (e.g. disease signs versus disease labels). Finally, as we noted earlier, interpersonal disgusts—stigmatization—may be focused on a different locus of contamination than object-based disgusts, with disgust increasing as the contact with the stigmatized target becomes more sexualized (i.e. a sexual rather than an oral focus). Again this particular prediction has not been explicitly tested, and as with the others above there are little data available.
In §5c, we examine the evidence that is available and that is relevant to three particular predictions. First, that there is phylogenetic continuity, namely that disease avoidance can be observed in animals. Although this is general to all disease-avoidance models, it is very important because it points towards the primacy of this type of explanation for human stigmatization over other explanations which cannot predict such continuity. We then examine data with respect to a further general prediction of a disease-avoidance account, namely that avoidant responses to disease signs will be observed cross-culturally. Finally, we consider a prediction specific to our model of disease avoidance, namely that reaction towards stigmatized groups will frequently be characterized by disgust, and that stigmatized individuals will have the capacity to contaminate other people and objects.
(c) An animal precursor of stigmatization: avoidance of diseased conspecifics
Most, if not all, animals engage in behaviours that function to avoid disease . There should be fitness advantages in animals able to recognize and avoid conspecifics infected with transmittable disease. In fact, it has been proposed that animals that do not engage in such behaviours get sick . The extensive literature detailing such behaviours in animals is particularly important as it suggests a continuity of behavioural strategies between humans and other animals.
Like humans, some species rely more heavily on visual cues, such as deviations from normal appearance or behaviour as a marker for infection . Chimpanzees are one such species and have been observed to engage in behavioural avoidance of diseased conspecifics. Goodall  reported the social exclusion of two chimps suffering from the infectious viral disease poliomyelitis. The behavioural markers of this disease, such as awkward movements owing to paralysis and muscular wasting, appeared to alarm and deter the other chimps. Goodall noted ‘Of the total number of 32 adult and adolescent chimpanzees who visited camp at the time, 17 approached the crippled male … Only nine adults approached closely … and of these only four actually touched him (two aggressively) … Humphrey [possibly his biological nephew] was the only chimpanzee who sometimes slept within 20 m of the stricken male … Perhaps the most striking aspect was the fact that not once in the 24 h was [he] involved in a session of social grooming’ [120, pp. 233–234]. Goodall suggested that the social distancing of conspecifics showing abnormal behaviour might be adaptive because it reduces the risk of spreading contagious disease.
Primate populations have also been observed to socially exclude ‘strangers’. Primates have to interact with ‘out-group’ members because inbreeding often produces negative outcomes . New affiliates are often kept at a physical distance from the primary group for months. Freeland suggests that such peripheralization serves a disease-avoidance function . That is, the long period of social distancing offers a protective front from which the ‘in-group’ members can observe that a potential group member is not carrying a latent disease. Any candidate members showing signs of disease will remain marginalized.
Behavioural avoidance and social rejection of diseased individuals are observed at many different taxonomic levels. For example, bullfrog tadpoles selectively avoid swimming in proximity to tadpoles infected with transmissible intestinal parasites , and three-spined sticklebacks avoid other sticklebacks if those individuals emit cues indicating parasitic infestation . Healthy spiny lobsters also avoid infected members of their species . Infected lobsters were observed to rarely share shelters with conspecifics (less than 7% shared dens and more than 93% were solitary), even though healthy lobsters generally lived together. Similarly, healthy killifish prefer not to shoal with other killifish that have been injected with black ink-spots to mimic the effects of a common parasite .
Judicious selection of mates may also be related to disease avoidance. Sexual-selection studies have found that females of some species avoid breeding with diseased males. For example, female mice select unparasitized males because they obtain the direct benefit of avoiding parasitic infection , and more generally, diseases ‘may impair fertility, induce abortion, or cause malformations in the young’ [119, p. 281]. Females can detect disease by the urinary odour of males, and avoid mating with male mice that are infected with viruses, protozoa and larval nematodes. Disease-free males have also been shown to refuse copulation with infected females, and to avoid parasitized others, thus reducing the likelihood of infection . Similarly, termites  and the three-spined sticklebacks  also avoid mating with infected males.
Sexual-selection studies also reveal that expression of male characteristics may reliably reveal disease resistance. A study of courtship and spawning success was carried out in a species of fish (Copadichromis) and found males that spawned had significantly fewer parasites in their livers than males that did not spawn . In addition, males that spawned had significantly heavier gonads than ‘unsuccessful’ males. Likewise, a study on experimentally infected red jungle fowl found that at sexual maturity, infected roosters displayed duller combs and eyes, shorter combs and tail feathers, and paler hackle feathers than controls. Mate choice revealed that females preferred unparasitized to parasitized males, and hens were using the traits on which the two groups differed to make their mate choice decisions . Similarly, it was reported that male grouse that had been subjected to experimental alteration (e.g. red paint applied on wattle) enjoyed less success in attracting female mates relative to those males who had not been painted . These findings suggest that females are choosing to mate with males that signal disease resistance. Relatedly, some species engage in post-copulatory grooming as a disease-avoidance strategy. Chimpanzees regularly practice penile hygiene—that is, wiping their penises, either with leaves or their hands, after mating [133,134]. This activity is thought to help prevent the acquisition of sexually transmitted diseases .
The non-human evidence by no means proves that human beings reject one another for the same reasons. However, if humans were the only animals to demonstrate anything resembling within-species stigmatization on account of disease, then an evolutionary explanation could not be offered . This is clearly not the case. The abovementioned behaviours among non-human animals that resemble human social exclusion suggest that similar principles might be at work . These phenomena cannot be explained easily by any of the alternative explanations to stigma discussed above. As noted by Kurzban & Leary [5, p. 191], presumably, ‘sticklebacks do not try to boost their self-esteem by avoiding parasitized others, … and McGregor's assailants did not attack this poor chimpanzee because their social identity was threatened’. The continuity of evidence across species makes a disease avoidance of stigmatization plausible.
(d) Cross-cultural evidence for false alarms
Evidence that certain characteristics are stigmatized across many cultures would suggest the presence of a common underlying component (or goal) to stigma. A small but consistent body of work suggests that people with visible physical disabilities experience some form of stigma in most societies . While the form and/or degree of stigmatization may vary from culture to culture, there is evidence to suggest that there is some universal agreement regarding who gets avoided . For example, there is cross-cultural support for the social exclusion of people with facial disfigurement [67,70] and physical impairments [9,137–139]. Relatedly, concealable forms of physical disability such as asthma, diabetes and heart disease were reported as among the least stigmatized conditions, whereas the more obvious conditions such as paraplegia, dwarfism and cerebral palsy were reported as among the least accepted across Chinese, Italian, German, Greek and Australian communities .
The social rejection of people with dermatological disorders, especially those that affect the face, is widespread in Indian society [28,29,49]. People with psoriasis report avoidance by others in social places (e.g. communal baths) and social relationships (e.g. marriage ). A comparative study of psoriasis and leprosy patients found that reported social exclusion was equivalent for the two conditions . Similarly, people with a common pigment condition, vitiligo, also report social rejection. This condition is particularly disfiguring in people with dark skin and causes such a severe social stigma in Indian society that affected people are deemed unmarriageable . The social avoidance of people with superficial skin conditions has also been reported in Uganda , Nepal  and Southeast Asia .
Distortions of the body (e.g. crippling, paralysis and amputation) are generally associated with universal social rejection and avoidance . Social distancing has been reported towards physically disabled people in West African , Arab-Israeli , Native American , Mexican , Chinese  and Southeast Asian communities . A recent study examined social exclusion of people with physical disabilities in the Dominican Republic and Ghana . In both countries, people with disfigured limbs reported being teased about physical appearance, gossiped about and shunned by community members, health workers and even by friends and family. Such treatment resulted in public rejection and forced exclusion from many social situations. Similarly, Machado-Joseph disease or ‘stumbling disease’, a rare hereditary disease among Azorean-Portuguese that produces staggering, lurching, ataxia, muscular weakness, spasticity and uncoordinated body movements, results in public ridicule, gossip and social isolation . Finally, a study conducted in the West Indies also found that people with physical disabilities (e.g. amputations, paralysis and deformity of one or more limbs, the trunk or a combination of body parts) are excluded from social relationships among peers and neighbours . On the basis of the available evidence, there does appear to be support for the existence of universally accepted signs of disease (e.g. physical disability and disfigurement), and these signs motivate behavioural avoidance in healthy people across many cultures.
(e) Disgust and contamination in response to stigmatized people
While there is overwhelming evidence for the behavioural avoidance of individuals carrying signs of disease (reviewed in §5c), there are considerably fewer studies, which have specifically (or incidentally) looked to see if disgust and contamination feature in the response. Stangor & Crandall propose that direct contact with a person exhibiting distorted physical features results in a ‘visceral physiological arousal, experienced … as aversion or disgust’ [69, p. 77], and this effect is thought to be strongest for facial distortions. Indeed, facially disfigured people have long complained that others act unfavourably towards them in social encounters. For example, facial burn victims report that family members and friends, as well as strangers, react with disgust displays and avoid close contact . Similarly, it has been observed that it is not uncommon to witness an expression of repulsive disgust on the faces of people who see a person with a facial deformity . Recipients of facial transplant also report being subjected to disgust grimaces, social avoidance and other negative reactions in daily life .
Such anecdotal reports are supported by empirical work. Participants react with disgust faces when viewing photos depicting severe forms of facial deformity , and facial disfigurement has been observed to elicit a negative response from perceivers reflecting a desire to ‘remove it from one's sight’ [154, p. 53]. Relatedly, a recent functional magnetic resonance imaging study found people with psoriasis had significantly smaller signal responses in the insular cortex when observing disgusted faces relative to healthy controls . This was accompanied by a behavioural deficit in detecting facial disgust displays as measured by a facial expression recognition task. The reported effects were specific to disgust; people with psoriasis did not differ from controls in their brain response to, or recognition of, fearful faces. The authors interpreted this to be a learned response that helps people with psoriasis cope with their condition and the attendant aversive reactions (e.g. facial displays of disgust) of others.
Healthy people also report disgust at the sight of physical abnormalities of the body . For example, disabled individuals were more likely than non-disabled individuals to be associated with disease, and this effect was stronger among people especially sensitive to disgust or concerned about disease transmission . People also report being ‘repulsed’ and ‘turned off’ by extremely underweight individuals . Obese people are commonly described as dirty, smelly and disgusting [158,159], as having poor personal hygiene , and are avoided in interpersonal domains [160,161].
As noted in §4b, a central feature of cues that evoke disgust is that contact with them can result in a neutral object itself becoming disgusting—contamination [92,162,163]. Therefore, stigmatized individuals should be capable of contaminating previously neutral objects and people. Indeed, the propensity to wear a previously desirable sweater has been reported to significantly decrease after it had been worn (and thoroughly laundered) by a healthy stranger (history unknown), followed by a person maimed in an automobile accident (unlucky), a murderer (moral taint), a person with AIDS and finally a person with tuberculosis . This finding is consistent with the notion that a range of properties (e.g. personal characteristics and moral standing) can be transferred by physical contact . Contagion concerns could also be inferred from the tendency to avoid shaking hands with, or use silverware previously used by, people who have cancer , and to not wanting to swim in pools in which psychiatric patients have swum . The idea that interpersonal aversion might extend to indirect contact with these conditions is further supported by a recent study that reported participants did not want to wear a (clean) sweater previously worn by a range of targets. Not only did participants report a reluctance to wear a sweater previously contacted by someone described as having HIV/AIDS or influenza, but also someone described as obese, mentally ill, brain injured, elderly, an amputee, as having cancer, with a birthmark . These effects might be the result of visualizing the physical and behavioural anomalies that accompany these conditions (e.g. scalp wound, slurred speech and poor hygiene—brain injury and mental illness).
A further study found that healthy participants not only reported avoiding physical contact with individuals who appear unwell (e.g. acne, eczema and birthmark), but also that this propensity for avoidance increased with proximity to the target person (e.g. sit next to < handshake < social kiss) occurred independently of contagion knowledge, and extended to objects that the stigma target had previously contacted (i.e. contamination; ). Moreover, participant ratings of disgust sensitivity and avoidance were positively related, suggesting that reactions to the conditions featured in vignettes and the images they brought to mind were a further correlate of avoidance.
Evidence that obesity is viewed as a contaminant comes from research describing a ‘stigma by association’ process, in which being in physical proximity to an obese individual has a negative impact on people's evaluations of the bystander . Bystanders seated next to obese (versus average weight) individuals were denigrated consistently, regardless of the perceived depth of the relationship, the evaluator's anti-fat attitudes or gender, and whether or not positive information was presented concerning the obese individual. This effect has also been reported in children as young as 5 years . A more recent developmental study found that obese children transmit negative properties to previously neutral objects they have contacted . Here, Caucasian American and Chinese 7 and 10 year-olds were presented with beverages purportedly created by, and thereby having come into contact with, either obese or average weight children. Compared with drinks created by average-weight peers, children believed that drinks created by obese-weight peers tasted worse and would more likely result in illness following ingestion. In addition, Chinese children, who are less familiar with obesity, showed the same effects as their Caucasian counterparts.
The available evidence on disgust and contamination in response to stigma targets is consistent with our disease-avoidance account—it appears that people who are vulnerable to stigmatization carry the potential to contaminate objects and other people, a phenomenon that cannot be readily accommodated within alternate theories of stigmatization. These data also imply that the parallel noted earlier between reactions to people with infectious disease, and reaction to people who appear to have disease signs or labels, probably do have the same underlying cause—disease avoidance mediated by disgust.
6. Extension of the model into other domains of stigma
Many forms of stigmatization, such as sexual orientation, poverty, skin colour, ethnicity and occupation might appear to fall completely outside the scope of this model. While this is not a problem in one regard, as many of the theories considered earlier may be correct within specific domains—as with disease avoidance—it is a problem for the broader claim that disease avoidance contributes from some degree to many if not all forms of stigmatization. For this broader claim to be correct, there must be a route from the more directly related forms of disease avoidance (visible disease signs and disease labels) to the apparently unrelated types of stigmatization outlined above and in table 1. We suggest that the model we presented earlier contains within it the mechanisms necessary to supply this link.
Before turning to look at the stigmatization of particular groups, it is instructive to consider a more innocuous example of how social attitudes may be transformed, in this case for the better, by linking here particular behaviours (hygiene)—and ultimately people who violate the norm this creates—to disease. In contemporary western society, poor personal hygiene—visibly dirty hands or body odour—are reported to be ‘disgusting’ and people may avoid unhygienic people [168,169]. This was not always so. Historically, many westerners did not bathe and smelled strongly of body odour. It was only with changing medical opinion, from the dangers of bathing to its advantages, along with the hygiene movement in the nineteenth century that personal cleanliness came to acquire a moral dimension. To be dirty was to be slothful and, crucially, meant risking infectious disease for oneself and one's kin. Indeed, this message was frequently reinforced in advertisements and other social media, linking poor personal hygiene with disease [170,171]. As these measures tended to be adopted first by those higher in the social hierarchy, cleanliness became aspirational and its meaning further transmuted to equate cleanliness with personal and social success, and terms like ‘clean’ and ‘dirty’ came to infer not just the physical state of an individual but also their moral condition (i.e. clean-living, dirty-minded). This process of ‘moralization’ recruits the emotion disgust and any associated negative qualities and meanings, and projects them onto previously acceptable attitudes, products and … people . In this sense, the emotion of disgust has been culturally co-opted to influence distinctions between desirable and undesirable behaviour, by explicitly linking that behaviour to disease. In the context of the model outlined earlier, the disgust and contamination component becomes invoked via the cognitive component of the model, to cues which hitherto would not have evoked disgust.
Ethnic out-groups have often been blamed for outbreaks of epidemic diseases [172–174], and such outbreaks can also provoke negative reactions to any ‘outsiders’ (see , for a discussion on the treatment of Mexican Americans during the recent H1N1 outbreak). Members of ethnic out-groups are also more directly associated with concepts of disease and there are many historical examples. Notably, people of African descent in the USA were often portrayed as disease vectors capable of infecting other parts of the American population . This purported relationship has been cited in both the scientific and popular press . For example, a 1911 medical paper reported that the incidence of hookworm, a debilitating condition prevalent in the southern states of America, ‘possibly indicates that the Negro has brought [it] with him from Africa … and spread it broadcast through the South. … we must frankly face the fact that the Negro … because of his unsanitary habit of polluting the soil … is a menace to others’ [8, p. 531].
The association between race and disease concepts was also evident in ‘scientific’ work supporting apartheid in South Africa (1948–1994). Here, ‘blacks’ were thought to carry ‘inferior characteristics’ and were to be avoided lest risk ‘poisonous infiltration’ of the white community [177, p. 79]. A number of laws were formalized to eliminate contact between whites and other races (i.e. blacks, coloureds and Indians; ), including forced racial segregation in all public amenities, buildings and transport, prohibition of inter-racial marriage, and prohibition of inter-racial sexual relations. The risks deemed to be associated with inter-racial marriage or sexual relationships were children prone to poor health and weak constitutions, and who also suffered great ‘ … physical disharmony (e.g. large native teeth in a small European mouth) … ’ [177, p. 226]. Likewise, it was suggested that children of ‘mixed blood’ lacked moral balance and were a threat to civilized white society [177, p. 227].
Similarly, the American eugenics movement essentially halted immigration in the early twentieth century—Immigration Restriction Act 1924—because it alleged that recent American immigrants (e.g. Russian and Polish Jews, Italians and Central Europeans) possessed various inadequacies (e.g. mental illness, deformities, unhygienic, tuberculosis) which posed a threat to the health of the larger community . Likewise, much Nazi propaganda focused on linking Jewish people with parasites and vectors of disease .
A parallel argument can also be made for the association between a gay lifestyle and the AIDS epidemic, which as described earlier, led to the stigmatization of groups connected with this disease. Reactions to the HIV/AIDS epidemic over recent decades suggest that sexually transmitted diseases are intimately caught up in a society's understanding of what constitutes ‘normal’ or acceptable sexual behaviour and what it defines as atypical or deviant. Social attitudes towards sexuality and sexual behaviour are known to be shaped by the disease history of the local inhabitants . Indeed, sexually transmitted diseases have played a large role in public debate and government policy, from fears in the late nineteenth century and early twentieth century about the future of the white race and eugenics (e.g. segregation of Australian Aborigines; ), to the perceived dissolution of morality in wartime (e.g. quarantine of prostitutes; ), to the liberalization of the 1960s and 1970s (e.g. the advent of penicillin; ). In sum, cultural pressures can come to identify certain groups with disease, resulting in disease-avoidant behaviour and stigmatization of individuals who are identified—labelled—as belonging to such groups. This process, moralization, as instantiated in the model we presented earlier, may then allow a disease-avoidance process to contribute to multiple forms of stigmatization, which show no immediate or obvious connection with the disease.
The central argument of this article is that both stigmatization and avoidance of persons with infectious diseases are consequences of the same underlying process. This process of physical avoidance and social exclusion occurs as a result of three functionally discrete systems which operate in an integrated manner (figure 1). These are disgust and/or contamination which motivate avoidance, the detection of atypical body forms acting to alert the perceiver to potential disease-related threats, and a cognitive system that enables activation of the disgust/contamination module via labels, can evaluate and respond to disease threats with reference to knowledge of contemporary germ theory, and that can engage the emotion of fear/anxiety. This system, which in animals may involve only the first two components, has in humans become sufficiently flexible with the addition of the third cognitive component that it can be used to promote avoidance of people who appear to be healthy, but who have become linked to disease-related knowledge by a label or more indirectly via the societal process of moralization. In this final section, we discuss the evidence reviewed above in relation to our theoretical account, noting its problems, examine whether other forms of stigmatization can be accommodated within it and look at its relationship to mate selection.
While most of the literature reviewed above provides ample evidence of avoidance for disease-related signs (i.e. physical and label), far fewer studies have examined for disgust and contamination in response to such cues. Four studies reported disgust faces in response to facial disfigurements [26,151–153], four provided self-report ratings indicating felt disgust towards disabled, obese and underweight people [156–159], and eight indicated evidence of contamination sensitivity in relation to disease signs [25,74,80,89,164–167]. The most unique aspect of our theoretical approach, in contrast to other potential accounts, is that the stigmatized target should generally engender disgust, and be able to contaminate other objects and people. Both predictions are relatively straightforward to test and have yet to be examined across a wide range of stigma targets.
This line of enquiry would also need to be extended into three further domains. First, disgust-related responding and hence stigmatization towards reliable correlates for disease (e.g. facial distortions) should be observable early in development. Second, cross-culturally, we would expect that a common characteristic of stigmatized groups is their capacity to induce disgust and contaminate other objects and people, but there have been no direct tests of this prediction. Third, the nature of disease-avoidant behaviour in animals is not well understood. At present, it would seem that while there is plenty of evidence of avoidance towards disease-related cues across many different taxonomic classes (termites to primates), whether (for higher vertebrates at least) this includes some rudimentary form of disgust and contamination is not known.
A further issue is the validity of the model presented in this article. It is important to note that the claim that disease avoidance underpins certain or many forms of stigmatization is independent of the correctness of the model we have presented. However, we felt that a notable shortcoming of previous claims that disease avoidance does underpin stigmatization was their lack of specificity. This lack of specificity makes it hard to develop testable predictions, many of which were discussed above. The model is unique in proposing two separate contamination systems (implicit and explicit) and in its account of how verbal labels can come to activate the disgust/contamination component of the system. Examining this model for plausibility, it offers no obvious contradictions with what is known about disgust/contamination or with contemporary knowledge of cognition.
Of particular interest for any theory is the evidence that is poorly accommodated within it and below we discuss a number of groups who are known to be stigmatized and who may not fit within the model. One such group of people are those with a mental illness. Mentally ill people experience universal rejection [183–186], report avoidance from friends and family , and are disadvantaged in the employment  and real estate markets . The avoidance of mentally ill people is thought to be motivated by fear , as they are commonly described as unpredictable and dangerous . For example, a survey found that 80 per cent of Americans believed that mentally ill individuals are more likely to commit violent crimes, that it is natural and appropriate to be afraid of someone who is mentally ill, and that former mental patients are dangerous . However, there are several reasons why mental illness may also serve as a disease cue. People with mental illnesses have often been described as ‘unclean’, ‘dirty’ and ‘unkempt’, all of which are clearly associated with poor hygiene and thus with disease [15,62,192]. These descriptors may only be relevant for a subset of mentally ill people—schizophrenia and depression are often characterized by a disregard for appearance and personal hygiene . Homelessness in the mentally ill is also associated with diminished treatment compliance and efficacy , and the condition of living on the streets or in shelters is also likely to impair hygiene practices, producing feelings of disgust in observers and thus motivating avoidance . Therefore, the rejection of mentally ill people appears to be driven primarily by fear, but also by other factors too, one of which may be disease avoidance.
There is much evidence reporting that elderly people endure various forms of segregation and social exclusion [11,194,195]. However, ageism does not appear to be a universal phenomenon. Attitudes towards the elderly were reported to be most favourable in ‘primitive’ societies and decrease with increasing modernization . Chinese and Japanese communities typically treat elders with respect as their age is recognized as a source of prestige and honour . It might be that in western societies, ageism reflects the fact that, relative to younger adults, older adults are characterized by decreased levels of cognitive flexibility and physical ability [198–200], leading them to be judged as having limited potential in the realm of social exchange relationships [5,104]. In addition, it has been suggested that elderly people remind us of our own mortality—an aversive experience with implications for a mortality salience-based form of avoidance [99,100]. Nonetheless, there are certain parallels between signs of old age and disease. Some of the physical characteristic of older adults (e.g. wrinkles, hair loss and skin) may be perceived as being physically unattractive and sickly, and could therefore trigger disgust. Similarly, the greater incidence of disease in the elderly, and the loss of bladder and bowel control in those with certain conditions, might also predispose towards disgust and hence avoidance. As with mental illness, it seems that multiple factors may shape attitudes towards the elderly, one of which again might be disease avoidance.
Recent theorizing has suggested that disease-avoidance mechanisms may generalize beyond the tendency to respond to cues signalling abnormality in morphology or motor behaviour; they may also respond to cues signalling ‘cultural strangeness’ [6, p. 17]. Individuals may be especially adept at learning to detect a wide range of inferential cues that discriminate between familiar (low disease risk) and foreign peoples (high disease risk). Consistent with this view, individuals often exhibit disgust reactions when speaking about ethnic out-groups . Similarly, in Rozin et al.'s  typology of disgust ‘direct and indirect contact with strangers or undesirables’ constitute the domain of ‘interpersonal contamination’ suggesting that this category of interaction can potentially be regarded as disgust-eliciting. Despite this, we do not consider that ‘strangers’ experience the same response as other stigmatized groups. This is because stigmatized individuals experience chronic avoidance—that is, avoidance most of the time and in many social situations and contexts. This is not likely to be true for the category of ‘stranger’ because presumably they enjoy a form of in-group membership elsewhere. In addition, ‘strangers’ can become familiars, arguably losing the characteristic that may have initially led to avoidance. Rather, we might better regard avoidance of ‘strangers’ in the same way we might feel towards a friend who had the flu—a temporary state of disease-related avoidance.
Those who have been convicted of a criminal offence or who have served time in jail are another stigmatized population that can be only partially accommodated within a disease-avoidance account . Stigmatization of this group may arise from numerous sources, such as fear of attack (i.e. violent offenders) or because of violation of social rules to access resources (i.e. poor social exchange partners). However, as with mental illness, prisoners (and jails) are often associated with dirt, disease and poor hygiene , which as noted above can produce feelings of disgust that may then motivate avoidance and rejection.
Finally, the animal and human literature provide plenty of evidence that selection of a healthy mate is an important part of selecting a sexual partner [203,204]. Errors in this process would serve to disadvantage the healthy partner in passing on their genes to the next generation. From this perspective, mate selection might be especially important in relation to disease avoidance and thus to stigmatization, and three predictions can be derived from this suggestion. First, we might expect that females would be more likely to stigmatize other men and women, as they typically invest more in reproduction than in men. For a female to select an unhealthy mate, or to associate with men or women carrying infectious pathogens, is arguably more costly than it is to a man. Second, and as we noted earlier in a different context, we might expect that the prospect of sexual contact with a stigmatized target should elicit the highest levels of avoidance and fear of contamination, and while this should be true for both men and women, this might be more pronounced for the latter for the reason outlined above. Third, we might expect stigmatization of other people to be most pronounced during the reproductively active phase of men and women's lives, and that older people and people already in stable relationships would be less likely to stigmatize others. As far as we are aware, these possibilities have not been directly tested.
A disease-avoidance model should not be taken as a dismissal of the many other processes that contribute to discriminatory behaviour. As noted earlier, there is an abundance of research reporting the influence of multiple processes including social identity , terror management , social unease , social exchange , halo effects  and blame . But there is mounting evidence, of another, less obvious process that also contributes to most forms of stigmatization and which we suggest is both more fundamental and also more general, namely a psychological system designed to protect our bodies from contact with infectious disease.
The authors thank the Australian Research Council for their continued support.
One contribution of 11 to a Theme Issue ‘Disease avoidance: from animals to culture’.
- This journal is © 2011 The Royal Society