Time series analysis has revealed two different patterns of smallpox epidemics in Britain in the seventeenth and eighteenth centuries: in large conurbations (exemplified by London) the disease was endemic whereas medium-sized rural towns (exemplified by Penrith, Cumbria) suffered from 5 year epidemics with no cases of smallpox in the inter-epidemic years. The oscillations (epidemics) persisted for over 150 years and it is suggested that both systems were pumped up by regular fluctuations in susceptibility (<latex>$\delta \beta $</latex>). Modelling suggests that: (i) the natural frequency of oscillations in large cities is two years and the system is pumped up by a 1 year, seasonal input; (ii) it takes five years to build up a pool of susceptibles in medium-sized towns by new births and epidemics are then triggered by a 5 year input. The equations represent a system that has two components, a basic linear element with the remainder of the system being nonlinear; modelling a progressive increase in <latex>$\delta \beta $</latex> in London illustrates theoretically how a predominantly linear response changes to a nonlinear response and ultimately to chaos. A variation in susceptibility is a theoretical condition for inducing chaos; the undriven system cannot become chaotic. Modelling populations of progressively increasing size/density and applying a 1 year or 5 year sinusoidal oscillation in <latex>$\delta \beta $</latex> illustrates the fundamental distinction in the response of medium-sized rural towns and large cities.