Evolution of virulence for vertebrate hosts

Ewald (1994) has argued that the evolution of high vertebrate virulence is favoured for many vector-borne pathogens because immobilization associated with severe disease favours vector feeding on hosts with impaired defensive behaviour. However, many arboviruses, despite their virulence for humans and domesticated animals, which represent dead-end hosts, have very limited virulence for their natural reservoir hosts (Weaver, 1997). Limits on virulence to the vertebrate host have invoked the reasoning that host, and hence parasite, mortality selects against more virulent strains. However, selection for low virulence within clones of the rodent malaria parasite, Plasmodium chabaudi, failed to attenuate the pathogen despite strong selection that mimicked 5075 % host mortality Selection on between-host differences in virulence was also unable to counteract selection for increased virulence caused by within-host selection processes (Mackinnon & Read, 1999). In other studies employing parasite clones selected for virulence, Mackinnon et al. (2002) showed that the total number of gametocytes produced during the infection, a measure of parasite fitness, was fourfold higher in mice that survived than in those which died. Among mice that survived, total gametocyte production was greatest in the host genotype that suffered intermediate levels of morbidity (anaemia and weight loss). Thus transmission costs of high virulence were partly due to host mortality, but perhaps also due to some factor related to high morbidity. Other studies examining the impact of host immunity on malaria parasite virulence predict that anti-disease vaccines will select for higher virulence in those microparasites for which virulence is linked to transmission (Mackinnon & Read, 2003).

Was this article helpful?

0 0

Post a comment