INTRODUCTION
In a previous paper (Vagenas et al. Reference Vagenas, Bishop and Kyriazakis2007) a deterministic, dynamic model which accounts for the interaction of parasitism with sheep nutrition, and the consequences of gastrointestinal parasitism on the performance of the host, was developed. In this model growth was modelled using a standard Gompertz growth function, and comprehensive descriptions of host-parasite interactions and the impact of parasitism on nutrient utilization were incorporated. Importantly, this model makes it possible to explore joint effects of host nutrition and host genotype for performance and resistance.
There has been a substantial body of literature on the trade-offs between immunity and life-history traits, which include traits of economic importance for animal production, such as growth (e.g. Norris and Evans, Reference Norris and Evans2000; Lochmiller and Deerenberg, Reference Lochmiller and Deerenberg2000; Long and Nanthakumar, Reference Long and Nanthakumar2004). These trade-offs could be the consequence of life-history and immunity traits competing for scarce nutrients. However, it has also been suggested that resistance to pathogens can be achieved through non-costly means of defence such as altered host biochemistry, host behaviour etc. (Coustau et al. Reference Coustau, Chevillon and ffrench-Constant2000; Rigby et al. Reference Rigby, Hechinger and Stevens2002). With respect to gastrointestinal parasite infections in sheep the above 2 mechanisms could be modelled through 2 routes, which could be operating simultaneously (a) as a consequence of the partitioning of scarce nutrients to different body functions and (b) non-costly differences in traits affecting the host-parasite interaction. In the second case, hosts explicitly differ in traits of an immunological nature, whereas in the first this is not necessarily so. The model of Vagenas et al. (Reference Vagenas, Bishop and Kyriazakis2007) can be used to investigate both mechanisms. Importantly, this model allows exploration of relationships between immunity and performance without explicitly assuming trade-offs per se. An investigation of these 2 mechanisms requires the effect of the nutritional environment to be explored.
The aims of this paper were 3-fold. Firstly, the sensitivity of the model outputs to its various input parameter assumptions was investigated. Secondly, the effect of 3 nutritional regimes on the performance and on traits representing the interaction between host and parasite were addressed. Finally, the combined effect of host genotype, nutrition and different levels of parasite exposure were explored. The nutritional regimes represented different levels of nutrient availability under different management scenarios. The genotypes modelled differed only in their growth characteristics and, as a consequence, the way they partitioned nutrients between competing functions. By using 2 different genotypes we were able to explore the proposition that differences in the growth characteristics of different genotypes, on their own, can account for the observed differences in resistance for different genotypes. The outcome of this investigation was expected to provide us with insights into the interaction between nutrition and genotype for different levels of parasitism.
MATERIALS AND METHODS
Sensitivity analyses
In this model the value of a number of parameters had to be assumed on conceptual grounds, since they could not be estimated from the literature. Thus, a sensitivity analysis was undertaken to evaluate the effect of variation in the values of these parameters on model outputs. The sensitivity analysis was performed for the model parameters given in Table 1, these being parameters deemed to be particularly influential and for which there is considerable uncertainty. The sensitivity analysis was performed using ±50% of the default value for all the parameters, apart from parameter LImax (the maximum larval intake up to which immunity was assumed to continue to increase) for which a ±25% of the default value was used. In this case, a −50% value is incompatible with the value of the parameter LIinf (the inflection point in the relationship that describes protein loss as a function of larval intake) occurring at 50% of LImax. The output traits for which the effects of the parameters were evaluated were food intake (effects on food intake were larger than for growth rate) and worm burden. Worm burden is the desired trait to measure for determining the resistance of the host (Beh and Maddox, Reference Beh and Maddox1996; Raadsma et al. Reference Raadsma, Gray, Woolaston, Piper and Ruvinsky1998). This is not currently possible in live animals and faecal egg counts are used as an indicator trait for worm burden.
Table 1. Model parameters for which a sensitivity analysis was performed: default values and brief description of their function
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary-alt:20160708205146-09630-mediumThumb-S0031182007002624_tab1.jpg?pub-status=live)
For the sensitivity analyses a genotype equivalent to the Scottish Blackface was used (Friggens et al. Reference Friggens, Shanks, Kyriazakis, Oldham and McClelland1997). The Scottish Blackface is an important breed in the UK and is raised extensively, and faces continuous, natural parasitic challenges. The default parameters used to describe this genotype are given in Table 2. The sheep was given an in silico trickle infection of 3000 L3 Teladorsagia circumcincta per day and offered ad libitum access to a good quality grass (containing 12·6 MJ Metabolisable Energy (ME)/kg Dry matter (DM) and 0·19 kg Crude protein (CP)/kg DM, AFRC, 1993). The model was used to simulate the period from weaning (2 months of age) to slaughter, at 6 months of age.
Table 2. The growth and body composition characteristics of the two genotypes used in the simulations
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary:20151022131557147-0488:S0031182007002624_tab2.gif?pub-status=live)
1 Values estimated by Friggens et al. (Reference Friggens, Shanks, Kyriazakis, Oldham and McClelland1997).
2 Values estimated by Blaxter et al. (Reference Blaxter, Fowler and Gill1982).
3 Weaning weight at the same age of 2 months.
Exploration of the model
The model was used to investigate the effect of nutrition and varying levels of challenge with T. circumcincta on the performance of a growing lamb growing from 2 to 6 months of age. Two experiments were simulated for assessing (a) the effect of nutrition for a single genotype (for growth performance) under 1 level of parasitic challenge and (b) the effect of genotype for growth performance and level of challenge under 3 different nutritional regimes.
(a) Experiment 1
Genotype was defined by the mature body protein and lipid weight from which the value of the (Gompertz) growth rate parameter, B, is estimated (Jones et al. Reference Jones, Amer, Lewis and Emmans2004). A genotype similar to the Scottish Blackface, as described above, was used. The animals were assumed to be either uninfected or given a high trickle challenge (6000 L3 per day). This level of challenge is sufficiently high to demonstrate subclinical effects of parasitism and similar levels have been used in the literature to assess the effect of infection with T. circumcincta on the performance of lambs (Coop et al. Reference Coop, Sykes and Angus1977, Reference Coop, Sykes and Angus1982).
Three different nutritional regimes were simulated (i) both control (uninfected) and trickle-infected animals were offered a good quality grass (containing 12·6 MJ ME/kg DM and 0·19 kg CP/kg DM, AFRC, 1993) ad libitum. In this case the animals are expected to be able to cover their nutritional requirements and thus the effect of parasitism in non-limiting circumstances could be assessed. (ii) The animals were offered a poor quality grass (containing 7·5 MJ ME/kg DM and 0·097 kg CP/kg DM, AFRC, 1993), ad libitum. (iii) The animals were offered the poor quality grass in restricted amounts, with the amount of grass offered being set to that consumed by an uninfected animal offered the good quality grass ad libitum. This type of restriction captures the change in nutritional requirements of growing animals and creates an appropriate level of nutrient scarcity. This treatment might arise when sheep are offered grass of short sward height (Phillips, Reference Phillips1993).
(b) Experiment 2
Two genotypes were simulated, being equivalent to the Scottish Blackface and Suffolk, the latter being an important terminal sire genotype of higher mature weight and lower rate of maturation than the Blackface. The 2 genotypes were assumed to be similar in their immunological status and their ability to mount an immune response, but they differed in their weaning and mature weights, their body composition at maturity and their rate of maturation. The parameters used to describe both genotypes are given in Table 2. In this experiment the 3 nutritional regimes used in Exp. 1 were repeated. Two extra levels of challenge were used, a low (1000 L3 per day), and a medium level (3000 L3 per day) in addition to the high level (6000 L3 per day). These levels were chosen to cover a spectrum of subclinical challenges of different levels and are similar to challenges used in the literature for assessing the effect of infection with T. circumcincta on the performance of lambs (e.g. Coop et al. Reference Coop, Sykes and Angus1977, Reference Coop, Sykes and Angus1982).
RESULTS AND DISCUSSION
Sensitivity analyses
The results of the sensitivity analyses on food intake and worm burden are presented in Table 3. Results for high and low parameter values are presented relative to those obtained for the benchmark values, both for the average effect over the simulation period and the maximum divergence.
Table 3. Output of the sensitivity analyses performed
(The results are presented as cumulative food intake or worm mass over 120 days post-infection relative to the equivalent trait at the benchmark value. The numbers in parentheses are the maximum relative deviation from the benchmark value.)
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary:20151022131557147-0488:S0031182007002624_tab3.gif?pub-status=live)
† The low and high values for LImax corresponds to 0·75 and 1·25×Benchmark respectively. In the case of LImax, 0·5 times the benchmark value would result in a singularity, thus the above range was used instead.
Varying the investigated parameters generally had small overall effects on food intake. The maximum effect was observed when increasing the parameter in the function used for the prediction of reduction of the growth rate parameter (C5); an average reduction of 4% over the 120-day period and a maximum reduction of 19% were achieved when this parameter was increased by 50% of its default value. This may be expected as this parameter directly controls the reduction of the growth rate parameter, which in turn is responsible for the anorexia observed during parasitism. On the other hand, 3 parameters had a big impact on worm mass. When the parameter used in the function for estimating establishment rate of incoming larvae (Kε) was increased by 50% the worm burden decreased by 46%, whereas a 50% decrease in the value of Kε led to a 76% increase in worm burden. The maximum effects on worm burden (in parentheses in the above table) in these cases were a 67% decrease and a 314% increase respectively. Changes in the maximum larval intake for which it is assumed that the immune response continues to increase (LImax) had a similar relative effect to changes in Kε on worm burden (N. B. LImax was varied by 25% rather than 50%).
(a) Experiment 1
The predicted food intake of a Scottish Blackface lamb for the 3 nutritional regimes investigated is shown in Fig. 1. When the animal was given access to good quality grass anorexia was observed, the maximum of which was a reduction of 23% compared to the control. This is qualitatively similar to the pattern of anorexia observed in the literature (Coop et al. Reference Coop, Sykes and Angus1982). Anorexia has been modelled as a function of adult worm mass, thus it increases as more adult worms develop in the gastrointestinal tract, and it subsequently reduces as the host becomes able to control the parasitic burden through the expression of immunity. However, the expression of immunity creates a nutritional demand on the animal, thus leading to an increased desired food intake. Therefore, the challenged animals on good quality feed consumed extra food to meet this demand, which is approximately 5% at 120 days of age. This extra demand continued, as long as the animals continued to be challenged with parasites (results not shown). This is in agreement with the results of Liu et al. (Reference Liu, Smith, Karlsson, Palmer and Besier2005), who estimated that resistance to gastrointestinal parasites led to 5% higher requirements in young sheep. Furthermore, in an experiment by Kyriazakis et al. (Reference Kyriazakis, Anderson, Oldham, Coop and Jackson1996), trickle-infected animals consumed more feed than control animals, following the expression of immunity and the recovery of their feed intake. The results of Liu et al. (Reference Liu, Smith, Karlsson, Palmer and Besier2005) and Kyriazakis et al. (Reference Kyriazakis, Anderson, Oldham, Coop and Jackson1996) provide independent verification of the results of this model.
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary-alt:20160708205146-20278-mediumThumb-S0031182007002624_fig1g.jpg?pub-status=live)
Fig. 1. The effect of 3 nutritional regimes on the food intake of a Scottish Blackface lamb which was either uninfected or trickle infected with a high dose of Teladorsagia circumcincta, from weaning (2 months of age) to slaughter (6 months of age). The nutritional regimes were: (a) good quality grass, ad libitum (— uninfected, – – – infected) (b) poor quality grass, ad libitum (... uninfected, –.– infected) (c) poor quality grass, restricted. The animals given the poor quality grass in restricted amount, had the same food intake as the uninfected control offered the good quality feed, ad libitum and are not shown.
On the other hand, when a poor quality grass was offered ad libitum, the intake of the animal was constrained by the bulk of the feed (Fig. 1). In this case the animal needs to eat more grass to meet its nutritional requirements, than when good quality grass is offered. However, the required nutrient intake is greater than the amount of the feed the animal can consume, due to the bulk constraint, and as a consequence there is an absence of anorexia. The bulk constraint has been assumed to be a function of body weight and since the challenged animal has a lower growth rate than the control, its capacity for bulk is slightly lower. Thus, the challenged animal consumed less feed than the uninfected control, but the mechanism through which this happens differs from the case in which good quality grass was offered ad libitum. To our knowledge, this result has yet to be tested experimentally.
When both the control and the challenged animals were offered a limited amount of poor quality feed there was no difference in their feed intake, as the offered feed intake was less than the bulk constraint and also less than that required to meet their nutritional requirements. As a consequence, both the control and the challenged animals consumed all the feed offered and no anorexia was observed.
The challenge with T. circumcincta had a temporary effect on growth rate (Fig. 2). For the control animals, the growth rates differed between the different nutritional regimes, being highest for animals offered the good quality grass (0·347 kg/day), intermediate for ad libitum poor quality grass (0·224 kg/day) and lowest for restricted the poor quality grass (0·181 kg/day). The relative reduction in empty body weight due to the parasitic challenge was similar in all 3 nutritional regimes. Since compensatory body protein growth was not assumed in the model, the growth rates of the control and the infected animals were the same following the development of immunity of the infected animals. Furthermore, as the animals approached maturity the differences in empty body weight were reduced and eventually mature animals attained the same body weight (results not shown), irrespective of nutritional regime and parasitic challenge.
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary-alt:20160708205146-72680-mediumThumb-S0031182007002624_fig2g.jpg?pub-status=live)
Fig. 2. The effect of 3 different nutritional regimes on the Empty Body Weight (EBW) of a Scottish Blackface sheep either uninfected or trickle infected with a high dose of Teladorsagia circumcincta, from weaning (2 months of age) to slaughter (6 months of age). The nutritional regimes were: (a) good quality grass, ad libitum (— uninfected, – – – infected) (b) poor quality grass, ad libitum (... uninfected, – . – infected) (c) poor quality grass, restricted, (.. – uninfected, … … infected).
The worm burden of the infected animals on the 3 nutritional regimes is given in Fig. 3A. Immunity developed at the same rate in all 3 cases, as this is a function of cumulative larval intake only. As immunity develops, the ability to control worm burden is affected by nutrient availability and, as expected, the greater the nutrient availability the lower the peak worm burden. Peak worm burdens for ad libitum or restricted poor quality grass were 1·17 and 1·26 times greater than the peak worm burden for ad libitum good quality grass. Since the simulated challenge was a subclinical one the animals eventually managed to control worm burden effectively and reduce it to low levels. However, worms were not totally eliminated, as the establishment and mortality parameters are never reduced to zero.
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary-alt:20160708205146-60517-mediumThumb-S0031182007002624_fig3g.jpg?pub-status=live)
Fig. 3 (A and B) The effect of different levels of nutrition on the worm burden and faecal egg counts (eggs per gram of dry faeces) respectively of a Scottish Blackface sheep infected with a high dose of Teladorsagia circumcincta, from weaning (2 months of age) to slaughter (6 months of age). The nutritional regimes were: (a) good quality grass, ad libitum (—) (b) poor quality grass, ad libitum (...) (c) poor quality grass, restricted (– . –).
Qualitatively, the nutritional effects on worm burdens over time are similar to those observed in experiments (van Houtert et al. Reference van Houtert, Barger, Steel, Windon and Emery1995). Experimentally, it is currently not possible to determine the worm burden of a living animal. Thus, the relevant data are collected in serial slaughter experiments (e.g. Steel et al. Reference Steel, Symons and Jones1980; Dobson et al. Reference Dobson, Barnes and Windon1992; Barnes and Dobson, Reference Barnes and Dobson1993; van Houtert et al. Reference van Houtert, Barger, Steel, Windon and Emery1995). The general pattern observed for worm burdens over time in such experiments, is an initial increase and a subsequent reduction, reaching low levels. However, even immune animals have a small number of worms (Houdijk et al. Reference Houdijk, Jessop and Kyriazakis2001).
The effect of nutrition on faecal egg counts is given in Fig. 3B, presented on a dry matter basis rather than as the usual function of wet faeces. Faecal egg counts were lowest for the poor quality grass offered ad libitum, and highest for good quality, ad libitum offered, grass. At first sight this seems counter-intuitive since it is in the reverse order to the worm burden prediction. However, faecal egg counts are a function of both the egg output of the adult female worms and the faecal output of the host. Total egg output on the 3 nutritional treatments (not shown) was similar to the worm burden pattern shown in Fig. 3A. However, the predicted faecal output of the host in the 3 nutritional treatments was quite different, with animals on the good quality grass consuming less feed than those on the 2 other nutritional treatments. Furthermore, the good quality grass had a higher digestibility (AFRC, 1993) and thus faecal output, for the same intake, was lower than for the poor quality feed scenario. Thus, the faecal concentration of worm eggs was higher for good quality grass. This observation might explain the conflicting results found in the literature with respect to the effect of various treatments and their impact on faecal egg count (e.g. Hoste et al. 2006).
Experiment 2
Good quality grass, ad libitum
The consequences of the 3 levels of infection on food intake for the 2 genotypes are given in Fig. 4 (A–C). In the absence of challenge, the Scottish Blackface consumed less feed than the Suffolk. The food intake of both genotypes was slightly increased when they were trickle infected with 1000 L3 (Fig. 4A), as the increased requirements for immunity and damage repair outweighed the potential anorexia effects. However, this difference was quite small and it is unlikely that it would be detected in experiments. Anorexia was observed for the higher infectious doses, increasing as the dose rate increased (Fig. 4B and C). Both breeds showed the same relative anorexia, being a maximum of 5% and 23% reduction in food intake for 3000 L3 and the 6000 L3 infection levels, respectively. As seen in Exp. 1, the infected animals had a higher predicted intake than uninfected animals, after development of immunity. Effects on growth rate and empty body weight were similar for the 2 breeds.
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary-alt:20160708205146-28315-mediumThumb-S0031182007002624_fig4g.jpg?pub-status=live)
Fig. 4 (A–C). The effect of different levels of trickle infection on the food intake of 2 genotypes of sheep (Suffolk vs. Scottish Blackface) offered ad libitum a good quality grass from weaning (2 months of age) to slaughter (6 months of age). (— uninfected, – – – 1000 L3 Teladorsagia circumcincta, … 3000 L3 T. circumcincta, – . – 6000 L3 T. circumcincta. Thin lines Scottish Blackface, bold lines Suffolk.)
Worm burdens for the3 levels of challenge, for ad libitum good quality pasture, are given in the first row of Fig. 5. Because both genotypes were able to consume sufficient feed to meet their growth and immunity requirements, they had the same worm burdens for the same level of challenge.
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary-alt:20160708205146-29335-mediumThumb-S0031182007002624_fig5g.jpg?pub-status=live)
Fig. 5. Worm burdens of sheep of different genotypes offered 1 of the 3 levels of nutrition whilst exposed to 1 of the 3 levels of challenge. The 3 columns are drawn for LI=1000 L3, 3000 L3 and 6000 L3 (left to right), and the rows for ad libitum offered good quality grass, ad libitum offered poor quality grass and restricted offered poor quality grass (top to bottom). The 2 genotypes were Scottish Blackface (- - -) and Suffolk (—).
In the first row of Fig. 6 the faecal egg counts for the 2 genotypes on the 3 levels of challenge are shown. The Scottish Blackface had higher faecal egg counts for the same level of challenge, due to the fact that they had a lower food intake and thus a lower faecal output than the Suffolk.
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary-alt:20160708205146-30582-mediumThumb-S0031182007002624_fig6g.jpg?pub-status=live)
Fig. 6. Faecal egg counts of sheep (eggs/g DM faeces) of different genotypes offered 1 of the 5 levels of nutrition whilst exposed to 1 of the 3 levels of challenge. The 3 columns are drawn for LI=1000 L3, 3000 L3 and 6000 L3 (left to right), and the rows for ad libitum offered good quality grass, ad libitum offered poor quality grass and restricted offered poor quality grass (top to bottom). The 2 genotypes were Scottish Blackface (- - -) and Suffolk (—).
Poor quality grass, ad libitum
As in Exp. 1, the animals in this nutritional treatment did not consume enough feed to satisfy their requirements due to the bulkiness of the feed (Fig. 7). Suffolk and Scottish Blackface showed the same food intake pattern and therefore only the results for the Scottish Blackface are shown. The maximum differences in food intake were 4%, 5% and 7% compared to the control, for the animals trickle infected with 1000 L3, 3000 L3 and 6000 L3 of T. circumcincta per day, respectively. Effects of infection on growth rate were similar to those on ad libitum, good quality food.
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary-alt:20160708205146-87479-mediumThumb-S0031182007002624_fig7g.jpg?pub-status=live)
Fig. 7. The effect of different levels of trickle infection on food intake of Scottish Blackface sheep offered ad libitum a poor quality grass from weaning (2 months of age) to slaughter (6 months of age) and exposed to 3 levels of challenge. (— uninfected, – – – 1000 L3 Teladorsagio circumcincta, … 3000 L3 T. circumcincta, – . – 6000 L3 T. circumcincta.)
Under this feeding regime, maximum worm burden and faecal egg counts were marginally higher in the Scottish Blackface than the Suffolk (Fig. 5, second row), due to the higher intake of the Suffolk. Thus, relative to their intake and hence protein resources, they faced a marginally lower challenge than the Blackface. This difference was greater for faecal egg counts (Fig. 6, second row). In this case the consumption of more feed by the Suffolk sheep also resulted in a lower concentration of eggs in their faeces compared to the Scottish Blackface. The greater the level of challenge, the greater the observed difference between the 2 genotypes.
Poor quality feed, restricted
When the animals were offered a limiting quantity of poor quality grass their growth was further restricted, compared to the previous 2 nutritional treatments. In this case both genotypes had access to the same quantity of nutrients. Due to the partition rule used for partitioning scarce nutrients (Vagenas et al. Reference Vagenas, Bishop and Kyriazakis2007), the Suffolk partitioned relatively more nutrients towards their growth than the Scottish Blackface, and less to immunity. This led to a more pronounced effect on the worm burden of the 2 genotypes shown in Fig. 5 (last row). In this nutritional treatment the Scottish Blackface had a lower worm burden and thus appeared to be more resistant than the Suffolk, in contrast to the case when poor quality, ad libitum feed was offered. The faecal egg counts of the Scottish Blackface were lower than those of the Suffolk, at all levels of parasitic challenge (Fig. 6, last row).
GENERAL DISCUSSION
The aim of this paper was to use the model of Vagenas et al. (Reference Vagenas, Bishop and Kyriazakis2007) to gain insights into the interaction between parasitism and nutrition for host genotypes differing in their expected growth rate. The 2 genotypes used for the simulations differed in their initial and mature body weight, composition and growth, being similar to Suffolk and Scottish Blackface. No explicit assumptions were made for differences in immunological resistance to parasites between the 2 genotypes. However, differences in the expression of immunological traits were created as a consequence of the modelled host-parasite interactions.
The model predicted that when the intake of animals is already limited by the bulkiness of the feed (i.e. poor quality grass), parasitized animals will not exhibit pathogen-induced anorexia. This is because the model assumes that the food intake of animals will be limited to the extent of the most limiting constraint (Emmans and Kyriazakis, Reference Emmans and Kyriazakis2001). An alternative view is that the effect of constraints on food intake may be additive (Anil et al. Reference Anil, Mbanya, Symonds and Forbes1993). This implies that the food intake of parasitized sheep will be limited to the same extent on either good or poor quality foods. Currently there are no experiments in the literature, for macro-parasites at least, that would allow us to distinguish between these 2 views. We have favoured the former because it seems more consistent to assume that an animal whose intake is already constrained (hence it is in a state of nutrient scarcity), would not reduce further its intake when it is exposed to parasites.
Faecal egg counts are used as an indicator trait of worm burden and therefore of resistance to gastrointestinal nematodes. Faecal egg counts are usually expressed as the concentration of eggs in wet faeces, whereas the output of this model was faecal egg counts per gram of faecal dry matter. This was because (a) water consumed by the animal was not modelled per se and (b) the uncertainty associated with water content of the gastrointestinal tract and faeces was removed. The output of this model for faecal egg counts suggests that comparing faecal egg counts of animals which have different food intake can lead to misleading conclusions. Animals exposed to a trickle infection which consume higher quantities of food, with other factors being equal, would have lower faecal egg counts due to higher faecal output. This could lead to flawed conclusions relating to the relative resistance of animals being compared. The model output also indicates that part of the high variation in the estimated faecal egg counts can be attributed to variation in food intake.
One question that can be addressed through our model is whether differences in the growth rate of sheep are sufficient, on their own, to account for differences in the observed resistance of animals. Scottish Blackface sheep are generally, albeit anecdotally, considered to be relatively more resistant to gastrointestinal worms than Suffolk sheep. Furthermore, the 2 breeds differ in their growth rate, with the Scottish Blackface having a lower growth rate, but a faster rate of maturing and a lower mature weight than the Suffolk. The results of our simulations suggest that this difference in growth rate would result in differences in apparent resistance only when nutrients are scarce. However, even then the predicted differences between the 2 breeds were small; for example, the Scottish Blackface sheep had 34% higher and 5% lower faecal egg counts than the Suffolk sheep, when both sheep were given access to a low quality grass, ad libitum and restrictively, respectively. On the basis of between-sheep variation in resistance characteristics we see in the field (Bishop et al. Reference Bishop, Bairden, McKellar, Park and Stear1996), the small differences predicted by this model are unlikely to be translated into differences of practical significance. Therefore, other factors in addition to growth differences most likely contribute to the differences between sheep reported in the literature.
New genomic techniques are providing insight into what these factors could be. In a microarray experiment comparing lines of Perendale sheep divergently selected for resistance to nematodes, Keane et al. (Reference Keane, Zadissa, Wilson, Hyndman, Greer, Baird, McCulloch, Crawford and McEwan2006) found increased levels of expression for many stress response genes in the gastrointestinal tract of susceptible, naïve lambs. This indicates that the gastrointestinal tract of susceptible animals shows signs of stress and is compromised even in the absence of infection by gastrointestinal nematodes. This could be incorporated in the current model, for example, as an increased level of damage for the same level of infection in susceptible animals compared to resistant ones. Furthermore, the above authors found that several genes, vital for maintaining a healthy immune system, were expressed to a higher degree in resistant animals than in susceptible animals. Such differences can be readily incorporated into our model by introducing appropriate variation in the parameters determining the rate of establishment, fecundity and mortality of the parasites. Using appropriate values, susceptible animals could be simulated so as to harbour more parasites than resistant animals. For example, a larger value for the exponent of establishment would lead to a faster decrease of establishment rate of incoming larvae for the same quantity of available protein and a faster development of immunity. Such changes would create differences in resistance above those created by differences in the growth rate of animals.
Our model currently simulates a single animal exposed to parasites at different levels of nutrition over time. It can be expanded, so as to explore the interaction between genotype and environment, when the animals of a population differ in their genetic characteristics for both immunity and production traits. Expanding the model in this way would add further insight, but also further complexities. For example, the between-animal distributions of many of the model parameters are unknown, as are the correlations between these parameters or traits. Exploring the distributional properties of the underlying traits and parameters and the way they interact along with the effect of the nutritional environment would help us elucidate the apparently conflicting results for the observed traits reported in the literature (i.e. McEwan et al. Reference McEwan, Dodds, Greer, Bain, Duncan, Wheeler, Knowler, Rid, Green and Douch1995; Bishop et al. Reference Bishop, Bairden, McKellar, Park and Stear1996). The issue of predicting performance and resistance of sheep, at the population level, will be a focus of future research.
The Biotechnology and Biology Sciences Research Council financially supported this work. The Scottish Agricultural College receives support from the Scottish Executive Environment and Rural Affairs Department. Our colleagues in the Animal Nutrition and Health Department of the Scottish Agricultural College and Gerry Emmans in particular, are acknowledged for their input at various stages of the model development.