Introduction
Animal welfare plays an increasingly important role in decisions within and attitudes towards animal agriculture in the USA. For example, because of pressure from consumers and advocacy groups, food retailers have begun to require their suppliers to pass animal welfare audits on a regular basis. Although this auditing process was initially limited to highly centralized aspects of animal agriculture, such as slaughter plants, food retailers face growing pressure to provide evidence that animal welfare is considered from the farm to the plate. Indeed, the consumer perception of animal welfare can drive the sustainability of the production method, as in the egg industry, where concerns about animal welfare have led to proposed, but ultimately unsucessful, national legislation banning conventional cages for laying hens (Mench et al., Reference Mench, Sumner and Rosen-Molina2011; Greene and Cowan, Reference Greene and Cowan2012).
The scientific study of animal welfare can inform best practice on farm and, in many cases, provide a basis for animal care, thus contributing to the public discussion about these issues. Sufficient scientific information will allow decisions to be made based on evidence and measured outcomes, rather than solely on perceived ethical concerns. Throughout this review, emphasis is placed on the empirical evidence informing the three main aspects of animal welfare: (1) physical functioning, in that animals should be healthy and thriving, such that one function should not be enhanced to the detriment of another, (2) naturalness, meaning that animals have the ability to engage in behaviors that they are strongly motivated to perform, and (3) subjective states, in that animals can enjoy life; that is to say, they experience positive states and that negative states (e.g. pain) are minimized (Fraser, Reference Fraser2008). Empirical evidence about these issues comes from scientific literature about cattle; we included both international and national peer-reviewed and non-peer reviewed literature of direct relevance to the USA. We have focused on intensive aspects of beef production, but included information from dairy when relevant and identify it as such.
Our objective is to review scientific information about the welfare of beef cattle in the USA in order to identify priorities for future research. Our process involved both our professional expertise and interpretation of existing literature about intensive aspects of beef production. Comprehensive reviews are available for many of the topics we cover, including more extensive beef systems (Petherick, Reference Petherick2005). Our review aims to address a specific objective: providing a review of existing knowledge in order to identify where we believe additional information is needed. Intensive aspects of beef production have been divided into the following sections: nutrition and growth, health, painful procedures, environmental and housing conditions, social interactions, transport, handling, and slaughter. Key issues and future research needs are highlighted within each section; our summary is provided at the end.
Nutrition and growth
Several areas of nutrition have received either considerable attention within the scientific literature or have been identified as key animal welfare concerns: weaning, feeding high concentrate diets, using body condition as a tool for assessment and production-related technologies, such as antibiotics, ionophones, hormones (both fed and as implants), and ß-adrenergic agonists.
Weaning
The loss of milk and social contact associated with weaning creates a stressful situation for cows and calves (Weary et al., Reference Weary, Jasper and Hötzel2008), and increases the risk of sickness, particularly when it is coupled with other factors such as transport. For example, calves that are weaned and immediately transported to a new location and comingled with unfamiliar calves have a higher mortality risk and decreased weight gain compared with calves weaned and left at home (e.g. Step et al., Reference Step, Krehbiel, Depra, Cranston, Fulton, Kirkpatrick, Gill, Payton, Montelongo and Confer2008; Edwards, Reference Edwards2010); only a single study found no differences between these two approaches (Boyles et al., Reference Boyles, Loerch and Lowe2007).
Research has explored several aspects of weaning in order to alleviate the stress associated with the process. Calves that are weaned but allowed fence-line contact with their dams display fewer behavioral signs of stress (Boland et al., Reference Boland, Scaglia, Swecker and Burke2008) and gain more weight than traditionally weaned calves up to 10 weeks after separation (Price, Reference Price2002). Similarly, two-step (or two-stage) weaning procedure (Haley et al., Reference Haley, Bailey and Stookey2005), allows contact between cows and calves. In the first stage, nursing is prohibited by placing a plastic anti-sucking device into the calf's nose for 4 to 7 days. In the second stage, the flap is removed and the pairs separated. Weaning in two steps results in fewer vocalizations, more time spent eating, and less time walking in the first week following separation for both cows and calves, compared with abrupt weaning (Haley et al., Reference Haley, Bailey and Stookey2005; Boland et al., Reference Boland, Scaglia, Swecker and Burke2008; Loberg et al., Reference Loberg, Hernandez, Thierfelder, Jensen, Berg and Lidfors2008). Calves weaned in two steps gain more weight compared with abruptly weaned calves in the first week following separation (Haley et al., Reference Haley, Bailey and Stookey2005). One study did not show behavioral advantages in two-step calves compared with abruptly weaned calves (Enríquez et al., Reference Enríquez, Ungerfeld, Quintans, Guidoni and Hötzel2010) perhaps because two-step calves wore the anti-sucking devices for 17 days. This was at least 10 days longer than Haley et al. (Reference Haley, Bailey and Stookey2005) and the results could be explained by anecdotal reports of extended wearing of flaps resulting in calves learning to nurse around the device. Overall, the short-term benefits of two-step weaning seem robust, but the long-term health and economic implications are unknown.
Future research
Research is needed to determine the best management practices for non-abrupt weaning methods (e.g. duration of fence line and flap insertion in two-step) and to quantify the effects of these on calf health.
High concentrate feeding
Upon arrival at the feedlot, 58% of US cattle are fed a diet that is at least 50% concentrates; 83% of cattle in the finishing phase are fed this type of diet (United States Department of Agriculture or USDA, 2011b). High concentrate feeding has been associated with nutritional disease, the most common of which include acidosis, liver abscesses and laminitis; the latter two occurring secondary to acidosis (Nocek, Reference Nocek1997; Galyean and Rivera, Reference Galyean and Rivera2003; Nagaraja and Lechtenberg, Reference Nagaraja and Lechtenberg2007a). Acidosis is the result of excessive acid production in the rumen (Owens et al., Reference Owens, Secrist, Hill and Gill1998; Penner et al., Reference Penner, Steele, Aschenbach and Mcbride2011) and can be defined as either acute (clinical acidosis) when the pH is <5.0 or sub-acute ruminal acidosis (SARA) when ruminal pH < 5.8 for more than 12 h day−1 (Schwartzkopf-Genswein et al., Reference Schwartzkopf-Genswein, Beauchemin, Gibb, Crews, Hickman, Streeter and Mcallister2003). The welfare issues related to acute acidosis include observable illness and mortality, while SARA is associated with variable feed intake and reduced weight gain, rumenitis (lactate induced thicking of the statum cornea of the rumen mucosa causing mucosal lesions; Owens et al., Reference Owens, Secrist, Hill and Gill1998), liver abscesses (Nagaraja and Lechtenberg, Reference Nagaraja and Lechtenberg2007b), and laminitis (Cook et al., Reference Cook, Nordlund and Oetzel2004; Nordlund et al., Reference Nordlund, Cook and Oetzel2004). At this time, the prevalence of acidosis in feedlots is not known, however the incidence of liver abscesses has been reported to range between 12 and 32% (Nagaraja and Lechtenberg, Reference Nagaraja and Lechtenberg2007a), but can be as high as 56% (Fox et al., Reference Fox, Thomson, Lindberg and Barling2009). In addition to the scope of the problem, little is known about the severity, for example, if conditions like SARA are painful.
Factors contributing to acidosis include the length of time to adapt to high concentrate feeding (Beaver and Olson, Reference Beaver and Olson1997; Owens et al., Reference Owens, Secrist, Hill and Gill1998; Schwartzkopf-Genswein et al., Reference Schwartzkopf-Genswein, Beauchemin, Gibb, Crews, Hickman, Streeter and Mcallister2003; Nagaraja and Titgemeyer, Reference Nagaraja and Titgemeyer2007), the amount and fermentability of concentrate and fiber in the diet (Campbell et al., Reference Campbell, Marshall, Mandell and Wilton1992; Yang and Beauchemin, Reference Yang and Beauchemin2006, Reference Yang and Beauchemin2007), feed intake (Schwartzkopf-Genswein et al., Reference Schwartzkopf-Genswein, Beauchemin, Gibb, Crews, Hickman, Streeter and Mcallister2003), feeding behavior and competition (Gibb et al., Reference Gibb, Mcallister, Huisma and Wiedmeier1998; Schwartzkopf-Genswein et al., Reference Schwartzkopf-Genswein, Beauchemin, Gibb, Crews, Hickman, Streeter and Mcallister2003, Reference Schwartzkopf-Genswein, Hickman, Shah, Krehbiel, Genswein, Silasi, Gibb, Crews and Mcallister2011; González et al., Reference González, Manteca, Calsamiglia, Schwartzkopf-Genswein and Ferret2012a) and an individual's ability to cope with high acid production (Goad et al., Reference Goad, Goad and Nagaraja1998). Other risk factors include an animal's capacity for fermentation, acid absorption, epithelial molecular level adaptation and epithelial proliferation (Penner et al., Reference Penner, Steele, Aschenbach and Mcbride2011).
Strategies to reduce the incidence of acute and SARA include feeding diets containing adequate amounts of fiber (Nagaraja and Lechtenberg, Reference Nagaraja and Lechtenberg2007a) introduced gradually (Vasconcelos and Galyean, Reference Vasconcelos and Galyean2007; Aschenbach et al., Reference Aschenbach, Penner, Stumpff and Gäbel2011; Penner et al., Reference Penner, Steele, Aschenbach and Mcbride2011) or ionophores which are believed to alter ruminal fermentation and feeding behavior (Nagaraja and Lechtenberg, Reference Nagaraja and Lechtenberg2007a; González et al., Reference González, Correa, Ferret, Manteca, Ruíz-De-la-torre and Calsamiglia2009, Reference González, Manteca, Calsamiglia, Schwartzkopf-Genswein and Ferret2012a). Although buffers such as sodium bicarbonate, seaweed, yeast cultures, and bacteria have been used to mitigate acidosis and SARA, their effects have not been consistent (Enemark, Reference Enemark2008). Several studies provide evidence that feeding antibiotics decreases the incidence of liver abscesses (Nagaraja and Lechtenberg, Reference Nagaraja and Lechtenberg2007b; Wileman et al., Reference Wileman, Thomson, Reinhardt and Renter2009) and increases weight gain (Nagaraja and Lechtenberg, Reference Nagaraja and Lechtenberg2007b). However, broadly feeding antimicrobials as to groups of cattle has been scrutinized (US Food and Drug Administration, 2012) and may not be viable in the future. In addition, despite the knowledge about specific strategies for reducing SARA, little is known about the effectiveness of combinations of practices, thus epidemiological studies are needed to identify protective and risk factors.
In addition to the health concerns associated with feeding high-concentrate diets, other aspects of welfare, such as motivation to consume forage and ruminate, may also be affected. However, little work has addressed this issue in beef cattle. In a single study, beef cattle preferred to spend time at pasture, in particular for lying behavior, but preferred the feedlot during feeding times (Lee et al., Reference Lee, Fisher, Colditz, Lea and Ferguson2013). In this type of work, it is unclear what drives these results, as microclimates and the comfort of lying surface, as well as the type of feed available all differ between environments. It does not provide direct evidence about the motivation for forage. No work has identified how much forage or what particle size/length is required to stimulate rumination. Geophagia or pica, has been observed in cattle fed high concentrate diets, and may indicate that beef cattle are motivated to consume forage and/or ruminate, but no studies to date have quantified this response.
Future research
Epidemiological analysis is needed to understand the prevalence of acidosis and the risk factors (e.g. diets resulting in least digestive disturbance and best performance, optimal diet transition strategies and relationships between animal feeding behavior, ruminal physiology, metabolism and genetics) that contribute to this problem in commercial feedlots. Further work is needed in order to understand the non-health implications of high concentrate feeding, from whether acidosis and/or sub-clinical acidosis are painful for cattle (e.g. the extent of the welfare concern) to a better understanding of the motivation to consume forage and to ruminate. Finally, the economic implications of high concentrate feeding (e.g. mortality, morbidity, growth efficiency and carcass quality) require more attention.
Body condition score (BCS)
BCS is currently assessed in the Beef Quality Assurance program (e.g. National Cattlemen's Beef Association, 2012) as an indirect evaluation of adequate nutrition or disease. Biological implications of low BCS include lower reproductive performance in cows with BCS below 4.5 (out of 9) and those that have lost BCS between calving and breeding (Spitzer et al., Reference Spitzer, Morrison, Wettemann and Faulkner1995; Kasimanickam et al., Reference Kasimanickam, Asay, Firth, Whittier and Hall2012; Soca et al., Reference Soca, Carriquiry, Keisler, Claramunt, Do Carmo, Olivera-Muzante, Rodríguez and Meikle2013) unless they had been reared on extensive rangeland (Mulliniks et al., Reference Mulliniks, Cox, Kemp, Endecott, Waterman, Vanleeuwen and Petersen2012). Indeed, while there is a long history of use of such scoring systems in beef cattle and its relationship to fertility and the number of days to estrus postpartum, relatively little research has evaluated the other, non-health-related implications of body condition, such as hunger. In addition, retailer and industry welfare assessment schemes include BCS as an outcome measure, but face the challenge of determining a threshold of acceptability. Canada's Code of Practice for the Care and Handling of Beef Cattle (National Farm Animal Care Council, 2013) requires prompt corrective action to improve body condition of cattle with a score of 2 or less (out of 5). The scientific basis for such cut-offs is unclear.
Future research
If BCSs are used as an outcome measure in welfare assurance schemes, additional research is needed to understand when BCS reaches an unacceptable threshold.
Production-related technologies
Several technologies are commonly used for either health or management reasons in beef cattle (% of US feedlot cattle affected): antibiotics (48%) are fed to both prevent and treat illness and to improve weight gain (for an overview of label-approved uses see Gadberry, Reference Gadberry2011), ionophores (90%) are used to reduce SARA by altering ruminal fermentation and feeding behavior (Nagaraja and Lechtenberg, Reference Nagaraja and Lechtenberg2007a; González et al., Reference González, Correa, Ferret, Manteca, Ruíz-De-la-torre and Calsamiglia2009, Reference González, Manteca, Calsamiglia, Schwartzkopf-Genswein and Ferret2012a), and ß-adrenergic agonists (57%; USDA, 2011b) are fed in the latter stages of the finishing period to increase average daily gain, improve feed efficiency, and maximize lean muscle growth (e.g. Lean et al., Reference Lean, Thompson and Dunshea2014). Hormonal treatments are also used. Melengestrol acetate (MGA) is fed to 85% of heifers to both improve feed efficiency and suppress estrus in feedlots (Gadberry, Reference Gadberry2011; USDA, 2011b) and is discussed further in the section on painful procedures. Hormonal implants are used to increase average daily gain and improve feed efficiency in 84% of cattle (USDA, 2011b; Stewart, Reference Stewart2013).
Research has primarily focused on the performance benefits of these technologies, while the effects on animal welfare have received some, but relatively less attention. Using implants in summer, when faster-growing animals might be more susceptible to heat, resulted in no change (Kreikemeier and Mader, Reference Kreikemeier and Mader2004; Mader and Kreikemeier, Reference Mader and Kreikemeier2006; Mader et al., Reference Mader, Gaughan, Kreikemeier and Parkhurst2008) or an increase (Gaughan et al., Reference Gaughan, Kreikemeier and Mader2005) in physiological indicators of heat stress such as panting score or body temperature compared with animals without implants. Use of implants is thought to either increase (oestrogen) or decrease (androgens) buller steer syndrome, an animal welfare concern (Blackshaw et al., Reference Blackshaw, Blackshaw and Mcglone1997). Implants either have no effect (Baker and Gonyou, Reference Baker and Gonyou1986; Newman et al., Reference Newman, Tong, Doornenbal, Tennessen, Coulter and Mears1990; Godfrey et al., Reference Godfrey, Lunstra and Schanbacher1992) or decrease aggression and/or sexual behavior (O'Lamhna and Roche, Reference O'lamhna and Roche1983; Unruh et al., Reference Unruh, Gray and Dikeman1986; Vanderwert et al., Reference Vanderwert, Berger, Mckeith, Baker, Gonyou and Bechtel1985; Hawkins et al., Reference Hawkins, Field, Orme, Dyer and Lunt2005). Only two studies found that implants (Synovex-S, progesterone and estradiol benzoate and Revalor-S, trenbolone acetate and estradiol) increased mounting and agnostic behavior, respectively (Lesmeister and Ellington, Reference Lesmeister and Ellington1977; Stackhouse-Lawson et al., Reference Stackhouse-Lawson, Tucker, Calvo-Lorenzo and Mitloehner2015). Unless stated otherwise, the majority of these studies have examined a specific estrogenic implant, Zeranol. These results are likely dependent on the type of implant used, the age, and sex of the animal implanted, the dose as well as aspects of experimental design, such as sample size.
Until recently, ß-adrenergic agonists have not been well studied from an animal welfare perspective. β-adrenergic agonists significantly increase the likelihood of death in feedlot cattle (Loneragan et al., Reference Loneragan, Thomson and Scott2014), but the mechanism is unknown. Zilpaterol, a ß-adrenergic agonist, has been associated with increased skin temperature in sheep (Macías-Cruz et al., Reference Macías-Cruz, Álvarez-Valenzuela, Torrentera-Olivera, Velázquez-Morales, Correa-Calderón, Robinson and Avendaño-Reyes2010) and Baszczak et al. (Reference Baszczak, Grandin, Gruber, Engle, Platter, Laudert, Schroeder and Tatum2006; 200 mg/steer per day) reported a slight increase in speed when cattle entered the squeeze chute. Zilpaterol also results in more lateral lying and, in combination with hormonal implants, agonistic behavior in feedlot steers, compared with untreated controls (Stackhouse-Lawson et al., Reference Stackhouse-Lawson, Tucker, Calvo-Lorenzo and Mitloehner2015). It also decreases dry matter (DM) intake (Reinhardt et al., Reference Reinhardt, Vahl, Depenbusch, Hutcheson and Thomson2014) and heart rate (Frese et al., submitted). Anecdotally, both hoof and animal handling problems with cattle have been reported (Thomson et al., Reference Thomson, Loneragan, Henningson, Ensley and Bawa2015) and in 2013, Merck removed zilpaterol from the market (Centner et al., Reference Centner, Alvey and Stelzleni2014).
Future research
Research is needed to understand the more nuanced effects of both hormones (implanted and fed) and ß-adrenergic agonists on beef cattle welfare.
Health
Cattle are affected by a number of diseases. Rather than review all literature in these areas, this review focuses on two problems, Bovine Respiratory Disease (BRD) and lameness, as examples of health issues that are relatively well and less understood, respectively.
BRD is estimated to cause 67% of total mortalities in US feedlots (Loneragan and Brashears, Reference Loneragan and Brashears2005) and, as such, is the most costly disease for the beef industry (Smith, Reference Smith1998). BRD is associated with treatment costs and death loss (Galyean et al., Reference Galyean, Perino and Duff1999; Irsik et al., Reference Irsik, Langemeier, Schroeder, Spire and Roder2006) and decreased feedlot performance and carcass qualities (Smith, Reference Smith1998; Larson, Reference Larson2005). BRD research can be divided into two major categories: prevention and detection/treatment.
Many of the risk factors for BRD are known. Cattle at the highest risk for morbidity and mortality are light body weight, commingled, immunologically naïve, hauled long distances from the farm of origin, and are not used to being fed or watered from bunks or troughs (Duff and Galyean, Reference Duff and Galyean2007). Currently, nearly half (50%) of producers wean and ship calves the same day (USDA, 2007b). Similarly, preconditioning, a protocol that requires vaccinations, starting calves on feed and avoids immediate transportation after weaning among other practices, offers a consistent health advantage to the calf (Peterson et al., Reference Peterson, Strohbehn, Ladd and Willham1989b; Schipper et al., Reference Schipper, Church and Harris1989), but there is not always a financial advantage to the producer (Peterson et al., Reference Peterson, Strohbehn, Ladd and Willham1989a). The majority of producers (61%) never vaccinate calves for respiratory diseases before sale and only 34% hold calves for more than 31 days after weaning and before sale (USDA, 2007b). However, over 80% of large feedlot operators recognize the value of these practices, but only 35% always had access to this information when calves arrive at their feedlot (USDA, 2011b). Indeed, for some feedlot operators, there can be financial advantages associated with finishing these high-risk calves.
Detection of BRD and other illnesses is more challenging, particularly in extensive environments or in large feedlot groups. Specificity and sensitivity of BRD detection by pen riders, compared with post-mortem lung lesions, is relatively poor, 60% (White and Renter, Reference White and Renter2009) and this may result in sick, but untreated, animals. Using aspects of the generalized sickness response, such as changes in behavior, are promising ways to improve detection (e.g. Weary et al., Reference Weary, Huzzey and Von Keyserlingk2009). For example, previous research has demonstrated that GrowSafe®, a tool used to automatically record feeding behavior, can be used to detect BRD earlier than observation by feedlot personnel (Sowell et al., Reference Sowell, Branine, Bowman, Hubbert, Sherwood and Quimby1999; Quimby et al., Reference Quimby, Sowell, Bowman, Branine, Hubbert and Sherwood2001). These tools may also aid evaluation of treatment outcomes, particularly if they become cost-effective; little work has addressed these ideas.
In contrast to BRD, other health problems have received considerably less attention within the published literature. For example, the causes, prevention, detection and treatment of lameness in beef cattle are not well understood, nor has the national prevalence been documented. Veterinarians and feedlot managers estimate that lameness causes 10% of mortality in dirt, open-air feedlots and suggest that the primary causes are foot rot, injury and toe abscesses (Terrell et al., Reference Terrell, Thomson, Stackhouse-Lawson, Apley, Larson and Reinhardt2013). Improving understanding of lameness is important for several reasons. Firstly, it is known to be painful, with the degree and severity differing among causes (e.g. Whay et al., Reference Whay, Waterman, Webster and O'brien1998). Secondly, lameness is costly. Involuntary culling of lame cattle continues to be an important reason for losses in both the dairy and beef industries (USDA, 2010, 2011a). Finally, the evaluation and prevalence of lame cattle is one of the primary factors in third-party welfare audit programs including National Dairy Farmers Assuring Responsible Management (FARM) Program, Validus, New York State Cattle Health Assurance Program (NYSCHAP), and others (NYS Department of Agriculture and Markets, 2002; National Milk Producers Federation, 2012; Valdus Ventures LLC, 2012). It is possible that similar parameters could be included in beef evaluation schemes. An understanding of risk factors, prevalence and treatment of specific causes of lameness will improve the value of these audits.
Future research
For diseases that are relatively well understood, additional work is required to improve uptake of preventative management strategies and validate automated ways to improve cost-effective disease detection. For diseases where less is known, information is required about prevalence, risk factors and evaluation of treatment effectiveness (e.g. lameness, SARA, as described in the nutrition section). Epidemiological studies at the feedlot or slaughter level would be useful starting points for these types of health concerns.
Painful procedures
Beef cattle undergo a number of painful procedures, including castration, dehorning and branding. Other procedures that are less prevalent in beef cattle include spaying heifers, and other forms of identification such as wattling (cutting the dewlap) and ear notching. The focus of this review will be on the most prevalent management procedures that are associated with pain in cattle including how to reduce it, such as anesthetics, non-steroidal anti-inflammatory drugs (NSAIDs), opioids, α2-agonists, and N-methyl D-aspartate receptor antagonists (Thurmon et al., Reference Thurmon, Tranquilli and Benson1996).
Castration
Castration involves either surgical removal of the testes or methods that irreparably damage the testicles by disruption of the blood supply using a rubber ring or latex band (American Veterinary Medical Association or AVMA, 2009) or a castration clamp (Burdizzo castration). Surgical and band castration are the most common methods in the USA at 49 and 47%, respectively (USDA, 2007a). Of those operations that castrate before sale (41%), most (75%) performed the procedure at an average age of less than 93 days (USDA, 2007a). All castration methods have been demonstrated to produce physiological, neuroendocrine, and behavioral changes indicative of pain (Fisher et al., Reference Fisher, Crowe, De La Varga and Enright1996; Stafford and Mellor, Reference Stafford and Mellor2005b; Pang et al., Reference Pang, Earley, Sweeney and Crowe2006, Reference Pang, Earley, Gath and Crowe2008; Coetzee et al., Reference Coetzee, Lubbers, Toeber, Gehring, Thomson, White and Apley2008; Stilwell et al., Reference Stilwell, Lima and Broom2008; White et al., Reference White, Coetzee, Renter, Babcock, Thomson and Andresen2008; Currah et al., Reference Currah, Hendrick and Stookey2009; González et al., Reference González, Schwartzkopf-Genswein, Caulkett, Janzen, Mcallister, Fierheller, Schaefer, Haley, Stookey and Hendrick2010). The combination of local anesthesia and an NSAID achieved the greatest reduction of pain within this literature, suggesting that a multimodal analgesic approach is more effective in mitigating pain associated with castration than use of a single analgesic agent.
To date, most work on the benefits of analgesia has focused on the short-term responses. Recently, additional benefits, such as lower incidence of BRD, have also been found when NSAIDs were given to bulls castrated upon entering a feedlot (Coetzee et al., Reference Coetzee, Edwards, Mosher, Bello, O'connor, Wang, Kukanich and Blasi2012a). Given that castration wounds take between 10 days and 9 weeks to heal (Molony et al., Reference Molony, Kent and Robertson1995; Fisher et al., Reference Fisher, Knight, Cosgrove, Death, Anderson, Duganzich and Matthews2001; Stafford et al., Reference Stafford, Mellor, Todd, Bruce and Ward2002), these longer-term effects (and benefits of pain relief) deserve more attention.
Dehorning/disbudding
Disbudding is a method of preventing horn growth in calves when buds are 5–10 mm long and the blood supply can be cauterized or the tissue of the entire area damaged (Stafford and Mellor, Reference Stafford and Mellor2005a). Once horns grow longer, they become attached to the skull and must be removed by amputation (dehorning). There are three primary methods to remove or prevent horn growth in cattle: (1) amputation using scoop dehorners, gauges, saws and gigli wire; (2) cautery using an electric, gas or a battery powered hot iron and (3) chemical application of caustic paste usually consisting of a strong alkalotic agent such as sodium hydroxide or calcium hydroxide; and use of these methods varies between regions (USDA, 2007b). Regardless of the method, this procedure is painful (reviewed by Stafford and Mellor, Reference Stafford and Mellor2011; Stock et al., Reference Stock, Baldridge, Griffin and Coetzee2013).
The use of local anesthetics, systemic anti-inflammatories, and sedatives with analgesic properties following dehorning/disbudding have been investigated in several studies using behavioral, physiological, and neuroendocrine biomarkers as endpoints (reviewed in Stafford and Mellor, Reference Stafford and Mellor2011). As with castration, recent reviews of this procedure suggest that a multimodal approach, using a combination of anesthetic and analgesic compounds with different mechanisms of action, provides optimal pain relief in cattle following dehorning/disbudding. Local anesthetics and sedatives-analgesics aid in the attenuation of the acute response, and NSAIDs mitigate the longer term pain associated with inflammation (Coetzee, Reference Coetzee2011; Coetzee et al., Reference Coetzee, Mosher, Kukanich, Gehring, Robert, Reinbold and White2012b; Theurer et al., Reference Theurer, White, Coetzee, Edwards, Mosher and Cull2012). Most studies that assess stress and nociception in dehorned cattle examine the acute response; however, very few studies have evaluated chronic pain or distress responses following dehorning/disbudding.
Branding
Brands are a permanent thermal injury to the skin or hair follicles (AVMA, 2012b). Hot-iron branding induces scaring – a permanent alopecia. Freeze branding kills the pigment in the hair follicles resulting in the regrowth of depigmented hair (AVMA, 2012b).
Pain responses during hot-iron branding include tail flicking, kicking, and falling down (Schwartzkopf-Genswein et al., Reference Schwartzkopf-Genswein, Stookey, Crowe and Genswein1998), escape attempts (Lay et al., Reference Lay, Friend, Randel, Bowers, Grissom and Jenkins1992c), and vocalization (Schwartzkopf-Genswein et al., Reference Schwartzkopf-Genswein, Stookey and Welford1997; Watts and Stookey, Reference Watts and Stookey2000). Similarly, freeze branding is associated with increases in plasma cortisol concentrations (Lay et al., Reference Lay, Friend, Bowers, Grissom and Jenkins1992a) and more tail-flicking compared with sham-branded animals (Schwartzkopf-Genswein et al., Reference Schwartzkopf-Genswein, Stookey and Welford1997). The magnitude and duration of the pain response is less in freeze compared with hot-iron branding (Schwartzkopf-Genswein et al., Reference Schwartzkopf-Genswein, Stookey and Welford1997, Reference Schwartzkopf-Genswein, Stookey, Crowe and Genswein1998). Hot-iron brand wounds take at least 8 weeks to heal and remain sensitive to palpation during this time (Tucker et al., Reference Tucker, Mintline, Banuelos, Walker, Hoar, Drake and Weary2014a, Reference Tucker, Mintline, Banuelos, Walker, Hoar, Varga, Drake and Wearyb). Use of NSAIDs or a cooling gel at the time of the procedure does not have an effect (Tucker et al., Reference Tucker, Mintline, Banuelos, Walker, Hoar, Drake and Weary2014a, Reference Tucker, Mintline, Banuelos, Walker, Hoar, Varga, Drake and Wearyb). Studies that specifically evaluate interventions, such as topical anesthesia, at the time of branding are needed.
Age
The age at which these procedures are performed is often thought to play a role in the pain experienced. However, comparisons between ages are difficult because the behavioral repertoire of the animal also changes with age, and creates an inherent confound. Thus, the clearest arguments for performing painful procedures at a younger age are about minimizing the invasiveness or the amount of tissue damage. For example, the AVMA recommends disbudding before the horns attach to the skull (around 8 weeks of age; AVMA, 2012a), as this procedure is less invasive than dehorning and involves a smaller wound. Regardless of the age the procedure is performed, effective analgesia is still needed. In addition, further work is needed to understand the effects of performing multiple painful procedures at the same time. Many experiments focus on a single procedure in order to control other factors, but in reality, these management practices occur at the same time.
Challenges in providing analgesia
There are several challenges associated with providing effective analgesia in the USA (reviewed in Coetzee, Reference Coetzee2013). Firstly, there are currently no analgesic drugs specifically approved for the alleviation of pain in livestock (North American Compendiums, 2010). Indeed, lidocaine and flunixin meglumine are the only compounds with analgesic properties that are approved by the US Food and Drug Administration (FDA) for use in cattle. Alternatives (and even these approved drugs) used for the purpose of pain relief constitute extra-label drug use (Smith et al., Reference Smith, Davis, Tell, Webb and Riviere2008). Under Animal Medicinal Drug Use Clarification Act (AMDUCA) (1994), extra-label use is permitted for relief of suffering in cattle provided specific conditions are met (Coetzee, Reference Coetzee2013). A second challenge to providing effective analgesia in cattle is that there is often a delay between the time of drug administration and the onset of analgesic activity. For example, local anesthetics require 2–5 min before a maximal effect is achieved (Lemke and Dawson, Reference Lemke and Dawson2000; Spoormakers et al., Reference Spoormakers, Donker and Ensink2004). This may slow animal processing. A third challenge is the route or method of analgesic drug administration. For example, intravenous administration, the only approved route for the NSAID flunixin meglumine in the USA (Smith et al., Reference Smith, Davis, Tell, Webb and Riviere2008), requires adequate animal restraint and a trained operator. Similar issues are encountered with epidural analgesics drug administration and use of local anesthetics in the scrotum. The latter procedure is also considered hazardous by many livestock handlers. The fourth challenge is that the majority of analgesic drugs that are available in the USA have a short elimination half-life necessitating frequent administration in order to be effective (Smith et al., Reference Smith, Davis, Tell, Webb and Riviere2008). For all of these reasons, less than 20% of US veterinarians currently report using analgesia routinely at the time of castration (Coetzee et al., Reference Coetzee, Nutsch, Barbur and Bradburn2010).
Studies examining the health and performance effects of newer drugs with extended duration of activity are also needed. Regulatory approval of safe, cost effective, and convenient analgesic compounds will support the implementation of practical pain management strategies as a part of standard industry practice at the time of castration, dehorning/disbudding and/or branding.
Alternatives to painful procedures
Eliminating the need for a given procedure is the most straightforward way to manage pain. For example, using polled genetics has successfully reduced the number of animals undergoing dehorning; this approach has resulted in a 58% reduction in beef calves born with horns from 1992 to 2007 (USDA, 2007b). There are also alternatives for both spaying heifers and castration of bulls.
Heifers may undergo bilateral ovariectomy or ‘spaying’ to prevent pregnancy and, in feedlots, the resulting mounting and higher activity levels associated with estrus (AVMA, 2011). Evidence suggests that both flank and vaginal approaches cause pain (Petherick et al., Reference Petherick, Mccosker, Mayer, Letchford and Mcgowan2011, Reference Petherick, Mccosker, Mayer, Letchford and Mcgowan2013) that may be long-lasting. Indeed, spaying results in reduced weight gain for at least 6 weeks compared with unsprayed controls (McCosker et al., Reference Mccosker, Letchford, Petherick, Meyer and Mcgowan2010). Rather than surgically spaying females, orally administered estrus-suppressing compounds (MGA) are used with 85% of feedlot heifers (USDA, 2011b) to achieve the same goals.
Chemical castration is also an alternative to physical methods. Gonadotropin-releasing hormone vaccines induce neutralizing antibodies, resulting in immunocastration in bulls characterized by suppression of luteinizing hormone and testosterone (Amatayakul-Chantler et al., Reference Amatayakul-Chantler, Jackson, Stegner, King, Rubio, Howard, Lopez and Walker2012). When used with anabolic implants, immunocastration increased bodyweight, average daily gain, and hot carcass weight (Amatayakul-Chantler et al., Reference Amatayakul-Chantler, Jackson, Stegner, King, Rubio, Howard, Lopez and Walker2012). One challenge with immunocastration is that two injections, 42 days apart, are required and this may be impractical under extensive range conditions. Lack of approval by the USDA limits current use in the USA.
Future research
If producers and veterinarians are expected to adopt interventions to manage pain during processing, FDA-approved compounds are needed. Furthermore, studies investigating the potential health and performance benefits are critical to determine if the cost of analgesia can be offset by production benefits in beef systems. In addition, most studies are confined to investigating the effect of either disbudding/dehorning or castration or branding separately when these procedures are typically conducted at the same time and may have a cumulative effect on the calf. Finally, research investigating pain management for branding is needed.
Environmental and housing conditions
Mud
According to the National Beef Quality Audit in 2005, only 26% of beef cattle had no mud or manure on their bodies at the time of slaughter (Garcia et al., Reference Garcia, Nicholson, Hoffman, Lawrence, Hale, Griffin, Savell, Vanoverbeke, Morgan, Belk, Field, Scanga, Tatum and Smith2008). These findings indicate that at least some beef cattle are exposed to environments with mud or manure before processing. In addition, cattle kept in open lots have dirtier hides than those reared indoors (Honeyman et al., Reference Honeyman, Busby, Lonergan, Johnson, Maxwell, Harmon and Shouse2010). However, there is little evidence documenting environmental conditions (e.g. depth of mud, % of days with mud or rain present, average temperature) during rearing for either cow-calf operations or feedlots.
There is some evidence that muddy conditions affect performance. Mud reduces growth rate in beef steers (Morrison et al., Reference Morrison, Givens, Garrett and Bond1970), and decreased productivity may be mediated by additional energy requirements associated with thermoregulation in wet environments (Degen and Young, Reference Degen and Young1993) and walking in mud (Dijkman and Lawrence, Reference Dijkman and Lawrence1997). Finally, mud is thought to increase health problems such as lameness (see research needs in health section of this paper), as exposure to moisture can weaken the integrity of the hoof (Borderas et al., Reference Borderas, Pawluczuk, De Passillé and Rushen2004).
Wet lying areas are also undesirable. When given a choice, dairy cattle clearly avoid wet sawdust in freestalls (86 vs. 27% DM, Fregonesi et al., Reference Fregonesi, Veira, Von Keyserlingk and Weary2007) and will spend less time lying down on wet surfaces (Fisher et al., Reference Fisher, Stewart, Verkerk, Morrow and Matthews2003; Fregonesi et al., Reference Fregonesi, Veira, Von Keyserlingk and Weary2007; Tucker et al., Reference Tucker, Rogers, Verkerk, Kendall, Webster and Matthews2007; Reich et al., Reference Reich, Weary, Veira and Von Keyserlingk2010; Schütz et al., Reference Schütz, Clark, Cox, Matthews and Tucker2010). Beef cattle also spend less time lying in cold conditions, particularly when housed outside (Gonyou et al., Reference Gonyou, Christopherson and Young1979; Johnson et al., Reference Johnson, Lonergan, Busby, Shouse, Maxwell, Harmon and Honeyman2011). Exposure to wet conditions and/or deprivation of lying time cause a cascade of physiological responses beginning with cortisol (Fisher et al., Reference Fisher, Verkerk, Morrow and Matthews2002; Tucker et al., Reference Tucker, Rogers, Verkerk, Kendall, Webster and Matthews2007; Webster et al., Reference Webster, Stewart, Rogers and Verkerk2008) but also including lower white blood cell counts, higher non-esterified fatty acid and thyroxine levels and lower body temperature (Tucker et al., Reference Tucker, Rogers, Verkerk, Kendall, Webster and Matthews2007; Webster et al., Reference Webster, Stewart, Rogers and Verkerk2008). In contrast, others have found minimal changes in physiological function with exposure to mud in feedlots (Wilson et al., Reference Wilson, Fell, Colditz and Collins2002). In feedlots, mounds of bedding and/or dirt are often used to provide a dry lying area (61% of feedlots, USDA, 2000), but little is known about appropriate stocking density (m2 of mound/animal), nor the effectiveness of this management strategy. In addition, little is known about other strategies used to manage mud in feedlots including changes in pen scraping, addition of bedding and use of wind breaks. To date, most of the research looking at the responses to mud and wet conditions has been conducted in dairy cattle and rarely encompasses the combination of wet and cold conditions seen in many feedlots in winter.
Cold
Beef cattle are often reared and managed in environments with severe winters. Cattle will use man-made windbreaks (Olson and Wallander, Reference Olson and Wallander2002) and shelters that provide protection from rain (Vandenheede et al., Reference Vandenheede, Nicks, Shehi, Canart, Dufrasne, Biston and Lecomte1995). Windbreaks can result in mixed effects on production and measures of growth or body condition (e.g. Olson et al., Reference Olson, Wallander and Paterson2000), as the energy saved by improved thermoregulation can be offset by reduced time spent feeding. Windbreaks are used in 44% of feedlots (provided to most pens, USDA, 2000), but there is little experimental evidence evaluating how to best provide and manage protection from winter weather in this phase of the production cycle. In addition to the use of windbreaks, cattle also use conspecifics for protection (Graunke et al., Reference Graunke, Schuster and Lidfors2011) and will orient towards the sun in cold winter weather (Gonyou and Stricklin, Reference Gonyou and Stricklin1981). Finally, beef cattle shiver in response to cold conditions; this response becomes less marked when animals have more exposure to winter weather (Gonyou et al., Reference Gonyou, Christopherson and Young1979). Together, these behavioral changes may provide insight into when cattle begin to perceive the adverse effects of winter weather and when protection may be needed.
Future research
Research is needed to evaluate methods used to provide dry lying areas in winter (e.g. mounds, changes in stocking density, addition of bedding, pen scraping, and wind breaks) in terms of both short-term behavioral responses and longer-term health and production outcomes.
Heat
Heat load affects cattle in a number of ways, from changes in behavior, physiology, productivity, and in extreme cases, death. Physiological changes include an increase in respiration and eventually panting, increase in body temperature and sweating (e.g. Brown-Brandl et al., Reference Brown-Brandl, Nienaber, Eigenberg, Hahn and Freetly2003). Behavioral responses include shade seeking and an increase in time spent standing overall and near water sources (e.g. Widowski, Reference Widowski, Stowell, Bucklin and Bottcher2001). In addition, the duration and previous acclimation to heat are also thought to contribute to animal response to heat waves; however, much of this evidence is based on experiments in environmentally controlled chambers without solar radiation (Brown-Brandl et al., Reference Brown-Brandl, Eigenberg, Hahn, Nienaber, Mader, Spiers and Parkhurst2005) and retrospective analysis of death events (as reviewed by Nienaber and Hahn, Reference Nienaber and Hahn2007).
Two management strategies are commonly used to reduce heat load in beef cattle: shade provision and cooling with water. Both options are effective (Mitlöhner et al., Reference Mitlöhner, Galyean and Mcglone2002; Kendall et al., Reference Kendall, Verkerk, Webster and Tucker2007; Gaughan et al., Reference Gaughan, Bonner, Loxton, Mader, Lisle and Lawrence2010; Sullivan et al., Reference Sullivan, Cawdell-Smith, Mader and Gaughan2011), but neither option is particularly common in feedlots, with only 14 and 13% of feedlots providing either method of cooling to all or most pens (USDA, 2000). In addition, other factors, such as air flow, may influence heat load, but are not easily documented in surveys of management practices, or in on-farm assessment of animal welfare.
Determining when heat abatement is needed is challenging because of the number of weather parameters involved, effects of acclimatization and individual differences in susceptibility (e.g. breeds differ in sensitivity to heat, Brown-Brandl et al., Reference Brown-Brandl, Eigenberg and Nienaber2006; Gaughan et al., Reference Gaughan, Bonner, Loxton, Mader, Lisle and Lawrence2010). Previous attempts to set thresholds for ‘stress’ (e.g. Armstrong, Reference Armstrong1994) draw clear lines about when heat abatement would be beneficial, but the scientific basis for these categorizations are unclear and further complicated by the use of different measures of heat load (ambient temperature, temperature-humidity index, and heat-load index). More recently, there is growing evidence that heat abatement is beneficial in preventing an increase in body temperature or a drop in production; often at lower temperatures or heat indices than previously thought (e.g. Bryant et al., Reference Bryant, Lopez-Villalobos, Pryce, Holmes, Johnson and Garrick2007; Chen et al., Reference Chen, Schutz and Tucker2013). These more recent papers treat heat load as a continuous variable (as opposed to categorical), lending themselves towards identification of thresholds based on biological responses. Secondly, additional research is needed to make recommendations about the physical features of heat abatement (e.g. type of shade provided, droplet size of water used, air flow, space at the water trough) as well as the effects of management of these resources (e.g. m2 of shade provided per animal) and other management decisions (e.g. movement of cattle or other factors that may increase risk). Recently, several groups have begun to address these questions (e.g. Eigenberg et al., Reference Eigenberg, Brown-Brandl and Nienaber2009; Sullivan et al., Reference Sullivan, Cawdell-Smith, Mader and Gaughan2011), but more work is needed to make industry-wide recommendations. These recommendations may play an important role in on-site animal welfare assessments, as indicators of heat stress are weather-dependent (e.g. respiration rate, panting score), thus are difficult to include in single-time point audits. For example, in the European Union cattle welfare assessment tool, heat stress does not have an animal-based measure associated with it (Welfare Quality®, 2009), likely because of the difficulty ensuring that the auditor is evaluating the farm at the appropriate season or when weather is a concern.
Future research
Research is needed to inform producers when heat abatement is needed from an animal welfare perspective. An improved understanding of how to best provide and manage heat abatement (e.g. type and amount of shade, droplet size, role of air flow, effect of other management such as cattle movement, size, and placement of water troughs) is needed and an epidemiological approach to addressing these questions would improve the industry-wide relevance.
Social interactions
Mixing or comingling of unfamiliar cattle is a common practice in the feedlot industry where pens are often filled with animals from multiple sources. Fighting among cattle is usually limited to the first day of regrouping, but changes in behavior and avoidance among unfamiliar steers can last up to 5 days (Patison et al., Reference Patison, Swain, Bishop-Hurley, Robins, Pattison and Reid2010). The stress from regrouping beef cattle appears subtle, but in dairy cows is evident by the reduction in milk yield (as much as 4% for the first 5 days after mixing Jezierski and Podlużny, Reference Jezierski and Podlużny1984). In large groups, individual recognition is more difficult and the incidence of aggression increases (Stricklin et al., Reference Stricklin, Graves, Wilson and Singh1980). In feedlots, aggression among unfamiliar cattle has been identified as a contributing factor in dark cutters arising from cattle that are regrouped before slaughter (Warren et al., Reference Warren, Mandell and Bateman2010).
The large group sizes of unfamiliar cattle in feedlots, as well as other contributing factors, may contribute to the emergence of the buller steer syndrome, a problem with an annual incidence between 2 and 4% (Brower and Kiracofe, Reference Brower and Kiracofe1978; Irwin et al., Reference Irwin, Melendy, Amoss and Hutcheson1979). The buller steer syndrome is a behavioral problem characterized by the repeated mounting of a steer (referred to as the buller) by a group of steers (known as the riders) who persistently follow and perform the mounting behavior. The buller steer becomes exhausted from the excessive mounting, often shows loss of hair, swelling and trauma on the rump and tail head and in extreme cases can suffer broken bones. In addition, bullers were 2.5 times more likely to be reclassified as ‘sick’ and 3.2 times more at risk to die than non-buller steers (Taylor et al., Reference Taylor, Booker, Jim and Guichon1997).
Several causative factors of the buller steer syndrome have been suggested and include: the use of anabolic agents, improper implantation, re-implantation or double dosing (Irwin et al., Reference Irwin, Melendy, Amoss and Hutcheson1979; Voyles et al., Reference Voyles, Brown, Swingle and Karr2004; Bryant et al., Reference Bryant, Rabe, Bowers, Hutches and Jaragin2008), as well as changes in weather and seasonal factors, excessive mud or dusty pen conditions, entry weights, disease, group size, improper or late castration, feeding management, transportation, handling, mixing and aggressive social dominance behavior (see review Blackshaw et al., Reference Blackshaw, Blackshaw and Mcglone1997). The incidence of bullers tends to increase as the number of animals in the pen increases (Brower and Kiracofe, Reference Brower and Kiracofe1978; Irwin et al., Reference Irwin, Melendy, Amoss and Hutcheson1979). For every 9.3 m2 increase in pen size, the buller rate decreased by 0.05% (Irwin et al., Reference Irwin, Melendy, Amoss and Hutcheson1979). Preventative methods that tend to reduce the incidence of bullers includes: forming feedlot pens with as few groups as possible, keeping the number of steers per pen below 200, implanting on arrival (as opposed to delayed implantation) and some anecdotal evidence suggests overhead barriers that allow steers to escape mounting may have some benefit (Blackshaw et al., Reference Blackshaw, Blackshaw and Mcglone1997).
In general, low space allowance results in slower rates of gain in heifers and bulls (Andersen et al., Reference Andersen, Jensen, Munksgaard and Ingvartsen1997; Mogensen et al., Reference Mogensen, Nielsen, Hindhede, Sørensen and Krohn1997). However, insufficient bunk space has a greater effect on cattle behavior, such as aggression and displacements, than group size (as summarized in Albright and Arave, Reference Albright and Arave1997). Animals that are more competitive (win more aggressive encounters) ultimately gain more time at the feed bunk and have higher DM intake when bunk space is limited (Zobel et al., Reference Zobel, Schwartzkopf-Genswein, Genswein and Von Keyserlingk2011). Though agonistic behavior becomes more common as group size increases and as space decreases, distance between individual cattle remains constant (10–12 m between animals) after space reaches 360 m2 per animal (Kondo et al., Reference Kondo, Sekine, Okubo and Asahida1989). Relatively little is known about best management practices for group size, nor for space allocation of key resources such as food or water in feedlots.
Future research
Research is needed to address several issues associated with social interactions. Science-based recommendations for key resources, such as bunk space and water, are needed. More work is required to understand the health and performance effects as well as the best management strategy for grouping unfamiliar animals. Similarly, the effectiveness of overhead barriers and other protocols for preventing, eliminating and treating buller steer syndrome needs to be evaluated.
Transport
Duration
Transport regulations regarding travel duration and distance for cattle are less stringent in North America than other countries, including the EU, Australia, and New Zealand. According to the 28-Hour Law (USDA, 1999), American cattle can be in transport up to 28 h, while the maximum allowable transport duration in Canada, an important player in the North American beef system, is 52 h before cattle must reach their final destination (Canadian Agri-Food Research Council, 2001). A range of studies suggest that the vast majority of cattle transported in North America fall within the maximum regulated times specified by the USA and Canada of either 28 or 52 h, respectively (Warren et al., Reference Warren, Mandell and Bateman2010; Cernicchiaro et al., Reference Cernicchiaro, White, Renter, Babcock, Kelly and Slattery2012a, Reference Cernicchiaro, White, Renter, Babcock, Kelly and Slatteryb; González et al., Reference González, Schwartzkopf-Genswein, Bryan, Silasi and Brown2012b). Less is known about the cumulative transport duration of cattle that are sold through markets or auctions, or is there enforcement of the 28-Hour Law within the USA.
Transport duration, as opposed to distance, better defines the length of time cattle are confined on a transport vehicle. Transport duration ranging between 2 and 48 h have been shown to result in shrink values between 0 and 8% of body weight (Lofgreen et al., Reference Lofgreen, Dunbar, Addis and Clark1975; Mayes et al., Reference Mayes, Asplund and Anderson1979; Jones et al., Reference Jones, Schaefer, Robertson and Vincent1990; Zavy et al., Reference Zavy, Juniewicz, Phillips and Vontungeln1992; Schwartzkopf-Genswein et al., Reference Schwartzkopf-Genswein, Booth-Mclean, Shah, Entz, Bach, Mears, Schaefer, Cook, Church and Mcallister2007, Reference Schwartzkopf-Genswein, Faucitano, Dadgar, Shand, González and Crowe2012; Cernicchiaro et al., Reference Cernicchiaro, White, Renter, Babcock, Kelly and Slattery2012a). González et al. (Reference González, Schwartzkopf-Genswein, Bryan, Silasi and Brown2012c) reported that cattle in transport longer than 30 h had a greater likelihood of becoming non-ambulatory, lame, or dying. They also documented that shrink increased most rapidly in cattle transported for longer durations at higher ambient temperatures and concluded that transport durations greater than 30 h should be avoided during extreme climatic conditions. Identifying and understanding the effect of transport on unfit animals, such as cull cows or very young calves, is lacking, but is where the largest welfare issues likely exist.
Space allowance
Space allowance (loading density) is the space available to an animal when placed into a trailer compartment calculated as kg of body weight per m2 or m2 per animal. In the USA and Canada, loading densities are not regulated, but are based on industry codes of practice guidelines which vary slightly between the two countries (USDA, 1999; Canadian Agri-Food Research Council, 2001). Over or under loading cattle on trucks may pose a risk to animal welfare and meat quality (Eldridge et al., Reference Eldridge, Winfield and Cahill1988; Tarrant et al., Reference Tarrant, Kenny and Harrington1988, Reference Tarrant, Kenny, Harrington and Murphy1992).
The negative effects of high (for example, 0.89 m2 per animal) loading density [in comparison with medium (e.g. 1.16 m2 per animal) and low (e.g. 1.39 m2 per animal)] on animal welfare and carcass quality are well established. These effects include increased indicators of stress such as cortisol and glucose content in the plasma, reduced carcass weight, downgrading of carcasses, higher incidence of severely bruised cattle, and increased risk of injury and mortality (Eldridge and Winfield, Reference Eldridge and Winfield1988; Tarrant et al., Reference Tarrant, Kenny and Harrington1988, Reference Tarrant, Kenny, Harrington and Murphy1992; Grandin, Reference Grandin2007). Creatine kinase, a muscle enzyme, was found to increase with loading density, which suggests higher levels of muscle damage (Tarrant et al., Reference Tarrant, Kenny and Harrington1988, Reference Tarrant, Kenny, Harrington and Murphy1992). In addition to determining the effects of stocking density overall, location within the truck affects effective space allocation: cattle in the nose, back and doghouse were reported to have 44, 4, and 60% more space while cattle in the belly and deck had 8 and 6% less space than recommended by CARC (2001) and USDA (1997) (González et al., Reference González, Schwartzkopf-Genswein, Bryan, Silasi and Brown2012e). Indeed, cull cows transported in the doghouse have more carcass bruising than those transported elsewhere in the truck (Goldhawk et al., Reference Goldhawk, Janzen, Gonzalez, Crowe, Kastelic, Kehler, Ominski, Pajor and Schwartzkopf-Genswein2015).
Feed and water withdrawal
During transport in the USA, cattle are not provided with feed or water. The majority of weight loss or shrink has been attributed to the effect of feed and water withdrawal accounting for 12 to 15% of the animal's live weight and is both a welfare and economic concern (Grandin, Reference Grandin2007). Evidence of dehydration after transport includes increases in red blood cell count, hemoglobin, total protein and packed cell volume (Lambooy and Hulsegge, Reference Lambooy and Hulsegge1988; Tarrant et al., Reference Tarrant, Kenny, Harrington and Murphy1992; Warriss et al., Reference Warriss, Brown, Knowles, Kestin, Edwards, Dolan and Phillips1995).
Rest stops where cattle are unloaded can be used to provide cattle with respite from standing, as well as provide opportunities to eat and drink. In addition, beginning July 2013, truck drivers will be required to stop for 30 min for every 8 h of driving (US Department of Transportation, 2015), but it seems unlikely that cattle would be unloaded during this type of stop. Little is known about the effect of providing rest stops, with or without unloading, on beef cattle welfare.
Environmental conditions
Environmental conditions can vary greatly within a single journey, compartment, location within the compartment, depending on weather, if the truck is moving and the type of passive ventilation (Mitchell and Kettlewell, Reference Mitchell and Kettlewell2008) and if the measurements are taken at animal-height (Goldhawk et al., Reference Goldhawk, Crowe, Gonzalez, Janzen, Kastelic, Pajor and Schwartzkopf-Genswein2014a, Reference Goldhawk, Crowe, Janzen, Gonzalez, Kastelic, Pajor and Schwartzkopf-Gensweinb). Transport vehicles are not climate controlled, thus environmental conditions can influence animal welfare and meat quality (Schwartzkopf-Genswein et al., Reference Schwartzkopf-Genswein, Faucitano, Dadgar, Shand, González and Crowe2012). A number of studies have evaluated trailer/compartment/location temperatures and sought to link these parameters to either ambient weather conditions or management practices (Wikner et al., Reference Wikner, Gebresenbet and Nilsson2003; Goldhawk et al., Reference Goldhawk, Janzen, Gonzalez, Crowe, Pajor, Kastelic and Schwartzkopf-Genswein2011; Stanford et al., Reference Stanford, Bryan, Peters, Gonzalez, Stephens and Schwartzkopf-Genswein2011). In one of the few studies assessing the relationship between environmental conditions and animal welfare, González et al. (Reference González, Schwartzkopf-Genswein, Bryan, Silasi and Brown2012d) found that animal death increased sharply when ambient temperatures fell below −15°C while the likelihood of becoming non-ambulatory increased when temperatures rose above 30°C. There is also experimental evidence that mitigation at either extreme is beneficial. Warren et al. (Reference Warren, Mandell and Bateman2010) reported that boarding (plastic, fiberglass or plywood pieces inserted to cover side perforations and alter air flow) reduced the incidence of dark cutters during winter transport. Beyond these few studies, little is known in this area and the relationship between air flow and wind speed direction, ambient conditions, perforation patterns and other design features and animal welfare outcomes such as cattle weight loss, morbidity, mortality, and injury due to extreme conditions such as frostbite need further study.
Animal type and temperament
Calves are believed to be more susceptible to the stresses of transport due to their naive immune systems and lack of exposure to new environments make them more vulnerable than mature cattle (calves less than 4 weeks of age vs. animals >6 months of age) (Swanson and Morrow-Tesch, Reference Swanson and Morrow-Tesch2001). In addition, calves are more likely to become non-ambulatory and/or die compared with slaughter-weight and feeder cattle (González et al., Reference González, Schwartzkopf-Genswein, Bryan, Silasi and Brown2012d). Cernicchiaro et al. (Reference Cernicchiaro, White, Renter, Babcock, Kelly and Slattery2012a) reported that shrink increased the risk of BRD in lighter compared with heavier weight calves. There are also concerns about appropriate animal handling and management (i.e. use of bedding in the trailers or long transport durations) during transport of old or cull animals because of their relatively low economic value (Grandin, Reference Grandin2001b). González et al. (Reference González, Schwartzkopf-Genswein, Bryan, Silasi and Brown2012c, Reference González, Schwartzkopf-Genswein, Bryan, Silasi and Brownd) found that cull cattle were at greatest risk of poor welfare during long haul transport (≥400 km) as they were more likely to become lame at the time of loading and unloading and declared non-ambulatory or dead at the time of off-loading compared with calves, feeder, and slaughter-weight cattle. Handling stress due to loading and unloading has been shown to vary with animal temperament (Burdick et al., Reference Burdick, Carroll, Hulbert, Dailey, Willard, Vann, Welsh and Randel2010), handling quality (gentle vs rough), experience of the handler and animal, the animal's condition and the quality of the handling facilities (Grandin, Reference Grandin2001b). As an example, Lay et al. (Reference Lay, Friend, Grissom, Bowers and Mal1992b, Reference Lay, Friend, Randel, Bowers, Grissom and Jenkinsc) found that pasture-raised beef cattle unaccustomed to handling had higher heart rates and cortisol indicative of greater stress levels during restraint and handling compared with frequently handled dairy cattle. Recently several studies have reported that non-steroidal anti-inflammatory drug administration prior to long distance transportation may reduce stress and improve performance on arrival likely due to a combination of analgesic and anti-inflammatory effects (Guarnieri Filho et al., Reference Guarnieri Filho, Cooke, Cappellozza, Reis, Marques and Bohnert2014; Van Engen et al., Reference Van Engen, Stock, Engelken, Vann, Wulf, Karriker, Busby, Lakritz, Carpenter, Bradford, Hsu, Wang and Coetzee2014). Science-based information on the interrelationships between animal type (age, size, and condition), temperament, and animal and handler experience, as they relate to transport, are scarce.
Future research
Research is required to understand optimal loading densities by animal type and weather, the effect of transportation durations experienced by cattle sold and resold through the auction markets, effect of rest stops (with and without unloading), trailer design features that control environmental conditions (ventilation, use of bedding, and boarding), and internal ramp and compartment construction. The identification of alternative strategies to mitigate transportation stress is also needed.
Handling
Handling represents an area of considerable improvement in beef cattle welfare. Numerical scoring systems that assess handling in meat plants and in feedlots have been implemented on a wide scale. The scoring system for meat plants is fully described in Grandin (Reference Grandin1998b, Reference Grandin2010, Reference Grandin2012a). For example, baseline data collected in 1996 showed that the percentage of cattle moved with an electric prod ranged from 4% in the best plant to 90% in the worst (Grandin, Reference Grandin1998b). After 4 years of audits by retailers such as McDonald's and Wendy's (as part of the North American Meat Institute audit; Grandin, Reference Grandin2012b), the average electric prod score was reduced to 15% (Grandin, Reference Grandin2005).
Vocalizations provide insight into an animal's perception of aversive events (Watts and Stookey, Reference Watts and Stookey1999) and increase with use of electric prods, excessive pressure from a restraint device, and gates slamming or falling on animals (Grandin, Reference Grandin1998a). In addition, cattle that vocalized during restraint or were shocked with electric prods had higher cortisol levels (Dunn, Reference Dunn1990; Hemsworth et al., Reference Hemsworth, Rice, Karlen, Calleja, Barnett, Nash and Coleman2011). Bourguet et al. (Reference Bourguet, Deiss, Tannugi and Terlouw2011) also found that 25% of vocalizations were associated with the use of restraint devices. Measuring vocalizations during handling has also improved with the auditing process. For example, in 1996 the percentage of cattle vocalizing during handling ranged from 0 to 32%, with a mean of 7.7%. Grandin (Reference Grandin2005) reported that the median vocalization percentage was 2% (in audits conducted in 1999, 2000, 2001 and 2002).
There are several mechanisms to improve handling, regardless of the situation. Firstly, the physical environment can be altered to minimize balking and to improve cattle flow. For example, cattle balked less when an entrance was lit (Grandin, Reference Grandin2001a). Similarly, reducing pressure on the neck from a head restraint device decreased the percentage of cattle vocalizing from 23 to 0% (Grandin, Reference Grandin2001a). Secondly, low levels of electric prod use, vocalization, and falling can be achieved by altering human behavior: better employee training, increased supervision, and moving smaller groups of cattle. Thirdly, several studies and reviews have shown that habituating cattle to handling procedures, such as being moved through chutes, can help reduce stress when the cattle are handled in the future (Fordyce, Reference Fordyce1987; Grandin, Reference Grandin1997; Cooke et al., Reference Cooke, Arthington, Araujo and Lamb2009, Reference Cooke, Bohnert, Cappellozza, Mueller and Delcurto2012; Petherick et al., Reference Petherick, Doogan, Holroyd, Olsson and Venus2009). Finally, other animal factors, such as breed, can affect handling. For example, Brahman cross and continental cattle will run out of the squeeze chute faster compared with British cattle (Baszczak et al., Reference Baszczak, Grandin, Gruber, Engle, Platter, Laudert, Schroeder and Tatum2006). Plasma cortisol levels were higher in Bos indicus calves compared with Bos taurus calves (Zavy et al., Reference Zavy, Juniewicz, Phillips and Vontungeln1992).
The type of restraint can affect beef cattle welfare. For example, there has been recurrent interest in electrical immobilization. However, each scientific study on electrical immobilization has found that it is extremely aversive (Lambooy, Reference Lambooy1985; Grandin et al., Reference Grandin, Curtis, Widowski and Thurmon1986; Pascoe, Reference Pascoe1986; Petherick et al., Reference Petherick, Mccosker, Mayer, Letchford and Mcgowan2013). The AVMA states, ‘Electrical immobilization that paralyzes an animal without first inducing unconsciousness is extremely aversive and is unacceptable’ (AVMA, 2013).
Future research
Research is required to understand the effects of some management practices, such as feeding ß-adrenergic agonists, on animal handling (see Nutrition section for more detail).
Slaughter
Reviews of welfare issues during slaughter have been recently covered by Grandin (Reference Grandin and Grandin2009, Reference Grandin2013, Reference Grandin2014). Like handling, slaughter is an area that has undergone considerable improvement through programs like the North American Meat Institute audit of meat packing plants (Grandin, Reference Grandin2012b). Currently, problems at slaughter are focused on lack of suitability for transport and the kill process, due to morbidity or poor body condition. The World Organization for Animal Health has guidelines for suitability for transport (2012). When these guidelines are not met, euthanasia is the key alternative for animals not suitable for transport and the AVMA has recently revised their recommendations for these procedures (2013).
Animal welfare during religious slaughter is poorly understood and controversial. Discussions on this issue can be found in Grandin (Reference Grandin2013), Rosen (Reference Rosen2004), Gibson et al. (Reference Gibson, Johnson, Murrell, Hulls, Mitchinson, Stafford, Johnstone and Mellor2009), AVMA (2013), Grandin (Reference Grandin1994), Grandin and Regenstein (Reference Grandin and Regenstein1994), Gregory and Grandin (Reference Gregory and Grandin2007) and Anil (Reference Anil2012), and reflect different perspectives about these issues. When the animal is held in a well-designed restraint device for religious slaughter the vocalization score will be 5% or less (Grandin, Reference Grandin2012a). There are conflicting results between different scientific studies about the possible stress and pain that occurs when the throat is cut in a fully sensible animal (Grandin and Regenstein, Reference Grandin and Regenstein1994; Rosen, Reference Rosen2004; Gibson et al., Reference Gibson, Johnson, Murrell, Hulls, Mitchinson, Stafford, Johnstone and Mellor2009; Gregory et al., Reference Gregory, Von Wenzlawowicz and Von Holleben2009, Reference Gregory, Schuster, Mirabito, Kolesar and Mcmanus2012). Grandin (Reference Grandin1992) describes the operation of a well-designed restraint device for slaughter without stunning and ultimately, the decision about the acceptability of these methods may lie in public discussion, rather than in scientific evidence.
Future research
Research is required to understand the relationship between kill method, pain and sensibility.
Conclusions
We have aimed to strategically review existing scientific information about intensive aspects of production on the welfare of beef cattle in the USA. Within the areas where more research is required, we offered our thoughts about priorities, based on our interpretation of the scientific importance or our anticipation of issues that are likely to be the subject of public discussion, where evidence, rather than perceived ethical concerns, would be beneficial. Within our priorities, we identified two themes: areas where implementation or policy-based actions are needed and issues where additional empirical research is needed. We discussed each in turn.
For some topics, considerable research informs best practice, yet gaps remain between scientific knowledge and implementation. BRD prevention provides a good example. As we reviewed, many of the risk factors and management strategies to prevent this disease are known, but only used by a portion of the beef industry. Yet, one of the most pressing welfare issues facing the beef industry is the high health risk to newly arrived feedlot cattle and the subsequent reliance on antimicrobials to treat and prevent respiratory diseases at the feedlot. This is both a public relations and an animal health issue that will require leadership, discussion, sincere effort, and financial incentives across all segments of the beef industry to gain widespread adoption of practices that are in the best interest of the calves’ welfare. Other areas where empirical evidence does not limit decisions about how to improve animal welfare include use of analgesics and promoting alternatives for painful procedures. However, it is critical that if producers and veterinarians are expected to adopt interventions to manage pain during processing, that they be provided with FDA-approved compounds to accomplish this goal. Animal welfare in these areas could be improved considerably and quickly with leadership and regulatory changes to improve the number and types of analgesics available. There is evidence of success when such actions are taken, as illustrated by the dramatic improvements in handling at slaughter facilities reviewed here.
In some cases, however, additional empirical evidence is needed to inform best practice. The summary we provided here encapsulates many of the ideas laid out in the main body of this review. Here, we offered our thoughts about the most urgent of these issues, based on both scientific importance as well as our sense of what topics will be timely now or in the near future.
-
• Evaluation of the animal welfare implications of technologies used to either promote growth or manage cattle in feedlots, particularly ß-adrenergic agonists, hormonal implants, immunocastration and MGA. In some cases, relatively little is known about the welfare implications of these technologies, while in others we anticipate continued public scrutiny and interest in these management tools.
-
• Understanding risk factors for health problems that are poorly studied at the population level, such as lameness and the effect management has on both these rare and more common diseases (e.g. SARA). Epidemiological work in this area is the first step to understanding how we may prevent these problems across the entire industry and would provide insight into the long-term consequences of management decisions such as weaning method, transition to high-concentrate diets, and economic benefits of pain mitigation.
-
• Understanding the welfare implications of limited feed water and rest intervals (both with and without unloading) for cattle transported long durations under extreme climatic and management (too high or low loading density) conditions. Transport is a highly visible and important aspect of the cattle industry that has been poorly studied.
-
• Research is required to identify recommendations about stocking density in feedlots for key resources: dry lying areas (mounds), shade, water, and feed year-round. This work is needed at a commercial scale, using industry-relevant group sizes and multiple measures of welfare, with particular emphasis on animal behavior, as these dependent variables provide insight into competition for and simultaneous use of these resources. This work is important to identify best management of summer and winter weather, as well as social interactions in feedlots.
-
• Understanding the welfare implications of aspects of trailer design, including optimal loading densities by animal type and weather, trailer design features that control environmental conditions (ventilation, use of bedding, boarding), and internal ramp and compartment construction. All of these features or management decisions need to be evaluated in terms of behavioral and physiological indicators of health and welfare and meat quality.
Investment in these research areas would advance science-based recommendations about beef cattle welfare.
Acknowledgments
This project was sponsored by The Beef Checkoff. We thank Janice Swanson (Michigan State University) for thoughtful comments on an earlier draft. We are also grateful to Erin Mintline (UC Davis) for her help in proofreading and reference organization.