Hostname: page-component-745bb68f8f-l4dxg Total loading time: 0 Render date: 2025-02-06T06:28:16.382Z Has data issue: false hasContentIssue false

The contribution of behavioral economics to crisis management decision-making

Published online by Cambridge University Press:  18 December 2017

John A. Parnell*
Affiliation:
School of Business, University of North Carolina at Pembroke, Pembroke, NC, USA
William ‘Rick’ Crandall
Affiliation:
School of Business, University of North Carolina at Pembroke, Pembroke, NC, USA
*
Corresponding author: john.parnell@uncp.edu
Rights & Permissions [Opens in a new window]

Abstract

Scholarly work in the field of crisis management has flourished in recent years with contributions from numerous disciplines, including strategic management, organizational behavior, public relations, risk management, and disaster management. However, the substantial and prospective applications from behavioral economics – from Herbert Simon to modern theorists – have yet to be systematically integrated into the literature. This paper presents a framework that categorizes applications from behavioral economics along three stages of the crisis management life cycle – crisis preparation, crisis action, and postcrisis. It provides insights for scholars and practitioners into the crisis decision-making process and outlines why ‘less-than-rational’ decision-making approaches often appear in crisis environments.

Type
Research Article
Copyright
Copyright © Cambridge University Press and Australian and New Zealand Academy of Management 2017

INTRODUCTION

An organizational crisis is a low-probability, high-impact event that threatens an organization and its stakeholders. It is largely unexpected, can create substantial damage to the organization, and requires a prompt, decisive response (Crandall, Parnell, & Spillan, Reference Crandall, Parnell and Spillan2014; Haibing, Jinhong, Qi, & Wilbur, Reference Haibing, Jinhong, Qi and Wilbur2015). If not addressed effectively, a crisis can seriously damage an organization’s performance and reputation (Pearson & Clair, Reference Pearson and Clair1998; Coombs & Holladay, Reference Coombs and Holladay2006; Coombs, Reference Coombs2007). Effectively managing a crisis is a multifarious process that extends beyond quotidian decision-making (Pearson & Clair, Reference Pearson and Clair1998; Roux-Dufort, Reference Roux-Dufort2007; Parnell, Köseoglu, & Spillan, Reference Parnell, Köseoglu and Spillan2010; Crandall, Parnell, & Spillan, Reference Crandall, Parnell and Spillan2014). Crisis management (CM) is an interdisciplinary field, with scholarly contributions from a variety of disciplines including finance, strategic management, organizational behavior, public relations, risk management, and disaster management (Paraskevas, Reference Paraskevas2006; Piotrowski, Watt, & Armstrong, Reference Piotrowski, Watt and Armstrong2010; Gerrans, Reference Gerrans2012; Maiorescu, Reference Maiorescu2016). CM has been conceptualized as the management of exceptions (Roux-Dufort, Reference Roux-Dufort2007).

CM is important in both public and private organizations (Hunter, Van Wassenhove, & Besiou, Reference Hunter, Van Wassenhove and Besiou2016), although government agencies are typically not as effective as private organizations in CM activities (Piotrowski, Reference Piotrowski2006). Facing resource limitations, smaller organizations often struggle to survive when a crisis hits. As such, there is less published work on crisis planning in small- and medium-sized organizations, with notable exceptions in select crisis-prone industries such as tourism and hospitality (Racherla & Clark, Reference Racherla and Clark2009; Herbane, Reference Herbane2013; Sawalha, Jraisat, & Al-Quduh, Reference Sawalha, Jraisat and Al-Quduh2013; Morakabati, Page, & Fletcher, Reference Morakabati, Page and Fletcher2017).

Effective CM requires action before, during, and after a crisis event. Crises should be avoided when possible, the negative effects of unavoidable crises should be mitigated, and firms should promote learning after the crisis is under control. Crisis management teams (CMTs) should develop plans, including the assessment of worst-case scenarios, prior to a crisis event (Jaques, Reference Jaques2007, Reference Jaques2010; Wester, 2009; Cirka & Corrigall, Reference Cirka and Corrigall2010). Uncertainty and organization-specific vulnerability determine the priority, time, and resources available for crisis planning, however. As such, CMT members require salient and accurate information throughout the process (Somers, Reference Somers2009; Crandall, Parnell, & Spillan, Reference Crandall, Parnell and Spillan2014), while all organizational members should understand their specific responsibilities and be empowered to manage the crisis in their own departments (Fowler, Kling, & Larson, Reference Fowler, Kling and Larson2007).

Early CM scholars (e.g., Pearson & Mitroff, Reference Pearson and Mitroff1993; Lerbinger, Reference Lerbinger1997; Pheng, Ho, & Ann, Reference Pheng, Ho and Ann1999) crafted categories and typologies to identify and explain the CM process. Mitroff’s (Reference Mitroff2005) seven crisis families – economic-related, informational, physical, human resources, reputation-related, psychopathic acts, and natural disasters – remains widely referenced. Research addressing the strategic and long-terms effects of crises (Elsubbaugh, Fildes, & Rose, Reference Elsubbaugh, Fildes and Rose2004; Evans & Elphick, Reference Evans and Elphick2005; Coombs & Holladay, Reference Coombs and Holladay2006), and the diversity of crisis types have increased in recent years (Robert & Lajtha, Reference Robert and Lajtha2002; Coleman, Reference Coleman2004; Crandall, Parnell, & Spillan, Reference Crandall, Parnell and Spillan2014; Christensen, Lægreid, & Rykkja, Reference Christensen, Lægreid and Rykkja2016).

Central to the CM process is the challenge of crisis decision-making, which requires leaders to render rapid decisions amidst environments of stress, high uncertainty, and complexity. Indeed, decision-making tends to be multifaceted and more challenging under such circumstances (de Waard, Volberda, & Soeters, Reference de Waard, Volberda and Soeters2012; Kantur & Iseri-Say, Reference Kantur and Iseri-Say2012). Scholars have traditionally viewed decision-making in all organizational contexts, including CM, as an inherently rational process, whereby decision-makers collect and analyze facts, identify alternatives, and select the optimal one for implementation. However, in practice, exclusively rational decision-making is rarely the norm, and even the most objective decision-makers do not always make rational decisions in all instances. Although rational models represent a starting point for managerial decision-making, their shortcomings become more acute in crisis situations.

Behavioral economics (BE) challenges the rationality assumption – approaching a decision by evaluating a set of alternatives and then selecting the one alternative based on a systematic process – inherent in traditional decision-making models. BE integrates the fields of psychology, sociology, and neuroscience, and has flourished in recent years (McDonald, Reference McDonald2008). Herbert Simon, often touted as the field’s founder of BE, proposed a key cornerstone of BE, the concept of bounded rationality. While Simon rejected the classic economic definition of rationality, this is not akin to irrational decision-making (Diacon, Donici, & Maha, Reference Diacon, Donici and Maha2013). Instead, decision-making tactics are rational to the decision-makers, because they can usually provide justification (Simon, Reference Simon1985). In practice, this means perfect, well-informed, rational decisions do not occur because of incomplete information, time constraints, and cognitive limitations are present (Diacon, Donici, & Maha, Reference Diacon, Donici and Maha2013).

This concept of bounded rationality is based on two real world notions, limited search and satisficing (Simon, Reference Simon1991; Schilirò, Reference Schilirò2012). Individuals must make decisions within prescribed bounds. A satisficing decision is more feasible because the decision must be satisfactory, but not necessarily perfect. Hence, satisficing decisions reflect the decision-maker’s approach when shortcuts are required and often, the first satisfactory course of action is selected, instead of continuing to search for the optimal course of action (Fox, Reference Fox2015).

The rational approach to decision-making inherent in classical economics assumes that individuals make choices that maximize their utility. It involves identifying all the available alternatives to a given problem, and then evaluating each one until the optimum alternative is identified. Whereas rational decision-making assumes a systematic approach with an optimal choice after the process, BE posits that decision-makers often resort to heuristics that are ‘less-than-rational’ because a rational thought process is not possible or feasible.

BE exposes shortcomings associated with rational decision-making models, particularly in crisis environments. Given the constraints and demands of crisis situations, prospective contributions from BE to CM are legion because both recognize the psychological, emotional, and interpersonal anomalies associated with multifaceted, nonroutine decisions (Carone & Di Iorio, Reference Carone and Di Iorio2013; Diacon, Donici, & Maha, Reference Diacon, Donici and Maha2013). Organizational crisis scholars encourage practitioners to make sense of complex and sometimes frantic crisis-related activities by following a systematic process. Indeed, responding to such unexpected and high-impact events requires a rational mindset vis-à-vis the organization’s CMT and the implementation of its crisis management plan. However, the prevalence of organizational crises does not mean that decision-makers will always pursue a rational approach to decision-making, before, during, or after the crisis event.

In this paper, we examine links between BE concepts and organizational CM, including various forms of less-than-rational decision-making about crisis planning and management. We overview 13 BE concepts that influence one or more of the crisis stages and explain how their application can enrich our understanding of the CM process. We conclude with more specific applications to the CM field.

A BE PERSPECTIVE ON CM

Within the rational decision-making context, decision-makers often revert to heuristics, an approach more useful when decisions are routine and the environment is predictable than when there are time, complexity, and unfamiliarity constraints. However, time, complexity, and lack of familiarity constrain the rational model. BE scholars recognize the importance of the human element and that most decisions are made in less-than-ideal situations. Rather, decision-making is fraught with emotion, bias, and inconsistency (Peters & Stark, Reference Peters and Stark2005; Shefrin, Reference Shefrin2013, Reference Shefrin2015).

A two-system approach to bounded rationality developed by Stanovich and West (Reference Stanovich and West2000) and elaborated by Kahneman (Reference Kahneman2003, Reference Kahneman2011) sheds light on this conundrum. System 1 is based on intuition and encompasses rapid, routine, and habitual decisions, whereas system 2 is based on reasoning, encompassing slower, more deliberate decisions. The amount of effort required is a key distinguishing factor between the two systems (Kahneman, Reference Kahneman2003). Whereas the system 1 process is largely emotional, effortless, and slow-learning, the system 2 process requires greater effort, is governed by rules, and is more flexible. Crisis decision-making challenges decision-makers because effectiveness requires both the speed of system 1 and the flexibility of system 2.

In the discussion that follows, we examine 13 BE concepts and operationalize them in the crisis planning, crisis action, and postcrisis stages of the CM process. The resulting framework amends the traditional rational decision-making process, whereby crisis threats are formally identified, alternatives are evaluated systematically, and the decision is based on the most logical choice of action. Indeed, while rationality provides a useful paradigm for thinking about CM, the reality of an organizational crisis suggests that human emotion, unique and dire circumstances, and an expedient time frame dictate that an alternative thinking process may be at work. Given the high uncertainty, complexity and asymmetry of information, and the need for a rapid response inherent in crisis decision-making, an approach to decision-making that incorporates BE is germane.

In this discussion, the crisis planning stage is characterized by the formation of the CMT, the identification of potential crises events unique to the organization, and a plan to address these potential crises. The crisis action stage is characterized by the activation of the CMT and involves containment of the crisis. During the postcrisis stage, external stakeholders such as customers begin to elicit their viewpoints on what has transpired, sometimes with damaging results. We begin the discussion with BE heuristics that impact the crisis preparation stage.

Crisis preparation

Crisis preparation is a necessity for any organization that seeks to identify and mitigate crises events that could affect the entity. The key steps in crisis preparation include establishing the CMT, identifying threats to the organization, and crafting a plan to address these threats (Crandall, Parnell, & Spillan, Reference Crandall, Parnell and Spillan2014). While this process may appear straight-forward, rational, and strategic, several heuristics can enter in and affect decision-making.

Bounded rationality

We discuss bounded rationality in both the crisis preparation stage and the crisis action stage. Bounded rationality – as proposed by Simon – can exist in the crisis planning stage, particularly with newer CMTs and other managers not familiar with the process. Because team members have limited experience in crisis identification and planning, they may have to satisfice until they have acquired a breadth of experience dealing with crises events. As a result, some potential, important crises may be overlooked.

Optimism bias

The tendency to overestimate the probability of a positive occurrence and underestimate the probability of a negative occurrence is referred to as optimism bias (Weinstein, Reference Weinstein1980), or the ‘positivity illusion’ (Ariely, Reference Ariely2009). Many decision-makers assume that crises are more likely to happen to someone else (Caponecchia, Reference Caponecchia2010). The optimism bias helps explain this mindset. Optimism pushed to the extreme can be dysfunctional to organizations, society, and even the entire economy (Parnell & Dent, Reference Parnell and Dent2009).

Visualization

Crisis scenarios that can be easily visualized are more likely to be perceived as viable threats. While photos of airplane crashes and fires are readily available, it is difficult to visualize a technology breakdown, although the latter is more common. For this reason, highly visual crises may receive greater attention even if their likelihood is remote when compared to more likely, but less visual crises. The most prevalent crises facing an organization may not be the most easily visualized. We place visualization in the crisis preparation stage because it is a viable procedure in assessing the worst-case scenarios the organization faces.

Temporal discounting

A rational approach to decision-making assumes that individuals discount the value of future benefits associated with each alternative. However, individuals tend to accept a less attractive outcome in the present, even if appropriately discounted, rather than waiting for a more attractive outcome, especially if risk is associated with the delay. This process, known as temporal discounting (Whelan & McHugh, Reference Whelan and McHugh2009; Dittmar & Bond, Reference Dittmar and Bond2010), can be a serious concern for individuals seeking to raise crisis awareness in an organization and/or procure the resources necessary to engage in crisis preparation.

Managers who advocate crisis awareness and planning find themselves in a precarious position. On one hand, they are championing crisis planning for events that are unlikely to occur. Such planning leaves little tangible outcomes to point at other than the organization can position itself as ‘ready’ for a crisis. On the other hand, if crisis planning is not championed, the results can be catastrophic, as an unprepared firm may suffer considerable harm if a major crisis strikes.

We place temporal discounting in the crisis planning stage. The less attractive present outcome is to not plan (or at the least, marginally plan) for crisis events. The more attractive future outcome is to have a well-managed crisis management plan in place when a disaster occurs. However, temporal discounting means the former option, not the latter, will be more desirable, even though it is less beneficial to the organization.

Probability weighting

Identifying quantifiable probabilities associated with alternatives does not guarantee an effective decision. According to prospect theory, decision-makers tend to overrate low-probability events and underrate high-probability events (Kahneman & Tversky, Reference Kahneman and Tversky1979; Barberis, Reference Barberis2013). Ironically, this phenomenon helps raise crisis awareness, but may shift resources in the wrong direction. Raising awareness can be productive, but resource allocation decisions should be based on objective data, not just estimated probabilities. In such an instance, it would be unwise for a CMT to focus a substantial amount of time and energy preparing for a fire while ignoring less spectacular, but more likely events such as a supply chain disruption caused by a weather event or a denial of website service due to a computer virus.

We place probability weighting in the crisis preparation stage. Individuals tend to convert objective information about probability into a subjective view of what might happen (Becker & Greenberg, Reference Becker and Greenberg1978; Mukherjee, Reference Mukherjee2011). Even when probabilities of outcomes are reliably known, they tend to create their own probability weights in a way that overemphasizes low-probability outcomes.

Inertia

Inertia is a tendency to favor the status quo. Indeed, the concept helps explain much of human behavior (van Witteloostuijn, Reference van Witteloostuijn1998) and is well understood by marketers. Magazine publishers understand the power of inertia and are often willing to deeply discount a subscription if it is accompanied by an automatic renewal provision. Likewise, many companies offer free trials but require a credit card and a call to cancel or the service will continue (Madrian & Shea, Reference Madrian and Shea2001).

We place inertia in the crisis planning stage. Before the crisis, inertia lulls decision-makers into thinking a crisis is unlikely. Inertia can create a false sense of security if the effects of a potential crisis are not easily felt or visualized, and helps explain why many organizations do not engage in crisis planning. Hence, it can reinforce other BE factors that hinder crisis planning.

Crisis action

We define the crisis action stage as the management interventions that commence and are ongoing after a crisis has occurred. During the crisis action stage, the CMT should be activated and steps should be taken to mitigate the effects of the crisis and restore a normal operating state as quickly as possible.

Bounded rationality

A rational approach to decision-making is intuitively appealing, but time constraints and available resources rarely permit a complete and accurate analysis of every available alternative. During an organizational crisis, decisions must be made in real time, in real situations, and by real actors. As such, the complexity and asymmetry of information common to crisis situations suggest that a bounded rationality approach is more feasible.

During an actual crisis, there is a limited search for decision options because individuals often lack sufficient time and energy to investigate every decision alternative available. Hence, effective decision-makers know when to make a choice that satisfies key criteria and move on by identifying key decision criteria and evaluating potential alternatives accordingly. Satisficing – accepting a satisfactory, but not necessarily optimal alternative – may lead to error when misused, but it can also be an effective decision-making tool when time is constrained. Perfect decisions are elusive, and spending excessive time and resources to pursue perfection can make a situation worse. Hence, in some cases – and especially during crises – more effective decisions often occur when limitations are embraced, major issues are examined, and decisions are made expediently.

We place bounded rationality in both the crisis planning (discussed previously) and crisis action stages. During the latter stage, decisions must be evaluated and rendered even if time and information may be limited. It may not be feasible to evaluate all of the options before a decision is made.

Compromise effect

There are situations whereby all of the alternatives are deemed acceptable. When faced with a difficult choice in such situations, decision-makers sometimes select an alternative that appears to reside conceptually in the middle of these options. This is known as the compromise effect (Mourali, Böckenholt, & Laroche, Reference Mourali, Böckenholt and Laroche2007; Chuang, Cheng, Chang, & Chiang, Reference Chuang, Cheng, Chang and Chiang2013).

The compromise effect can be prevalent in a crisis situation. When faced with an array of alternatives, some team members may believe that ‘the truth is always in the middle’ and gravitate toward ostensibly safe, compromise decisions. For example, during a product recall discussion, various options considered typically include a total recall, a partial recall, or no recall. When various recall options are considered, some team members might favor recalling some of the products under consideration – perhaps those with the greatest likelihood of damage – but not all of them. This tendency toward compromise can result in a CMT ignoring more salient considerations associated with each alternative. We place the compromise effect in the crisis action stage.

Perceptions of gains and losses

Decision-makers perceive potential gains and losses differently. As a rule, they tend to be risk averse in potential gain situations and risk taking in potential loss situations (Pixley, Reference Pixley2010). Crisis situations typically involve potential loss, suggesting that crisis managers might be willing to take irrational risks to avoid or minimize the downside.

We place perceptions of gains and losses in the crisis action stage. Because of personality influences, some CMT members may be risk averse in some areas but risk seeking in others. Differences on how to approach risk are linked to perceptions and how individuals mentally process negative outcomes.

Omission bias

Decision-makers typically anticipate regret when choosing an alternative. To avoid regret, they often choose safe, default choices that can be easily defended unless there is a risky choice that might be objectively better (Sunstein, Reference Sunstein2013). They often fail to take action that might help a situation because they fear the regret associated with taking an action will lead to harm, a phenomenon known as omission bias. Individuals do not tend to think of harms of omission as a personal fault. Consider the case of childhood vaccinations. Some parents are reluctant to have their children vaccinated, even though the likelihood of illness or death is typically lower than occurs if no vaccination occurs (Ritov & Baron, Reference Ritov and Baron1990). Vaccinating a child that ultimately experiences harm is likely to engender substantial feelings of guilt because the injury was a result of parental action.

The key to understanding omission bias is distinguishing between acts of commission and omission. Acts of omission typically lead to less regret than acts of commission in the short run. However, this changes in the long run, as acts of omission – what could have been done – become more important (Schwartz, Reference Schwartz2004).

Omission bias can be seen in crisis decision-making as well. For example, because press conferences typically involve confrontation, an organization in crisis may avoid holding one even though doing would be beneficial over the long term. In other words, avoiding or delaying the press conference is perceived to be a safe choice, but is not the best one. We place the omission bias in the crisis action stage where most of the action decisions are made.

The postcrisis stage

After a crisis, external stakeholders begin to voice their opinions on what they think the organization did wrong. Reference points, an overreliance on outliers, and framing help explain this mindset.

Reference points

People tend to use reference points rather than absolute values when evaluating alternatives (Baucells, Weber, & Welfens, Reference Baucells, Weber and Welfens2011; Paraschiv & Chenavaz, Reference Paraschiv and Chenavaz2011). Reference points are useful and appropriate, but decision-makers must take care to identify them accurately. During a crisis, the reference point may be identified by some managers as the state or condition of the fully functional organization before the crisis occurred. However, a crisis alters the realm of possibilities, so decision-makers may delay appropriate crisis responses, evading the reality of the crisis and attempting to push the organization back to an earlier reference point that is no longer attainable. Instead, they should make decisions that offer the best possible outcome for the organization given the current situation.

We place the use of reference points in the crisis action stage. Decisions made during this stage determine if the organization returns to its original equilibrium point (the precrisis stage) or if it must deviate from the status quo to a new point.

Overreliance on outliers

There is a tendency to remember and emphasize extraordinary events, even though they rarely occur. Individuals often overemphasize rare and memorable events, such as a time when an airline lost baggage. A wide range of evidence should be considered when evaluating the likelihood of crisis events.

We place overreliance on outliers in the crisis planning stage. The outlier phenomenon may be present in this stage because crisis events are, by definition, outliers. Raising awareness for certain crises is important, but too much awareness can stymie a firm from a strategic perspective. Crisis managers should be cognizant of outliers when evaluating scenarios, but should not overemphasize them beyond the risk and potential loss they present. Indeed, some risk is always inherent in decisions. The key is to understand and manage it accordingly.

Framing

Framing occurs when the same situation or alternative is stated using different terms, thereby resulting in different perceptions (Foss & Lindenberg, Reference Foss and Lindenberg2013). For example, many consumers view products that are 90% fat-free more favorably than those with 10% fat content, although both product descriptions are synonymous. Framing can create challenges for crisis decision-makers. Although crises are often framed as completely unpredictable, chance, or unavoidable events, this is not always the case. A ‘we can’t help it’ perspective is often promoted either by individuals who are poorly trained in CM or those who seek to escape responsibility for crisis planning, an act of omission (aforementioned), or both.

We place framing in the crisis action stage. Framing is also a challenge when alternative crisis responses are evaluated. The frames through which alternatives are presented – consciously or otherwise – can influence the attractiveness of the alternatives (Hossain & List, Reference Hossain and List2012; Goh & Bockstedt, Reference Goh and Bockstedt2013). For example, a production line that is 96% free of defects also means four products out of 100 will be defective. When assessing alternatives, decision-makers should be careful to rephrase and evaluate facts and options from all angles to minimize problems associated with framing.

IMPLICATIONS

Understanding the basic tenets of BE can help explain decision-making of CMT members, as well as the perspectives of customers and other external stakeholders. In this respect, BE can augment rational decision-making models by addressing their limitations in crisis environments. Understanding the reasoning behind decisions that are less-than-optimal is a first step in improving the quality of an organization’s CM response. In the discussion that follows, we focus on three stages of the crisis life cycle – crisis preparation, crisis action, and postcrisis (see Table 1). To augment the discussion, we reference a prominent crisis that United Airlines experienced on 9 April 2017.

Table 1 Implications for management

Crisis preparation: BE explains why crisis planning may be conservative and overlook important potential crises

The identification of worst-case scenarios is a key charge of the CMT. From this list, crisis contingency plans are prepared and resources are made available, ‘just in case.’ But the motivation to identify these scenarios is complicated by certain behavioral factors. Examining a recent corporate crisis illustrates the dynamics of BE heuristics. On 9 April 2017, United Airlines personnel asked a passenger, Dr. David Dao, to depart the aircraft prior to takeoff. The seat was required for one of four crew members the airline needed to transport to Louisville, Kentucky. Dr. Dao refused, and he was forcibly removed from the aircraft to the dismay of fellow passengers. The event was recorded on a smartphone and the video went viral on social media.

While the problem faced by United was multifaceted, the request emanated from a common industry practice of overbooking flights. In this instance, four passengers were involuntarily denied boarding after volunteers could not be found. Their seats were required so that four crew members could be repositioned for another flight. Unfortunately, the four individuals selected for seat denial had already been permitted to board the flight.

Overbooking in the airline industry is a form of revenue enhancement and a valid business process (Carey, Reference Carey2017). The rationale is that not all passengers who reserve a flight arrive in time to board. Subsequently, no-show passengers create empty seats which are lost revenue, also known as ‘spoilage’ in the airline industry (Klophaus & Pölt, Reference Klophaus and Pölt2006).

BE helps explain overbooking from the airlines’ perspective and its general acceptance among most passengers, as well as the public’s negative reaction after the United Airlines crisis ensued. Airlines employees use elaborate mathematical models to calculate how much a specific flight should be overbooked so that the plane is filled to its optimum level without causing paying passengers to be involuntarily denied boarding (Siddappa, Rosenberger, & Chen, Reference Siddappa, Rosenberger and Chen2007; Gerchick, Reference Gerchick2013). If a flight is overbooked, airlines offer incentives to passengers to take later flights. Volunteers are usually, but not always identified.

The practice of overbooking has been largely successful and the number of passengers affected by involuntary boarding denial has been relatively small. In 2016, ~660 million passengers boarded flights in the United States, of which only 434,425, or 0.07%, were ‘bumped’ and voluntarily took a later flight; 40,629 passengers, or 0.006%, were removed involuntarily (Hopper, Reference Hopper2017). The industry has viewed the practice as a success; indeed, fewer empty seats means more revenues, lower per-seat costs, and lower fares (Klophaus & Pölt, Reference Klophaus and Pölt2006). Hence, a form of inertia has developed. Within the airline industry, overbooking was an accepted part of the status quo. As long as problems were minimal, the status quo would become even more entrenched.

Given this record of success, managers in the industry have not focused on prospective crises that could result from overbooking. Indeed, managing for worst-case scenarios was nonexistent and hence, bounded rationality regarding crisis planning became the norm. Because managers have limited experience in crisis identification and planning, they often satisfice until they have acquired the breadth of knowledge and experience required to address specific crises in a precise manner. Prior to the United Airlines crisis, industry players lacked experience with this type of event. Recall also that optimism bias is the tendency to overestimate the probability of a positive outcome and underestimate the probability of a negative outcome. United was overly optimistic that a crisis could always be avoided. This high-level confidence seemed reasonable, as passengers have generally been compliant for decades (Rothstein, Reference Rothstein1985).

Recall also that visualization encourages risk identification and planning before a crisis event has occurred including a process of addressing these events should they occur. While some crisis scenarios, such as a power outage, are easily visualized, others, such as a computer chip failure, are not. Hence, visualization is biased against identifying certain types of crises. Prior to the United incident, forcibly removing a ticketed passenger from an aircraft had only occurred when he or she was disruptive or posed a security threat (Hopper, Reference Hopper2017). The problem of visualization was present. The United crisis was a true cosmology episode (see Weick, Reference Weick1993) in that no major airline had never experienced such a confluence of events. Indeed, United officials had never visualized this scenario and the disastrous impact it would have.

Temporal discounting contributed to the response as well. This mindset assumes that decision-makers discount the value of future benefits associated with an alternative. In this situation, the option chosen by airline personnel – forcibly removing a seated passenger – even with its downside, was believed to lead to a more positive outcome. Allowing a passenger to remain on the aircraft was not suitable because it carried perceived risk; the flight crew might not make its destination, creating problems for a future flight. The total cost of passenger removal was not considered, and later, the public relations outcry and eventual passenger lawsuit exceeded any benefits associated with transporting the flight crew. Did the flight generate enough value to United to offset the crisis it created? Temporal discounting suggests that it did, but this does not mean that the correct decision was made. Rather, it explains why the decision might have been made.

Probability weighting

It also helps explain (but not justify) the rationale behind United’s decision to deplane a seated, paying passenger. Recall that decision-makers tend to overrate low-probability events and underrate high-probability events. United’s decision to remove Dr. Dao was a low-probability event. It had never happened before (Hopper, Reference Hopper2017), so the risks associated with it were not considered. However, given the advent of social media, United did not weigh the risks of a higher probability outcome, that other passengers would record and disseminate the event. The video went viral and the event was known across the globe, and the crisis had fully ensued in minutes. United had overrated the importance of one event and underrated the importance of another.

In summary, probability weighting introduces a bias that may cause crisis planners to emphasize the wrong worst-case scenarios. Probability weighting results in the decision-maker placing a higher weight on a low-probability event occurrence, and a lower weight on a high-probability event occurrence. This is akin to committing too many resources to defend against, for example, a physical terrorist attack (a low-probability event in most organizations) while allocating considerably fewer resources to guarding against an information system breakdown (a higher probability event in most organizations). Hence, probability weighting is biased against identifying certain types of crises, specifically those perceived as low-probability events, although they may be more likely to occur than perceived. Figure 1 shows the probability weighting function line. In the gray area, the decision-maker overestimates the importance of the crisis event. In the white area, the decision-maker underestimates the severity of the crisis.

Figure 1 The effect of the probability weighting function, optimism bias, and overreliance on outliers on crisis decision-making

Figure 1 depicts that an overreliance on outliers (discussed in the postcrisis stage) may create a false sense of urgency in favor of crisis events that are serious, but not likely to occur to the organization. Hence, the overreliance on outliers tends to put the decision-maker in the gray area of the figure. Like probability weighting then, an overreliance on outliers is biased against identifying certain types of crises events. However, in this case, the decision-maker is biased against focusing on higher probability events in favor of allocating more attention to extreme, low-probability events, specifically outliers.

The white area of Figure 2 depicts the region where the decision-maker places less weight on the occurrence of a crisis when she should place more weight on planning for the crisis. This can result in the organization taking on an unnecessary level of risk for which it is not adequately prepared. Two concepts skew decision-making in this region, optimism bias and probability weighting. Per optimism bias, the decision-maker simply does not perceive that a crisis could occur at his or her organization. Graphically, this region can be depicted in the white area. Figure 2 also shows the probability weighting function line in the white region.

Figure 2 Bounded rationality and decision options

Optimism bias, temporal discounting, and inertia contribute to our knowledge of why crisis decision-makers are reluctant to change. Inertia makes it difficult to embrace crisis planning as a legitimate activity because movement is difficult when one is already standing still. In other words, if crisis planning has not been the norm in the organization, it becomes more difficult over time to take it seriously. If the organizational culture does not support a crisis readiness mentality, then crisis planning will be difficult to implement until there is a cultural shift (Smith & Elliott, Reference Smith and Elliott2007). Hence, inertia is the BE concept that identifies the thinking pattern, whereas organizational culture describes why that pattern exists.

On the other hand, optimism bias and temporal discounting do not necessarily dismiss the threat of a crisis. Instead, they lull management into thinking the threats either will occur elsewhere, or they are so far removed from the firm they are not worth considering. Deferred maintenance illustrates this phenomenon. Consider that (1) deferred maintenance will save money in the present fiscal period, (2) the result will be an enhanced bottom line, (3) the unintended consequence will be safety issues, and (4) a workplace accident will eventually result from the eroding equipment and machinery. The 2010 explosion and fire at Tesoro Corporation in the state of Washington, USA, illustrates this scenario. The accident killed seven workers. It was revealed that Tesoro continued to operate equipment that was failing and should have been replaced and intentionally postponed maintenance, both of which were contributing factors in the accident (Crandall, Parnell, & Spillan, Reference Crandall, Parnell and Spillan2014).

Crisis action: BE explains why when faced with a crisis, decisions may gravitate towards a pre-established decision rule instead of a logical problem-solving thought process

When United Airlines had to find four seats for its crew members, a cosmology episode had ensued, as passengers were already seated on the aircraft. Standard operating procedure is to address such issues before passengers board. Hence, United was in new territory, and there was little time to evaluate all the alternatives to resolve the problem. Bounded rationality suggests that during an organizational crisis, decisions must be made expediently and with satisficing in mind; indeed, good enough is good enough. With very little time to make decisions, United invoked its involuntary denial of boarding policy after passengers were seated on the aircraft. Given the constraints of bounded rationality, this was logical, but not optimal. Still, it explains decision-making under complex circumstances that are only somewhat predictable. What was not predicted was what happened next, Dr. Dao’s refusal to deplane.

Recall that the compromise effect occurs when a set of alternatives are deemed acceptable. When faced with choices, decision-makers often select an alternative that appears to reside conceptually in the middle of the options. The source of the decision to deplane the four seated passengers is not entirely clear, but other alternatives were available. The crew members could have been placed on another flight or passengers could have been offered a higher payout to give up their seats (Hopper, Reference Hopper2017). The alternative selected could have been a compromise decision, so we offer it here as a rationale as well. Compromise decisions are typically middle-of-the-road choices, neither too extreme nor too difficult to implement. In the United case, the compromise decision of invoking the involuntary denial of boarding policy was arguably the simplest option available. Of course, hindsight proves that it was not the right decision. As United CEO Oscar Munoz noted after the crisis, ‘We let our policies and procedures get in the way of doing the right thing’ (Carey, Reference Carey2017: B1).

The perceptions of gains/losses was a factor in the crisis as well. As a rule, decision-makers tend to be risk averse in potential gain situations and risk seeking in potential loss situations (Pixley, Reference Pixley2010). Since a crisis involves a potential loss, managers may be more willing to take extra risks to minimize possible losses. The need to transport four crew members on an already full flight could be viewed as cutting losses. If the crew members were not seated, another flight might have to be cancelled, thereby creating losses for the airline. However, fares and payouts would be incurred for the four passengers who deplaned. Given the option of losing revenue on an entire future flight or losing revenue from four passengers on a current flight, United chose the latter. But how do you manage the one passenger who refuses to leave a boarded flight? United choose to cut its losses by forcibly removing the passenger, a risky option in part because other passengers could easily record the incident. United was within its legal rights to remove the passenger involuntarily, but doing so was clearly the wrong decision.

The omission bias occurs when decision-makers experience regret when choosing an alternative. To avoid regret, they may choose safe, default choices that can be easily defended. Here, the default choice was to continue with the passenger involuntary denial of boarding policy associated with overbooking. However, United failed to anticipate that executing the policy, under a different scenario, could backfire. Passenger reactions were influenced by the fact that passengers were being removed so that United crew members could board. Moreover, the passengers were already seated on the plane. The United spokesperson who initially noted that the airline was following policy was correct.

Organizational crises create unwelcome time constraints that can be inconsistent with the time-neutral assumption inherent in rational decision-making models. With speed at a premium, the CMT must ultimately resort to a bounded rationality approach and make decisions expediently. Figure 2 depicts a hypothetical situation involving various decision alternatives for an organization facing a crisis. Although decision-makers would prefer the chance to evaluate all recall options in a systematic manner, a sense of urgency prevails and management must consider only three possible options.

To summarize, bounded rationality, the compromise effect, perceptions of gains and losses, and the omission bias can cause decision-making to sway toward alternatives in the conceptual middle. When compromise decisions are made, it is difficult to assess whether options at either end of the spectrum would have been more effective. Moreover, a middle-of-the-road decision could hinder the effectiveness of the crisis strategy. Because of group dynamics, the decision-makers are often forced to compromise, minimize loss aversion, and avoid regret. Such decision-making does not always address the crisis effectively.

Postcrisis: BE explains why consumers and external stakeholders may hold a company in a negative light after a crisis

Crisis situations are extraordinary and by definition, outliers. In the crisis preparation stage, decision-makers must advocate the allocation of time, energy and resources to prepare for situations many in the organization – including executives – do not believe will happen. In the crisis action stage, they must make decisions under conditions of duress, incomplete information, and limited time. In the postcrisis stage, the emergence of external stakeholder opinions surface, and because of social media, can be noteworthy and powerful. Reference points, an overreliance on outliers, and framing help explain how these external cognitions occur.

Recall that people tend to use reference points rather than absolute values when evaluating alternatives. Decision-makers must take care to identify reference points accurately. In the United case, passengers and airline officials viewed the reference points differently. Airline officials view passengers as sources of revenue whether they have boarded an aircraft or not. However, passengers view the reference point differently in that there is a distinction between being at the gate versus seated on the aircraft. In CM, understanding this distinction is critical as it can be the deciding factor in an ensuing crisis. Clearly, reference points matter.

Recall that an overreliance on outliers results in the tendency to remember and emphasize extraordinary events, even though they rarely occur. Clearly, the United Airlines case is an outlier. Many might assume it is unlikely to recur, but this assumption may not be accurate. Most consumers probably perceive forcible removal from an aircraft as an outlier, but it is also one that is easily recorded and can be experienced again and again on social media. Indeed, this event is probably unlikely to recur at United, but passengers will carry the mindset that it could. United and other airlines may not get the benefit of the doubt in future confrontations. Passengers will be quick to pull out their smartphones and record anything they see as potentially damaging or newsworthy.

Recall that framing occurs when the same situation is perceived in different ways by different people or groups. For example, regarding overbooking, some passengers frame the situation as a positive, while others frame it as a negative. Passengers who voluntarily opt to take a later flight can benefit financially by receiving vouchers to offset the inconvenience. In fact, United reports that 96% of its overbooked customers agree to take another flight in return for compensation (Carey, Reference Carey2017). However, the United crisis has been framed negatively by passengers, as many on-board incidents between passengers and flight attendants are now being recorded and posted on social media outlets.

CONCLUSION

The field of BE offers keen insight into decision-making in all stages of the CM process, particularly the crisis preparation and crisis action stages. While rational decision-making serves as a sound conceptual basis for thinking about CM, it has practical limits. Indeed, rational decision-making is typically more difficult when an organizational crisis is involved. Many textbook cases of CM depict ineffectiveness and suggest clear, serious and often repeated mistakes by crisis managers. Comprehensive academic and practical understandings of the CM field hinge on a modified decision-making framework that invokes less-than-rational, evidence-based concepts. BE explains why the utility of rational decision-making models declines as an organization shifts into crisis mode. As such, BE offers both a scholarly richness and a toolkit to shift crisis thinking to a more behavioral perspective.

Acknowledgement

This manuscript is an original work that has not been submitted to nor published elsewhere. All authors have read and approved the paper.

References

Ariely, D. (2009). The curious paradox of ‘optimism bias’. Business Week, 48.Google Scholar
Barberis, N. C. (2013). Thirty years of prospect theory in economics: A review and assessment. Journal of Economic Perspectives, 27(1), 173196.10.1257/jep.27.1.173CrossRefGoogle Scholar
Baucells, M., Weber, M., & Welfens, F. (2011). Reference-point formation and updating. Mathematics of Operations Research, 36(1), 506519.Google Scholar
Becker, B. W., & Greenberg, M. G. (1978). Probability estimates by respondents: Does weighting improve accuracy? Journal of Marketing Research (JMR), 15(3), 482486.10.1177/002224377801500318CrossRefGoogle Scholar
Caponecchia, C. (2010). It won’t happen to me: An investigation of optimism bias in occupational safety and health. Journal of Applied Social Psychology, 40(3), 601617.10.1111/j.1559-1816.2010.00589.xCrossRefGoogle Scholar
Carey, S. (2017, April 27). United cites litany of failures. The Wall Street Journal, p. B1.Google Scholar
Carone, A., & Di Iorio, L. (2013). Crisis management: An extended reference framework for decision-makers. Journal of Business Continuity & Emergency Planning, 6(4), 347359.Google ScholarPubMed
Christensen, T., Lægreid, P., & Rykkja, L. H. (2016). Organizing for crisis management: Building governance capacity and legitimacy. Public Administration Review, 76(6), 887897. http://dx.doi.org/10.1111/puar.12558CrossRefGoogle Scholar
Chuang, S. C., Cheng, Y. H., Chang, C. J., & Chiang, Y. T. (2013). The impact of self-confidence on the compromise effect. International Journal of Psychology, 48(4), 660675.10.1080/00207594.2012.666553CrossRefGoogle ScholarPubMed
Cirka, C. C., & Corrigall, E. A. (2010). Expanding possibilities through metaphor: Breaking biases to improve crisis management. Journal of Management Education, 34, 303323.10.1177/1052562909336912CrossRefGoogle Scholar
Coleman, L. (2004). The frequency and costs of corporate crises. Journal of Contingencies & Crisis Management, 14, 311.10.1111/j.1468-5973.2006.00476.xCrossRefGoogle Scholar
Coombs, W. (2007). Ongoing Crisis Communication: Planning, Managing, and Responding (2nd ed.), Thousand Oaks, CA: Sage.Google Scholar
Coombs, W. T., & Holladay, S. J. (2006). Unpacking the halo effect: Reputation and crisis management. Journal of Communication Management, 10(2), 123137.10.1108/13632540610664698CrossRefGoogle Scholar
Crandall, W. R., Parnell, J. A., & Spillan, J. E. (2014). Crisis Management: Leading in the New Strategy Landscape (2nd ed.), Thousand Oaks, CA: Sage Publications.Google Scholar
de Waard, E. J., Volberda, H. W., & Soeters, J. (2012). How to support sensing capabilities in highly volatile situations. Journal of Management & Organization, 18, 774794.10.5172/jmo.2012.18.6.774CrossRefGoogle Scholar
Diacon, P. E., Donici, G.-A., & Maha, L.-G. (2013). Perspectives of economics – Behavioural economics. Theoretical & Applied Economics, 20(7), 2732.Google Scholar
Dittmar, H., & Bond, R. (2010). I want it and I want it now’: Using a temporal discounting paradigm to examine predictors of consumer impulsivity. British Journal of Psychology, 101(4), 751776.10.1348/000712609X484658CrossRefGoogle Scholar
Elsubbaugh, S., Fildes, R., & Rose, M. B. (2004). Preparation for crisis management: A proposed model and empirical evidence. Journal of Contingencies and Crisis Management, 12(3), 112127.10.1111/j.0966-0879.2004.00441.xCrossRefGoogle Scholar
Evans, N., & Elphick, S. (2005). Models of crisis management: An evaluation of their value for strategic planning in the international travel industry. International Journal of Tourism Research, 7(3), 135150.10.1002/jtr.527CrossRefGoogle Scholar
Foss, N. J., & Lindenberg, S. (2013). Microfoundations for strategy: A goal-framing perspective on the drivers of value creation. Academy of Management Perspectives, 27(2), 85102.10.5465/amp.2012.0103CrossRefGoogle Scholar
Fowler, K. L., Kling, N. D., & Larson, M. D. (2007). Organizational preparedness for coping with a major crisis or disaster. Business & Society, 46(1), 88103.10.1177/0007650306293390CrossRefGoogle Scholar
Fox, J. (2015). From “Economic Man” to behavioral economics: A short history of modern decision-making. Harvard Business Review, 93(5), 7985.Google Scholar
Gerchick, M. (2013). Full upright and locked position: Not-so-comfortable truths about air travel today. New York/London: W.W. Norton and Company.Google Scholar
Gerrans, P. (2012). Retirement savings investment choices in response to the global financial crisis: Australian evidence. Australian Journal of Management, 37, 415439.10.1177/0312896212450041CrossRefGoogle Scholar
Goh, K. H., & Bockstedt, J. C. (2013). The framing effects of multipart pricing on consumer purchasing behavior of customized information good bundles. Information Systems Research, 24(2), 334351.10.1287/isre.1120.0428CrossRefGoogle Scholar
Haibing, G., Jinhong, X., Qi, W., & Wilbur, K. C. (2015). Should ad spending increase or decrease before a recall announcement? The marketing-finance interface in product-harm crisis management. Journal of Marketing, 79(5), 8099.Google Scholar
Herbane, B. (2013). Exploring crisis management in UK small- and medium-sized enterprises. Journal of Contingencies & Crisis Management, 21(2), 8295.10.1111/1468-5973.12006CrossRefGoogle Scholar
Hopper, N. (2017). United’s no good, very bad day – And what it means for all of us. Time, 189(15), 22.Google Scholar
Hossain, T., & List, J. A. (2012). The behavioralist visits the factory: Increasing productivity using simple framing manipulations. Management Science, 58(12), 21512167.10.1287/mnsc.1120.1544CrossRefGoogle Scholar
Hunter, M. L., Van Wassenhove, L. N., & Besiou, M. (2016). The new rules for crisis management. MIT Sloan Management Review, 57(4), 7178.Google Scholar
Jaques, T. (2007). Issue management and crisis management: An integrated, non-linear, relational construct. Public Relations Review, 33(2), 147157.10.1016/j.pubrev.2007.02.001CrossRefGoogle Scholar
Jaques, T. (2010). Reshaping crisis management. The challenge for organizational design. Organization Development Journal, 28(1), 917.Google Scholar
Kahneman, D. (2003). Maps of bounded rationality: Psychology for behavioral economics. American Economic Review, 93, 14491475.10.1257/000282803322655392CrossRefGoogle Scholar
Kahneman, D. (2011). Thinking fast and slow. New York: Farrar, Straus and Giroux.Google Scholar
Kahneman, D., & Tversky, A. (1979). Prospect theory: An analysis of decision under risk. Econometrica, 47(2), 263292.10.2307/1914185CrossRefGoogle Scholar
Kantur, D., & Iseri-Say, A. (2012). Organizational resilience: A conceptual, integrative framework. Journal of Management & Organization, 18, 762.773.10.5172/jmo.2012.18.6.762CrossRefGoogle Scholar
Klophaus, R., & Pölt, S. (2006). Airline overbooking with dynamic spoilage costs. Journal of Revenue and Pricing Management, 6(1), 918.CrossRefGoogle Scholar
Lerbinger, O. (1997). The crisis manager: Facing risk and responsibility. Mahwah, NJ: Lawrence Erlbaum Associates.Google Scholar
Madrian, B. C., & Shea, D. F. (2001). The power of suggestion: Inertia in 401(k) participation and savings behavior. Quarterly Journal of Economics, 116(4), 11491187.CrossRefGoogle Scholar
Maiorescu, R. D. (2016). Crisis management at general motors and Toyota: An analysis of gender-specific communication and media coverage. Public Relations Review, 42(4), 556563. http://dx.doi.org/10.1016/j.pubrev.2016.03.011CrossRefGoogle Scholar
McDonald, I. (2008). Behavioral economics. The Australian Economic Review, 41(2), 222228.CrossRefGoogle Scholar
Mitroff, I. (2005). Why some companies emerge stronger and better from a crisis: 7 essential lessons for surviving a disaster (238pp). New York: AMACOM.Google Scholar
Morakabati, Y., Page, S. J., & Fletcher, J. (2017). Emergency management and tourism stakeholder responses to crises: A global survey. Journal of Travel Research, 56(3), 299316. http://dx.doi.org/10.1177/0047287516641516CrossRefGoogle ScholarPubMed
Mourali, M., Böckenholt, U. L. F., & Laroche, M. (2007). Compromise and attraction effects under prevention and promotion motivations. Journal of Consumer Research, 34(2), 234247.10.1086/519151CrossRefGoogle Scholar
Mukherjee, K. (2011). Thinking styles and risky decision-making: Further exploration of the affect-probability weighting link. Journal of Behavioral Decision-making, 24(5), 443455. http://dx.doi.org/10.1002/bdm.700CrossRefGoogle Scholar
Paraschiv, C., & Chenavaz, R. (2011). Sellers’ and buyers’ reference point dynamics in the housing market. Housing Studies, 26(3), 329352.CrossRefGoogle Scholar
Paraskevas, A. (2006). Crisis management or crisis response system? A complexity science approach to organizational crises. Management Decision, 44, 892907.10.1108/00251740610680587CrossRefGoogle Scholar
Parnell, J. A., & Dent, E. L. (2009). Philosophy, ethics and capitalism: An interview with BB&T CEO John Allison. Academy of Management Learning & Education, 8, 587596.Google Scholar
Parnell, J. A., Köseoglu, M. A., & Spillan, J. E. (2010). Crisis readiness in Turkey and the United States. Journal of Contingencies & Crisis Management, 18(2), 108116.10.1111/j.1468-5973.2010.00603.xCrossRefGoogle Scholar
Pearson, C., & Clair, J. (1998). Reframing crisis management. Academy of Management Review, 23(1), 5976.10.5465/amr.1998.192960CrossRefGoogle Scholar
Pearson, C., & Mitroff, I. (1993). From crisis prone to crisis prepared: A framework for crisis management. Academy of Management Executive, 71, 4859.Google Scholar
Peters, E., & Stark, J. W. C. (2005). Behavioral economics: The art and science of making choices. New Orleans Health/Pension Spring Meeting, 31(2), 125.Google Scholar
Pheng, L., Ho, D., & Ann, Y. (1999). Crisis management: A survey of property development firms. Property Management, 17(3), 3151.Google Scholar
Piotrowski, C. (2006). Hurricane Katrina and organization development: Implications of chaos theory. Organizational Development Journal, 24(3), 1019.Google Scholar
Piotrowski, C., Watt, J. D., & Armstrong, T. (2010). The interdisciplinary nature of the field of crisis management: A call for research collaboration. Organization Development Journal, 28(3), 8591.Google Scholar
Pixley, J. (2010). The use of risk in understanding financial decisions and institutional uncertainty. Journal of Socio-Economics, 39(2), 209222.CrossRefGoogle Scholar
Racherla, P., & Clark, H. (2009). A framework for knowledge-based crisis management in the hospitality and tourism industry. Cornell Hospitality Quarterly, 50, 561577.CrossRefGoogle Scholar
Ritov, I., & Baron, J. (1990). Reluctance to vaccinate: Omission bias and ambiguity. Journal of Behavioral Decision-making, 3, 263277.CrossRefGoogle Scholar
Robert, B., & Lajtha, C. (2002). A new approach to crisis management. Journal of Contingencies and Crisis Management, 10(4), 181191.CrossRefGoogle Scholar
Rothstein, M. (1985). OR and the airline overbooking problem. Operations Research, 33(2), 237248.CrossRefGoogle Scholar
Roux-Dufort, C. (2007). Is crisis management (only) a management of exceptions? Journal of Contingencies and Crisis Management, 15(2), 105114.10.1111/j.1468-5973.2007.00507.xCrossRefGoogle Scholar
Sawalha, I. H. S., Jraisat, L. F., & Al-Quduh, K. A. M. (2013). Crisis and disaster management in Jordanian hotels: Practices and cultural considerations. Disaster Prevention and Management, 22(3), 210228.CrossRefGoogle Scholar
Schilirò, D. (2012). Bounded rationality and perfect rationality: Psychology into economics. Theoretical & Practical Research in Economic Fields, 3(2), 101111.Google Scholar
Schwartz, B. (2004). The paradox of choice: How the culture of abundance robs us of satisfaction. New York, NY: Harper Perennial.Google Scholar
Shefrin, H. (2013). When CFOs think fast…They may get the company into a lot of trouble; here’s where they go astray—and how to avoid it. Wall Street Journal, p. R5.Google Scholar
Shefrin, H. (2015). The behavioral paradigm shift. Revista de Administração de Empresas, 55(1), 9598.CrossRefGoogle Scholar
Siddappa, S., Rosenberger, J., & Chen, V. (2007). Optimizing airline overbooking using a hybrid gradient approach and statistical modeling. Journal of Revenue and Pricing Management, 7(2), 207218.CrossRefGoogle Scholar
Simon, H. A. (1985). Human nature in politics: The dialogue of psychology and political science. The American Political Science Review, 79(2), 293304.CrossRefGoogle Scholar
Simon, H. A. (1991). Bounded rationality and organizational learning. Organization Science, 2(1), 125134.10.1287/orsc.2.1.125CrossRefGoogle Scholar
Smith, D., & Elliott, D. (2007). Exploring the barriers to learning from crisis: Organizational learning and crisis. Management Learning, 38(5), 519538.CrossRefGoogle Scholar
Somers, S. (2009). Measuring resilience potential: An adaptive strategy for organizational crisis planning. Journal of Contingencies & Crisis Management, 17(1), 1223.10.1111/j.1468-5973.2009.00558.xCrossRefGoogle Scholar
Stanovich, K. E., & West, R. F. (2000). Individual differences in reasoning: Implications for the rationality debate? Behavioral and Brain Sciences, 23, 645665.CrossRefGoogle ScholarPubMed
Sunstein, C. R. (2013). The Storrs Lectures: Behavioral economics and paternalism. Yale Law Journal, 122(7), 18261899.Google Scholar
van Witteloostuijn, A. (1998). Bridging behavioral and economic theories of decline: Organizational inertia, strategic competition and chronic failure. Management Science, 44(4), 501521.CrossRefGoogle Scholar
Weick, K. (1993). The collapse of sensemaking in organizations: The Mann Gulch Disaster. Administrative Science Quarterly, 38, 628652.CrossRefGoogle Scholar
Weinstein, N. D. (1980). Unrealistic optimism about future life events. Journal of Personality and Social Psychology, 39, 806820.10.1037/0022-3514.39.5.806CrossRefGoogle Scholar
Whelan, R., & McHugh, L. A. (2009). Temporal discounting of hypothetical monetary rewards by adolescents, adults, and older adults. Psychological Record, 59(2), 247258.CrossRefGoogle Scholar
Figure 0

Table 1 Implications for management

Figure 1

Figure 1 The effect of the probability weighting function, optimism bias, and overreliance on outliers on crisis decision-making

Figure 2

Figure 2 Bounded rationality and decision options