Hostname: page-component-745bb68f8f-5r2nc Total loading time: 0 Render date: 2025-02-06T17:47:07.688Z Has data issue: false hasContentIssue false

Establishing a Foundation for Performance Measurement for Local Public Health Preparedness

Published online by Cambridge University Press:  06 May 2021

Jeff Schlegelmilch*
Affiliation:
The National Center for Disaster Preparedness at Columbia University’s Earth Institute, New York, USA
Mitch Stripling
Affiliation:
Planned Parenthood Federation of America, New York, USA
Thomas Chandler
Affiliation:
The National Center for Disaster Preparedness at Columbia University’s Earth Institute, New York, USA
Sabine Marx
Affiliation:
The National Center for Disaster Preparedness at Columbia University’s Earth Institute, New York, USA
Paul Bonwoo Gu
Affiliation:
The National Center for Disaster Preparedness at Columbia University’s Earth Institute, New York, USA
*
Corresponding author: Jeff Schlegelmilch, Email: js4645@columbia.edu.
Rights & Permissions [Opens in a new window]

Abstract

The development of performance measures is not a new concept in the disaster preparedness space. For over a decade, goals have been developed and tied to federal preparedness grant programs. However, these measures have been heavily criticized for their inability to truly measure preparedness. There is also growing frustration at the local level that these performance measures do not account for local readiness priorities or the outcome-driven value of emergency response activities. To define an appropriate theoretical framework for the development of performance measures, a review of the literature on existing planning and preparedness frameworks was conducted, with an iterative feedback process with a local health agency. This paper presents elements of that literature review that were most directly along with the conceptual framework that was used as a starting point for future iterations of a comprehensive performance measure development project.

Type
Concepts in Disaster Medicine
Copyright
© Society for Disaster Medicine and Public Health, Inc. 2021

Background

The development of performance measures is not a new concept in the disaster preparedness space. For over a decade, goals have been developed and tied to federal preparedness grant programs administered by the U.S. Department of Homeland Security (DHS), the Federal Emergency Management Agency (FEMA), as well as through the U.S. Department of Health and Human Services (DHHS). Reference Bascetta1-3 Specific to health and medical preparedness, these measures are administered through the Centers for Disease Control and Prevention (CDC) and the Assistant Secretary for Preparedness and Response (ASPR), who manage the Public Health Emergency Preparedness Cooperative Agreements (PHEP) and the Hospital Preparedness Program (HPP), respectively. 4,5 However, these measures have been heavily criticized for the past decade due to their inability to truly measure preparedness. 6-Reference Nelson, Lurie and Wasserman15 There is also growing frustration at the local level that these performance measures do not account for local readiness priorities or the outcome-driven value of emergency response activities.

Developing performance measures that are simultaneously nationally and locally relevant for a topic as complex as public health preparedness, is incredibly challenging. However, while progress can be slow, the goal of incorporating performance measures into organizational preparedness and response can add enormous benefit to local and national preparedness efforts. In recognition of the importance of this, the New York City Department of Health and Mental Hygiene (NYC DOHMH) contracted with the National Center for Disaster Preparedness (NCDP) at Columbia University’s Earth Institute in 2016 to develop a process and a set of measures that would meet the local preparedness and response requirements of a large city agency. The first step was to establish a theoretical framework. To achieve this, a thorough grounding in the health and medical preparedness capability and performance measure landscape was needed. At the center of this process was a project team that consisted of representatives of the NYC DOHMH, the Bureau of Agency Preparedness and Response, and the the Bureau of Healthcare Systems Readiness.

In order to define an appropriate theoretical framework for the development of performance measures, a review of the literature on existing planning and preparedness frameworks was conducted. This review incorporated publications on planning frameworks from multiple disciplines that would lead to high quality performance measures, rather than exclusively focusing on programs taken on by federal agencies. It also included an examination of non-disaster fields to attain insights to benefit the development of performance measures. Particular attention was also given to existing planning frameworks and actual response evaluations compiled by NYC DOHMH over a decade of incident command activations.

The findings from the literature review were presented to the project team at NYC DOHMH for further discussion of the values of the various approaches, and for selection for integration into the performance measurement process. These were selected based on their grounding in evidence and applicability to the challenges of performance measurement in an environment with a high degree of uncertainty and limited operational data, due to the relatively rare and non-standard environment that disasters present. Therefore, to accommodate the focus and parameters of this article submission, this paper only presents the sources from the literature review that were most directly integrated into the initial framework for this project, described in the final section of this paper.

This review and framework provides value to researchers and practitioners by showing both the potential of well-designed performance measures to practically measure the value of preparedness/response and the key role of uncertainty in the design of such measures. Subsequent papers from this effort will elaborate on the steps that followed this project to build out and validate this framework, and how it was ultimately integrated into the NYC DOHMH’s Strategic Preparedness and Response Total Alignment (SPARTA) framework.

Grounding the Approach

Current Health and Medical Disaster Planning and Performance Guidance

The first National Health Security Strategy (NHSS) was published in 2009 to provide a coordination plan for all stakeholders, ranging from the federal government to community-based organizations, to minimize the health impacts of disasters. 16 In 2012, the Strategy was strengthened with more specific activities. However its scope was largely at the federal level and did not fully consider the roles of non-federal collaborators. 17 The Strategy lacked specific and quality performance measures, which further hindered progress evaluations in key areas. In 2015, the NHSS released the National Health Security Strategy and Implementation Plan for 2015–2018. Starting in 2016, the strategy proposed to incorporate new qualitative and quantitative data from a number of available sources including the National Health Security Preparedness Index (NHSPI), among others. Nonetheless, the plan acknowledges that performance measurement will remain a challenge as the NHSPI is developmental and needs further augmentation and refinement. 18

The National Health Security Preparedness Index (NHSPI) was launched in December, 2013 through the partnership of the Association of State and Territorial Health Officials (ASTHO), the Centers for Disease Control and Prevention (CDC), and the Office of Public Health Preparedness and Response (OPHPR) to address the needs of health security preparedness by providing an annual measure at the national and state level. Reference Uzun Jacobson, Inglesby and Khan19 In both the 2013 and 2014 Indices, NHSPI applied 197 individual measures to provide a detailed picture of conditions of the health security capabilities of each state and of the nation as a whole. Despite the large number of current measures and ongoing improvement, the NHSPI has been criticized for a very limited number of measures that are research-tested and validated, as well as for limitations in the validity of self-reported data. Reference Mays, Childress, Hoover and Bollinger20 The NHSPI has also been criticized for having an incomplete set of variables particularly related to environmental health indicators. Reference Mays, Childress, Hoover and Bollinger20 It is worth noting however, that the NHSPI is the first major public health preparedness index developed for the United States, and it continues to evolve along with the evidence-base and as more understanding of the public health disaster dynamics are developed. From the project team’s perspective, the NHSPI has had limited value in setting objectives or showcasing the value of local preparedness work to date.

In an effort to facilitate strategic planning of state and local health departments, the Centers for Disease Control and Prevention (CDC) created 15 Public Health Emergency Preparedness (PHEP) Capabilities to serve as the national standard. 21 Despite being 1 of the largest funding sources for state and local public health preparedness, the PHEP Capabilities have been challenged in being able to simultaneously be locally relevant and provide a cohesive picture of national public health preparedness, with a strong evidence-based for its validity as a measure of readiness lacking. Reference Nelson, Lurie, Wasserman and Zakowski22,Reference Savoia, Lin, Bernard, Klein, James and Guicciardi23 While some studies analyzing PHEP date back more than a decade, these challenges and shortcomings were still largely relevant at the time of the review with little contemporary evidence suggesting a marked shift in utility and validity.

For healthcare readiness, under the Office of the Assistant Secretary for Preparedness and Response (ASPR), the Hospital Preparedness Program (HPP) Capabilities aim to improve preparedness and resilience by providing support to state and local agencies, in identifying gaps in preparedness measures, determining specific priorities, and developing plans for strengthening their specific healthcare capabilities. Each of the 8 HPP Capabilities are aligned with the 15 PHEP Capabilities, 5 but are specific to the elements healthcare preparedness under the grant program.

Other Disaster Planning Frameworks Utilized by NYC DOHMH

At the initiation of this project, there were several grounding frameworks already integrated into NYC DOHMH planning and preparedness activities. Therefore, it was important to utilize these existing frameworks as a starting context for building off of existing frameworks.

Many of the foundational tenets of disaster preparedness planning are rooted in the research work of Quarantelli, with a focus on the social and behavioral sciences. In particular, he developed 10 general principles of good disaster planning, followed by general principles of good disaster management. The overall theme of both is that planning and doctrine do not necessarily translate directly into good disaster management. He also emphasizes that the process of arriving at preparedness matters at least as much as the documentation that it produces, and the doctrine that it follows. Reference Quarantelli24 Therefore, any preparedness activities should follow a robust, inclusive process focused directly on enhancing response activities. This is distinct from most federal guidance which separates preparedness and response as separate, more loosely connected spheres.

Building on these themes, Keim outlines a process of planning referred to as O2C3. This acronym represents Operational-level planning, Objectives-based planning, Capability-based planning, Consensus-based planning, and Compliance with national preparedness doctrine. Reference Keim25 After developing this planning process, Keim elaborated on the planning methodology with the SOARS framework (SOARS is an acronym for Strategic Objectives, Operational Objectives, Activities, Responsible Parties, and Standard Operating Procedures). This framework translated the values of the O2C3 framework into a process and associated tools to define and develop preparedness activities, and to organize them in a manner that is conducive to broader data management through a nested system of objectives, activities and procedures (see example in Figure 1 below). Reference Keim26

Figure 1. Framework and illustrative example of SOARS framework from Keim (2013).

As a means to capture these open-ended and dynamic qualities of organizations, Dynes developed a typology which classifies organizations along 2 dimensions: tasks and structure (see Figure 5). Reference Dynes27 According to Dynes, tasks are characterized as either regular or non-regular and structure is characterized as either old or new, resulting in 4 types of organized responses to disasters. Type I, or established organizations, rely on a previously established structure and carry out routine tasks during disasters. Type II, or expanding organizations, are also required to quickly respond to disasters, as they also carry out regular tasks. However, in doing so, they depend on new structural arrangements. Type III, or extending organizations, are usually not anticipated as being responder organizations. They are characterized by a pre-existing structure, but during disasters they perform non-regular tasks. Type IV organizations must adapt to both new structures and new tasks. These organizations will have the most difficulty in achieving success in their emergent roles, and it’s essential to address their needs to the extent possible as a disaster unfolds.

Using these methodologies, it becomes possible to create a more data driven approach through nesting and linking specific tasks that creates enhanced opportunities to observe specific behaviors in relation to their parent objectives and capabilities across a broader disaster management perspective, while also acknowledging some degree of uncertainty and the contributions of extending and emergent organizations.

Frameworks on Managing Uncertainty

It is important to recognize that these tasks/strategies (even if structured) must be adaptable to emerging or otherwise unanticipated conditions, an aspect that has been largely missed by the models developed within the traditional realm of disaster preparedness and response. A domain to consider for potential approaches to filling this gap is the business and management sector, from which there is a wealth of literature on rapid adaptation in response to changing market conditions. A useful model considered is Emergent Strategy. This approach grew from the challenge of strategic planning being highly vulnerable to changes in the operating environment for businesses. This is a challenge very similar to those in the public agency domain, even in the absence of disasters (e.g., changes in elected leadership, budget priorities, etc.). Emergent Strategy identifies that strategic plans are developed with primary intentions based on the context in which they were conceived. However, factors change that impact the decision-making landscape, and alter the actual realized strategy. Therefore, strategies must be adaptable to these emerging conditions as they are presented. Approaching strategic planning through the lens of Emergent Strategy uses an overlapping approach of ‘Defining the Game,’ ‘Defining the Fitness Criteria,’ and ‘Stimulating Action.’ Reference Carpenter, Bauer and Erdogan28 ‘Defining the Game’ refers to the scope of industry where you are competing and the parameters for success. ‘Defining the Fitness Criteria’ represents the kinds of capacities and capabilities an organization must have to be successful in their defined industry (or within the defined game). Finally, ‘Stimulating Action’ seeks to take steps to improve fitness and competitiveness within the defined industry. This last concept of fitness begins to create a pathway for linking uncertainty with performance measures.

Another aspect of disaster preparedness and response that affects performance in significant ways is that many commitments need to be made without sufficient data and without full knowledge of all possible future impacts. Raynor addresses this need for organizations to make commitments with downstream consequences long before sufficient data are available to actually make informed decisions. This conundrum is also very similar to those experienced in the public sector and in disaster management. He has developed methods to confront what he calls the ‘Strategy Paradox.’ In order to escape this paradox, he asserts that the role of the strategic planner is to create and preserve options available to the organization. This requires the integration of defining the uncertainty dimensions that are relevant to an organization and understanding likely extremes of how these dimensions could play out. He also considers organizational structures, defining the highest levels of an organization as being responsible for managing uncertainty with operational levels focused on making and/or fulfilling commitments. Reference Raynor29 This supposes that different levels of uncertainty might need to be assessed to design performance measures at different levels of the SOARS framework. For example, ‘Strategic Objectives’ would be creating options, thus handling more uncertainty, and ‘Activities’ would focus on fulfilling commitments, thus limiting uncertainty (see Figure 2).

Figure 2. Organizational levels and relationship to managing uncertainty adapted from Raynor. Reference Raynor29

Other Performance Measure Development Approaches

In order to fully connect these theoretical frameworks for planning and managing uncertainty to actual performance, a more robust understanding of performance measure development must be attained. Scholars have attempted various approaches. For instance, Adair, et al. Reference Adair, Simpson, Casebeer, Birdsell, Hayden and Lewis30 analyzed 664 health and business publications that discussed organizational performance measurement. In short, the authors suggested a need for a comprehensive approach in addressing Performance Measurement, further refining the measures and the system while ensuring that Performance Measures would ultimately improve the entirety of the healthcare system. Additionally, Asch, et al. Reference Asch, Stoto and Mendes12 analyzed 27 existing instruments for planning, assessing, or evaluating the preparedness of public health agencies and evaluated each instrument using 4 criteria, based on the Essential Public Health Services framework: (1) clarity of measurement parameters and normative standards, (2) balance between structural and process measures, (3) evidence of effectiveness, and (4) specification of an accountable entity. They found that there is a lack of consensus among the instruments on what constitutes preparedness and how it should be measured. The authors also asserted that the disjunction between evidence and preparedness guidelines creates difficulty in conducting effective studies in public health practice. Given such gaps in currently available instruments, the authors provide 3 recommendations: (1) improved communication across federal-state-local agencies, (2) improved delineation of accountability for specific preparedness functions in measurement instruments, and (3) more explicit approaches that define the underlying evidence behind measures.

The Health Resources Services Administration (HRSA), provides a framework based on a summary of best practices in the health and medical sector. 31 This includes guidance from the Institutes of Medicine, which indicated that healthcare should be Safe, Timely, Effective, Efficient, Equitable, and Patient centered (STEEP). 32 The HRSA model emphasizes that performance measures should align with organizational goals, demonstrating a relationship to positive outcomes, while also being under the control of the organization developing them in ways that are reliable, valid, and standardized. Reference Point33

Some additional perspectives on performance measure development include the National State Auditors Association (NSAA), which suggest 6 steps in the best practices of developing performance measures in government that include: Define Desired Measures, Assess Measures, Select Key Indicators, Identify Limitations, Simplify Measures, and Establish Targets. 34 The Performance-Based Management Special Interest Group (PBM SIG) funded by the US Department of Energy define a number of characteristics that represent good performance measures that include ensuring measures are results-based, practical and easy to understand, measurable, normalized for benchmarking and regularly assessed. Additionally, this approach also highlighted that the value of the measures should exceed the cost of measurement. Reference Artley35 Finally, Wolk, et al. prepared a Root Cause How-to Guide, which provides a Performance Measurement Cycle system that allows each activity and measurement to be evaluated and refined after each Cycle. Reference Wolk, Dholakia and Kreitz36

All of these approaches demonstrate the importance of defining performance measures, setting targets, and assessing performance, as well as measuring validity from an interactive cycle of performance management. However, none are well-situated to the periodic performance environment and the improvisation involved within a public health emergency preparedness and response framework. This is because they are based in relatively routine, high frequency operations that can be evaluating at high volumes on a regular basis. However, by applying these principles of defining measures and establishing measurable targets, an adaptive approach begins to emerge.

An Integrated Approach to Performance Measure Development at the NYC DOHMH

As the review of available literature came to a conclusion, the project team determined that the baseline framework to be used by the NYC DOHMH should follow the SOARS process, Reference Keim26 as the anchor for developing and articulating Performance Measures since it allowed for preparedness work to focus squarely on response activities. This framework also integrated the Strategy Paradox, Reference Raynor29 and principles of Emergent Strategy, Reference Carpenter, Bauer and Erdogan28 to ensure that the management of uncertainty could be integrated into the development of Strategic Objectives in a manner that was reliable, valid, and standardized. It was the project team’s aim to ensure that performance measures would be developed and/or integrated into the framework to cover the full range of strategic possibilities, but with sufficient detail to implement tangible actions in disaster response and recovery. The process of developing performance measures was also influenced by numerous examples of best practices, 34-Reference Lichiello and Turnock37 and was designed to ensure maximum applicability and reliability of the measures from the perspective of the stakeholders involved.

In developing performance measures, Strategic Objectives were first developed to frame the areas of uncertainty that the NYC DOHMH would be assumed to operate within. This process recognized that the more strategic the definition, the greater the purpose to manage uncertainty by developing and maintaining response options. As more specific Operational Objectives and Activities were developed, the focus shifted from managing uncertainty to implementing response decisions, or commitments. By appropriately framing the boundaries of uncertainty through the development of Strategic Objectives, the project team sought to assure that the subsequent Activities and Performance Measures resulted in a comprehensive toolkit of response/recovery activities that would then be tested further by the agency. Figure 3 below depicts the relationship between uncertainty and the SOARS framework, where uncertainty dimensions are the categories of information unavailable for decision-making at the time. This may be specific about the threat (e.g., transmission rate of a virus, level of contamination in the air), a range of political actions (e.g., will elected officials be heavily involved or minimally involved), or any other factor that may influence the response and recovery that is unknown, but highly impactful.

Figure 3. SOARS framework with integration of uncertainty.

Once the Strategic Objectives were aligned, the Operational Objectives were then developed to represent the desired end-state and their composite activities. These activities were the direct actions that generated impact for the residents of New York City in the response and recovery phases of emergency management, and were thus the most critical measures to define. Figure 4 below depicts where performance measures were situated within the flow of the SOARS framework.

Figure 4. SOARS framework performance measures integrated.

For each Activity, Performance Standards were developed to define the optimal level of performance. Likewise, the measure provided the method for actually measuring progress towards the achievement of the Performance Standard. Reference Point33 Given the uncertainty of the measures, ratios that could be calculated in situ were generally used instead of discrete numerical targets (e.g., x percent of new sites inspected within 1 operational period).

For each Activity, a Performance Standard was developed to represent the optimal performance based on the mission, objectives, and goals of the NYC DOHMH. When developing Performance Measures, the perspectives of the Activity owner(s) as well as internal and external stakeholders responsible for the outcomes were considered as they were the people most interested in the result. Reference Lichiello and Turnock37 Each measure also required a clear definition of its meaning, frequency, relevance, precision, compatibility, cost effectiveness, simplicity and validity in order to effectively monitor and evaluate the progress towards the achievement of the Performance Standard. 34

The process for developing Performance Measures was designed to utilize a stakeholder engagement process that included Performance Measure development best practices, and the following steps:

  1. 1. Set objectives for performance

  2. 2. Establish the measurement frame

  3. 3. Gather Subject Matter Experts (SME) according to the measurement frame

  4. 4. Identify Critical Tasks/Activities

  5. 5. Draft of Performance Measures

  6. 6. Ensure validity of Performance Measures with SMEs

Critical factors in determining the likelihood of success of the Performance Measures were the individual and organizational readiness to conduct the defined activities. This was the ‘fitness criteria’ derived from Emergent Strategy, Reference Carpenter, Bauer and Erdogan28 which points to the kinds of actions that individuals and organizations can take to prepare for their operational roles. Understanding the level of inherent fitness helped to guide preparedness investments towards meeting Performance Standards, which, in aggregate, were designed to achieve Operational, and Strategic Objectives. This ‘inherent fitness’ speaks to the similarity of emergency functions to day-to-day tasks within both the individual and organizational contexts, and the readiness to transition to those emergency functions rapidly. Tasks that are similar likely need less preparedness work to achieve fitness, and vice versa, allowing more nuanced measurement/design of preparedness activities.

Depending on the response function, individuals must be prepared to manage uncertainty (create and preserve response options) and/or implement commitments. Reference Raynor29 Strategic personnel will require the tools and skillsets for managing uncertainty, whereas operational personnel will be more focused on tactical knowledge and skills. Alignment of emergency roles to day-to-day functions increases inherent fitness and decreases (though would never eliminate) needed preparedness (see Figure 5).

An organization’s ability to successfully implement its tasks will depend on how closely aligned it is to its core operations. Reference Dynes27 The more emergent the task, the more challenging it will be for the organization and thus the more pro-active the preparedness investments will need to be (see Figure 5).

Figure 5. Individual and organizational roles.

Conclusion

The performance measure development project team involved in this process reviewed current health and medical disaster planning and performance guidance, predominantly from U.S. federal agencies, as well as planning frameworks rooted in the private sector. As noted throughout this review, many of the federal efforts to develop and evaluate performance measures have been unsuccessful or inconclusive, primarily due to their lack of relevance to local response measurement procedures and concerns. NYC DOHMH and NCDP developed a new approach anchored in the SOARS model. This new framework strived to bring together the most relevant portions of the disaster literature with other frameworks to create a practical approach to the otherwise impractical events of disaster response. This approach combines tactical analysis of real-world response actions, their similarity (or not) to daily work and an in-depth assessment of the uncertainty which can never be separated from performance measurement in an emergency context.

This framework provided depth and focus to the Strategic Preparedness and Response Total Alignment (SPARTA) planning framework developed by DOHMH and grounded preparedness efforts within that framework. In fact, beginning with conceptual understanding of performance measures within each Strategic or Operational Objectives, was found to make preparedness efforts more specific, and thus more successful. Still, full engagement of stakeholders, analysis of uncertainty across the range of potential Strategic Objectives, and the craftsmanship/validation of the actual Performance Measures are critical to ensure improved outcomes in a disaster response, and should be prioritized in future efforts.

Researchers should also consider this work in light of new insights and publications since this project was conducted. Additionally, while this project was developed within a United States public health preparedness framework, there may be elements worthy of additional consideration overseas. In particular, the integration of uncertainty dimensions into performance measurement is likely to be of value to public health agencies in the U.S. and internationally.

Funding statement

Original Funding was provided under a grant to the New York City Department Health and Mental Hygiene from the US Department of Health and Human Services Hospital Preparedness Program and Public Health Preparedness Program (Award U90TP00546).

References

Bascetta, CA. Public health and hospital emergency preparedness programs evolution of performance measurement systems to measure progress. U.S. Government Accountability Office Website. 2007. https://www.gao.gov/products/GAO-07-485R. Accessed December 15, 2019.Google Scholar
Department of Homeland Security (DHS). National Preparedness Goal. Washington, DC: US Department of Homeland Security; 2011. https://www.dhs.gov/national-preparedness-goal.Google Scholar
Department of Homeland Security (DHS). Target capabilities list: A companion to the national preparedness guidelines. Washington, DC: U.S. Dept. of Homeland Security; 2007.Google Scholar
U.S. Centers for Disease Control and Prevention. Public Health Emergency Preparedness Cooperative Agreement: Budget Period 2 Performance Measure Specificactions and Implementation Guidance; July 1, 2013 – June 30, 2014. Washington, DC: US Health and Human Services; 2013.Google Scholar
U.S. Department of Health and Human Services (HHS)/Office of the Assistant Secretary for Preparedness and Response (ASPR). Healthcare Preparedness Capabilities: National Guidance for Healthcare System Preparedness. Washington, DC: US Health and Human Services; 2012.Google Scholar
U.S. Government Accountability Office (GAO). National preparedness: improvements needed for measuring awardee performance in meeting medical and public health preparedness goals: A report to the Committee on Energy and Commerce, House of Representatives; 2013. https://www.gao.gov/products/GAO-13-278. Accessed December 15, 2019.Google Scholar
Maurer, DC, U.S. Government Accountability Office (GAO). National preparedness: FEMA has made progress, but additional steps are needed to improve grant management and assess capabilities: Testimony before the Subcommittee on Emergency Management, Intergovernmental Relations and the District of Columbia, Committee on Homeland Security and Government Affairs, U.S. Senate; 2013. https://www.gao.gov/assets/gao-13-637t.pdf. Accessed December 15, 2019.Google Scholar
Jenkins, WO. FEMA has made limited progress in efforts to develop and implement a system to assess national preparedness capabilities; 2010. US Government Accountability Office Website. https://www.gao.gov/products/GAO-11-51R. Accessed December 15, 2019.Google Scholar
Jackson, BA. The Problem of Measuring Emergency Preparedness: The Need for Assessing “Response Reliability” as Part of Homeland Security Planning. Santa Monica, CA: RAND Corporation; 2008. https://www.rand.org/pubs/occasional_papers/OP234.html. Accessed December 15, 2019.Google Scholar
Broughton, PN. Measuring Preparedness Accessing the Impact of the Homeland Security Grant Program [dissertation]. Monterey, California, Naval Postgraduate School; 2009.Google Scholar
Freking, K. US struggles to ensure funds aid fight against bioterrorism. Boston Globe. 2007. http://archive.boston.com/news/nation/washington/articles/2007/03/11/us_struggles_to_ensure_funds_aid_fight_against_bioterrorism/. Accessed December 15, 2019.Google Scholar
Asch, SM, Stoto, M, Mendes, M, et al. A review of instruments assessing public health preparedness. Public Health Rep. 2005;120(5):532542.CrossRefGoogle ScholarPubMed
Davis, MV, Bevc, CA, Schenck, AP. Declining Trends in local health department preparedness capacities. Am J Public Health. 2014;104(11):22332238.CrossRefGoogle ScholarPubMed
Lurie, N, Wasserman, J, Nelson, CD. Public health preparedness: Evolution or revolution? Health Aff (Millwood). 2006;25(4):935945.Google ScholarPubMed
Nelson, C, Lurie, N, Wasserman, J. Assessing public health emergency preparedness: Concepts, tools, and challenges. Annu Rev Public Health. 2007;28:118.Google ScholarPubMed
U.S. Department of Health and Human Services (HHS). National Health Security Strategy of the United States of America. Washington, DC: US Department of Health and Human Services; 2009.Google Scholar
U.S. Department of Health and Human Services (HHS). National Health Security Review of the United States of America. Washington, DC: US Department of Health and Human Services; 2014.Google Scholar
U.S. Department of Health and Human Services (HHS). National Health Security Strategy and Implementation Plan 2015–2018; 2015. US Department of Health and Human Services Website. https://www.phe.gov/Preparedness/planning/authority/nhss/Pages/default.aspx. Accessed December 15, 2019.Google Scholar
Uzun Jacobson, E, Inglesby, T, Khan, AS, et al. Design of the national health security preparedness index. Biosecur Bioterror. 2014;12(3):122131.CrossRefGoogle ScholarPubMed
Mays, GP, Childress, M, Hoover, AG, Bollinger, C. National Health Security Preparedness Index: Recommendations from the National Advisory Committee; 2015. https://uknowledge.uky.edu/cgi/viewcontent.cgi?article=1082&context=hsm_present. Accessed December 15, 2019.Google Scholar
U.S. Centers for Disease Control and Prevention. Funding and Guidance for State and Local Health Departments. Washington, DC: US Health and Human Services; 2015.Google Scholar
Nelson, C, Lurie, N, Wasserman, J, Zakowski, S. Conceptualizing and defining public health emergency preparedness. Am J Public Health. 2007;97 Suppl 1(Suppl 1):S9S11.CrossRefGoogle ScholarPubMed
Savoia, E, Lin, L, Bernard, D, Klein, N, James, LP, Guicciardi, S. Public health system research in public health emergency preparedness in the United States (2009-2015): actionable knowledge base. Am J Public Health. 2017;107(S2):e1e6.CrossRefGoogle ScholarPubMed
Quarantelli, EL. Research based criteria for evaluating disaster planning and managing. Newark Delaware: Disaster Research Center, University of Delaware. 1997. http://udspace.udel.edu/handle/19716/136. Accessed December 15, 2009.Google Scholar
Keim, ME. O2C3: A unified model for emergency operations planning. Am J Disaster Med. 2010;5(3):169179.CrossRefGoogle ScholarPubMed
Keim, ME. An innovative approach to capability-based emergency operations planning. Disaster Health. 2013;1(1):5462.CrossRefGoogle ScholarPubMed
Dynes, RR. Organized Behavior in Disaster. Lexington, MA: Heath Lexington Books; 1970.Google Scholar
Carpenter, MA, Bauer, T, Erdogan, B. Principles of Management 1.1. Irvington, NY: Flat World Knowledge; 2010.Google Scholar
Raynor, ME. The Strategy Paradox: Why Committing to Success Leads to Failure (And What to do About it). New York, NY: Crown Business; 2007.Google Scholar
Adair, CE, Simpson, E, Casebeer, AL, Birdsell, JM, Hayden, KA, Lewis, S. Performance measurement in healthcare: part II-state of the science findings by stage of the performance measurement process. Healthcare Policy. 2006;2(1):5678.Google ScholarPubMed
HRSA. Performance Management and Measurement. Washington DC: US Department of Health and Human Services; 2011. http://www.hrsa.gov/quality/toolbox/508pdfs/performancemanagementandmeasurement.pdf. Accessed December 15, 2019.Google Scholar
Institute of Medicine (US) Committee on Quality of Health Care in America. Crossing the Quality Chasm: A New Health System for the 21st Century. Washington, DC: National Academies Press; 2001.Google Scholar
Point, T. From Silos to Systems: Using Performance Management to Improve the Public’s Health. Washington, DC: Public Health Foundation; 2004.Google Scholar
National State Auditors Association (NSAA). Best Practices in Performance Measurement Part 1: Developing Performance Measures. Lexington, KY: National State Auditors Association Best Practices Document; 2014.Google Scholar
Artley, W. The Performance-Based Management Handbook. Oak Ridge, TN: Performance-Based Management Special Interest Group; 2001.Google Scholar
Wolk, A, Dholakia, A, Kreitz, K. Building a Performance Measurement System: Using Data to Accelerate Social Impact. Cambridge, MA: Root Cause; 2009.Google Scholar
Lichiello, P, Turnock, BJ. Guidebook for Performance Measurement. Washington, DC: University of Washington Turning Point National Program; 1999.Google Scholar
Figure 0

Figure 1. Framework and illustrative example of SOARS framework from Keim (2013).

Figure 1

Figure 2. Organizational levels and relationship to managing uncertainty adapted from Raynor.29

Figure 2

Figure 3. SOARS framework with integration of uncertainty.

Figure 3

Figure 4. SOARS framework performance measures integrated.

Figure 4

Figure 5. Individual and organizational roles.