Hostname: page-component-745bb68f8f-s22k5 Total loading time: 0 Render date: 2025-02-06T03:08:35.483Z Has data issue: false hasContentIssue false

Are We Ready for Mass Fatality Incidents? Preparedness of the US Mass Fatality Infrastructure

Published online by Cambridge University Press:  28 December 2015

Jacqueline A. Merrill
Affiliation:
Department of Nursing in Biomedical Informatics, Columbia University Medical Center, New York, New York
Mark Orr
Affiliation:
Social and Decision Analytics Laboratory, Virginia Polytechnic Institute and State University-National Capital Region, Arlington, Virginia
Daniel Y. Chen
Affiliation:
Social and Decision Analytics Laboratory, Virginia Polytechnic Institute and State University-National Capital Region, Arlington, Virginia
Qi Zhi
Affiliation:
Phillip R. Lee Institute for Health Policy Studies, School of Medicine, University of California, San Francisco
Robyn R. Gershon*
Affiliation:
Phillip R. Lee Institute for Health Policy Studies and Department of Epidemiology and Biostatistics, School of Medicine, University of California, San Francisco.
*
Correspondence and reprint requests to Robyn R. Gershon, DrPH, MHS, Phillip R. Lee Institute for Health Policy Studies and Department of Epidemiology and Biostatistics, School of Medicine, University of California, San Francisco, 3333 California Street, Suite 280, San Francisco, CA 94118 (e-mail: robyn.gershon@ucsf.edu).
Rights & Permissions [Opens in a new window]

Abstract

Objective

To assess the preparedness of the US mass fatality infrastructure, we developed and tested metrics for 3 components of preparedness: organizational, operational, and resource sharing networks.

Methods

In 2014, data were collected from 5 response sectors: medical examiners and coroners, the death care industry, health departments, faith-based organizations, and offices of emergency management. Scores were calculated within and across sectors and a weighted score was developed for the infrastructure.

Results

A total of 879 respondents reported highly variable organizational capabilities: 15% had responded to a mass fatality incident (MFI); 42% reported staff trained for an MFI, but only 27% for an MFI involving hazardous contaminants. Respondents estimated that 75% of their staff would be willing and able to respond, but only 53% if contaminants were involved. Most perceived their organization as somewhat prepared, but 13% indicated “not at all.” Operational capability scores ranged from 33% (death care industry) to 77% (offices of emergency management). Network capability analysis found that only 42% of possible reciprocal relationships between resource-sharing partners were present. The cross-sector composite score was 51%; that is, half the key capabilities for preparedness were in place.

Conclusions

The sectors in the US mass fatality infrastructure report suboptimal capability to respond. National leadership is needed to ensure sector-specific and infrastructure-wide preparedness for a large-scale MFI. (Disaster Med Public Health Preparedness. 2016;10:87–97)

Type
Original Research
Copyright
Copyright © Society for Disaster Medicine and Public Health, Inc. 2015 

Two of the most enduring American values are respect for the deceased and compassion for the bereaved. Even in the aftermath of a mass fatality incident (MFI), we strive to uphold these values through rapid identification of decedents, the provision of spiritual and psychological support to families and responders, and timely release of remains for final interment. However, management of a large-scale or complex MFI is extremely challenging and requires the cooperation and collaboration of a large network of public and private agencies and organizations. In the United States, this is referred to as the mass fatality infrastructure. The sectors that constitute this infrastructure, as shown in Figure 1, include the medico-legal system of medical examiners and coroners (ME/C), the death care industry (DCI; funeral homes, cemeteries, crematories, and funeral industry suppliers), departments of health (DOHs), faith-based organizations (FBOs) and other voluntary organizations, offices of emergency management (OEMs), and the Disaster Mortuary Operational Response Teams (DMORTs).

Figure 1 US Mass Fatality Infrastructure. Abbreviations: DHHS, US Department of Health and Human Services; OSHA, Occupational Safety and Health Administration.

Cooperation between sectors is essential but difficult for a number of reasons; each sector has different but interdependent responsibilities and must be cognizant of the roles that other sectors play, each has varying levels of capabilities, and each operates at different levels. For example, the medico-legal system and the DOHs operate at the county, regional, or state level, and members of both the DCI and FBOs may be local (eg, local funeral homes, community faith-based organization) or national (eg, nationwide funeral industry services). The DMORTs, which are organized by Federal Emergency Management Agency (FEMA) regions, respond under the command of the US Department of Health and Human Services and the control of the Assistant Secretary for Preparedness and Response. Although each sector plays an essential independent role, ultimately, the successful management of a large-scale or complex MFI is dependent on an effective and cooperative response from all infrastructure members.

An MFI is defined by convention as a situation in which the number of deaths exceeds the local jurisdiction’s response capabilities. A large-scale MFI may involve hundreds of fatalities. A complex MFI refers to contamination of the site or remains with chemical, biological, radiological, nuclear, or explosive (CBRNE) contaminants. During an MFI, it is first the local authorities, most often the ME/C, that are responsible for managing additional deaths above those expected during normal operations. When demand exceeds capacity, additional resources may be provided, typically by county and state OEMs and DOHs.Reference Terbush, Huckaby and Luke 1 - Reference Abbott 3 Collaboration with nongovernmental partners such as funeral homes, churches, and voluntary organizations such as the American Red Cross also occurs. 2 , Reference Kapucu and Van Wart 4 When local capabilities are exceeded, the plan in the United States is to call upon state and then federal resources, such as the DMORTs. 5

Nearly a decade ago, amid growing concerns of deliberate attacks involving weapons of mass destruction and fears of a high-fatality pandemic event, a Working Group was convened by the US Northern Command in cooperation with the US Department of Health and Human Services to discuss the nation’s ability to effectively manage an MFI.Reference Gursky 6 The Working Group concluded that the capabilities of the US mass fatality management infrastructure were limited, especially with respect to large-scale catastrophic events, and they recommended that additional preparedness steps be taken immediately to improve readiness. However, improvements have lagged; as noted in the National Preparedness Report, 7 states and territories consistently rank themselves poorly prepared in the area of fatality management, which is 1 of the 31 core capabilities defined in the National Preparedness Goals. 5 Disaster preparedness experts both within and external to the infrastructure consistently voice concerns regarding large-scale MFI preparedness in the United States.

An important barrier to improvement has been the lack of consensus on the appropriate measures of preparedness for each of the key sectors in the MFI infrastructure. This study was designed to address this gap by (1) developing and testing new sector-specific metrics of preparedness and (2) using these metrics to assess preparedness for each sector and for the infrastructure as a whole.

Methods

Metric and Survey Development

With input from stakeholders in each response sector and experts in mass fatality management and disaster response, we first identified 3 key components or domains to “define” preparedness: (1) organizational capabilities, (2) operational capabilities, and (3) strength of reciprocal ties with resource-sharing partners.

To construct sector-specific items for each component, we followed a 4-part process: (1) review of federal MFI response documents and relevant analogous documents for each of the sectors; (2) an environmental scan to obtain existing materials for review, such as state annexes, mass fatality plans, or other documents; (3) drafting of survey items based on document review; and (4) iterative review by the research team and external experts. Individualized surveys were then constructed for each of the infrastructure sectors and formatted for anonymous online administration. 8 The surveys underwent extensive validation testing before administration. Our expert panel reviewed draft surveys and assessed them for construct validity (including both face and content validity) as well as criterion validity. We also conducted analysis of measures to assess for convergent validity via pilot testing with representative members of each sector. The pilot testing also served to refine the internet-based survey format and to determine the average length of time for completion. Each survey contained about 50 items tailored to each sector and took about 20 minutes to complete. At the start of each survey, a brief introductory paragraph provided information on the purpose of the study and information on the research team members. No incentives were used because all participants were professional representatives of their organizations. All procedures had the approval of the University of California, San Francisco, and Columbia University Institutional Review Boards (CHR 12-09425 and IRB-AAAL0206). All study materials, including the surveys, may be obtained by contacting the corresponding author.

Data Collection

Data were collected over a 6-week period. Respondents were recruited through collaborating professional groups and associations: the Association of State and Territorial Health Officials; the National Association of County and City Health Officials; the National Association of Medical Examiners; the International Association of Coroners and Medical Examiners; the International Cemetery, Cremation and Funeral Association; the National Funeral Directors Association; the International Association of Emergency Managers; and the National Disaster Interfaiths Network. DMORTs were contacted directly. Because a great many voluntary organizations may respond to large-scale MFIs and it was beyond the scope of this study to include them all, we decided to focus on FBOs because they have increasingly been playing a more prominent role in response to MFIs. Other voluntary organizations, such as the American Red Cross, were, however, included on the surveys as possible response partners, along with local health organizations, first responder agencies, and others. Potential participants were recruited through the collaborating professional organizations noted above. Announcements regarding the study were made and a link to the survey was provided. Interested individuals were requested to forward the link to the person in their organization most knowledgeable about planning for MFIs. For that reason, and also because most of these organizations have fluid memberships, it is not possible to ascertain an accurate denominator.

Measures

To assess organizational capacity to respond to MFIs, 5 survey questions called for yes or no answers: whether the respondent’s jurisdiction experienced an MFI in the past 5 years, whether their organization conducted or participated in jurisdiction-wide (city, county, state) drills or exercises, whether their organization had mutual aid agreements in place to share resources with jurisdictional or community partners, and whether their organization had provided training to its staff on (a) the mass fatality plan and (b) MFIs involving CBRNE. Additionally, respondents were asked to estimate the percentage of staff expected to report during an MFI on a scale from 0% to 100% in increments of 10%. The question was asked 4 ways: percentage willing to respond to MFI, percentage willing to respond to MFI involving CBRNE, percentage able to respond to MFI, and percentage able to respond to MFI involving CBRNE (this question was not part of the survey for the OEM sector because they are mandated to respond). Respondents were also asked to indicate which of 7 elements (eg, more training, more drills, more written plans) their organization needed to be better prepared. Perception of organizational preparedness was measured on a 5-point Likert scale by asking, “How would you rate the overall preparedness of your agency, office or organization?”

To assess operational capability to respond, each sector’s survey contained a list of key elements or capabilities tailored to the response role for that sector. Respondents were asked to indicate whether or not an item was included in their organization’s plan. The survey for the FBO sector did not measure this capacity, because responders from the National Disaster Interfaiths Network are not drawn from any single organization.

To assess network capability for collaborative resource sharing between MFI response partners, answers to 2 questions were combined: (1) From which agencies or organizations does your agency/office expect to obtain necessary resources to manage MFI? and (2) To which agencies or organizations does your agency/office expect to provide necessary resources to manage MFI? Answers were dichotomous (yes or no). The DOH and ME/C sectors were asked about relationships with 11 possible response partners (eg, local first response organizations, local health care organizations, state DOH). The OEM and DCI sectors were asked about 9 and 6 possible partners, respectively. The FBO sector is not expected to obtain resources from response partners and therefore was only asked about providing resources. Possible response partners at all levels (local, state, and federal) were included. We did not differentiate between “appropriate” and “possible” relationships (ie, state-level organizations would be expected to have relationships at the federal level), for 2 reasons: (1) to standardize surveys across all subgroups, we used similar lists of all possible response partners, and (2) to learn more about all possible partners—even unexpected ones—because information on this is lacking. To estimate overall MFI preparedness, we devised a composite score that combined the weighted averages of each sector’s organizational, operational, and network capabilities.

Statistical Analysis

Descriptive statistics were performed. In all cases, missing data and “don’t know” responses were dropped when calculating point estimates. The average percentage of “don’t know” and missing responses across questions in each of the surveys was about 7% for DCI, 9% for DOH, 16% for FBO, 12% for ME/C, and 10% for OEM. Data preparation and analysis were performed by using R, 9 and network analysis was performed with ORA.Reference Carley 10

To analyze items related to organizational and operational capability, we computed the percentage answering “yes” in the context of each item. The true proportion (ie, valid percent) that responded positively were aggregated within sectors and then averaged across sectors. For “perception of workplace preparedness,” we computed the distribution across categories (ie, 0% to 100%). For estimates of staff willing and able to respond, we produced 2 scores: (1) mean percentage estimated willing and able to respond in an MFI and (2) mean percentage estimated willing and able to respond in an MFI with CBRNE.

To analyze network capability, a matrix was created for the 2 resource-sharing questions. Survey respondents were represented on the vertical axis and possible response partners on the horizontal. In each cell, a value of 1 was assigned to a “yes” response and all other responses were assigned 0. For each sector, the 2 matrices were added and binarized such that a bi-directional tie (ie, expect to both provide and obtain resources) was valued as 1 and all others as 0. This network of reciprocal relationships for each sector, when multiplied by its transpose, produced a matrix in which cells on the diagonal axis contained counts of reciprocal ties with each possible partner. These counts were divided by the valid number of respondents who answered the question to yield the proportion in each sector that reported reciprocal ties with a specific response partners. Note that we did not perform individual network analysis at both state and local levels because only DOHs and ME/C have both local level and state departments or offices.

Network density was calculated for each sector on the “provide resources,” “obtain resources,” and reciprocal networks. This measure of network cohesion represents the existing ties between entities as a proportion of all ties that are possible.Reference Wasserman and Faust 11 Density scores were normalized to fall between 0 and 1 to allow comparison between the sectors. Averages weighted by the valid number of respondents were calculated across sectors.

To estimate MFI preparedness for the overall response infrastructure, we aggregated the responses for each capability. For the item addressing “elements needed for better preparedness,” we reversed the calculation so that respondents indicating that their organization needed fewer of the 7 elements received a higher score (ie, “were better prepared”). We used density of the reciprocal ties network as an estimate of network capability.

Results

There were a total of 879 US responses across 5 sectors distributed across all FEMA regions. International responses were excluded (and there were a few, mainly from Canadians). There were 294 responses from representatives of the DCI sector, 178 from DOHs, 124 from FBOs, 122 from ME/C, 161 from OEMs, and 5 from DMORTs. Because there were so few DMORTs in the sample (there are only 10 in the United States, although not all are currently active), and because they are organized by FEMA region, the DMORT responses were not included in the analysis presented here to protect their privacy. Most states, territories, and the District of Columbia were represented by at least one respondent from at least one sector (Figure 2), and responses were obtained from all levels (county, city, state, and federal). Responses were more concentrated in the northeast, and Texas was over-represented in the data.

Figure 2 Distribution of Sector Respondents, by State (N=879).

Organizational Capability

Across the 5 sectors, 48% of respondents indicated that their organization participated in jurisdiction-wide drills, as shown in Table 1. The greatest proportion was from the ME/C sector and the smallest was from the DCI sector.

Table 1 Organizational Capability to Respond in 3 CategoriesFootnote a

a Abbreviations: CBRNE, chemical, biological, radiological, nuclear, and explosives; DCI, death care industry; DOH, department of health; FBO, faith-based organization; MEC, medical examiners/coroners; MFI, mass fatality incident; OEM, offices of emergency management.

b Question was not asked.

Across sectors, only 15% indicated that their jurisdiction experienced an MFI in the past 5 years. The FBO sector, which responds nationally via the National Disaster Interfaiths Network, 12 reported the greatest proportion. The DCI sector reported the smallest. The most frequently reported needs were for more training, planning, and drills. About half of the respondents indicated a need for more funding and greater surge capacity. The least frequently reported needs were for written MFI plans and inter-agency agreements.

Across sectors, respondents estimated that 75% of their organizations’ staff would be “willing and able” to respond to MFIs. The ME/C sector estimated the greatest percentages, followed by DCI and DOH. The estimates from the FBO sector were the smallest. The estimated proportion “willing and able” decreased by about 20 percentage points if the MFI involved CBRNE contamination, and this was true across all sectors. Most respondents perceived their organization to be prepared to some degree. Only 13% indicated that their organization was “not at all prepared” (Table 2).

Table 2 Distribution of Respondent’s Perception of Workplace PreparednessFootnote a

a Abbreviations: DCI, death care industry; DOH, department of health; FBO, faith-based organization; MEC, medical examiners/coroners; OEM, offices of emergency management.

Operational Capability

The proportion of sector-specific capabilities that respondents reported to be present in their organizations’ operational plans ranged between 33% (DCI) and 77% (OEM). The results are given in alphabetical order in Table 3.

Table 3 Operational Capability to Respond: Elements Included in the Mass Fatality Incident Plan of the Respondents’ OrganizationsFootnote a

a Abbreviations: DCI, death care industry; DOH, department of health; MEC, medical examiners/coroners; OEM, offices of emergency management; PPE, personal protective equipment.

b Hazardous materials.

c A dash indicates the item was not included in the sector’s survey.

DCI respondents reported 33% capability across 8 items. A majority reported that their plans covered up-to-date contact information for staff and suppliers (74%). Fewer reported that their plans addressed the ability to repair critical equipment (47%), included up-to-date contact information for back-up staff (48%), or included written inventories of key supplies (35%). Even fewer reported written plans for maintaining regular service (16%), formal agreements with suppliers (14%), written plans for staff absences (14%), and written plans for provision of altered standards of service (12%).

DOH respondents reported 72% capability across 19 items. A majority, ranging between 97% and 51%, reported that their operational MFI plans addressed 16 or more of the 19 capabilities. A minority reported that their plan addressed 3 items related to security of the disaster site and human remains.

ME/C respondents reported an average of 52% capability across 34 items. Over 70% reported that their MFI capabilities included morgue services and management of human remains, but the proportion dropped to below 60% for their ability to carry out joint investigations and to maintain site security. The lowest operational capability scores reported by the ME/C were for providing long-term family management and memorials (8%), identification of temporary interment sites (17%), communicating via social media (21%), and organizing a missing persons call center (23%). The ability to provide interment that was sensitive to religious and cultural considerations was low (30%).

OEM respondents reported 77% overall operational capability (using 16 items in each of 3 distinct categories of MFI (“regular” MFI, epidemic MFI, and hazardous materials [CBRNE] MFI) that they might be required to respond to. For an MFI without pandemic/epidemic or hazardous materials (CBRNE) involved, an average 97% capability was reported. In contrast, OEM respondents reported much lower capability for responding to MFI with pandemic/epidemic (66%) and MFI contaminated with CBRNE (69%). Items addressing management of sites and human remains were the source of greatest discrepancy between “regular” and contaminated response capability.

Network Capability

The proportion of respondents reporting reciprocal resource-sharing relationships with possible response partners is shown in Table 4. Across groups, respondents reported the lowest rates of reciprocity with faith-based and other volunteer organizations. Across sectors, there was considerable variability. High proportions (60-70%) of ME/C respondents reported reciprocity with local and state OEMs, local health departments, and local first response organizations. High proportions of DOH respondents (~60%) reported reciprocity with the coroner/sheriff/justice of peace and local first responders. OEM sector respondents reported the highest reciprocal ties with local first responders (68%), coroners (51%), and nearby health departments (50%). Most DCI sector respondents did not report reciprocity with possible partners, except for organizations “similar to their own” (ie, other local funeral homes, cemeteries, and crematories), for which 64% reported reciprocal relationships.

Table 4 Network Capability: Resource-Sharing Relationships With Selected Response PartnersFootnote a

a Abbreviations: CERT, community emergency response team; DCI, death care industry; DMORT, Disaster Mortuary Operational Response Team; DOD, Department of Defense; DOE, Department of Energy; DOH, department of health; EMS, emergency medical services; FBO, faith-based organization; MEC, medical examiners/coroners; NERT, neighborhood emergency response team; OEM, offices of emergency management.

b In the DCI questionnaire, the medical examiner category included coroner.

c A dash indicates the item was not included in the sector’s survey.

Of particular note, few OEM respondents (23%) reported reciprocity with local funeral homes, cemeteries, and crematories, and even fewer DCI respondents (9%) reported reciprocity with the local OEM. An interesting reciprocity gap was noted; more ME/C and DOH respondents reported reciprocity with local funeral homes (59% and 55%, respectively) than DCI sector respondents reported with ME/C and DOH (27% and 10%, respectively).

Network density measurements are shown in Table 5. In the “provide resources” network, 56% of possible relationships were present. In the “obtain resources” network, 52% of possible relationships were present. In the reciprocal network, 42% of possible relationships were present. The governmental sectors (DOH, OEM, and ME/C) had the highest density measurements, indicating greater potential for resource sharing. The DCI sector had the lowest, with the notable exception of the “provide resources” network, which was on par with the governmental sectors.

Table 5 Comparison of Density of Ties Between Response Partners Within Each Sector for Resource Sharing NetworksFootnote a

a Abbreviations: DCI, death care industry; DOH, department of health; FBO, faith-based organization; MEC, medical examiners/coroners; OEM, offices of emergency management. Density is shown as normalized proportion (0–1).

A composite estimate of national MFI preparedness calculated from the weighted averages for each of the 3 components of preparedness across the 5 response sectors was 51%. This was based on within-sector estimates of overall preparedness capability at 36% for the DCI sector, 55% for FBO, 50% for DOH, 53% for ME/C, and 63% for OEM.

Discussion

In this study of a national sample of 879 individual representatives of 5 MFI response sectors, preparedness capabilities were uneven. The governmental sectors (DOH, ME/C, and OEM) generally reported greater capability than the private sectors (DCI and FBO), although important gaps were reported in all sectors. A composite estimate across sectors suggests the nation is about half prepared for a complex MFI, mirroring federal estimates. 5 , 7

In terms of organizational capability, our investigation found that these sectors have limited experience to draw upon. Only about half of our sample reported organizational participation in MFI drills; far less reported prior experience with MFI. We have shown that MFI experience is positively correlated with higher levels of preparedness,Reference Gershon, Magda, Riley and Merrill 13 and emergency response experience is associated with better organizational performance in general.Reference Merrill, Carley and Orr 14

Less than half of the sample reported that their staff had been trained for MFI, and only about a quarter reported that staff had been trained for MFI with CBRNE. There is additional cause for concern when the data on MFI experience and lack of training are viewed in light of willingness to respond; the number of experienced staff that are both willing and able to respond is likely to be suboptimal for contaminated MFI. These findings are similar to other studies on intentions to respond to disasters when hazardous contaminants are involved.Reference Gershon, Magda, Riley and Merrill 13 , Reference Chaffee 15 - Reference Qureshi, Gershon and Sherman 17 Assumptions regarding response workforce strength may compromise even the best disaster planning; therefore, data on the expected “true” response of staff are a critical piece of planning information for mass fatality infrastructure organizations. 7 Most respondents perceived that their workplace was “somewhat prepared,” yet almost all respondents reported that more training, drills, and planning were needed to improve their level of preparedness, which is a clear mandate for allocating additional resources to the MFI infrastructure for preparedness activities.

With respect to operational capability, the OEM sector appears adequately prepared for an uncomplicated MFI, but considerably less so if the MFI involves CBRNE contamination. The most dramatic deficits were reported in the capacity of managing sites and human remains that were contaminated with CBRNE. The DOH sector reported unique strengths such as maintaining continuity of its nonresponse functions, providing health and safety guidance and training for other responders and the public, and providing mental health services. In comparison, the ME/C sector, designated lead in MFI response, reported only about half the operational capability to do so. In our survey, 42% of ME/C reported that 24 or fewer additional fatalities over their normal caseload (in a 48-hour time period) would exceed their capacity. This suggests a potentially critical gap in surge capacity. In a recent analysis of a mass murder incident with 33 deaths, Fierro and colleagues noted that whereas a relatively small incident (<50 fatalities) can generally be managed by ME/C without additional out-of-state resources, even these smaller incidents require local system surge capacity.Reference Fierro 18 Other gaps reported by ME/C included their ability to identify temporary interment, provide staff respite areas, manage missing person call centers, and communicate to the public through social media. The ability to provide interment that was sensitive to a range of religious beliefs was low. We have no information on the ability of ME/C to provide other services (such as missing person call centers) for non-English-speaking populations, as this was not addressed in our survey (for the ME/C or any other sectors), however, this is an important issue that deserves closer examination. For some of the operational capabilities for which the ME/C had notable gaps (eg, provide staff respite areas, manage missing person call centers, and communicate to the public through social media), the DOH sector has reportedly better capacity, suggesting that the 2 sectors could improve MFI preparedness by building on synergistic capabilities. 19 The DCI sector reported the lowest operational capability, with the greatest deficits in formal written aspects of planning for supporting a response, suggesting that this critical civilian sector is not effectively included in local preparedness strategies.

Our network results reflect uneven resource-sharing potential. Although three-quarters of respondents reported that their organizations had mutual aid agreements in place, less than half reported reciprocal ties with specific response partners. When relationships are not reciprocal, a hierarchical arrangement is more likely, with less peer-to-peer collaboration. During a response, every sector may not require equal reciprocity, but the existence of reciprocal relationships is an indicator of collaborative potential. 2 , Reference Floría, Gracia-Lázaro and Gómez-Gardeñes 20 Gaps were found between all sectors despite well-documented evidence that effective response requires collaboration between the government and the private sector.Reference Kapucu and Van Wart 4 , Reference Simo and Bies 21 Of particular note was the imbalance in reciprocity reported by governmental sectors (OEM, DOH, ME/C) versus the DCI sector. That imbalance, combined with the high density measurement in the DCI “provide resources” network, suggests that the DCI sector understands its important role in MFI response, but death care organizations are not effectively incorporated into the overall response infrastructure. When collaboration exists mainly within (rather than across) sectors, a full-spectrum response that requires extensive on-scene coordination may be delayed. This delay can result in inefficiencies, additional costs, increased use of resources, and greater risk to the responders and the general public.

Our network density measurements suggest that about half of the possible relationships between response partners in the MFI infrastructure are likely to be present. Density is associated with “flow” (eg, of communication, resources, services) between entities and is correlated with the potential to coordinate.Reference Wasserman and Faust 11 , Reference Haythornthwaite 22 - Reference Hawe, Webster and Shiell 24 In general, the expected effect of lower density is less capability for handling complex situations, such as multi-sector response.Reference Hossain and Kuti 23 For example, effective levels of density between fire and rescue teams during response have been demonstrated as being as high as 95%.Reference Mohammadfam, Bastani and Esaghi 25 Organizations with high reciprocal relationships and high capacity to obtain resources are likely to be better prepared.Reference Merrill, Carley and Orr 14 , Reference Kenis and Knoke 26

Limited capability within sectors diminishes the effectiveness of the MFI infrastructure as a whole. For example, excellent operational capability for response to uncomplicated MFI within the OEM sector and adequate operational capacity in the ME/C and DOH sectors were offset by weak operational capacity in the DCI sector, which can provide critically needed services during an MFI response. The result is a composite score for preparedness at about half the maximum, suggesting that an effective and efficient response to a large-scale or complex MFI is not likely.

The study findings are constrained by several important limitations. Our purposive convenience sample is not representative of all key sectors, both in terms of numbers and distribution across the country. Any of the 3143 counties and county equivalents in the United States 27 may or may not have a DOH, OEM, and/or ME/C, and the respondents in our study represent only a small fraction of these organizations. More research using extensive, representative samples is clearly needed. We also limited examination of voluntary organizations’ preparedness to members of the faith-based sector, and other organizations, most notably the American Red Cross, also play crucial voluntary roles. Further examination of the preparedness of these other voluntary organizations is warranted in future studies.

Another threat to reliability stemming from the convenience sample is systematic bias. We have no explanation for the distribution of responses, or for the higher response rates across sectors from Texas and the lower response from other states, nor do we know if this imbalance affected our results. However, we do know that Texas has shown important leadership in terms of MFI preparedness.Reference Woody and Boyer 28 Furthermore, our estimates of national preparedness are based on individuals responding for their organizations, who may or may not have been the most knowledgeable about their organization’s MFI capabilities. Finally, the missing DMORT sector is important to acknowledge because they bring significant expertise to MFI; however, anecdotally, it has been reported that they have been underfunded and under-resourced for several years. This important national asset might be severely compromised in response to a large-scale MFI involving thousands or more deaths.

Despite these limitations, our study resulted in well-defined measures of MFI preparedness for this large and critical infrastructure as well as the only known empirical evidence on preparedness within and across the 5 response sectors that comprise the infrastructure. These findings begin to fill a significant gap in our knowledge about relationships between organizations that have essential roles and responsibilities in MFI, with important implications for national response policy and planning.

Conclusions

These results provide evidence of a pressing need for a collaborative MFI response infrastructure that is optimized through planning, training, and drills across sectors at all levels.Reference Waugh and Streib 29 , 30 The variability we found presents a compelling argument for this enhanced planning so that individual sector weaknesses can be identified and addressed. This approach is exemplified in FEMA training courses that focus on collaborative planning through simulations. 31 Regional planning is another promising approach for mitigating the economic, environmental, and psychological consequence of MFI. The pioneering multi-sector planning underway in New York, New Jersey, Connecticut, and Pennsylvania is expected to maximize regional coordination, communication, and unity of effort. 32

All sectors need clear guidance regarding their MFI responsibilities. Predefined roles can be useful, but the focus should remain on collaboration, because in a novel response, pre-defined roles may not apply. The professional organizations that represent each sector in the mass fatality infrastructure can provide leadership by developing and distributing guidance, and advocating for local, state, and federal support. For example, core competencies for MFI response, similar to those proposed for disaster medicine and public health, are needed to guide workforce training.Reference Walsh, Subbarao and Gebbie 33

Finally, in the National Response Framework, 5 support for management of mass fatalities is defined by the Public Health and Medical Services Annex, Emergency Support Function (ESF) #8, which is coordinated by the US Department of Health and Human Services. 34 It should be pointed out, however, that standards for preparedness are not provided in the National Response Framework for any of the sectors in the mass fatality infrastructure or for the infrastructure as a whole. Furthermore, ESF #8 currently has a large scope, including evacuation and care of the living, and federal stakeholders may wish to consider whether a dedicated MFI response function is needed.

We know that preparedness is essential for effective emergency response. The resiliency and recovery of affected communities and the nation as a whole depends on the readiness of the government and private sectors to collaborate efficiently. History, experience, and current trends tell us that future MFIs are inevitable. Enhanced national preparedness for MFIs of any scale is crucial for ensuring a response that is both effective and respectful of the victims and society’s deeply held values.

Acknowledgments

The authors are grateful to the following individuals who graciously shared their expert advice: Ms Cynthia Gavin, Dr Kathleen M. Carley, Mr John Nesler, Ms Allison Woody, Dr Jason Wiersema, Ms Emily Carroll, Mr Frank DePaolo, Dr Suzanne Utley, Dr Lisa LaPoint, Dr John Fudenberg, Mr Robert Fells, Ms Lesley Witter, Ms Deana Gillespie, Ms Lori Cascaden, Ms Carmela Hinderacker, Ms Malaya Fletcher, Dr Naveena Bobba, Mr Andrew Roszak, Ms Resham Patel, Mr David Zane, Ms Lynne Bratka, Mr Peter Gudaitis, Ms Dawn Shiley, Dr Elin Gursky, Mr Edward Kilbane, and Mr Kevin Sheehan. We also thank Dr Martin Sherman, Ms Halley Riley, Ms Tara McAlexander, and Ms Denise McNally for their input in questionnaire development. We are also deeply appreciative of the Association of State and Territorial Health Officials; the National Association of County and City Health Officials; the National Association of Medical Examiners; the International Association of Coroners and Medical Examiners; the International Cemetery, Cremation and Funeral Association; the National Funeral Directors Association; the International Association of Emergency Managers; and the National Disaster Interfaiths Network for their assistance in questionnaire development and distribution and participant recruitment. A special note of thanks to the study participants for their enthusiastic participation in the various aspects of this study.

Funding

This study was funded by a grant (CMMI-1233673) provided by the National Science Foundation.

References

1. Terbush, JW, Huckaby, GC, Luke, M. Civil-military integration for mass-fatality events. In: Gursky EA, Fierro MF, eds. Death in Large Numbers: the Science, Policy, and Management of Mass Fatality Events. Chicago, IL: American Medical Association; 2012.Google Scholar
2. Centers for Disease Control and Prevention. A Framework for Improving Cross-Sector Coordination for Emergency Preparedness and Response. http://www.cdc.gov/phlp/docs/CDC_BJA_Framework.pdf. Published July 2008. Accessed November 19, 2015.Google Scholar
3. Abbott, D. Disaster public health considerations. Prehosp Disaster Med. 2000;15(4):158-166.Google Scholar
4. Kapucu, N, Van Wart, M. The Evolving Role of the Public Sector in Managing Catastrophic Disasters: Lessons Learned. Administration & Society. 2006;38(3):279-308.Google Scholar
5. Federal Emergency Management Agency. National Response Framework. https://www.fema.gov/national-response-framework. Accessed November 19, 2015.Google Scholar
6. Gursky, E. A working group consensus statement on mass-fatality planning for pandemics and disasters. Journal of Homeland Security. July 2007.Google Scholar
7. Federal Emergency Management Agency. National Preparedness Report. http://www.fema.gov/national-preparedness-report. Accessed November 19, 2015.Google Scholar
8. SurveyMonkey Inc https://www.surveymonkey.com/. Accessed July 7, 2015.Google Scholar
9. R: A Language and Environment for Statistical Computing. [computer program]. Vienna, Austria: R Foundation for Statistical Computing; 2010.Google Scholar
10. Carley, KM. ORA: A Toolkit for Dynamic Network Analysis and Visualization. In: Alhajj R, Rokne J, eds. Encyclopedia of Social Network Analysis and Mining. New York, NY: Springer; 2014; http://dx.doi.org/10.1007/978-1-4614-6170-8_309.Google Scholar
11. Wasserman, S, Faust, K. Social Network Analysis: Methods and Applications. Vol 8. New York, NY: Cambridge University Press; 1994; http://dx.doi.org/10.1017/CBO9780511815478.Google Scholar
12. National Disaster Interfaiths Network. http://www.n-din.org/. Accessed July 7, 2015.Google Scholar
13. Gershon, RR, Magda, LA, Riley, HE, Merrill, JA. Mass fatality preparedness in the death care sector. J Occup Environ Med. 2011;53(10):1179-1186.Google Scholar
14. Merrill, JA, Carley, KM, Orr, MG, et al. Patterns of interaction among local public health officials and the adoption of recommended practices. Front Public Health Serv Syst Res. 2012;1(1):6.Google Scholar
15. Chaffee, M. Willingness of health care personnel to work in a disaster: an integrative review of the literature. Disaster Med Public Health Prep. 2009;3(1):42-56. http://dx.doi.org/10.1097/DMP.0b013e31818e8934.Google Scholar
16. Gershon, RR, Magda, LA, Qureshi, KA, et al. Factors associated with the ability and willingness of essential workers to report to duty during a pandemic. J Occup Environ Med. 2010;52(10):995-1003.CrossRefGoogle ScholarPubMed
17. Qureshi, K, Gershon, RR, Sherman, MF, et al. Health care workers’ ability and willingness to report to duty during catastrophic disasters. J Urban Health. 2005;82(3):378-388. http://dx.doi.org/10.1093/jurban/jti086.Google Scholar
18. Fierro, MF. Mass murder in a university setting: analysis of the medical examiner’s response. Disaster Med Public Health Prep. 2007;1(1 suppl):S25-S30. http://dx.doi.org/10.1097/DMP.0b013e31814cf374.Google Scholar
19. Santa Clara County Public Health Department Advanced Practice Center. Managing Mass Fatalities: A Toolkit for Planning. https://www.sccgov.org/sites/sccphd/en-us/HealthProviders/BePrepared/Pages/Managing-Mass-Fatalities.aspx. Published May 2008. Accessed November 19, 2015.Google Scholar
20. Floría, LM, Gracia-Lázaro, C, Gómez-Gardeñes, J, et al. Social network reciprocity as a phase transition in evolutionary cooperation. Physical Review E. 2009;79(2):026106. http://dx.doi.org/10.1103/PhysRevE.79.026106.Google Scholar
21. Simo, G, Bies, AL. The role of nonprofits in disaster response: an expanded model of cross‐sector collaboration. Public Adm Rev. 2007;67(s1):125-142. http://dx.doi.org/10.1111/j.1540-6210.2007.00821.x.Google Scholar
22. Haythornthwaite, C. Social network analysis: an approach and technique for the study of information exchange. Libr Inf Sci Res. 1996;18(4):323-342. http://dx.doi.org/10.1016/S0740-8188(96)90003-1.Google Scholar
23. Hossain, L, Kuti, M. Disaster response preparedness coordination through social networks. Disasters. 2010;34(3):755-786. http://dx.doi.org/10.1111/j.1467-7717.2010.01168.x.Google Scholar
24. Hawe, P, Webster, C, Shiell, A. A glossary of terms for navigating the field of social network analysis. J Epidemiol Community Health. 2004;58(12):971-975. http://dx.doi.org/10.1136/jech.2003.014530.Google Scholar
25. Mohammadfam, I, Bastani, S, Esaghi, M, et al. Evaluation of coordination of emergency response team through the social network analysis. Case study: oil and gas refinery. Saf Health Work. 2015;6(1):30-34.Google Scholar
26. Kenis, P, Knoke, D. How organizational field networks shape interorganizational tie-formation rates. Acad Manage Rev. 2002;27(2):275-293.Google Scholar
27. United States Census Bureau. USA Counties. 1969-2007. http://censtats.census.gov/usa/usainfo.shtml. Accessed May 4, 2015.Google Scholar
28. Woody, AC, Boyer, DA. Mass Fatality Management: Local & Regional Planning. Presented at: Texas Emergency Management Conference; March 2013; Texas.Google Scholar
29. Waugh, WL, Streib, G. Collaboration and leadership for effective emergency management. Public Adm Rev. 2006;66(s1):131-140. http://dx.doi.org/10.1111/j.1540-6210.2006.00673.x.Google Scholar
30. Partnership for Public Service. Annual Report. Washington, DC: Partnership for Public Service; 2008.Google Scholar
31. Federal Emergency Management Agency. National Training and Education (NTE). FEMA website. https://training.fema.gov/. Accessed July 15, 2015.Google Scholar
32. Frank DePaolo. Regional catastrophic mass fatality management response system. Presented at: International Mass Fatality Management Conference; April 25-27, 2012; New York, NY.Google Scholar
33. Walsh, L, Subbarao, I, Gebbie, K, et al. Core competencies for disaster medicine and public health. Disaster Med Public Health Prep. 2012;6(1):44-52. http://dx.doi.org/10.1001/dmp.2012.4.Google Scholar
34. US Department of Health and Human Services. Emergency Support Function #8 - Public Health and Medical Services Annex. http://www.fema.gov/media-library-data/20130726-1914-25045-3446/final_esf_8_public_health_medical_20130501.pdf. Accessed July 7, 2015.Google Scholar
Figure 0

Figure 1 US Mass Fatality Infrastructure. Abbreviations: DHHS, US Department of Health and Human Services; OSHA, Occupational Safety and Health Administration.

Figure 1

Figure 2 Distribution of Sector Respondents, by State (N=879).

Figure 2

Table 1 Organizational Capability to Respond in 3 Categoriesa

Figure 3

Table 2 Distribution of Respondent’s Perception of Workplace Preparednessa

Figure 4

Table 3 Operational Capability to Respond: Elements Included in the Mass Fatality Incident Plan of the Respondents’ Organizationsa

Figure 5

Table 4 Network Capability: Resource-Sharing Relationships With Selected Response Partnersa

Figure 6

Table 5 Comparison of Density of Ties Between Response Partners Within Each Sector for Resource Sharing Networksa