Hostname: page-component-745bb68f8f-lrblm Total loading time: 0 Render date: 2025-02-06T06:24:02.175Z Has data issue: false hasContentIssue false

Computer-Facilitated Assessment of Disaster Preparedness for Remote Hospitals in a Long-Distance, Virtual Tabletop Drill Model

Published online by Cambridge University Press:  20 September 2011

Brian Gillett*
Affiliation:
Director of Center for Clinical Simulation, Maimonides Medical Center, Brooklyn, New York USA
Mark Silverberg
Affiliation:
Associate Residency Director, Assistant Professor of Emergency Medicine, State University of New York Downstate, Brooklyn, New York USA
Patricia Roblin
Affiliation:
Administrative Director for Division of Disaster Medicine, New York Institute for All Hazard Preparedness, State University of New York Downstate, Brooklyn, New York USA
John Adelaine
Affiliation:
Senior Resident, Department of Emergency Medicine, State University of New York Downstate, Brooklyn, New York USA
Walter Valesky
Affiliation:
Director of Disaster Preparedness, Associate Professor of Emergency Medicine, State University of New York Downstate, Brooklyn, New York USA
Bonnie Arquilla
Affiliation:
Associate Professor Emergency Medicine, Director Emergency Preparedness, SUNY Downstate Medical Center, Kings Country Hospital Center, New York Institute All Hazard Preoaredness, Brooklyn, New York USA
*
Correspondence: Brian Gillett, MD 4802 10th AvenueBrooklyn, New York 11219 USA E-mail: bpgillett@earthlink.net
Rights & Permissions [Opens in a new window]

Abstract

Introduction: Emergency preparedness experts generally are based at academic or governmental institutions. A mechanism for experts to remotely facilitate a distant hospital’s disaster readiness is lacking.

Objective: The objective of this study was to develop and examine the feasibility of an Internet-based software tool to assess disaster preparedness for remote hospitals using a long-distance, virtual, tabletop drill.

Methods: An Internet-based system that remotely acquires information and analyzes disaster preparedness for hospitals at a distance in a virtual, tabletop drill model was piloted. Nine hospitals in Cape Town, South Africa designated as receiving institutions for the 2010 FIFA World Cup Games and its organizers, utilized the system over a 10-week period. At one-week intervals, the system e-mailed each hospital’s leadership a description of a stadium disaster and instructed them to login to the system and answer questions relating to their hospital’s state of readiness. A total of 169 questions were posed relating to operational and surge capacities, communication, equipment, major incident planning, public relations, staff safety, hospital supplies, and security in each hospital. The system was used to analyze answers and generate a real-time grid that reflectied readiness as a percent for each hospital in each of the above categories. It also created individualized recommendations of how to improve preparedness for each hospital. To assess feasibility of such a system, the end users’ compliance and response times were examined.

Results: Overall, compliance was excellent with an aggregate response rate of 98%. The mean response interval, defined as the time elapsed between sending a stimuli and receiving a response, was eight days (95% CI = 8–9 days).

Conclusions: A web-based data acquisition system using a virtual, tabletop drill to remotely facilitate assessment of disaster preparedness is efficient and feasible. Weekly reinforcement for disaster preparedness resulted in strong compliance.

Type
Brief Report
Copyright
Copyright Gillett © World Association for Disaster and Emergency Medicine 2011

Introduction

Emerging public health threats of natural or deliberate constitution considerably jeopardize the health and safety of communities across the globe. As such, a baseline measurement of the emergency capabilities of hospitals is needed to ensure adequate preparedness. This baseline assessment provides information about emergency management planning and response capabilities so that weaknesses can be addressed. Then, progress may be measured relative to the initial baseline capabilities so resources can be allocated to the areas of greatest vulnerability.

Regional hospital collaborative networks for disaster response are important in accommodating surges of ill or injured patients.Reference Courtney, Toner and Waldhorn1 This requires an efficient mechanism for both acquiring and for sharing hospitals’ states of emergency preparedness; however, the capacity for such networks to communicate their preparedness information with each other and with government agencies is lacking.Reference Maldin-Morgenthau, Toner and Waldhorn2 Because assessing disaster preparedness requires specialized expertise, the ability for a hospital’s leadership to assess preparedness varies across the globe. Additionally, in the US, the provision of health care largely occurs in the private sector, while disaster preparedness and response typically are coordinated by local, state, and federal agencies. Lastly, experts in emergency preparedness tend to base at academic or government institutions, thus making it difficult for private healthcare facilities to exploit these intellectual resources.Reference Courtney, Toner and Toole3

A mechanism for experts to assist assessment of disaster readiness of an institution from a distance, and to participate in the exercise from its own base of operations is needed for efficient mass casualty planning on a global scale. Such a system was developed and implemented that remotely assessed preparedness of nine hospitals in Cape Town, South Africa in preparation for the 2010 FIFA World Cup games.

Methods

Study Design

An online program was developed with PHP and HTML programming languages and a MySQL database to remotely acquire information and analyze disaster preparedness for hospitals from a distance.

Nine hospitals in Cape Town, South Africa designated as receiving institutions for the 2010 FIFA World Cup games, and tournament organizers utilized the Internet-based system, Remote Disaster Drill and Analysis Tool (RDDAT), over a 10-week period in preparation for the World Cup events.

The RDDAT system was hosted on a server in New York, and integrated automated electronic mail to conduct survey questions by sending weekly “stimuli” (Figure 1) to leadership at each of the nine hospitals over a 10-week period. Stimuli were in the form of status updates relating to increasing tourist density in the Cape Town metropolitan area during the first two weeks of the exercise, and subsequently, to a mass-casualty incident in a stadium during the remaining weeks. Upon receipt of each email stimulus, the designated recipients were instructed to login to RDDAT and answer questions relating to their hospital’s state of readiness.

Figure 1 Example of an e-mail stimulus generated by RDDAT

Respondents were instructed to answer the questions as quickly and accurately as possible. They were informed that RDDAT records the time required to answer questions, and that this time interval may be factored into evaluation of their state of readiness.

Each hospital was asked 169 questions. Of these, 124 were in a “yes/no” format and related to communication, equipment, major incident planning, public relations, staff safety, surge capacity, hospital supplies, and security. Affirmative responses indicated preparedness. Forty-five of the questions required a numerical answer, and related to patient bed capacity. All questions were adapted from the Hospital Emergency Analysis Tool (HEAT).Reference Oster and Chaffee4

The RDDAT recorded the respondents’ answers, identity, and response intervals in a MySQL database. The system provided real-time monitoring of these responses to both the respondents and to the FIFA World Cup organizers. Institutions only were able to monitor their own response data; however, the World Cup organizers and the study group could monitor data from all nine facilities. The RDDAT provided both institution specific and regional network summary views for monitoring data (Figure 2).

Figure 2 Grid generated by RDDAT displaying the hospitals’ specific responses organized by category

Disaster readiness data were broken down into six categories for display and analysis purposes: (1) communication; (2) equipment; (3) major incident planning; (4) public relations and risk communications; (5) surge capacity; and (6) safety, supplies, fire, and security. Assessments involved comparison between measures of actual performance and standards that described ideal performance.Reference Toner, Waldhorn and Franco5,Reference Phillips and Knebel6 The RDDAT displayed real-time readiness as a percent, based on the proportion of affirmative answers, in a grid format for each category across the nine hospitals. The hospitals’ specific responses to each survey question also were categorized under the above headings, and displayed in a separate grid (Figure 3).

Figure 3 Portion of a grid generated by RDDAT displaying the hospitals’ specific responses organized by category

The system also analyzed negative answers to programmatically generate individualized plain English recommendations for each hospital (Figure 4). After the exercise, detailed reports were generated by RDDAT and presented to the leadership of each hospital and to the World Cup organizers.

Figure 4 Example section from a Summary of Recommendations report automatically generated by RDDAT

To assess the feasibility of the RDDAT system, the end users’ compliance and response times were examined. These values were tracked by the RDDAT and reported to the study group at the conclusion of the exercise.

The study was approved by the State University of New York Downstate Institutional Review Board.

Study Setting

The New York Institute for All Hazard Preparedness conducted this long-distance, virtual, tabletop drill and data acquisition exercise between May and September 2009, which involved nine hospitals in the Western Cape of South Africa. These hospitals represent the majority of receiving facilites for mass-casualty incidents, preceeding, during, or following the 2010 FIFA World Cup games. The drill, facilitated by RDDAT, initially simulated a surge in the Capetown population density, and subsequently, a stampede-type event at the Green Point Stadium in Cape Town, South Africa.

Data Collection

The RDDAT software applied and recorded a time stamp and a unique identifier for each question at the time of activation, and for each response at the time of submission. The RDDAT also programmatically recorded the number of questions asked, and the number of answers submitted for each institution. The RDDAT program stored all raw data into a MySQL database, and programmatically generated “percent readiness” scores in real-time based on the proportion of affirmative answers.

Statistical Processing

The end users’ response times and their percent of questions answered were presented as means and proportions with 95% confidence intervals (CIs). All data were entered into SAS statistical software, version 9.0 (SAS Institute).

Results

Overall, compliance was excellent with an aggregate response rate of 98% (respectively: 100%, 100%, 100%, 100%, 99%, 99%, 98%, 97%, and 86%). The mean response interval, defined as the time elapsed between sending a stimulus and receiving a response, was 8 days (95% CI = 8–9 days).

Discussion

Infectious, geological, and intentional disasters occur unexpectedly. Hospitals prepare for such disasters by drilling with either tabletop exercises or mock patient surges involving live actors. Utilizing live actors is not feasible for many hospitals ,as their use requires significant human and financial resources; for this reason, institutions often substitute the tabletop format.Reference Catlett, Perl and Bass7 Effective tabletop drill design and post-drill data processing requires specialized expertise; however, experts in emergency preparedness tend to be employed at academic or government institutions. Thus, exploiting expert consultation can be difficult for private healthcare facilities. Also, the field of disaster medicine is not well established in many regions of the world.

A web-based, emergency preparedness assessment, such as RDDAT, has many attractive properties. Remote assessment offers a global and public health benefit to underdeveloped regions and to the private sector where disaster expertise is lacking. The RDDAT acquired preparedness data in the context of a virtual, tabletop drill, automating both the drill’s execution and the response analysis. Automating these complex tasks minimized expended human resources and expedited the development of both institution-specific and collaborative network-oriented recommendations. Such a system also may eliminate delays and expenses relating to travel of experts to remote sites.

This system allows the assessments to be administered to large groups or even censuses of health departments and staff. Another advantage is that responses to surveys and checklists allow for relatively straightforward data processing, including the calculation of aggregate measures and comparisons across subgroups.

The six dimensions of emergency preparedness assessed were: (1) communication; (2) equipment; (3) major incident planning; (4) public relations and risk communications; (5) surge capacity; and (6) safety, supplies, fire, and security. These categories and their respective questions, adapted from HEAT, provided a thorough framework for assessing hospitals’ state of readiness.Reference Courtney, Toner and Toole3 The stimuli and questions were not directly coded into RDDAT. Instead, RDDAT incorporated an interface for creating and editing the stimuli and their accompanying questions. This broadens the system’s applicability to a wide range of both health-related and non-health-related facilities.

Organizing and analyzing preparedness data are cumbersome tasks, especially when assessing a regional collaborative network of hospitals. A total of 1,521 answers were selected across nine hospitals (169 per hospital). This immense amount of data would have required tremendous staff-hours to process, and could be susceptible to arithmetic and recording errors if performed by assessment personnel. However, the RDDAT system programmatically organized and analyzed responses instantly upon receipt, which permitted real-time monitoring. The RDDAT displayed percent readiness for each category as well as hospital-specific recommendations for improvement in a readily accessible and organized layout.

Efficient communication of disaster preparedness between hospitals and with outside organizations remains challenging.Reference Maldin-Morgenthau, Toner and Waldhorn2 The RDDAT’s grid display permits disaster readiness, organized by category, to be shared across regional hospital networks as well as with a region’s emergency medical services, and government and regulatory agencies. Such a grid has potential to improve regional disaster planning.

Compliance in responding to surveys is a common problem.Reference Jenkins, Kelen and McCarthy8–Reference McCarthy, Brewster and Kelen9 The weekly stimuli accompanying the questions, in the form of incident status updates, reinforced the importance of the exercise and promoted compliance. The specific stimuli texts used in this study are not reported, as the actual text should be tailored to the unique setting in which such a system is implemented.

The time stamps applied by RDDAT upon sending questions and receiving answers permitted the program to analyze the hospitals’ response intervals. The mean response interval of eight days is acceptable for non-emergent data acquisition, but would not be appropriate during a live incident. Informal interviews with the hospitals’ staff shed light on the relatively long response interval. Participants related that they often responded to questions when they were less busy with other tasks. Several participants also explained that generating some answers required several days because the information was difficult to obtain. Further research is needed to better clarify the factors influencing response times in such a system.

It is assumed, based on traditional pedagogical philosophy, that drilling the process of obtaining answers will improve efficiency in executing these same tasks during an actual incident. Also, comparing response intervals across preparedness categories offers insight relating to an institution’s efficiency of monitoring important information and in identifying specific areas requiring improvement.

Limitations

There are many reasons to approach written assessments with caution. First, such assessments tend to focus on public health structure, and little rigorous evidence exists that links specific public health preparedness structures with the ability to execute effective response processes, and ultimately, to impact health. Thus, such structural measures may not be valid indicators of preparedness. Also, the design of the study did not allow the authors to validate survey responses against data from actual operations, and the validity of the responses are not easily verifiable.

In addition, a system such as RDDAT requires accessibility to the Internet, which varies between regions of the world. Also, the response intervals may not accurately reflect the true times required by a hospital to gather the requested information because the system can not account for a user’s delay in opening stimulus emails or in submitting gathered information.

Lastly, the questions presented to the hospitals were of “yes”/”no” format which, in certain circumstances, fails to capture sufficient detail. For example, the question: “Is your emergency power generating equipment adequate to provide enough power for all essential services for three days” was asked. A hospital that has power coverage for 90% of the essential services and a hospital with only 40% coverage would both have to respond “No” to the question. While a negative response to this question does not provide the entire answer, it does serve to flag the topic for further exploration by the institution or by a disaster preparedness consultant. Future versions of this computer-based system should incorporate a mechanism to accommodate non-dichotomous data where applicable.

Conclusions

Despite its limitations, a web-based, data acquisition system using a virtual, tabletop drill to remotely facilitate assessment of disaster preparedness is efficient and feasible. Weekly reinforcement for disaster preparedness resulted in strong compliance and in a large amount of data at the exercises’ conclusion. Further study is required to validate such a system’s impact on actual disaster response.

Abbreviations:

FIFA = Fédération Internationale de Football Association

HEAT = Hospital Emergency Assessment Tool

RDDAT = Remote Disaster Drill and Analysis Tool

References

Courtney, B, Toner, E, Waldhorn, R, et al: Preparing the Healthcare System for Catastrophic Emergencies. Biosecur Bioterr 2009;7(1).CrossRefGoogle ScholarPubMed
Maldin-Morgenthau, B, Toner, E, Waldhorn, R, et al: Roundtable: Promoting Partnerships for Regional Healthcare Preparedness and Response. Biosecur Bioterr 2007; 5(2).CrossRefGoogle ScholarPubMed
Courtney, B, Toner, E, O’Toole, T, et al: Healthcare Coalitions: The New Foundation or National Healthcare Preparedness and Response for Catastrophic Health Emergencies. Biosecur Bioterr 2009;7(2).CrossRefGoogle ScholarPubMed
Oster, NS, Chaffee, MW: Hospital preparedness analysis using the hospital emergency analysis tool (the HEAT). Ann Emerg Med Oct 2004;44(4).CrossRefGoogle Scholar
Toner, E, Waldhorn, R, Franco, C, et al: Descriptive Framework for Healthcare Preparedness for Mass Casualty Events. Prepared by the Center for Biosecurity of UPMC for the US Department of Health and Human Services under Contract No. HHSO100200700038C; 2008.Google Scholar
Phillips, SJ, Knebel, A, (eds): Mass Medical Care with Scarce Resources: A Community Planning Guide. Prepared by Health Systems Research, Inc. under Contract No. 290-04-0010. AHRQ Publication No. 07-0001. Rockville, MD: Agency for Healthcare Research and Quality 2007.Google Scholar
Catlett, C, Perl, T, Bass, E, et al: Training of Clinicians for Public Health Events Relevant to Bioterrorism Preparedness. Evidence Report/Technology Assessment Number 51. AHRQ Publication No. 02-E011. Jan 2002.Google Scholar
Jenkins, J, Kelen, G, McCarthy, M, et al: Review of Hospital Preparedness Instruments for National Incident Management System Compliance. Disaster Med Public Health Prep 2009;3:s83s89.CrossRefGoogle ScholarPubMed
McCarthy, M, Brewster, P, Kelen, G, et al: Consensus and tools needed to measure health care emergency management capabilities. Disaster Med Public Health Prep 2009;3(2 Suppl):s45s51.CrossRefGoogle ScholarPubMed
Figure 0

Figure 1 Example of an e-mail stimulus generated by RDDAT

Figure 1

Figure 2 Grid generated by RDDAT displaying the hospitals’ specific responses organized by category

Figure 2

Figure 3 Portion of a grid generated by RDDAT displaying the hospitals’ specific responses organized by category

Figure 3

Figure 4 Example section from a Summary of Recommendations report automatically generated by RDDAT