In September 2014 the first case of Ebola virus disease (EVD) was diagnosed in the United States. Three more cases were diagnosed in October. The responses that followed this emerging public health emergency raised questions about the readiness of our health care system to prevent and manage future cases. 1 The American College of Emergency Physicians (ACEP) took an active role in the response by assessing hospital readiness and disseminating Ebola-related information. ACEP conducted 3 electronic surveys among its members to assess the state of Ebola preparedness of US emergency departments (EDs) and hospitals.
ACEP provides a model for how professional organizations can use surveys among their members to help inform preparedness and response to unfolding public health emergencies. Such surveys can identify successes and challenges and serve as a basis for sharing best practices. One practical way to achieve this is to incorporate survey data into a local-, state-, or national-level intra-action report. The intra-action report tool is an innovation in public health emergency management that aims to identify both successes (that may be replicated) and failures (that should be addressed) to inform further actions during ongoing response and recovery.Reference Chamberlin, Okunogbe and Moore 2 This tool is especially relevant to public health emergencies involving a relatively long-term response effort. Responses to protracted public health emergencies may have to be altered midresponse to fit the needs and challenges of the unfolding emergency. Information gleaned from conducting surveys through health professional networks during such emergencies can help with timely reshaping of an informed emergency response. Improved patient outcomes and protection of health care providers may result from leveraging such survey-based practice tools to understand and proactively disseminate best practices.
This article describes the methodology used and the results of these 3 ACEP surveys as an illustrative case to demonstrate how timely, survey-based information networks can rapidly collect data on challenges and best practices from health care providers and systems across the nation. Such data can then be used to quickly inform preparedness or response efforts for public health emergencies, disasters, and other incidents that may require real-time adjustments to health care delivery.
Methods
Selection of Participants and Data Collection
ACEP developed and disseminated 3 electronic surveys. The survey questions were selected by the ACEP Ebola Expert Panel (which consisted of infectious disease and disaster management experts). Some questions were recommended by individuals from the US Department of Health and Human Services, Office of the Assistant Secretary for Preparedness and Response. The 3 surveys were informally piloted in a small sample of people before distribution; those pilots indicated that the surveys could be completed in less than 10 minutes. ACEP members received an e-mail with a link to a website to complete the survey. The first, purely qualitative, survey was sent to 700 ACEP emergency medical services and disaster preparedness section members on October 15 with a 2-day response window. The second survey, consisting of both closed-ended (quantitative) and open-ended (qualitative) questions, was sent on October 18 to the 1200 members of ACEP’s Emergency Medicine Practice Research Network (EMPRN), a practice-based network designed to capture information from emergency practitioners to improve clinical experience and outcomes. 3 The third survey, also including both closed- and open-ended questions, was sent to all ACEP members (21,981) on November 12. The second and third surveys did not specify a response close date.
Description of the Data and Analytic Approach
The qualitative data in each survey were independently analyzed by 3 authors (MM, MC, MA). These independent analyses were then synthesized to create a unified set of themes.
We used the STATA statistical package (StataCorp, College Station, TX) for all quantitative analyses. Outputs from the second and third surveys included frequency distributions for each question and crosstabs to examine relationships between outcomes (dependent variables) and potential contributing factors (independent variables). The third survey also included logistic regression analysis of factors associated with the perceived ability to screen and admit EVD patients.
The second survey included one outcome of interest—whether the respondent’s hospital was able to admit EVD patients. Independent variables examined included preparedness planning as well as hospital infrastructure elements, procedures, and training. The third survey included data on several preparedness-related hospital characteristics that served as outcomes of interest, each examined in relation to hospital size and geography: Midwest, Mid-Atlantic, West, Southeast, New England, South Central, Rocky Mountain West, and North Central. It also included an outcome similar to that examined in the second survey: whether the hospital was prepared to screen and admit an EVD patient rather than screen and transfer.
Results
First Survey: October 15, 2014
The first survey aimed to inform messages for public audiences and policy makers. This survey garnered 19 responses. The respondents noted the great responsibilities emergency physicians bear as frontline responders for EVD case identification, safe care, screening, transport, lab diagnosis, and isolation. A sample response indicated how they also hold themselves to a high standard: “For Ebola care, as with all other things, it is likely that practice makes perfect—and that only perfect is good enough.”
Three themes arose from this survey. First, limited resources such as funds, personal protective equipment (PPE), personnel, and expert guidance remain an important issue. Most felt that their hospital was reasonably prepared for Ebola, but that important preparedness gaps remained. Second, respondents reflected frustration with the high frequency of updates to guidance from the Centers for Disease Control and Prevention (CDC) and others. Third, respondents strongly supported regionalization of Ebola care to improve efficiency and allow individual hospitals to tailor their level of preparedness to their role within a broader context of coordinated facilities, with only some being designated to provide EVD care. However, several respondents noted that transferring patients to regional centers would only work if the number of cases remained small—that such centers could become overwhelmed if case numbers surged.
Few respondents noted learning from past successful experiences with SARS, H1N1 influenza, and measles and applying these experiences to Ebola. They called for further learning from successes in Ebola care in the United States and Africa.
Second Survey: October 18, 2014
ACEP received 231 responses from EMPRN members (response rate, 19.3%). Only 63% of responses were received by October 21, which was within 72 hours after the survey e-mail was sent.
Quantitative Results
About half of the respondents (132; 57%) indicated that their hospital could admit a suspected EVD patient; others felt their hospital could not (46; 20%) or were unsure (53; 23%). When asked who had provided necessary information or resources to care for potential EVD patients, more respondents agreed or strongly agreed that their home institution had done so (162; 70%) compared to the federal government (140; 61%) or state health department (128; 55%). However, most were confident that state, regional, or local resources were available for assistance to care for EVD patients (150; 65%).
Some aspects of preparedness were stronger than others in the respondents’ hospitals. Nearly all respondents indicated that triage and assessment for potential EVD always or usually includes a travel history (218; 94%). One-third (81; 35%) noted that their hospital had conducted an Ebola drill. About two-thirds indicated that their hospital was either somewhat prepared (115; 50%) or very prepared (37; 16%) to handle EVD patient waste; a similar proportion (159; 69%) indicated that their hospital has a decontamination room. However, only about half (111; 48%) indicated that their hospital staff had received practical PPE training, with the same number indicating that their hospital required a monitor for donning and doffing PPE (116; 50%); a PPE monitor (ie, an observer to ensure appropriate donning and doffing of PPE) was recommended in the hospital of an additional 41 (18%). The hospital’s laboratory had communicated with EDs of only about half the respondents (104; 45%) about procedures for handling specimens from EVD patients. Furthermore, only one-fourth (68; 29%) had a plan to manage EVD if their hospital’s capacity was exceeded.
We examined factors that might contribute to a hospital’s readiness to admit EVD patients. The consistency of taking a travel history and the availability of most individual PPE items were not statistically associated with admission ability; however, the availability of 3 specific PPE items did reach statistical significance: hood (P=0.0085, one-tailed Fishers exact test [FE]), boots with leg coverings (P=0.0171, FE), and booties (P=0.0492, FE ). Nonmaterial dimensions of hospital preparedness were even more important (Table 1). Factors that were statistically significantly associated with a hospital’s ability to admit EVD patients included hospitals where:
-
∙ Staff had received practical PPE training and a monitor for PPE donning and doffing was required;
-
∙ An Ebola drill had been conducted;
-
∙ The laboratory had communicated with the ED on how to handle EVD specimens;
-
∙ There were a greater number of isolation rooms or the ability to isolate a greater number of EVD patients;
-
∙ There was a decontamination room;
-
∙ EVD medical waste could be handled;
-
∙ The hospital had a plan for managing EVD patients if its own capacity was exceeded.
Table 1 Survey 2: Ability to Admit EVD Patient, by PPE Training and Other PreparednessFootnote a
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary-alt:20160921123103-98620-mediumThumb-S1935789316000938_tab1.jpg?pub-status=live)
a Abbreviations: EVD, Ebola virus disease; PPE, personal protective equipment.
We also examined whether the ability of a hospital to admit EVD patients was associated with the degree of information and other support received or expected from various sources (Table 2). The most strongly associated sources of information and resource support were the home institution (P=0.017) and the state health department (P=0.017); support from the federal government was not associated with the ability to admit EVD patients, nor was the expectation of assistance from state, regional, or local sources to care for EVD patients.
Table 2 Survey 2: Ability to Admit EVD Patient, by Information and Resources Available or ProvidedFootnote a
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary-alt:20160921123103-82396-mediumThumb-S1935789316000938_tab2.jpg?pub-status=live)
a Abbreviation: EVD, Ebola virus disease.
Qualitative Results
Respondents were asked about what education or assistance is needed from ACEP. Many reported being inundated with Ebola information from many sources. Several suggested that ACEP serve as a central hub to provide concise and current information and educational and training materials on best practices related to PPE donning and doffing, screening, isolation, preventing and dealing with contamination of waiting room and ED, clinical management, and standards of care. They suggested channels for disseminating such information, such as a website, 24-hour hotline, or regular social media updates. Another recommendation was for ACEP to serve as an advocate for frontline emergency physicians and local, state, and federal government entities. Respondents indicated that ACEP should work with other entities to downplay fear and hysteria in public communications.
Respondents were also asked about what education or assistance is needed from government sources. Many cited confusing, overlapping, frequently changing, and at times contradictory guidance from government sources, as well as a need for government to provide consistent, trustworthy, actionable guidance.
Respondents also called for more resources from government, including funding for PPE, isolation, and quarantine equipment, as well as CDC rapid response teams that could be deployed to hospitals to guide EVD care. Some urged government action to designate and certify Ebola treatment centers and to establish a 24-hour information hotline. A few respondents suggested learning from the positive experiences of the Nebraska Ebola treatment center and Doctors Without Borders.
Third survey: November 12, 2014
This survey was sent to all 21,981 ACEP members. In response, 9510 (43%) opened the e-mail, 1345 (6.1%) clicked on the survey link, and 1147 (5.2%) submitted a completed survey by November 18. A total of 91.2% of the responses were received within 72 hours of the e-mail message. In order to focus on US preparedness, we deleted the 33 responses from outside the United States and included 1114 responses in the analyses.
Quantitative Results
About half of the respondents (514; 46%) were from large hospitals with an annual ED census of at least 60,000. About half (481; 43%) were from the Mid-Atlantic or Midwest region. Roughly half (485; 44%) indicated that their hospital could screen and admit EVD patients (Table 3). About three-fourths (869; 78%) indicated that their hospital had “adequate PPE for a potential EVD case,” but only about half (617; 55%) indicated they had “enough PPE for a proven EVD case.” More respondents were unsure of the latter (265; 24%) than of the former (106; 10%). Respondents felt that fewer physicians had adequate practical PPE training (611; 55%) compared to their ED nurses and technicians (664; 60%), but these emergency physician respondents were more unsure about level of training for nurses and technicians (179; 16%) than for physicians (96; 9%). Two-thirds of respondents (766; 69%) indicated that their hospital had screened no suspected EVD cases, and most of the rest (335; 30%) had screened 1 to 10 suspected cases; very few (13; 1%) had screened more than 10 cases.
Table 3 Survey 3: Frequency Distributions of Selected VariablesFootnote a
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary-alt:20160921123103-03310-mediumThumb-S1935789316000938_tab3.jpg?pub-status=live)
a Abbreviations: ED, emergency department; EVD, Ebola virus disease; PPE, personal protective equipment.
Some outcomes varied significantly by hospital size. Larger hospitals were significantly more likely to have a plan for managing EVD in the ED, to have adequate PPE, and to have screened at least one patient for EVD (Table 4). At larger hospitals, physicians were significantly more likely to have had PPE training compared to smaller hospitals; training of ED nurses and technicians did not vary by hospital size. At the smallest hospitals, Ebola preparedness was led significantly more frequently by safety management and administrative personnel and less frequently by infectious disease physicians and nurses.
Table 4 Survey 3: Relationships Between Hospital Size and Outcomes of InterestFootnote a
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary-alt:20160921123103-70077-mediumThumb-S1935789316000938_tab4.jpg?pub-status=live)
a Abbreviations: ED, emergency department; EVD, Ebola virus disease; PPE, personal protective equipment.
b P value: chi-square test.
c Number of patients screened: “None” versus one or more
Most elements of Ebola preparedness varied significantly by region (Table 5). These included having enough PPE (highest in Midwest and Rocky Mountain West regions); having physicians, nurses, and technicians who received PPE training (highest in Rocky Mountain West region for both groups); and having screened at least one patient for EVD (highest in North Central and New England regions).
Table 5 Survey 3: Relationships between Outcomes of Interest and RegionFootnote a
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary-alt:20160921123103-08340-mediumThumb-S1935789316000938_tab5.jpg?pub-status=live)
a Abbreviations: ED, emergency department; EVD, Ebola virus disease; PPE, personal protective equipment.
b P value: chi-square test.
c Excludes “unsure” responses.
d “None” versus “1-10” versus “11+.”
e Number of patients screened: “none” versus one or more.
We first used bivariate analysis to assess factors potentially associated with a hospital’s ability to screen and admit an EVD patient rather than screen and transfer. Hospitals that were more likely to be able to admit an EVD patient were larger, had adequate PPE, had physicians or nurses who had received PPE training, had already screened at least one patient for EVD, or had Ebola preparedness efforts led by infectious disease physicians and nurses (Table 6).
Table 6 Survey 3: Factors Potentially Associated with Ability to Screen and Admit an EVD PatientFootnote a
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary-alt:20160921123103-52366-mediumThumb-S1935789316000938_tab6.jpg?pub-status=live)
a Abbreviations: ED, emergency department; EVD, Ebola virus disease; PPE, personal protective equipment.
b P value: chi-square test.
c Excludes “unsure” response.
d Excludes “still being decided” response.
e Yes/no by admit or transfer.
f None versus one or more by admit or transfer.
To gain further insight we used logistic regression to evaluate the relationship between each factor from Table 6 and a hospital’s plan to either screen and admit or screen and transfer EVD patients. Through sensitivity analysis and forward stepwise model fitting, we determined that only 3 factors had independent and statistically significant relationships to the dependent variable at a standard threshold of 0.05: (1) annual ED census, (2) the number of potential EVD patients the hospital had screened, and (3) whether infectious disease physicians and nurses led the preparedness efforts. Each of these factors had an independent positive relationship with a hospital’s self-determined capacity to admit EVD patients. The odds ratio that a hospital is capable of screening and admitting EVD patients increases by 1.57:1 as the hospital daily census increases across size categories. Similarly, the odds ratio increases by 2.4:1 as the hospital gains experience screening EVD patients. Finally, the odds ratio that the hospital will be capable of admitting EVD patients increases by 1.64 when infectious disease physicians and nurses lead preparedness efforts.
Qualitative Results
The qualitative data in this third survey derived from one open-ended question that asked respondents for anecdotes in dealing with Ebola. The largest number of comments related to PPE availability and training. Most hospitals had either trained their staff in PPE use, including practical training in donning and doffing, or were in the process. However, many questioned the adequacy of the PPE and felt that there was not enough post-training practice to ensure that skills for PPE remained current.
Several respondents commented on their hospital’s overall level of preparedness. Some described efforts to prepare for Ebola, such as establishing patient screening and isolation areas in underutilized hospital spaces. Ebola preparedness—especially PPE availability and training, hospital drills, and “false alarms” of patients who were ruled out—increased staff confidence in some hospitals. Moreover, the drills helped to identify preparedness gaps, such as guidance for laboratory specimen handling. Some respondents perceived a gap between their hospital’s ability to plan and to execute, commenting that their training did not adequately prepare them to manage Ebola.
Respondents tended to support a collaborative regional approach to EVD management. Well-exercised hospital plans were important, but the quality and level of detail in hospital plans seemed to vary widely. Moreover, in areas that took a regional approach to Ebola management, the burden for planning was shared among facilities and the process was cooperative; respondents from such areas expressed more comfort with their plans.
Many respondents commented on hospital leadership in Ebola preparedness, including both who the leaders were (or should be) and their effectiveness. Respondents commented most favorably on multidisciplinary leadership schemes, especially those that include infectious disease and ED physicians and engaged frontline practitioners. Administrative and corporate leadership schemes received the most negative comments.
Many respondents voiced concerns about the burden that Ebola preparedness was placing on their EDs. One characterized the burden as “crippling” when even one patient is admitted to the ED to rule out EVD; the 72-hour rule-out can lead to closing down part or all of the ED and take clinicians “off-line” for quarantine or disrupt hospital services. Others noted that being designated as an Ebola treatment center stigmatizes the hospital and can reduce patient volume.
Many respondents commented on ethical issues. They expressed concerns about the fairness to both the suspected Ebola patients and others in the ED and whether some hospital policies violated the Emergency Medical Treatment and Labor Act (EMTALA). In one instance, the hospital administration had directed the hospital to refuse treatment to any suspected Ebola patients.
Several respondents suggested potential roles for ACEP to improve preparedness. For example, ACEP might explore alternative approaches to reduce ED burden while maintaining the norms of safety, medical ethics, and legality.
Discussion
The 3 ACEP surveys reflected the current situation and perspectives from across the country when domestic concerns about Ebola were high. Although survey response rates were low, the number of responses to the second and third surveys was sufficient for statistical analysis, which showed significant associations between selected hospital characteristics and perceived ability to admit EVD patients. However, our main intention was more to illustrate, through empiric example, the potential value of such surveys than to present the results as a definitive basis for policy change.
Because the questions in each survey differed, the ability to use the surveys to assess evolution in preparedness over time was limited. For example, the second and third surveys asked about the availability of PPE in different ways. The responses were somewhat comparable: availability of specific PPE items in the 18 October survey ranged from 56% to 67%, and in the 12 November survey 55% of respondents felt the facility had “enough PPE for proven EVD” and 78% had “adequate PPE for potential EVD.” Approximately 55% of physicians in both surveys had received practical PPE training that included donning and doffing. No other questions were comparable across surveys.
Although the second and third surveys were not issued with a response deadline, the majority of responses were received within 72 hours after distribution (63% for survey 2, 91% for survey 3). This may indicate the feasibility of quick turnaround surveys to collect near-real-time data. Furthermore, although the response rates were low, they were higher than those of routinely conducted ACEP surveys. For example, in the case of the EMPRN survey, there was a 6% click rate (compared to a typical rate of <2%), a 43% open rate (compared to a typical rate of 25-30%), and a 5.6% response rate (compared to a typical rate of <2%). This suggested a potential willingness on the part of ACEP members to participate in surveys distributed in times of emergency and may present an opportunity to evaluate measures that can help to improve the overall response rates when health professional networks are used to collect near-real-time data.
The ACEP Ebola survey experience points to opportunities for improving future surveys, such as (1) using the same questions for serial surveys to assess evolution over time if indicated, (2) focusing on questions that are most relevant and “analyzable,” (3) identifying and applying strategies to improve response rates, and (4) distributing survey results to network members and other stakeholders in a timely manner to inform ongoing practice. Another opportunity relates to harnessing the survey information into a larger emergency management context, for example, by incorporating serial survey data into local, state, or national intra-action reports that can be used to identify and act to potentially replicate successes and address failures. Timely survey data collection and analysis could provide current situational awareness and inform policy or practice recommendations. This would allow for health care workers and facilities to adjust practices in real time to reflect ongoing emergency response.
Early experiences with using new media to collect timely, valid data after disasters are encouraging. For example, Georgia’s emergency response to 125,000 evacuees from Hurricane Katrina, which used the internet-based State Electronic Notifiable Disease Surveillance System, successfully collected clinical data on evacuees from health care providers working in multiple locations documenting health encounters across the state.Reference Cookson, Soetebier and Murray 4 More recently, creating online surveys has been simplified through widely available technology allowing nonexperts to conduct surveys. For example, platforms such as SurveyMonkey (Palo Alto, CA) and Qualtrics (Provo, UT) may be used, along with electronic mailing lists, with the ability to authenticate the recipient as the respondent and follow up with them and to de-identify the data received. Furthermore, social media outlets may be used to conduct surveys of both health professional and lay communities during emergencies.
There are several limitations to these surveys. First, the respondents did not represent a random sample. The first 2 surveys targeted subsets of ACEP members and the third was sent to all members. The selective targeting and low response rates may have introduced bias and diminished representativeness of the overall state of Ebola preparedness. Nonresponse bias is a limitation of these surveys; however, given the time-sensitive nature of these surveys, follow-up with nonrespondents was not attempted. The low response rates also limit the generalizability of the findings. We attribute the low response rate to the short time frame allowed to respond to the surveys (in the case of the first survey) and lack of time to follow-up with nonresponders (all surveys). As noted, this article’s purpose was primarily to highlight ACEP’s model and the type of valuable data and insights that are possible from such surveys. As such, the low response rates in these particular surveys are relatively less important. Strategies to improve response rates under time pressure must be considered to improve response rates to, and values of, surveys conducted during or after future incidents. Finally, it is not possible to determine whether multiple individuals responded from the same facility. If there were multiple respondents from the same facility, we could not assess interrater reliability. The ability to compare serial surveys and assess improvements in preparedness over time was limited because each survey asked different questions. Finally, the term adequate in the survey questionnaire was not defined a priori and was left to interpretation by the respondents. While (qualitative) responses to open-ended questions provide valuable information, they do not necessary lend themselves to precision or standardization of terms.
Notwithstanding the limitations of the reported surveys, we believe that such surveys can provide enormous value to both the medical community and emergency management officials. ACEP is a respected professional association that is relevant to local-, state-, and national-level emergency management policy and practice because its members are on the front lines of emergency medicine. ACEP has demonstrated its initiative and ability to carry out real-time member surveys, to take the pulse of hospital preparedness across the country during a public health emergency. As such, it serves as a strong illustrative example for other professional associations that might also carry out member surveys assessing readiness, needs, and best practices. The results of these surveys were shared with the US Department of Health and Human Services (all member survey results were shared on December 2) and CDC, and informed EVD-related protocols and messages distributed to the health care community. Full use of such surveys during and after future incidents should include timely analysis and dissemination to all relevant stakeholders, beginning with the professional network members.
In conclusion, the ACEP surveys demonstrated how a professional association can use its network of members to collect critical survey data in real time regarding public health emergency preparedness and response. Surveys conducted as public health emergencies unfold can inform local, state, and national emergency management and sharing of best practices.
Acknowledgment
The authors thank the ACEP Ebola Expert Panel for their contributions to the surveys.
Funding
The efforts of RAND-affiliated authors on this project were funded by internal funds from the RAND Corporation.
Author Contributions
Mahshid Abir: Conceptualizing premise behind manuscript, analyzing qualitative data, writing manuscript. Melinda Moore: Quantitative and qualitative data analysis, writing manuscript. Margaret Chamberlin: Analyzing quantitative and qualitative data, preparing manuscript tables, reviewing/editing manuscript. Kristi L. Koenig: Conceptualizing ACEP Ebola surveys, manuscript reviewing/editing the manuscript. Jon Mark Hirshon: manuscript reviewing/editing. Cynthia Singh: Conceptualizing ACEP Ebola surveys, manuscript reviewing/editing. Sandra Schneider: Contributing to survey content, manuscript reviewing/editing. Stephen Cantrill: Conceptualizing and contributing to survey content, manuscript reviewing/editing.