Hostname: page-component-745bb68f8f-grxwn Total loading time: 0 Render date: 2025-02-04T13:12:32.166Z Has data issue: false hasContentIssue false

ILI-Related School Dismissal Monitoring System: An Overview and Assessment

Published online by Cambridge University Press:  08 April 2013

Rights & Permissions [Opens in a new window]

Abstract

Objective: This report provides an overview and assessment of the School Dismissal Monitoring System (SDMS) that was developed by the Centers for Disease Control and Prevention (CDC) and the US Department of Education (ED) to monitor influenza-like illness (ILI)-related school dismissals during the 2009-2010 school year in the United States.

Methods: SDMS was developed with considerable consultation with CDC's and ED's partners. Further, each state appointed a single school dismissal monitoring contact, even if that state also had its own school-dismissal monitoring system in place. The SDMS received data from three sources: (1) direct reports submitted through CDC's Web site, (2) state monitoring systems, and (3) media scans and online searches. All cases identified through any of the three data sources were verified.

Results: Between August 3, 2009, and December 18, 2009, a total of 812 dismissal events (ie, a single school dismissal or dismissal of all schools in a district) were reported in the United States. These dismissal events had an impact on 1947 schools, approximately 623 616 students, and 40 521 teachers.

Conclusions: The SDMS yielded real-time, national summary data that were used widely throughout the US government for situational awareness to assess the impact of CDC guidance and community mitigation efforts and to inform the development of guidance, resources, and tools for schools.

(Disaster Med Public Health Preparedness. 2012;6:104-112)

Type
From the Field
Copyright
Copyright © Society for Disaster Medicine and Public Health, Inc. 2012

In mid-April 2009, a novel influenza A virus was first identified in Mexico and the United States. As the 2009 H1N1 influenza A virus (H1N1) outbreaks began to occur in multiple locations in the United States, the federal government mounted an aggressive response while concurrently beginning vaccine development. On April 28, 2009, the Centers for Disease Control and Prevention (CDC) released guidance suggesting the use of selected community- and individual-level nonpharmaceutical measures to slow the spread of the disease in communities and protect those at increased risk for complications. This CDC guidance included a recommendation to dismiss affected schools to help stem transmission among school-aged children and the larger community. With limited epidemiologic data and anecdotal reports of severe disease in Mexico, CDC updated the guidance for schools on May 1, 2009, to suggest that communities with laboratory-confirmed cases of H1N1 consider extended (eg, two-week) school dismissals. As more was rapidly learned about the virus, which suggested that it was less virulent than initially assumed, CDC revised the guidance on May 5, 2009, advising communities to keep most schools open while reiterating that sick students and staff stay home, and stressing continued enhanced personal protective behaviors such as frequent hand washing. It is important to note that the decision to dismiss schools in the United States is typically made at the local level. The federal government has no authority to dismiss schools.

OBJECTIVE

This report provides an overview and assessment of the School Dismissal Monitoring System (SDMS) that was developed by CDC and the US Department of Education (ED) to monitor influenza-like illness (ILI)-related school dismissals (public and nonpublic) during the 2009-2010 school year in the United States. SDMS was a key component of the response by CDC and ED to the 2009 pandemic influenza A (H1N1) virus in the United States.

School Dismissal Monitoring During Spring 2009

Development of SDMS was motivated in part by the school dismissal monitoring activities that ED and CDC conducted during spring 2009. These monitoring activities were undertaken re-actively when local decisions were made to dismiss schools in response to the 2009 influenza A (H1N1) virus pandemic and in response to requests for data about school dismissals from throughout the US government's emergency management structure and the media. No formal national system was in place to monitor dismissals at that time, and neither ED nor CDC had official authority to require reporting of school dismissals.

Consequently, beginning on April 27, 2009, two staff members detailed temporarily to this task (one at CDC and one at ED) began monitoring school dismissals using three data sources: (1) online searches for news articles and broadcasts reporting school dismissals, (2) state and local health and education agency Web sites listing school dismissals, and (3) direct reports from schools and school districts. The direct reports were obtained from the American Association of School Administrators (AASA). At CDC's request AASA posted a form on their Web site on May 1, 2009, for schools and school districts to report school dismissals to AASA. These reports were then forwarded to CDC. The Office of Safe and Drug-Free Schools (OSDFS) at ED also posted a link on its “Lead and Manage My School, Emergency Planning” Web site for schools and school districts to directly report school dismissals to ED. The direct reports represented less than 1% of all reported dismissals, but they demonstrated that public and nonpublic schools were willing to voluntarily report school dismissals to an outside organization or agency.

An attempt was made to verify every dismissal by reviewing health and education agency Web sites to look for official announcements, “dear parent” letters, or other information verifying the decision to dismiss students or by contacting the local health or education agency directly if the online approach did not yield sufficient information. Each school day, ED compiled the result of the searching and verification processes into a spreadsheet, which provided the name of the schools and districts that had initiated a dismissal of all students on that day or that had an ongoing dismissal from a previous day; the type of school (public or nonpublic); the beginning and projected ending dates of the dismissal; and the numbers of students and teachers that were affected. ED also compiled a second spreadsheet that provided the name of the schools and districts that re-opened each day, the type of school, and the numbers of students and teachers affected by the re-opening. The data on numbers of students and teachers affected was based on ED's National Center for Education Statistics (NCES) Common Core of Data public school district data for the 2006-2007 school year and the NCES Private School Universe Survey data for the 2007-2008 school year. On behalf of the CDC and ED collaboration, ED distributed the detailed daily spreadsheets widely at CDC, ED, the Department of Health and Human Services (DHHS), and the Department of Homeland Security (DHS) and shared national summary data only (total numbers of schools dismissed, total numbers of students and teachers affected, and a list of the affected states) with the media each school day. The names of schools and school districts were not shared outside the US government.

Monitoring of school dismissals in this manner continued through June 12, 2009, when most of the schools in the nation had closed for summer break. At that time, on behalf of the CDC and ED collaboration, CDC constructed a master computer database (using the daily spreadsheets) of all school dismissals that were detected between April 27, 2009, and June 12, 2009, and began an analysis of the quality of the data that had been collected and reported using retrospective reviews of news articles and broadcasts and state and local health and education agency Web sites describing school dismissals. This analysis indicated that during the first two weeks of monitoring (when 88% of the total school dismissal days occurred), from 2% to 26% of the total numbers of school dismissals per day were not reported on the first day they occurred. While schools and school districts that were not detected on a timely basis (ie, the first day they dismissed students) were added to the master database as they were discovered, the amount of reporting delay was a cause for concern. Further, the detection and verification process did not sufficiently involve CDC or ED partners and relied almost entirely on passive, secondary sources of data (ie, news articles and broadcasts). Finally, the process depended almost entirely on just two persons and required an extraordinary amount of work that was too error prone and not sustainable for such a large-scale public health emergency affecting communities across the nation.

Nonetheless, the school dismissal monitoring process developed during spring 2009 effectively provided important information used for daily situational awareness throughout the US government and development of guidance on the use of selected community- and individual-level nonpharmaceutical measures and further research. Analysis of the master database indicated that from April 27, 2009, through June 12, 2009, at least 1351 schools (1% of schools nationwide) from 34 states1 plus the District of Columbia dismissed all students for at least one day as a result of the H1N1 outbreak. The most school dismissals occurred on May 5, 2009, when 980 schools and approximately 607 778 students were affected. From May 11 through May 22, 2009, fewer than 100 schools per day were affected, and from May 26 through June 12, 2009, fewer than 40 schools per day were affected. Of all school dismissal days, 59% occurred in Texas, followed by 8% in New York. The mean length of the dismissal events was 3.8 days (range 1 to 9 days) per school. A total of at least 5137 days of school were cancelled. At least 824 966 students and 53 217 teachers were affected by school dismissals. A total of at least 3 170 061 student-days of school were lost, at least in part due to the outbreak during spring 2009. Additional analyses of these data are ongoing.

METHODS

Development of the School Dismissal Monitoring System (SDMS)

Due to the limitations of the school dismissal monitoring activity used during spring 2009 and because of the expected reemergence of 2009 pandemic influenza A (H1N1) virus during fall 2009 and winter 2010, CDC and ED decided to develop a School Dismissal Monitoring System (SDMS) during summer of 2009 in preparation for the 2009-2010 school year. The Division of Adolescent and School Health (DASH), National Center for Chronic Disease Prevention and Health Promotion, was givien primary responsibility for developing the SDMS.

Development of SDMS was guided by two goals: (1) to minimize reporting burden on schools and local and state health and education agencies and (2) to increase accuracy and timeliness of data reported by CDC and ED. From the outset, the new system was to be characterized by voluntary and redundant reporting from schools, school districts, and local health agencies supplemented with media reports of school dismissals and a focus only on flu-related exceptions to the official school calendar. Further, CDC and ED were committed to implementing SDMS in full partnership with key national health and education organizations and state health and education agencies. Development of SDMS began in June 2009. The system was activated August 3, 2009, when the first schools in the nation opened for the 2009-2010 school year.

Partnerships

SDMS was formed from a partnership between CDC and ED and was developed with considerable consultation with CDC's and ED's partners, especially the Association of State and Territorial Health Officials (ASTHO), the Council of State and Territorial Epidemiologists (CSTE), and the National Association of County and City Health Officials (NACCHO). In addition, in June 2009, CDC and ED met with representatives from 16 national educational organizations including AASA, the National Association of Independent Schools (NAIS), the National Association of School Nurses (NASN), and the National School Boards Association (NSBA), which represented important leadership in schools nationwide. The national organizations stressed the importance of fully involving state and local health and education agencies during both the development and implementation of the system, making sure that the SDMS (1) had no information about dismissals that the state did not also have, (2) had close collaboration with and support for states and communities that had their own monitoring system already in place or that wished to develop their own monitoring system, and (3) restricted release of the names of schools and districts dismissed daily to the state agencies and officials within the US government (primarily at CDC, ED, DHHS, and DHS) with a need to know. In addition, they reached out to their membership to test the feasibility of various data collection and reporting strategies and provided very helpful feedback. Support of ASTHO, CSTE, and NACCHO, as well as the educational organizations, was critical to the success of SDMS and the active participation of state and local health and education agencies nationwide.

Each of the four primary education organizations and NACCHO prominently displayed information about reporting school dismissals on their Web sites and communicated directly (through webinars, newsletters, and conference sessions) with their membership about the importance of doing so. CDC provided a “button” to help brand and highlight SDMS that linked directly to the direct reporting form on CDC's Web site.

State Dismissal Monitoring Contacts

To formally establish a partnership with every state health and education agency, a letter from CDC and ED was sent on July 30, 2009, to all 50 state health officers and chief state school officers. The letter to the state health officers was disseminated by ASTHO and the letter to the state school officers was disseminated by ED. Each state was asked to appoint a single school dismissal monitoring contact in either the health or education agency, even if that state also had its own SDMS already in place. Eighteen states appointed a contact from the state education agency, and the remaining 33 states (including the District of Columbia) appointed a contact from the state health agency. In addition, the Bureau of Indian Education (BIE) provided a school dismissal monitoring contact. While duties varied from state to state, the school dismissal monitoring contacts were responsible for administering state-based systems where they existed; encouraging schools, districts, and local health agencies to report school dismissals; receiving direct reports of school dismissals from schools, school districts, and local health agencies in their state; helping to verify reports of school dismissals; distributing school dismissal reports to others in their agency and other agencies; and serving as the first point of contact for CDC for any school dismissal monitoring issue in their state. The school dismissal monitoring contacts proved to be a critical component of SDMS. Many interacted daily with CDC and contributed significantly to timely and accurate reporting of school dismissal data.

Case Definitions

A school dismissal was defined for SDMS as the following; it matched the definition used during spring 2009:

  • Any instance of a public or nonpublic school or a public school district with any of grades K-12 that dismissed all students (but not staff) for one or more days OR

  • Any instance when an entire school building or all school buildings in a district were completely closed to all students and staff AND

  • In response to

  • confirmed or suspected case(s) of 2009 pandemic influenza A (H1N1) virus infection OR

  • an unusually high number of student or teacher absences due to confirmed or suspected influenza-like illness (ILI) that interfered with the school's ability to function OR

  • community or administrative interest in cleaning and sanitizing school facilities regardless of the presence of a confirmed or suspected case of 2009 pandemic influenza A (H1N1) virus infection or confirmed or suspected ILI among students and staff OR

  • any other aspect of a community's response to the 2009 pandemic influenza A (H1N1) virus or confirmed or suspected ILI.

A dismissal event was defined as either a single school dismissal or dismissal of all schools in a district. If only some schools in a district dismissed students, then each of those schools were considered separate dismissal events. Because a district-level dismissal event could represent one or more schools, the number of dismissed schools was higher than the number of dismissal events.

Data Sources

SDMS received data from three sources: (1) direct reports, (2) state monitoring systems, and (3) media scans and online searches. In many cases, both a direct report and a media report were obtained for the same dismissal. This redundancy was purposefully built into the system and helped with the data verification process.

Direct reports were submitted through CDC's Web site at www.cdc.gov/FluSchoolDismissal. All direct reports used a common reporting form that required only the name and zip code of the school or school district and the dates of the dismissal. Contact information also could be provided. In response to requests by health and education officials to reduce reporting burden as much as possible and CDC's and ED's interest in increasing willingness to report as much as possible, the amount of information required on the form was kept to the absolute minimum needed to positively identify the school. The common reporting form could be submitted online, by far the most common mode of submission, or via e-mail or fax. SDMS was designed so that the online and e-mail submissions were automatically forwarded simultaneously to both CDC and the appropriate school dismissal monitoring contact, using the zip code of the school or school district as an indicator of the appropriate state school dismissal monitoring contact. This process was established in recognition of the need to keep states informed of all school dismissals directly reported to the SDMS. Because it occurred automatically and required no staff intervention, it could occur at any time or day, whenever a report of a dismissal was directly submitted. Faxed reports, however, required forwarding by CDC staff to the school dismissal-monitoring contact. Faxing typically occurred early each business day or as soon as direct reports were received during the regular business day. Fortunately, few school dismissals were reported via fax.

Ten states (Arkansas, Idaho, Kentucky, Massachusetts, Maine, New York, Ohio, Oklahoma, Tennessee, and West Virginia) had their own SDMS. The characteristics of these systems varied tremendously. Some were mandatory systems (ie, schools were not able to dismiss students without notifying someone at the state level), while others were strictly voluntary; some only applied to public schools and others applied to both public and nonpublic schools; some involved electronic submissions and electronic reporting, while others were telephone- or e-mail-based; some generated daily reports and others only weekly ones; some were part of larger systems that also captured school dismissals for other reasons (eg, weather) or measured other topics (eg, absenteeism); and some posted results to publicly available Web sites, while others did not routinely share any information outside the state government. CDC worked with each of these systems to establish data-sharing arrangements. In addition, each state agreed to accept direct reports of school dismissals submitted through www.cdc.gov/FluSchoolDismissal, since it was not possible to keep this Web-based system out of any state.

Media scans were conducted daily by the Emergency Risk Communication Branch, Division of Health Communication and Marketing (DHCM), National Center for Health Marketing. A daily search was provided each day from Nexis and AP Exchange using a set of search terms jointly agreed on in advance by DASH and DHCM (eg, flu, school, closed). In addition, CDC and ED reviewed daily an extensive list of flu-related pages on every state and 28 local health and education agency Web sites and numerous media Web sites for school dismissal notifications. Systematic online searches also were run daily to identify news articles, blogs, and social media Web site postings about school dismissals.

Data Verification Process

All cases identified through any of the three data sources were verified. The most frequently used verification sources were school and district Web sites. Many schools and districts that dismissed students posted announcements or “dear parent” letters about the dismissal on their Web site. Others made specific changes to their school year calendar, indicating the schedule change. Because this material about the dismissal was often posted only for a short time on the school and district Web sites, CDC printed a copy of it to provide permanent proof of the verification. These printed materials provided a rich qualitative data source that could be analyzed to better understand flu-related messaging and communication and responsiveness to CDC guidance on this topic. If a case could not be identified through the school or district Web site, CDC contacted the state school dismissal contact (who often contacted local health or education agency staff) for verification or called the school district or school directly. In addition, in January 2010, the entire list of cases from fall 2009 for each state was verified by the school dismissal monitoring contact in each state, and any discrepancies were resolved. The data in this report reflect that final verification process.

Intranet Reporting Application

All verified dismissal events were then entered at CDC into the SDMS intranet reporting application. This application was developed at CDC during the summer of 2009 to facilitate reporting of dismissed schools and school districts and the number of students and teachers who were affected. The reporting application was based on a data set of all primary and secondary public and nonpublic schools (n = 130 133) and public school districts (n = 14 039) in the United States. The data set was created from NCES's 2006-2007 Common Core of Data (CCD) and the 2007-2008 Private School Universe Survey (PSS). CCD is an NCES program that annually collects data about all public schools, public school districts, and state education agencies in the United States. The data are reported to ED annually by state education agency officials, and they describe schools and school districts and demographic information about their students and staff. The Private School Universe Survey is conducted biennially by NCES; it produces data similar to those of the CCD for the public schools, including the number of nonpublic school students and teachers. The data set provided the following information about each school: complete address, telephone number, county, school district, number of students enrolled, number of teachers employed, type of school, the NCES district and school ID numbers, and the Web site address, if available. Consequently, it was unnecessary to ask for this information on the common reporting form, which reduced the reporting burden considerably. Also, because of the availability of the NCES district and school ID numbers it was also possible to add additional demographic information to the database, again without any increase in reporting burden.

A search function in the application allowed CDC to locate schools or districts in the list that dismissed students. Once the school or district was located in the database, CDC entered the dismissal and projected re-opening dates and a description of how the dismissal event was identified and verified and then saved the school or school district to the SDMS database. The application was designed to allow changes to closing and re-opening dates and the addition of more comments as needed. It was also possible to add missing demographic information into the application such as Web site addresses or telephone numbers if the data were incomplete.

Daily Reports

The Intranet reporting application also was used by CDC to generate three daily reports of dismissals and re-openings from the SDMS database that were issued every school day beginning August 3, 2009. All three reports, which were prepared in a spreadsheet, provided similar information as the daily reports during spring 2009: the dates of dismissal and reopening, type of school, and the numbers of schools, students, and teachers that were affected. One report described all new and ongoing dismissal events for a given day by state and school district or school name and type. This report was shared with CDC and ED staff only. Another report provided the same information as the first, with zip code substituted for the school district or school name, and was shared with DHHS, DHS, and other federal agency staff. CDC and ED created this report to fulfill the needs of additional federal partners and their private sector partners, while protecting the identities of school districts and schools. A third report described all school districts or schools by name that re-opened on that day regardless of when their dismissal event had begun, and was shared only with CDC and ED staff. A fourth presentation software-based report was generated manually by CDC using data from the first report to describe the numbers of schools and students that were affected daily over time by dismissals. CDC released all four reports daily around 3:00 PM on behalf of the CDC and ED collaboration. In addition, CDC divided the first report into pieces by state and e-mailed these state-level reports with school and school district names each day to each relevant state school dismissal monitoring contact to make sure that they knew exactly what CDC was reporting from their state and as one more quality control check. CDC and ED released to the media national summary data (numbers of schools, students, and teachers and number of states) and did not release state-level or school district or school name-based data. All state-specific questions from the media were referred to the relevant state school dismissal contact.

The Intranet reporting application also could be used to generate a summary report, in spreadsheet format, of all dismissal events occurring since August 3, 2009. This report contained additional information about each dismissal event including county, complete address, and Web address. Over time additional descriptive and evaluative variables were added. This report was used to provide more detailed descriptive analyses of all dismissal events and analyses over time.

RESULTS

SDMS was activated on August 3, 2009. While daily reporting ceased in mid-January 2010, the system continued to operate through April 30, 2010. However, no dismissals were detected in 2010. Consequently, the following data describe dismissals detected during the fall semester of the 2009-2010 school year.

Between August 3, 2009, and December 18, 2009, a total of 812 dismissal events occurred in the United States. These dismissal events had an impact on 1947 schools (1.5% of all schools) and approximately 623 616 students and 40 521 teachers (1.1% of all students and teachers). A total of at least 4940 days of school were cancelled, and a total of at least 1 565 321 student-days of school (<.0000001% of all student-days of school) were lost, at least in part, due to ILI.

The mean length of the dismissal events was 2.5 days, the median was 2.0 days, and the range was 1 to 8 days. Table 1 provides the percentage of dismissal events by number of days dismissed.

TABLE 1 Percentage of Dismissal Events by Number of Days Dismissed, Fall Semester 2009

The most (n = 184) new and ongoing dismissal events occurred on Friday, October 23, 2009, when 451 schools and approximately 153 588 students and 9584 teachers were affected (Figure). From late September through early November 2009, dismissals followed a similar pattern each week, with a few dismissals on Monday and Tuesday, an increasing number on Wednesday and Thursday, and the highest number of the week on Friday. Anecdotal information from school and school district officials indicated that to minimize the number of days lost for learning, they used the weekends to extend the number of days students were not in class together. Nonetheless, not all school dismissals were accompanied by cancellation of extra-curricular activities, such as Friday night football games.

Figure. Number of School and Student Influenza-Like Illness-Related Dismissals, by Day, From August 3 to December 18, 2009.

The distribution by month of new and ongoing dismissal events and schools that dismissed students and approximate numbers of students and teachers affected is shown in Table 2.

TABLE 2 Number of Dismissal Events and Schools That Dismissed Students by Month, Fall Semester 2009

School dismissals during the fall semester 2009 occurred in 46 states (all except the District of Columbia, Hawaii, Mississippi, Nevada, and Rhode Island) and varied by state as shown in Table 3. The most dismissal events (n = 282 events) occurred in Michigan, representing 34.7% of all dismissal events, followed by Kentucky (n = 73 events, 9.0% of all dismissal events), Missouri (n = 58 events, 7.1% of all dismissal events), and Texas (n = 56 events, 6.9% of all dismissal events). The highest rate of school dismissals occurred in Kentucky (22.5% of all schools in Kentucky), followed by Michigan (13.0% of all schools in Michigan), Tennessee (8.8% of all schools in Tennessee), and Missouri (4.1% of all schools in Missouri).

TABLE 3 Dismissal Events and Dismissed Schools, Students, and Teachers by State, Fall Semester 2009

School dismissals also varied by school type. A total of 627 dismissal events occurred among public schools or school districts, which affected 1762 public schools (90.5% of all dismissed schools) and approximately 579 844 students and 37 414 teachers. In contrast, 185 nonpublic schools (9.5% of all dismissed schools) dismissed students, which affected approximately 43 772 students and 3106 teachers. The 1762 public schools and 185 nonpublic schools represent 1.8% and 0.6% of all public and nonpublic schools, respectively, nationwide.

School dismissals also varied by urbanicity (urban or rural).2 Among the 1889 schools with data on urbanicity, 639 urban schools dismissed students, affecting approximately 233 005 students and 14 249 teachers, while 1250 rural schools dismissed students affecting approximately 378 581 students and 25 888 teachers. The 639 urban schools and 1250 rural schools represent 0.7% and 3.3% of all urban and rural schools, respectively, nationwide.

A goal of SDMS was to improve the timeliness of reporting of school dismissals. Analyses of the lag between the first day of each dismissal event and the day each event was first included in an SDMS report indicated that 67.1% of all dismissal events were reported on the day they first occurred, and 81.9% were reported on the day they occurred or only one day later. In many instances, the dismissal events reported one day later were reported to the SDMS on the day they occurred, but after the cutoff time (1:00 PM) for the daily report. The median number of days it took to report all school dismissal events was 0 days, the mean was 2.4 days, and the range was 0 to 62 days.

2 NCES Common Core of Data descriptions were used to define rural schools as (1) Rural, Fringe—rural territory that is less than or equal to 5 miles from an urbanized area, as well as rural territory that is less than or equal to 2.5 miles from an urban cluster; (2) Rural, Distant—rural territory that is more than 5 miles but less than or equal to 25 miles from an urbanized area, as well as rural territory that is more than 2.5 miles but less than or equal to 10 miles from an urban cluster; or (3) Rural, Remote—rural territory that is more than 25 miles from an urbanized area and is also more than 10 miles from an urban cluster. All other schools were considered to be urban.

CONCLUSIONS

The use of direct reporting to CDC of school dismissals by local health and education agencies as a primary data source for monitoring ILI-related school dismissals had never been attempted before activation of SDMS on August 3, 2009. Between August 3, 2009, and December 18, 2009, a total of 54.7% (n = 444) of all dismissal events were reported through the direct reporting system at www.cdc.gov/fluschooldismissal. The support of state health and education agencies and national organizations for using the direct report process was key to the high percentage of dismissal events reported in this manner. The use of direct reporting also significantly reduced the amount of staff time necessary to detect and verify dismissals, therefore allowing for more timely reporting.

Limitations of SDMS

As with any surveillance system, SDMS had several limitations. First, not every dismissal event was reported on the first day it occurred. As described, nearly all dismissal events were reported on the day they occurred or only one day later, but reporting delays still occurred. Second, to reduce reporting burden, information on the numbers of students and teachers affected and the number of schools per district was based on the latest data from NCES. However, the NCES data are typically two to three years old and may not match exactly the current demographic characteristics of all schools and school districts nationwide. Further, NCES data were not available for a very small number of mostly nonpublic schools, which contributed to a slight underestimate of the numbers of students and teachers that were affected. Third, SDMS was not designed to monitor dismissals among preschools or institutions of higher education. Fourth, SDMS was designed only to monitor school dismissals and not student or teacher absenteeism, ILI among the school community, or factors influencing the decision to dismiss schools. Establishing a national system to do so would require resources beyond those available for SDMS. Finally, even though every effort was made to obtain reports of every school dismissal, it is possible some were missed. Similar to most disease-focused public health surveillance systems, SDMS was exception based.

Considerations for the Future

Based on the development and implementation of SDMS in response to the 2009 H1N1 influenza A virus (H1N1) outbreaks, it would be important to consider several issues should the need for this type of surveillance arise again in the future. First, SDMS was not designed to provide data on why school dismissals occurred. However, considerable interest in the reasons why dismissals were occurring was expressed throughout fall 2009. The information downloaded from school and school district Web sites about dismissals could have helped answer questions about why the dismissals occurred, but this information was not available for every case, often lacked sufficient precision, and was qualitative in nature, which made quick analyses impossible with the staffing available. Based on conversations with school administrators, it is unclear if they would have been precise enough reporters of the reasons they used to dismiss school or if being asked for this information would have increased respondent burden sufficiently to reduce willingness to report at all. However, new studies and qualitative analyses of the school and school district Web site information could improve understanding of why ILI-related dismissals occur.

Second, SDMS relied heavily on the state school dismissal monitoring contacts who reported and helped verify dismissals. Unlike most other surveillance systems at CDC, each state was allowed to select whether their contact would be from the state health or state education agency. Many of these persons had more than full-time responsibilities (not necessarily related to flu or surveillance) long before school dismissal monitoring was added to their workload, and some had little experience or familiarity with their state's education system. Nonetheless, their contributions greatly enhanced the quality and timeliness of the data, and their role should be considered critical if SDMS is activated in the future.

Third, direct online reporting from schools, school districts, and local health agencies to CDC is a unique feature of SDMS and, as mentioned, had not been previously attempted as part of a CDC surveillance system. While simultaneous reporting occurred to the state school dismissal monitoring contacts, the direct online reports still came from a local agency to CDC. It was unclear when SDMS was activated if this mode of reporting would work and whether schools, school districts, and local health agencies would voluntarily submit online reports to CDC in a timely manner. As mentioned, neither CDC nor ED has official authority to require reporting. Consequently, the support of state health and education agencies and national organizations for using the direct online reporting process was key to the high percentage of dismissal events reported in this manner. Many state agencies and national organizations promoted the use of the online reporting form on their Web site, in newsletters, and through other modes of communications with their constituencies and members, and they did this at the request of CDC and ED without remuneration. Any future activation of SDMS should not fail to recognize the critical role of these partnerships and should include time for this component of the system to be re-established.

Finally, the SDMS intranet reporting application greatly enhanced the ability of CDC to generate real-time daily school dismissal reports. However, additional reports and analyses were requested that could not be produced by the application. Enhancing the SDMS intranet reporting application to produce additional reports and more detailed analyses might be a good investment before the application is needed again.

In conclusion, CDC and ED established the online direct reporting system and intranet reporting application used for SDMS, identified and collaborated with state school dismissal monitoring contacts, conducted media scans and online searches, collaborated with critical national organizations and other federal agencies, and supervised and staffed daily operations during the fall semester of the 2009-2010 school year. This investment yielded real-time national summary data that were used widely throughout the US government in three primary ways: (1) for situational awareness, (2) to assess the impact of CDC guidance and community mitigation efforts, and (3) to inform the development of guidance, resources, and tools for schools. Further, SDMS data were used to respond in a timely manner to the considerable media interest in dismissals at both CDC and ED.

Fortunately, SDMS could prove helpful in the future. The relationships established and the technology created could be used in response to other public health emergencies (eg, pandemic flu or hurricanes). With resources for staffing and in collaboration with state health and education agencies and ED, CDC could activate SDMS again and use it to monitor ILI-related or other types of school dismissals.

Disclaimer: The findings and conclusions in this report are those of the authors and do not necessarily represent the official position of the Centers for Disease Control and Prevention.

Acknowledgment: Lisa Barrios contributed to the development and implementation of the school dismissal monitoring system.

Figure 0

TABLE 1 Percentage of Dismissal Events by Number of Days Dismissed, Fall Semester 2009

Figure 1

Figure. Number of School and Student Influenza-Like Illness-Related Dismissals, by Day, From August 3 to December 18, 2009.

Figure 2

TABLE 2 Number of Dismissal Events and Schools That Dismissed Students by Month, Fall Semester 2009

Figure 3

TABLE 3 Dismissal Events and Dismissed Schools, Students, and Teachers by State, Fall Semester 2009