Skip Navigation LinksSkip Navigation Links
Centers for Disease Control and Prevention
Safer Healthier People
Blue White
Blue White
bottom curve
CDC Home Search Health Topics A-Z spacer spacer
spacer
Blue curve MMWR spacer
spacer
spacer

The content, links, and pdfs are no longer maintained and might be outdated.

  • The content on this page is being archived for historic and reference purposes only.
  • For current, updated information see the MMWR website.

Syndromic Surveillance System Evaluation --- District of Columbia, 2001--2004

Michael A. Stoto,1 A. Jain,1 A. Diamond,2 J. Davies-Cole,3 A. Adade,3 S. Washington,3 G. Kidane,3 C. Glymph3
1
RAND, Arlington, Virginia; 2Harvard University, Cambridge, Massachusetts; 3District of Columbia Department of Health, Washington, DC

Corresponding author: Michael A. Stoto, RAND, 1200 South Hayes St., Arlington, VA 22202. Telephone: 703-413-1100, ext. 5472; Fax: 703-413-8111; E-mail: stoto@rand.org.

Disclosure of relationship: The contributors of this report have disclosed that they have no financial interest, relationship, affiliation, or other association with any organization that might represent a conflict of interest. In addition, this report does not contain any discussion of unlabeled use of commercial products or products for investigational use.

Abstract

Introduction: In September 2001, the District of Columbia Department of Health began a syndromic surveillance program based on hospital emergency department (ED) visits. ED logs are faxed daily to the health department, where staff code them by chief complaint and record the number of patients, in each hospital who die or experience sepsis, rash, respiratory complaints, gastrointestinal complaints, unspecified infection, and neurologic or other complaints.

Objectives: This study evaluates the completeness, usefulness, and effectiveness of the syndromic surveillance system.

Methods: Data were received from nine hospitals in the first 32 months of the operation of the system (September 2001--May 2004). These data were used to describe the operation of the completeness of the system (whether reports were sent to health departments daily), by hospital, season and day of the week, and variability in patterns of symptom groups across hospital and season. Three statistical detection algorithms also were applied retrospectively to identify departures from normal patterns associated with the beginning of the winter influenza season and other disease outbreaks.

Results: Completeness varied by calendar quarter and hospital, ranging from no missing data for some hospitals and quarters to 100% missing data. Data were missing primarily in weekly patterns and stretches of time that varied across hospitals, which might reflect staff availability to fax data to the health department. In seven of nine hospitals from which the data were more than 75% complete, with limited exceptions, the number and proportion of cases in each symptom group were constant over time. The distribution of symptom groups were similar in all except one hospital, possibly reflecting a different patient population. Day-of-the-week effects were apparent in certain hospitals but varied substantially by symptom, group, and hospital. Application of various detection algorithms indicated that, particularly when pooling data across seven hospitals, the syndromic surveillance data can be used to identify the onset of the influenza season within 2--3 days. The data also can be used to determine indications of the "worried well" who sought care during the 2001 anthrax attacks and a previously undetected series of gastrointestinal illness outbreaks that occurred during a 4-month period in five different hospitals. No single symptom group or detection algorithm consistently signaled each of the gastrointestinal events.

Conclusion: If problems with completeness of the data can be improved through a planned automatic electronic reporting system, syndromic surveillance data might offer the potential for early detection of influenza and other disease outbreaks. Additional research is needed, however, to characterize normal patterns in the data, identify the most effective detection algorithms and symptom groups for various purposes, and characterize their sensitivity and specificity when used prospectively in real time.

Use of trade names and commercial sources is for identification only and does not imply endorsement by the U.S. Department of Health and Human Services.


References to non-CDC sites on the Internet are provided as a service to MMWR readers and do not constitute or imply endorsement of these organizations or their programs by CDC or the U.S. Department of Health and Human Services. CDC is not responsible for the content of pages found at these sites. URL addresses listed in MMWR were current as of the date of publication.

Disclaimer   All MMWR HTML versions of articles are electronic conversions from ASCII text into HTML. This conversion may have resulted in character translation or format errors in the HTML version. Users should not rely on this HTML document, but are referred to the electronic PDF version and/or the original MMWR paper copy for the official text, figures, and tables. An original paper copy of this issue can be obtained from the Superintendent of Documents, U.S. Government Printing Office (GPO), Washington, DC 20402-9371; telephone: (202) 512-1800. Contact GPO for current prices.

**Questions or messages regarding errors in formatting should be addressed to mmwrq@cdc.gov.

Date last reviewed: 8/5/2005

HOME  |  ABOUT MMWR  |  MMWR SEARCH  |  DOWNLOADS  |  RSSCONTACT
POLICY  |  DISCLAIMER  |  ACCESSIBILITY

Safer, Healthier People

Morbidity and Mortality Weekly Report
Centers for Disease Control and Prevention
1600 Clifton Rd, MailStop E-90, Atlanta, GA 30333, U.S.A

USA.GovDHHS

Department of Health
and Human Services