Skip Navigation LinksSkip Navigation Links
Centers for Disease Control and Prevention
Safer Healthier People
Blue White
Blue White
bottom curve
CDC Home Search Health Topics A-Z spacer spacer
spacer
Blue curve MMWR spacer
spacer
spacer

Persons using assistive technology might not be able to fully access information in this file. For assistance, please send e-mail to: mmwrq@cdc.gov. Type 508 Accommodation and the title of the report in the subject line of e-mail.

Comparison of Syndromic Surveillance and a Sentinel Provider System in Detecting an Influenza Outbreak --- Denver, Colorado, 2003

Debra P. Ritzwoller,1 K. Kleinman,2 T. Palen,1 A. Abrams,2 J. Kaferly,1 W. Yih,2 R. Platt2,3
1
Kaiser Permanente Colorado, Denver, Colorado; 2Harvard Pilgrim Health Care and Harvard Medical School, Boston, Massachusetts;
3
Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts

Corresponding author: Debra P. Ritzwoller, Clinical Research Unit, Kaiser Permanente Colorado, 580 Mohawk Dr., Boulder, CO 80301. Telephone: 303-554-5045; Fax: 303-554-5043; E-mail: debra.ritzwoller@kp.org.

Disclosure of relationship: The contributors of this report have disclosed that they have no financial interest, relationship, affiliation, or other association with any organization that might represent a conflict of interest. In addition, this report does not contain any discussion of unlabeled use of commercial products or products for investigational use.

Abstract

Introduction: Syndromic surveillance systems can be useful in detecting naturally occurring illness.

Objectives: Syndromic surveillance performance was assessed to identify an early and severe influenza A outbreak in Denver in 2003.

Methods: During October 1, 2003--January 31, 2004, syndromic surveillance signals generated for detecting clusters of influenza-like illness (ILI) were compared with ILI activity identified through a sentinel provider system and with reports of laboratory-confirmed influenza. The syndromic surveillance and sentinel provider systems identified ILI activity based on ambulatory-care visits to Kaiser Permanente Colorado. The syndromic surveillance system counted a visit as ILI if the provider recorded any in a list of 30 respiratory diagnoses plus fever. The sentinel provider system required the provider to select "influenza" or "ILI."

Results: Laboratory-confirmed influenza cases, syndromic surveillance ILI episodes, and sentinel provider reports of patient visits for ILI all increased substantially during the week ending November 8, 2003. A greater absolute increase in syndromic surveillance episodes was observed than in sentinel provider reports, suggesting that sentinel clinicians failed to code certain cases of influenza. During the week ending December 6, when reports of laboratory-confirmed cases peaked, the number of sentinel provider reports exceeded the number of syndromic surveillance episodes, possibly because clinicians diagnosed influenza without documenting fever.

Conclusion: Syndromic surveillance performed as well as the sentinel provider system, particularly when clinicians were advised to be alert to influenza, suggesting that syndromic surveillance can be useful for detecting clusters of respiratory illness in various settings.

Introduction

The 2003--04 influenza season in the Denver metropolitan area began earlier, was more severe than in recent years, and included reports of pediatric mortality (1). Influenza outbreaks occur each year, but uncertainty exists with respect to the timing and severity of these outbreaks. Effective surveillance is critical for tracking the spread and severity of disease and for determining the types and subtypes of viruses that circulate during the influenza season.

Most public health organizations that monitor influenza use the U.S. Influenza Sentinel Provider Surveillance Network, a collaborative effort between CDC, state and local health departments, and health-care providers (2). This system monitors influenza activity in the general population. Traditionally, the sentinel provider surveillance system operates from October to mid-May each year. Each week, sentinel providers report the total number of patient visits during the preceding week and the total number of patient visits for influenza-like illness (ILI) (2), stratified by age categories.

In recent years, substantial investments have been made in syndromic surveillance systems. These systems allow rapid detection of natural infectious disease clusters and of intentional acts of terrorism (3--5). Previous studies have demonstrated that syndromic surveillance can be useful in detecting ILI (3,6,7). Investment in these systems might enhance public health organizations' ability to identify and react to infectious disease outbreaks. This report compares the dates on which a syndromic surveillance system and a Sentinel Provider Network, both in a single health-care delivery system, identified unusual ILI activity associated with the onset of the fall 2003 influenza outbreak in Denver, as determined by reported laboratory-confirmed cases of influenza.

Methods

A retrospective comparison of data collected by three ILI detection systems (i.e., two ambulatory care--based and one laboratory-based) was conducted during fall 2003. The two ambulatory-care surveillance systems were situated in Kaiser Permanente Colorado (KPCO), a closed panel, group-model health maintenance organization serving approximately 380,000 members in the Denver-metropolitan area.

Laboratory-Based Surveillance

During the 2003--04 influenza season, laboratories in Colorado reported positive influenza tests to the Colorado Department of Public Health and Environment (CDPHE), either via the Colorado Electronic Disease Reporting System (CEDRS) or by fax or telephone; test results were displayed graphically by week of report on CDPHE's website (8). Weekly electronic newsletters that included the county specific counts of laboratory-confirmed influenza cases were generated and distributed to providers and public health officials. Laboratory-confirmed cases were from the seven-county Denver-metropolitan area, consistent with KPCO's service area; however, test results were obtained by clinicians outside and within KPCO*. Laboratory confirmation of influenza cases was based on direct fluorescent antibody and viral culture results. Information about other viruses in the community was obtained from The Children's Hospital (TCH) in Denver, Colorado, which published counts of confirmed cases of respiratory syncytial virus (RSV), adenovirus, parainfluenza, rhinovirus, and pertussis (9). The source of these laboratory specimens was pediatric patients who sought medical care at the TCH emergency department with respiratory illness during October--May.

Syndromic Surveillance System

The CDC-sponsored National Bioterrorism Syndromic Surveillance Demonstration Program, in which KPCO participates, has been described previously (4,7,9--11). This syndromic surveillance system is based on diagnostic codes entered in patients' electronic medical records (EMR) by providers during the routine delivery of ambulatory care. Diagnostic codes are mapped to 13 syndromes. To be counted as a case of ILI, the encounter must have at least one of a set of respiratory illness codes and have measured fever of at least 100ºF (37.8ºC) in the temperature field. If no value is present in that field, an International Classification of Diseases, Ninth Revision (ICD-9) primary, secondary, or tertiary code of fever (code 780.6) must be provided. These data are extracted daily, and counts by ZIP code are reported on a secure website in both a graphical and a map format and on a daily basis. Signals of unusual clusters of ILI are identified by three statistical models: small area method, spatio-temporal method, and purely temporal method. These models are estimated daily, and signals are reported based on pre-determined thresholds.

The small area method has been described previously (12). The historical series of counts in each small area are used to create a regression estimate of the count to be expected in each area on each day, adjusting for seasonal, weekly, and secular trends, as well as holiday effects. The results are used to create p-values for statistical significance. These estimates are then corrected to account for multiple ZIP codes and used to create recurrence intervals (RIs). RIs are defined as the number of surveillance days required to expect a count as unusual as the one observed to occur exactly once by chance. This method, also called the SMART (Small Area Regression and Testing) scores method, is advantageous in that large values imply more unusual results, and the multiple tests are adjusted for in the same step. For this analysis, ZIP codes were used as the small areas.

The spatio-temporal method is a space-time scan statistic, implemented by using the public domain SaTScan software (13). Day of the week, holidays, season, secular trends, and the unique characteristics of each ZIP code area (e.g., the health-seeking behavior of the population) were adjusted for by using the results of the regression needed in the SMART scores method (14,15). The space-time scan statistic searches all circular areas incorporating one or more ZIP codes for the most unusual grouping, as measured by a likelihood ratio statistic. A p-value is calculated by using Monte Carlo methods (16). The maximum geographic size was set at 50% of the adjusted population at risk (14) and the temporal length at 3 days.

The temporal method implemented the space-time scan statistic by also using SaTScan, but required the area to include 100% of the surveillance area, effectively removing the spatial aspect of the test. In all other respects, it was identical to the space-time scan statistic used. For the space-time and temporal scan statistics, the RI was calculated, although no correction for multiple comparisons was required, and the RI was the inverse of the p-value.

The full-text medical records of patients with ILI who were counted as part of signals in September and October with RI >30 days were reviewed by a clinician to assess mention (present, absent, not mentioned) of ILI clinical characteristics. These characteristics included headache, myalgia, or muscle aches; malaise; cough; sore throat; ocular pain; photophobia; dyspnea; and/or fever.

Sentinel Provider System

The sentinel provider system is overseen by CDPHE and is part of the CDC funded Colorado Influenza Surveillance Project (2). This ongoing project recruits providers each season to report weekly ILI activity from earlyOctober through mid-May. In 2003, KPCO submitted ILI data from its EMR for approximately 250 primary care providers. The surveillance system relies on a report of the total number of patients evaluated and the number of those patients with ILI stratified by age group and reported weekly. From these data, the percentage of patient visits for ILI is calculated. During the 2003--04 influenza season, KPCO used an EMR system that employed a controlled-medical-terminology vocabulary from SnoMed (17) for the documentation of diagnoses. A patient visit was reported to the sentinel provider system as ILI if the physician actively selected either of the SnoMed terms "influenza or influenza-like illness" within the diagnosis section of the patient's EMR. Data were extracted weekly, based on the specific SnoMed terms selected rather than on the specific ICD-9 codes. SnoMed terms were extracted from the patient's electronic chart for analysis because in KPCO's data warehouse both "influenza-like illness" and "influenza" were mapped only to ICD-9 code 487.1, which is "influenza, not otherwise specified." In addition to reporting the percentage of all visits for ILI, this analysis stratified data by the specialty of the provider (i.e., pediatrics, family practice, internal medicine, and urgent care).

Although many visits captured by one KPCO provider-based system were also captured by the other, the two sets were not completely overlapping. The sentinel provider system did not explicitly require the patient to meet the fever criterion of the syndromic surveillance system. In addition, the syndromic surveillance ILI system could capture visits for which the provider did not assign the influenza or ILI diagnosis, either because the provider believed the cause was not influenza or because the provider simply chose a different diagnosis, such as cough, upper respiratory infection, or pneumonia.

Results

A single, positive laboratory-confirmed influenza case was reported before October 25, 2003, in Denver. On November 10, CDPHE reported a substantial increase in reported laboratory-confirmed cases, from seven during the week ending November 1 to 69 during the week ending November 8. A total of 447 laboratory-confirmed cases were reported during the week ending November 15; cases peaked at 1,504 during the week ending December 6 (Figure). During that week, TCH reported <15 positive tests each for RSV, parainfluenza, rhinoviruses, or pertussis.

When the daily syndromic surveillance data were aggregated into comparable weekly units, the number of episodes that met the syndromic surveillance definition for ILI exceeded the number of sentinel provider reports until the week ending November 15. ILI episodes identified by the syndromic system increased from 89 (week ending November 1) to 242 (week ending November 8) and to 556 (week ending November 15). For the sentinel system, identified cases increased from 11 (week ending November 1) to 100 (week ending November 8) and to 567 (week ending November 15). For both the syndromic and sentinel systems, the number of episodes and cases peaked the week ending November 22, with 859 episodes and 1,304 cases, respectively (Figure).

The number of new clinical episodes identified in the ambulatory-care setting that met the syndromic surveillance system definition of ILI are illustrated, as well as signals with RIs >30 days for the three signal detection algorithms (Figure). Signals of at least this magnitude occurred on 1 day in September and 2 days in October; during this 2-month period, six events would have been expected by chance (1 per method per 30 days). Signals of at least this magnitude were then observed on November 1 and every day during November 6--December 1. RIs exceeded 10,000 (expected to occur by chance no more often than once in 27 years) on every day during November 8--30.

Among the three signal detection algorithms, the SMART score generated RIs of >30 days on 3 days during November 1--8. The SMART scores also generated RIs of 200--9,000 on 8 days and RIs of >10,000 on 9 days during November 8--December 6. Both the spatio-temporal and purely temporal SaTScan generated RIs of >30 days beginning November 7, with RIs of >10,000 days for 23 days during November 8--December 6.

Manual review of the medical records for the 20 patients who were part of the September and October ILI signals generated by either SMART scores or SaTScan indicated that all 20 patients had fever (100ºF [37.8ºC]), 15 had cough or nasal congestion, and 17 had cough, nasal congestion, or sore throat. Chart abstraction also indicated that more than half of the patients associated with the early signals (i.e., 12 of 20 patients in signals before November 1, 2003, had cough and fever, the combination most predictive of influenza (18). Two or more of the four signs that are most commonly reported among patients with confirmed influenza A diagnoses and consistent with CDC's case definition (i.e., fever, cough, nasal congestion, and sore throat), were reported in 15 of 20 patients (19--21).

The average weekly sentinel surveillance data from visits from all KPCO primary care providers first indicated an increase in ILI above 1% of visits during the week ending November 15. During the dominant weeks of the outbreak, weeks ending November 8--December 6, substantial variation was observed in the sentinel provider group by primary care specialty. During the week of November 22, the Pediatrics department providers assigned an ILI diagnosis for >8% of visits, compared with <2% of visits in the Internal Medicine department.

Discussion

Although the ambulatory-care--based syndromic surveillance system described in this report was designed principally to detect terrorism events, it would have automatically generated ILI alerts at the same time a meaningful number of laboratory-confirmed cases were reported. The sentinel provider also demonstrated increased activity. The syndromic surveillance and sentinel provider systems shared important features. Both obviated the need for clinicians to participate directly in reporting because the information recorded as part of routine documentation of clinical encounters was extracted from an EMR. For this reason, this is a bestcase implementation of sentinel provider surveillance. A difference between the syndromic and sentinel systems is that the syndromic surveillance system is more standardized it does not require clinicians to make an explicit diagnosis of influenza or ILI. Instead, influenza can be defined by signs and symptoms the provider might not recognize or code as influenza. Because of this, clinicians were not reminded to use the influenza codes; outreach to clinicians was a prominent feature of the sentinel provider program. This difference would be particularly important for surveillance of conditions that are not expected at a particular season or that are not readily recognized by clinicians; examples include the early phases of many terrorism-related illnesses or severe acute respiratory syndrome.

Another difference is that the sentinel provider system did not explicitly require the patient to meet the fever criterion of the syndromic surveillance system. In addition, the syndromic surveillance ILI system could capture visits for which the provider did not assign the influenza or ILI diagnosis, either because the provider believed the cause was not influenza or because the provider simply chose a different diagnosis, such as cough, upper respiratory infection, or pneumonia.

The findings of this report suggest that clinicians' likelihood of choosing an "influenza" diagnosis might have been subject to external information about the presence of influenza in the community (i.e., media reports and public health alerts). During the week ending November 8, the number of ILI episodes identified by KPCO's syndromic surveillance system increased by 153, whereas the number of cases identified by the sentinel provider system increased by 89. Given the increase in reported laboratory-confirmed cases and the lack of evidence that other respiratory viruses were circulating in the metropolitan area during that week, a substantial fraction of the difference represents cases that were missed by the sentinel provider system. In contrast, after an outbreak was recognized, the number of sentinel provider reported cases (1,304) substantially exceeded the number of new syndromic surveillance episodes (859). This is also the week that the Infectious Disease Department sent reminders to KPCO primary care providers asking them to use the KPCO specific ILI coding terms for suspected influenza cases. Most of this difference in the counts of new episodes versus cases (445) is likely to be a result of clinicians' assigning an influenza diagnosis without documenting fever.

The findings also demonstrate the utility of the signal detection algorithms that were used to analyze the syndromic surveillance data. They provided unequivocal signals, despite the syndrome definition being nonspecific, as evidenced by the baseline rate of nearly 100 new episodes per week before influenza became widespread in the community. These methods might also be useful for detecting unusual clusters of other endemic infectious diseases, despite being designed to ignore typical seasonal increases in ILI episodes.

Theoretical considerations suggest that the spatio-temporal approach has the best combination of sensitivity and specificity for detecting events that occur in more than one adjoining small area (e.g., more than one ZIP code that is under surveillance), whereas a purely temporal approach is best when the events are scattered throughout all regions under surveillance. The size of the 2003 influenza outbreak overwhelmed these theoretical differences among the algorithms, and all three---SMART score, spatio-temporal, and purely temporal SaTScan--provided strong signals that coincided with the increase in laboratory-confirmed cases of influenza.

All of the signal detection algorithms used in the syndromic surveillance adjusted for typical seasonal fluctuations in illness, including ILI, to be able to detect a terrorism event against a background of normal patterns of morbidity. Because the influenza season arrived earlier than usual facilitated its identification in November. If identification of seasonal respiratory illness is a goal of such a syndromic surveillance system, it will be necessary to develop signal detection algorithms that are optimized for this purpose.

Conclusion

Automated syndromic surveillance identified unusual ILI activity early, as did the traditional sentinel provider surveillance and reported laboratory-confirmed influenza cases, despite being designed to detect terrorism rather than natural outbreaks of diseases such as influenza. The syndromic surveillance system's ability to use uniform criteria for case identification might be an advantage in situations in which clinicians are not alerted to the potential presence of a problem. Because the syndromic surveillance system is a passive system, this might limit bias in the data collection that might be associated with external factors such as media reports and public health alerts in contrast to sentinel provider recognition. In addition, the three different signal detection algorithms used by the syndromic surveillance system proved useful and might have broader applicability for surveillance of other infectious diseases.

Acknowledgments

This study was funded by CDC cooperative agreement UR8/CCU115079. The findings in this report are based, in part, on contributions by Ken Gershman and Barbara Stone from the Colorado Department of Public Health and Environment. Additional assistance was also provided by Ray Bem, Mike Bodily, Capp Luckett, and Elizabeth Newsom from the Kaiser Permanente Colorado Clinical Research Unit, and Adam Jackson from Kaiser Permanente Department of Infectious Disease, Denver, Colorado.

References

  1. CDC. Update: influenza activity---United States and worldwide, 2003--04 season, and composition of the 2004--05 influenza vaccine. MMWR 2004;53:547--52.
  2. CDC. Preliminary assessment of the effectiveness of the 2003--04 inactivated influenza vaccine---Colorado, December 2003. MMWR 2004;53:8--11.
  3. Lewis MD, Pavlin JA, Mansfield JL, et al. Disease outbreak detection system using syndromic data in the greater Washington, DC area. Am J Prev Med 2002;23:180--6.
  4. Lazarus R, Kleinman K, Dashevsky I, et al. Use of automated ambulatory-care encounter records for detection of acute illness clusters, including potential bioterrorism events. Emerg Infect Dis 2002;8:753--60.
  5. Bravata DM, McDonald KM, Smith WM, et al. Systematic review: surveillance systems for early detection of bioterrorismrelated diseases. Ann Intern Med 2004;140:910--22.Miller B, Kassenborg H, Dunsmuir W, et al. Syndromic surveillance for influenza like illness in ambulatory care network. Emerg Infect Dis 2004;10:1806--11.
  6. Lazarus R, Kleinman K, Dashevsky I, DeMaria A, Platt R. Using automated medical records for rapid identification of illness syndromes: the example of lower respiratory infection. BioMed Central Public Health 2001;1:1--9.
  7. Colorado Department of Public Health and Environment. Summary of 2003--04 influenza season, reported cases of laboratory confirmed influenza. Available at http://www.cdphe.state.co.us/dc/Influenza/lab_chart_03_04.pdf.
  8. The Children's Hospital Departments of Epidemiology and Pathology. Contagious comments `bug watch'. Available at http://www.thechildrenshospital.org/pro/publications/bug.pdf.
  9. Platt R, Bocchino C, Caldwell B, et al. Syndromic surveillance using minimum transfer of identifiable data: the example of the National Bioterrorism Syndromic Surveillance Demonstration Program. J Urban Health 2003;80(Suppl 1):i25--i31.
  10. Yih WK, Caldwell B, Harmon R, et al. The National Bioterrorism Syndromic Surveillance Demonstration Program. In: Syndromic surveillance: reports from a national conference, 2003. MMWR 2004; 53(Suppl):43--6.
  11. Kleinman K, Lazarus R, Platt R. A generalized linear mixed models approach for detecting incident clusters of disease in small areas, with an application to biological terrorism (with invited commentary). Am J Epid 2004;159:217--24.
  12. Kulldorff M; Information Management Services Inc. SaTScan v4.0: software for the spatial and space-time scan statistics. Available at http://www.satscan.org.
  13. Kleinman K, Abrams A, Kulldorff M, Platt R. A model-adjusted space-time scan statistic with an application to syndromic surveillance. Epid Infect 2005; In press.
  14. Kulldorff M. Prospective time periodic geographic disease surveillance using a scan statistic. J Royal Stat Soc [Ser A] 2001;164:61--72.
  15. SNOWMED International. SNOWMED clinical terms. Available at http://www.snomed.org.
  16. Dwass M. Modified randomization tests for nonparametric hypotheses. Ann Math Stat 1957;28:181--7.
  17. Monto AS, Gravenstein S, Elliott M, Colopy M, Schweinle J. Clinical signs and symptoms predicting influenza infection. Arch Intern Med 2000;160:3243--7.
  18. Besag PJ, Newell J. The detection of cluster/signals in rare diseases. J R Stat Soc [Ser A] 1991;154:143--55.
  19. Boivin G, Hardy I, Tellier G, Maziade J. Predicting influenza infections during epidemics with use of a clinical case definition. Clin Infect Dis 2000;31:1166--9.

* Counties include Adams, Arapahoe, Boulder, Broomfield, Denver, Douglas, and Jefferson.

Figure

Figure 1
Return to top.

Use of trade names and commercial sources is for identification only and does not imply endorsement by the U.S. Department of Health and Human Services.


References to non-CDC sites on the Internet are provided as a service to MMWR readers and do not constitute or imply endorsement of these organizations or their programs by CDC or the U.S. Department of Health and Human Services. CDC is not responsible for the content of pages found at these sites. URL addresses listed in MMWR were current as of the date of publication.

Disclaimer   All MMWR HTML versions of articles are electronic conversions from ASCII text into HTML. This conversion may have resulted in character translation or format errors in the HTML version. Users should not rely on this HTML document, but are referred to the electronic PDF version and/or the original MMWR paper copy for the official text, figures, and tables. An original paper copy of this issue can be obtained from the Superintendent of Documents, U.S. Government Printing Office (GPO), Washington, DC 20402-9371; telephone: (202) 512-1800. Contact GPO for current prices.

**Questions or messages regarding errors in formatting should be addressed to mmwrq@cdc.gov.

Date last reviewed: 8/5/2005

HOME  |  ABOUT MMWR  |  MMWR SEARCH  |  DOWNLOADS  |  RSSCONTACT
POLICY  |  DISCLAIMER  |  ACCESSIBILITY

Safer, Healthier People

Morbidity and Mortality Weekly Report
Centers for Disease Control and Prevention
1600 Clifton Rd, MailStop E-90, Atlanta, GA 30333, U.S.A

USA.GovDHHS

Department of Health
and Human Services