Daily Emergency Department Surveillance
County, New Jersey
Bergen County Department of Health Services, Paramus, New Jersey
Corresponding author: Marc Paladini, New York City Department of Health and Mental Hygiene, 125 Worth St., New York, NY
10013. Telephone: 212-788-4320; Fax: 212-788-5470; E-mail: firstname.lastname@example.org.
The purpose of the Daily Emergency Department Surveillance System (DEDSS) is to provide consistent, timely, and
robust data that can be used to guide public health activities in Bergen County, New Jersey. DEDSS collects data on all
emergency department visits in four hospitals in Bergen County and analyzes them for aberrant patterns of disease or single instances of certain diseases or syndromes. The system monitors for clusters of patients with syndromes consistent with the prodrome of
a terrorism-related illness (e.g., anthrax or smallpox) or naturally occurring disease (e.g., pandemic influenza or food
and waterborne outbreaks). The health department can use these data to track and characterize the temporal and geographic spread of a known outbreak or demonstrate the absence of cases during the same period (e.g., severe acute respiratory syndrome
[SARS] or anthrax). DEDSS was designed to be flexible and readily adaptable as local, state, or federal surveillance needs evolve.
In 2001, the Bergen County Department of Health Services instituted a countywide syndromic surveillance system that
uses hospital emergency department (ED) data. Located in northeast New Jersey across the Hudson River from New York
City, Bergen County has a population of approximately 884,000 persons (U.S. Census 2000) living within
234 square miles.
The first step in creating the Daily Emergency Department Surveillance System (DEDSS) was to identify the
appropriate stakeholders. Within the health department, the creative team consisted of an epidemiologist, an information technology (IT) professional, and the director of planning. Next, immediate external stakeholders, including the infection-control practitioner (ICP), the ED director, the hospital IT professional, and the hospital director of security, were brought into the discussion. After the system was developed, local health officers, health department nurses, and state and regional health department epidemiologists were updated on its progress.
Four of six Bergen County hospitals provide daily data to DEDSS, representing 85% of all daily ED visits. Early
each morning, the hospital's computer system generates a text file containing the following fields for each person who visited
the ED the previous day: date of visit, residential zip code, age, chief complaint, and admission status. The file, abstracted from the hospital's database, uses data produced during normal, clinical ED workflow. The text file is then automatically sent to a password-protected file transfer protocol (FTP) server, where it is stored. The size of each file differs, ranging from a four-hospital total of 400 to 600 visits/day. At 8:00 a.m. each morning, the epidemiologist's computer automatically starts DEDSS. The program connects to the FTP site and downloads, formats, integrates, and analyzes the data. DEDSS then
creates standardized reports and e-mails them to the epidemiologist along with an alert to his cellular telephone indicating the system ran successfully. The epidemiologist can then
access the reports remotely and determine any needed
Data are analyzed daily by using a modified version of the cumulative sum
SAS® (2). For each syndrome in each hospital, a ratio is calculated by dividing the number of visits caused by the syndrome by the total number of ED visits. This ratio is then compared with the mean of an 11-day moving baseline that precedes the day of interest.
The first 3 days before the current observation are ignored to act as a buffer for an outbreak that might grow slowly over 1--2 days, and the mean is tabulated for days 4--14 before the day of interest. Because the data are not transformed and any signals that might arise remain in the data set, the health department uses both a buffer and an 11-day moving average to offset the effects that days of increased activity would have on the analysis.
If an observation is higher than expected, on the basis of the moving average plus 3 standard deviations, a signal is
created and two reports are generated. The first report includes the syndrome signaled, hospital (if the signal has occurred at a single hospital) or county (if the signal has occurred at
>2 hospitals), date, total number of visits, total number in the
syndrome, ratio for that day, and baseline ratio with which it was compared. For each signal, a corresponding report is generated that features a line listing of all persons who were part of the signal.
The first step, as in any outbreak investigation, is to verify the diagnosis. Because using text strings to identify
affected patients can result in inclusion of patients who do not have the chief complaints of interest (e.g.,
no fever instead of fever), the chief-complaint field for each member of the line listing is examined. This field contains a mixture of triage information, clinical diagnoses, and patient statements. For example, a case of viral respiratory disease (e.g., influenza) might be coded
as fever and cough, viral syndrome, or
I don't feel well, depending on the hospital. After an investigation
determines the system properly identified appropriate chief complaints and all of the observations appear to be valid, a level of concern is assigned.
Three levels of concern can be assigned to signals,
low, moderate, or elevated, each with corresponding steps.
The epidemiologist assigns the level after reviewing each day's report, which usually takes <10 minutes. If a signal is attributable
to low numbers (<10), is just above the baseline, is attributable to seasonality (e.g., pneumonia in winter), and exhibits
no obvious epidemiologic links (e.g., age or zip code), then the signal level assigned is
low, and no action is taken.
A level of moderate is assigned if multiple signals occur on the same day in different hospitals; if two, consecutive,
low-level signals occur in the same hospital; if a low-level signal arises with possible epidemiologic links (e.g., geographic clustering);
or if the signal is substantially but not exceptionally higher than the baseline (on the basis of experience rather than
statistics, until an algorithm is developed to quantify this). Response to a moderate signal includes e-mail notification of possible
activity to hospital ICPs and epidemiologists in surrounding counties. Those epidemiologists and ICPs then
decide whether to investigate their jurisdiction's conditions.
If a signal is exceptionally higher than the baseline (on the basis of experience rather than statistics) or if moderate
signals occur at more than one hospital on a given day, a signal level of
elevated is assigned. An elevated signal entails
immediate notification of hospital ICPs, internal chain of command, regional epidemiologists, and state health department officials
that further investigation is warranted. Status of hospitals
involved in an elevated-level signal is determined through
phone consultation, and if disease activity remains high, an epidemiologic investigation is initiated. Depending on the number of persons and hospitals involved, either the epidemiologist or the epidemiologic response team are sent to the hospital to review charts, interview patients, and confer with hospital personnel regarding next steps.
Although the burden to Bergen County has been minimal, the system's cost and maintenance requirements need to be
better quantified, both in terms of resources spent and person-hours used to respond to system alerts. Furthermore, the better the system operators (e.g., epidemiologists and IT personnel) understand hospitals' coding and triage practices, the better they will understand the system's output and be able to alter it as needed. To date, no elevated signals have occurred. Moderate signals have occurred but none that required more than a telephone consultation with hospital ICPs. In all cases, the
numbers decreased substantially after 1 day, and no specimens were
collectedby hospital physicians.
DEDSS monitors two primary syndromes: influenza-like illness (ILI) and gastrointestinal illness (GI). Each syndrome has
a corresponding case definition, complaint group (i.e., a list of chief complaints being monitored), and diagnostic group (i.e., a list of International Classification of Diseases, Ninth
Revision [ICD-9] codes for validation studies). Preliminary comparisons
of chief complaint to ICD-9-coded diagnoses indicate sensitivity of 76%, specificity of 96%, and positive predictive value
of 53% for ILI and sensitivity of 61%, specificity of 97%, and positive predictive value of 32% for GI. Specific results need to be analyzed further to identify and quantify the source of noise and discrepancies within the syndrome definitions,
especially when examining positive predictive value.
As the system is fine-tuned and case definitions and complaint groups revised, the epidemiologist can easily change
the coding as needed. The system's malleability enables the health department to monitor seasonal or short-term
disease-activity trends. During a crisis, the epidemiologist can request that hospitals place a keyword in the complaint field for all visits relating to a certain event (e.g., alleged anthrax exposures) to monitor visits more precisely.
DEDSS is designed to accommodate inclusion of new fields when necessary. If the system were also able to link the
clinical aspects of a patient's visit (e.g., X-ray results, medications prescribed, laboratory results, or blood work) to each observation, the epidemiologist reviewing the day's data would have more information to examine when assigning the level of concern.
Because the infrastructure is already in place, establishing future projects that capture different data will be even easier.
Obstacles and Benefits
The primary obstacles encountered during development and maintenance of DEDSS involve IT and resources. The
ability to troubleshoot technical and programmatic computer problems has been limited by departmental
resources. Although the system is intended to be automated and electronic, certain hospitals had difficulty scheduling tasks and transferring the files. Fortunately, the fundamental act of creating the daily data file was not a problem for any hospitals. However, because
hospital IT personnel are instrumental to the mechanics of file creation, automation, and transfer, including them in early planning is essential.
After establishing standard analytic methods and reporting protocols within a jurisdiction, the next step is to
coordinate surveillance systems within the region; as multiple systems come online, maintaining communication and
methodologic developments in real time is crucial. Conducting surveillance and validation regionally would enable joining of resources to accomplish similar goals.
Beyond DEDSS' stated goals, the system has had additional benefits. The process of meeting with the hospital
personnel and setting up the data transfer generated excellent working relations between the health department and the hospitals.
It increased the timeliness of reporting routine incidents and fostered communication around unusual occurrences.
Furthermore, an infrastructure supporting the electronic transfer of data between hospitals and the health department is now in
place. Unfortunately, redundant capabilities are not yet built into the system; currently, when one aspect of the system fails, the entire system goes offline. The system also lacks a single, dedicated manager. These limitations can result in periods of system inactivity.
The health department hopes the system will be useful for more than terrorism-preparedness purposes. Its goal is to have
a multifaceted system that uses multiple analytic processes and creates reports for multiple users on different aspects of
public health and health-care delivery.
Hutwagner, LC, Maloney EK, Bean NH et al. Using laboratory-based surveillance data for prevention: an algorithm for detecting Salmonella outbreaks. Emerg Infect Dis 1997;3:395--400.
SAS Institute. SAS,® Version 8.2 [Software]. Cary, NC: SAS Institute.
Use of trade names and commercial sources is for identification only and does not imply endorsement by the U.S. Department of
Health and Human Services.References to non-CDC sites on the Internet are
provided as a service to MMWR readers and do not constitute or imply
endorsement of these organizations or their programs by CDC or the U.S.
Department of Health and Human Services. CDC is not responsible for the content
of pages found at these sites. URL addresses listed in MMWR were current as of
the date of publication.
All MMWR HTML versions of articles are electronic conversions from ASCII text
into HTML. This conversion may have resulted in character translation or format errors in the HTML version.
Users should not rely on this HTML document, but are referred to the electronic PDF version and/or
the original MMWR paper copy for the official text, figures, and tables.
An original paper copy of this issue can be obtained from the Superintendent of Documents,
U.S. Government Printing Office (GPO), Washington, DC 20402-9371; telephone: (202) 512-1800.
Contact GPO for current prices.
**Questions or messages regarding errors in formatting should be addressed to