Skip Navigation LinksSkip Navigation Links
Centers for Disease Control and Prevention
Safer Healthier People
Blue White
Blue White
bottom curve
CDC Home Search Health Topics A-Z spacer spacer
spacer
Blue curve MMWR spacer
spacer
spacer

Evaluation of Syndromic Surveillance Based on National Health Service Direct Derived Data --- England and Wales

Alexander Doroshenko,1 D. Cooper,1 G. Smith,1 E. Gerard,2 F. Chinemana,3 N. Verlander,4 A. Nicoll4
1
Health Protection Agency West Midlands, Birmingham, England; 2NHS Direct National Team, England; 3NHS Direct Hampshire and Isle of Wight, Southampton, England; 4Health Protection Agency, Center for Infections, London, England

Corresponding author: Duncan Cooper, Health Protection Agency West Midlands, Floor 2, Lincoln House, Heartlands Hospital, Birmingham, England B9 5SS. Telephone: 0121-773-7077; Fax: 0121-773-1407; E-mail: duncan.cooper@hpa.org.uk.

Disclosure of relationship: The contributors of this report have disclosed that they have no financial interest, relationship, affiliation, or other association with any organization that might represent a conflict of interest. In addition, this report does not contain any discussion of unlabeled use of commercial products or products for investigational use.

Abstract

Introduction: Syndromic surveillance systems might serve as an early warning to detect outbreaks of infectious diseases and chemical poisoning, including those caused by deliberate release. In England and Wales, data from National Health Service (NHS) Direct, a national telephone health advice service, were used for surveillance of 10 syndromes commonly occurring in the community.

Objectives: The objective of this study was to evaluate NHS Direct syndromic surveillance using the "Framework for Evaluating Public Health Surveillance Systems for Early Detection of Outbreaks", published by CDC.

Methods: Quantitative and qualitative assessments were performed. Examination of daily data flow was used to determine the timeliness and data quality. Validity was determined by comparing NHS Direct surveillance with a well-established clinical-based surveillance system using a time series analysis. Semistructured interviews of main stakeholders were conducted to determine usefulness, flexibility, acceptability, portability, stability, and system costs.

Results: NHS Direct syndromic surveillance has representative national coverage, provides near real-time recording and data analysis, and can potentially detect high-risk, large-scale events. Direct costs are low and variable costs are unpredictable. Flexibility depends on urgency of the need for change, and portability relies on the existence of infrastructure similar to NHS Direct. Statistically significant correlation exists between NHS Direct surveillance and a surveillance system based on the Royal College of General Practitioners data for influenza-like illness.

Conclusion: The CDC framework is a useful tool to standardize the evaluation of syndromic surveillance. NHS Direct syndromic surveillance is timely, representative, useful, and acceptable with low marginal costs and borderline flexibility and portability. Cross-correlation time series modeling might represent an appropriate method in the evaluation of syndromic surveillance validity.

Introduction

Emphasis has been placed on the improvement of existing surveillance systems and developing innovative new surveillance systems around the world. Commitments to improve surveillance for health protection have been made in the United Kingdom (UK) (1). Because certain emerging infections and chemical poisonings, including those caused by deliberate release, might first appear as ill-defined syndromes, rapid outbreak detection is a challenge. Suspicious patterns of patient presentations might be apparent at the community level well before laboratory data raise an alarm. Syndromic surveillance might serve as an early warning to detect such occurrences (2,3).

In 2004, an evaluation of the usefulness of 35 detection and diagnostic decision support systems for biologic terrorism response was performed. Most evaluations were critically deficient (4,5). The need for more detailed evaluation of syndromic surveillance projects culminated in the publication of the "Framework for Evaluating Public Health Surveillance Systems for Early Detection of Outbreaks" by CDC in May 2004 (6). This guidance aims to standardize frequently fragmented evaluation efforts. The CDC framework is designed for the evaluation of relatively mature, fully operational syndromic surveillance systems (7). This report expands on the existing work on the NHS Direct syndromic surveillance system in England and Wales on the basis of call data from the national telephone health advice helpline operated by NHS (8,9). This report presents a preliminary evaluation of NHS Direct syndromic surveillance according to the CDC framework.

Methods

Both quantitative and qualitative assessments using CDC guidance were performed. Information was gathered with respect to the construct and utility of the NHS Direct syndromic surveillance. Comprehensive semistructured qualitative interviews with eight main stakeholders were conducted to determine usefulness, flexibility, acceptability, portability, stability, and costs of the system. Respondents were selected on the basis of their knowledge and experience of NHS Direct syndromic surveillance and included consultants in communicable disease control (CCDC), regional epidemiologists (RE), NHS Direct managerial and scientific staff, and national experts from the Health Protection Agency (HPA). Interviews were conducted by the same investigator using a devised standard questionnaire, and all answers were recorded and transcribed in a standard way. Examination of daily electronic NHS Direct surveillance data and weekly NHS Direct syndromic surveillance bulletins were used to determine timeliness and data quality. Qualitative estimates of numbers of outbreaks detected by NHS Direct syndromic surveillance were also obtained through interviews. This estimate was based on professional judgement and supplemented by the quantitative analysis.

Quantitative analysis included an evaluation of the system's validity by comparing NHS Direct syndromic surveillance for influenza-like illnesses (ILIs) with a well-established national clinical surveillance system (the Royal College of General Practitioners Weekly Returns Service [WRS]). WRS is a broadly representative network of 78 general practices that voluntarily participate in a scheme to collect information on consultations and episodes of illness diagnosed in general practice. Weekly incidence rates per 100,000 population for common illnesses are calculated. On the basis of historical trends, robust thresholds for ILI activity have been developed by WRS. These thresholds determine four levels of ILI activity in England and Wales: baseline activity, normal seasonal activity, higher than average seasonal activity, and epidemic activity. Weekly surveillance data on ILI syndromes were compared between NHS Direct and WRS systems during August 2001--August 2004. NHS Direct surveillance began collecting data on ILIs in August 2001, so all NHS Direct data available at the time of study were analyzed. NHS Direct call data were aggregated from daily to weekly to conform to WRS data format and two time series were constructed (Figure 1). Data from both sources were compared by calculating Spearman rank correlation coefficient and fitting time-series models and estimating a cross-correlogram between two time series at different lags (weeks of observations). For the time series models, both data sets were transformed and detrended by differencing to ensure that transformed series were stationary. Then appropriate autoregressive moving average models were fitted to the differenced, transformed time series so that each set of residuals were white noise. The models were determined by examining the autocorrelation and partial autocorrelation functions to determine autoregressive and moving average parts of the models. Residuals were determined from models and checked for normality and against the fitted values. Residuals were also checked for white noise by the Portmanteau test. Cross-correlation was estimated for residuals at different lags with the limit for statistically significant correlation being 2/Ö(N-1) in either direction, where N represented the number of data points.

Results

System Description

The initial purpose of NHS Direct syndromic surveillance was to augment other surveillance systems in detecting outbreaks of influenza. The aim was to facilitate the early implementation of preventative measures. In December 2001, the surveillance of 10 syndromes began, and the purpose of the system was expanded to provide an early warning for potential deliberate release of harmful biologic and chemical agents. The system is more likely to detect large-scale events or outbreaks and a rise in symptoms with no clear cause evident. NHS Direct syndromic surveillance is an example of how a system, initially designed for the clinical assessment (by telephone) of common conditions presenting in communities, has been used for surveillance purposes. Because the surveillance process is fully operational, the validation of the system is considered a priority.

At the time of the inception of the NHS Direct syndromic surveillance system, the list of stakeholders was limited to NHS Direct central management, NHS Direct sites, the NHS Direct Health Intelligence Unit (HIU), and the Regional Surveillance Unit (RSU) of the Health Protection Agency West Midlands. The greater public health community has taken a greater interest in the activities of NHS Direct syndromic surveillance, and regional and national networks have been established. These networks include other divisions of the Health Protection Agency, the Faculty of Public Health, acute hospital NHS trusts and primary-care organizations. The need for additional expertise to interpret trends detected by NHS Direct surveillance resulted in collaboration with the UK Meteorological Office. Distribution of data within and across regional boundaries improved knowledge sharing and networking among epidemiologists and physicians working in public health.

Operations of the NHS Direct syndromic surveillance system have been previously described (10). NHS Direct is a nurse-led telephone helpline that provides health information and health advice to callers with symptoms, including directing them to the appropriate NHS service. NHS Direct handled 6 million calls per year (11). Nurses at 22 NHS Direct sites use a computerized clinical decision support system (CAS) containing approximately 200 clinical algorithms, each with series of questions relating to symptoms. The NHS Direct syndromic surveillance system provides surveillance of 10 syndromes (i.e., cold/influenza, cough, diarrhea, difficulty breathing, double vision, eye problems, lumps, fever, rash, and vomiting) commonly occurring in the community and requiring telephone health advice. An increase in the number of callers with these syndromes might be caused by a naturally occurring outbreak (e.g., influenza) or early stages of illnesses caused by biologic or chemical weapons. At RSU, information derived from the call data is initially analyzed using confidence interval and control chart methodology (stage 1 investigation). Any statistical aberrations from historical trends (i.e., exceedances) are further investigated by the team of scientific and medical staff (stage 2 investigation). A public health alert (stage 3 investigation) is issued if no plausible explanation can be found for the exceedance (10). An alert is usually triggered by close geographic clustering of calls and/or sustained high level of calls for the same syndrome (Figure 2).

Outbreak Detection

The NHS Direct syndromic surveillance system captures an event instantly when a caller contacts the NHS Direct helpline. Every weekday, approximately 5 minutes are required to process the previous day's data at each NHS Direct site and transmit them to the Health Intelligence Unit (HIU). HIU collates these 22 files from the 22 NHS Direct sites and transmits them to the RSU. Application of the pattern recognition tools including confidence intervals and control charts methodologies is normally complete by midday. Further (stage 2) investigation is completed within 2 hours of the detection of an exceedance and, if necessary, a stage 3 investigation is initiated on the same day. Public health interventions usually include communications and alerts to local public health professionals. These are normally implemented by the end of the working day and, depending on the severity and urgency of the situation, very prompt public health responses can be initiated. Other public health interventions include enhanced analysis of call data until the exceedance abates. During the weekends, data are collected but not analysed until the following Monday. A similar lag exists during public holidays in England and Wales, although emergency surveillance and epidemiologists can be provided if necessary. NHS Direct syndromic surveillance is the only system producing daily surveillance data for England and Wales and has the ability to record an increase in syndromes 12--36 hours after the calls have been made.

Data quality is determined by its completeness and representativeness of the coverage. The system is designed to capture all events from the population of England and Wales. The volume of calls is disproportionately low for the elderly (aged >65 years) and high for young children (aged <5 years), suggesting that the surveillance system might have potential for the surveillance of common viruses predominantly affecting children (e.g., rotavirus). Regional variations in call rates are not substantial. Calls represent the ethnic mix of the UK population; however, 65% of callers are female (12). During the preceding 3 years, the volume of calls to NHS Direct has increased, indirectly improving representativeness of the surveillance system. Completeness of data transmitted from 22 NHS Direct sites to the Regional Surveillance Unit consistently approaches 100%; completeness of data collection at NHS Direct sites is more difficult to evaluate. Intuitively, because of the simplicity of use of NHS Direct software systems, data collection should be complete; however, separate audit is necessary to support this assumption.

Validation of the surveillance systems performance was conducted using both qualitative and quantitative approaches. The majority of interviewed stakeholders indicated that they perceived that the NHS Direct syndromic surveillance system registered an increase in calls about diarrhea and vomiting at the times when traditional public health surveillance systems indicated a national increase in Norovirus. Similarly, an increase in calls about colds and fever coincided with the increase of influenza incidence nationally. Although this is a subjective view, it was recognized that NHS Direct surveillance augmented data from other surveillance systems.

Traditionally, a quantitative approach to determine the validity of a surveillance system involves the calculation of sensitivity, specificity, and positive predictive value (6,13). In the context of syndromic surveillance, this is difficult to achieve. The unit of analysis is the detection of an outbreak or trend, but not an individual illness. Such a detection is frequently based on drawing information from various sources and ultimately on professional judgement. The standard needed for calculations is rarely available and frequently represents a variable itself. Another approach is to determine the correlation between data derived from different surveillance systems. The calculation of the Spearman rank correlation coefficients was used (2). However, this approach represents a historic evaluation over a prolonged period of time and does not take into consideration natural trends and seasonality. Therefore, while using the same principle of comparing NHS Direct syndromic surveillance to other robust surveillance systems, time series analyses were used to determine the cross-correlation between NHS Direct and WRS time series for ILIs. A comparison with the laboratory-based surveillance was considered less appropriate because only a small proportion of influenza samples are collected and tested in the laboratory in the UK. Data from real time series and those predicted by the models demonstrated a satisfactory fit (Figure 3). The Portmanteau test results indicated that two sets of residuals were white noise. Statistically significant but weak correlations were detected at lag (week) 0, 1, 2, and 3 between NHS Direct and WRS time series (Table 1). This indicates that an increase in consultations for ILIs recorded by WRS is preceded by the increase in calls to NHS Direct for ILI by 1--3 weeks and that increases recorded by both systems can occur simultaneously. The Spearman rank correlation coefficient was calculated to be 0.85, but the time series modelling approach takes into account the timing of observations and indicates how NHS Direct data are correlated to WRS data by giving the correlations at different lags. The conclusions of this report are dependent on the fit of the time-series models, normality of the residuals, and the number of observations (156 in the model). Time-series models with altered parameters were fitted but results remained similar. NHS Direct syndromic surveillance offers an additional benefit of having data available daily in contrast to the WRS operated by RCGP.

Experience

Qualitative interviews indicated that analysis and interpretation of data from the NHS Direct syndromic surveillance system resulted in outbreak detection, public health actions in response to alerts generated by the system, and research and development work. The majority of stakeholders agreed that NHS Direct surveillance detected national (England and Wales) outbreaks of ILI and increases in diarrhea and vomiting. An increase in callers reporting difficulty breathing was documented at a regional (countywide) level. Mapping of calls to NHS Direct by residential postcode was also feasible (10). However, it is unclear whether NHS Direct syndromic surveillance can be used to detect small-scale local outbreaks (e.g., neighborhood or small town). However, on the basis of modelling work, evidence exists that the potential of the NHS Direct surveillance system to detect local outbreaks will be improved by the predicted rise in NHS Direct call rates in England and Wales (14).

A pilot study to investigate the feasibility of influenza self testing by NHS Direct callers was conducted during the winter of 2003 and 2004. A total of 22% of the callers involved in this pilot study tested positive for the influenza virus strain known to be prevalent during that season's influenza epidemic (15). NHS Direct syndromic surveillance output has also been used to track epidemics, for example, by identifying the age-groups most affected during the influenza season (9), and to reassure the public during periods of increased perceived risk that illness in the community has not increased. The majority of stakeholders agreed that the system contributed to a better understanding of trends and baselines of the syndromes under surveillance. In addition, the system has also promoted collaborative work between public health professionals and led to the formation of professional networks.

NHS Direct syndromic surveillance examines call data for 10 syndromes commonly presented in the community. All interviewed stakeholders believed that the expansion of the system to capture syndromes other than the 10 under surveillance is feasible because the clinical assessment software used by NHS Direct staff includes approximately 200 algorithms. Additional input will be required in terms of professional time and funds. If a strong need exists, such changes can be implemented quickly. Additional algorithms to handle potential deliberate release events can also be added to the NHS Direct clinical assessment software through negotiation with NHS Direct. These algorithms can be switched on in an emergency situation. The limitation is that when new data are available, time is needed to form a meaningful baseline to interpret new trends. The system potentially can aid the management of an outbreak and detection by examining local or regional trends when an outbreak is declared.

The NHS Direct surveillance system is embedded into operations of the NHS Direct service; therefore, duplicating such a surveillance system in different settings or jurisdictions is dependent on an existing service similar to NHS Direct. Although disseminating NHS Direct surveillance experience within the United Kingdom (i.e., to Scotland and Northern Ireland) might be easier, national coverage and reliance on the NHS Direct infrastructure for operations might preclude the system's replication elsewhere.

Most data acquisition is conducted by staff of NHS Direct sites who contribute indirectly to the operation of the surveillance system. The majority of stakeholders consider NHS Direct syndromic surveillance important and acceptable. Regional epidemiologists who staff the on-call roster to help interpret surveillance output and initiate public health actions accept the additional workload. The system is easy to operate because it does not require extensive computer or programming training for most of the front-line staff. The clinical assessment software (CAS) used at NHS Direct sites is under constant review, and the proven service robustness of NHS Direct ensures data are not lost if there is a technical problem at an individual site. Data transmission modalities are also regularly serviced and upgraded. Although occasional personnel shortages are recorded, this has never resulted in the loss of data. The NHS Direct syndromic surveillance system is funded by continuous appropriations and by research grants ranging from short to long term.

System Costs

The direct annual cost of operating the NHS Direct surveillance system is an estimated $280,000. This includes salaries and benefits for one fulltime scientist, one fulltime information analyst, and four parttime professionals funded by the surveillance project. Medical epidemiologists are funded by the Health Protection Agency. Surveillance activity is embedded into wider NHS Direct operations. Therefore, no additional costs are accrued for the use of NHS Direct software, maintenance of facilities, and data transmission. Overall, the marginal cost of operating the system is low. Estimating variable cost is more difficult because it depends on the frequency of additional analyses and initiated public health actions. Given the current workload, no extra cost was incurred as a result of acting on genuine alerts and screening out false alarms. In addition to personnel considerations, variable costs might increase if further testing (i.e., laboratory testing) is initiated to validate trends detected by the syndromic surveillance system. No information is available on clinical outcomes to estimate benefits resulting from decreases in the morbidity caused by precise outbreak detection or costs resulting from missed outbreaks or excessive false alarms.

Conclusion

The NHS Direct syndromic surveillance system is the only national syndromic surveillance system in England and Wales. Since the start of its operations in 1999, the capabilities have been expanded from augmenting data from other surveillance systems to detecting a variety of syndromic trends, forming historical baselines, and using it for the potential detection of deliberate release of harmful agents. Dissemination of the NHS Direct syndromic surveillance output has prompted interagency collaborations between medical, scientific, and public health professionals. NHS Direct syndromic surveillance is regarded as timely, representative, useful, and acceptable with low marginal costs. More work is needed to improve its portability and flexibility. It has the potential to detect high-risk, large scale events, but in its current state is less likely to detect smaller, localized outbreaks.

The CDC framework is a benchmark tool to evaluate well-established syndromic surveillance systems. The greatest challenge is to develop consistent techniques to assess whether syndromic surveillance systems provide an early warning of outbreaks of disease in the community. In the future, this needs to be considered at the stage of planning and purpose formulation of new systems. The creation and maintenance of an international database of evaluation projects can be beneficial for further development of research on syndromic surveillance.

Acknowledgments

The authors thank all stakeholders who volunteered their time to participate in this research, staff of NHS Direct sites and Douglas Fleming (Director of the Birmingham Research Unit of the Royal College of General Practitioners for Weekly Return Service) for WRS data.

References

  1. UK Department of Health. Getting ahead of the curve: a strategy for combating infectious diseases. Available at http://www.doh.gov.uk/cmo/idstrategy.
  2. Lewis MD, Pavlin JA, Mansfield JL, et al. Disease outbreak detection system using syndromic data in the greater Washington, DC area. Am J Prev Med 2002;23:229--30.
  3. Fleming DM, Barley MA, Chapman RS. Surveillance of the bioterrorist threat: a primary care response. Commun Dis Public Health 2004;7:68--72
  4. Bravata DM, McDonald KM, Smith WM, et al. Systematic review: surveillance systems for early detection of bioterrorism-related diseases. Ann Intern Med 2004;140:910--22.
  5. Bravata DM, Sundaram V, McDonald KM, et al. Evaluating detection and diagnostic decision support systems for bioterrorism response. Emerg Infect Dis 2004;10:100--8.
  6. Buehler JW, Hopkins RS, Overhage JM, Sosin DM, Tong V; CDC Working Group. Framework for evaluating public health surveillance systems for early detection of outbreaks: recommendations from the CDC Working Group. MMWR 2004;53(No. RR-5).
  7. CDC. ESSENCE II and the framework for evaluating syndromic surveillance systems. In: Syndromic surveillance: reports from a national conference, 2003. MMWR 2004;53:159--65.
  8. Baker M, Smith GE, Cooper D, et al. Early warning and NHS Direct: a role in community surveillance? J Public Health Med 2003;25:362--8.
  9. Cooper DL, Smith GE, Hollyoak VA, Joseph CA, Johnson L, Chaloner R. Use of NHS Direct calls for surveillance of influenza---a second year's experience. Commun Dis Public Health 2002;5:127--31.
  10. CDC. National symptom surveillance using calls to a telephone health advice service---United Kingdom, December 2001--February 2003. MMWR 2004;53:179--83.
  11. Directorate of Access and Choice. Developing NHS Direct: a strategy document for the next 3 years. London, England: Department of Health; 2003.
  12. Cooper DL, Arnold E, Smith GE, et al. The effect of deprivation, age and gender on NHS Direct call rates. Br J Gen Pract 2005;5:287--2915.
  13. Murakami Y, Hashimoto S, Taniguchi K, Osaka K, Fuchigami H, Nagai M. Evaluation of a method for issuing warnings pre-epidemics and epidemics in Japan by infectious diseases surveillance. J Epidemiol 2004;14:33--40.
  14. Cooper DL, Verlander NQ, Smith GE, et al. Can syndromic surveillance data detect local outbreaks of communicable disease? A model using a historical cryptosporidiosis outbreak. Epidemiology and Infection; In press.
  15. Cooper DL. Can we use self-testing to augment syndromic surveillance? A pilot study using influenza. Presented at 2004 Syndromic surveillance conference, Boston, Massachusetts. November 3--4, 2004.

Table 1

Table 1
Return to top.
Figure 1

Figure 1
Return to top.
Figure 2

Figure 2
Return to top.
Figure 3

Figure 3
Return to top.

Use of trade names and commercial sources is for identification only and does not imply endorsement by the U.S. Department of Health and Human Services.


References to non-CDC sites on the Internet are provided as a service to MMWR readers and do not constitute or imply endorsement of these organizations or their programs by CDC or the U.S. Department of Health and Human Services. CDC is not responsible for the content of pages found at these sites. URL addresses listed in MMWR were current as of the date of publication.

Disclaimer   All MMWR HTML versions of articles are electronic conversions from ASCII text into HTML. This conversion may have resulted in character translation or format errors in the HTML version. Users should not rely on this HTML document, but are referred to the electronic PDF version and/or the original MMWR paper copy for the official text, figures, and tables. An original paper copy of this issue can be obtained from the Superintendent of Documents, U.S. Government Printing Office (GPO), Washington, DC 20402-9371; telephone: (202) 512-1800. Contact GPO for current prices.

**Questions or messages regarding errors in formatting should be addressed to mmwrq@cdc.gov.

Date last reviewed: 8/5/2005

HOME  |  ABOUT MMWR  |  MMWR SEARCH  |  DOWNLOADS  |  RSSCONTACT
POLICY  |  DISCLAIMER  |  ACCESSIBILITY

Safer, Healthier People

Morbidity and Mortality Weekly Report
Centers for Disease Control and Prevention
1600 Clifton Rd, MailStop E-90, Atlanta, GA 30333, U.S.A

USA.GovDHHS

Department of Health
and Human Services