Ohio Promotes Accurate Stroke Care Data
Success Story from the Paul Coverdell National Acute Stroke Program
Alice Liskay, MPA, RN, BSN, a nurse consultant with the Ohio Coverdell Stroke Program, remembers 14 years ago when she manually collected stroke data from paper medical charts to make sure the data collected were correct. “The process was pretty laborious,” she recalled. But a lot has changed since then, and for the better.
Quality improvement for stroke care involves collecting data from each stroke patient seen in each hospital in the same way across patients and hospital sites. The process includes two staff members, called data abstractors, who independently review the same patient report. The data from each abstractor is then compared to identify any differences. This process is called the interrater reliability process.
The degree to which the information collected by each data abstractor matches is called the item-specific percent agreement (ISPA) score. The greater the agreement between the two data abstractors, the higher the ISPA score. Hospitals use ISPA scores to understand the reliability and trustworthiness of their data over time.
This process also involves entering stroke patient data into the Get With The Guidelines®−Stroke (GWTG–Stroke) databaseexternal icon. This national database is supported by the American Heart Association, the American Stroke Association, and their contractor, IQVIA.
Stroke Data Collection
Data abstraction: The process of extracting key elements from a medical record. The data are used to analyze performance measures for stroke care.
Interrater reliability process: The method used to collect and compare data from two different data abstractors.
Item-specific percent agreement (ISPA): The degree to which data elements abstracted from a patient’s medical record by two data abstractors match, which generates a numeric score.
The Ohio Coverdell Stroke Program helps support the calculation of the ISPA score and works to improve the accuracy of data from nearly 90 hospitals in the state. During its latest funding cycle, the Ohio program worked with the MetroHealth Clinical Consulting Team to develop and revise the interrater reliability process.
Data abstractors review each patient’s medical record for certain key elements. This information is then put into the web-based GWTG–Stroke database. To make sure the data abstractors collect the data accurately and in a unified way, the Ohio Coverdell Stroke Program randomly selects medical records from five patients who had a stroke for the hospital to re-abstract or review again each quarter. An example of a key variable that needs to be collected is the GWTG data element “time that the patient was last known well” (or the time immediately before the stroke symptoms started).
“It’s important that ‘last known well’ is correct because all treatment for patients with stroke are based on that time,” said Liskay. “Depending on how many hours have passed, the measure determines the treatment patients can be offered, so that’s a vitally important measure to get right.”
A high ISPA score ensures that this information, which is critical to patient care, is reliable. Quarterly reports on the interrater reliability process:
- Give hospitals timely assessments of their ISPA scores.
- Help hospitals meet stroke certification requirements.
- Identify areas where hospitals can improve their data accuracy in a timely manner.
- Ensure that accurate data reach each hospital’s stroke committee and CDC.
Challenge and Approach
The Ohio Coverdell Stroke Program wanted to make their interrater reliability process easier and quicker. Staff from the Ohio Department of Health and the MetroHealth Clinical Consulting Team worked with IQVIA, the American Heart Association, and hospitals participating in the Coverdell Program to improve the process. There were several challenges, including the following:
- COVID-19 forced some hospitals to shift staff who were collecting stroke data to support the pandemic response, leaving less time for data collection.
- New hospitals and new staff at current hospitals needed training on the process.
A biostatistician with the MetroHealth Clinical Consulting Team, Steven Lewis, MS, MBA, created an automated program that randomly selects five charts from each hospital, which saves staff time. The program compares the information collected by the two data abstractors, gives feedback on which elements were missed, and provides an ISPA score for all CDC-required data.
Results from each hospital were compared, and hospitals were told which elements were mismatched so they could improve their data collection.
“A huge ‘thank you’ to Coverdell and its staff for developing a very specific and organized interrater reliability process. Coverdell has helped make our stroke program exceptional with the aid of the interrater reliability service.”
The Ohio Coverdell Stroke Program reported a state-wide ISPA score above 94% for all participating hospitals every year for the past 5 years of the current funding cycle. It has achieved this score despite a 62% increase in the number of participating hospitals and significant turnover in hospital stroke teams. During Year 5:
- Data on 23,568 patients were entered into the GWTG®–Stroke
- 1,023 charts were re-abstracted.
- 93 pieces of CDC data (data elements) were reviewed.
- Another 68 data elements were reviewed and scored for the eight Comprehensive Stroke Center hospitals that use the Ohio Coverdell Stroke Program to meet interrater reliability certification requirements.
“The process has proven to be a success with its ease and consistency,” said Weigand.
Activities that were key to success were having meetings with key stakeholders, developing educational materials for data abstractors, and sharing the new process with all participating hospitals.
The Ohio Coverdell Stroke Program conducts training sessions on data abstraction twice a year for all participating hospitals and encourages all new hospitals and staff to attend. Training sessions can be onsite or virtual. They teach hospitals how to properly collect data according to established guidelines and incorporate common areas of disagreement. During the training sessions, participants can discuss challenges and network with peers.
The program was able to successfully:
- Document each step of the interrater reliability process so information can be shared with other hospitals.
- Create a process that shows hospital how reliable their data are, which helps them improve the quality of their care.