Volume 11, Issue 19 May 14, 2019

CDC Science Clips: Volume 11, Issue 19, May 14, 2019

Science Clips is produced weekly to enhance awareness of emerging scientific knowledge for the public health community. Each article features an Altmetric Attention scoreexternal icon to track social and mainstream media mentions!

  1. Top Articles of the Week

    Selected weekly by a senior CDC scientist from the standard sections listed below.

    The names of CDC authors are indicated in bold text.
    • Chronic Diseases and Conditions
      • Morbidity and mortality after lifestyle intervention for people with impaired glucose tolerance: 30-year results of the Da Qing Diabetes Prevention Outcome Studyexternal icon
        Gong Q, Zhang P, Wang J, Ma J, An Y, Chen Y, Zhang B, Feng X, Li H, Chen X, Cheng YJ, Gregg EW, Hu Y, Bennett PH, Li G.
        Lancet Diabetes Endocrinol. 2019 Apr 26.
        BACKGROUND: Lifestyle interventions can delay the onset of type 2 diabetes in people with impaired glucose tolerance, but whether this leads subsequently to fewer complications or to increased longevity is uncertain. We aimed to assess the long-term effects of lifestyle interventions in people with impaired glucose tolerance on the incidence of diabetes, its complications, and mortality. METHODS: The original study was a cluster randomised trial, started in 1986, in which 33 clinics in Da Qing, China, were randomly assigned to either be a control clinic or provide one of three interventions (diet, exercise, or diet plus exercise) for 6 years for 577 adults with impaired glucose tolerance who usually receive their medical care from the clinics. Subsequently, participants were followed for up to 30 years to assess the effects of intervention on the incidence of diabetes, cardiovascular disease events, composite microvascular complications, cardiovascular disease death, all-cause mortality, and life expectancy. FINDINGS: Of the 577 participants, 438 were assigned to an intervention group and 138 to the control group (one refused baseline examination). After 30 years of follow-up, 540 (94%) of 576 participants were assessed for outcomes (135 in the control group, 405 in the intervention group). During the 30-year follow-up, compared with control, the combined intervention group had a median delay in diabetes onset of 3.96 years (95% CI 1.25 to 6.67; p=0.0042), fewer cardiovascular disease events (hazard ratio 0.74, 95% CI 0.59-0.92; p=0.0060), a lower incidence of microvascular complications (0.65, 0.45-0.95; p=0.025), fewer cardiovascular disease deaths (0.67, 0.48-0.94; p=0.022), fewer all-cause deaths (0.74, 0.61-0.89; p=0.0015), and an average increase in life expectancy of 1.44 years (95% CI 0.20-2.68; p=0.023). INTERPRETATION: Lifestyle intervention in people with impaired glucose tolerance delayed the onset of type 2 diabetes and reduced the incidence of cardiovascular events, microvascular complications, and cardiovascular and all-cause mortality, and increased life expectancy. These findings provide strong justification to continue to implement and expand the use of such interventions to curb the global epidemic of type 2 diabetes and its consequences. FUNDING: US Centers for Disease Control and Prevention, WHO, Chinese Center for Disease Control and Prevention, World Bank, Ministry of Public Health of the People’s Republic of China, Da Qing First Hospital, China-Japan Friendship Hospital, and National Center for Cardiovascular Diseases & Fuwai Hospital.

      • Intervention increases physical activity and healthful diet among South African adolescents over 54 months: A randomized controlled trialexternal icon
        Jemmott JB, Zhang J, Jemmott LS, Icard LD, Ngwane Z, Makiwane M, O’Leary A.
        J Adolesc Health. 2019 Apr 23.
        PURPOSE: Scant research has investigated whether health promotion interventions have sustained effects in increasing physical activity and healthful diet among adolescents in sub-Saharan Africa, which is experiencing an epidemiological transition from infectious diseases to noncommunicable diseases as leading causes of mortality. We examined whether an intervention increased adherence to 5-a-day diet and physical activity guidelines during a 54-month postintervention period among South African adolescents and whether its effects weakened at long-term (42 and 54 months postintervention) compared with short-term (3, 6, and 12 months postintervention) follow-up. METHODS: We randomized 18 randomly selected schools serving grade 6 learners (mean age = 12.6) in a township and a semirural area in Eastern Cape Province, South Africa, to one of the two 12-hour interventions: health promotion, targeting healthful diet and physical activity; attention-matched control, targeting sexual risk behaviors. We tested the intervention’s effects on adherence to 5-a-day diet and physical activity guidelines using generalized estimating equations logistic regression models adjusting for baseline behavior and clustering within schools. RESULTS: Health promotion intervention participants had higher odds of meeting 5-a-day diet and physical activity guidelines than control participants. The effect on 5-a-day diet did not weaken at long-term compared with short-term follow-up, but the effect on physical activity guidelines was weaker at long-term follow-up, mainly because of a reduced effect on muscle-strengthening physical activity. The intervention also increased health promotion attitude and intention and health knowledge and reduced binge drinking compared with the control group. CONCLUSIONS: A 12-hour intervention in grade 6 shows promise in increasing self-reported adherence to healthful diet and physical activity guidelines during a 4.5-year postintervention period among South African adolescents.

    • Communicable Diseases
      • [No abstract]

      • Text-based illness monitoring for detection of novel influenza A virus infections during an influenza A (H3N2)v virus outbreak in Michigan, 2016: Surveillance and surveyexternal icon
        Stewart RJ, Rossow J, Eckel S, Bidol S, Ballew G, Signs K, Conover JT, Burns E, Bresee JS, Fry AM, Olsen SJ, Biggerstaff M.
        JMIR Public Health Surveill. 2019 Apr 26;5(2):e10842.
        BACKGROUND: Rapid reporting of human infections with novel influenza A viruses accelerates detection of viruses with pandemic potential and implementation of an effective public health response. After detection of human infections with influenza A (H3N2) variant (H3N2v) viruses associated with agricultural fairs during August 2016, the Michigan Department of Health and Human Services worked with the US Centers for Disease Control and Prevention (CDC) to identify infections with variant influenza viruses using a text-based illness monitoring system. OBJECTIVE: To enhance detection of influenza infections using text-based monitoring and evaluate the feasibility and acceptability of the system for use in future outbreaks of novel influenza viruses. METHODS: During an outbreak of H3N2v virus infections among agricultural fair attendees, we deployed a text-illness monitoring (TIM) system to conduct active illness surveillance among households of youth who exhibited swine at fairs. We selected all fairs with suspected H3N2v virus infections. For fairs without suspected infections, we selected only those fairs that met predefined criteria. Eligible respondents were identified and recruited through email outreach and/or on-site meetings at fairs. During the fairs and for 10 days after selected fairs, enrolled households received daily, automated text-messages inquiring about illness; reports of illness were investigated by local health departments. To understand the feasibility and acceptability of the system, we monitored enrollment and trends in participation and distributed a Web-based survey to households of exhibitors from five fairs. RESULTS: Among an estimated 500 households with a member who exhibited swine at one of nine selected fairs, representatives of 87 (17.4%) households were enrolled, representing 392 household members. Among fairs that were ongoing when the TIM system was deployed, the number of respondents peaked at 54 on the third day of the fair and then steadily declined throughout the rest of the monitoring period; 19 out of 87 household representatives (22%) responded through the end of the 10-day monitoring period. We detected 2 H3N2v virus infections using the TIM system, which represents 17% (2/12) of all H3N2v virus infections detected during this outbreak in Michigan. Of the 70 survey respondents, 16 (23%) had participated in the TIM system. A total of 73% (11/15) participated because it was recommended by fair coordinators and 80% (12/15) said they would participate again. CONCLUSIONS: Using a text-message system, we monitored for illness among a large number of individuals and households and detected H3N2v virus infections through active surveillance. Text-based illness monitoring systems are useful for detecting novel influenza virus infections when active monitoring is necessary. Participant retention and testing of persons reporting illness are critical elements for system improvement.

    • Health Economics
      • BACKGROUND: Although influenza vaccination has been shown to reduce the incidence of major adverse cardiac events (MACE) among those with existing cardiovascular disease (CVD), in the 2015-16 season, coverage for persons with heart disease was only 48% in the US. METHODS: We built a Monte Carlo (probabilistic) spreadsheet-based decision tree in 2018 to estimate the cost-effectiveness of increased influenza vaccination to prevent MACE readmissions. We based our model on current US influenza vaccination coverage of the estimated 493,750 US acute coronary syndrome (ACS) patients from the healthcare payer perspective. We excluded outpatient costs and time lost from work and included only hospitalization and vaccination costs. We also estimated the incremental cost/MACE case averted and incremental cost/QALY gained (ICER) if 75% hospitalized ACS patients were vaccinated by discharge and estimated the impact of increasing vaccination coverage incrementally by 5% up to 95% in a sensitivity analysis, among hospitalized adults aged >/= 65 years and 18-64 years, and varying vaccine effectiveness from 30-40%. RESULT: At 75% vaccination coverage by discharge, vaccination was cost-saving from the healthcare payer perspective in adults >/= 65 years and the ICER was $12,680/QALY (95% CI: 6,273-20,264) in adults 18-64 years and $2,400 (95% CI: -1,992-7,398) in all adults 18 + years. These resulted in ~ 500 (95% CI: 439-625) additional averted MACEs/year for all adult patients aged >/=18 years and added ~700 (95% CI: 578-825) QALYs. In the sensitivity analysis, vaccination becomes cost-saving in adults 18+years after about 80% vaccination rate. To achieve 75% vaccination rate in all adults aged >/= 18 years will require an additional cost of $3 million. The effectiveness of the vaccine, cost of vaccination, and vaccination coverage rate had the most impact on the results. CONCLUSION: Increasing vaccination rate among hospitalized ACS patients has a favorable cost-effectiveness profile and becomes cost-saving when at least 80% are vaccinated.

    • Healthcare Associated Infections
      • Using NHSN’s antimicrobial use option to monitor and improve antibiotic stewardship in neonatesexternal icon
        O’Leary EN, van Santen KL, Edwards EM, Braun D, Buus-Frank ME, Edwards JR, Guzman-Cottrill JA, Horbar JD, Lee GM, Neuhauser MM, Roberts J, Schulman J, Septimus E, Soll RF, Srinivasan A, Webb AK, Pollock DA.
        Hosp Pediatr. 2019 May;9(5):340-347.
        BACKGROUND: The Antimicrobial Use (AU) Option of the Centers for Disease Control and Prevention’s National Healthcare Safety Network (NHSN) is a surveillance resource that can provide actionable data for antibiotic stewardship programs. Such data are used to enable measurements of AU across hospitals and before, during, and after stewardship interventions. METHODS: We used monthly AU data and annual facility survey data submitted to the NHSN to describe hospitals and neonatal patient care locations reporting to the AU Option in 2017, examine frequencies of most commonly reported agents, and analyze variability in AU rates across hospitals and levels of care. We used results from these analyses in a collaborative project with Vermont Oxford Network to develop neonatal-specific Standardized Antimicrobial Administration Ratio (SAAR) agent categories and neonatal-specific NHSN Annual Hospital Survey questions. RESULTS: As of April 1, 2018, 351 US hospitals had submitted data to the AU Option from at least 1 neonatal unit. In 2017, ampicillin and gentamicin were the most frequently reported antimicrobial agents. On average, total rates of AU were highest in level III NICUs, followed by special care nurseries, level II-III NICUs, and well newborn nurseries. Seven antimicrobial categories for neonatal SAARs were created, and 6 annual hospital survey questions were developed. CONCLUSIONS: A small but growing percentage of US hospitals have submitted AU data from neonatal patient care locations to NHSN, enabling the use of AU data aggregated by NHSN as benchmarks for neonatal antimicrobial stewardship programs and further development of the SAAR summary measure for neonatal AU.

    • Immunity and Immunization
      • Poliovirus and rotavirus share notable similarities. Although rotavirus is not amenable to eradication because of animal reservoirs, live, attenuated oral vaccines have been the bedrock of both prevention and control programs, providing intestinal and humoral immunity. Both programs have also encountered safety concerns and suboptimal immune responses to oral vaccines in low-income settings that have been challenges, prompting the search for alternative solutions. In this paper, we review the progress made by polio prevention and eradication efforts over the past six decades. Specifically, we discuss the roles of the oral polio vaccine (OPV) and the inactivated polio vaccine (IPV) in achieving polio eradication, and explore potential application of these lessons to rotavirus. Recent scientific evidence has confirmed that a combined schedule of IPV and OPV adds synergistic value that may give the polio eradication effort the tools to end all poliovirus circulation worldwide. For rotavirus, oral vaccine is the only currently licensed and recommended vaccine for use in all children worldwide, providing heterologous protection against a broad range of strains. However, parenteral rotavirus vaccines are in the pre-clinical and clinical trial stage and insight from polio provides strong justification for accelerating the development of these vaccines. While challenges for parenteral rotavirus vaccines will need to be addressed, such as achieving protection against a broad range of strains, the principle of combined use of oral and parenteral rotavirus vaccines may provide the necessary humoral and intestinal immunity necessary to close the efficacy gaps between developing and developed countries, therefore controlling rotavirus worldwide. This strategy may also potentially reduce risk of intussusception.

    • Laboratory Sciences
      • The Quansys multiplex (Q-Plex) measures ferritin (Fer), soluble transferrin receptor (sTfR), C-reactive protein (CRP), alpha-1-acid glycoprotein (AGP), and retinol-binding protein (RBP). We compared Q-Plex results with reference-type assays and evaluated Q-Plex performance. Pearson correlation and Lin’s concordance coefficients between the Q-Plex and reference assay were: Fer 0.98 and 0.91, sTfR 0.88 and 0.35, CRP 0.98 and 0.98, AGP 0.82 and 0.81, and RBP 0.68 and 0.31, respectively. The median relative difference between the Q-Plex and reference assay were: Fer -2.4%, sTfR 107%, CRP 0.03%, AGP -1.3%, and RBP 51%. The Q-Plex intra-assay CVs were <5%; the inter-assay CVs were higher: Fer 11%, sTfR 14%, CRP 9.3%, AGP 7.5%, and RBP 19%. EDTA plasma produced 74% higher Q-Plex sTfR concentrations compared to serum. Analyte stability was good for </=5 freeze-thaw cycles. After adjusting Q-Plex data to the reference assays, sensitivity and specificity were >85% for Fer and CRP; specificity was >85% for sTfR, AGP, and RBP. Using performance criteria derived from biologic variation, Fer, CRP, and AGP met the minimum allowable imprecision (<10.7%, <31.7%, and <8.5%, respectively) and difference from the reference assay (<+/-7.7%, <+/-32.7%, and <+/-10.3%, respectively), while sTfR and RBP exceeded these thresholds (<8.5% and <7.8% for imprecision and <+/-7.7% and <+/-12% for difference, respectively). The Q-Plex measures multiple biomarkers simultaneously, is easy to perform, and uses small sample volumes. With some improvements in accuracy and precision (i.e., sTfR and RBP), this assay could be a useful tool for low-resource laboratories conducting micronutrient surveys for epidemiologic screening applications. These findings need to be verified using other populations, particularly those with inadequate micronutrient status.

    • Occupational Safety and Health
      • Since at least 2015, a major Zika virus epidemic has impacted the Americas and the Caribbean. There is an ongoing risk of Aedes mosquito-borne transmission in more than 90 countries and territories worldwide. In these areas, as well as in places that are not experiencing active outbreaks, workers in a variety of jobs may be exposed to the virus. In addition to outdoor workers in places with ongoing, vector-borne transmission who may be exposed when bitten by Zika-infected mosquitoes, biomedical researchers studying the virus and health care workers and staff in clinical laboratories may encounter blood and infectious body fluids from infected individuals, including travelers from Zika virus-affected areas. Because of potentially serious health outcomes, including reproductive effects, sometimes associated with Zika, the Occupational Safety and Health Administration and National Institute for Occupational Safety and Health previously issued guidance to help US employers protect workers from exposure to the virus on the job. This commentary summarizes the details of these recommendations and explains their rationale, which is important to understand when adapting and implementing workplace controls to prevent occupational Zika virus exposures and infections at individual worksites. The industrial hygiene hierarchy of controls, including elimination and substitution, engineering controls, administrative controls, and safe work practices, and personal protective equipment, serves as a framework for infection prevention practices for at-risk workers discussed here.

    • Substance Use and Abuse
      • OBJECTIVES: Assess use and reasons for use of electronic vapour products (EVPs) shaped like universal serial bus (USB) flash drives among adults in the USA. METHODS: Data came from SummerStyles, an internet survey of US adults aged >/=18 (N=4088) fielded in June to July 2018. Respondents were shown product images and asked about ever use, current (past 30 days) use and reasons for use. Weighted point estimates and adjusted ORs were assessed. RESULTS: In 2018, 7.9% of participants had ever used flash drive-shaped EVPs, including 25.7% of current cigarette smokers and 45.9% of current EVP users. Moreover, 2.0% reported current use, including 6.8% of cigarette smokers and 34.3% of EVP users. Leading reasons for ever use were ‘to deliver nicotine’ (30.7%) and ‘friend or family member used them’ (30.2%). CONCLUSIONS: About one in 13 US adults have ever used flash drive-shaped EVPs, with use being highest among current EVP users. Nicotine content and friend/family use are drivers of ever use. PUBLIC HEALTH IMPLICATIONS: Understanding use of emerging EVP types can inform strategies to maximise any potential benefits for adult cessation and minimise risks of youth initiation.

  2. CDC Authored Publications
    The names of CDC authors are indicated in bold text.
    Articles published in the past 6-8 weeks authored by CDC or ATSDR staff.
    • Chronic Diseases and Conditions
      1. Anti-hypertensive medication use and factors related to adherence among adults with intellectual and developmental disabilitiesexternal icon
        Cyrus AC, Royer J, Carroll DD, Courtney-Long EA, McDermott S, Turk MA.
        Am J Intellect Dev Disabil. 2019 May;124(3):248-262.
        Adults with intellectual and developmental disabilities (IDD) are known to experience significant health disparities; however, few studies have described anti-hypertensive medication adherence in this population. Using administrative data from South Carolina from 2000-2014, we evaluated the odds of adherence to anti-hypertensive medication among a cohort of adults with IDD and hypertension. Approximately half (49.5%) of the study cohort were adherent to anti-hypertensive medication. Those who lived in a supervised residence, had a Medicaid waiver, and had more frequent contact with a primary care provider were more likely to be adherent. Organizations that serve people with IDD have an opportunity to increase adherence by educating these individuals, their family members, and caregivers about the importance of adherence to anti-hypertensive medication.

      2. Surveillance of congenital heart defects among adolescents at three U.S. sitesexternal icon
        Lui GK, McGarry C, Bhatt A, Book W, Riehle-Colarusso TJ, Dunn JE, Glidewell J, Gurvitz M, Hoffman T, Hogue CJ, Hsu D, Obenhaus S, Raskind-Hood C, Rodriguez FH, Zaidi A, Van Zutphen AR.
        Am J Cardiol. 2019 Apr 10.
        The prevalence, co-morbidities, and healthcare utilization in adolescents with congenital heart defects (CHDs) is not well understood. Adolescents (11 to 19 years old) with a healthcare encounter between January 1, 2008 (January 1, 2009 for MA) and December 31, 2010 with a CHD diagnosis code were identified from multiple administrative data sources compiled at 3 US sites: Emory University, Atlanta, Georgia (EU); Massachusetts Department of Public Health (MA); and New York State Department of Health (NY). The estimated prevalence for any CHD was 4.77 (EU), 17.29 (MA), and 4.22 (NY) and for severe CHDs was 1.34 (EU), 3.04 (MA), and 0.88 (NY) per 1,000 adolescents. Private or commercial insurance was the most common insurance type for EU and NY, and Medicaid for MA. Inpatient encounters were more frequent in severe CHDs. Cardiac co-morbidities included rhythm and conduction disorders at 20% (EU), 46% (MA), and 9% (NY) as well as heart failure at 3% (EU), 15% (MA), and 2% (NY). Leading noncardiac co-morbidities were respiratory/pulmonary (22% EU, 34% MA, 16% NY), infectious disease (17% EU, 22% MA, 20% NY), non-CHD birth defects (12% EU, 23% MA, 14% NY), gastrointestinal (10% EU, 28% MA, 13% NY), musculoskeletal (10% EU, 32% MA, 11% NY), and mental health (9% EU, 30% MA, 11% NY). In conclusion, this study used a novel approach of uniform CHD definition and variable selection across administrative data sources in 3 sites for the first population-based CHD surveillance of adolescents in the United States. High resource utilization and co-morbidities illustrate ongoing significant burden of disease in this vulnerable population.

      3. Effect of health information technologies on cardiovascular risk factors among patients with diabetesexternal icon
        Yoshida Y, Boren SA, Soares J, Popescu M, Nielson SD, Koopman RJ, Kennedy DR, Simoes EJ.
        Curr Diab Rep. 2019 Apr 27;19(6):28.
        PURPOSE OF REVIEW: To identify a common effect of health information technologies (HIT) on the management of cardiovascular disease (CVD) risk factors among people with type 2 diabetes (T2D) across randomized control trials (RCT). RECENT FINDINGS: CVD is the most frequent cause of morbidity and mortality among patients with diabetes. HIT are effective in reducing HbA1c; however, their effect on cardiovascular risk factor management for patients with T2D has not been evaluated. We identified 21 eligible studies (23 estimates) with measurement of SBP, 20 (22 estimates) of DBP, 14 (17 estimates) of HDL, 14 (17 estimates) of LDL, 15 (18 estimates) of triglycerides, and 10 (12 estimates) of weight across databases. We found significant reductions in SBP, DBP, LDL, and TG, and a significant improvement in HDL associated with HIT. As adjuvants to standard diabetic treatment, HIT can be effective tools for improving CVD risk factors among patients with T2D, especially in those whose CVD risk factors are not at goal.

    • Communicable Diseases
      1. BACKGROUND: Infections with Histoplasma can range from asymptomatic to life-threatening acute pulmonary or disseminated disease. Histoplasmosis can be challenging to diagnose and is widely under-recognized. We analyzed insurance claims data to better characterize histoplasmosis testing and treatment practices and its burden on patients. METHODS: We used the IBM(R) MarketScan(R) Research Databases to identify patients with histoplasmosis (International Classification of Diseases, Ninth Revision, Clinical Modification [ICD-9-CM] codes 115.00-115.99) during 2012-2014. We analyzed claims in the 3 months before to the 1 year after diagnosis and examined differences between probable (hospitalized or >1 outpatient visit) and suspect (1 outpatient visit) patients. RESULTS: Among 1,935 patients (943 probable, 922 suspect), 54% had codes for symptoms or findings consistent with histoplasmosis and 35% had >/=2 healthcare visits in the 3 months before diagnosis. Overall, 646 (33%) had any fungal-specific laboratory test: histoplasmosis antibody test (n= 349, 18%), Histoplasma antigen test (n=349, 18%), fungal smear (n=294, 15%), or fungal culture (n=223, 12%); 464 (24%) had a biopsy. Forty-nine percent of probable patients and 10% of suspect patients were prescribed antifungal medication in the outpatient setting. Total, 19% were hospitalized. Patients’ last histoplasmosis-associated healthcare visits occurred a median of 6 months after diagnosis. CONCLUSIONS: Some histoplasmosis patients experienced severe disease, apparent diagnostic delays, and prolonged illness, whereas other patients lacked symptoms and were likely diagnosed incidentally (e.g., via biopsy). Low rates of histoplasmosis-specific testing also suggest incidental diagnoses and low provider suspicion, highlighting the need for improved awareness about this disease.

      2. Trends in pretreatment HIV-1 drug resistance in antiretroviral therapy-naive adults in South Africa, 2000-2016: A pooled sequence analysisexternal icon
        Chimukangara B, Lessells RJ, Rhee SY, Giandhari J, Kharsany AB, Naidoo K, Lewis L, Cawood C, Khanyile D, Ayalew KA, Diallo K, Samuel R, Hunt G, Vandormael A, Stray-Pedersen B, Gordon M, Makadzange T, Kiepiela P, Ramjee G, Ledwaba J, Kalimashe M, Morris L, Parikh UM, Mellors JW, Shafer RW, Katzenstein D, Moodley P, Gupta RK, Pillay D, Abdool Karim SS, de Oliveira T.
        EClinicalMedicine. 2019 ;9:26-34.
        Background: South Africa has the largest public antiretroviral therapy (ART) programme in the world. We assessed temporal trends in pretreatment HIV-1 drug resistance (PDR) in ART-naive adults from South Africa. Methods: We included datasets from studies conducted between 2000 and 2016, with HIV-1 pol sequences from more than ten ART-naive adults. We analysed sequences for the presence of 101 drug resistance mutations. We pooled sequences by sampling year and performed a sequence-level analysis using a generalized linear mixed model, including the dataset as a random effect. Findings: We identified 38 datasets, and retrieved 6880 HIV-1 pol sequences for analysis. The pooled annual prevalence of PDR remained below 5% until 2009, then increased to a peak of 11.9% (95% confidence interval (CI) 9.2-15.0) in 2015. The pooled annual prevalence of non-nucleoside reverse-transcriptase inhibitor (NNRTI) PDR remained below 5% until 2011, then increased to 10.0% (95% CI 8.4-11.8) by 2014. Between 2000 and 2016, there was a 1.18-fold (95% CI 1.13-1.23) annual increase in NNRTI PDR (p < 0.001), and a 1.10-fold (95% CI 1.05-1.16) annual increase in nucleoside reverse-transcriptase inhibitor PDR (p = 0.001). Interpretation: Increasing PDR in South Africa presents a threat to the efforts to end the HIV/AIDS epidemic. These findings support the recent decision to modify the standard first-line ART regimen, but also highlights the need for broader public health action to prevent the further emergence and transmission of drug-resistant HIV. Source of Funding: This research project was funded by the South African Medical Research Council (MRC) with funds from National Treasury under its Economic Competitiveness and Support Package. Disclaimer: The contents of this publication are solely the responsibility of the authors and do not necessarily represent the official views of CDC.

      3. Epidemiological profile of individuals diagnosed with HIV: results from the preliminary phase of case-based surveillance in Kenyaexternal icon
        Harklerode R, Waruiru W, Humwa F, Waruru A, Kellogg T, Muthoni L, Macharia J, Zielinski-Gutierrez E.
        AIDS Care. 2019 Apr 27:1-7.
        Understanding the characteristics of individuals who are newly diagnosed with HIV is critical to controlling the HIV epidemic. Characterizing this population can improve strategies to identify undiagnosed positives and assist in targeting the provision of HIV services to improve health outcomes. We describe the characteristics of newly diagnosed HIV cases in western Kenya from 124 health facilities. The study cohort cases were matched to prevent duplication and patients newly diagnosed between January and June 2015 were identified and descriptive analysis performed. Among 8664 newly identified HIV cases, during the pilot timeframe, 3.1% (n=265) had retested for HIV after initial diagnosis. Linkage to care was recorded for approximately half (45.3%, n = 3930) and 28.0% (n = 2425) had a CD4 count available during the pilot timeframe. The median baseline CD4 count was 332 cells/mL (IQR: 156-544). Among the newly diagnosed age 15 years or older with a CD4 test, 53.0% (n = 1216) were diagnosed late, including 32.9% (n = 755) who had advanced HIV at diagnosis. Factors associated with late diagnosis included being male and in an age group older than 34 years. In western Kenya, continued efforts are needed in the area of testing to enhance early HIV diagnosis and epidemic control.

      4. Clinical development of therapeutic agents for hospitalized patients with influenza: Challenges and innovationsexternal icon
        King JC, Beigel JH, Ison MG, Rothman RE, Uyeki TM, Walker RE, Neaton JD, Tegeris JS, Zhou JA, Armstrong KL, Carter W, Miele PS, Willis MS, Dugas AF, Tracy LA, Vock DM, Bright RA.
        Open Forum Infect Dis. 2019 Apr;6(4):ofz137.
        Background: Since 1999, the US Food and Drug Administration approved neuraminidase and endonuclease inhibitors to treat uncomplicated outpatient influenza but not severe hospitalized influenza. After the 2009 pandemic, several influenza hospital-based clinical therapeutic trials were unsuccessful, possibly due to certain study factors. Therefore, in 2014, the US Health and Human Services agencies formed a Working Group (WG) to address related clinical challenges. Methods: Starting in 2014, the WG obtained retrospective data from failed hospital-based influenza therapeutic trials and nontherapeutic hospital-based influenza studies. These data allowed the WG to identify factors that might improve hospital-based therapeutic trials. These included primary clinical endpoints, increased clinical site enrollment, and appropriate baseline enrollment criteria. Results: During 2018, the WG received retrospective data from a National Institutes of Health hospital-based influenza therapeutic trial that demonstrated time to resolution of respiratory status, which was not a satisfactory primary endpoint. The WG statisticians examined these data and believed that ordinal outcomes might be a more powerful primary endpoint. Johns Hopkins’ researchers provided WG data from an emergency-department (ED) triage study to identify patients with confirmed influenza using molecular testing. During the 2013-2014 influenza season, 4 EDs identified 1074 influenza-patients, which suggested that triage testing should increase enrollment by hospital-based clinical trial sites. In 2017, the WG received data from Northwestern Memorial Hospital researchers regarding 703 influenza inpatients over 5 seasons. The WG applied National Early Warning Score (NEWS) at patient baseline to identify appropriate criteria to enroll patients into hospital-based therapeutic trials. Conclusions: Data received by the WG indicated that hospital-based influenza therapeutic trials could use ordinal outcome analyses, ED triage to identify influenza patients, and NEWS for enrollment criteria.

      5. HIV incidence and risk behaviours of people who inject drugs in Bangkok, 1995-2012external icon
        Martin M, Vanichseni S, Sangkum U, Mock PA, Leethochawalit M, Chiamwongpaet S, Pitisuttithum P, Kaewkungwal J, van Griensven F, McNicholl JM, Tappero JW, Mastro TD, Kittimunkong S, Choopanya K.
        EClinicalMedicine. 2019 ;9:44-51.
        Background: Three consecutive prospective studies were conducted among people who inject drugs (PWID) from May 1995 through June 2012 in Bangkok, Thailand. We examined data from these studies to evaluate HIV incidence and explore trends in risk behaviours. Methods: We used data from a 1995-1998 cohort study, a 1999-2004 HIV vaccine trial, and a 2005-2012 HIV pre-exposure prophylaxis (PrEP) study to examine per-quarter trends in HIV incidence, using a restricted cubic spline function for time in a Poisson regression. We also examined temporal trends in HIV-associated risk behaviours. Findings: HIV incidence declined from 5.7 per 100 person-years during the cohort study, to 2.7 per 100 person-years in the vaccine trial, to 0.7 per 100 person-years among PrEP study placebo recipients. Incidence peaked at 12.1 per 100 person-years in 1996 and declined to < 1.0% during 2005-2012. Reports of injecting drugs and sharing needles also declined from the cohort study to the PrEP study (p < 0.0001). Heroin was the most common drug injected during the cohort study and the vaccine trial, but stimulants (e.g., methamphetamine) and sedatives (e.g., midazolam) were injected more often during the PrEP study. Interpretation: HIV incidence among PWID declined during 2005-2012. Several factors likely contributed to the decline, including decreases in the frequency of injecting and sharing, improved access to HIV testing and antiretroviral therapy, and the use of PrEP. Expanding access to effective HIV prevention tools can hasten control of the HIV epidemic among PWID. Funding: The Bangkok Metropolitan Administration and U.S. Centers for Disease Control and Prevention, Division of HIV/AIDS Prevention.

      6. Hepatitis C in pregnant American Indian and Alaska native women; 2003-2015external icon
        Nolen LD, O’Malley JC, Seeman SS, Bruden DJ, Apostolou A, McMahon BJ, Bruce MG.
        Int J Circumpolar Health. 2019 Dec;78(1):1608139.
        Recent reports have found a rise in Hepatitis C virus (HCV) infection in reproductive age women in the USA. Surveillance data suggests one group that is at increased risk of HCV infection is the American Indian and Alaska Native population (AI/AN). Using the National Center for Health Statistics (NCHS) birth certificate and the Indian Health Services, Tribal, and Urban Indian (IHS) databases, we evaluated reported cases of HCV infection in pregnant women between 2003 and 2015. In the NCHS database, 38 regions consistently reported HCV infection. The percentage of mothers who were known to have HCV infection increased between 2011 and 2015 in both the AI/AN population (0.57% to 1.19%, p < 0.001) and the non-AI/AN population (0.21% to 0.36%, p < 0.001). The IHS database confirmed these results. Individuals with hepatitis B infection or intravenous drug use (IDU) had significantly higher odds of HCV infection (OR 16.4 and 17.6, respectively). In total, 62% of HCV-positive women did not have IDU recorded. This study demonstrates a significant increase in the proportion of pregnant women infected with HCV between 2003 and 2015. This increase was greater in AI/AN women than non-AI/AN women. This highlights the need for HCV screening and prevention in pregnant AI/AN women.

      7. BACKGROUND: The association between the type of diagnostic testing algorithm for HIV infection and the time from diagnosis to care has not been fully evaluated. Here we extend an earlier analysis of this association by controlling for patient and diagnosing facility characteristics. STUDY DESIGN: Descriptive analysis of HIV infection diagnoses during 2016 reported to the National HIV Surveillance System through December 2017. Algorithm type: traditional = initial HIV antibody immunoassay followed by a Western blot or immunofluorescence antibody test; recommended = initial HIV antigen/antibody immunoassay followed by HIV-1/2 type-differentiating antibody test; rapid = two CLIA-waived rapid tests on the same date. RESULTS: In multivariate analyses controlling for patient and diagnosing facility characteristics, persons whose infection was diagnosed using the rapid algorithm were more likely to be linked to care within 30 days than those whose infection was diagnosed using the other testing algorithms (p < 0.01). The median time to link to care during a 30-day follow-up was 9.0 days (95% CI 8.0-12.0) after the rapid algorithm, 17.0 days (95% CI 17.0-18.0) after the recommended algorithm, and 23.0 days (95% CI 22.0-25.0) after the traditional algorithm. CONCLUSIONS: The time from HIV diagnosis to care varied with the type of testing algorithm. The median time to care was shortest for the rapid algorithm, longest for the traditional algorithm, and intermediate for the recommended algorithm. These results demonstrate the importance of choosing an algorithm with a short time between initial specimen collection and report of the final result to the patient.

    • Disease Reservoirs and Vectors
      1. Comparative analysis of serologic cross-reactivity using convalescent sera from filovirus-experimentally infected fruit batsexternal icon
        Schuh AJ, Amman BR, Sealy TS, Flietstra TD, Guito JC, Nichol ST, Towner JS.
        Sci Rep. 2019 Apr 30;9(1):6707.
        With the exception of Reston and Bombali viruses, the marburgviruses and ebolaviruses (family Filoviridae) cause outbreaks of viral hemorrhagic fever in sub-Saharan Africa. The Egyptian rousette bat (ERB) is a natural reservoir host for the marburgviruses and evidence suggests that bats are also natural reservoirs for the ebolaviruses. Although the search for the natural reservoirs of the ebolaviruses has largely involved serosurveillance of the bat population, there are no validated serological assays to screen bat sera for ebolavirus-specific IgG antibodies. Here, we generate filovirus-specific antisera by prime-boost immunization of groups of captive ERBs with all seven known culturable filoviruses. After validating a system of filovirus-specific indirect ELISAs utilizing infectious-based virus antigens for detection of virus-specific IgG antibodies from bat sera, we assess the level of serological cross-reactivity between the virus-specific antisera and heterologous filovirus antigens. This data is then used to generate a filovirus antibody fingerprint that can predict which of the filovirus species in the system is most antigenically similar to the species responsible for past infection. Our filovirus IgG indirect ELISA system will be a critical tool for identifying bat species with high ebolavirus seroprevalence rates to target for longitudinal studies aimed at establishing natural reservoir host-ebolavirus relationships.

    • Global Health
      1. Migration and forced displacement are at record levels in today’s geopolitical environment; ensuring the health of migrating populations and the health security of asylum and receiving countries is critically important. Overseas screening, treatment, and vaccination during planned migration to the United States represents one successful model. These strategies have improved tuberculosis detection and treatment, reducing rates in the United States; decreased transmission and importation of vaccine-preventable diseases; prevented morbidity and mortality from parasitic diseases among refugees; and saved health costs. We describe the work of CDC’s Division of Global Migration and Quarantine and partners in developing and implementing these strategies.

    • Health Economics
      1. Estimated cost of comprehensive syringe service program in the United Statesexternal icon
        Teshale EH, Asher A, Aslam MV, Augustine R, Duncan E, Rose-Wood A, Ward J, Mermin J, Owusu-Edusei K, Dietz PM.
        PLoS One. 2019 ;14(4):e0216205.
        OBJECTIVE: To estimate the cost of establishing and operating a comprehensive syringe service program (SSP) free to clients in the United States. METHODS: We identified the major cost components of a comprehensive SSP: (one-time start-up cost, and annual costs associated with personnel, operations, and prevention/medical services) and estimated the anticipated total costs (2016 US dollars) based on program size (number of clients served each year) and geographic location of the service (rural, suburban, and urban). RESULTS: The estimated costs ranged from $0.4 million for a small rural SSP (serving 250 clients) to $1.9 million for a large urban SSP (serving 2,500 clients), of which 1.6% and 0.8% is the start-up cost of a small rural and large urban SSP, respectively. Cost per syringe distributed varied from $3 (small urban SSP) to $1 (large rural SSP), and cost per client per year varied from $2000 (small urban SSP) to $700 (large rural SSP). CONCLUSIONS: Estimates of the cost of SSPs in the United States vary by number of clients served and geographic location of service. Accurate costing can be useful for planning programs, developing policy, allocating funds for establishing and supporting SSPs, and providing data for economic evaluation of SSPs.

    • Healthcare Associated Infections
      1. Comparison of two glove-sampling methods to discriminate between study arms of a hand hygiene and glove-use studyexternal icon
        Robinson GL, Otieno L, Johnson JK, Rose LJ, Harris AD, Noble-Wang J, Thom KA.
        Infect Control Hosp Epidemiol. 2018 Jul;39(7):884-885.

        [No abstract]

    • Immunity and Immunization
      1. BACKGROUND: Rotavirus is the leading cause of severe diarrhea among children worldwide, and vaccines can reduce morbidity and mortality by 50-98%. The test-negative control (TNC) study design is increasingly used for evaluating the effectiveness of vaccines against rotavirus and other vaccine-preventable diseases. In this study design, symptomatic patients who seek medical care are tested for the pathogen of interest. Those who test positive (negative) are classified as cases (controls). METHODS: We use a probability model to evaluate the bias of estimates of rotavirus vaccine effectiveness (VE) against rotavirus diarrhea resulting in hospitalization in the presence of possible confounding and selection biases due to differences in the propensity of seeking medical care (PSMC) between vaccinated and unvaccinated children. RESULTS: The TNC-based VE estimate corrects for confounding bias when the confounder’s effects on the probabilities of rotavirus and non-rotavirus related hospitalizations are equal. If this condition is not met, then the estimated VE may be substantially biased. The bias is more severe in low-income countries, where VE is known to be lower. Under our model, differences in PSMC between vaccinated and unvaccinated children do not result in selection bias when the TNC study design is used. CONCLUSIONS: In practice, one can expect the association of PSMC (or other potential confounders) with the probabilities of rotavirus and non-rotavirus related hospitalization to be similar, in which case the confounding effects will only result in small bias in the VE estimate from TNC studies. The results of this work, along with those of our previous paper, confirm the TNC design can be expected to provide reliable estimates of rotavirus VE in both high- and low-income countries.

      2. Data resource profile: Household Influenza Vaccine Evaluation (HIVE) Studyexternal icon
        Monto AS, Malosh RE, Evans R, Lauring AS, Gordon A, Thompson MG, Fry AM, Flannery B, Ohmit SE, Petrie JG, Martin ET.
        Int J Epidemiol. 2019 Apr 30.

        [No abstract]

      3. Rotavirus vaccine impact assessment surveillance in India: protocol and methodsexternal icon
        Nair NP, Reddy NS, Giri S, Mohan VR, Parashar U, Tate J, Shah MP, Arora R, Gupte M, Mehendale SM, Kang G.
        BMJ Open. 2019 Apr 25;9(4):e024840.
        INTRODUCTION: Rotavirus infection accounts for 39% of under-five diarrhoeal deaths globally and 22% of these deaths occur in India. Introduction of rotavirus vaccine in a national immunisation programme is considered to be the most effective intervention in preventing severe rotavirus disease. In 2016, India introduced an indigenous rotavirus vaccine (Rotavac) into the Universal Immunisation Programme in a phased manner. This paper describes the protocol for surveillance to monitor the performance of rotavirus vaccine following its introduction into the routine childhood immunisation programme. METHODS: An active surveillance system was established to identify acute gastroenteritis cases among children less than 5 years of age. For all children enrolled at sentinel sites, case reporting forms are completed and a copy of vaccination record and a stool specimen obtained. The forms and specimens are sent to the referral laboratory for data entry, analysis, testing and storage. Data from sentinel sites in states that have introduced rotavirus vaccine into their routine immunisation schedule will be used to determine rotavirus vaccine impact and effectiveness. ETHICS AND DISSEMINATION: The Institutional Review Board of Christian Medical College, Vellore, and all the site institutional ethics committees approved the project. Results will be disseminated in peer-reviewed journals and with stakeholders of the universal immunisation programme in India.

      4. The role of immune correlates of protection on the pathway to licensure, policy decision and use of group B Streptococcus vaccines for maternal immunization: considerations from World Health Organization consultationsexternal icon
        Vekemans J, Crofts J, Baker CJ, Goldblatt D, Heath PT, Madhi SA, Le Doare K, Andrews N, Pollard AJ, Saha SK, Schrag SJ, Smith PG, Kaslow DC.
        Vaccine. 2019 Apr 25.
        The development of a group B Streptococcus (GBS) vaccine for maternal immunization constitutes a global public health priority, to prevent GBS-associated early life invasive disease, stillbirth, premature birth, maternal sepsis, adverse neurodevelopmental consequences, and to reduce perinatal antibiotic use. Sample size requirements for the conduct of a randomized placebo-controlled trial to assess vaccine efficacy against the most relevant clinical endpoints, under conditions of appropriate ethical standards of care, constitute a significant obstacle on the pathway to vaccine availability. Alternatively, indirect evidence of protection based on immunologic data from vaccine and sero-epidemiological studies, complemented by data from opsonophagocytic in vitro assays and animal models, could be considered as pivotal data for licensure, with subsequent confirmation of effectiveness against disease outcomes in post-licensure evaluations. Based on discussions initiated by the World Health Organization we present key considerations about the potential role of correlates of protection towards an accelerated pathway for GBS vaccine licensure and wide scale use. Priority activities to support progress to regulatory and policy decision are outlined.

      5. Progress toward sustainable influenza vaccination in the Lao Peoples’ Democratic Republic, 2012-2018external icon
        Xeuatvongsa A, Mott JA, Khanthamaly V, Patthammavong C, Phounphenghak K, McKinlay M, Mirza S, Lafond KE, McCarron M, Corwin A, Moen A, Olsen SJ, Bresee JS.
        Vaccine. 2019 Apr 23.
        Despite global recommendations for influenza vaccination of high-risk, target populations, few low and middle-income countries have national influenza vaccination programs. Between 2012 and 2017, Lao PDR planned and conducted a series of activities to develop its national influenza vaccine program as a part of its overall national immunization program. In this paper, we review the underlying strategic planning for this process, and outline the sequence of activities, research studies, partnerships, and policy decisions that were required to build Laos’ influenza vaccine program. The successful development and sustainability of the program in Laos offers lessons for other low and middle-income countries interested in initiating or expanding influenza immunization.

    • Informatics
      1. Predictive analytics: Helping guide the implementation research agenda at the National Heart, Lung, and Blood Instituteexternal icon
        Engelgau MM, Khoury MJ, Roper RA, Curry JS, Mensah GA.
        Glob Heart. 2019 Mar;14(1):75-79.

        [No abstract]

      2. How did Ebola information spread on twitter: broadcasting or viral spreading?external icon
        Liang H, Fung IC, Tse ZT, Yin J, Chan CH, Pechta LE, Smith BJ, Marquez-Lameda RD, Meltzer MI, Lubell KM, Fu KW.
        BMC Public Health. 2019 Apr 25;19(1):438.
        BACKGROUND: Information and emotions towards public health issues could spread widely through online social networks. Although aggregate metrics on the volume of information diffusion are available, we know little about how information spreads on online social networks. Health information could be transmitted from one to many (i.e. broadcasting) or from a chain of individual to individual (i.e. viral spreading). The aim of this study is to examine the spreading pattern of Ebola information on Twitter and identify influential users regarding Ebola messages. METHODS: Our data was purchased from GNIP. We obtained all Ebola-related tweets posted globally from March 23, 2014 to May 31, 2015. We reconstructed Ebola-related retweeting paths based on Twitter content and the follower-followee relationships. Social network analysis was performed to investigate retweeting patterns. In addition to describing the diffusion structures, we classify users in the network into four categories (i.e., influential user, hidden influential user, disseminator, common user) based on following and retweeting patterns. RESULTS: On average, 91% of the retweets were directly retweeted from the initial message. Moreover, 47.5% of the retweeting paths of the original tweets had a depth of 1 (i.e., from the seed user to its immediate followers). These observations suggested that the broadcasting was more pervasive than viral spreading. We found that influential users and hidden influential users triggered more retweets than disseminators and common users. Disseminators and common users relied more on the viral model for spreading information beyond their immediate followers via influential and hidden influential users. CONCLUSIONS: Broadcasting was the dominant mechanism of information diffusion of a major health event on Twitter. It suggests that public health communicators can work beneficially with influential and hidden influential users to get the message across, because influential and hidden influential users can reach more people that are not following the public health Twitter accounts. Although both influential users and hidden influential users can trigger many retweets, recognizing and using the hidden influential users as the source of information could potentially be a cost-effective communication strategy for public health promotion. However, challenges remain due to uncertain credibility of these hidden influential users.

    • Injury and Violence
      1. Functional outcome trajectories following inpatient rehabilitation for TBI in the United States: A NIDILRR TBIMS and CDC Interagency Collaborationexternal icon
        Dams-O’Connor K, Ketchum JM, Cuthbert JP, Corrigan JD, Hammond FM, Haarbauer-Krupa J, Kowalski RG, Miller AC.
        J Head Trauma Rehabil. 2019 Apr 25.
        OBJECTIVE: To describe trajectories of functioning up to 5 years after traumatic brain injury (TBI) that required inpatient rehabilitation in the United States using individual growth curve models conditioned on factors associated with variability in functioning and independence over time. DESIGN: Secondary analysis of population-weighted data from a multicenter longitudinal cohort study. SETTING: Acute inpatient rehabilitation facilities. PARTICIPANTS: A total of 4624 individuals 16 years and older with a primary diagnosis of TBI. MAIN OUTCOME MEASURES: Ratings of global disability and supervision needs as reported by participants or proxy during follow-up telephone interviews at 1, 2, and 5 years postinjury. RESULTS: Many TBI survivors experience functional improvement through 1 and 2 years postinjury, followed by a decline in functioning and decreased independence by 5 years. However, there was considerable heterogeneity in outcomes across individuals. Factors such as older age, non-White race, lower preinjury productivity, public payer source, longer length of inpatient rehabilitation stay, and lower discharge functional status were found to negatively impact trajectories of change over time. CONCLUSIONS: These findings can inform the content, timing, and target recipients of interventions designed to maximize functional independence after TBI.

      2. Prior research has demonstrated the scope and impact of adverse childhood experiences (ACEs) on health and wellbeing. Less is known about the trajectories from exposure to ACEs, such as witnessing family conflict and violence in the community, to teen dating violence perpetration, and the protective factors that buffer the association between early exposure to ACEs and later teen dating violence perpetration. Students (n = 1611) completed self-report surveys six times during middle and high school from 2008 to 2013. In early middle school, the sub-sample was 50.2% female and racially/ethnically diverse: 47.7% Black, 36.4% White, 3.4% Hispanic, 1.7% Asian/Pacific Islander, and 10.8% other. Youth were, on average, 12.7 years old. Latent transition analysis was used to assess how trajectories of exposure to parental conflict and community violence during middle school transition into classes of teen dating violence perpetration (e.g., sexual, physical, threatening, relational, and verbal) in high school. Protective factors were then analyzed as moderators of the transition probabilities. Three class trajectories of ACEs during middle school were identified: decreasing family conflict and increasing community violence (n = 103; 6.4%), stable low family conflict and stable low community violence (n = 1027; 63.7%), stable high family conflict and stable high community violence (n = 481; 29.9%). A three class solution for teen dating violence perpetration in high school was found: high all teen dating violence class (n = 113; 7.0%), physical and verbal only teen dating violence class (n = 335; 20.8%), and low all teen dating violence class (n = 1163; 72.2%). Social support, empathy, school belonging and parental monitoring buffered some transitions from ACEs exposure trajectory classes to teen dating violence perpetration classes. Comprehensive prevention strategies that address multiple forms of violence while bolstering protective factors across the social ecology may buffer negative effects of exposure to violence in adolescence.

      3. The purpose was to explore the underlying mechanisms that drive relationships between knowledge, attitudes and intervening bystander behavior to improve bystander violence prevention program effectiveness. Perceptual effects theory was used to understand third-person and first-person perceptions (TPP and FPP) as related to bystander intervention programs and to what extent perceptual gaps influence one?s intention to intervene. A web-based survey was conducted with 379 undergraduate students recruited from a large, Northeastern University. The survey covered demographics, previous bystander training, self-efficacy to engage in bystander behavior, social desirability of bystander intervention training programs, and perceived effects on self and others. Participants indicated how they would act in six hypothetical dating violence/bullying and sexual violence scenarios, and how they thought an average student on campus would act. Perceived ambiguity and risk for each of the scenarios were also measured. Descriptive statistics, paired-sample t-tests, and multilevel model analyses were conducted. Results showed that a robust first-person perception effect existed (i.e., the student perceived themselves being more influenced by bystander interventions/messages than their peers). The magnitude of FPP was increased by sex (significantly larger gap among female students) and previous training. Results show promise to further tailor and refine bystander interventions and provide directions to improve program effectiveness. Despite study limitations, the results indicate the first-person effect warrants further consideration for programming and messaging. Tailoring bystander training or repeated exposure may increase bystander behaviors. More research is needed to fully uncover TPP/FPP effects, predictors, and impacts on bystander intervention programs.

    • Laboratory Sciences
      1. Development and application of a high throughput one-pot extraction protocol for quantitative LC-MS/MS analysis of phospholipids in serum and lipoprotein fractions in normolipidemic and dyslipidemic subjectsexternal icon
        Gardner MS, Kuklenyik Z, Lehtikoski A, Carter KA, McWilliams LG, Kusovschi J, Bierbaum K, Jones JI, Rees J, Reis G, Pirkle JL, Barr JR.
        J Chromatogr B Analyt Technol Biomed Life Sci. 2019 Apr 22;1118-1119:137-147.
        Progress toward better diagnosis and treatment of lipid metabolism-related diseases requires high throughput approaches for multiplexed quantitative analysis of structurally diverse lipids, including phospholipids (PLs). This work demonstrates a simplified “one-pot” phospholipid extraction protocol, as an alternative to conventional liquid-liquid extraction. Performed in a 96-well format, the extraction was coupled with high throughput UPLC and multiplexed tandem mass spectrometry (MS/MS) detection, allowing non-targeted quantification of phosphatidylcholines (PC), sphingomyelins (SM), lysophosphatidylcholines (LPC), phosphatidylethanolamines (PE), and phosphatidylinositols (PI). Using 50muL aliquots of serum samples from 110 individuals, lipoproteins were fractionated by size, and analyzed for phospholipids and non-polar lipids including free cholesterol (FC), cholesteryl esters (CEs) and triglycerides (TGs). Analysis of serum samples with wide range of Total-TG levels showed significant differences in PL composition. The correlations of molar ratios in lipoprotein size fractions, SM/PL with FC/PL, PE/PL with TG/CE, and PE/PL with PI/PL, demonstrate the applicability of the method for quantitative composition analysis of high, low and very-low density lipoproteins (HDL, LDL and VLDL), and characterization of lipid metabolism related disease states.

      2. Cold-induced vasodilation responses before and after exercise in normobaric normoxia and hypoxiaexternal icon
        Gerhart HD, Seo Y, Vaughan J, Followay B, Barkley JE, Quinn T, Kim JH, Glickman EL.
        Eur J Appl Physiol. 2019 Apr 25.
        PURPOSE: Cold-induced vasodilation (CIVD) is known to protect humans against local cold injuries and improve manual dexterity. The current study examined the effects of metabolic heat production on cold-induced vasodilation responses in normobaric hypoxia and normoxia. METHODS: Ten participants immersed their non-dominant hand into 5 degrees C water for 15 min. Minimum finger temperature (Tmin), maximum finger temperature (Tmax), onset time, amplitude, and peak time were measured before and after exercise under normoxia (21% O2) and two levels of normobaric hypoxia (17% O2 and 13% O2). RESULTS: Neither Tmin nor amplitude was affected by hypoxia. However, Tmax was significantly decreased by hypoxia while reduction in onset time and peak time trended towards significance. Tmin, Tmax, and amplitude were significantly higher during post-exercise CIVD than pre-exercise CIVD. CONCLUSION: The CIVD response may be negatively affected by the introduction of hypoxia whereas metabolic heat production via exercise may counteract adverse effects of hypoxia and improve CIVD responses.

      3. Development and implementation of evidence-based laboratory safety management tools for a public health laboratoryexternal icon
        Keckler MS, Anderson K, McAllister S, Rasheed JK, Noble-Wang J.
        Safety Science. 2019 August;117:205-216.
        We developed an evidence-based continuous quality improvement (CQI) cycle for laboratory safety as a method of utilizing survey data to improve safety in a public health laboratory setting. Expert Opinion: The CQI cycle begins with the solicitation of laboratory staff input via an annual survey addressing potential chemical, physical and radiological hazards associated with multiple laboratory activities. The survey collects frequency, severity and exposure data related to these activities in the context of the most pathogenic organisms handled at least weekly. Gap Analysis: Step 2 of the CQI cycle used survey data to identify areas needing improvement. Typically, the traditional two-dimensional risk assessment matrix is used to prioritize mitigations. However, we added an additional dimension – frequency of exposure – to create three-dimensional risk maps to better inform and communicate risk priorities. Mitigation Measures: Step 3 of the CQI cycle was to use these results to develop mitigations. This included evaluating the identified risks to determine what risk control measures (elimination, substitution, engineering, administrative or PPE) were needed. In the 2016 iteration of the CQI cycle described here, all mitigations were based on administrative controls. Evaluation and Feedback: The last step of the CQI cycle was to evaluate the inferred effects of interventions through subsequent surveys, allowing for qualitative assessment of intervention effectiveness while simultaneously restarting the cycle by identifying new hazards.Here we describe the tools used to drive this CQI cycle, including the survey tool, risk analysis method, design of interventions and inference of mitigation effectiveness.

      4. Estradiol reference intervals in women during the menstrual cycle, postmenopausal women and men using an LC-MS/MS methodexternal icon
        Verdonk SJ, Vesper HW, Martens F, Sluss PM, Hillebrand JJ, Heijboer AC.
        Clin Chim Acta. 2019 Apr 11;495:198-204.
        BACKGROUND: For optimal medical decision-making, harmonized reference intervals for estradiol for different ages and both sexes are needed. Our aim was to establish reference intervals using a highly accurate and traceable LC-MS/MS method and to compare these with reference intervals in literature. METHODS: Estradiol was measured in serum obtained daily during the menstrual cycle of 30 healthy premenopausal women and in serum of 64 men and 33 postmenopausal women. The accuracy of our LC-MS/MS method was demonstrated by a method comparison with the CDC reference method. RESULTS: Our LC-MS/MS method was traceable to the reference method. Estradiol reference interval during the early follicular phase (days -15 to -6) was 31-771pmol/L; during the late follicular phase (days -5 to -1) 104-1742pmol/L; during the LH peak (day 0) 275-2864pmol/L; during the early luteal phase (days +1 to +4) 95-1188pmol/L; during mid luteal phase (days +5 to +9) 151-1941pmol/L; during late luteal phase (days +10 to +14) 39-1769pmol/L. The reference interval for men was 12-136pmol/L and for postmenopausal women <26pmol/L. CONCLUSIONS: The established estradiol reference intervals can be used for all traceable LC-MS/MS methods for medical-decision making.

    • Occupational Safety and Health
      1. Animal production, insecticide use and self-reported symptoms and diagnoses of COPD, including chronic bronchitis, in the Agricultural Health Studyexternal icon
        Rinsky JL, Richardson DB, Kreiss K, Nylander-French L, Beane Freeman LE, London SJ, Henneberger PK, Hoppin JA.
        Environ Int. 2019 Apr 24;127:764-772.
        BACKGROUND: Occupational exposure to animal production is associated with chronic bronchitis symptoms; however, few studies consider associations with chronic obstructive pulmonary disease (COPD). We estimated associations between animal production activities and prevalence of self-reported COPD among farmers in the Agricultural Health Study. METHODS: During a 2005-2010 interview, farmers self-reported information about: their operations (i.e., size, type, number of animals, insecticide use), respiratory symptoms, and COPD diagnoses (i.e., COPD, chronic bronchitis, emphysema). Operations were classified as small or medium/large based on regulatory definitions. Farmers were classified as having a COPD diagnosis, chronic bronchitis symptoms (cough and phlegm for >/=3months during 2 consecutive years), or both. Polytomous logistic regression was used to estimate odds ratios (OR) and 95% confidence intervals (CI). RESULTS: Of 22,491 participating farmers (median age: 59years), 922 (4%) reported a COPD diagnosis only, 254 (1%) reported a diagnosis and symptoms, and 962 (4%) reported symptoms only. Compared to raising no commercial animals, raising animals on a medium/large operation was positively associated with chronic bronchitis symptoms with (OR: 1.59; 95% CI: 1.16, 2.18) and without a diagnosis (OR: 1.69; 95% CI: 1.42, 2.01). Ever use of multiple organophosphates, carbaryl, lindane, and permethrin were positively associated with chronic bronchitis symptoms. CONCLUSION: Animal production work, including insecticide use, was positively associated with chronic bronchitis symptoms; but not consistently with COPD diagnosis alone. Our results support the need for further investigation into the role of animal production-related exposures in the etiology of COPD and better respiratory protection for agricultural workers.

    • Occupational Safety and Health – Mining
      1. Respirable coal mine dust in underground mines, United States, 1982-2017external icon
        Doney BC, Blackley D, Hale JM, Halldin C, Kurth L, Syamlal G, Laney AS.
        Am J Ind Med. 2019 Apr 29.
        BACKGROUND: This study summarized the mass concentration and quartz mass percent of respirable coal mine dust samples (annually, by district, and by occupation) from underground coal mines during 1982-2017. METHODS: Respirable dust and quartz data collected and analyzed by Mine Safety and Health Administration (MSHA) were summarized by year, coal mining occupation, and geographical area. The older (before August 2016) 2.0 mg/m (3) respirable dust MSHA permissible exposure limit (PEL) was used across all years for comparative purposes. For respirable dust and quartz, geometric mean and percent of samples exceeding the respirable dust PEL (2.0 mg/m (3) or a reduced standard for samples with >5% quartz content) were calculated. For quartz samples, the average percent quartz content was also calculated. RESULTS: The overall geometric mean concentration for 681 497 respirable dust samples was 0.55 mg/m (3) and 5.5% of the samples exceeded the 2.0 mg/m (3) PEL. The overall respirable quartz geometric mean concentration for 210 944 samples was 0.038 mg/m (3) and 18.7% of these samples exceeded the applicable standard. There was a decline over time in the percent of respirable dust samples exceeding 2.0 mg/m (3) . The respirable dust geometric mean concentration was lower in central Appalachia compared to the rest of the United States. However, the respirable quartz geometric mean concentration and the mean percent quartz content were higher in central Appalachia. CONCLUSION: This study summarizes respirable dust and quartz concentrations from coal mine inspector samples and may provide an insight into differences in the prevalence of pneumoconiosis by region and occupation.

      2. Investigating the impact of caving on longwall mine ventilation using scaled physical modelingexternal icon
        Gangrade V, Schatzel S, Harteis S, Addis J.
        Min Metall Explor. 2019 Apr.
        In longwall mining, ventilation is considered one of the more effective means for controlling gases and dust. In order to study longwall ventilation in a controlled environment, researchers built a unique physical model called the Longwall Instrumented Aerodynamic Model (LIAM) in a laboratory at the National Institute for Occupational Safety and Health (NIOSH) Pittsburgh Mining Research Division (PMRD) campus. LIAM is a 1:30 scale physical model geometrically designed to simulate a single longwall panel with a three-entry headgate and tailgate configuration, along with three back bleeder entries. It consists of a twopart heterogeneous gob that simulates a less compacted unconsolidated zone and more compacted consolidated zone. It has a footprint of 8.94 m (29 ft.) by 4.88 m (16 ft.), with a simulated face length of 220 m (720 ft.) in full scale. LIAM is built with critical details of the face, gob, and mining machinery. It is instrumented with pressure gauges, flow anemometers, temperature probes, a fan, and a data acquisition system. Scaling relationships are derived on the basis of Reynolds and Richardson numbers to preserve the physical and dynamic similitude. This paper discusses the findings from a study conducted in the LIAM to investigate the gob-face interaction, airflow patterns within the gob, and airflow dynamics on the face for varying roof caving characteristics. Results are discussed to show the impact of caving behind the shields on longwall ventilation.

      3. Continued increase in prevalence of r-type opacities among underground coal miners in the USAexternal icon
        Hall NB, Blackley DJ, Halldin CN, Laney AS.
        Occup Environ Med. 2019 Apr 25.
        INTRODUCTION: Respirable crystalline silica exposure has been implicated in the resurgence of coal workers’ pneumoconiosis (CWP) in the USA. A 2010 report found an increasing prevalence of r-type opacities, which are associated with silicosis lung pathology, on the radiographs of working underground coal miners in central Appalachia. This analysis updates that report by assessing the prevalence of r-type opacities during 2010-2018 compared with earlier decades. METHODS: Data from the Coal Workers’ Health Surveillance Program were used to calculate the prevalence of r-type opacities on radiographs of working underground coal miners. The data were restricted to radiographs taken during 1 January 1980 to 15 September 2018. The presence of r-type opacities was defined as an r-type classification for either the primary or secondary shape/size of small opacities. Prevalence ratios for r-type opacities were calculated using log binomial regression. RESULTS: Radiograph classifications for 106 506 miners were included in analysis. For the USA overall, the prevalence of r-type opacities among miners with radiographs taken during 2010-2018 compared with 1980-1989 has increased (PR 2.4; 95% CI 1.9 to 3.0). For central Appalachia, the proportion of r-type opacities observed increased when comparing 1980-1989 to 2010-2018 (PR 6.0; 95% CI 4.6 to 7.9). CONCLUSIONS: The prevalence of r-type opacities on the radiographs of Appalachian underground coal miners continues to increase, implicating exposure to crystalline silica in respirable coal mine dust. The current findings underscore the importance of monitoring and controlling exposure to silica in coal mines.

      4. The National Institute for Occupational Safety and Health (NIOSH) maintains the Pittsburgh Mining Research Division (PMRD) where a wide variety of mining-related health and safety research is conducted. Part of this research is devoted to reducing the incidence of noise-induced hearing loss (NIHL) among the nation’s mining workforce. The need for this research is particularly important, as NIHL is the second most common occupational-related disease among miners. Many types of equipment operators are overexposed to noise, and NIOSH has worked to develop noise controls that reduce the sound level at the equipment operator’s location and, thus, operator noise exposure. Examples of these include a urethane-coated flight bar chain for continuous mining machines and a drill bit isolator for roof bolting machines. This article discusses the development of a retrofitted noise control package for haul trucks and load-haul-dumps (LHDs) used in underground metal/nonmetal mines. Experimental methods under discussion include dosimetry and time motion studies, to determine when an operator accumulates the most noise dose. Noise source identification techniques are used to determine the primary noise contributors to the sound level at the operator’s position. Proof-of-concept testing using rudimentary noise controls is undertaken to confirm that treating the suspected noise sources will actually reduce the sound level at the operator’s location. Next, a description is given of the development of noise controls-an iterative process where noise controls are fabricated, evaluated in an acoustic laboratory, refined, and tested again. Those noise controls that show promise are then field tested under actual mine operating conditions.

    • Parasitic Diseases
      1. The ability to identify mixed-species infections and track the origin of Plasmodium parasites can further enhance the development of treatment and prevention recommendations as well as outbreak investigations. Here, we explore the utility of using the full Plasmodium mitochondrial genome to classify Plasmodium species, detect mixed infections, and infer the geographical origin of imported P. falciparum parasites to the United States (U.S.). Using the recently developed standardized, high-throughput Malaria Resistance Surveillance (MaRS) protocol, the full Plasmodium mitochondrial genomes of 265 malaria cases imported to the U.S. from 2014-2017 were sequenced and analyzed. P. falciparum infections were found in 94.7% (251/265) of samples. Five percent (14/265) of samples were identified as mixed- Plasmodium species or non-P. falciparum, including P. vivax, P. malariae, P. ovale curtisi, and P. ovale wallikeri. P. falciparum mitochondrial haplotypes analysis revealed greater than eighteen percent of samples to have at least two P. falciparum mitochondrial genome haplotypes, indicating either heteroplasmy or multi-clonal infections. Maximum-likelihood phylogenies of 912 P. falciparum mitochondrial genomes with known country origin were used to infer the geographical origin of thirteen samples from persons with unknown travel histories as: Africa (country unspecified) (n = 10), Ghana (n = 1), Southeast Asia (n = 1), and the Philippines (n = 1). We demonstrate the utility and current limitations of using the Plasmodium mitochondrial genome to classify samples with mixed-infections and infer the geographical origin of imported P. falciparum malaria cases to the U.S. with unknown travel history.

Back to Top

CDC Science Clips Production Staff

  • John Iskander, MD MPH, Editor
  • Gail Bang, MLIS, Librarian
  • Kathy Tucker, Librarian
  • William (Bill) Thomas, MLIS, Librarian
  • Onnalee Gomez, MS, Health Scientist
  • Jarvis Sims, MIT, MLIS, Librarian


DISCLAIMER: Articles listed in the CDC Science Clips are selected by the Stephen B. Thacker CDC Library to provide current awareness of the public health literature. An article's inclusion does not necessarily represent the views of the Centers for Disease Control and Prevention nor does it imply endorsement of the article's methods or findings. CDC and DHHS assume no responsibility for the factual accuracy of the items presented. The selection, omission, or content of items does not imply any endorsement or other position taken by CDC or DHHS. Opinion, findings and conclusions expressed by the original authors of items included in the Clips, or persons quoted therein, are strictly their own and are in no way meant to represent the opinion or views of CDC or DHHS. References to publications, news sources, and non-CDC Websites are provided solely for informational purposes and do not imply endorsement by CDC or DHHS.

Page last reviewed: May 17, 2019, 12:00 AM