libraryheader-short.png

Volume 11, Issue 31 August 6, 2019


CDC Science Clips: Volume 11, Issue 31, August 6, 2019

Science Clips is produced weekly to enhance awareness of emerging scientific knowledge for the public health community. Each article features an Altmetric Attention scoreexternal icon to track social and mainstream media mentions!

This week, Science Clips is pleased to collaborate with CDC Vital Signs by featuring scientific articles from the latest issue on Naloxone. The articles marked with an asterisk are general review articles which may be of particular interest to clinicians and public health professionals seeking background information in this area.

  1. CDC Vital Signs
    • Naloxone
      1. BACKGROUND: Drug overdose deaths have been rising since the early 1990s and is the leading cause of injury death in the United States. Overdose from prescription opioids constitutes a large proportion of this burden. State policy and systems-level interventions have the potential to impact prescription drug misuse and overdose. METHODS: We searched the literature to identify evaluations of state policy or systems-level interventions using non-comparative, cross-sectional, before-after, time series, cohort, or comparison group designs or randomized/non-randomized trials. Eligible studies examined intervention effects on provider behavior, patient behavior, and health outcomes. RESULTS: Overall study quality is low, with a limited number of time-series or experimental designs. Knowledge and prescribing practices were measured more often than health outcomes (e.g., overdoses). Limitations include lack of baseline data and comparison groups, inadequate statistical testing, small sample sizes, self-reported outcomes, and short-term follow-up. Strategies that reduce inappropriate prescribing and use of multiple providers and focus on overdose response, such as prescription drug monitoring programs, insurer strategies, pain clinic legislation, clinical guidelines, and naloxone distribution programs, are promising. Evidence of improved health outcomes, particularly from safe storage and disposal strategies and patient education, is weak. CONCLUSIONS: While important efforts are underway to affect prescriber and patient behavior, data on state policy and systems-level interventions are limited and inconsistent. Improving the evidence base is a critical need so states, regulatory agencies, and organizations can make informed choices about policies and practices that will improve prescribing and use, while protecting patient health.

      2. Naloxone access through established healthcare settings is critical to responding to the opioid crisis. We conducted a systematic review to assess the acceptability and feasibility of prescribing naloxone to patients in primary care. We queried PubMed, EmBase and CINAHL for US-based, peer-reviewed, full-length, original articles relating to acceptability or feasibility of prescribing naloxone in primary care. Searches yielded 270 unduplicated articles; one analyst reviewed all titles and abstracts. Two analysts independently reviewed eligible articles for study design, study outcome, and acceptability and/or feasibility. Analyses were compared and a third reviewer consulted if discrepancies emerged. Seventeen articles were included. Providers’ willingness to prescribe naloxone appeared to increase over time. Most studies provided prescribers in-person naloxone trainings, including how to write a prescription and indications for prescribing. Most studies implemented universal prescribing, whereby anyone prescribed long-term opioids or otherwise at risk for overdose was eligible for naloxone. Patient education was largely provided by prescribers and most studies provided take-home educational materials. Providers reported concerns around naloxone prescribing including lack of knowledge around prescribing and educating patients. Providers also reported benefits such as improving difficult conversations around opioids and resetting the culture around opioids and overdose. Current literature supports the acceptability and feasibility of naloxone prescribing in primary care. Provision of naloxone through primary care may help normalize such medication safety interventions, support larger opioid stewardship efforts, and expand access to patients not served by a community distribution program.

      3. Importance: Given high rates of opioid-related fatal overdoses, improving naloxone access has become a priority. States have implemented different types of naloxone access laws (NALs) and there is controversy over which of these policies, if any, can curb overdose deaths. We hypothesize that NALs granting direct authority to pharmacists to provide naloxone will have the greatest potential for reducing fatal overdoses. Objectives: To identify which types of NALs, if any, are associated with reductions in fatal overdoses involving opioids and examine possible implications for nonfatal overdoses. Design, Setting, and Participants: State-level changes in both fatal and nonfatal overdoses from 2005 to 2016 were examined across the 50 states and the District of Columbia after adoption of NALs using a difference-in-differences approach while estimating the magnitude of the association for each year relative to time of adoption. Policy environments across full state populations were represented in the primary data set. The association for 3 types of NALs was associated: NALs providing direct authority to pharmacists to prescribe, NALs providing indirect authority to prescribe, and other NALs. The study was conducted from January 2017 to January 2019. Exposures: Fatal and nonfatal overdoses in states that adopted NAL laws were compared with those in states that did not adopt NAL laws. Further consideration was given to the type of NAL passed in terms of its association with these outcomes. We hypothesize that NALs granting direct authority to pharmacists to provide naloxone will have the greatest potential for reducing fatal overdoses. Main Outcomes and Measures: Fatal overdoses involving opioids were the primary outcome. Secondary outcomes were nonfatal overdoses resulting in emergency department visits and Medicaid naloxone prescriptions. Results: In this evaluation of the dispensing of naloxone across the United States, NALs granting direct authority to pharmacists were associated with significant reductions in fatal overdoses, but they may also increase nonfatal overdoses seen in emergency department visits. The effect sizes for fatal overdoses grew over time relative to adoption of the NALs. These policies were estimated to reduce opioid-rated fatal overdoses by 0.387 (95% CI, 0.119-0.656; P = .007) per 100000 people in 3 or more years after adoption. There was little evidence of an association for indirect authority to dispense (increase by 0.121; 95% CI, -0.014 to 0.257; P = .09) and other NALs (increase by 0.094; 95% CI, -0.040 to 0.227; P = .17). Conclusions and Relevance: Although many states have passed some type of law affecting naloxone availability, only laws allowing direct dispensing by pharmacists appear to be useful. Communities in which access to naloxone is improved should prepare for increases in nonfatal overdoses and link these individuals to effective treatment.

      4. Academic detailing pilot for naloxone prescribing among primary care providers in San Franciscoexternal icon
        Behar E, Rowe C, Santos GM, Santos N, Coffin PO.
        Fam Med. 2017 Feb;49(2):122-126.
        BACKGROUND: Improving the safety of prescribed opioids in clinical settings is a national priority. While co-prescribing naloxone is increasingly recommended, there is little understanding of the optimal way to implement this practice. METHODS: We developed and delivered an academic detailing intervention to 40 randomly selected opioid-prescribing primary care providers in San Francisco from February to May 2015. Process outcomes were tracked and included provider demographics, number and type of contact attempts, reason for refusal (if applicable), name of detailer, duration of intervention, topics covered, provider concerns, and follow-up plan. Outcome evaluation included changes in the rate of naloxone prescriptions 4 months before and after academic detailing by provider based on de-identified Medi-Cal claims data. Using a difference-in-differences approach, we developed a negative binomial regression model to compare changes in naloxone prescribing to Medi-Cal patients between providers that did and did not receive the intervention. RESULTS: Eighty-three percent of 48 providers contacted accepted the intervention after a mean of 2.6 contacts. Detailing lasted a mean of 28 minutes (range 5-60 minutes) and most frequently covered indications for naloxone, examples of naloxone prescriptions, language to use with patients, and pharmacy outreach. Those who received the academic detailing had a significantly greater increase in naloxone prescriptions compared to those who did not receive the intervention (IRR=11.0, 95%CI=1.8-67.8, P=.010). CONCLUSIONS: Academic detailing addressing opioid safety and naloxone prescribing was well-received by primary care providers and associated with an increase in naloxone prescriptions filled by Medi-Cal patients.

      5. Overdose education and naloxone for patients prescribed opioids in primary care: A qualitative study of primary care staffexternal icon
        Binswanger IA, Koester S, Mueller SR, Gardner EM, Goddard K, Glanz JM.
        J Gen Intern Med. 2015 Dec;30(12):1837-44.
        BACKGROUND: The rate of fatal unintentional pharmaceutical opioid poisonings has increased substantially since the late 1990s. Naloxone is an effective opioid antidote that can be prescribed to patients for bystander use in the event of an overdose. Primary care clinics represent settings in which large populations of patients prescribed opioids could be reached for overdose education and naloxone prescription. OBJECTIVE: Our aim was to investigate the knowledge, attitudes and beliefs about overdose education and naloxone prescription among clinical staff in primary care. DESIGN: This was a qualitative study using focus groups to elucidate both clinic-level and provider-level barriers and facilitators. SETTING: Ten primary care internal medicine, family medicine and infectious disease/HIV practices in three large Colorado health systems. METHODS: A focus group guide was developed based on behavioral theory. Focus group transcripts were coded for manifest and latent meaning, and analyzed for themes using a recursive approach that included inductive and deductive analysis. RESULTS: Themes emerged in four content areas related to overdose education and naloxone prescription: knowledge, barriers, benefits and facilitators. Clinical staff (N = 56) demonstrated substantial knowledge gaps about naloxone and its use in outpatient settings. They expressed uncertainty about who to prescribe naloxone to, and identified a range of logistical barriers to its use in practice. Staff also described fears about offending patients and concerns about increased risk behaviors in patients prescribed naloxone. When considering naloxone, some providers reflected critically and with discomfort on their own opioid prescribing. These barriers were balanced by beliefs that prescribing naloxone could prevent death and result in safer opioid use behaviors. LIMITATIONS: Findings from these qualitative focus groups may not be generalizable to other settings. CONCLUSION: In addition to evidence gaps, logistical and attitudinal barriers will need to be addressed to enhance uptake of overdose education and naloxone prescription for patients prescribed opioids for pain.

      6. CDC Guideline for Prescribing Opioids for Chronic Pain – United States, 2016external icon
        Dowell D, Haegerich TM, Chou R.
        MMWR Recomm Rep. 2016 Mar 18;65(1):1-49.
        This guideline provides recommendations for primary care clinicians who are prescribing opioids for chronic pain outside of active cancer treatment, palliative care, and end-of-life care. The guideline addresses 1) when to initiate or continue opioids for chronic pain; 2) opioid selection, dosage, duration, follow-up, and discontinuation; and 3) assessing risk and addressing harms of opioid use. CDC developed the guideline using the Grading of Recommendations Assessment, Development, and Evaluation (GRADE) framework, and recommendations are made on the basis of a systematic review of the scientific evidence while considering benefits and harms, values and preferences, and resource allocation. CDC obtained input from experts, stakeholders, the public, peer reviewers, and a federally chartered advisory committee. It is important that patients receive appropriate pain treatment with careful consideration of the benefits and risks of treatment options. This guideline is intended to improve communication between clinicians and patients about the risks and benefits of opioid therapy for chronic pain, improve the safety and effectiveness of pain treatment, and reduce the risks associated with long-term opioid therapy, including opioid use disorder, overdose, and death. CDC has provided a checklist for prescribing opioids for chronic pain (http://stacks.cdc.gov/view/cdc/38025) as well as a website (http://www.cdc.gov/drugoverdose/prescribingresources.html) with additional tools to guide clinicians in implementing the recommendations.

      7. Disparity in naloxone administration by emergency medical service providers and the burden of drug overdose in US rural communitiesexternal icon
        Faul M, Dailey MW, Sugerman DE, Sasser SM, Levy B, Paulozzi LJ.
        Am J Public Health. 2015 Jul;105 Suppl 3:e26-32.
        OBJECTIVES: We determined the factors that affect naloxone (Narcan) administration in drug overdoses, including the certification level of emergency medical technicians (EMTs). METHODS: In 2012, 42 states contributed all or a portion of their ambulatory data to the National Emergency Medical Services Information System. We used a logistic regression model to measure the association between naloxone administration and emergency medical services certification level, age, gender, geographic location, and patient primary symptom. RESULTS: The odds of naloxone administration were much higher among EMT-intermediates than among EMT-basics (adjusted odds ratio [AOR] = 5.4; 95% confidence interval [CI] = 4.5, 6.5). Naloxone use was higher in suburban areas than in urban areas (AOR = 1.41; 95% CI = 1.3, 1.5), followed by rural areas (AOR = 1.23; 95% CI = 1.1, 1.3). Although the odds of naloxone administration were 23% higher in rural areas than in urban areas, the opioid drug overdose rate is 45% higher in rural communities. CONCLUSIONS: Naloxone is less often administered by EMT-basics, who are more common in rural areas. In most states, the scope-of-practice model prohibits naloxone administration by basic EMTs. Reducing this barrier could help prevent drug overdose death.

      8. BACKGROUND AND AIMS: Distribution of take-home naloxone (THN) to emergency department (ED) patients who have survived an opioid overdose (OD) could reduce future opioid mortality, but is not commonly performed. We examined whether electronic health record (EHR) prompts provided to ED physicians when discharging a patient after an OD could improve THN distribution. DESIGN: Interrupted time-series analysis to compare the percentage of OD patients who received THN during the 11 months before and after implementation of an EHR prompt on 18 June 2017. SETTING AND PARTICIPANTS: A total of 3492 adult patients with diagnoses of OD discharged from nine EDs in a single health system in Western Pennsylvania from July 2016 to April 2018. INTERVENTION AND COMPARATOR: The EHR prompt was triggered by the presence of specific terms in the nurse’s initial assessment note. The EHR displayed a pop-up window during the ED physician discharge process asking the physician to consider prescribing or providing naloxone to the patient. The comparator was ‘no EHR prompt’. MEASUREMENTS: Measurements were based on standard criteria from ICD diagnostic codes and chief complaint keywords. FINDINGS: In July 2016, 16.3% [95% confidence interval (CI) = 14.0, 18.5] of OD patients received THN, which decreased every month through June 2017 by 1.2% (P < 0.0001, 95% CI = 0.8,1.7). For each month post-EHR prompt there was an increase of 2.8% of OD patients receiving THN (P < 0.001, 95% CI = 2.0, 3.5). No increases occurred in the ED with the highest pre-EHR prompt THN distribution. Rates of THN distribution varied by patient age and race prior to, but not after, implementation of EHR prompts. CONCLUSIONS: Electronic health record prompts are associated with increased take-home naloxone distribution for emergency department patients discharged after opioid overdoses.

      9. Association of naloxone coprescription laws with naloxone prescription dispensing in the United Statesexternal icon
        Sohn M, Talbert JC, Huang Z, Lofwall MR, Freeman PR.
        JAMA Netw Open. 2019 Jun 5;2(6):e196215.
        Importance: To mitigate the opioid overdose crisis, states have implemented a variety of legal interventions aimed at increasing access to the opioid antagonist naloxone. Recently, Virginia and Vermont mandated the coprescription of naloxone for potentially at-risk patients. Objective: To assess the association between naloxone coprescription legal mandates and naloxone dispensing in retail pharmacies. Design, Setting, and Participants: This was a population-based, state-level cohort study. The sample included all prescriptions dispensed for naloxone in the retail pharmacy setting contained in IQVIA’s national prescription audit, which represents 90% of all retail pharmacies in the United States. The unit of observation was state-month and the study period was January 1, 2011, to December 31, 2017. Exposures: State legal intervention mandating naloxone coprescription. Main Outcomes and Measures: Number of naloxone prescriptions dispensed. State rates of naloxone prescriptions dispensed per month per 100000 standard population were calculated. Results: The rate of naloxone dispensing increased after implementation of legal mandates for naloxone coprescription. An estimated 88 naloxone prescriptions per 100000 were dispensed in Virginia and 111 prescriptions per 100 000 were dispensed in Vermont during the first full month the legal requirement was effective. In comparison, 16 naloxone prescriptions per 100 000 were dispensed in the 10 states (including the District of Columbia) with the highest opioid overdose death rates and 6 prescriptions per 100 000 were dispensed in the 39 remaining states. The number of naloxone prescriptions dispensed was associated with the legal mandate for naloxone coprescription (incidence rate ratio [IRR], 7.75; 95% CI, 1.22-49.35). Implementation of the naloxone coprescription mandate was associated with an estimated 214 additional naloxone prescriptions dispensed per month in the period following the mandates, holding all other variables constant. Among covariates, naloxone access laws (IRR, 1.37; 1.05-1.78), opioid overdose death rates (IRR, 1.06; 95% CI, 1.04-1.08), the percentage of naloxone prescriptions paid by third-party payers (IRR 1.009; 1.008-1.010), and time (IRR, 1.06; 95% CI, 1.05-1.07) were significantly associated with naloxone prescription dispensing. Conclusions and Relevance: These study findings suggest that legally mandated naloxone prescription for those at risk for opioid overdose may be associated with substantial increases in naloxone dispensing and further reduction in opioid-related harm.

      10. BACKGROUND: In response to the ongoing opioid overdose epidemic, many states have enacted laws increasing naloxone access by lay people, such as friends and family members of people who use drugs (PWUD), as well as PWUD themselves. METHOD: We utilized Symphony Health Solutions’ PHAST Prescription data from 2007 to 2016 to investigate whether naloxone access laws were associated with an increase in naloxone dispensed from retail pharmacies in the United States. RESULT: Using a negative binomial regression, we found that naloxone access laws were associated with an average increase of 78 prescriptions dispensed per state per quarter. This represents an average 79% increase in naloxone dispensed from U.S. retail pharmacies, compared with states where there were no such laws. CONCLUSION: Our study suggests that naloxone access laws can increase the availability and accessibility of naloxone.

  2. CDC Authored Publications
    The names of CDC authors are indicated in bold text.
    Articles published in the past 6-8 weeks authored by CDC or ATSDR staff.
    • Chronic Diseases and Conditions
      1. Tribal practices for wellness in Indian Countryexternal icon
        Andrade NS, Jones M, Frazier SM, Percy C, Flores M, Bauer UE.
        Prev Chronic Dis. 2019 Jul 25;16:E97.

        [No abstract]

      2. Clinicians can play a role in skin cancer prevention by counseling their patients on use of sun protection and indoor tanning avoidance. We used data from the 2016 DocStyles, a web-based survey of U.S. primary care providers, to examine skin cancer prevention counseling practices among 1506 providers. In 2018, we conducted logistic regression analyses to examine factors associated with regularly providing counseling. Almost half (48.5%) of all providers reported regularly counseling on sun protection, and 27.4% reported regularly counseling on indoor tanning. Provider characteristics associated with regular counseling included having practiced medicine for >/=16years (sun protection: adjusted prevalence ratio [aPR]=1.27, 95% confidence interval [CI]=1.15, 1.41; indoor tanning: aPR=1.38, 95% CI=1.17, 1.63), having treated sunburn in the past year (sun protection: aPR=1.78, 95% CI=1.46, 2.17; indoor tanning: aPR=2.42, 95% CI=1.73, 3.39), and awareness of US Preventive Services Task Force recommendations (sun protection: aPR=1.73, 95% CI=1.51, 2.00; indoor tanning: aPR=2.70, 95% CI=2.09, 3.48). Reporting barriers to counseling was associated with a lower likelihood of regularly counseling on sun protection (1-3 barriers: aPR=0.82, 95% CI=0.71, 0.94; 4+ barriers: aPR=0.80, 95% CI=0.69, 0.93) and indoor tanning (1-3 barriers: aPR=0.72, 95% CI=0.57, 0.91; 4+ barriers: aPR=0.61, 95% CI=0.47, 0.78). Barriers to counseling included lack of time (58.1%), more urgent health concerns (49.1%), and patient disinterest (46.3%). Although many providers report regularly counseling patients on skin cancer prevention, most report serious barriers to providing such counseling. Additional research could explore strategies to integrate compelling and informative skin cancer prevention counseling into current provider practices.

      3. Chronic obstructive pulmonary disease and arthritis among US adults, 2016external icon
        Liu Y, Wheaton AG, Murphy LB, Xu F, Croft JB, Greenlund KJ.
        Prev Chronic Dis. 2019 Jul 18;16.
        INTRODUCTION: More than 54 million US adults have arthritis, and more than 15 million US adults have chronic obstructive pulmonary disease (COPD). Arthritis and COPD share many risk factors, such as tobacco use, asthma history, and age. The objective of this study was to assess the relationship between self-reported physician-diagnosed COPD and arthritis in the US adult population. METHODS: We analyzed data from 408,774 respondents aged 18 or older in the 2016 Behavioral Risk Factor Surveillance System to assess the association between self-reported physician-diagnosed COPD and arthritis in the US adult population by using multivariable logistic regression analyses. RESULTS: Overall crude prevalence was 6.4% for COPD and 25.2% for arthritis. The prevalence of age-adjusted COPD was higher among respondents with arthritis than among respondents without arthritis (13.7% vs 3.8%, P < .001). The association remained significant among most subgroups (P < .001) particularly among adults aged 18 to 44 (11.5% vs 2.0%) and never smokers (7.6% vs 1.7%). In multivariable logistic regression analyses, arthritis status was significantly associated with COPD status after controlling for sociodemographic characteristics, risk behaviors, and health-related quality of life measures (adjusted prevalence ratio = 1.5, 95% confidence interval, 1.4-1.5, P < .001). CONCLUSION: Our results confirmed that arthritis is associated with a higher prevalence of COPD in the US adult population. Health care providers may assess COPD and arthritis symptoms for earlier detection of each condition and recommend that patients with COPD and/or arthritis participate in pulmonary rehabilitation and self-management education programs such as the Chronic Disease Self-Management Program, the proven benefits of which include increased aerobic activity and reduced shortness of breath, pain, and depression.

      4. [No abstract]

      5. Origins and organization of the NHLBI State of the Science Workshop: Generating a national blueprint for future research on factor VIII inhibitorsexternal icon
        Sabatino DE, Pipe SW, Nugent DJ, Soucie JM, Hooper WC, Hoots WK, DiMichele DM.
        Haemophilia. 2019 Jul;25(4):575-580.
        INTRODUCTION: The major complication of protein replacement therapy for haemophilia A is the development of anti-FVIII antibodies or inhibitors that occur in 25%-30% of persons with severe haemophilia A. Alternative therapeutics such as bypassing agents or immune tolerance induction protocols have additional challenges and are not always effective. AIM: Assemble a National Heart, Lung and Blood Institute (NHLBI) State of the Science (SOS) Workshop to generate a national blueprint for research on inhibitors to solve the problem of FVIII immunogenicity. METHODS: An Executive Steering Committee was formed in October 2017 to establish the scientific focus and Scientific Working Groups for the SOS Workshop in May 2018. Four working groups were assembled to address scientific priorities in basic, translational and clinical research on inhibitors. RESULTS: Working Group 1 was charged with determining the scientific priorities for clinical trials to include the integration of non-intravenous, non-factor therapeutics including gene therapy into the standard of care for people with haemophilia A with inhibitors. Working Group 2 established the scientific priorities for 21st-century data science and biospecimen collection for observational inhibitor cohort studies. The scientific priorities for acquiring an actionable understanding of FVIII immunogenicity and the immunology of the host response and FVIII tolerance were developed by Working Group 3. Working Group 4 designed prospective pregnancy/birth cohorts to study FVIII immunogenicity, inhibitor development and eradication. CONCLUSION: The NHLBI SOS Workshop generated a focused summary of scientific priorities and implementation strategies to overcome the challenges of eradicating and preventing inhibitors in haemophilia A.

    • Communicable Diseases
      1. Progress toward poliomyelitis eradication – Nigeria, January 2018-May 2019external icon
        Adamu US, Archer WR, Braka F, Damisa E, Siddique A, Baig S, Higgins J, Sume GE, Banda R, Korir CK, Waziri N, Gidado S, Bammeke P, Edukugo A, Nganda GW, Forbi JC, Burns CC, Liu H, Jorba J, Asekun A, Franka R, Wassilak SG, Bolu O.
        MMWR Morb Mortal Wkly Rep. 2019 Jul 26;68(29):642-646.
        The number of wild poliovirus (WPV) cases in Nigeria decreased from 1,122 in 2006 to six WPV type 1 (WPV1) in 2014 (1). During August 2014-July 2016, no WPV cases were detected; during August-September 2016, four cases were reported in Borno State. An insurgency in northeastern Nigeria had resulted in 468,800 children aged <5 years deprived of health services in Borno by 2016. Military activities in mid-2016 freed isolated families to travel to camps, where the four WPV1 cases were detected. Oral poliovirus vaccine (OPV) campaigns were intensified during August 2016-December 2017; since October 2016, no WPV has been detected (2). Vaccination activities in insurgent-held areas are conducted by security forces; however, 60,000 unvaccinated children remain in unreached settlements. Since 2018, circulating vaccine-derived poliovirus type 2 (cVDPV2) has emerged and spread from Nigeria to Niger and Cameroon; outbreak responses to date have not interrupted transmission. This report describes progress in Nigeria polio eradication activities during January 2018-May 2019 and updates the previous report (2). Interruption of cVDPV2 transmission in Nigeria will need increased efforts to improve campaign quality and include insurgent-held areas. Progress in surveillance and immunization activities will continue to be reviewed, potentially allowing certification of interruption of WPV transmission in Africa in 2020.

      2. An evaluation of a tuberculosis case-finding and treatment program among Syrian refugees-Jordan and Lebanon, 2013-2015external icon
        Boyd AT, Cookson ST, Almashayek I, Yaacoub H, Qayyum MS, Galev A.
        Confl Health. 2019 ;13:32.
        Background: The displacement crisis in Syria poses challenges for tuberculosis (TB) control across the region. Since 2012 in Jordan and 2013 in Lebanon, the International Organization for Migration (IOM) has supported the National TB Program (NTP) in detecting and treating TB among Syrian refugees. In December 2016, IOM asked US Centers for Disease Control and Prevention (CDC) staff to evaluate its program of support to Jordan and Lebanon’s NTPs for TB control among Syrian refugees. This manuscript focuses on case-finding, including contact investigations, and treatment components of the IOM program during 2013-2015 in Jordan and 2015 in Lebanon. Methods: The evaluation consisted of a retrospective review of de-identified Jordan and Lebanon line lists of TB cases and of investigated contacts (Lebanon only). Syrian refugee TB cases were categorized by sex, age group (age < 5 years, 5-14 years, >/=15 years), TB type (pulmonary versus extra-pulmonary), and additionally in Jordan, by refugee camp status (residence in versus outside a refugee camp), to evaluate differences in treatment completion and contact investigation. Results: In Jordan, Syrian refugee cases represented 24.4% of TB cases in 2013, when Syrian refugees made up 6.8% of the country’s population, and 13.8% of TB cases in 2015, when Syrians made up 8.3% of the total population. In Lebanon in 2015, Syrian refugee cases represented 21.4% of TB cases, when Syrians made up 20.1% of the total population. In Jordan, the proportion of Syrian TB cases residing in refugee camps (29.3%) was higher than the proportion of Syrians refugees residing in camps (17.1%). Of Syrian TB cases in 2015, 94.8% in Jordan and 87.8% in Lebanon completed treatment. In Lebanon, among Syrian TB cases with household contacts listed, contact investigation was completed for 77.8% of cases. Conclusion: IOM’s program of NTP support provides critical TB services for Syrian refugees with high treatment completion rates. More community and health practitioner outreach for enhanced active case finding among community-based Syrian refugees in Jordan may improve TB case detection in populations outside of refugee camps. Thorough contact investigations need continued emphasis, including completely recording investigations in both countries, to find active TB cases.

      3. Antiretroviral adherence level necessary for HIV viral suppression using real-world dataexternal icon
        Byrd KK, Hou JG, Hazen R, Kirkham H, Suzuki S, Clay PG, Bush T, Camp NM, Weidle PJ, Delpino A.
        J Acquir Immune Defic Syndr. 2019 Jul 18.
        BACKGROUND: A benchmark of near-perfect adherence (>/=95%) to antiretroviral therapy (ART) is often cited as necessary for HIV viral suppression. However, given newer, more effective ART medications the threshold for viral suppression might be lower. We estimated the minimum ART adherence level necessary to achieve viral suppression. SETTINGS: The Patient-centered HIV Care Model demonstration project. METHODS: Adherence to ART was calculated using the Proportion of Days Covered (PDC) measure for the 365-day period prior to each viral load test result, and grouped into five categories (<50%, 50%-<80%, 80%-<85%, 85%-<90%, and >/=90%). Binomial regression analyses were conducted to determine factors associated with viral suppression (HIV RNA <200 copies/mL); demographics, PDC category and ART regimen type were explanatory variables. Generalized estimating equations with an exchangeable working correlation matrix accounted for correlation within subjects. In addition, probit regression models were used to estimate adherence levels required to achieve viral suppression in 90% of HIV viral load tests. RESULTS: The adjusted odds of viral suppression did not differ between persons with an adherence level of 80%-<85% or 85%-<90% and those with an adherence level of >/=90%. Additionally, the overall estimated adherence level necessary to achieve viral suppression in 90% of viral load tests was 82% and varied by regimen type; integrase inhibitor- and non-nucleoside reverse transcriptase inhibitor-based regimens achieved 90% viral suppression with adherence levels of 75% and 78%, respectively. CONCLUSIONS: The ART adherence level necessary to reach HIV viral suppression may be lower than previously thought and may be regimen dependent.

      4. Urine emtricitabine and tenofovir concentrations provide markers of recent antiretroviral drug exposure among HIV-negative men who have sex with menexternal icon
        Haaland RE, Martin A, Livermont T, Fountain J, Dinh C, Holder A, Lupo LD, Hall L, Conway-Washington C, Kelley CF.
        J Acquir Immune Defic Syndr. 2019 Jul 4.
        BACKGROUND: Urine provides a minimally invasive specimen that may allow for development of rapid tests to detect antiretroviral drugs (ARVs) and provide opportunities to improve individual adherence. This study sought to determine if urine could provide a biomarker of adherence for currently approved PrEP and HIV treatment regimens. METHODS: Urine and blood were collected from 34 HIV-negative men who have sex with men aged 18-49 years enrolled in a clinical trial comparing 2 ARV regimens. Specimens were collected 4 and 24 hours after a single oral dose of tenofovir disoproxil fumarate (TDF)/emtricitabine (FTC) (n=10) or tenofovir alafenamide (TAF)/FTC/cobicistat (COBI)/elvitegravir (EVG) (n=8), or after 4 and 10 days of daily oral TDF/FTC (n=9) or TAF/FTC/COBI/EVG (n=7). Tenofovir (TFV), FTC, and EVG were measured by high performance liquid chromatography-mass spectrometry. RESULTS: Median urine FTC concentrations at 4 and 24 hours were similar between men receiving TDF/FTC (4 hours 147 microg/mL; 24 hours 10 microg/mL) and men receiving TAF/FTC/COBI/EVG (4 hours 333 microg/mL, p=0.173; 24 hours 13 microg/mL, p=0.681). Median urine TFV concentrations were lower among men receiving TAF/FTC/COBI/EVG (4 hours 1.2 microg/mL; 24 hours 0.8 microg/mL) compared to men receiving TDF/FTC (4 hours 17 microg/mL, p<0.001; 24 hours 7 microg/mL, p=0.001). Urine TFV concentrations remained reduced among men receiving TAF/FTC/COBI/EVG compared to men receiving TDF/FTC following daily dosing. EVG was not consistently measureable in urine. CONCLUSION: High urine FTC and TFV concentrations could provide an indication of adherence to daily oral dosing with TDF or TAF-based regimens used for treatment and prevention.

      5. Data from mathematical models suggest that kissing and saliva exchange during sexual activity might be major contributors to community gonorrhoea morbidity. Although there is little evidence to support this, it provokes discussion of the potential role of the oropharynx in gonorrhoea control. Improved sensitivity and ease of diagnostic testing, as well as increased screening for extragenital infections among men who have sex with men, have increased awareness of the high frequency of oropharyngeal gonorrhoea. However, there are insufficient data to determine the mechanisms of transmission for these infections. Innovative studies that use quantitative microbiological techniques are needed to accurately assess how oral gonorrhoea or saliva exchange in infected people contribute to the morbidity of gonorrhoea in the community. More empirical data on pharyngeal gonorrhoea infections, and the role of transmission to and from the oropharynx, are needed to inform prevention planning.

      6. The power of partners: positively engaging networks of people with HIV in testing, treatment and preventionexternal icon
        Katz DA, Wong VJ, Medley AM, Johnson CC, Cherutich PK, Green KE, Huong P, Baggaley RC.
        J Int AIDS Soc. 2019 Jul;22 Suppl 3:e25314.

        [No abstract]

      7. Trends in diagnosed chronic hepatitis B in a US health system population, 2006-2015external icon
        Lu M, Zhou Y, Holmberg SD, Moorman AC, Spradling PR, Teshale EH, Boscarino JA, Daida YG, Schmidt MA, Li J, Rupp LB, Trudeau S, Gordon SC.
        Open Forum Infect Dis. 2019 Jul;6(7):ofz286.
        Background: Trends in the epidemiology of chronic hepatitis B (CHB) among routine clinical care patients in the United States are not well documented. We used data from the Chronic Hepatitis Cohort Study to investigate changes in prevalence and newly recorded cases of CHB from 2006 to 2015. Methods: Annual percentage changes (APCs) were estimated using join point Poisson regression. Analyses were adjusted by study site; when an interaction with the trend was observed, APCs were estimated by subgroups. Differences in rates based on race, age, and sex were calculated with rate ratios. Results: We identified 5492 patients with CHB within select health systems with total populations that ranged from 1.9 to 2.4 million persons. From 2006 to 2014, the prevalence of diagnosed CHB increased from 181.3 to 253.0 per 100 000 persons in the health system population; from 2014 to 2015, it declined to 237.0 per 100 000 persons. APC was +3.7%/y through 131 December 2014 (P < .001) and -15.0%/y (P < .001) thereafter. The rate of newly reported cases of CHB did not change significantly across the study period (APC, -1.1%/y; P = .07). The rates of newly reported cases were 20.5 times higher among patients in the Asian American/American Indian/Pacific Islander (ASINPI) category, compared with white patients, and 2.8 times higher among African American patients. The ratio of male to female patients was roughly 3:2. Conclusions: The prevalence of diagnosed CHB in this US patient population increased from 2006 to 2014, after which it decreased significantly. Rates declined most rapidly among patients </=40 or 61-70 years old, as well as among ASINPI patients. The rate of newly reported cases remained steady over the study period.

      8. The US South accounted for 51% of annual new HIV infections, 50% of undiagnosed infections and 45% of persons with HIV infection in 2016 while comprising 38% of the population. Myriad structural and contextual factors are associated with HIV-related disparities. This paper describes initiatives and strategies conducted by the Centers for Disease Control and Prevention and Health Resources and Services Administration to identify opportunities and activities addressing the disparity of HIV diagnoses in the South. Targeted HIV prevention and care efforts can change the trajectory of outcomes along the HIV care continuum and reduce HIV-related disparities in the South.

      9. Social mixing and clinical features linked with transmission in a network of extensively drug-resistant (XDR) tuberculosis cases in KwaZulu-Natal, South Africaexternal icon
        Nelson KN, Jenness SM, Mathema B, Lopman BA, Auld SC, Shah NS, Brust JC, Ismail N, Omar SV, Brown TS, Allana S, Campbell A, Moodley P, Mlisana K, Gandhi NR.
        Clin Infect Dis. 2019 Jul 12.
        BACKGROUND: Tuberculosis (TB) is the leading infectious cause of death globally and drug-resistant TB strains pose a serious threat to controlling the global TB epidemic. The clinical features, locations, and social factors driving transmission in settings with a high incidence of drug-resistant TB are poorly understood. METHODS: We measured a network of genomic links using Mycobacterium tuberculosis (Mtb) whole genome sequences. RESULTS: Cases with 2-3 months of cough or who spent time in urban locations, were more likely to be linked in the network, while cases with sputum smear-positive disease were less likely to be linked than those with smear-negative disease. Associations persisted using different thresholds to define genomic links and irrespective of assumptions about the direction of transmission. CONCLUSIONS: Identifying factors that lead to many transmissions, including contact with urban areas, can suggest settings instrumental in transmission and indicate optimal locations and groups to target with interventions.

      10. Prevalence, incidence, and clearance of anal high-risk human papillomavirus infection among HIV-infected men in the SUN Studyexternal icon
        Patel P, Bush T, Kojic EM, Conley L, Unger ER, Darragh TM, Henry K, Hammer J, Escota G, Palefsky JM, Brooks JT.
        J Infect Dis. 2018 Mar 5;217(6):953-963.
        Background: The natural history of anal human papilloma virus (HPV) infection among human immunodeficiency virus (HIV)-infected men is unknown. Methods: Annually, from 2004 to 2012, we examined baseline prevalence, incidence, and clearance of anal HPV infection at 48 months, and associated factors among HIV-infected men. Results: We examined 403 men who have sex with men (MSM) and 96 men who have sex with women (MSW) (median age 42 years for both, 78% versus 81% prescribed cART, median CD4+ T-lymphocyte cell count 454 versus 379 cells/mm3, and 74% versus 75% had undetectable viral load, respectively). Type 16 prevalence among MSM and MSW was 38% versus 14% (P < .001), and incidence 24% versus 7% (P = .001). Type 18 prevalence was 24% versus 8% (P < .001), and incidence 13% versus 4% (P = .027). Among MSM and MSW, clearance of prevalent HPV 16 and HPV 18 was 31% and 60% (P = .392), and 47% and 25% (P = .297), respectively. Among MSM, receptive anal sex (with or without a condom) was associated with persistent HPV 16 (OR 2.24, P < .001). Conclusions: MSM had higher prevalence and incidence of HPV than MSW, but similar clearance. Receptive anal sex may predict cancer risk among HIV-infected MSM.

      11. Dolutegravir Use at Conception – Additional Surveillance Data from Botswanaexternal icon
        Raesima MM, Ogbuabo CM, Thomas V, Forhan SE, Gokatweng G, Dintwa E, Petlo C, Motswere-Chirwa C, Rabold EM, Tinker SC, Odunsi S, Malima S, Mmunyane O, Modise T, Kefitlhile K, Dare K, Letebele M, Roland ME, Moore CA, Modi S, Williamson DM.
        N Engl J Med. 2019 Jul 22.

        [No abstract]

      12. Health care autonomy of women living with HIVexternal icon
        Redfield RR, Modi S, Moore CA, Delaney A, Honein MA, Tomlinson HL.
        N Engl J Med. 2019 Jul 24.

        [No abstract]

      13. BACKGROUND: Despite recommendations for preventive health services and routine HIV care for HIV-positive women, limited data are available regarding uptake of recommendations. METHODS: We used data from the 2013-2014 data cycles of the Medical Monitoring Project. We calculated weighted estimates and used multivariable logistic regression with adjusted prevalence ratios (aPR) and 95% confidence intervals (CI) to examine associations between preventive health screenings, routine HIV care (based on viral load (VL) and CD4 measures as proxies), and sociodemographic factors. RESULTS: Of 2,766 women, 47.7% were >/= 50 years old, 61.7% non-Hispanic black, 37.2% had > high school education, 63.3% had been living with HIV for >/= 10 years, 68.4% were living </=the federal poverty level, 67.3% had public health insurance; 93.8%were prescribed antiretroviral therapy (ART); 66.1%had sustained/durable suppression (12 months). For women aged >/= 18 years, cervical cancer, breast cancer, and STI screenings were documented for 44.3%, 27.6%, and 34.7%, respectively; 26% did not meet 6-month, and 37% did not meet 12-month, VL and CD4 test measure goals. In multivariable analyses, women with no VLs in past 6 months were less likely to be durably suppressed, and women who did not have >/= three CD4 or VL tests (past 12 months) were less likely to be living above the poverty level, and more likely to have public insurance, compared to private health insurance (p < 0.05). CONCLUSION: Receipt of recommended preventive care was suboptimal. Targeted interventions are warranted to help ensure access to comprehensive HIV care and prevention services for women.

      14. Assessing uncertainty in an anatomical site-specific gonorrhea transmission model of men who have sex with menexternal icon
        Spicknall IH, Mayer KH, Aral SO, Romero-Severson EO.
        Sex Transm Dis. 2019 May;46(5):321-328.
        BACKGROUND: Increased gonorrhea detection highlights the need for additional prevention efforts. Gonorrhea may only be acquired when there is contact between infected and uninfected anatomical sites. With 3 sites of infection, this leads to 7 plausible routes of men who have sex with men (MSM) transmission: urethra-to-rectum, rectum-to-urethra, urethra-to-oropharynx, rectum-to-oropharynx, oropharynx-to-urethra, oropharynx-to-rectum, and oropharynx-to-oropharynx. We characterize the uncertainty and potential importance of transmission from each anatomical site using a deterministic compartmental mathematical model. METHODS: We developed a model of site-specific gonococcal infection, where individuals are infected at 0, 1, 2, or all 3 sites. Sexual behavior and infection duration parameters were fixed similar to a recent model analysis of Australian MSM. Markov chain Monte Carlo methods were used to sample the posterior distribution of transmission probabilities that were consistent with site-specific prevalence in American MSM populations under specific scenarios. Scenarios were defined by whether transmission routes may or may not transmit by constraining specific transmission probabilities to zero rather than fitting them. RESULTS: Transmission contributions from each site have greater uncertainty when more routes may transmit; in the most extreme case, when all routes may transmit, the oropharynx can contribute 0% to 100% of all transmissions. In contrast, when only anal or oral sex may transmit, transmission from the oropharynx can account for only 0% to 25% of transmission. Intervention effectiveness against transmission from each site also has greater uncertainty when more routes may transmit. CONCLUSIONS: Even under ideal conditions (ie, when site-specific gonococcal prevalence, relative rates of specific sex acts, and duration of infection at each anatomical site are known and do not vary), the relative importance of different anatomical sites for gonococcal infection transmission cannot be inferred with precision. Additional data informing per act transmissibility are needed to understand site-specific gonococcal infection transmission. This understanding is essential for predicting population-specific intervention effectiveness.

      15. Progress in testing for and treatment of hepatitis C virus infection among persons who inject drugs – Georgia, 2018external icon
        Stvilia K, Spradling PR, Asatiani A, Gogia M, Kutateladze K, Butsashvili M, Zarkua J, Tsertsvadze T, Sharvadze L, Japaridze M, Kuchuloria T, Gvinjilia L, Tskhomelidze I, Gamkrelidze A, Khonelidze I, Sergeenko D, Shadaker S, Averhoff F, Nasrullah M.
        MMWR Morb Mortal Wkly Rep. 2019 Jul 26;68(29):637-641.
        In April 2015, the country of Georgia, with a high prevalence of hepatitis C virus (HCV) infection (5.4% of the adult population, approximately 150,000 persons), embarked on the world’s first national elimination program (1,2). Nearly 40% of these infections are attributed to injection drug use, and an estimated 2% of the adult population currently inject drugs, among the highest prevalence of injection drug use in the world (3,4). Since 2006, needle and syringe programs (NSPs) have been offering HCV antibody testing to persons who inject drugs and, since 2015, referring clients with positive test results to the national treatment program. This report summarizes the results of these efforts. Following implementation of the elimination program, the number of HCV antibody tests conducted at NSPs increased from an average of 3,638 per year during 2006-2014 to an average of 21,551 during 2015-2018. In 2017, to enable tracking of clinical outcomes among persons who inject drugs, NSPs began encouraging clients to voluntarily provide their national identification number (NIN), which all citizens must use to access health care treatment services. During 2017-2018, a total of 2,780 NSP clients with positive test results for HCV antibody were identified in the treatment database by their NIN. Of 494 who completed treatment and were tested for HCV RNA >/=12 weeks after completing treatment, 482 (97.6%) were cured of HCV infection. Following the launch of the elimination program, Georgia has made much progress in hepatitis C screening among persons who inject drugs; recent data demonstrate high cure rates achieved in this population. Testing at NSPs is an effective strategy for identifying persons with HCV infection. Tracking clients referred from NSPs through treatment completion allows for monitoring the effectiveness of linkage to care and treatment outcomes in this population at high risk, a key to achieving hepatitis C elimination in Georgia. The program in Georgia might serve as a model for other countries.

      16. Understanding the association of internalized HIV stigma with retention in HIV careexternal icon
        Valverde E, Rodriguez A, White B, Guo Y, Waldrop-Valverde D.
        J HIV aids. 2018 Oct;4(3).
        Internalized HIV stigma plays a detrimental role in terms of linkage to HIV care and adherence to antiretroviral treatment. Yet, little is known regarding the association of internalized HIV stigma with retention in HIV care. We conducted an analysis of interview and medical record abstraction data collected from 188 HIV positive men and women receiving HIV care in Miami, Florida. Demographic characteristics, HIV risk behaviors and care related factors were used to explore the association of internalized HIV stigma with retention in care in a Poisson regression analysis. The relationship of internalized HIV stigma and retention in care was moderated by the patient’s level of engagement with an HIV care provider (p=0.004) in that higher levels of provider engagement were significantly associated with higher retention in care rates among patients with moderate levels of internalized HIV stigma. Additionally, retention in care rates were lower for females than for males and for 18-44 year olds than for individuals 44 years and older. Our findings indicate that higher levels of patient-provider engagement may reduce the impact that internalized HIV stigma has on retention in HIV care for some patients. Interventions with HIV care providers or patients to enhance patient-provider engagement may be beneficial.

      17. Short and long-term pharmacologic measures of HIV pre-exposure prophylaxis use among high-risk men who have sex with men in HPTN 067/ADAPTexternal icon
        Velloza J, Bacchetti P, Hendrix CW, Murnane P, Hughes JP, Li M, Curlin M, Holtz TH, Mannheimer S, Marzinke MA, Amico KR, Liu A, Piwowar-Manning E, Eshleman SH, Dye BJ, Gandhi M, Grant RM.
        J Acquir Immune Defic Syndr. 2019 Jul 4.
        BACKGROUND: The effectiveness of oral emtricitabine (FTC)/tenofovir (TFV) disoproxil fumarate (TDF)-based HIV pre-exposure prophylaxis (PrEP) depends on adherence. Pharmacologic measures help interpret patterns and predictors of PrEP adherence. SETTING: We analyzed data from the sub-sample of men who have sex with men (MSM) enrolled in HPTN 067/ADAPT in Bangkok, Thailand, and Harlem, NY, U.S. METHODS: After a five-week directly observed therapy period, participants were randomized to daily, time-driven, or event-driven PrEP. Follow-up occurred at weeks 4, 12, and 24 post-randomization. Plasma and hair FTC/TFV levels indicated short and long-term PrEP use, respectively. Electronic pill bottle data (Wisepill) were collected weekly. Pearson correlation coefficients between PrEP use measures were calculated; linear mixed models assessed predictors of plasma and hair drug concentrations. RESULTS: Among 350 participants (median age 31 years, interquartile range [IQR]: 25-38), 49.7% were from Harlem, half had less than college education, and 21% reported heavy alcohol use. In multivariable models, being enrolled in Harlem, being in non-daily arms, and having less than college education were associated with lower hair FTC/TFV concentrations; heavy alcohol use was associated with higher concentrations. Similar results were found for plasma concentrations by site and arm, but older age and greater number of sex partners were associated with higher concentrations. Hair and plasma FTC/TFV concentrations were moderately correlated with Wisepill data (r>/=0.29) across visits. CONCLUSION: In HPTN067, plasma, hair, and Wisepill data correlated with one another and served as complementary adherence measures. Site, arm, education, age, alcohol, and sexual behavior influenced patterns of adherence.

    • Disaster Control and Emergency Services
      1. Enhanced One Health Surveillance during the 58th Presidential Inauguration – District of Columbia, January 2017external icon
        Garrett-Cherry TA, Hennenfent AK, McGee S, Davies-Cole J.
        Disaster Med Public Health Prep. 2019 Jul 23:1-7.
        ABSTRACTObjective:In January 2017, Washington, DC, hosted the 58th United States presidential inauguration. The DC Department of Health leveraged multiple health surveillance approaches, including syndromic surveillance (human and animal) and medical aid station-based patient tracking, to detect disease and injury associated with this mass gathering. METHODS: Patient data were collected from a regional syndromic surveillance system, medical aid stations, and an internet-based emergency department reporting system. Animal health data were collected from DC veterinary facilities. RESULTS: Of 174 703 chief complaints from human syndromic data, there were 6 inauguration-related alerts. Inauguration attendees who visited aid stations (n = 162) and emergency departments (n = 180) most commonly reported feeling faint/dizzy (n = 29; 17.9%) and pain/cramps (n = 34;18.9%). In animals, of 533 clinical signs reported, most were gastrointestinal (n = 237; 44.5%) and occurred in canines (n = 374; 70.2%). Ten animals that presented dead on arrival were investigated; no significant threats were identified. CONCLUSION: Use of multiple surveillance systems allowed for near-real-time detection and monitoring of disease and injury syndromes in humans and domestic animals potentially associated with inaugural events and in local health care systems.

    • Disease Reservoirs and Vectors
      1. Marmots and Yersinia pestis Strains in Two Plague Endemic Areas of Tien Shan Mountainsexternal icon
        Sariyeva G, Bazarkanova G, Maimulov R, Abdikarimov S, Kurmanov B, Abdirassilova A, Shabunin A, Sagiyev Z, Dzhaparova A, Abdel Z, Mussagaliyeva R, Morand S, Motin V, Kosoy M.
        Front Vet Sci. 2019 ;6:207.
        The main purpose of this study was to clarify the role of gray marmots (Marmota baibacina) in the long-term maintenance of highly virulent strains of Yersinia pestis in two plague endemic foci of the Tien Shan Mountains in Kyrgyzstan. We present data from regular observations of populations of M. baibacina and small rodents cohabiting with marmots in the mountainous grasslands of the Sari-Dzhas (east of Issyk-Kul Lake) and the Upper-Naryn (south of Issyk-Kul Lake) natural foci. During 2012-2017, an abundance of marmots and their ectoparasites (fleas and ticks) was significantly higher in Upper-Naryn comparing to Sari-Dzhas, although there were no differences in a number and diversity of small rodents cohabiting with marmots. The plague bacterium was detected either in marmots or in their ectoparasites collected during 4 of 6 years of observation in Sari-Dzhas and during 2 of 4 years of observation in Upper-Naryn. Plague was found in three sectors situated closely to each other in Sari-Dzhas and in 1 of 8 repeatedly surveyed sectors in Upper-Naryn. During 6 years, we isolated 9 strains of Y. pestis from marmots, two from their fleas Oropsylla silantiewi, one from an unidentified tick, and one from the gray hamster (Cricetulus migratorius). All plague strains isolated from the rodents and their ectoparasites in this study were similar to Antiqua biovar specific for marmots. The results indicate that plague can circulate continuously in the Tien Shan Mountains in populations of gray marmots and their ectoparasites with a facultative involvement of other rodent species after significant changes in rodent communities that happened in Kyrgyzstan during the previous two decades. The simultaneous field survey of two natural foci of plague, Sari-Dzhas, and Upper-Naryn, would be important for further analysis of circulation of Y. pestis strains belonging to Antiqua biovar in the Tien Shan Mountains.

      2. Autocidal gravid ovitraps protect humans from chikungunya virus infection by reducing Aedes aegypti mosquito populationsexternal icon
        Sharp TM, Lorenzi O, Torres-Velasquez B, Acevedo V, Perez-Padilla J, Rivera A, Munoz-Jordan J, Margolis HS, Waterman SH, Biggerstaff BJ, Paz-Bailey G, Barrera R.
        PLoS Negl Trop Dis. 2019 Jul;13(7):e0007538.
        BACKGROUND: Public health responses to outbreaks of dengue, chikungunya, and Zika virus have been stymied by the inability to control the primary vector, Aedes aegypti mosquitos. Consequently, the need for novel approaches to Aedes vector control is urgent. Placement of three autocidal gravid ovitraps (AGO traps) in ~85% of homes in a community was previously shown to sustainably reduce the density of female Ae. aegypti by >80%. Following the introduction of chikungunya virus (CHIKV) to Puerto Rico, we conducted a seroprevalence survey to estimate the prevalence of CHIKV infection in communities with and without AGO traps and evaluate their effect on reducing CHIKV transmission. METHODS AND FINDINGS: Multivariate models that calculated adjusted prevalence ratios (aPR) showed that among 175 and 152 residents of communities with and without AGO traps, respectively, an estimated 26.1% and 43.8% had been infected with CHIKV (aPR = 0.50, 95% CI: 0.37-0.91). After stratification by time spent in their community, protection from CHIKV infection was strongest among residents who reported spending many or all weekly daytime hours in their community:10.3% seropositive in communities with AGO traps vs. 48.7% in communities without (PR = 0.21, 95% CI: 0.11-0.41). The age-adjusted rate of fever with arthralgia attributable to CHIKV infection was 58% (95% CI: 46-66%). The monthly number of CHIKV-infected mosquitos and symptomatic residents were diminished in communities with AGO traps compared to those without. CONCLUSIONS: These findings indicate that AGO traps are an effective tool that protects humans from infection with a virus transmitted by Ae. aegypti mosquitos. Future studies should evaluate their protective effectiveness in large, urban communities.

    • Environmental Health
      1. BACKGROUND: Introduction of an organic diet can significantly reduce exposure to some classes of pesticides in children and adults, but no long-term trials have been conducted. OBJECTIVES: To assess the effect of a long-term (24-week) organic produce intervention on pesticide exposure among pregnant women. METHODS: We recruited 20 women from the Idaho Women, Infants, and Children (WIC) program during their first trimester of pregnancy. Eligible women were nonsmokers aged 18-35years who reported eating exclusively conventionally grown food. We randomly assigned participants to receive weekly deliveries of either organic or conventional fruits and vegetables throughout their second or third trimesters and collected weekly spot urine samples. Urine samples, which were pooled to represent monthly exposures, were analyzed for biomarkers of organophosphate (OP) and pyrethroid insecticides. RESULTS: Food diary data demonstrated that 66% of all servings of fruits and vegetables consumed by participants in the “organic produce” group were organic, compared to <3% in the “conventional produce” group. We collected an average of 23 spot samples per participant (461 samples total), which were combined to yield 116 monthly composites. 3-Phenoxybenzoic acid (3-PBA, a non-specific biomarker of several pyrethroids) was detected in 75% of the composite samples, and 3-PBA concentrations were significantly higher in samples collected from women in the conventional produce group compared to the organic produce group (0.95 vs 0.27mug/L, p=0.03). Another pyrethroid biomarker, trans-3-(2,2-dichlorovinyl)-2,2-dimethylcyclopropane carboxylic acid, was detected more frequently in women in the conventional compared to the organic produce groups (16% vs 4%, p=0.05). In contrast, we observed no statistically significant differences in detection frequency or concentrations for any of the four biomarkers of OP exposure quantified in this trial. DISCUSSION: To our knowledge, this is the first long-term organic diet intervention study, and the first to include pregnant women. These results suggest that addition of organic produce to an individual’s diet, as compared to conventional produce, significantly reduces exposure to pyrethroid insecticides.

      2. Notes from the Field: Targeted Biomonitoring for GenX and Other Per- and Polyfluoroalkyl Substances Following Detection of Drinking Water Contamination – North Carolina, 2018external icon
        Pritchett JR, Rinsky JL, Dittman B, Christensen A, Langley R, Moore Z, Fleischauer AT, Koehler K, Calafat AM, Rogers R, Esters L, Jenkins R, Collins F, Conner D, Breysse P.
        MMWR Morb Mortal Wkly Rep. 2019 Jul 26;68(29):647-648.

        [No abstract]

      3. Prenatal exposure to Polychlorinated Biphenyls and body fatness in girlsexternal icon
        Wang A, Jeddy Z, Sjodin A, Taylor EV, Marks KJ, Hartman TJ.
        Chemosphere. 2019 Jul 12;236:124315.
        Polychlorinated biphenyls (PCBs) are synthetic, organochlorine compounds previously used in industrial processes. Although banned in 1980’s across Europe, these chemicals persist in the environment and are associated with adverse health outcomes in children. We investigated the association between in utero concentrations of PCBs and girls’ body fatness. Concentrations of various PCB congeners (PCB 118, PCB 138, PCB 153, PCB 170, and PCB 180) were measured in maternal serum samples collected in the early 1990’s. Body fatness was measured in the daughters at 9y of age using body mass index (BMI) and dual-energy x-ray absorptiometry (DXA) for percent body fat. Using multivariable linear regression, we explored associations between prenatal PCB congener concentrations and body fatness outcomes. Among 339 mother-daughter dyads, the median and interquartile range (IQR) for PCB congeners ranged between 15.0ngg(-1) (11.0-20.8) for PCB 118 to 64.6ngg(-1) (48.6-86.3) for PCB 153. Among daughters, the median was 27.5% (21.7-34.6) for percent body fat, 39.6% (36.4-43.5) for percent trunk fat, 4.9kgm(-2) (3.5-7.0) for fat mass index and 18.1kgm(-2) (16.3-20.6) for body mass index. Multivariable-adjusted regression analyses showed little or no association between prenatal PCB concentrations with daughters’ body fatness measures. Prenatal concentrations of PCB congeners were not strongly associated with measures of body fatness in girls.

    • Epidemiology and Surveillance
      1. Advancing biological hazards risk assessmentexternal icon
        Messens W, Hugas M, Afonso A, Aguilera J, Berendonk TU, Carattoli A, Dhollander S, Gerner-Smidt P, Kriz N, Liebana E, Medlock J, Robinson T, Stella P, Waltner-Toews D, Catchpole M.
        EFSA Journal. 2019 ;17(S1).
        This paper focusses on biological hazards at the global level and considers the challenges to risk assessment (RA) from a One Health perspective. Two topics – vector-borne diseases (VBD) and antimicrobial resistance (AMR) – are used to illustrate the challenges ahead and to explore the opportunities that new methodologies such as next-generation sequencing can offer. Globalisation brings complexity and introduces drivers for infectious diseases. Cooperation and the application of an integrated RA approach – one that takes into consideration food farming and production systems including social and environmental factors – are recommended. Also needed are methodologies to identify emerging risks at a global level and propose prevention strategies. AMR is one of the biggest threats to human health in the infectious disease environment. Whereas new genomic typing techniques such as whole genome sequencing (WGS) provide further insights into the mechanisms of spread of resistance, the role of the environment is not fully elucidated, nor is the role of plants as potential vehicles for spread of resistance. Historical trends and recent experience indicate that (re)-emergence and/or further spread of VBD within the EU is a matter of when rather than if. Standardised and validated vector monitoring programs are required to be implemented at an international level for continuous surveillance and assessment of potential threats. There are benefits to using WGS – such as a quicker and better response to outbreaks and additional evidence for source attribution. However, significant challenges need to be addressed, including method standardisation and validation to fully realise these benefits; barriers to data sharing; and establishing epidemiological capacity for cluster triage and response.

    • Health Communication and Education
      1. Use of mass communication by public health programs in nonmetropolitan regionsexternal icon
        Kreslake JM, Elkins A, Thomas CN, Gates S, Lehman T.
        Prev Chronic Dis. 2019 Jul 25;16:E96.

        [No abstract]

    • Health Economics
      1. Estimating the cost of illness and burden of disease associated with the 2014-2015 chikungunya outbreak in the U.S. Virgin Islandsexternal icon
        Feldstein LR, Ellis EM, Rowhani-Rahbar A, Hennessey MJ, Staples JE, Halloran ME, Weaver MR.
        PLoS Negl Trop Dis. 2019 Jul 19;13(7):e0007563.
        Chikungunya virus (CHIKV), an alphavirus that causes fever and severe polyarthralgia, swept through the Americas in 2014 with almost 2 million suspected or confirmed cases reported by April 2016. In this study, we estimate the direct medical costs, cost of lost wages due to absenteeism, and years lived with disability (YLD) associated with the 2014-2015 CHIKV outbreak in the U.S. Virgin Islands (USVI). For this analysis, we used surveillance data from the USVI Department of Health, medical cost data from three public hospitals in USVI, and data from two studies of laboratory-positive cases up to 12 months post illness. On average, employed case-patients missed 9 days of work in the 12 months following their disease onset, which resulted in an estimated cost of $15.5 million. Estimated direct healthcare costs were $2.9 million for the first 2 months and $0.6 million for 3-12 months following the outbreak. The total estimated cost associated with the outbreak ranged from $14.8 to $33.4 million (approximately 1% of gross domestic product), depending on the proportion of the population infected with symptomatic disease, degree of underreporting, and proportion of cases who were employed. The estimated YLDs associated with long-term sequelae from the CHIKV outbreak in the USVI ranged from 599-1,322. These findings highlight the significant economic burden of the recent CHIKV outbreak in the USVI and will aid policy-makers in making informed decisions about prevention and control measures for inevitable, future CHIKV outbreaks.

      2. Estimates of testing for latent tuberculosis infection and cost, United States, 2013external icon
        Marks SM, Woodruff RY, Owusu-Edusei K, Asay GR, Hill AN.
        Public Health Rep. 2019 Jul 24:33354919862688.
        OBJECTIVES: Tracking trends in the testing of latent tuberculosis infection (LTBI) can help measure tuberculosis elimination efforts in the United States. The objectives of this study were to estimate (1) the annual number of persons tested for LTBI and the number of LTBI tests conducted, by type of test and by public, private, and military sectors, and (2) the cost of LTBI testing in the United States. METHODS: We searched the biomedical literature for published data on private-sector and military LTBI testing in 2013, and we used back-calculation to estimate public-sector LTBI testing. To estimate costs, we applied Medicare-allowable reimbursements in 2013 by test type. RESULTS: We estimated an average (low-high) 13.3 million (11.3-15.4 million) persons tested for LTBI and 15.3 million (12.9-17.7 million) LTBI tests, of which 13.2 million (11.1-15.3 million) were tuberculin skin tests and 2.1 million (1.8-2.4 million) were interferon-gamma release assays (IGRAs). Eighty percent of persons tested were in the public sector, 18% were in the private sector, and 2% were in the military. Costs of LTBI tests and of chest radiography totaled $314 million (range, $256 million to $403 million). CONCLUSIONS: To achieve tuberculosis elimination, millions more persons will need to be tested in all sectors. By targeting testing to only those at high risk of tuberculosis and by using more specific IGRA tests, the incidence of tuberculosis in the United States can be reduced and resources can be more efficiently used.

      3. Cost barriers to asthma care by health insurance type among children with asthmaexternal icon
        Pate CA, Qin X, Bailey CM, Zahran HS.
        J Asthma. 2019 Jul 25:1-7.
        Objective: Children with asthma have ongoing health care needs and health insurance is a vital part of their health care access. Health care coverage may be associated with various cost barriers to asthma care. We examined cost barriers to receiving asthma care by health insurance type and coverage continuity among children with asthma using the 2012-2014 Child Asthma Call-back Survey (ACBS). Methods: The study sample included 3788 children under age 18 years with current asthma who had responses to the ACBS by adult proxy. Associations between cost barriers to asthma care and treatment were analyzed by demographic, health insurance coverage, and urban residence variables using multivariable logistic regression models. Results: Among insured children, more blacks reported a cost barrier to seeing a doctor (10.6% [5.9, 18.3]) compared with whites (2.9% [2.1, 4.0]) (p = 0.03). Adjusting for demographic factors (sex, age, and race), uninsured and having partial year coverage were associated with cost barrier to seeing a doctor (adjusted prevalence ratio aPR = 8.07 [4.78, 13.61] and aPR = 6.58 [3.78, 11.45], respectively) and affording medication (aPR = 8.35 [5.23, 13.34] and aPR = 4.93 [2.96, 8.19], respectively), compared with children who had full year coverage. Public insurance was associated with cost barrier to seeing a doctor (aPR = 4.43 [2.57, 7.62]), compared with private insurance. Conclusions: Having no health insurance, partial year coverage, and public insurance were associated with cost barriers to asthma care. Improving health insurance coverage may help strengthen access to and reduce cost barriers to asthma care.

    • Healthcare Associated Infections
      1. Accuracy of catheter-associated urinary tract infections reported to the National Healthcare Safety Network, January 2010 through July 2018external icon
        Bagchi S, Watkins J, Norrick B, Scalise E, Pollock DA, Allen-Bridson K.
        Am J Infect Control. 2019 Jul 17.
        BACKGROUND: Surveillance of health care-associated, catheter-associated urinary tract infections (CAUTI) are the corner stone of infection prevention activity. The Centers for Disease Control and Prevention’s National Healthcare Safety Network provides standard definitions for CAUTI surveillance, which have been updated periodically to increase objectivity, credibility, and reliability of urinary tract infection definitions. Several state health departments have validated CAUTI data that provided insights into accuracy of CAUTI reporting and adherence to CAUTI definition. METHODS: Data accuracy measures included pooled mean sensitivity, specificity, positive predictive value, and negative predictive value. Total CAUTI error rate was computed as proportion of mismatches among total records. The impact of 2015 CAUTI definition changes were tested by comparing pooled accuracy estimates of validations prior to 2015 with post-2015. RESULTS: At least 19 state health departments conducted CAUTI validations and indicated pooled mean sensitivity of 88.3%, specificity of 98.8%, positive predictive value of 93.6%, and negative predictive value of 97.6% of CAUTI reporting to the National Healthcare Safety Network. Among CAUTIs misclassified (121), 66% were underreported and 34% were overreported. CAUTI classification error rate declined significantly from 4.3% (pre-2015) to 2.4% (post-2015). Reasons for CAUTI misclassifications included: misapplication of CAUTI definition, misapplication of general health care-associated infection definitions, and clinical judgement over surveillance definition. CONCLUSIONS: CAUTI underreporting is a major concern; validations provide transparency, education, and relationship building to improve reporting accuracy.

      2. Francisella tularensis transmission by solid organ transplantation, 2017external icon
        C.A. Nelson NN, Nelson CA, Murua C, Jones JM, Mohler K, Zhang Y, Wiggins L, Kwit NA, Respicio-Kingry L, Kingry LC, Petersen JM, Brown J, Aslam S, Krafft M, Asad S, Dagher HN, Ham J, Medina-Garcia LH, Burns K, Kelley WE, Hinckley AF, Annambhotla P, Carifo K, Gonzalez A, Helsel E, Iser J, Johnson M, Fritz CL, Basavaraju SV.
        Emerg Infect Dis. 2019 Apr;25(4):767-775.
        In July 2017, fever and sepsis developed in 3 recipients of solid organs (1 heart and 2 kidneys) from a common donor in the United States; 1 of the kidney recipients died. Tularemia was suspected only after blood cultures from the surviving kidney recipient grew Francisella species. The organ donor, a middle-aged man from the southwestern United States, had been hospitalized for acute alcohol withdrawal syndrome, pneumonia, and multiorgan failure. F. tularensis subsp. tularensis (clade A2) was cultured from archived spleen tissue from the donor and blood from both kidney recipients. Whole-genome multilocus sequence typing indicated that the isolated strains were indistinguishable. The heart recipient remained seronegative with negative blood cultures but had been receiving antimicrobial drugs for a medical device infection before transplant. Two lagomorph carcasses collected near the donor’s residence were positive by PCR for F. tularensis subsp. tularensis (clade A2). This investigation documents F. tularensis transmission by solid organ transplantation.

      3. Evaluation of a water and hygiene project in health-care facilities in Siaya County, Kenya, 2016external icon
        Davis W, Odhiambo A, Oremo J, Otieno R, Mwaki A, Rajasingham A, Kim S, Quick R.
        Am J Trop Med Hyg. 2019 Jul 22.
        Evaluation of a Water and Hygiene Project in Health-Care Facilities in Siaya County, Kenya, 2016.

      4. We describe an outbreak of imipenemase metallo-beta-lactamase-producing organisms in a long-term-care facility (LTCF) amid a larger community outbreak of extended-spectrum beta-lactamase-producing organisms. Transmission was propagated by inadequate infection prevention practices. We provided infection prevention recommendations and education, facilitated colonization screening, and increased interfacility communication. This outbreak demonstrates the unmet need for infection prevention education in long-term-care facilities and the importance of prompt public health response to ensure appropriate identification, containment, and prevention of emerging resistance.

      5. Clostridioides difficile is a common pathogen that is well known to survive for extended periods of time on environmental healthcare surfaces from fecal contamination. During epidemiological investigations of healthcare-associated infections, it is important to be able to detect whether or not there are viable spores of C. difficile on surfaces. Current methods to detect C. difficile can take up to 7 days for culture and in the case of detection by PCR, viability of the spores cannot be ascertained. Prevention of C. difficile infection in healthcare settings includes adequate cleaning and disinfection of environmental surfaces which increases the likelihood of detecting dead organisms from an environmental sample during an investigation. In this study, we were able to adapt a rapid-viability PCR (RV-PCR) method, first developed for detection of viable Bacillus anthracis spores, for the detection of viable C. difficile spores. RV-PCR uses the change in cycle threshold after incubation to confirm the presence of live organisms. Using this modified method we were able to detect viable C. difficile after 22h of anaerobic incubation in Cycloserine Cefoxitin Fructose Broth (CCFB). This method also used bead beating combined with the Maxwell 16 Casework kit for DNA extraction and purification and a real-time duplex PCR assay for toxin B and cdd3 genes to confirm the identity of the C. difficile spores. Spiked environmental sponge-wipes with and without added organic load were tested to determine the limit of detection (LOD). The LOD from spiked environmental sponge-wipe samples was 10(4) spores/mL but after incubation initial spore levels of 10(1) spores/mL were detected. Use of this method would greatly decrease the amount of time required to detect viable C. difficile spores; incubation of samples is only required for germination (22h or less) instead of colony formation, which can take up to 7 days. In addition, PCR can then quickly confirm or deny the identity of the organism at the same time it would confirm viability. The presence of viable C. difficile spores could be detected at very low levels within 28h total compared to the 2 to 10-day process that would be needed for culture, identification and toxin detection.

    • Immunity and Immunization
      1. Survey of adult influenza vaccination practices and perspectives among US primary care providers (2016-2017 influenza season)external icon
        Cataldi JR, O’Leary ST, Lindley MC, Hurley LP, Allison MA, Brtnikova M, Beaty BL, Crane LA, Kempe A.
        J Gen Intern Med. 2019 Jul 19.
        BACKGROUND: Seasonal influenza vaccination is recommended for all adults; however, little is known about how primary care physicians can communicate effectively with patients about influenza vaccination. OBJECTIVE: To assess among general internal medicine (GIM) and family physicians (FP) regarding adult influenza vaccination: (1) recommendation and administration practices, (2) barriers to discussing and perceived reasons for patient refusal, and (3) factors associated with physician self-efficacy in convincing patients to be vaccinated. DESIGN: Email and mail survey conducted in February-March 2017 PARTICIPANTS: Nationally representative sample of GIM and FP MAIN MEASURES: Factor analysis was used to group similar items for multivariable analysis of barriers and strategies associated with high physician self-efficacy about convincing patients to be vaccinated (defined as disagreeing that they could do nothing to change resistant patients’ minds). KEY RESULTS: Response rate was 67% (620/930). Ninety-eight percent always/almost always recommended influenza vaccine to adults >/= 65 years, 90% for adults 50-64 years, and 75% for adults 19-49 years. Standing orders (76%) and electronic alerts (64%) were the most commonly used practice-based immunization strategies. Frequently reported barriers to discussing vaccination were other health issues taking precedence (41%), time (29%), and feeling they were unlikely to change patients’ minds (24%). Fifty-eight percent of physicians reported high self-efficacy about convincing patients to be vaccinated; these providers reported fewer patient belief barriers contributing to vaccine refusal (RR = 0.93 per item; 95% CI (0.89-0.98); Cronbach’s alpha = 0.70), were more likely to report using both fact- (1.08/item; (1.03-1.14); 0.66) and personal experience-based (1.07/item; (1.003-1.15); 0.65) communication strategies, and were more likely to work in practices using patient reminders for influenza vaccine (1.32; (1.16-1.50)). CONCLUSIONS: Physicians identified barriers to successfully communicating about adult influenza vaccination but few effective strategies to counter them. Interventions to promote self-efficacy in communication and under-utilized practice-based immunization strategies are needed.

      2. Antibody responses against influenza B lineages among community-dwelling individuals 65 years of age or older having received trivalent inactivated influenza vaccine during two consecutive seasons in Thailandexternal icon
        Chittaganpitch M, Puthavathana P, Praphasiri P, Waicharoen S, Shrestha M, Mott JA, Prasert K.
        Southeast Asian J Trop Med Public Health. 2019 ;50(3):500-513.
        Seasonal trivalent influenza vaccines (TIV) have been recommended since 2008 for people >/=65 years of age in Thailand. While two distinct antigenic lineages of influenza B virus, namely, B/Yamagata and B/Victoria, often co-circulate in Thailand, TIV contains only one influenza B lineage. Little is known regarding cross-protection among older Thai persons offered by current TIV against heterologous influenza B lineage. Kinetics, longevity and cross-antibody response to both influenza B lineages were measured in 85 healthy Thai persons >/=65 years of age vaccinated with TIV containing B Phuket/3073/2013 (Yamagata lineage) in 2015-2016 season and then with TIV containing B/Brisbane/60/2008 (Victoria lineage) in 2016-2017 season. Hemagglutination-inhibition assays were performed on blood specimens collected at five time intervals during the study period. Seroconversion rate, seroprotection rate, geometric mean titer (GMT), and GMT ratio peaked at one month after the first vaccination and declined over time. At one-month post second vaccination, antibody response was shown not only for homologous B/Brisbane virus but also for heterologous B/Phuket virus. The study suggests the elderly could develop antibody response to both influenza B lineages when primed with TIV containing one B lineage and boosted with a TIV vaccination containing the other B lineage. Although more research is needed, in resource-limited countries where quadrivalent influenza vaccines are not avail-able, repeated TIV vaccination may be beneficial in developing immunological protection against influenza B in people >/= 65 years of age.

      3. The Global Vaccine Action Plan – insights into its utility, application, and ways to strengthen future plansexternal icon
        Daugherty MA, Hinman AR, Cochi SL, Garon JR, Rodewald LE, Nowak G, McKinlay MA, Mast EE, Orenstein WA.
        Vaccine. 2019 Jul 17.
        BACKGROUND: The pace of global progress must increase if the Global Vaccine Action Plan (GVAP) goals are to be achieved by 2020. We administered a two-phase survey to key immunization stakeholders to assess the utility and application of GVAP, including how it has impacted country immunization programs, and to find ways to strengthen the next 10-year plan. METHODS: For the Phase I survey, an online questionnaire was sent to global immunization stakeholders in summer 2017. The Phase II survey was sent to regional and national immunization stakeholders in summer 2018, including WHO Regional Advisors on Immunization, Expanded Programme on Immunization managers, and WHO and UNICEF country representatives from 20 countries. Countries were selected based on improvements (10) versus decreases (10) in DTP3 coverage from 2010 to 2016. RESULTS: Global immunization stakeholders (n=38) cite global progress in improving vaccine delivery (88%) and engaging civil society organizations as advocates for vaccines (83%). Among regional and national immunization stakeholders (n=58), 70% indicated reaching mobile and underserved populations with vaccination activities as a major challenge. The top ranked activities for helping country programs achieve progress toward GVAP goals include improved monitoring of vaccination coverage and upgrading disease surveillance systems. Most respondents (96%) indicated GVAP as useful for determining immunization priorities and 95% were supportive of a post-2020 GVAP strategy. CONCLUSIONS: Immunization stakeholders see GVAP as a useful tool, and there is cause for excitement as the global immunization community looks toward the next decade of vaccines. The next 10-year plan should attempt to increase political will, align immunization activities with other health system agendas, and address important issues like reaching mobile/migrant populations and improving data reporting systems.

      4. An approach for preparing and responding to adverse events following immunization reported after hepatitis B vaccine birth dose administrationexternal icon
        Gidudu JF, Shaum A, Habersaat K, Wilhelm E, Woodring J, Mast E, Zuber P, Amarasinghe A, Nelson N, Kabore H, Abad N, Tohme RA.
        Vaccine. 2019 Jul 20.
        The success of immunization programs in lowering the incidence of vaccine preventable diseases (VPDs) has led to increased public attention on potential health risks associated with vaccines. As a result, a scientifically rigorous response to investigating reported adverse events following immunization (AEFI) and effective risk communications strategies are critical to ensure public confidence in immunization. Globally, an estimated 257 million people have chronic hepatitis B virus (HBV) infection, which causes more than 686,000 premature deaths from liver cancer and cirrhosis. Hepatitis B vaccination is the most effective way to prevent mother-to-child transmission of HBV infection, especially when a timely birth dose is given within 24h of birth. However, an infant’s risk of dying is highest in the neonatal period, and thus, administering HepB-BD within 24h of birth overlaps with the most fragile period in an infant’s life. A working group formed in July 2016 following the publication of the case reports of the effects on vaccination coverage of media reports of infant deaths after HepB-BD administration in China and Vietnam. The goal of the working group was to create a framework and describe best practices for preparing for and responding to AEFI reported after HepB-BD administration, using existing resources. The framework includes six steps, including three preparation steps and three response steps. This document is written for national and regional immunization program staff. Prior to using the framework for preparation and response to AEFIs reported after HepB-BD administration, staff members should be familiar with how AEFI are detected, reported, and investigated in the country. The document might also be of interest to national regulatory staff members who monitor vaccine safety within the country.

      5. Projected population benefit of increased effectiveness and coverage of influenza vaccination on influenza burden – United Statesexternal icon
        Hughes MM, Reed C, Flannery B, Garg S, Singleton JA, Fry AM, Rolfes MA.
        Clin Infect Dis. 2019 Jul 25.
        BACKGROUND: Vaccination is the best way to prevent influenza; however, greater benefit could be achieved. To help guide research and policy agendas, we aimed to quantify the magnitude of influenza disease that would be prevented through targeted increases in vaccine effectiveness (VE) or coverage. METHODS: For three influenza seasons (2011-12, 2015-16, and 2017-18) we used a mathematical model to estimate the number of prevented influenza-associated illnesses, medically-attended illnesses, and hospitalizations across five age groups. Compared with estimates of prevented illness during each season, given observed VE and coverage, we explored the number of additional outcomes that would be prevented from a 5% absolute increase in VE or coverage or achieving 60% VE or 70% coverage. RESULTS: During the 2017-18 season, compared with the burden already prevented by influenza vaccination, a 5% absolute VE increase would prevent an additional 1,050,000 illnesses and 25,000 hospitalizations (76% among those aged >/=65 years) while achieving 60% VE would prevent an additional 190,000 hospitalizations. A 5% coverage increase would result in 785,000 fewer illnesses (56% among those aged 18-64 years) and 11,000 fewer hospitalizations; reaching 70% would prevent an additional 39,000 hospitalizations. CONCLUSIONS: Small, attainable improvements in effectiveness or coverage of influenza vaccine could lead to substantial additional reductions in influenza burden in the U.S. Improvements in VE would have the greatest impact in reducing hospitalizations in adults aged >/=65 years and coverage improvements would have the largest benefit in reducing illnesses in adults aged 18-49 years.

      6. Effectiveness of monovalent rotavirus vaccine against hospitalization with acute rotavirus gastroenteritis in Kenyan childrenexternal icon
        Khagayi S, Omore R, Otieno GP, Ogwel B, Ochieng JB, Juma J, Apondi E, Bigogo G, Onyango C, Ngama M, Njeru R, Owor BE, Mwanga MJ, Addo Y, Tabu C, Amwayi A, Mwenda JM, Tate JE, Parashar UD, Breiman RF, Nokes DJ, Verani JR.
        Clin Infect Dis. 2019 Jul 20.
        BACKGROUND: Rotavirus remains a leading cause of diarrheal illness and death among children worldwide. Data on rotavirus vaccine effectiveness in sub-Saharan Africa are limited. Kenya introduced monovalent rotavirus vaccine (RV1) in July 2014. We assessed RV1 effectiveness against rotavirus-associated hospitalization in Kenyan children. METHODS: Between July-2014 and December-2017, we conducted surveillance for acute gastroenteritis (AGE) in three hospitals across Kenya. We analysed data from children age-eligible for >/=1 RV1 dose, with stool tested for rotavirus and confirmed vaccination history. We compared RV1 coverage among those who tested rotavirus-positive (cases) versus rotavirus-negative (controls) using multivariable logistic regression; effectiveness was 1-adjusted odds ratio for vaccination x100%. RESULTS: Among 677 eligible children, 110 (16%) were rotavirus-positive. Vaccination data were available for 91 (83%) cases; 51 (56%) had received 2 RV1 doses and 33 (36%) 0 doses. Among 567 controls, 418 (74%) had vaccination data; 308 (74%) had 2 doses and 69 (16%) 0 doses. Overall 2-dose effectiveness was 64% (95% confidence interval [CI]: 35-80%); for children aged <12 months 67% (95%CI: 30-84%) and children aged >/=12 months 72% (95%CI: 10-91%). Significant effectiveness was seen in children with normal weight-for-age (84% [95%CI: 62-93%]), length/height-for-age (75% [95%CI: 48-88%]) and weight-for-length/height (84% [95%CI: 64-93%]); however, no protection was found among underweight, stunted nor wasted children. CONCLUSIONS: RV1 in the routine Kenyan immunization program provides significant protection against rotavirus AGE hospitalization. Protection was sustained beyond infancy. Malnutrition appears to diminish vaccine effectiveness. Efforts to improve rotavirus vaccine uptake and nutritional status are important to maximize vaccine benefit.

      7. [No abstract]

    • Informatics
      1. Background: The history of large-scale technological advances, such as the digital revolution in our era, suggests that core technologies yield wide benefits by serving as a method of invention, spawning new tools and techniques that surpass the performance of their predecessors. Methods: Digital platforms provide a method of invention in the health sector by enabling innovations in data collection, use, and sharing. Although wide adoption of computerized information technology in healthcare has produced mixed results, the advent of mobile health (mHealth) creates new opportunities for device-mediated advances in surgical and public health practice. Conclusion: Mobile solutions for collecting, using, and sharing patient-generated health data after surgery can yield important benefits for post-operative monitoring, whether the data are used to evaluate and manage individual patients or track infections and other outcomes in patient populations.

    • Injury and Violence
      1. Fall-related traumatic brain injury in children ages 0-4 yearsexternal icon
        Haarbauer-Krupa J, Haileyesus T, Gilchrist J, Mack KA, Law CS, Joseph A.
        J Safety Res. 2019 ;70:127-133.
        Introduction: Falls are the leading cause of traumatic brain injury (TBI) for children in the 0-4 year age group. There is limited literature pertaining to fall-related TBIs in children age 4 and under and the circumstances surrounding these TBIs. This study provides a national estimate and describes actions and products associated with fall-related TBI in this age group. Method: Data analyzed were from the 2001-2013 National Electronic Injury Surveillance System-All Injury Program (NEISS-AIP), a nationally representative sample of emergency departments (ED). Case narratives were coded for actions associated with the fall, and product codes were abstracted to determine fall location and product type. All estimates were weighted. Results: An estimated 139,001 children younger than 5 years were treated annually in EDs for nonfatal, unintentional fall-related TBI injuries (total = 1,807,019 during 2001-2013). Overall, child actions (e.g., running) accounted for the greatest proportion of injuries and actions by others (e.g., carrying) was highest for children younger than 1 year. The majority of falls occurred in the home, and involved surfaces, fixtures, furniture, and baby products. Conclusions: Fall-related TBI in young children represents a significant public health burden. The majority of children seen for TBI assessment in EDs were released to home. Prevention efforts that target parent supervision practices and the home environment are indicated. Practical applications: Professionals in contact with parents of young children can remind them to establish a safe home and be attentive to the environment when carrying young children to prevent falls.

      2. CDC’s guideline on pediatric mild traumatic brain injury: Recommendations for neurologistsexternal icon
        Weissman B, Joseph M, Gronseth G, Sarmiento K, Giza CC.
        Neurol Clin Pract. 2019 Jun;9(3):241-249.
        Purpose of review: In September 2018, the Centers for Disease Control and Prevention (CDC) published an evidence-based guideline on the diagnosis and management of mild traumatic brain injury (mTBI) among children. Recent findings: Based on a systematic review of the evidence that covers research published over a 25-year span (1990-2015), the CDC Pediatric mTBI Guideline strives to optimize the care of pediatric patients with mTBI. The guideline was developed using a rigorous methodology developed by the American Academy of Neurology. Summary: Clinical practice recommendations in the CDC Pediatric mTBI Guideline can help guide neurologists with critical diagnostic and management decisions and to implement evidence-based strategies for the recovery of their young patients with this injury.

    • Laboratory Sciences
      1. Acquisition of cancer stem cell-like properties in human small airway epithelial cells after a long-term exposure to carbon nanomaterialsexternal icon
        Kiratipaiboon C, Stueckle TA, Ghosh R, Rojanasakul LW, Chen YC, Dinu CZ, Rojanasakul Y.
        Environmental Science: Nano. 2019 ;6(7):2152-2170.
        Cancer stem cells (CSCs) are a key driver of tumor formation and metastasis, but how they are affected by nanomaterials is largely unknown. The present study investigated the effects of different carbon-based nanomaterials (CNMs) on neoplastic and CSC-like transformation of human small airway epithelial cells and determined the underlying mechanisms. Using a physiologically relevant exposure model (long-term/low-dose) with system validation using a human carcinogen, asbestos, we demonstrated that single-walled carbon nanotubes, multi-walled carbon nanotubes, ultrafine carbon black, and crocidolite asbestos induced particle-specific anchorage-independent colony formation, DNA-strand breaks, and p53 downregulation, indicating the genotoxicity and carcinogenic potential of CNMs. The chronic CNM-exposed cells exhibited CSC-like properties as indicated by 3D spheroid formation, anoikis resistance, and CSC marker expression. Mechanistic studies revealed specific self-renewal and epithelial-mesenchymal transition (EMT)-related transcription factors that are involved in the cellular transformation process. Pathway analysis of gene signaling networks supports the role of SOX2 and SNAI1 signaling in CNM-mediated transformation. These findings support the potential carcinogenicity of high aspect ratio CNMs and identified molecular targets and signaling pathways that may contribute to disease development.

      2. Efficacy of a solar concentrator to Inactivate E. coli and C. perfringens spores in latrine waste in Kenyaexternal icon
        Murphy JL, Ayers T, Foote A, Woods E, Wamola N, Fagerli K, Waiboci L, Mugoh R, Mintz ED, Zhao K, Marano N, O’Reilly CE, Hill VR.
        Sci Total Environ. 2019 Jul 2;691:401-406.
        Alternative sanitation options are needed for effective waste management in low-income countries where centralized, large-scale waste treatment is not easily achievable. A newly designed solar concentrator technology utilizes solar thermal energy to treat feces contained in drums. This pilot study assessed the efficacy of the new design to inactivate microbes in 13 treatment drums under field conditions in Kenya. Three-quarters of the drums contained <1000E. coli/g of total solids following 6h of solar thermal treatment and inactivation of thermotolerant C. perfringens spores ranged from <1.8 to >5.0log10. Nearly all (94%) samples collected from treatment drums achieved thermophilic temperatures (>50 degrees C) during the treatment period, however this alone did not ensure samples met the WHO E. coli guideline; higher, sustained thermophilic temperatures tended to be more effective in reaching this guideline. The newly designed solar concentrator was capable of inactivating thermotolerant, environmentally-stable microorganisms as, or possibly more, efficiently than a previous design. Additional data are needed to better characterize how temperature, time, and other parameters affect the ability of the solar concentrator to inactivate microbes in feces.

      3. We tested the hypothesis that the impact of the Fms-like tyrosine kinase 3-ligand (Flt3L; FL) on recombinant Vibrio cholerae ghost (rVCG) vaccine-induced chlamydial immunity is influenced by route of vaccine delivery. Female C57BL/6J mice were immunized rectally (IR) or intramuscularly (IM) with rVCG co-expressing the Chlamydia trachomatis PmpD and PorB proteins (rVCG- PmpD/PorB) with and without FL or glycoprotein D of HSV-2 (rVCG-gD2) as antigen control. Vaccine evaluation was based on measurement of T cell proliferation, Th1/Th2 cytokine, and humoral responses at systemic and mucosal compartments, and protection against intravaginal challenge infection. Results revealed that high levels of CD4+ T cell-mediated and humoral immune responses, were elicited in mice as a function of both IR and IM immunization. Unexpectedly, co-administration of vaccine with FL enhanced specific Th1-type cytokine levels and T cell proliferative responses following IR but not IM immunization. While administration of vaccine with FL enhanced the specific mucosal and systemic IgA antibody responses following both immunization routes, IgG2c responses were not enhanced following IR delivery. The vaccine-induced immune effectors protected mice against live heterologous C. muridarum infection irrespective of route of vaccine administration, with the regimen incorporating FL having a protective advantage. Further evaluation showed that protection afforded by the FL adjuvanted vaccine was facilitated by CD4+ T cells, as indicated by reduction in the intensity and duration of genital chlamydial shedding by naive mice following adoptive transfer of immune CD4+ T cells. Taken together, the results indicate that comparable protective immunity, which is enhanced by co-delivery with FL, is elicited in the female genital tract against Chlamydia infection after mucosal and systemic administration, highlighting the ability of FL to function as an effective immunostimulator at both mucosal and systemic sites. The differential modulation of humoral and cellular immune responses, and protective immunity afforded by the FL adjuvanted vaccine following IR administration indicates that the immunomodulatory impact of FL on chlamydial-specific immunity is influenced by the route of vaccine administration. Thus, targeting of VCG-based vaccines to antigen presenting cells by co-delivery with FL is a feasible immunization approach for inducing effective chlamydial immunity in the female genital tract.

      4. Messenger RNA levels of the Polo-like kinase gene (PLK) correlate with cytokinesis in the Trypanosoma rangeli cell cycleexternal icon
        Prestes EB, Stoco PH, de Moraes MH, Moura H, Grisard EC.
        Exp Parasitol. 2019 Jul 22:107727.
        BACKGROUND: Trypanosoma rangeli is a protozoan parasite that is non-virulent to the mammalian host and is morphologically and genomically related to Trypanosoma cruzi, whose proliferation within the mammalian host is controversially discussed. OBJECTIVES: We aimed to investigate the T. rangeli cell cycle in vitro and in vivo by characterizing the timespan of the parasite life cycle and by proposing a molecular marker to assess cytokinesis. METHODOLOGY: The morphological events and their timing during the cell cycle of T. rangeli epimastigotes were assessed using DNA staining, flagellum labelling and bromodeoxyuridine incorporation. Messenger RNA levels of four genes previously associated with the cell cycle of trypanosomatids (AUK1, PLK, MOB1 and TRACK) were evaluated in the different T. rangeli forms. FINDINGS: T. rangeli epimastigotes completed the cell cycle in vitro in 20.8h. PLK emerged as a potential molecular marker for cell division, as its mRNA levels were significantly increased in exponentially growing epimastigotes compared with growth-arrested parasites or in vitro-differentiated trypomastigotes. PLK expression in T. rangeli can be detected near the flagellum protrusion site, reinforcing its role in the cell cycle. Interestingly, T. rangeli bloodstream trypomastigotes exhibited very low mRNA levels of PLK and were almost entirely composed of parasites in G1 phase. MAIN CONCLUSIONS: Our work is the first to describe the T. rangeli cell cycle in vitro and proposes that PLK mRNA levels could be a useful tool to investigate the T. rangeli ability to proliferate within the mammalian host bloodstream.

      5. Stable Occupancy of the Crimean-Congo Hemorrhagic Fever Virus-Encoded Deubiquitinase Blocks Viral Infectionexternal icon
        Scholte FE, Hua BL, Spengler JR, Dzimianski JV, Coleman-McCray JD, Welch SR, McMullan LK, Nichol ST, Pegan SD, Spiropoulou CF, Bergeron E.
        MBio. 2019 Jul 23;10(4).
        Crimean-Congo hemorrhagic fever virus (CCHFV) infection can result in a severe hemorrhagic syndrome for which there are no antiviral interventions available to date. Certain RNA viruses, such as CCHFV, encode cysteine proteases of the ovarian tumor (OTU) family that antagonize interferon (IFN) production by deconjugating ubiquitin (Ub). The OTU of CCHFV, a negative-strand RNA virus, is dispensable for replication of the viral genome, despite being part of the large viral RNA polymerase. Here, we show that mutations that prevent binding of the OTU to cellular ubiquitin are required for the generation of recombinant CCHFV containing a mutated catalytic cysteine. Similarly, the high-affinity binding of a synthetic ubiquitin variant (UbV-CC4) to CCHFV OTU strongly inhibits viral growth. UbV-CC4 inhibits CCHFV infection even in the absence of intact IFN signaling, suggesting that its antiviral activity is not due to blocking the OTU’s immunosuppressive function. Instead, the prolonged occupancy of the OTU with UbV-CC4 directly targets viral replication by interfering with CCHFV RNA synthesis. Together, our data provide mechanistic details supporting the development of antivirals targeting viral OTUs.IMPORTANCE Crimean-Congo hemorrhagic fever virus is an important human pathogen with a wide global distribution for which no therapeutic interventions are available. CCHFV encodes a cysteine protease belonging to the ovarian tumor (OTU) family which is involved in host immune suppression. Here we demonstrate that artificially prolonged binding of the OTU to a substrate inhibits virus infection. This provides novel insights into CCHFV OTU function during the viral replicative cycle and highlights the OTU as a potential antiviral target.

      6. Identification of key hemagglutinin residues responsible for cleavage, acid stability, and virulence of fifth-wave highly pathogenic avian influenza A(H7N9) virusesexternal icon
        Sun X, Belser JA, Yang H, Pulit-Penaloza JA, Pappas C, Brock N, Zeng H, Creager HM, Stevens J, Maines TR.
        Virology. 2019 Jul 12;535:232-240.
        We previously demonstrated that despite no airborne transmissibility increase compared to low pathogenic avian influenza viruses, select human isolates of highly pathogenic avian influenza A(H7N9) virus exhibit greater virulence in animal models and a lower threshold pH for fusion. In the current study, we utilized both in vitro and in vivo approaches to identify key residues responsible for hemagglutinin (HA) intracellular cleavage, acid stability, and virulence in mice. We found that the four amino acid insertion (-KRTA-) at the HA cleavage site of A/Taiwan/1/2017 virus is essential for HA intracellular cleavage and contributes to disease in mice. Furthermore, a lysine to glutamic acid mutation at position HA2-64 increased the threshold pH for HA activation, reduced virus stability, and replication in mice. Identification of a key residue responsible for enhanced acid stability of A(H7N9) viruses is of great significance for future surveillance activities and improvements in vaccine stability.

      7. Conjugal transfer, whole-genome sequencing, and plasmid analysis of four mcr-1-bearing isolates from U.S. patientsexternal icon
        Zhu W, Lawsin A, Lindsey RL, Batra D, Knipe K, Yoo BB, Perry KA, Rowe LA, Lonsway D, Walters MS, Rasheed JK, Halpin AL.
        Antimicrob Agents Chemother. 2019 Apr;63(4).
        Four Enterobacteriaceae clinical isolates bearing mcr-1 gene-harboring plasmids were characterized. All isolates demonstrated the ability to transfer colistin resistance to Escherichia coli; plasmids were stable in conjugants after multiple passages on nonselective media. mcr-1 was located on an IncX4 (n = 3) or IncN (n = 1) plasmid. The IncN plasmid harbored 13 additional antimicrobial resistance genes. Results indicate that the mcr-1-bearing plasmids in this study were highly transferable in vitro and stable in the recipients.

    • Maternal and Child Health
      1. Exome sequencing of family trios from the National Birth Defects Prevention Study: Tapping into a rich resource of genetic and environmental dataexternal icon
        Jenkins MM, Almli LM, Pangilinan F, Chong JX, Blue EE, Shapira SK, White J, McGoldrick D, Smith JD, Mullikin JC, Bean CJ, Nembhard WN, Lou XY, Shaw GM, Romitti PA, Keppler-Noreuil K, Yazdy MM, Kay DM, Carter TC, Olshan AF, Moore KJ, Nascone-Yoder N, Finnell RH, Lupo PJ, Feldkamp ML, Nickerson DA, Bamshad MJ, Brody LC, Reefhuis J.
        Birth Defects Res. 2019 Jul 21.
        BACKGROUND: The National Birth Defects Prevention Study (NBDPS) is a multisite, population-based, case-control study of genetic and nongenetic risk factors for major structural birth defects. Eligible women had a pregnancy affected by a birth defect or a liveborn child without a birth defect between 1997 and 2011. They were invited to complete a telephone interview to collect pregnancy exposure data and were mailed buccal cell collection kits to collect specimens from themselves, their child (if living), and their child’s father. Over 23,000 families representing more than 30 major structural birth defects provided DNA specimens. METHODS: To evaluate their utility for exome sequencing (ES), specimens from 20 children with colonic atresia were studied. Evaluations were conducted on specimens collected using cytobrushes stored and transported in open versus closed packaging, on native genomic DNA (gDNA) versus whole genome amplified (WGA) products and on a library preparation protocol adapted to low amounts of DNA. RESULTS: The DNA extracted from brushes in open packaging yielded higher quality sequence data than DNA from brushes in closed packaging. Quality metrics of sequenced gDNA were consistently higher than metrics from corresponding WGA products and were consistently high when using a low input protocol. CONCLUSIONS: This proof-of-principle study established conditions under which ES can be applied to NBDPS specimens. Successful sequencing of exomes from well-characterized NBDPS families indicated that this unique collection can be used to investigate the roles of genetic variation and gene-environment interaction effects in birth defect etiologies, providing a valuable resource for birth defect researchers.

      2. Geospatial analysis for reproductive, maternal, newborn, child and adolescent health: gaps and opportunitiesexternal icon
        Matthews Z, Rawlins B, Duong J, Molla YB, Moran AC, Singh K, Serbanescu F, Tatem AJ, Nilsen K.
        BMJ Glob Health. 2019 ;4(Suppl 5):e001702.

        [No abstract]

      3. Best practices in availability, management and use of geospatial data to guide reproductive, maternal, child and adolescent health programmesexternal icon
        Molla YB, Nilsen K, Singh K, Ruktanonchai CW, Schmitz MM, Duong J, Serbanescu F, Moran AC, Matthews Z, Tatem AJ.
        BMJ Glob Health. 2019 ;4(Suppl 5):e001406.

        [No abstract]

      4. The PRogram In Support of Moms (PRISM): study protocol for a cluster randomized controlled trial of two active interventions addressing perinatal depression in obstetric settingsexternal icon
        Moore Simas TA, Brenckle L, Sankaran P, Masters GA, Person S, Weinreb L, Ko JY, Robbins CL, Allison J, Byatt N.
        BMC Pregnancy Childbirth. 2019 Jul 22;19(1):256.
        BACKGROUND: Perinatal depression, the most common pregnancy complication, is associated with negative maternal-offspring outcomes. Despite existence of effective treatments, it is under-recognized and under-treated. Professional organizations recommend universal screening, yet multi-level barriers exist to ensuring effective diagnosis, treatment, and follow-up. Integrating mental health and obstetric care holds significant promise for addressing perinatal depression. The overall study goal is to compare the effectiveness of two active interventions: (1) the Massachusetts Child Psychiatry Access Program (MCPAP) for Moms, a state-wide, population-based program, and (2) the PRogram In Support of Moms (PRISM) which includes MCPAP for Moms plus a proactive, multifaceted, practice-level intervention with intensive implementation support. METHODS: This study is conducted in two phases: (1) a run-in phase which has been completed and involved practice and patient participant recruitment to demonstrate feasibility for the second phase, and (2) a cluster randomized controlled trial (RCT), which is ongoing, and will compare two active interventions 1:1 with ten Ob/Gyn practices as the unit of randomization. In phase 1, rates of depressive symptoms and other demographic and clinical features among patients were examined to inform practice randomization. Patient participants to be recruited in phase 2 will be followed longitudinally until 13 months postpartum; they will have 3-5 total study visits depending on whether their initial recruitment and interview was at 4-24 or 32-40 weeks gestation, or 1-3 months postpartum. Sampling throughout pregnancy and postpartum will ensure participants with different depressive symptom onset times. Differences in depression symptomatology and treatment participation will be compared between patient participants by intervention arm. DISCUSSION: This manuscript describes the full two-phase study protocol. The study design is innovative because it combines effectiveness with implementation research designs and integrates critical components of participatory action research. Our approach assesses the feasibility, acceptance, efficacy, and sustainability of integrating a stepped-care approach to perinatal depression care into ambulatory obstetric settings; an approach that is flexible and can be tailored and adapted to fit unique workflows of real-world practices. TRIAL REGISTRATION: ClinicalTrials.gov Identifier: NCT02760004, registered prospectively on May 3, 2016.

      5. Survival of infants with spina bifida and the role of maternal prepregnancy body mass indexexternal icon
        Pace ND, Siega-Riz AM, Olshan AF, Chescheir NC, Cole SR, Desrosiers TA, Tinker SC, Hoyt AT, Canfield MA, Carmichael SL, Meyer RE.
        Birth Defects Res. 2019 Jul 19.
        OBJECTIVE: To investigate first-year survival of infants born with spina bifida, and examine the association of maternal prepregnancy body mass index (BMI) with infant mortality. METHODS: This is a retrospective cohort study of 1,533 liveborn infants with nonsyndromic spina bifida with estimated dates of delivery from 1998 to 2011 whose mothers were eligible for the National Birth Defects Prevention Study (NBDPS). NBDPS data were linked to death records to conduct survival analyses. Kaplan-Meier survival functions estimated mortality risk over the first year of life. Cox proportional hazards models estimated hazard ratios (HRs) for maternal prepregnancy BMI categorized as underweight (<18.5), normal (18.5-24.9), overweight (25-29.9), and obese (>/=30). RESULTS: Infant mortality risk among infants with spina bifida was (4.4% [3.52, 5.60%]). Infants with multiple co-occurring defects, very preterm delivery, multiple gestation, high-level spina bifida lesions, or non-Hispanic Black mothers had an elevated risk of infant mortality. Maternal prepregnancy underweight and obesity were associated with higher infant mortality (15.7% [7.20, 32.30%] and 5.82% [3.60, 9.35%], respectively). Adjusted HR estimates showed underweight and obese mothers had greater hazard of infant mortality compared to normal weight mothers (HR: 4.5 [1.08, 16.72] and 2.6 [1.36, 8.02], respectively). CONCLUSION: The overall risk of infant mortality for infants born with spina bifida was lower than most previously reported estimates. Infants born with spina bifida to mothers who were underweight or obese prepregnancy were at higher risk of infant mortality. This study provides additional evidence of the importance of healthy maternal weight prior to pregnancy.

    • Military Medicine and Health
      1. Oligodendrocyte involvement in Gulf War Illnessexternal icon
        Belgrad J, Dutta DJ, Bromley-Coolidge S, Kelly KA, Michalovicz LT, Sullivan KA, O’Callaghan JP, Fields RD.
        Glia. 2019 Jul 24.
        Low level sarin nerve gas and other anti-cholinesterase agents have been implicated in Gulf War illness (GWI), a chronic multi-symptom disorder characterized by cognitive, pain and fatigue symptoms that continues to afflict roughly 32% of veterans from the 1990-1991 Gulf War. How disrupting cholinergic synaptic transmission could produce chronic illness is unclear, but recent research indicates that acetylcholine also mediates communication between axons and oligodendrocytes. Here we investigated the hypothesis that oligodendrocyte development is disrupted by Gulf War agents, by experiments using the sarin-surrogate acetylcholinesterase inhibitor, diisopropyl fluorophosphate (DFP). The effects of corticosterone, which is used in some GWI animal models, were also investigated. The data show that DFP decreased both the number of mature and dividing oligodendrocytes in the rat prefrontal cortex (PFC), but differences were found between PFC and corpus callosum. The differences seen between the PFC and corpus callosum likely reflect the higher percentage of proliferating oligodendroglia in the adult PFC. In cell culture, DFP also decreased oligodendrocyte survival through a non-cholinergic mechanism. Corticosterone promoted maturation of oligodendrocytes, and when used in combination with DFP it had protective effects by increasing the pool of mature oligodendrocytes and decreasing proliferation. Cell culture studies indicate direct effects of both DFP and corticosterone on OPCs, and by comparison with in vivo results, we conclude that in addition to direct effects, systemic effects and interruption of neuron-glia interactions contribute to the detrimental effects of GW agents on oligodendrocytes. Our results demonstrate that oligodendrocytes are an important component of the pathophysiology of GWI.

    • Nutritional Sciences
      1. Nutrient content of squeeze pouch foods for infants and toddlers sold in the United States in 2015external icon
        Beauregard JL, Bates M, Cogswell ME, Nelson JM, Hamner HC.
        Nutrients. 2019 Jul 23;11(7).
        BACKGROUND: To describe the availability and nutrient composition of U.S. commercially available squeeze pouch infant and toddler foods in 2015. MATERIALS AND METHODS: Data were from information presented on nutrition labels for 703 ready-to-serve, pureed food products from 24 major U.S. infant and toddler food brands. We described nutritional components (e.g., calories, fat) and compared them between packaging types (squeeze pouch versus other packaging types) within food categories. RESULTS: 397 (56%) of the analyzed food products were packaged as squeeze pouches. Differences in 13 nutritional components between squeeze pouch versus other packaging types were generally small and varied by food category. Squeeze pouches in the fruits and vegetables, fruit-based, and vegetable-based categories were more likely to contain added sugars than other package types. CONCLUSION: In 2015, squeeze pouches were prevalent in the U.S. commercial infant and toddler food market. Nutrient composition differed between squeeze pouches and other packaging types for some macro- and micronutrients. Although it is recommended that infants and toddlers under two years old not consume any added sugars, a specific area of concern may be the inclusion of sources of added sugar in squeeze pouches. Linking this information with children’s dietary intake would facilitate understanding how these differences affect overall diet quality.

    • Occupational Safety and Health
      1. Association of occupational stress with waking, diurnal, and bedtime cortisol response in police officersexternal icon
        Allison P, Mnatsakanova A, Fekedulegn DB, Violanti JM, Charles LE, Hartley TA, Andrew ME, Miller DB.
        Am J Hum Biol. 2019 Jul 22:e23296.
        OBJECTIVE: Police officers have higher rates of cardiovascular disease (CVD) morbidity and mortality than the U.S. general population. Officers are exposed to conventional and unexpected workplace stressors. The hypothalamic-pituitary-adrenal (HPA) axis plays a major role responding to stressor exposure by releasing cortisol. Prolonged release or excessive levels may result in disease. Our study investigated cross-sectional associations between self-reported work stress and various salivary cortisol parameters. METHODS: A total of 285 police officers (76.5% male) from the Buffalo Cardio-Metabolic Occupational Police Stress (BCOPS) Study (2004-2009) completed the Spielberger Police Stress Survey, reporting frequency and severity of work events during the past month and year to calculate stress indices. Officers provided saliva samples to measure levels of cortisol secretion. Linear regression assessed associations between stress indices and various cortisol parameters, adjusted for age, gender, race/ethnicity, abdominal height, and smoking status. RESULTS: Significant positive associations were observed between stress indices (overall stress, physical danger stress, and past-month lack of support) and diurnal cortisol (AUCg: total area under the curve). Administrative, overall, and physical danger stress in the past year were significantly associated with the diurnal slope. Overall, administrative, and physical danger stress were significantly associated with bedtime levels. There were no significant associations between the stress indices and the awakening cortisol parameters. CONCLUSIONS: Higher stress ratings were related to blunted diurnal decline in cortisol, suggesting conventional and unexpected police stressors may result in HPA axis dysfunction. Future studies investigating possible associations between elevated cortisol and subclinical CVD are needed.

      2. Assessing work-related risk factors for musculoskeletal knee disorders in construction roofing tasksexternal icon
        Breloff SP, Dutta A, Dai F, Sinsel EW, Warren CM, Ning X, Wu JZ.
        Applied Ergonomics. 2019 ;81.
        Roofers often suffer from musculoskeletal disorders (MSDs) to their knees due to spending a large amount of time kneeling while performing work-related roofing activities on sloped rooftops. Several ergonomic studies have identified kneeling as a potential risk factor for knee injuries and disorders. Existing biomechanical models and sensor technologies used to assess work-related risk factors for different construction trades are not applicable in roof work settings especially on slanted rooftop surfaces. This work assesses the impacts of work-related factors, namely working posture and roof slope, on the potential risk of developing knee MSDs due to residential roofing tasks in a laboratory setting. Nine human subjects participated in the experiment and mimicked shingle installation on a slope-configurable wooden platform. Maximum angles of right and left knee flexion, abduction, adduction, and axial rotation (internal and external) were measured as risk indicators using a motion capture system under different roof slope settings. The results demonstrated that roof slope, working posture and their interaction may have significant impacts on developing knee MSDs during roofing activities. Knees are likely to be exposed to increased risk of MSDs due to working in a dynamic kneeling posture during shingle installation. In our study, flexion in both knees and adduction in the right knee were found lower in high-pitched rooftops; however, abduction in the left knee and internal rotation in the right knee were found higher during shingle installation. Hence proper attention is needed for these situations. This study provides useful information about the impact of roof work settings on knee MSDs development, which may facilitate effective interventions such as education, training, and tools to prevent knee injuries in construction roofing tasks.

      3. Potential occupational and respiratory hazards in a Minnesota cannabis cultivation and processing facilityexternal icon
        Couch JR, Grimes GR, Wiegand DM, Green BJ, Glassford EK, Zwack LM, Lemons AR, Jackson SR, Beezhold DH.
        Am J Ind Med. 2019 Jul 23.
        BACKGROUND: Cannabis has been legalized in some form for much of the United States. The National Institute for Occupational Safety and Health (NIOSH) received a health hazard evaluation request from a Minnesota cannabis facility and their union to undertake an evaluation. METHODS: NIOSH representatives visited the facility in August 2016 and April 2017. Surface wipe samples were collected for analysis of delta-9 tetrahydrocannabinol (Delta9-THC), delta-9 tetrahydrocannabinol acid (Delta9-THCA), cannabidiol, and cannabinol. Environmental air samples were collected for volatile organic compounds (VOCs), endotoxins (limulus amebocyte lysate assay), and fungal diversity (NIOSH two-stage BC251 bioaerosol sampler with internal transcribed spacer region sequencing analysis). RESULTS: Surface wipe samples identified Delta9-THC throughout the facility. Diacetyl and 2,3-pentanedione were measured in initial VOC screening and subsequent sampling during tasks where heat transference was greatest, though levels were well below the NIOSH recommended exposure limits. Endotoxin concentrations were highest during processing activities, while internal transcribed spacer region sequencing revealed that the Basidiomycota genus, Wallemia, had the highest relative abundance. CONCLUSIONS: To the authors’ knowledge, this is the first published report of potential diacetyl and 2,3-pentanedione exposure in the cannabis industry, most notably during cannabis decarboxylation. Endotoxin exposure was elevated during grinding, indicating that this is a potentially high-risk task. The findings indicate that potential health hazards of significance are present during cannabis processing, and employers should be aware of potential exposures to VOCs, endotoxin, and fungi. Further research into the degree of respiratory and dermal hazards and resulting health effects in this industry is recommended.

    • Parasitic Diseases
      1. Multiplex serology demonstrate cumulative prevalence and spatial distribution of malaria in Ethiopiaexternal icon
        Assefa A, Ali Ahmed A, Deressa W, Sime H, Mohammed H, Kebede A, Solomon H, Teka H, Gurrala K, Matei B, Wakeman B, Wilson GG, Sinha I, Maude RJ, Ashton R, Cook J, Shi YP, Drakeley C, von Seidlein L, Rogier E, Hwang J.
        Malar J. 2019 Jul 22;18(1):246.
        BACKGROUND: Measures of malaria burden using microscopy and rapid diagnostic tests (RDTs) in cross-sectional household surveys may incompletely describe the burden of malaria in low-transmission settings. This study describes the pattern of malaria transmission in Ethiopia using serological antibody estimates derived from a nationwide household survey completed in 2015. METHODS: Dried blood spot (DBS) samples were collected during the Ethiopian Malaria Indicator Survey in 2015 from malarious areas across Ethiopia. Samples were analysed using bead-based multiplex assays for IgG antibodies for six Plasmodium antigens: four human malaria species-specific merozoite surface protein-1 19kD antigens (MSP-1) and Apical Membrane Antigen-1 (AMA-1) for Plasmodium falciparum and Plasmodium vivax. Seroprevalence was estimated by age, elevation and region. The seroconversion rate was estimated using a reversible catalytic model fitted with maximum likelihood methods. RESULTS: Of the 10,278 DBS samples available, 93.6% (9622/10,278) had valid serological results. The mean age of participants was 15.8 years and 53.3% were female. National seroprevalence for antibodies to P. falciparum was 32.1% (95% confidence interval (CI) 29.8-34.4) and 25.0% (95% CI 22.7-27.3) to P. vivax. Estimated seroprevalences for Plasmodium malariae and Plasmodium ovale were 8.6% (95% CI 7.6-9.7) and 3.1% (95% CI 2.5-3.8), respectively. For P. falciparum seroprevalence estimates were significantly higher at lower elevations (< 2000 m) compared to higher (2000-2500 m) (aOR 4.4; p < 0.01). Among regions, P. falciparum seroprevalence ranged from 11.0% (95% CI 8.8-13.7) in Somali to 65.0% (95% CI 58.0-71.4) in Gambela Region and for P. vivax from 4.0% (95% CI 2.6-6.2) in Somali to 36.7% (95% CI 30.0-44.1) in Amhara Region. Models fitted to measure seroconversion rates showed variation nationally and by elevation, region, antigen type, and within species. CONCLUSION: Using multiplex serology assays, this study explored the cumulative malaria burden and regional dynamics of the four human malarias in Ethiopia. High malaria burden was observed in the northwest compared to the east. High transmission in the Gambela and Benishangul-Gumuz Regions and the neglected presence of P. malariae and P. ovale may require programmatic attention. The use of a multiplex assay for antibody detection in low transmission settings has the potential to act as a more sensitive biomarker.

      2. Evaluating effectiveness of mass and continuous long-lasting insecticidal net distributions over time in Madagascar: A sentinel surveillance based epidemiological studyexternal icon
        Girond F, Madec Y, Kesteman T, Randrianarivelojosia M, Randremanana R, Randriamampionona L, Randrianasolo L, Ratsitorahina M, Herbreteau V, Hedje J, Rogier C, Piola P.
        EClinicalMedicine. 2018 Jul;1:62-69.
        Background: The reduction of global malaria burden over the past 15 years is much attributed to the expansion of mass distribution campaigns (MDCs) of long-lasting insecticidal nets (LLIN). In Madagascar, two LUN MDCs were implemented and one district also benefited from a community-based continuous distribution (CB-CD). Malaria incidence dropped but eventually rebounded after a decade. Methods: Data from a sentinel surveillance network over the 2009-2015 period was analyzed. Alerts were defined as w eekly number o f malaria cases exceeding the 90th percentile value for three consecutive weeks. Statistical analyses assessed the temporal relationship between LLIN MDCs and (i) number of malaria cases and (ii) malaria alerts detected, and (iii) the effect of a combination of MDCs and a CB-CD in Toamasina District. Findings: Analyses showed an increase of 13.6 points and 21.4 points in the percentile value of weekly malaria cases during the second and the third year following the MDC of LLINs respectively. The percentage of alert-free sentinel sites was 98.2% during the first year after LLIN MDC, 56.7% during the second year and 31.5% during the third year. The number of weekly malaria cases decreased by 14% during the CB-CD in Toamasina District. In contrast, sites without continuous distribution had a 12% increase of malaria cases. Interpretation: These findings support the malaria-preventive effectiveness of MDCs in Madagascar but highlight their limited duration when not followed by continuous distribution. The resulting policy implications are crucial to sustain reductions in malaria burden in high transmission settings.

      3. Modelling the relationship between malaria prevalence as a measure of transmission and mortality across age groupsexternal icon
        Khagayi S, Desai M, Amek N, Were V, Onyango ED, Odero C, Otieno K, Bigogo G, Munga S, Odhiambo F, Hamel MJ, Kariuki S, Samuels AM, Slutsker L, Gimnig J, Vounatsou P.
        Malar J. 2019 Jul 23;18(1):247.
        BACKGROUND: Parasite prevalence has been used widely as a measure of malaria transmission, especially in malaria endemic areas. However, its contribution and relationship to malaria mortality across different age groups has not been well investigated. Previous studies in a health and demographic surveillance systems (HDSS) platform in western Kenya quantified the contribution of incidence and entomological inoculation rates (EIR) to mortality. The study assessed the relationship between outcomes of malaria parasitaemia surveys and mortality across age groups. METHODS: Parasitological data from annual cross-sectional surveys from the Kisumu HDSS between 2007 and 2015 were used to determine malaria parasite prevalence (PP) and clinical malaria (parasites plus reported fever within 24 h or temperature above 37.5 degrees C). Household surveys and verbal autopsy (VA) were used to obtain data on all-cause and malaria-specific mortality. Bayesian negative binomial geo-statistical regression models were used to investigate the association of PP/clinical malaria with mortality across different age groups. Estimates based on yearly data were compared with those from aggregated data over 4 to 5-year periods, which is the typical period that mortality data are available from national demographic and health surveys. RESULTS: Using 5-year aggregated data, associations were established between parasite prevalence and malaria-specific mortality in the whole population (RRmalaria = 1.66; 95% Bayesian Credible Intervals: 1.07-2.54) and children 1-4 years (RRmalaria = 2.29; 1.17-4.29). While clinical malaria was associated with both all-cause and malaria-specific mortality in combined ages (RRall-cause = 1.32; 1.01-1.74); (RRmalaria = 2.50; 1.27-4.81), children 1-4 years (RRall-cause = 1.89; 1.00-3.51); (RRmalaria = 3.37; 1.23-8.93) and in older children 5-14 years (RRall-cause = 3.94; 1.34-11.10); (RRmalaria = 7.56; 1.20-39.54), no association was found among neonates, adults (15-59 years) and the elderly (60+ years). Distance to health facilities, socioeconomic status, elevation and survey year were important factors for all-cause and malaria-specific mortality. CONCLUSION: Malaria parasitaemia from cross-sectional surveys was associated with mortality across age groups over 4 to 5 year periods with clinical malaria more strongly associated with mortality than parasite prevalence. This effect was stronger in children 5-14 years compared to other age-groups. Further analyses of data from other HDSS sites or similar platforms would be useful in investigating the relationship between malaria and mortality across different endemicity levels.

      4. A search for snail-related answers to explain differences in response of Schistosoma mansoni to praziquantel treatment among responding and persistent hotspot villages along the Kenyan Shore of Lake Victoriaexternal icon
        Mutuku MW, Laidemitt MR, Beechler BR, Mwangi IN, Otiato FO, Agola EL, Ochanda H, Kamel B, Mkoji GM, Steinauer ML, Loker ES.
        Am J Trop Med Hyg. 2019 Jul;101(1):65-77.
        Following a 4-year annual praziquantel (PZQ) treatment campaign, the resulting prevalence of Schistosoma mansoni was seen to differ among individual villages along the Kenyan shore of Lake Victoria. We have investigated possible inherent differences in snail-related aspects of transmission among such 10 villages, including six persistent hotspot (PHS) villages (</= 30% reduction in prevalence following repeated treatments) located along the west-facing shore of the lake and four PZQ-responding (RESP) villages (> 30% prevalence reduction following repeated treatment) along the Winam Gulf. When taking into account all sampling sites, times, and water hyacinth presence/absence, shoreline-associated Biomphalaria sudanica from PHS and RESP villages did not differ in relative abundance or prevalence of S. mansoni infection. Water hyacinth intrusions were associated with increased B. sudanica abundance. The deeper water snail Biomphalaria choanomphala was significantly more abundant in the PHS villages, and prevalence of S. mansoni among villages both before and after control was positively correlated with B. choanomphala abundance. Worm recoveries from sentinel mice did not differ between PHS and RESP villages, and abundance of non-schistosome trematode species was not associated with S. mansoni abundance. Biomphalaria choanomphala provides an alternative, deepwater mode of transmission that may favor greater persistence of S. mansoni in PHS villages. As we found evidence for ongoing S. mansoni transmission in all 10 villages, we conclude that conditions conducive for transmission and reinfection occur ubiquitously. This argues for an integrated, basin-wide plan for schistosomiasis control to counteract rapid reinfections facilitated by large snail populations and movements of infected people around the lake.

      5. Lack of evidence for Toxocara infection in Italian myelitis patientsexternal icon
        Nicoletti A, Garcia HH, Cicero CE, Portaro G, Giuliano L, Patti F, Sofia V, Noh J, Handali S, Zappia M.
        Neurol Sci. 2019 Jul 22.
        Acute myelitis is a common neurological manifestation due to different causes, but in about 15-30% of cases its etiology remains unknown (idiopathic myelitis). Myelitis represents the most common manifestation of neurotoxocariasis, the infection of the human nervous system by larvae of the nematode Toxocara spp.; however, despite the high seroprevalence worldwide, its contribution to the burden of disease has not been assessed. We evaluated the presence of antibodies against Toxocara spp. in cerebrospinal fluid (CSF) from a sample of 28 patients with a diagnosis of idiopathic myelitis (N = 20) or encephalomyelitis (N = 8) who attended the Neurological Unit of the University Hospital of Catania, Sicily. Antibodies against Toxocara spp. were measured using a multiplex bead-based assay and Toxocara immunoblot using Toxocara canis excretory secretory antigens. All samples tested negative for the presence of anti-T. canis IgG antibodies. In this series, we found no evidence of a contribution of neurotoxocariasis to the burden of myelitis.

      6. The safety of double- and triple-drug community mass drug administration for lymphatic filariasis: A multicenter, open-label, cluster-randomized studyexternal icon
        Weil GJ, Bogus J, Christian M, Dubray C, Djuardi Y, Fischer PU, Goss CW, Hardy M, Jambulingam P, King CL, Kuttiat VS, Krishnamoorthy K, Laman M, Lemoine JF, O’Brian KK, Robinson LJ, Samuela J, Schechtman KB, Sircar A, Srividya A, Steer AC, Supali T, Subramanian S.
        PLoS Med. 2019 Jun;16(6):e1002839.
        BACKGROUND: The Global Programme to Eliminate Lymphatic Filariasis (GPELF) provides antifilarial medications to hundreds of millions of people annually to treat filarial infections and prevent elephantiasis. Recent trials have shown that a single-dose, triple-drug treatment (ivermectin with diethylcarbamazine and albendazole [IDA]) is superior to a two-drug combination (diethylcarbamazine plus albendazole [DA]) that is widely used in LF elimination programs. This study was performed to assess the safety of IDA and DA in a variety of endemic settings. METHODS AND FINDINGS: Large community studies were conducted in five countries between October 2016 and November 2017. Two studies were performed in areas with no prior mass drug administration (MDA) for filariasis (Papua New Guinea and Indonesia), and three studies were performed in areas with persistent LF despite extensive prior MDA (India, Haiti, and Fiji). Participants were treated with a single oral dose of IDA (ivermectin, 200 mug/kg; diethylcarbamazine, 6 mg/kg; plus albendazole, a fixed dose of 400 mg) or with DA alone. Treatment assignment in each study site was randomized by locality of residence. Treatment was offered to residents who were >/=5 years of age and not pregnant. Adverse events (AEs) were assessed by medical teams with active follow-up for 2 days and passive follow-up for an additional 5 days. A total of 26,836 persons were enrolled (13,535 females and 13,300 males). A total of 12,280 participants were treated with DA, and 14,556 were treated with IDA. On day 1 or 2 after treatment, 97.4% of participants were assessed for AEs. The frequency of all AEs was similar after IDA and DA treatment (12% versus 12.1%, adjusted odds ratio for IDA versus DA 1.15, 95% CI 0.87-1.52, P = 0.316); 10.9% of participants experienced mild (grade 1) AEs, 1% experienced moderate (grade 2) AEs, and 0.1% experienced severe (grade 3) AEs. Rates of serious AEs after DA and IDA treatment were 0.04% (95% CI 0.01%-0.1%) and 0.01% (95% CI 0.00%-0.04%), respectively. Severity of AEs was not significantly different after IDA or DA. Five of six serious AEs reported occurred after DA treatment. The most common AEs reported were headache, dizziness, abdominal pain, fever, nausea, and fatigue. AE frequencies varied by country and were higher in adults and in females. AEs were more common in study participants with microfilaremia (33.4% versus 11.1%, P < 0.001) and more common in microfilaremic participants after IDA than after DA (39.4% versus 25.6%, P < 0.001). However, there was no excess of severe or serious AEs after IDA in this subgroup. The main limitation of the study was that it was open-label. Also, aggregation of AE data from multiple study sites tends to obscure variability among study sites. CONCLUSIONS: In this study, we observed that IDA was well tolerated in LF-endemic populations. Posttreatment AE rates and severity did not differ significantly after IDA or DA treatment. Thus, results of this study suggest that IDA should be as safe as DA for use as a MDA regimen for LF elimination in areas that currently receive DA. TRIAL REGISTRATION: Clinicaltrials.gov registration number: NCT02899936.

    • Public Health Leadership and Management
      1. Mitigating ethical risks in public-private partnerships in public healthexternal icon
        Yassanye DM, Anason AP, Barrett DH.
        J Public Health Manag Pract. 2019 .
        Context: Partnerships between the public and private sectors are necessary in public health and health care. Each partner provides skills, resources, and capabilities. When the public sector, including government, enters into a partnership with a nongovernmental or corporate entity, it is important to determine in advance whether there are real or perceived ethical, financial, or programmatic risks to the organization that might need mitigation. Program: This article describes how the Centers for Disease Control and Prevention has approached assessing ethical considerations of public-private partnerships, especially those involving monetary or in-kind gifts. Implementation: There are practices that can be applied no matter the size or structure of the organization that can lead to transparency and accountability for a potential partnership. Discussion: Examples in this article include a list of practical considerations to review before entering into a new partnership, as well as illustrative anecdotes.

    • Zoonotic and Vectorborne Diseases
      1. Distinguishing patients with laboratory-confirmed chikungunya from dengue and other acute febrile illnesses, Puerto Rico, 2012-2015external icon
        Alvarado LI, Lorenzi OD, Torres-Velasquez BC, Sharp TM, Vargas L, Munoz-Jordan JL, Hunsperger EA, Perez-Padilla J, Rivera A, Gonzalez-Zeno GE, Galloway RL, Glass Elrod M, Mathis DL, Oberste MS, Nix WA, Henderson E, McQuiston J, Singleton J, Kato C, Garcia-Gubern C, Santiago-Rivera W, Muns-Sosa R, Ortiz-Rivera JD, Jimenez G, Rivera-Amill V, Andujar-Perez DA, Horiuchi K, Tomashek KM.
        PLoS Negl Trop Dis. 2019 Jul;13(7):e0007562.
        Chikungunya, a mosquito-borne viral, acute febrile illness (AFI) is associated with polyarthralgia and polyarthritis. Differentiation from other AFI is difficult due to the non-specific presentation and limited availability of diagnostics. This 3-year study identified independent clinical predictors by day post-illness onset (DPO) at presentation and age-group that distinguish chikungunya cases from two groups: other AFI and dengue. Specimens collected from participants with fever </=7 days were tested for chikungunya, dengue viruses 1-4, and 20 other pathogens. Of 8,996 participants, 18.2% had chikungunya, and 10.8% had dengue. Chikungunya cases were more likely than other groups to be older, report a chronic condition, and present <3 DPO. Regardless of timing of presentation, significant positive predictors for chikungunya versus other AFI were: joint pain, muscle, bone or back pain, skin rash, and red conjunctiva; with dengue as the comparator, red swollen joints (arthritis), joint pain, skin rash, any bleeding, and irritability were predictors. Chikungunya cases were less likely than AFI and dengue to present with thrombocytopenia, signs of poor circulation, diarrhea, headache, and cough. Among participants presenting <3 DPO, predictors for chikungunya versus other AFI included: joint pain, skin rash, and muscle, bone or back pain, and absence of thrombocytopenia, poor circulation and respiratory or gastrointestinal symptoms; when the comparator was dengue, joint pain and arthritis, and absence of thrombocytopenia, leukopenia, and nausea were early predictors. Among all groups presenting 3-5 DPO, pruritic skin became a predictor for chikungunya, joint, muscle, bone or back pain were no longer predictive, while arthritis became predictive in all age-groups. Absence of thrombocytopenia was a significant predictor regardless of DPO or comparison group. This study identified robust clinical indicators such as joint pain, skin rash and absence of thrombocytopenia that can allow early identification of and accurate differentiation between patients with chikungunya and other common causes of AFI.

      2. Seroprevalence, risk factors, and rodent reservoirs of leptospirosis in an urban community of Puerto Rico, 2015external icon
        Briskin EA, Casanovas-Massana A, Ryff KR, Morales-Estrada S, Hamond C, Perez-Rodriguez NM, Benavidez KM, Weinberger DM, Castro-Arellano I, Wunder EA, Sharp TM, Rivera-Garcia B, Ko AI.
        J Infect Dis. 2019 Jul 2.
        BACKGROUND: The burden of leptospirosis in Puerto Rico remains unclear due to underreporting. METHODS: A cross-sectional survey and rodent trapping was performed in a community within San Juan, Puerto Rico to determine the seroprevalence and risk factors for Leptospira infection. The microscopic agglutination test was used to detect anti-Leptospira antibodies as a marker of previous infection. We evaluated Leptospira carriage by qPCR among rodents trapped at the community site. RESULTS: Of 202 study participants, 55 (27.2%) had Leptospira agglutinating antibodies. Among the 55 seropositive individuals, antibodies were directed most frequently against serogroups Icterohaemorrhagiae (22.0%) and Autumnalis (10.6%). Of 18 captured rodents, 11 (61.1%) carried pathogenic Leptospira (L. borgpetersenii, 7 and L. interrogans, 2). Four participants showed their highest titer against an isolate obtained from a rodent (serogroup Ballum). Increasing household distance to the canal that runs through the community was associated with decreased risk of infection (OR = 0.934 per 10m increase, 95% CI: 0.952-0.992). CONCLUSION: There are high levels of Leptospira exposure in an urban setting in Puerto Rico, for which rodents may be an important reservoir for transmission. Our findings indicate that prevention should focus on mitigating risk posed by infrastructure deficiencies such as the canal.

Back to Top

CDC Science Clips Production Staff

  • John Iskander, MD MPH, Editor
  • Gail Bang, MLIS, Librarian
  • Kathy Tucker, Librarian
  • William (Bill) Thomas, MLIS, Librarian
  • Onnalee Gomez, MS, Health Scientist
  • Jarvis Sims, MIT, MLIS, Librarian

____

DISCLAIMER: Articles listed in the CDC Science Clips are selected by the Stephen B. Thacker CDC Library to provide current awareness of the public health literature. An article's inclusion does not necessarily represent the views of the Centers for Disease Control and Prevention nor does it imply endorsement of the article's methods or findings. CDC and DHHS assume no responsibility for the factual accuracy of the items presented. The selection, omission, or content of items does not imply any endorsement or other position taken by CDC or DHHS. Opinion, findings and conclusions expressed by the original authors of items included in the Clips, or persons quoted therein, are strictly their own and are in no way meant to represent the opinion or views of CDC or DHHS. References to publications, news sources, and non-CDC Websites are provided solely for informational purposes and do not imply endorsement by CDC or DHHS.

Page last reviewed: August 9, 2019, 12:00 AM