Skip directly to search Skip directly to A to Z list Skip directly to navigation Skip directly to page options Skip directly to site content

Issue 41, October 17, 2017

CDC Science Clips: Volume 9, Issue 41, October 17, 2017

Science Clips is produced weekly to enhance awareness of emerging scientific knowledge for the public health community. Each article features an Altmetric Attention score to track social and mainstream media mentions!

  1. CDC Public Health Grand Rounds
    • Maternal and Child Health – Global Prevention of Neural Tube Defects
      1. A 2015 global update on folic acid-preventable spina bifida and anencephaly
        Arth A, Kancherla V, Pachon H, Zimmerman S, Johnson Q, Oakley GP.
        Birth Defects Res A Clin Mol Teratol. 2016 Jul;106(7):520-9.
        BACKGROUND: Spina bifida and anencephaly are two major neural tube defects. They contribute substantially to perinatal, neonatal, infant, and under-five mortality and life-long disability. To monitor the progress toward the total prevention of folic acid-preventable spina bifida and anencephaly (FAP SBA), we examined their global status in 2015. METHODS: Based on existing data, we modeled the proportion of FAP SBA that are prevented in the year 2015 through mandatory folic acid fortification globally. We included only those countries with mandatory fortification that added at least 1.0 ppm folic acid as a fortificant to wheat and maize flour, and had complete information on coverage. Our model assumed mandatory folic acid fortification at 200 mug/day is fully protective against FAP SBA, and reduces the rate of spina bifida and anencephaly to a minimum of 0.5 per 1000 births. RESULTS: Our estimates show that, in 2015, 13.2% (35,500 of approximately 268,700 global cases) of FAP SBA were prevented in 58 countries through mandatory folic acid fortification of wheat and maize flour. Most countries in Europe, Africa, and Asia were not implementing mandatory fortification with folic acid. CONCLUSION: Knowledge that folic acid prevents spina bifida and anencephaly has existed for 25 years, yet only a small fraction of FAP SBA is being prevented worldwide. Several countries still have 5- to 20-fold epidemics of FAP SBA. Implementation of mandatory fortification with folic acid offers governments a proven and rapid way to prevent FAP SBA-associated disability and mortality, and to help achieve health-related Sustainable Development Goals. Birth Defects Research (Part A) 106:520-529, 2016. (c) 2016 Wiley Periodicals, Inc.

      2. Population red blood cell folate concentrations for prevention of neural tube defects: Bayesian model
        Crider KS, Devine O, Hao L, Dowling NF, Li S, Molloy AM, Li Z, Zhu J, Berry RJ.
        Bmj. 2014 Jul 29;349:g4554.
        OBJECTIVE: To determine an optimal population red blood cell (RBC) folate concentration for the prevention of neural tube birth defects. DESIGN: Bayesian model. SETTING: Data from two population based studies in China. PARTICIPANTS: 247,831 participants in a prospective community intervention project in China (1993-95) to prevent neural tube defects with 400 mug/day folic acid supplementation and 1194 participants in a population based randomized trial (2003-05) to evaluate the effect of folic acid supplementation on blood folate concentration among Chinese women of reproductive age. INTERVENTION: Folic acid supplementation (400 mug/day). MAIN OUTCOME MEASURES: Estimated RBC folate concentration at time of neural tube closure (day 28 of gestation) and risk of neural tube defects. RESULTS: Risk of neural tube defects was high at the lowest estimated RBC folate concentrations (for example, 25.4 (95% uncertainty interval 20.8 to 31.2) neural tube defects per 10,000 births at 500 nmol/L) and decreased as estimated RBC folate concentration increased. Risk of neural tube defects was substantially attenuated at estimated RBC folate concentrations above about 1000 nmol/L (for example, 6 neural tube defects per 10,000 births at 1180 (1050 to 1340) nmol/L). The modeled dose-response relation was consistent with the existing literature. In addition, neural tube defect risk estimates developed using the proposed model and population level RBC information were consistent with the prevalence of neural tube defects in the US population before and after food fortification with folic acid. CONCLUSIONS: A threshold for “optimal” population RBC folate concentration for the prevention of neural tube defects could be defined (for example, approximately 1000 nmol/L). Population based RBC folate concentrations, as a biomarker for risk of neural tube defects, can be used to facilitate evaluation of prevention programs as well as to identify subpopulations at elevated risk for a neural tube defect affected pregnancy due to folate insufficiency.

      3. Iron, zinc, folate, and vitamin B-12 status increased among women and children in Yaounde and Douala, Cameroon, 1 year after introducing fortified wheat flour
        Engle-Stone R, Nankap M, Ndjebayi AO, Allen LH, Shahab-Ferdows S, Hampel D, Killilea DW, Gimou MM, Houghton LA, Friedman A, Tarini A, Stamm RA, Brown KH.
        J Nutr. 2017 Jul;147(7):1426-1436.
        Background: Few data are available on the effectiveness of large-scale food fortification programs.Objective: We assessed the impact of mandatory wheat flour fortification on micronutrient status in Yaounde and Douala, Cameroon.Methods: We conducted representative surveys 2 y before and 1 y after the introduction of fortified wheat flour. In each survey, 10 households were selected within each of the same 30 clusters (n = approximately 300 households). Indicators of inflammation, malaria, anemia, and micronutrient status [plasma ferritin, soluble transferrin receptor (sTfR), zinc, folate, and vitamin B-12] were assessed among women aged 15-49 y and children 12-59 mo of age.Results: Wheat flour was consumed in the past 7 d by >/=90% of participants. Postfortification, mean total iron and zinc concentrations of flour samples were 46.2 and 73.6 mg/kg (target added amounts were 60 and 95 mg/kg, respectively). Maternal anemia prevalence was significantly lower postfortification (46.7% compared with 39.1%; adjusted P = 0.01), but mean hemoglobin concentrations and child anemia prevalence did not differ. For both women and children postfortification, mean plasma concentrations were greater for ferritin and lower for sTfR after adjustments for potential confounders. Mean plasma zinc concentrations were greater postfortification and the prevalence of low plasma zinc concentration in women after fortification (21%) was lower than before fortification (39%, P < 0.001); likewise in children, the prevalence postfortification (28%) was lower than prefortification (47%, P < 0.001). Mean plasma total folate concentrations were approximately 250% greater postfortification among women (47 compared with 15 nmol/L) and children (56 compared with 20 nmol/L), and the prevalence of low plasma folate values was <1% after fortification in both population subgroups. In a nonrepresentative subset of plasma samples, folic acid was detected in 77% of women (73% of those fasting) and 93% of children. Mean plasma and breast-milk vitamin B-12 concentrations were >50% greater postfortification.Conclusion: Although the pre-post survey design limits causal inference, iron, zinc, folate, and vitamin B-12 status increased among women and children in urban Cameroon after mandatory wheat flour fortification.

      4. Retrospective assessment of cost savings from prevention: Folic acid fortification and spina bifida in the U.S
        Grosse SD, Berry RJ, Mick Tilford J, Kucik JE, Waitzman NJ.
        Am J Prev Med. 2016 May;50(5 Suppl 1):S74-80.
        INTRODUCTION: Although fortification of food with folic acid has been calculated to be cost saving in the U.S., updated estimates are needed. This analysis calculates new estimates from the societal perspective of net cost savings per year associated with mandatory folic acid fortification of enriched cereal grain products in the U.S. that was implemented during 1997-1998. METHODS: Estimates of annual numbers of live-born spina bifida cases in 1995-1996 relative to 1999-2011 based on birth defects surveillance data were combined during 2015 with published estimates of the present value of lifetime direct costs updated in 2014 U.S. dollars for a live-born infant with spina bifida to estimate avoided direct costs and net cost savings. RESULTS: The fortification mandate is estimated to have reduced the annual number of U.S. live-born spina bifida cases by 767, with a lower-bound estimate of 614. The present value of mean direct lifetime cost per infant with spina bifida is estimated to be $791,900, or $577,000 excluding caregiving costs. Using a best estimate of numbers of avoided live-born spina bifida cases, fortification is estimated to reduce the present value of total direct costs for each year’s birth cohort by $603 million more than the cost of fortification. A lower-bound estimate of cost savings using conservative assumptions, including the upper-bound estimate of fortification cost, is $299 million. CONCLUSIONS: The estimates of cost savings are larger than previously reported, even using conservative assumptions. The analysis can also inform assessments of folic acid fortification in other countries.

      5. Components of Successful Staple Food Fortification Programs: Lessons From Latin America
        Martorell R, de Romana DL.
        Food Nutr Bull. 2017 Sep;38(3):384-404.
        BACKGROUND: There are few effectiveness evaluations of food fortification programs, and little is known about what makes programs successful. OBJECTIVE: We examined 3 food fortification programs in Latin America to identify common features that might explain their success and to draw lessons for program design and implementation everywhere: The vitamin A fortification of sugar in Guatemala with impact on vitamin A status of the population, the fortification of a basket of foods with iron and other micronutrients in Costa Rica with impact on iron status and anemia in women and children, and the fortification of wheat flour with folic acid in Chile, which reduced the incidence of neural tube defects. METHODS: We identified pertinent literature about these preselected programs and asked regional experts for any additional information. We also conducted structured interviews of key informants to provide historical and contextual information. RESULTS: Institutional research capacity and champions of fortification are features of successful programs in Latin America. We also found that private/public partnerships (industry, government, academia, and civil society) might be key for sustainability. To achieve impact, program managers need to use fortification vehicles that are consumed by the nutritionally vulnerable and to add bioavailable fortificants at adequate content levels in order to fill dietary gaps and reduce micronutrient deficiencies. Adequate monitoring and quality control are essential. CONCLUSIONS: For future programs, we recommend that the evaluation be specified up-front, including a baseline/end line and data collection along the program impact pathway to inform needed improvements and to strengthen causal inferences.

      6. Large-scale wheat flour folic acid fortification program increases plasma folate levels among women of reproductive age in urban Tanzania
        Noor RA, Abioye AI, Ulenga N, Msham S, Kaishozi G, Gunaratna NS, Mwiru R, Smith E, Dhillon CN, Spiegelman D, Fawzi W.
        PLoS One. 2017 ;12(8):e0182099.
        There is widespread vitamin and mineral deficiency problem in Tanzania with known deficiencies of at least vitamin A, iron, folate and zinc, resulting in lasting negative consequences especially on maternal health, cognitive development and thus the nation’s economic potential. Folate deficiency is associated with significant adverse health effects among women of reproductive age, including a higher risk of neural tube defects. Several countries, including Tanzania, have implemented mandatory fortification of wheat and maize flour but evidence on the effectiveness of these programs in developing countries remains limited. We evaluated the effectiveness of Tanzania’s food fortification program by examining folate levels for women of reproductive age, 18-49 years. A prospective cohort study with 600 non-pregnant women enrolled concurrent with the initiation of food fortification and followed up for 1 year thereafter. Blood samples, dietary intake and fortified foods consumption data were collected at baseline, and at 6 and 12 months. Plasma folate levels were determined using a competitive assay with folate binding protein. Using univariate and multivariate linear regression, we compared the change in plasma folate levels at six and twelve months of the program from baseline. We also assessed the relative risk of folate deficiency during follow-up using log-binomial regression. The mean (+/-SE) pre-fortification plasma folate level for the women was 5.44-ng/ml (+/-2.30) at baseline. These levels improved significantly at six months [difference: 4.57ng/ml (+/-2.89)] and 12 months [difference: 4.27ng/ml (+/-4.18)]. Based on plasma folate cut-off level of 4 ng/ml, the prevalence of folate deficiency was 26.9% at baseline, and 5% at twelve months. One ng/ml increase in plasma folate from baseline was associated with a 25% decreased risk of folate deficiency at 12 months [(RR = 0.75; 95% CI = 0.67-0.85, P<0.001]. In a setting where folate deficiency is high, food fortification program with folic acid resulted in significant improvements in folate status among women of reproductive age.

      7. Folate deficiency is prevalent in women of childbearing age in Belize and is negatively affected by coexisting vitamin B-12 deficiency: Belize National Micronutrient Survey 2011
        Rosenthal J, Largaespada N, Bailey LB, Cannon M, Alverson CJ, Ortiz D, Kauwell GP, Sniezek J, Figueroa R, Daly R, Allen P.
        J Nutr. 2017 Jun;147(6):1183-1193.
        Background: Folate deficiency, vitamin B-12 deficiency, and anemia can have adverse effects on birth outcomes. Also, low vitamin B-12 reduces the formation of metabolically active folate.Objectives: We sought to establish the baseline prevalence of and factors associated with folate deficiency and insufficiency, vitamin B-12 deficiency, and anemia among women of childbearing age (WCBA) in Belize.Methods: In 2011, a national probability-based survey was completed among Belizean nonpregnant WCBA aged 15-49 y. Blood samples for determination of hemoglobin, folate (RBC and serum), and vitamin B-12 (plasma) and sociodemographic and health information were collected from 937 women. RBC and serum folate concentrations were measured by microbiologic assay (MBA). Folate status was defined based on both the WHO-recommended radioproteinbinding assay and the assay adjusted for the MBA.Results: The national prevalence estimates for folate deficiency in WCBA, based on serum and RBC folate concentrations by using the assay-matched cutoffs, were 11.0% (95% CI: 8.6%, 14.0%) and 35.1% (95% CI: 31.3%, 39.2%), respectively. By using the assay-matched compared with the WHO-recommended cutoffs, a substantially higher prevalence of folate deficiency was observed based on serum (6.9% absolute difference) and RBC folate (28.9% absolute difference) concentrations. The prevalence for RBC folate insufficiency was 48.9% (95% CI: 44.8%, 53.1%). Prevalence estimates for vitamin B-12 deficiency and marginal deficiency and anemia were 17.2% (95% CI: 14.2%, 20.6%), 33.2% (95% CI: 29.6%, 37.1%), and 22.7% (95% CI: 19.5%, 26.2%), respectively. The adjusted geometric means of the RBC folate concentration increased significantly (P-trend < 0.001) in WCBA who had normal vitamin B-12 status relative to WCBA who were vitamin B-12 deficient.Conclusions: In Belize, the prevalence of folate and vitamin B-12 deficiencies continues to be a public health concern among WCBA. Furthermore, low folate status co-occurred with low vitamin B-12 status, underlining the importance of providing adequate vitamin B-12 and folic acid intake through approaches such as mandatory food fortification.

      8. BACKGROUND: Red blood cell (RBC) folate concentrations are a potential biomarker of folate-sensitive neural tube defect (NTD) risk in the population. The purpose of this analysis was to describe women in the U.S. population with RBC folate concentrations below those associated with optimal NTD prevention. METHODS: We used data from the 2007 to 2012 National Health and Nutrition Examination Survey (NHANES) to assess the RBC folate status of U.S. women of childbearing age relative to risk categories for NTD risk based on RBC folate concentrations. We defined suboptimal RBC folate concentrations as those associated with a prevalence of >/=9 NTDs per 10,000 live births. RESULTS: Among nonpregnant women age 12 to 49 years, 22.8% (95% Confidence Interval: 21.1, 24.6) had suboptimal RBC folate concentrations. Women had greater odds of having a suboptimal RBC folate concentration if they did not use dietary supplements containing folic acid; had mandatorily fortified enriched cereal grain products as their only source of folic acid; were non-Hispanic black or Hispanic; or were current smokers. CONCLUSION: Based on RBC folate concentrations, we would predict that the majority of U.S. women of reproductive age are not at increased risk for folate sensitive NTDs in the presence of mandatory folic acid fortification. Prevention policies and programs can be aimed at population subgroups identified as having higher predicted risk for folate-sensitive NTDs based on RBC folate concentrations.

      9. Assessing the association between the methylenetetrahydrofolate reductase (MTHFR) 677C>T polymorphism and blood folate concentrations: a systematic review and meta-analysis of trials and observational studies
        Tsang BL, Devine OJ, Cordero AM, Marchetta CM, Mulinare J, Mersereau P, Guo J, Qi YP, Berry RJ, Rosenthal J, Crider KS, Hamner HC.
        Am J Clin Nutr. 2015 Jun;101(6):1286-94.
        BACKGROUND: The methylenetetrahydrofolate reductase (MTHFR) 677C>T polymorphism is a risk factor for neural tube defects. The T allele produces an enzyme with reduced folate-processing capacity, which has been associated with lower blood folate concentrations. OBJECTIVE: We assessed the association between MTHFR C677T genotypes and blood folate concentrations among healthy women aged 12-49 y. DESIGN: We conducted a systematic review of the literature published from January 1992 to March 2014 to identify trials and observational studies that reported serum, plasma, or red blood cell (RBC) folate concentrations and MTHFR C677T genotype. We conducted a meta-analysis for estimates of percentage differences in blood folate concentrations between genotypes. RESULTS: Forty studies met the inclusion criteria. Of the 6 studies that used the microbiologic assay (MA) to measure serum or plasma (S/P) and RBC folate concentrations, the percentage difference between genotypes showed a clear pattern of CC > CT > TT. The percentage difference was greatest for CC > TT [S/P: 13%; 95% credible interval (CrI): 7%, 18%; RBC: 16%; 95% CrI: 12%, 20%] followed by CC > CT (S/P: 7%; 95% CrI: 1%, 12%; RBC: 8%; 95% CrI: 4%, 12%) and CT > TT (S/P: 6%; 95% CrI: 1%, 11%; RBC: 9%; 95% CrI: 5%, 13%). S/P folate concentrations measured by using protein-binding assays (PBAs) also showed this pattern but to a greater extent (e.g., CC > TT: 20%; 95% CrI: 17%, 22%). In contrast, RBC folate concentrations measured by using PBAs did not show the same pattern and are presented in the Supplemental Material only. CONCLUSIONS: Meta-analysis results (limited to the MA, the recommended population assessment method) indicated a consistent percentage difference in S/P and RBC folate concentrations across MTHFR C677T genotypes. Lower blood folate concentrations associated with this polymorphism could have implications for a population-level risk of neural tube defects.

      10. Describing the prevalence of neural tube defects worldwide: A systematic literature review
        Zaganjor I, Sekkarie A, Tsang BL, Williams J, Razzaghi H, Mulinare J, Sniezek JE, Cannon MJ, Rosenthal J.
        PLoS One. 2016 ;11(4):e0151586.
        BACKGROUND: Folate-sensitive neural tube defects (NTDs) are an important, preventable cause of morbidity and mortality worldwide. There is a need to describe the current global burden of NTDs and identify gaps in available NTD data. METHODS AND FINDINGS: We conducted a systematic review and searched multiple databases for NTD prevalence estimates and abstracted data from peer-reviewed literature, birth defects surveillance registries, and reports published between January 1990 and July 2014 that had greater than 5,000 births and were not solely based on mortality data. We classified countries according to World Health Organization (WHO) regions and World Bank income classifications. The initial search yielded 11,614 results; after systematic review we identified 160 full text manuscripts and reports that met the inclusion criteria. Data came from 75 countries. Coverage by WHO region varied in completeness (i.e., % of countries reporting) as follows: African (17%), Eastern Mediterranean (57%), European (49%), Americas (43%), South-East Asian (36%), and Western Pacific (33%). The reported NTD prevalence ranges and medians for each region were: African (5.2-75.4; 11.7 per 10,000 births), Eastern Mediterranean (2.1-124.1; 21.9 per 10,000 births), European (1.3-35.9; 9.0 per 10,000 births), Americas (3.3-27.9; 11.5 per 10,000 births), South-East Asian (1.9-66.2; 15.8 per 10,000 births), and Western Pacific (0.3-199.4; 6.9 per 10,000 births). The presence of a registry or surveillance system for NTDs increased with country income level: low income (0%), lower-middle income (25%), upper-middle income (70%), and high income (91%). CONCLUSIONS: Many WHO member states (120/194) did not have any data on NTD prevalence. Where data are collected, prevalence estimates vary widely. These findings highlight the need for greater NTD surveillance efforts, especially in lower-income countries. NTDs are an important public health problem that can be prevented with folic acid supplementation and fortification of staple foods.

  2. CDC Authored Publications
    The names of CDC authors are indicated in bold text.
    Articles published in the past 6-8 weeks authored by CDC or ATSDR staff.
    • Chronic Diseases and Conditions
      1. BACKGROUND: It is unknown whether decreases in the prevalence of prostate-specific antigen (PSA) screening and prostate cancer incidence rates, following the US Preventive Service Task Force (USPSTF) recommendations against routine PSA, are similar across socioeconomic groups and US census regions. METHODS: We analyzed incidence rates and PSA screening prevalence by age, race/ethnicity, disease stage, US region, and area-level socioeconomic status. Annual percent changes were examined for changes in rates over time. Predicted marginal probability and 95% confidence intervals (CI) were calculated to estimate changes in PSA screening. RESULTS: Incidence rates for men age >/=50 years decreased in all race/ethnic, regional, and SES groups. From 2007-2013, overall incidence rates for localized cancers significantly decreased by 7.5% (95% CI; -10.5, -4.4) per year in age 50-74 years and by 11.1% (95% CI; -14.1, -8.1) per year in age >/=75 years. In contrast, incidence for distant-stage cancer significantly increased by 1.4% (95% CI; 0.3, 2.5) per year from 2008-2013 in age 50-74 years, but stabilized from 2011-2013 in age >/=75 years (5.1% per year, 95% CI; -3.4, 14.4). Distant-stage disease rates increased with increasing poverty-level in age 50-74, but not in age >/=75 years. CONCLUSIONS: Prostate cancer incidence decreased for early-stage disease in men 50 years and older, while rates for distant-stage disease slightly increased in men 50-74 years following USPSTF recommendations against routine PSA screening. Further studies with additional years of data are needed to substantiate our findings and monitor the effects of late stage disease increase on prostate cancer mortality rates.

      2. Estimating health benefits and cost-savings for achieving the Healthy People 2020 objective of reducing invasive colorectal cancer
        Hung MC, Ekwueme DU, White A, Rim SH, King JB, Wang JD, Chang SH.
        Prev Med. 2017 Sep 27.
        This study aims to quantify the aggregate potential life-years (LYs) saved and healthcare cost-savings if the Healthy People 2020 objective were met to reduce invasive colorectal cancer (CRC) incidence by 15%. We identified patients (n=886,380) diagnosed with invasive CRC between 2001 and 2011 from a nationally representative cancer dataset. We stratified these patients by sex, race/ethnicity, and age. Using these data and data from the 2001-2011 U.S. life tables, we estimated a survival function for each CRC group and the corresponding reference group and computed per-person LYs saved. We estimated per-person annual healthcare cost-savings using the 2008-2012 Medical Expenditure Panel Survey. We calculated aggregate LYs saved and cost-savings by multiplying the reduced number of CRC patients by the per-person LYs saved and lifetime healthcare cost-savings, respectively. We estimated an aggregate of 84,569 and 64,924 LYs saved for men and women, respectively, accounting for healthcare cost-savings of $329.3 and $294.2 million (in 2013$), respectively. Per person, we estimated 6.3 potential LYs saved related to those who developed CRC for both men and women, and healthcare cost-savings of $24,000 for men and $28,000 for women. Non-Hispanic whites and those aged 60-64 had the highest aggregate potential LYs saved and cost-savings. Achieving the HP2020 objective of reducing invasive CRC incidence by 15% by year 2020 would potentially save nearly 150,000 life-years and $624 million on healthcare costs.

      3. The prevalence of atrial fibrillation (AF) is increasing in the United States as the population ages, but national surveillance is lacking. This cross-sectional study (2006 to 2014) analyzed data from the Healthcare Cost and Utilization Project’s Nationwide Emergency Department Sample, the National (Nationwide) Inpatient Sample, and the National Vital Statistics System. Event totals were estimated independently for emergency department (ED) visits, hospitalizations, and mortality, and then collectively after applying criteria to identify mutually exclusive events. Rates were calculated for AF as primary diagnosis or underlying cause of death (primary AF), as well as secondary diagnosis or contributing cause of death (co-morbid AF), and standardized by age to the 2010 US population. From 2006 to 2014, event rates increased for primary AF (249 to 268 per 100,000) and co-morbid AF (1,473 to 1,835 per 100,000). In 2014, an estimated 599,790 ED visits, 453,060 hospitalizations, and 21,712 deaths listed AF as primary. A total of 684,470 mutually exclusive primary AF and 4,695,997 mutually exclusive co-morbid AF events occurred. Among ED visits and hospitalizations with primary AF, the most common secondary diagnoses were hypertension, heart failure, ischemic heart disease, and diabetes. The mean cost per hospitalization with primary AF was $8,819. Mean costs were higher for those with co-morbid AF versus those without co-morbid AF among hospitalizations with a primary diagnosis of ischemic heart disease, heart failure, stroke, hypertension, or diabetes (all p </=0.01). In conclusion, with the substantial health and economic impact of AF and an aging US population, improved diagnosis, prevention, management, and surveillance of AF are increasingly important.

      4. Excessive weight gain, obesity, and cancer: Opportunities for clinical intervention
        Massetti GM, Dietz WH, Richardson LC.
        Jama. 2017 Oct 03.

        [No abstract]

    • Communicable Diseases
      1. Identifying gaps in HIV policy and practice along the HIV care continuum: evidence from a national policy review and health facility surveys in urban and rural Kenya
        Cawley C, McRobie E, Oti S, Njamwea B, Nyaguara A, Odhiambo F, Otieno F, Njage M, Shoham T, Church K, Mee P, Todd J, Zaba B, Reniers G, Wringe A.
        Health Policy Plan. 2017 Nov 01;32(9):1316-1326.
        The last decade has seen rapid evolution in guidance from the WHO concerning the provision of HIV services along the diagnosis-to-treatment continuum, but the extent to which these recommendations are adopted as national policies in Kenya, and subsequently implemented in health facilities, is not well understood. Identifying gaps in policy coverage and implementation is important for highlighting areas for improving service delivery, leading to better health outcomes. We compared WHO guidance with national policies for HIV testing and counselling, prevention of mother-to-child transmission, HIV treatment and retention in care. We then investigated implementation of these national policies in health facilities in one rural (Kisumu) and one urban (Nairobi) sites in Kenya. Implementation was documented using structured questionnaires that were administered to in-charge staff at 10 health facilities in Nairobi and 34 in Kisumu. Policies were defined as widely implemented if they were reported to occur in > 70% facilities, partially implemented if reported to occur in 30-70% facilities, and having limited implementation if reported to occur in < 30% facilities. Overall, Kenyan national HIV care and treatment policies were well aligned with WHO guidance. Policies promoting access to treatment and retention in care were widely implemented, but there was partial or limited implementation of several policies promoting access to HIV testing, and the more recent policy of Option B+ for HIV-positive pregnant women. Efforts are needed to improve implementation of policies designed to increase rates of diagnosis, thus facilitating entry into HIV care, if morbidity and mortality burdens are to be further reduced in Kenya, and as the country moves towards universal access to antiretroviral therapy.

      2. Incidence of measles in the United States, 2001-2015
        Clemmons NS, Wallace GS, Patel M, Gastanaduy PA.
        Jama. 2017 Oct 03;318(13):1279-1281.

        [No abstract]

      3. Global HIV antiretroviral drug resistance: A perspective and report of a National Institute of Allergy and Infectious Diseases consultation
        Godfrey C, Thigpen MC, Crawford KW, Jean-Phillippe P, Pillay D, Persaud D, Kuritzkes DR, Wainberg M, Raizes E, Fitzgibbon J.
        J Infect Dis. 2017 Jun 17.

        [No abstract]

      4. Rates of virological suppression and drug resistance in adult HIV-1-positive patients attending primary healthcare facilities in KwaZulu-Natal, South Africa
        Hunt GM, Dokubo EK, Takuva S, de Oliveira T, Ledwaba J, Dube N, Moodley P, Sabatier J, Deyde V, Morris L, Raizes E.
        J Antimicrob Chemother. 2017 Aug 03.
        Background: KwaZulu-Natal (KZN) Province in South Africa has the highest HIV disease burden in the country, with an estimated population prevalence of 24.7%. A pilot sentinel surveillance project was undertaken in KZN to classify the proportion of adult patients failing first-line ART and to describe the patterns of drug resistance mutations (DRMs) in patients with virological failure (VF). Methods: Cross-sectional surveillance of acquired HIV drug resistance was conducted in 15 sentinel ART clinics between August and November 2013. Two population groups were surveyed: on ART for 12-15 months (Cohort A) or 24-36 months (Cohort B). Plasma specimens with viral load >/=1000 copies/mL were defined as VF and genotyped for DRMs. Results: A total of 1299 adults were included in the analysis. The prevalence of VF was 4.0% (95% CI 1.8-8.8) among 540 adults in Cohort A and 7.7% (95% CI 4.4-13.0) of 759 adults in Cohort B. Treatment with efavirenz was more likely to suppress viral load in Cohort A ( P = 0.005). Independent predictors of VF for Cohort B included male gender, advanced WHO stage at ART initiation and treatment with stavudine or zidovudine compared with tenofovir. DRMs were detected in 89% of 123 specimens with VF, including M184I/V, K103N/S, K65N/R, V106A/M and Y181C. Conclusions: VF in adults in KZN was <8% up to 3 years post-ART initiation but was associated with a high frequency of DRMs. These data identify key groups for intensified adherence counselling and highlight the need to optimize first-line regimens to maintain viral suppression.

      5. Trends in diagnoses among hospitalizations of HIV-infected children and adolescents in the United States: 2003-2012
        Hurst SA, Ewing AC, Ellington SR, Kourtis AP.
        Pediatr Infect Dis J. 2017 Oct;36(10):981-987.
        OBJECTIVE: Using data from 2003-2012, we updated a previous analysis of trends in hospitalizations of HIV-infected children and adolescents in the United States. METHODS: We used data from the Kids Inpatient Database of the Healthcare Cost and Utilization Project to derive nationally representative estimates of the number of hospitalizations and the rates per 1000 hospitalizations of select discharge diagnoses and procedures in 2003, 2006, 2009 and 2012 among HIV-infected and HIV-uninfected children and adolescents </=18 years, excluding hospitalizations for conditions related to pregnancy/delivery and neonatal diagnoses. We also examined trends in the prevalence of select discharge diagnoses and procedures using multivariable logistic regression models. RESULTS: During 2003-2012, the number of hospitalizations for HIV-infected children declined 58% versus 17% for uninfected, but the odds of having discharge codes for most of the diagnoses and procedures studied, including death during hospitalization, remained higher among HIV-infected compared with uninfected children. Among HIV-infected children, the prevalence of discharge diagnoses for pneumonia, pneumococcal disease and varicella/herpes zoster infections and odds of death during hospitalization decreased over time, while bacterial infections/sepsis and methicillin-resistant Staphylococcus aureus increased. Among HIV-uninfected children, there was no increase in diagnoses of bacterial infection/sepsis, but otherwise trends were similar. CONCLUSIONS: The number of hospitalizations for HIV-infected children declined from 2003 to 2012. The decreased prevalence of several discharge diagnoses and lower risk of death during hospitalization likely reflect improvements in HIV therapies and increased uptake of other preventive strategies. However, the increasing prevalence of discharge diagnoses for bacterial infections/sepsis warrants further attention and monitoring.

      6. Delta hepatitis: Towards improved diagnostics
        Kamili S, Drobeniuc J, Mixson-Hayden T, Kodani M.
        Hepatology. 2017 Sep 29.

        [No abstract]

      7. Antimicrobial drug prescription and Neisseria gonorrhoeae susceptibility, United States, 2005-2013
        Kirkcaldy RD, Bartoces MG, Soge OO, Riedel S, Kubin G, Del Rio C, Papp JR, Hook EW, Hicks LA.
        Emerg Infect Dis. 2017 Oct;23(10):1657-1663.
        We investigated whether outpatient antimicrobial drug prescribing is associated with Neisseria gonorrhoeae antimicrobial drug susceptibility in the United States. Using susceptibility data from the Gonococcal Isolate Surveillance Project during 2005-2013 and QuintilesIMS data on outpatient cephalosporin, macrolide, and fluoroquinolone prescribing, we constructed multivariable linear mixed models for each antimicrobial agent with 1-year lagged annual prescribing per 1,000 persons as the exposure and geometric mean MIC as the outcome of interest. Multivariable models did not demonstrate associations between antimicrobial drug prescribing and N. gonorrhoeae susceptibility for any of the studied antimicrobial drugs during 2005-2013. Elucidation of epidemiologic factors contributing to resistance, including further investigation of the potential role of antimicrobial drug use, is needed.

      8. Rhinovirus viremia in patients hospitalized with community acquired pneumonia
        Lu X, Schneider E, Jain S, Bramley AM, Hymas W, Stockmann C, Ampofo K, Arnold SR, Williams DJ, Self WH, Patel A, Chappell JD, Grijalva CG, Anderson EJ, Wunderink RG, McCullers JA, Edwards KM, Pavia AT, Erdman DD.
        J Infect Dis. 2017 Aug 29.
        Background: Rhinoviruses (RVs) are ubiquitous respiratory pathogens that often cause mild or subclinical infections. Molecular detection of RV from the upper respiratory tract can be prolonged, complicating etiologic association in persons with severe lower respiratory tract infections. Little is known about RV viremia and its value as a diagnostic indicator in persons hospitalized with community-acquired pneumonia (CAP). Methods: Sera from RV-positive children and adults hospitalized with CAP were tested for RV by real-time RT-PCR. RV species and type were determined by partial genome sequencing. Results: Overall, 57/570 (10%) RV-positive patients were viremic and all were children <10 years old [57/375 (15.2%)]. Although RV-A was the most common RV species detected from respiratory specimens (48.8%), almost all viremias were RV-C (98.2%). Viremic patients had fewer co-detected pathogens and were more likely to have chest retractions, wheezing and a history of underlying asthma/reactive airway disease than patients without viremia. Conclusions: More than one out of seven RV-infected children <10 years old hospitalized with CAP were viremic. In contrast with other RV species, RV-C infections were highly associated with viremia and more often the only respiratory pathogen identified, suggesting that RV-C viremia may be an important diagnostic indicator in pediatric pneumonia.

      9. Predictors of voluntary medical male circumcision prevalence among men aged 25-39 years in Nyanza region, Kenya: Results from the baseline survey of the TASCO study
        Odoyo-June E, Agot K, Grund JM, Onchiri F, Musingila P, Mboya E, Emusu D, Onyango J, Ohaga S, Soo L, Otieno-Nyunya B.
        PLoS One. 2017 ;12(10):e0185872.
        INTRODUCTION: Uptake of voluntary medical male circumcision (VMMC) as an intervention for prevention of HIV acquisition has been low among men aged >/=25 years in Nyanza region, western Kenya. We conducted a baseline survey of the prevalence and predictors of VMMC among men ages 25-39 years as part of the preparations for a cluster randomized controlled trial (cRCT) called the Target, Speed and Coverage (TASCO) Study. The TASCO Study aimed to assess the impact of two demand creation interventions-interpersonal communication (IPC) and dedicated service outlets (DSO), delivered separately and together (IPC + DSO)-on VMMC uptake. METHODS: As part of the preparatory work for implementation of the cRCT to evaluate tailored interventions to improve uptake of VMMC, we conducted a survey of men aged 25-39 years from a traditionally non-circumcising Kenyan ethnic community within non-contiguous locations selected as study sites. We determined their circumcision status, estimated the baseline circumcision prevalence and assessed predictors of being circumcised using univariate and multivariate logistic regression. RESULTS: A total of 5,639 men were enrolled of which 2,851 (50.6%) reported being circumcised. The odds of being circumcised were greater for men with secondary education (adjusted Odds Ratio (aOR) = 1.65; 95% CI: 1.45-1.86, p<0.001), post-secondary education (aOR = 1.72; 95% CI: 1.44-2.06, p <0.001), and those employed (aOR = 1.32; 95% CI: 1.18-1.47, p <0.001). However, the odds were lower for men with a history of being married (currently married, divorced, separated, or widowed). CONCLUSION: Among adult men in the rural Nyanza region of Kenya, men with post-primary education and employed were more likely to be circumcised. VMMC programs should focus on specific sub-groups of men, including those aged 25-39 years who are married, divorced/separated/ widowed, and of low socio-economic status (low education and unemployed).

      10. Inhibition of rubella virus replication by the broad-spectrum drug nitazoxanide in cell culture and in a patient with a primary immune deficiency
        Perelygina L, Hautala T, Seppanen M, Adebayo A, Sullivan KE, Icenogle J.
        Antiviral Res. 2017 Sep 30.
        Persistent rubella virus (RV) infection has been associated with various pathologies such as congenital rubella syndrome, Fuchs’s uveitis, and cutaneous granulomas in patients with primary immune deficiencies (PID). Currently there are no drugs to treat RV infections. Nitazoxanide (NTZ) is an FDA-approved drug for parasitic infections, and has been recently shown to have broad-spectrum antiviral activities. Here we found that empiric 2-month therapy with oral NTZ was associated in the decline/elimination of RV antigen from lesions in a PID patient with RV positive granulomas, while peginterferon treatment had no effect. In addition, we characterized the effects of NTZ on cell culture models of persistent RV infection. NTZ significantly inhibited RV replication in a primary culture of human umbilical vein endothelial cells (HUVEC) and Vero and A549 epithelial cell lines in a dose dependent manner with an average 50% inhibitory concentration of 0.35 mug/ml (1.1 muM). RV strains representing currently circulating genotypes were inhibited to a similar extent. NTZ affected early and late stages of infection by inhibiting synthesis of cellular and RV RNA and interfering with intracellular trafficking of the RV surface glycoproteins, E1 and E2. These results suggest a potential application of NTZ for the treatment of persistent rubella infections, but more studies are required.

      11. Background: Increasing antibiotic resistance limits treatment options for gonorrhea. We examined the impact of a hypothetical point-of-care (POC) test reporting antibiotic susceptibility profiles on slowing resistance spread. Methods: A mathematical model describing gonorrhea transmission incorporated resistance emergence probabilities and fitness costs associated with resistance based on characteristics of ciprofloxacin (A), azithromycin (B), and ceftriaxone (C). We evaluated time to 1% and 5% prevalence of resistant strains among all isolates with: (1) empiric treatment (B and C), and treatment guided by POC tests determining susceptibility to (2) A only, and (3) all three antibiotics. Results: Continued empiric treatment without POC testing was projected to result in >5% of isolates being resistant to both B and C within 15 years. Use of either POC test in 10% of identified cases delayed this by 5 years. The three antibiotic POC test delayed the time to reach 1% prevalence of triply-resistant strains by 6 years, while the A-only test resulted in no delay. Results were less sensitive to assumptions about fitness costs and test characteristics with increasing test uptake. Conclusions: Rapid diagnostics reporting antibiotic susceptibility may extend the usefulness of existing antibiotics for gonorrhea treatment, but ongoing monitoring of resistance patterns will be critical.

      12. A point system to forecast hepatocellular carcinoma risk before and after treatment among persons with chronic hepatitis C
        Xing J, Spradling PR, Moorman AC, Holmberg SD, Teshale EH, Rupp LB, Gordon SC, Lu M, Boscarino JA, Schmidt MA, Trinacty CM, Xu F.
        Dig Dis Sci. 2017 Sep 30.
        BACKGROUND: Risk of hepatocellular carcinoma (HCC) may be difficult to determine in the clinical setting. AIM: Develop a scoring system to forecast HCC risk among patients with chronic hepatitis C. METHODS: Using data from the Chronic Hepatitis Cohort Study collected during 2005-2014, we derived HCC risk scores for males and females using an extended Cox model with aspartate aminotransferase-to-platelet ratio index (APRI) as a time-dependent variables and mean Kaplan-Meier survival functions from patient data at two study sites, and used data collected at two separate sites for external validation. For model calibration, we used the Greenwood-Nam-D’Agostino goodness-of-fit statistic to examine differences between predicted and observed risk. RESULTS: Of 12,469 patients (1628 with a history of sustained viral response [SVR]), 504 developed HCC; median follow-up was 6 years. Final predictors in the model included age, alcohol abuse, interferon-based treatment response, and APRI. Point values, ranging from -3 to 14 (males) and -3 to 12 (females), were established using hazard ratios of the predictors aligned with 1-, 3-, and 5-year Kaplan-Meier survival probabilities of HCC. Discriminatory capacity was high (c-index 0.82 males and 0.84 females) and external calibration demonstrated no differences between predicted and observed HCC risk for 1-, 3-, and 5-year forecasts among males (all p values >0.97) and for 3- and 5-year risk among females (all p values >0.87). CONCLUSION: This scoring system, based on age, alcohol abuse history, treatment response, and APRI, can be used to forecast up to a 5-year risk of HCC among hepatitis C patients before and after SVR.

      13. Estimated rates of influenza-associated outpatient visits during 2001-2010 in six US integrated health care delivery organizations
        Zhou H, Thompson WW, Belongia EA, Fowlkes A, Baxter R, Jacobsen SJ, Jackson ML, Glanz JM, Naleway AL, Ford DC, Weintraub E, Shay DK.
        Influenza Other Respir Viruses. 2017 Sep 27.
        BACKGROUND: Population-based estimates of influenza-associated outpatient visits including both pandemic and inter-pandemic seasons are uncommon. Comparisons of such estimates with laboratory-confirmed rates of outpatient influenza are rare. OBJECTIVE: To estimate influenza-associated outpatient visits in six US integrated health care delivery organizations enrolling ~7.7 million persons. METHODS: Using negative-binomial regression methods, we modeled rates of influenza-associated visits with ICD-9-CM-coded pneumonia or acute respiratory outpatient visits during 2001-10. These estimated counts were added to visits coded specifically for influenza to derive estimated rates. We compared these rates with those observed in two contemporaneous studies recording RT-PCR-confirmed influenza outpatient visits. RESULTS: Outpatient rates estimated with pneumonia visits were 39 (95% confidence interval [CI], 30-70) and 203 (95% CI, 180-240) per 10,000 person-years, respectively, for inter-pandemic and pandemic seasons. Corresponding rates estimated with respiratory visits were 185 (95% CI, 161-255) and 542 (95% CI, 441-823) per 10,000 person-years. During the pandemic, children aged 2-17 years had the largest increase in rates (when estimated with pneumonia visits, from 64 [95% CI, 50-121] to 381 [95% CI, 366-481]). Rates estimated with pneumonia visits were consistent with rates of RT-PCR-confirmed influenza visits during 4 of 5 seasons in one comparison study. In another, rates estimated with pneumonia visits during the pandemic for children and adults were consistent in timing, peak, and magnitude. CONCLUSIONS: Estimated rates of influenza-associated outpatient visits were higher in children than adults during pre-pandemic and pandemic seasons. Rates estimated with pneumonia visits plus influenza-coded visits were similar to rates from studies using RT-PCR-confirmed influenza. This article is protected by copyright. All rights reserved.

    • Disaster Control and Emergency Services
      1. PURPOSE: The effects of climate change are far-reaching and multifactorial, with potential impacts on food security and conflict. Large population movements, whether from the aftermath of natural disasters or resulting from conflict, can precipitate the need for humanitarian response in what can become complex humanitarian emergencies (CHEs). Nurses need to be prepared to respond to affected communities in need, whether the emergency is domestic or global. The purpose of the article is to describe a novel course for nursing students interested in practice within the confines of CHEs and natural disasters. METHODS AND FRAMEWORK: The authors used the Sphere Humanitarian Charter and Minimum Standards as a practical framework to inform the course development. They completed a review of the literature on the interaction on climate change, conflict and health, and competencies related to working CHEs. Resettled refugees, as well as experts in the area of humanitarian response, recovery, and mitigation from the Centers for Disease Control and Prevention and nongovernmental organizations further informed the development of the course. CLINICAL RELEVANCE: This course prepares the nursing workforce to respond appropriately to large population movements that may arise from the aftermath of natural disasters or conflict, both of which can comprise a complex humanitarian disaster. Using The Sphere Project e-learning course, students learn about the Sphere Project, which works to ensure accountability and quality in humanitarian response and offers core minimal standards for technical assistance. These guidelines are seen globally as the gold standard for humanitarian response and address many of the competencies for disaster nursing (

    • Disease Reservoirs and Vectors
      1. Rickettsia parkeri (Rickettsiales: Rickettsiaceae) detected in ticks of the Amblyomma maculatum (Acari: Ixodidae) group collected from multiple locations in southern Arizona
        Allerdice ME, Beati L, Yaglom H, Lash RR, Delgado-de la Mora J, Licona-Enriquez JD, Delgado-de la Mora D, Paddock CD.
        J Med Entomol. 2017 Jul 24.
        Rickettsia parkeri is an emerging human pathogen transmitted by Amblyomma ticks in predominately tropical and subtropical regions of the western hemisphere. In 2014 and 2015, one confirmed case and one probable case of R. parkeri rickettsiosis were reported from the Pajarita Wilderness Area, a semi-arid mountainous region in southern Arizona. To examine more closely the potential public health risk of R. parkeri in this region, a study was initiated to investigate the pervasiveness of Amblyomma maculatum Koch group ticks in mountainous areas of southern Arizona and to ascertain the infection frequencies of R. parkeri in these ticks. During July 2016, a total of 182 adult ticks were collected and evaluated from the Pajarita Wilderness Area in Santa Cruz County and two additional sites in Cochise and Santa Cruz counties in southern Arizona. DNA of R. parkeri was detected in a total of 44 (24%) of these ticks. DNA of “Candidatus Rickettsia andeanae” and Rickettsia rhipicephali was detected in three (2%) and one (0.5%) of the samples, respectively. These observations corroborate previous collection records and indicate that established populations of A. maculatum group ticks exist in multiple foci in southern Arizona. The high frequency of R. parkeri in these tick populations suggests a public health risk as well as the need to increase education of R. parkeri rickettsiosis for those residing, working in, or visiting this area.

      2. Ticks (Acari: Ixodidae) were collected from 44 desert bighorn sheep (Ovis canadensis) and 10 mule deer (Odocoileus hemionus) in southern California during health inspections in 2015-16. Specimens were identified and screened by PCR analysis to determine the presence and prevalence of Bartonella, Borrelia, and Rickettsia species in ticks associated with these wild ruminants. None of the 60 Dermacentor hunteri and 15 Dermacentor albipictus ticks tested yielded positive PCR results. Additional tick specimens should be collected and tested to determine the prevalence of these confirmed or suspected tickborne pathogens within ruminant populations.

      3. Assessing monkeypox virus prevalence in small mammals at the human-animal interface in the Democratic Republic of the Congo
        Doty JB, Malekani JM, Kalemba LN, Stanley WT, Monroe BP, Nakazawa YU, Mauldin MR, Bakambana TL, Liyandja Dja Liyandja T, Braden ZH, Wallace RM, Malekani DV, McCollum AM, Gallardo-Romero N, Kondas A, Peterson AT, Osorio JE, Rocke TE, Karem KL, Emerson GL, Carroll DS.
        Viruses. 2017 Oct 03;9(10).
        During 2012, 2013 and 2015, we collected small mammals within 25 km of the town of Boende in Tshuapa Province, the Democratic Republic of the Congo. The prevalence of monkeypox virus (MPXV) in this area is unknown; however, cases of human infection were previously confirmed near these collection sites. Samples were collected from 353 mammals (rodents, shrews, pangolins, elephant shrews, a potamogale, and a hyrax). Some rodents and shrews were captured from houses where human monkeypox cases have recently been identified, but most were trapped in forests and agricultural areas near villages. Real-time PCR and ELISA were used to assess evidence of MPXV infection and other Orthopoxvirus (OPXV) infections in these small mammals. Seven (2.0%) of these animal samples were found to be anti-orthopoxvirus immunoglobulin G (IgG) antibody positive (six rodents: two Funisciurus spp.; one Graphiurus lorraineus; one Cricetomys emini; one Heliosciurus sp.; one Oenomys hypoxanthus, and one elephant shrew Petrodromus tetradactylus); no individuals were found positive in PCR-based assays. These results suggest that a variety of animals can be infected with OPXVs, and that epidemiology studies and educational campaigns should focus on animals that people are regularly contacting, including larger rodents used as protein sources.

      4. Protection of bats (Eptesicus fuscus) against rabies following topical or oronasal exposure to a recombinant raccoon poxvirus vaccine
        Stading B, Ellison JA, Carson WC, Satheshkumar PS, Rocke TE, Osorio JE.
        PLoS Negl Trop Dis. 2017 Oct 04;11(10):e0005958.
        Rabies is an ancient neglected tropical disease that causes tens of thousands of human deaths and millions of cattle deaths annually. In order to develop a new vaccine for potential use in bats, a reservoir of rabies infection for humans and animals alike, an in silico antigen designer tool was used to create a mosaic glycoprotein (MoG) gene using available sequences from the rabies Phylogroup I glycoprotein. This sequence, which represents strains more likely to occur in bats, was cloned into raccoonpox virus (RCN) and the efficacy of this novel RCN-MoG vaccine was compared to RCN-G that expresses the glycoprotein gene from CVS-11 rabies or luciferase (RCN-luc, negative control) in mice and big brown bats (Eptesicus fuscus). Mice vaccinated and boosted intradermally with 1 x 107 plaque forming units (PFU) of each RCN-rabies vaccine construct developed neutralizing antibodies and survived at significantly higher rates than controls. No significant difference in antibody titers or survival was noted between rabies-vaccinated groups. Bats were vaccinated either oronasally (RCN-G, RCN-MoG) with 5×107 PFU or by topical application in glycerin jelly (RCN-MoG, dose 2×108 PFU), boosted (same dose and route) at 46 days post vaccination (dpv), and then challenged with wild-type big brown variant RABV at 65 dpv. Prior to challenge, 90% of RCN-G and 75% of RCN-MoG oronasally vaccinated bats had detectable levels of serum rabies neutralizing antibodies. Bats from the RCN-luc and topically vaccinated RCN-MoG groups did not have measurable antibody responses. The RCN-rabies constructs were highly protective and not significantly different from each other. RCN-MoG provided 100% protection (n = 9) when delivered oronasally and 83% protection (n = 6) when delivered topically; protection provided by the RCN-G construct was 70% (n = 10). All rabies-vaccinated bats survived at a significantly (P </= 0.02) higher rate than control bats (12%; n = 8). We have demonstrated the efficacy of a novel, in silico designed rabies MoG antigen that conferred protection from rabies challenge in mice and big brown bats in laboratory studies. With further development, topical or oronasal administration of the RCN-MoG vaccine could potentially mitigate rabies in wild bat populations, reducing spillover of this deadly disease into humans, domestic mammals, and other wildlife.

      5. Francisella-like endosymbiont detected in Haemaphysalis ticks (Acari: Ixodidae) from the Republic of Korea
        Takhampunya R, Kim HC, Chong ST, Korkusol A, Tippayachai B, Davidson SA, Petersen JM, Klein TA.
        J Med Entomol. 2017 Jul 12.
        A total of 6,255 ticks belonging to three genera and six species (Haemaphysalis flava Neumann, Haemaphysalis longicornis Neumann, Haemaphysalis phasiana Saito, Ixodes nipponensis Kitaoka & Saito, Ixodes persulcatus Schulze, and Amblyomma testudinarium Koch) collected from May-August, 2013, at four southwestern provinces in the Republic of Korea (ROK) were submitted to the Armed Forces Research Institute of Medical Sciences and assayed for selected tick-borne pathogens. One pool each of H. flava and H. phasiana was positive by PCR and sequencing for a Francisella-like endosymbiont, while all pools were negative for Francisella tularensis, the causative agent of tularemia.

    • Drug Safety
      1. Improving safe use of medications during pregnancy: The roles of patients, physicians, and pharmacists
        Lynch MM, Amoozegar JB, McClure EM, Squiers LB, Broussard CS, Lind JN, Polen KN, Frey MT, Gilboa SM, Biermann J.
        Qual Health Res. 2017 Oct 01:1049732317732027.
        Our study sought to explore the actual and potential roles of patients, physicians, and pharmacists, as well as their shared challenges and opportunities, in improving the safety of medication use during pregnancy. We conducted virtual focus groups with 48 women and in-depth interviews with nine physicians and five pharmacists. Qualitative analysis revealed that all three groups of participants reported “playing it safe,” the need for an engaged patient making informed decisions, challenges surrounding communication about pregnancy status, and a lack of patient-centric resources. Patients, physicians, and pharmacists are highly motivated to protect developing babies from potential harms of medication use during pregnancy while maintaining the patient’s health. Strategic messaging could maximize the effectiveness of these interactions by helping physicians discuss the benefits and risks of medication use during pregnancy, pharmacists screen for pregnancy and counsel on medication safety, and patients using medications to share pregnancy intentions with their providers pre-pregnancy.

      2. Evidence of superficial knowledge regarding antibiotics and their use: Results of two cross-sectional surveys in an urban informal settlement in Kenya
        Omulo S, Thumbi SM, Lockwood S, Verani JR, Bigogo G, Masyongo G, Call DR.
        PLoS One. 2017 ;12(10):e0185827.
        We assessed knowledge and practices related to antibiotic use in Kibera, an urban informal settlement in Kenya. Surveys was employed at the beginning (entry) and again at the end (exit) of a 5-month longitudinal study of AMR. Two-hundred households were interviewed at entry, of which 149 were also interviewed at exit. The majority (>65%) of respondents in both surveys could name at least one antibiotic, with amoxicillin and cotrimoxazole jointly accounting for 85% and 77% of antibiotics mentioned during entry and exit, respectively. More than 80% of respondents felt antibiotics should not be shared or discontinued following the alleviation of symptoms. Nevertheless, 66% and 74% of respondents considered antibiotics effective for treating colds and flu in the entry and exit surveys, respectively. There was a high (87%, entry; 70% exit) level of reported antibiotic use (past 12 months) mainly for colds/flu, coughs and fever, with >80% of respondents obtaining antibiotics from health facilities and pharmacies. Less than half of respondents remembered getting information on the correct use of antibiotics, although 100% of those who did reported improved attitudes towards antibiotic use. Clinicians and community pharmacists were highly trusted information sources. Paired household responses (n = 149) generally showed improved knowledge and attitudes by the exit survey although practices were largely unchanged. Weak agreement (kappa = -0.003 to 0.22) between survey responses suggest both that unintended learning had not occurred, and that participant responses were not based on established knowledge or behaviors. Targeted public education regarding antibiotics is needed to address this gap.

    • Entomology
      1. Aedes aegypti (L.) and Ae. albopictus (Skuse) are important arbovirus vectors in the United States, and the recent emergence of Zika virus disease as a public health concern in the Americas has reinforced a need for tools to rapidly distinguish between these species in collections made by vector control agencies. We developed a duplex real-time PCR assay that detects both species and does not cross-amplify in any of the other seven Aedes species tested. The lower limit of detection for our assay is equivalent to approximately 0.03 of a first-instar larva in a 60-microl sample (0.016 ng of DNA per real-time PCR reaction). The assay was sensitive and specific in mixtures of both species that reflected up to a 2,000-fold difference in DNA concentration. In addition, we developed a simple protocol to extract DNA from sonicated first-instar larvae, and used that DNA to test the assay. Because it uses real-time PCR, the assay saves time by not requiring a separate visualization step. This assay can reduce the time needed for vector control agencies to make species identifications, and thus inform decisions about surveillance and control.

    • Environmental Health
      1. Plasma concentrations of per- and polyfluoroalkyl substances at baseline and associations with glycemic indicators and diabetes incidence among high-risk adults in the Diabetes Prevention Program Trial
        Cardenas A, Gold DR, Hauser R, Kleinman KP, Hivert MF, Calafat AM, Ye X, Webster TF, Horton ES, Oken E.
        Environ Health Perspect. 2017 Oct 02;125(10):107001.
        BACKGROUND: Several per- and polyfluoroalkyl substances (PFAS) are ubiquitous anthropogenic pollutants almost universally detected in humans. Experimental evidence indicates that PFAS alter glucose metabolism and insulin secretion. However, epidemiological studies have yielded inconsistent results. OBJECTIVE: We sought to examine associations between plasma PFAS concentrations, glycemic indicators, and diabetes incidence among high-risk adults. METHODS: Within the Diabetes Prevention Program (DPP), a trial for the prevention of type 2 diabetes among high-risk individuals, we quantified baseline plasma concentrations of nine PFAS among 957 participants randomized to a lifestyle intervention or placebo. We evaluated adjusted associations for plasma PFAS concentrations with diabetes incidence and key glycemic indicators measured at baseline and annually over up to 4.6 y. RESULTS: Plasma PFAS concentrations were similar to those reported in the U.S. population in 1999-2000. At baseline, in cross-sectional analysis, a doubling in plasma perfluorooctanesulfonic acid (PFOS) and perfluorooctanoic acid (PFOA) concentrations was associated with higher homeostatic model assessment of insulin resistance (HOMA-IR) [betaPFOS=0.39; 95% confidence interval (CI): 0.13, 0.66; betaPFOA=0.64; 95% CI: 0.34, 0.94], beta-cell function (HOMA-beta) (betaPFOS=9.62; 95% CI: 1.55, 17.70; betaPFOA=15.93; 95% CI: 6.78, 25.08), fasting proinsulin (betaPFOS=1.37 pM; 95% CI: 0.50, 2.25; betaPFOA=1.71 pM; 95% CI: 0.72, 2.71), and glycated hemoglobin (HbA1c) (betaPFOS=0.03%; 95% CI: 0.002, 0.07; betaPFOA=0.04%; 95% CI: 0.001, 0.07). There was no strong evidence of associations between plasma PFAS concentrations and diabetes incidence or prospective changes in glycemic indicators during the follow-up period. CONCLUSIONS: At baseline, several PFAS were cross-sectionally associated with small differences in markers of insulin secretion and beta-cell function. However, there was limited evidence suggesting that PFAS concentrations are associated with diabetes incidence or changes in glycemic indicators during the follow-up period.

      2. Environmental exposure to manganese in air: Tremor, motor and cognitive symptom profiles
        Kornblith ES, Casey SL, Lobdell DT, Colledge MA, Bowler RM.
        Neurotoxicology. 2017 Sep 28.
        BACKGROUND: Excessive exposure to manganese (Mn) may cause parkinsonian-like motor and tremor symptoms and adverse cognitive effects, including problems with executive functioning (EF), resembling those found in later-stage Parkinson’s disease (PD). Studies seeking to differentiate PD patients into subgroups with associated cognitive and functional outcomes using motor and tremor symptoms identified tremor-dominant (TD) and non-tremor dominant (NTD) subtypes. It is unclear whether differing patterns of pathophysiology and symptoms exist in Mn neurotoxicity, as they do in PD. METHODS: Residents of East Liverpool (n=83) and Marietta, OH (n=99) exposed to chronic (>10 years) environmental Mn through industrial pollution were administered neuropsychological measures and a physician-rated scale of movement-disorder symptoms. Two-step cluster analysis was used to group residents based on tremor symptoms, bradykinesia/rigidity symptoms, gait disturbance, and executive function. Cluster membership was validated using modeled air-Mn exposure and a computerized tremor measure. RESULTS: Elevated tremor and motor symptoms and executive dysfunction were observed, and TD and NTD symptom clusters were identified. Two additional clusters were also identified: Executive Dysfunction and Normal Functioning. The NTD residents, with elevated levels of gait disturbance and other movement disorder symptoms, did not evidence EF impairment, as predicted. Instead, residents with EF impairment formed their own cluster, and were relatively free of movement disorder symptoms. CONCLUSIONS: Results resemble reports in the PD literature with TD and NTD clusters identified, but executive dysfunction did not cluster with NTD symptoms. PD and Mn exposure likely have differing pathophysiology and developmental courses, and therefore different symptom patterns, even when similar symptoms are present.

      3. Environmental survey of drinking water sources in Kampala, Uganda during a typhoid fever outbreak
        Murphy JL, Kahler AM, Nansubuga I, Nanyunja EM, Kaplan B, Jothikumar N, Routh J, Gomez GA, Mintz ED, Hill VR.
        Appl Environ Microbiol. 2017 Sep 29.
        In 2015, a typhoid fever outbreak began in downtown Kampala, Uganda and spread into adjacent districts. In response, an environmental survey of drinking water source types was conducted in areas of the city with high case numbers. A total of 122 samples were collected from 12 source types and tested for E. coli, free chlorine, and conductivity. An additional 37 grab samples from seven source types and 16 paired large volume (20-L) samples from wells and springs were also collected and tested for the presence of Salmonella Typhi. Escherichia coli was detected in 60% of kaveras (drinking water sold in plastic bags) and 80% of refilled water bottles; free chlorine was not detected in either source type. Most jerry cans (68%) contained E. coli and had free chlorine residuals below the WHO recommended level of 0.5 mg/L during outbreaks. Elevated conductivity readings for kaveras, refilled water bottles, and jerry cans (compared to treated surface water supplied by the water utility) suggested they likely contained untreated ground water. All unprotected springs and wells and more than 60% of protected springs contained E. coli Water samples collected from the water utility were found to have acceptable free chlorine levels and no detectable E. coli While S Typhi was not detected in water samples, Salmonella spp. were detected in samples from two unprotected springs, one protected spring, and one refilled water bottle. These data provided clear evidence that unregulated vended water and ground water represented a risk for typhoid transmission.Importance Despite the high incidence of typhoid fever globally, relatively few outbreak investigations incorporate drinking water testing. During waterborne disease outbreaks, measurement of physical-chemical parameters, such as free chlorine residual and electrical conductivity, and microbiological parameters, such as the presence of E. coli or the implicated etiologic agent, in drinking water samples can identify contaminated sources. This investigation indicated that unregulated vended water and ground water sources were contaminated and were therefore a risk to consumers during the 2015 typhoid fever outbreak in Kampala. Identification of contaminated drinking water sources and sources that do not contain adequate disinfectant levels can lead to rapid targeted interventions.

      4. Reactive indoor air chemistry and health – A workshop summary
        Wells JR, Schoemaecker C, Carslaw N, Waring MS, Ham JE, Nelissen I, Wolkoff P.
        Int J Hyg Environ Health. 2017 Sep 23.
        The chemical composition of indoor air changes due to the reactive nature of the indoor environment. Historically, only the stable parent compounds were investigated due to their ease of measurement by conventional methods. Today, however, scientists can better characterize oxidation products (gas and particulate-phase) formed by indoor chemistry. An understanding of occupant exposure can be developed through the investigation of indoor oxidants, the use of derivatization techniques, atmospheric pressure detection, the development of real-time technologies, and improved complex modeling techniques. Moreover, the connection between exposure and health effects is now receiving more attention from the research community. Nevertheless, a need still exists for improved understanding of the possible link between indoor air chemistry and observed acute or chronic health effects and long-term effects such as work-related asthma.

    • Epidemiology and Surveillance
      1. Evaluation of syndromic surveillance systems in 6 US state and local health departments
        Thomas MJ, Yoon PW, Collins JM, Davidson AJ, Mac Kenzie WR.
        J Public Health Manag Pract. 2017 Sep 28.
        OBJECTIVE: Evaluating public health surveillance systems is critical to ensuring that conditions of public health importance are appropriately monitored. Our objectives were to qualitatively evaluate 6 state and local health departments that were early adopters of syndromic surveillance in order to (1) understand the characteristics and current uses, (2) identify the most and least useful syndromes to monitor, (3) gauge the utility for early warning and outbreak detection, and (4) assess how syndromic surveillance impacted their daily decision making. DESIGN: We adapted evaluation guidelines from the Centers for Disease Control and Prevention and gathered input from the Centers for Disease Control and Prevention subject matter experts in public health surveillance to develop a questionnaire. PARTICIPANTS: We interviewed staff members from a convenience sample of 6 local and state health departments with syndromic surveillance programs that had been in operation for more than 10 years. RESULTS: Three of the 6 interviewees provided an example of using syndromic surveillance to identify an outbreak (ie, cluster of foodborne illness in 1 jurisdiction) or detect a surge in cases for seasonal conditions (eg, influenza in 2 jurisdictions) prior to traditional, disease-specific systems. Although all interviewees noted that syndromic surveillance has not been routinely useful or efficient for early outbreak detection or case finding in their jurisdictions, all agreed that the information can be used to improve their understanding of dynamic disease control environments and conditions (eg, situational awareness) in their communities. CONCLUSION: In the jurisdictions studied, syndromic surveillance may be useful for monitoring the spread and intensity of large outbreaks of disease, especially influenza; enhancing public health awareness of mass gatherings and natural disasters; and assessing new, otherwise unmonitored conditions when real-time alternatives are unavailable. Future studies should explore opportunities to strengthen syndromic surveillance by including broader access to and enhanced analysis of text-related data from electronic health records. Health departments may accelerate the development and use of syndromic surveillance systems, including the improvement of the predictive value and strengthening the early outbreak detection capability of these systems. These efforts support getting the right information to the right people at the right time, which is the overarching goal of CDC’s Surveillance Strategy.

    • Food Safety
      1. From 1998 to 2008, produce-related illness outbreaks accounted for roughly one-half of reported foodborne outbreaks in the United States. In 2013, Mexico accounted for approximately 50 and 30% of the monetary value of all vegetables and fruits, respectively, imported into the United States. We used historical import data to examine the correlation between the port of entry for five implicated produce vehicles from five multistate outbreaks and the geospatial and temporal distribution of illnesses in the corresponding outbreaks in the United States. For comparison, we analyzed the geospatial and temporal distribution of cases from two U.S. multistate outbreaks associated with domestically grown produce. The geospatial distribution of illnesses in the two outbreaks linked to domestic produce differed from that of the import-related produce outbreaks. The results of our pilot study suggest that geospatial distribution of early-onset cases may be used to identify ports of entry for produce likely to be responsible for causing multistate outbreaks in the United States and that targeted sampling of produce items from these ports of entry may expedite identification of an outbreak vehicle.

    • Health Communication and Education
      1. The U.S. National Action Plan to Improve Health Literacy: A model for positive organizational change
        Baur C, Harris L, Squire E.
        Stud Health Technol Inform. 2017 ;240:186-202.
        This chapter presents the U.S. National Action Plan to Improve Health Literacy and its unique contribution to public health and health care in the U.S. The chapter details what the National Action Plan is, how it evolved, and how it has influenced priorities for health literacy improvement work. Examples of how the National Action Plan fills policy and research gaps in health care and public health are included. The first part of the chapter lays the foundation for the development of the National Action Plan, and the second part discusses how it can stimulate positive organizational change to help create health literate organizations and move the nation towards a health literate society.

    • Immunity and Immunization
      1. CONTEXT: Before participating in a project funded by the Centers for Disease Control and Prevention, most state and local health departments (LHDs) were not seeking reimbursement or being fully reimbursed by insurance plans for the cost of immunization services (including vaccine costs and administration fees) they provided to insured patients. Centers for Disease Control and Prevention’s Billables Project was designed to enable state and LHDs to bill public and private insurance plans for immunization services provided to insured patients. OBJECTIVE: Identify and describe key barriers state and LHDs may encounter while planning and implementing a billing program, as well as possible solutions for overcoming those barriers. DESIGN: This study used reports from Billables Project participants to explore barriers they encountered when planning and implementing a billing program and steps taken to address those barriers. SETTING AND PARTICIPANTS: Thirty-eight state immunization programs. RESULTS: Based on project participants’ reports, barriers were noted in 7 categories: (1) funding and costs, (2) staff, (3) health department characteristics, (4) third-party payers and insurance plans, (5) software, (6) patient insurance status, and (7) other barriers. Possible solutions for overcoming those barriers included hiring or seeking external help, creating billing guides and training modules, streamlining workflows, and modifying existing software systems. CONCLUSION: Overcoming barriers during planning and implementation of a billing program can be challenging for state and LHDs, but the experiences and suggestions of past Billables Project participants can help guide future billing program efforts.

      2. Pneumococcal vaccination among adults with work-related asthma
        Dodd KE, Mazurek JM.
        Am J Prev Med. 2017 Sep 22.
        INTRODUCTION: Pneumococcal vaccination is recommended for all adults with asthma and a Healthy People 2020 goal aims to achieve 60% coverage among high-risk adults, including those with asthma. Adults with work-related asthma have more severe asthma symptoms than those with non-work-related asthma and are particularly vulnerable to pneumococcal pneumonia. METHODS: To assess pneumococcal vaccination coverage by work-related asthma status among ever-employed adults aged 18-64 years with current asthma, data from the 2012-2013 Behavioral Risk Factor Surveillance System Asthma Call-back Survey for ever-employed adults (18-64 years) with current asthma from 29 states were examined in 2016. Adults with work-related asthma had ever been told by a physician their asthma was work-related. Pneumococcal vaccine recipients self-reported having ever received a pneumococcal vaccine. Multivariate logistic regression was used to calculate adjusted prevalence ratios and associated 95% CIs. RESULTS: Among an estimated 12 million ever-employed adults with current asthma in 29 states, 42.0% received a pneumococcal vaccine. Adults with work-related asthma were more likely to have received a pneumococcal vaccine than adults with non-work-related asthma (53.7% versus 35.0%, respectively, prevalence ratio=1.24, 95% CI=1.06, 1.45). Among adults with work-related asthma, pneumococcal vaccine coverage was lowest among Hispanics (36.2%) and those without health insurance (38.5%). CONCLUSIONS: Pneumococcal vaccination coverage among adults with work-related asthma and non-work-related asthma is below the Healthy People 2020 target level. Healthcare providers should verify pneumococcal vaccination status in their patients with asthma and offer the vaccine to those not vaccinated.

      3. Febrile seizure risk after vaccination in children one to five months of age
        Duffy J, Hambidge SJ, Jackson LA, Kharbanda EO, Klein NP, Naleway A, Omer SB, Weintraub E.
        Pediatr Neurol. 2017 Aug 23.
        BACKGROUND: The risk of febrile seizure is temporarily increased for a few days after the administration of certain vaccines in children aged six to 23 months. Our objective was to determine the febrile seizure risk following vaccination in children aged one to five months, when six different vaccines are typically administered. METHODS: We identified emergency department visits and inpatient admissions with International Classification of Diseases, Ninth Revision, febrile seizure codes among children enrolled in nine Vaccine Safety Datalink participating health care organizations from 2006 through 2011. Febrile seizures were confirmed by medical record abstraction. We used the self-controlled risk-interval method to compare the incidence of febrile seizure during postvaccination days 0 to 1 (risk interval) versus days 14 to 20 (control interval). RESULTS: We identified 15 febrile seizure cases that occurred after 585,342 vaccination visits. The case patients were aged three to five months. The patients had received a median of four (range two to six) vaccines simultaneously. The incidence rate ratio of febrile seizure after vaccination was 23 (95% confidence interval 5.13 to 100.8), and the attributable risk was 3.92 (95% confidence interval 1.68 to 6.17) febrile seizure cases per 100,000 persons vaccinated. CONCLUSIONS: Vaccination in children aged three to five months was associated with a large relative risk of febrile seizure on the day of and the day after vaccination, but the risk was small in absolute terms. Postvaccination febrile seizure should not be a concern for the vast majority of children receiving vaccines, but clinicians might take this risk into consideration when evaluating and treating children susceptible to seizures precipitated by fever.

      4. Meningococcal carriage following a university serogroup B meningococcal disease outbreak and vaccination campaign with MenB-4C and MenB-FHbp – Oregon, 2015-2016
        McNamara LA, Thomas JD, MacNeil J, Chang HY, Day M, Fisher E, Martin S, Poissant T, Schmink SE, Steward-Clark E, Jenkins LT, Wang X, Acosta A.
        J Infect Dis. 2017 Aug 26.
        Background: Limited data exist on the impact of the serogroup B meningococcal (MenB) vaccines MenB-FHbp and MenB-4C on meningococcal carriage and herd protection. We therefore assessed meningococcal carriage following a MenB vaccination campaign in response to a university serogroup B meningococcal disease outbreak in 2015. Methods: A convenience sample of students recommended for vaccination provided oropharyngeal swabs and completed questionnaires during four carriage surveys over 11 months. Isolates were tested by real-time PCR, slide agglutination, and whole genome sequencing. Vaccination history was verified via university records and the state immunization registry. Results: A total of 4,225 oropharyngeal swabs were analyzed from 3,802 unique participants. Total meningococcal and genotypically serogroup B carriage prevalence among sampled students were stable at 11-17% and 1.2%-2.4% during each round, respectively; no participants carried the outbreak strain. Neither 1-3 doses of MenB-FHbp nor 1-2 doses of MenB-4C was associated with decreased total or serogroup B carriage prevalence. Conclusions: While few participants completed the full MenB vaccination series, limiting analytic power, these data suggest that MenB-FHbp and MenB-4C do not have a large, rapid impact on meningococcal carriage and are unlikely to provide herd protection in the context of an outbreak response.

      5. Reduction in diarrhea- and rotavirus-related healthcare visits among children <5 years of age after national rotavirus vaccine introduction in Zimbabwe
        Mujuru HA, Yen C, Nathoo KJ, Gonah NA, Ticklay I, Mukaratirwa A, Berejena C, Tapfumanei O, Chindedza K, Rupfutse M, Weldegebriel G, Mwenda JM, Burnett E, Tate JE, Parashar UD, Manangazira P.
        Pediatr Infect Dis J. 2017 Oct;36(10):995-999.
        BACKGROUND: In Zimbabwe, rotavirus accounted for 41%-56% of acute diarrhea hospitalizations before rotavirus vaccine introduction in 2014. We evaluated rotavirus vaccination impact on acute diarrhea- and rotavirus-related healthcare visits in children. METHODS: We examined monthly and annual acute diarrhea and rotavirus test-positive hospitalizations and Accident and Emergency Department visits among children <60 months of age at 3 active surveillance hospitals during 2012-2016; we compared prevaccine introduction (2012-2013) with postvaccine introduction (2015 and 2016) data for 2 of the hospitals. We examined monthly acute diarrhea hospitalizations by year and age group for 2013-2016 from surveillance hospital registers and monthly acute diarrhea outpatient visits reported to the Ministry of Health and Child Care during 2012-2016. RESULTS: Active surveillance data showed winter seasonal peaks in diarrhea- and rotavirus-related visits among children <60 months of age during 2012-2014 that were substantially blunted in 2015 and 2016 after vaccine introduction; the percentage of rotavirus test-positive visits followed a similar seasonal pattern and decrease. Hospital register data showed similar pre-introduction seasonal variation and post-introduction declines in diarrhea hospitalizations among children 0-11 and 12-23 months of age. Monthly variation in outpatient diarrhea-related visits mirrored active surveillance data patterns. At 2 surveillance hospitals, the percentage of rotavirus-positive visits declined by 40% and 43% among children 0-11 months of age and by 21% and 33% among children 12-23 months of age in 2015 and 2016, respectively. CONCLUSION: Initial reductions in diarrheal illness among children <60 months of age, particularly among those 0-11 months of age, after vaccine introduction are encouraging. These early results provide evidence to support continued rotavirus vaccination and rotavirus surveillance in Zimbabwe.

      6. Association between parent attitudes and receipt of human papillomavirus vaccine in adolescents
        VanWormer JJ, Bendixsen CG, Vickers ER, Stokley S, McNeil MM, Gee J, Belongia EA, McLean HQ.
        BMC Public Health. 2017 Oct 02;17(1):766.
        BACKGROUND: Human papillomavirus (HPV) vaccine coverage rates remain low. This is believed to reflect parental hesitancy, but few studies have examined how changes in parents’ attitudes impact HPV vaccine uptake. This study examined the association between changes in parents’ vaccine attitudes and HPV vaccine receipt in their adolescent children. METHODS: A baseline and 1-year follow-up survey of HPV vaccine attitudes was administered to parents of 11-17 year olds who had not completed the HPV vaccine series. Changes in attitudinal scores (barriers, harms, ineffectiveness, and uncertainties) from the Carolina HPV Immunization Attitudes and Beliefs Scale were assessed. Two outcomes were measured (in parents’ adolescent children) over an 18-month period and analyzed using multivariable regression; receipt of next scheduled HPV vaccine dose and 3-dose series completion. RESULTS: There were 221 parents who completed the baseline survey (11% response rate) and 164 with available follow-up data; 60% of their adolescent children received a next HPV vaccine dose and 38% completed the vaccine series at follow-up. Decrease in parents’ uncertainties was a significant predictor of vaccine receipt, with each 1-point reduction in uncertainties score associated with 4.9 higher odds of receipt of the next vaccine dose. Higher baseline harms score was the only significant predictor of lower series completion. CONCLUSIONS: Reductions in parents’ uncertainties appeared to result in greater likelihood of their children receiving the HPV vaccine. Only baseline concerns about vaccine harms were associated with lower series completion rate. Education for parents should emphasize the HPV vaccine’s safety profile.

    • Injury and Violence
      1. Factors associated with physical violence by a sexual partner among girls and women in rural Kenya
        Gust DA, Pan Y, Otieno F, Hayes T, Omoro T, Phillips-Howard PA, Odongo F, Otieno GO.
        J Glob Health. 2017 Dec;7(2):020406.
        BACKGROUND: Intimate partner physical violence increases women’s risk for negative health outcomes and is an important public health concern. The purpose of the present study was to determine 1) the proportion of girls (</=18 years) and women (>18 years) who experienced physical violence by a sexual partner, and 2) factors (including self-reported HIV infection) associated with girls and women who experienced physical violence by a sexual partner. METHODS: Cross-sectional surveys conducted in the Gem Health and Demographic Surveillance System (HDSS) area in Siaya County, western Kenya in 2011-2012 (Round 1) and 2013-2014 (Round 2). FINDINGS: Among 8003 unique participants (582 girls and 7421 women), 11.6% reported physical violence by a sexual partner in the last 12 months (girls: 8.4%, women: 11.8%). Three factors were associated with physical violence by a sexual partner among girls: being married or cohabiting (nearly 5-fold higher risk), low education, and reporting forced sex in the last 12 months (both with an approximate 2-fold higher risk). Predictive factors were similar for women, with the addition of partner alcohol/drug use and deliberately terminating a pregnancy. Self-reported HIV status was not associated with recent physical violence by a sexual partner among girls or women. CONCLUSIONS: Gender-based physical violence is prevalent in this rural setting and has a strong relationship with marital status, low education level, and forced sex among girls and women. Concerted efforts to prevent child marriage and retain girls in school as well as implementation of school and community-based anti-violence programs may help mitigate this risk.

      2. BACKGROUND: Healthcare providers and law enforcement (LE) officers are among the most common first responders to injuring events. Despite frequent interface between the health system (HS) and LE sectors, the published evidence that supports their collaboration in injury surveillance, control and prevention has not been comprehensively reviewed. METHODS: We conducted a scoping review of literature published from 1990 to 2016 that focused on local and regional HS and LE collaborations in injury surveillance, control and prevention. Our aim was to describe what is known and what remains unexplored about these cross-sector efforts. RESULTS: 128 articles were included in the final review. These were categorised by their focus on either surveillance activities or partnerships in injury control and prevention programmes. The majority of surveillance articles focused on road traffic injuries. Conversely, articles describing partnerships and programme evaluations primarily targeted the prevention of interpersonal violence. DISCUSSION: This review yielded two major findings: overall, the combination of HS and LE injury data added value to surveillance systems, especially as HS data augmented LE data; and HS and LE partnerships have been developed to improve injury control and prevention. However, there are few studies that have evaluated the impact and sustainability of these partnerships. CONCLUSIONS: The current evidence to support HS and LE collaboration in injury surveillance and control and prevention programmes is heterogeneous. Notable gaps suggest ample opportunity for further research and programme evaluation across all types of injury.

    • Laboratory Sciences
      1. Improving laboratory efficiency in the Caribbean to attain the World Health Organization HIV Treat All recommendations
        Alemnji GA, Chase M, Branch S, Guevara G, Nkengasong JN, Albalak R.
        AIDS Res Hum Retroviruses. 2017 Oct 01.
        Scientific evidence showing the benefits of early initiation of antiretroviral therapy (ART) prompted World Health organization (WHO) to recommend that all persons diagnosed HIV-positive should commence ART irrespective of CD4 count and disease progression. Based on this recommendation, countries should adopt and implement the HIV “Treat All” policy to achieve the UNAIDS 90-90-90 targets and ultimately reach epidemic control. Attaining this goal along the HIV treatment cascade depends on the laboratory to monitor progress and measure impact. The laboratory plays an important role in HIV diagnosis to attain the first 90 and in viral load (VL) and HIV drug resistance testing to reinforce adherence, improve viral suppression, and measure the third 90. Countries in the Caribbean region have endorsed the WHO HIV “Treat all” recommendation; however, they are faced with diminishing financial resources to support laboratory testing, seen as a rate-limiting factor to achieving this goal. To improve laboratory coverage with fewer resources in the Caribbean there is the need to optimise laboratory operations to ensure the implementation of high quality, less expensive, evidence-based approaches that will result in more efficient and effective service delivery. Suggested practical and innovative approaches to achieve this include: 1) targeted testing within HIV hotspots; 2) strengthening sample referral systems for VL; 3) better laboratory data collection systems; and 4) use of treatment cascade data for programmatic decision making. Furthermore, strengthening quality improvement and procurement systems will minimize diagnostic errors and guarantee a continuum of uninterrupted testing which is critical for routine monitoring of patients to meet the stated goal.

      2. Pulmonary toxicity following acute coexposures to diesel particulate matter and alpha-quartz crystalline silica in the Sprague-Dawley rat
        Farris BY, Antonini JM, Fedan JS, Mercer RR, Roach KA, Chen BT, Schwegler-Berry D, Kashon ML, Barger MW, Roberts JR.
        Inhal Toxicol. 2017 Oct 01:1-18.
        The effects of acute pulmonary coexposures to silica and diesel particulate matter (DPM), which may occur in various mining operations, were investigated in vivo. Rats were exposed by intratracheal instillation (IT) to silica (50 or 233 microg), DPM (7.89 or 50 microg) or silica and DPM combined in phosphate-buffered saline (PBS) or to PBS alone (control). At one day, one week, one month, two months and three months postexposure bronchoalveolar lavage and histopathology were performed to assess lung injury, inflammation and immune response. While higher doses of silica caused inflammation and injury at all time points, DPM exposure alone did not. DPM (50 microg) combined with silica (233 microg) increased inflammation at one week and one-month postexposure and caused an increase in the incidence of fibrosis at one month compared with exposure to silica alone. To assess susceptibility to lung infection following coexposure, rats were exposed by IT to 233 microg silica, 50 microg DPM, a combination of the two or PBS control one week before intratracheal inoculation with 5 x 105 Listeria monocytogenes. At 1, 3, 5, 7 and 14 days following infection, pulmonary immune response and bacterial clearance from the lung were evaluated. Coexposure to DPM and silica did not alter bacterial clearance from the lung compared to control. Although DPM and silica coexposure did not alter pulmonary susceptibility to infection in this model, the study showed that noninflammatory doses of DPM had the capacity to increase silica-induced lung injury, inflammation and onset/incidence of fibrosis.

      3. High-density microprojection array delivery to rat skin of low doses of trivalent inactivated poliovirus vaccine elicits potent neutralising antibody responses
        Muller DA, Fernando GJ, Owens NS, Agyei-Yeboah C, Wei JC, Depelsenaire AC, Forster A, Fahey P, Weldon WC, Oberste MS, Young PR, Kendall MA.
        Sci Rep. 2017 Oct 03;7(1):12644.
        To secure a polio-free world, the live attenuated oral poliovirus vaccine (OPV) will eventually need to be replaced with inactivated poliovirus vaccines (IPV). However, current IPV delivery is less suitable for campaign use than OPV, and more expensive. We are progressing a microarray patch delivery platform, the Nanopatch, as an easy-to-use device to administer vaccines, including IPV. The Nanopatch contains an ultra-high density array (10,000/cm2) of short (~230 mum) microprojections that delivers dry coated vaccine into the skin. Here, we compare the relative immunogenicity of Nanopatch immunisation versus intramuscular injection in rats, using monovalent and trivalent formulations of IPV. Nanopatch delivery elicits faster antibody response kinetics, with high titres of neutralising antibody after just one (IPV2) or two (IPV1 and IPV3) immunisations, while IM injection requires two (IPV2) or three (IPV1 and IPV3) immunisations to induce similar responses. Seroconversion to each poliovirus type was seen in 100% of rats that received ~1/40th of a human dose of IPV delivered by Nanopatch, but not in rats given ~1/8th or ~1/40th dose by IM injection. Ease of administration coupled with dose reduction observed in this study suggests the Nanopatch could facilitate inexpensive IPV vaccination in campaign settings.

      4. Novel multipurpose pod-intravaginal ring for the prevention of HIV, HSV, and unintended pregnancy: Pharmacokinetic evaluation in a macaque model
        Smith JM, Moss JA, Srinivasan P, Butkyavichene I, Gunawardana M, Fanter R, Miller CS, Sanchez D, Yang F, Ellis S, Zhang J, Marzinke MA, Hendrix CW, Kapoor A, Baum MM.
        PLoS One. 2017 ;12(10):e0185946.
        Globally, women bear an uneven burden for sexual HIV acquisition. Results from two clinical trials evaluating intravaginal rings (IVRs) delivering the antiretroviral agent dapivirine have shown that protection from HIV infection can be achieved with this modality, but high adherence is essential. Multipurpose prevention technologies (MPTs) can potentially increase product adherence by offering protection against multiple vaginally transmitted infections and unintended pregnancy. Here we describe a coitally independent, long-acting pod-IVR MPT that could potentially prevent HIV and HSV infection as well as unintended pregnancy. The pharmacokinetics of MPT pod-IVRs delivering tenofovir alafenamide hemifumarate (TAF2) to prevent HIV, acyclovir (ACV) to prevent HSV, and etonogestrel (ENG) in combination with ethinyl estradiol (EE), FDA-approved hormonal contraceptives, were evaluated in pigtailed macaques (N = 6) over 35 days. Pod IVRs were exchanged at 14 days with the only modification being lower ENG release rates in the second IVR. Plasma progesterone was monitored weekly to determine the effect of ENG/EE on menstrual cycle. The mean in vivo release rates (mg d-1) for the two formulations over 30 days ranged as follows: TAF2 0.35-0.40; ACV 0.56-0.70; EE 0.03-0.08; ENG (high releasing) 0.63; and ENG (low releasing) 0.05. Mean peak progesterone levels were 4.4 +/- 1.8 ng mL-1 prior to IVR insertion and 0.075 +/- 0.064 ng mL-1 for 5 weeks after insertion, suggesting that systemic EE/ENG levels were sufficient to suppress menstruation. The TAF2 and ACV release rates and resulting vaginal tissue drug concentrations (medians: TFV, 2.4 ng mg-1; ACV, 0.2 ng mg-1) may be sufficient to protect against HIV and HSV infection, respectively. This proof of principle study demonstrates that MPT-pod IVRs could serve as a potent biomedical prevention tool to protect women’s sexual and reproductive health and may increase adherence to HIV PrEP even among younger high-risk populations.

      5. Joint toxicity of different heavy metal mixtures after a short-term oral repeated-administration in rats
        Su H, Li Z, Fiati Kenston SS, Shi H, Wang Y, Song X, Gu Y, Barber T, Aldinger J, Zou B, Ding M, Zhao J, Lin X.
        Int J Environ Res Public Health. 2017 Oct 01;14(10).
        The systemic toxicity of different combinations of heavy metal mixtures (HMMs) was studied according to equivalent proportions of the eight most common detectable heavy metals found in fish consumption in the Ningbo area of China. The ion mass proportions of Zn, Cu, Mn, Cr, Ni, Cd, Pb, and Hg were 1070.0, 312.6, 173.1, 82.6, 30.0, 13.3, 6.6, and 1.0, respectively. In this study, 10 experimental groups were set as follows: M8 (Pb + Cd + Hg + Ni + Cu + Zn + Mn + Cr); M5 (Pb + Cd + Hg + Ni + Cr); M4A (Pb + Cd + Hg + Ni); M4B (Cu + Zn + Mn + Cr); M3 (Cu + Zn + Mn); Cr; Cu; Zn; Mn; and control. Sprague Dawley (SD) rats were orally treated with a single dose of each group every three days (10 times in total) for 34 days. After Morris water maze test, blood and tissue samples were collected to obtain biochemical, histopathological and western blot analysis. Results show abnormalities could be observed in different treatment groups, the M4B combination had the most significant change compared to all other groups. In conclusion, combination HMMs may have adverse effects on the hematologic, hepatic, renal and neurobehavioral function, and may also disturb electrolyte and lipid balance. Why M4B combination generated much higher toxic effects than any other combination mixtures or individual heavy metal needs to be further evaluated.

      6. Early assessment and correlations of nanoclay’s toxicity to their physical and chemical properties
        Wagner AL, White AP, Stueckle TA, Banerjee D, Sierros KA, Rojanasakul Y, Agarwal S, Gupta RK, Dinu CZ.
        ACS Applied Materials and Interfaces. 2017 ;9(37):32323-32335.
        Nanoclays’ functionalization with organic modifiers increases their individual barrier properties, thermal stability, and mechanical properties and allows for ease of implementation in food packaging materials or medical devices. Previous reports have shown that, while organic modifiers integration between the layered mineral silicates leads to nanoclays with different degrees of hydrophobicity that become easily miscible in polymers, they could also pose possible effects at inhalation or ingestion routes of exposure. Through a systematic analysis of three organically modified and one pristine nanoclay, we aimed to relate for the first time the physical and chemical characteristics, determined via microscopical and spectroscopical techniques, with the potential of these nanoclays to induce deleterious effects in in vitro cellular systems, i.e. in immortalized and primary human lung epithelial cell lines. To derive information on how functionalization could lead to toxicological profiles throughout nanoclays’ life cycle, both as-received and thermally degraded nanoclays were evaluated. Our analysis showed that the organic modifiers chemical composition influenced both the physical and chemical characteristics of the nanoclays as well as their toxicity. Overall, when cells were exposed to nanoclays with organic modifiers containing bioreactive groups, they displayed lower cellular numbers as well more elongated cellular morphologies relative to the pristine nanoclay and the nanoclay containing a modifier with long carbon chains. Additionally, thermal degradation caused loss of the organic modifiers as well as changes in size and shape of the nanoclays, which led to changes in toxicity upon exposure to our model cellular systems. Our study provides insight into the synergistic effects of chemical composition, size, and shape of the nanoclays and their toxicological profiles in conditions that mimic exposure in manufacturing and disposal environments, respectively, and can help aid in safe-by-design manufacturing of nanoclays with user-controlled functionalization and lower toxicity levels when food packaging applications are considered. 2017 American Chemical Society.

      7. Multiplex RT-PCR for simultaneous surveillance of influenza A and B viruses
        Zhou B, Deng YM, Barnes JR, Sessions O, Chou TW, Wilson M, Stark TJ, Volk M, Spirason N, Halpin RA, Kamaraj US, Ding T, Stockwell TB, Salvatore M, Ghedin E, Barr IG, Wentworth DE.
        J Clin Microbiol. 2017 Oct 04.
        Influenza A and B viruses are the causative agents of annual influenza epidemics that can be severe; influenza A viruses intermittently cause pandemics. Sequence information from influenza genomes is instrumental in determining mechanisms underpinning antigenic evolution and antiviral resistance. However, due to sequence diversity and the dynamics of influenza evolution, rapid and high-throughput sequencing of influenza viruses remains a challenge. We developed a single-reaction FluA/B Multiplex RT-PCR method that amplifies the most critical genomic segments (HA, NA, and M) of seasonal influenza A and B viruses for next-generation sequencing, regardless of viral types, subtypes, or lineages. Herein we demonstrate that the strategy is highly sensitive and robust. The strategy was validated on thousands of seasonal influenza A and B virus positive specimens using multiple next-generation sequencing platforms.

    • Nutritional Sciences
      1. The validity of predictive equations to estimate 24-hour sodium excretion: The MESA and CARDIA Urinary Sodium Study
        Allen NB, Zhao L, Loria CM, Van Horn L, Wang CY, Pfeiffer CM, Cogswell ME, Wright J, Liu K.
        Am J Epidemiol. 2017 Jul 15;186(2):149-159.
        We examined the population distribution of urinary sodium concentrations and the validity of existing equations predicting 24-hour sodium excretion from a single spot urine sample among older adults with and without hypertension. In 2013, 24-hour urine collections were obtained from 554 participants in the Multi-Ethnic Study of Atherosclerosis and the Coronary Artery Risk Development in Young Adults study, who were aged 45-79 years and of whom 56% were female, 58% were African American, and 54% had hypertension, in Chicago, Illinois. One-third provided a second 24-hour collection. Four timed (overnight, morning, afternoon, and evening) spot urine specimens and the 24-hour collection were analyzed for sodium and creatinine concentrations. Mean 24-hour sodium excretion was 3,926 (standard deviation (SD), 1,623) mg for white men, 2,480 (SD, 1,079) mg for white women, 3,454 (SD, 1,651) mg for African-American men, and 3,397 (SD, 1,641) mg for African-American women, and did not differ significantly by hypertensive status. Mean bias (difference) in predicting 24-hour sodium excretion from the timed spot urine specimens ranged from -182 (95% confidence interval: -285, -79) to 1,090 (95% confidence interval: 966, 1,213) mg/day overall. Although the Tanaka equation using the evening specimen produced the least bias overall, no single equation worked well across subgroups of sex and race/ethnicity. A single spot urine sample is not a valid indicator of individual sodium intake. New equations are needed to accurately estimate 24-hour sodium excretion for older adults.

      2. Experiences and lessons learned for delivery of micronutrient powders interventions
        Reerink I, Namaste SM, Poonawala A, Nyhus Dhillon C, Aburto N, Chaudhery D, Kroeun H, Griffiths M, Haque MR, Bonvecchio A, Jefferds ME, Rawat R.
        Matern Child Nutr. 2017 Sep;13 Suppl 1.
        An effective delivery strategy coupled with relevant social and behaviour change communication (SBCC) have been identified as central to the implementation of micronutrient powders (MNP) interventions, but there has been limited documentation of what works. Under the auspices of “The Micronutrient Powders Consultation: Lessons Learned for Operational Guidance,” three working groups were formed to summarize experiences and lessons across countries regarding MNP interventions for young children. This paper focuses on programmatic experiences related to MNP delivery (models, platforms, and channels), SBCC, and training. Methods included a review of published and grey literature, interviews with key informants, and deliberations throughout the consultation process. We found that most countries distributed MNP free of charge via the health sector, although distribution through other platforms and using subsidized fee for product or mixed payment models have also been used. Community-based distribution channels have generally shown higher coverage and when part of an infant and young child feeding approach, may provide additional benefit given their complementarity. SBCC for MNP has worked best when focused on meeting the MNP behavioural objectives (appropriate use, intake adherence, and related infant and young child feeding behaviours). Programmers have learned that reincorporating SBCC and training throughout the intervention life cycle has allowed for much needed adaptations. Diverse experiences delivering MNP exist, and although no one-size-fits-all approach emerged, well-established delivery platforms, community involvement, and SBCC-centred designs tended to have more success. Much still needs to be learned on MNP delivery, and we propose a set of implementation research questions that require further investigation.

    • Occupational Safety and Health
      1. Sleep apnea and pesticide exposure in a study of US farmers
        Baumert BO, Carnes MU, Hoppin JA, Jackson CL, Sandler DP, Freeman LB, Henneberger PK, Umbach DM, Shrestha S, Long S, London SJ.
        Sleep Health. 2017 .
        Introduction: Carbamate and organophosphate pesticides inhibit acetylcholinesterase, and poisoning leads to respiratory depression. Thus, involvement in sleep apnea is plausible, but no data exist at lower levels of exposure. Other pesticides could impact sleep apnea by different mechanisms but have not been studied. Our study examines the associations between pesticide exposure and sleep apnea among pesticide applicators from a US farming population. Participants and methods: We analyzed data from 1569 male pesticide applicators, mostly farmers, from an asthma case-control study nested within the prospective Agricultural Health Study. On questionnaires, participants reported use of specific pesticides and physician diagnosis plus prescribed treatments for sleep apnea. We used multivariable logistic regression to estimate associations between ever use of 63 pesticides and sleep apnea (234 cases, 1335 noncases). Results: The most notable association was for carbofuran, a carbamate (100 exposed cases, odds ratio 1.83, 95% confidence interval 1.34-2.51, P = .0002). Carbofuran use began before reported onset of sleep apnea in all cases. Discussion: This study adds to the known adverse health outcomes of exposure to carbofuran, a pesticide canceled in the United States in 2009 for most agricultural purposes but persists in the environment and remains in use in some other countries. Conclusions: We conducted the first epidemiological study investigating the association of pesticide exposure and sleep apnea. Our results in a male agricultural population suggests that exposure to carbofuran is positively associated with sleep apnea.

      2. Factors associated with crewmember survival of cold water immersion due to commercial fishing vessel sinkings in Alaska
        Lucas DL, Case SL, Lincoln JM, Watson JR.
        Safety Science. 2018 January;101:190-196.
        Occupational fatality surveillance has identified that fishing vessel disasters, such as sinkings and capsizings, continue to contribute to the most deaths among crewmembers in the US fishing industry. When a fishing vessel sinks at sea, crewmembers are at risk of immersion in water and subsequent drowning. This study examined survival factors for crewmembers following cold water immersion after the sinking of decked commercial fishing vessels in Alaskan waters during 2000-2014. Two immersion scenarios were considered separately: immersion for any length of time, and long-term immersion defined as immersion lasting over 30 min. Logistic regression was used to predict the odds of crewmember survival. Of the 617 crewmembers onboard 187 fishing vessels that sank in Alaska during 2000-2014, 557 (90.3%) survived and 60 died. For crewmembers immersed for any length of time, the significant adjusted predictors of survival were: entering a life-raft, sinking within three miles of shore, the sinking not being weather-related, and working as a deckhand. For crewmembers immersed for over 30 min, the significant adjusted predictors of survival were: wearing an immersion suit, entering a life-raft, working as a deckhand, and the sinking not being weather-related. The results of this analysis demonstrate that in situations where cold water immersion becomes inevitable, having access to well-maintained, serviceable lifesaving equipment and the knowledge and skills to use it properly are critical.

      3. Many antineoplastic drugs used to treat cancer, particularly alkylating agents and topoisomerase inhibitors, are known to induce genetic damage in patients. Elevated levels of chromosomal aberrations, micronuclei, and DNA damage have been documented in cancer patients. Elevations in these same biomarkers of genetic damage have been reported in numerous studies of healthcare workers, such as nurses and pharmacists, who routinely handle these drugs, but results vary across studies. To obtain an overall assessment of the exposure effect, we performed a meta-analysis on data obtained from peer-reviewed publications reporting chromosomal aberration levels in healthcare workers exposed to antineoplastic drugs. A literature search identified 39 studies reporting on occupational exposure to antineoplastic drugs and measurement of chromosomal aberrations in healthcare workers. After applying strict inclusion criteria for data quality and presentation, data from 17 studies included in 16 publications underwent meta-analysis using Hedges’ bias-corrected g and a random-effects model. Results showed the level of chromosomal aberrations in healthcare workers exposed to antineoplastic drugs was significantly higher than in controls. The standardized mean differences (difference of means divided by within sd) from all studies were pooled, yielding a value 1.006 (unitless) with p< 0.001. Thus, in addition to the documented genotoxic effects of antineoplastic drugs in cancer patients, this meta-analysis confirmed a significant association between occupational exposure to antineoplastics during the course of a normal work day and increases in chromosomal aberrations in healthcare workers. Based on the studies reviewed, we were unable to accurately assess whether appropriate use of protective measures might reduce the incidence of genetic damage in healthcare workers. However, given the potential for increased cancer risk linked to increases in chromosomal aberrations, the results of this study support the need to limit occupational exposure of healthcare workers to antineoplastic drugs as much as possible.

    • Parasitic Diseases
      1. Household costs among patients hospitalized with malaria: evidence from a national survey in Malawi, 2012
        Hennessee I, Chinkhumba J, Briggs-Hagen M, Bauleni A, Shah MP, Chalira A, Moyo D, Dodoli W, Luhanga M, Sande J, Ali D, Gutman J, Lindblade KA, Njau J, Mathanga DP.
        Malar J. 2017 Oct 02;16(1):395.
        BACKGROUND: With 71% of Malawians living on < $1.90 a day, high household costs associated with severe malaria are likely a major economic burden for low income families and may constitute an important barrier to care seeking. Nevertheless, few efforts have been made to examine these costs. This paper describes household costs associated with seeking and receiving inpatient care for malaria in health facilities in Malawi. METHODS: A cross-sectional survey was conducted in a representative nationwide sample of 36 health facilities providing inpatient treatment for malaria from June-August, 2012. Patients admitted at least 12 h before study team visits who had been prescribed an antimalarial after admission were eligible to provide cost information for their malaria episode, including care seeking at previous health facilities. An ingredients-based approach was used to estimate direct costs. Indirect costs were estimated using a human capital approach. Key drivers of total household costs for illness episodes resulting in malaria admission were assessed by fitting a generalized linear model, accounting for clustering at the health facility level. RESULTS: Out of 100 patients who met the eligibility criteria, 80 (80%) provided cost information for their entire illness episode to date and were included: 39% of patients were under 5 years old and 75% had sought care for the malaria episode at other facilities prior to coming to the current facility. Total household costs averaged $17.48 per patient; direct and indirect household costs averaged $7.59 and $9.90, respectively. Facility management type, household distance from the health facility, patient age, high household wealth, and duration of hospital stay were all significant drivers of overall costs. CONCLUSIONS: Although malaria treatment is supposed to be free in public health facilities, households in Malawi still incur high direct and indirect costs for malaria illness episodes that result in hospital admission. Finding ways to minimize the economic burden of inpatient malaria care is crucial to protect households from potentially catastrophic health expenditures.

      2. The effect of holes in long-lasting insecticidal nets on malaria in Malawi: results from a case-control study
        Minta AA, Landman KZ, Mwandama DA, Shah MP, Eng JL, Sutcliffe JF, Chisaka J, Lindblade KA, Mathanga DP, Steinhardt LC.
        Malar J. 2017 Oct 02;16(1):394.
        BACKGROUND: Long-lasting insecticidal nets (LLINs) are a cornerstone of malaria prevention. Holes develop in LLINs over time and compromise their physical integrity, but how holes affect malaria transmission risk is not well known. METHODS: After a nationwide mass LLIN distribution in July 2012, a study was conducted to assess the relationship between LLIN damage and malaria. From March to September 2013, febrile children ages 6-59 months who consistently slept under LLINs (every night for 2 weeks before illness onset) were enrolled in a case-control study at Machinga District Hospital outpatient department. Cases were positive for Plasmodium falciparum asexual parasites by microscopy while controls were negative. Digital photographs of participants’ LLINs were analysed using an image-processing programme to measure holes. Total hole area was classified by quartiles and according to the World Health Organization’s proportionate hole index (pHI) cut-offs [< 79 cm2 (good), 80-789 cm2 (damaged), and > 790 cm2 (too torn)]. Number of holes by location and size, and total hole area, were compared between case and control LLINs using non-parametric analyses and logistic regression. RESULTS: Of 248 LLINs analysed, 97 (39%) were from cases. Overall, 86% of LLINs had at least one hole. The median number of holes of any size was 9 [interquartile range (IQR) 3, 22], and most holes were located in the lower halves of the nets [median 7 (IQR 2, 16)]. There were no differences in number or location of holes between LLINs used by cases and controls. The median total hole area was 10 cm2 (IQR 2, 125) for control LLINs and 8 cm2 (IQR 2, 47) for case LLINs (p = 0.10). Based on pHI, 109 (72%) control LLINs and 83 (86%) case LLINs were in “good” condition. Multivariable modeling showed no association between total hole area and malaria, controlling for child age, caregiver education, and iron versus thatched roof houses. CONCLUSIONS: LLIN holes were not associated with increased odds of malaria in this study. However, most of the LLINs were in relatively good condition 1 year after distribution. Future studies should examine associations between LLIN holes and malaria risk with more damaged nets.

      3. Protocol and baseline data for a multi-year cohort study of the effects of different mass drug treatment approaches on functional morbidities from schistosomiasis in four African countries
        Shen Y, King CH, Binder S, Zhang F, Whalen CC, Secor WE, Montgomery SP, Mwinzi PN, Olsen A, Magnussen P, Kinung’hi S, Phillips AE, Nala R, Ferro J, Aurelio HO, Fleming F, Garba A, Hamidou A, Fenwick A, Campbell CH, Colley DG.
        BMC Infect Dis. 2017 Sep 29;17(1):652.
        BACKGROUND: The Schistosomiasis Consortium for Operational Research and Evaluation (SCORE) focus is on randomized trials of different approaches to mass drug administration (MDA) in endemic countries in Africa. Because their studies provided an opportunity to evaluate the effects of mass treatment on Schistosoma-associated morbidity, nested cohort studies were developed within SCORE’s intervention trials to monitor changes in a suite of schistosomiasis disease outcomes. This paper describes the process SCORE used to select markers for prospective monitoring and the baseline prevalence of these morbidities in four parallel cohort studies. METHODS: In July 2009, SCORE hosted a discussion of the potential impact of MDA on morbidities due to Schistosoma infection that might be measured in the context of multi-year control. Candidate markers were reviewed and selected for study implementation. Baseline data were then collected from cohorts of children in four country studies: two in high endemic S. mansoni sites (Kenya and Tanzania), and two in high endemic S. haematobium sites (Niger and Mozambique), these cohorts to be followed prospectively over 5 years. RESULTS: At baseline, 62% of children in the S. mansoni sites had detectable eggs in their stool, and 10% had heavy infections (>/= 400 eggs/g feces). Heavy S. mansoni infections were found to be associated with increased baseline risk of anemia, although children with moderate or heavy intensity infections had lower risk of physical wasting. Prevalence of egg-positive infection in the combined S. haematobium cohorts was 27%, with 5% of individuals having heavy infection (>/=50 eggs/10 mL urine). At baseline, light intensity S. haematobium infection was associated with anemia and with lower scores in the social domain of health-related quality-of-life (HRQoL) assessed by Pediatric Quality of Life Inventory. CONCLUSIONS: Our consensus on practical markers of Schistosoma-associated morbidity indicated that height, weight, hemoglobin, exercise tolerance, HRQoL, and ultrasound abnormalities could be used as reference points for gauging treatment impact. Data collected over five years of program implementation will provide guidance for future evaluation of morbidity control in areas endemic for schistosomiasis. TRIAL REGISTRATION: These cohort studies are registered and performed in conjunction with the International Standard Randomised Controlled Trial Registry trials ISRCTN16755535 , ISRCTN14117624 , ISRCTN95819193 , and ISRCTN32045736 .

      4. A persistent hotspot of Schistosoma mansoni infection in a five-year randomized trial of praziquantel preventative chemotherapy strategies
        Wiegand RE, Mwinzi PN, Montgomery SP, Chan YL, Andiego K, Omedo M, Muchiri G, Ogutu MO, Rawago F, Odiere MR, Karanja DM, Secor WE.
        J Infect Dis. 2017 Sep 16.
        Background: Persistent hotspots have been described following mass drug administration (MDA) for the control of schistosomiasis, but have not been studied during the course of a multi-year MDA program. Methods: In data from a five-year study of school-based and village-wide preventive chemotherapy strategies for Schistosoma mansoni, spatial scan statistics were used to find infection hotspots in three populations: 5-8 year olds, 9-12 year olds, and adults. Negative binomial regression was used to analyze changes from baseline and ROC analyses were used to predict which villages would reach prevalence and intensity endpoints. Results: We identified a persistent hotspot, not associated with study arm, where S. mansoni infection prevalence and intensity did not decrease as much as in villages outside the hotspot. Significant differences from baseline were realized after one year of MDA; we did not identify factors that moderated this relationship. Villages meeting specified endpoints at year 5 were predicted from prior year data with moderately high sensitivity and specificity. Conclusions: MDA strategies were less effective at reducing prevalence and intensity in the hotspot compared to other villages. Villages that reached year 5 endpoints could be detected earlier providing the opportunity to amend intervention strategies.

      5. Knowledge and perception towards net care and repair practice in Ethiopia
        Zewde A, Irish S, Woyessa A, Wuletaw Y, Nahusenay H, Abdelmenan S, Demissie M, Gulema H, Dissanayake G, Chibsa S, Solomon H, Yenehun MA, Kebede A, Lorenz LM, Ponce-de-Leon G, Keating J, Worku A, Berhane Y.
        Malar J. 2017 Oct 02;16(1):396.
        BACKGROUND: Long-lasting insecticidal nets (LLINs) are a key malaria control intervention. Although LLINs are presumed to be effective for 3 years under field or programmatic conditions, net care and repair approaches by users influence the physical and chemical durability. Understanding how knowledge, perception and practices influence net care and repair practices could guide the development of targeted behavioural change communication interventions related to net care and repair in Ethiopia and elsewhere. METHODS: This population-based, household survey was conducted in four regions of Ethiopia [Amhara, Oromia, Tigray, Southern Nations Nationalities Peoples Region (SNNPR)] in June 2015. A total of 1839 households were selected using multi-stage sampling procedures. The household respondents were the heads of households. A questionnaire was administered and the data were captured electronically. STATA software version 12 was used to analyse the data. Survey commands were used to account for the multi-stage sampling approach. Household descriptive statistics related to characteristics and levels of knowledge and perception on net care and repair are presented. Ordinal logistic regression was used to identify factors associated with net care and repair perceptions. RESULTS: Less than a quarter of the respondents (22.3%: 95% CI 20.4-24.3%) reported adequate knowledge of net care and repair; 24.6% (95% CI 22.7-26.5%) of the respondents reported receiving information on net care and repair in the previous 6 months. Thirty-five per cent of the respondents (35.1%: 95% CI 32.9-37.4%) reported positive perceptions towards net care and repair. Respondents with adequate knowledge on net care and repair (AOR 1.58: 95% CI 1.2-2.02), and those who discussed net care and repair with their family (AOR 1.47: 95% CI 1.14-1.89) had higher odds of having positive perceptions towards net care and repair. CONCLUSIONS: The low level of reported knowledge on net care and repair, as well as the low level of reported positive perception towards net repair need to be addressed. Targeted behavioural change communication campaigns could be used to target specific groups; increased net care and repair would lead to longer lasting nets.

    • Physical Activity
      1. Objectively measured physical activity and risk of knee osteoarthritis
        Qin J, Barbour KE, Nevitt MC, Helmick CG, Hootman JM, Murphy LB, Cauley JA, Dunlop DD.
        Med Sci Sports Exerc. 2017 Oct 02.
        PURPOSE: To examine the association between objectively measured physical activity and risk of developing incident knee osteoarthritis (OA) in a community-based cohort of middle-aged and older adults. METHODS: We used data from the Osteoarthritis Initiative (OAI), an ongoing prospective cohort study of adults aged 45 to 83 at initial enrollment with elevated risk of symptomatic knee OA. Moderate-vigorous physical activity (MVPA) was measured by a uniaxial accelerometer for seven continuous days in two data collection cycles, and was categorized as inactive (<10 minutes/week), low activity (10-<150 minutes/week), and active (>/=150 minutes/week). Incident knee OA based on radiographic and symptomatic OA and joint space narrowing were analyzed as outcomes over four years of follow-up. Participants free of the outcome of interest in both knees at study baseline were included (sample sizes ranged from 694 to 1,331 for different outcomes). We estimated hazard ratio (HR) and its 95% confidence intervals (CI). RESULTS: In multivariate adjusted analyses, active MVPA participation was not significantly associated with risk of incident radiographic knee OA (HR: 1.52; 95% CI: 0.68-3.40), symptomatic knee OA (HR: 1.17; 95% CI: 0.44-3.09), or joint space narrowing (HR: 0.87; 95% CI: 0.37-2.06), when compared with inactive MVPA participation. Similar results were found for participants with low activity MVPA. CONCLUSION: MVPA was not associated with the risk of developing incident knee OA or joint space narrowing over four years of follow-up among OAI participants who are at increased risk of knee OA.

    • Reproductive Health
      1. OBJECTIVE: We sought to determine the prevalence of postpartum contraceptive use among women with postpartum depressive symptoms (PDS) and examine the association between PDS and contraceptive method. STUDY DESIGN: We evaluated data from 16,357 postpartum women participating in the 2009-2011 Pregnancy Risk Assessment Monitoring System. PDS was defined as an additive score of>/=10 for three questions on depression, hopelessness, and feeling physically slowed. Contraceptive use was categorized as permanent, long-acting reversible contraception (LARC), user-dependent hormonal, and user-dependent non-hormonal. Logistic regression models compared postpartum contraceptive use and method by PDS status. RESULTS: In total, 12.3% of women with a recent live birth reported PDS. Large percentages of women with (69.4%) and without (76.1%) PDS, used user-dependent or no contraceptive method. There were no associations between PDS and use of any postpartum contraception (adjusted Prevalence Ratio (aPR)=1.00, 95% CI 0.98-1.03) or permanent contraception (aPR=1.05, 95% CI 0.88-1.27). LARC use was elevated, but not significantly, among women with PDS compared to those without (aPR=1.16, 95% CI: 1.00-1.34). CONCLUSIONS: Large percentages of women with and without PDS used user-dependent or no contraception. Since depression may be associated with misuse of user-dependent methods, counseling women about how to use methods more effectively, as well as the effectiveness of non-user dependent methods, may be beneficial. IMPLICATIONS: A large percentage of women with PDS are either not using contraception or using less effective user-dependent methods. Since depression may be associated with misuse of user-dependent contraceptive methods, counseling women about how to use methods more effectively, as well as non-user dependent options, such as LARC, may be beneficial.

      2. Embryo cryopreservation and preeclampsia risk
        Sites CK, Wilson D, Barsky M, Bernson D, Bernstein IM, Boulet S, Zhang Y.
        Fertil Steril. 2017 Sep 30.
        OBJECTIVE: To determine whether assisted reproductive technology (ART) cycles involving cryopreserved-warmed embryos are associated with the development of preeclampsia. DESIGN: Retrospective cohort study. SETTING: IVF clinics and hospitals. PATIENT(S): A total of 15,937 births from ART: 9,417 singleton and 6,520 twin. INTERVENTION(S): We used linked ART surveillance, birth certificate, and maternal hospitalization discharge data, considering resident singleton and twin births from autologous or donor eggs from 2005-2010. MAIN OUTCOME MEASURE(S): We compared the frequency of preeclampsia diagnosis for cryopreserved-warmed versus fresh ET and used multivariable logistic regression to adjust for confounders. RESULT(S): Among pregnancies conceived with autologous eggs resulting in singletons, preeclampsia was greater after cryopreserved-warmed versus fresh ET (7.51% vs. 4.29%, adjusted odds ratio = 2.17 [1.67-2.82]). Preeclampsia without and with severe features, preeclampsia with preterm delivery, and chronic hypertension with superimposed preeclampsia were more frequent after cryopreserved-warmed versus fresh ET (3.99% vs. 2.55%; 2.95% vs. 1.41%; 2.76 vs. 1.48%; and 0.95% vs. 0.43%, respectively). Among pregnancies from autologous eggs resulting in twins, the frequency of preeclampsia with severe features (9.26% vs. 5.70%) and preeclampsia with preterm delivery (14.81% vs. 11.74%) was higher after cryopreserved versus fresh transfers. Among donor egg pregnancies, rates of preeclampsia did not differ significantly between cryopreserved-warmed and fresh ET (10.78% vs. 12.13% for singletons and 28.0% vs. 25.15% for twins). CONCLUSION(S): Among ART pregnancies conceived using autologous eggs resulting in live births, those involving transfer of cryopreserved-warmed embryos, as compared with fresh ETs, had increased risk for preeclampsia with severe features and preeclampsia with preterm delivery.

    • Substance Use and Abuse
      1. Contribution of opioid-involved poisoning to the change in life expectancy in the United States, 2000-2015
        Dowell D, Arias E, Kochanek K, Anderson R, Guy GP, Losby JL, Baldwin G.
        Jama. 2017 Sep 19;318(11):1065-1067.

        [No abstract]

      2. Cadmium and cadmium/zinc ratios and tobacco-related morbidities
        Richter P, Faroon O, Pappas RS.
        Int J Environ Res Public Health. 2017 Sep 29;14(10).
        Metals are one of five major categories of carcinogenic or toxic constituents in tobacco and tobacco smoke. Cadmium is highly volatile and a higher percentage of the total tobacco cadmium content is efficiently transferred to mainstream tobacco smoke than many other toxic metals in tobacco. Inhaled cadmium bioaccumulates in the lungs and is distributed beyond the lungs to other tissues, with a total body biological half-life of one to two decades. Chronic cadmium exposure through tobacco use elevates blood and urine cadmium concentrations. Cadmium is a carcinogen, and an inducer of proinflammatory immune responses. Elevated exposure to cadmium is associated with reduced pulmonary function, obstructive lung disease, bronchogenic carcinoma, cardiovascular diseases including myocardial infarction, peripheral arterial disease, prostate cancer, cervical cancer, pancreatic cancer, and various oral pathologies. Cadmium and zinc have a toxicologically inverse relationship. Zinc is an essential element and is reportedly antagonistic to some manifestations of cadmium toxicity. This review summarizes associations between blood, urine, and tissue cadmium concentrations with emphasis on cadmium exposure due to tobacco use and several disease states. Available data about zinc and cadmium/zinc ratios and tobacco-related diseases is summarized from studies reporting smoking status. Collectively, data suggest that blood, urine, and tissue cadmium and cadmium/zinc ratios are often significantly different between smokers and nonsmokers and they are also different in smokers for several diseases and cancers. Additional biomonitoring data such as blood or serum and urine zinc and cadmium levels and cadmium/zinc ratios in smokers may provide further insight into the development and progression of diseases of the lung, cardiovascular system, and possibly other organs.

    • Zoonotic and Vectorborne Diseases
      1. Zoonotic and vector borne agents causing disease in adult patients hospitalized due to fever of unknown origin in Thailand
        Hinjoy S, Wacharapluesadee S, Iamsirithaworn S, Smithsuwan P, Padungtod P.
        Asian Pacific Journal of Tropical Disease. 2017 01 Oct;7(10):577-581.
        Objective: To determine the etiologic agents of fever of unknown origin among populations in agricultural communities and to assess the possible risk factors for zoonotic infections. Methods: Hospitalized patients with fever of unknown origin under physician care were asked to participate and provide blood samples for laboratory tests and screening for endemic diseases at the hospitals. Samples were stored at -80 degreeC until they were tested at Chulalongkorn University to identify additional pathogens. Results: We were able to identify the etiologic agents in 24.6% of the 463 enrolled patients. Zoonotic and vector borne agents were confirmed in 59 cases (12.7%). Dengue virus (7.3%) was the most frequently detected disease followed by scrub typhus (3.2%). There were two cases of comorbidities of scrub typhus and dengue fever. The other six cases of zoonoses were leptospirosis, melioidosis, and Streptococcus suis infections. Patients with zoonotic/vector borne agents noticed rats in their houses and reported having contact with livestock feces morefrequently than those patients without zoonotic/vector borne agents. Conclusions: Dengue virus and scrub typhus were mostly detected in the rainy season. During this specific season, clinicians should raise awareness of those diseases when any patients are admitted to the hospital with fever of an unidentified source.

Back to Top

CDC Science Clips Production Staff

  • John Iskander, MD MPH, Editor
  • Gail Bang, MLIS, Librarian
  • Kathy Tucker, Librarian
  • William (Bill) Thomas, MLIS, Librarian
  • Onnalee Gomez, MS, Health Scientist
  • Jarvis Sims, MIT, MLIS, Librarian


DISCLAIMER: Articles listed in the CDC Science Clips are selected by the Stephen B. Thacker CDC Library to provide current awareness of the public health literature. An article's inclusion does not necessarily represent the views of the Centers for Disease Control and Prevention nor does it imply endorsement of the article's methods or findings. CDC and DHHS assume no responsibility for the factual accuracy of the items presented. The selection, omission, or content of items does not imply any endorsement or other position taken by CDC or DHHS. Opinion, findings and conclusions expressed by the original authors of items included in the Clips, or persons quoted therein, are strictly their own and are in no way meant to represent the opinion or views of CDC or DHHS. References to publications, news sources, and non-CDC Websites are provided solely for informational purposes and do not imply endorsement by CDC or DHHS.