libraryheader-short.png

Volume 12, Issue, 2 January 21, 2020

CDC Science Clips: Volume 12, Issue 2, January 21, 2020

Science Clips is produced weekly to enhance awareness of emerging scientific knowledge for the public health community. Each article features an Altmetric Attention scoreexternal icon to track social and mainstream media mentions!

  1. CDC Public Health Grand Rounds
    • Genetics and Genomics - Emerging Role of Pathogen Genomics in Public Health
      1. Pathogen genomics in public healthexternal icon
        Armstrong GL, MacCannell DR, Taylor J, Carleton HA, Neuhaus EB, Bradbury RS, Posey JE, Gwinn M.
        N Engl J Med. 2019 Dec 26;381(26):2569-2580.
        Rapid advances in DNA sequencing technology ("next-generation sequencing") have inspired optimism about the potential of human genomics for "precision medicine." Meanwhile, pathogen genomics is already delivering "precision public health" through more effective investigations of outbreaks of foodborne illnesses, better-targeted tuberculosis control, and more timely and granular influenza surveillance to inform the selection of vaccine strains. In this article, we describe how public health agencies have been adopting pathogen genomics to improve their effectiveness in almost all domains of infectious disease. This momentum is likely to continue, given the ongoing development in sequencing and sequencing-related technologies.

      2. Prediction of susceptibility to first-line tuberculosis drugs by DNA sequencingexternal icon
        Allix-Beguec C, Arandjelovic I, Bi L, Beckert P, Bonnet M, Bradley P, Cabibbe AM, Cancino-Munoz I, Caulfield MJ, Chaiprasert A, Cirillo DM, Clifton DA, Comas I, Crook DW, De Filippo MR, de Neeling H, Diel R, Drobniewski FA, Faksri K, Farhat MR, Fleming J, Fowler P, Fowler TA, Gao Q, Gardy J, Gascoyne-Binzi D, Gibertoni-Cruz AL, Gil-Brusola A, Golubchik T, Gonzalo X, Grandjean L, He G, Guthrie JL, Hoosdally S, Hunt M, Iqbal Z, Ismail N, Johnston J, Khanzada FM, Khor CC, Kohl TA, Kong C, Lipworth S, Liu Q, Maphalala G, Martinez E, Mathys V, Merker M, Miotto P, Mistry N, Moore DA, Murray M, Niemann S, Omar SV, Ong RT, Peto TE, Posey JE, Prammananan T, Pym A, Rodrigues C, Rodrigues M, Rodwell T, Rossolini GM, Sanchez Padilla E, Schito M, Shen X, Shendure J, Sintchenko V, Sloutsky A, Smith EG, Snyder M, Soetaert K, Starks AM, Supply P, Suriyapol P, Tahseen S, Tang P, Teo YY, Thuong TN, Thwaites G, Tortoli E, van Soolingen D, Walker AS, Walker TM, Wilcox M, Wilson DJ, Wyllie D, Yang Y, Zhang H, Zhao Y, Zhu B.
        N Engl J Med. 2018 Oct 11;379(15):1403-1415.
        BACKGROUND: The World Health Organization recommends drug-susceptibility testing of Mycobacterium tuberculosis complex for all patients with tuberculosis to guide treatment decisions and improve outcomes. Whether DNA sequencing can be used to accurately predict profiles of susceptibility to first-line antituberculosis drugs has not been clear. METHODS: We obtained whole-genome sequences and associated phenotypes of resistance or susceptibility to the first-line antituberculosis drugs isoniazid, rifampin, ethambutol, and pyrazinamide for isolates from 16 countries across six continents. For each isolate, mutations associated with drug resistance and drug susceptibility were identified across nine genes, and individual phenotypes were predicted unless mutations of unknown association were also present. To identify how whole-genome sequencing might direct first-line drug therapy, complete susceptibility profiles were predicted. These profiles were predicted to be susceptible to all four drugs (i.e., pansusceptible) if they were predicted to be susceptible to isoniazid and to the other drugs or if they contained mutations of unknown association in genes that affect susceptibility to the other drugs. We simulated the way in which the negative predictive value changed with the prevalence of drug resistance. RESULTS: A total of 10,209 isolates were analyzed. The largest proportion of phenotypes was predicted for rifampin (9660 [95.4%] of 10,130) and the smallest was predicted for ethambutol (8794 [89.8%] of 9794). Resistance to isoniazid, rifampin, ethambutol, and pyrazinamide was correctly predicted with 97.1%, 97.5%, 94.6%, and 91.3% sensitivity, respectively, and susceptibility to these drugs was correctly predicted with 99.0%, 98.8%, 93.6%, and 96.8% specificity. Of the 7516 isolates with complete phenotypic drug-susceptibility profiles, 5865 (78.0%) had complete genotypic predictions, among which 5250 profiles (89.5%) were correctly predicted. Among the 4037 phenotypic profiles that were predicted to be pansusceptible, 3952 (97.9%) were correctly predicted. CONCLUSIONS: Genotypic predictions of the susceptibility of M. tuberculosis to first-line drugs were found to be correlated with phenotypic susceptibility to these drugs. (Funded by the Bill and Melinda Gates Foundation and others.).

      3. Next-generation sequencing technologies and their application to the study and control of bacterial infectionsexternal icon
        Besser J, Carleton HA, Gerner-Smidt P, Lindsey RL, Trees E.
        Clin Microbiol Infect. 2018 Apr;24(4):335-341.
        BACKGROUND: With the efficiency and the decreasing cost of next-generation sequencing, the technology is being rapidly introduced into clinical and public health laboratory practice. AIMS: The historical background and principles of first-, second- and third-generation sequencing are described, as are the characteristics of the most commonly used sequencing instruments. SOURCES: Peer-reviewed literature, white papers and meeting reports. CONTENT AND IMPLICATIONS: Next-generation sequencing is a technology that could potentially replace many traditional microbiological workflows, providing clinicians and public health specialists with more actionable information than hitherto achievable. Examples of the clinical and public health uses of the technology are provided. The challenge of comparability of different sequencing platforms is discussed. Finally, the future directions of the technology integrating it with laboratory management and public health surveillance systems, and moving it towards performing sequencing directly from the clinical specimen (metagenomics), could lead to yet another fundamental transformation of clinical diagnostics and public health surveillance.

      4. A primer on microbial bioinformatics for nonbioinformaticiansexternal icon
        Carrico JA, Rossi M, Moran-Gilad J, Van Domselaar G, Ramirez M.
        Clin Microbiol Infect. 2018 Apr;24(4):342-349.
        BACKGROUND: Presently, the bottleneck in the deployment of high-throughput sequencing technology is the ability to analyse the increasing amount of data produced in a fit-for-purpose manner. The field of microbial bioinformatics is thriving and quickly adapting to technological changes, which creates difficulties for nonbioinformaticians in following the complexity and increasingly obscure jargon of this field. AIMS: This review is directed towards nonbioinformaticians who wish to gain understanding of the overall microbial bioinformatic processes, from raw data obtained from sequencers to final outputs. SOURCES: The software and analytical strategies reviewed are based on the personal experience of the authors. CONTENT: The bioinformatic processes of transforming raw reads to actionable information in a clinical and epidemiologic context is explained. We review the advantages and limitations of two major strategies currently applied: read mapping, which is the comparison with a predefined reference genome, and de novo assembly, which is the unguided assembly of the raw data. Finally, we discuss the main analytical methodologies and the most frequently used freely available software and its application in the context of bacterial infectious disease management. IMPLICATIONS: High-throughput sequencing technologies are overhauling outbreak investigation and epidemiologic surveillance while creating new challenges due to the amount and complexity of data generated. The continuously evolving field of microbial bioinformatics is required for stakeholders to fully harness the power of these new technologies.

      5. Restriction enzyme digestion of host DNA enhances universal detection of parasitic pathogens in blood via targeted amplicon deep sequencingexternal icon
        Flaherty BR, Talundzic E, Barratt J, Kines KJ, Olsen C, Lane M, Sheth M, Bradbury RS.
        Microbiome. 2018 Sep 17;6(1):164.
        BACKGROUND: Targeted amplicon deep sequencing (TADS) of the 16S rRNA gene is commonly used to explore and characterize bacterial microbiomes. Meanwhile, attempts to apply TADS to the detection and characterization of entire parasitic communities have been hampered since conserved regions of many conserved parasite genes, such as the 18S rRNA gene, are also conserved in their eukaryotic hosts. As a result, targeted amplification of 18S rRNA from clinical samples using universal primers frequently results in competitive priming and preferential amplification of host DNA. Here, we describe a novel method that employs a single pair of universal primers to capture all blood-borne parasites while reducing host 18S rRNA template and enhancing the amplification of parasite 18S rRNA for TADS. This was achieved using restriction enzymes to digest the 18S rRNA gene at cut sites present only in the host sequence prior to PCR amplification. RESULTS: This method was validated against 16 species of blood-borne helminths and protozoa. Enzyme digestion prior to PCR enrichment and Illumina amplicon deep sequencing led to a substantial reduction in human reads and a corresponding 5- to 10-fold increase in parasite reads relative to undigested samples. This method allowed for discrimination of all common parasitic agents found in human blood, even in cases of multi-parasite infection, and markedly reduced the limit of detection in digested versus undigested samples. CONCLUSIONS: The results herein provide a novel methodology for the reduction of host DNA prior to TADS and establish the validity of a next-generation sequencing-based platform for universal parasite detection.

      6. Next-generation sequencing of infectious pathogensexternal icon
        Gwinn M, MacCannell D, Armstrong GL.
        Jama. 2019 Mar 5;321(9):893-894.

      7. Precision public health for the era of precision medicineexternal icon
        Khoury MJ, Iademarco MF, Riley WT.
        Am J Prev Med. 2016 Mar;50(3):398-401.

      8. PulseNet and the changing paradigm of laboratory-based surveillance for foodborne diseasesexternal icon
        Kubota KA, Wolfgang WJ, Baker DJ, Boxrud D, Turner L, Trees E, Carleton HA, Gerner-Smidt P.
        Public Health Rep. 2019 Nov/Dec;134(2_suppl):22s-28s.
        PulseNet, the National Molecular Subtyping Network for Foodborne Disease Surveillance, was established in 1996 through a collaboration with the Centers for Disease Control and Prevention; the US Department of Agriculture, Food Safety and Inspection Service; the US Food and Drug Administration; 4 state public health laboratories; and the Association of Public Health Laboratories. The network has since expanded to include 83 state, local, and food regulatory public health laboratories. In 2016, PulseNet was estimated to be helping prevent an estimated 270 000 foodborne illnesses annually. PulseNet is undergoing a transformation toward whole-genome sequencing (WGS), which provides better discriminatory power and precision than pulsed-field gel electrophoresis (PFGE). WGS improves the detection of outbreak clusters and could replace many traditional reference identification and characterization methods. This article highlights the contributions made by public health laboratories in transforming PulseNet's surveillance and describes how the transformation is changing local and national surveillance practices. Our data show that WGS is better at identifying clusters than PFGE, especially for clonal organisms such as Salmonella Enteritidis. The need to develop prioritization schemes for cluster follow-up and additional resources for both public health laboratory and epidemiology departments will be critical as PulseNet implements WGS for foodborne disease surveillance in the United States.

      9. Precision epidemiology for infectious disease controlexternal icon
        Ladner JT, Grubaugh ND, Pybus OG, Andersen KG.
        Nat Med. 2019 Feb;25(2):206-211.
        Advances in genomics and computing are transforming the capacity for the characterization of biological systems, and researchers are now poised for a precision-focused transformation in the way they prepare for, and respond to, infectious diseases. This includes the use of genome-based approaches to inform molecular diagnosis and individual-level treatment regimens. In addition, advances in the speed and granularity of pathogen genome generation have improved the capability to track and understand pathogen transmission, leading to potential improvements in the design and implementation of population-level public health interventions. In this Perspective, we outline several trends that are driving the development of precision epidemiology of infectious disease and their implications for scientists' ability to respond to outbreaks.

      10. In the decade and a half since the introduction of next-generation sequencing (NGS), the technical feasibility, cost, and overall utility of sequencing have changed dramatically, including applications for infectious disease epidemiology. Massively parallel sequencing technologies have decreased the cost of sequencing by more than 6 orders or magnitude over this time, with a corresponding increase in data generation and complexity. This review provides an overview of the basic principles, chemistry, and operational mechanics of current sequencing technologies, including both conventional Sanger and NGS approaches. As the generation of large amounts of sequence data becomes increasingly routine, the role of bioinformatics in data analysis and reporting becomes all the more critical, and the successful deployment of NGS in public health settings requires careful consideration of changing information technology, bioinformatics, workforce, and regulatory requirements. While there remain important challenges to the sustainable implementation of NGS in public health, in terms of both laboratory and bioinformatics capacity, the impact of these technologies on infectious disease surveillance and outbreak investigations has been nothing short of revolutionary. Understanding the important role that NGS plays in modern public health laboratory practice is critical, as is the need to ensure appropriate workforce, infrastructure, facilities, and funding consideration for routine NGS applications, future innovation, and rapidly scaling NGS-based infectious disease surveillance and outbreak response activities. *This article is part of a curated collection.

      11. Identifying clusters of recent and rapid HIV transmission through analysis of molecular surveillance dataexternal icon
        Oster AM, France AM, Panneer N, Banez Ocfemia MC, Campbell E, Dasgupta S, Switzer WM, Wertheim JO, Hernandez AL.
        J Acquir Immune Defic Syndr. 2018 Dec 15;79(5):543-550.
        BACKGROUND: Detecting recent and rapid spread of HIV can help prioritize prevention and early treatment for those at highest risk of transmission. HIV genetic sequence data can identify transmission clusters, but previous approaches have not distinguished clusters of recent, rapid transmission. We assessed an analytic approach to identify such clusters in the United States. METHODS: We analyzed 156,553 partial HIV-1 polymerase sequences reported to the National HIV Surveillance System and inferred transmission clusters using 2 genetic distance thresholds (0.5% and 1.5%) and 2 periods for diagnoses (all years and 2013-2015, ie, recent diagnoses). For rapidly growing clusters (with >/=5 diagnoses during 2015), molecular clock phylogenetic analysis estimated the time to most recent common ancestor for all divergence events within the cluster. Cluster transmission rates were estimated using these phylogenies. RESULTS: A distance threshold of 1.5% identified 103 rapidly growing clusters using all diagnoses and 73 using recent diagnoses; at 0.5%, 15 clusters were identified using all diagnoses and 13 using recent diagnoses. Molecular clock analysis estimated that the 13 clusters identified at 0.5% using recent diagnoses had been diversifying for a median of 4.7 years, compared with 6.5-13.2 years using other approaches. The 13 clusters at 0.5% had a transmission rate of 33/100 person-years, compared with previous national estimates of 4/100 person-years. CONCLUSIONS: Our approach identified clusters with transmission rates 8 times those of previous national estimates. This method can identify groups involved in rapid transmission and help programs effectively direct and prioritize limited public health resources.

      12. Next-generation sequencing and bioinformatics protocol for malaria drug resistance marker surveillanceexternal icon
        Talundzic E, Ravishankar S, Kelley J, Patel D, Plucinski M, Schmedes S, Ljolje D, Clemons B, Madison-Antenucci S, Arguin PM, Lucchi NW, Vannberg F, Udhayakumar V.
        Antimicrob Agents Chemother. 2018 Apr;62(4).
        The recent advances in next-generation sequencing technologies provide a new and effective way of tracking malaria drug-resistant parasites. To take advantage of this technology, an end-to-end Illumina targeted amplicon deep sequencing (TADS) and bioinformatics pipeline for molecular surveillance of drug resistance in P. falciparum, called malaria resistance surveillance (MaRS), was developed. TADS relies on PCR enriching genomic regions, specifically target genes of interest, prior to deep sequencing. MaRS enables researchers to simultaneously collect data on allele frequencies of multiple full-length P. falciparum drug resistance genes (crt, mdr1, k13, dhfr, dhps, and the cytochrome b gene), as well as the mitochondrial genome. Information is captured at the individual patient level for both known and potential new single nucleotide polymorphisms associated with drug resistance. The MaRS pipeline was validated using 245 imported malaria cases that were reported to the Centers for Disease Control and Prevention (CDC). The chloroquine resistance crt CVIET genotype (mutations underlined) was observed in 42% of samples, the highly pyrimethamine-resistant dhpsIRN triple mutant in 92% of samples, and the sulfadoxine resistance dhps mutation SGEAA in 26% of samples. The mdr1 NFSND genotype was found in 40% of samples. With the exception of two cases imported from Cambodia, no artemisinin resistance k13 alleles were identified, and 99% of patients carried parasites susceptible to atovaquone-proguanil. Our goal is to implement MaRS at the CDC for routine surveillance of imported malaria cases in the United States and to aid in the adoption of this system at participating state public health laboratories, as well as by global partners.


  2. CDC Authored Publications
    The names of CDC authors are indicated in bold text.
    Articles published in the past 6-8 weeks authored by CDC or ATSDR staff.
    • Chronic Diseases and Conditions
      1. Platelet count variation and risk for coronary artery abnormalities in Kawasaki diseaseexternal icon
        Ae R, Abrams JY, Maddox RA, Schonberger LB, Nakamura Y, Shindo A, Kuwabara M, Makino N, Matsubara Y, Kosami K, Sasahara T, Belay ED.
        Pediatr Infect Dis J. 2019 Dec 16.
        BACKGROUND: Platelet count is considered as a biomarker for the development of coronary artery abnormalities (CAAs) among Kawasaki disease (KD) patients. However, previous studies have reported inconsistent results. We addressed the controversial association of platelet count with CAAs using a large-scale dataset. METHODS: A retrospective cohort study was conducted using KD survey data from Japan (2015-2016; n = 25,448). Classifying patients by intravenous immunoglobulin (IVIG) responsiveness, we described the trends in platelet count using the lowest and highest values along with the specific illness days. Multivariate logistic regression analysis was performed to evaluate the association between platelet count and CAAs, adjusting for relevant factors. RESULTS: Platelet counts rapidly decreased from admission, reached the lowest count at 6-7 days, and peaked after 10 days. Platelet counts in IVIG non-responders decreased with a lower minimum value than IVIG responders, but subsequently rebounded toward a higher maximum. Compared with patients with normal platelet counts (150-450 x 10/L), patients with abnormally high platelet counts (>450 x 10/L) were more likely to have CAAs at admission (adjusted odds ratio: IVIG responders, 1.50 [95% confidence interval 1.20-1.87] and non-responders, 1.46 [1.01-2.12]). By contrast, IVIG non-responding patients whose counts were below normal (<150 x 10/L) after hospitalization were at higher risk for developing CAAs (2.27 [1.44-3.58]). CONCLUSIONS: Platelet count varied widely by illness day and was confounded by IVIG responsiveness, which might have contributed to previous inconsistent findings. KD patients with abnormally high platelet counts at admission or abnormally low counts after hospitalization were at higher risk for CAAs.

      2. Current work hours and coronary artery calcification (CAC): The Multi-Ethnic Study of Atherosclerosis (MESA)external icon
        Allison PJ, Jorgensen NW, Fekedulegn D, Landsbergis P, Andrew ME, Foy C, Hinckley Stukovsky K, Charles LE.
        Am J Ind Med. 2019 Dec 17.
        BACKGROUND: Long work hours may be associated with adverse outcomes, including cardiovascular disease. We investigated cross-sectional associations of current work hours with coronary artery calcification (CAC). METHODS: Participants (n = 3046; 54.6% men) were from the Multi-Ethnic Study of Atherosclerosis. The number of hours worked in all jobs was obtained by questionnaire and CAC from computed tomography. The probability of a positive CAC score was modeled using log-binomial regression. Positive scores were modeled using analysis of covariance and linear regression. RESULTS: Sixteen percent of the sample worked over 50 hours per week. The overall geometric mean CAC score was 5.2 +/- 10.0; 40% had positive scores. In fully-adjusted models, prevalence ratios were less than 40 hours: 1.00 (confidence interval [CI]: 0.88-1.12), 40:(ref), 41 to 49:1.13 (CI: 0.99-1.30), and >/=50:1.07 (CI: 0.94-1.23) and longer current work hours were not associated with higher mean CAC scores (<40:56.0 [CI: 47.3-66.3], 40:57.8 [CI: 45.6-73.3], 41 to 49:59.2 [CI: 45.2-77.6], >/=50:51.2 [CI: 40.5-64.8]; P = .686). CONCLUSIONS: Current work hours were not independently associated with CAC scores.

      3. Estimates of incidence and mortality of cervical cancer in 2018: a worldwide analysisexternal icon
        Arbyn M, Weiderpass E, Bruni L, de Sanjose S, Saraiya M, Ferlay J, Bray F.
        Lancet Glob Health. 2019 Dec 4.
        BACKGROUND: The knowledge that persistent human papillomavirus (HPV) infection is the main cause of cervical cancer has resulted in the development of prophylactic vaccines to prevent HPV infection and HPV assays that detect nucleic acids of the virus. WHO has launched a Global Initiative to scale up preventive, screening, and treatment interventions to eliminate cervical cancer as a public health problem during the 21st century. Therefore, our study aimed to assess the existing burden of cervical cancer as a baseline from which to assess the effect of this initiative. METHODS: For this worldwide analysis, we used data of cancer estimates from 185 countries from the Global Cancer Observatory 2018 database. We used a hierarchy of methods dependent on the availability and quality of the source information from population-based cancer registries to estimate incidence of cervical cancer. For estimation of cervical cancer mortality, we used the WHO mortality database. Countries were grouped in 21 subcontinents and were also categorised as high-resource or lower-resource countries, on the basis of their Human Development Index. We calculated the number of cervical cancer cases and deaths in a given country, directly age-standardised incidence and mortality rate of cervical cancer, indirectly standardised incidence ratio and mortality ratio, cumulative incidence and mortality rate, and average age at diagnosis. FINDINGS: Approximately 570 000 cases of cervical cancer and 311 000 deaths from the disease occurred in 2018. Cervical cancer was the fourth most common cancer in women, ranking after breast cancer (2.1 million cases), colorectal cancer (0.8 million) and lung cancer (0.7 million). The estimated age-standardised incidence of cervical cancer was 13.1 per 100 000 women globally and varied widely among countries, with rates ranging from less than 2 to 75 per 100 000 women. Cervical cancer was the leading cause of cancer-related death in women in eastern, western, middle, and southern Africa. The highest incidence was estimated in Eswatini, with approximately 6.5% of women developing cervical cancer before age 75 years. China and India together contributed more than a third of the global cervical burden, with 106 000 cases in China and 97 000 cases in India, and 48 000 deaths in China and 60 000 deaths in India. Globally, the average age at diagnosis of cervical cancer was 53 years, ranging from 44 years (Vanuatu) to 68 years (Singapore). The global average age at death from cervical cancer was 59 years, ranging from 45 years (Vanuatu) to 76 years (Martinique). Cervical cancer ranked in the top three cancers affecting women younger than 45 years in 146 (79%) of 185 countries assessed. INTERPRETATION: Cervical cancer continues to be a major public health problem affecting middle-aged women, particularly in less-resourced countries. The global scale-up of HPV vaccination and HPV-based screening-including self-sampling-has potential to make cervical cancer a rare disease in the decades to come. Our study could help shape and monitor the initiative to eliminate cervical cancer as a major public health problem. FUNDING: Belgian Foundation Against Cancer, DG Research and Innovation of the European Commission, and The Bill & Melinda Gates Foundation.

      4. Prevalence of diabetes by race and ethnicity in the United States, 2011-2016external icon
        Cheng YJ, Kanaya AM, Araneta MR, Saydah SH, Kahn HS, Gregg EW, Fujimoto WY, Imperatore G.
        Jama. 2019 Dec 24;322(24):2389-2398.
        Importance: The prevalence of diabetes among Hispanic and Asian American subpopulations in the United States is unknown. Objective: To estimate racial/ethnic differences in the prevalence of diabetes among US adults 20 years or older by major race/ethnicity groups and selected Hispanic and non-Hispanic Asian subpopulations. Design, Setting, and Participants: National Health and Nutrition Examination Surveys, 2011-2016, cross-sectional samples representing the noninstitutionalized, civilian, US population. The sample included adults 20 years or older who had self-reported diagnosed diabetes during the interview or measurements of hemoglobin A1c (HbA1c), fasting plasma glucose (FPG), and 2-hour plasma glucose (2hPG). Exposures: Race/ethnicity groups: non-Hispanic white, non-Hispanic black, Hispanic and Hispanic subgroups (Mexican, Puerto Rican, Cuban/Dominican, Central American, and South American), non-Hispanic Asian and non-Hispanic Asian subgroups (East, South, and Southeast Asian), and non-Hispanic other. Main Outcomes and Measures: Diagnosed diabetes was based on self-reported prior diagnosis. Undiagnosed diabetes was defined as HbA1c 6.5% or greater, FPG 126 mg/dL or greater, or 2hPG 200 mg/dL or greater in participants without diagnosed diabetes. Total diabetes was defined as diagnosed or undiagnosed diabetes. Results: The study sample included 7575 US adults (mean age, 47.5 years; 52% women; 2866 [65%] non-Hispanic white, 1636 [11%] non-Hispanic black, 1952 [15%] Hispanic, 909 [6%] non-Hispanic Asian, and 212 [3%] non-Hispanic other). A total of 2266 individuals had diagnosed diabetes; 377 had undiagnosed diabetes. Weighted age- and sex-adjusted prevalence of total diabetes was 12.1% (95% CI, 11.0%-13.4%) for non-Hispanic white, 20.4% (95% CI, 18.8%-22.1%) for non-Hispanic black, 22.1% (95% CI, 19.6%-24.7%) for Hispanic, and 19.1% (95% CI, 16.0%-22.1%) for non-Hispanic Asian adults (overall P < .001). Among Hispanic adults, the prevalence of total diabetes was 24.6% (95% CI, 21.6%-27.6%) for Mexican, 21.7% (95% CI, 14.6%-28.8%) for Puerto Rican, 20.5% (95% CI, 13.7%-27.3%) for Cuban/Dominican, 19.3% (95% CI, 12.4%-26.1%) for Central American, and 12.3% (95% CI, 8.5%-16.2%) for South American subgroups (overall P < .001). Among non-Hispanic Asian adults, the prevalence of total diabetes was 14.0% (95% CI, 9.5%-18.4%) for East Asian, 23.3% (95% CI, 15.6%-30.9%) for South Asian, and 22.4% (95% CI, 15.9%-28.9%) for Southeast Asian subgroups (overall P = .02). The prevalence of undiagnosed diabetes was 3.9% (95% CI, 3.0%-4.8%) for non-Hispanic white, 5.2% (95% CI, 3.9%-6.4%) for non-Hispanic black, 7.5% (95% CI, 5.9%-9.1%) for Hispanic, and 7.5% (95% CI, 4.9%-10.0%) for non-Hispanic Asian adults (overall P < .001). Conclusions and Relevance: In this nationally representative survey of US adults from 2011 to 2016, the prevalence of diabetes and undiagnosed diabetes varied by race/ethnicity and among subgroups identified within the Hispanic and non-Hispanic Asian populations.

      5. PURPOSE: To examine self-reported oral health among adults aged 40 years and older with and without vision impairment. DESIGN: Cross-sectional, with a nationally representative sample. METHODS: We used publicly available data from the Oral Health Module, last administered in 2008, of the National Health Interview Survey. Outcome variables included fair/poor oral health status, mouth condition compared to others the same age, mouth problems (mouth sores, difficulty eating, dry mouth, bad breath, and/or jaw pain), teeth problems (toothache; broken/missing fillings or teeth; loose, crooked, or stained teeth; and/or bleeding gums), and lack of social participation. Using descriptive statistics and multivariate logistic regression, we examined the association (P < .05) between vision impairment and oral health outcomes by age group, sociodemographics, and other explanatory variables. RESULTS: Our study sample included 12,090 adults; 12.8% of adults aged 40-64 years reported vision impairment, and among them, 44.5% reported fair/poor oral health status and 47.2% reported any mouth problems. Among adults aged >/=65 years, 17.3% reported vision impairment, of whom 36.3% reported fair/poor oral health status and 57.3% reported any mouth problems. There is a strong association between vision impairment and poorer oral health of adults; adults aged 40-64 years with vision impairment reported 90%-150% greater odds of oral health problems, including fair/poor oral health status, mouth problems, and teeth problems, compared to people without vision impairment. CONCLUSIONS: Oral health disparities exist between adults with and without vision impairment. Targeted interventions are required to improve oral health in this vulnerable population.

      6. Recent epidemiologic trends in periodontitis in the USAexternal icon
        Eke PI, Borgnakke WS, Genco RJ.
        Periodontol 2000. 2020 Feb;82(1):257-267.
        The most important development in the epidemiology of periodontitis in the USA during the last decade is the result of improvements in survey methodologies and statistical modeling of periodontitis in adults. Most of these advancements have occurred as the direct outcome of work by the joint initiative known as the Periodontal Disease Surveillance Project by the Centers for Disease Control and Prevention and the American Academy of Periodontology that was established in 2006. This report summarizes some of the key findings of this important initiative and its impact on our knowledge of the epidemiology of periodontitis in US adults. This initiative first suggested new periodontitis case definitions for surveillance in 2007 and revised them slightly in 2012. This classification is now regarded as the global standard for periodontitis surveillance and is used worldwide. First, application of such a standard in reporting finally enables results from different researchers in different countries to be meaningfully compared. Second, this initiative tackled the concern that prior national surveys, which used partial-mouth periodontal examination protocols, grossly underestimated the prevalence of periodontitis of potentially more than 50%. Consequently, because previous national surveys significantly underestimated the true prevalence of periodontitis, it is not possible to extrapolate any trend in periodontitis prevalence in the USA over time. Any difference calculated may not represent any actual change in periodontitis prevalence, but rather is a consequence of using different periodontal examination protocols. Finally, the initiative addressed the gap in the need for state and local data on periodontitis prevalence. Through the direct efforts of the Centers for Disease Control and Prevention and the American Academy of Periodontology initiative, full-mouth periodontal probing at six sites around all nonthird molar teeth was included in the 6 years of National Health and Nutrition Examination Surveys from 2009-2014, yielding complete data for 10 683 dentate community-dwelling US adults aged 30 to 79 years. Applying the 2012 periodontitis case definitions to the 2009-2014 National Health and Nutrition Examination Surveys data, the periodontitis prevalence turned out to be much greater than previously estimated, namely affecting 42.2% of the population with 7.8% of people experiencing severe periodontitis. It was also discovered that only the moderate type of periodontitis is driving the increase in periodontitis prevalence with age, not the mild or the severe types whose prevalence do not increase consistently with age, but remain ~ 10%-15% in all age groups of 40 years and older. The greatest risk for having periodontitis of any type was seen in older people, in males, in minority race/ethnic groups, in poorer and less educated groups, and especially in cigarette smokers. The Centers for Disease Control and Prevention and the American Academy of Periodontology initiative reported, for the first time, the periodontitis prevalence estimated at both local and state levels, in addition to the national level. Also, this initiative developed and validated in field studies a set of eight items for self-reported periodontitis for use in direct survey estimates of periodontitis prevalence in existing state-based surveys. These items were also included in the 2009-2014 National Health and Nutrition Examination Surveys for validation against clinically determined cases of periodontitis. Another novel result of this initiative is that, for the first time, the geographic distribution of practicing periodontists in relation to the geographic distribution of people with severe periodontitis is illustrated. In summary, the precise periodontitis prevalence and distribution among subgroups in the dentate US noninstitutionalized population aged 30-79 years is better understood because of application of valid periodontitis case definitions to full-mouth periodontal examination, in combination with reliable information on demographic and health-related measures. We now can monitor the trend of periodontitis prevalence over time as well as guide public health preventive and intervention initiatives for the betterment of the health of the adult US population.

      7. RATIONALE & OBJECTIVE: Dialysis-requiring acute kidney injury (AKI-D) has increased substantially in the United States. We examined trends in and comorbid conditions associated with hospitalizations and in-hospital mortality in the setting of AKI-D among people with versus without diabetes. STUDY DESIGN: Cross-sectional study. SETTING & PARTICIPANTS: Nationally representative data from the National Inpatient Sample and National Health Interview Survey were used to generate 16 cross-sectional samples of US adults (aged >/=18 years) between 2000 and 2015. EXPOSURE: Diabetes, defined using International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) diagnosis codes. OUTCOME: AKI-D, defined using ICD-9-CM diagnosis and procedure codes. ANALYTICAL APPROACH: Annual age-standardized rates of AKI-D and AKI-D mortality were calculated for adults with and without diabetes, by age and sex. Data were weighted to be representative of the US noninstitutionalized population. Trends were assessed using join point regression with annual percent change (Delta/y) reported. RESULTS: In adults with diabetes, AKI-D increased between 2000 and 2015 (from 26.4 to 41.1 per 100,000 persons; Delta/y, 3.3%; P < 0.001), with relative increases greater in younger versus older adults. In adults without diabetes, AKI-D increased between 2000 and 2009 (from 4.8 to 8.7; Delta/y, 6.5%; P < 0.001) and then plateaued. AKI-D mortality significantly declined in people with and without diabetes. In adults with and without diabetes, the proportion of AKI-D hospitalizations with liver, rheumatic, and kidney disease comorbid conditions increased between 2000 and 2015, while the proportion of most cardiovascular comorbid conditions decreased. LIMITATIONS: Lack of laboratory data to corroborate AKI diagnosis; National Inpatient Sample data are hospital-level rather than person-level data; no data for type of diabetes; residual unmeasured confounding. CONCLUSIONS: Hospitalization rates for AKI-D have increased considerably while mortality has decreased in adults with and without diabetes. Hospitalization rates for AKI-D remain substantially higher in adults with diabetes. Greater AKI risk-factor mitigation is needed, especially in young adults with diabetes.

      8. The CDC Colorectal Cancer Control Program, 2009-2015external icon
        Joseph DA, DeGroff A.
        Prev Chronic Dis. 2019 Dec 5;16:E159.

      9. Psychometric evaluation of the National Institutes of Health Patient-Reported Outcomes Measurement Information System in a multiracial, multiethnic systemic lupus erythematosus cohortexternal icon
        Katz P, Yazdany J, Trupin L, Rush S, Helmick CG, Murphy LB, Lanata C, Criswell LA, Dall'Era M.
        Arthritis Care Res (Hoboken). 2019 Dec;71(12):1630-1639.
        OBJECTIVE: We examined psychometric performance of Patient-Reported Outcomes Measurement Information System (PROMIS) measures in a racially/ethnically and linguistically diverse cohort with systemic lupus erythematosus (SLE). METHODS: Data were from the California Lupus Epidemiology Study, a multiracial/multiethnic cohort of individuals with physician-confirmed SLE. The majority (n = 332) attended in-person research visits that included interviews conducted in English, Spanish, Cantonese, or Mandarin. Up to 12 PROMIS short forms were administered (depending on language availability). An additional 99 individuals completed the interview by phone only. Internal consistency was examined with Cronbach's alpha and item-total correlations. Correlations with the Short Form 36 subscales and both self-reported and physician-assessed disease activity assessed convergent validity. All analyses were repeated within each racial/ethnic group. Differences in scores by race/ethnicity were examined in bivariate analyses and by multiple regression analyses controlling for age, sex, disease duration, and disease damage and activity. RESULTS: The total sample was 30.0% white, 22.3% Hispanic, 10.9% African American, 33.7% Asian, and 3.0% other race/ethnicity. Seventy-seven percent of interviews were conducted in-person. Non-English interviews were conducted in 26.0% of the Hispanic subjects and 18.6% of the Asian subjects. Each scale demonstrated adequate reliability and validity overall and within racial/ethnic groups. Minimal floor effects were observed, but ceiling effects were noted. Missing item responses were minimal for most scales, except for items related to work. No differences were noted by mode of administration or by language of administration among Hispanics and Asians. After accounting for differences in disease status, age, and sex, few differences in mean scores between whites and other racial/ethnic groups were noted. CONCLUSION: PROMIS measures appear reliable and valid in persons with lupus across racial/ethnic groups.

      10. Awareness of the link between breast cancer and risk factors such as family history of breast cancer and alcohol consumption may help modify health behaviors. To reduce risk factors for breast cancer among young women, it is important to understand overall levels of risk awareness and socioeconomic differences in awareness. Data from the National Survey of Family Growth 2011-2015 were used to examine awareness of two risk factors for breast cancer, positive family history and alcohol consumption, among women aged 15-44 years (n = 10,940) in the United States by presence of risk factors and by socioeconomic characteristics. Prevalence of positive family history, non-binge, and binge drinking was 30%, 29%, and 31%, respectively among women aged 15-44. Awareness of positive family history of breast cancer as a risk factor for breast cancer was 88%, whereas for alcohol consumption it was 25%. Awareness of family history as a risk factor was higher among women with positive family history of breast cancer compared to those without. Current drinkers were more likely to believe that alcohol was not a risk factor for breast cancer compared to those who did not drink. Racial/ethnic minority women and those with lower education and income had lower awareness of family history as a risk factor. Awareness of alcohol consumption as a risk factor for breast cancer was low across all socioeconomic groups. Evidence-based interventions to increase risk awareness and decrease excessive alcohol use among young women are needed to reduce the risk of developing breast cancer.

      11. Five-year U.S. trends in the North American Cancer Survival Index, 2005-2014external icon
        Morawski BM, Weir HK, Johnson CJ.
        Am J Prev Med. 2019 Dec 10.
        INTRODUCTION: Progress in U.S. 5-year survival trends for all cancers combined was assessed using the North American Cancer Survival Index, a sum of age-, sex-, and cancer site-standardized relative survival ratios. METHODS: In January 2019, authors calculated 5-year cancer survival indices and 95% CIs by race and sex for 2005-2011, 2006-2012, 2007-2013, and 2008-2014 diagnosis cohorts with data from 42 cancer registries. RESULTS: Overall 5-year survival increased from 63.5% (95% CI=63.4, 63.5) in 2005-2011 to 64.1% (95% CI=64.1, 64.2) in 2008-2014. Survival increased 0.9 and 0.5 percentage points in female and male patients, respectively; the survival disparity among blacks versus whites decreased by 0.5%. In 2008-2014, the Cancer Survival Index was 7.7% higher for whites (64.6%; 95% CI=64.6, 64.7) than for blacks (56.9%; 95% CI=56.7, 57.1). CONCLUSIONS: Cancer Survival Index survival estimates increased among all race and sex subpopulations during 2005-2014. A substantial but decreasing survival gap persisted between blacks and whites. The Cancer Survival Index can assist decision makers and others in comparing cancer survival among populations and over time and in monitoring progress toward national cancer surveillance objectives.

      12. Combating gastric cancer in Alaska Native people: An expert and community symposium: Alaska Native Gastric Cancer Symposiumexternal icon
        Nolen LD, Vindigni SM, Parsonnet J, Bruce MG, Martinson HA, Thomas TK, Sacco F, Nash S, Olnes MJ, Miernyk K, Bruden D, Ramaswamy M, McMahon B, Goodman KJ, Bass AJ, Hur C, Inoue M, Camargo MC, Cho SJ, Parnell K, Allen E, Woods T, Melkonian S.
        Gastroenterology. 2019 Dec 10.
        Alaska Native (AN) people experience higher incidence of, and mortality from, gastric cancer compared to other U.S. populations(1, 2). Compared to the general U.S. population, gastric cancer in AN people occurs at a younger age, is diagnosed at later stages, is more evenly distributed between the sexes, and is more frequently signet-ring or diffuse histology(3). It is known that the prevalence of Helicobacter pylori (Hp) infection, a risk factor for gastric cancer, is high in AN people(4); however, high antimicrobial resistance combined with high reinfection rates in Alaska make treatment at the population level complex(5). In addition, health issues in AN people are uniquely challenging due to the extremely remote locations of many residents. A multiagency workgroup hosted a symposium in Anchorage that brought internationally-recognized experts and local leaders together to evaluate issues around gastric cancer in the AN population. The overall goal of this symposium was to identify the best strategies to combat gastric cancer in the AN population through prevention and early diagnosis.

      13. State-specific prevalence and characteristics of frequent mental distress and history of depression diagnosis among adults with arthritis - United States, 2017external icon
        Price JD, Barbour KE, Liu Y, Lu H, Amerson NL, Murphy LB, Helmick CG, Calanan RM, Sandoval-Rosario M, Samanic CM, Greenlund KJ, Thomas CW.
        MMWR Morb Mortal Wkly Rep. 2020 Jan 3;68(5152):1173-1178.

      14. INTRODUCTION: In the United States, children in Puerto Rico and non-Hispanic black children in the mainland US have a higher burden of asthma than non-Hispanic white children in the mainland US. We examined indoor environmental control (IEC) practices that reduce asthma triggers, by race/ethnicity among children in the mainland US and Puerto Rico. METHODS: We used 2013 and 2014 data from the Behavioral Risk Factor Surveillance System Asthma Call-back Survey Child Questionnaire from 14 states and Puerto Rico to measure the association between race/ethnicity and IEC practices, adjusting for sociodemographic covariates, among children identified as ever receiving an asthma diagnosis. Racial/ethnic groups were compared in 14 US states using aggregated data. Separate analyses compared IEC practices for children diagnosed with asthma in Puerto Rico with children of all races/ethnicities diagnosed with asthma in 14 states. RESULTS: Among households in 14 US states that had a child with asthma, non-Hispanic black children were more likely than non-Hispanic white children to use an air purifier (36.8% vs 25.2%; adjusted odds ratio [aOR] = 2.0; 95% confidence interval [CI], 1.3-3.2) and avoid pets in the bedroom (87.9% vs 58.3%; aOR = 4.5; 95% CI, 2.3-8.8). Children in Puerto Rico were more likely than children in 14 states to use dust mite-impermeable pillow covers (53.7% vs 36.4%; aOR = 3.6; 95% CI, 1.8-7.1) and mattress encasements (60.3% vs 30.3%; aOR = 2.4; 95% CI, 1.2-4.8). CONCLUSION: IEC practices such as using air purifiers, pillow covers, mattress encasements, and avoiding pets in the bedroom vary by race/ethnicity among children with asthma. These findings show that vulnerable populations are using IEC practices, but asthma prevention and control measures should continue to be assessed.

      15. Longitudinal changes in glucose metabolism in women with gestational diabetes, from late pregnancy to the postpartum periodexternal icon
        Waters TP, Kim SY, Sharma AJ, Schnellinger P, Bobo JK, Woodruff RT, Cubbins LA, Haghiac M, Minium J, Presley L, Wolfe H, Hauguel-de Mouzon S, Adams W, Catalano PM.
        Diabetologia. 2019 Dec 9.
        AIMS/HYPOTHESIS: This study aimed to determine, in women with gestational diabetes (GDM), the changes in insulin sensitivity (Matsuda Insulin Sensitivity Index; ISOGTT), insulin response and disposition index (DI) from late pregnancy (34-37 weeks gestation, T1), to early postpartum (1-5 days, T2) and late postpartum (6-12 weeks, T3). A secondary aim was to correlate the longitudinal changes in maternal lipids, adipokines, cytokines and weight in relation to the changes in ISOGTT, insulin response and DI. METHODS: ISOGTT, insulin response and DI were calculated at the three time points (T1, T2 and T3) using the results of a 75 g OGTT. Adipokines, cytokines and lipids were measured prior to each OGTT. Linear mixed-effects models were used to compare changes across each time point. Changes in ISOGTT, insulin response and DI were correlated with changes in maternal adipokines, cytokines and lipids at each time point. RESULTS: A total of 27 women completed all assessments. Compared with T1, ISOGTT was 11.20 (95% CI 8.09, 14.31) units higher at 1-5 days postpartum (p < 0.001) and was 5.49 (95% CI 2.38, 8.60) units higher at 6-12 weeks postpartum (p < 0.001). Compared with T1, insulin response values were 699.6 (95% CI 957.5, 441.6) units lower at T2 (p < 0.001) and were 356.3 (95% CI 614.3, 98.3) units lower at T3 (p = 0.004). Compared with T1, the DI was 6434.1 (95% CI 2486.2, 10,381.0) units higher at T2 (p = 0.001) and was 4262.0 (95% CI 314.6, 8209.3) units higher at T3 (p = 0.03). There was a decrease in mean cholesterol, triacylglycerol, LDL-cholesterol and VLDL-cholesterol from T1 to T2 (all p < 0.001), and an increase in mean C-reactive protein, IL-6 and IL-8 from T1 to T2 (all p < 0.001). Mean leptin decreased from T1 to T2 (p = 0.001). There was no significant change in mean adiponectin (p = 0.99) or TNF-alpha (p = 0.81) from T1 to T2. The mean maternal BMI decreased from T1 to T2 (p = 0.001) and T3 (p < 0.001). There were no significant correlations between any measure of change in ISOGTT, insulin response and DI and change in maternal cytokines, adipokines, lipids or weight from T1 to T2. CONCLUSIONS/INTERPRETATION: In women with GDM, delivery was associated with improvement in both insulin sensitivity and insulin production within the first few days. Improvement in insulin production persisted for 6-12 weeks, but insulin sensitivity deteriorated slightly. These changes in glucose metabolism were not associated to changes in lipids, leptin, inflammation markers or body weight. TRIAL REGISTRATION: ClinicalTrials.gov NCT02082301.

      16. Hospitalizations for inflammatory bowel disease among Medicare fee-for-service beneficiaries - United States, 1999-2017external icon
        Xu F, Wheaton AG, Liu Y, Lu H, Greenlund KJ.
        MMWR Morb Mortal Wkly Rep. 2019 Dec 13;68(49):1134-1138.
        Crohn's disease and ulcerative colitis, collectively referred to as inflammatory bowel disease (IBD), are conditions characterized by chronic inflammation of the gastrointestinal tract. The incidence and prevalence of IBD is increasing globally, and although the disease has little impact on mortality, the number of older adults with IBD is expected to increase as the U.S. population ages (1). Older adults with IBD have worse hospitalization outcomes than do their younger counterparts (2). CDC analyzed Medicare Provider Analysis and Review (MedPAR) data to estimate IBD-related hospitalization rates and hospitalization outcomes in 2017 among Medicare fee-for-service beneficiaries aged >/=65 years, by selected demographics and trends in hospitalization rates and by race/ethnicity during 1999-2017. In 2017, the age-adjusted hospitalization rate for Crohn's disease was 15.5 per 100,000 Medicare enrollees, and the IBD-associated surgery rate was 17.4 per 100 hospital stays. The age-adjusted hospitalization rate for ulcerative colitis was 16.2 per 100,000 Medicare enrollees, and the surgery rate was 11.2 per 100 stays. During 1999-2017, the hospitalization rate for both Crohn's disease and ulcerative colitis decreased among non-Hispanic white (white) beneficiaries, but not among non-Hispanic black (black) beneficiaries. Health care utilization assessment is needed among black beneficiaries with IBD. Disease management for older adults with IBD could focus on increasing preventive care and preventing emergency surgeries that might result in further complications.

    • Communicable Diseases
      1. Tuberculosis treatment outcomes among people living with HIV diagnosed using Xpert MTB/RIF versus sputum-smear microscopy in Botswana: a stepped-wedge cluster randomised trialexternal icon
        Agizew T, Chihota V, Nyirenda S, Tedla Z, Auld AF, Mathebula U, Mathoma A, Boyd R, Date A, Pals SL, Lekone P, Finlay A.
        BMC Infect Dis. 2019 Dec 16;19(1):1058.
        BACKGROUND: Xpert(R) MTB/RIF (Xpert) has high sensitivity for diagnosing tuberculosis (TB) compared to sputum-smear microscopy (smear) and can reduce time-to-diagnosis, time-to-treatment and potentially unfavorable patient-level treatment outcome. METHODS: People living with HIV (PLHIV) initiating antiretroviral therapy at 22 HIV clinics were enrolled and underwent systematic screening for TB (August 2012-November 2014). GeneXpert instruments were deployed following a stepped-wedge design at 13 centers from October 2012-June 2013. Treatment outcomes classified as an unfavorable outcome (died, treatment failure or loss-to-follow-up) or favorable outcome (cured and treatment completed). To determine outcome, smear was performed at month 5 or 6. Empiric treatment was defined as initiating treatment without/before receiving TB-positive results. Adjusting for intra-facility correlation, we compared patient-level treatment outcomes between patients screened using smear (smear arm)- and Xpert-based algorithms (Xpert arm). RESULTS: Among 6041 patients enrolled (smear arm, 1816; Xpert arm, 4225), 256 (199 per 2985 and 57 per 1582 person-years of follow-up in Xpert and smear arms, respectively; adjusted incidence rate ratio, 9.07; 95% confidence interval [CI]: 4.70-17.48; p < 0.001) received TB diagnosis and were treated. TB treatment outcomes were available for 203 patients (79.3%; Xpert, 157; smear, 46). Unfavorable outcomes were reported for 21.7% (10/46) in the smear and 13.4% (21/157) in Xpert arm (adjusted hazard ratio, 1.40; 95% CI: 0.75-2.26; p = 0.268). Compared to smear, in Xpert arm median days from sputum collection to TB treatment was 6 days (interquartile range [IQR] 2-17 versus 22 days [IQR] 3-51), p = 0.005; patients with available sputum test result had microbiologically confirmed TB in 59.0% (102/173) versus 41.9% (18/43), adjusted Odds Ratio [aOR], 2.00, 95% CI: 1.01-3.96, p = 0.048). In smear arm empiric treatment was 68.4% (39/57) versus 48.7% (97/199), aOR, 2.28, 95% CI: 1.24-4.20, p = 0.011), compared to Xpert arm. CONCLUSIONS: TB treatment outcomes were similar between the smear and Xpert arms. However, compared to the smear arm, more patients in the Xpert arm received a TB diagnosis, had a microbiologically confirmed TB, and had a shorter time-to-treatment, and had a lower empiric treatment. Further research is recommended to identify potential gaps in the Botswana health system and similar settings. TRIAL REGISTRATION: ClinicalTrials.gov Identifier: NCT02538952. Retrospectively registered on 2 September 2015.

      2. Interferon-gamma release assays in children <15 years of ageexternal icon
        Ahmed A, Feng PI, Gaensbauer JT, Reves RR, Khurana R, Salcedo K, Punnoose R, Katz DJ.
        Pediatrics. 2020 Jan;145(1).
        OBJECTIVES: The tuberculin skin test (TST) has been preferred for screening young children for latent tuberculosis infection (LTBI) because of concerns that interferon-gamma release assays (IGRAs) may be less sensitive in this high-risk population. In this study, we compared the predictive value of IGRAs to the TST for progression to tuberculosis disease in children, including those <5 years old. METHODS: Children <15 years old at risk for LTBI or progression to disease were tested with TST, QuantiFERON-TB Gold In-Tube test (QFT-GIT), and T-SPOT.TB test (T-SPOT) and followed actively for 2 years, then with registry matches, to identify incident disease. RESULTS: Of 3593 children enrolled September 2012 to April 2016, 92% were born outside the United States; 25% were <5 years old. Four children developed tuberculosis over a median 4.3 years of follow-up. Sensitivities for progression to disease for TST and IGRAs were low (50%-75%), with wide confidence intervals (CIs). Specificities for TST, QFT-GIT, and T-SPOT were 73.4% (95% CI: 71.9-74.8), 90.1% (95% CI: 89.1-91.1), and 92.9% (95% CI: 92.0-93.7), respectively. Positive and negative predictive values for TST, QFT-GIT, and T-SPOT were 0.2 (95% CI: 0.1-0.8), 0.9 (95% CI: 0.3-2.5), and 0.8 (95% CI: 0.2-2.9) and 99.9 (95% CI: 99.7-100), 100 (95% CI: 99.8-100), and 99.9 (95% CI: 99.8-100), respectively. Of 533 children with TST-positive, IGRA-negative results not treated for LTBI, including 54 children <2 years old, none developed disease. CONCLUSIONS: Although both types of tests poorly predict disease progression, IGRAs are no less predictive than the TST and offer high specificity and negative predictive values. Results from this study support the use of IGRAs for children, especially those who are not born in the United States.


      3. Estimated burden of community-onset respiratory syncytial virus-associated hospitalizations among children aged <2 years in the United States, 2014-15external icon
        Arriola CS, Kim L, Langley G, Anderson EJ, Openo K, Martin AM, Lynfield R, Bye E, Como-Sabetti K, Reingold A, Chai S, Daily P, Thomas A, Crawford C, Reed C, Garg S, Chaves SS.
        J Pediatric Infect Dis Soc. 2019 Dec 23.
        BACKGROUND: Respiratory syncytial virus (RSV) is a major cause of hospitalizations in young children. We estimated the burden of community-onset RSV-associated hospitalizations among US children aged <2 years by extrapolating rates of RSV-confirmed hospitalizations in 4 surveillance states and using probabilistic multipliers to adjust for ascertainment biases. METHODS: From October 2014 through April 2015, clinician-ordered RSV tests identified laboratory-confirmed RSV hospitalizations among children aged <2 years at 4 influenza hospitalization surveillance network sites. Surveillance populations were used to estimate age-specific rates of RSV-associated hospitalization, after adjusting for detection probabilities. We extrapolated these rates using US census data. RESULTS: We identified 1554 RSV-associated hospitalizations in children aged <2 years. Of these, 27% were admitted to an intensive care unit, 6% needed mechanical ventilation, and 5 died. Most cases (1047/1554; 67%) had no underlying condition. Adjusted age-specific RSV hospitalization rates per 100 000 population were 1970 (95% confidence interval [CI],1787 to 2177), 897 (95% CI, 761 to 1073), 531 (95% CI, 459 to 624), and 358 (95% CI, 317 to 405) for ages 0-2, 3-5, 6-11, and 12-23 months, respectively. Extrapolating to the US population, an estimated 49 509-59 867 community-onset RSV-associated hospitalizations among children aged <2 years occurred during the 2014-2015 season. CONCLUSIONS: Our findings highlight the importance of RSV as a cause of hospitalization, especially among children aged <2 months. Our approach to estimating RSV-related hospitalizations could be used to provide a US baseline for assessing the impact of future interventions.

      4. Progress and challenges in a pioneering hepatitis C elimination program in the country of Georgia, 2015-2018external icon
        Averhoff F, Shadaker S, Gamkrelidze A, Kuchuloria T, Gvinjilia L, Getia V, Sergeenko D, Butsashvili M, Tsertsvadze T, Sharvadze L, Zarkua J, Skaggs B, Nasrullah M.
        J Hepatol. 2019 Dec 4.
        BACKGROUND & AIMS: Georgia, with a high prevalence of hepatitis C virus (HCV) infection, launched the world's first national hepatitis C elimination program in April 2015. A key strategy is the identification, treatment, and cure of the estimated 150,000 HCV infected persons living in the country. We report on progress and key challenges from Georgia's experience. METHODS: We constructed a care cascade by analyzing linked data from the national hepatitis C screening registry and treatment databases during 2015-2018. We assessed the impact of reflex hepatitis C core antigen (HCVcAg) testing on rates of viremia testing and treatment initiation (i.e. linkage to care). RESULTS: As of December 31, 2018, 1,101,530 adults (39.6% of the adult population) were screened for HCV antibody, of whom 98,430 (8.9%) tested positive, 78,484 (79.7%) received viremia testing, of these, 66,916 persons (85.3%) tested positive for active HCV infection. A total of 52,576 persons with active HCV infection initiated treatment, 48,879 completed their course of treatment. Of the 35,035 who were tested for cure (i.e., sustained virologic response [SVR]), 34,513 (98.5%) achieved SVR. Reflex HCVcAg testing, implemented in March 2018, increased rates of monthly viremia testing among persons screening positive for anti-HCV by 97.5%, however, rates of treatment initiation decreased by 60.7% among diagnosed viremic patients. CONCLUSIONS: Over one-third of persons living with HCV in Georgia have been detected and linked to care and treatment, however, identification and linkage to care of the remaining persons with HCV infection is challenging. Novel interventions, such as reflex testing with HCVcAg can improve rates of viremia testing, but may result in unintended consequences, such as decreased rates of treatment initiation. Linked data systems allow for regular review of the care cascade, allowing for identification of deficiencies and development of corrective actions.

      5. Evaluation of the influenza sentinel surveillance system in the Democratic Republic of Congo, 2012-2015external icon
        Babakazo P, Kabamba-Tshilobo J, Wemakoy EO, Lubula L, Manya LK, Ilunga BK, Disasuani W, Nkwembe E, Kavunga-Membo H, Changachanga JC, Muhemedi S, Tamfum JM, Tempia S.
        BMC Public Health. 2019 Dec 10;19(1):1652.
        BACKGROUND: The World Health Organization recommends periodic evaluations of influenza surveillance systems to identify areas for improvement and provide evidence of data reliability for policymaking. However, data about the performance of established influenza surveillance systems are limited in Africa, including in the Democratic Republic of Congo (DRC). METHODS: We used the Centers for Disease Control and Prevention guidelines to evaluate the performance of the influenza sentinel surveillance system (ISSS) in DRC during 2012-2015. The performance of the system was evaluated using eight surveillance attributes: (i) data quality and completeness for key variables, (ii) timeliness, (iii) representativeness, (iv) flexibility, (v) simplicity, (vi) acceptability, (vii) stability and (viii) utility. For each attribute, specific indicators were developed and described using quantitative and qualitative methods. Scores for each indicator were as follows: < 60% weak performance; 60-79% moderate performance; >/=80% good performance. RESULTS: During 2012-2015, we enrolled and tested 4339 patients with influenza-like illness (ILI) and 2869 patients with severe acute respiratory illness (SARI) from 11 sentinel sites situated in 5 of 11 provinces. Influenza viruses were detected in 446 (10.3%) samples from patients with ILI and in 151 (5.5%) samples from patients with SARI with higher detection during December-May. Data quality and completeness was > 90% for all evaluated indicators. Other strengths of the system were timeliness, simplicity, stability and utility that scored > 70% each. Representativeness, flexibility and acceptability had moderate performance. It was reported that the ISSS contributed to: (i) a better understanding of the epidemiology, circulating patterns and proportional contribution of influenza virus among patients with ILI or SARI; (ii) acquisition of new key competences related to influenza surveillance and diagnosis; and (iii) continuous education of surveillance staff and clinicians at sentinel sites about influenza. However, due to limited resources no actions were undertaken to mitigate the impact of seasonal influenza epidemics. CONCLUSIONS: The system performed overall satisfactorily and provided reliable and timely data about influenza circulation in DRC. The simplicity of the system contributed to its stability. A better use of the available data could be made to inform and promote prevention interventions especially among the most vulnerable groups.

      6. Design of an enhanced public health surveillance system for hepatitis C virus elimination in King County, Washingtonexternal icon
        Baer A, Fagalde MS, Drake CD, Sohlberg EH, Barash E, Glick S, Millman AJ, Duchin JS.
        Public Health Rep. 2020 Jan;135(1):33-39.
        INTRODUCTION: With the goal of eliminating hepatitis C virus (HCV) as a public health problem in Washington State, Public Health-Seattle & King County (PHSKC) designed a Hepatitis C Virus Test and Cure (HCV-TAC) data system to integrate surveillance, clinical, and laboratory data into a comprehensive database. The intent of the system was to promote identification, treatment, and cure of HCV-infected persons (ie, HCV care cascade) using a population health approach. MATERIALS AND METHODS: The data system automatically integrated case reports received via telephone and fax from health care providers and laboratories, hepatitis test results reported via electronic laboratory reporting, and data on laboratory and clinic visits reported by 6 regional health care systems. PHSKC examined patient-level laboratory test results and established HCV case classification using Council of State and Territorial Epidemiologists criteria, classifying patients as confirmed if they had detectable HCV RNA. RESULTS: The data enabled PHSKC to report the number of patients at various stages along the HCV care cascade. Of 7747 HCV RNA-positive patients seen by a partner site, 5377 (69%) were assessed for severity of liver fibrosis, 3932 (51%) were treated, and 2592 (33%) were cured. PRACTICE IMPLICATIONS: Data supported local public heath surveillance and HCV program activities. The data system could serve as a foundation for monitoring future HCV prevention and control programs.

      7. Coordinating the real-time use of global influenza activity data for better public health planningexternal icon
        Biggerstaff M, Dahlgren FS, Fitzner J, George D, Hammond A, Hall I, Haw D, Imai N, Johansson MA, Kramer S, McCaw JM, Moss R, Pebody R, Read JM, Reed C, Reich NG, Riley S, Vandemaele K, Viboud C, Wu JT.
        Influenza Other Respir Viruses. 2019 .
        Health planners from global to local levels must anticipate year-to-year and week-to-week variation in seasonal influenza activity when planning for and responding to epidemics to mitigate their impact. To help with this, countries routinely collect incidence of mild and severe respiratory illness and virologic data on circulating subtypes and use these data for situational awareness, burden of disease estimates and severity assessments. Advanced analytics and modelling are increasingly used to aid planning and response activities by describing key features of influenza activity for a given location and generating forecasts that can be translated to useful actions such as enhanced risk communications, and informing clinical supply chains. Here, we describe the formation of the Influenza Incidence Analytics Group (IIAG), a coordinated global effort to apply advanced analytics and modelling to public influenza data, both epidemiological and virologic, in real-time and thus provide additional insights to countries who provide routine surveillance data to WHO. Our objectives are to systematically increase the value of data to health planners by applying advanced analytics and forecasting and for results to be immediately reproducible and deployable using an open repository of data and code. We expect the resources we develop and the associated community to provide an attractive option for the open analysis of key epidemiological data during seasonal epidemics and the early stages of an influenza pandemic.

      8. The Manaus Declaration: Current Situation of Histoplasmosis in the Americas, Report of the II Regional Meeting of the International Histoplasmosis Advocacy Groupexternal icon
        Caceres DH, Adenis A, de Souza JV, Gomez BL, Cruz KS, Pasqualotto AC, Ravasi G, Perez F, Chiller T, de Lacerda MV, Nacher M.
        Curr Fungal Infect Rep. 2019 .
        Purpose of Review: The aim of this report is to summarize the conclusions of the II Regional Meeting on Histoplasmosis in the Americas held in Manaus, Brazil, on March 22–24, 2019. Recent Findings: Persons living with advanced HIV are at high risk for developing histoplasmosis. Clinical signs and symptoms of this disease are often non-specific, making it difficult to establish a diagnosis. Although with the recent technological advances, in vitro diagnostics and medicines for histoplasmosis are often not available in many regions around the world. In addition, histoplasmosis is often not included in HIV care and treatment programs, resulting in inadequate health system planning and missed opportunities to save lives. Summary: The II Regional Meeting on Histoplasmosis in the Americas gathered a multidisciplinary audience. Developed recommendations to be included in the WHO guidelines for diagnosis and treatment of histoplasmosis in advanced HIV were the product of this meeting, and guidelines are aimed to be published in early 2020.

      9. Invasive nontypeable Haemophilus influenzae infection among adults with HIV in metropolitan Atlanta, Georgia, 2008-2018external icon
        Collins LF, Havers FP, Tunali A, Thomas S, Clennon JA, Wiley Z, Tobin-D'Angelo M, Parrott T, Read TD, Satola SW, Petit RA, Farley MM.
        Jama. 2019 Dec 24;322(24):2399-2410.
        Importance: Invasive nontypeable Haemophilus influenzae (NTHi) infection among adults is typically associated with bacteremic pneumonia. Nontypeable H influenzae is genetically diverse and clusters of infection are uncommon. Objective: To evaluate an increase in invasive NTHi infection from 2017-2018 among HIV-infected men who have sex with men in metropolitan Atlanta, Georgia. Design, Setting, and Participants: A population-based surveillance study with a cohort substudy and descriptive epidemiological analysis identified adults aged 18 years or older with invasive NTHi infection (isolation of NTHi from a normally sterile site) between January 1, 2008, and December 31, 2018 (final date of follow-up). Exposures: Time period, HIV status, and genetic relatedness (ie, cluster status) of available NTHi isolates. Main Outcomes and Measures: The primary outcome was incidence of invasive NTHi infection (from 2008-2016 and 2017-2018) among persons with HIV and compared with NTHi infection from 2008-2018 among those without HIV. The secondary outcomes were assessed among those aged 18 to 55 years with invasive NTHi infection and included epidemiological, clinical, and geographic comparisons by cluster status. Results: Among 553 adults with invasive NTHi infection (median age, 66 years [Q1-Q3, 48-78 years]; 52% male; and 38% black), 60 cases occurred among persons with HIV. Incidence of invasive NTHi infection from 2017-2018 among persons with HIV (41.7 cases per 100000) was significantly greater than from 2008-2016 among those with HIV (9.6 per 100000; P < .001) and from 2008-2018 among those without HIV (1.1 per 100000; P < .001). Among adults aged 18 to 55 years with invasive NTHi infections from 2017-2018 (n = 179), persons with HIV (n = 31) were significantly more likely than those from 2008-2018 without HIV (n = 124) to be male (94% vs 49%, respectively; P < .001), black (100% vs 53%; P < .001), and have septic arthritis (35% vs 1%; P < .001). Persons with HIV who had invasive NTHi infection from 2017-2018 (n = 31) were more likely than persons with HIV who had invasive NTHi infection from 2008-2016 (n = 24) to have septic arthritis (35% vs 4%, respectively; P = .01). Pulsed-field gel electrophoresis of 174 of 179 NTHi isolates from 18- to 55-year-olds identified 2 genetically distinct clonal groups: cluster 1 (C1; n = 24) and cluster 2 (C2; n = 23). Whole-genome sequencing confirmed 2 clonal lineages of NTHi infection and revealed all C1 isolates (but none of the C2 isolates) carried IS1016 (an insertion sequence associated with H influenzae capsule genes). Persons with HIV were significantly more likely to have C1 or C2 invasive NTHi infection from 2017-2018 (28/31 [90%]) compared with from 2008-2016 among persons with HIV (10/24 [42%]; P < .001) and compared with from 2008-2018 among those without HIV (9/119 [8%]; P < .001). Among persons with C1 or C2 invasive NTHi infection who had HIV (n = 38) (median age, 34.5 years; 100% male; 100% black; 82% men who have sex with men), 32 (84%) lived in 2 urban counties and an area of significant spatial aggregation was identified compared with those without C1 or C2 invasive NTHi infection. Conclusions and Relevance: Among persons with HIV in Atlanta, the incidence of invasive nontypeable H influenzae infection increased significantly from 2017-2018 compared with 2008-2016. Two unique but genetically related clonal strains were identified and were associated with septic arthritis among black men who have sex with men and who lived in geographic proximity.

      10. Epidemiology and clinical outcomes of hospitalizations for acute respiratory or febrile illness and laboratory-confirmed influenza among pregnant women during six influenza seasons, 2010-2016external icon
        Dawood FS, Garg S, Fink RV, Russell ML, Regan AK, Katz MA, Booth S, Chung H, Klein NP, Kwong JC, Levy A, Naleway A, Riesel D, Thompson MG, Wyant BE, Fell DB.
        J Infect Dis. 2019 Dec 26.
        BACKGROUND: Pregnant women are at increased risk of seasonal influenza hospitalizations, but data about the epidemiology of severe influenza among pregnant women remain largely limited to pandemics. METHODS: To describe the epidemiology of hospitalizations for acute respiratory infection or febrile illness (ARFI) and influenza-associated ARFI among pregnant women, administrative and electronic health record data were analyzed from retrospective cohorts of pregnant women hospitalized with ARFI who had testing for influenza viruses by RT-PCR in Australia, Canada, Israel and the United States during 2010-2016. RESULTS: Of 18,048 ARFI-coded hospitalizations, 1,064 (6%) included RT-PCR testing for influenza viruses, of which 614 (58%) were influenza-positive. Of 614 influenza-positive ARFI hospitalizations, 35% were in women with low socioeconomic status, 20% with underlying conditions, and 67% in their third trimesters. The median length of influenza-positive hospitalizations was 2 days (IQR 1-4), 18% (95% confidence interval (CI) 15-21%) resulted in delivery, 10% (95% CI 8-12%) included a pneumonia diagnosis, 5% (95% CI 3-6%) required intensive care, 2% (95% CI 1-3%) included a sepsis diagnosis, and <1% (95% CI 0-1%) resulted in respiratory failure. CONCLUSIONS: Our findings characterize seasonal influenza hospitalizations among pregnant women and can inform assessments of the public health and economic impact of seasonal influenza on pregnant women.

      11. HIV care outcomes among Hispanics/Latinos with diagnosed HIV in the United States by place of birth-2015-2018, Medical Monitoring Projectexternal icon
        Demeke HB, Luo Q, Luna-Gierke RE, Padilla M, Girona-Lozada G, Miranda-De Leon S, Weiser J, Beer L.
        Int J Environ Res Public Health. 2019 Dec 25;17(1).
        Relocation from one's birthplace may affect human immunodeficiency virus (HIV) outcomes, but national estimates of HIV outcomes among Hispanics/Latinos by place of birth are limited. We analyzed Medical Monitoring Project data collected in 2015-2018 from 2564 HIV-positive Hispanic/Latino adults and compared clinical outcomes between mainland US-born (referent group), Puerto Rican (PR-born), and those born outside the United States (non-US-born). We reported weighted percentages of characteristics and used logistic regression with predicted marginal means to examine differences between groups (p < 0.05). PR-born Hispanics/Latinos were more likely to be prescribed antiretroviral therapy (ART) (94%) and retained in care (94%) than mainland-US-born (79% and 77%, respectively) and non-US-born (91% and 87%, respectively) Hispanics/Latinos. PR-born Hispanics/Latinos were more likely to have sustained viral suppression (75%) than mainland-US-born Hispanics/Latinos (57%). Non-US-born Hispanics/Latinos were more likely to be prescribed ART (91% vs. 79%), retained in care (87% vs. 77%), and have sustained viral suppression (74% vs. 57%) than mainland-US-born Hispanics/Latinos. Greater Ryan White HIV/AIDS-funded facility usage among PR-born, better mental health among non-US-born, and less drug use among PR-born and non-US-born Hispanics/Latinos may have contributed to better HIV outcomes. Expanding programs with comprehensive HIV/AIDS services, including for mental health and substance use, may reduce HIV outcome disparities among Hispanics/Latinos.

      12. Challenges of HIV diagnosis and management in the context of pre-exposure prophylaxis (PrEP), post-exposure prophylaxis (PEP), test and start and acute HIV infection: a scoping reviewexternal icon
        Elliott T, Sanders EJ, Doherty M, Ndung'u T, Cohen M, Patel P, Cairns G, Rutstein SE, Ananworanich J, Brown C, Fidler S.
        J Int AIDS Soc. 2019 Dec;22(12):e25419.
        INTRODUCTION: Knowledge of HIV status relies on accurate HIV testing, and is the first step towards access to HIV treatment and prevention programmes. Globally, HIV-status unawareness represents a significant challenge for achieving zero new HIV infections and deaths. In order to enhance knowledge of HIV status, the World Health Organisation (WHO) recommends a testing strategy that includes the use of HIV-specific antibody point-of-care tests (POCT). These POCTs do not detect acute HIV infection, the stage of disease when viral load is highest but HIV antibodies are undetectable. Complicating things further, in the presence of antiretroviral therapy (ART) for pre-exposure prophylaxis (PrEP) or post-exposure prophylaxis (PEP), other currently available testing technologies, such as viral load detection for diagnosis of acute HIV infection, may yield false-negative results. In this scoping review, we evaluate the evidence and discuss alternative HIV testing algorithms that may mitigate diagnostic dilemmas in the setting of increased utilization of ART for immediate treatment and prevention of HIV infection. DISCUSSION: Missed acute HIV infection prevents people living with HIV (PLHIV) from accessing early treatment, increases likelihood of onward transmission, and allows for inappropriate initiation or continuation of PrEP, which may result in HIV drug resistance. While immediate ART is recommended for all PLHIV, studies have shown that starting ART in the setting of acute HIV infection may result in a delayed or complete absence of development of HIV-specific antibodies, posing a diagnostic challenge that is particularly pertinent to resource-limited, high HIV burden settings where HIV-antibody POCTs are standard of care. Similarly, ART used as PrEP or PEP may supress HIV RNA viral load, complicating current HIV testing algorithms in resource-wealthy settings where viral detection is included. As rollout of PrEP continues, HIV testing algorithms may need to be modified. CONCLUSIONS: With increasing use of PrEP and ART in acute infection we anticipate diagnostic challenges using currently available HIV testing strategies. Research and surveillance are needed to determine the most appropriate assays and optimal testing algorithms that are accurate, affordable and sustainable.

      13. Hepatitis C virus antibody testing among 13- to 21-year-olds in a large sample of US federally qualified health centersexternal icon
        Epstein RL, Wang J, Hagan L, Mayer KH, Puro J, Linas BP, Assoumou SA.
        Jama. 2019 Dec 10;322(22):2245-2248.

      14. Hepatitis C Guidance 2019 Update: AASLD-IDSA Recommendations for Testing, Managing, and Treating Hepatitis C Virus Infectionexternal icon
        Ghany MG, Marks KM, Morgan TR, Wyles DL, Aronsohn AI, Bhattacharya D, Broder T, Falade-Nwulia OO, Feld JJ, Gordon SC, Heller T, Jhaveri RR, Jonas MM, Kiser JJ, Linas BP, Lo Re V, Peters MG, Reddy KR, Reynolds A, Scott JD, Searson G, Spradling P, Terrault NA, Trooskin SB, Verna EC, Wong JB, Woolley AE, Workowski KA.
        Hepatology. 2019 Dec 9.
        The American Association for the Study of Liver Diseases (AASLD) and the Infectious Diseases Society of America (IDSA) initiated the hepatitis C guidance project (hereafter HCV guidance) in 2013. The AASLD-IDSA HCV guidance website (www.HCVGuidelines.org) disseminates up-to-date, peer-reviewed, unbiased, evidence-based recommendations to aid clinicians making decisions regarding the testing, management, and treatment of hepatitis C virus (HCV) infection. Utilizing a web-based system enables timely and nimble distribution of the HCV guidance, which is periodically updated in near real time as necessitated by emerging research data, recommendations from public health agencies, the availability of new therapeutic agents, or other significant developments affecting the rapidly evolving hepatitis C arena.

      15. High levels of pretreatment and acquired HIV drug resistance in Nicaragua: results from the first nationally representative survey, 2016external icon
        Giron-Callejas A, Garcia-Morales C, Mendizabal-Burastero R, Roman M, Tapia-Trejo D, Perez-Garcia M, Quiroz-Morales VS, Juarez SI, Ravasi G, Vargas C, Gutierrez R, Romero L, Solorzano A, Sajquim E, Northbrook S, Avila-Rios S, Reyes-Teran G.
        J Int AIDS Soc. 2019 Dec;22(12):e25429.
        INTRODUCTION: A nationally representative HIV drug resistance (HIVDR) survey in Nicaragua was conducted to estimate the prevalence of pretreatment HIVDR (PDR) among antiretroviral therapy (ART) initiators and acquired HIVDR among people living with HIV (PLHIV) who had received ART for 12 +/- 3 months (ADR12) and >/=48 months (ADR48). METHODS: A nationwide cross-sectional survey with a two-stage cluster sampling was conducted from March to November 2016. Nineteen of 45 total ART clinics representing >90% of the national cohort of adults on ART were included. ART initiators were defined as PLHIV initiating or reinitiating first-line ART. HIVDR was assessed for protease, reverse transcriptase and integrase Sanger sequences using the Stanford HIVdb algorithm. Viral load (VL) suppression was defined as <1000 copies/mL. Results were weighted according to the survey design. RESULTS AND DISCUSSION: A total of 638 participants were enrolled (PDR: 171; ADR12: 114; ADR48: 353). The proportion of ART initiators with prior exposure to antiretrovirals (ARVs) was 12.3% (95% CI: 5.8% to 24.3%). PDR prevalence to any drug was 23.4% (95% CI: 14.4% to 35.6%), and 19.3% (95% CI: 12.2% to 29.1%) to non-nucleoside reverse transcriptase inhibitors (NNRTI). NNRTI PDR was higher in ART initiators with previous ARV exposure compared with those with no exposure (76.2% vs. 11.0%, p < 0.001). Protease inhibitors (PI) and integrase strand transfer inhibitors PDR was not observed. VL suppression rate was 77.8% (95% CI: 67.1% to 85.8%) in ADR12 and 70.3% (95% CI: 66.7% to 73.8%) in ADR48. ADR12 prevalence to any drug among PLHIV without VL suppression was 85.1% (95% CI: 66.1% to 94.4%), 82.4% to NNRTI and 70.2% to nucleoside reverse transcriptase inhibitors (NRTI). ADR48 prevalence to any drug among PLHIV without VL suppression was 75.5% (95% CI: 63.5% to 84.5 %), 70.7% to NNRTI, 59.4% to NRTI and 4.6% to PI. CONCLUSIONS: Despite implementation challenges yielding low-precision HIVDR estimates, high rates of NNRTI PDR were observed in Nicaragua, suggesting consideration of non-NNRTI-based first-line regimens for ART initiators. Strengthened HIVDR monitoring, systematic VL testing, and improved ART adherence support are also warranted.

      16. Trends in incidence of norovirus-associated acute gastroenteritis in 4 Veterans Affairs Medical Center populations in the United States, 2011-2015external icon
        Grytdal S, Browne H, Collins N, Vargas B, Rodriguez-Barradas MC, Rimland D, Beenhouwer DO, Brown ST, Goetz MB, Lucero-Obusan C, Holodniy M, Kambhampati A, Parashar U, Vinje J, Lopman B, Hall AJ, Cardemil CV.
        Clin Infect Dis. 2020 Jan 1;70(1):40-48.
        BACKGROUND: Norovirus is an important cause of epidemic acute gastroenteritis (AGE), yet the burden of endemic disease in adults has not been well documented. We estimated the prevalence and incidence of outpatient and community-acquired inpatient norovirus AGE at 4 Veterans Affairs Medical Centers (VAMC) (Atlanta, Georgia; Bronx, New York; Houston, Texas; and Los Angeles, California) and examined trends over 4 surveillance years. METHODS: From November 2011 to September 2015, stool specimens collected within 7 days of AGE symptom onset for clinician-requested diagnostic testing were tested for norovirus, and positive samples were genotyped. Incidence was calculated by multiplying norovirus prevalence among tested specimens by AGE-coded outpatient encounters and inpatient discharges, and dividing by the number of unique patients served. RESULTS: Of 1603 stool specimens, 6% tested were positive for norovirus; GII.4 viruses (GII.4 New Orleans [17%] and GII.4 Sydney [47%]) were the most common genotypes. Overall prevalence and outpatient and inpatient community-acquired incidence followed a seasonal pattern, with higher median rates during November-April (9.2%, 376/100 000, and 45/100 000, respectively) compared to May-October (3.0%, 131/100 000, and 13/100 000, respectively). An alternate-year pattern was also detected, with highest peak prevalence and outpatient and inpatient community-acquired norovirus incidence rates in the first and third years of surveillance (14%-25%, 349-613/100 000, and 43-46/100 000, respectively). CONCLUSIONS: This multiyear analysis of laboratory-confirmed AGE surveillance from 4 VAMCs demonstrates dynamic intra- and interannual variability in prevalence and incidence of outpatient and inpatient community-acquired norovirus in US Veterans, highlighting the burden of norovirus disease in this adult population.

      17. Demographics and health profile on precursors of non-communicable diseases in adults testing for HIV in Soweto, South Africa: a cross-sectional studyexternal icon
        Hopkins KL, Hlongwane K, Otwombe K, Dietrich J, Cheyip M, Khanyile N, Doherty T, Gray GE.
        BMJ Open. 2019 Dec 15;9(12):e030701.
        OBJECTIVES: This cross-sectional study investigated the burden of HIV-non-communicable disease (NCD) precursor comorbidity by age and sex. Policies stress integrated HIV-NCD screenings; however, NCD screening is poorly implemented in South African HIV testing services (HTS). SETTING: Walk-in HTS Centre in Soweto, South Africa. PARTICIPANTS: 325 voluntary adults, aged 18+ years, who provided written or verbal informed consent (with impartial witness) for screening procedures were enrolled. PRIMARY AND SECONDARY OUTCOMES: Data on sociodemographics, tuberculosis and sexually transmitted infection symptoms, blood pressure (BP) (>/=140/90=elevated) and body mass index (<18.5 underweight; 18.5-25.0 normal; >25 overweight/obese) were stratified by age-group, sex and HIV status. RESULTS: Of the 325 participants, the largest proportions were female (51.1%; n=166/325), single (71.5%; n=231/323) and 25-34 years (33.8%; n=110/325). Overall, 20.9% (n=68/325) were HIV infected, 27.5% (n=89/324) had high BP and 33.5% (n=109/325) were overweight/obese. Among HIV-infected participants, 20.6% (14/68) had high BP and 30.9% (21/68) were overweight/obese, as compared with 29.3% (75/256) and 12.1% (31/256) of the HIV-uninfected participants, respectively. Females were more likely HIV-infected compared with males (26.5% (44/166) vs 15.1% (24/159); p=0.012). In both HIV-infected and uninfected groups, high BP was most prevalent in those aged 35-44 years (25% (6/24) vs 36% (25/70); p=0.3353) and >44 years (29% (4/14) vs 48% (26/54); p=0.1886). Males had higher BP than females (32.9% (52/158) vs 22.3% (37/166); p=0.0323); more females were overweight/obese relative to males (45.8% (76/166) vs 20.8% (33/159); p<0.0001). Females were more likely to be HIV infected and overweight/obese. CONCLUSION: Among HTS clients, NCD precursors rates and co-morbidities were high. Elevated BP occurred more in older participants. Targeted integrated interventions for HIV-infected females and HIV-infected people aged 18-24 and 35-44 years could improve HIV public health outcomes. Additional studies on whether integrated HTS will improve the uptake of NCD treatment and improve health outcomes are required.

      18. Estimating the incidence of influenza at the state level - Utah, 2016-17 and 2017-18 influenza seasonsexternal icon
        Hughes MM, Carmack AE, McCaffrey K, Spencer M, Reed GM, Hill M, Dunn A, Risk I, Garg S, Reed C, Biggerstaff M, Mayer J, Gesteland P, Korgenski K, Dascomb K, Pavia A, Rolfes MA.
        MMWR Morb Mortal Wkly Rep. 2019 Dec 20;68(50):1158-1161.
        The 2017-18 U.S. influenza season was notable for its high severity, with approximately 45 million illnesses and 810,000 influenza-associated hospitalizations throughout the United States (1). The purpose of the investigation reported here was to create a state-level estimate of the number of persons in Utah who became ill with influenza disease during this severe national seasonal influenza epidemic and to create a sustainable system for making timely updates in future influenza seasons. Knowing the extent of influenza-associated illness can help public health officials, policymakers, and clinicians tailor influenza messaging, planning, and responses for seasonal influenza epidemics or during pandemics. Using national methods and existing influenza surveillance and testing data, the influenza burden (number of influenza illnesses, medical visits for influenza, and influenza-associated hospitalizations) in Utah during the 2016-17 and 2017-18 influenza seasons was estimated. During the 2016-17 season, an estimated 265,000 symptomatic illnesses affecting 9% of Utah residents occurred, resulting in 125,000 medically attended illnesses and 2,700 hospitalizations. During the 2017-18 season, an estimated 338,000 symptomatic illnesses affecting 11% of Utah residents occurred, resulting in 160,000 medically attended illnesses and 3,900 hospitalizations. Other state or county health departments could adapt similar methods in their jurisdictions to estimate the burden of influenza locally and support prompt public health activities.

      19. Expanded eligibility for HIV testing increases HIV diagnoses - A cross-sectional study in seven health facilities in western Kenyaexternal icon
        Joseph RH, Musingila P, Miruka F, Wanjohi S, Dande C, Musee P, Lugalia F, Onyango D, Kinywa E, Okomo G, Moth I, Omondi S, Ayieko C, Nganga L, Zielinski-Gutierrez E, Muttai H, De Cock KM.
        PLoS One. 2019 ;14(12):e0225877.
        Homa Bay, Siaya, and Kisumu counties in western Kenya have the highest estimated HIV prevalence (16.3-21.0%) in the country, and struggle to meet program targets for HIV testing services (HTS). The Kenya Ministry of Health (MOH) recommends annual HIV testing for the general population. We assessed the degree to which reducing the interval for retesting to less than 12 months increased diagnosis of HIV in outpatient departments (OPD) in western Kenya. We conducted a retrospective analysis of routinely collected program data from seven high-volume (>800 monthlyOPD visits) health facilities in March-December, 2017. Data from persons >/=15 years of age seeking medical care (patients) in the OPD and non-care-seekers (non-patients) accompanying patients to the OPD were included. Outcomes were meeting MOH (routine) criteria versus criteria for a reduced retesting interval (RRI) of <12 months, and HIV test result. STATA version 14.2 was used to calculate frequencies and proportions, and to test for differences using bivariate analysis. During the 9-month period, 119,950 clients were screened for HIV testing eligibility, of whom 79% (94,766) were eligible and 97% (92,153) received a test. Among 92,153 clients tested, the median age was 28 years, 57% were female and 40% (36,728) were non-patients. Overall, 20% (18,120) of clients tested met routine eligibility criteria: 4% (3,972) had never been tested, 10% (9,316) reported a negative HIV test in the past >12 months, and 5% (4,832) met other criteria. The remaining 80% (74,033) met criteria for a RRI of < 12 months. In total 1.3% (1,185) of clients had a positive test. Although the percent yield was over 2-fold higher among those meeting routine criteria (2.4% vs. 1.0%; p<0.001), 63% (750) of all HIV infections were found among clients tested less than 12 months ago, the majority (81%) of whom reported having a negative test in the past 3-12 months. Non-patients accounted for 45% (539) of all HIV-positive persons identified. Percent yield was higher among non-patients as compared to patients (1.5% vs. 1.2%; p-value = <0.001) overall and across eligibility criteria and age categories. The majority of HIV diagnoses in the OPD occurred among clients reporting a negative HIV test in the past 12 months, clients ineligible for testing under the current MOH guidelines. Nearly half of all HIV-positive individuals identified in the OPD were non-patients. Our findings suggest that in the setting of a generalized HIV epidemic, retesting persons reporting an HIV-negative test in the past 3-12 months, and routine testing of non-patients accessing the OPD are key strategies for timely diagnosis of persons living with HIV.

      20. Mapping the study characteristics and topics of HIV pre-exposure prophylaxis research literature: A scoping reviewexternal icon
        Kamitani E, Mizuno Y, Wichser M, Adegbite AH, DeLuca JB, Higa DH.
        AIDS Educ Prev. 2019 Dec;31(6):505-522.
        Since WHO released the first PrEP guidance in 2012, the PrEP research literature has rapidly increased, but PrEP uptake is still low. To identify research gaps, this scoping review describes study characteristics, identifies populations, and maps study topics in PrEP publications. We identified 561 PrEP primary studies published in English between 2006 and 2018. The most commonly used study design was cross-sectional. Almost half of studies were conducted in non-U.S. countries and focused on men who have sex with men. We mapped study topics using five categories. The most studied category was Potential PrEP user/prescriber (41.3%) followed by Considerations while on PrEP (28.2%), PrEP efficacy and safety (20.9%), Cost-effectiveness or economic evaluation (5.2%), and Methods of and experiences with PrEP clinical trials (4.2%). Although the PrEP literature has dramatically increased, some research areas (e.g., PrEP awareness in non-U.S. countries, intervention studies to promote PrEP use) and populations (e.g., Black women) are still understudied.

      21. Measles and rubella seroprevalence among adults in Georgia in 2015: helping guide the elimination effortsexternal icon
        Khetsuriani N, Chitadze N, Russell S, Ben Mamou M.
        Epidemiol Infect. 2019 Dec 11;147:e319.
        A large-scale measles outbreak (11 495 reported cases, 60% aged >/=15 years) occurred in Georgia during 2013-2015. A nationwide, multistage, stratified cluster serosurvey for hepatitis B and C among persons aged >/=18 years conducted in Georgia in late 2015 provided an opportunity to assess measles and rubella (MR) susceptibility after the outbreak. Residual specimens from 3125 participants aged 18-50 years were tested for Immunoglobulin G antibodies against MR using ELISA. Nationwide, 6.3% (95% CI 4.9%-7.6%) of the surveyed population were seronegative for measles and 8.6% (95% CI 7.1%-10.1%) were seronegative for rubella. Measles susceptibility was highest among 18-24 year-olds (10.1%) and declined with age to 1.2% among 45-50 year-olds (P < 0.01). Susceptibility to rubella was highest among 25-29 year-olds (15.3%), followed by 18-24 year-olds (11.6%) and 30-34 year-olds (10.2%), and declined to <5% among persons aged >/=35 years (P < 0.001). The susceptibility profiles in the present serosurvey were consistent with the epidemiology of recent MR cases and the history of the immunization programme. Measles susceptibility levels >10% among 18-24 year-olds in Georgia revealed continued risk for outbreaks among young adults. High susceptibility to rubella among 18-34 year-olds indicates a continuing risk for congenital rubella cases.

      22. Sexually transmissible infection testing among pregnant women in the US, 2011-15external icon
        Leichliter JS, Haderxhanaj LT, Gift TL, Dittus PJ.
        Sex Health. 2019 Nov 4.
        Introduction:Sexually transmissible infections (STIs) are increasing in the US. Pregnant women and infants are susceptible to serious STI-related sequelae; however, some STIs can be cured during pregnancy with appropriate, timely screening. Methods: We used data from the 2011-15 National Survey of Family Growth to examine STI testing (in the past 12 months) among women who were pregnant in the past 12 months (n = 1155). In bivariate and multivariable analyses, we examined associations between demographics, health care access and two outcome variables, namely receipt of a chlamydia test and receipt of other STI tests. Results: Among women who were pregnant in the past 12 months, 48% reported receiving a chlamydia test and 54% reported that they received an STI test other than chlamydia in the past 12 months. In adjusted analyses, non-Hispanic Black women were more likely to receive a chlamydia test (adjusted odds ratio (aOR) 2.82; 95% confidence interval (CI) 1.86-4.26) and other STI tests (aOR 2.43; 95% CI 1.58-3.74) than non-Hispanic White women. Women living in a metropolitan statistical area but not the principal city were less likely to report chlamydia (aOR 0.62; 95% CI 0.44-0.86) and other STI (aOR 0.57; 95% CI 0.40-0.81) testing than women living in a principal city. Women born outside the US were significantly less likely to have received a chlamydia test (aOR 0.35; 95% CI 0.19-0.64) or other STI test (aOR 0.34; 95% CI 0.20-0.58), whereas those who had received prenatal care were more likely to receive a chlamydia test (aOR 2.10; 95% CI 1.35-3.28) or another STI test (aOR 2.32; 95% CI 1.54-3.49). Conclusions: The findings suggest that interventions are needed to increase adherence to recommended STI screenings during pregnancy.

      23. Guillain-Barre syndrome and antecedent cytomegalovirus infection, USA 2009-2015external icon
        Leung J, Sejvar JJ, Soares J, Lanzieri TM.
        Neurol Sci. 2019 Dec 11.
        OBJECTIVE: To describe incidence and clinical characteristics of cases of Guillain-Barre syndrome (GBS) in the USA during 2009-2015, and characteristics of GBS cases with antecedent cytomegalovirus (CMV) infection among persons with employer-sponsored insurance. METHODS: We analyzed medical claims from IBM Watson MarketScan(R) databases. GBS patients were defined as enrollees with an inpatient claim with GBS as the principal diagnosis code, based on ICD-9 or ICD-10, and >/= 1 claim for lumbar puncture or EMG/nerve conduction study. We assessed intensive care unit (ICU) hospitalization, intubation, dysautonomia, and death. We also assessed selected infectious illness within 60 days prior to the first GBS-coded inpatient claim. RESULTS: We identified 3486 GBS patients; annual incidence was 1.0-1.2/100,000 persons during 2009-2015. GBS incidence was higher in males (1.2/100,000) than in females (0.9/100,000) (p = 0.006) and increased with age, from 0.4/100,000 in persons 0-17 years old to 2.1/100,000 in persons >/= 65 years old (p < 0.001). Half of GBS patients were hospitalized in the ICU, 8% were intubated, 2% developed dysautonomia, and 1% died. Half had a claim for antecedent illness, but only 125 (3.5%) had a claim for specific infectious pathogens. The mean age among 18 GBS patients with antecedent CMV infection was 39 years versus 47 years among those without antecedent CMV infection (p = 0.038). CONCLUSIONS: Incidence of GBS using a large national claims database was comparable to that reported in the literature, but cases appeared to be less severe. Half of GBS patients reported prior infectious illness, but only a minority had a specific pathogen identified.

      24. Diarrhoeal disease and subsequent risk of death in infants and children residing in low-income and middle-income countries: analysis of the GEMS case-control study and 12-month GEMS-1A follow-on studyexternal icon
        Levine MM, Nasrin D, Acacio S, Bassat Q, Powell H, Tennant SM, Sow SO, Sur D, Zaidi AK, Faruque AS, Hossain MJ, Alonso PL, Breiman RF, O'Reilly CE, Mintz ED, Omore R, Ochieng JB, Oundo JO, Tamboura B, Sanogo D, Onwuchekwa U, Manna B, Ramamurthy T, Kanungo S, Ahmed S, Qureshi S, Quadri F, Hossain A, Das SK, Antonio M, Saha D, Mandomando I, Blackwelder WC, Farag T, Wu Y, Houpt ER, Verweiij JJ, Sommerfelt H, Nataro JP, Robins-Browne RM, Kotloff KL.
        Lancet Glob Health. 2019 Dec 18.
        BACKGROUND: The Global Enteric Multicenter Study (GEMS) was a 3-year case-control study that measured the burden, aetiology, and consequences of moderate-to-severe diarrhoea (MSD) in children aged 0-59 months. GEMS-1A, a 12-month follow-on study, comprised two parallel case-control studies, one assessing MSD and the other less-severe diarrhoea (LSD). In this report, we analyse the risk of death with each diarrhoea type and the specific pathogens associated with fatal outcomes. METHODS: GEMS was a prospective, age-stratified, matched case-control study done at seven sites in Africa and Asia. Children aged 0-59 months with MSD seeking care at sentinel health centres were recruited along with one to three randomly selected matched community control children without diarrhoea. In the 12-month GEMS-1A follow-on study, children with LSD and matched controls, in addition to children with MSD and matched controls, were recruited at six of the seven sites; only cases of MSD and controls were enrolled at the seventh site. We compared risk of death during the period between enrolment and one follow-up household visit done about 60 days later (range 50-90 days) in children with MSD and LSD and in their respective controls. Approximately 50 pathogens were detected using, as appropriate, classic bacteriology, immunoassays, gel-based PCR and reverse transcriptase PCR, and quantitative real-time PCR (qPCR). Specimens from a subset of GEMS cases and controls were also tested by a TaqMan Array Card that compartmentalised probe-based qPCR for 32 enteropathogens. FINDINGS: 223 (2.0%) of 11 108 children with MSD and 43 (0.3%) of 16 369 matched controls died between study enrolment and the follow-up visit at about 60 days (hazard ratio [HR] 8.16, 95% CI 5.69-11.68, p<0.0001). 12 (0.4%) of 2962 children with LSD and seven (0.2%) of 4074 matched controls died during the follow-up period (HR 2.78, 95% CI 0.95-8.11, p=0.061). Risk of death was lower in children with dysenteric MSD than in children with non-dysenteric MSD (HR 0.20, 95% CI 0.05-0.87, p=0.032), and lower in children with LSD than in those with non-dysenteric MSD (HR 0.29, 0.14-0.59, p=0.0006). In children younger than 24 months with MSD, infection with typical enteropathogenic Escherichia coli, enterotoxigenic E coli encoding heat-stable toxin, enteroaggregative E coli, Shigella spp (non-dysentery cases), Aeromonas spp, Cryptosporidium spp, and Entamoeba histolytica increased risk of death. Of 61 deaths in children aged 12-59 months with non-dysenteric MSD, 31 occurred among 942 children qPCR-positive for Shigella spp and 30 deaths occurred in 1384 qPCR-negative children (HR 2.2, 95% CI 1.2-3.9, p=0.0090), showing that Shigella was strongly associated with increased risk of death. INTERPRETATION: Risk of death is increased following MSD and, to a lesser extent, LSD. Considering there are approximately three times more cases of LSD than MSD in the population, more deaths are expected among children with LSD than in those with MSD. Because the major attributable LSD-associated and MSD-associated pathogens are the same, implementing vaccines and rapid diagnosis and treatment interventions against these major pathogens are rational investments. FUNDING: Bill & Melinda Gates Foundation.

      25. Applying infectious disease forecasting to public health: a path forward using influenza forecasting examplesexternal icon
        Lutz CS, Huynh MP, Schroeder M, Anyatonwu S, Dahlgren FS, Danyluk G, Fernandez D, Greene SK, Kipshidze N, Liu L, Mgbere O, McHugh LA, Myers JF, Siniscalchi A, Sullivan AD, West N, Johansson MA, Biggerstaff M.
        BMC Public Health. 2019 Dec 10;19(1):1659.
        BACKGROUND: Infectious disease forecasting aims to predict characteristics of both seasonal epidemics and future pandemics. Accurate and timely infectious disease forecasts could aid public health responses by informing key preparation and mitigation efforts. MAIN BODY: For forecasts to be fully integrated into public health decision-making, federal, state, and local officials must understand how forecasts were made, how to interpret forecasts, and how well the forecasts have performed in the past. Since the 2013-14 influenza season, the Influenza Division at the Centers for Disease Control and Prevention (CDC) has hosted collaborative challenges to forecast the timing, intensity, and short-term trajectory of influenza-like illness in the United States. Additional efforts to advance forecasting science have included influenza initiatives focused on state-level and hospitalization forecasts, as well as other infectious diseases. Using CDC influenza forecasting challenges as an example, this paper provides an overview of infectious disease forecasting; applications of forecasting to public health; and current work to develop best practices for forecast methodology, applications, and communication. CONCLUSIONS: These efforts, along with other infectious disease forecasting initiatives, can foster the continued advancement of forecasting science.

      26. Amphetamine use is higher among men who have sex with men (MSM) compared with other men, and is associated with sexual behavior linked to HIV transmission. No national estimates of amphetamine use among MSM with HIV have been published. We used data from the Medical Monitoring Project, a nationally representative sample of persons with diagnosed HIV, to describe patterns in amphetamine use in the past 12 months among MSM during 2015-2016 (N = 3796). Prevalence of amphetamine use in this population was 9.6% (95% CI 7.6, 11.6%) in the past 12 months. MSM who used amphetamines were more likely to have condomless sex with partners without HIV or of unknown serostatus (PR 1.87; 95% CI 1.62, 2.16) and less likely to be durably virally suppressed (PR 0.81; 95% CI 0.71, 0.91). Interventions to address amphetamine use and associated transmission risk behaviors among MSM living with HIV may decrease transmission.

      27. Clinical characteristics of enterovirus A71 neurological disease during an outbreak in children in Colorado, USA, in 2018: an observational cohort studyexternal icon
        Messacar K, Spence-Davizon E, Osborne C, Press C, Schreiner TL, Martin J, Messer R, Maloney J, Burakoff A, Barnes M, Rogers S, Lopez AS, Routh J, Gerber SI, Oberste MS, Nix WA, Abzug MJ, Tyler KL, Herlihy R, Dominguez SR.
        Lancet Infect Dis. 2019 Dec 16.
        BACKGROUND: In May, 2018, Children's Hospital Colorado noted an outbreak of enterovirus A71 (EV-A71) neurological disease. We aimed to characterise the clinical features of EV-A71 neurological disease during this outbreak. METHODS: In this retrospective observational cohort study, children (younger than 18 years) who presented to Children's Hospital Colorado (Aurora, CO, USA) between March 1 and November 30, 2018, with neurological disease (defined by non-mutually exclusive criteria, including meningitis, encephalitis, acute flaccid myelitis, and seizures) and enterovirus detected from any biological specimen were eligible for study inclusion. The clinical characteristics of children with neurological disease associated with EV-A71 were compared with those of children with neurological disease associated with other enteroviruses during the same period. To explore the differences in clinical presentation of acute flaccid myelitis, we also used a subgroup analysis to compare clinical findings in children with EV-A71-associated acute flaccid myelitis during the study period with these findings in those with enterovirus D68 (EV-D68)-associated acute flaccid myelitis at the same hospital between 2013 and 2018. FINDINGS: Between March 10 and Nov 10, 2018, 74 children presenting to Children's Hospital Colorado were found to have enterovirus neurological disease; EV-A71 was identified in 43 (58%) of these children. The median age of the children with EV-A71 neurological disease was 22.7 months (IQR 4.0-31.9), and most of these children were male (34 [79%] children). 40 (93%) children with EV-A71 neurological disease had findings suggestive of meningitis, 31 (72%) children showed evidence of encephalitis, and ten (23%) children met our case definition of acute flaccid myelitis. All children with EV-A71 disease had fever and 18 (42%) children had hand, foot, or mouth lesions at or before neurological onset. Children with EV-A71 disease were best differentiated from those with other enteroviruses (n=31) by the neurological findings of myoclonus, ataxia, weakness, and autonomic instability. Of the specimens collected from children with EV-A71, this enterovirus was detected in 94% of rectal, 79% of oropharyngeal, 56% of nasopharyngeal, and 20% of cerebrospinal fluid specimens. 39 (93%) of 42 children with EV-A71 neurological disease who could be followed up showed complete recovery by 1-2 months. Compared with children with EV-D68-associated acute flaccid myelitis, children with EV-A71-associated acute flaccid myelitis were younger, showed neurological onset earlier after prodromal symptom onset, had milder weakness, showed more rapid improvement, and were more likely to completely recover. INTERPRETATION: This outbreak of EV-A71 neurological disease, the largest reported in the Americas, was characterised by fever, myoclonus, ataxia, weakness, autonomic instability, and full recovery in most patients. Because EV-A71 epidemiology outside of Asia remains difficult to predict, identification of future outbreaks will be aided by prompt recognition of these distinct clinical findings, testing of non-sterile and sterile site specimens, and enhanced enterovirus surveillance. FUNDING: None.

      28. We sought to identify and compare correlates of condomless receptive anal intercourse with HIV-positive or unknown status partners (CRAI) for younger (< 25 years) and older (>/= 25 years) Hispanic/Latino, black/African-American, and white men who have sex with men (MSM). Baseline data from the Evaluation of Rapid HIV Self-Testing among MSM Project (eSTAMP), a randomized controlled trial with MSM (n = 2665, analytical sample size = 2421), were used. Potential correlates included participants' sociodemographic characteristics and HIV status as well as the characteristics of participants' partners. Younger Hispanic/Latino and black men were most likely to report having older sex partners (>/= 50% of partners being at least 5 years older), and having older partners was a significant correlate of CRAI among younger Hispanic/Latino and white men. Regardless of race/ethnicity, not knowing one's HIV status was a significant correlate of CRAI among younger men, whereas having a black sex partner was a significant correlate among older men. HIV prevention initiatives could address these and other correlates specific to race/ethnicity groups to target their prevention resources and messaging.

      29. Urban-rural disparities in treatment outcomes among recurrent TB cases in Southern Province, Zambiaexternal icon
        Mutembo S, Mutanga JN, Musokotwane K, Kanene C, Dobbin K, Yao X, Li C, Marconi VC, Whalen CC.
        BMC Infect Dis. 2019 Dec 30;19(1):1087.
        BACKGROUND: At least 13-20% of all Tuberculosis (TB) cases are recurrent TB. Recurrent TB has critical public health importance because recurrent TB patients have high risk of Multi-Drug Resistant TB (MDR-TB). It is critical to understand variations in the prevalence and treatment outcomes of recurrent TB between different geographical settings. The objective of our study was to estimate the prevalence of recurrent TB among TB cases and compare risk of unfavorable treatment outcomes between rural and urban settings. METHODS: In a retrospective cohort study conducted in southern province of Zambia, we used mixed effects logistic regression to asses associations between explanatory and outcome variables. Primary outcome was all-cause mortality and exposure was setting (rural/urban). Data was abstracted from the facility TB registers. RESULTS: Overall 3566 recurrent TB cases were diagnosed among 25,533 TB patients. The prevalence of recurrent TB was 15.3% (95% CI: 14.8 15.9) in urban and 11.3% (95% CI: 10.7 12.0) in rural areas. Death occurred in 197 (5.5%), 103 (2.9%) were lost to follow-up, and 113 (3.2%) failed treatment. Rural settings had 70% higher risk of death (adjusted OR: 1.7; 95% CI: 1.2 2.7). Risk of lost to follow-up was twice higher in rural than urban (adjusted OR: 2.0 95% CI: 1.3 3.0). Compared to HIV-uninfected, HIV-infected individuals on Antiretroviral Treatment (ART) were 70% more likely to die (adjusted OR: 1.7; 95% CI: 1.2 3.1). CONCLUSION: Recurrent TB prevalence was generally high in both urban and rural settings. The risk of mortality and lost to follow-up was higher among rural patients. We recommend a well-organized Directly Observed Therapy strategy adapted to setting where heightened TB control activities are focused on areas with poor treatment outcomes.

      30. Temporal trends in the incidence of anogenital warts: Impact of human papillomavirus vaccinationexternal icon
        Naleway AL, Crane B, Smith N, Francisco M, Weinmann S, Markowitz LE.
        Sex Transm Dis. 2019 Dec 26.
        BACKGROUND: Studies in countries with high human papillomavirus (HPV) vaccination coverage have demonstrated marked reductions in anogenital warts (AGW) incidence. Our goal was to assess the impact of HPV vaccination in a population with suboptimal coverage by comparing AGW incidence trends in the years prior to and following vaccine introduction. METHODS: We conducted a retrospective analysis of AGW incidence trends using an ecologic study design among 11 through 39 year olds enrolled at Kaiser Permanente Northwest. We defined incidence as the proportion of persons who had a new AGW diagnosis for each calendar year in the pre-vaccine periods (2000 through 2006 for females; 2000 through 2010 for males) and the post-vaccine periods (2007 through 2016 for females; 2011 through 2016 for males). We also described cumulative HPV vaccination coverage. RESULTS: The average annual AGW incidence rates in the pre-vaccine periods were 27.8 per 10,000 in females and 26.9 per 10,000 in males. In the post-vaccine periods, AGW incidence rates decreased 31% (p<.001) in females and 10% (p=.006) in males; the largest reductions were observed in 15-19 year old-females (67%, p<.001) and males (45%, p<.001). Three dose HPV coverage rates were less than 50% in all age groups and both sexes. CONCLUSIONS: In a population of young adults with moderate HPV vaccination coverage, we observed declines in AGW incidence among both females and males following the introduction of HPV vaccination. The largest incidence reductions were observed in 15- to 19-year-olds who were most likely to have been vaccinated.

      31. Despite high pregnancy rates and HIV incidence among adolescents, their uptake of prevention of mother-to-child HIV transmission (PMTCT) services is not well characterized. This paper describes current PMTCT program coverage among adolescents <20 years. Using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines, PubMed/MEDLINE (NCBI), SCOPUS (Elsevier), Grey literature and EMBASE and websites of international organizations and conferences were searched for eligible studies published from 2000 to 2017. Adolescents had lower rates of planned pregnancies, were less likely to know their HIV infection status before their first ANC visit, lower use of ARV, higher rates of loss to follow-up and higher rates of MTCT compared to adults. This study identified differential uptake of PMTCT services for adolescents compared to adults. Age-disaggregated data are urgently needed to understand the sub-optimal uptake of HIV services for adolescents in PMTCT and support the design of effective interventions to close these gaps.

      32. Bridging the knowledge gap on mycoses in Africa; setting up a Pan-African Mycology Working Groupexternal icon
        Oladele RO, Akase IE, Fahal A, Govender NP, Honigl M, Gangneux JP, Chiller TM, Denning DW, Cornely OA, Chakrabarti A.
        Mycoses. 2019 Dec 12.
        Most African countries have poorly funded and overburdened health systems. Additionally, a high prevalence of HIV in Sub-Saharan Africa contributes to a high burden of opportunistic fungal infections. Data generated by GAFFI from 15 of 57 African countries revealed that an estimated 47 million Africans suffer from fungal diseases, of whom an estimated 1.7 million suffer from a serious fungal infection annually. Almost all African countries lack a surveillance system for fungal infections with the exception of South Africa. South Africa is also the only African country with a national mycology reference laboratory. Across the continent, there is a pervasive picture of inadequate/poor diagnostic capacity, low level of awareness among health care workers and policy makers as well as unavailability and non-accessibility to essential antifungal medications. Recent outreach efforts by the International Society for Human and Animal Mycology (ISHAM) and the European Confederation of Medical Mycology (ECMM) have aimed to increase involvement of African countries and experts in global initiatives such as "One World One Guideline" and also the ECMM Academy. Recently, under the auspices of ISHAM, the African sub-region created a network of mycology experts whose goal is organize and engage African leaders in the field of medical mycology. The aim of this ISHAM Working Group is to facilitate interaction and synergy among regional leaders in order to develop educational programs for capacity building to aid in the diagnosis and care of patients with fungal infections in Africa. The working group will also encourage country initiatives to develop clinical guidelines, to support surveys, and to support the establishment of reference mycology laboratories.

      33. The control of diarrhea, the case of a rotavirus vaccineexternal icon
        Parashar UD, Tate JE.
        Salud Publica Mex. 2020 Jan-Feb;62(1):1-5.

      34. On August 8, 2016, a confirmed case of mumps was reported to the Arkansas Department of Health (ADH) in an adult resident of Springdale, Arkansas. By July 2017, nearly 3,000 cases of mumps were reported to ADH from 37 of the 75 counties in Arkansas. Over 50% of cases were in the Arkansas Marshallese community, a close-knit community characterized by large, and extended families sharing the same living space and communal activities. In a statewide effort, ADH collaborated with CDC, the Republic of the Marshall Island's (RMI) Ministry of Health, and the Arkansas Department of Education (ADE) to rapidly respond to and contain the outbreak. We assessed the economic burden to ADH of the outbreak response in terms of containment and vaccination costs, as well as response costs incurred by CDC, RMI, and ADE. The 2016-2017 Arkansas mumps outbreak was the second largest US mumps outbreak in over 30 years and was unique in size, spread, and population affected. Total public health response costs as a result of the outbreak were over $2.1 million, approximately $725 per case. The costs incurred to control this outbreak reflect the response strategies tailored to the affected populations, including consideration of social, cultural, and political factors in controlling transmission and requirements of distinctive strategies for public health outreach. Aside from the burden these outbreaks have on the affected population, we demonstrate the potential for high economic burden of these outbreaks to public health.

      35. Cervical cancer risk in women living with HIV across four continents: A multicohort studyexternal icon
        Rohner E, Butikofer L, Schmidlin K, Sengayi M, Maskew M, Giddy J, Taghavi K, Moore RD, Goedert JJ, Gill MJ, Silverberg MJ, D'Souza G, Patel P, Castilho JL, Ross J, Sohn A, Bani-Sadr F, Taylor N, Paparizos V, Bonnet F, Verbon A, Vehreschild JJ, Post FA, Sabin C, Mocroft A, Dronda F, Obel N, Grabar S, Spagnuolo V, Quiros-Roldan E, Mussini C, Miro JM, Meyer L, Hasse B, Konopnicki D, Roca B, Barger D, Clifford GM, Franceschi S, Egger M, Bohlius J.
        Int J Cancer. 2020 Feb 1;146(3):601-609.
        We compared invasive cervical cancer (ICC) incidence rates in Europe, South Africa, Latin and North America among women living with HIV who initiated antiretroviral therapy (ART) between 1996 and 2014. We analyzed cohort data from the International Epidemiology Databases to Evaluate AIDS (IeDEA) and the Collaboration of Observational HIV Epidemiological Research in Europe (COHERE) in EuroCoord. We used flexible parametric survival models to determine regional ICC rates and risk factors for incident ICC. We included 64,231 women from 45 countries. During 320,141 person-years (pys), 356 incident ICC cases were diagnosed (Europe 164, South Africa 156, North America 19 and Latin America 17). Raw ICC incidence rates per 100,000 pys were 447 in South Africa (95% confidence interval [CI]: 382-523), 136 in Latin America (95% CI: 85-219), 76 in North America (95% CI: 48-119) and 66 in Europe (95% CI: 57-77). Compared to European women ICC rates at 5 years after ART initiation were more than double in Latin America (adjusted hazard ratio [aHR]: 2.43, 95% CI: 1.27-4.68) and 11 times higher in South Africa (aHR: 10.66, 95% CI: 6.73-16.88), but similar in North America (aHR: 0.79, 95% CI: 0.37-1.71). Overall, ICC rates increased with age (>50 years vs. 16-30 years, aHR: 1.57, 95% CI: 1.03-2.40) and lower CD4 cell counts at ART initiation (per 100 cell/mul decrease, aHR: 1.25, 95% CI: 1.15-1.36). Improving access to early ART initiation and effective cervical cancer screening in women living with HIV should be key parts of global efforts to reduce cancer-related health inequities.

      36. The potential population-level impact of different gonorrhea screening strategies in Baltimore and San Francisco: an exploratory mathematical modeling analysisexternal icon
        Ronn MM, Testa C, Tuite AR, Chesson HW, Gift TL, Schumacher C, Williford SL, Zhu L, Bellerose M, Earnest R, Malyuta Y, Hsu KK, Salomon JA, Menzies NA.
        Sex Transm Dis. 2019 Dec 12.
        BACKGROUND: Baltimore and San Francisco represent high burden areas for gonorrhea in the United States. We explored different gonorrhea screening strategies and their comparative impact in the two cities. METHODS: We used a compartmental transmission model of gonorrhea stratified by sex, sexual orientation, age, and race/ethnicity, calibrated to city-level surveillance data for 2010-2017. We analyzed the benefits of 5-year interventions which improved retention in care cascade or increased screening from current levels. We also examined a 1-year outreach screening intervention of high-activity populations. RESULTS: In Baltimore, annual screening of population aged 15-24 was the most efficient of the five-year interventions with 17.9 additional screening tests (95% Credible Interval [CrI] 11.8-31.4) needed per infection averted while twice annual screening of the same population averted the most infections (5.4%, 95%CrI 3.1-8.2%) overall with 25.3 (95%CrI 19.4-33.4) tests per infection averted. In San Francisco, quarter-annual screening of all men who have sex with men was the most efficient with 16.2 additional (95%CrI 12.5-44.5) tests needed per infection averted and it also averted the most infections (10.8%, 95%CrI 1.2-17.8%). Interventions that reduce loss to follow-up after diagnosis improved outcomes. Depending on the ability to of a short-term outreach screening to screen populations at higher acquisition risk, such interventions can offer efficient ways to expand screening coverage. CONCLUSIONS: Data on gonorrhea prevalence distribution and time trends locally would improve the analyses. More focused intervention strategies could increase the impact and efficiency of screening interventions.

      37. The impact of concurrent antiretroviral therapy and MDR-TB treatment on adverse eventsexternal icon
        Smith JP, Gandhi NR, Shah NS, Mlisana K, Moodley P, Johnson BA, Allana S, Campbell A, Nelson KN, Master I, Brust JC.
        J Acquir Immune Defic Syndr. 2020 Jan 1;83(1):47-55.
        BACKGROUND: South Africa has among the highest incidence of multidrug-resistant tuberculosis (MDR-TB) and more than 70% of patients are HIV co-infected. MDR-TB treatment is associated with frequent adverse events (AEs). Although guidelines recommend concurrent treatment of MDR-TB and HIV, safety data on concurrent therapy are limited. METHODS: We conducted a prospective observational study of MDR-TB patients with and without HIV-coinfection in South Africa between 2011 and 2015. Participants received standardized MDR-TB and HIV regimens. Participants were followed monthly for the duration of MDR-TB therapy and screened for clinical and laboratory AEs. Audiometry was performed monthly during the intensive phase; color discrimination testing was performed every 2 months. RESULTS: We enrolled 150 HIV-infected and 56 HIV-uninfected participants. Nearly all experienced at least one clinical (93%) or laboratory (96%) AE. The most common clinical AEs were peripheral neuropathy (50%) and difficulty sleeping (48%); the most common laboratory AEs were hypokalemia (47%) and decreased creatinine clearance (46%). Among 19 clinical and lab AEs examined, there were no differences by HIV status, except for diarrhea (27% HIV-infected vs. 13% HIV-uninfected, P = 0.03). Hearing loss was experienced by 72% of participants (8% severe loss). Fourteen percent experienced color discrimination loss (4% severe loss). There were no differences in frequency or severity of hearing or vision loss by HIV status. CONCLUSIONS: AEs were common, but not more frequent or severe among MDR-TB/HIV co-infected participants receiving concurrent antiretroviral therapy. Given the favorable treatment outcomes associated with concurrent treatment, antiretroviral therapy initiation should not be delayed in MDR-TB patients with HIV-coinfection.

      38. Syphilis management in pregnancy: a review of guideline recommendations from countries around the worldexternal icon
        Trinh T, Leal AF, Mello MB, Taylor MM, Barrow R, Wi TE, Kamb ML.
        Sex Reprod Health Matters. 2019 Dec;27(1):69-82.
        Guidelines can help healthcare practitioners manage syphilis in pregnancy and prevent perinatal death or disability. We conducted systematic reviews to locate guidance documents describing management of syphilis in pregnancy, 2003-2017. We compared country and regional guidelines with current World Health Organization (WHO) guidelines. We found 64 guidelines with recommendations on management of syphilis in pregnancy representing 128 of the 195 WHO member countries, including the two WHO guidelines published in 2016 and 2017. Of the 62 guidelines, 16 were for countries in Africa, 21 for the Americas, two for Eastern Mediterranean, six for Europe and 17 for Asia or the Pacific. Fifty-seven (92%) guidelines recommended universal syphilis screening in pregnancy, of which 46 (81%) recommended testing at the first antenatal care visit. Also, 46 (81%) recommended repeat testing including 21 guidelines recommended this during the third pregnancy trimester and/or at delivery. Fifty-nine (95%) guidelines recommended benzathine penicillin G (BPG) as the first-line therapy for syphilis in pregnancy, consistent with WHO guidelines. Alternative regimens to BPG were listed in 42 (68%) guidelines, primarily from Africa and Asia; only 20 specified that non-penicillin regimens are not proven-effective in treating the fetus. We identified guidance recommending use of injectable penicillin in exposed infants for 112 countries. Most guidelines recommended universal syphilis testing for pregnant women, repeat testing for high-risk women and treatment of infected women with BPG; but several did not. Updating guidance on syphilis testing and treatment in pregnancy to reflect global norms could prevent congenital syphilis and save newborn lives.

      39. Bacteriologically-confirmed pulmonary tuberculosis in an Ethiopian prison: Prevalence from screening of entrant and resident prisonersexternal icon
        Tsegaye Sahle E, Blumenthal J, Jain S, Sun S, Young J, Manyazewal T, Woldeamanuel H, Teferra L, Feleke B, Vandenberg O, Rey Z, Briggs-Hagen M, Haubrich R, Amogne W, McCutchan JA.
        PLoS One. 2019 ;14(12):e0226160.
        BACKGROUND: Pulmonary Tuberculosis (PTB) is a major health problem in prisons. Multiple studies of TB in regional Ethiopian prisons have assessed prevalence and risk factors but have not examined recently implemented screening programs for TB in prisons. This study compares bacteriologically-confirmed PTB (BC-PTB) prevalence in prison entrants versus residents and identifies risk factors for PTB in Kality prison, a large federal Ethiopian prison located in Addis Ababa, through a study of an enhanced TB screening program. METHODS: Participating prisoners (n = 13,803) consisted of 8,228 entrants screened continuously and 5,575 residents screened in two cross-sectional waves for PTB symptoms, demographics, TB risk factors, and medical history. Participants reporting at least one symptom of PTB were asked to produce sputum which was examined by microscopy for acid-fast bacilli, Xpert MTB/RIF assay and MGIT liquid culture. Prevalence of BC-PTB, defined as evidence of Mycobacterium tuberculosis (MTB) in sputum by the above methods, was compared in entrants and residents for the study. Descriptive analysis of prevalence was followed by bivariate and multivariate analyses of risk factors. RESULTS: Prisoners were mainly male (86%), young (median age 26 years) and literate (89%). Prevalence of TB symptoms by screening was 17% (2,334/13,803) with rates in residents >5-fold higher than entrants. Prevalence of BC-PTB detected by screening in participating prisoners was 0.16% (22/13,803). Prevalence in residents increased in the second resident screening compared to the first (R1 = 0.10% and R2 = 0.39%, p = 0.027), but remained higher than in entrants (4.3-fold higher during R1 and 3.1-fold higher during R2). Drug resistance (DR) was found in 38% (5/13) of culture-isolated MTB. Risk factors including being ever diagnosed with TB, history of TB contact and low Body Mass Index (BMI) (<18.5) were significantly associated with BC-PTB (p<0.05). CONCLUSIONS: BC-PTB prevalence was strikingly lower than previously reported from other Ethiopian prisons. PTB appears to be transmitted within this prison based on its higher prevalence in residents than in entrants. Whether a sustained program of PTB screening of entrants and/or residents reduces prevalence of PTB in prisons is not clear from this study, but our findings suggest that resources should be prioritized to resident, rather than entrant, screening due to higher BC-PTB prevalence. Detection of multi- and mono-DR TB in both entrant and resident prisoners warrants regular screening for active TB and adoption of methods to detect drug resistance.

      40. Andrew Vernon and co-authors discuss adherence to therapy and its measurement in tuberculosis treatment trials.

      41. Interim effect evaluation of the hepatitis C elimination programme in Georgia: a modelling studyexternal icon
        Walker JG, Kuchuloria T, Sergeenko D, Fraser H, Lim AG, Shadaker S, Hagan L, Gamkrelidze A, Kvaratskhelia V, Gvinjilia L, Aladashvili M, Asatiani A, Baliashvili D, Butsashvili M, Chikovani I, Khonelidze I, Kirtadze I, Kuniholm MH, Otiashvili D, Sharvadze L, Stvilia K, Tsertsvadze T, Zakalashvili M, Hickman M, Martin NK, Morgan J, Nasrullah M, Averhoff F, Vickerman P.
        Lancet Glob Health. 2019 Dec 18.
        BACKGROUND: Georgia has a high prevalence of hepatitis C, with 5.4% of adults chronically infected. On April 28, 2015, Georgia launched a national programme to eliminate hepatitis C by 2020 (90% reduction in prevalence) through scaled-up treatment and prevention interventions. We evaluated the interim effect of the programme and feasibility of achieving the elimination goal. METHODS: We developed a transmission model to capture the hepatitis C epidemic in Georgia, calibrated to data from biobehavioural surveys of people who inject drugs (PWID; 1998-2015) and a national survey (2015). We projected the effect of the administration of direct-acting antiviral treatments until Feb 28, 2019, and the effect of continuing current treatment rates until the end of 2020. Effect was estimated in terms of the relative decrease in hepatitis C incidence, prevalence, and mortality relative to 2015 and of the deaths and infections averted compared with a counterfactual of no treatment over the study period. We also estimated treatment rates needed to reach Georgia's elimination target. FINDINGS: From May 1, 2015, to Feb 28, 2019, 54 313 patients were treated, with approximately 1000 patients treated per month since mid 2017. Compared with 2015, our model projects that these treatments have reduced the prevalence of adult chronic hepatitis C by a median 37% (95% credible interval 30-44), the incidence of chronic hepatitis C by 37% (29-44), and chronic hepatitis C mortality by 14% (3-30) and have prevented 3516 (1842-6250) new infections and averted 252 (134-389) deaths related to chronic hepatitis C. Continuing treatment of 1000 patients per month is predicted to reduce prevalence by 51% (42-61) and incidence by 51% (40-62), by the end of 2020. To reach a 90% reduction by 2020, treatment rates must increase to 4144 (2963-5322) patients initiating treatment per month. INTERPRETATION: Georgia's hepatitis C elimination programme has achieved substantial treatment scale-up, which has reduced the burden of chronic hepatitis C. However, the country is unlikely to meet its 2020 elimination target unless treatment scales up considerably. FUNDING: CDC Foundation, National Institute for Health Research, National Institutes of Health.

      42. Persons with multidrug-resistant tuberculosis (MDR-TB) have a disease resulting from a strain of tuberculosis (TB) that does not respond to at least isoniazid and rifampicin, the two most effective anti-TB drugs. MDR-TB is always treated with multiple antimicrobial agents. Our data consist of individual patient data from 31 international observational studies with varying prescription practices, access to medications, and distributions of antibiotic resistance. In this study, we develop identifiability criteria for the estimation of a global treatment importance metric in the context where not all medications are observed in all studies. With stronger causal assumptions, this treatment importance metric can be interpreted as the effect of adding a medication to the existing treatments. We then use this metric to rank 15 observed antimicrobial agents in terms of their estimated add-on value. Using the concept of transportability, we propose an implementation of targeted maximum likelihood estimation (TMLE), a doubly robust and locally efficient plug-in estimator, to estimate the treatment importance metric. A clustered sandwich estimator is adopted to compute variance estimates and produce confidence intervals. Simulation studies are conducted to assess the performance of our estimator, verify the double robustness property, and assess the appropriateness of the variance estimation approach. This article is protected by copyright. All rights reserved.

      43. Natural selection favoring more transmissible HIV detected in United States molecular transmission networkexternal icon
        Wertheim JO, Oster AM, Switzer WM, Zhang C, Panneer N, Campbell E, Saduvala N, Johnson JA, Heneine W.
        Nat Commun. 2019 Dec 19;10(1):5788.
        HIV molecular epidemiology can identify clusters of individuals with elevated rates of HIV transmission. These variable transmission rates are primarily driven by host risk behavior; however, the effect of viral traits on variable transmission rates is poorly understood. Viral load, the concentration of HIV in blood, is a heritable viral trait that influences HIV infectiousness and disease progression. Here, we reconstruct HIV genetic transmission clusters using data from the United States National HIV Surveillance System and report that viruses in clusters, inferred to be frequently transmitted, have higher viral loads at diagnosis. Further, viral load is higher in people in larger clusters and with increased network connectivity, suggesting that HIV in the United States is experiencing natural selection to be more infectious and virulent. We also observe a concurrent increase in viral load at diagnosis over the last decade. This evolutionary trajectory may be slowed by prevention strategies prioritized toward rapidly growing transmission clusters.

      44. Masculine gender norms, male circumcision, and men's engagement with health care in the Dominican Republicexternal icon
        Wiginton JM, Fleming PJ, Barrington C, Donastorg Y, Lerebours L, Brito MO.
        Glob Public Health. 2019 Dec 24:1-12.
        Overall, adult men are less likely to seek and receive health care than women, but male circumcision for HIV prevention has been successful in engaging men in health services. The purpose of this paper is to examine the relationship between masculine norms and health care-seeking among men participating in a voluntary male medical circumcision (VMMC) programme in the Dominican Republic (DR). We employed a mixed methods approach integrating survey data collected 6-12 months post-circumcision (n = 293) and in-depth interviews with a sub-sample of these men (n = 30). In our qualitative analysis, we found that health care-seeking is connected to masculine norms among men in the DR, including the perceptions of medical facilities as feminine spaces. Participants' narratives demonstrate that male circumcision programmes may facilitate men overcoming masculinity-related barriers to health care engagement. In quantitative analysis, we found that being concerned about being perceived as masculine was associated with health care-seeking behaviour in the past five years, though this association was not retained in multivariable analyses. Findings indicate that male circumcision programmes can familiarise men with the healthcare system and masculinise health care-seeking and utilisation, easing associated discomfort.

      45. Assessment of readiness to transition from antenatal HIV surveillance surveys to PMTCT programme data-based HIV surveillance in South Africa: The 2017 Antenatal Sentinel HIV Surveyexternal icon
        Woldesenbet SA, Kufa T, Barron P, Ayalew K, Cheyip M, Chirombo BC, Lombard C, Manda S, Pillay Y, Puren AJ.
        Int J Infect Dis. 2019 Nov 9;91:50-56.
        OBJECTIVE: South Africa has used antenatal HIV surveys for HIV surveillance in pregnant women since 1990. We assessed South Africa's readiness to transition to programme data based antenatal HIV surveillance with respect to PMTCT uptake, accuracy of point-of-care rapid testing (RT) and selection bias with using programme data in the context of the 2017 antenatal HIV survey. METHODS: Between 1 October and 15 November 2017, the national survey was conducted in 1,595 public antenatal facilities selected using stratified multistage cluster sampling method. Results of point-of-care RT were obtained from medical records. Blood samples were taken from eligible pregnant women and tested for HIV using immunoassays (IA) in the laboratory. Descriptive statistics were used to report on: PMTCT uptake; agreement between HIV point-of-care RT and laboratory-based HIV-1 IA; and selection bias associated with using programme data for surveillance. RESULTS: PMTCT HIV testing uptake was high (99.8%). The positive percent agreement (PPA) between RT and IA was lower than the World Health Organization (WHO) benchmark (97.6%) at 96.3% (95% confidence interval (CI): 95.9%-96.6%). The negative percent agreement was above the WHO benchmark (99.5%), at 99.7% (95% CI: 99.6%-99.7%) nationally. PPA markedly varied by province (92.9%-98.3%). Selection bias due to exclusion of participants with no RT results was within the recommended threshold at 0.3%. CONCLUSION: For the three components assessed, South Africa was close to meeting the WHO standard for transitioning to routine RT data for antenatal HIV surveillance. The wide variations in PPA across provinces should be addressed.

      46. Viral suppression and factors associated with failure to achieve viral suppression among pregnant women in South Africa: a national cross-sectional surveyexternal icon
        Woldesenbet SA, Kufa T, Barron P, Chirombo BC, Cheyip M, Ayalew K, Lombard C, Manda S, Diallo K, Pillay Y, Puren AJ.
        Aids. 2019 Dec 9.
        OBJECTIVE: To describe viral load (VL) levels among pregnant women and factors associated with failure to achieve viral suppression (VL</=50 copies/mL) (VS) during pregnancy. DESIGN: Between 1 October and 15 November 2017, a cross-sectional survey was conducted among 15-49 year old pregnant women attending antenatal care (ANC) in 1,595 nationally representative public facilities. METHODS: Blood specimens were taken from each pregnant woman and tested for HIV. VL testing was done on all HIV-positive specimens. Demographic and clinical data were extracted from medical records or self-reported. Survey logistic regression examined factors associated with failure to achieve VS. RESULT: Of 10,052 HIV-positive participants with VL data, 56.2% were virally suppressed. Participants initiating ART prior to pregnancy had higher VS (71.0%) by their third trimester compared with participants initiating ART during pregnancy (59.3%). Booking for ANC during the third trimester vs earlier: (adjusted odds ratio (AOR) 1.8, 95% confidence interval (CI):1.4-2.3), low frequency of ANC visits (AOR for 2 ANC visits vs >/=4 ANC visits: 2.0, 95%CI:1.7-2.4), delayed initiation of ART (AOR for ART initiated at the second trimester vs before pregnancy:2.2, 95%CI:1.8-2.7), and younger age (AOR for 15-24years vs 35-49years: 1.4, 95%CI:1.2-1.8) were associated with failure to achieve VS at third trimester. CONCLUSION: Failure to achieve VS was primarily associated with late initiation of ANC and late initiation of ART. Efforts to improve early ANC booking and early ART initiation in the general population would help improve VS rates among pregnant women. In addition, the study found, despite initiating ART prior to pregnancy, more than one quarter of participants did not achieve VS in their third trimester. This highlights the need to closely monitor VL and strengthen counselling and support services for ART adherence.

    • Disaster Control and Emergency Services
      1. Preparedness for influenza vaccination during a pandemic in the World Health Organization Western Pacific Regionexternal icon
        Bell L, Peters L, Heffelfinger JD, Sullivan SG, Vilajeliu A, Shin J, Bresee J, Dueger E.
        Western Pac Surveill Response J. 2019 2018;9(5 Suppl 1):11-14.

      2. CDC's multiple approaches to safeguard the health, safety, and resilience of Ebola respondersexternal icon
        Klomp RW, Jones L, Watanabe E, Thompson WW.
        Prehosp Disaster Med. 2019 Dec 10:1-7.
        Over 27,000 people were sickened by Ebola and over 11,000 people died between March of 2014 and June of 2016. The US Centers for Disease Control and Prevention (CDC; Atlanta, Georgia USA) was one of many public health organizations that sought to stop this outbreak. This agency deployed almost 2,000 individuals to West Africa during that timeframe. Deployment to these countries exposed these individuals to a wide variety of dangers, stressors, and risks.Being concerned about the at-risk populations in Africa, and also the well-being of its professionals who willingly deployed, the CDC did several things to help safeguard the health, safety, and resilience of these team members before, during, and after deployment.The accompanying special report highlights innovative pre-deployment training initiatives, customized screening processes, and post-deployment outreach efforts intended to protect and support the public health professionals fighting Ebola. Before deploying, the CDC team members were expected to participate in both internally-created and externally-provided trainings. These ranged from pre-deployment briefings, to Preparing for Work Overseas (PFWO) and Public Health Readiness Certificate Program (PHRCP) courses, to Incident Command System (ICS) 100, 200, and 400 courses.A small subset of non-clinical deployers also participated in a three-day training designed in collaboration with the Center for the Study of Traumatic Stress (CSTS; Bethesda, Maryland USA) to train individuals to assess and address the well-being and resilience of themselves and their teammates in the field during a deployment. Participants in this unique training were immersed in a Virtual Reality Environment (VRE) that simulated deployment to one of seven different types of emergencies.The CDC leadership also requested a pre-deployment screening process that helped professionals in the CDC's Occupational Health Clinic (OHC) determine whether or not individuals were at an increased risk of negative outcomes by participating in a rigorous deployment at that time.When deployers returned from the field, they received personalized invitations to participate in a voluntary, confidential, post-deployment operational debriefing one-on-one or in a group.Implementing these approaches provided more information to clinical decision makers about the readiness of deployers. It provided deployers with a greater awareness of the kinds of challenges they were likely to face in the field. The post-deployment outreach efforts reminded staff that their contributions were appreciated and there were resources available if they needed help processing any of the potentially-traumatizing things they may have experienced.

      3. The United States (US) and Caribbean regions remain vulnerable to the impact of severe tropical storms, hurricanes, and typhoons. In 2017, a series of hurricanes posed threats to residents living in inland and coastal communities as well as on islands isolated from the US mainland. Harvey, Irma, Jose, and Maria caused catastrophic infrastructure damage, resulting in a loss of electrical power and communications due to damaged or downed utility poles, cell towers, and transmission lines. Critical services were inoperable for many months. Emergency managers are public officials who are accountable to both political leaders and the citizens. During disaster events, emergency managers must prioritize areas of effort, manage personnel, and communicate with stakeholders to address critical infrastructure interdependences. Essential lifeline services (eg, energy and communications) were inoperable for many months, which led to increased attention from policy-makers, media, and the public.

      4. PanStop: a decade of rapid containment exercises for pandemic preparedness in the WHO Western Pacific Regionexternal icon
        Moturi E, Horton K, Bell L, Breakwell L, Dueger E.
        Western Pac Surveill Response J. 2018 2018;9(5 Suppl 1):71-74.

    • Disease Reservoirs and Vectors
      1. Isolation and characterization of Akhmeta virus from wild-caught rodents (Apodemus spp.) in Georgiaexternal icon
        Doty JB, Maghlakelidze G, Sikharulidze I, Tu SL, Morgan CN, Mauldin MR, Parkadze O, Kartskhia N, Turmanidze M, Matheny AM, Davidson W, Tang S, Gao J, Li Y, Upton C, Carroll DS, Emerson GL, Nakazawa Y.
        J Virol. 2019 Dec 15;93(24).
        In 2013, a novel orthopoxvirus was detected in skin lesions of two cattle herders from the Kakheti region of Georgia (country); this virus was named Akhmeta virus. Subsequent investigation of these cases revealed that small mammals in the area had serological evidence of orthopoxvirus infections, suggesting their involvement in the maintenance of these viruses in nature. In October 2015, we began a longitudinal study assessing the natural history of orthopoxviruses in Georgia. As part of this effort, we trapped small mammals near Akhmeta (n = 176) and Gudauri (n = 110). Here, we describe the isolation and molecular characterization of Akhmeta virus from lesion material and pooled heart and lung samples collected from five wood mice (Apodemus uralensis and Apodemus flavicollis) in these two locations. The genomes of Akhmeta virus obtained from rodents group into 2 clades: one clade represented by viruses isolated from A. uralensis samples, and one clade represented by viruses isolated from A. flavicollis samples. These genomes also display several presumptive recombination events for which gene truncation and identity have been examined.IMPORTANCE Akhmeta virus is a unique Orthopoxvirus that was described in 2013 from the country of Georgia. This paper presents the first isolation of this virus from small mammal (Rodentia; Apodemus spp.) samples and the molecular characterization of those isolates. The identification of the virus in small mammals is an essential component to understanding the natural history of this virus and its transmission to human populations and could guide public health interventions in Georgia. Akhmeta virus genomes harbor evidence suggestive of recombination with a variety of other orthopoxviruses; this has implications for the evolution of orthopoxviruses, their ability to infect mammalian hosts, and their ability to adapt to novel host species.

      2. Attenuated strains of rabies virus (RABV) have been used for oral vaccination of wild carnivores in Europe and North America. However, some RABV vaccines caused clinical rabies in target animals. To improve the safety of attenuated RABV as an oral vaccine for field use, strategies using selection of escape mutants under monoclonal antibody neutralization pressure and reverse genetics-defined mutations have been used. We tested the safety, immunogenicity, and efficacy of one RABV construct, ERA-g333, developed with reverse genetics by intramuscular (IM) or oral (PO) routes in big brown bats (Eptesicus fuscus). Twenty-five bats received 5x10(6) mouse intracerebral median lethal doses (MICLD50) of ERA-g333 by IM route, 10 received 5x10(6) MICLD50 of ERA-g333 by PO route, and 22 bats served as unvaccinated controls. Twenty-one days after vaccination, 44 bats were infected by IM route with 10(2.9) MICLD50 of E. fuscus RABV. We report both the immunogenicity and efficacy of ERA-g333 delivered by the IM route; no induction of humoral immunity was detected in bats vaccinated by the PO route. Two subsets of bats vaccinated IM (n=5) and PO (n=3) were not challenged, and none developed clinical rabies from ERA-g333. Scarce reports exist on the evaluation of oral rabies vaccines in insectivorous bats, although the strategy evaluated here may be feasible for future application to these important RABV reservoirs.

      3. Associations between household environmental factors and immature mosquito abundance in Quetzaltenango, Guatemalaexternal icon
        Madewell ZJ, Sosa S, Brouwer KC, Juarez JG, Romero C, Lenhart A, Cordon-Rosales C.
        BMC Public Health. 2019 Dec 23;19(1):1729.
        BACKGROUND: Aedes aegypti-borne diseases are becoming major public health problems in tropical and sub-tropical regions. While socioeconomic status has been associated with larval mosquito abundance, the drivers or possible factors mediating this association, such as environmental factors, are yet to be identified. We examined possible associations between proximity to houses and roads and immature mosquito abundance, and assessed whether these factors and mosquito prevention measures mediated any association between household environmental factors and immature mosquito abundance. METHODS: We conducted two cross-sectional household container surveys in February-March and November-December, 2017, in urban and rural areas of Quetzaltenango, Guatemala. We used principal components analysis to identify factors from 12 variables to represent the household environment. One factor which included number of rooms in house, electricity, running water, garbage service, cable, television, telephone, latrine, well, and sewer system, was termed "environmental capital." Environmental capital scores ranged from 0 to 5.5. Risk factors analyzed included environmental capital, and distance from nearest house/structure, paved road, and highway. We used Poisson regression to determine associations between distance to nearest house/structure, roads, and highways, and measures of immature mosquito abundance (total larvae, total pupae, and positive containers). Using cubic spline generalized additive models, we assessed non-linear associations between environmental capital and immature mosquito abundance. We then examined whether fumigation, cleaning containers, and distance from the nearest house, road, and highway mediated the relationship between environmental capital and larvae and pupae abundance. RESULTS: We completed 508 household surveys in February-March, and we revisited 469 households in November-December. Proximity to paved roads and other houses/structures was positively associated with larvae and pupae abundance and mediated the associations between environmental capital and total numbers of larvae/pupae (p </= 0.01). Distance to highways was not associated with larval/pupal abundance (p >/= 0.48). Households with the lowest and highest environmental capital had fewer larvae/pupae than households in the middle range (p < 0.01). CONCLUSIONS: We found evidence that proximity to other houses and paved roads was associated with greater abundance of larvae and pupae. Understanding risk factors such as these can allow for improved targeting of surveillance and vector control measures in areas considered at higher risk for arbovirus transmission.

      4. The white-footed mouse, Peromyscus leucopus (Rafinesque), is a reservoir for the Lyme disease spirochete Borrelia burgdorferi sensu stricto in the eastern half of the United States, where the blacklegged tick, Ixodes scapularis Say (Acari: Ixodidae), is the primary vector. In the Midwest, an additional Lyme disease spirochete, Borrelia mayonii, was recorded from naturally infected I. scapularis and P. leucopus. However, an experimental demonstration of reservoir competence was lacking for a natural tick host. We therefore experimentally infected P. leucopus with B. mayonii via I. scapularis nymphal bites and then fed uninfected larvae on the mice to demonstrate spirochete acquisition and passage to resulting nymphs. Of 23 mice fed on by B. mayonii-infected nymphs, 21 (91%) developed active infections. The infection prevalence for nymphs fed as larvae on these infected mice 4 wk post-infection ranged from 56 to 98%, and the overall infection prevalence for 842 nymphs across all 21 P. leucopus was 75% (95% confidence interval, 72-77%). To assess duration of infectivity, 10 of the P. leucopus were reinfested with uninfected larval ticks 12 wk after the mice were infected. The overall infection prevalence for 480 nymphs across all 10 P. leucopus at the 12-wk time point was 26% (95% confidence interval, 23-31%), when compared with 76% (95% confidence interval, 71-79%) for 474 nymphs from the same subset of 10 mice at the 4-wk time point. We conclude that P. leucopus is susceptible to infection with B. mayonii via bite by I. scapularis nymphs and an efficient reservoir for this Lyme disease spirochete.

    • Environmental Health
      1. Urine bisphenol A and arsenic levels in residents of the Cheyenne River Sioux Tribe, South Dakota, with and without diabetesexternal icon
        Chang A, Ridpath A, Carpenter J, Kieszak S, Sircar K, Espinosa-Bode A, Nelson D, Martin C.
        J Med Toxicol. 2019 Dec 17.
        INTRODUCTION: Diabetes disproportionately affects American Indians/Alaskan Natives (AI/AN). Bisphenol A (BPA) and arsenic (As), environmental toxicants which may be associated with diabetes, have not been well studied in this population. Our objectives were to determine if urinary BPA and As are associated with diabetes among adults in the Cheyenne River Sioux Tribe (CRST), and to compare their urinary levels with the general US population. METHODS: We performed a case-control study among 276 volunteers. We matched our cases (persons with diabetes) and controls (persons without diabetes) using age. We collected questionnaire data and urine samples which were tested for BPA and speciated As analytes. We used paired t tests and McNemar's chi-square test to compare continuous and categorical variables, respectively, between cases and controls and linear regression to assess the association between self-reported exposures and BPA and As levels. We used conditional logistic regression to investigate the association between case status and BPA and As levels. BPA and As levels among participants were compared with those from the 2011-2012 National Health and Nutrition Examination Survey (NHANES). RESULTS: The average age of participants was 46 years. The majority identified as AI/AN race (97%) and 58% were female. The geometric means from CRST participant urine specimens were 1.83 ug/L for BPA and 3.89 ug/L for total As. BPA geometric means of CRST participants were higher than NHANES participants while total As geometric means were lower. BPA and As were not associated with case status. CONCLUSION: The results of this study are consistent with others that have reported no association between diabetes and exposure to BPA or As.

      2. Exposure of nail salon workers to phthalates, di(2-ethylhexyl) terephthalate, and organophosphate esters: A pilot studyexternal icon
        Craig JA, Ceballos DM, Fruh V, Petropoulos ZE, Allen JG, Calafat AM, Ospina M, Stapleton HM, Hammel S, Gray R, Webster TF.
        Environ Sci Technol. 2019 Dec 17;53(24):14630-14637.
        Relatively little is known about the exposure of nail technicians to semivolatile organic compounds (SVOCs) in nail salons. We collected preshift and postshift urine samples and silicone wrist bands (SWBs) worn on lapels and wrists from 10 female nail technicians in the Boston area in 2016-17. We analyzed samples for phthalates, phthalate alternatives, and organophosphate esters (OPEs) or their metabolites. Postshift urine concentrations were generally higher than preshift concentrations for SVOC metabolites; the greatest change was for a metabolite of the phthalate alternative di(2-ethylhexyl) terephthalate (DEHTP): mono(2-ethyl-5-carboxypentyl) terephthalate (MECPTP) more than tripled from 11.7 to 36.6 mug/g creatinine. DEHTP biomarkers were higher in our study participants' postshift urine compared to 2015-2016 National Health and Nutrition Examination Survey females. Urinary MECPTP and another DEHTP metabolite were moderately correlated (r = 0.37-0.60) with DEHTP on the SWBs, suggesting occupation as a source of exposure. Our results suggest that nail technicians are occupationally exposed to certain phthalates, phthalate alternatives, and OPEs, with metabolites of DEHTP showing the largest increase across a work day. The detection of several of these SVOCs on SWBs suggests that they can be used as a tool for examining potential occupational exposures to SVOCs among nail salon workers.

      3. Notes from the field: Methylmercury toxicity from a skin lightening cream obtained from Mexico - California, 2019external icon
        Mudan A, Copan L, Wang R, Pugh A, Lebin J, Barreau T, Jones RL, Ghosal S, Lee M, Albertson T, Jarrett JM, Lee J, Betting D, Ward CD, De Leon Salazar A, Smollin CG, Blanc PD.
        MMWR Morb Mortal Wkly Rep. 2019 Dec 20;68(50):1166-1167.

      4. Association of long-term ambient ozone exposure with respiratory morbidity in smokersexternal icon
        Paulin LM, Gassett AJ, Alexis NE, Kirwa K, Kanner RE, Peters S, Krishnan JA, Paine R, Dransfield M, Woodruff PG, Cooper CB, Barr RG, Comellas AP, Pirozzi CS, Han M, Hoffman EA, Martinez FJ, Woo H, Peng RD, Fawzy A, Putcha N, Breysse PN, Kaufman JD, Hansel NN.
        JAMA Intern Med. 2019 Dec 9.
        Importance: Few studies have investigated the association of long-term ambient ozone exposures with respiratory morbidity among individuals with a heavy smoking history. Objective: To investigate the association of historical ozone exposure with risk of chronic obstructive pulmonary disease (COPD), computed tomography (CT) scan measures of respiratory disease, patient-reported outcomes, disease severity, and exacerbations in smokers with or at risk for COPD. Design, Setting, and Participants: This multicenter cross-sectional study, conducted from November 1, 2010, to July 31, 2018, obtained data from the Air Pollution Study, an ancillary study of SPIROMICS (Subpopulations and Intermediate Outcome Measures in COPD Study). Data analyzed were from participants enrolled at 7 (New York City, New York; Baltimore, Maryland; Los Angeles, California; Ann Arbor, Michigan; San Francisco, California; Salt Lake City, Utah; and Winston-Salem, North Carolina) of the 12 SPIROMICS clinical sites. Included participants had historical ozone exposure data (n = 1874), were either current or former smokers (>/=20 pack-years), were with or without COPD, and were aged 40 to 80 years at baseline. Healthy persons with a smoking history of 1 or more pack-years were excluded from the present analysis. Exposures: The 10-year mean historical ambient ozone concentration at participants' residences estimated by cohort-specific spatiotemporal modeling. Main Outcomes and Measures: Spirometry-confirmed COPD, chronic bronchitis diagnosis, CT scan measures (emphysema, air trapping, and airway wall thickness), 6-minute walk test, modified Medical Research Council (mMRC) Dyspnea Scale, COPD Assessment Test (CAT), St. George's Respiratory Questionnaire (SGRQ), postbronchodilator forced expiratory volume in the first second of expiration (FEV1) % predicted, and self-report of exacerbations in the 12 months before SPIROMICS enrollment, adjusted for demographics, smoking, and job exposure. Results: A total of 1874 SPIROMICS participants were analyzed (mean [SD] age, 64.5 [8.8] years; 1479 [78.9%] white; and 1013 [54.1%] male). In adjusted analysis, a 5-ppb (parts per billion) increase in ozone concentration was associated with a greater percentage of emphysema (beta = 0.94; 95% CI, 0.25-1.64; P = .007) and percentage of air trapping (beta = 1.60; 95% CI, 0.16-3.04; P = .03); worse scores for the mMRC Dyspnea Scale (beta = 0.10; 95% CI, 0.03-0.17; P = .008), CAT (beta = 0.65; 95% CI, 0.05-1.26; P = .04), and SGRQ (beta = 1.47; 95% CI, 0.01-2.93; P = .048); lower FEV1% predicted value (beta = -2.50; 95% CI, -4.42 to -0.59; P = .01); and higher odds of any exacerbation (odds ratio [OR], 1.37; 95% CI, 1.12-1.66; P = .002) and severe exacerbation (OR, 1.37; 95% CI, 1.07-1.76; P = .01). No association was found between historical ozone exposure and chronic bronchitis, COPD, airway wall thickness, or 6-minute walk test result. Conclusions and Relevance: This study found that long-term historical ozone exposure was associated with reduced lung function, greater emphysema and air trapping on CT scan, worse patient-reported outcomes, and increased respiratory exacerbations for individuals with a history of heavy smoking. The association between ozone exposure and adverse respiratory outcomes suggests the need for continued reevaluation of ambient pollution standards that are designed to protect the most vulnerable members of the US population.

      5. Hypertension in relation to dioxins and polychlorinated biphenyls from the Anniston Community Health Survey Follow-Upexternal icon
        Pavuk M, Serio TC, Cusack C, Cave M, Rosenbaum PF, Birnbaum LS.
        Environ Health Perspect. 2019 Dec;127(12):127007.
        BACKGROUND: In 2014, we conducted a longitudinal study [Anniston Community Health Survey (ACHS II)] 8 y after the baseline (ACHS I). OBJECTIVES: We investigated the relationship between persistent chlorinated compounds and hypertension in residents living around the former polychlorinated biphenyl (PCB) production plant in Anniston, Alabama. We also examined the potential role of inflammatory cytokines in those with hypertension. METHODS: A total of 338 participants had their blood pressure measured and medications recorded, gave a blood sample, and completed a questionnaire. Prevalent hypertension was defined as taking antihypertensive medication or having systolic blood pressure >140 mmHg and/or diastolic pressure >90 mmHg; incident hypertension used similar criteria in those who developed hypertension since the baseline in 2005-2007. PCB congeners were categorized into structure-activity groups, and toxic equivalencies (TEQs) were calculated for dioxin-like compounds. Descriptive statistics, logistic and linear regressions, as well as Cox proportional hazard models, were used to analyze the associations between exposures and hypertension. RESULTS: Prevalent hypertension (78%) in ACHS II showed statistically significant adjusted odds ratios (ORs) for PCBs 74, 99, 138, 153, 167, 177, 183, and 187, ranging from 2.18 [95% confidence interval (CI): 1.10, 4.33] to 2.76 (95% CI: 1.14, 6.73), as well as for two estrogenic-like PCB groups, and the thyroid-like group [ORs ranging from 2.25 (95% CI: 1.07, 4.75) to 2.54 (95% CI: 1.13, 5.74)]. Furthermore, analysis of quartiles demonstrated a monotonic relationship for dioxin-like non-ortho (non-o)-PCB TEQs [fourth vs. first quartile: 3.66 (95% CI: 1.40, 9.56)]. Longitudinal analyses of incident hypertension supported those positive associations. The results were strongest for the di-o-PCBs [hazard ratio (HR)=1.93 (95% CI: 0.93, 4.00)] and estrogenic II PCB group [HR=1.90 (95% CI: 0.96, 3.78)] but were weaker for the dioxin TEQs. DISCUSSION: Findings supportive of positive associations were reported for dioxin-like mono-o- and non-o-PCBs as well as for nondioxin-like estrogenic and thyroid-like congeners with prevalent and incident hypertension, suggesting that multiple pathways may be involved in hypertension development. https://doi.org/10.1289/EHP5272.

      6. Effect of body composition and energy expenditure on permethrin biomarker concentrations among US Army National Guard membersexternal icon
        Scarpaci MM, Haven CC, Maule AL, Heaton KJ, Taylor KM, Rood J, Ospina M, Calafat AM, Proctor SP.
        J Occup Environ Med. 2019 Dec 31.
        OBJECTIVE: To examine relationships between percent body fat (%BF) and total energy expenditure (TEE) on permethrin exposure among Army National Guard (ARNG) Soldiers wearing permethrin-treated uniforms. METHODS: ARNG members (n = 47) participated in a nine-day study. Repeated body composition (height, weight, %BF) measurements and daily urine samples, analyzed for permethrin and N,N-diethyl-meta-toluamide (DEET) metabolites, were collected. TEE was determined via doubly labeled water protocol. Linear mixed and regression models were used for analyses. RESULTS: Neither %BF nor TEE were significantly associated with permethrin or DEET biomarkers. However, a significant interaction effect (F = 10.76; p = 0.0027) between laundering history and %BF was observed; 10% higher %BF was significantly associated with 25% higher permethrin biomarker concentrations among those wearing uniforms washed </= 25 (compared to >25) times. CONCLUSIONS: Uniform laundering history significantly affects the association between %BF and permethrin-treated uniform exposure.

      7. Methyl tertiary-butyl ether exposure from gasoline in the U.S. population, NHANES 2001-2012external icon
        Silva LK, Espenship MF, Pine BN, Ashley DL, De Jesus VR, Blount BC.
        Environ Health Perspect. 2019 Dec;127(12):127003.
        BACKGROUND: Methyl tertiary-butyl ether (MTBE) was used as a gasoline additive in the United States during 1995-2006. Because of concerns about potential exposure and health effects, some U.S. states began banning MTBE use in 2002, leading to a nationwide phaseout in 2006. OBJECTIVES: We investigated the change in blood MTBE that occurred during the years in which MTBE was being phased out of gasoline. METHODS: We used data from the National Health and Nutrition Examination Survey (NHANES) from 2001-2012 to assess the change in blood MTBE over this period. We fit sample-weighted multivariate linear regression models to 12,597 human blood MTBE concentrations from the NHANES 2001-2002 to 2011-2012 survey cycles. RESULTS: The unweighted proportion of the individuals with MTBE blood levels above the limit of detection (LOD) of 1.4 ng/L was 93.9% for 2001-2002. This portion dropped to 25.4% for the period 2011-2012. Weighted blood MTBE median levels (ng/L) (25th and 75th percentiles) decreased from 25.8 (6.08, 68.1) ng/L for the period from 2001-2002 to 4.57 (1.44, 19.1) ng/L for the period from 2005-2006. For the entire postban period (2007-2012), MTBE median levels were below the detection limit of 1.4 ng/L. DISCUSSION: These decreases in blood MTBE coincided with multiple statewide bans that began in 2002 and a nationwide ban in 2006. The multivariate log-linear regression model for the NHANES 2003-2004 data showed significantly higher blood MTBE concentrations in the group who pumped gasoline less than 7 h before questionnaire administration compared to those who pumped gasoline more than 12 h before questionnaire administration (p=0.032). This study is the first large-scale, national-level confirmation of substantial decrease in blood MTBE levels in the general population following the phaseout of the use of MTBE as a fuel additive. https://doi.org/10.1289/EHP5572.

      8. Particle and organic vapor emissions from children's 3-D pen and 3-D printer toysexternal icon
        Yi J, Duling MG, Bowers LN, Knepp AK, LeBouf RF, Nurkiewicz TR, Ranpara A, Luxton T, Martin SB, Burns DA, Peloquin DM, Baumann EJ, Virji MA, Stefaniak AB.
        Inhal Toxicol. 2019 Dec 24:1-14.
        Objective: Fused filament fabrication "3-dimensional (3-D)" printing has expanded beyond the workplace to 3-D printers and pens for use by children as toys to create objects.Materials and methods: Emissions from two brands of toy 3-D pens and one brand of toy 3-D printer were characterized in a 0.6 m(3) chamber (particle number, size, elemental composition; concentrations of individual and total volatile organic compounds (TVOC)). The effects of print parameters on these emission metrics were evaluated using mixed-effects models. Emissions data were used to model particle lung deposition and TVOC exposure potential.Results: Geometric mean particle yields (10(6)-10(10) particles/g printed) and sizes (30-300 nm) and TVOC yields (<detectable to 590 microg TVOC/g printed) for the toys were similar to those from 3-D printers used in workplaces. Metal emissions included manganese (1.6-92.3 ng/g printed) and lead (0.13-1.2 ng/g printed). Among toys, extruder nozzle conditions (diameter, temperature) and filament (type, color, and extrusion speed) significantly influenced particle and TVOC emissions. Dose modeling indicated that emitted particles would deposit in the lung alveoli of children. Exposure modeling indicated that TVOC concentration from use of a single toy would be 1-31 microg/m(3) in a classroom and 3-154 microg/m(3) in a residential living room.Discussion: Potential exists for inhalation of organic vapors and metal-containing particles during use of these toys.Conclusions: If deemed appropriate, e.g. where multiple toys are used in a poorly ventilated area or a toy is positioned near a child's breathing zone, control technologies should be implemented to reduce emissions and exposure risk.

    • Food Safety
      1. Shiga toxin-producing E. coli infections associated with romaine lettuce - United States, 2018external icon
        Bottichio L, Keaton A, Thomas D, Fulton T, Tiffany A, Frick A, Mattioli M, Kahler A, Murphy J, Otto M, Tesfai A, Fields A, Kline K, Fiddner J, Higa J, Barnes A, Arroyo F, Salvatierra A, Holland A, Taylor W, Nash J, Morawski BM, Correll S, Hinnenkamp R, Havens J, Patel K, Schroeder MN, Gladney L, Martin H, Whitlock L, Dowell N, Newhart C, Watkins LF, Hill V, Lance S, Harris S, Wise M, Williams I, Basler C, Gieraltowski L.
        Clin Infect Dis. 2019 Dec 9.
        BACKGROUND: Produce-associated outbreaks of Shiga toxin-producing Escherichia coli (STEC) were first identified in 1991. In April 2018, New Jersey and Pennsylvania officials reported a cluster of STEC O157 infections associated with multiple locations of a restaurant chain. CDC queried PulseNet, the national laboratory network for foodborne disease surveillance, for additional cases and began a national investigation. METHODS: A case was defined as an infection between March 13 and August 22, 2018 with one of the 22 identified outbreak-associated E. coli O157:H7 or E. coli O61 pulsed-field gel electrophoresis pattern combinations, or with a strain STEC O157 that was closely related to the main outbreak strain by whole genome sequencing. We conducted epidemiologic and traceback investigations to identify illness sub-clusters and common sources. An FDA-led environmental assessment, which tested water, soil, manure, compost, and scat samples, was conducted to evaluate potential sources of STEC contamination. RESULTS: We identified 240 case-patients from 37 states; 104 were hospitalized, 28 developed hemolytic uremic syndrome, and five died. Of 179 people who were interviewed, 152 (85%) reported consuming romaine lettuce in the week before illness onset. Twenty sub-clusters were identified. Product traceback from sub-cluster restaurants identified numerous romaine lettuce distributors and growers; all lettuce originated from the Yuma growing region. Water samples collected from an irrigation canal in the region yielded the outbreak strain of STEC O157. CONCLUSION: We report on the largest multistate leafy green-linked STEC O157 outbreak in several decades. The investigation highlights the complexities associated with investigating outbreaks involving widespread environmental contamination.

    • Genetics and Genomics
      1. Pathogen genomics in public healthexternal icon
        Armstrong GL, MacCannell DR, Taylor J, Carleton HA, Neuhaus EB, Bradbury RS, Posey JE, Gwinn M.
        N Engl J Med. 2019 Dec 26;381(26):2569-2580.
        Rapid advances in DNA sequencing technology ("next-generation sequencing") have inspired optimism about the potential of human genomics for "precision medicine." Meanwhile, pathogen genomics is already delivering "precision public health" through more effective investigations of outbreaks of foodborne illnesses, better-targeted tuberculosis control, and more timely and granular influenza surveillance to inform the selection of vaccine strains. In this article, we describe how public health agencies have been adopting pathogen genomics to improve their effectiveness in almost all domains of infectious disease. This momentum is likely to continue, given the ongoing development in sequencing and sequencing-related technologies.

      2. Implications of mobile genetic elements for Salmonella enterica single-nucleotide polymorphism subtyping and source tracking investigationsexternal icon
        Li S, Zhang S, Baert L, Jagadeesan B, Ngom-Bru C, Griswold T, Katz LS, Carleton HA, Deng X.
        Appl Environ Microbiol. 2019 Dec 15;85(24).
        Single-nucleotide polymorphisms (SNPs) are widely used for whole-genome sequencing (WGS)-based subtyping of foodborne pathogens in outbreak and source tracking investigations. Mobile genetic elements (MGEs) are commonly present in bacterial genomes and may affect SNP subtyping results if their evolutionary history and dynamics differ from that of the bacterial chromosomes. Using Salmonella enterica as a model organism, we surveyed major categories of MGEs, including plasmids, phages, insertion sequences, integrons, and integrative and conjugative elements (ICEs), in 990 genomes representing 21 major serotypes of S. enterica We evaluated whether plasmids and chromosomal MGEs affect SNP subtyping with 9 outbreak clusters of different serotypes found in the United States in 2018. The median total length of chromosomal MGEs accounted for 2.5% of a typical S. enterica chromosome. Of the 990 analyzed S. enterica isolates, 68.9% contained at least one assembled plasmid sequence. The median total length of assembled plasmids in these isolates was 93,671 bp. Plasmids that carry high densities of SNPs were found to substantially affect both SNP phylogenies and SNP distances among closely related isolates if they were present in the reference genome for SNP subtyping. In comparison, chromosomal MGEs were found to have limited impact on SNP subtyping. We recommend the identification of plasmid sequences in the reference genome and the exclusion of plasmid-borne SNPs from SNP subtyping analysis.IMPORTANCE Despite increasingly routine use of WGS and SNP subtyping in outbreak and source tracking investigations, whether and how MGEs affect SNP subtyping has not been thoroughly investigated. Besides chromosomal MGEs, plasmids are frequently entangled in draft genome assemblies and yet to be assessed for their impact on SNP subtyping. This study provides evidence-based guidance on the treatment of MGEs in SNP analysis for Salmonella to infer phylogenetic relationship and SNP distance between isolates.


    • Health Communication and Education
      1. Awareness of "The Real Cost" campaign among US middle and high school students: National Youth Tobacco Survey, 2017external icon
        Delahanty J, Ganz O, Bernat JK, Trigger S, Smith A, Lavinghouze R, Rao P.
        Public Health Rep. 2020 Jan;135(1):82-89.
        OBJECTIVES: Monitoring awareness of a public education campaign can help to better understand the extent of sustained population-level exposure to the campaign. We examined unaided awareness (awareness that does not include a visual image to remind the respondent of the campaign or advertisement) and correlates of unaided awareness of "The Real Cost," a national youth tobacco education campaign developed by the US Food and Drug Administration and implemented in 2014. METHODS: This secondary analysis examined unaided campaign awareness by using data from the 2017 National Youth Tobacco Survey, a nationally representative school-based sample of young persons aged 9-19 years (n = 17 269) surveyed approximately 3 years after campaign launch. We compared unaided campaign awareness among various cigarette user groups (experimenters, susceptible nonsmokers, current or former smokers, and nonsusceptible nonsmokers). We examined associations between unaided campaign awareness and demographic and tobacco-related correlates, overall and by cigarette user group. RESULTS: Three years after "The Real Cost" campaign was launched, most middle and high school students (58.5%) still reported unaided campaign awareness. Of 17 269 middle and high school students in the sample, 62.0% of susceptible nonsmokers and 64.5% of experimenters reported unaided campaign awareness. Among susceptible nonsmokers, unaided campaign awareness differed by age and race/ethnicity and was higher among students with greater tobacco-related harm perceptions (vs lower harm perceptions) and exposure to pro-tobacco marketing (vs no exposure). CONCLUSIONS: Future surveillance and research could examine awareness of "The Real Cost" campaign and effects of the campaign on young persons' knowledge, attitudes, and beliefs to further assess the public health impact of tobacco prevention campaigns.

    • Health Disparities
      1. Unexpected race and ethnicity differences in the US National Veterans Affairs Kidney Transplant Programexternal icon
        Myaskovsky L, Kendall K, Li X, Chang CH, Pleis JR, Croswell E, Ford CG, Switzer GE, Langone A, Mittal-Henkle A, Saha S, Thomas CP, Adams Flohr J, Ramkumar M, Dew MA.
        Transplantation. 2019 Dec;103(12):2701-2714.
        BACKGROUND: Racial/ethnic minorities have lower rates of deceased kidney transplantation (DDKT) and living donor kidney transplantation (LDKT) in the United States. We examined whether social determinants of health (eg, demographics, cultural, psychosocial, knowledge factors) could account for differences in the Veterans Affairs (VA) Kidney Transplantation (KT) Program. METHODS: We conducted a multicenter longitudinal cohort study of 611 Veterans undergoing evaluation for KT at all National VA KT Centers (2010-2012) using an interview after KT evaluation and tracking participants via medical records through 2017. RESULTS: Hispanics were more likely to get any KT (subdistribution hazard ratios [SHR] [95% confidence interval (CI)]: 1.8 [1.2-2.8]) or DDKT (SHR [95% CI]: 2.0 [1.3-3.2]) than non-Hispanic white in univariable analysis. Social determinants of health, including marital status (SHR [95% CI]: 0.6 [0.4-0.9]), religious objection to LDKT (SHR [95% CI]: 0.6 [0.4-1.0]), and donor preference (SHR [95% CI]: 2.5 [1.2-5.1]), accounted for some racial differences, and changes to Kidney Allocation System policy (SHR [95% CI]: 0.3 [0.2-0.5]) mitigated race differences in DDKT in multivariable analysis. For LDKT, non-Hispanic African American Veterans were less likely to receive an LDKT than non-Hispanic white (SHR [95% CI]: 0.2 [0.0-0.7]), but accounting for age (SHR [95% CI]: 1.0 [0.9-1.0]), insurance (SHR [95% CI]: 5.9 [1.1-33.7]), presenting with a living donor (SHR [95% CI]: 4.1 [1.4-12.3]), dialysis duration (SHR [95% CI]: 0.3 [0.2-0.6]), network of potential donors (SHR [95% CI]: 1.0 [1.0-1.1]), self-esteem (SHR [95% CI]: 0.4 [0.2-0.8]), transplant knowledge (SHR [95% CI]: 1.3 [1.0-1.7]), and changes to Kidney Allocation System policy (SHR [95% CI]: 10.3 [2.5-42.1]) in multivariable analysis eliminated those disparities. CONCLUSIONS: The VA KT Program does not exhibit the same pattern of disparities in KT receipt as non-VA centers. Transplant centers can use identified risk factors to target patients who may need more support to ensure they receive a transplant.

      2. County-level social capital and bacterial sexually transmitted infections in the United Statesexternal icon
        Owusu-Edusei K, McClendon-Weary B, Bull L, Gift TL, Aral SO.
        Sex Transm Dis. 2019 Dec 13.
        BACKGROUND: The association between county-level social capital indices (SCIs) and the three most commonly reported sexually transmitted infections (STIs) in the United States is lacking. In this study, we determined and examined the association between two recently developed county-level SCIs (i.e., Penn State social capital index [PSSCI] vs. United States Congress social capital index [USCSCI]) and the three most commonly reported bacterial STIs (chlamydia, gonorrhea and syphilis) using spatial and non-spatial regression techniques. METHODS: We assembled and analyzed multi-year (2012-2016) cross-sectional data on STIs and two SCIs (PSSCI vs. USCSCI) on counties in all 48 contiguous states. We explored two non-spatial regression models (univariate and multiple generalized linear models) and three spatial regression models (spatial lag model, spatial error model and the spatial autoregressive moving average model) for comparison. RESULTS: Without exception, all the SCIs were negatively associated with all three STI morbidity. A one-unit increase in the SCIs were associated with at least 9% (p<0.001) decrease in each STI. Our test of the magnitude of the estimated associations indicated that the USCSCI was at least two-times higher than the estimates for the PSSCI for all STIs (highest p-value=0.01). CONCLUSIONS: Overall, our results highlight the potential benefits of applying/incorporating social capital concepts to STI control and prevention efforts. In addition, our results suggest that for the purpose of planning, designing and implementing effective STI control and prevention interventions/programs, understanding the communities' associational life (as indicated by the factors/data used to develop the USCSCI) may be important.

    • Health Economics
      1. Nonadherence to any prescribed medication due to costs among adults with HIV infection - United States, 2016-2017external icon
        Beer L, Tie Y, Weiser J, Shouse RL.
        MMWR Morb Mortal Wkly Rep. 2019 Dec 13;68(49):1129-1133.
        The United States spends more per capita on prescription drugs than do other high-income countries (1). In 2017, patients paid 14% of this cost out of pocket (2). Prescription drug cost-saving strategies, including nonadherence to medications due to cost concerns, have been documented among U.S. adults (3) and can negatively affect morbidity and, in the case of persons with human immunodeficiency virus (HIV) infection, can increase transmission risk (4,5). However, population-based data on prescription drug cost-saving strategies among U.S. persons with HIV are lacking. CDC's Medical Monitoring Project* analyzed cross-sectional, nationally representative, surveillance data on behaviors, medical care, and clinical outcomes among adults with HIV infection. During 2016-2017, 14% of persons with HIV infection used a prescription drug cost-saving strategy for any prescribed medication, and 7% had cost saving-related nonadherence. Nonadherence due to prescription drug costs was associated with reporting an unmet need for medications from the Ryan White AIDS Drug Assistance Program (ADAP), not having Medicaid coverage, and having private insurance. Persons who were nonadherent because of cost concerns were more likely to have visited an emergency department, have been hospitalized, and not be virally suppressed. Reducing barriers to ADAP and Medicaid coverage, in addition to reducing medication costs for persons with private insurance, might help to decrease nonadherence due to cost concerns and, thus contribute to improved viral suppression rates and other health outcomes among persons with HIV infection.

      2. Micro-costing data collection tools often used in literature include standardized comprehensive templates, targeted questionnaires, activity logs, on-site administrative databases, and direct observation. These tools are not mutually exclusive and are often used in combination. Each tool has unique merits and limitations, and some may be more applicable than others under different circumstances. Proper application of micro-costing tools can produce quality cost estimates and enhance the usefulness of economic evaluations to inform resource allocation decisions. A common method to derive both fixed and variable costs of an intervention involves collecting data from the bottom up for each resource consumed (micro-costing). We scanned economic evaluation literature published in 2008-2018 and identified micro-costing data collection tools used. We categorized the identified tools and discuss their practical applications in an example study of health interventions, including their potential strengths and weaknesses. Sound economic evaluations of health interventions provide valuable information for justifying resource allocation decisions, planning for implementation, and enhancing the sustainability of the interventions. However, the quality of intervention cost estimates is seldom addressed in the literature. Reliable cost data forms the foundation of economic evaluations, and without reliable estimates, evaluation results, such as cost-effectiveness measures, could be misleading. In this project, we identified data collection tools often used to obtain reliable data for estimating costs of interventions that prevent and manage chronic conditions and considered practical applications to promote their use.

      3. Assessing costs of a hypertension management program: An application of the HEARTS costing tool in a program planning workshop in Thailandexternal icon
        Husain MJ, Allaire BT, Hutchinson B, Ketgudee L, Srisuthisak S, Yueayai K, Pisitpayat N, Nugent R, Datta BK, Joseph KT, Kostova D.
        J Clin Hypertens (Greenwich). 2019 Dec 24.
        The HEARTS technical package, a part of the Global Hearts Initiative to improve cardiovascular health globally, is a strategic approach for cardiovascular disease prevention and control at the primary care level. To support the evaluation of costs associated with HEARTS program components, a costing tool was developed to evaluate the incremental cost of program implementation. This report documents an application of the HEARTS costing tool during a costing workshop prior to the initiation of a HEARTS pilot program in Thailand's Phothong District, 2019-2020. During the workshop, a mock exercise was conducted to estimate the expected costs of the pilot study. The workshop application of the tool underscored its applicability to the HEARTS program planning process by identifying cost drivers associated with individual program elements. It further illustrated that by supporting disaggregation of costs into fixed and variable categories, the tool can inform the scalability of pilot projects to larger populations. Lessons learned during the initial development and application of the costing tool can inform future HEARTS evaluation efforts.

      4. Effectiveness and cost-effectiveness of human papillomavirus vaccination through age 45 years in the United Statesexternal icon
        Laprise JF, Chesson HW, Markowitz LE, Drolet M, Martin D, Benard E, Brisson M.
        Ann Intern Med. 2019 Dec 10.
        Background: In the United States, the routine age for human papillomavirus (HPV) vaccination is 11 to 12 years, with catch-up vaccination through age 26 years for women and 21 years for men. U.S. vaccination policy on use of the 9-valent HPV vaccine in adult women and men is being reviewed. Objective: To evaluate the added population-level effectiveness and cost-effectiveness of extending the current U.S. HPV vaccination program to women aged 27 to 45 years and men aged 22 to 45 years. Design: The analysis used HPV-ADVISE (Agent-based Dynamic model for VaccInation and Screening Evaluation), an individual-based transmission dynamic model of HPV infection and associated diseases, calibrated to age-specific U.S. data. Data Sources: Published data. Target Population: Women aged 27 to 45 years and men aged 22 to 45 years in the United States. Time Horizon: 100 years. Perspective: Health care sector. Intervention: 9-valent HPV vaccination. Outcome Measures: HPV-associated outcomes prevented and cost-effectiveness ratios. Results of Base-Case Analysis: The model predicts that the current U.S. HPV vaccination program will reduce the number of diagnoses of anogenital warts and cervical intraepithelial neoplasia of grade 2 or 3 and cases of cervical cancer and noncervical HPV-associated cancer by 82%, 80%, 59%, and 39%, respectively, over 100 years and is cost saving (vs. no vaccination). In contrast, extending vaccination to women and men aged 45 years is predicted to reduce these outcomes by an additional 0.4, 0.4, 0.2, and 0.2 percentage points, respectively. Vaccinating women and men up to age 30, 40, and 45 years is predicted to cost $830 000, $1 843 000, and $1 471 000, respectively, per quality-adjusted life-year gained (vs. current vaccination). Results of Sensitivity Analysis: Results were most sensitive to assumptions about natural immunity and progression rates after infection, historical vaccination coverage, and vaccine efficacy. Limitation: Uncertainty about the proportion of HPV-associated disease due to infections after age 26 years and about the level of herd effects from the current HPV vaccination program. Conclusion: The current HPV vaccination program is predicted to be cost saving. Extending vaccination to older ages is predicted to produce small additional health benefits and result in substantially higher incremental cost-effectiveness ratios than the current recommendation. Primary Funding Source: Centers for Disease Control and Prevention.

      5. OBJECTIVE: To estimate the average medical care cost of fatal and non-fatal injuries in the USA comprehensively by injury type. METHODS: The attributable cost of injuries was estimated by mechanism (eg, fall), intent (eg, unintentional), body region (eg, head and neck) and nature of injury (eg, fracture) among patients injured from 1 October 2014 to 30 September 2015. The cost of fatal injuries was the multivariable regression-adjusted average among patients who died in hospital emergency departments (EDs) or inpatient settings as reported in the Healthcare Cost and Utilization Project Nationwide Emergency Department Sample and National Inpatient Sample, controlling for demographic (eg, age), clinical (eg, comorbidities) and health insurance (eg, Medicaid) factors. The 1-year attributable cost of non-fatal injuries was assessed among patients with ED-treated injuries using MarketScan medical claims data. Multivariable regression models compared total medical payments (inpatient, outpatient, drugs) among non-fatal injury patients versus matched controls during the year following injury patients' ED visit, controlling for demographic, clinical and insurance factors. All costs are 2015 US dollars. RESULTS: The average medical cost of all fatal injuries was approximately $6880 and $41 570 per ED-based and hospital-based patient, respectively (range by injury type: $4764-$10 289 and $31 912-$95 295). The average attributable 1-year cost of all non-fatal injuries per person initially treated in an ED was approximately $6620 (range by injury type: $1698-$80 172). CONCLUSIONS AND RELEVANCE: Injuries are costly and preventable. Accurate estimates of attributable medical care costs are important to monitor the economic burden of injuries and help to prioritise cost-effective public health prevention activities.

      6. The findings and conclusions in this report are those of the authors and do not necessarily represent the official positon of the Centers for Disease Control and Prevention. BACKGROUND: Continued indirect effects provided by the childhood pneumococcal conjugate vaccine (13-valent pneumococcal conjugate vaccine [PCV13]) program in the United States have decreased disease in the adult population, reducing the potential direct effects of vaccinating older adults. OBJECTIVE: We examined the incremental cost-effectiveness of continuing to recommend PCV13 in series with 23-valent pneumococcal polysaccharide vaccine (PPSV23) at age 65 compared to a strategy that only included a recommendation for PPSV23 at age 65. METHODS: We used a probabilistic model following a cohort of 65 year olds in 2019. We used vaccination coverage and disease incidence estimates for healthy adults and adults with chronic medical conditions. We incorporated continued indirect effects from the childhood PCV13 program on adult disease incidence. RESULTS: In the base case scenario, continuing to recommend PCV13 at age 65 cost $561,682 per quality-adjusted life year (QALY) gained. In a scenario where PPSV23 provided modest protection against non-invasive pneumococcal pneumonia, costs increased to $2.3 million per QALY. These estimates are larger than our prior estimates for cost-effectiveness of this recommendation in the context of predicted indirect effects due to new data indicating PCV13 provided limited impact on serotype 3, the major cause of the remaining PCV13-type disease. Under our prior assumptions about PCV13 effectiveness against serotype 3 disease, the cost of continuing the recommendation is $207,607 per QALY. CONCLUSION: Indirect effects from the childhood PCV13 program have dramatically increased the cost per QALY of continuing to recommend PCV13 at age 65 after only a few years.

      7. Cost-effectiveness of reflex laboratory-based cryptococcal antigen screening for the prevention and treatment of cryptococcal meningitis in Botswanaexternal icon
        Tenforde MW, Muthoga C, Callaghan A, Ponetshego P, Ngidi J, Mine M, Jordan A, Chiller T, Larson BA, Jarvis JN.
        Wellcome Open Res. 2019 ;4:144.
        Background: Cryptococcal antigen (CrAg) screening for antiretroviral therapy (ART)-naive adults with advanced HIV/AIDS can reduce the incidence of cryptococcal meningitis (CM) and all-cause mortality. We modeled the cost-effectiveness of laboratory-based "reflex" CrAg screening for ART-naive CrAg-positive patients with CD4<100 cells/microL (those currently targeted in guidelines) and ART-experienced CrAg-positive patients with CD4<100 cells/microL (who make up an increasingly large proportion of individuals with advanced HIV/AIDS). Methods: A decision analytic model was developed to evaluate CrAg screening and treatment based on local CD4 count and CrAg prevalence data, and realistic assumptions regarding programmatic implementation of the CrAg screening intervention. We modeled the number of CrAg tests performed, the number of CrAg positives stratified by prior ART experience, the proportion of patients started on pre-emptive antifungal treatment, and the number of incident CM cases and CM-related deaths. Screening and treatment costs were evaluated, and cost per death or disability-adjusted life year (DALY) averted estimated. Results: We estimated that of 650,000 samples undergoing CD4 testing annually in Botswana, 16,364 would have a CD4<100 cells/microL and receive a CrAg test, with 70% of patients ART-experienced at the time of screening. Under base model assumptions, CrAg screening and pre-emptive treatment restricted to ART-naive patients with a CD4<100 cells/microL prevented 20% (39/196) of CM-related deaths in patients undergoing CD4 testing at a cost of US$2 per DALY averted. Expansion of preemptive treatment to include ART-experienced patients with a CD4<100 cells/microL resulted in 55 additional deaths averted (a total of 48% [94/196]) and was cost-saving compared to no screening. Findings were robust across a range of model assumptions. Conclusions: Reflex laboratory-based CrAg screening for patients with CD4<100 cells/microL is a cost-effective strategy in Botswana, even in the context of a relatively low proportion of advanced HIV/AIDS in the overall HIV-infected population, the majority of whom are ART-experienced.

      8. Assessment of the cost effectiveness of a brief video intervention for sexually transmitted disease preventionexternal icon
        Williams AM, Gift TL, O'Donnell LN, Rietmeijer CA, Malotte CK, Margolis AD, Warner L.
        Sex Transm Dis. 2019 Dec 12.
        BACKGROUND: Cost-effective, scalable interventions are needed to address high rates of sexually transmitted diseases (STDs) in the United States. Safe in the City, a 23 minute video intervention designed for STD clinic waiting rooms, effectively reduced new infections among STD clinic clients. A cost effectiveness analysis of this type of intervention could inform whether it should be replicated. METHODS: The cost effectiveness of a brief video intervention was calculated under a baseline scenario in which this type of intervention was expanded to a larger patient population. Alternative scenarios included expanding the intervention over a longer period of time or to more clinics, including HIV prevention benefits, and operating the intervention part time. Program costs, net costs per STD case averted, and the discounted net cost of the intervention were calculated from a health sector perspective across the scenarios. Monte Carlo simulations were used to calculate 95 percent confidence intervals surrounding the cost effectiveness measures. RESULTS: The net cost per case averted was $75 in the baseline scenario. The net cost of the intervention was $108,015, and most of the alternative scenarios found that the intervention was cost-saving compared to usual care. CONCLUSIONS: Single session, video-based interventions can be highly cost effective when implemented at scale. Updated video-based interventions that account for the changing STD landscape in the United States could play an important role in addressing the recent increases in infections.

    • Healthcare Associated Infections
      1. Prioritizing prevention to combat multidrug resistance in nursing homes: A call to actionexternal icon
        Jacobs Slifka KM, Kabbani S, Stone ND.
        J Am Med Dir Assoc. 2020 Jan;21(1):5-7.

      2. Notes from the Field: Hospital water contamination associated with a pseudo-outbreak of Mycobacterium porcinum - Wisconsin, 2016-2018external icon
        Kloth H, Elbadawi LI, Bateman A, Louison L, Shrivastwa N.
        MMWR Morb Mortal Wkly Rep. 2019 Dec 13;68(49):1149.

      3. Bloodstream infections with a novel nontuberculous mycobacterium involving 52 outpatient oncology clinic patients - Arkansas, 2018external icon
        Labuda SM, Garner K, Cima M, Moulton-Meissner H, Laufer Halpin A, Charles-Toney N, Yu P, Bolton E, Pierce R, Crist MB, Gomes D, Gable P, McAllister G, Lawsin A, Houston H, Patil N, Wheeler JG, Bradsher R, Vyas K, Haselow D.
        Clin Infect Dis. 2019 Nov 16.
        BACKGROUND: In July 2018, the Arkansas Department of Health (ADH) was notified by Hospital A of three patients with bloodstream infections (BSIs) with a rapidly growing, nontuberculous Mycobacterium (NTM) species; on September 5, 2018, six additional BSIs were reported. All were among oncology patients at Clinic A. We investigated to identify sources and to prevent further infections. METHODS: ADH performed an onsite investigation at Clinic A on September 7, 2018 and reviewed patient charts, obtained environmental samples, and cultured isolates. Isolates were sequenced (whole genome, 16S, rpoB) by the Centers for Disease Control and Prevention to determine species identity and relatedness. RESULTS: By December 31, 2018, 52 (34%) of 151 oncology patients with chemotherapy ports accessed at Clinic A during March 22-September 12, 2018 had NTM BSIs. Infected patients received significantly more saline flushes than uninfected patients (P <0.001) during the risk period. NTM grew from 6 unused saline flushes compounded by Clinic A. The identified species was novel and designated Mycobacterium FVL 201832. Isolates from patients and saline flushes were highly related by whole-genome sequencing, indicating a common source. Clinic A changed to prefilled saline flushes on September 12 as recommended. CONCLUSIONS: Mycobacterium FVL 201832 caused BSIs in oncology clinic patients. Laboratory data allowed investigators to rapidly link infections to contaminated saline flushes; cooperation between multiple institutions resulted in timely outbreak resolution. New state policies being considered because of this outbreak include adding extrapulmonary NTM to ADH's reportable disease list and providing more oversight to outpatient oncology clinics.

      4. Reinforcement of an infection control bundle targeting prevention practices for Clostridioides difficile in Veterans Health Administration nursing homesexternal icon
        Mayer J, Stone ND, Leecaster M, Hu N, Pettey W, Samore M, Pacheco SM, Sambol S, Donskey C, Jencson A, Gupta K, Strymish J, Johnson D, Woods C, Young E, McDonald LC, Gerding D.
        Am J Infect Control. 2019 Dec 4.
        BACKGROUND: Clostridioides difficile infection (CDI) causes significant morbidity in nursing home residents. Our aim was to describe adherence to a bundled CDI prevention initiative, which had previously been deployed nationwide in Veterans Health Administration (VA) long-term care facilities (LTCFs), and to improve compliance with reinforcement. METHODS: A multicenter pre- and post-reinforcement of the VA bundle consisting of environmental management, hand hygiene, and contact precautions was conducted in 6 VA LTCFs. A campaign to reinforce VA bundle components, as well as to promote select antimicrobial stewardship recommendations and contact precautions for 30 days, was employed. Hand hygiene, antimicrobial usage, and environmental contamination, before and after bundle reinforcement, were assessed. RESULTS: All LTCFs reported following the guidelines for cleaning and contact precautions until diarrhea resolution pre-reinforcement. Environmental specimens rarely yielded C difficile pre- or post-reinforcement. Proper hand hygiene across all facilities did not change with reinforcement (pre 52.51%, post 52.18%), nor did antimicrobial use (pre 87-197 vs. post 84-245 antibiotic days per 1,000 resident-days). LTCFs found it challenging to maintain prolonged contact precautions. DISCUSSION: Variation in infection prevention and antimicrobial prescribing practices across LTCFs were identified and lessons learned. CONCLUSIONS: Introducing bundled interventions in LTCFs is challenging, given the available resources, and may be more successful with fewer components and more intensive execution with feedback.

      5. Knowledge, attitudes, and practices of pediatric long-term care facility staff regarding infection control for acute respiratory infections and influenza vaccinationexternal icon
        Saiman L, Wilmont S, Hill-Ricciuti A, Jain M, Collins E, Ton A, Neu N, Prill MM, Garg S, Larson E, Stone ND, Gerber SI, Kim L.
        J Pediatric Infect Dis Soc. 2019 Dec 18.
        We surveyed clinical staff and on-site teachers working at pediatric long-term care facilities regarding prevention and control of acute respiratory infections and influenza in staff and residents. We uncovered knowledge gaps, particularly among teachers and clinical staff working <5 years at sites, thereby elucidating areas for targeted staff education.

      6. OBJECTIVES: To summarize patient notifications resulting from unsafe injection practices by health care personnel in the United States and describe recommended actions for prevention and response. PATIENTS AND METHODS: We examined records of events involving communications to groups of patients, conducted from January 1, 2012, through December 31, 2018, in which bloodborne pathogen testing was recommended or offered because of potential exposure to unsafe injection practices by health care personnel in the United States. Information compiled included: health care setting(s), type of unsafe injection practice(s), number of patients notified, number of outbreak-associated infections, and whether evidence suggesting bloodborne pathogen transmission prompted the notification. We compared these numbers with a similar review conducted from January 1, 2001, through December 31, 2011. RESULTS: From 2012 through 2018, more than 66,748 patients were notified as part of 38 patient notification events. Twenty-one involved exposures in non-hospital settings. Twenty-five involved syringe and/or needle reuse in the context of routine patient care; 11 involved drug tampering by a health care provider. The majority of events (n=25) were prompted by identification of unsafe injection practices alone, absent any documented infections at the time of notification. Outbreak-associated hepatitis B virus and/or hepatitis C virus infections were documented for 11 of the events; 8 involved patient-to-patient transmission, and 3 involved provider-to-patient transmission. CONCLUSIONS: Since 2001, nearly 200,000 patients in the United States were notified about potential exposure to blood-contaminated medications or injection equipment. Facility leadership has an obligation to ensure adherence to safe injection practices and to respond properly if unsafe injection practices are identified.

      7. Epidemiology of antibiotic use for urinary tract infection in nursing home residentsexternal icon
        Thompson ND, Penna A, Eure TR, Bamberg WM, Barney G, Barter D, Clogher P, DeSilva MB, Dumyati G, Epson E, Frank L, Godine D, Irizarry L, Kainer MA, Li L, Lynfield R, Mahoehney JP, Nadle J, Ocampo V, Perry L, Ray SM, Davis SS, Sievers M, Wilson LE, Zhang AY, Stone ND, Magill SS.
        J Am Med Dir Assoc. 2019 Dec 7.
        OBJECTIVES: Describe antibiotic use for urinary tract infection (UTI) among a large cohort of US nursing home residents. DESIGN: Analysis of data from a multistate, 1-day point prevalence survey of antimicrobial use performed between April and October 2017. SETTING AND PARTICIPANTS: Residents of 161 nursing homes in 10 US states of the Emerging Infections Program (EIP). METHODS: EIP staff reviewed nursing home medical records to collect data on systemic antimicrobial drugs received by residents, including therapeutic site, rationale for use, and planned duration. For drugs with the therapeutic site documented as urinary tract, pooled mean and nursing home-specific prevalence rates were calculated per 100 nursing home residents, and proportion of drugs by selected characteristics were reported. Data were analyzed in SAS, version 9.4. RESULTS: Among 15,276 residents, 407 received 424 antibiotics for UTI. The pooled mean prevalence rate of antibiotic use for UTI was 2.66 per 100 residents; nursing home-specific rates ranged from 0 to 13.6. One-quarter of antibiotics were prescribed for UTI prophylaxis, with a median planned duration of 111 days compared with 7 days when prescribed for UTI treatment (P < .001). Fluoroquinolones were the most common (18%) drug class used. CONCLUSIONS AND IMPLICATIONS: One in 38 residents was receiving an antibiotic for UTI on a given day, and nursing home-specific prevalence rates varied by more than 10-fold. UTI prophylaxis was common with a long planned duration, despite limited evidence to support this practice among older persons in nursing homes. The planned duration was >/=7 days for half of antibiotics prescribed for treatment of a UTI. Fluoroquinolones were the most commonly used antibiotics, despite their association with significant adverse events, particularly in a frail and older adult population. These findings help to identify priority practices for nursing home antibiotic stewardship.

      8. Elution efficiency of healthcare pathogens from environmental sampling toolsexternal icon
        West-Deadwyler RM, Moulton-Meissner HA, Rose LJ, Noble-Wang JA.
        Infect Control Hosp Epidemiol. 2019 Dec 9:1-3.
        Standardizing healthcare surface sampling requires the evaluation of sampling tools for organism adherence. Here, 7 sampling tools were evaluated to assess their elution efficiencies in the presence of 5 pathogens. Foam sponges (80.6%), microfiber wipes (80.5%), foam swabs (77.9%), and cellulose sponges (66.5%) yielded the highest median elution efficiencies.

      9. Laboratory analysis of an outbreak of Candida auris in New York from 2016 to 2018 - impact and lessons learnedexternal icon
        Zhu Y, O'Brien B, Leach L, Clark A, Bates M, Adams E, Ostrowsky B, Quinn M, Dufort E, Southwick K, Erazo R, Haley VB, Bucher C, Chaturvedi V, Limberger RJ, Blog D, Lutterloh E, Chaturvedi S.
        J Clin Microbiol. 2019 Dec 18.
        Candida auris is a multidrug-resistant yeast which has emerged in healthcare facilities worldwide, however little is known about identification methods, patient colonization, environmental survival, spread, and drug resistance. Colonization on both biotic (patients) and abiotic (healthcare objects) surfaces, along with travel, appear to be the major factors for the spread of this pathogen across the globe. In this investigation, we present laboratory findings from an ongoing C. auris outbreak in New York (NY) from August 2016 through 2018. A total of 540 clinical isolates, 11,035 patient surveillance specimens, and 3,672 environmental surveillance samples were analyzed. Laboratory methods included matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) for yeast isolate identification, real-time PCR for rapid surveillance sample screening, culture on selective/non-selective media for recovery of C. auris and other yeasts from surveillance samples, antifungal susceptibility testing to determine the C. auris resistance profile, and Sanger sequencing of the internal transcribed spacer (ITS) and D1/D2 regions of the ribosomal gene for C. auris genotyping. Results included: a) identification and confirmation of C. auris in 413 clinical isolates and 931 patient surveillance isolates, as well as identification of 277 clinical cases and 350 colonized cases from 151 healthcare facilities including 59 hospitals, 92 nursing homes, 1 long-term acute care hospital (LTACH), and 2 hospices, b) successful utilization of an in-house developed C. auris real-time PCR assay for the rapid screening of patient and environmental surveillance samples, c) demonstration of relatively heavier colonization of C. auris in nares compared to the axilla/groin, and d) predominance of the South Asia clade I with intrinsic resistance to fluconazole and elevated minimum inhibitory concentration (MIC) to voriconazole (81%), amphotericin B (61%), 5-FC (3%) and echinocandins (1%). These findings reflect greater regional prevalence and incidence of C. auris and the deployment of better detection tools in an unprecedented outbreak.

    • Immunity and Immunization
      1. BACKGROUND: Despite the success of rotavirus vaccines over the last decade, rotavirus remains a leading cause of severe diarrheal disease among young children. Further progress in reducing the burden of disease is inhibited, in part, by vaccine underperformance in certain settings. Early trials suggested that oral poliovirus vaccine (OPV), when administered concomitantly with rotavirus vaccine, reduces rotavirus seroconversion rates after the first rotavirus dose with modest or nonsignificant interference after completion of the full rotavirus vaccine course. Our study aimed to identify a range of individual-level characteristics, including concomitant receipt of OPV, that affect rotavirus vaccine immunogenicity in high- and low-child-mortality settings, controlling for individual- and country-level factors. Our central hypothesis was that OPV administered concomitantly with rotavirus vaccine reduced rotavirus vaccine immunogenicity. METHODS AND FINDINGS: Pooled, individual-level data from GlaxoSmithKline's Phase II and III clinical trials of the monovalent rotavirus vaccine (RV1), Rotarix, were analyzed, including 7,280 vaccinated infants (5-17 weeks of age at first vaccine dose) from 22 trials and 33 countries/territories (5 countries/territories with high, 13 with moderately low, and 15 with very low child mortality). Two standard markers for immune response were examined including antirotavirus immunoglobulin A (IgA) seroconversion (defined as the appearance of serum antirotavirus IgA antibodies in subjects initially seronegative) and serum antirotavirus IgA titer, both collected approximately 4-12 weeks after administration of the last rotavirus vaccine dose. Mixed-effect logistic regression and mixed-effect linear regression of log-transformed data were used to identify individual- and country-level predictors of seroconversion (dichotomous) and antibody titer (continuous), respectively. Infants in high-child-mortality settings had lower odds of seroconverting compared with infants in low-child-mortality settings (odds ratio [OR] = 0.48, 95% confidence interval [CI] 0.43-0.53, p < 0.001). Similarly, among those who seroconverted, infants in high-child-mortality settings had lower IgA titers compared with infants in low-child-mortality settings (mean difference [beta] = 0.83, 95% CI 0.77-0.90, p < 0.001). Infants who received OPV concomitantly with both their first and their second doses of rotavirus vaccine had 0.63 times the odds of seroconverting (OR = 0.63, 95% CI 0.47-0.84, p = 0.002) compared with infants who received OPV but not concomitantly with either dose. In contrast, among infants who seroconverted, OPV concomitantly administered with both the first and second rotavirus vaccine doses was found to be positively associated with antirotavirus IgA titer (beta = 1.28, 95% CI 1.07-1.53, p = 0.009). Our findings may have some limitations in terms of generalizability to routine use of rotavirus vaccine because the analysis was limited to healthy infants receiving RV1 in clinical trial settings. CONCLUSIONS: Our findings suggest that OPV given concomitantly with RV1 was a substantial contributor to reduced antirotavirus IgA seroconversion, and this interference was apparent after the second vaccine dose of RV1, as with the original clinical trials that our reanalysis is based on. However, our findings do suggest that the forthcoming withdrawal of OPV from the infant immunization schedule globally has the potential to improve RV1 performance.

      2. Use of anthrax vaccine in the United States: Recommendations of the Advisory Committee on Immunization Practices, 2019external icon
        Bower WA, Schiffer J, Atmar RL, Keitel WA, Friedlander AM, Liu L, Yu Y, Stephens DS, Quinn CP, Hendricks K.
        MMWR Recomm Rep. 2019 Dec 13;68(4):1-14.
        This report updates the 2009 recommendations from the CDC Advisory Committee on Immunization Practices (ACIP) regarding use of anthrax vaccine in the United States (Wright JG, Quinn CP, Shadomy S, Messonnier N. Use of anthrax vaccine in the United States: recommendations of the Advisory Committee on Immunization Practices [ACIP)], 2009. MMWR Recomm Rep 2010;59[No. RR-6]). The report 1) summarizes data on estimated efficacy in humans using a correlates of protection model and safety data published since the last ACIP review, 2) provides updated guidance for use of anthrax vaccine adsorbed (AVA) for preexposure prophylaxis (PrEP) and in conjunction with antimicrobials for postexposure prophylaxis (PEP), 3) provides updated guidance regarding PrEP vaccination of emergency and other responders, 4) summarizes the available data on an investigational anthrax vaccine (AV7909), and 5) discusses the use of anthrax antitoxins for PEP. Changes from previous guidance in this report include the following: 1) a booster dose of AVA for PrEP can be given every 3 years instead of annually to persons not at high risk for exposure to Bacillus anthracis who have previously received the initial AVA 3-dose priming and 2-dose booster series and want to maintain protection; 2) during a large-scale emergency response, AVA for PEP can be administered using an intramuscular route if the subcutaneous route of administration poses significant materiel, personnel, or clinical challenges that might delay or preclude vaccination; 3) recommendations on dose-sparing AVA PEP regimens if the anthrax vaccine supply is insufficient to vaccinate all potentially exposed persons; and 4) clarification on the duration of antimicrobial therapy when used in conjunction with vaccine for PEP.These updated recommendations can be used by health care providers and guide emergency preparedness officials and planners who are developing plans to provide anthrax vaccine, including preparations for a wide-area aerosol release of B. anthracis spores. The recommendations also provide guidance on dose-sparing options, if needed, to extend the supply of vaccine to increase the number of persons receiving PEP in a mass casualty event.

      3. Comparative immunogenicity of several enhanced influenza vaccine options for older adults: A randomized, controlled trialexternal icon
        Cowling BJ, Perera R, Valkenburg SA, Leung NH, Iuliano AD, Tam YH, Wong JH, Fang VJ, Li AP, So HC, Ip DK, Azziz-Baumgartner E, Fry AM, Levine MZ, Gangappa S, Sambhara S, Barr IG, Skowronski DM, Peiris JS, Thompson MG.
        Clin Infect Dis. 2019 Dec 12.
        BACKGROUND: Enhanced influenza vaccines may improve protection for older adults, but comparative immunogenicity data are limited. Our objective was to examine immune responses to enhanced influenza vaccines, compared to standard-dose vaccines, in community-dwelling older adults. METHODS: Community-dwelling older adults aged 65-82 years in Hong Kong were randomly allocated (October 2017-January 2018) to receive 2017-2018 Northern hemisphere formulations of a standard-dose quadrivalent vaccine, MF59-adjuvanted trivalent vaccine, high-dose trivalent vaccine, or recombinant-hemagglutinin (rHA) quadrivalent vaccine. Sera collected from 200 recipients of each vaccine before and at 30-days postvaccination were assessed for antibodies to egg-propagated vaccine strains by hemagglutination inhibition (HAI) and to cell-propagated A/Hong Kong/4801/2014(H3N2) virus by microneutralization (MN). Influenza-specific CD4+ and CD8+ T cell responses were assessed in 20 participants per group. RESULTS: Mean fold rises (MFR) in HAI titers to egg-propagated A(H1N1) and A(H3N2) and the MFR in MN to cell-propagated A(H3N2) were statistically significantly higher in the enhanced vaccine groups, compared to the standard-dose vaccine. The MFR in MN to cell-propagated A(H3N2) was highest among rHA recipients (4.7), followed by high-dose (3.4) and MF59-adjuvanted (2.9) recipients, compared to standard-dose recipients (2.3). Similarly, the ratio of postvaccination MN titers among rHA recipients to cell-propagated A(H3N2) recipients was 2.57-fold higher than the standard-dose vaccine, which was statistically higher than the high-dose (1.33-fold) and MF59-adjuvanted (1.43-fold) recipient ratios. Enhanced vaccines also resulted in the boosting of T-cell responses. CONCLUSIONS: In this head-to-head comparison, older adults receiving enhanced vaccines showed improved humoral and cell-mediated immune responses, compared to standard-dose vaccine recipients. CLINICAL TRIALS REGISTRATION: NCT03330132.

      4. INTRODUCTION: The Advisory Committee on Immunization Practices (ACIP) recommends vaccination with tetanus toxoid, reduced diphtheria toxoid, and acellular pertussis vaccine (Tdap) in persons >/=65years of age. To date, few studies have assessed the safety of Tdap in this population. We aimed to summarize reports submitted to the Vaccine Adverse Event Reporting System (VAERS) following receipt of Tdap in this age group. METHODS: We searched for and analyzed U.S. VAERS reports of Tdap among individuals >/=65years of age submitted from September 1, 2010 through December 31, 2018. We classified reports according to concurrent vaccination, seriousness, and outcome (death, non-death) and determined the frequency of reported adverse events (AEs). For serious reports, we reviewed available medical records. Data mining analyses were undertaken to detect disproportionality in reporting. RESULTS: VAERS received a total of 1,798 reports following Tdap, of which 104 (6%) were serious. The most common AEs were injection site erythema (26%; n=468), injection site pain (19%; n=335), injection site swelling (18%; n=329), and erythema (18%; n=321). We identified seven deaths; none were attributed to Tdap. Among serious non-death reports, nervous system disorders (35.1%; n=34) and infections and infestations (n=18.6%; n=18) were most commonly reported. Data mining did not identify any vaccine-AE combination reported more frequently than expected. CONCLUSIONS: We did not identify any new safety concern over nearly a decade of recommended Tdap use among adults >/=65years of age. Findings from this post-marketing review are consistent with prior post-marketing observations and pre-licensure studies.

      5. BACKGROUND: Cholera is a major public health concern in displaced-person camps, which often contend with overcrowding and scarcity of resources. Maela, the largest and longest-standing refugee camp in Thailand, located along the Thai-Burmese border, experienced four cholera outbreaks between 2005 and 2010. In 2013, a cholera vaccine campaign was implemented in the camp. To assist in the evaluation of the campaign and planning for subsequent campaigns, we developed a mathematical model of cholera in Maela. METHODS: We formulated a Susceptible-Infectious-Water-Recovered-based transmission model and estimated parameters using incidence data from 2010. We next evaluated the reduction in cases conferred by several immunization strategies, varying timing, effectiveness, and resources (i.e., vaccine availability). After the vaccine campaign, we generated case forecasts for the next year, to inform on-the-ground decision-making regarding whether a booster campaign was needed. RESULTS: We found that preexposure vaccination can substantially reduce the risk of cholera even when <50% of the population is given the full two-dose series. Additionally, the preferred number of doses per person should be considered in the context of one vs. two dose effectiveness and vaccine availability. For reactive vaccination, a trade-off between timing and effectiveness was revealed, indicating that it may be beneficial to give one dose to more people rather than two doses to fewer people, given that a two-dose schedule would incur a delay in administration of the second dose. Forecasting using realistic coverage levels predicted that there was no need for a booster campaign in 2014 (consistent with our predictions, there was not a cholera epidemic in 2014). CONCLUSIONS: Our analyses suggest that vaccination in conjunction with ongoing water sanitation and hygiene efforts provides an effective strategy for controlling cholera outbreaks in refugee camps. Effective preexposure vaccination depends on timing and effectiveness. If a camp is facing an outbreak, delayed distribution of vaccines can substantially alter the effectiveness of reactive vaccination, suggesting that quick distribution of vaccines may be more important than ensuring every individual receives both vaccine doses. Overall, this analysis illustrates how mathematical models can be applied in public health practice, to assist in evaluating alternative intervention strategies and inform decision-making.

      6. Clinical practices for measles-mumps-rubella vaccination among US pediatric international travelersexternal icon
        Hyle EP, Rao SR, Bangs AC, Gastanaduy P, Fiebelkorn AP, Hagmann SH, Walker AT, Walensky RP, Ryan ET, LaRocque RC.
        JAMA Pediatr. 2019 Dec 9:e194515.
        Importance: The US population is experiencing a resurgence of measles, with more than 1000 cases during the first 6 months of 2019. Imported measles cases among returning international travelers are the source of most US measles outbreaks, and these importations can be reduced with pretravel measles-mumps-rubella (MMR) vaccination of pediatric travelers. Although it is estimated that children account for less than 10% of US international travelers, pediatric travelers account for 47% of all known measles importations. Objective: To examine clinical practice regarding MMR vaccination of pediatric international travelers and to identify reasons for nonvaccination of pediatric travelers identified as MMR eligible. Design, Setting, and Participants: This cross-sectional study of pediatric travelers (ages >/=6 months and <18 years) attending pretravel consultation at 29 sites associated with Global TravEpiNet (GTEN), a Centers for Disease Control and Prevention-supported consortium of clinical sites that provide pretravel consultations, was performed from January 1, 2009, through December 31, 2018. Main Outcomes and Measures: Measles-mumps-rubella vaccination among MMR vaccination-eligible pediatric travelers. Results: Of 14602 pretravel consultations for pediatric international travelers, 2864 travelers (19.6%; 1475 [51.5%] males; 1389 [48.5%] females) were eligible to receive pretravel MMR vaccination at the time of the consultation: 365 of 398 infants aged 6 to 12 months (91.7%), 2161 of 3623 preschool-aged travelers aged 1 to 6 years (59.6%), and 338 of 10581 school-aged travelers aged 6 to 18 years (3.2%). Of 2864 total MMR vaccination-eligible travelers, 1182 (41.3%) received the MMR vaccine and 1682 (58.7%) did not. The MMR vaccination-eligible travelers who did not receive vaccine included 161 of 365 infants (44.1%), 1222 of 2161 preschool-aged travelers (56.5%), and 299 of 338 school-aged travelers (88.5%). We observed a diversity of clinical practice at different GTEN sites. In multivariable analysis, MMR vaccination-eligible pediatric travelers were less likely to be vaccinated at the pretravel consultation if they were school-aged (model 1: odds ratio [OR], 0.32 [95% CI, 0.24-0.42; P < .001]; model 2: OR, 0.26 [95% CI, 0.14-0.47; P < .001]) or evaluated at specific GTEN sites (South: OR, 0.06 [95% CI, 0.01-0.52; P < .001]; West: OR, 0.10 [95% CI, 0.02-0.47; P < .001]). The most common reasons for nonvaccination were clinician decision not to administer MMR vaccine (621 of 1682 travelers [36.9%]) and guardian refusal (612 [36.4%]). Conclusions and Relevance: Although most infant and preschool-aged travelers evaluated at GTEN sites were eligible for pretravel MMR vaccination, only 41.3% were vaccinated during pretravel consultation, mostly because of clinician decision or guardian refusal. Strategies may be needed to improve MMR vaccination among pediatric travelers and to reduce measles importations and outbreaks in the United States.

      7. Japanese encephalitis vaccine-specific envelope protein E138K mutation does not attenuate virulence of West Nile virusexternal icon
        Kaiser JA, Luo H, Widen SG, Wood TG, Huang CY, Wang T, Barrett AD.
        NPJ Vaccines. 2019 ;4:50.
        West Nile (WNV) and Japanese encephalitis viruses (JEV) are closely related, mosquito-borne neurotropic flaviviruses. Although there are no licensed human vaccines for WNV, JEV has multiple human vaccines, including the live, attenuated vaccine SA14-14-2. Investigations into determinants of attenuation of JE SA14-14-2 demonstrated that envelope (E) protein mutation E138K was crucial to the attenuation of mouse virulence. As WNV is closely related to JEV, we investigated whether or not the E-E138K mutation would be beneficial to be included in a candidate live attenuated WNV vaccine. Rather than conferring a mouse attenuated phenotype, the WNV E-E138K mutant reverted and retained a wild-type mouse virulence phenotype. Next-generation sequencing analysis demonstrated that, although the consensus sequence of the mutant had the E-E138K mutation, there was increased variation in the E protein, including a single-nucleotide variant (SNV) revertant to the wild-type glutamic acid residue. Modeling of the E protein and analysis of SNVs showed that reversion was likely due to the inability of critical E-protein residues to be compatible electrostatically. Therefore, this mutation may not be reliable for inclusion in candidate live attenuated vaccines in related flaviviruses, such as WNV, and care must be taken in translation of attenuating mutations from one virus to another virus, even if they are closely related.

      8. With rapid development of computing technology, Bayesian statistics have increasingly gained more attention in various areas of public health. However, the full potential of Bayesian sequential methods applied to vaccine safety surveillance has not yet been realized, despite acknowledged practical benefits and philosophical advantages of Bayesian statistics. In this paper, we describe how sequential analysis can be performed in a Bayesian paradigm in the field of vaccine safety. We compared the performance of the frequentist sequential method, specifically, Maximized Sequential Probability Ratio Test (MaxSPRT), and a Bayesian sequential method using simulations and a real world vaccine safety example. The performance is evaluated using three metrics: false positive rate, false negative rate, and average earliest time to signal. Depending on the background rate of adverse events, the Bayesian sequential method could significantly improve the false negative rate and decrease the earliest time to signal. We consider the proposed Bayesian sequential approach to be a promising alternative for vaccine safety surveillance.

      9. BACKGROUND: Swine origin A(H3N2) variant (A(H3N2)v) viruses continue to evolve and remain a public health threat. Recent outbreaks in humans in 2016-2018 were caused by a newly emerged A(H3N2)v cluster 2010.1 which are genetically and antigenically distinct from the previously predominant cluster IV. To address the public health risk, we evaluated the levels of heterologous cross-reactive antibodies to A(H3N2)v cluster 2010.1 viruses induced from an existing cluster IV A(H3N2)v vaccine and several seasonal inactivated influenza vaccines (IIVs) in adults, elderly and children. METHODS: Human vaccine sera and ferret antisera were analyzed by hemagglutination inhibition (HI) and neutralization assays against representative A(H3N2)v viruses from clusters IV and 2010.1, and seasonal A(H3N2) viruses. RESULTS: Ferret antisera detected no or little cross-reactivity between the two A(H3N2)v clusters, or between A(H3N2)v and seasonal A(H3N2) viruses. In humans, cluster IV A(H3N2)v vaccine induced antibodies cross-reactive to cluster 2010.1 viruses in about 1/3 of the 89 adult and elderly vaccinees. Seasonal IIVs did not induce seroprotective antibodies (>/= 40) to A(H3N2)v viruses in young children, but induced higher antibodies to A(H3N2)v viruses in cluster 2010.1 than those in cluster IV in adults. CONCLUSIONS: Cluster IV A(H3N2)v vaccine did not provide sufficient heterologous antibody responses against the new 2010.1 cluster A(H3N2)v viruses. Seasonal IIV could not induce seroprotective antibodies to 2010.1 cluster A(H3N2)v viruses in young children, suggesting that young children are still at high risk to the newly emerged A(H3N2)v viruses. Continued surveillance on A(H3N2)v viruses is critical for risk assessment and pandemic preparedness.

      10. BACKGROUND: Annual vaccination against seasonal influenza is widely recognized as the primary intervention method in preventing morbidity and mortality from influenza, but coverage among adults is suboptimal in the United States. Safety and effectiveness perceptions regarding vaccines are consistently cited as factors that influence adults' decisions to accept or reject vaccination. Therefore, we conducted this analysis in order to understand sociodemographic, attitude, and knowledge factors associated with these perceptions for influenza vaccine among adults in three different age groups. METHODS: Probability-based Internet panel surveys using nationally representative samples of adults aged >/=19 years in the United States were conducted during February-March of 2017 and 2018. We asked respondents if they believed the influenza vaccine was safe and effective. We calculated prevalence ratios using chi-square and pairwise t-tests to determine associations between safety and effectiveness beliefs and sociodemographic variables for adults aged 19-49, 50-64, and >/=65 years. RESULTS: Survey completion rates were 58.2% (2017) and 57.2% (2018); we analyzed 4597 combined responses. Overall, most adults reported the influenza vaccine was safe (86.3%) and effective (73.0%). However, fewer younger adults reported positive perceptions compared with older age groups. Respondents who believed the vaccine was safe also reported it was effective. CONCLUSIONS: Generally, adults perceived the influenza vaccine as safe and effective. Considering this, any improvements to these perceptions would likely be minor and have a limited effect on coverage. Future research to understand why, despite positive perceptions, adults are still choosing to forego the vaccine may be informative.

      11. Vaccine-associated anaphylaxisexternal icon
        McNeil MM.
        Curr Treat Options Allergy. 2019 Sep;6(3):297-308.
        Purpose of Review: Anaphylaxis is a rare, serious hypersensitivity reaction following vaccination, which is rapid in onset and characterized by multisystem involvement. Although anaphylaxis may occur after any vaccine, understanding the risk for this outcome, particularly following influenza vaccines, is important because of the large number of persons vaccinated annually. Recent Findings Two recent CDC safety studies confirmed the rarity of post-vaccination anaphylaxis. In a 25-year review of data from the Vaccine Adverse Event Reporting System (VAERS), reports in children were most common following childhood vaccinations and among adults more often followed influenza vaccine. In a Vaccine Safety Datalink (VSD) study, the estimated incidence of anaphylaxis was 1.3 per million vaccine doses administered for all vaccines and 1.6 per million doses for IIV3 (trivalent) influenza vaccine. Summary: Despite its rarity, its rapid onset (usually within minutes) and potentially lethal nature require that all personnel and facilities providing vaccinations have procedures in place for anaphylaxis management.

      12. Neonatal seizures: Case definition & guidelines for data collection, analysis, and presentation of immunization safety dataexternal icon
        Pellegrin S, Munoz FM, Padula M, Heath PT, Meller L, Top K, Wilmshurst J, Wiznitzer M, Das MK, Hahn CD, Kucuku M, Oleske J, Vinayan KP, Yozawitz E, Aneja S, Bhat N, Boylan G, Sesay S, Shrestha A, Soul JS, Tagbo B, Joshi J, Soe A, Maltezou HC, Gidudu J, Kochhar S, Pressler RM.
        Vaccine. 2019 Dec 10;37(52):7596-7609.

      13. Does having a seasonal influenza program facilitate pandemic preparedness? An analysis of vaccine deployment during the 2009 pandemicexternal icon
        Porter RM, Goldin S, Lafond KE, Hedman L, Ungkuldee M, Kurzum J, Azziz-Baumgartner E, Nannei C, Bresee JS, Moen A.
        Vaccine. 2019 Dec 12.
        BACKGROUND: National seasonal influenza programs have been recommended as a foundation for pandemic preparedness. During the 2009 pandemic, WHO aimed to increase Member States' equitable access to influenza vaccines through pandemic vaccine donation. METHODS: This analysis explores whether the presence of a seasonal influenza program contributed to more rapid national submission of requirements to receive vaccine during the 2009 influenza pandemic. Data from 2009 influenza vaccine donation, deployment, and surveillance initiatives were collected during May-September 2018 from WHO archival material. Data about the presence of seasonal influenza vaccine programs prior to 2009 were gathered from the WHO-UNICEF Joint Reporting Form. Cox proportional hazards models were used to assess the relationship between presence of a seasonal influenza program and time to submission of a national deployment and vaccination plan and to vaccine delivery. FINDING: Of 97 countries eligible to receive WHO-donated vaccine, 83 (86%) submitted national deployment and vaccination plans and 77 (79%) received vaccine. Countries with a seasonal influenza vaccine program were more likely to submit a national deployment and vaccination plan (hazards ratio [HR] 2.1; 95% confidence interval [CI]. Countries with regulatory delays were less likely to receive vaccine than those without these delays (HR 0.4, 95% CI: 0.2-0.6). INTERPRETATION: During the 2009 pandemic, eligible countries with a seasonal influenza vaccine program weremore ready to receive and use donated vaccines than those without a program. Our findings suggest that robust seasonal influenza vaccine programs increase national familiarity with the management of influenza vaccines and therefore enhance pandemic preparedness. FUNDING: N/A.

      14. Influenza vaccine effectiveness against hospitalizations in children and older adults - Data from South America, 2013-2017. A test negative designexternal icon
        Sofia Arriola C, El Omeiri N, Azziz-Baumgartner E, Thompson MG, Sotomayor-Proschle V, Fasce RA, Von Horoch M, Enrique Carrizo Olalla J, Aparecida Ferreira de Almeida W, Palacios J, Palekar R, Couto P, Descalzo M, Maria Ropero-Alvarez A.
        Vaccine X. 2019 Dec 10;3:100047.
        Background: In 2013, the Pan American Health Organization established a multi-site, multi-country network to evaluate influenza vaccine effectiveness (VE). We pooled data from five consecutive seasons in five countries to conduct an analysis of southern hemisphere VE against laboratory-confirmed influenza hospitalizations in young children and older adults. Methods: We used a test-negative design to estimate VE against laboratory-confirmed influenza in hospitalized young children (aged 6 horizontal line 24months) and older adults (aged >/=60years) in Argentina, Brazil, Chile, Colombia, and Paraguay. Following country-specific influenza surveillance protocol, hospitalized persons with severe acute respiratory infections (SARI) at 48 sentinel hospitals (March 2013-December 2017) were tested for influenza virus infection by rRT-PCR. VE was estimated for young children and older adults using logistic random effects models accounting for cluster (country), adjusting for sex, age (months for children, and age-in-year categories for adults), calendar year, country, preexisting conditions, month of illness onset and prior vaccination as an effect modifier for the analysis in adults. Results: We included 8426 SARI cases (2389 children and 6037 adults) in the VE analyses. Among young children, VE against SARI hospitalization associated with any influenza virus was 43% (95%CI: 33%, 51%) for children who received two doses, but was 20% (95%CI: -16%, 45%) and not statistically significant for those who received one dose in a given season. Among older adults, overall VE against SARI hospitalization associated with any influenza virus was 41% (95%CI: 28%, 52%), 45% (95%CI: 34%, 53%) against A(H3N2), 40% (95%CI: 18%, 56%) against A(H1N1)pdm09, and 20% (95%CI: -40%, 54%) against influenza B viruses. Conclusions: Our results suggest that over the five-year study period, influenza vaccination programs in five South American countries prevented more than one-third of laboratory confirmed influenza-associated hospitalizations in young children receiving the recommended two doses and vaccinated older adults.

      15. Erythema multiforme, Stevens Johnson syndrome, and toxic epidermal necrolysis reported after vaccination, 1999-2017external icon
        Su JR, Haber P, Ng CS, Marquez PL, Dores GM, Perez-Vilar S, Cano MV.
        Vaccine. 2019 Dec 20.
        BACKGROUND: Since the last review of vaccine safety surveillance data for erythema multiforme (EM), Stevens Johnson syndrome (SJS), SJS/TEN, and toxic epidermal necrolysis (TEN) (EM/SJS/TEN), over 37 new vaccines have been introduced in the United States. We sought to describe reported EM/SJS/TEN after vaccines during 1999-2017. METHODS: We identified U.S. reports of EM/SJS/TEN received by the Vaccine Adverse Event Reporting System (VAERS) during 1999-2017. We stratified analysis by condition (EM, SJS, or TEN), and analyzed reports by serious or non-serious status, sex, age group, time from vaccination to symptom onset, exposure to known causes of EM/SJS/TEN, and vaccines administered. We used Empirical Bayesian data mining to detect vaccine-AE pairs reported more frequently than expected. RESULTS: Of 466,027 reports to VAERS during 1999-2017, we identified 984 reports of EM, 89 reports of SJS, 6 reports of SJS/TEN, and 7 reports of TEN. Few reports of EM (9%), and most reports of SJS (52%), SJS/TEN (100%), and TEN (100%) were serious. Overall, 55% of reports described males, 48% described children aged < 4 years; 58% of EM/SJS/TEN occurred </= 7 days after vaccination. Few reports (</=5%) described exposure to known causes of EM/SJS/TEN. Overall, childhood vaccines (e.g., combined measles, mumps, and rubella vaccine) were most commonly reported. We identified 6 deaths; 4 were exposed to medications associated with EM/SJS/TEN. EM after smallpox vaccine was reported disproportionately among people aged 19-49 years. CONCLUSIONS: EM/SJS/TEN were rarely reported after vaccination; data mining identified a known association between EM and smallpox vaccine.

    • Injury and Violence
      1. Driving under the influence of marijuana and illicit drugs among persons aged >/=16 years - United States, 2018external icon
        Azofeifa A, Rexach-Guzman BD, Hagemeyer AN, Rudd RA, Sauber-Schatz EK.
        MMWR Morb Mortal Wkly Rep. 2019 Dec 20;68(50):1153-1157.
        In the United States, driving while impaired is illegal. Nonetheless, an estimated 10,511 alcohol-impaired driving deaths occurred in 2018.* The contribution of marijuana and other illicit drugs to these and other impaired driving deaths remains unknown. Data from the Substance Abuse and Mental Health Services Administration's National Survey on Drug Use and Health (NSDUH) indicated that in the United States during 2014, 12.4% of all persons aged 16-25 years reported driving under the influence of alcohol, and 3.2% reported driving under the influence of marijuana (1). The impairing effects of alcohol are well established, but less is known about the effects of illicit substances or other psychoactive drugs (e.g., marijuana, cocaine, methamphetamines, and opioids, including heroin). This report provides the most recent national estimates of self-reported driving under the influence of marijuana and illicit drugs among persons aged >/=16 years, using 2018 public-use data from NSDUH. Prevalences of driving under the influence of marijuana and illicit drugs other than marijuana were assessed for persons aged >/=16 years by age group, sex, and race/ethnicity. During 2018, 12 million (4.7%) U.S. residents reported driving under the influence of marijuana in the past 12 months; 2.3 million (0.9%) reported driving under the influence of illicit drugs other than marijuana. Driving under the influence was more prevalent among males and among persons aged 16-34 years. Effective measures that deter driving under the influence of drugs are limited (2). Development, evaluation, and further implementation of strategies to prevent alcohol-impaired,(dagger) drug-impaired, and polysubstance-impaired driving, coupled with standardized testing of impaired drivers and drivers involved in fatal crashes, could advance understanding of drug- and polysubstance-impaired driving and support prevention efforts.

      2. National prevalence of sexual violence by a workplace-related perpetratorexternal icon
        Basile KC, D'Inverno AS, Wang J.
        Am J Prev Med. 2019 Dec 10.
        INTRODUCTION: Workplace sexual violence is not a new phenomenon but has received increased attention recently with the re-emergence of the #metoo movement. Gaps exist in the understanding of the prevalence of this problem in the U.S., its perpetrators, and its impacts. METHODS: Using 2010-2012 data from the National Intimate Partner and Sexual Violence Survey (22,590 women and 18,584 men), this study examined the prevalence of several types of sexual violence by a workplace-related perpetrator (authority figure or nonauthority figure) and numerous impacts of the violence, including psychological impacts, safety concerns, and missing days of work or school. Data were analyzed in 2018. RESULTS: In the U.S., 5.6% of women (almost 7 million) and 2.5% of men (nearly 3 million) reported some type of sexual violence by a workplace-related perpetrator. Almost 4% of women (3.9%) reported sexual violence by nonauthority figures and 2.1% reported authority figures; 2.0% of men reported sexual violence by nonauthority figures, and 0.6% reported authority figures. For women, the most commonly reported sexual violence type was unwanted sexual contact (3.5% of women); for men, it was noncontact unwanted sexual experiences (1.3% of men). An estimated 1 million women (0.8%) have been raped by a workplace-related perpetrator. For women and men, fear was the most commonly reported impact of workplace-related sexual violence. CONCLUSIONS: These findings suggest that workplace prevention efforts that do not address different components of workplace harassment may not be adequate to address all forms of sexual violence occurring across the U.S. in the workplace context.

      3. Prevalence of intimate partner reproductive coercion in the United States: Racial and ethnic differencesexternal icon
        Basile KC, Smith SG, Liu Y, Miller E, Kresnow MJ.
        J Interpers Violence. 2019 Dec 6:886260519888205.
        Reproductive coercion (RC) is a specific type of intimate partner violence (IPV). Although clinical studies have highlighted women's experiences of RC, we know little about its national prevalence and differences in prevalence by sex category and race/ethnicity. Data are from the National Intimate Partner and Sexual Violence Survey (NISVS), years 2010 to 2012. NISVS is an ongoing, nationally representative random-digit-dial telephone survey of the noninstitutionalized English- or Spanish-speaking U.S. adult population. This article reports the national lifetime and 12-month prevalence of two RC victimization measures, and proportions among IPV victims. T tests were used to examine differences in estimates across racial/ethnic groups. In the United States, 9.7% of men and 8.4% of women experienced any RC by an intimate partner during their lifetime. Men reported more commonly than women that a partner tried to get pregnant when the man did not want her to; women reported higher prevalence of partner condom refusal. Examination by race/ethnicity revealed that non-Hispanic (NH) Black women and men had significantly higher lifetime prevalence of both RC types than all other groups; in the last 12 months, NH Blacks had significantly higher prevalence across the board than NH Whites. Hispanics had significantly higher lifetime and 12-month prevalence of any RC and partner condom refusal than NH Whites. RC is at the intersection of two public health concerns-IPV and reproductive health. Documenting its prevalence and differences by sex and race/ethnicity may inform prevention efforts to reduce occurrence and negative health outcomes among specific populations.

      4. Male adolescents' gender attitudes and violence: Implications for youth violence preventionexternal icon
        Miller E, Culyba AJ, Paglisotti T, Massof M, Gao Q, Ports KA, Kato-Wallace J, Pulerwitz J, Espelage DL, Abebe KZ, Jones KA.
        Am J Prev Med. 2019 Dec 19.
        INTRODUCTION: This study analyzed the associations among male adolescents' gender attitudes, intentions to intervene, witnessing peers' abusive behaviors, and multiple forms of adolescent violence perpetration. This community-based evaluation aims to inform future youth violence prevention efforts through the identification of potential predictors of interpersonal violence perpetration. METHODS: Cross-sectional data were from baseline surveys conducted with 866 male adolescents, aged 13-19 years, from community settings in 20 lower-resource neighborhoods in Pittsburgh, PA (August 2015 - June 2017), as part of a cluster RCT to evaluate a sexual violence prevention program. Participants completed in-person, anonymous electronic surveys about gender attitudes, bystander intentions, witnessing peers' abusive behaviors, violence perpetration, and demographics. The analysis was conducted between 2018 and 2019. RESULTS: The youth identified mostly as African American (70%) or Hispanic, multiracial, or other (21%). Most (88%) were born in the U.S., and 85% were in school. Youth with more equitable gender attitudes had lower odds of self-reported violence perpetration across multiple domains, including dating abuse (AOR=0.46, 95% CI=0.29, 0.72) and sexual harassment (AOR=0.50, 95% CI=0.37, 0.67). The relationship between intentions to intervene and violence perpetration was inconclusive. Witnessing peers engaged in abusive behaviors was associated with increased odds of multiple types of violence perpetration, such as dating abuse (witnessed 3 or more behaviors, AOR=2.41, 95% CI=1.31, 4.44). CONCLUSIONS: This is the first U.S.-based study to elicit information from male adolescents in community-based settings (rather than schools or clinics) about multiple types of interpersonal violence perpetration. Findings support violence prevention strategies that challenge harmful gender and social norms while simultaneously increasing youths' skills in interrupting peers' disrespectful and harmful behaviors.

      5. INTRODUCTION: Despite progress, injury remains the leading cause of preventable death for American Indian and Alaska Natives (AI/AN), aged 1 to 44. There are few publications on injuries among the AI/AN population, especially those on traumatic brain injury (TBI). A TBI can cause short- or long-term changes in cognition, communication, and/or emotion. METHODS: To describe changes over time in TBI incidence by mechanism of injury, injury intent, and age group among AI/ANs, the CDC analyzed hospitalization and death data from the 2008-2014 Healthcare Cost and Utilization Project (HCUP) National Inpatient Sample (NIS) and the National Vital Statistics System (NVSS), respectively. RESULTS: From 2008-2014, the incidence of TBI-related hospitalizations increased by 32% (1,477 in 2008 to 1,945 in 2014) and resulted in a 21% increase in age-adjusted rates of people hospitalized with TBI. TBI-related deaths increased in number (569 in 2008 to 644 in 2014) and age-adjusted rate (22.7 in 2008 to 25.4 in 2014) by approximately 13% and 12%, respectively. Motor-vehicle crashes were the leading cause of TBI-related deaths among AI/ANs aged 0-54years. Practical application: Prevention efforts should focus on increasing motor-vehicle safety and advancing prevention strategies for other leading causes of TBI, including: falls, intentional self-harm, and assaults.

      6. Few comprehensive primary prevention approaches for youth have been evaluated for effects on multiple types of violence. Dating Matters(R): Strategies to Promote Healthy Teen Relationships (Dating Matters) is a comprehensive teen dating violence (TDV) prevention model designed by the Centers for Disease Control and Prevention and evaluated using a longitudinal stratified cluster-randomized controlled trial to determine effectiveness for preventing TDV and promoting healthy relationship behaviors among middle school students. In this study, we examine the prevention effects on secondary outcomes, including victimization and perpetration of physical violence, bullying, and cyberbullying. This study examined the effectiveness of Dating Matters compared to a standard-of-care TDV prevention program in 46 middle schools in four high-risk urban communities across the USA. The analytic sample (N = 3301; 53% female; 50% Black, non-Hispanic; and 31% Hispanic) consisted of 6th-8th grade students who had an opportunity for exposure to Dating Matters in all three grades or the standard-of-care in 8th grade only. Results demonstrated that both male and female students attending schools implementing Dating Matters reported 11% less bullying perpetration and 11% less physical violence perpetration than students in comparison schools. Female Dating Matters students reported 9% less cyberbullying victimization and 10% less cyberbullying perpetration relative to the standard-of-care. When compared to an existing evidence-based intervention for TDV, Dating Matters demonstrated protective effects on physical violence, bullying, and cyberbullying for most groups of students. The Dating Matters comprehensive prevention model holds promise for reducing multiple forms of violence among middle school-aged youth. ClinicalTrials.gov Identifier: NCT01672541.

    • Laboratory Sciences
      1. Sensitivity of C-reactive protein for the identification of patients with laboratory-confirmed bacterial infections in northern Tanzaniaexternal icon
        Althaus T, Lubell Y, Maro VP, Mmbaga BT, Lwezaula B, Halleux C, Biggs HM, Galloway RL, Stoddard RA, Perniciaro JL, Nicholson WL, Doyle K, Olliaro P, Crump JA, Rubach MP.
        Trop Med Int Health. 2019 Dec 6.
        OBJECTIVE: Identifying febrile patients requiring antibacterial treatment is challenging, particularly in low-resource settings. In Southeast Asia, C-reactive protein (CRP) has been demonstrated to be highly sensitive and moderately specific in detecting bacterial infections, and to safely reduce unnecessary antibacterial prescriptions in primary care. As evidence is scant in sub-Saharan Africa, we assessed the sensitivity of CRP in identifying serious bacterial infections in Tanzania. METHODS: Samples were obtained from inpatients and outpatients in a prospective febrile illness study at two hospitals in Moshi, Tanzania, 2011-2014. Bacterial bloodstream infections (BSI) were established by blood culture, and bacterial zoonotic infections were defined by >/=4-fold rise in antibody titer between acute and convalescent sera. The sensitivity of CRP in identifying bacterial infections was estimated using thresholds of 10, 20, and 40 mg/L. Specificity was not assessed because determining false positive CRP results was limited by the lack of diagnostic testing to confirm non-bacterial etiologies and because ascertaining true negative cases was limited by the imperfect sensitivity of the diagnostic tests used to identify bacterial infections. RESULTS: Among 235 febrile outpatients and 569 febrile inpatients evaluated, 31 (3.9%) had a bacterial BSI and 61 (7.6%) had a bacterial zoonosis. Median (interquartile range) CRP values were 173 (80-315) mg/L in bacterial BSI, and 108 (31-208) mg/L in bacterial zoonoses. The sensitivity (95% Confidence Intervals) of CRP was 97% (83-99%), 94% (79-98%), 90% (74-97%) for identifying bacterial BSI, and 87% (76-93%), 82% (71-90%), 72% (60-82%) for bacterial zoonoses, using thresholds of 10, 20 and 40mg/L respectively. CONCLUSION: CRP was moderately sensitive for bacterial zoonoses and highly sensitive for identifying BSIs. Based on these results, operational studies are warranted to assess the safety and clinical utility of CRP for the management of non-malaria febrile illness at first-level health facilities in sub-Saharan Africa.

      2. Effect of a high fat diet and occupational exposure in different rat strains on lung and systemic responses: examination of the exposome in an animal modelexternal icon
        Antonini JM, Kodali V, Shoeb M, Kashon M, Roach KA, Boyce G, Meighan T, Stone S, McKinney W, Boots T, Roberts JR, Zeidler-Erdely PC, Erdely A.
        Toxicol Sci. 2019 Dec 23.
        The exposome is the measure of all exposures of an individual in a lifetime and how those exposures relate to health. The goal was to examine an experimental model integrating multiple aspects of the exposome by collecting biological samples during critical life stages of an exposed animal that are applicable to worker populations. Genetic contributions were assessed using strains of male rats with different genetic backgrounds [Fischer-344, Sprague-Dawley, Brown-Norway] maintained on a regular (REG) or high fat (HF) diet for 24 wk. At wk 7 during diet maintenance, groups of rats from each strain were exposed to stainless steel welding fume (WF; 20 mg/m3 x 3 hr/d x 4 d/wk x 5 wk) or air until wk 12, at which time some animals were euthanized. A separate set of rats from each strain were allowed to recover from WF exposure until the end of the 24 wk period. Bronchoalveolar lavage fluid and serum were collected at 7, 12, and 24 wk to assess general health indices. Depending on animal strain, WF exposure and HF diet together worsened kidney toxicity as well as altered different serum enzymes and proteins. Diet had minimal interaction with WF exposure for pulmonary toxicity endpoints. Experimental factors of diet, exposure, and strain were all important, depending on the health outcome measured. Exposure had the most significant influence related to pulmonary responses. Strain was the most significant contributor regarding the other health indices examined, indicating that genetic differences possibly drive the exposome effect in each strain.

      3. As influenza A viruses continue to jump species barriers, data generated in the ferret model to assess influenza virus pathogenicity, transmissibility, and tropism of these novel strains continues to inform an increasing scope of public health-based applications. This review presents the suitability of ferrets as a small mammalian model for influenza viruses and describes the breadth of pathogenicity and transmissibility profiles possible in this species following inoculation with a diverse range of viruses. Adaptation of aerobiology-based techniques and analyses have furthered our understanding of data obtained from this model and provide insight into the capacity of novel and emerging influenza viruses to cause human infection and disease.

      4. Cardiovascular and renal outcome trials demonstrate nephroprotection with sodium-glucose cotransporter-2 inhibitors in people with type 2 diabetes. Attenuation of hyperfiltration is believed to be responsible for the nephroprotection, and studies in young adults with type 1 diabetes suggest that afferent arteriolar vasoconstriction induced by a tubuloglomerular feedback mechanism may be responsible for this effect. The study by van Bommel et al. suggests that this mechanism may not hold true in older adults with type 2 diabetes, who instead attenuate elevated glomerular filtration rate via post-glomerular vasodilation.

      5. Identification and characterization of Shigella with decreased susceptibility to azithromycin in the United States, 2005 to 2014external icon
        Campbell D, Bowen A, Bhatnagar A, McCullough A, Grass J, Chen J, Folster JP.
        J Glob Antimicrob Resist. 2019 Dec 19.
        OBJECTIVES: To identify Shigella isolates in the U.S. with decreased susceptibility to azithromycin (DSA) and characterized the genetic mechanisms responsible for this resistance. METHODS: The National Antimicrobial Resistance Monitoring System (NARMS) at Centers for Disease Control and Prevention (CDC) collects and conducts broth microdilution antimicrobial susceptibility testing on Shigella to determine minimum inhibitory concentrations (MIC) for up to 15 drugs, including azithromycin. Isolates with decreased susceptibility to azithromycin were subjected to molecular methods (PCR, whole genome sequencing, and plasmid typing/transformation) to identify the genetic mechanisms of resistance. RESULTS: A total of 118 isolates with decreased susceptibility to azithromycin were tested; 65 (55%) isolates contained only mphA, one (<1%) isolate contained only ermB, and 51 (43%) isolates contained both mechanisms. Seven isolates contained IncFII plasmids with mphA, ermB, or mphA and ermB, while one isolate contained an IncB/O plasmid with mphA. One (<1%) isolate that contained neither mphA nor ermB contained mutations in rrlH, rplD, and rplV genes, and an insertion in rplV, the function of which are not yet known. CONCLUSIONS: Additional studies are needed to understand the effect on treatment outcomes, epidemiology, and possible additional mechanisms responsible for decreased susceptibility of azithromycin in Shigella.

      6. The disulfide stress response and protein S-thioallylation caused by allicin and diallyl polysulfanes in Bacillus subtilis as revealed by transcriptomics and proteomicsexternal icon
        Chi BK, Huyen NT, Loi VV, Gruhlke MC, Schaffer M, Mader U, Maass S, Becher D, Bernhardt J, Arbach M, Hamilton CJ, Slusarenko AJ, Antelmann H.
        Antioxidants (Basel). 2019 Nov 29;8(12).
        Garlic plants (Allium sativum L.) produce antimicrobial compounds, such as diallyl thiosulfinate (allicin) and diallyl polysulfanes. Here, we investigated the transcriptome and protein S-thioallylomes under allicin and diallyl tetrasulfane (DAS4) exposure in the Gram-positive bacterium Bacillus subtilis. Allicin and DAS4 caused a similar thiol-specific oxidative stress response, protein and DNA damage as revealed by the induction of the OhrR, PerR, Spx, YodB, CatR, HypR, AdhR, HxlR, LexA, CymR, CtsR, and HrcA regulons in the transcriptome. At the proteome level, we identified, in total, 108 S-thioallylated proteins under allicin and/or DAS4 stress. The S-thioallylome includes enzymes involved in the biosynthesis of surfactin (SrfAA, SrfAB), amino acids (SerA, MetE, YxjG, YitJ, CysJ, GlnA, YwaA), nucleotides (PurB, PurC, PyrAB, GuaB), translation factors (EF-Tu, EF-Ts, EF-G), antioxidant enzymes (AhpC, MsrB), as well as redox-sensitive MarR/OhrR and DUF24-family regulators (OhrR, HypR, YodB, CatR). Growth phenotype analysis revealed that the low molecular weight thiol bacillithiol, as well as the OhrR, Spx, and HypR regulons, confer protection against allicin and DAS4 stress. Altogether, we show here that allicin and DAS4 cause a strong oxidative, disulfide and sulfur stress response in the transcriptome and widespread S-thioallylation of redox-sensitive proteins in B. subtilis. The results further reveal that allicin and polysulfanes have similar modes of actions and thiol-reactivities and modify a similar set of redox-sensitive proteins by S-thioallylation.

      7. Francisella opportunistica sp. nov., isolated from human blood and cerebrospinal fluidexternal icon
        Dietrich EA, Kingry LC, Kugeler KJ, Levy C, Yaglom H, Young JW, Mead PS, Petersen JM.
        Int J Syst Evol Microbiol. 2019 Dec 20.
        Two isolates of a Gram-negative, non-spore-forming coccobacillus cultured from the blood and cerebrospinal fluid of immunocompromised patients in the United States were described previously. Biochemical and phylogenetic analyses revealed that they belong to a novel species within the Francisella genus. Here we describe a third isolate of this species, recovered from blood of a febrile patient with renal failure, and formally name the Francisella species. Whole genome comparisons indicated the three isolates display greater than 99.9 % average nucleotide identity (ANI) to each other and are most closely related to the tick endosymbiont F. persica, with only 88.6-88.8 % ANI to the type strain of F. persica. Based on biochemical, metabolic and genomic comparisons, we propose that these three isolates should be recognized as Francisella opportunistica sp. nov, with the type strain of the species, PA05-1188(T), available through the Deutsche Sammlung von Mikroorganismen und Zellkulturen (DSM 107100) and the American Type Culture Collection (ATCC BAA-2974).

      8. Determining the molecular drivers of species-specific interferon-stimulated gene product 15 interactions with nairovirus ovarian tumor domain proteasesexternal icon
        Dzimianski JV, Scholte FE, Williams IL, Langley C, Freitas BT, Spengler JR, Bergeron E, Pegan SD.
        PLoS One. 2019 ;14(12):e0226415.
        Tick-borne nairoviruses (order Bunyavirales) encode an ovarian tumor domain protease (OTU) that suppresses the innate immune response by reversing the post-translational modification of proteins by ubiquitin (Ub) and interferon-stimulated gene product 15 (ISG15). Ub is highly conserved across eukaryotes, whereas ISG15 is only present in vertebrates and shows substantial sequence diversity. Prior attempts to address the effect of ISG15 diversity on viral protein-ISG15 interactions have focused on only a single species' ISG15 or a limited selection of nairovirus OTUs. To gain a more complete perspective of OTU-ISG15 interactions, we biochemically assessed the relative activities of 14 diverse nairovirus OTUs for 12 species' ISG15 and found that ISG15 activity is predominantly restricted to particular nairovirus lineages reflecting, in general, known virus-host associations. To uncover the underlying molecular factors driving OTUs affinity for ISG15, X-ray crystal structures of Kupe virus and Ganjam virus OTUs bound to sheep ISG15 were solved and compared to complexes of Crimean-Congo hemorrhagic fever virus and Erve virus OTUs bound to human and mouse ISG15, respectively. Through mutational and structural analysis seven residues in ISG15 were identified that predominantly influence ISG15 species specificity among nairovirus OTUs. Additionally, OTU residues were identified that influence ISG15 preference, suggesting the potential for viral OTUs to adapt to different host ISG15s. These findings provide a foundation to further develop research methods to trace nairovirus-host relationships and delineate the full impact of ISG15 diversity on nairovirus infection.

      9. Integrated transcriptomics, metabolomics, and lipidomics profiling in rat lung, blood, and serum for assessment of laser printer-emitted nanoparticle inhalation exposure-induced disease risksexternal icon
        Guo NL, Poh TY, Pirela S, Farcas MT, Chotirmall SH, Tham WK, Adav SS, Ye Q, Wei Y, Shen S, Christiani DC, Ng KW, Thomas T, Qian Y, Demokritou P.
        Int J Mol Sci. 2019 Dec 16;20(24).
        Laser printer-emitted nanoparticles (PEPs) generated from toners during printing represent one of the most common types of life cycle released particulate matter from nano-enabled products. Toxicological assessment of PEPs is therefore important for occupational and consumer health protection. Our group recently reported exposure to PEPs induces adverse cardiovascular responses including hypertension and arrythmia via monitoring left ventricular pressure and electrocardiogram in rats. This study employed genome-wide mRNA and miRNA profiling in rat lung and blood integrated with metabolomics and lipidomics profiling in rat serum to identify biomarkers for assessing PEPs-induced disease risks. Whole-body inhalation of PEPs perturbed transcriptional activities associated with cardiovascular dysfunction, metabolic syndrome, and neural disorders at every observed time point in both rat lung and blood during the 21 days of exposure. Furthermore, the systematic analysis revealed PEPs-induced transcriptomic changes linking to other disease risks in rats, including diabetes, congenital defects, auto-recessive disorders, physical deformation, and carcinogenesis. The results were also confirmed with global metabolomics profiling in rat serum. Among the validated metabolites and lipids, linoleic acid, arachidonic acid, docosahexanoic acid, and histidine showed significant variation in PEPs-exposed rat serum. Overall, the identified PEPs-induced dysregulated genes, molecular pathways and functions, and miRNA-mediated transcriptional activities provide important insights into the disease mechanisms. The discovered important mRNAs, miRNAs, lipids and metabolites may serve as candidate biomarkers for future occupational and medical surveillance studies. To the best of our knowledge, this is the first study systematically integrating in vivo, transcriptomics, metabolomics, and lipidomics to assess PEPs inhalation exposure-induced disease risks using a rat model.

      10. ICTV virus taxonomy profile: Peribunyaviridaeexternal icon
        Hughes HR, Adkins S, Alkhovskiy S, Beer M, Blair C, Calisher CH, Drebot M, Lambert AJ, de Souza WM, Marklewitz M, Nunes MR, Shi X.
        J Gen Virol. 2019 Dec 17.
        Peribunyaviruses are enveloped and possess three distinct, single-stranded, negative-sense RNA segments comprising 11.2-12.5 kb in total. The family includes globally distributed viruses in the genera Orthobunyavirus, Herbevirus, Pacuvirus and Shangavirus. Most viruses are maintained in geographically-restricted vertebrate-arthropod transmission cycles that can include transovarial transmission from arthropod dam to offspring. Others are arthropod-specific. Arthropods can be persistently infected. Human infection occurs through blood feeding by an infected vector arthropod. Infections can result in a diversity of human and veterinary clinical outcomes in a strain-specific manner. Segment reassortment is evident between some peribunyaviruses. This is a summary of the International Committee on Taxonomy of Viruses (ICTV) Report on the taxonomy of the family Peribunyaviridae, which is available at ictv.global/report/peribunyaviridae.

      11. Application of the fentanyl analog screening kit toward the identification of emerging synthetic opioids in human plasma and urine by LC-QTOFexternal icon
        Krajewski LC, Swanson KD, Bragg WA, Shaner RL, Seymour C, Carter MD, Hamelin EI, Johnson RC.
        Toxicol Lett. 2020 Mar 1;320:87-94.
        Human exposures to fentanyl analogs, which significantly contribute to the ongoing U.S. opioid overdose epidemic, can be confirmed through the analysis of clinical samples. Our laboratory has developed and evaluated a qualitative approach coupling liquid chromatography and quadrupole time-of-flight mass spectrometry (LC-QTOF) to address novel fentanyl analogs and related compounds using untargeted, data-dependent acquisition. Compound identification was accomplished by searching against a locally-established mass spectral library of 174 fentanyl analogs and metabolites. Currently, our library can identify 150 fentanyl-related compounds from the Fentanyl Analog Screening (FAS) Kit), plus an additional 25 fentanyl-related compounds from individual purchases. Plasma and urine samples fortified with fentanyl-related compounds were assessed to confirm the capabilities and intended use of this LC-QTOF method. For fentanyl, 8 fentanyl-related compounds and naloxone, lower reportable limits (LRL100), defined as the lowest concentration with 100 % true positive rate (n = 12) within clinical samples, were evaluated and range from 0.5 ng/mL to 5.0 ng/mL for urine and 0.25 ng/mL to 2.5 ng/mL in plasma. The application of this high resolution mass spectrometry (HRMS) method enables the real-time detection of known and emerging synthetic opioids present in clinical samples.

      12. Animal models have been used to gain insight into the risk of noise-induced hearing loss (NIHL) and its potential prevention using investigational new drug agents. A number of compounds have yielded benefit in pre-clinical (animal) models. However, the acute traumatic injury models commonly used in pre-clinical testing are fundamentally different from the chronic and repeated exposures experienced by many human populations. Diverse populations that are potentially at risk and could be considered for enrollment in clinical studies include service members, workers exposed to occupational noise, musicians and other performing artists, and children and young adults exposed to non-occupational (including recreational) noise. Both animal models and clinical populations were discussed in this special issue, followed by discussion of individual variation in vulnerability to NIHL. In this final contribution, study design considerations for NIHL otoprotection in pre-clinical and clinical testing are integrated and broadly discussed with evidence-based guidance offered where possible, drawing on the contributions to this special issue as well as other existing literature. The overarching goals of this final paper are to (1) review and summarize key information across contributions and (2) synthesize information to facilitate successful translation of otoprotective drugs from animal models into human application.

      13. Cultivation and aerosolization of Stachybotrys chartarum for modeling pulmonary inhalation exposureexternal icon
        Lemons AR, Croston TL, Goldsmith WT, Barnes MA, Jaderson MA, Park JH, McKinney W, Beezhold DH, Green BJ.
        Inhal Toxicol. 2019 Dec 24:1-11.
        Objective: Stachybotrys chartarum is a hydrophilic fungal species commonly found as a contaminant in water-damaged building materials. Although several studies have suggested that S. chartarum exposure elicits a variety of adverse health effects, the ability to characterize the pulmonary immune responses to exposure is limited by delivery methods that do not replicate environmental exposure. This study aimed to develop a method of S. chartarum aerosolization to better model inhalation exposures. Materials and methods: An acoustical generator system (AGS) was previously developed and utilized to aerosolize and deliver fungal spores to mice housed in a multi-animal nose-only exposure chamber. In this study, methods for cultivating, heat-inactivating, and aerosolizing two macrocyclic trichothecene-producing strains of S. chartartum using the AGS are described. Results and discussion: In addition to conidia, acoustical generation of one strain of S. chartarum resulted in the aerosolization of fungal fragments (<2 microm aerodynamic diameter) derived from conidia, phialides, and hyphae that initially comprised 50% of the total fungal particle count but was reduced to less than 10% over the duration of aerosolization. Acoustical generation of heat-inactivated S. chartarum did not result in a similar level of fragmentation. Delivery of dry, unextracted S. chartarum using these aerosolization methods resulted in pulmonary inflammation and immune cell infiltration in mice inhaling viable, but not heat-inactivated S. chartarum. Conclusions: These methods of S. chartarum growth and aerosolization allow for the delivery of fungal bioaerosols to rodents that may better simulate natural exposure within water-damaged indoor environments.

      14. Azole-resistant Aspergillus fumigatus: What you need to knowexternal icon
        Lockhart SR, Beer K, Toda M.
        Clin Microbiol Newsl. 2020 ;42(1):1-6.
        Aspergillosis is one of the most common fungal infections. The predominant cause of aspergillosis is the species Aspergillus fumigatus. There have been increasing reports of A. fumigatus isolates that are resistant to azole antifungals. The predominant causes of this resistance are environmentally acquired mutations in the target gene, CYP51A, known as TR34/L98H and TR46/Y121F/T289A. They consist of a tandem repeat in the promoter region (TR) and one or two amino acid changes in the protein sequence, respectively. Unfortunately, the capacity in the United States for mold antifungal susceptibility testing is limited, so the extent of azole resistance in clinical practice is largely unknown. This review discusses the causes and implications of azole-resistant A. fumigatus and the role that antifungal susceptibility testing might play in its identification.

      15. Uncrewed aircraft systems versus motorcycles to deliver laboratory samples in west Africa: a comparative economic studyexternal icon
        Ochieng WO, Ye T, Scheel C, Lor A, Saindon J, Yee SL, Meltzer MI, Kapil V, Karem K.
        Lancet Glob Health. 2020 Jan;8(1):e143-e151.
        BACKGROUND: Transportation of laboratory samples in low-income and middle-income countries is often constrained by poor road conditions, difficult geographical terrain, and insecurity. These constraints can lead to long turnaround times for laboratory diagnostic tests and hamper epidemic control or patient treatment efforts. Although uncrewed aircraft systems (UAS)-ie, drones-can mitigate some of these transportation constraints, their cost-effectiveness compared with land-based transportation systems is unclear. METHODS: We did a comparative economic study of the costs and cost-effectiveness of UAS versus motorcycles in Liberia (west Africa) for transportation of laboratory samples under simulated routine conditions and public health emergency conditions (based on the 2013-16 west African Ebola virus disease epidemic). We modelled three UAS with operational ranges of 30 km, 65 km, and 100 km (UAS30, UAS65, and UAS100) and lifespans of 1000 to 10 000 h, and compared the costs and number of samples transported with an established motorcycle transportation programme (most commonly used by the Liberian Ministry of Health and the charity Riders for Health). Data for UAS were obtained from Skyfire (a UAS consultancy), Vayu (a UAS manufacturer), and Sandia National Laboratories (a private company with UAS research experience). Motorcycle operational data were obtained from Riders for Health. In our model, we included costs for personnel, equipment, maintenance, and training, and did univariate and probabilistic sensitivity analyses for UAS lifespans, range, and accident or failures. FINDINGS: Under the routine scenario, the per sample transport costs were US$0.65 (95% CI 0.01-2.85) and $0.82 (0.56-5.05) for motorcycles and UAS65, respectively. Per-sample transport costs under the emergency scenario were $24.06 (95% CI 21.14-28.20) for motorcycles, $27.42 (95% CI 19.25-136.75) for an unadjusted UAS model with insufficient geographical coverage, and $34.09 (95% CI 26.70-127.40) for an adjusted UAS model with complementary motorcycles. Motorcycles were more cost-effective than short-range UAS (ie, UAS30). However, with increasing range and operational lifespans, UAS became increasingly more cost-effective. INTERPRETATION: Given the current level of technology, purchase prices, equipment lifespans, and operational flying ranges, UAS are not a viable option for routine transport of laboratory samples in west Africa. Field studies are required to generate evidence about UAS lifespan, failure rates, and performance under different weather conditions and payloads. FUNDING: None.


      16. Performance of an alternative laboratory-based HIV diagnostic testing algorithm using HIV-1 RNA viral loadexternal icon
        Pitasi MA, Patel SN, Wesolowski LG, Masciotra S, Luo W, Owen SM, Delaney KP.
        Sex Transm Dis. 2019 Dec 30.
        BACKGROUND: Since 2014, the recommended algorithm for laboratory diagnosis of HIV infection in the United States has consisted of an HIV-1/2 antigen/antibody (Ag/Ab) test followed by an HIV-1/2 antibody (Ab) differentiation test and, if necessary, a diagnostic HIV-1 nucleic acid test (NAT) to resolve discordant or indeterminate results. METHODS: Using stored specimens from persons seeking HIV testing who had not received a previous diagnosis or treatment, we compared the performance of a three-step alternative algorithm consisting of an Ag/Ab test followed by a quantitative HIV-1 RNA viral load assay and, if viral load is not detected, an Ab differentiation test, to that of the recommended algorithm. We calculated the sensitivity and specificity of five Ag/Ab tests and the proportion of specimens correctly classified by the alternative algorithm compared to the recommended algorithm. Results were examined separately for specimens classified as early infection, established infection, and false-reactive screening RESULTS: Sensitivity and specificity were similar among all Ag/Ab tests. Viral load quantification correctly classified all specimens from early infection, all false-reactive screening specimens, and the majority of specimens from established infection. CONCLUSIONS: Although cost, regulatory barriers, test availability, and the ability to differentiate early from established infection must be considered, this alternative algorithm can potentially decrease the total number of tests performed and reduce turnaround time, thereby streamlining HIV diagnosis and initiation of treatment.

      17. Calculation and uncertainty of zeta potentials of microorganisms in a 1:1 electrolyte with a conductivity similar to surface waterexternal icon
        Polaczyk AL, Amburgey JE, Alansari A, Poler JC, Propato M, Hill VR.
        Colloids Surf A Physicochem Eng Asp. 2020 5 February;586.
        The electrophoretic mobilities (EPM's) of fifteen different microbes (6 viruses, 5 vegetative bacteria, 2 bacterial endospores, 2 protozoa) and one microbial particle surrogate (Polystyrene microspheres) were measured, and five models were used to convert EPM's of these microorganisms to zeta potentials. The Helmholtz-Smoluchowski, Huckel-Onsager, Henry, modified Booth, and O'Brien and Hunter models were compared over their ranges of applicability for various microbes in a weak electrolyte solution intended to simulate the conductivity of surface water. The results from each of the models were compared by assessing the magnitude of the error due to inherent limitations of the models and comparing it to the error associated with the measurement of the EPM. Results indicated that differences imparted to the calculated zeta potentials by double layer distortion corrections were typically smaller than the uncertainty of the EPM measurement from which the zeta potential value was calculated. Based on our analyses, the Helmholtz-Smoluchowski equation was most appropriate for application to bacteria (vegetative and endospores) and parasites, while the Henry or modified Booth models were necessary for viruses. Zeta potential calculations with corresponding uncertainty values are presented for each of the microbes and the surrogate for each of the five models studied. A zone chart was created to help avoid unnecessary error in calculating microbial zeta potentials that can exceed 50%.

      18. Comprehensive laboratory evaluation of a lateral flow assay for the detection of Yersinia pestisexternal icon
        Prentice KW, DePalma L, Ramage JG, Sarwar J, Parameswaran N, Petersen J, Yockey B, Young J, Joshi M, Thirunavvukarasu N, Singh A, Chapman C, Avila JR, Pillai CA, Manickam G, Sharma SK, Morse SA, Venkateswaran KV, Anderson K, Hodge DR, Pillai SP.
        Health Secur. 2019 Nov/Dec;17(6):439-453.
        We conducted a comprehensive, multiphase laboratory evaluation of the Plague BioThreat Alert((R)) (BTA) test, a lateral flow immunoassay (LFA), for the rapid detection of Yersinia pestis. The study was conducted in 7 phases at 2 sites to assess the performance of the LFA. The limit of detection (LOD) was determined using both a virulent and avirulent strain of Y. pestis, CO99-3015 (10(5) CFU/ml) and A1122 (10(4) CFU/ml), respectively. In the other phases, 18 Y. pestis strains, 20 phylogenetic near-neighbor strains, 61 environmental background microorganisms, 26 white powders, and a pooled aerosol sample were also tested. A total of 1,110 LFA test results were obtained, and their analysis indicates that this LFA had a sensitivity of 97.65% and specificity of 96.57%. These performance data are important for accurate interpretation of qualitative results arising from testing suspicious white powders and aerosol samples in the field. Any positive specimen in this assay is considered presumptive positive and should be referred to the Centers for Disease Control and Prevention Laboratory Response Network for additional testing, confirmation, and characterization for an appropriate public health response.

      19. Antifungal triazole posaconazole targets an early stage of the parechovirus A3 life cycleexternal icon
        Rhoden E, Ng TF, Campagnoli R, Nix WA, Konopka-Anstadt J, Selvarangan R, Briesach L, Oberste MS, Weldon WC.
        Antimicrob Agents Chemother. 2019 Dec 9.
        Viruses in species Parechovirus A (Picornaviridae) are associated with a wide variety of clinical manifestations. Parechovirus A3 (PeV-A3) is known to cause sepsis-like illness, meningitis, and encephalitis in infants and young children. To date, no specific therapies are available to treat PeV-A3-infected children. We had previously identified two FDA-cleared antifungal drugs, itraconazole (ITC) and posaconazole (POS) with potent and specific antiviral activity against PeV-A3. Time-of-addition and synchronized infection assays revealed that POS targets an early stage of the PeV-A3 life cycle. POS exerts an antiviral effect, evidenced by a reduction in viral titer following the addition of POS to Vero-P cells before infection, coaddition of POS and PeV-A3 to Vero-P cells, incubation of POS and PeV-A3 prior to Vero-P infection, and at attachment. POS exerts less of an effect on virus entry. A PeV-A3 ELISA inhibition experiment, using an anti-PeV-A3 monoclonal antibody (mAb), suggested that POS binds directly to the PeV-A3 capsid. POS-resistant PeV-A3 strains developed by serial passage in the presence of POS, acquired substitutions in multiple regions of the genome, including the capsid. Reverse genetics confirmed substitutions in capsid proteins VP0, VP3, VP1 and nonstructural proteins 2A and 3A. Single mutants VP0_K66R, VP0_A124T, VP3_N88S, VP1_Y224C, 2A_S788L and 3A_T1I were respectively 4-, 9-, 12-, 34-, 51-, and 119-fold more resistant to POS than its susceptible prototype strain. Our studies demonstrate that POS may be a valuable tool in developing an antiviral therapy for PeV-A3.

      20. Exploring mechanistic toxicity of mixtures using PBPK modeling and computational systems biologyexternal icon
        Ruiz P, Emond C, McLanahan E, Joshi-Barr S, Mumtaz M.
        Toxicol Sci. 2019 Dec 18.
        Mixtures risk assessment needs an efficient integration of in vivo, in vitro and in silico data with epidemiology and human studies data. This involves several approaches, some in current use and others under development. This work extends the Agency for Toxic Substances and Disease Registry PBPK toolkit, available for risk assessors, to include a mixture PBPK model of benzene, toluene, ethylbenzene, xylenes (BTEX). The recoded model was evaluated and applied to exposure scenarios to evaluate the validity of dose additivity for mixtures. In the second part of this work, we studied TEX-gene- disease associations using Comparative Toxicogenomics Database, pathway analysis and published microarray data from human gene expression changes in blood samples after short-term and long-term exposures. Collectively, this information was used to establish hypotheses on potential linkages between TEX exposures and human health. The results show that 236 genes expressed were common between the short-term and long-term exposures. These genes could be central for the interconnecting biological pathways potentially stimulated by TEX exposure, likely related to respiratory and neuro diseases. Using publicly available data we propose a conceptual framework to study pathway perturbations leading to toxicity of chemical mixtures. This proposed methodology lends mechanistic insights of the toxicity of mixtures and when experimentally validated will allow data gaps filling for mixtures' toxicity assessment. This work proposes an approach using current knowledge, available multiple stream data and applying computational methods to advance mixtures risk assessment.

      21. Onsite healthcare worker acceptability and performance of the point-of-care Pima CD4 assay in Dar es Salaam, Tanzaniaexternal icon
        Schmitz ME, Chang K, Arnett N, Kohatsu L, Lemwayi R, Mwasekaga M, Nkengasong J, Bolu O, Mosha F, Westerman L.
        Afr J Lab Med. 2019 ;8(1):740.
        Background: Healthcare workers' acceptance of and ability to perform point-of-care testing is important for reliable and accurate results. The Alere Pima() CD4 assay (Pima CD4) is the CD4 point-of-care test for HIV management in Tanzania. Objectives: To evaluate healthcare workers' acceptance and performance of Pima CD4 testing. Methods: The study was implemented in five high volume sites in Dar es Salaam, Tanzania, in 2011. Trained healthcare workers performed Pima testing using three whole-blood specimens collected from each patient: venous blood, fingerstick blood directly applied to a Pima cartridge (capillary-direct), and fingerstick blood collected in a microtube (capillary-microtube). Using a semi-structured interview guide, we interviewed 11 healthcare workers about specimen collection methods and Pima CD4 acceptability. Quantitative responses were analysed using descriptive statistics. Open-ended responses were summarised by thematic areas. Pima CD4 results were analysed to determine variation between cadres. Results: Healthcare workers found Pima CD4 user-friendly and recommended its use in low volume, peripheral facilities. Both venous and capillary-direct blood were considered easy to collect, with venous preferred. Advantages noted with venous and capillary-microtube methods were the ability to retest, perform multiple tests, or delay testing. Pima CD4 results were trusted by the healthcare workers and were in agreement with laboratory Pima testing. Conclusion: In this point-of-care testing setting, the Pima CD4 assay was accepted by healthcare workers. Both venous and fingerstick capillary blood specimens can be used with Pima CD4, but fingerstick methods may require more intensive training on technique to minimise variation in results and increase acceptability.

      22. Three years of shared service HIV-1 and HIV-2 nucleic acid testing for public health laboratories: worthwhile for HIV-1 but not for HIV-2external icon
        Styer LM, Gaynor AM, Parker MM, Bennett SB, Wesolowski LG, Ethridge S, Chavez PR, Sullivan TJ, Fordan S, Wroblewski K.
        Sex Transm Dis. 2019 Dec 24.
        BACKGROUND: In 2016, HIV-2 nucleic acid testing (NAT) was added to a shared service program that conducts HIV-1 NAT for public health laboratories performing the recommended algorithm for diagnosing HIV. Here we evaluate the usefulness of HIV-2 NAT in this program as compared to HIV-1 NAT. METHODS: Specimens eligible for HIV-1 NAT were reactive on an HIV-1/2 antibody or antigen/antibody initial test and non-reactive or indeterminate on a supplemental antibody test or were reactive for HIV-1 antigen-only on an HIV-1/2 antigen/antibody initial test. Specimens eligible for HIV-2 NAT were reactive on an initial test, HIV-2 indeterminate or HIV indeterminate on a supplemental antibody test and had no detectable HIV-1 RNA or were reactive for HIV-2 antibody on an HIV-1/2 antigen/antibody test and this reactivity was not confirmed with a supplemental antibody assay. All specimens were tested in a reference laboratory using APTIMA HIV-1 qualitative RNA and/or a validated qualitative HIV-2 RNA real-time PCR assay. RESULTS: During 2016-2019, HIV-1 RNA was detected in 234/1731 (14%) specimens tested. HIV-2 RNA was not detected in 52 specimens tested. Median time from specimen collection to reporting of HIV-1 and HIV-2 NAT results by year ranged from 9-10 days and 22-27 days, respectively. Two specimens with HIV-2 indeterminate results on a supplemental antibody test had detectable HIV-1 RNA. CONCLUSIONS: A shared service model for HIV-1 NAT is both feasible and beneficial for public health laboratories. However, because no HIV-2 infections were detected, our data suggest that this program should reconsider the usefulness of HIV-2 NAT testing.

      23. Multistate population and whole genome sequence-based strain surveillance of invasive pneumococci recovered in the USA during 2017external icon
        Varghese J, Chochua S, Tran T, Walker H, Li Z, Snippes Vagnone PM, Lynfield R, McGee L, Li Y, Metcalf BJ, Pilishvili T, Beall B.
        Clin Microbiol Infect. 2019 Sep 16.
        OBJECTIVES: We aimed to provide population-based and whole-genome sequence (WGS) -based characterization of invasive pneumococcal disease isolates collected from multistate surveillance in the USA during 2017. METHODS: We obtained short-read WGS from 2881 isolates with associated bioinformatics pipeline strain feature predictions. For quality control, capsular serotypes and antimicrobial MICs were also obtained conventionally from 442 isolates. Annotated WGS were provided (inclusive of serotypes, MICs, multilocus sequence types, pilus type(s)) from 2723 isolates. For 158 isolates with suboptimal WGS, antimicrobial MICs were obtained conventionally. RESULTS: There were 127 isolates from children <5 years of age and 2754 isolates from those >/=5 years old in 2017. One of 43 different serotypes was predicted for 2877 of the 2881 isolates. Serotypes in the 13-valent conjugate vaccine together with 6C (PCV13+6C) accounted for 816 (28.3%) isolates, with PCV13 serotype 3 being the most common serotype overall. Non-PCV13-6C- serotypes accounted for 2065 (71.7%) isolates, comprising 96 (75.6%) isolates from children < 5 years old and 1969 (61.4%) isolates from those aged >/=5 years. Of 36 different categories of recently emerged serotype-switch variants, three showed marked increases relative to 2015-2016 in that the number from 2017 surpassed the number from 2015-2016 combined. Two of these included antimicrobial-resistant serotype 11A and 35B serotype-switch variants of the ST156 clonal complex. CONCLUSIONS: PCV13+6C strains are still identified in 2017 but non-PCV13-type strains impose a considerable burden. This well-annotated year 2017 WGS/strain data set will prove useful for a broad variety of analyses and improved our understanding of invasive pneumococcal disease-causing strains in the post-PCV13 era.

      24. Evaluation of the performance of the Cepheid Xpert HIV-1 Viral Load Assay for quantitative and diagnostic usesexternal icon
        Wesolowski L, Fowler W, Luo W, Sullivan V, Masciotra S, Smith T, Rossetti R, Delaney K, Oraka E, Chavez P, Ethridge S, Switzer WM, Owen SM.
        J Clin Virol. 2019 Nov 15;122:104214.
        BACKGROUND: Cepheid's Xpert HIV-1 Viral Load (Xpert VL), a simplified, automated, single-use quantitative assay used with the GeneXpert System, is not FDA approved. OBJECTIVES: Using stored plasma, we conducted a study to assess the ability of Xpert VL to quantify viral load relative to the Roche COBAS AmpliPrep/COBAS TaqMan HIV-1 (Cobas VL) and to examine the use of the Xpert VL as a qualitative diagnostic test. STUDY DESIGN: Following HIV-1 viral stock dilutions, we conducted a probit analysis to identify the concentration where 95 % of specimens had quantified VLs. We also examined Xpert and Cobas log VL correlation in linearity panels; compared the proportion of 220 seroconverter specimens with virus detected using McNemar's test; and tested specimens from persons with untreated, established HIV-1 infection (n=149) and uninfected persons (n=497). Furthermore, we examined Xpert VL as a qualitative test in seroconverter specimens with early (n=20) and later (n=68) acute infections. RESULTS: At 1.80 log10 copies/mL, 95 % of specimens had quantifiable virus using Xpert VL. Xpert and Cobas VLs were highly correlated (R(2)=0.994). The proportion of seroconverter specimens with virus detected using Cobas and with Xpert VL was not statistically different (p=0.0578). Xpert VL detected 97.9 % of established infections, and specificity was 99.80 % (95 % CI 98.87%-99.99%). Xpert VL detected 90 % and 98.5 % of early and later acute infections, respectively. CONCLUSIONS: If approved, Xpert VL could allow U.S. laboratories that cannot bring on large, complex testing platforms to conduct HIV monitoring. An approval for diagnostic use may provide timely identification of HIV infections.

      25. Measurement of microcystin and nodularin activity in human urine by immunocapture-protein phosphatase 2A assayexternal icon
        Wharton RE, Cunningham BR, Schaefer AM, Guldberg SM, Hamelin EI, Johnson RC.
        Toxins (Basel). 2019 Dec 13;11(12).
        Microcystins (MC) and nodularin (NOD) are toxins released by cyanobacteria during harmful algal blooms. They are potent inhibitors of protein phosphatases 1 and 2A (PP1 and PP2A) and cause a variety of adverse symptoms in humans and animals if ingested. More than 250 chemically diverse congeners of MCs have been identified, but certified reference materials are only available for a few. A diagnostic test that does not require each reference material for detection is necessary to identify human exposures. To address this need, our lab has developed a method that uses an antibody to specifically isolate MCs and NOD from urine prior to detection via a commercially available PP2A kit. This assay quantitates the summed inhibitory activity of nearly all MCs and NOD on PP2A relative to a common MC congener, microcystin-LR (MC-LR). The quantitation range for MC-LR using this method is from 0.050-0.500 ng/mL. No background responses were detected in a convenience set of 50 individual urines. Interday and intraday % accuracies ranged from 94%-118% and relative standard deviations were 15% or less, meeting FDA guidelines for receptor binding assays. The assay detected low levels of MCs in urines from three individuals living in close proximity to harmful algal blooms (HABs) in Florida.

      26. Isolation and phylogenomic analysis of buffalopox virus from human and buffaloes in Indiaexternal icon
        Yadav PD, Mauldin MR, Nyayanit DA, Albarino CG, Sarkale P, Shete A, Guerrero LW, Nakazawa Y, Nichol ST, Mourya DT.
        Virus Res. 2019 Dec 7:197836.
        Three genome sequences of Buffalopox virus (BPVX) were retrieved from a human and two buffaloes scab samples. Phylogenomic analysis of the BPXV indicates that it shares a most recent common ancestor with Lister and closely related vaccine strains when compared to potential wild-type VACV strains (like Horsepox virus).

      27. Landscape of gene expression variation of natural isolates of Cryptococcus neoformans in response to biologically relevant stressesexternal icon
        Yu CH, Chen Y, Desjardins CA, Tenor JL, Toffaletti DL, Giamberardino C, Litvintseva A, Perfect JR, Cuomo CA.
        Microb Genom. 2019 Dec 20.
        Cryptococcus neoformans is an opportunistic fungal pathogen that at its peak epidemic levels caused an estimated million cases of cryptococcal meningitis per year worldwide. This species can grow in diverse environmental (trees, soil and bird excreta) and host niches (intracellular microenvironments of phagocytes and free-living in host tissues). The genetic basic for adaptation to these different conditions is not well characterized, as most experimental work has relied on a single reference strain of C. neoformans. To identify genes important for yeast infection and disease progression, we profiled the gene expression of seven C. neoformans isolates grown in five representative in vitro environmental and in vivo conditions. We characterized gene expression differences using RNA-Seq (RNA sequencing), comparing clinical and environmental isolates from two of the major lineages of this species, VNI and VNBI. These comparisons highlighted genes showing lineage-specific expression that are enriched in subtelomeric regions and in lineage-specific gene clusters. By contrast, we find few expression differences between clinical and environmental isolates from the same lineage. Gene expression specific to in vivo stages reflects available nutrients and stresses, with an increase in fungal metabolism within macrophages, and an induction of ribosomal and heat-shock gene expression within the subarachnoid space. This study provides the widest view to date of the transcriptome variation of C. neoformans across natural isolates, and provides insights into genes important for in vitro and in vivo growth stages.

    • Maternal and Child Health
      1. Middle ear effusion in children with congenital cytomegalovirus infectionexternal icon
        Chung W, Leung J, Lanzieri TM, Blum P, Demmler-Harrison G.
        Pediatr Infect Dis J. 2019 Dec 20.
        BACKGROUND: Sensorineural hearing loss (SNHL) is well described in children with congenital cytomegalovirus (CMV) infection, but limited data are available on middle ear effusion (MEE) occurrence in this population. We assessed the prevalence of MEE and the degree of transient hearing change associated with MEE among children with congenital CMV infection. METHODS: Children with congenital CMV infection enrolled in a longitudinal study received hearing and tympanometric testing during scheduled follow-up visits annually up to 6 years of age. We used a generalized linear mixed-effect logistic regression model to compare the odds of MEE, defined as type B tympanogram (normal ear canal volume with little tympanic membrane movement) among patients categorized as symptomatic or asymptomatic based on the presence of congenital CMV-associated signs in the newborn period. RESULTS: Forty-four (61%) of 72 symptomatic and 24 (28%) of 87 asymptomatic patients had >/=1 visit with MEE. After controlling for the number of visits, symptomatic patients had significantly higher odds of MEE (odds ratio: 2.09; 95% confidence interval: 1.39-3.14) than asymptomatic patients. Transient hearing decrease associated with a type B tympanogram ranged from 10 to 40 dB, as measured by audiometric air-bone gap in 11 patients. CONCLUSIONS: Among children with congenital CMV, MEE can result in transient hearing decrease, which can reduce the efficacy of a hearing aid in those with SNHL. It is warranted that children with congenital CMV infection and SNHL receive routine audiologic and tympanometric testing to better manage hearing aid amplification levels.

      2. U-shaped pillows and sleep-related infant deaths, United States, 2004-2015external icon
        Cottengim C, Parks SE, Erck Lambert AB, Dykstra HK, Shaw E, Johnston E, Olson CK, Shapiro-Mendoza CK.
        Matern Child Health J. 2019 Dec 11.
        OBJECTIVES: To describe infant deaths where a u-shaped pillow was under or around an infant and to describe cases classified as Explained Suffocation. METHODS: We examined demographics and circumstances of 141 infant deaths during 2004-2015 in the US National Fatality Review Case Reporting System with u-shaped pillows in the sleep environment. RESULTS: Most infants were < 6 months old (92%), male (58%), non-Hispanic White (53%), and of the nine explained suffocation deaths, four occurred when the u-shaped pillow obstructed the infant's airway; five occurred when the infant rolled off the pillow and their airway was obstructed by another object. CONCLUSIONS FOR PRACTICE: Although infrequent, infant deaths with u-shaped pillows have occurred. Health care providers may include discussion of the importance of caregivers following infant product packaging precautions and warning labels for commonly used consumer products, such as u-shaped pillows in their advice to caregivers.

      3. Maternal surgery and anesthesia during pregnancy and risk of birth defects in the National Birth Defects Prevention Study, 1997-2011external icon
        Fisher SC, Siag K, Howley MM, Van Zutphen AR, Reefhuis J, Browne ML.
        Birth Defects Res. 2019 Dec 16.
        BACKGROUND: There is little recent research on the teratogenicity of maternal anesthesia exposure. We used National Birth Defects Prevention Study data to describe surgical procedures conducted during pregnancy and to estimate the risk of birth defects associated with periconceptional anesthesia exposure. METHODS: We used logistic regression to assess associations between general and local anesthesia for surgery during the periconceptional period and specific birth defects. We calculated odds ratios and 95% confidence intervals for 25 birth defects with at least five exposed cases (11,501 controls, 24,337 cases), adjusted for maternal race/ethnicity, age, body mass index, periconceptional exposure to X-ray, CT, or radionuclide scans, and study site. RESULTS: The most commonly reported procedures were dental, dermatologic, and cervical cerclage procedures, regardless of gestational timing. Overall, 226 case and 73 control women reported periconceptional general anesthesia; 230 case and 89 control women reported periconceptional local anesthesia. Women who reported general or local anesthesia were disproportionately non-Hispanic white and were more likely to report periconceptional opioid use and at least one periconceptional X-ray/CT/radionuclide scan. Women who reported general anesthesia were also more likely to report periconceptional injury. We did not observe any significant associations between either type of anesthesia exposure and the birth defects studied. Odds ratios were generally close to null and imprecise. CONCLUSIONS: Our study population reported a wide range of surgical procedures during pregnancy, requiring both general and local anesthesia. Our findings suggest that periconceptional anesthesia is not strongly associated with the birth defects assessed in this study.

      4. Risk of stillbirth for fetuses with specific birth defectsexternal icon
        Heinke D, Nestoridi E, Hernandez-Diaz S, Williams PL, Rich-Edwards JW, Lin AE, Van Bennekom CM, Mitchell AA, Nembhard WN, Fretts RC, Roberts DJ, Duke CW, Carmichael SL, Yazdy MM.
        Obstet Gynecol. 2019 Dec 5.
        OBJECTIVE: To estimate the risk of stillbirth (fetal death at 20 weeks of gestation or more) associated with specific birth defects. METHODS: We identified a population-based retrospective cohort of neonates and fetuses with selected major birth defects and without known or strongly suspected chromosomal or single-gene disorders from active birth defects surveillance programs in nine states. Abstracted medical records were reviewed by clinical geneticists to confirm and classify all birth defects and birth defect patterns. We estimated risks of stillbirth specific to birth defects among pregnancies overall and among those with isolated birth defects; potential bias owing to elective termination was quantified. RESULTS: Of 19,170 eligible neonates and fetuses with birth defects, 17,224 were liveborn, 852 stillborn, and 672 electively terminated. Overall, stillbirth risks ranged from 11 per 1,000 fetuses with bladder exstrophy (95% CI 0-57) to 490 per 1,000 fetuses with limb-body-wall complex (95% CI 368-623). Among those with isolated birth defects not affecting major vital organs, elevated risks (per 1,000 fetuses) were observed for cleft lip with cleft palate (10; 95% CI 7-15), transverse limb deficiencies (26; 95% CI 16-39), longitudinal limb deficiencies (11; 95% CI 3-28), and limb defects due to amniotic bands (110; 95% CI 68-171). Quantified bias analysis suggests that failure to account for terminations may lead to up to fourfold underestimation of the observed risks of stillbirth for sacral agenesis (13/1,000; 95% CI 2-47), isolated spina bifida (24/1,000; 95% CI 17-34), and holoprosencephaly (30/1,000; 95% CI 10-68). CONCLUSION: Birth defect-specific stillbirth risk was high compared with the U.S. stillbirth risk (6/1,000 fetuses), even for isolated cases of oral clefts and limb defects; elective termination may appreciably bias some estimates. These data can inform clinical care and counseling after prenatal diagnosis.

      5. Introduction Infant mortality is a key population health indicator, and accurate cause of death reporting is necessary to design infant mortality prevention strategies. Death certificates and child fatality review (CFR) both track leading infant causes of death in Ohio but produce different results. Our aim was to determine the frequency and characteristics of differences between the two systems to understand both cause of death ranking systems for Ohio. Methods We linked and analyzed data from death certificates and CFR records for all infant deaths (aged < 1 year) in Ohio during 2009-2013. Death certificate and CFR cause of death assignments were compared. Kappa statistic was used to measure concordance. Death certificate-CFR cause of death pairs were plotted to identify common concordant and discordant pairs. Results A total of 5030 infant deaths with death certificate and CFR records were analyzed. The most common discordant cause of death pair was other perinatal condition on the death certificate and prematurity by CFR (1119). Specific injury categories had higher concordance (kappa 0.71-1.00) than medical categories (kappa 0.00-0.78). Among 456 deaths categorized as sudden infant death syndrome on death certificates, approximately 50% (230) were categorized as missing, unknown, or undetermined by CFR. Discussion Linking death certificate and CFR causes of death provided a more robust understanding of infant causes of death in Ohio. Separately, each system serves distinct and valuable purposes that should be reviewed before selecting one system for ranking leading causes of infant mortality.

      6. INTRODUCTION: Maternal and child health (MCH) and chronic disease programs at state health agencies may not routinely collaborate. The objective of this study was to describe a project that enhanced relationships between MCH and chronic disease epidemiologists at the Florida Department of Health, increased epidemiologic capacity, and informed both programs. METHODS: We collaborated to assess hypertension-related severe maternal morbidity (H-SMM) and hypertensive disorders (preexisting hypertension, gestational hypertension, and preeclampsia) among women at delivery of their live birth to help determine the burden on health care systems in Florida. We identified ways to improve the health of women before they conceive and to help them manage any chronic diseases during the perinatal period. RESULTS: We found differences by maternal characteristics in H-SMM rates among 979,660 women who delivered live births. We proposed strategies to support collaboration between state MCH and chronic disease staff. First, increase the screening, monitoring, and management of hypertension before, during, and after pregnancy. Second, examine H-SMM concurrently with maternal mortality to help find prevention strategies. Third, include reproductive-aged women in ongoing hypertension prevention and intervention efforts. Fourth, expand team-based care to include obstetricians, midwives, and doulas who can work together with primary care providers for hypertension management. And fifth, create and share data products that guide various groups about hypertension and related risk factors among reproductive-aged women. CONCLUSION: The collaboration between the Florida Department of Health MCH and chronic disease epidemiologists produced 1) a program-relevant indicator, H-SMM and 2) strategies for enhancing program and clinical activities, communication, and surveillance to reduce H-SMM rates.

      7. Disparities in documented diagnoses of autism spectrum disorder based on demographic, individual, and service factorsexternal icon
        Wiggins LD, Durkin M, Esler A, Lee LC, Zahorodny W, Rice C, Yeargin-Allsopp M, Dowling NF, Hall-Lande J, Morrier MJ, Christensen D, Shenouda J, Baio J.
        Autism Res. 2019 Dec 23.
        The objectives of our study were to (a) report how many children met an autism spectrum disorder (ASD) surveillance definition but had no clinical diagnosis of ASD in health or education records and (b) evaluate differences in demographic, individual, and service factors between children with and without a documented ASD diagnosis. ASD surveillance was conducted in selected areas of Arizona, Arkansas, Colorado, Georgia, Maryland, Minnesota, Missouri, New Jersey, North Carolina, Tennessee, and Wisconsin. Children were defined as having ASD if sufficient social and behavioral deficits and/or an ASD diagnosis were noted in health and/or education records. Among 4,498 children, 1,135 (25%) had ASD indicators without having an ASD diagnosis. Of those 1,135 children without a documented ASD diagnosis, 628 (55%) were not known to receive ASD services in public school. Factors associated with not having a clinical diagnosis of ASD were non-White race, no intellectual disability, older age at first developmental concern, older age at first developmental evaluation, special education eligibility other than ASD, and need for fewer supports. These results highlight the importance of reducing disparities in the diagnosis of children with ASD characteristics so that appropriate interventions can be promoted across communities. Autism Res 2019, 00: 1-10. (c) 2019 International Society for Autism Research,Wiley Periodicals, Inc. LAY SUMMARY: Children who did not have a clinical diagnosis of autism spectrum disorder (ASD) documented in health or education records were more likely to be non-White and have fewer developmental problems than children with a clinical diagnosis of ASD. They were brought to the attention of healthcare providers at older ages and needed fewer supports than children with a clinical diagnosis of ASD. All children with ASD symptoms who meet diagnostic criteria should be given a clinical diagnosis so they can receive treatment specific to their needs.

      8. The reality of cerebral palsy in Ugandaexternal icon
        Yeargin-Allsopp M.
        Dev Med Child Neurol. 2019 Dec 17.

    • Nutritional Sciences
      1. Demographic, physiologic, and lifestyle characteristics observed with serum total folate differ among folate forms: Cross-sectional data from fasting samples in the NHANES 2011-2016external icon
        Fazili Z, Sternberg MR, Potischman N, Wang CY, Storandt RJ, Yeung L, Yamini S, Gahche JJ, Juan W, Qi YP, Paladugula N, Gabey G, Pfeiffer CM.
        J Nutr. 2019 Dec 25.
        BACKGROUND: Serum folate forms were measured in the US population during recent NHANES to assess folate status. OBJECTIVE: We describe post-folic acid-fortification concentrations of serum folate forms in the fasting US population >/=1 y from the NHANES 2011-2016. METHODS: We measured 5 biologically active folates and 1 oxidation product (MeFox) of 5-methyltetrahydrofolate (5-methyl-THF). We calculated geometric means of 5-methyl-THF, unmetabolized folic acid (UMFA), nonmethyl folate (sum of tetrahydrofolate, 5-formyltetrahydrofolate, and 5,10-methenyltetrahydrofolate), total folate (sum of above biomarkers), and MeFox by demographic, physiologic, and lifestyle variables; estimated the magnitude of variables on biomarker concentrations after covariate adjustment; and determined the prevalence of UMFA >2 nmol/L. RESULTS: After demographic adjustment, age, sex, and race-Hispanic origin were significantly associated with most folate forms. MeFox increased with age, while 5-methyl-THF, UMFA, and nonmethyl folate displayed U-shaped age patterns. Compared with non-Hispanic whites, non-Hispanic blacks had 23% lower predicted 5-methyl-THF but comparable UMFA; non-Hispanic Asians had comparable 5-methyl-THF but 28% lower UMFA; Hispanics, non-Hispanic Asians, and non-Hispanic blacks had approximately 20% lower MeFox. After additional physiologic and lifestyle adjustment, predicted UMFA and MeFox concentrations were 43% and 112% higher, respectively, in adults with chronic kidney disease and 17% and 15% lower, respectively, in adults consuming daily 1-<2 alcoholic beverages; 5-methyl-THF concentrations were 20% lower in adult smokers. The prevalence of UMFA >2 nmol/L was highest in persons aged >/=70 y (9.01%) and lowest in those aged 12-19 y (1.14%). During 2011-2014, the prevalence was 10.6% in users and 2.22% in nonusers of folic acid-containing supplements. CONCLUSIONS: In fasting persons >/=1 y, the demographic, physiologic, and lifestyle characteristics observed with serum total folate differed among folate forms, suggesting biological and/or genetic influences on folate metabolism. High UMFA was mostly observed in supplement users and older persons.

      2. Age, ethnicity, glucose-6-phosphate dehydrogenase deficiency, micronutrient powder intake, and biomarkers of micronutrient status, infection, and inflammation are associated with anemia among children 6-59 months in Nepalexternal icon
        Ford ND, Bichha RP, Parajuli KR, Paudyal N, Joshi N, Whitehead RD, Chitekwe S, Mei Z, Flores-Ayala R, Adhikari DP, Rijal S, Jefferds ME.
        J Nutr. 2019 Dec 28.
        BACKGROUND: Anemia is a major concern for children in Nepal; however, little is known about context-specific causes of anemia. OBJECTIVE: We used cross-sectional data from the 2016 Nepal National Micronutrient Status Survey to evaluate factors associated with anemia in a nationally representative, population-based sample of children 6-59 mo (n = 1367). METHODS: Hemoglobin, biomarkers of iron status and other micronutrients, infection, inflammation, and blood disorders were assessed from venous blood samples. Soil-transmitted helminth (STH) and Helicobacter pylori infections were assessed from stool. Anthropometry was measured with standard procedures. Sociodemographic and household characteristics, diet, micronutrient powder (MNP) intake, pica, and morbidity recall were ascertained by caregiver interview. Multivariable logistic regression that accounted for complex sampling design, determined predictors of anemia (hemoglobin <11.0 g/dL, altitude adjusted); candidate predictors were variables with P < 0.05 in bivariate models. RESULTS: Anemia prevalence was 18.6% (95% CI: 15.8, 21.4). MNP intake [adjusted OR (AOR): 0.25, 95% CI: 0.07, 0.86], log (ln) ferritin (mug/L) (AOR: 0.49, 95% CI: 0.38, 0.64), and ln RBP (mumol/L) (AOR: 0.42, 95% CI: 0.18, 0.95) were associated with reduced odds of anemia. Younger age (6-23 mo compared with 24-59 mo; AOR: 2.29, 95% CI: 1.52, 3.46), other Terai ethnicities (AOR: 2.59, 95% CI: 1.25, 5.35) and Muslim ethnicities (AOR: 3.15, 95% CI: 1.30, 7.65) relative to Brahmin/Chhetri ethnicities, recent fever (AOR: 1.68, 95% CI: 1.08, 2.59), ln C-reactive protein (mg/L) (AOR: 1.23, 95% CI: 1.03, 1.45), and glucose-6-phosphate dehydrogenase deficiency (AOR: 2.84, 95% CI: 1.88, 4.30) were associated with increased odds of anemia. CONCLUSION: Both nonmodifiable and potentially modifiable factors were associated with anemia. Thus some but not all anemia might be addressed through effective public health policy, programs, and delivery of nutrition and infection prevention and control.

      3. BACKGROUND: RBC folate (RBF) is an indicator of folate status and risk of neural-tube defects. It is calculated from whole blood folate (WBF), serum folate (SFOL), and hematocrit (Hct). SFOL and/or Hct are sometimes unavailable; hemoglobin (Hb) is generally available in surveys. OBJECTIVES: We assessed the ability of different RBF approximations to generate population data in women aged 12-49 y. METHODS: Using SFOL, RBF, Hct, Hb, and mean corpuscular Hb content (MCHC) from prefortification (1988-1994) and postfortification (1999-2006, 2007-2010) NHANES we applied 6 approaches: #1) assume SFOL = 0; #2) impute SFOL (population median); #3) impute Hct (population median); #4) estimate Hct (Hb/MCHC); #5) assume SFOL = 0 and estimate Hct; and #6) predict SFOL (from WBF) and estimate Hct. For each approach, we calculated the paired percentage difference to the "true" RBF and estimated various statistics. RESULTS: For 2007-2010 (unweighted data), the median relative difference from "true" RBF was lowest for approaches #2 (-0.74%), #4 (-0.96%), and #6 (-1.15%), intermediate for #3 (-3.36%), and highest for #5 (4.96%) and #1 (5.78%). The 95% agreement limits were smallest for approach #1 (2.33%, 13.0%) and largest for #3 (-20.8%, 11.3%). Approach #2 showed concentration-dependence (negative compared with positive differences at low compared with high RBF). Using weighted data, we found similar patterns across approaches for mean relative differences by demographic subgroup for all 3 time periods. CONCLUSIONS: We obtained the best agreement between estimated and "true" RBF when we predicted SFOL using a regression equation obtained from a subset of samples (approach #6). Alternatively, the consistent overestimation of RBF when assuming SFOL = 0 ( approximately 6%) could be addressed by adjusting the data (approach #5). Similar observations for pre- and postfortification periods suggest applicability to low and high folate status situations, but should be confirmed elsewhere. To estimate RBF, at least WBF and Hb are needed.

    • Occupational Safety and Health
      1. Police stress and depressive symptoms: role of coping and hardinessexternal icon
        Allison P, Mnatsakanova A, McCanlies E, Fekedulegn D, Hartley TA, Andrew ME, Violanti JM.
        Policing. 2019 .
        Purpose: Chronic exposure to occupational stress may lead to depressive symptoms in police officers. The association between police stress and depressive symptoms and the potential influences of coping and hardiness were evaluated. The paper aims to discuss this issue. Design/methodology/approach: Stress level was assessed in the Buffalo Cardio-Metabolic Occupational Police Stress Study (2004–2009) with the Spielberger Police Stress Survey. The frequency and severity of events at work were used to calculate stress indices for the past year. The Center for Epidemiologic Studies Depression (CES-D) Scale was used to measure depressive symptoms during the past week. Linear regression was used to evaluate the association between the stress indices and depressive symptom scores. Models were adjusted for age, sex, race, smoking status and alcohol intake, and stratified by median values for coping (passive, active and support seeking) and hardiness (control, commitment and challenge) to assess effect modification. Findings: Among the 388 officers (73.2 percent men), a significant positive association was observed between total stress and the CES-D score (β=1.98 (SE=0.36); p<0.001). Lower CES-D scores were observed for officers who reported lower passive coping (β=0.94 (SE=0.45); p=0.038) and higher active coping (β=1.41 (SE=0.44); p=0.002), compared with their counterparts. Officers higher in hardiness had lower CES-D scores, particularly for commitment (β=0.86 (SE=0.35); p=0.016) and control (β=1.58 (SE=0.34); p<0.001). Originality/value: Results indicate that high active coping and hardiness modify the effect of work stress in law enforcement, acting to reduce depressive symptoms.

        https://www.scopus.com/inward/record.uri?eid=2-s2.0-85076858014&doi=10.1108%2fPIJPSM-04-2019-0055&partnerID=40&md5=1b242ab617ce627d3b34795d3ba9ee65

      2. Worker exposure to flame retardants in manufacturing, construction and service industriesexternal icon
        Estill CF, Slone J, Mayer A, Chen IC, La Guardia MJ.
        Environ Int. 2019 Dec 3;135:105349.
        Workers in several industries are occupationally exposed to flame retardants. This study characterizes flame retardant exposure for nine industries through air and hand wipe measures for 105 workers. Specifically, we analyzed 24 analytes from three chemical classes: organophosphate flame retardants (OFRs), polybrominated diphenyl ethers (PBDEs), and non-PBDE brominated flame retardants (NPBFRs). The industries were: carpet installation, chemical manufacturing, foam manufacturing, electronic scrap, gymnastics, rigid board installation, nail salons, roofing, and spray polyurethane foam. Workers wore personal air samplers for two entire workdays and provided hand wipe samples before and after the second work day. Bulk products were also analyzed. The air, hand wipe and bulk samples were evaluated for relevant flame retardants. Spray polyurethane foam workers' tris(1-chloro-2-propyl) phosphate air (geometric mean = 48,500 ng/m(3)) and hand wipe (geometric mean = 83,500 ng per sample) concentrations had the highest mean industry concentration of any flame retardant analyzed in this study, followed by triphenyl phosphate air concentration and tris(1,3-dichloro-2-propyl) phosphate hand wipe concentration from chemical manufacturers. Overall, OFR air and hand wipe concentrations were higher and more prevalent than PBDEs or non-PBDE brominated flame retardants. Some industries including spray polyurethane foam application, chemical manufacturing, foam manufacturing, nail salons, roofing, and rigid polyiso board installation had high potential for both air and hand exposure to OFRs. Carpet installers, electronic scrap workers, and gymnastic workers had exposures to all three classes of flame retardants including PBDEs, which were phased out of production in 2013. Air and dermal exposures to OFRs are prevalent in many industries and are replacing PBDEs in some industries.

      3. Population-based age adjustment tables for use in occupational hearing conservation programsexternal icon
        Flamme GA, Deiters KK, Stephenson MR, Themann CL, Murphy WJ, Byrne DC, Goldfarb DG, Zeig-Owens R, Hall C, Prezant DJ, Cone JE.
        Int J Audiol. 2019 Dec 17:1-11.
        Objective: In occupational hearing conservation programmes, age adjustments may be used to subtract expected age effects. Adjustments used in the U.S. came from a small dataset and overlooked important demographic factors, ages, and stimulus frequencies. The present study derived a set of population-based age adjustment tables and validated them using a database of exposed workers.Design: Cross-sectional population-based study and retrospective longitudinal cohort study for validation.Study sample: Data from the U.S. National Health and Nutrition Examination Survey (unweighted n = 9937) were used to produce these tables. Male firefighters and emergency medical service workers (76,195 audiograms) were used for validation.Results: Cross-sectional trends implied less change with age than assumed in current U.S. regulations. Different trends were observed among people identifying with non-Hispanic Black race/ethnicity. Four age adjustment tables (age range: 18-85) were developed (women or men; non-Hispanic Black or other race/ethnicity). Validation outcomes showed that the population-based tables matched median longitudinal changes in hearing sensitivity well.Conclusions: These population-based tables provide a suitable replacement for those implemented in current U.S. regulations. These tables address a broader range of worker ages, account for differences in hearing sensitivity across race/ethnicity categories, and have been validated for men using longitudinal data.

      4. The National Institute for Occupational Safety and Health B Reader Certification Program - an update report (1987 to 2018) and future directionsexternal icon
        Halldin CN, Hale JM, Weissman DN, Attfield MD, Parker JE, Petsonk EL, Cohen RA, Markle T, Blackley DJ, Wolfe AL, Tallaksen RJ, Laney AS.
        J Occup Environ Med. 2019 Dec;61(12):1045-1051.
        OBJECTIVE: The National Institute for Occupational Safety and Health (NIOSH) B Reader Program provides the opportunity for physicians to demonstrate proficiency in the International Labour Office (ILO) system for classifying radiographs of pneumoconioses. We summarize trends in participation and examinee attributes and performance during 1987 to 2018. METHODS: Since 1987, NIOSH has maintained details of examinees and examinations. Attributes of examinees and their examination performance were summarized. Simple linear regression was used in trend analysis of passing rates over time. RESULTS: The mean passing rate for certification and recertification for the study period was 40.4% and 82.6%, respectively. Since the mid-1990s, the number of B Readers has declined and the mean age and years certified have increased. CONCLUSIONS: To address the declining B Reader population, NIOSH is currently taking steps to modernize the program and offer more opportunities for training and testing.

      5. Industrial exoskeletons: Need for intervention effectiveness researchexternal icon
        Howard J, Murashov VV, Lowe BD, Lu ML.
        Am J Ind Med. 2019 Dec 11.
        Exoskeleton devices are being introduced across several industry sectors to augment, amplify, or reinforce the performance of a worker's existing body components-primarily the lower back and the upper extremity. Industrial exoskeletons may play a role in reducing work-related musculoskeletal disorders arising from lifting and handling heavy materials or from supporting heavy tools in overhead work. However, wearing an exoskeleton may pose a number of risks that are currently not well-studied. There are only a few studies about the safety and health implications of wearable exoskeletons and most of those studies involve only a small number of participants. Before the widespread implementation of industrial exoskeletons occurs, there is need for prospective interventional studies to evaluate the safety and health effectiveness of exoskeletons across various industry sectors. Developing a research strategy to fill current safety and health knowledge gaps, understanding the benefits, risks, and barriers to adoption of industrial exoskeletons, determining whether exoskeleton can be considered a type of personal protective equipment, and advancing consensus standards that address exoskeleton safety, should be major interests of both the occupational safety and health research and practice communities.

      6. Night shift work and cardiovascular disease biomarkers in female nursesexternal icon
        Johnson CY, Tanz LJ, Lawson CC, Schernhammer ES, Vetter C, Rich-Edwards JW.
        Am J Ind Med. 2019 Dec 11.
        BACKGROUND: Night shift work is associated with cardiovascular disease, but its associations with cardiovascular disease biomarkers are unclear. We investigated these associations in a study of female nurses. METHODS: We used data from the Nurses' Health Study II for total cholesterol, low-density lipoprotein cholesterol, high-density lipoprotein (HDL) cholesterol, triglycerides, C-reactive protein (CRP), and fibrinogen. The sample sizes for our analysis ranged from 458 (fibrinogen) to 3574 (total cholesterol). From questionnaires, we determined the number of night shifts worked in the 2 weeks before blood collection and total years of rotating night shift work. We used quantile regression to estimate differences in biomarker levels by shift work history, adjusting for potential confounders. RESULTS: Nurses working 1 to 4 recent night shifts had median HDL cholesterol levels 4.4 mg/dL (95% confidence interval [CI]: 0.3, 7.5) lower than nurses without recent night shifts. However, working >/=5 recent night shifts and years of rotating night shift work were not associated with HDL cholesterol. There was no association between recent night shifts and CRP, but median CRP levels were 0.1 (95% CI: 0.0, 0.2), 0.2 (95% CI: 0.1, 0.4), and 0.2 (95% CI: 0.0, 0.4) mg/L higher among nurses working rotating night shifts for 1 to 5, 6 to 9, and >/=10 years compared with nurses never working rotating night shifts. These associations were attenuated when excluding postmenopausal women and women taking statins. We observed no associations between night shift work and other biomarkers. CONCLUSIONS: We found suggestive evidence of adverse short-term and long-term effects of night shift work on select cardiovascular disease biomarkers.

      7. BACKGROUND: Mortality tends to be higher among people who do not work than among workers, but the impact of work-related disability on mortality has not been well studied. METHODS: The vital status through 2015 was ascertained for 14 219 workers with an accepted workers' compensation claim in West Virginia for a low back injury in 1998 or 1999. Mortality among the cohort compared with the West Virginia general population was assessed using standard life table techniques. Associations of mortality and disability-related factors within the cohort were evaluated using Cox proportional hazards regression. RESULTS: Compared to the general population, mortality from accidental poisoning was significantly elevated among the overall cohort and lost-time claimants. Most deaths from accidental poisoning in the cohort were due to drug overdoses involving opioids. Mortality from intentional self-harm was also significantly elevated among lost-time claimants. In internal analyses, overall mortality and mortality from cancer, heart disease, intentional self-harm, and drug overdoses involving opioids was significantly associated with lost time. Overall mortality and mortality from drug overdoses involving opioids were also significantly associated with amount of lost time, permanent partial disability, and percent permanent disability. Heart disease mortality was also significantly associated with the amount of lost time. CONCLUSIONS: The results suggest that disability itself may impact mortality risks. If confirmed, these results reinforce the importance of return to work and other efforts to reduce disability.


      8. Cleaning products and work-related asthma, 10 year updateexternal icon
        Rosenman K, Reilly MJ, Pechter E, Fitzsimmons K, Flattery J, Weinberg J, Cummings K, Borjan M, Lumia M, Harrison R, Dodd K, Schleiff P.
        J Occup Environ Med. 2019 Dec 31.
        OBJECTIVE: To describe the frequency of work-related asthma (WRA) and characteristics of individuals with exposure to cleaning products 1998-2012, compared to 1993-1997. METHODS: Cases of WRA from products used for cleaning or disinfecting surfaces were identified from California, Massachusetts, Michigan (1998-2012), New Jersey (1998-2011), and New York (2009-2012). RESULTS: There were 1,199 (12.4%) cleaning product cases among all 9,667 WRA cases; 77.8% women, 62.1% white non-Hispanic, and average age of 43 years. The highest percentage worked in healthcare (41.1%), and were building cleaners (20.3%), or registered nurses (14.1%). CONCLUSIONS: The percentage of WRA cases from exposure to cleaning products from 1998-2012 was unchanged from 1993-1997 indicating that continued and additional prevention efforts are needed to reduce unnecessary use, identify safer products, and implement safer work processes.

      9. Toward an expanded focus for occupational safety and health: A commentaryexternal icon
        Schulte PA, Delclos G, Felknor SA, Chosewood LC.
        Int J Environ Res Public Health. 2019 Dec 6;16(24).
        Powerful and ongoing changes in how people work, the workforce, and the workplace require a more holistic view of each of these. We argue that an expanded focus for occupational safety and health (OSH) is necessary to prepare for and respond rapidly to future changes in the world of work that will certainly challenge traditional OSH systems. The WHO Model for Action, various European efforts at well-being, and the Total Worker Health concept provide a foundation for addressing changes in the world of work. However, a paradigm expansion to include the recognition of worker and workforce well-being as an important outcome of OSH will be needed. It will also be vital to stimulate transdisciplinary efforts and find innovative ways to attract and train students into OSH professions as the paradigm expands. This will require active marketing of the OSH field as vibrant career choice, as a profession filled with meaningful, engaging responsibilities, and as a well-placed investment for industry and society. An expanded paradigm will result in the need for new disciplines and specialties in OSH, which may be useful in new market efforts to attract new professionals. Ultimately, to achieve worker and workforce well-being we must consider how to implement this expanded focus.

      10. Electrocardiographic responses following live-fire firefighting drillsexternal icon
        Smith DL, Horn GP, Fernhall B, Kesler RM, Fent KW, Kerber S, Rowland TW.
        J Occup Environ Med. 2019 Dec;61(12):1030-1035.
        OBJECTIVE: Firefighting-related environmental and physiological factors associated with cardiovascular strain may promote arrhythmias and myocardial ischemia, which induce sudden cardiac events (SCE) in susceptible individuals. The present study evaluated electrocardiographic (ECG) changes that may reflect increased SCE risk following simulated live-firefighting. METHODS: Using a repeated measures design, ECG tracings from 32 firefighters were recorded 12-hours post-firefighting in a residential structure and compared with a 12-hour control period. RESULTS: Ventricular arrhythmias were present in 20%, and ST segment changes indicative of myocardial ischemia in 16%, of firefighters 12-hours post-firefighting that were not detected in the control period. CONCLUSION: Live-firefighting induces significant ECG changes that include ventricular arrhythmias and ST segment changes, which may reflect myocardial ischemia. The implications of such ECG changes explaining increased cardiovascular risk in firefighters warrants further research.

      11. Spinal loading and lift style in confined vertical spaceexternal icon
        Weston EB, Dufour JS, Lu ML, Marras WS.
        Appl Ergon. 2020 Apr;84:103021.
        The objective of this study was to investigate biomechanical loads on the lumbar spine as a function of working in a confined vertical space, consistent with baggage handling inside the baggage compartment of an airplane. Ten male subjects performed baggage handling tasks using confined (kneeling, sitting) and unconfined (stooping) lifting styles. Dependent measures of torso flexion and three-dimensional spinal loads were assessed with an electromyography-driven biomechanical model. Lifting exertions typical to airline baggage handling posed significant risk to the lumbar spine, regardless of lifting style. Statistically significant differences attributable to lift style (stooping, kneeling, sitting) were not observed for peak compressive, lateral shear, or resultant spinal loads, but lifting while kneeling decreased anterior/posterior (A/P) shear spinal loads relative to stooping (p = 0.02). Collectively, kneeling offers the greatest benefit when lifting in confined spaces because of the ability to keep the torso upright, subsequently reducing shear forces on the lumbar spine.

    • Occupational Safety and Health - Mining
      1. Respirable coal mine dust at surface mines, United States, 1982-2017external icon
        Doney BC, Blackley D, Hale JM, Halldin C, Kurth L, Syamlal G, Laney AS.
        Am J Ind Med. 2019 Dec 9.
        BACKGROUND: Exposure to respirable coal mine dust can cause pneumoconiosis, an irreversible lung disease that can be debilitating. The mass concentration and quartz mass percent of respirable coal mine dust samples (annually, by occupation, by geographic region) from surface coal mines and surface facilities at U.S. underground mines during 1982-2017 were summarized. METHODS: Mine Safety and Health Administration (MSHA) collected and analyzed data for respirable dust and a subset of the samples were analyzed for quartz content. We calculated the respirable dust and quartz concentration geometric mean, arithmetic mean, and percent of samples exceeding the respirable dust permissible exposure limit (PEL) of 2.0 mg/m3, and the average percent of quartz content in samples. RESULTS: The geometric mean for 288 705 respirable dust samples was 0.17 mg/m(3) with 1.6% of the samples exceeding the 2.0 mg/m(3) PEL. Occupation-specific geometric means for respirable dust in active mining areas were highest among drillers. The geometric mean for respirable dust was higher in central Appalachia compared to the rest of the U.S. The geometric mean for respirable quartz including 54 040 samples was 0.02 mg/m(3) with 15.3% of these samples exceeding the applicable standard (PEL or reduced PEL). Occupation-specific geometric means for respirable quartz were highest among drillers. CONCLUSION: Higher concentrations of respirable dust or quartz in specific coal mining occupations, notably drilling occupations, and in certain U.S. regions, underscores the need for continued surveillance to identify workers at higher risk for pneumoconiosis.

      2. Use of the field-based silica monitoring technique in a coal mine: A case studyexternal icon
        Pampena JD, Cauda EG, Chubb LG, Meadows JJ.
        Min Metall Explor. 2019 .
        Exposure to respirable crystalline silica (RCS) can cause serious and irreparable negative health effects, including silicosis and lung cancer. Workers in coal mines have the potential of being exposed to RCS found in dust generated by various mining processes. The silica content of respirable dust in one single mine can vary substantially over both time and location. The current monitoring approach for RCS relies on the use of traditional air sampling followed by laboratory analysis. Results generated using this approach are generally not available for several days to several weeks after sampling, and this delay prevents timely and effective intervention if needed. An alternate analytical method is needed to reduce the time required to quantify the RCS exposure of mine workers. The National Institute for Occupational Safety and Health (NIOSH) has developed a new method using commercially available portable infrared spectrometers for measuring RCS at the end of the sampling shift. This paper will describe the application of the new field-based RCS analytical process for coal mines, including the use of the new method with the existing Coal Mine Dust Personal Sampler Unit. In a case study conducted by NIOSH with a coal mine operator in West Virginia, field-based RCS analysis was completed at a mine site to evaluate the new technique. The RCS analysis results obtained by the field-based method in this case study showed sufficiently strong correlation with results obtained by the MSHA standard laboratory analysis method to allow the mine operator to use the field-based method for evaluating process improvements.

      3. Field study results of a 3rd generation roof bolter canopy air curtain for respirable coal mine dust controlexternal icon
        Reed WR, Shahan M, Klima S, Ross G, Singh K, Cross R, Grounds T.
        Int J Coal Sci Technol. 2019 Nov:[Epub ahead of print].
        A 3rd generation roof bolter canopy air curtain (CAC) has been developed and constructed by J.H. Fletcher &amp; Co., Inc. As with the previous generation of the CAC, this design uses the principle of providing uniform airflow across the canopy area as recommended by the National Institute for Occupational Safety and Health. The new modifications include a plenum that is constructed of a single flat aluminum plate, smaller-diameter airflow openings, and a single row of perimeter nozzles designed to prevent mine air contaminated by respirable dust from entering the CAC protection zone. Field testing was conducted on this new 3rd generation design showing reductions in coal mine respirable dust exposure for roof bolter operators. Dust control efficiencies for the CAC for the left bolter operator (intake side) ranged from approximately 26%-60%, while the efficiencies for the CAC for the right bolter operator (return side) ranged from 3% to 47%.

      4. Underground mine air and strata temperature change due to the use of refuge alternativesexternal icon
        Yan L, Yantek DS, Reyes MA.
        Min Metall Explor. 2019 Nov:[Epub ahead of print].
        Heat and humidity buildup withn refuge alternatives (RAs) may expose occupants to physiological hazards such as heat stress. The Mine Safety and Health Administration (MSHA) regulations require RAs in underground coal mines to provide a life-sustaining environment for miners trapped underground when escape is impossible. RAs are required to sustain life for 96 h while maintaining an apparent temperature (AT) below 95 degrees F (35 degrees C). The National Institute for Occupational Safety and Health (NIOSH) tested a 10-person tent-type RA, a 23-person tent-type RA, and a 6-person metal-type RA in its underground coal mine facilities to investigate the thermal environment over a 96-h period. The test results showed that mine air and mine strata temperatures surrounding an RA occupied by simulated miners (SMs) increased over the 96-h test period. The test results suggest that RA manufacturers should consider this increase in temperatures when calculating and evaluating RA components during surface and laboratory tests. The findings can equip stakeholders with additional considerations for calculating the interior heat and humidity temperature profiles for occupied RAs not tested in situ.

    • Parasitic Diseases
      1. Given that the C580Y polymorphism in the Plasmodium falciparum propeller domain of the kelch 13 gene (pfk13) was documented in Guyana, monitoring for mutations associated with antimalarial resistance was undertaken in neighboring Roraima state in Brazil. Polymorphisms in the pfmdr1 and pfk13 genes were investigated in 275 P. falciparum samples. No pfk13 mutations were observed. Triple mutants 184F, 1042D, and 1246Y were observed in 100% of the samples successfully sequenced for the pfmdr1 gene, with 20.1% of these having an additional mutation at codon 1034C. Among them, 2.5% of samples harbored two copies of the pfmdr1 gene. We found no evidence of the spread of C580Y parasites to Roraima state, Brazil. As previously observed, the 184F, 1042D, and 1246Y mutations in the pfmdr1 gene appear to be fixed in this region. Continued molecular surveillance is essential to detect any potential migration or local emergence of artemisinin-resistant mutation.

      2. Biannual versus annual mass azithromycin distribution and malaria seroepidemiology among preschool children in Niger: a sub-study of a cluster randomized trialexternal icon
        Oldenburg CE, Amza A, Cooley G, Kadri B, Nassirou B, Arnold BF, Rosenthal PJ, O'Brien KS, West SK, Bailey RL, Porco TC, Keenan JD, Lietman TM, Martin DL.
        Malar J. 2019 Dec 3;18(1):389.
        BACKGROUND: Biannual mass azithromycin administration to preschool children reduces all-cause mortality, but the mechanism for the effect is not understood. Azithromycin has activity against malaria parasites, and malaria is a leading cause of child mortality in the Sahel. The effect of biannual versus annual azithromycin distribution for trachoma control on serological response to merozoite surface protein 1 (MSP-119), a surrogate for malaria incidence, was evaluated among children in Niger. METHODS: Markers of malaria exposure were measured in two arms of a factorial randomized controlled trial designed to evaluate targeted biannual azithromycin distribution to children under 12 years of age compared to annual azithromycin to the entire community for trachoma control (N = 12 communities per arm). Communities were treated for 36 months (6 versus 3 distributions). Dried blood spots were collected at 36 months among children ages 1-5 years, and MSP-119 antibody levels were assessed using a bead-based multiplex assay to measure malaria seroprevalence. RESULTS: Antibody results were available for 991 children. MSP-119 seropositivity was 62.7% in the biannual distribution arm compared to 68.7% in the annual arm (prevalence ratio 0.91, 95% CI 0.83 to 1.00). Mean semi-quantitative antibody levels were lower in the biannual distribution arm compared to the annual arm (mean difference - 0.39, 95% CI - 0.05 to - 0.72). CONCLUSIONS: Targeted biannual azithromycin distribution was associated with lower malaria seroprevalence compared to that in a population that received annual distribution. Trial Registration Clinicaltrials.gov NCT00792922.

      3. Percutaneous emergence of Gnathostoma spinigerum following praziquantel treatmentexternal icon
        Sapp SG, Kaminski M, Abdallah M, Bishop HS, Fox M, Ndubuisi M, Bradbury RS.
        Trop Med Infect Dis. 2019 Dec 14;4(4).
        A Bangladeshi patient with prior travel to Saudi Arabia was hospitalized in the United States for a presumptive liver abscess. Praziquantel was administered following a positive Schistosoma antibody test. Ten days later, a subadult worm migrated to the skin surface and was identified morphologically as Gnathostoma spinigerum. This case highlights the challenges of gnathostomiasis diagnosis, raising questions on potential serologic cross-reactivity and the possible role of praziquantel in stimulating outward migration of Gnathostoma larvae/subadults.

      4. Taenia solium cysticercosis and taeniasis in urban settings: Epidemiological evidence from a health-center based study among people with epilepsy in Dar es Salaam, Tanzaniaexternal icon
        Schmidt V, O'Hara MC, Ngowi B, Herbinger KH, Noh J, Wilkins PP, Richter V, Kositz C, Matuja W, Winkler AS.
        PLoS Negl Trop Dis. 2019 Dec;13(12):e0007751.
        In Africa, urbanization is happening faster than ever before which results in new implications for transmission of infectious diseases. For the zoonotic parasite Taenia solium, a major cause of acquired epilepsy in endemic countries, the prevalence in urban settings is unknown. The present study investigated epidemiological, neurological, and radiological characteristics of T. solium cysticercosis and taeniasis (TSCT) in people with epilepsy (PWE) living in Dar es Salaam, Tanzania, one of the fastest growing cities worldwide. A total of 302 PWE were recruited from six health centers in the Kinondoni district of Dar es Salaam. Serological testing for T. solium cysticercosis-antigen (Ag) and -antibodies (Abs) and for T. solium taeniasis-Abs was performed in all PWE. In addition, clinical and radiological examinations that included cranial computed tomography (CT) were performed. With questionnaires, demographic data from study populations were collected, and factors associated with TSCT were assessed. Follow-up examinations were conducted in PWE with TSCT. T. solium cysticercosis-Ag was detected in three (0.99%; 95% CI: 0-2.11%), -Abs in eight (2.65%; 95% CI: 0.84-4.46%), and taeniasis-Abs in five (1.66%; 95% CI: 0.22-3.09%) of 302 PWE. Six PWE (1.99%; 95% CI: 0.41-3.56%) were diagnosed with neurocysticercosis (NCC). This study demonstrates the presence of TSCT in Dar es Salaam, however, NCC was only associated with a few cases of epilepsy. The small fraction of PWE with cysticercosis- and taeniasis-Abs may suggest that active transmission of T. solium plays only a minor role in Dar es Salaam. A sufficiently powered risk analysis was hampered by the small number of PWE with TSCT; therefore, further studies are required to determine the exact routes of infection and risk behavior of affected individuals.

      5. Assessing the role of the private sector in surveillance for malaria elimination in Haiti and the Dominican Republic: a qualitative studyexternal icon
        Sidibe A, Maglior A, Cueto C, Chen I, Le Menach A, Chang MA, Eisele TP, Andrinopolous K, Cherubin J, Lemoine JF, Bennett A.
        Malar J. 2019 Dec 5;18(1):408.
        BACKGROUND: Haiti and the Dominican Republic (DR) are targeting malaria elimination by 2022. The private health sector has been relatively unengaged in these efforts, even though most primary health care in Haiti is provided by non-state actors, and many people use traditional medicine. Data on private health sector participation in malaria elimination efforts are lacking, as are data on care-seeking behaviour of patients in the private health sector. This study sought to describe the role of private health sector providers, care-seeking behaviour of individuals at high risk of malaria, and possible means of engaging the private health sector in Hispaniola's malaria elimination efforts. METHODS: In-depth interviews with 26 key informants (e.g. government officials), 62 private providers, and 63 patients of private providers, as well as 12 focus group discussions (FGDs) with community members, were conducted within seven study sites in Haiti and the DR. FGDs focused on local definitions of the private health sector and identified private providers for interview recruitment, while interviews focused on private health sector participation in malaria elimination activities and treatment-seeking behaviour of febrile individuals. RESULTS: Interviews revealed that self-medication is the most common first step in the trajectory of care for fevers in both Haiti and the DR. Traditional medicine is more commonly used in Haiti than in the DR, with many patients seeking care from traditional healers before, during, and/or after care in the formal health sector. Private providers were interested in participating in malaria elimination efforts but emphasized the need for ongoing support and training. Key informants agreed that the private health sector needs to be engaged, especially traditional healers in Haiti. The Haitian migrant population was reported to be one of the most at-risk groups by participants from both countries. CONCLUSION: Malaria elimination efforts across Hispaniola could be enhanced by engaging traditional healers in Haiti and other private providers with ongoing support and trainings; directing educational messaging to encourage proper treatment-seeking behaviour; and refining cross-border strategies for surveillance of the high-risk migrant population. Increasing distribution of rapid diagnostic tests (RDTs) and bi-therapy to select private health sector facilities, accompanied by adopting regulatory policies, could help increase numbers of reported and correctly treated malaria cases.

      6. A robust estimator of malaria incidence from routine health facility dataexternal icon
        Thwing J, Camara A, Candrinho B, Zulliger R, Colborn J, Painter J, Plucinski MM.
        Am J Trop Med Hyg. 2019 Dec 12.
        Routine incident malaria case data have become a pillar of malaria surveillance in sub-Saharan Africa. These data provide granular, timely information to track malaria burden. However, incidence data are sensitive to changes in care seeking rates, rates of testing of suspect cases, and reporting completeness. Based on a set of assumptions, we derived a simple algebraic formula to convert crude incidence rates to a corrected estimation of incidence, adjusting for biases in variable and suboptimal rates of care seeking, testing of suspect cases, and reporting completeness. We applied the correction to routine incidence data from Guinea and Mozambique, and aggregate data for sub-Saharan African countries from the World Malaria Report. We calculated continent-wide needs for malaria tests and treatments, assuming universal testing but current care seeking rates. Countries in southern and eastern Africa reporting recent increases in malaria incidence generally had lower overall corrected incidence than countries in Central and West Africa. Under current care seeking rates, the unmet need for malaria tests was estimated to be 160 million (M) (interquartile range [IQR]: 139-188) and for malaria treatments to be 37 M (IQR: 29-51). Maps of corrected incidence were more consistent with maps of community survey prevalence than was crude incidence in Guinea and Mozambique. Crude malaria incidence rates need to be interpreted in the context of suboptimal testing and care seeking rates, which vary over space and time. Adjusting for these factors can provide an insight into the spatiotemporal trends of malaria burden.

    • Physical Activity
      1. Creating activity-friendly communities: Exploring the intersection of public health and the artsexternal icon
        Cornett K, Bray-Simons K, Devlin HM, Iyengar S, Moore Shaffer P, Fulton JE.
        J Phys Act Health. 2019 Sep 30:1-3.

    • Reproductive Health
      1. The Teen Access and Quality Initiative: Improving adolescent reproductive health best practices in publicly funded health centersexternal icon
        Brittain AW, Tevendale HD, Mueller T, Kulkarni AD, Middleton D, Garrison ML, Read-Wahidi MR, Koumans EH.
        J Community Health. 2019 Dec 9.
        Quality adolescent sexual and reproductive health (ASRH) services play an important role in supporting the overall health and well-being of adolescents. Improving access to this care can help reduce unintended pregnancies, sexually transmitted diseases (STDs), and human immunodeficiency virus (HIV) infection and their associated consequences, as well as promote health equity. The Centers for Disease Control and Prevention funded three grantees to implement a clinic-based ASRH quality improvement initiative complimented by activities to strengthen systems to refer and link youth to ASRH services. The purpose of this study is to describe the initiative and baseline assessment results of ASRH best practice implementation in participating health centers. The assessment found common use of the following practices: STD/HIV screening, education on abstinence and the use of dual protection, and activities to increase accessibility (e.g., offering after-school hours and walk-in and same-day appointments). The following practices were used less frequently: provider training for Long-Acting Reversible Contraception (LARC) insertion and removal, LARC availability, same-day provision of all contraceptive methods, and consistent sharing of information about confidentiality and minors' rights with adolescent clients. This study describes the types of training and technical assistance being implemented at each health center and discusses implications for future programming.

      2. Trends in ectopic pregnancy diagnoses in United States emergency departments, 2006-2013external icon
        Mann LM, Kreisel K, Llata E, Hong J, Torrone EA.
        Matern Child Health J. 2019 Dec 17.
        OBJECTIVES: Ectopic pregnancy is an important adverse pregnancy outcome that is under-surveilled. Emergency department (ED) data can help provide insight on the trends of ectopic pregnancy incidence in the United States (US). METHODS: Data from the largest US all-payer ED database, the Healthcare Cost and Utilization Project Nationwide ED Sample, were used to identify trends in the annual ratio of ED ectopic pregnancy diagnoses to live births during 2006-2013, and the annual rate of diagnoses among all pregnancies during 2006-2010. Diagnoses were identified through International Classification of Diseases, Ninth Revision, Clinical Modification diagnosis and procedure codes and CPT codes. RESULTS: The overall ratio of weighted ED visits with an ectopic pregnancy diagnosis during 2006-2013 was 12.3 per 1000 live births. This ratio increased significantly from 2006 to 2013, from 11.0 to 13.7 ectopic pregnancies per 1000 live births, with no inflections in trend. The rate of ectopic pregnancy diagnoses per 1000 pregnancies increased during 2006-2010, from 7.0 to 8.3, with no inflections in trend. Females of all age groups experienced increases, though increases were less pronounced with increasing age. All geographic regions experienced increases, with increases being most pronounced in the Northeast. CONCLUSIONS: Our study suggests that ED ectopic pregnancy diagnoses may be increasing in the US, although the drivers of these increases are not clear. Our results highlight the need for national measures of total pregnancies, stratified by pertinent demographic variables, to evaluate trends in pregnancy-related conditions among key populations.

    • Statistics as Topic
      1. Latent class analysis (LCA) has been effectively used to cluster multiple survey items. However, causal inference with an exposure variable, identified by an LCA model, is challenging because (1) the exposure variable is unobserved and harbors the uncertainty of estimating parameters in the LCA model and (2) confounding bias adjustments need to be done with the unobserved LCA-driven exposure variable. In addition to these challenges, complex survey design features and survey weights must be accounted for if they are present. Our solutions to these issues are to (1) assess point estimates with the expected estimating function approach and (2) modify the survey design weights with LCA-based propensity scores. This paper aims to introduce a statistical procedure to apply the estimating equation approach to assessing the effects of LCA-driven cause in complex survey data using an example of the National Health and Nutrition Examination Survey.

      2. In August 2017 the National Center for Health Statistics (NCHS), part of the U.S. Federal Statistical System, published new standards for determining the reliability of proportions estimated using their data. These standards require an individual to take the Korn-Graubard confidence interval (CI), along with CI widths, sample size, and degrees of freedom, to assess reliability of a proportion and determine if it can be presented. The assessment itself involves determining if several conditions are met. This manuscript presents kg_nchs, a postestimation command that is used following svy: proportion. It allows Stata users to (a) calculate the Korn-Graubard CI and associated statistics used in applying the NCHS presentation standards for proportions, and (b) display a series of three dichotomous flags that show if the standards are met. The empirical examples provided show how kg_nchs can be used to easily apply the standards and prevent Stata users from needing to perform manual calculations. While developed for NCHS survey data, this command can also be used with data that stems from any survey with a complex sample design.

    • Substance Use and Abuse
      1. Vitamin E acetate in bronchoalveolar-lavage fluid associated with EVALIexternal icon
        Blount BC, Karwowski MP, Shields PG, Morel-Espinosa M, Valentin-Blasini L, Gardner M, Braselton M, Brosius CR, Caron KT, Chambers D, Corstvet J, Cowan E, De Jesus VR, Espinosa P, Fernandez C, Holder C, Kuklenyik Z, Kusovschi JD, Newman C, Reis GB, Rees J, Reese C, Silva L, Seyler T, Song MA, Sosnoff C, Spitzer CR, Tevis D, Wang L, Watson C, Wewers MD, Xia B, Heitkemper DT, Ghinai I, Layden J, Briss P, King BA, Delaney LJ, Jones CM, Baldwin GT, Patel A, Meaney-Delman D, Rose D, Krishnasamy V, Barr JR, Thomas J, Pirkle JL.
        N Engl J Med. 2019 Dec 20.
        BACKGROUND: The causative agents for the current national outbreak of electronic-cigarette, or vaping, product use-associated lung injury (EVALI) have not been established. Detection of toxicants in bronchoalveolar-lavage (BAL) fluid from patients with EVALI can provide direct information on exposure within the lung. METHODS: BAL fluids were collected from 51 patients with EVALI in 16 states and from 99 healthy participants who were part of an ongoing study of smoking involving nonsmokers, exclusive users of e-cigarettes or vaping products, and exclusive cigarette smokers that was initiated in 2015. Using the BAL fluid, we performed isotope dilution mass spectrometry to measure several priority toxicants: vitamin E acetate, plant oils, medium-chain triglyceride oil, coconut oil, petroleum distillates, and diluent terpenes. RESULTS: State and local health departments assigned EVALI case status as confirmed for 25 patients and as probable for 26 patients. Vitamin E acetate was identified in BAL fluid obtained from 48 of 51 case patients (94%) in 16 states but not in such fluid obtained from the healthy comparator group. No other priority toxicants were found in BAL fluid from the case patients or the comparator group, except for coconut oil and limonene, which were found in 1 patient each. Among the case patients for whom laboratory or epidemiologic data were available, 47 of 50 (94%) had detectable tetrahydrocannabinol (THC) or its metabolites in BAL fluid or had reported vaping THC products in the 90 days before the onset of illness. Nicotine or its metabolites were detected in 30 of 47 of the case patients (64%). CONCLUSIONS: Vitamin E acetate was associated with EVALI in a convenience sample of 51 patients in 16 states across the United States. (Funded by the National Cancer Institute and others.).

      2. Update: Interim guidance for health care professionals evaluating and caring for patients with suspected e-cigarette, or vaping, product use-associated lung injury and for reducing the risk for rehospitalization and death following hospital discharge - United States, December 2019external icon
        Evans ME, Twentyman E, Click ES, Goodman AB, Weissman DN, Kiernan E, Hocevar SA, Mikosz CA, Danielson M, Anderson KN, Ellington S, Lozier MJ, Pollack LA, Rose DA, Krishnasamy V, Jones CM, Briss P, King BA, Wiltz JL.
        MMWR Morb Mortal Wkly Rep. 2020 Jan 3;68(5152):1189-1194.

      3. Patient characteristics and product use behaviors among persons with e-cigarette, or vaping, product use-associated lung injury - Indiana, June October 2019external icon
        Gaub KL, Hallyburton S, Samanic C, Paddack D, Clark CR, Pence S, Brown JA, Hawkins E.
        MMWR Morb Mortal Wkly Rep. 2019 Dec 13;68(49):1139-1141.
        As of December 4, 2019, a total of 2,291 cases of hospitalized e-cigarette, or vaping, product use-associated lung injury (EVALI) have been reported from 50 states, the District of Columbia, and two U.S. territories (Puerto Rico and the U.S. Virgin Islands) (1). State health departments, including the Indiana State Department of Health (ISDH), are working with their local health departments and with CDC, the Food and Drug Administration, and other clinical and public health partners in investigating this outbreak of EVALI. On August 7, 2019, ISDH issued an advisory regarding patients hospitalized in Wisconsin with severe acute lung injury who reported the use of e-cigarette, or vaping, products (2); health care providers were requested to notify ISDH of similar cases. On August 8, 2019, ISDH received reports of five similar cases among Indiana residents. Suspected cases EVALI reported to ISDH were investigated further only among patients who required hospitalization. Established case definitions were used to classify cases.* Medical record abstractions and patient interviews were completed using nationally standardized forms to ascertain patient characteristics, medical care received, and product-use behaviors.


      4. Syndromic surveillance for e-cigarette, or vaping, product use-associated lung injuryexternal icon
        Hartnett KP, Kite-Powell A, Patel MT, Haag BL, Sheppard MJ, Dias TP, King BA, Melstrom PC, Ritchey MD, Stein Z, Idaikkadar N, Vivolo-Kantor AM, Rose DA, Briss PA, Layden JE, Rodgers L, Adjemian J.
        N Engl J Med. 2019 Dec 20.

      5. Update: Demographic, product, and substance-use characteristics of hospitalized patients in a nationwide outbreak of e-cigarette, or vaping, product use-associated lung injuries - United States, December 2019external icon
        Lozier MJ, Wallace B, Anderson K, Ellington S, Jones CM, Rose D, Baldwin G, King BA, Briss P, Mikosz CA.
        MMWR Morb Mortal Wkly Rep. 2019 Dec 13;68(49):1142-1148.

      6. Characteristics of patients experiencing rehospitalization or death after hospital discharge in a nationwide outbreak of e-cigarette, or vaping, product use-associated lung injury - United States, 2019external icon
        Mikosz CA, Danielson M, Anderson KN, Pollack LA, Currie DW, Njai R, Evans ME, Goodman AB, Twentyman E, Wiltz JL, Rose DA, Krishnasamy V, King BA, Jones CM, Briss P, Lozier M, Ellington S.
        MMWR Morb Mortal Wkly Rep. 2020 Jan 3;68(5152):1183-1188.

      7. Trends in intentional and unintentional opioid overdose deaths in the United States, 2000-2017external icon
        Olfson M, Rossen LM, Wall MM, Houry D, Blanco C.
        Jama. 2019 Dec 17;322(23):2340-2342.

      8. BACKGROUND: The unpredictable physiologic and pharmacologic effects of synthetic cannabinoids (SCs) are continuously changing as the chemical structure of SCs evolve to avoid classification as a Schedule I drug under the Controlled Substances Act in the U.S. This results in unpredictable pharmacologic effects and subsequent sequelae. Little is known about national or regional trends of SC clusters. The objective of this study is to investigate trends in SC exposure using emergency department (ED) syndromic data. METHODS: We analyzed ED syndromic data to detect quarterly trends from January 2016 through September 2019 for SC-related exposures within 59 jurisdictions in 47 states by U.S. region. Pearson chi-square tests detected quarter-to-quarter changes and Joinpoint regression assessed trends over time. RESULTS: From January 2016 to September 2019, 21,714 of 303.5 million ED visits involved suspected SC exposures. Nationally, SC-related exposures decreased by 1.9 % (p = .04) on average per quarter, yet exposures increased in the Midwest by 6.3 % (p = .002) and in the Northeast by 3.2 % (p = .03) on average per quarter, and decreased on average per quarter by 7.7% (p </= .001) in the Southeast and 11.4 % in the West (p </= .001). Known SC exposures that may align with clusters were identified in quarter-to-quarter monitoring. CONCLUSIONS: Only a small proportion of ED visits were related to suspected SC exposure. Although we did identify a small decrease in national SC exposures, there was wide variation by region. Additional efforts are needed to understand variation and to develop prevention and response strategies.

      9. Neonatal abstinence syndrome incidence and health care costs in the United States, 2016external icon
        Strahan AE, Guy GP, Bohm M, Frey M, Ko JY.
        JAMA Pediatr. 2019 Dec 16.

      10. Smokeless tobacco (ST) products are used worldwide and are a major public health concern. In addition to harmful chemicals found in these products, microbes found in ST products are believed to be responsible for generating harmful tobacco-specific nitrosamines (TSNAs), the most abundant carcinogens in ST. These microbes also contribute endotoxins and other pro-inflammatory components. A greater understanding of the microbial constituents in these products is sought in order to potentially link select design aspects or manufacturing processes to avoidable increases in harmful constituents. Previous studies looked primarily at bacterial constituents and had not differentiated between viable vs nonviable organisms, so in this study, we sought to use a dual metatranscriptomic and metagenomic analysis to see if differences exist. Using high-throughput sequencing, we observed that there were differences in taxonomic abundances between the metagenome and metatranscriptome, and in the metatranscriptome, we also observed an abundance of plant virus RNA not previously reported in DNA-only studies. We also found in the product tested, that there were no viable bacteria capable of metabolizing nitrate to nitrite. Therefore, the product tested would not be likely to increase TSNAs during shelf storage. We tested only a single product to date using the strategy presented here, but succeeded in demonstrating the value of using of these methods in tobacco products. These results present novel findings from the first combined metagenome and metatranscriptome of a commercial tobacco product.

    • Zoonotic and Vectorborne Diseases
      1. Community-based prevention of epidemic Rocky Mountain spotted fever among minority populations in Sonora, Mexico, using a One Health approachexternal icon
        Alvarez-Hernandez G, Drexler N, Paddock CD, Licona-Enriquez JD, la Mora JD, Straily A, Del Carmen Candia-Plata M, Cruz-Loustaunau DI, Arteaga-Cardenas VA.
        Trans R Soc Trop Med Hyg. 2019 Dec 10.
        BACKGROUND: Rocky Mountain spotted fever (RMSF) is a significant public health problem in Sonora, Mexico, resulting in thousands of cases and hundreds of deaths. Outbreaks of RMSF are perpetuated by heavy brown dog tick infestations in and around homes. During 2009-2015, there were 61 RMSF cases and 23 deaths in a single community of Sonora (Community A). METHODS: An integrated intervention was carried out from March-November 2016 aimed at reducing tick populations with long-acting acaricidal collars on dogs, environmental acaricides applied to peri-domestic areas and RMSF education. Tick levels were measured by inspection of community dogs to monitor efficacy of the intervention. A similar neighborhood (Community B) was selected for comparison and received standard care (acaricide treatment and education). RESULTS: The prevalence of tick-infested dogs in Community A declined from 32.5% to 8.8% (p<0.01). No new cases of RMSF were identified in this area during the subsequent 18 mo. By comparison, the percentage of tick-infested dogs in Community B decreased from 19% to 13.4% (p=0.36) and two cases were reported, including one death. CONCLUSIONS: Community-based interventions using an integrated approach to control brown dog ticks can diminish the morbidity and mortality attributable to RMSF.

      2. Serologic testing is the standard for laboratory diagnosis and confirmation of Lyme disease. Serodiagnostic assays to detect antibodies against Borrelia burgdorferi, the agent of Lyme borreliosis, are used for detection of infection. However, serologic testing within the first month of infection is less sensitive as patients' antibody responses continue to develop. Previously, we screened several B. burgdorferi in vivo expressed antigens for candidates that elicit early antibody responses in patients with Stage 1 and 2 Lyme disease. We evaluated patient IgM seroreactivity against 6 antigens and found an increase in sensitivity without compromising specificity when compared to current IgM second-tier immunoblot scoring. In this study, we continued the evaluation using a multi-antigen panel to measure IgM plus IgG seroreactivity in these early Lyme disease patients' serum samples. Using two statistical methods for calculating positivity cutoff values, sensitivity was 70 and 84-87%, for early acute and early convalescent Lyme disease patients, respectively. Specificity was 98-100% for healthy non-endemic control patients, and 96-100% for healthy endemic controls depending on the statistical analysis. We conclude that improved serologic testing for early Lyme disease may be achieved by the addition of multiple borrelial antigens that elicit IgM and IgG antibodies early in infection.

      3. Risk of yellow fever virus importation into the United States from Brazil, outbreak years 2016-2017 and 2017-2018external icon
        Dorigatti I, Morrison S, Donnelly CA, Garske T, Bowden S, Grills A.
        Sci Rep. 2019 Dec 31;9(1):20420.
        Southeast Brazil has experienced two large yellow fever (YF) outbreaks since 2016. While the 2016-2017 outbreak mainly affected the states of Espirito Santo and Minas Gerais, the 2017-2018 YF outbreak primarily involved the states of Minas Gerais, Sao Paulo, and Rio de Janeiro, the latter two of which are highly populated and popular destinations for international travelers. This analysis quantifies the risk of YF virus (YFV) infected travelers arriving in the United States via air travel from Brazil, including both incoming Brazilian travelers and returning US travelers. We assumed that US travelers were subject to the same daily risk of YF infection as Brazilian residents. During both YF outbreaks in Southeast Brazil, three international airports-Miami, New York-John F. Kennedy, and Orlando-had the highest risk of receiving a traveler infected with YFV. Most of the risk was observed among incoming Brazilian travelers. Overall, we found low risk of YFV introduction into the United States during the 2016-2017 and 2017-2018 outbreaks. Decision makers can use these results to employ the most efficient and least restrictive actions and interventions.

      4. Description of Eschar-associated rickettsial diseases using passive surveillance data - United States, 2010-2016external icon
        Drexler N, Nichols Heitman K, Cherry C.
        MMWR Morb Mortal Wkly Rep. 2020 Jan 3;68(5152):1179-1182.

      5. An evaluation of the flea index as a predictor of plague epizootics in the West Nile Region of Ugandaexternal icon
        Eisen RJ, Atiku LA, Mpanga JT, Enscore RE, Acayo S, Kaggwa J, Yockey BM, Apangu T, Kugeler KJ, Mead PS.
        J Med Entomol. 2019 Dec 31.
        Plague is a low incidence flea-borne zoonosis that is often fatal if treatment is delayed or inadequate. Outbreaks occur sporadically and human cases are often preceded by epizootics among rodents. Early recognition of epizootics coupled with appropriate prevention measures should reduce plague morbidity and mortality. For nearly a century, the flea index (a measure of fleas per host) has been used as a measure of risk for epizootic spread and human plague case occurrence, yet the practicality and effectiveness of its use in surveillance programs has not been evaluated rigorously. We sought to determine whether long-term monitoring of the Xenopsylla flea index on hut-dwelling rats in sentinel villages in the plague-endemic West Nile region of Uganda accurately predicted plague occurrence in the surrounding parish. Based on observations spanning ~6 yr, we showed that on average, the Xenopsylla flea index increased prior to the start of the annual plague season and tended to be higher in years when plague activity was reported in humans or rodents compared with years when it was not. However, this labor-intensive effort had limited spatial coverage and was a poor predictor of plague activity within sentinel parishes.

      6. Measurement error, microcephaly prevalence and implications for Zika: an analysis of Uruguay perinatal dataexternal icon
        Harville EW, Buekens PM, Cafferata ML, Gilboa S, Tomasso G, Tong V.
        Arch Dis Child. 2019 Dec 13.
        BACKGROUND AND OBJECTIVE: The Zika virus outbreak has drawn attention to microcephaly, whose definition is based on head circumference measuring below a percentile or number of SDs below the mean. The objective of this analysis was to assess how differences in measurement precision might affect prevalence and trends of microcephaly. METHODS: Data from all births in Uruguay during 2010-2015 were obtained from the Perinatal Information System. The prevalence of births with microcephaly was calculated based on head circumference measurement at birth applying the INTERGROWTH-21(st) standards for sex and gestational age, and compared by method of ascertaining gestational age. RESULTS: Rounding and digit preference was observed: 74% of head circumference measurements were reported as a whole centimetre value. The prevalence of births varied substantially by the criterion used to define microcephaly (<3 SD, <2 SD, <3rd percentile for gestational age) and could be halved or doubled based on adding or subtracting a half-centimetre from all reported head circumference measurements. If 4 days were added to gestational age calculations, rather than using completed gestational weeks (without days) for gestational age reporting, the prevalence was 1.7-2 times higher. DISCUSSION: Rounding in measurement of head circumference and reporting preferences of gestational age may have contributed to a lower prevalence of microcephaly than expected in this population. Differences in head circumference measurement protocols and gestational age dating have the potential to affect the prevalence of babies reported with microcephaly, and this limitation should be acknowledged when interpreting head circumference data collected for surveillance.

      7. Comparing vector and human surveillance strategies to detect arbovirus transmission: A simulation study for Zika virus detection in Puerto Ricoexternal icon
        Madewell ZJ, Hemme RR, Adams L, Barrera R, Waterman SH, Johansson MA.
        PLoS Negl Trop Dis. 2019 Dec 26;13(12):e0007988.
        BACKGROUND: Detecting and monitoring the transmission of arboviruses such as Zika virus (ZIKV), dengue virus, and chikungunya virus is critical for prevention and control activities. Previous work has compared the ability of different human-focused surveillance strategies to detect ZIKV transmission in U.S. counties where no known transmission had occurred, but whether virological surveillance in mosquitoes could represent an effective surveillance system is unclear. OBJECTIVES: We leveraged a unique set of data from human and virological surveillance in Ae. aegypti during the 2016 ZIKV epidemic in Caguas, Puerto Rico, to compare alternative strategies for detecting and monitoring ZIKV activity. METHODS: We developed a simulation model for mosquito and human surveillance strategies and simulated different transmission scenarios with varying infection rates and mosquito trap densities. We then calculated the expected weekly number of detected infections, the probability of detecting transmission, and the number of tests needed and compared the simulations with observed data from Caguas. RESULTS: In simulated high transmission scenarios (1 infection per 1,000 people per week), the models demonstrated that both approaches had estimated probabilities of detection of greater than 78%. In simulated low incidence scenarios, vector surveillance had higher sensitivity than human surveillance and sensitivity increased with more traps, more trapping effort, and testing. In contrast, the actual data from Caguas indicated that human virological surveillance was more sensitive than vector virological surveillance during periods of both high and low transmission. CONCLUSION: In scenarios where human surveillance is not possible or when transmission intensity is very low, virological surveillance in Ae. aegypti may be able to detect and monitor ZIKV epidemic activity. However, surveillance for humans seeking care for Zika-like symptoms likely provides an equivalent or more sensitive indicator of transmission intensity in most circumstances.

      8. Zika virus surveillance at the human-animal interface in West-Central Brazil, 2017-2018external icon
        Pauvolid-Correa A, Goncalves Dias H, Marina Siqueira Maia L, Porfirio G, Oliveira Morgado T, Sabino-Santos G, Helena Santa Rita P, Teixeira Gomes Barreto W, Carvalho de Macedo G, Marinho Torres J, Arruda Gimenes Nantes W, Martins Santos F, Oliveira de Assis W, Castro Rucco A, Mamoru Dos Santos Yui R, Bosco Vilela Campos J, Rodrigues Leandro ES, da Silva Ferreira R, Aparecido da Silva Neves N, Charlles de Souza Costa M, Ramos Martins L, Marques de Souza E, Dos Santos Carvalho M, Goncalves Lima M, de Cassia Goncalves Alves F, Humberto Guimaraes Riquelme-Junior L, Luiz Batista Figueiro L, Fernandes Gomes de Santana M, Gustavo Rodrigues Oliveira Santos L, Serra Medeiros S, Lopes Seino L, Hime Miranda E, Henrique Rezende Linhares J, de Oliveira Santos V, Almeida da Silva S, Araujo Lucio K, Silva Gomes V, de Araujo Oliveira A, Dos Santos Silva J, de Almeida Marques W, Schafer Marques M, Junior Franca de Barros J, Campos L, Couto-Lima D, Coutinho Netto C, Strussmann C, Panella N, Hannon E, Cristina de Macedo B, Ramos de Almeida J, Ramos Ribeiro K, Carolina Barros de Castro M, Pratta Campos L, Paula Rosa Dos Santos A, Marino de Souza I, de Assis Bianchini M, Helena Ramiro Correa S, Ordones Baptista Luz R, Dos Santos Vieira A, Maria de Oliveira Pinto L, Azeredo E, Tadeu Moraes Figueiredo L, Augusto Fonseca Alencar J, Maria Barbosa de Lima S, Miraglia Herrera H, Dezengrini Shlessarenko R, Barreto Dos Santos F, Maria Bispo de Filippis A, Salyer S, Montgomery J, Komar N.
        Viruses. 2019 Dec 16;11(12).
        Zika virus (ZIKV) was first discovered in 1947 in Uganda but was not considered a public health threat until 2007 when it found to be the source of epidemic activity in Asia. Epidemic activity spread to Brazil in 2014 and continued to spread throughout the tropical and subtropical regions of the Americas. Despite ZIKV being zoonotic in origin, information about transmission, or even exposure of non-human vertebrates and mosquitoes to ZIKV in the Americas, is lacking. Accordingly, from February 2017 to March 2018, we sought evidence of sylvatic ZIKV transmission by sampling whole blood from approximately 2000 domestic and wild vertebrates of over 100 species in West-Central Brazil within the active human ZIKV transmission area. In addition, we collected over 24,300 mosquitoes of at least 17 genera and 62 species. We screened whole blood samples and mosquito pools for ZIKV RNA using pan-flavivirus primers in a real-time reverse-transcription polymerase chain reaction (RT-PCR) in a SYBR Green platform. Positives were confirmed using ZIKV-specific envelope gene real-time RT-PCR and nucleotide sequencing. Of the 2068 vertebrates tested, none were ZIKV positive. Of the 23,315 non-engorged mosquitoes consolidated into 1503 pools tested, 22 (1.5%) with full data available showed some degree of homology to insect-specific flaviviruses. To identify previous exposure to ZIKV, 1498 plasma samples representing 62 species of domestic and sylvatic vertebrates were tested for ZIKV-neutralizing antibodies by plaque reduction neutralization test (PRNT90). From these, 23 (1.5%) of seven species were seropositive for ZIKV and negative for dengue virus serotype 2, yellow fever virus, and West Nile virus, suggesting potential monotypic reaction for ZIKV. Results presented here suggest no active transmission of ZIKV in non-human vertebrate populations or in alternative vector candidates, but suggest that vertebrates around human populations have indeed been exposed to ZIKV in West-Central Brazil.

      9. Balancing sensitivity and specificity of Zika virus case definitionsexternal icon
        Paz-Bailey G, Gregory CJ.
        Lancet Infect Dis. 2019 Dec 20.

      10. Strategies for combating avian influenza in the Asia-Pacificexternal icon
        Peters L, Greene C, Azziz-Baumgartner E, Zhou S, Lupisan S, Dayan W, Hammond A, Claes F, Mumford E, Dueger E.
        Western Pac Surveill Response J. 2018 2018;9(5 Suppl 1):8-10.

      11. Factors that mattered in helping travelers from countries with Ebola outbreaks participate in post-arrival monitoring during the 2014-2016 Ebola epidemicexternal icon
        Prue CE, Williams PN, Joseph HA, Johnson M, Wojno AE, Zulkiewicz BA, Macom J, Alexander JP, Ray SE, Southwell BG.
        Inquiry. 2019 Jan-Dec;56.
        During the 2014-2016 Ebola epidemic in West Africa, the US Centers for Disease Control and Prevention (CDC) developed the CARE+ program to help travelers arriving to the United States from countries with Ebola outbreaks to meet US government requirements of post-arrival monitoring. We assessed 2 outcomes: (1) factors associated with travelers' intention to monitor themselves and report to local or state public health authority (PHA) and (2) factors associated with self-reported adherence to post-arrival monitoring and reporting requirements. We conducted 1195 intercept in-person interviews with travelers arriving from countries with Ebola outbreaks at 2 airports between April and June 2015. In addition, 654 (54.7%) of these travelers participated in a telephone interview 3 to 5 days after intercept, and 319 (26.7%) participated in a second telephone interview 2 days before the end of their post-arrival monitoring. We used regression modeling to examine variance in the 2 outcomes due to 4 types of factors: (1) programmatic, (2) perceptual, (3) demographic, and (4) travel-related factors. Factors associated with the intention to adhere to requirements included clarity of the purpose of screening (B = 0.051, 95% confidence interval [CI], 0.011-0.092), perceived approval of others (B = 0.103, 95% CI, 0.058-0.148), perceived seriousness of Ebola (B = 0.054, 95% CI, 0.031-0.077), confidence in one's ability to perform behaviors (B = 0.250, 95% CI, 0.193-0.306), ease of following instructions (B = 0.053, 95% CI, 0.010-0.097), and trust in CARE Ambassador (B = 0.056, 95% CI, 0.009-0.103). Respondents' perception of the seriousness of Ebola was the single factor associated with adherence to requirements (odds ratio [OR] = 0.81, 95% CI, 0.673-0.980, for non-adherent vs adherent participants and OR = 0.86, 95% CI, 0.745-0.997, for lost to follow-up vs adherent participants). Results from this assessment can guide public health officials in future outbreaks by identifying factors that may affect adherence to public health programs designed to prevent the spread of epidemics.

      12. Ebola patient virus cycle threshold and risk of household transmission of Ebola virusexternal icon
        Reichler MR, Bruden D, Thomas H, Erickson BR, Knust B, Duffy N, Klena J, Hennessy T.
        J Infect Dis. 2019 Dec 19.
        BACKGROUND: Identifying risk factors for household transmission of Ebola virus (EBOV) is important to guide preventive measures during Ebola outbreaks. METHODS: We enrolled all confirmed persons with EBOV disease who were the first case patient in a household from December 2014 to April 2015 in Freetown, Sierra Leone, and their household contacts. Index patients and contacts were interviewed, and contacts were followed up for 21 days to identify secondary cases. Epidemiologic data were linked to EBOV real-time reverse-transcription polymerase chain reaction cycle threshold (Ct) data from initial diagnostic specimens obtained from enrolled index case patients. RESULTS: Ct data were available for 106 (71%) of 150 enrolled index patients. Of the Ct results, 85 (80%) were from blood specimens from live patients and 21 (20%) from oral swab specimens from deceased patients. The median Ct values for blood and swab specimens were 21.0 and 24.0, respectively (P = .007). In multivariable analysis, a Ct value from blood specimens in the lowest quintile was an independent predictor of both increased risk of household transmission (P = .009) and higher secondary attack rate among household contacts (P = .03), after adjustment for epidemiologic factors. CONCLUSIONS: Our findings suggest the potential to use Ct values from acute EBOV diagnostic specimens for index patients as an early predictor of high-risk households and high-risk groups of contacts to help prioritize EBOV disease investigation and control efforts.


Back to Top

CDC Science Clips Production Staff

  • Takudzwa Sayi, Editor
  • Gail Bang, MLIS, Librarian
  • Kathy Tucker, Librarian
  • William (Bill) Thomas, MLIS, Librarian
  • Jarvis Sims, MIT, MLIS, Librarian

____

DISCLAIMER: Articles listed in the CDC Science Clips are selected by the Stephen B. Thacker CDC Library to provide current awareness of the public health literature. An article's inclusion does not necessarily represent the views of the Centers for Disease Control and Prevention nor does it imply endorsement of the article's methods or findings. CDC and DHHS assume no responsibility for the factual accuracy of the items presented. The selection, omission, or content of items does not imply any endorsement or other position taken by CDC or DHHS. Opinion, findings and conclusions expressed by the original authors of items included in the Clips, or persons quoted therein, are strictly their own and are in no way meant to represent the opinion or views of CDC or DHHS. References to publications, news sources, and non-CDC Websites are provided solely for informational purposes and do not imply endorsement by CDC or DHHS.

Page last reviewed: January 27, 2020, 12:00 AM