Issue 3, January 26, 2021

CDC Science Clips: Volume 13, Issue 3, January 26, 2021

Science Clips is produced weekly to enhance awareness of emerging scientific knowledge for the public health community. Each article features an Altmetric Attention scoreexternal icon to track social and mainstream media mentions.

  1. CDC Authored Publications
    • Chronic Diseases and Conditions
      1. Emergency medical services utilization for acute stroke care: Analysis of the Paul Coverdell National Acute Stroke Program, 2014-2019external icon
        Asaithambi G, Tong X, Lakshminarayan K, Coleman King SM, George MG, Odom EC.
        Prehosp Emerg Care. 2021 Jan 19:1-9.
        OBJECTIVE: Emergency medical service (EMS) transportation after acute stroke is associated with shorter symptom-to-arrival times and more rapid medical attention when compared to patient transportation by private vehicle. METHODS: We analyzed data from the Paul Coverdell National Acute Stroke Program from 2014 to 2019 among stroke (ischemic and hemorrhagic) and transient ischemic attack (TIA) patients to examine patterns in EMS utilization. RESULTS: Of 500,829 stroke and TIA patients (mean age 70.9 years, 51.3% women) from 682 participating hospitals during the study period, 60% arrived by EMS. Patients aged 18-64 years vs. ≥65 years (AOR 0.67) were less likely to utilize EMS. Severe stroke patients (AOR 2.29, 95%CI, 2.15-2.44) and hemorrhagic stroke patients vs. ischemic stroke patients (AOR 1.47, 95% CI, 1.43-1.51) were more likely to utilize EMS. Medicare (AOR 1.35, 95% CI, 1.32-1.38) and Medicaid (AOR 1.41, 95% CI, 1.37-1.45) beneficiaries were more likely than privately insured patients to utilize EMS, but no difference was found between no insurance/self-pay patients and privately insured patients on EMS utilization. Overall, there was a decreasing trend in the utilization of EMS (59.6% to 59.3%, p = 0.037). The decreasing trend was identified among ischemic stroke (p < 0.0001) patients but not among TIA (p = 0.89) or hemorrhagic stroke (p = 0.44) patients. There was no observed trend in pre-notification among stroke patients' arrival by EMS across the study period (56.9% to 56.5%, p = 0.99). CONCLUSIONS: Strategies to help increase stroke awareness and utilization of EMS among those with symptoms of stroke should be considered in order to help improve stroke outcomes.

      2. Prevalence of systemic lupus erythematosus in the United States: Estimates from a meta-analysis of the Centers for Disease Control and Prevention National Lupus Registriesexternal icon
        Izmirly PM, Parton H, Wang L, McCune WJ, Lim SS, Drenkard C, Ferucci ED, Dall'Era M, Gordon C, Helmick CG, Somers EC.
        Arthritis Rheumatol. 2021 Jan 20.
        OBJECTIVE: Epidemiologic data for systemic lupus erythematosus (SLE) is limited, particularly for racial/ethnic subpopulations in the United States (U.S.). Leveraging data from the Centers for Disease Control and Prevention (CDC) National Lupus Registry network of population-based SLE registries, a meta-analysis estimating U.S. SLE prevalence was performed. METHODS: The CDC National Lupus Registry network included four registries in unique states and a fifth in the Indian Health Service (IHS). All registries used the 1997 revised American College of Rheumatology (ACR) classification criteria for the SLE case definition. Case finding spanned either 2002-2004 or 2007-2009. A random effects model was employed given heterogeneity across sites. Applying sex/race-stratified estimates to the 2018 Census population, an estimate for the number of SLE cases in the U.S. was generated. RESULTS: 5,417 cases fulfilled the ACR SLE classification criteria. Pooled prevalence from the four state-specific registries was 72.8/100,000 (95%CI:65.3,81.0), 9 times higher for females than males (128.7 vs 14.6), and highest among Black females (230.9), followed by Hispanic (120.7), white (84.7) and Asian/Pacific Islander females (84.4). Male prevalence was highest in Black males (26.7) followed by Hispanic (18.0), Asian/Pacific Islander (11.2), and white males (8.9). The American Indian/Alaska Native had the highest race-specific SLE estimates for females (270.6/100,000) and males (53.8/100,000). In 2018, 204,295 persons (95% CI:160,902,261,725) in the U.S. fulfilled ACR SLE classification criteria. CONCLUSIONS: A coordinated network of population-based SLE registries provided more accurate estimates for SLE prevalence and numbers affected in the U.S.

      3. Hypertension is one of the largest modifiable risk factors for cardiovascular disease in the United States, and when it occurs during pregnancy, it can lead to serious risks for both the mother and child. There is currently no nationwide or state surveillance system that specifically monitors hypertension among women of reproductive age (WRA). We reviewed hypertension information available in the Behavioral Risk Factor Surveillance System (BRFSS), National Health and Nutrition Examination Survey (NHANES), National Health Interview Survey (NHIS), and Pregnancy Risk Assessment and Monitoring System (PRAMS) health surveys, the Health care Cost and Utilization Project administrative data sets (National Inpatient Sample, State Inpatient Databases, Nationwide Emergency Department Sample, and State Emergency Department Database and the Nationwide Readmissions Database), and the National Vital Statistics System. BRFSS, NHIS, and NHANES and administrative data sets have the capacity to segment nonpregnant WRA from pregnant women. PRAMS collects information on hypertension before and during pregnancy only among women with a live birth. Detailed information on hypertension in the postpartum period is lacking in the data sources that we reviewed. Enhanced data collection may improve opportunities to conduct surveillance of hypertension among WRA.

      4. Summary of current guidelines for cervical cancer screening and management of abnormal test results: 2016-2020external icon
        Perkins RB, Guido RL, Saraiya M, Sawaya GF, Wentzensen N, Schiffman M, Feldman S.
        J Womens Health (Larchmt). 2021 Jan;30(1):5-13.
        Cervical cancer can be prevented through routine screening and follow-up of abnormal results. Several guidelines have been published in the last 4 years from various medical societies and organizations. These guidelines aim to personalize screening and management, reducing unnecessary testing in low-risk patients and managing high-risk patients with more intensive follow-up. However, the resulting complexity can lead to confusion among providers. The CDC, NCI, and obstetrician-gynecologists involved in guideline development summarized current screening and management guidelines. For screening, guidelines for average-risk and high-risk populations are summarized and presented. For management, differences between the 2012 and 2019 consensus guidelines for managing abnormal cervical cancer screening tests and cancer precursors are summarized. Current screening guidelines for average-risk individuals have minor differences, but are evolving toward an HPV-based strategy. For management, HPV testing is preferred to cytology because it is a more sensitive test for cancer precursor detection and also allows for precise risk stratification. Current risk-based screening and management strategies can improve care by reducing unnecessary tests and procedures in low-risk patients and focusing resources on high-risk patients. Knowledge of screening and management guidelines is important to improve adherence and avoid both over- and under-use of screening and colposcopy.

      5. Association of greenness with blood pressure among individuals with type 2 diabetes across rural to urban community types in Pennsylvania, USAexternal icon
        Poulsen MN, Schwartz BS, Nordberg C, DeWalle J, Pollak J, Imperatore G, Mercado CI, Siegel KR, Hirsch AG.
        Int J Environ Res Public Health. 2021 Jan 13;18(2).
        Greenness may impact blood pressure (BP), though evidence is limited among individuals with type 2 diabetes (T2D), for whom BP management is critical. We evaluated associations of residential greenness with BP among individuals with T2D in geographically diverse communities in Pennsylvania. To address variation in greenness type, we evaluated modification of associations by percent forest. We obtained systolic (SBP) and diastolic (DBP) BP measurements from medical records of 9593 individuals following diabetes diagnosis. Proximate greenness was estimated within 1250-m buffers surrounding individuals' residences using the normalized difference vegetation index (NDVI) prior to blood pressure measurement. Percent forest was calculated using the U.S. National Land Cover Database. Linear mixed models with robust standard errors accounted for spatial clustering; models were stratified by community type (townships/boroughs/cities). In townships, the greenest communities, an interquartile range increase in NDVI was associated with reductions in SBP of 0.87 mmHg (95% CI: -1.43, -0.30) and in DBP of 0.41 mmHg (95% CI: -0.78, -0.05). No significant associations were observed in boroughs or cities. Evidence for modification by percent forest was weak. Findings suggest a threshold effect whereby high greenness may be necessary to influence BP in this population and support a slight beneficial impact of greenness on cardiovascular disease risk.

    • Communicable Diseases
      1. Molecular epidemiological analysis of the origin and transmission dynamics of the HIV-1 CRF01_AE sub-epidemic in Bulgariaexternal icon
        Alexiev I, Campbell EM, Knyazev S, Pan Y, Grigorova L, Dimitrova R, Partsuneva A, Gancheva A, Kostadinova A, Seguin-Devaux C, Elenkov I, Yancheva N, Switzer WM.
        Viruses. 2021 Jan 16;13(1).
        HIV-1 subtype CRF01_AE is the second most predominant strain in Bulgaria, yet little is known about the molecular epidemiology of its origin and transmissibility. We used a phylodynamics approach to better understand this sub-epidemic by analyzing 270 HIV-1 polymerase (pol) sequences collected from persons diagnosed with HIV/AIDS between 1995 and 2019. Using network analyses at a 1.5% genetic distance threshold (d), we found a large 154-member outbreak cluster composed mostly of persons who inject drugs (PWID) that were predominantly men. At d = 0.5%, which was used to identify more recent transmission, the large cluster dissociated into three clusters of 18, 12, and 7 members, respectively, five dyads, and 107 singletons. Phylogenetic analysis of the Bulgarian sequences with publicly available global sequences showed that CRF01_AE likely originated from multiple Asian countries, with Vietnam as the likely source of the outbreak cluster between 1988 and 1990. Our findings indicate that CRF01_AE was introduced into Bulgaria multiple times since 1988, and infections then rapidly spread among PWID locally with bridging to other risk groups and countries. CRF01_AE continues to spread in Bulgaria as evidenced by the more recent large clusters identified at d = 0.5%, highlighting the importance of public health prevention efforts in the PWID communities.

      2. Population-based estimates of COVID-19-like illness, COVID-19 illness, and rates of case ascertainment, hospitalizations, and deaths - non-institutionalized New York city residents, March-April 2020external icon
        Alroy KA, Crossa A, Dominianni C, Sell J, Bartley K, Sanderson M, Fernandez S, Levanon Seligson A, Lim SW, Wang SM, Dumas SE, Perlman SE, Konty K, Olson DR, Gould LH, Greene SK.
        Clin Infect Dis. 2021 Jan 18.
        Using a population-based, representative telephone survey, ~930,000 New York City residents had COVID-19 illness beginning March 20-April 30, 2020, a period with limited testing. For every 1000 persons estimated with COVID-19 illness, 141.8 were tested and reported as cases, 36.8 were hospitalized, and 12.8 died, varying by demographic characteristics.

      3. Health care coverage and preexposure prophylaxis (PrEP) use among men who have sex with men living in 22 US cities with Medicaid expansion, 2017external icon
        Baugher AR, Finlayson T, Lewis R, Sionean C, Whiteman A, Wejnert C.
        Am J Public Health. 2021 Jan 21:e1-e9.
        Objectives. To compare health care coverage and utilization between men who have sex with men (MSM) in Medicaid expansion versus nonexpansion states.Methods. We used cross-sectional weighted data from the National HIV Behavioral Surveillance system, which used venue-based methods to interview and test MSM in 22 US cities from June through December, 2017 (n = 8857). We compared MSM in Medicaid expansion versus nonexpansion states by using the Rao-Scott χ(2) test stratified by HIV status. We used multivariable logistic regression to model the relationship between Medicaid expansion, coverage, and preexposure prophylaxis (PrEP) use.Results. MSM in expansion states were more likely to have insurance (87.9% vs 71.6%), have Medicaid (21.3% vs 3.8%), discuss PrEP with a provider (58.8% vs 44.3%), or use PrEP (31.1% vs 17.5%).Conclusions. Medicaid expansion is associated with higher coverage and care, including PrEP.Public Health Implications. States may consider expanding Medicaid to help end the HIV epidemic. (Am J Public Health. Published online ahead of print January 21, 2021: e1-e9.

      4. Integrated TB and HIV care for Mozambican children: temporal trends, site-level determinants of performance, and recommendations for improved TB preventive treatmentexternal icon
        Buck WC, Nguyen H, Siapka M, Basu L, Greenberg Cowan J, De Deus MI, Gleason M, Ferreira F, Xavier C, Jose B, Muthemba C, Simione B, Kerndt P.
        AIDS Res Ther. 2021 Jan 9;18(1):3.
        BACKGROUND: Pediatric tuberculosis (TB), human immunodeficiency virus (HIV), and TB-HIV co-infection are health problems with evidence-based diagnostic and treatment algorithms that can reduce morbidity and mortality. Implementation and operational barriers affect adherence to guidelines in many resource-constrained settings, negatively affecting patient outcomes. This study aimed to assess performance in the pediatric HIV and TB care cascades in Mozambique. METHODS: A retrospective analysis of routine PEPFAR site-level HIV and TB data from 2012 to 2016 was performed. Patients 0-14 years of age were included. Descriptive statistics were used to report trends in TB and HIV indicators. Linear regression was done to assess associations of site-level variables with performance in the pediatric TB and HIV care cascades using 2016 data. RESULTS: Routine HIV testing and cotrimoxazole initiation for co-infected children in the TB program were nearly optimal at 99% and 96% in 2016, respectively. Antiretroviral therapy (ART) initiation was lower at 87%, but steadily improved from 2012 to 2016. From the HIV program, TB screening at the last consultation rose steadily over the study period, reaching 82% in 2016. The percentage of newly enrolled children who received either TB treatment or isoniazid preventive treatment (IPT) also steadily improved in all provinces, but in 2016 was only at 42% nationally. Larger volume sites were significantly more likely to complete the pediatric HIV and TB care cascades in 2016 (p value range 0.05 to < 0.001). CONCLUSIONS: Mozambique has made significant strides in improving the pediatric care cascades for children with TB and HIV, but there were missed opportunities for TB diagnosis and prevention, with IPT utilization being particularly problematic. Strengthened TB/HIV programming that continues to focus on pediatric ART scale-up while improving delivery of TB preventive therapy, either with IPT or newer rifapentine-based regimens for age-eligible children, is needed.

      5. Norovirus and other viral causes of medically attended acute gastroenteritis across the age spectrum: Results from the MAAGE Study in the United Statesexternal icon
        Burke RM, Mattison C, Marsh Z, Shioda K, Donald J, Salas SB, Naleway AL, Biggs C, Schmidt MA, Hall AJ.
        Clin Infect Dis. 2021 Jan 21.
        BACKGROUND: Acute gastroenteritis (AGE) causes a substantial burden in the United States, but its etiology frequently remains undetermined. Active surveillance within an integrated healthcare delivery system was used to estimate the prevalence and incidence of medically attended norovirus, rotavirus, sapovirus, and astrovirus. METHODS: Active surveillance was conducted among all enrolled members of Kaiser Permanente Northwest during July 2014 - June 2016. An age-stratified, representative sample of AGE-associated medical encounters were recruited to provide a stool specimen to be tested for norovirus, rotavirus, sapovirus, and astrovirus. Medically attended AGE (MAAGE) encounters for a patient occurring within 30 days were grouped into one episode, and all-cause MAAGE incidence was calculated. Pathogen- and healthcare setting-specific incidence estimates were calculated using age-stratified bootstrapping. RESULTS: The overall incidence of MAAGE was 40.6 episodes per 1000 person-years (PY), with most episodes requiring no more than outpatient care. Norovirus was the most frequently detected pathogen, with an incidence of 5.5 medically attended episodes per 1000 PY. Incidence of norovirus MAAGE was highest among children aged <5 years (20.4 episodes per 1000 PY), followed by adults aged ≥65 years (4.5 episodes per 1000 PY). Other study pathogens showed similar patterns by age, but lower overall incidence (sapovirus: 2.4 per 1000 PY, astrovirus: 1.3 per 1000 PY, rotavirus: 0.5 per 1000 PY). CONCLUSIONS: Viral enteropathogens, particularly norovirus, are an important contributor to MAAGE, especially among children <5 years of age. The present findings underline the importance of judicious antibiotics use for pediatric AGE and suggest that an effective norovirus vaccine could substantially reduce MAAGE.

      6. Prospective cohort study of children with suspected SARS-CoV-2 infection presenting to paediatric emergency departments: a Paediatric Emergency Research Networks (PERN) Study Protocolexternal icon
        Funk AL, Florin TA, Dalziel SR, Mintegi S, Salvadori MI, Tancredi DJ, Neuman MI, Payne DC, Plint AC, Klassen TP, Malley R, Ambroggio L, Kim K, Kuppermann N, Freedman SB.
        BMJ Open. 2021 Jan 15;11(1):e042121.
        INTRODUCTION: Relatively limited data are available regarding paediatric COVID-19. Although most children appear to have mild or asymptomatic infections, infants and those with comorbidities are at increased risk of experiencing more severe illness and requiring hospitalisation due to COVID-19. The recent but uncommon association of SARS-CoV-2 infection with development of a multisystem inflammatory syndrome has heightened the importance of understanding paediatric SARS-CoV-2 infection. METHODS AND ANALYSIS: The Paediatric Emergency Research Network-COVID-19 cohort study is a rapid, global, prospective cohort study enrolling 12 500 children who are tested for acute SARS-CoV-2 infection. 47 emergency departments across 12 countries on four continents will participate. At enrolment, regardless of SARS-CoV-2 test results, all children will have the same information collected, including clinical, epidemiological, laboratory, imaging and outcome data. Interventions and outcome data will be collected for hospitalised children. For all children, follow-up at 14 and 90 days will collect information on further medical care received, and long-term sequelae, respectively. Statistical models will be designed to identify risk factors for infection and severe outcomes. ETHICS AND DISSEMINATION: Sites will seek ethical approval locally, and informed consent will be obtained. There is no direct risk or benefit of study participation. Weekly interim analysis will allow for real-time data sharing with regional, national, and international policy makers. Harmonisation and sharing of investigation materials with WHO, will contribute to synergising global efforts for the clinical characterisation of paediatric COVID-19. Our findings will enable the implementation of countermeasures to reduce viral transmission and severe COVID-19 outcomes in children. TRIAL REGISTRATION NUMBER: NCT04330261.

      7. Emergence of SARS-CoV-2 B.1.1.7 lineage - United States, December 29, 2020-January 12, 2021external icon
        Galloway SE, Paul P, MacCannell DR, Johansson MA, Brooks JT, MacNeil A, Slayton RB, Tong S, Silk BJ, Armstrong GL, Biggerstaff M, Dugan VG.
        MMWR Morb Mortal Wkly Rep. 2021 Jan 22;70(3):95-99.
        On December 14, 2020, the United Kingdom reported a SARS-CoV-2 variant of concern (VOC), lineage B.1.1.7, also referred to as VOC 202012/01 or 20I/501Y.V1.* The B.1.1.7 variant is estimated to have emerged in September 2020 and has quickly become the dominant circulating SARS-CoV-2 variant in England (1). B.1.1.7 has been detected in over 30 countries, including the United States. As of January 13, 2021, approximately 76 cases of B.1.1.7 have been detected in 12 U.S. states.(†) Multiple lines of evidence indicate that B.1.1.7 is more efficiently transmitted than are other SARS-CoV-2 variants (1-3). The modeled trajectory of this variant in the U.S. exhibits rapid growth in early 2021, becoming the predominant variant in March. Increased SARS-CoV-2 transmission might threaten strained health care resources, require extended and more rigorous implementation of public health strategies (4), and increase the percentage of population immunity required for pandemic control. Taking measures to reduce transmission now can lessen the potential impact of B.1.1.7 and allow critical time to increase vaccination coverage. Collectively, enhanced genomic surveillance combined with continued compliance with effective public health measures, including vaccination, physical distancing, use of masks, hand hygiene, and isolation and quarantine, will be essential to limiting the spread of SARS-CoV-2, the virus that causes coronavirus disease 2019 (COVID-19). Strategic testing of persons without symptoms but at higher risk of infection, such as those exposed to SARS-CoV-2 or who have frequent unavoidable contact with the public, provides another opportunity to limit ongoing spread.

      8. SARS-CoV-2 serologic assay needs for the next phase of the US COVID-19 pandemic responseexternal icon
        Gundlapalli AV, Salerno RM, Brooks JT, Averhoff F, Petersen LR, McDonald LC, Iademarco MF.
        Open Forum Infect Dis. 2021 Jan;8(1):ofaa555.
        BACKGROUND: There is a need for validated and standardized severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) quantitative immunoglobulin G (IgG) and neutralization assays that can be used to understand the immunology and pathogenesis of SARS-CoV-2 infection and support the coronavirus disease 2019 (COVID-19) pandemic response. METHODS: Literature searches were conducted to identify English language publications from peer-reviewed journals and preprints from January 2020 through November 6, 2020. Relevant publications were reviewed for mention of IgG or neutralization assays for SARS-CoV-2, or both, and the methods of reporting assay results. RESULTS: Quantitative SARS-CoV-2 IgG results have been reported from a limited number of studies; most studies used in-house laboratory-developed tests in limited settings, and only two semiquantitative tests have received US Food and Drug Administration (FDA) Emergency Use Authorization (EUA). As of November 6, 2020, there is only one SARS-CoV-2 neutralization assay with FDA EUA. Relatively few studies have attempted correlation of quantitative IgG titers with neutralization results to estimate surrogates of protection. The number of individuals tested is small compared with the magnitude of the pandemic, and persons tested are not representative of disproportionately affected populations. Methods of reporting quantitative results are not standardized to enable comparisons and meta-analyses. CONCLUSIONS: Lack of standardized SARS-CoV-2 quantitative IgG and neutralization assays precludes comparison of results from published studies. Interassay and interlaboratory validation and standardization of assays will support efforts to better understand antibody kinetics and longevity of humoral immune responses postillness, surrogates of immune protection, and vaccine immunogenicity and efficacy. Public-private partnerships could facilitate realization of these advances in the United States and worldwide.

      9. COVID-19 trends among persons aged 0-24 years - United States, March 1-December 12, 2020external icon
        Leidman E, Duca LM, Omura JD, Proia K, Stephens JW, Sauber-Schatz EK.
        MMWR Morb Mortal Wkly Rep. 2021 Jan 22;70(3):88-94.
        Coronavirus disease 2019 (COVID-19) case and electronic laboratory data reported to CDC were analyzed to describe demographic characteristics, underlying health conditions, and clinical outcomes, as well as trends in laboratory-confirmed COVID-19 incidence and testing volume among U.S. children, adolescents, and young adults (persons aged 0-24 years). This analysis provides a critical update and expansion of previously published data, to include trends after fall school reopenings, and adds preschool-aged children (0-4 years) and college-aged young adults (18-24 years) (1). Among children, adolescents, and young adults, weekly incidence (cases per 100,000 persons) increased with age and was highest during the final week of the review period (the week of December 6) among all age groups. Time trends in weekly reported incidence for children and adolescents aged 0-17 years tracked consistently with trends observed among adults since June, with both incidence and positive test results tending to increase since September after summer declines. Reported incidence and positive test results among children aged 0-10 years were consistently lower than those in older age groups. To reduce community transmission, which will support schools in operating more safely for in-person learning, communities and schools should fully implement and strictly adhere to recommended mitigation strategies, especially universal and proper masking, to reduce COVID-19 incidence.

      10. Toxic shock syndrome in patients younger than 21 years of age, United States, 2006-2018external icon
        Leung J, Abrams JY, Maddox RA, Godfred-Cato S, Schonberger LB, Belay ED.
        Pediatr Infect Dis J. 2021 Jan 12.
        We examined the incidence of toxic shock syndrome in the United States during 2006-2018 among persons <21 years old with commercial or Medicaid-insurance using administrative data. There were 1008 commercially-insured and 481 Medicaid-insured toxic shock syndrome cases. The annual rate was 1 per 100,000 and stable over time. Rates were even lower in children <5 years old and stable over time.

      11. Guillain-Barré syndrome and antecedent cytomegalovirus infection, USA 2009-2015external icon
        Leung J, Sejvar JJ, Soares J, Lanzieri TM.
        Neurol Sci. 2020 Apr;41(4):885-891.
        OBJECTIVE: To describe incidence and clinical characteristics of cases of Guillain-Barré syndrome (GBS) in the USA during 2009-2015, and characteristics of GBS cases with antecedent cytomegalovirus (CMV) infection among persons with employer-sponsored insurance. METHODS: We analyzed medical claims from IBM Watson MarketScan® databases. GBS patients were defined as enrollees with an inpatient claim with GBS as the principal diagnosis code, based on ICD-9 or ICD-10, and ≥ 1 claim for lumbar puncture or EMG/nerve conduction study. We assessed intensive care unit (ICU) hospitalization, intubation, dysautonomia, and death. We also assessed selected infectious illness within 60 days prior to the first GBS-coded inpatient claim. RESULTS: We identified 3486 GBS patients; annual incidence was 1.0-1.2/100,000 persons during 2009-2015. GBS incidence was higher in males (1.2/100,000) than in females (0.9/100,000) (p = 0.006) and increased with age, from 0.4/100,000 in persons 0-17 years old to 2.1/100,000 in persons ≥ 65 years old (p < 0.001). Half of GBS patients were hospitalized in the ICU, 8% were intubated, 2% developed dysautonomia, and 1% died. Half had a claim for antecedent illness, but only 125 (3.5%) had a claim for specific infectious pathogens. The mean age among 18 GBS patients with antecedent CMV infection was 39 years versus 47 years among those without antecedent  CMV infection (p = 0.038). CONCLUSIONS: Incidence of GBS using a large national claims database was comparable to that reported in the literature, but cases appeared to be less severe. Half of GBS patients reported prior infectious illness, but only a minority had a specific pathogen identified.

      12. Evaluation of Abbott BinaxNOW rapid antigen test for SARS-CoV-2 infection at two community-based testing sites - Pima County, Arizona, November 3-17, 2020external icon
        Prince-Guerra JL, Almendares O, Nolen LD, Gunn JK, Dale AP, Buono SA, Deutsch-Feldman M, Suppiah S, Hao L, Zeng Y, Stevens VA, Knipe K, Pompey J, Atherstone C, Bui DP, Powell T, Tamin A, Harcourt JL, Shewmaker PL, Medrzycki M, Wong P, Jain S, Tejada-Strop A, Rogers S, Emery B, Wang H, Petway M, Bohannon C, Folster JM, MacNeil A, Salerno R, Kuhnert-Tallman W, Tate JE, Thornburg NJ, Kirking HL, Sheiban K, Kudrna J, Cullen T, Komatsu KK, Villanueva JM, Rose DA, Neatherlin JC, Anderson M, Rota PA, Honein MA, Bower WA.
        MMWR Morb Mortal Wkly Rep. 2021 Jan 22;70(3):100-105.
        Rapid antigen tests, such as the Abbott BinaxNOW COVID-19 Ag Card (BinaxNOW), offer results more rapidly (approximately 15-30 minutes) and at a lower cost than do highly sensitive nucleic acid amplification tests (NAATs) (1). Rapid antigen tests have received Food and Drug Administration (FDA) Emergency Use Authorization (EUA) for use in symptomatic persons (2), but data are lacking on test performance in asymptomatic persons to inform expanded screening testing to rapidly identify and isolate infected persons (3). To evaluate the performance of the BinaxNOW rapid antigen test, it was used along with real-time reverse transcription-polymerase chain reaction (RT-PCR) testing to analyze 3,419 paired specimens collected from persons aged ≥10 years at two community testing sites in Pima County, Arizona, during November 3-17, 2020. Viral culture was performed on 274 of 303 residual real-time RT-PCR specimens with positive results by either test (29 were not available for culture). Compared with real-time RT-PCR testing, the BinaxNOW antigen test had a sensitivity of 64.2% for specimens from symptomatic persons and 35.8% for specimens from asymptomatic persons, with near 100% specificity in specimens from both groups. Virus was cultured from 96 of 274 (35.0%) specimens, including 85 (57.8%) of 147 with concordant antigen and real-time RT-PCR positive results, 11 (8.9%) of 124 with false-negative antigen test results, and none of three with false-positive antigen test results. Among specimens positive for viral culture, sensitivity was 92.6% for symptomatic and 78.6% for asymptomatic individuals. When the pretest probability for receiving positive test results for SARS-CoV-2 is elevated (e.g., in symptomatic persons or in persons with a known COVID-19 exposure), a negative antigen test result should be confirmed by NAAT (1). Despite a lower sensitivity to detect infection, rapid antigen tests can be an important tool for screening because of their quick turnaround time, lower costs and resource needs, high specificity, and high positive predictive value (PPV) in settings of high pretest probability. The faster turnaround time of the antigen test can help limit transmission by more rapidly identifying infectious persons for isolation, particularly when used as a component of serial testing strategies.

      13. Epidemiology of cytomegalovirus infection among mothers and infants in Colombiaexternal icon
        Rico A, Dollard SC, Valencia D, Corchuelo S, Tong V, Laiton-Donato K, Amin MM, Benavides M, Wong P, Newton S, Daza M, Cates J, Gonzalez M, Zambrano LD, Mercado M, Ailes EC, Rodriguez H, Gilboa SM, Acosta J, Ricaldi J, Pelaez D, Honein MA, Ospina ML, Lanzieri TM.
        J Med Virol. 2021 Jan 21.
        We assessed maternal and infant cytomegalovirus (CMV) infection in Colombia. Maternal serum was tested for CMV immunoglobulin G antibodies at a median of 10 (interquartile range: 8-12) weeks gestation (n=1,501). CMV DNA polymerase chain reaction was performed on infant urine to diagnose congenital (≤21 days of life) and postnatal (>21 days) infection. Maternal CMV seroprevalence was 98.1% (95% confidence interval [CI]: 97.5-98.8%). Congenital CMV prevalence was 8.4 (95% CI: 3.9-18.3; 6/711) per 1,000 live births. Among 472 infants without confirmed congenital CMV infection subsequently tested at age 6 months, 258 (54.7%, 95% CI: 50.2%-59.1%) had postnatal infection. This article is protected by copyright. All rights reserved.

      14. COVID-19 case investigation and contact tracing efforts from health departments - United States, June 25-July 24, 2020external icon
        Spencer KD, Chung CL, Stargel A, Shultz A, Thorpe PG, Carter MW, Taylor MM, McFarlane M, Rose D, Honein MA, Walke H.
        MMWR Morb Mortal Wkly Rep. 2021 Jan 22;70(3):83-87.
        Case investigation and contact tracing are core public health tools used to interrupt transmission of pathogens, including SARS-CoV-2, the virus that causes coronavirus disease 2019 (COVID-19); timeliness is critical to effectiveness (1,2). In May 2020, CDC funded* 64 state, local, and territorial health departments(†) to support COVID-19 response activities. As part of the monitoring process, case investigation and contact tracing metrics for June 25-July 24, 2020, were submitted to CDC by 62 health departments. Descriptive analyses of case investigation and contact tracing load, timeliness, and yield (i.e., the number of contacts elicited divided by the number of patients prioritized for interview) were performed. A median of 57% of patients were interviewed within 24 hours of report of the case to a health department (interquartile range [IQR] = 27%-82%); a median of 1.15 contacts were identified per patient prioritized for interview(§) (IQR = 0.62-1.76), and a median of 55% of contacts were notified within 24 hours of identification by a patient (IQR = 32%-79%). With higher caseloads, the percentage of patients interviewed within 24 hours of case report was lower (Spearman coefficient = -0.68), and the number of contacts identified per patient prioritized for interview also decreased (Spearman coefficient = -0.60). The capacity to conduct timely contact tracing varied among health departments, largely driven by investigators' caseloads. Incomplete identification of contacts affects the ability to reduce transmission of SARS-CoV-2. Enhanced staffing capacity and ability and improved community engagement could lead to more timely interviews and identification of more contacts.

      15. Risk factors and clinical profile of sapovirus-associated acute gastroenteritis in early childhood: A Nicaraguan birth cohort studyexternal icon
        Vielot NA, González F, Reyes Y, Zepeda O, Blette B, Paniagua M, Toval-Ruíz C, Diez-Valcarce M, Hudgens MG, Gutiérrez L, Blandón P, Herrera R, Cuadra EC, Bowman N, Vilchez S, Vinjé J, Becker-Dreps S, Bucardo F.
        Pediatr Infect Dis J. 2021 Jan 12.
        BACKGROUND: Sapovirus is increasingly recognized as an important cause of acute gastroenteritis (AGE) in children. We identified risk factors and characterized the clinical profile of sapovirus AGE in a birth cohort in León, Nicaragua. METHODS: We conducted a case-control study nested within a birth cohort (n = 444). Fieldworkers conducted weekly household AGE surveillance. AGE stools were tested for sapovirus by reverse transcriptase quantitative polymerase chain reaction. For each first sapovirus episode, we selected 2 healthy age-matched controls and estimated independent risk factors of sapovirus AGE using conditional logistic regression. We compared clinical characteristics of sapovirus AGE episodes with episodes associated with other etiologies and identified co-infections with other enteric pathogens. RESULTS: From June 2017 to July 2019, we identified 63 first sapovirus AGE episodes and selected 126 controls. Having contact with an individual with AGE symptoms and vaginal delivery were independent risk factors for sapovirus AGE. All cases experienced diarrhea, lasting a median 6 days; 23% experienced vomiting. Compared to children with AGE due to another etiology, sapovirus AGE was similar in severity, with less reported fever. Most cases experienced co-infections and were more likely than controls to be infected with diarrheagenic Escherichia coli or astrovirus. CONCLUSIONS: Sapovirus was a commonly identified AGE etiology in this Central American setting, and symptoms were similar to AGE associated with other etiologies. The association between vaginal delivery and sapovirus is a novel finding. Gut microbiome composition might mediate this relationship, or vaginal delivery might be a proxy for other risk factors. Further investigation into more specific biological mechanisms is warranted.

      16. High prevalence of pulmonary tuberculosis among female sex workers, men who have sex with men, and transgender women in Papua New Guineaexternal icon
        Willie B, Hakim AJ, Badman SG, Weikum D, Narokobi R, Coy K, Gabuzzi J, Pekon S, Gene S, Amos A, Kupul M, Hou P, Dala NM, Whiley DM, Wapling J, Kaldor JM, Vallely AJ, Kelly-Hanku A.
        Trop Med Health. 2021 Jan 13;49(1):4.
        BACKGROUND: Papua New Guinea (PNG) has a tuberculosis (TB) case notification rate of 333 cases per 100,000 population in 2016 and is one of the 14 countries classified by the World Health Organization (WHO) as "high-burden" for TB, multi-drug-resistant TB (MDR-TB), and TB/HIV. HIV epidemic is mixed with a higher prevalence among key populations, female sex workers (FSW), men who have sex with men (MSM), and transgender women (TGW). METHODS: We conducted a cross-sectional HIV biobehavioral survey (BBS) using respondent-driven sampling method among FSW, MSM, and TGW in Port Moresby, Lae, and Mt. Hagen (2016-2017). As part of the study, participants were screened for the four symptoms suggestive of TB infection using the WHO TB screening algorithm. Sputum and venous whole blood samples were collected and tested for pulmonary TB and HIV infection, respectively. Pulmonary TB testing was performed using GeneXpert®MTB/RIF molecular point-of-care test, and HIV testing was done following the PNG national HIV testing algorithm. All data discussed are weighted unless otherwise mentioned. RESULTS: Among FSW, 72.6%, 52.0%, and 52.9% in Port Moresby, Lae, and Mt. Hagen, respectively, experienced at least one symptom suggestive of TB infection. Among MSM and TGW, 69% and 52.6% in Port Moresby and Lae, respectively, experienced at least one symptom suggestive of TB infection. Based on GeneXpert®MTB/RIF results, the estimated TB prevalence rate among FSW was 1200, 700, and 200 per 100,000 in Port Moresby, Lae, and Mt. Hagen, respectively. Among MSM and TGW, the estimated TB prevalence rate was 1000 and 1200 per 100,000 in Port Moresby and Lae, respectively. Co-prevalence of TB/HIV among FSW was 0.1% in Port Moresby and 0.2% in Lae. There were no co-prevalent cases among FSW in Mt. Hagen or among MSM and TGW in Port Moresby and Lae. CONCLUSIONS: Key populations have a higher estimated rate of pulmonary TB than the national rate of pulmonary and extra-pulmonary TB combined. This showed that screening key populations for TB should be integrated into HIV programs regardless of HIV status in PNG's national TB response.

      17. Viral suppression and factors associated with failure to achieve viral suppression among pregnant women in South Africaexternal icon
        Woldesenbet SA, Kufa T, Barron P, Chirombo BC, Cheyip M, Ayalew K, Lombard C, Manda S, Diallo K, Pillay Y, Puren AJ.
        Aids. 2020 Mar 15;34(4):589-597.
        OBJECTIVE: To describe viral load levels among pregnant women and factors associated with failure to achieve viral suppression (viral load ≤50 copies/ml) during pregnancy. DESIGN: Between 1 October and 15 November 2017, a cross-sectional survey was conducted among 15-49-year-old pregnant women attending antenatal care (ANC) at 1595 nationally representative public facilities. METHODS: Blood specimens were taken from each pregnant woman and tested for HIV. Viral load testing was done on all HIV-positive specimens. Demographic and clinical data were extracted from medical records or self-reported. Survey logistic regression examined factors associated with failure to achieve viral suppression. RESULT: Of 10 052 HIV-positive participants with viral load data, 56.2% were virally suppressed. Participants initiating antiretroviral therapy (ART) prior to pregnancy had higher viral suppression (71.0%) by their third trimester compared with participants initiating ART during pregnancy (59.3%). Booking for ANC during the third trimester vs. earlier: [adjusted odds ratio (AOR) 1.8, 95% confidence interval (CI):1.4-2.3], low frequency of ANC visits (AOR for 2 ANC visits vs. ≥4 ANC visits: 2.0, 95% CI:1.7-2.4), delayed initiation of ART (AOR for ART initiated at the second trimester vs. before pregnancy:2.2, 95% CI:1.8-2.7), and younger age (AOR for 15-24 vs. 35-49 years: 1.4, 95% CI:1.2-1.8) were associated with failure to achieve viral suppression during the third trimester. CONCLUSION: Failure to achieve viral suppression was primarily associated with late ANC booking and late initiation of ART. Efforts to improve early ANC booking and early ART initiation in the general population would help improve viral suppression rates among pregnant women. In addition, the study found, despite initiating ART prior to pregnancy, more than one quarter of participants did not achieve viral suppression in their third trimester. This highlights the need to closely monitor viral load and strengthen counselling and support services for ART adherence.

    • Disaster Control and Emergency Services
      1. A workshop on cognitive aging and impairment in the 9/11-exposed populationexternal icon
        Daniels RD, Clouston SA, Hall CB, Anderson KR, Bennett DA, Bromet EJ, Calvert GM, Carreón T, DeKosky ST, Diminich ED, Finch CE, Gandy S, Kreisl WC, Kritikos M, Kubale TL, Mielke MM, Peskind ER, Raskind MA, Richards M, Sano M, Santiago-Colón A, Sloan RP, Spiro A, Vasdev N, Luft BJ, Reissman DB.
        Int J Environ Res Public Health. 2021 Jan 14;18(2).
        The terrorist attacks on 11 September 2001 potentially exposed more than 400,000 responders, workers, and residents to psychological and physical stressors, and numerous hazardous pollutants. In 2011, the World Trade Center Health Program (WTCHP) was mandated to monitor and treat persons with 9/11-related adverse health conditions and conduct research on physical and mental health conditions related to the attacks. Emerging evidence suggests that persons exposed to 9/11 may be at increased risk of developing mild cognitive impairment. To investigate further, the WTCHP convened a scientific workshop that examined the natural history of cognitive aging and impairment, biomarkers in the pathway of neurodegenerative diseases, the neuropathological changes associated with hazardous exposures, and the evidence of cognitive decline and impairment in the 9/11-exposed population. Invited participants included scientists actively involved in health-effects research of 9/11-exposed persons and other at-risk populations. Attendees shared relevant research results from their respective programs and discussed several options for enhancements to research and surveillance activities, including the development of a multi-institutional collaborative research network. The goal of this report is to outline the meeting's agenda and provide an overview of the presentation materials and group discussion.

      2. Public health emergency management capacity building in Guinea: 2014-2019external icon
        Martel LD, Phipps M, Traore A, Standley CJ, Soumah ML, Lamah A, Wone A, Asima M, Barry AM, Berete M, Attal-Juncqua A, Katz R, Robert A, Sompare I, Sorrell EM, Toure Y, Morel-Vulliez A, Keita S.
        Int J Emerg Manag. 2020 ;16(2):179-200.
        Before the Ebola virus disease (EVD) outbreak of 2014-2016, Guinea did not have an emergency management system in place. During the outbreak, Global Health Security Agenda (GHSA) 2014-2019 funds made it possible to rapidly improve the country's capacity to manage epidemics through the development of public health emergency operation centres (PHEOCs) at the national and district levels. Since the end of the response, the infrastructure, staff, and systems of these PHEOCs have been further reinforced and well-integrated in the daily activities of Guinea's National Agency for Health Security, the entity responsible for the management of epidemics. The development of PHEOCs as emergency management tools for epidemics in Guinea would not have been possible without a strong endorsement within the Ministry of Health. Guinea's PHEOC network is wellpositioned to serve as a model of excellence for other Ministries in Guinea and Ministries of Health of other countries of West Africa.

    • Disease Reservoirs and Vectors
      1. We identified an established population of the Gulf Coast tick (Amblyomma maculatum Koch) infected with Rickettsia parkeri in Connecticut, representing the northernmost range limit of this medically relevant tick species. Our finding highlights the importance of tick surveillance and public health challenges posed by geographic expansion of tick vectors and their pathogens.

    • Environmental Health
      1. Backpack use as an alternative water transport method in Kisumu, Kenyaexternal icon
        Kim S, Curran K, Deng L, Odhiambo A, Oremo J, Otieno R, Omore R, Handzel T, Quick R.
        J Water Sanit Hyg Develop. 2020 ;10(4):986-995.
        In developing countries, most households transport water from distant sources, placing physical burdens on women and children, who commonly carry water on their heads. A lightweight backpack was developed to alleviate physical stress from water carriage and provide a safe storage container. In 2015, we conducted a baseline survey among 251 Kenyan households with children <5 years old, distributed one backpack per household, and made 6 monthly home visits to ask about backpack use. At baseline, the median reported water collection time was 40 minutes/round trip; 80% of households reported collecting water daily (median 3 times/day). At follow-up visits, respondents reported backpack use to carry water ranged from 4% to 20% in the previous day; reported backpack use for water storage in the previous day ranged from 31% to 67%. Pain from water carriage was reported at 9% of all follow-up visits. The odds of backpack use in the past day to collect water were lower during rainy season (OR: 0.3, 95% CI: 0.2–0.3) and not associated with reported pain (OR: 1.7, 95% CI: 0.9–3.3). Our study suggests that participants preferred using the backpacks for storage rather than transport of water. Further dissemination of the backpacks is not recommended because of modest use for transport.

      2. Impact of community health promoters on awareness of a rural social marketing program, purchase and use of health products, and disease risk, Kenya, 2014–2016external icon
        Kim S, Laughlin M, Morris J, Otieno R, Odhiambo A, Oremo J, Graham J, Hirai M, Wells E, Basler C, Okello A, Matanock A, Eleveld A, Quick R.
        J Water Sanit Hyg Develop. 2020 ;10(4):940-950.
        The Safe Water and AIDS Project (SWAP), a non-governmental organization in western Kenya, opened kiosks run as businesses by community health promoters (CHPs) to increase access to health products among poor rural families. We conducted a baseline survey in 2014 before kiosks opened, and a post-intervention follow-up in 2016, enrolling 1,517 households with children <18 months old. From baseline to follow-up, we observed increases in reported exposure to the SWAP program (3–11%, p = 0.01) and reported purchases of any SWAP product (3–10%, p < 0.01). The percent of households with confirmed water treatment (detectable free chlorine residual (FCR) >0.2 mg/ml) was similar from baseline to follow-up (7% vs. 8%, p = 0.57). The odds of reported diarrhea in children decreased from baseline to follow-up (odds ratios or OR: 0.77, 95% CI: 0.64–0.93) and households with detectable FCR had lower odds of diarrhea (OR: 0.53, 95% CI: 0.34–0.83). Focus group discussions with CHPs suggested that high product prices, lack of affordability, and expectations that products should be free contributed to low sales. In conclusion, modest reported increases in SWAP exposure and product sales in the target population were insufficient to impact health, but children in households confirmed to chlorinate their water had decreased diarrhea.

      3. Characterizing indoor microbial communities using molecular methods provides insight into bacterial assemblages present in environments that can influence occupants' health. We conducted an environmental assessment as part of an epidemiologic study of 50 elementary schools in a large city in the northeastern USA. We vacuumed dust from the edges of the floor in 500 classrooms accounting for 499 processed dust aliquots for 16S Illumina MiSeq sequencing to characterize bacterial assemblages. DNA sequences were organized into operational taxonomic units (OTUs) and identified using a database derived from the National Center for Biotechnology Information. Bacterial diversity and ecological analyses were performed at the genus level. We identified 29 phyla, 57 classes, 148 orders, 320 families, 1193 genera, and 2045 species in 3073 OTUs. The number of genera per school ranged from 470 to 705. The phylum Proteobacteria was richest of all while Firmicutes was most abundant. The most abundant order included Lactobacillales, Spirulinales, and Clostridiales. Halospirulina was the most abundant genus, which has never been reported from any school studies before. Gram-negative bacteria were more abundant and richer (relative abundance = 0.53; 1632 OTUs) than gram-positive bacteria (0.47; 1441). Outdoor environment-associated genera were identified in greater abundance in the classrooms, in contrast to homes where human-associated bacteria are typically more abundant. Effects of school location, degree of water damage, building condition, number of students, air temperature and humidity, floor material, and classroom's floor level on the bacterial richness or community composition were statistically significant but subtle, indicating relative stability of classroom microbiome from environmental stress. Our study indicates that classroom floor dust had a characteristic bacterial community that is different from typical house dust represented by more gram-positive and human-associated bacteria. Health implications of exposure to the microbiomes in classroom floor dust may be different from those in homes for school staff and students. Video abstract.

      4. Background: Toluene diisocyanate (TDI) is a highly reactive chemical that causes sensitization and has also been associated with increased lung cancer. A risk assessment was conducted based on occupational epidemiologic estimates for several health outcomes. Method(s): Exposure and outcome details were extracted from published studies and a NIOSH Health Hazard Evaluation for new onset asthma, pulmonary function measurements, symptom prevalence, and mortality from lung cancer and respiratory disease. Summary exposure-response estimates were calculated taking into account relative precision and possible survivor selection effects. Attributable incidence of sensitization was estimated as were annual proportional losses of pulmonary function. Excess lifetime risks and benchmark doses were calculated. Result(s): Respiratory outcomes exhibited strong survivor bias. Asthma/sensitization exposure response decreased with increasing facility-average TDI air concentration as did TDI-associated pulmonary impairment. In a mortality cohort where mean employment duration was less than 1 year, survivor bias pre-empted estimation of lung cancer and respiratory disease exposure response. Conclusion(s): Controlling for survivor bias and assuming a linear dose-response with facility-average TDI concentrations, excess lifetime risks exceeding one per thousand occurred at about 2 ppt TDI for sensitization and respiratory impairment. Under alternate assumptions regarding stationary and cumulative effects, one per thousand excess risks were estimated at TDI concentrations of 10 - 30 ppt. The unexplained reported excess mortality from lung cancer and other lung diseases, if attributable to TDI or associated emissions, could represent a lifetime risk comparable to that of sensitization.

    • Genetics and Genomics
      1. Complete genome sequence of Francisella sp. Strain LA11-2445 (FDC406), a novel Francisella species isolated from a human skin lesionexternal icon
        Öhrman C, Uneklint I, Karlsson L, Respicio-Kingry L, Forsman M, Petersen JM, Sjödin A.
        Microbiol Resour Announc. 2021 Jan 14;10(2).
        Here, we present the 2,139,666-bp circular chromosome of Francisella sp. strain LA11-2445 (FDC406), a proposed novel species of Francisella that was isolated from a human cutaneous lesion and is related to Francisella species from marine environments.

    • Health Economics
      1. Cost effectiveness of the Tips From Former Smokers Campaign - U.S., 2012-2018external icon
        Shrestha SS, Davis K, Mann N, Taylor N, Nonnemaker J, Murphy-Hoefer R, Trivers KF, King BA, Babb SD, Armour BS.
        Am J Prev Med. 2021 Jan 14.
        INTRODUCTION: Since 2012, the Centers for Disease Control and Prevention has conducted the national Tips From Former Smokers public education campaign, which motivates smokers to quit by featuring people living with the real-life health consequences of smoking. Cost effectiveness, from the healthcare sector perspective, of the Tips From Former Smokers campaign was compared over 2012-2018 with that of no campaign. METHODS: A combination of survey data from a nationally representative sample of U.S. adults that includes cigarette smokers and literature-based lifetime relapse rates were used to calculate the cumulative number of Tips From Former Smokers campaign‒associated lifetime quits during 2012-2018. Then, lifetime health benefits (premature deaths averted, life years saved, and quality-adjusted life years gained) and healthcare sector cost savings associated with these quits were assessed. All the costs were adjusted for inflation in 2018 U.S. dollars. The Tips From Former Smokers campaign was conducted and the survey data were collected during 2012-2018. Analyses were conducted in 2019. RESULTS: During 2012-2018, the Tips From Former Smokers campaign was associated with an estimated 129,100 premature deaths avoided, 803,800 life years gained, 1.38 million quality-adjusted life years gained, and $7.3 billion in healthcare sector cost savings on the basis of an estimated 642,200 campaign-associated lifetime quits. The Tips From Former Smokers campaign was associated with cost savings per lifetime quit of $11,400, per life year gained of $9,100, per premature deaths avoided of $56,800, and per quality-adjusted life year gained of $5,300. CONCLUSIONS: Mass-reach health education campaigns, such as Tips From Former Smokers, can help smokers quit, improve health outcomes, and potentially reduce healthcare sector costs.

    • Healthcare Associated Infections
      1. Antimicrobial resistance control efforts in Africa: a survey of the role of Civil Society Organisationsexternal icon
        Fraser JL, Alimi YH, Varma JK, Muraya T, Kujinga T, Carter VK, Schultsz C, Del Rio Vilas VJ.
        Glob Health Action. 2021 Jan 1;14(1):1868055.
        Background: Antimicrobial resistance (AMR) is a growing public health threat in Africa. AMR prevention and control requires coordination across multiple sectors of government and civil society partners. Objectives: To assess the current role, needs, and capacities of CSOs working in AMR in Africa. Methods: We conducted an online survey of 35 CSOs working in 37 countries across Africa. The survey asked about priorities for AMR, current AMR-specific activities, monitoring practices, training needs, and preferences for sharing information on AMR. Further data were gathered on the main roles of the organisations, the length of time engaged in and budget spent on AMR-related activities, and their involvement in the development and implementation of National Action Plans (NAPs). Results were assessed against The Africa Centres for Disease Control and Prevention (Africa CDC) Framework for Antimicrobial Resistance (2018-2023). Results: CSOs with AMR-related activities are working in all four areas of Africa CDC's Framework: improving surveillance, delaying emergence, limiting transmission, and mitigating harm from infections caused by AMR microorganisms. Engagement with the four objectives is mainly through advocacy, followed by accountability and service delivery. There were limited monitoring activities reported by CSOs, with only seven (20%) providing an example metric used to monitor their activities related to AMR, and 27 (80%) CSOs reporting having no AMR-related strategy. Half the CSOs reported engaging with the development and implementation of NAPs; however, only three CSOs are aligning their work with these national strategies. Conclusion: CSOs across Africa are supporting AMR prevention and control, however, there is potential for more engagement. Africa CDC and other government agencies should support the training of CSOs in strategies to control AMR. Tailored training programmes can build knowledge of AMR, capacity for monitoring processes, and facilitate further identification of CSOs' contribution to the AMR Framework and alignment with NAPs and regional strategies.

      2. Association between socioeconomic status and incidence of community-associated Clostridioides difficile infection - United States, 2014-2015external icon
        Skrobarcek KA, Mu Y, Ahern J, Basiliere E, Beldavs ZG, Brousseau G, Dumyati G, Fridkin S, Holzbauer SM, Johnston H, Kainer MA, Meek J, Ocampo VL, Parker E, Perlmutter R, Phipps EC, Winston L, Guh A.
        Clin Infect Dis. 2021 Jan 19.
        We evaluated the association between socioeconomic status (SES) and community-associated Clostridioides difficile infection (CA-CDI) incidence across 2474 census tracts in 10 states. Highly correlated community-level SES variables were transformed into distinct factors using factor analysis. We found low SES communities were associated with higher CA-CDI incidence.

    • Immunity and Immunization
      1. Waning vaccine effectiveness against influenza-associated hospitalizations among adults, 2015-2016 to 2018-2019, US Hospitalized Adult Influenza Vaccine Effectiveness Networkexternal icon
        Ferdinands JM, Gaglani M, Martin ET, Monto AS, Middleton D, Silveira F, Talbot HK, Zimmerman R, Patel M.
        Clin Infect Dis. 2021 Jan 19.
        We observed decreased effectiveness of influenza vaccine with increasing time since vaccination for prevention of influenza A(H3N2), influenza A(H1N1)pdm09, and influenza B(Yamagata)-associated hospitalizations among adults. Maximum VE was observed shortly after vaccination, followed by an absolute decline in VE of about 8 to 9% per month post-vaccination.

      2. US primary care physicians' viewpoints on HPV vaccination for adults 27 to 45 yearsexternal icon
        Hurley LP, O'Leary ST, Markowitz LE, Crane LA, Cataldi JR, Brtnikova M, Beaty BL, Gorman C, Meites E, Lindley MC, Kempe A.
        J Am Board Fam Med. 2021 Jan-Feb;34(1):162-170.
        INTRODUCTION: In June 2019, the Advisory Committee on Immunization Practices (ACIP) recommended shared clinical decision making (SCDM) regarding human papillomavirus (HPV) vaccination for adults 27 to 45 years. Our objectives were to assess among primary care physicians 1) recent practice regarding HPV vaccination for adults 27 to 45 years, 2) knowledge of HPV and the new SCDM recommendation, and 3) attitudes toward and anticipated effect of the new SCDM recommendation. METHODS: From October to December 2019, we administered an Internet and mail survey to national networks of 494 general internist (GIM) and 474 family physician (FP) members of the American College of Physicians and American Academy of Family Physicians, respectively. RESULTS: Response rate was 64% (617/968; GIM, 57%; FP, 71%). Fifty-eight percent were aware of the new ACIP recommendation; 42% had recommended HPV vaccination to adults 27 to 45 years, but most had administered HPV vaccine to very few of these patients (73% to 0% and 22% to 1 to 3). Fifty-five percent and 63% were unaware that HPV vaccination does not prevent progression of existing HPV-related cancers or infections, respectively and 57% were not sure what to emphasize when having a SCDM conversation about HPV vaccination. A majority reported they will be more likely recommend HPV vaccination to adults in the 27-to-45-year age range as a result of the new recommendation. CONCLUSIONS: Physicians are interested in recommending HPV vaccination for adults age 27 to 45 years despite ACIP not routinely recommending it in this age range. The majority need more education about the optimal use of HPV vaccine in this age group.

      3. A Retrospective observational cohort study of the effect of antenatal influenza vaccination on birth outcomes in Cape Town, South Africa, 2015-2016external icon
        McMorrow ML, Rossi L, Meiring S, Bishop K, Itzikowitz R, Isaacs W, Stellenboom F, Walaza S, Hellferscee O, Treurnicht FK, Zar HJ, Tempia S, Cohen C.
        Influenza Other Respir Viruses. 2021 Jan 16.
        BACKGROUND: There are conflicting data concerning the impact of antenatal influenza vaccination on birth outcomes including low birthweight (LBW), preterm birth, small for gestational age (SGA), and stillbirth. METHODS: We conducted a retrospective observational cohort study of infants born to women residing in Mitchells Plain, Cape Town. Infants were born at 4 health facilities during May 28 - December 31, 2015 and April 15 - December 31, 2016. We performed crude and multivariable logistic regression, propensity score (PS) matching logistic regression, and inverse probability of treatment weighted (IPTW) regression to assess vaccine effectiveness (VE) against LBW, preterm birth, SGA, and stillbirth adjusting for measured confounders. RESULTS: Maternal vaccination status, antenatal history, and ≥1 birth outcome(s) were available for 4084/5333 (76.6%) pregnancies, 2109 (51.6%) vaccinated, and 1975 (48.4%) unvaccinated. The proportion LBW was lower in vaccinated (6.9%) vs. unvaccinated (12.5%) in multivariable [VE 0.27 (95% CI 0.07-0.42)], PS [VE 0.30 (95% CI 0.09-0.51)], and IPTW [VE 0.24 (95% CI 0.04-0.45)]. Preterm birth was less frequent in vaccinated (8.6%) than unvaccinated (16.4%) in multivariable [VE 0.26 (0.09-0.40)], PS [VE 0.25 (95% CI 0.09-0.41)], and IPTW [VE 0.34 (95% CI 0.18-0.51)]. The proportion SGA was lower in vaccinated (6.0%) than unvaccinated (8.8%) but not in adjusted models. There were few stillbirths in our study population, 30/4084 (0.7%). CONCLUSIONS: Using multiple analytic approaches, we found that influenza vaccination was associated with lower prevalence of LBW (24-30%) and preterm birth (25-34%) in Cape Town during 2015-2016.

      4. Vaccination coverage with selected vaccines and exemption rates among children in kindergarten - United States, 2019-20 school yearexternal icon
        Seither R, McGill MT, Kriss JL, Mellerson JL, Loretan C, Driver K, Knighton CL, Black CL.
        MMWR Morb Mortal Wkly Rep. 2021 Jan 22;70(3):75-82.
        State and local school vaccination requirements serve to protect students against vaccine-preventable diseases (1). This report summarizes data collected by state and local immunization programs* on vaccination coverage among children in kindergarten (kindergartners) in 48 states, exemptions for kindergartners in 49 states, and provisional enrollment and grace period status for kindergartners in 28 states for the 2019-20 school year, which was more than halfway completed when most schools moved to virtual learning in the spring because of the coronavirus 2019 (COVID-19) pandemic. Nationally, vaccination coverage(†) was 94.9% for the state-required number of doses of diphtheria and tetanus toxoids, and acellular pertussis vaccine (DTaP); 95.2% for 2 doses of measles, mumps, and rubella vaccine (MMR); and 94.8% for the state-required number of varicella vaccine doses. Although 2.5% of kindergartners had an exemption from at least one vaccine,(§) another 2.3% were not up to date for MMR and did not have a vaccine exemption. Schools and immunization programs can work together to ensure that undervaccinated students are caught up on vaccinations in preparation for returning to in-person learning. This follow-up is especially important in the current school year, in which undervaccination is likely higher because of disruptions in vaccination during the ongoing COVID-19 pandemic (2-4).

      5. Continued HPV vaccination in the face of unexpected challenges: A commentary on the rationale for an extended interval two-dose scheduleexternal icon
        Whitworth HS, Schiller J, Markowitz LE, Jit M, Brisson M, Simpson E, Watson-Jones D.
        Vaccine. 2021 Jan 13.

    • Injury and Violence
      1. Objectives. To report trends in sexual violence (SV) emergency department (ED) visits in the United States.Methods. We analyzed monthly changes in SV rates (per 100 000 ED visits) from January 2017 to December 2019 using Centers for Disease Control and Prevention's National Syndromic Surveillance Program data. We stratified the data by sex and age groups.Results. There were 196 948 SV-related ED visits from January 2017 to December 2019. Females had higher rates of SV-related ED visits than males. Across the entire time period, females aged 50 to 59 years showed the highest increase (57.33%) in SV-related ED visits, when stratified by sex and age group. In all strata examined, SV-related ED visits displayed positive trends from January 2017 to December 2019; 10 out of the 24 observed positive trends were statistically significant increases. We also observed seasonal trends with spikes in SV-related ED visits during warmer months and declines during colder months, particularly in ages 0 to 9 years and 10 to 19 years.Conclusions. We identified several significant increases in SV-related ED visits from January 2017 to December 2019. Syndromic surveillance offers near-real-time surveillance of ED visits and can aid in the prevention of SV. (Am J Public Health. Published online ahead of print January 21, 2021: e1-e9.

      2. Corporal punishment (CP) leads to detrimental mental and physical consequences for a child. One way to prevent CP is to encourage parents to apply alternative discipline strategies that do not involve violence. Based on the knowledge-behavior gap framework in public health education, this study analyzed the focus group data of 75 low-income Black, Latino and White parents to uncover commonalties and differences in their knowledge, self-efficacy and response efficacy of alternative discipline strategies. Findings revealed that parents knew several alternative discipline strategies and had confidence in their ability to conduct these strategies. However, parents reported that some strategies were hard to implement because they lacked the relevant resources. Moreover, parents did not perceive that alternative discipline strategies were effective without using some forms of CP. Knowledge, self-efficacy and response efficacy of alternative discipline strategies are risk factors for child physical abuse and addressing them will help prevent injury and health impacts on children, while providing safe, stable, nurturing relationships and environments for child development.

      3. Introduction: National estimates for nonfatal self-directed violence (SDV) presenting at EDs are calculated from the National Electronic Injury Surveillance System – All Injury Program (NEISS–AIP). In 2005, the Centers for Disease Control and Prevention and Consumer Product Safety Commission added several questions on patient characteristics and event circumstances for all intentional, nonfatal SDV captured in NEISS–AIP. In this study, we evaluated these additional questions along with the parent NEISS–AIP, which together is referred to as NEISS–AIP SDV for study purposes. Methods: We used a mixed methods design to evaluate the NEISS–AIP SDV as a surveillance system through an assessment of key system attributes. We reviewed data entry forms, the coding manual, and training materials to understand how the system functions. To identify strengths and weaknesses, we interviewed multiple key informants. Finally, we analyzed the NEISS–AIP SDV data from 2018—the most recent data year available—to assess data quality by examining the completeness of variables. Results: National estimates of SDV are calculated from NEISS–AIP SDV. Quality control activities suggest more than 99% of the cause and intent variables were coded consistently with the open text field that captures the medical chart narrative. Many SDV variables have open-ended response options, making them difficult to efficiently analyze. Conclusions: NEISS–AIP SDV provides the opportunity to describe systematically collected risk factors and characteristics associated with nonfatal SDV that are not regularly available through other data sources. With some modifications to data fields and yearly analysis of the additional SDV questions, NEISS–AIP SDV can be a valuable tool for informing suicide prevention. Practical Applications: NEISS-AIP may consider updating the SDV questions and responses and analyzing SDV data on a regular basis. Findings from analyses of the SDV data may lead to improvements in ED care.

      4. Introduction: Falls among older adults are a significant health concern affecting more than a quarter of older adults (age 65+). Certain fall risk factors, such as medication use, increase fall risk among older adults (age 65+). Aim: The aim of this study is to examine the association between antidepressant-medication subclass use and self-reported falls in community-dwelling older adults. Methods: This analysis used the 2009–2013 Medicare Current Beneficiary Survey, a nationally representative panel survey. A total of 8,742 community-dwelling older adults, representing 40,639,884 older Medicare beneficiaries, were included. We compared self-reported falls and psychoactive medication use, including antidepressant subclasses. These data are controlled for demographic, functional, and health characteristics associated with increased fall risk. Descriptive analyses and multivariate logistic regression analyses were conducted using SAS 9.4 and Stata 15 software. Results: The most commonly used antidepressant subclass were selective serotonin reuptake inhibitors (SSRI) antidepressants (13.1%). After controlling for characteristics associated with increased fall risk (including depression and concurrent psychoactive medication use), the risk of falling among older adults increased by approximately 30% among those who used a SSRI or a serotonin-norepinephrine reuptake inhibitors (SNRI) compared to non-users. The adjusted risk ratio (aRR) for SSRI was 1.29 (95% CI = 1.13, 1.47) and for SNRI was 1.32 (95% CI = 1.07, 1.62). Conclusion: SSRI and SNRI are associated with increased risk of falling after adjusting for important confounders. Medication use is a modifiable fall risk factor in older adults and can be targeted to reduce risk of falls. Practical Applications: Use of selective serotonin reuptake inhibitors and serotonin-norepinephrine reuptake inhibitors increased the risk of falling in older adults by approximately 30%, even after controlling for demographic, functional, and health characteristics, including depression. Health care providers can work towards reducing fall risk among their older patients by minimizing the use of certain medications when potential risks outweigh the benefits.

      5. BACKGROUND: There is a dearth of information and guidance for healthcare providers on how to manage a patient's return to driving following a mild traumatic brain injury (mTBI). METHODS: Using the 2020 DocStyles survey, 958 healthcare providers were surveyed about their diagnosis and management practices related to driving after an mTBI. RESULTS: Approximately half (52.0%) of respondents reported routinely (more than 75% of the time) talking with patients with mTBI about how to safely return to driving after their injury. When asked about how many days they recommend their patients with mTBI wait before returning to driving after their injury: 1.0% recommended 1 day or less; 11.7% recommended 2-3 days; 24.5% recommended 4-7 days and 45.9% recommended more than 7 days. Many respondents did not consistently screen patients with mTBI for risk factors that may affect their driving ability or provide them with written instructions on how to safely return to driving (59.7% and 62.6%, respectively). Approximately 16.8% of respondents reported they do not usually make a recommendation regarding how long patients should wait after their injury to return to driving. CONCLUSIONS: Many healthcare providers in this study reported that they do not consistently screen nor educate patients with mTBI about driving after their injury. In order to develop interventions, future studies are needed to assess factors that influence healthcare providers behaviours on this topic.

    • Laboratory Sciences
      1. Evaluation of antiretroviral drug concentrations in minimally invasive specimens for potential development of point of care drug assaysexternal icon
        Haaland R, Martin A, Mengesha M, Dinh C, Fountain J, Lupo LD, Hall L, Conway-Washington C, Kelley C.
        AIDS Res Hum Retroviruses. 2021 Jan 18.
        Point of care (POC) tests for antiretroviral drugs (ARVs) could help improve individual adherence. This study sought to define the utility of urine, blood, and buccal swabs as minimally invasive specimens amenable to development of POC tests for ARVs. Urine, dried blood spots (DBS) and buccal swabs were collected from 35 HIV-negative men between 2 and 96 hours following a single dose of tenofovir alafenamide (TAF)/emtricitabine (FTC)/elvitegravir (EVG)/cobicistat and darunavir (DRV). ARV concentrations were measured by high performance liquid chromatography-mass spectrometry. High concentrations of FTC, DRV and TFV were detectable in urine at least 24 hours after dosing. FTC, DRV and EVG remained detectable in DBS at least 24 hours post dose. FTC and DRV were detectable on buccal swabs up to 2- and 24-hours post dose, respectively. TFV was not detectable in DBS or buccal swabs collected between 2 and 96 hours after dosing. Variable distribution of ARVs in minimally invasive specimens highlights the challenge of developing POC assays for recent ARV exposure.

      2. The extensive use of carbon nanomaterials such as carbon nanotubes/nanofibers (CNTs/CNFs) in industrial settings has raised concerns over the potential health risks associated with occupational exposure to these materials. These exposures are commonly in the form of CNT/CNF-containing aerosols, resulting in a need for a reliable structure classification protocol to perform meaningful exposure assessments. However, airborne carbonaceous nanomaterials are very likely to form mixtures of individual nano-sized particles and micron-sized agglomerates with complex structures and irregular shapes, making structure identification and classification extremely difficult. While manual classification from transmission electron microscopy (TEM) images is widely used, it is time-consuming due to the lack of automation tools for structure identification. In the present study, we applied a convolutional neural network (CNN) based machine learning and computer vision method to recognize and classify airborne CNT/CNF particles from TEM images. We introduced a transfer learning approach to represent images by hypercolumn vectors, which were clustered via K-means and processed into a Vector of Locally Aggregated Descriptors (VLAD) representation to train a softmax classifier with the gradient boosting algorithm. This method achieved 90.9% accuracy on the classification of a 4-class dataset and 84.5% accuracy on a more complex 8-class dataset. The developed model established a framework to automatically detect and classify complex carbon nanostructures with potential applications that extend to the automated structural classification for other nanomaterials.

      3. The feasibility and performance of participant-collected mid-turbinate nasal swabs for detection of influenza virus, respiratory syncytial virus, and human metapneumovirus infections among pregnant womenexternal icon
        Suntarattiwong P, Mott JA, Mohanty S, Sinthuwattanawibool C, Srisantiroj N, Patamasingh Na Ayudhaya O, Klungthong C, Fernandez S, Kim L, Hunt D, Hombroek D, Brummer T, Chotpitayasunondh T, Dawood FS, Kittikraisak W.
        J Infect Dis. 2021 Jan 18.
        BACKGROUND: We assessed performance of participant-collected mid-turbinate nasal swabs compared to study staff-collected mid-turbinate nasal swabs for the detection of respiratory viruses among pregnant women in Bangkok, Thailand. METHODS: We enrolled pregnant women aged ≥18 years and followed them throughout the 2018 influenza season. Women with acute respiratory illness (ARI) self-collected mid-turbinate nasal swabs at homes for influenza viruses, RSV, and hMPV real-time RT-PCR testing while the study nurse collected a second mid-turbinate nasal swab during home visits. Paired specimens were processed and tested on the same day. RESULTS: The majority (109, 60%) of 182 participants were 20-30 years old. All 200 paired swabs had optimal specimen quality. The median time from symptom onsets to participant-collected swabs was two days and to staff-collected swabs was also two days. The median time difference between the two swabs was two hours. Compared to staff-collected swabs, the participant-collected swabs were 93% sensitive and 99% specific for influenza virus detection, 94% sensitive and 99% specific for RSV detection, and 100% sensitive and 100% specific for hMPV detection. CONCLUSIONS: Participant-collected mid-turbinate nasal swabs were a valid alternative approach for laboratory confirmation of influenza-, RSV-, and hMPV-associated illnesses among pregnant women in a community setting.

    • Maternal and Child Health
      1. Duchenne and Becker muscular dystrophies' prevalence in MD STARnet surveillance sites: An examination of racial and ethnic differencesexternal icon
        Zhang Y, Mann JR, James KA, McDermott S, Conway KM, Paramsothy P, Smith T, Cai B.
        Neuroepidemiology. 2021 Jan 21:1-9.
        INTRODUCTION: Previous studies indicated variability in the prevalence of Duchenne and Becker muscular dystrophies (DBMD) by racial/ethnic groups. The Centers for Disease Control and Prevention's (CDC) Muscular Dystrophy Surveillance, Tracking, and Research network (MD STARnet) conducts muscular dystrophy surveillance in multiple geographic areas of the USA and continues to enroll new cases. This provides an opportunity to continue investigating differences in DBMD prevalence by race and ethnicity and to compare the impact of using varying approaches for estimating prevalence. OBJECTIVE: To estimate overall and race/ethnicity-specific prevalence of DBMD among males aged 5-9 years and compare the performance of three prevalence estimation methods. METHODS: The overall and race/ethnicity-specific 5-year period prevalence rates were estimated with MD STARnet data using three methods. Method 1 used the median of 5-year prevalence, and methods 2 and 3 calculated prevalence directly with different birth cohorts. To compare prevalence between racial/ethnic groups, Poisson modeling was used to estimate prevalence ratios (PRs) with non-Hispanic (NH) whites as the referent group. Comparison between methods was also conducted. RESULTS: In the final population-based sample of 1,164 DBMD males, the overall 5-year prevalence for DBMD among 5-9 years of age ranged from 1.92 to 2.48 per 10,000 males, 0.74-1.26 for NH blacks, 1.78-2.26 for NH whites, 2.24-4.02 for Hispanics, and 0.61-1.83 for NH American Indian or Alaska Native and Asian or Native Hawaiian or Pacific Islander (AIAN/API). The PRs for NH blacks/NH whites, Hispanics/NH whites, and NH AIAN/API/NH whites were 0.46 (95% CI: 0.36-0.59), 1.37 (1.17-1.61), and 0.61 (0.40-0.93), respectively. CONCLUSIONS: In males aged 5-9 years, compared to the prevalence of DBMD in NH whites, prevalence in NH blacks and NH AIAN/API was lower and higher in Hispanics. All methods produced similar prevalence estimates; however, method 1 produced narrower confidence intervals and method 2 produced fewer zero prevalence estimates than the other two methods.

    • Mining
      1. The Subtropolis Mine is a room-and-pillar mine extracting the Vanport limestone near Petersburg, Ohio, at a depth of approximately 59.4 m (190 ft). In February of 2018, mine management began implementing a new layout to better control the negative effects of excessive levels of horizontal stress. Almost immediately, the conditions in the headings improved. Conversely, and as expected, stress-related damage concentrated within crosscuts. Over the last 18 months, the mine operator has diligently experimented with different techniques/methods to lessen the impact of the instabilities in the outby crosscuts. The range of controls used by the mine operator include angled crosscuts, crosscut offsets, increase distance between crosscuts, arched crosscuts, cable bolted crosscuts, altered blasting pattern, and windows. A window is used to resist roof deformation by leaving a strong brow of roof rock within the crosscuts. A window reduces the crosscut dimensions vertically and, in some applications, horizontally. With each application of engineering controls, conditions were monitored and analyzed using observational and measurement techniques. In every case, the advantages in ground conditions were weighed against its impacts to haulage, ventilation, and other mining considerations. This paper examines how each engineering control was implemented and assessed. All these controls are based on well-established geomechanics principles, but experience has shown that local modifications are needed to deal with the unique local conditions such as geology, mining method, mine equipment, and in situ stress conditions.

      2. Mining-induced stresses in underground coal mines play a significant role in pillar and ground support design, hence in the safety of mining operations. In the US, Analysis of Longwall Pillar Stability (ALPS) and in Australia, Analysis of Longwall Tailgate Serviceability (ALTS) software are used for designing Longwall coal mine layouts; and in the US, Analysis of Retreat Mining Pillar Stability (ARMPS) software is used to design retreat room-and-pillar mine layouts. All these software determine the adequacy of the design by comparing the estimated loads to the load-bearing capacity of the pillars and they use the “abutment angle” concept and a square decay stress distribution function to calculate the magnitude and distribution of the mining-induced loads. The abutment angle concept has been successfully applied to US longwall coal mines with the use of ALPS and ALTS in Australia. ARMPS uses the same concept for retreat room and pillar coal mine design in the US. The suggested abutment angle for coal mines in the US was derived as 21° by the back analysis of underground stress measurements from the 1990s and implemented in ALPS and ARMPS. The ALPS methodology was re-examined and calibrated for Australian conditions with additional Australian stress measurements and resulted in the original ALTS methodology which has been continually improved and expanded with additional cases. In this paper, some recent stress measurements are back-analyzed, and the abutment angles are investigated to verify the applicability of using 21° in retreat room and pillar mines with different depths and mining dimensions. For shallow mines, the derivation of the 21° abutment angle is supported by the new case histories. However, at depths greater than 200 m, the abutment angle was found to be decreasing with increasing depth. In this study, a new equation for the calculation of abutment angle for moderate and deep cover cases was constructed and tested for its applicability in retreat room and pillar mines. The differences in the mechanism of complete side abutment loads in shallow and deep cover mines are further analyzed by applying the finite volume modeling (FVM) approach to two case study mines, one shallow, and one deep cover. A 2D model of each mine is created and one-side and two-side abutment loads of consecutive panels are analyzed. Analysis of the deep cover mine indicated that the prior panel gobs provide a considerable amount of support to the overburden strata. These higher gob loads prevent a higher percentage of overburden loads from being transferred to the active panel workings, and this is in agreement with the lower abutment angles observed for deep cover mines. The findings of this study should only be used for retreat room-and-pillar mines’ production pillar loads since these are calculated geometrically using the abutment angle concept.

      3. Coal mine entry rating system: A case studyexternal icon
        Van Dyke MA, Klemetti TM, Compton C.
        Int J Min Sci Technol. 2021 .
        Coal mines are continuously seeking to determine the performance of entries with different ground control products and installation methods. There are many factors that impact how an entry will perform which include but are not limited to geology, overburden, bolting type and pattern, and mine design. At the National Institute for Occupational Safety and Health (NIOSH), research has been instituted to examine the relationship of the parts of a coal mine entry as a system and not as individual components. To study this relationship, the first step in this study was to create a numeric rating system that accurately reflects visual observations of the mine entry and is easy to implement. NIOSH researchers devised this rating system to improve upon previous ideas, offering increased flexibility which can be incorporated into an overall entry condition that offers different levels of confidence based on the user's time devoted to the inspection. This new entry rating system was implemented at three different mines over varying periods of time to evaluate the ground response to the geology, bolt installation pattern, stress changes by mining, overburden, and time dependency.

      4. Bleeder entries are critically important to longwall mining for the moving of supplies, personnel, and the dilution of mine air contaminants. By design, these entries must stay open for many years for ventilation. Standing supports in moderate cover bleeder entries were observed, numerically modeled, and instrumented by researchers at the National Institute for Occupational Safety and Health (NIOSH). The measurements of the installed borehole pressure cells (BPCs), standing support load cells and convergence meters, and roof extensometers are presented in this paper in addition to the numerical modeling results and visual observations made by the NIOSH researchers in the bleeder entries. The results include the effects of multiple panels being extracted in close proximity to the instrumented site as well as over one and a half years of aging. As expected, standing supports closer to the longwall gob showed the greatest load and convergence. The roof sag appeared generally independent of the proximity to the longwall gob. The BPC readings were driven by both the proximity to the gob and the depth into the pillar. The results of this study demonstrated that the entry roof can respond independently of the pillar and standing support loading. In addition, the rear abutment stress experienced by this bleeder entry design was minimal. The closer the mine development, pillar, or supports are to the gob, the greater the applied load due to rear abutment stress.

    • Nutritional Sciences
      1. Percentage of adolescents meeting federal fruit and vegetable intake recommendations - Youth Risk Behavior Surveillance System, United States, 2017external icon
        Lange SJ, Moore LV, Harris DM, Merlo CL, Lee SH, Demissie Z, Galuska DA.
        MMWR Morb Mortal Wkly Rep. 2021 Jan 22;70(3):69-74.
        According to the 2020-2025 Dietary Guidelines for Americans, persons should consume fruits and vegetables as part of a healthy eating pattern to reduce their risk for diet-related chronic diseases, such as cardiovascular disease, type 2 diabetes, some cancers, and obesity.* A healthy diet is important for healthy growth in adolescence, especially because adolescent health behaviors might continue into adulthood (1). The U.S. Department of Agriculture (USDA) recommends minimum daily intake of 1.5 cups of fruit and 2.5 cups of vegetables for females aged 14-18 years and 2 cups of fruit and 3 cups of vegetables for males aged 14-18 years.(†) Despite the benefits of fruit and vegetable consumption, few adolescents consume these recommended amounts (2-4). In 2013, only 8.5% of high school students met the recommendation for fruit consumption, and only 2.1% met the recommendation for vegetable consumption (2). To update the 2013 data, CDC analyzed data from the 2017 national and state Youth Risk Behavior Surveys (YRBSs) to describe the percentage of students who met intake recommendations, overall and by sex, school grade, and race/ethnicity. The median frequencies of fruit and vegetable consumption nationally were 0.9 and 1.1 times per day, respectively. Nationally, 7.1% of students met USDA intake recommendations for fruits (95% confidence interval [CI] = 4.0-10.3) and 2.0% for vegetables (upper 95% confidence limit = 7.9) using previously established scoring algorithms. State-specific estimates of the percentage of students meeting fruit intake recommendations ranged from 4.0% (Connecticut) to 9.3% (Louisiana), and the percentage meeting vegetable intake recommendations ranged from 0.6% (Kansas) to 3.7% (New Mexico). Additional efforts to expand the reach of existing school and community programs or to identify new effective strategies, such as social media approaches, might help address barriers and improve adolescent fruit and vegetable consumption.

      2. Retinol-binding protein, retinol, and modified-relative-dose response in Ugandan children aged 12-23 months and their non-pregnant caregiversexternal icon
        Whitehead RD, Ford ND, Mapango C, Ruth LJ, Zhang M, Schleicher RL, Ngalombi S, Halati S, Ahimbisibwe M, Lubowa A, Sheftel J, Tanumihardjo SA, Jefferds ME.
        Exp Biol Med (Maywood). 2021 Jan 19.
        Retinol-binding protein (RBP), retinol, and modified-relative-dose response (MRDR) are used to assess vitamin A status. We describe vitamin A status in Ugandan children and women using dried blood spot (DBS) RBP, serum RBP, plasma retinol, and MRDR and compare DBS-RBP, serum RBP, and plasma retinol. Blood was collected from 39 children aged 12-23 months and 28 non-pregnant mothers aged 15-49 years as a subsample from a survey in Amuria district, Uganda, in 2016. DBS RBP was assessed using a commercial enzyme immunoassay kit, serum RBP using an in-house sandwich enzyme-linked immunosorbent assay, and plasma retinol/MRDR test using high-performance liquid chromatography. We examined (a) median concentration or value (Q1, Q3); (b) R(2) between DBS-RBP, serum RBP, and plasma retinol; and (c) Bland-Altman plots. Median (Q1, Q3) for children and mothers, respectively, were as follows: DBS-RBP 1.15 µmol/L (0.97, 1.42) and 1.73 (1.52, 1.96), serum RBP 0.95 µmol/L (0.78, 1.18) and 1.47 µmol/L (1.30, 1.79), plasma retinol 0.82 µmol/L (0.67, 0.99) and 1.33 µmol/L (1.22, 1.58), and MRDR 0.025 (0.014, 0.042) and 0.014 (0.009, 0.019). DBS RBP-serum RBP R(2) was 0.09 for both children and mothers. The mean biases were -0.19 µmol/L (95% limits of agreement [LOA] 0.62, -0.99) for children and -0.01 µmol/L (95% LOA -1.11, -1.31) for mothers. DBS RBP-plasma retinol R(2) was 0.11 for children and 0.13 for mothers. Mean biases were 0.33 µmol/L (95% LOA -0.37, 1.03) for children, and 0.29 µmol/L (95% LOA -0.69, 1.27) for mothers. Serum RBP-plasma retinol R(2) was 0.75 for children and 0.55 for mothers, with mean biases of 0.13 µmol/L (95% LOA -0.23, 0.49) for children and 0.18 µmol/L (95% LOA -0.61, 0.96) for mothers. Results varied by indicator and matrix. The serum RBP-retinol R(2) for children was moderate (0.75), but poor for other comparisons. Understanding the relationships among vitamin A indicators across contexts and population groups is needed.

    • Occupational Safety and Health
      1. Exposure to radon and progeny in a tourist cavernexternal icon
        Anderson JL, Zwack LM, Brueck SE.
        Health Phys. 2021 Jan 19.
        The primary objective of this work was to characterize employee exposure to radon and progeny while performing guide/interpretation and concessions duties in a tourist cavern. Radon gas and progeny concentrations, fraction of unattached progeny, and other environmental parameters were evaluated in a popular tourist cavern in Southeastern New Mexico. Alpha-track detectors were used to measure radon gas in several cavern locations during a 9-mo period. Additionally, radon gas and attached and unattached fractions of radon progeny were measured at three primary cavern work locations during a 1-d period using a SARAD EQF 3220. Radon gas concentrations in the cavern were elevated due to extremely low air exchange rates with substantial seasonal variation. Mean measured radon concentrations ranged from 970 to 2,600 Bq m-3 in the main cavern and from 5,400 to 6,000 Bq m-3 in a smaller cave associated with the regional cave system. Measurements of unattached fractions (0.40-0.60) were higher than those commonly found in mines and other workplaces, leading to the potential for relatively high worker dose. Although radon gas concentrations were below the Occupational Safety and Health Administration Permissible Exposure Limit, employees working in the cavern have the potential to accrue ionizing radiation dose in excess of the annual effective dose limit recommended by the National Council on Radiation Protection and Measurements due to a high unattached fraction of radon progeny. There was a strong negative correlation between unattached fractions and equilibrium factors, but these parameters should be further evaluated for seasonal variation. Introduction of engineering controls such as ventilation could damage the cavern environment, so administrative controls, such as time management, are preferred to reduce employee dose.

      2. Occupational use of high-level disinfectants and asthma incidence in early- to mid-career female nurses: a prospective cohort studyexternal icon
        Dumas O, Gaskins AJ, Boggs KM, Henn SA, Le Moual N, Varraso R, Chavarro JE, Camargo CA.
        Occup Environ Med. 2021 Jan 15.
        OBJECTIVES: Occupational use of disinfectants among healthcare workers has been associated with asthma. However, most studies are cross-sectional, and longitudinal studies are not entirely consistent. To limit the healthy worker effect, it is important to conduct studies among early- to mid-career workers. We investigated the prospective association between use of disinfectants and asthma incidence in a large cohort of early- to mid-career female nurses. METHODS: The Nurses' Health Study 3 is an ongoing, prospective, internet-based cohort of female nurses in the USA and Canada (2010-present). Analyses included 17 280 participants without a history of asthma at study entry (mean age: 34 years) and who had completed ≥1 follow-up questionnaire (sent every 6 months). Occupational use of high-level disinfectants (HLDs) was evaluated by questionnaire. We examined the association between HLD use and asthma development, adjusted for age, race, ethnicity, smoking status and body mass index. RESULTS: During 67 392 person-years of follow-up, 391 nurses reported incident clinician-diagnosed asthma. Compared with nurses who reported ≤5 years of HLD use (89%), those with >5 years of HLD use (11%) had increased risk of incident asthma (adjusted HR (95% CI), 1.39 (1.04 to 1.86)). The risk of incident asthma was elevated but not statistically significant in those reporting >5 years of HLD use and current use of ≥2 products (1.72 (0.88 to 3.34)); asthma risk was significantly elevated in women with >5 years of HLD use but no current use (1.46 (1.00 to 2.12)). CONCLUSIONS: Occupational use of HLDs was prospectively associated with increased asthma incidence in early- to mid-career nurses.

    • Occupational Safety and Health - Mining
      1. Field-based methods for the analysis of respirable crystalline silica are now possible with the availability of portable instrumentation. Such methods also require the use of cassettes that facilitate direct-on-filter analysis of field samples. Conventional sampling cassettes can be modified such that they are amenable to direct-on-filter analysis while remaining compatible with common respirable dust samplers. The required modifications are described herein, and one version of such an analysis-ready cassette is described and evaluated in comparison to more traditional cassette designs. The novel cassette was found to result in a slightly higher mass of collected respirable material (for the same sampling duration), though this is likely due to the conductive material of the cassettes, which prevents particle wall losses in comparison to the more commonly used styrene cassette material. Both types of cassettes demonstrated comparable predictability in terms of respirable crystalline silica in a sample.

      2. Exploring the differences in safety climate among mining sectorsexternal icon
        Haas EJ, Yorio PL.
        Min Metall Explor. 2021 ;38(1):655-668.
        Currently, the US mining industry is encouraged, but not required to adopt a formal health and safety management system. Previous research has shown that the adoption of such systems has been more difficult in some subsectors of the mining industry than others. Given the interdependencies between management systems and safety climate in addition to their predictive utility of incidents, it is important to assess differences in the perceptions of safety climate among mining subsectors in the USA. If significant differences exist, then mining subsectors may not necessarily be able to adopt a one-size approach to system implementation. To that end, the National Institute for Occupational Safety and Health assessed mineworkers’ perceptions of several individual and organizational safety climate constructs. Participants consisted of 2945 mineworkers at coal, industrial mineral, and stone/sand/gravel mine sites throughout 18 states. Linear regressions were used to answer the research question. The results suggest that coal miners, in comparison to those miners in industrial mineral and stone/sand/gravel sectors, had significantly less favorable perceptions on each of the organizational climate constructs measured (i.e., organizational support, supervisor support and communication, coworker communication, engagement/involvement, and training) (p < 0.001 in all cases). Importantly, these results parse out organizational indicators to show that perceptions are not only lower in one area of organizational or supervisor support. Rather, engagement, training, and communication practices were all significantly lower among coal miners, prompting considerations for these significant differences and actions that can be taken to improve system practices.

      3. Capability of the airstream helmet for protecting mine workers from diesel particulate matterexternal icon
        Noll J, Lee T, Vanderslice S, Barone T.
        Min Metall Explor. 2021 ;38(1):635-644.
        Diesel particulate matter (DPM) is considered carcinogenic to humans by the International Agency for Research on Cancer (IARC), and mine workers have some of the highest exposures to DPM in the USA. Therefore, mines have been developing control strategies for reducing DPM exposures of mine workers. Many of these strategies include engineering and administrative controls. In addition to these types of controls, a respirator program is used at some mines to provide further protection to mine workers where elevated concentrations of DPM exist. However, sometimes mine workers may feel restricted by the use of a half-mask respirator or inconvenienced by the requirement to remove facial hair. Another option which may be more appealing to some mine workers than a half-mask respirator is an airstream helmet, which provides filtered air in the breathing zone of the worker. The airstream helmet does not restrict breathing, provides some cooling, and does not require the worker to be clean shaven to work properly. These helmets are being used to help reduce respirable dust exposures in some coal mines, and this study investigated how effective this helmet may be for reducing DPM exposures. The airstream helmet with a HEPA filter was found to reduce DPM exposures by over 99% in static conditions by both mass and particle counting data. The airstream helmet can be an important part of a mine’s DPM control plan because it can provide clean air into a mine worker’s breathing zone in areas of elevated concentrations.

    • Parasitic Diseases
      1. The use of dried tube specimens of Plasmodium falciparum in an external quality assessment programme to evaluate health worker performance for malaria rapid diagnostic testing in healthcare centres in Togoexternal icon
        Dorkenoo AM, Kouassi KC, Koura AK, Adams ML, Gbada K, Katawa G, Yakpa K, Charlebois R, Milgotina E, Merkel MO, Aidoo M.
        Malar J. 2021 Jan 20;20(1):50.
        BACKGROUND: The use of rapid diagnostic tests (RDTs) to diagnose malaria is common in sub-Saharan African laboratories, remote primary health facilities and in the community. Currently, there is a lack of reliable methods to ascertain health worker competency to accurately use RDTs in the testing and diagnosis of malaria. Dried tube specimens (DTS) have been shown to be a consistent and useful method for quality control of malaria RDTs; however, its application in National Quality Management programmes has been limited. METHODS: A Plasmodium falciparum strain was grown in culture and harvested to create DTS of varying parasite density (0, 100, 200, 500 and 1000 parasites/µL). Using the dried tube specimens as quality control material, a proficiency testing (PT) programme was carried out in 80 representative health centres in Togo. Health worker competency for performing malaria RDTs was assessed using five blinded DTS samples, and the DTS were tested in the same manner as a patient sample would be tested by multiple testers per health centre. RESULTS: All the DTS with 100 parasites/µl and 50% of DTS with 200 parasites/µl were classified as non-reactive during the pre-PT quality control step. Therefore, data from these parasite densities were not analysed as part of the PT dataset. PT scores across all 80 facilities and 235 testers was 100% for 0 parasites/µl, 63% for 500 parasites/µl and 93% for 1000 parasites/µl. Overall, 59% of the 80 healthcare centres that participated in the PT programme received a score of 80% or higher on a set of 0, 500 and 1000 parasites/ µl DTS samples. Sixty percent of health workers at these centres recorded correct test results for all three samples. CONCLUSIONS: The use of DTS for a malaria PT programme was the first of its kind ever conducted in Togo. The ease of use and stability of the DTS illustrates that this type of samples can be considered for the assessment of staff competency. The implementation of quality management systems, refresher training and expanded PT at remote testing facilities are essential elements to improve the quality of malaria diagnosis.

      2. Anti-malarial efficacy and resistance monitoring of artemether-lumefantrine and dihydroartemisinin-piperaquine shows inadequate efficacy in children in Burkina Faso, 2017-2018external icon
        Gansané A, Moriarty LF, Ménard D, Yerbanga I, Ouedraogo E, Sondo P, Kinda R, Tarama C, Soulama E, Tapsoba M, Kangoye D, Compaore CS, Badolo O, Dao B, Tchwenko S, Tinto H, Valea I.
        Malar J. 2021 Jan 19;20(1):48.
        BACKGROUND: The World Health Organization recommends regularly assessing the efficacy of artemisinin-based combination therapy (ACT), which is a critical tool in the fight against malaria. This study evaluated the efficacy of two artemisinin-based combinations recommended to treat uncomplicated Plasmodium falciparum malaria in Burkina Faso in three sites: Niangoloko, Nanoro, and Gourcy. METHODS: This was a two-arm randomized control trial of the efficacy of artemether-lumefantrine (AL) and dihydroartemisinin-piperaquine (DP). Children aged 6-59 months old were monitored for 42 days. The primary outcomes of the study were uncorrected and PCR-corrected efficacies to day 28 for AL and 42 for DP. Molecular markers of resistance to artemisinin derivatives and partner drugs were also analysed. RESULTS: Of 720 children enrolled, 672 reached study endpoints at day 28, 333 in the AL arm and 339 in the DP arm. PCR-corrected 28-day per protocol efficacy in the AL arm was 74% (64-83%) in Nanoro, 76% (66-83%) in Gourcy, and 92% (84-96%) in Niangoloko. The PCR-corrected 42-day per protocol efficacy in the DP arm was 84% (75-89%) in Gourcy, 89% (81-94%) in Nanoro, and 97% (92-99%) in Niangoloko. No Pfk13 mutation previously associated with artemisinin-resistance was observed. No statistically significant association was found between treatment outcome and presence of the 86Y mutation in the Pfmdr1 gene. There was also no association observed between treatment outcome and Pfpm2 or Pfmdr1 copy number variation. CONCLUSION: The results of this study indicate evidence of inadequate efficacy of AL at day 28 and DP at day 42 in the same two sites. A change of first-line ACT may be warranted in Burkina Faso. Trial Registry Pan African Clinical Trial Registry Identifier: PACTR201708002499311. Date of registration: 8/3/2017

    • Physical Activity
      1. Rationale for the essential components of physical educationexternal icon
        Michael SL, Wright C, Mays Woods A, van der Mars H, Brusseau TA, Stodden DF, Burson SL, Fisher J, Killian CM, Mulhearn SC, Nesbitt DR, Pfledderer CD.
        Res Q Exerc Sport. 2021 Jan 19:1-7.
        Purpose: This introductory article provides the context and rationale for conducting systematic literature reviews on each of the essential components of physical education, including policy and environment, curriculum, appropriate instruction, and student assessment. Methods: Four research teams from Doctoral Physical Education Teacher Education programs (D-PETE) conducted these systematic reviews using the PRISMA guidelines process. Results: This article explains the role of the national framework for increasing physical education and physical activity (i.e., Comprehensive School Physical Activity Program) in supporting the essential components of physical education. It also highlights the expectations for physical education and provides a brief history of these components. Lastly, this article highlights each of the articles presented in the special feature. Conclusion: Understanding the implementation of these components may be important for improving the physical education experience for all students and creating a foundation for lifelong physical activity and health.

    • Public Health Leadership and Management
      1. INTRODUCTION: In low/middle-income countries (LMICs), training is often used to improve healthcare provider (HCP) performance. However, important questions remain about how well training works and the best ways to design training strategies. The objective of this study is to characterise the effectiveness of training strategies to improve HCP practices in LMICs and identify attributes associated with training effectiveness. METHODS: We performed a secondary analysis of data from a systematic review on improving HCP performance. The review included controlled trials and interrupted time series, and outcomes measuring HCP practices (eg, percentage of patients correctly treated). Distributions of effect sizes (defined as percentage-point (%-point) changes) were described for each training strategy. To identify effective training attributes, we examined studies that directly compared training approaches and performed random-effects linear regression modelling. RESULTS: We analysed data from 199 studies from 51 countries. For outcomes expressed as percentages, educational outreach visits (median effect size when compared with controls: 9.9 %-points; IQR: 4.3-20.6) tended to be somewhat more effective than in-service training (median: 7.3 %-points; IQR: 3.6-17.4), which seemed more effective than peer-to-peer training (4.0 %-points) and self-study (by 2.0-9.3 %-points). Mean effectiveness was greater (by 6.0-10.4 %-points) for training that incorporated clinical practice and training at HCPs' work site. Attributes with little or no effect were: training with computers, interactive methods or over multiple sessions; training duration; number of educational methods; distance training; trainers with pedagogical training and topic complexity. For lay HCPs, in-service training had no measurable effect. Evidence quality for all findings was low. CONCLUSIONS: Although additional research is needed, by characterising the effectiveness of training strategies and identifying attributes of effective training, decision-makers in LMICs can improve how these strategies are selected and implemented.

    • Reproductive Health
      1. Required examinations and tests before initiating contraception: provider practices from a national cross-sectional surveyexternal icon
        Krashin JW, Zapata LB, Morgan IA, Tepper NK, Jatlaoui TC, Frederiksen BN, Whiteman MK, Curtis KM.
        Contraception. 2021 Jan 14.
        OBJECTIVE(S): We estimated the prevalence of requiring specific examinations or tests before providing contraception in a nationwide survey of family planning providers. STUDY DESIGN: We conducted a cross-sectional survey of public-sector health centers and office-based physicians providing family planning services across the United States in 2019 (n=1,395). We estimated the weighted proportion of providers (or their health center or practice) who required blood pressure measurement, pelvic examination (bimanual examination and cervical inspection), Papanicolaou (Pap) smear, clinical breast examination (CBE), and chlamydia and gonorrhea (CT/GC) screening before initiating hormonal or intrauterine contraception (IUC) for healthy women. We performed multivariable regression to identify factors associated with pelvic examination practices aligned with clinical recommendations; these recommendations classify examinations and tests as recommended or unnecessary before initiation of specific contraceptive methods. RESULTS: The overall response rate was 51%. Most providers required blood pressure measurement before initiating each method. Unnecessary CBE, Pap smears, and CT/GC screening were required by 14-33% of providers across methods. Fifty-two to 62% of providers required recommended pelvic examination before IUC placement; however, 16-23% of providers required unnecessary pelvic examinations before non-intrauterine hormonal method initiation. Factors associated with recommendation-aligned pelvic examination practices included having a higher proportion of patients using public funding (Medicaid or other assistance) and more recently completing formal clinical training. CONCLUSIONS: Almost half (47%) of providers did not require necessary pelvic examination before placing IUC. Conversely, many providers required unnecessary examinations and tests before contraception initiation for patients. IMPLICATIONS: Most providers required the few recommended examinations and tests for safe contraceptive provision. Reduction of unnecessary examinations and tests may reduce barriers to contraceptive access. There are also opportunities to increase use of recommended examinations, as up to 48% of providers did not require recommended pelvic examination before IUC.

    • Substance Use and Abuse
      1. Binge drinking, other substance use, and concurrent use in the U.S., 2016-2018external icon
        Esser MB, Pickens CM, Guy GP, Evans ME.
        Am J Prev Med. 2021 Feb;60(2):169-178.
        INTRODUCTION: The use of multiple substances heightens the risk of overdose. Multiple substances, including alcohol, are commonly found among people who experience overdose-related mortality. However, the associations between alcohol use and the use of a range of other substances are often not assessed. Therefore, this study examines the associations between drinking patterns (e.g., binge drinking) and other substance use in the U.S., the concurrent use of alcohol and prescription drug misuse, and how other substance use varies by binge-drinking frequency. METHODS: Past 30-day alcohol and other substance use data from the 2016-2018 National Survey on Drug Use and Health were analyzed in 2020 among 169,486 U.S. respondents aged ≥12 years. RESULTS: The prevalence of other substance use ranged from 6.0% (nondrinkers) to 24.1% (binge drinkers). Among people who used substances, 22.2% of binge drinkers reported using substances in 2 additional substance categories. Binge drinking was associated with 4.2 (95% CI=3.9, 4.4) greater adjusted odds of other substance use than nondrinking. Binge drinkers were twice as likely to report concurrent prescription drug misuse while drinking as nonbinge drinkers. The prevalence of substance use increased with binge-drinking frequency. CONCLUSIONS: Binge drinking was associated with other substance use and concurrent prescription drug misuse while drinking. These findings can guide the implementation of a comprehensive approach to prevent binge drinking, substance misuse, and overdoses. This might include population-level strategies recommended by the Community Preventive Services Task Force to prevent binge drinking (e.g., increasing alcohol taxes and regulating alcohol outlet density).

      2. Frequency of cannabis use during pregnancy and adverse infant outcomes, by cigarette smoking status - 8 PRAMS states, 2017external icon
        Haight SC, King BA, Bombard JM, Coy KC, Ferré CD, Grant AM, Ko JY.
        Drug Alcohol Depend. 2021 Jan 8;220:108507.
        BACKGROUND: Research on prenatal cannabis use and adverse infant outcomes is inconsistent, and findings vary by frequency of use or cigarette use. We assess (1) the prevalence of high frequency (≥once/week), low frequency (<once/week), and any cannabis use during pregnancy by maternal characteristics and adverse infant outcomes; (2) the prevalence of infant outcomes by cannabis use frequency, stratified by cigarette smoking; and (3) the association between cannabis use frequency and infant outcomes, stratified by cigarette smoking. METHODS: Cross-sectional data from 8 states' 2017 Pregnancy Risk Assessment Monitoring System (n = 5548) were analyzed. We calculated adjusted prevalence ratios (aPR) between cannabis use frequency and infant outcomes with Modified Poisson regression. RESULTS: Approximately 1.7 % and 2.6 % of women reported low and high frequency prenatal cannabis use, respectively. Prevalence of use was higher among women with small-for-gestational age (SGA) (10.2 %) and low birthweight (9.7 %) deliveries, and cigarette use during pregnancy (21.2 %). Among cigarette smokers (aPR: 1.8; 95 % CI: 1.1-3.0) and non-smokers (aPR: 2.1; 95 % CI: 1.1-3.9), high frequency cannabis use doubled the risk of low birthweight delivery but did not increase preterm or SGA risk. Regardless of cigarette use, low frequency cannabis use did not significantly increase infant outcome risk. CONCLUSIONS: Prenatal cannabis use was more common among women who smoked cigarettes during pregnancy. High frequency cannabis use was associated with low birthweight delivery, regardless of cigarette use. Healthcare providers can implement recommended substance use screening and provide evidence-based counseling and cessation services to help pregnant women avoid tobacco and cannabis use.

      3. A comprehensive approach to increase adult tobacco cessationexternal icon
        VanFrank B, Presley-Cantrell L.
        Jama. 2021 Jan 19;325(3):232-233.

      4. Hepatitis C virus infection and polysubstance use among young adult people who inject drugs in a rural county of New Mexicoexternal icon
        Wagner K, Zhong Y, Teshale E, White K, Winstanley EL, Hettema J, Thornton K, Bisztray B, Fiuty P, Page K.
        Drug Alcohol Depend. 2021 Jan 11;220:108527.
        AIMS: We assessed prevalence and correlates for hepatitis C virus (HCV) infection in young adult people who inject drugs (PWID) in rural New Mexico, where opioid use has been historically problematic. METHODS: Participants were 18-29 years old with self-reported injection drug use in the past 90 days. We conducted testing for HCV antibodies (anti-HCV) and HCV ribonucleic acid (RNA) and assessed sociodemographic and risk exposures. We provided counseling and referrals to prevention services and drug treatment. We estimated prevalence ratios (PR) to assess bivariate associations with HCV infection; and adjusted PRs using modified Poisson regression methods. RESULTS: Among 256 participants tested for anti-HCV, 156 (60.9 %) had been exposed (anti-HCV positive), and of 230 tested for both anti-HCV and HCV RNA, 103 (44.8 %) had current infection (RNA-positive). The majority (87.6 %) of participants were Hispanic. Almost all (96.1 %) had ever injected heroin; 52.4 % and 52.0 % had ever injected methamphetamine or cocaine, respectively. Polysubstance injecting (heroin and any other drug) was associated with significantly higher prevalence of HCV infection (76.0 %) compared to injecting only heroin (24.0 %) (PR: 3.17 (95 % CI:1.93, 5.23)). Years of injecting, history of non-fatal opioid-involved overdose, polysubstance injecting, and stable housing were independently associated with HCV infection. CONCLUSIONS: HCV is highly prevalent among young adult PWID in rural NM. The high reported prevalence of polysubstance injecting and its association with HCV infection should be considered in prevention planning.

    • Zoonotic and Vectorborne Diseases
      1. Another dengue fever outbreak in Eastern Ethiopia-An emerging public health threatexternal icon
        Gutu MA, Bekele A, Seid Y, Mohammed Y, Gemechu F, Woyessa AB, Tayachew A, Dugasa Y, Gizachew L, Idosa M, Tokarz RE, Sugerman D.
        PLoS Negl Trop Dis. 2021 Jan 19;15(1):e0008992.
        BACKGROUND: Dengue Fever (DF) is a viral disease primarily transmitted by Aedes (Ae.) aegypti mosquitoes. Outbreaks in Eastern Ethiopia were reported during 2014-2016. In May 2017, we investigated the first suspected DF outbreak from Kabridahar Town, Somali region (Eastern Ethiopia) to describe its magnitude, assess risk factors, and implement control measures. METHODS: Suspected DF cases were defined as acute febrile illness plus ≥2 symptoms (headache, fever, retro-orbital pain, myalgia, arthralgia, rash, or hemorrhage) in Kabridahar District residents. All reported cases were identified through medical record review and active searches. Severe dengue was defined as DF with severe organ impairment, severe hemorrhage, or severe plasma leakage. We conducted a neighborhood-matched case-control study using a subset of suspected cases and conveniently-selected asymptomatic community controls and interviewed participants to collect demographic and risk factor data. We tested sera by RT-PCR to detect dengue virus (DENV) and identify serotypes. Entomologists conducted mosquito surveys at community households to identify species and estimate larval density using the house index (HI), container index (CI) and Breteau index (BI), with BI≥20 indicating high density. RESULTS: We identified 101 total cases from May 12-31, 2017, including five with severe dengue (one death). The attack rate (AR) was 17/10,000. Of 21 tested samples, 15 (72%) were DENV serotype 2 (DENV 2). In the case-control study with 50 cases and 100 controls, a lack of formal education (AOR [Adjusted Odds Ratio] = 4.2, 95% CI [Confidence Interval] 1.6-11.2) and open water containers near the home (AOR = 3.0, 95% CI 1.2-7.5) were risk factors, while long-lasting insecticide treated-net (LLITN) usage (AOR = 0.21, 95% CI 0.05-0.79) was protective. HI and BI were 66/136 (49%) and 147 per 100 homes (147%) respectively, with 151/167 (90%) adult mosquitoes identified as Ae. aegypti. CONCLUSION: The epidemiologic, entomologic, and laboratory investigation confirmed a DF outbreak. Mosquito indices were far above safe thresholds, indicating inadequate vector control. We recommended improved vector surveillance and control programs, including best practices in preserving water and disposal of open containers to reduce Aedes mosquito density.

  2. CDC Authored Publications
    The names of CDC authors are indicated in bold text.
    Articles published in the past 6-8 weeks authored by CDC or ATSDR staff.
    • Chronic Diseases and Conditions
      1. What's the "secret sauce"? How implementation variation affects the success of colorectal cancer screening outreachexternal icon
        Coury J, Miech EJ, Styer P, Petrik AF, Coates KE, Green BB, Baldwin LM, Shapiro JA, Coronado GD.
        Implement Sci Commun. 2021 Jan 11;2(1):5.
        BACKGROUND: Mailed fecal immunochemical testing (FIT) programs can improve colorectal cancer (CRC) screening rates, but health systems vary how they implement (i.e., adapt) these programs for their organizations. A health insurance plan implemented a mailed FIT program (named BeneFIT), and participating health systems could adapt the program. This multi-method study explored which program adaptations might have resulted in higher screening rates. METHODS: First, we conducted a descriptive analysis of CRC screening rates by key health system characteristics and program adaptations. Second, we generated an overall model by fitting a weighted regression line to our data. Third, we applied Configurational Comparative Methods (CCMs) to determine how combinations of conditions were linked to higher screening rates. The main outcome measure was CRC screening rates. RESULTS: Seventeen health systems took part in at least 1 year of BeneFIT. The overall screening completion rate was 20% (4-28%) in year 1 and 25% (12-35%) in year 2 of the program. Health systems that used two or more adaptations had higher screening rates, and no single adaptation clearly led to higher screening rates. In year 1, small systems, with just one clinic, that used phone reminders (n = 2) met the implementation success threshold (≥ 19% screening rate) while systems with > 1 clinic were successful when offering a patient incentive (n = 4), scrubbing mailing lists (n = 4), or allowing mailed FIT returns with no other adaptations (n = 1). In year 2, larger systems with 2-4 clinics were successful with a phone reminder (n = 4) or a patient incentive (n = 3). Of the 10 systems that implemented BeneFIT in both years, seven improved their CRC screening rates in year 2. CONCLUSIONS: Health systems can choose among many adaptations and successfully implement a health plan's mailed FIT program. Different combinations of adaptations led to success with health system size emerging as an important contextual factor.

      2. Screening and interventions for glaucoma and eye health through telemedicine (SIGHT) studiesexternal icon
        De Moraes CG, Hark LA, Saaddine J.
        J Glaucoma. 2021 Jan 7;Publish Ahead of Print.

      3. Cancer screening test receipt - United States, 2018external icon
        Sabatino SA, Thompson TD, White MC, Shapiro JA, de Moor J, Doria-Rose VP, Clarke T, Richardson LC.
        MMWR Morb Mortal Wkly Rep. 2021 Jan 15;70(2):29-35.
        Screening for breast cancer, cervical cancer, and colorectal cancer (CRC) reduces mortality from these cancers.* However, screening test receipt has been below national targets with disparities observed in certain populations (1,2). National Health Interview Survey (NHIS) data from 2018 were analyzed to estimate percentages of adults up to date with U.S. Preventive Services Task Force (USPSTF) screening recommendations. Screening test receipt remained below national Healthy People 2020 (HP2020) targets, although CRC test receipt neared the target. Disparities were evident, with particularly low test receipt among persons who were uninsured or did not have usual sources of care. Continued monitoring helps assess progress toward targets and could inform efforts to promote screening and reduce barriers for underserved populations.

      4. Association of community types and features in a case-control analysis of new onset type 2 diabetes across a diverse geography in Pennsylvaniaexternal icon
        Schwartz BS, Pollak J, Poulsen MN, Bandeen-Roche K, Moon K, DeWalle J, Siegel K, Mercado C, Imperatore G, Hirsch AG.
        BMJ Open. 2021 Jan 13;11(1):e043528.
        OBJECTIVES: To evaluate associations of community types and features with new onset type 2 diabetes in diverse communities. Understanding the location and scale of geographic disparities can lead to community-level interventions. DESIGN: Nested case-control study within the open dynamic cohort of health system patients. SETTING: Large, integrated health system in 37 counties in central and northeastern Pennsylvania, USA. PARTICIPANTS AND ANALYSIS: We used electronic health records to identify persons with new-onset type 2 diabetes from 2008 to 2016 (n=15 888). Persons with diabetes were age, sex and year matched (1:5) to persons without diabetes (n=79 435). We used generalised estimating equations to control for individual-level confounding variables, accounting for clustering of persons within communities. Communities were defined as (1) townships, boroughs and city census tracts; (2) urbanised area (large metro), urban cluster (small cities and towns) and rural; (3) combination of the first two; and (4) county. Community socioeconomic deprivation and greenness were evaluated alone and in models stratified by community types. RESULTS: Borough and city census tract residence (vs townships) were associated (OR (95% CI)) with higher odds of type 2 diabetes (1.10 (1.04 to 1.16) and 1.34 (1.25 to 1.44), respectively). Urbanised areas (vs rural) also had increased odds of type 2 diabetes (1.14 (1.08 to 1.21)). In the combined definition, the strongest associations (vs townships in rural areas) were city census tracts in urban clusters (1.41 (1.22 to 1.62)) and city census tracts in urbanised areas (1.33 (1.22 to 1.45)). Higher community socioeconomic deprivation and lower greenness were each associated with increased odds. CONCLUSIONS: Urban residence was associated with higher odds of type 2 diabetes than for other areas. Higher community socioeconomic deprivation in city census tracts and lower greenness in all community types were also associated with type 2 diabetes.

      5. Incidence of venous thromboembolism in a racially diverse population of Oklahoma County, Oklahomaexternal icon
        Wendelboe AM, Campbell J, Ding K, Bratzler DW, Beckman MG, Reyes NL, Raskob GE.
        Thromb Haemost. 2021 Jan 10.
        BACKGROUND:  Contemporary incidence data for venous thromboembolism (VTE) from racially diverse populations are limited. The racial distribution of Oklahoma County closely mirrors that of the United States. OBJECTIVE:  To evaluate VTE incidence and mortality, including demographic and racial subgroups. DESIGN:  Population-based prospective study. SETTING:  We conducted VTE surveillance at all relevant tertiary care facilities and outpatient clinics in Oklahoma County, Oklahoma during 2012 to 2014, using both active and passive methods. Active surveillance involved reviewing all imaging reports used to diagnose VTE. Passive surveillance entailed identifying VTE events from hospital discharge data and death certificate records. MEASUREMENTS:  We used Poisson regression to calculate crude, age-stratified, and age-adjusted incidence and mortality rates per 1,000 population per year and 95% confidence intervals (CIs). RESULTS:  The incidence rate of all VTE was 3.02 (2.92-3.12) for those age ≥18 years and 0.05 (0.04-0.08) for those <18 years. The age-adjusted incidence rates of all VTE, deep vein thrombosis, and pulmonary embolism were 2.47 (95% CI: 2.39-2.55), 1.47 (1.41-1.54), and 0.99 (0.93-1.04), respectively. The age-adjusted VTE incidence and the 30-day mortality rates, respectively, were 0.63 and 0.121 for Asians/Pacific Islanders, 3.25 and 0.355 for blacks, 0.67 and 0.111 for Hispanics, 1.25 and 0.195 for Native Americans, and 2.71 and 0.396 for whites. CONCLUSION:  The age-adjusted VTE incidence and mortality rates vary substantially by race. The incidence of three per 1,000 adults per year indicates an important disease burden, and is informative of the burden in the U.S.

    • Communicable Diseases
      1. Associations of father and adult male presence with first pregnancy and HIV infection: Longitudinal evidence from Adolescent Girls and Young Women in Rural South Africa (HPTN  068)external icon
        Albert LM, Edwards J, Pence B, Speizer IS, Hillis S, Kahn K, Gómez-Olivé FX, Wagner RG, Twine R, Pettifor A.
        AIDS Behav. 2021 Jan 8.
        This study, a secondary analysis of the HPTN 068 randomized control trial, aimed to quantify the association of father and male presence with HIV incidence and first pregnancy among 2533 school-going adolescent girls and young women (AGYW) in rural South Africa participating in the trial between March 2011 and April 2017. Participants' ages ranged from 13-20 years at study enrollment and 17-25 at the post-intervention visit. HIV and pregnancy incidence rates were calculated for each level of the exposure variables using Poisson regression, adjusted for age using restricted quadratic spline variables, and, in the case of pregnancy, also adjusted for whether the household received a social grant. Our study found that AGYW whose fathers were deceased and adult males were absent from the household were most at risk for incidence of first pregnancy and HIV (pregnancy: aIRR = 1.30, Wald 95% CI 1.05, 1.61, Wald chi-square p = 0.016; HIV: aIRR = 1.27, Wald 95% CI 0.84, 1.91, Wald chi-square p = 0.263) as compared to AGYW whose biological fathers resided with them. For AGYW whose fathers were deceased, having other adult males present as household members seemed to attenuate the incidence (pregnancy: aIRR = 0.92, Wald 95% CI 0.74, 1.15, Wald chi-square p = 0.462; HIV: aIRR = 0.90, Wald 95% CI 0.58, 1.39, Wald chi-square p = 0.623) such that it was similar, and therefore not statistically significantly different, to AGYW whose fathers were present in the household.

      2. Time from start of quarantine to SARS-CoV-2 positive test among quarantined college and university athletes - 17 states, June-October 2020external icon
        Atherstone C, Peterson ML, Malone M, Honein MA, MacNeil A, O'Neal CS, Paul S, Harmon KG, Goerl K, Wolfe CR, Casani J, Barrios LC.
        MMWR Morb Mortal Wkly Rep. 2021 Jan 8;70(1):7-11.
        To safely resume sports, college and university athletic programs and regional athletic conferences created plans to mitigate transmission of SARS-CoV-2, the virus that causes coronavirus disease 2019 (COVID-19). Mitigation measures included physical distancing, universal masking, and maximizing outdoor activity during training; routine testing; 10-day isolation of persons with COVID-19; and 14-day quarantine of athletes identified as close contacts* of persons with confirmed COVID-19. Regional athletic conferences created testing and quarantine policies based on National Collegiate Athletic Association (NCAA) guidance (1); testing policies varied by conference, school, and sport. To improve compliance with quarantine and reduce the personal and economic burden of quarantine adherence, the quarantine period has been reduced in several countries from 14 days to as few as 5 days with testing (2) or 10 days without testing (3). Data on quarantined athletes participating in NCAA sports were used to characterize COVID-19 exposures and assess the amount of time between quarantine start and first positive SARS-CoV-2 test result. Despite the potential risk for transmission from frequent, close contact associated with athletic activities (4), more athletes reported exposure to COVID-19 at social gatherings (40.7%) and from roommates (31.7%) than they did from exposures associated with athletic activities (12.7%). Among 1,830 quarantined athletes, 458 (25%) received positive reverse transcription-polymerase chain reaction (RT-PCR) test results during the 14-day quarantine, with a mean of 3.8 days from quarantine start (range = 0-14 days) until the positive test result. Among athletes who had not received a positive test result by quarantine day 5, the probability of having a positive test result decreased from 27% after day 5 to <5% after day 10. These findings support new guidance from CDC (5) in which different options are provided to shorten quarantine for persons such as collegiate athletes, especially if doing so will increase compliance, balancing the reduced duration of quarantine against a small but nonzero risk for postquarantine transmission. Improved adherence to mitigation measures (e.g., universal masking, physical distancing, and hand hygiene) at all times could further reduce exposures to SARS-CoV-2 and disruptions to athletic activities because of infections and quarantine (1,6).

      3. Rates of COVID-19 among residents and staff members in nursing homes - United States, May 25-November 22, 2020external icon
        Bagchi S, Mak J, Li Q, Sheriff E, Mungai E, Anttila A, Soe MM, Edwards JR, Benin AL, Pollock DA, Shulman E, Ling S, Moody-Williams J, Fleisher LA, Srinivasan A, Bell JM.
        MMWR Morb Mortal Wkly Rep. 2021 Jan 15;70(2):52-55.
        During the beginning of the coronavirus disease 2019 (COVID-19) pandemic, nursing homes were identified as congregate settings at high risk for outbreaks of COVID-19 (1,2). Their residents also are at higher risk than the general population for morbidity and mortality associated with infection with SARS-CoV-2, the virus that causes COVID-19, in light of the association of severe outcomes with older age and certain underlying medical conditions (1,3). CDC's National Healthcare Safety Network (NHSN) launched nationwide, facility-level COVID-19 nursing home surveillance on April 26, 2020. A federal mandate issued by the Centers for Medicare & Medicaid Services (CMS), required nursing homes to commence enrollment and routine reporting of COVID-19 cases among residents and staff members by May 25, 2020. This report uses the NHSN nursing home COVID-19 data reported during May 25-November 22, 2020, to describe COVID-19 rates among nursing home residents and staff members and compares these with rates in surrounding communities by corresponding U.S. Department of Health and Human Services (HHS) region.* COVID-19 cases among nursing home residents increased during June and July 2020, reaching 11.5 cases per 1,000 resident-weeks (calculated as the total number of occupied beds on the day that weekly data were reported) (week of July 26). By mid-September, rates had declined to 6.3 per 1,000 resident-weeks (week of September 13) before increasing again, reaching 23.2 cases per 1,000 resident-weeks by late November (week of November 22). COVID-19 cases among nursing home staff members also increased during June and July (week of July 26 = 10.9 cases per 1,000 resident-weeks) before declining during August-September (week of September 13 = 6.3 per 1,000 resident-weeks); rates increased by late November (week of November 22 = 21.3 cases per 1,000 resident-weeks). Rates of COVID-19 in the surrounding communities followed similar trends. Increases in community rates might be associated with increases in nursing home COVID-19 incidence, and nursing home mitigation strategies need to include a comprehensive plan to monitor local SARS-CoV-2 transmission and minimize high-risk exposures within facilities.

      4. Mitigation policies and COVID-19-associated mortality - 37 European countries, January 23-June 30, 2020external icon
        Fuller JA, Hakim A, Victory KR, Date K, Lynch M, Dahl B, Henao O.
        MMWR Morb Mortal Wkly Rep. 2021 Jan 15;70(2):58-62.
        As cases and deaths from coronavirus disease 2019 (COVID-19) in Europe rose sharply in late March, most European countries implemented strict mitigation policies, including closure of nonessential businesses and mandatory stay-at-home orders. These policies were largely successful at curbing transmission of SARS-CoV-2, the virus that causes COVID-19 (1), but they came with social and economic costs, including increases in unemployment, interrupted education, social isolation, and related psychosocial outcomes (2,3). A better understanding of when and how these policies were effective is needed. Using data from 37 European countries, the impact of the timing of these mitigation policies on mortality from COVID-19 was evaluated. Linear regression was used to assess the association between policy stringency at an early time point and cumulative mortality per 100,000 persons on June 30. Implementation of policies earlier in the course of the outbreak was associated with lower COVID-19-associated mortality during the subsequent months. An increase by one standard deviation in policy stringency at an early timepoint was associated with 12.5 cumulative fewer deaths per 100,000 on June 30. Countries that implemented stringent policies earlier might have saved several thousand lives relative to those countries that implemented similar policies, but later. Earlier implementation of mitigation policies, even by just a few weeks, might be an important strategy to reduce the number of deaths from COVID-19.

      5. Geographic range of recreational water-associated primary amebic meningoencephalitis, United States, 1978-2018external icon
        Gharpure R, Gleason M, Salah Z, Blackstock AJ, Hess-Homeier D, Yoder JS, Ali IK, Collier SA, Cope JR.
        Emerg Infect Dis. 2021 Jan;27(1):271-274.
        Naegleria fowleri is a free-living ameba that causes primary amebic meningoencephalitis (PAM), a rare but usually fatal disease. We analyzed trends in recreational water exposures associated with PAM cases reported during 1978-2018 in the United States. Although PAM incidence remained stable, the geographic range of exposure locations expanded northward.

      6. Performance of Xpert MTB/RIF and Determine TB-LAM Ag in HIV-infected adults in peri-urban sites in Zambiaexternal icon
        Kasaro MP, Chilyabanyama ON, Shah NS, Muluka B, Kapata N, Krüüner A, Mwaba I, Kaunda K, Coggin WL, Wen XJ, Henostroza G, Reid S.
        Public Health Action. 2020 Dec 21;10(4):134-140.
        SETTING: Peri-urban health facilities providing HIV and TB care in Zambia. OBJECTIVE: To evaluate 1) the impact of Xpert(®) MTB/RIF on time-to-diagnosis, treatment initiation, and outcomes among adult people living with HIV (PLHIV) on antiretroviral therapy (ART); and 2) the diagnostic performance of Xpert and Determine™ TB-LAM Ag assays. DESIGN: Quasi-experimental study design with the first cohort evaluated per standard-of-care (SOC; first sputum tested using smear microscopy) and the second cohort per an algorithm using Xpert as initial test (intervention phase; IP). Xpert testing was provided onsite in Chongwe District, while samples were transported 5-10 km in Kafue District. TB was confirmed using mycobacterial culture. RESULTS: Among 1350 PLHIV enrolled, 156 (15.4%) had confirmed TB. Time from TB evaluation to diagnosis (P = 0.018), and from evaluation to treatment initiation (P = 0.03) was significantly shorter for IP than for SOC. There was no difference in all-cause mortality (7.0% vs. 8.6%). TB-LAM Ag showed higher sensitivity with lower CD4 cell count: 81.8% at CD4 < 50 cells/mm(3) vs. 31.7% overall. CONCLUSION: Xpert improved time to diagnosis and treatment initiation, but there was no difference in all-cause mortality. High sensitivity of Determine TB-LAM Ag at lower CD4 count supports increased use in settings providing care to PLHIV, particularly with advanced HIV disease.

      7. Screening and treatment of sexually transmitted infections among Medicaid populations - a two-state analysisexternal icon
        Merrell MA, Betley C, Crouch E, Hung P, Stockwell I, Middleton A, Pearson WS.
        Sex Transm Dis. 2021 Jan 8;Publish Ahead of Print.
        BACKGROUND: Chlamydia, gonorrhea, and syphilis are common, treatable sexually transmitted infections (STIs) that are highly prevalent in the general U.S. population. Costs associated with diagnosing and treating these conditions for individual states' Medicaid participants are unknown. The purpose of this study was to estimate the cost of screening and treatment for three common STIs for state Medicaid program budgets in Maryland and South Carolina. METHODS: A retrospective, cross-sectional study was conducted using Medicaid administrative claims data over a two-year period. Claims were included based on the presence of one of the three study conditions in either diagnosis or procedure codes. Descriptive analyses were used to characterize the participant population and expenditures for services provided. RESULTS: Total Medicaid expenditures for STI care in state fiscal years 2016 and 2017 averaged $43.5 million and $22.3 million for each year in Maryland and South Carolina, respectively. Maryland had a greater proportion of costs associated with outpatient hospital and laboratory settings. Costs for care provided in the emergency department were highest in South Carolina. CONCLUSIONS: Diagnosis and treatment of commonly reported STIs may have a considerable financial impact on individual state Medicaid programs. Public health activities directed at STI prevention are important tools for reducing these costs to states.

      8. Recognizing the hidden: strengthening the HIV surveillance system among key and priority populations in Mozambiqueexternal icon
        Semá Baltazar C, Boothe M, Chitsondzo Langa D, Sathane I, Horth R, Young P, Schaad N, Raymond HF.
        BMC Public Health. 2021 Jan 7;21(1):91.
        High quality, representative data from HIV surveillance systems that have country ownership and commitment are critical for guiding national HIV responses, especially among key and priority populations given their disproportionate role in the transmission of the virus. Between 2011 to 2013, the Mozambique Ministry of Health has conducted five Biobehavioral Surveillance Surveys among key populations (female sex workers, men who has sex with men and people who inject drugs) and priority populations (long distance truck drives and miners) as part of the national HIV surveillance system. We describe the experience of strengthening the HIV surveillance system among those populations through the implementation of these surveys in Mozambique. We document the lessons learned through the impact on coordination and collaboration; workforce development and institutional capacity building; data use and dissemination; advocacy and policy impact; financial sustainability and community impact. Key lessons learned include the importance of multisectoral collaboration, vital role of data to support key populations visibility and advocacy efforts, and institutional capacity building of government agencies and key populations organizations. Given that traditional surveillance methodologies from routine data often do not capture these hidden populations, it will be important to ensure that Biobehavioral Surveillance Surveys are an integral part of ongoing HIV surveillance activities in Mozambique.

      9. An influenza A (H3N2) virus outbreak in the Kingdom of Cambodia during the COVID-19 pandemic of 2020external icon
        Sovann LY, Sar B, Kab V, Yann S, Kinzer M, Raftery P, Albalak R, Patel S, Hay PL, Seng H, Um S, Chin S, Chau D, Khalakdina A, Karlsson E, Olsen SJ, Mott JA.
        Int J Infect Dis. 2020 Nov 26;103:352-357.
        BACKGROUND: Global influenza virus circulation decreased during the COVID-19 pandemic, possibly due to widespread community mitigation measures. Cambodia eased some COVID-19 mitigation measures in June and July 2020. On 20 August a cluster of respiratory illnesses occurred among residents of a pagoda, including people who tested positive for influenza A but none who were positive for SARS-CoV-2. METHODS: A response team was deployed on 25 August 2020. People with influenza-like illness (ILI) were asked questions regarding demographics, illness, personal prevention measures, and residential arrangements. Respiratory swabs were tested for influenza and SARS-Cov-2 by real-time reverse transcription PCR, and viruses were sequenced. Sentinel surveillance data were analyzed to assess recent trends in influenza circulation in the community. RESULTS: Influenza A (H3N2) viruses were identified during sentinel surveillance in Cambodia in July 2020 prior to the reported pagoda outbreak. Among the 362 pagoda residents, 73 (20.2%) ILI cases were identified and 40 were tested, where 33/40 (82.5%) confirmed positive for influenza A (H3N2). All 40 were negative for SARS-CoV-2. Among the 73 residents with ILI, none were vaccinated against influenza, 47 (64%) clustered in 3/8 sleeping quarters, 20 (27%) reported often wearing a mask, 27 (36%) reported often washing hands, and 11 (15%) reported practicing social distancing. All viruses clustered within clade 3c2.A1 close to strains circulating in Australia in 2020. CONCLUSIONS: Circulation of influenza viruses began in the community following the relaxation of national COVID-19 mitigation measures, and prior to the outbreak in a pagoda with limited social distancing. Continued surveillance and influenza vaccination are required to limit the impact of influenza globally.

      10. Documenting successes 30 years after passage of the Ryan White CARE Act: To the editorexternal icon
        Weiser J, Dempsey A, Mandsager P, Shouse RL.
        J Assoc Nurses AIDS Care. 2021 Jan 6;Publish Ahead of Print.

      11. Understanding the contribution of CDC-funded testing toward diagnosing HIV informs efforts to end the HIV epidemic. Due to differences in surveillance data and CDC program data, which sometimes rely on self-reported information, the number of new diagnoses cannot be directly compared. CDC recently asked grantees to check surveillance data to inform the identification of new diagnoses from CDC-funded tests. In this analysis, we use this newly available information to estimate the percent of all HIV diagnoses from 2010 to 2017 in the United States that result from CDC-funded tests. Among tests with surveillance information, correlates of correct categorization using self-report only were assessed. Weights were calculated from that analysis and used to estimate the total number of CDC-funded new diagnoses. Estimates are presented overall and by demographics/transmission risk group. We estimate that one third of all HIV diagnoses in the United States from 2010 to 2017 resulted from a CDC-funded test. The percent of diagnoses that resulted from CDC-funded tests was higher among some high-risk groups: 41% among 20-29-year-olds and 39% among blacks/African Americans. When compared to total diagnoses in the United States from 2010 to 2017, a large proportion resulted from CDC-funded tests, particularly among young individuals and blacks/African Americans. CDC's contribution to new HIV diagnoses was previously unknown. CDC-funded testing is an important part of the national effort to diagnose all people with HIV as early as possible after infection.

    • Disaster Control and Emergency Services
      1. OBJECTIVE: This article describes the development of a prototype dry decontamination system (DryCon) for use in the event of a contamination incident involving a particulate contaminant. Disrobing and showering is currently recommended almost exclusively in mass decontamination, although it may not be feasible when water is scarce, in cold weather environments, or when there may be compliance issues with the requirement to disrobe, ie, unwillingness to disrobe. During disrobing, dust particles could also re-aerosolize, leading to inhalation of contaminants. DESIGN: The DryCon prototype uses air jets for dry decontamination. The system is portable and can run on building-supplied 220-V power or generator power. Multiple contaminated persons can be treated rapidly, one after the other, using this system. SETTING: We tested DryCon in a controlled environment, using a manikin and three different types of fabric squares to investigate its effectiveness, with a decontamination time of 60 seconds. MAIN OUTCOME: At the higher airflow tested, ie, 90 percent of full blower speed or approximately 540 cfm (15 m3/minute), mean decontamination efficiencies of 56.8 percent, 70.3 percent, and 80.7 percent were measured for firefighter (FF) turnout fabric, cotton denim, and polyester double knit fabric, respectively. RESULTS: Removal of this easily re-aerosolized fraction of the contaminants helps protect contaminated people, as well as healthcare providers they come in contact with, from the potential risk of further inhalation exposures from the re-aerosolization caused by doffing clothing. CONCLUSION: The results demonstrate the promise of the DryCon system for use where water is not available, as a first step prior to wet decontamination, or in an industrial setting for post-work-shift decontamination. Further lab and field research will be necessary to prove the effectiveness of this technique in real-world applications and to determine if respiratory protection or other personal protective equipment (PPE) is needed during use of the DryCon system.

      2. Respiratory protection in a time of crisis: NIOSH testing of international respiratory protective devices for emergency useexternal icon
        Andrews AS, Powers JR, Cichowicz JK, Coffey CC, Fries ML, Yorio PL, D'Alessandro MM.
        Health Secur. 2021 Jan 11.
        National Institute for Occupational Safety and Health (NIOSH)-approved respirators are required by the Occupational Safety and Health Administration (OSHA) when personal respiratory protection is used in US occupational settings. During the COVID-19 pandemic, the demand for NIOSH-approved N95 filtering facepiece respirators overwhelmed the available supply. To supplement the national inventory of N95 respirators, contingency and crisis capacity strategies were implemented and incorporated a component that endorsed the use of non-NIOSH-approved respiratory protective devices that conformed to select international standards. The development and execution of this strategy required the collaborative effort of numerous agencies. The Food and Drug Administration temporarily authorized non-NIOSH-approved international respiratory protective devices through an emergency use authorization, OSHA relaxed their enforcement guidance concerning their use in US workplaces, and NIOSH initiated a supplemental performance assessment process to verify the quality of international devices. NIOSH testing revealed that many of the non-NIOSH-approved respiratory protective devices had filtration efficiencies below 95% and substantial inconsistencies in filtration performance. This article reports the results of the NIOSH testing to date and discusses how it has contributed to continuous improvement of the crisis strategy of temporarily permitting the use of non-NIOSH-approved respirators in US occupational settings during the COVID-19 pandemic.

      3. CDC's Emergency Management Program activities - worldwide, 2013-2018external icon
        Rico A, Sanders CA, Broughton AS, Andrews M, Bader FA, Maples DL.
        MMWR Morb Mortal Wkly Rep. 2021 Jan 15;70(2):36-39.
        CDC continually evaluates its Emergency Management Program (EMP) activities, including Incident Management System (IMS) activations, use of EMP functions (referred to as EMP utilizations), and exercises, to ensure that the agency is ready to respond to infectious disease outbreaks, disasters (human-made or natural), and security events. Such evaluation not only documents baseline preparedness and response activities during a selected analytical period, but also highlights significant EMP actions that can guide and inform future emergency operations. To characterize EMP activities that occurred during January 1, 2013-December 31, 2018, CDC conducted a retrospective analysis of operational activity logs. The results showed 253 domestic (U.S. states and territories) and international EMP activities, including 12 IMS activations, 147 EMP utilizations, and 94 exercises. Infectious diseases were the most common threat among both IMS activations (58%) and EMP utilizations (52%). CDC responded to the 2014 Ebola epidemic and the 2016 Zika outbreak; each response lasted approximately 2 years and required extended collaboration with domestic and international partners. Understanding the trends in EMP activities, including knowing the most common threats, aids CDC in allocating resources and focusing preparedness efforts. In 2013, CDC became the first federal agency to receive full agency-wide accreditation by the Emergency Management Accreditation Program (EMAP) in recognition of CDC's commitment to preparedness and its ability to respond to domestic and global public health threats. CDC received EMAP reaccreditation in December 2018 (1,2).

      4. An analytic perspective of a mixed methods study during humanitarian crises in South Sudan: translating facility- and community-based newborn guidelines into practiceexternal icon
        Sami S, Amsalu R, Dimiti A, Jackson D, Kenneth K, Kenyi S, Meyers J, Mullany LC, Scudder E, Tomczyk B, Kerber K.
        Confl Health. 2021 Jan 12;15(1):5.
        BACKGROUND: In South Sudan, the civil war in 2016 led to mass displacement in Juba that rapidly spread to other regions of the country. Access to health care was limited because of attacks against health facilities and workers and pregnant women and newborns were among the most vulnerable. Translation of newborn guidelines into public health practice, particularly during periods of on-going violence, are not well studied during humanitarian emergencies. During 2016 to 2017, we assessed the delivery of a package of community- and facility-based newborn health interventions in displaced person camps to understand implementation outcomes. This case analysis describes the challenges encountered and mitigating strategies employed during the conduct of an original research study. DISCUSSION: Challenges unique to conducting research in South Sudan included violent attacks against humanitarian aid workers that required research partners to modify study plans on an ongoing basis to ensure staff and patient safety. South Sudan faced devastating cholera and measles outbreaks that shifted programmatic priorities. Costs associated with traveling study staff and transporting equipment kept rising due to hyperinflation and, after the July 2016 violence, the study team was unable to convene in Juba for some months to conduct refresher trainings or monitor data collection. Strategies used to address these challenges were: collaborating with non-research partners to identify operational solutions; maintaining a locally-based study team; maintaining flexible budgets and timelines; using mobile data collection to conduct timely data entry and remote quality checks; and utilizing a cascade approach for training field staff. CONCLUSIONS: The case analysis provides lessons that are applicable to other humanitarian settings including the need for flexible research methods, budgets and timelines; innovative training and supervision; and a local research team with careful consideration of sociopolitical factors that impact their access and safety. Engagement of national and local stakeholders can ensure health services and data collection continue and findings translate to public health action, even in contexts facing severe and unpredictable insecurity.

    • Disease Reservoirs and Vectors
      1. Bats are key hosts in the radiation of mammal-associated Bartonella bacteriaexternal icon
        McKee CD, Bai Y, Webb CT, Kosoy MY.
        Infect Genet Evol. 2021 Jan 11:104719.
        Bats are notorious reservoirs of several zoonotic diseases and may be uniquely tolerant of infection among mammals. Broad sampling has revealed the importance of bats in the diversification and spread of viruses and eukaryotes to other animal hosts. Vector-borne bacteria of the genus Bartonella are prevalent and diverse in mammals globally and recent surveys have revealed numerous Bartonella lineages in bats. We assembled a sequence database of Bartonella strains, consisting of nine genetic loci from 209 previously characterized Bartonella lineages and 121 new cultured isolates from bats, and used these data to perform a comprehensive phylogenetic analysis of the Bartonella genus. This analysis included estimation of divergence dates using a molecular clock and ancestral reconstruction of host associations and geography. We estimate that Bartonella began infecting mammals 62 million years ago near the Cretaceous-Paleogene boundary. Additionally, the radiation of particular Bartonella clades correlate strongly to the timing of diversification and biogeography of mammalian hosts. Bats were inferred to be the ancestral hosts of all mammal-associated Bartonella and appear to be responsible for the early geographic expansion of the genus. We conclude that bats have had a deep influence on the evolutionary radiation of Bartonella bacteria and their spread to other mammalian orders. These results support a 'bat seeding' hypothesis that could explain similar evolutionary patterns in other mammalian parasite taxa. Application of such phylogenetic tools as we have used to other taxa may reveal the general importance of bats in the ancient diversification of mammalian parasites.

      2. Characterization of Pyrethroid Resistance Mechanisms in Aedes aegypti from the Florida Keysexternal icon
        Scott ML, Hribar LJ, Leal AL, McAllister JC.
        Am J Trop Med Hyg. 2021 Jan 11.
        The status of insecticide resistance in Aedes aegypti is of concern in areas where Aedes-borne arboviruses like chikungunya, dengue, and Zika occur. In recent years, outbreaks involving these arboviruses have occurred, for which vaccines do not exist; therefore, disease prevention is only through vector control and personal protection. Aedes aegypti are present on every inhabited island within the Florida Keys. The resistance status of Ae. aegypti in the Florida Keys was assessed to guide knowledge of the best choice of chemical for use during an outbreak. Mosquito eggs were collected using ovitraps placed on Key West, Stock Island, Vaca Key, Upper Matecumbe Key, Plantation Key, and Key Largo. Bottle bioassays were conducted at the Florida Keys Mosquito Control District using Bifenthrin(®) 30+30. Further bottle testing using malathion and permethrin occurred at the CDC, Fort Collins, CO, in addition to molecular and biochemical assays. Levels of resistance varied between islands with different underlying mechanisms present. Resistance was seen to bifenthrin 30+30 but not to permethrin, indicating that piperonyl butoxide (PBO) or the inert ingredients may be involved in resistance. No study has been conducted to date examining the role of PBO in resistance. Key Largo was treated the most with adulticides and expressed the highest levels of alpha and beta esterases, oxidases, glutathione-S-transferases, and frequency of the V1016I knockdown mutation from all sites tested. Knowledge of localized resistance and underlying mechanisms helps in making rational decisions in selection of appropriate and effective insecticides.

    • Environmental Health
      1. Gestational and childhood exposure to per- and polyfluoroalkyl substances and cardiometabolic risk at age 12 yearsexternal icon
        Li N, Liu Y, Papandonatos GD, Calafat AM, Eaton CB, Kelsey KT, Cecil KM, Kalkwarf HJ, Yolton K, Lanphear BP, Chen A, Braun JM.
        Environ Int. 2021 Jan 6;147:106344.
        BACKGROUND: Per- and polyfluoroalkyl substances (PFAS) may adversely influence cardiometabolic risk. However, few studies have examined if the timing of early life PFAS exposure modifies their relation to cardiometabolic risk. We examined the influence of gestational and childhood PFAS exposure on adolescents' cardiometabolic risk. METHODS: We quantified concentrations of four PFAS (perfluorooctanoate [PFOA], perfluorooctane sulfonate [PFOS], perfluorononanoate [PFNA], and perfluorohexane sulfonate [PFHxS]) in sera collected during pregnancy, at birth, and at ages 3, 8, and 12 years from 221 mother-child pairs in the HOME Study (enrolled 2003-06, Cincinnati, Ohio). We measured cardiometabolic risk factors using physical examinations, fasting serum biomarkers, and dual-energy X-ray absorptiometry scans at age 12 years. Cardiometabolic risk summary scores were calculated by summing age- and sex-standardized z-scores for individual cardiometabolic risk factors. We used multiple informant models to estimate covariate-adjusted associations of serum PFAS concentrations (log(2)-transformed) at each visit with cardiometabolic risk scores and their individual components, and tested for differences in associations across visits. RESULTS: The associations of serum PFOA concentrations with cardiometabolic risk scores differed across visits (P for heterogeneity = 0.03). Gestational and cord serum PFOA concentrations were positively associated with cardiometabolic risk scores (βs and 95% confidence intervals [95% CIs]: gestational 0.8 [0.0, 1.6]; cord 0.9 [-0.1, 1.9] per interquartile range increase). These positive associations were primarily driven by homeostatic model assessment for insulin resistance index (β = 0.3 [0.1, 0.5]) and adiponectin to leptin ratio (β = -0.5 [-1.0, 0.0]). Other individual cardiometabolic risk factors associated with gestational PFOA included insulin and waist circumference. Gestational and cord PFHxS were also associated with higher cardiometabolic risk scores (βs: gestational 0.9 [0.2, 1.6]; cord 0.9 [0.1, 1.7]). CONCLUSION: In this cohort of children with higher gestational PFOA exposure, fetal exposure to PFOA and PFHxS was associated with unfavorable cardiometabolic risk in adolescence.

      2. This study assesses the potential impact of drought on arsenic exposure from private domestic wells by using a previously developed statistical model that predicts the probability of elevated arsenic concentrations (>10 μg per liter) in water from domestic wells located in the conterminous United States (CONUS). The application of the model to simulate drought conditions used systematically reduced precipitation and recharge values. The drought conditions resulted in higher probabilities of elevated arsenic throughout most of the CONUS. While the increase in the probability of elevated arsenic was generally less than 10% at any one location, when considered over the entire CONUS, the increase has considerable public health implications. The population exposed to elevated arsenic from domestic wells was estimated to increase from approximately 2.7 million to 4.1 million people during drought. The model was also run using total annual precipitation and groundwater recharge values from the year 2012 when drought existed over a large extent of the CONUS. This simulation provided a method for comparing the duration of drought to changes in the predicted probability of high arsenic in domestic wells. These results suggest that the probability of exposure to arsenic concentrations greater than 10 μg per liter increases with increasing duration of drought. These findings indicate that drought has a potentially adverse impact on the arsenic hazard from domestic wells throughout the CONUS.

      3. A prospective ultrasound study of plasma polychlorinated biphenyl concentrations and incidence of uterine leiomyomataexternal icon
        Wesselink AK, Claus Henn B, Fruh V, Orta OR, Weuve J, Hauser R, Williams PL, McClean MD, Sjodin A, Bethea TN, Brasky TM, Baird DD, Wise LA.
        Epidemiology. 2021 Jan 6;Publish Ahead of Print.
        BACKGROUND: Uterine leiomyomata, or fibroids, are hormone-dependent neoplasms of the myometrium that can cause severe gynecologic morbidity. In previous studies, incidence of these lesions has been positively associated with exposure to polychlorinated biphenyls (PCBs), a class of persistent endocrine-disrupting chemicals. However, previous studies have been retrospective in design and none has used ultrasound to reduce disease misclassification. METHODS: The Study of Environment, Lifestyle, and Fibroids is a prospective cohort of 1,693 reproductive-aged Black women residing in Detroit, Michigan (enrolled during 2010-2012). At baseline and every 20 months for 5 years, women completed questionnaires, provided blood samples, and underwent transvaginal ultrasound to detect incident fibroids. We analyzed 754 baseline plasma samples for concentrations of 24 PCB congeners using a case-cohort study design. We used multivariable Cox proportional hazards regression to estimate hazard ratios (HRs) and 95% confidence intervals (CIs) for the association between plasma PCB concentrations and ultrasound-detected fibroids incidence over a 5-year period. RESULTS: We observed little association between PCB congener concentrations and fibroid incidence. The HR for a one-standard deviation increase in log-transformed total PCBs was 0.94 (95% CI: 0.78, 1.1). The PCB congener with the largest effect estimate was PCB 187 (HR for a one-standard deviation increase in log-transformed exposure=0.88, 95% CI: 0.73, 1.1). Associations did not appear to vary strongly across PCB groupings based on hormonal activity. CONCLUSIONS: In this cohort of reproductive-aged Black women, plasma PCB concentrations typical of the contemporary general population were not appreciably associated with higher risk of fibroids.

    • Food Safety
      1. Recency-weighted statistical modeling approach to attribute illnesses caused by 4 pathogens to food sources using outbreak data, United Statesexternal icon
        Batz MB, Richardson LC, Bazaco MC, Parker CC, Chirtel SJ, Cole D, Golden NJ, Griffin PM, Gu W, Schmitt SK, Wolpert BJ, Kufel JS, Hoekstra RM.
        Emerg Infect Dis. 2021 Jan;27(1):214-222.
        Foodborne illness source attribution is foundational to a risk-based food safety system. We describe a method for attributing US foodborne illnesses caused by nontyphoidal Salmonella enterica, Escherichia coli O157, Listeria monocytogenes, and Campylobacter to 17 food categories using statistical modeling of outbreak data. This method adjusts for epidemiologic factors associated with outbreak size, down-weights older outbreaks, and estimates credibility intervals. On the basis of 952 reported outbreaks and 32,802 illnesses during 1998-2012, we attribute 77% of foodborne Salmonella illnesses to 7 food categories (seeded vegetables, eggs, chicken, other produce, pork, beef, and fruits), 82% of E. coli O157 illnesses to beef and vegetable row crops, 81% of L. monocytogenes illnesses to fruits and dairy, and 74% of Campylobacter illnesses to dairy and chicken. However, because Campylobacter outbreaks probably overrepresent dairy as a source of nonoutbreak campylobacteriosis, we caution against using these Campylobacter attribution estimates without further adjustment.

      2. Attribution of illnesses transmitted by food and water to comprehensive transmission pathways using structured expert judgment, United Statesexternal icon
        Beshearse E, Bruce BB, Nane GF, Cooke RM, Aspinall W, Hald T, Crim SM, Griffin PM, Fullerton KE, Collier SA, Benedict KM, Beach MJ, Hall AJ, Havelaar AH.
        Emerg Infect Dis. 2021 Jan;27(1):182-195.
        Illnesses transmitted by food and water cause a major disease burden in the United States despite advancements in food safety, water treatment, and sanitation. We report estimates from a structured expert judgment study using 48 experts who applied Cooke's classical model of the proportion of disease attributable to 5 major transmission pathways (foodborne, waterborne, person-to-person, animal contact, and environmental) and 6 subpathways (food handler-related, under foodborne; recreational, drinking, and nonrecreational/nondrinking, under waterborne; and presumed person-to-person-associated and presumed animal contact-associated, under environmental). Estimates for 33 pathogens were elicited, including bacteria such as Salmonella enterica, Campylobacter spp., Legionella spp., and Pseudomonas spp.; protozoa such as Acanthamoeba spp., Cyclospora cayetanensis, and Naegleria fowleri; and viruses such as norovirus, rotavirus, and hepatitis A virus. The results highlight the importance of multiple pathways in the transmission of the included pathogens and can be used to guide prioritization of public health interventions.

    • Genetics and Genomics
      1. Challenges and opportunities for communication about the role of genomics in public healthexternal icon
        Allen CG, Green RF, Bowen S, Dotson WD, Yu W, Khoury MJ.
        Public Health Genomics. 2021 Jan 14:1-7.
        Despite growing awareness about the potential for genomic information to improve population health, lingering communication challenges remain in describing the role of genomics in public health programs. Identifying and addressing these challenges provide an important opportunity for appropriate communication to ensure the translation of genomic discoveries for public health benefits. In this commentary, we describe 5 common communication challenges encountered by the Centers for Disease Control and Prevention's Office of Genomics and Precision Public Health based on over 20 years of experience in the field. These include (1) communicating that using genomics to assess rare diseases can have an impact on public health; (2) providing evidence that genetic factors can add important information to environmental, behavioral, and social determinants of health; (3) communicating that although genetic factors are nonmodifiable, they can increase the impact of public health programs and communication strategies; (4) addressing the concern that genomics is not ready for clinical practice; and (5) communicating that genomics is valuable beyond the domain of health care and can be integrated as part of public health programs. We discuss opportunities for addressing these communication challenges and provide examples of ongoing approaches to communication about the role of genomics in public health to the public, researchers, and practitioners.

      2. Complete genome sequence of a serotype 7 Listeria monocytogenes strain, FSL R9-0915external icon
        Peters TL, Hudson LK, Bryan DW, Song Y, den Bakker HC, Kucerova Z, Denes TG.
        Microbiol Resour Announc. 2021 Jan 7;10(1).
        Listeria monocytogenes serotype 7 lacks glycosidic constituents in wall teichoic acids. Here, we present the complete genome sequence of L. monocytogenes serotype 7 strain FSL R9-0915 and an analysis of genes known to affect L. monocytogenes antigenicity. This strain is used as a control strain in Listeria phage host range analyses.

    • Health Disparities
      1. OBJECTIVES: We assessed the association between hospitalization for illness from COVID-19 infection and chronic conditions among Medicare beneficiaries (MBs) with fee-for-service (FFS) claims by race and ethnicity for January 1-September 30, 2020. METHODS: We used 2020 monthly Medicare data from January 1-September 30, 2020, reported to the Centers for Medicare and Medicaid Services to compute hospitalization rates per 100 COVID-19 MBs with FFS claims who were hospitalized (ICD-10-CM codes: B97.29 before April 1, 2020; ICD-10-CM codes: U07.1 from April 1, 2020, onward) with or without selected chronic conditions. We used logistic regression to estimate adjusted odds ratios with 95% confidence intervals for association of person-level rate of being hospitalized with COVID-19 and each of 27 chronic conditions by race/ethnicity, controlling for age, sex, and urban-rural residence among MBs. RESULTS: COVID-19-related hospitalizations were associated with all selected chronic conditions, except osteoporosis and Alzheimer disease/dementia among COVID-19 MBs. The top five conditions with the highest odds for hospitalization among COVID-19 MBs were end-stage renal disease (adjusted odds ratios (aOR): 2.15; 95% CI: 2.10-2.21), chronic kidney disease (aOR: 1.54; 95% CI: 1.52-1.56), acute myocardial infarction (aOR: 1.45; 95% CI: 1.39-1.53), heart failure (aOR: 1.43; 95% CI: 1.41-1.44), and diabetes (aOR: 1.37; 95% CI: 1.36-1.39). CONCLUSIONS: Racial/ethnic disparities in hospitalization rate persist among MBs with COVID-19, and associations of COVID-19 hospitalization with chronic conditions differ among racial/ethnic groups in the USA. These findings indicate the need for interventions in racial/ethnic populations at the highest risk of being hospitalized with COVID-19.

    • Health Economics
      1. Economic burden of Legionnaires' disease, United States, 2014external icon
        Baker-Goering M, Roy K, Edens C, Collier S.
        Emerg Infect Dis. 2021 Jan;27(1):255-257.
        Through the use of published estimates of medical costs and new calculations of productivity losses, we estimate the lifetime economic burden of 2014 Legionnaires' disease cases in the United States at ≈$835 million. This total includes $21 million in productivity losses caused by absenteeism and $412 million in productivity losses caused by premature deaths.

      2. Medical claims paid by workers' compensation insurance among US Medicare beneficiaries, 1999-2016external icon
        Kurth L, Casey M, Chin B, Mazurek JM, Schleiff P, Halldin C, Blackley DJ.
        Am J Ind Med. 2021 Jan 11.
        BACKGROUND: Workers' compensation claims among Medicare beneficiaries have not been described previously. To examine the healthcare burden of work-related injury and illness among Medicare beneficiaries, we assessed the characteristics, healthcare utilization, and financial costs among Medicare beneficiaries with claims for which workers' compensation was the primary payer. METHODS: We extracted final action fee-for-service Medicare claims from 1999 to 2016 where workers' compensation had primary responsibility for claim payment and beneficiary, claim type, diagnoses, and cost information from these claims. RESULTS: During 1999-2016, workers' compensation was the primary payer for 2,010,200 claims among 330,491 Medicare beneficiaries, and 58.7% of these beneficiaries had more than one claim. Carrier claims submitted by noninstitutional providers constituted the majority (94.5%) of claims. Diagnosis codes indicated 19.4% of claims were related to diseases of the musculoskeletal system and connective tissue and 12.9% were related to disease of the circulatory system. Workers' compensation insurance paid $880.4 million for these claims while Medicare paid $269.7 million and beneficiaries paid $37.4 million. CONCLUSIONS: Workers' compensation paid 74% of the total amount to providers for these work-related medical claims among Medicare beneficiaries. Claim diagnoses were similar to those of all workers' compensation claims in the United States. Describing these work-related claims helps identify the healthcare burden due to occupational injury and illness among Medicare beneficiaries resulting from employment and identifies a need for more comprehensive collection and surveillance of work-related medical claims.

      3. Medical expenditures for hypertensive disorders during pregnancy that resulted in a live birth among privately insured womenexternal icon
        Li R, Kuklina EV, Ailes EC, Shrestha SS, Grosse SD, Fang J, Wang G, Leung J, Barfield WD, Cox S.
        Pregnancy Hypertens. 2020 Dec 15;23:155-162.
        OBJECTIVE: To estimate the excess maternal health services utilization and direct maternal medical expenditures associated with hypertensive disorders during pregnancy and one year postpartum among women with private insurance in the United States. STUDY DESIGN: We used 2008-2014 IBM MarketScan® Commercial Databases to identify women aged 15-44 who had a pregnancy resulting in live birth during 1/1/09-12/31/13 and were continuously enrolled with non-capitated or partially capitated coverage from 12 months before pregnancy through 12 months after delivery. Hypertensive disorders identified by diagnosis codes were categorized into three mutually exclusive types: preeclampsia and eclampsia, chronic hypertension, and gestational hypertension. Multivariate negative binomial and generalized linear models were used to estimate service utilization and expenditures, respectively. MAIN OUTCOME MEASURES: Per person excess health services utilization and medical expenditures during pregnancy and one year postpartum associated with hypertensive disorders (in 2014 US dollars). RESULTS: Women with preeclampsia and eclampsia, chronic hypertension, and gestational hypertension had $9,389, $6,041, and $2,237 higher mean medical expenditures compared to women without hypertensive disorders ($20,252), respectively (ps < 0.001). One-third (36%) of excess expenditure associated with hypertensive disorders during pregnancy was attributable to outpatient services. CONCLUSIONS: Hypertensive disorders during pregnancy were associated with significantly higher health services utilization and medical expenditures among privately insured women with hypertensive disorders. Medical expenditures varied by types of hypertensive disorders. Stakeholders can use this information to assess the potential economic benefits of interventions that prevent these conditions or their complications.

      4. Prevalence and medical expenditures of diabetes-related complications among adult Medicaid enrollees with diabetes in eight U.S. statesexternal icon
        Ng BP, Laxy M, Shrestha SS, Soler RE, Cannon MJ, Smith BD, Zhang P.
        J Diabetes Complications. 2020 Nov 26:107814.
        AIMS: To estimate the prevalence and medical expenditures of diabetes-related complications (DRCs) among adult Medicaid enrollees with diabetes. METHODS: We estimated the prevalence and medical expenditures for 12 diabetes-related complications by Medicaid eligibility category (disability-based vs. non-disability-based) in eight states. We used generalized linear models with log link and gamma distribution to estimate the total per-person annual medical expenditures for DRCs, controlling for demographics, and other comorbidities. RESULTS: Among non-disability-based enrollees (NDBEs), 40.1% (in California) to 47.5% (in Oklahoma) had one or more DRCs, compared to 53.6% (in Alabama) to 64.8% (in Florida) among disability-based enrollees (DBEs). The most prevalent complication was neuropathy (16.1%-27.1% for NDBEs; 20.2%-30.4% for DBEs). Lower extremity amputation (<1% for both eligibilities) was the least prevalent complication. The costliest per-person complication was dialysis (per-person excess annual expenditure of $22,481-$41,298 for NDBEs; $23,569-$51,470 for DBEs in 2012 USD). Combining prevalence and per-person excess expenditures, the three costliest complications were nephropathy, heart failure, and ischemic heart disease (IHD) for DBEs, compared to neuropathy, nephropathy, and IHD for NDBEs. CONCLUSIONS: Our study provides data that can be used for assessing the health care resources needed for managing DRCs and evaluating cost-effectiveness of interventions to prevent and management DRCs.

    • Healthcare Associated Infections
      1. Healthcare-associated outbreaks of bacterial infections in Africa, 2009-2018: A reviewexternal icon
        Fraser JL, Mwatondo A, Alimi YH, Varma JK, Vilas VJ.
        Int J Infect Dis. 2020 Dec 17;103:469-477.
        BACKGROUND: Healthcare-associated infections (HAIs) are a major global public health problem, increasing the transmission of drug-resistant infections. In Africa, the prevalence of HAIs among all hospital inpatients is estimated to be between 3% and 15%, but outbreaks are infrequently reported. Failure to detect and/or report outbreaks can increase the risk of ongoing infections and recurrent outbreaks. METHODS: A search of the PubMed, Web of Science, Cochrane Library, and other outbreak databases was performed to identify published literature on bacterial HAI outbreaks in Africa (January 2009 to December 2018). Details of the outbreak characteristics, hospital environment, and the control measures implemented were extracted. RESULTS: Twenty-two studies published over the 10-year period were identified. These reported 31 distinct outbreaks and a total of 31 causative pathogens, including Klebsiella pneumoniae (six outbreaks, 19%), Staphylococcus aureus (six outbreaks, 19%), and Enterococcus (five outbreaks, 16%). Most outbreaks were reported from university (n = 8, 26%) and tertiary hospitals (n = 11, 55%), from South Africa (n = 9, 41%) and Tunisia (n = 4, 18%). Interventions to control the outbreaks were described in 27 (90%) outbreaks, and all instituted or recommended enhancing hand hygiene and education. CONCLUSIONS: Few facilities in Africa reported HAI outbreaks over the 10-year period, suggesting substantial under-detection and under-reporting. The quality and timeliness of reporting require improvement to ensure changes in public health practice.

      2. Antimicrobial resistance genes are enriched in aerosols near impacted urban surface waters in La Paz, Boliviaexternal icon
        Ginn O, Nichols D, Rocha-Melogno L, Bivins A, Berendes D, Soria F, Andrade M, Deshusses MA, Bergin M, Brown J.
        Environ Res. 2021 Jan 11:110730.
        Antibiotic resistance poses a major global health threat. Understanding emergence and dissemination of antibiotic resistance in environmental media is critical to the design of control strategies. Because antibiotic resistance genes (ARGs) may be aerosolized from contaminated point sources and disseminated more widely in localized environments, we assessed ARGs in aerosols in urban La Paz, Bolivia, where wastewater flows in engineered surface water channels through the densely populated urban core. We quantified key ARGs and a mobile integron (MI) via ddPCR and E. coli spp. as a fecal indicator by culture over two years during both the rainy and dry seasons in sites near wastewater flows. ARG targets represented major antibiotic groups-tetracyclines (tetA), fluoroquinolines (qnrB), and beta-lactams (bla(TEM))-and an MI (intI1) represented the potential for mobility of genetic material. Most air samples (82%) had detectable targets above the experimentally determined LOD: most commonly bla(TEM) and intI1 (68% and 47% respectively) followed by tetA and qnrB (17% and 11% respectively). ARG and MI densities in positive air samples ranged from 1.3 × 10(1) to 6.6 × 10(4) gene copies/m(3) air. Additionally, we detected culturable E. coli in the air (52% of samples <1 km from impacted surface waters) with an average density of 11 CFU/m(3) in positive samples. We observed decreasing density of bla(TEM) with increasing distance up to 150 m from impacted surface waters. To our knowledge this is the first study conducting absolute quantification and a spatial analysis of ARGs and MIs in ambient urban air of a city with contaminated surface waters. Environments in close proximity to urban wastewater flows in this setting may experience locally elevated concentrations of ARGs, a possible concern for the emergence and dissemination of antimicrobial resistance in cities with poor sanitation.

      3. Characterization of Clostridioides difficile isolates available through the CDC & FDA Antibiotic Resistance Isolate Bankexternal icon
        Paulick A, Adamczyk M, Anderson K, Vlachos N, Machado MJ, McAllister G, Korhonen L, Guh AY, Halpin AL, Rasheed JK, Karlsson M, Lutgring JD, Gargis AS.
        Microbiol Resour Announc. 2021 Jan 7;10(1).
        Thirty Clostridioides difficile isolates collected in 2016 through the Centers for Disease Control and Prevention Emerging Infections Program were selected for reference antimicrobial susceptibility testing and whole-genome sequencing. Here, we present the genetic characteristics of these isolates and announce their availability in the CDC & FDA Antibiotic Resistance Isolate Bank.

      4. Candida auris outbreak in a COVID-19 specialty care unit - Florida, July-August 2020external icon
        Prestel C, Anderson E, Forsberg K, Lyman M, de Perio MA, Kuhar D, Edwards K, Rivera M, Shugart A, Walters M, Dotson NQ.
        MMWR Morb Mortal Wkly Rep. 2021 Jan 15;70(2):56-57.
        In July 2020, the Florida Department of Health was alerted to three Candida auris bloodstream infections and one urinary tract infection in four patients with coronavirus disease 2019 (COVID-19) who received care in the same dedicated COVID-19 unit of an acute care hospital (hospital A). C. auris is a multidrug-resistant yeast that can cause invasive infection. Its ability to colonize patients asymptomatically and persist on surfaces has contributed to previous C. auris outbreaks in health care settings (1-7). Since the first C. auris case was identified in Florida in 2017, aggressive measures have been implemented to limit spread, including contact tracing and screening upon detection of a new case. Before the COVID-19 pandemic, hospital A conducted admission screening for C. auris and admitted colonized patients to a separate dedicated ward.

    • Immunity and Immunization
      1. Allergic reactions including anaphylaxis after receipt of the first dose of Pfizer-BioNTech COVID-19 vaccine - United States, December 14-23, 2020external icon
        CDC COVID-19 Response Team , Food and Drug Administration .
        MMWR Morb Mortal Wkly Rep. 2021 Jan 15;70(2):46-51.
        As of January 3, 2021, a total of 20,346,372 cases of coronavirus disease 2019 (COVID-19) and 349,246 associated deaths have been reported in the United States. Long-term sequalae of COVID-19 over the course of a lifetime currently are unknown; however, persistent symptoms and serious complications are being reported among COVID-19 survivors, including persons who initially experience a mild acute illness.* On December 11, 2020, the Food and Drug Administration (FDA) issued an Emergency Use Authorization (EUA) for Pfizer-BioNTech COVID-19 vaccine to prevent COVID-19, administered as 2 doses separated by 21 days. On December 12, 2020, the Advisory Committee on Immunization Practices (ACIP) issued an interim recommendation for use of Pfizer-BioNTech COVID-19 vaccine (1); initial doses were recommended for health care personnel and long-term care facility residents (2). As of December 23, 2020, a reported 1,893,360 first doses of Pfizer-BioNTech COVID-19 vaccine had been administered in the United States, and reports of 4,393 (0.2%) adverse events after receipt of Pfizer BioNTech COVID-19 vaccine had been submitted to the Vaccine Adverse Event Reporting System (VAERS). Among these, 175 case reports were identified for further review as possible cases of severe allergic reaction, including anaphylaxis. Anaphylaxis is a life-threatening allergic reaction that does occur rarely after vaccination, with onset typically within minutes to hours (3). Twenty-one cases were determined to be anaphylaxis (a rate of 11.1 per million doses administered), including 17 in persons with a documented history of allergies or allergic reactions, seven of whom had a history of anaphylaxis. The median interval from vaccine receipt to symptom onset was 13 minutes (range = 2-150 minutes). Among 20 persons with follow-up information available, all had recovered or been discharged home. Of the remaining case reports that were determined not to be anaphylaxis, 86 were judged to be nonanaphylaxis allergic reactions, and 61 were considered nonallergic adverse events. Seven case reports were still under investigation. This report summarizes the clinical and epidemiologic characteristics of case reports of allergic reactions, including anaphylaxis and nonanaphylaxis allergic reactions, after receipt of the first dose of Pfizer-BioNTech COVID-19 vaccine during December 14-23, 2020, in the United States. CDC has issued updated interim clinical considerations for use of mRNA COVID-19 vaccines currently authorized in the United States (4) and interim considerations for preparing for the potential management of anaphylaxis (5). In addition to screening for contraindications and precautions before administering COVID-19 vaccines, vaccine locations should have the necessary supplies available to manage anaphylaxis, should implement postvaccination observation periods, and should immediately treat persons experiencing anaphylaxis signs and symptoms with intramuscular injection of epinephrine (4,5).

      2. Use of Ebola vaccine: Recommendations of the Advisory Committee on Immunization Practices, United States, 2020external icon
        Choi MJ, Cossaboom CM, Whitesell AN, Dyal JW, Joyce A, Morgan RL, Campos-Outcalt D, Person M, Ervin E, Yu YC, Rollin PE, Harcourt BH, Atmar RL, Bell BP, Helfand R, Damon IK, Frey SE.
        MMWR Recomm Rep. 2021 Jan 8;70(1):1-12.
        This report summarizes the recommendations of the Advisory Committee on Immunization Practices (ACIP) for use of the rVSVΔG-ZEBOV-GP Ebola vaccine (Ervebo) in the United States. The vaccine contains rice-derived recombinant human serum albumin and live attenuated recombinant vesicular stomatitis virus (VSV) in which the gene encoding the glycoprotein of VSV was replaced with the gene encoding the glycoprotein of Ebola virus species Zaire ebolavirus. Persons with a history of severe allergic reaction (e.g., anaphylaxis) to rice protein should not receive Ervebo. This is the first and only vaccine currently licensed by the Food and Drug Administration for the prevention of Ebola virus disease (EVD). These guidelines will be updated based on availability of new data or as new vaccines are licensed to protect against EVD.ACIP recommends preexposure vaccination with Ervebo for adults aged ≥18 years in the U.S. population who are at highest risk for potential occupational exposure to Ebola virus species Zaire ebolavirus because they are responding to an outbreak of EVD, work as health care personnel at federally designated Ebola treatment centers in the United States, or work as laboratorians or other staff at biosafety level 4 facilities in the United States. Recommendations for use of Ervebo in additional populations at risk for exposure and other settings will be considered and discussed by ACIP in the future.

      3. Safety, reactogenicity, and health-related quality of life after trivalent adjuvanted vs trivalent high-dose inactivated influenza vaccines in older adults: A randomized clinical trialexternal icon
        Schmader KE, Liu CK, Harrington T, Rountree W, Auerbach H, Walter EB, Barnett ED, Schlaudecker EP, Todd CA, Poniewierski M, Staat MA, Wodi P, Broder KR.
        JAMA Netw Open. 2021 Jan 4;4(1):e2031266.
        IMPORTANCE: Trivalent adjuvanted inactivated influenza vaccine (aIIV3) and trivalent high-dose inactivated influenza vaccine (HD-IIV3) are US-licensed for adults aged 65 years and older. Data are needed on the comparative safety, reactogenicity, and health-related quality of life (HRQOL) effects of these vaccines. OBJECTIVE: To compare safety, reactogenicity, and changes in HRQOL scores after aIIV3 vs HD-IIV3. DESIGN, SETTING, AND PARTICIPANTS: This randomized blinded clinical trial was a multicenter US study conducted during the 2017 to 2018 and 2018 to 2019 influenza seasons. Among 778 community-dwelling adults aged at least 65 years and assessed for eligibility, 13 were ineligible and 8 withdrew before randomization. Statistical analysis was performed from August 2019 to August 2020. INTERVENTIONS: Intramuscular administration of aIIV3 or HD-IIV3 after age-stratification (65-79 years; ≥80 years) and randomization. MAIN OUTCOMES AND MEASURES: Proportions of participants with moderate-to-severe injection-site pain and 14 other solicited reactions during days 1 to 8, using a noninferiority test (5% noninferiority margin), and serious adverse events (SAE) and adverse events of clinical interest (AECI), including new-onset immune-mediated conditions, during days 1 to 43. Changes in HRQOL scores before and after vaccination (days 1, 3) were also compared between study groups. RESULTS: A total of 757 adults were randomized, 378 to receive aIIV3 and 379 to receive HD-IIV3. Of these participants, there were 420 women (55%) and 589 White individuals (78%) with a median (range) age of 72 (65-97) years. The proportion reporting moderate-to-severe injection-site pain, limiting or preventing activity, after aIIV3 (12 participants [3.2%]) (primary outcome) was noninferior compared with HD-IIV3 (22 participants [5.8%]) (difference -2.7%; 95% CI, -5.8 to 0.4). Ten reactions met noninferiority criteria for aIIV3; 4 (moderate-to-severe injection-site tenderness, arthralgia, fatigue, malaise) did not. It was inconclusive whether these 4 reactions occurred in higher proportions of participants after aIIV3. No participant sought medical care for a vaccine reaction. No AECI was observed. Nine participants had at least SAE after aIIV3 (2.4%; 95% CI,1.1% to 4.5%); 3 had at least 1 SAE after HD-IIV3 (0.8%; 95% CI, 0.2% to 2.2%). No SAE was associated with vaccination. Changes in prevaccination and postvaccination HRQOL scores were not clinically meaningful and not different between the groups. CONCLUSIONS AND RELEVANCE: Overall safety and HRQOL findings were similar after aIIV3 and HD-IIV3, and consistent with prelicensure data. From a safety standpoint, this study's results support using either vaccine to prevent influenza in older adults. TRIAL REGISTRATION: Identifier: NCT03183908.

      4. Public health role for fractional dosage of yellow fever vaccineexternal icon
        Staples JE, Alvarez AR.
        Lancet. 2021 Jan 9;397(10269):76-77.

      5. Myopericarditis after vaccination, Vaccine Adverse Event Reporting System (VAERS), 1990-2018external icon
        Su JR, McNeil MM, Welsh KJ, Marquez PL, Ng C, Yan M, Cano MV.
        Vaccine. 2021 Jan 6.
        BACKGROUND: Myopericarditis after vaccination has been sporadically reported in the medical literature. Here, we present a thorough descriptive analysis of reports to a national passive vaccine safety surveillance system (VAERS) of myopericarditis after vaccines licensed for use in the United States. METHODS: We identified U.S. reports of myopericarditis received by VAERS during 1990-2018 that met a published case definition for myopericarditis or were physician-diagnosed. We stratified analysis by age group (<19, 19-49, ≥50 years), describing reports by serious/non-serious status, sex, time to symptom onset after vaccination, vaccine(s) administered, and exposure to other known causes of myopericarditis. We used Empirical Bayesian data mining to detect disproportionate reporting of myopericarditis after vaccination. RESULTS: VAERS received 620,195 reports during 1990-2018: 708 (0.1%) met the case definition or were physician-diagnosed as myopericarditis. Most (79%) myopericarditis reports described males; 69% were serious; 72% had symptom onset ≤ 2 weeks postvaccination. Overall, smallpox (59%) and anthrax (23%) vaccines were most commonly reported. By age, among persons aged < 19 years, Haemophilus influenzae type b (22, 22%) and hepatitis B (18, 18%); among persons aged 19-49 years smallpox (387, 79%); among persons aged ≥ 50 years inactivated influenza (31, 36%) and live attenuated zoster (19, 22%) vaccines were most commonly reported. The vaccines most commonly reported remained unchanged when excluding 138 reports describing other known causes of myopericarditis. Data mining revealed disproportionate reporting of myopericarditis only after smallpox vaccine. CONCLUSIONS: Despite the introduction of new vaccines over the years, myopericarditis remains rarely reported after vaccines licensed for use in the United States. In this analysis, myopericarditis was most commonly reported after smallpox vaccine, and less commonly after other vaccines.

    • Informatics
      1. Electronic health records and pulmonary function data: Developing an interoperability roadmap. An Official American Thoracic Society Workshop Reportexternal icon
        McCormack MC, Bascom R, Brandt M, Burgos F, Butler S, Caggiano C, Dimmock AE, Fineberg A, Goldstein J, Guzman FC, Halldin CN, Johnson JD, Kerby GS, Krishnan JA, Kurth L, Morgan G, Mularski RA, Pasquale CB, Ryu J, Sinclair T, Stachowicz NF, Taite A, Tilles J, Truta JR, Weissman DN, Wu TD, Yawn BP, Drummond MB.
        Ann Am Thorac Soc. 2021 Jan;18(1):1-11.
        A workshop "Electronic Health Records and Pulmonary Function Data: Developing an Interoperability Roadmap" was held at the American Thoracic Society 2019 International Conference. "Interoperability" is defined as is the ability of different information-technology systems and software applications to directly communicate, exchange data, and use the information that has been exchanged. At present, pulmonary function test (PFT) equipment is not required to be interoperable with other clinical data systems, including electronic health records (EHRs). For this workshop, we assembled a diverse group of experts and stakeholders, including representatives from patient-advocacy groups, adult and pediatric general and pulmonary medicine, informatics, government and healthcare organizations, pulmonary function laboratories, and EHR and PFT equipment and software companies. The participants were tasked with two overarching Aobjectives: 1) identifying the key obstacles to achieving interoperability of PFT systems and the EHR and 2) recommending solutions to the identified obstacles. Successful interoperability of PFT data with the EHR impacts the full scope of individual patient health and clinical care, population health, and research. The existing EHR-PFT device platforms lack sufficient data standardization to promote interoperability. Cost is a major obstacle to PFT-EHR interoperability, and incentives are insufficient to justify the needed investment. The current vendor-EHR system lacks sufficient flexibility, thereby impeding interoperability. To advance the goal of achieving interoperability, next steps include identifying and standardizing priority PFT data elements. To increase the motivation of stakeholders to invest in this effort, it is necessary to demonstrate the benefits of PFT interoperability across patient care and population health.

      2. BACKGROUND: Electronic Health Record Systems (EHRs) are being rolled out nationally in many low- and middle-income countries (LMICs) yet assessing actual system usage remains a challenge. We employed a nominal group technique (NGT) process to systematically develop high-quality indicators for evaluating actual usage of EHRs in LMICs. METHODS: An initial set of 14 candidate indicators were developed by the study team adapting the Human Immunodeficiency Virus (HIV) Monitoring, Evaluation, and Reporting indicators format. A multidisciplinary team of 10 experts was convened in a two-day NGT workshop in Kenya to systematically evaluate, rate (using Specific, Measurable, Achievable, Relevant, and Time-Bound (SMART) criteria), prioritize, refine, and identify new indicators. NGT steps included introduction to candidate indicators, silent indicator ranking, round-robin indicator rating, and silent generation of new indicators. 5-point Likert scale was used in rating the candidate indicators against the SMART components. RESULTS: Candidate indicators were rated highly on SMART criteria (4.05/5). NGT participants settled on 15 final indicators, categorized as system use (4); data quality (3), system interoperability (3), and reporting (5). Data entry statistics, systems uptime, and EHRs variable concordance indicators were rated highest. CONCLUSION: This study describes a systematic approach to develop and validate quality indicators for determining EHRs use and provides LMICs with a multidimensional tool for assessing success of EHRs implementations.

    • Injury and Violence
      1. Circumstances associated with suicides among females-16 states, United States, 2005-2016external icon
        Crosby AE, Ertl A, Lyons BH, Ivey-Stephenson AZ, Jack SP.
        Med Care. 2021 Feb 1;59:S92-s99.
        BACKGROUND: Suicide rates in the United States have been consistently increasing since 2005 and increasing faster among females than among males. Understanding circumstances related to the changes in suicide may help inform prevention programs. This study describes the circumstances associated with suicides among females in the United States using the National Violent Death Reporting System. METHODS: We analyzed the circumstances of suicides occurring from 2005 to 2016 in 16 states (Alaska, Colorado, Georgia, Kentucky, Maryland, Massachusetts, New Jersey, New Mexico, North Carolina, Oklahoma, Oregon, Rhode Island, South Carolina, Utah, Virginia, and Wisconsin) among females aged 10 years and above. We compared the percentages of circumstances reported for the entire sample, by age group, and by race/ethnicity. Trends in changes in the leading circumstances were analyzed using Joinpoint regression. RESULTS: From 2005 to 2016, there were 27,809 suicides among females 10 years and older in the 16 states. Overall, the 2 leading precipitating circumstances were current mental health problem and ever treated for mental health problem. The leading circumstances differed by demographics. Joinpoint analysis showed inflection points in reports of job problems, financial problems, and non-intimate partner relationship problems during 2005-2009. During 2010-2016, downward inflections were seen in reports of job problems and financial problems and upward inflections in substance abuse problems and a recent or impending crisis. CONCLUSIONS: These findings show changes by age group and race/ethnicity in the circumstances associated with suicides among females in the 16 states have occurred. Studying these shifts and identifying the most salient circumstances among female suicide decedents may help prevention programs adapt to different needs.

      2. The Center for Disease Control and Prevention (CDC)'s 2018 Guideline for current practices in pediatric mild traumatic brain injury (mTBI; also referred to as concussion herein) systematically identified the best up-to-date practices based on current evidence and, specifically, identified recommended practices regarding computed tomography (CT), magnetic resonance imaging (MRI), and skull radiograph imaging. In this article, we discuss types of neuroimaging not discussed in the guideline in terms of their safety for pediatric populations, their potential application, and the research investigating the future use of certain modalities to aid in the diagnosis and treatment of mTBI in children. The role of neuroimaging in pediatric mTBI cases should be considered for the potential contribution to children's neural and social development, in addition to the immediate clinical value (as in the case of acute structural findings). Selective use of specific neuroimaging modalities in research has already been shown to detect aspects of diffuse brain injury, disrupted cerebral blood flow, and correlate physiological factors with persistent symptoms, such as fatigue, cognitive decline, headache, and mood changes, following mTBI. However, these advanced neuroimaging modalities are currently limited to the research arena, and any future clinical application of advanced imaging modalities in pediatric mTBI will require robust evidence for each modality's ability to provide measurement of the subtle conditions of brain development, disease, damage, or degeneration, while accounting for variables at both non-injury and time-post-injury epochs. Continued collaboration and communication between researchers and healthcare providers is essential to investigate, develop, and validate the potential of advanced imaging modalities in pediatric mTBI diagnostics and management.

      3. Sports- and physical activity-related concussion and risk for youth violenceexternal icon
        Lowry R, Haarbauer-Krupa J, Breiding MJ, Simon TR.
        Am J Prev Med. 2021 Jan 6.
        INTRODUCTION: Sports and physical activities are an important cause of traumatic brain injury among adolescents. Childhood traumatic brain injury has been associated with cognitive impairment, emotional problems, and impaired behavior control, and these neuropsychological changes may place these youth at increased risk for engagement in violence-related behaviors. METHODS: Data from the 2017 National Youth Risk Behavior Survey (N=14,765), a nationally representative survey of U.S. high school students, were analyzed in 2019 to examine the associations between sports- and physical activity-related concussion and violence-related behaviors occurring in the community and at school. Multivariable logistic regression models were used to calculate sex-stratified, adjusted (for race/ethnicity, grade, athlete status, impaired cognitive functioning, feeling sad/hopeless, and current substance use) prevalence ratios. Prevalence ratios were considered statistically significant if p<0.05. RESULTS: Male students (17.1%) were more likely than female students (13.0%) to experience a sports- and physical activity-related concussion during the 12 months preceding the survey. Compared with students who did not have a concussion, those who experienced ≥1 sports- and physical activity-related concussion were more likely to be in a physical fight (male students, adjusted prevalence ratio=1.45; female students, adjusted prevalence ratio=1.55), carry a weapon (male students, adjusted prevalence ratio=1.24; female students, adjusted prevalence ratio=1.79), and fight at school (male students, adjusted prevalence ratio=1.40; female students, adjusted prevalence ratio=1.77). In addition, male students were more likely to carry a gun (adjusted prevalence ratio=1.62) and carry a weapon at school (adjusted prevalence ratio=1.73). CONCLUSIONS: Although the direction of these associations is unknown, return-to-school programs may benefit from inclusion of assessment and counseling around issues of psychological and social functioning, conflict resolution, and coordination with violence prevention programs.

    • Laboratory Sciences
      1. Diagnostic testing for Galactose-alpha-1,3-galactose (Alpha-gal), United States, 2010-2018external icon
        Binder AM, Commins S, Altrich ML, Wachs T, Biggerstaff BJ, Beard CB, Petersen LR, Kersh GJ, Armstrong PA.
        Ann Allergy Asthma Immunol. 2021 Jan 7.
        BACKGROUND: Alpha-gal syndrome (AGS) is an emerging immunoglobulin E (IgE)mediated allergy to galactose-alpha-1,3-galactose (alpha-gal). The geographic distribution and burden of AGS in the United States is unknown. OBJECTIVE: To characterizes alpha-gal IgE testing patterns and describes trends and distribution during 2010-2018 in the United States. METHODS: This retrospective analysis included all persons tested for alpha-gal IgE antibodies by Viracor-IBT Laboratories (Lee's Summit, MO), the primary site of testing in the United States. Data included age and sex of person tested, specimen state of origin, collection date, and result value; persons with at least one positive test (≥0.1 kU/L) were compared to negatives. Proportions tested and with positive test results were calculated using U.S. Census population estimates. RESULTS: Overall, 122,068 specimens from 105,674 persons were tested for alpha-gal IgE during July 1, 2010-December 31, 2018. Nearly one-third (34,256, 32.4%) had at least one positive result. The number of persons testing positive increased 6-fold from 1,110 in 2011 to 7,798 in 2018. Of those testing positive, mean [SD] age was 46.9 [19.8] years; males were more likely to test positive than females (43.3% vs 26.0%). Arkansas, Virginia, Kentucky, Oklahoma, and Missouri had the highest number of persons who were tested and had a positive result per 100,000 population. CONCLUSION: More than 34,000 persons, most presumably symptomatic, have tested positive for IgE antibodies to alpha-gal, suggesting AGS is an increasingly recognized public health problem. The geographic distribution of persons who tested positive is consistent with exposure to Amblyomma americanum ticks.

      2. Surface contamination generated by "one-pot" methamphetamine productionexternal icon
        Ciesielski AL, Wagner JR, Alexander-Scott M, Smith J, Snawder J.
        J Chem Health Saf. 2021 .
        Methamphetamine production is the most common form of illicit drug manufacture in the United States. The "one-pot"method is the most prevalent methamphetamine synthesis method and is a modified Birch reduction, reducing pseudoephedrine with lithium and ammonia gas generated in situ. This research examined the amount of methamphetamine surface contamination generated by one-pot syntheses or "cooks", as well as the effectiveness of hosing with water as a simplified decontamination technique, to assess associated public health and environmental consequences. Concentrations of methamphetamine contamination were examined prior to production, after production, and after decontamination with water. Contamination was qualitatively field screened using lateral flow immunoassays and quantitatively assessed using a fluorescence covalent microbead immunosorbent assay. Following screening, 0 of 23 pre-cook samples, 29 of 41 post-cook samples, and 5 of 27 post-decontamination samples were positive. Quantitatively, one pre-cook sample had a methamphetamine concentration of 1.36 ng/100 cm2. Post-cook and post-decontamination samples had average methamphetamine concentrations of 26.50 ± 63.83 and 6.22 ± 12.17 ng/100 cm2, respectively. While all one-pot methamphetamine laboratories generate different amounts of waste, depending on the amount of precursors used and whether the reaction vessel remained uncompromised, this study examined the surface contamination generated by a popular one-pot method known to law enforcement. By understanding the amount of surface contamination generated by common methods of one-pot methamphetamine production and the effectiveness of decontamination techniques used to remediate them, health risks associated with these production sites can be better understood and environmental contamination can be mitigated.

      3. Influenza virus NS1- C/EBPβ gene regulatory complex inhibits RIG-I transcriptionexternal icon
        Kumari R, Guo Z, Kumar A, Wiens M, Gangappa S, Katz JM, Cox NJ, Lal RB, Sarkar D, Fisher PB, García-Sastre A, Fujita T, Kumar V, Sambhara S, Ranjan P, Lal SK.
        Antiviral Res. 2020 Apr;176:104747.
        Influenza virus non-structural protein 1 (NS1) counteracts host antiviral innate immune responses by inhibiting Retinoic acid inducible gene-I (RIG-I) activation. However, whether NS1 also specifically regulates RIG-I transcription is unknown. Here, we identify a CCAAT/Enhancer Binding Protein beta (C/EBPβ) binding site in the RIG-I promoter as a repressor element, and show that NS1 promotes C/EBPβ phosphorylation and its recruitment to the RIG-I promoter as a C/EBPβ/NS1 complex. C/EBPβ overexpression and siRNA knockdown in human lung epithelial cells resulted in suppression and activation of RIG-I expression respectively, implying a negative regulatory role of C/EBPβ. Further, C/EBPβ phosphorylation, its interaction with NS1 and occupancy at the RIG-I promoter was associated with RIG-I transcriptional inhibition. These findings provide an important insight into the molecular mechanism by which influenza NS1 commandeers RIG-I transcriptional regulation and suppresses host antiviral responses.

      4. In the case of a radiological or nuclear incident, valuable information could be obtained in a timely manner by using liquid scintillation counting (LSC) technique through fast screening of urine samples from potentially contaminated persons. This work describes the optimization of LSC parameters on PerkinElmer (PE) Tri-Carb and Quantulus GCT series instruments to develop a rapid method for screening urine in an emergency response situation.

    • Mining
      1. LiDAR mapping of ground damage in a heading re-orientation case studyexternal icon
        Evanek N, Slaker B, Iannacchione A, Miller T.
        Int. J Min Sci Technol. 2021 .
        The Subtropolis Mine is a room-and-pillar mine extracting the Vanport limestone near Petersburg, Ohio, U.S. In February of 2018, mine management began implementing a heading re-orientation to better control the negative effects of excessive levels of horizontal stress. The conditions in the headings improved, but as expected, stress-related damage concentrated within crosscuts. The mine operator has worked to lessen the impact of the instabilities in the outby crosscuts by implementing several engineering controls. With the implementation of each control, conditions were monitored and analyzed using observational and measurement techniques including 3D LiDAR surveys. Since the heading re-orientation, several 3D LiDAR surveys have been conducted and analyzed by researchers from the National Institute for Occupational Safety and Health (NIOSH). This study examines (1) the characteristics of each 3D LiDAR survey, (2) the change in the detailed strata conditions in response to stress concentrations, and (3) the change detection techniques between 3D LiDAR surveys to assess entry stability. Ultimately, the 3D LiDAR surveys proved to be a useful tool for characterizing ground instability and assessing the effectiveness of the engineering controls used in the heading re-orientation at the Subtropolis Mine.

    • Nutritional Sciences
      1. Prevalence of aflatoxins in dietary staples in the border county of Busia, Western Kenyaexternal icon
        Awuor AO, Thuita FM, Okoth SD.
        Afr J Food Agric Nutr Dev. 2020 ;20(7):17045-17062.
        Aflatoxins, secondary metabolites of some Aspergillus fungi, are of public health importance. They are major contaminants of cereals and tubers. Data on prevalence of aflatoxin contamination of sorghum, millet and cassava in Busia County are limited. The extent of aflatoxin contamination in dietary staples in Busia County were assessed and potential sources associated with the contamination evaluated. A tool designed to collect sociodemographic profile, food sources and storage locations and vessels and food consumption habits of respondents was loaded onto an Open Data Kit and used in 3 subcounties. Quantitative data were analyzed using SAS version 15 software. Maize, millet, sorghum, cassava and groundnut samples were collected from 469 households. Competitive Enzyme-linked Immunosorbent Assay method was used to determine total aflatoxin levels in food samples. Sixty-eight percent of the maize samples were sourced from the market. Approximately 75% of maize samples were stored in polypropylene sacks. Samples of all five foods had detectable levels of aflatoxin. Overall, maize had the highest level of contamination (mean 100 ppb; SD 252.9; range 1-1584 ppb) with about a third of maize samples above the East African Community regulatory limits (10 ppb). The levels of aflatoxin ranged from 0.3 to 740 ppb in sorghum, 0.5 to 15 ppb in cassava, from 0.5 to 12 ppb in millet and from 0.1 to 2.8 ppb in groundnuts. The odds of contamination above 10 ppb for market-sourced maize was 1.2 times higher than homegrown maize (OR 1.185, CI 0.554, 2.534). Sorghum stored in buckets had a 12.81 likelihood of having higher than allowable limits of aflatoxin (OR 12.82, CI 2.566, 63.992) than when stored in polypropylene sacks. Aflatoxin is prevalent in the dietary staples consumed in households within Busia County. Residents are at risk of chronic exposure to aflatoxin. Enhanced market surveillance within the county is recommended.

    • Occupational Safety and Health
      1. Legionellosis cluster associated with working at a racetrack facility in West Virginia, 2018external icon
        Rispens JR, Hast M, Edens C, Ritter T, Mercante JW, Siegel M, Martin SB, Thomasson E, Barskey AE.
        J Environ Health. 2021 Jan/Feb;83(6):14-19.
        In October 2018, the Centers for Disease Control and Prevention was notified of a cluster of Legionnaires' disease cases in workers at a racetrack facility. The objective of the resulting investigation was to determine the extent of the outbreak and identify potential sources of exposure to halt transmission. Case-finding and interviews were conducted among symptomatic racetrack workers who were known to be at the facility within 14 days prior to symptom onset. An environmental assessment of the facility and surrounding area was conducted for sources of potential Legionella exposure. In total, 17 legionellosis cases were identified. The environmental assessment revealed a poorly maintained hot tub in the jockey locker room as the most likely source. Further investigation identified deficiencies in the facility's ventilation systems, which suggested a transmission mechanism for workers who never entered the locker room floor. Considering indirect exposure routes via air handling systems can be useful for source identification and case-finding in legionellosis outbreaks.

      2. Peak exposures are of concern because they can potentially overwhelm normal defense mechanisms and induce adverse health effects. Metrics of peak exposure have been used in epidemiologic and exposure studies, but consensus is lacking on its definition. The relevant characteristics of peak exposure are dependent upon exposure patterns, biokinetics of exposure, and disease mechanisms. The objective of this review was to summarize the use of peak metrics in epidemiologic and exposure studies. A comprehensive search of Medline, Embase, Web of Science, and NIOSHTIC-2 databases was conducted using keywords related to peak exposures. The retrieved references were reviewed and selected for indexing if they included a peak metric and met additional criteria. Information on health outcomes and peak exposure metrics was extracted from each reference. A total of 1,215 epidemiologic or exposure references were identified, of which 182 were indexed and summarized. For the 72 epidemiologic studies, the health outcomes most frequently evaluated were: chronic respiratory effects, cancer and acute respiratory symptoms. Exposures were frequently assessed using task-based and full-shift time-integrated methods, qualitative methods, and real-time instruments. Peak exposure summary metrics included the presence or absence of a peak event, highest exposure intensity and frequency greater than a target. Peak metrics in the 110 exposure studies most frequently included highest exposure intensity, average short-duration intensity, and graphical presentation of the real-time data (plots). This review provides a framework for considering biologically relevant peak exposure metrics for epidemiologic and exposure studies to help inform risk assessment and exposure mitigation.

    • Parasitic Diseases
      1. Widespread zoophagy and detection of Plasmodium spp. in Anopheles mosquitoes in southeastern Madagascarexternal icon
        Finney M, McKenzie BA, Rabaovola B, Sutcliffe A, Dotson E, Zohdy S.
        Malar J. 2021 Jan 7;20(1):25.
        BACKGROUND: Malaria is a top cause of mortality on the island nation of Madagascar, where many rural communities rely on subsistence agriculture and livestock production. Understanding feeding behaviours of Anopheles in this landscape is crucial for optimizing malaria control and prevention strategies. Previous studies in southeastern Madagascar have shown that Anopheles mosquitoes are more frequently captured within 50 m of livestock. However, it remains unknown whether these mosquitoes preferentially feed on livestock. Here, mosquito blood meal sources and Plasmodium sporozoite rates were determined to evaluate patterns of feeding behaviour in Anopheles spp. and malaria transmission in southeastern Madagascar. METHODS: Across a habitat gradient in southeastern Madagascar 7762 female Anopheles spp. mosquitoes were collected. Of the captured mosquitoes, 492 were visibly blood fed and morphologically identifiable, and a direct enzyme-linked immunosorbent assay (ELISA) was used to test for swine, cattle, chicken, human, and dog blood among these specimens. Host species identification was confirmed for multiple blood meals using PCR along with Sanger sequencing. Additionally, 1,607 Anopheles spp. were screened for the presence of Plasmodium falciparum, P. vivax-210, and P. vivax 247 circumsporozoites (cs) by ELISA. RESULTS: Cattle and swine accounted, respectively, for 51% and 41% of all blood meals, with the remaining 8% split between domesticated animals and humans. Of the 1,607 Anopheles spp. screened for Plasmodium falciparum, Plasmodium vivax 210, and Plasmodium vivax 247 cs-protein, 45 tested positive, the most prevalent being P. vivax 247, followed by P. vivax 210 and P. falciparum. Both variants of P. vivax were observed in secondary vectors, including Anopheles squamosus/cydippis, Anopheles coustani, and unknown Anopheles spp. Furthermore, evidence of coinfection of P. falciparum and P. vivax 210 in Anopheles gambiae sensu lato (s.l.) was found. CONCLUSIONS: Here, feeding behaviour of Anopheles spp. mosquitoes in southeastern Madagascar was evaluated, in a livestock rich landscape. These findings suggest largely zoophagic feeding behaviors of Anopheles spp., including An. gambiae s.l. and presence of both P. vivax and P. falciparum sporozoites in Anopheles spp. A discordance between P. vivax reports in mosquitoes and humans exists, suggesting high prevalence of P. vivax circulating in vectors in the ecosystem despite low reports of clinical vivax malaria in humans in Madagascar. Vector surveillance of P. vivax may be relevant to malaria control and elimination efforts in Madagascar. At present, the high proportion of livestock blood meals in Madagascar may play a role in buffering (zooprophylaxis) or amplifying (zoopotentiation) the impacts of malaria. With malaria vector control efforts focused on indoor feeding behaviours, complementary approaches, such as endectocide-aided vector control in livestock may be an effective strategy for malaria reduction in Madagascar.

      2. Use of a tablet-based system to perform abdominal ultrasounds in a field investigation of schistosomiasis-related morbidity in western Kenyaexternal icon
        Straily A, Malit AO, Wanja D, Kavere EA, Kiplimo R, Aera R, Momanyi C, Mwangi S, Mukire S, Souza AA, Wiegand RE, Montgomery SP, Secor WE, Odiere M.
        Am J Trop Med Hyg. 2021 Jan 11.
        Chronic intestinal schistosomiasis can cause severe hepatosplenic disease and is a neglected tropical disease of public health importance in sub-Saharan Africa, including Kenya. Although the goal of control programs is to reduce morbidity, milestones for program performance focus on reductions in prevalence and intensity of infection, rather than actual measures of morbidity. Using ultrasound to measure hepatosplenic disease severity is an accepted method of determining schistosomiasis-related morbidity; however, ultrasound has not historically been considered a field-deployable tool because of equipment limitations and unavailability of expertise. A point-of-care tablet-based ultrasound system was used to perform abdominal ultrasounds in a field investigation of schistosomiasis-related morbidity in western Kenya; during the study, other pathologies and pregnancies were also identified via ultrasound, and participants referred to care. Recent technological advances may make it more feasible to implement ultrasound as part of a control program and can also offer important benefits to the community.

      3. A comparative evaluation of mobile medical APPS (MMAS) for reading and interpreting malaria rapid diagnostic testsexternal icon
        Visser T, Ramachandra S, Pothin E, Jacobs J, Cunningham J, Menach AL, Gatton ML, Dos Santos Souza S, Nelson S, Rooney L, Aidoo M.
        Malar J. 2021 Jan 13;20(1):39.
        BACKGROUND: The World Health Organization recommends confirmatory diagnosis by microscopy or malaria rapid diagnostic test (RDT) in patients with suspected malaria. In recent years, mobile medical applications (MMAs), which can interpret RDT test results have entered the market. To evaluate the performance of commercially available MMAs, an evaluation was conducted by comparing RDT results read by MMAs to RDT results read by the human eye. METHODS: Five different MMAs were evaluated on six different RDT products using cultured Plasmodium falciparum blood samples at five dilutions ranging from 20 to 1000 parasites (p)/microlitre (µl) and malaria negative blood samples. The RDTs were performed in a controlled, laboratory setting by a trained operator who visually read the RDT results. A second trained operator then used the MMAs to read the RDT results. Sensitivity (Sn) and specificity (Sp) for the RDTs were calculated in a Bayesian framework using mixed models. RESULTS: The RDT Sn of the P. falciparum (Pf) test line, when read by the trained human eye was significantly higher compared to when read by MMAs (74% vs. average 47%) at samples of 20 p/µl. In higher density samples, the Sn was comparable to the human eye (97%) for three MMAs. The RDT Sn of test lines that detect all Plasmodium species (Pan line), when read by the trained human eye was significantly higher compared to when read by MMAs (79% vs. average 56%) across all densities. The RDT Sp, when read by the human eye or MMAs was 99% for both the Pf and Pan test lines across all densities. CONCLUSIONS: The study results show that in a laboratory setting, most MMAs produced similar results interpreting the Pf test line of RDTs at parasite densities typically found in patients that experience malaria symptoms (> 100 p/µl) compared to the human eye. At low parasite densities for the Pf line and across all parasite densities for the Pan line, MMAs were less accurate than the human eye. Future efforts should focus on improving the band/line detection at lower band intensities and evaluating additional MMA functionalities like the ability to identify and classify RDT errors or anomalies.

    • Physical Activity
      1. Perceived importance of physical activity and walkable neighborhoods among US adults, 2017external icon
        Carlson SA, Ussery EN, Watson KB, Cornett KA, Fulton JE.
        Prev Chronic Dis. 2020 Dec 31;17:E168.
        The importance of physical activity and community-level promotion strategies are well established, but little is known about adult perception of the importance of physical activity. In a nationwide sample of US adults, we examined self-reported importance of regular physical activity and the importance of living in walkable neighborhoods. About 55% of adults strongly agreed that regular physical activity is important, 40% strongly agreed that living in a walkable neighborhood is important, and 31% strongly agreed that both are important. Separately for each measure, estimates were lower among adults with lower education levels and who did not meet the aerobic physical activity guideline. Opportunities exist to improve the perception of the importance of physical activity and the importance of walkable neighborhoods.

    • Program Evaluation
      1. Over the past decade, CDC has been implementing a high-impact prevention (HIP) approach to HIV, directing funds towards activities with the greatest likelihood of reducing new infections and disparities. Corresponding to this shift, the Division of HIV/AIDS Prevention (DHAP) began funding a series of multi-site demonstration projects to provide extra support and evaluative capacity to select health departments to initiate new HIP programming, with the intention of ascertaining and sharing lessons with other health departments. In this paper, we provide context for the PrEP, Implementation, Data2Care, Evaluation (PrIDE) evaluation by describing the evolution of evaluation goals and activities across three prior demonstration projects, highlighting four areas of change: 1) integrated evaluation and program implementation; 2) local program evaluation in addition to cross-site performance monitoring; 3) prescriptive allocation of resources to support local program evaluation; and 4) expansion beyond single site program evaluation to identify effective cross-site programmatic strategies. Together, these changes reflect our own learning about achieving the greatest contribution from multi-site projects and set the stage for unique aspects of program evaluation within PrIDE. We describe these features, concluding with lessons learned from this most recent approach to structuring and supporting evaluation within CDC DHAP's health department demonstration projects.

    • Substance Use and Abuse
      1. Characterization of acrylonitrile exposure in the United States based on urinary n-acetyl-S-(2-cyanoethyl)-L-cysteine (2CYEMA): NHANES 2011-2016external icon
        De Jesús VR, Zhang L, Bhandari D, Zhu W, Chang JT, Blount BC.
        J Expo Sci Environ Epidemiol. 2021 Jan 11.
        BACKGROUND: Acrylonitrile is a possible human carcinogen that is used in polymers and formed in tobacco smoke. We assessed acrylonitrile exposure in the US population by measuring its urinary metabolites N-acetyl-S-(4-hydroxy-2-methyl-2-buten-1-yl)-L-cysteine (2CYEMA) and N-acetyl-S-(1-cyano-2-hydroxyethyl)-L-cysteine (1CYHEMA) in participants from the 2011-2016 National Health and Nutrition Examination Survey. OBJECTIVE: To assessed acrylonitrile exposure using population-based biomonitoring data of the US civilian, non-institutionalized population. METHODS: Laboratory data for 8057 participants were reported for 2CYEMA and 1CYHEMA using ultrahigh-performance liquid chromatography/tandem mass spectrometry. Exclusive tobacco smokers were distinguished from non-users using a combination of self-reporting and serum cotinine data. We used multiple linear regression models to fit 2CYEMA concentrations with sex, age, race/Hispanic origin, and tobacco user group as predictor variables. RESULTS: The median 2CYEMA level was higher for exclusive cigarette smokers (145 µg/g creatinine) than for non-users (1.38 µg/g creatinine). Compared to unexposed individuals (serum cotinine ≤0.015 ng/ml) and controlling for confounders, presumptive second-hand tobacco smoke exposure (serum cotinine >0.015 to ≤10 ng/ml and 0 cigarettes per day, CPD) was significantly associated with 36% higher 2CYEMA levels (p < 0.0001). Smoking 1-10 CPD was significantly associated with 6720% higher 2CYEMA levels (p < 0.0001). SIGNIFICANCE: We show that tobacco smoke is an important source of acrylonitrile exposure in the US population and provide important biomonitoring data on acrylonitrile exposure.

      2. Neonatal abstinence syndrome and maternal opioid-related diagnoses in the US, 2010-2017external icon
        Hirai AH, Ko JY, Owens PL, Stocks C, Patrick SW.
        Jama. 2021 Jan 12;325(2):146-155.
        IMPORTANCE: Substantial increases in both neonatal abstinence syndrome (NAS) and maternal opioid use disorder have been observed through 2014. OBJECTIVE: To examine national and state variation in NAS and maternal opioid-related diagnoses (MOD) rates in 2017 and to describe national and state changes since 2010 in the US, which included expanded MOD codes (opioid use disorder plus long-term and unspecified use) implemented in International Classification of Disease, 10th Revision, Clinical Modification. DESIGN, SETTING, AND PARTICIPANTS: Repeated cross-sectional analysis of the 2010 to 2017 Healthcare Cost and Utilization Project's National Inpatient Sample and State Inpatient Databases, an all-payer compendium of hospital discharge records from community nonrehabilitation hospitals in 47 states and the District of Columbia. EXPOSURES: State and year. MAIN OUTCOMES AND MEASURES: NAS rate per 1000 birth hospitalizations and MOD rate per 1000 delivery hospitalizations. RESULTS: In 2017, there were 751 037 birth hospitalizations and 748 239 delivery hospitalizations in the national sample; 5375 newborns had NAS and 6065 women had MOD documented in the discharge record. Mean gestational age was 38.4 weeks and mean maternal age was 28.8 years. From 2010 to 2017, the estimated NAS rate significantly increased by 3.3 per 1000 birth hospitalizations (95% CI, 2.5-4.1), from 4.0 (95% CI, 3.3-4.7) to 7.3 (95% CI, 6.8-7.7). The estimated MOD rate significantly increased by 4.6 per 1000 delivery hospitalizations (95% CI, 3.9-5.4), from 3.5 (95% CI, 3.0-4.1) to 8.2 (95% CI, 7.7-8.7). Larger increases for MOD vs NAS rates occurred with new International Classification of Disease, 10th Revision, Clinical Modification codes in 2016. From a census of 47 state databases in 2017, NAS rates ranged from 1.3 per 1000 birth hospitalizations in Nebraska to 53.5 per 1000 birth hospitalizations in West Virginia, with Maine (31.4), Vermont (29.4), Delaware (24.2), and Kentucky (23.9) also exceeding 20 per 1000 birth hospitalizations, while MOD rates ranged from 1.7 per 1000 delivery hospitalizations in Nebraska to 47.3 per 1000 delivery hospitalizations in Vermont, with West Virginia (40.1), Maine (37.8), Delaware (24.3), and Kentucky (23.4) also exceeding 20 per 1000 delivery hospitalizations. From 2010 to 2017, NAS and MOD rates increased significantly for all states except Nebraska and Vermont, which only had MOD increases. CONCLUSIONS AND RELEVANCE: In the US from 2010 to 2017, estimated rates of NAS and MOD significantly increased nationally and for the majority of states, with notable state-level variation.

      3. BACKGROUND: COVID-19 community mitigation measures (e.g., stay-at-home orders) may worsen mental health and substance use-related harms such as opioid use disorder and overdose and limit access to medications for these conditions. We used nationally-representative data to assess dispensing of select substance use and mental health medications during the pandemic in the U.S. METHODS: IQVIA Total Patient Tracker data were used to calculate U.S. monthly numbers of unique patients dispensed buprenorphine, extended-release (ER) intramuscular naltrexone, naloxone, selective serotonin or serotonin-norepinephrine reuptake inhibitors, benzodiazepines, and for comparison, HMG-CoA reductase inhibitors (statins) and angiotensin receptor blockers (ARBs) between January 2019-May 2020. Forecasted estimates of number of unique patients dispensed medications, generated by exponential smoothing statistical forecasting, were compared to actual numbers of patients by month to examine access during mitigation measures (March 2020-May 2020). RESULTS: Between March 2020-May 2020, numbers of unique patients dispensed buprenorphine and numbers dispensed naloxone were within forecasted estimates. Numbers dispensed ER intramuscular naltrexone were significantly below forecasted estimates in March 2020 (-1039; 95 %CI:-1528 to -550), April 2020 (-2139; 95 %CI:-2629 to -1650), and May 2020 (-2498; 95 %CI:-2987 to -2009). Numbers dispensed antidepressants and benzodiazepines were significantly above forecasted estimates in March 2020 (977,063; 95 %CI:351,384 to 1,602,743 and 450,074; 95 % CI:189,999 to 710,149 additional patients, respectively), but were within forecasted estimates in April 2020-May 2020. Dispensing patterns for statins and ARBs were similar to those for antidepressants and benzodiazepines. CONCLUSIONS: Ongoing concerns about the impact of the COVID-19 pandemic on substance use and mental health underscore the need for innovative strategies to facilitate continued access to treatment.

      4. Assessment of neonatal abstinence syndrome surveillance - Pennsylvania, 2019external icon
        Krause KH, Gruber JF, Ailes EC, Anderson KN, Fields VL, Hauser K, Howells CL, Longenberger A, McClung N, Oakley LP, Reefhuis J, Honein MA, Watkins SM.
        MMWR Morb Mortal Wkly Rep. 2021 Jan 15;70(2):40-45.
        The incidence of neonatal abstinence syndrome (NAS), a withdrawal syndrome associated with prenatal opioid or other substance exposure (1), has increased as part of the U.S. opioid crisis (2). No national NAS surveillance system exists (3), and data about the accuracy of state-based surveillance are limited (4,5). In February 2018, the Pennsylvania Department of Health began surveillance for opioid-related NAS in birthing facilities and pediatric hospitals* (6). In March 2019, CDC helped the Pennsylvania Department of Health assess the accuracy of this reporting system at five Pennsylvania hospitals. Medical records of 445 infants who possibly had NAS were abstracted; these infants had either been reported by hospital providers as having NAS or assigned an International Classification of Diseases, Tenth Revision, Clinical Modification (ICD-10-CM) hospital discharge code potentially related to NAS.(†) Among these 445 infants, 241 were confirmed as having NAS. Pennsylvania's NAS surveillance identified 191 (sensitivity = 79%) of the confirmed cases. The proportion of infants with confirmed NAS who were assigned the ICD-10-CM code for neonatal withdrawal symptoms from maternal use of drugs of addiction (P96.1) was similar among infants reported to surveillance (71%) and those who were not (78%; p = 0.30). Infants with confirmed NAS who were not assigned code P96.1 typically had less severe signs and symptoms. Accurate NAS surveillance, which is necessary to monitor changes and regional differences in incidence and assist with planning for needed services, includes and is strengthened by a combination of diagnosis code assessment and focused medical record review.

    • Veterinary Medicine
      1. Barriers and opportunities for canine rabies vaccination campaigns in Addis Ababa, Ethiopiaexternal icon
        Yoak AJ, Haile A, O'Quin J, Belu M, Birhane M, Bekele M, Murphy S, Medley A, Vincent E, Stewart D, Shiferaw ML, Tafese K, Garabed R, Pieracci EG.
        Prev Vet Med. 2021 Jan 4;187:105256.
        BACKGROUND: Canine rabies is endemic in Ethiopia and presents a significant burden for both animal and human health. We investigate barriers to dog vaccination in Addis Ababa, Ethiopia. These results can be utilized to improve and target future rabies control efforts. METHODOLOGY/PRINCIPLE FINDINGS: During May of 2017, dog owners were surveyed during a free canine rabies vaccination programs that utilized both door-to-door (DtD) and central point (CP) vaccination methods. Surveys collected information on preferences for rabies vaccine delivery and were administered in Amharic. A total of 1057 surveys were completed. Of those surveyed, 62.4 % indicated that their dogs had been vaccinated against rabies within the last year. Commonly reported barriers to vaccination were a lack of awareness that dogs required rabies vaccines (18.1 %) and lack of knowledge about where to find vaccine (15.0 %). The median price owners were willing to pay for vaccination was 25 birr ($0.91 USD) and the median distance willing to travel was 1.0 km; however, 48.9 % of those surveyed during DtD were unwilling to travel at all. We identified 3 classes of respondents who were grouped due to their responses by latent class analysis: 'the Unaware', 'the Vaccinators', and 'the Multiple Barriers'. CONCLUSIONS/SIGNIFICANCE: Although many respondents were willing to pay for rabies vaccine (94.0 %); the preferred cost (median) was less than the actual cost of providing the vaccine. This supports the need for reduced-cost or free vaccine to achieve and sustain the 70 % vaccine coverage target threshold for canine rabies elimination. Additionally, a significant portion (41.5 %) of those surveyed indicated that they were unwilling to travel in order to have their dog vaccinated. The latent class analysis provides useful guidance on how to reach target vaccination. Owners from 'the Unaware' group made up 18.1 % of respondents and their high rate of allowing their dogs to roam identifies them as a prime target for canine health and behavior education. 'The Multiple Barriers' owners reported lower degrees of dog roaming and were substantially more likely to be found by DtD campaigns, possibly because they have limited ability/interest in handling their dogs. These results demonstrate the importance of incorporating DtD vaccination as well as subsidies to maximize vaccine coverage in Addis Ababa.

    • Zoonotic and Vectorborne Diseases
      1. Role of cats in human toxocarosisexternal icon
        Castro PD, Sapp SG.
        Companion Animal. 2021 ;26(1).
        Toxocara cati, the feline ascarid, is ubiquitous in domestic cats globally and is increasingly recognised as an important zoonotic species. In the definitive host, infections with the adult ascarid usually do not present any clinical signs; if clinical signs do appear, it is usually in kittens infected with T. cati, especially by the transmammary route. Diseases may include cachexia, a pot-bellied appearance, respiratory disorders, diarrhoea, vomiting, among other signs, and these may present as early as 3 weeks of age. However, infections with Toxocara spp. larvae in paratenic hosts (including humans and many other animals), can result in serious complications from the migration of larvae. Historically, there has been an assumption that Toxocara canis was the most likely cause of Toxocara spp.-related disease; while it is probably true that T. canis is responsible for the majority of infections, it is important that those caused by T. cati are accurately identified so that the contribution of this parasite to human disease can be established and then handled appropriately. Overall, the detection of infections in cats and the control of parasite stages in the environment are essential to minimise the infection risk to other animals or humans.

      2. Fatal case of chronic Jamestown Canyon virus encephalitis diagnosed by metagenomic sequencing in patient receiving rituximabexternal icon
        Solomon IH, Ganesh VS, Yu G, Deng XD, Wilson MR, Miller S, Milligan TA, Mukerji SS, Mathewson A, Linxweiler J, Morse D, Ritter JM, Staples JE, Hughes H, Gould CV, Sabeti PC, Chiu CY, Piantadosi A.
        Emerg Infect Dis. 2021 Jan;27(1):238-42.
        A 56-year-old man receiving rituximab who had months of neurologic symptoms was found to have Jamestown Canyon virus in cerebrospinal fluid by clinical metagenomic sequencing. The patient died, and postmortem examination revealed extensive neuropathologic abnormalities. Deep sequencing enabled detailed characterization of viral genomes from the cerebrospinal fluid, cerebellum, and cerebral cortex.

Back to Top

CDC Science Clips Production Staff

  • Takudzwa Sayi, Editor
  • Gail Bang, MLIS, Librarian
  • Kathy Tucker, Librarian
  • William (Bill) Thomas, MLIS, Librarian
  • Jarvis Sims, MIT, MLIS, Librarian
  • William Friedman, MLIS, Librarian


DISCLAIMER: Articles listed in the CDC Science Clips are selected by the Stephen B. Thacker CDC Library to provide current awareness of the public health literature. An article's inclusion does not necessarily represent the views of the Centers for Disease Control and Prevention nor does it imply endorsement of the article's methods or findings. CDC and DHHS assume no responsibility for the factual accuracy of the items presented. The selection, omission, or content of items does not imply any endorsement or other position taken by CDC or DHHS. Opinion, findings and conclusions expressed by the original authors of items included in the Clips, or persons quoted therein, are strictly their own and are in no way meant to represent the opinion or views of CDC or DHHS. References to publications, news sources, and non-CDC Websites are provided solely for informational purposes and do not imply endorsement by CDC or DHHS.

Page last reviewed: January 26, 2021, 12:00 AM