2020 Science Clips Banner

Issue 21, May 24, 2022

CDC Science Clips: Volume 14, Issue 21, May 24, 2022

Science Clips is produced weekly to enhance awareness of emerging scientific knowledge for the public health community. Each article features an Altmetric Attention scoreexternal icon to track social and mainstream media mentions.

  1. CDC Authored Publications
    The names of CDC authors are indicated in bold text.
    Articles published in the past 6-8 weeks authored by CDC or ATSDR staff.
    • Antimicrobial Resistance and Antibiotic Stewardship
      1. Transmitted drug resistance among human immunodeficiency virus (HIV)-1 diagnoses in the United States, 2014-2018external icon
        McClung RP, Oster AM, Ocfemia MC, Saduvala N, Heneine W, Johnson JA, Hernandez AL.
        Clin Infect Dis. 2022 Mar 23;74(6):1055-1062.
        BACKGROUND: Transmitted human immunodeficiency virus (HIV) drug resistance can threaten the efficacy of antiretroviral therapy and pre-exposure prophylaxis (PrEP). Drug-resistance testing is recommended at entry to HIV care in the United States and provides valuable insight for clinical decision making and population-level monitoring. METHODS: We assessed transmitted drug-resistance-associated mutation (TDRM) prevalence and predicted susceptibility to common HIV drugs among US persons with HIV diagnosed during 2014-2018 who had a drug resistance test performed ≤3 months after HIV diagnosis and reported to the National HIV Surveillance System and who resided in 28 jurisdictions where ≥20% of HIV diagnoses had an eligible sequence during this period. RESULTS: Of 50 747 persons in the analysis, 9616 (18.9%) had ≥1 TDRM. TDRM prevalence was 0.8% for integrase strand transfer inhibitors (INSTIs), 4.2% for protease inhibitors, 6.9% for nucleoside reverse transcriptase inhibitors (NRTIs), and 12.0% for non-NRTIs. Most individual mutations had a prevalence <1.0% including M184V (0.9%) and K65R (0.1%); K103N was most prevalent (8.6%). TDRM prevalence did not increase or decrease significantly during 2014-2018 overall, for individual drug classes, or for key individual mutations except for M184V (12.9% increase per year; 95% confidence interval, 5.6-20.6%). CONCLUSIONS: TDRM prevalence overall and for individual drug classes remained stable during 2014-2018; transmitted INSTI resistance was uncommon. Continued population-level monitoring of INSTI and NRTI mutations, especially M184V and K65R, is warranted amidst expanding use of second-generation INSTIs and PrEP.

      2. Pharmacist-driven transitions of care practice model for prescribing oral antimicrobials at hospital dischargeexternal icon
        Mercuro NJ, Medler CJ, Kenney RM, MacDonald NC, Neuhauser MM, Hicks LA, Srinivasan A, Divine G, Beaulac A, Eriksson E, Kendall R, Martinez M, Weinmann A, Zervos M, Davis SL.
        JAMA Netw Open. 2022 May 2;5(5):e2211331.
        IMPORTANCE: Although prescribers face numerous patient-centered challenges during transitions of care (TOC) at hospital discharge, prolonged duration of antimicrobial therapy for common infections remains problematic, and resources are needed for antimicrobial stewardship throughout this period. OBJECTIVE: To evaluate a pharmacist-driven intervention designed to improve selection and duration of oral antimicrobial therapy prescribed at hospital discharge for common infections. DESIGN, SETTING, AND PARTICIPANTS: This quality improvement study used a nonrandomized stepped-wedge design with 3 study phases from September 1, 2018, to August 31, 2019. Seventeen distinct medicine, surgery, and specialty units from a health system in Southeast Michigan participated, including 1 academic tertiary hospital and 4 community hospitals. Hospitalized adults who had urinary, respiratory, skin and/or soft tissue, and intra-abdominal infections and were prescribed antimicrobials at discharge were included in the analysis. Data were analyzed from February 18, 2020, to February 28, 2022. INTERVENTIONS: Clinical pharmacists engaged in a new standard of care for antimicrobial stewardship practices during TOC by identifying patients to be discharged with a prescription for oral antimicrobials and collaborating with primary teams to prescribe optimal therapy. Academic and community hospitals used both antimicrobial stewardship and clinical pharmacists in a multidisciplinary rounding model to discuss, document, and facilitate order entry of the antimicrobial prescription at discharge. MAIN OUTCOMES AND MEASURES: The primary end point was frequency of optimized antimicrobial prescription at discharge. Health system guidelines developed from national guidelines and best practices for short-course therapies were used to evaluate optimal therapy. RESULTS: A total of 800 patients prescribed oral antimicrobials at hospital discharge were included in the analysis (441 women [55.1%]; mean [SD] age, 66.8 [17.3] years): 400 in the preintervention period and 400 in the postintervention period. The most common diagnoses were pneumonia (264 [33.0%]), upper respiratory tract infection and/or acute exacerbation of chronic obstructive pulmonary disease (214 [26.8%]), and urinary tract infection (203 [25.4%]). Patients in the postintervention group were more likely to have an optimal antimicrobial prescription (time-adjusted generalized estimating equation odds ratio, 5.63 [95% CI, 3.69-8.60]). The absolute increase in optimal prescribing in the postintervention group was consistent in both academic (37.4% [95% CI, 27.5%-46.7%]) and community (43.2% [95% CI, 32.4%-52.8%]) TOC models. There were no differences in clinical resolution or mortality. Fewer severe antimicrobial-related adverse effects (time-adjusted generalized estimating equation odds ratio, 0.40 [95% CI, 0.18-0.88]) were identified in the postintervention (13 [3.2%]) compared with the preintervention (36 [9.0%]) groups. CONCLUSIONS AND RELEVANCE: The findings of this quality improvement study suggest that targeted antimicrobial stewardship interventions during TOC were associated with increased optimal, guideline-concordant antimicrobial prescriptions at discharge.

      3. Antimicrobial susceptibility survey of invasive haemophilus influenzae in the United States in 2016external icon
        Potts CC, Rodriguez-Rivera LD, Retchless AC, Buono SA, Chen AT, Marjuki H, Blain AE, Wang X.
        Microbiol Spectr. 2022 May 10:e0257921.
        Antibiotics are important for the treatment and prevention of invasive Haemophilus influenzae disease. Reduced susceptibility to clinically relevant drugs, except ampicillin, has been uncommon in the United States. Susceptibility of 700 invasive H. influenzae isolates, collected through population-based surveillance during 2016, was assessed for 15 antibiotics using broth microdilution, according to the CLSI guidelines; a subset of 104 isolates were also assessed for rifampin susceptibility using Etest. Genomes were sequenced to identify genes and mutations known to be associated with reduced susceptibility to clinically relevant drugs. A total of 508 (72.6%) had reduced susceptibility to at least one antibiotic and more than half of the isolates exhibited reduced susceptibility to only one (33.6%) or two (21.6%) antibiotic classes. All tested isolates were susceptible to rifampin, a chemoprophylaxis agent, and <1% (n = 3) of isolates had reduced susceptibility to third generation cephalosporins, which are recommended for invasive disease treatment. In contrast, ampicillin resistance was more common (28.1%) and predominantly associated with the detection of a β-lactamase gene; 26.2% of isolates in the collection contained either a TEM-1 or ROB-1 β-lactamase gene, including 88.8% of ampicillin-resistant isolates. β-lactamase negative ampicillin-resistant (BLNAR) isolates were less common and associated with ftsI mutations; resistance to amoxicillin-clavulanate was detected in <2% (n = 13) of isolates. The proportion of reduced susceptibility observed was higher among nontypeable H. influenzae and serotype e than other serotypes. US invasive H. influenzae isolates remain predominantly susceptible to clinically relevant antibiotics except ampicillin, and BLNAR isolates remain uncommon. IMPORTANCE Antibiotics play an important role for the treatment and prevention of invasive Haemophilus influenzae disease. Antimicrobial resistance survey of invasive H. influenzae isolates collected in 2016 showed that the US H. influenzae population remained susceptible to clinically relevant antibiotics, except for ampicillin. Detection of approximately a quarter ampicillin-resistant and β-lactamase containing strains demonstrates that resistance mechanisms can be acquired and sustained within the H. influenzae population, highlighting the continued importance of antimicrobial resistance surveillance for H. influenzae to monitor susceptibility trends and mechanisms of resistance.

    • Chronic Diseases and Conditions
      1. Aminoaciduria and metabolic dysregulation during diabetic ketoacidosis: Results from the diabetic kidney alarm (DKA) studyexternal icon
        Melena I, Piani F, Tommerdahl KL, Severn C, Chung LT, MacDonald A, Vinovskis C, Cherney D, Pyle L, Roncal-Jimenez CA, Lanaspa MA, Rewers A, van Raalte DH, Cara-Fuentes G, Parikh CR, Nelson RG, Pavkov ME, Nadeau KJ, Johnson RJ, Bjornstad P.
        J Diabetes Complications. 2022 Apr 28:108203.
        OBJECTIVE: We examined changes in the excretion of various amino acids and in glycolysis and ketogenesis-related metabolites, during and after diabetic ketoacidosis (DKA) diagnosis, in youth with known or new onset type 1 diabetes (T1D). METHODS: Urine samples were collected from 40 youth with DKA (52% boys, mean age 11 ± 4 years, venous pH 7.2 ± 0.1, blood glucose 451 ± 163 mg/dL) at 3 time points: 0-8 h and 12-24 h after starting an insulin infusion, and 3 months after hospital discharge. Mixed-effects models evaluated the changes in amino acids and other metabolites in the urine. RESULTS: Concentrations of urine histidine, threonine, tryptophan, and leucine per creatinine were highest at 0-8 h (148.8 ± 23.5, 59.5 ± 12.3, 15.4 ± 1.4, and 24.5 ± 2.4% of urine creatinine, respectively), and significantly decreased over 3 months (p = 0.028, p = 0.027, p = 0.019, and p < 0.0001, respectively). Urine histidine, threonine, tryptophan, and leucine per urine creatinine decreased by 10.6 ± 19.2, 0.7 ± 0.9, 1.3 ± 0.9, and 0.5 ± 0.3-fold, respectively, between 0 and 8 h and 3 months. CONCLUSIONS: In our study, DKA was associated with profound aminoaciduria, suggestive of proximal tubular dysfunction analogous to Fanconi syndrome.

      2. Role attainment in emerging adulthood: Subjective evaluation by male adolescents and adults with Duchenne and Becker muscular dystrophyexternal icon
        Peay HL, Do BT, Khosla N, Paramsothy P, Erickson SW, Lamb MM, Whitehead N, Fox DJ, Pandya S, Kinnett K, Wolff J, Howard JF.
        J Neuromuscul Dis. 2022 ;9(3):447-456.
        BACKGROUND: Youth with Duchenne and Becker muscular dystrophy (DBMD) experience challenges in attaining adult roles, which may impact quality of life. New interventions and treatments may facilitate adult role attainment through improved function. Historical data on adult role attainment is important to assess the impact of new interventions on teens and young adults with DBMD. This study assesses medical knowledge, independence and employment, and relationships among adolescents and young adults with DBMD. METHODS: This study uses data from a 2013 Muscular Dystrophy Surveillance, Tracking, and Research Network (MD STARnet) survey on adult transition. Males with DBMD aged 16-30 years were included. RESULTS: Sixty-five of 258 eligible males participated; we report results on 60 participants with an MD STARnet case definition of DMD or BMD. Individuals with BMD reported higher rates than those with DMD of frequently staying home without supervision (50% BMD; 14% DMD), independently performing daily physical needs (93% BMD; 7% DMD) and being employed full or part time (33% BMD; 4% DMD). Most participants understood medication and physical therapy goals; less than half indicated being often or always responsible for scheduling DMBD-related management and refilling medications. Most had not been in a romantic relationship but reported desiring such relationships. CONCLUSIONS: Our data reinforce the impact of DMD (and to a lesser extent, BMD) on transition to adult roles. These results provide an important historical comparator for teen and adult patients who are trying new interventions and therapies. Such data are important for assessing the quality-of-life impact of new treatments and to inform support and training programs for people with DBMD as they transition to new adult roles and responsibilities.

    • Communicable Diseases
      1. Trends in HIV care outcomes among adults and adolescents in the U.S. South, 2015-2019external icon
        Gant Z, Dailey A, Wang S, Lyons SJ, Watson M, Lee K, Johnson AS.
        Ann Epidemiol. 2022 May 4.
        PURPOSE: HIV disparities continue to persist in the southern United States and among some populations. Early HIV diagnosis, prompt linkage to care, and viral suppression among persons with HIV in the South, in particular the Deep South, are critical to reduce disparities and achieve national prevention goals. METHODS: Estimated annual percent changes were calculated to assess trends during 2015-2019 in percentage distributions for stage of disease at the time of diagnosis, linkage to HIV medical care, and viral suppression. RESULTS: Among 95,488 persons with HIV diagnosed in the South (Deep South: 81,848; Other South:13,640), the overall percentage that received a diagnosis classified as stage 0 increased 9.0%, stages 1-2 increased 1.8%, linkage to HIV care increased 2.9%, and viral suppression increased 5.9%. Changes in care outcomes among American Indian/Alaska Native persons and persons with infection attributed to injection drug use were minimal. CONCLUSIONS: To reach the goals of Ending the HIV Epidemic (EHE) and other federal initiatives, efforts need to focus on prevention and care among persons residing in the South. Addressing factors such as stigma and discrimination and elimination of barriers to HIV testing, care, and treatment are needed to effectively address these disparities in HIV-related care outcomes.

      2. Survey of incidence, lifetime prevalence, and treatment of self-reported vulvovaginal candidiasis, United States, 2020external icon
        Benedict K, Singleton AL, Jackson BR, Molinari NA.
        BMC Womens Health. 2022 May 10;22(1):147.
        BACKGROUND: Vulvovaginal candidiasis (VVC) is a common gynecologic problem in the United States but estimates of its true incidence and prevalence are lacking. We estimated self-reported incidence and lifetime prevalence of healthcare provider-diagnosed VVC and recurrent VVC (RVVC), assessed treatment types, and evaluated demographic and health-related risk factors associated with VVC. METHODS: An online survey sent to 4548 U.S. adults; data were weighted to be representative of the population. We conducted descriptive and bivariate analyses to examine demographic characteristics and health related factors associated with having VVC in the past year, lifetime prevalence of VVC, and over-the-counter (OTC) and prescription antifungal treatment use. We conducted multivariate analyses to assess features associated with 1) having VVC in the past year, 2) number of VVC episodes in the past year, and 3) lifetime prevalence of VVC. RESULTS: Among the subset of 1869 women respondents, 98 (5.2%) had VVC in the past year; of those, 5 (4.7%) had RVVC. Total, 991 (53%) women reported healthcare provider-diagnosed VVC in their lifetime. Overall, 72% of women with VVC in the past year reported prescription antifungal treatment use, 40% reported OTC antifungal treatment use, and 16% reported both. In multivariate analyses, odds of having VVC in the past year were highest for women with less than a high school education (aOR = 6.30, CI: 1.84-21.65), with a child/children under 18 years old (aOR = 3.14, CI: 1.58-6.25), with diabetes (aOR = 2.93, CI: 1.32-6.47), who were part of a couple (aOR = 2.86, CI: 1.42-5.78), and with more visits to a healthcare provider for any reason (aOR = 2.72, CI: 1.84-4.01). Similar factors were associated with increasing number of VVC episodes in the past year and with lifetime prevalence of VVC. CONCLUSION: VVC remains a common infection in the United States. Our analysis supports known clinical risk factors for VVC and suggests that antifungal treatment use is high, underscoring the need to ensure appropriate diagnosis and treatment.

      3. HIV incidence in Botswana rural communities with high antiretroviral treatment coverage: Results from the Botswana Combination Prevention Project, 2013-2017external icon
        Ussery F, Bachanas P, Alwano MG, Lebelonyane R, Block L, Wirth K, Ussery G, Sento B, Gaolathe T, Kadima E, Abrams W, Segolodi T, Hader S, Lockman S, Moore J.
        J Acquir Immune Defic Syndr. 2022 May 6.
        BACKGROUND: and Setting: The Botswana Combination Prevention Project (BCPP) demonstrated a 30% reduction in community HIV incidence through expanded HIV testing, enhanced linkage to care, and universal antiretroviral treatment and exceeded the Joint United Nations Programme on HIV/AIDS 90-90-90 targets. We report rates and characteristics of incident HIV infections. METHODS: BCPP was a community-randomized controlled trial conducted in 30 rural/peri-urban Botswana communities from 2013 to 2017. Home-based and mobile HIV-testing campaigns were conducted in 15 intervention communities, with 39% of participants testing at least twice. We assessed the HIV incidence rate (IR; number of new HIV infections per 100 person-years (py) at risk) among repeat testers and risk factors with a Cox proportional hazards regression model. RESULTS: During 27,517py, 195 (women,79%) of 18,597 became HIV-infected (0.71/100py). Women had a higher IR (1.01/100py; 95% CI, 0.99 to 1.02) than men (0.34/100py; 95% CI, 0.33 to 0.35). The highest IRs were among women aged 16-24 years (1.87/100py) and men aged 25-34 years (0.56/100py). The lowest IRs were among those aged 35-64 years (women,0.41/100py; men,0.20/100py). Hazard of incident infection was highest among women aged 16-24 (HR=7.05). Sex and age were significantly associated with incidence (both P<0.0001). CONCLUSIONS: Despite an overall reduction in HIV incidence and approaching the UNAIDS 95-95-95 targets, high HIV incidence was observed in adolescent girls and young women. These findings highlight the need for additional prevention services [pre-exposure prophylaxis (PrEP), DREAMS] to achieve epidemic control in this subpopulation and increased efforts with men with undiagnosed HIV.

      4. SARS-CoV-2 virus dynamics in recently infected people - data from a household transmission studyexternal icon
        Mellis AM, Meece JK, Halasa NB, Chappell JD, McLean HQ, Grijalva CG, Hanson KE, Zhu Y, Kim A, Deyoe J, Ivacic LC, Reed C, Talbot HK, Rolfes MA.
        J Infect Dis. 2022 May 5.
        We used daily real-time reverse-transcription polymerase chain reaction (rRT-PCR) results from 67 cases of SARS-CoV-2 infection in a household transmission study, conducted April 2020--May 2021, to examine the trajectory of cycle threshold (Ct) values, an inverse correlate of viral RNA concentration. Ct values varied across RT-PCR platforms and by participant age. Specimens collected from children and adolescents had higher Ct values and adults aged ≥50 years showed lower Ct values than adults aged 18-49 years. Ct values were lower on days when participants reported experiencing symptoms, with the lowest Ct value occurring 2-6 days after symptom onset.

      5. Among 342 US infants with congenital cytomegalovirus treated with antivirals, 114 (33%) received ganciclovir (with or without valganciclovir) and 228 (67%) received valganciclovir only, for a median of 8 and 171 days, starting at a median of 15 and 45 days of life, respectively, with neutropenia diagnosed in 25% and 17%.

      6. Acute hepatitis and adenovirus infection among children - Alabama, October 2021-February 2022external icon
        Baker JM, Buchfellner M, Britt W, Sanchez V, Potter JL, Ingram LA, Shiau H, Gutierrez Sanchez LH, Saaybi S, Kelly D, Lu X, Vega EM, Ayers-Millsap S, Willeford WG, Rassaei N, Bullock H, Reagan-Steiner S, Martin A, Moulton EA, Lamson DM, St George K, Parashar UD, Hall AJ, MacNeil A, Tate JE, Kirking HL.
        MMWR Morb Mortal Wkly Rep. 2022 May 6;71(18):638-640.
        During October-November 2021, clinicians at a children's hospital in Alabama identified five pediatric patients with severe hepatitis and adenovirus viremia upon admission. In November 2021, hospital clinicians, the Alabama Department of Public Health, the Jefferson County Department of Health, and CDC began an investigation. This activity was reviewed by CDC and conducted consistent with applicable federal law and CDC policy.

      7. SARS-CoV-2 infection among pregnant people at labor and delivery and changes in infection rates in the general population: Lessons learned from Illinoisexternal icon
        Goyal S, Gerardin J, Cobey S, Son C, McCarthy O, Dror A, Lightner S, Ezike NO, Duffus WA, Bennett AC.
        Public Health Rep. 2022 May 5:333549221091826.
        OBJECTIVES: The Illinois Department of Public Health (IDPH) assessed whether increases in the SARS-CoV-2 test positivity rate among pregnant people at labor and delivery (L&D) could signal increases in SARS-CoV-2 prevalence in the general Illinois population earlier than current state metrics. MATERIALS AND METHODS: Twenty-six birthing hospitals universally testing for SARS-CoV-2 at L&D voluntarily submitted data from June 21, 2020 through January 23, 2021, to IDPH. Hospitals reported the daily number of people who delivered, SARS-CoV-2 tests, and test results as well as symptom status. We compared the test positivity rate at L&D with the test positivity rate of the general population and the number of hospital admissions for COVID-19-like illness by quantifying correlations in trends and identifying a lead time. RESULTS: Of 26 633 reported pregnant people who delivered, 96.8% (n = 25 772) were tested for SARS-CoV-2. The overall test positivity rate was 2.4% (n = 615); 77.7% (n = 478) were asymptomatic. In Chicago, the only region with a sufficient sample size for analysis, the test positivity rate at L&D (peak of 5% on December 7, 2020) was lower and more stable than the test positivity rate of the general population (peak of 14% on November 13, 2020) and lagged hospital admissions for COVID-19-like illness (peak of 118 on November 15, 2020) and the test positivity rate of the general population by about 10 days (Pearson correlation = 0.73 and 0.75, respectively). PRACTICE IMPLICATIONS: Trends in the test positivity rate at L&D did not provide an earlier signal of increases in Illinois's SARS-CoV-2 prevalence than current state metrics did. Nonetheless, the role of universal testing protocols in identifying asymptomatic infection is important for clinical decision making and patient education about infection prevention and control.

      8. Shigellosis cases with bacterial sexually transmitted infections: Population-based data from 6 US jurisdictions, 2007-2016external icon
        Ridpath AD, Vanden Esschert KL, Bragg S, Campbell S, Convery C, Cope A, Devinney K, Diesel JC, Kikuchi N, Lee N, Lewis FM, Matthias J, Pathela P, Pugsley R, Slutsker JS, Schillinger JA, Thompson C, Tingey C, Wilson J, Newman DR, Marsh ZA, Garcia-Williams AG, Kirkcaldy RD.
        Sex Transm Dis. 2022 May 5.
        BACKGROUND: Shigella species, which cause acute diarrheal disease, are transmitted via fecal-oral and sexual contact. To better understand the overlapping populations affected by Shigella infections and sexually transmitted infections (STIs) in the United States, we examined the occurrence of reported STIs within 24 months among shigellosis case-patients. METHODS: Culture-confirmed Shigella cases diagnosed during 2007-2016 among residents of six U.S. jurisdictions were matched to reports of STIs (chlamydia, gonorrhea, and all stages of syphilis) diagnosed 12 months before or after the shigellosis case. We examined epidemiologic characteristics and reported temporal trends of Shigella cases by sex and species. RESULTS: During 2007-2016, 10,430 shigellosis cases were reported. The annual number of reported shigellosis cases across jurisdictions increased 70%, from 821 cases in 2007 to 1,398 cases in 2016; males saw a larger increase compared to females. Twenty percent of male shigellosis case-patients had an STI reported in the reference period, versus 4% of female case-patients. The percentage of male shigellosis case-patients with an STI increased from 11% (2007) to 28% (2016); the overall percentage among females remained low. CONCLUSIONS: We highlight the substantial proportion of males with shigellosis who were diagnosed with STIs within 24 months and the benefit of matching data across programs. STI screening may be warranted for male shigellosis case-patients.

    • Community Health Services
      1. To assess healthcare provider awareness of the Food and Drug Administration (FDA) 2019 approval of nucleic acid amplification tests (NAAT) using extragenital specimens for chlamydia and gonorrhea, several questions were included in fall 2020 Porter Novelli's DocStyles survey, a US nationally representative semi-annual web-based survey of healthcare providers. There were 1502 respondents included in this study, 1000 family practitioners/internists as primary care physicians (PCPs), 251 obstetricians/gynecologists (OBs/GYNs), and 251 nurse practitioners/physician assistants (NP/PA). Awareness of this FDA approval was 34.3% overall and significantly varied by provider specialty: 45.0% for OB/GYN versus 23.5% for NP/PA, p < 0.01. OB/GYN had the lowest rate of ordering any extragenital gonorrhea and chlamydia tests in the past 12 months (31.6%) versus the other providers (ranging from 46.2% for NP/PA to 60.7% for PCP). The respondents were more likely to be aware of the FDA approval if they had ordered extragenital chlamydia or gonorrhea testing for men who have sex with men (MSM) than those who did not order the tests for MSM (72.3% versus 43.7%, p < 0.01). Of 1502 respondents, lack of reimbursement as a barrier to ordering extragenital tests for chlamydia and gonorrhea was most mentioned (16.6%) overall and did not significantly vary by provider's specialty. Further outreach is needed to educate healthcare providers on the changes in the FDA approval for extragenital gonorrhea and chlamydia testing so that they can provide comprehensive care to their patients and to reduce the potential for antimicrobial resistance.

    • Disaster Preparedness and Emergency Services
      1. Conducting public health surveillance in areas of armed conflict and restricted population access: a qualitative case study of polio surveillance in conflict-affected areas of Borno State, Nigeriaexternal icon
        Wiesen E, Dankoli R, Musa M, Higgins J, Forbi J, Idris J, Waziri N, Ogunbodede O, Mohammed K, Bolu O, WaNganda G, Adamu U, Pinsker E.
        Confl Health. 2022 May 7;16(1):20.
        This study examined the impact of armed conflict on public health surveillance systems, the limitations of traditional surveillance in this context, and innovative strategies to overcome these limitations. A qualitative case study was conducted to examine the factors affecting the functioning of poliovirus surveillance in conflict-affected areas of Borno state, Nigeria using semi-structured interviews of a purposeful sample of participants. The main inhibitors of surveillance were inaccessibility, the destroyed health infrastructure, and the destroyed communication network. These three challenges created a situation in which the traditional polio surveillance system could not function. Three strategies to overcome these challenges were viewed by respondents as the most impactful. First, local community informants were recruited to conduct surveillance for acute flaccid paralysis in children in the inaccessible areas. Second, the informants engaged in local-level negotiation with the insurgency groups to bring children with paralysis to accessible areas for investigation and sample collection. Third, GIS technology was used to track the places reached for surveillance and vaccination and to estimate the size and location of the inaccessible population. A modified monitoring system tracked tailored indicators including the number of places reached for surveillance and the number of acute flaccid paralysis cases detected and investigated, and utilized GIS technology to map the reach of the program. The surveillance strategies used in Borno were successful in increasing surveillance sensitivity in an area of protracted conflict and inaccessibility. This approach and some of the specific strategies may be useful in other areas of armed conflict.

      2. Purpose: Social vulnerability in the context of disaster management refers to the sociodemographic characteristics of a population and the physical, social, economic, and environmental factors that increase their susceptibility to adverse disaster outcomes and capacity to anticipate, cope with, resist, and recover from disaster events. Because disasters do not impact people equally, researchers, public health practitioners, and emergency managers need training to meet the complex needs of vulnerable populations. Design/methodology/approach: To address gaps in current education, the CONVERGE initiative, headquartered at the Natural Hazards Center at the University of Colorado Boulder, developed the Social Vulnerability and Disasters Training Module. This free online course draws on decades of research to examine the factors that influence social vulnerability to disasters. Examples of studies and evidence-based programs are included to illuminate common methods for studying social vulnerability and ways that research can guide practice. To evaluate the module, all trainees completed a pre- and post-training questionnaire. Findings: Between July 2019 and September 2021, 1,089 people completed the module. Wilcoxon signed rank tests demonstrated a significant perceived increase in self-rated knowledge, skills, and attitudes (KSA). Students, members of historically underrepresented populations, and those new to or less experienced in the field, had the greatest perceived increase. Practical implications: This training module can help participants understand the specific needs of socially vulnerable populations to help reduce human suffering from disasters. Originality/value: This article describes a novel web-based training and offers evaluation data showing how it can help educate a broad hazards and disaster workforce on an important topic for disaster management. © 2022, Rachel Marie Adams, Candace Evans, Amy Wolkin, Tracy Thomas and Lori Peek.

    • Environmental Health
      1. Seasonality and ecological suitability modelling for anthrax (Bacillus anthracis) in Western Africaexternal icon
        Pittiglio C, Shadomy S, El Idrissi A, Soumare B, Lubroth J, Makonnen Y.
        Animals. 2022 ;12(9).
        Anthrax is hyper-endemic in West Africa affecting wildlife, livestock and humans. Prediction is difficult due to the lack of accurate outbreak data. However, predicting the risk of infection is important for public health, wildlife conservation and livestock economies. In this study, the seasonality of anthrax outbreaks in West Africa was investigated using climate time series and ecological niche modeling to identify environmental factors related to anthrax occurrence, develop geospatial risk maps and identify seasonal patterns. Outbreak data in livestock, wildlife and humans between 2010 and 2018 were compiled from different sources and analyzed against monthly rates of change in precipitation, normalized difference vegetation index (NDVI) and land surface temperature. Maximum Entropy was used to predict and map the environmental suitability of anthrax occurrence. The findings showed that: (i) Anthrax outbreaks significantly (99%) increased with incremental changes in monthly precipitation and vegetation growth and decremental changes in monthly temperature during January–June. This explains the occurrence of the anthrax peak during the early wet season in West Africa. (ii) Livestock density, precipitation seasonality, NDVI and alkaline soils were the main predictors of anthrax suitability. (iii) Our approach optimized the use of limited and heterogeneous datasets and ecological niche modeling, demonstrating the value of integrated disease notification data and outbreak reports to generate risk maps. Our findings can inform public, animal and environmental health and enhance national and regional One Health disease control strategies. © 2022 by the authors. Licensee MDPI, Basel, Switzerland.

      2. Analysis of Salmonella enterica isolated from a mixed-use watershed in Georgia, USA: Antimicrobial resistance, serotype diversity, and genetic relatedness to human isolatesexternal icon
        Cho S, Hiott LM, House SL, Woodley TA, McMillan EA, Sharma P, Barrett JB, Adams ES, Brandenburg JM, Hise KB, Bateman McDonald JM, Ottesen EA, Lipp EK, Jackson CR, Frye JG.
        Appl Environ Microbiol. 2022 May 9:e0039322.
        As the cases of Salmonella enterica infections associated with contaminated water are increasing, this study was conducted to address the role of surface water as a reservoir of S. enterica serotypes. We sampled rivers and streams (n = 688) over a 3-year period (2015 to 2017) in a mixed-use watershed in Georgia, USA, and 70.2% of the total stream samples tested positive for Salmonella. A total of 1,190 isolates were recovered and characterized by serotyping, antimicrobial susceptibility testing, and pulsed-field gel electrophoresis (PFGE). A wide range of serotypes was identified, including those commonly associated with humans and animals, with S. enterica serotype Muenchen being predominant (22.7%) and each serotype exhibiting a high degree of strain diversity by PFGE. About half (46.1%) of the isolates had PFGE patterns indistinguishable from those of human clinical isolates in the CDC PulseNet database. A total of 52 isolates (4.4%) were resistant to antimicrobials, out of which 43 isolates were multidrug resistant (MDR; resistance to two or more classes of antimicrobials). These 52 resistant Salmonella isolates were screened for the presence of antimicrobial resistance genes, plasmid replicons, and class 1 integrons, out of which four representative MDR isolates were selected for whole-genome sequencing analysis. The results showed that 28 MDR isolates resistant to 10 antimicrobials had bla(cmy-2) on an A/C plasmid. Persistent contamination of surface water with a high diversity of Salmonella strains, some of which are drug resistant and genetically indistinguishable from human isolates, supports a role of environmental surface water as a reservoir for and transmission route of this pathogen. IMPORTANCE Salmonella has been traditionally considered a foodborne pathogen, as it is one of the most common etiologies of foodborne illnesses worldwide; however, recent Salmonella outbreaks attributed to fresh produce and water suggest a potential environmental source of Salmonella that causes some human illnesses. Here, we investigated the prevalence, diversity, and antimicrobial resistance of Salmonella isolated from a mixed-use watershed in Georgia, USA, in order to enhance the overall understanding of waterborne Salmonella. The persistence and widespread distribution of Salmonella in surface water confirm environmental sources of the pathogen. A high proportion of waterborne Salmonella with clinically significant serotypes and genetic similarity to strains of human origin supports the role of environmental water as a significant reservoir of Salmonella and indicates a potential waterborne transmission of Salmonella to humans. The presence of antimicrobial-resistant and MDR Salmonella demonstrates additional risks associated with exposure to contaminated environmental water.

      3. Integrating public health surveillance and environmental data to model presence of histoplasma in the United Statesexternal icon
        Hepler SA, Kaufeld KA, Benedict K, Toda M, Jackson BR, Liu X, Kline D.
        Epidemiology. 2022 May 5.
        BACKGROUND: In the United States, the true geographic distribution of the environmental fungus Histoplasma capsulatum remains poorly understood but appears to have changed since it was first characterized. Histoplasmosis is caused by inhalation of the fungus and can range in severity from asymptomatic to life-threatening. Due to limited public health surveillance and under detection of infections, it is challenging to directly use reported case data to characterize spatial risk. METHODS: Using monthly and yearly county-level public health surveillance data and various environmental and socioeconomic characteristics, we use a spatio-temporal occupancy model to estimate latent, or unobserved, presence of H. capsulatum, accounting for imperfect detection of histoplasmosis cases. RESULTS: We estimate areas with higher probabilities of the presence of H. capsulatum in the East North Central states around the Great Lakes, reflecting a shift of the endemic region to the north from previous estimates. The presence of H. capsulatum was strongly associated with higher soil nitrogen levels. CONCLUSIONS: In this investigation, we were able to mitigate challenges related to reporting and illustrate a shift in the endemic region from historical estimates. This work aims to help inform future surveillance needs, clinical awareness, and testing decisions for histoplasmosis.

    • Epidemiology and Surveillance
      1. BACKGROUND: Equal-tailed confidence intervals that maintain nominal coverage (0.95 or greater probability that a 95% confidence interval covers the true value) are useful in interval-based statistical reliability standards, because they remain conservative. For age-adjusted death rates, while the Fay-Feuer gamma method remains the gold standard, modifications have been proposed to streamline implementation and/or obtain more efficient intervals (shorter intervals that retain nominal coverage). METHODS: This paper evaluates three such modifications for use in interval-based statistical reliability standards, the Anderson-Rosenberg, Tiwari, and Fay-Kim intervals, when data are sparse and sample size-based standards alone are overly coarse. Initial simulations were anchored around small populations (P = 2400 or 1200), the median crude all-cause US mortality rate in 2010-2019 (833.8 per 100,000), and the corresponding age-specific probabilities of death. To allow for greater variation in the age-adjustment weights and age-specific probabilities, a second set of simulations draws those at random, while holding the mean number of deaths at 20 or 10. Finally, county-level mortality data by race/ethnicity from four causes are selected to capture even greater variation: all causes, external causes, congenital malformations, and Alzheimer disease. RESULTS: The three modifications had comparable performance when the number of deaths was large relative to the denominator and the age distribution was as in the standard population. However, for sparse county-level data by race/ethnicity for rarer causes of death, and for which the age distribution differed sharply from the standard population, coverage probability in all but the Fay-Feuer method sometimes fell below 0.95. More efficient intervals than the Fay-Feuer interval were identified under specific circumstances. When the coefficient of variation of the age-adjustment weights was below 0.5, the Anderson-Rosenberg and Tiwari intervals appeared to be more efficient, whereas when it was above 0.5, the Fay-Kim interval appeared to be more efficient. CONCLUSIONS: As national and international agencies reassess prevailing data presentation standards to release age-adjusted estimates for smaller areas or population subgroups than previously presented, the Fay-Feuer interval can be used to develop interval-based statistical reliability standards with appropriate thresholds that are generally applicable. For data that meet certain statistical conditions, more efficient intervals could be considered.

    • Health Equity and Health Disparities
      1. HIV care outcomes among transgender persons with HIV infection in the United States, 2006-2021external icon
        Becasen JS, Morris JD, Denard CL, Mullins MM, Kota KK, Higa DH.
        AIDS. 2022 Feb 1;36(2):305-315.
        OBJECTIVES: HIV prevalence is an estimated 14% among transgender women (TW) and 3% among transgender men (TM). HIV care is vital for viral suppression but is hindered by transphobia and HIV stigma. We assessed HIV care outcomes among transgender persons (TG) with HIV in the United States. DESIGN: Systematic review and meta-analysis of peer-reviewed journal articles. METHODS: We searched multiple electronic databases and Centers for Disease Control and Prevention's HIV Prevention Research Synthesis database for 2006-September 2020. Eligible reports were US-based studies that included TG and reported HIV care outcomes. Random-effects models were used to calculate HIV care outcome rates. The protocol is registered with PROSPERO (CRD42018079564). RESULTS: Few studies reported outcomes for TM; therefore, only TW meta-analysis results are reported. Fifty studies were identified having low-to-medium risk-of-bias scores. Among TW with HIV, 82% had ever received HIV care; 72% were receiving care, and 83% of those were retained in HIV care. Sixty-two percent were currently virally suppressed. Among those receiving HIV care or antiretroviral therapy (ART), 67% were virally suppressed at last test. Sixty-five percent were linked to HIV care 3 months or less after diagnosis. Seventy-one percent had ever been prescribed ART. Approximately 66% were taking ART, and 66% were ART-adherent. Only 56% were currently adherent the previous year. CONCLUSIONS: HIV care outcomes for TW were not ideal, and research gaps exists for TM. High heterogeneity was observed; therefore, caution should be taken interpreting the findings. Integrating transgender-specific health needs are needed to improve outcomes of transgender persons across the HIV care continuum.

      2. BACKGROUND: To explore the prevalence, pharmacologic treatment, and control of hypertension among US non-pregnant women of reproductive age by race/Hispanic origin to identify potential gaps in care. METHODS: We pooled data from the 2011 to March 2020 (pre-pandemic) National Health and Nutrition Examination Survey cycles. Our analytic sample included 4,590 non-pregnant women aged 20-44 years who had at least one examiner-measured blood pressure (BP) value. We estimated prevalences and 95% CIs of hypertension, pharmacologic treatment, and control based on the 2003 Joint Committee on High Blood Pressure (JNC 7) and the 2017 American College of Cardiology and the American Heart Association (ACC/AHA) guidelines. We evaluated differences by race/Hispanic origin using Rao-Scott chi-square tests. RESULTS: Applying ACC/AHA guidelines, hypertension prevalence ranged from 14.0% (95% CI: 12.0, 15.9) among Hispanic women to 30.9% (95% CI: 27.8, 34.0) among Non-Hispanic Black women. Among women with hypertension, non-Hispanic Black women had the highest eligibility for pharmacological treatment (65.5%, 95% CI: 60.4, 70.5); current use was highest among White women (61.8%, 95% CI: 53.8, 69.9). BP control ranged from 5.2% (95% CI: 1.1, 9.3) among women of Another or Multiple non-Hispanic races to 18.6% (95% CI: 12.1, 25.0) among Hispanic women. CONCLUSIONS: These findings highlight the importance of monitoring hypertension, pharmacologic treatment, and control by race/Hispanic origin and addressing barriers to equitable hypertension care among women of reproductive age.

      3. BACKGROUND: Group A streptococci (GAS), while usually responsible for mild infections, can sometimes spread into normally sterile sites and cause invasive disease (iGAS). Because both the risk of iGAS disease and occurrence of outbreaks are elevated within certain communities, such as those comprised of people who inject drugs (PWID) and people experiencing homelessness (PEH), understanding the transmission dynamics of GAS is of major relevance to public health. METHODS: We employed a cluster detection tool to scan genomes of 7,552 Streptococcus pyogenes isolates acquired through the population-based Active Bacterial Core surveillance (ABCs) during 2015-2018 to identify genomically-related clusters representing previously unidentified iGAS outbreaks. RESULTS: We found that 64.6% of invasive isolates were included within clusters of at least 4 temporally related isolates. Calculating a cluster odds ratio (COR) for each emm type revealed that types vary widely in their propensity to form transmission clusters. By incorporating additional epidemiological metadata for each isolate, we found that emm types with a higher proportion of cases occurring among PEH and PWID were associated with higher CORs. Higher CORs were also correlated with emm types that are less geographically dispersed. DISCUSSION: Early identification of clusters with implementation of outbreak control measures could result in significant reduction of iGAS.

      4. Impact of rural hospital closures on hospitalizations and associated outcomes for ambulatory and emergency care sensitive conditionsexternal icon
        Khushalani JS, Holmes M, Song S, Arifkhanova A, Randolph R, Thomas S, Hall DM.
        J Rural Health. 2022 May 5.
        PURPOSE: The purpose of this paper is to examine the impact of rural hospital closures on age-adjusted hospitalization rates for ambulatory care sensitive condition (ACSC) and emergency care sensitive condition (ECSC) and associated outcomes, such as length of stay and in-hospital mortality in hospital service areas (HSAs) that utilized the closed hospital. METHODS: We used the State Inpatient Data from the Healthcare Cost and Utilization Project for 9 states from 2010 to 2017 and classified admissions as ACSC or ECSC. We compared age-adjusted admission rates and length of stay (LOS) for ACSC and ECSC rates and age adjusted in-hospital mortality rate for ECSC among rural ZIP codes in HSAs with a closure to rural ZIP codes in HSAs without closures. We used propensity score-weighted regression analysis and event study design. FINDINGS: Findings suggest that ACSC admission rates started to increase right before the closure. However, this increase levels off 2 years after closure. LOS for ACSC significantly decreased almost a year after closure. ECSC admissions showed a significant decrease for a few quarters 1 year before the closure. CONCLUSIONS: Rural hospital closures were associated with increase in ACSC admissions right before closure and for nearly 2 years post closure as well as decrease in ECSC admissions before closure. As rural hospitals continue to close, efforts to ensure communities affected by these closures maintain access to primary health care may help eliminate increases in costly preventable hospital admissions for ACSC while ensuring access for emergency care services.

      5. Geographic differences in sex-specific chronic obstructive pulmonary disease mortality rate trends among adults aged ≥25 years - United States, 1999-2019external icon
        Carlson SA, Wheaton AG, Watson KB, Liu Y, Croft JB, Greenlund KJ.
        MMWR Morb Mortal Wkly Rep. 2022 May 6;71(18):613-618.
        Chronic obstructive pulmonary disease (COPD) accounts for the majority of deaths from chronic lower respiratory diseases, the fourth leading cause of death in the United States in 2019.* COPD mortality rates are decreasing overall. Although rates in men remain higher than those in women, declines have occurred among men but not women (1). To examine the geographic variation in sex-specific trends in age-adjusted COPD mortality rates among adults aged ≥25 years, CDC analyzed 1999-2019 death certificate data, by urban-rural status,(†) U.S. Census Bureau region,(§) and state. Among women, no significant change in overall COPD mortality occurred during this period; however, rates increased significantly in small metropolitan (average annual percent change [AAPC] = 0.6%), micropolitan (1.2%), and noncore (1.9%) areas and in the Midwest (0.6%). Rates decreased significantly in large central (-0.9%) and fringe metropolitan (-0.4%) areas (and in the Northeast (-0.5%) and West (-1.2%). Among men, rates decreased significantly overall (-1.3%), in all urban-rural areas (range = -1.9% [large central metropolitan] to -0.4% [noncore]) and in all regions (range = -2.0% [West] to -0.9% [Midwest]). Strategies to improve the prevention, treatment, and management of COPD are needed, especially to address geographic differences and improve the trend in women, to reduce COPD deaths.

      6. Comparison of national vulnerability indices used by the Centers for Disease Control and Prevention for the COVID-19 Responseexternal icon
        Wolkin A, Collier S, House J, Reif D, Motsinger-Reif A, Duca L, Sharpe D.
        Public Health Rep. 2022 May 5:333549221090262.
        OBJECTIVE: Vulnerability indices use quantitative indicators and geospatial data to examine the level of vulnerability to morbidity in a community. The Centers for Disease Control and Prevention (CDC) uses 3 indices for the COVID-19 response: the CDC Social Vulnerability Index (CDC-SVI), the US COVID-19 Community Vulnerability Index (CCVI), and the Pandemic Vulnerability Index (PVI). The objective of this review was to describe these tools and explain the similarities and differences between them. METHODS: We described the 3 indices, outlined the underlying data sources and metrics for each, and discussed their use by CDC for the COVID-19 response. We compared the percentile score for each county for each index by calculating Spearman correlation coefficients (Spearman r). RESULTS: These indices have some, but not all, component metrics in common. The CDC-SVI is a validated metric that estimates social vulnerability, which comprises the underlying population-level characteristics that influence differences in health risk among communities. To address risk specific to the COVID-19 pandemic, the CCVI and PVI build on the CDC-SVI and include additional variables. The 3 indices were highly correlated. Spearman r for comparisons between the CDC-SVI score and the CCVI and between the CCVI and the PVI score was 0.83. Spearman r for the comparison between the CDC-SVI score and PVI score was 0.73. CONCLUSION: The indices can empower local and state public health officials with additional information to focus resources and interventions on disproportionately affected populations to combat the ongoing pandemic and plan for future pandemics.

    • Immunity and Immunization
      1. Why aren't we achieving high vaccination rates for rotavirus vaccine in the United States?external icon
        Kempe A, O'Leary ST, Cortese MM, Crane LA, Cataldi JR, Brtnikova M, Beaty BL, Hurley LP, Gorman C, Tate JE, St Pierre JL, Lindley MC.
        Acad Pediatr. 2022 May-Jun;22(4):542-550.
        BACKGROUND: Rotavirus vaccine (RV) coverage levels for US infants are <80%. METHODS: We surveyed nationally representative networks of pediatricians by internet/mail from April to June, 2019. Multivariable regression assessed factors associated with difficulty administering the first RV dose (RV#1) by the maximum age. RESULTS: Response rate was 68% (303/448). Ninety-nine percent of providers reported strongly recommending RV. The most common barriers to RV delivery overall (definite/somewhat of a barrier) were: parental concerns about vaccine safety overall (27%), parents wanting to defer (25%), parents not thinking RV was necessary (12%), and parent concerns about RV safety (6%). The most commonly reported reasons for nonreceipt of RV#1 by 4 to 5 months (often/always) were parental vaccine refusal (9%), hospitals not giving RV at discharge from nursery (7%), infants past the maximum age when discharged from neonatal intensive care unit/nursery (6%), and infant not seen before maximum age for well care visit (3%) or seen but no vaccine given (4%). Among respondents 4% strongly agreed and 25% somewhat agreed that they sometimes have difficulty giving RV#1 before the maximum age. Higher percentage of State Child Health Insurance Program/Medicaid-insured children in the practice and reporting that recommendations for timing of RV doses are too complicated were associated with reporting difficulty delivering the RV#1 by the maximum age. CONCLUSIONS: US pediatricians identified multiple, actionable issues that may contribute to suboptimal RV immunization rates including lack of vaccination prior to leaving nurseries after prolonged stays, infants not being seen for well care visits by the maximum age, missed opportunities at visits and parents refusing/deferring.

      2. Qualitative assessment of caregiver experiences when navigating childhood immunisation in urban communities in Sierra Leoneexternal icon
        Jalloh MF, Patel P, Sutton R, Kulkarni S, Toure M, Wiley K, Sessay T, Lahuerta M.
        BMJ Open. 2022 May 9;12(5):e058203.
        OBJECTIVE: To gain in-depth understanding of the caregiver experience when navigating urban immunisation services for their children. DESIGN: An exploratory qualitative assessment comprising 16 in-depth interviews using an interpretative phenomenology approach. SETTING: Caregivers were purposively recruited from slums (n=8) and other urban communities (n=8) in the capital city of Sierra Leone. PARTICIPANTS: Caregivers of children ages 6-36 months old who were fully vaccinated (n=8) or undervaccinated (n=8). RESULTS: Emotional enablers of vaccination were evident in caregivers' sense of parental obligation to their children while also anticipating reciprocal benefits in children's ability to take care of their parents later in life. Practical enablers were found in the diversity of immunisation reminders, information access, information trust, getting fathers more involved, positive experiences with health workers and postvaccination information sharing in the community. Underlying barriers to childhood vaccination were due to practical constraints such as overcrowding and long waiting times at the clinic, feeling disrespected by health workers, expecting to give money to health workers for free services and fear of serious vaccine side effects. To improve vaccination outcomes, caregivers desired more convenient and positive clinic experiences and deeper community engagement. CONCLUSIONS: Health system interventions, community engagement and vaccination outreach need to be tailored for urban settings. Vaccine communication efforts may resonate more strongly with caregivers when vaccination is framed both around parental responsibilities to do the right thing for the child and the future benefits to the parent.

      3. Timing of headache after COVID-19 vaccines and its association with cerebrovascular events: An analysis of 41,700 VAERS reportsexternal icon
        Garcia-Azorin D, Baykan B, Beghi E, Doheim MF, Fernandez-de-Las-Penas C, Gezegen H, Guekht A, Hoo FK, Santacatterina M, Sejvar J, Tamborska AA, Thakur KT, Westenberg E, Winkler AS, Frontera JA.
        Cephalalgia. 2022 May 6:3331024221099231.
        BACKGROUND: Delayed-onset of headache seems a specific feature of cerebrovascular events after COVID-19 vaccines. METHODS: All consecutive events reported to the United States Vaccine Adverse Reporting System following COVID-19 vaccines (1 January to 24 June 2021), were assessed. The timing of headache onset post-vaccination in subjects with and without concomitant cerebrovascular events, including cerebral venous thrombosis, ischemic stroke, and intracranial haemorrhage was analysed. The diagnostic accuracy in predicting concurrent cerebrovascular events of the guideline- proposed threshold of three-days from vaccination to headache onset was evaluated. RESULTS: There were 314,610 events following 306,907,697 COVID-19 vaccine doses, including 41,700 headaches, and 178/41,700 (0.4%) cerebrovascular events. The median time between the vaccination and the headache onset was shorter in isolated headache (1 day vs. 4 (in cerebral venous thrombosis), 3 (in ischemic stroke), or 10 (in intracranial hemorrhage) days, all P < 0.001). Delayed onset of headache had an area under the curve of 0.83 (95% CI: 0.75-0.97) for cerebral venous thrombosis, 0.70 (95% CI: 0.63-76) for ischemic stroke and 0.76 (95% CI: 0.67-84) for intracranial hemorrhage, and >99% negative predictive value. CONCLUSION: Headache following COVID-19 vaccination occurs within 1 day and is rarely associated with cerebrovascular events. Delayed onset of headache 3 days post-vaccination was an accurate diagnostic biomarker for the occurrence of a concomitant cerebrovascular events.

      4. Effectiveness of severe acute respiratory syndrome coronavirus 2 messenger RNA vaccines for preventing coronavirus disease 2019 hospitalizations in the United Statesexternal icon
        Tenforde MW, Patel MM, Ginde AA, Douin DJ, Talbot HK, Casey JD, Mohr NM, Zepeski A, Gaglani M, McNeal T, Ghamande S, Shapiro NI, Gibbs KW, Files DC, Hager DN, Shehu A, Prekker ME, Erickson HL, Exline MC, Gong MN, Mohamed A, Henning DJ, Steingrub JS, Peltan ID, Brown SM, Martin ET, Monto AS, Khan A, Hough CL, Busse LW, Ten Lohuis CC, Duggal A, Wilson JG, Gordon AJ, Qadir N, Chang SY, Mallow C, Gershengorn HB, Babcock HM, Kwon JH, Halasa N, Chappell JD, Lauring AS, Grijalva CG, Rice TW, Jones ID, Stubblefield WB, Baughman A, Womack KN, Lindsell CJ, Hart KW, Zhu Y, Olson SM, Stephenson M, Schrag SJ, Kobayashi M, Verani JR, Self WH.
        Clin Infect Dis. 2022 May 3;74(9):1515-1524.
        BACKGROUND: As severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) vaccination coverage increases in the United States, there is a need to understand the real-world effectiveness against severe coronavirus disease 2019 (COVID-19) and among people at increased risk for poor outcomes. METHODS: In a multicenter case-control analysis of US adults hospitalized March 11-May 5, 2021, we evaluated vaccine effectiveness to prevent COVID-19 hospitalizations by comparing odds of prior vaccination with a messenger RNA (mRNA) vaccine (Pfizer-BioNTech or Moderna) between cases hospitalized with COVID-19 and hospital-based controls who tested negative for SARS-CoV-2. RESULTS: Among 1212 participants, including 593 cases and 619 controls, median age was 58 years, 22.8% were Black, 13.9% were Hispanic, and 21.0% had immunosuppression. SARS-CoV-2 lineage B0.1.1.7 (Alpha) was the most common variant (67.9% of viruses with lineage determined). Full vaccination (receipt of 2 vaccine doses ≥14 days before illness onset) had been received by 8.2% of cases and 36.4% of controls. Overall vaccine effectiveness was 87.1% (95% confidence interval [CI], 80.7-91.3). Vaccine effectiveness was similar for Pfizer-BioNTech and Moderna vaccines, and highest in adults aged 18-49 years (97.4%; 95% CI, 79.3-9.7). Among 45 patients with vaccine-breakthrough COVID hospitalizations, 44 (97.8%) were ≥50 years old and 20 (44.4%) had immunosuppression. Vaccine effectiveness was lower among patients with immunosuppression (62.9%; 95% CI,20.8-82.6) than without immunosuppression (91.3%; 95% CI, 85.6-94.8). CONCLUSION: During March-May 2021, SARS-CoV-2 mRNA vaccines were highly effective for preventing COVID-19 hospitalizations among US adults. SARS-CoV-2 vaccination was beneficial for patients with immunosuppression, but effectiveness was lower in the immunosuppressed population.

      5. Expected rates of select adverse events after immunization for coronavirus disease 2019 vaccine safety monitoringexternal icon
        Abara WE, Gee J, Delorey M, Tun Y, Mu Y, Shay DK, Shimabukuro T.
        J Infect Dis. 2022 May 4;225(9):1569-1574.
        Using meta-analytic methods, we calculated expected rates of 20 potential adverse events of special interest (AESI) that would occur after coronavirus disease 2019 (COVID-19) vaccination within 1-, 7-, and 42-day intervals without causal associations. Based on these expected rates, if 10 000 000 persons are vaccinated, (1) 0.5, 3.7, and 22.5 Guillain-Barre syndrome cases, (2) 0.3, 2.4, and 14.3 myopericarditis cases, (3) and 236.5, 1655.5, and 9932.8 all-cause deaths would occur coincidentally within 1, 7, and 42 days postvaccination, respectively. Expected rates of potential AESI can contextualize events associated temporally with immunization, aid in safety signal detection, guide COVID-19 vaccine health communications, and inform COVID-19 vaccine benefit-risk assessments.

      6. Effectiveness of a COVID-19 additional primary or booster vaccine dose in preventing SARS-CoV-2 infection among nursing home residents during widespread circulation of the omicron variant - United States, February 14-March 27, 2022external icon
        Prasad N, Derado G, Nanduri SA, Reses HE, Dubendris H, Wong E, Soe MM, Li Q, Dollard P, Bagchi S, Edwards J, Shang N, Budnitz D, Bell J, Verani JR, Benin A, Link-Gelles R, Jernigan J, Pilishvili T.
        MMWR Morb Mortal Wkly Rep. 2022 May 6;71(18):633-637.
        Nursing home residents have experienced disproportionally high levels of COVID-19-associated morbidity and mortality and were prioritized for early COVID-19 vaccination (1). Following reported declines in vaccine-induced immunity after primary series vaccination, defined as receipt of 2 primary doses of an mRNA vaccine (BNT162b2 [Pfizer-BioNTech] or mRNA-1273 [Moderna]) or 1 primary dose of Ad26.COV2 (Johnson & Johnson [Janssen]) vaccine (2), CDC recommended that all persons aged ≥12 years receive a COVID-19 booster vaccine dose.* Moderately to severely immunocompromised persons, a group that includes many nursing home residents, are also recommended to receive an additional primary COVID-19 vaccine dose.(†) Data on vaccine effectiveness (VE) of an additional primary or booster dose against infection with SARS-CoV-2 (the virus that causes COVID-19) among nursing home residents are limited, especially against the highly transmissible B.1.1.529 and BA.2 (Omicron) variants. Weekly COVID-19 surveillance and vaccination coverage data among nursing home residents, reported by skilled nursing facilities (SNFs) to CDC's National Healthcare Safety Network (NHSN)(§) during February 14-March 27, 2022, when the Omicron variant accounted for >99% of sequenced isolates, were analyzed to estimate relative VE against infection for any COVID-19 additional primary or booster dose compared with primary series vaccination. After adjusting for calendar week and variability across SNFs, relative VE of a COVID-19 additional primary or booster dose was 46.9% (95% CI = 44.8%-48.9%). These findings indicate that among nursing home residents, COVID-19 additional primary or booster doses provide greater protection against Omicron variant infection than does primary series vaccination alone. All immunocompromised nursing home residents should receive an additional primary dose, and all nursing home residents should receive a booster dose, when eligible, to protect against COVID-19. Efforts to keep nursing home residents up to date with vaccination should be implemented in conjunction with other COVID-19 prevention strategies, including testing and vaccination of nursing home staff members and visitors.

      7. Use of a modified preexposure prophylaxis vaccination schedule to prevent human rabies: Recommendations of the Advisory Committee on Immunization Practices - United States, 2022external icon
        Rao AK, Briggs D, Moore SM, Whitehill F, Campos-Outcalt D, Morgan RL, Wallace RM, Romero JR, Bahta L, Frey SE, Blanton JD.
        MMWR Morb Mortal Wkly Rep. 2022 May 6;71(18):619-627.
        Human rabies is an acute, progressive encephalomyelitis that is nearly always fatal once symptoms begin. Several measures have been implemented to prevent human rabies in the United States, including vaccination of targeted domesticated and wild animals, avoidance of behaviors that might precipitate an exposure (e.g., provoking high-risk animals), awareness of the types of animal contact that require postexposure prophylaxis (PEP), and use of proper personal protective equipment when handling animals or laboratory specimens. PEP is widely available in the United States and highly effective if administered after an exposure occurs. A small subset of persons has a higher level of risk for being exposed to rabies virus than does the general U.S. population; these persons are recommended to receive preexposure prophylaxis (PrEP), a series of human rabies vaccine doses administered before an exposure occurs, in addition to PEP after an exposure. PrEP does not eliminate the need for PEP; however, it does simplify the rabies PEP schedule (i.e., eliminates the need for rabies immunoglobulin and decreases the number of vaccine doses required for PEP). As rabies epidemiology has evolved and vaccine safety and efficacy have improved, Advisory Committee on Immunization Practices (ACIP) recommendations to prevent human rabies have changed. During September 2019-November 2021, the ACIP Rabies Work Group considered updates to the 2008 ACIP recommendations by evaluating newly published data, reviewing frequently asked questions, and identifying barriers to adherence to previous ACIP rabies vaccination recommendations. Topics were presented and discussed during six ACIP meetings. The following modifications to PrEP are summarized in this report: 1) redefined risk categories; 2) fewer vaccine doses in the primary vaccination schedule; 3) flexible options for ensuring long-term protection, or immunogenicity; 4) less frequent or no antibody titer checks for some risk groups; 5) a new minimum rabies antibody titer (0.5 international units [IUs]) per mL); and 6) clinical guidance, including for ensuring effective vaccination of certain special populations.

    • Informatics
      1. WHO competency framework for health authorities and institutions to manage infodemics: its development and featuresexternal icon
        Rubinelli S, Purnat TD, Wihelm E, Traicoff D, Namageyo-Funa A, Thomson A, Wardle C, Lamichhane J, Briand S, Nguyen T.
        Hum Resour Health. 2022 May 7;20(1):35.
        BACKGROUND: In April 2020, the World Health Organization (WHO) Information Network for Epidemics produced an agenda for managing the COVID-19 infodemic. "Infodemic" refers to the overabundance of information-including mis- and disinformation. In this agenda it was pointed out the need to create a competency framework for infodemic management (IM). This framework was released by WHO on 20th September 2021. This paper presents the WHO framework for IM by highlighting the different investigative steps behind its development. METHODS: The framework was built through three steps. Step 1 included the preparatory work following the guidelines in the Guide to writing Competency Framework for WHO Academy courses. Step 2 was based on a qualitative study with participants (N = 25), identified worldwide on the basis of their academic background in relevant fields of IM or of their professional experience in IM activities at the institutional level. The interviews were conducted online between December 2020 and January 2021, they were video-recorded and analyzed using thematic analysis. In Step 3, two stakeholder panels were conducted to revise the framework. RESULTS: The competency framework contains four primary domains, each of which comprised main activities, related tasks, and knowledge and skills. It identifies competencies to manage and monitor infodemics, to design, conduct and evaluate appropriate interventions, as well as to strengthen health systems. Its main purpose is to assist institutions in reinforcing their IM capacities and implementing effective IM processes and actions according to their individual contexts and resources. CONCLUSION: The competency framework is not intended to be a regulatory document nor a training curriculum. As a WHO initiative, it serves as a reference tool to be applied according to local priorities and needs within the different countries. This framework can assist institutions in strengthening IM capacity by hiring, staff development, and human resources planning.

      2. Development of a standards-based city-wide health information exchange for public health in response to COVID-19external icon
        Hota B, Casey P, McIntyre AF, Khan J, Rab S, Chopra A, Lateef O, Layden JE.
        JMIR Public Health Surveill. 2022 May 7.
        BACKGROUND: Disease surveillance is a critical function of public health, provides essential information about disease burden, clinical and epidemiologic parameters of disease, and is an important element of effective and timely case and contact tracing. The COVID-19 pandemic demonstrates the essential role of disease surveillance in preserving public health. In theory, the standard data formats and exchange methods provided by EHR meaningful use should enable rapid healthcare data exchange in the setting of disruptive healthcare events like a pandemic. In reality, access to data remains challenging, and, even if available, often lacks conformity to regulated standards. As a result of the COVID-19 pandemic, we developed a regional data hub to enhance public health surveillance. OBJECTIVE: We sought to use regulated interoperability standards already in production to generate awareness of regional bed capacity and enhance the capture of epidemiological risk factors and clinical variables among patients tested for SARS-CoV-2. We describe the technical and operational components, governance model, and timelines required to implement the public health order which mandated electronic reporting of data from EHRs among hospitals in the Chicago jurisdiction. We also evaluate the data sources, infrastructure requirements and the completeness of data supplied to the platform and the capacity to link these sources. METHODS: Following a public health order mandating data submission by all acute care hospitals in Chicago, we developed the technical infrastructure to combine multiple data feeds from those EHR systems. We measured the completeness of each feed and the match rate between feeds. RESULTS: A cloud-based environment was created that received ELR, consolidated clinical data architecture, and bed capacity data feeds from sites. Data governance was planned from the project initiation to aid in consensus and principles for data use. Data from 88,906 persons from consolidated clinical data architecture (CCDA) records among 14 facilities, and 408,741 persons from ELR records among 88 facilities, were submitted. Most (90.1%) records could be matched between CCDA and ELR feeds. Data fields absent from ELR feeds included travel histories, clinical symptoms, and comorbidities. Less than 5% of CCDA data fields were empty. Merging CCDA with ELR data improved race, ethnicity, comorbidity, and hospitalization information data availability. CONCLUSIONS: We describe the development of a city-wide public health data hub for the surveillance of SARS-CoV-2 infection. We were able to assess the completeness of existing ELR feeds, augment these feeds with CCDA documents, establish secure transfer methods for data exchange, develop cloud-based architecture to enable secure data storage and analytics, and produce dashboards for monitoring of capacity and disease burden. We consider this public health and clinical data registry as an informative example of the power of common standards across EHR and a potential template for future use of standards to improve public health surveillance.

      3. Publication and impact of preprints included in the first 100 editions of the CDC COVID-19 science update: Content analysisexternal icon
        Otridge J, Ogden C, Bernstein K, Knuth M, Fishman J, Brooks J.
        JMIR Public Health Surveill. 2022 May 10.
        BACKGROUND: Preprints are publicly available manuscripts posted to various servers that have not been peer-reviewed. Although preprints have existed since 1961, they have gained increased popularity during the COVID-19 pandemic due to the need for immediate, relevant information. OBJECTIVE: The aim of this study is to evaluate the publication rate and impact of preprints included in the CDC COVID-19 Science Update and assess the performance of the COVID-19 Science Update team in selecting impactful preprints. METHODS: All preprints in the first 100 editions (April 1, 2020 - July 30, 2021) of the Science Update were included in the study. Preprints that were not published were categorized as "unpublished preprints". Preprints that were subsequently published exist in two versions (in a peer-reviewed journal and on the original preprint server) which were analyzed separately and referred to as "peer-reviewed preprint" and "original preprint", respectively. Time-to-publish was the time interval between the date on which a preprint was first posted to the date on which it was first available as a peer-reviewed article. Impact was quantified by Altmetric Attention Score and citation count for all available manuscripts on August 6, 2021. Preprints were analyzed by publication status, rate, and time to publication. RESULTS: Among 275 preprints included in the CDC COVID-19 Science Update during the study period, most came from three servers: medRxiv (n=201), bioRxiv (n=41), and SSRN (n=25), with eight coming from other sources. More than half (152 of 275, 55.3%) were eventually published. The median time-to-publish was 2.31 months (IQR 1.38-3.73). When preprints posted in the last 2.31 months were excluded (to account for the time-to-publish), the publication rate was to 67.8%. Seventy-six journals published at least one preprint from the CDC COVID-19 Science Update and 18 journals published at least three. The median Altmetric Attention Score for unpublished preprints (n=123) was 146 (IQR 22-552) and median citation count of 2 (IQR 0-8); for original preprints (n=152) these values were 212 (IQR 22-1164) and 14 (IQR 2-40), respectively; for peer-review preprints, these values were 265 (IQR 29-1896) 19 (IQR 3-101), respectively. CONCLUSIONS: Prior studies of COVID-19 preprints found publication rates between 5.4% and 21.1%. Preprints included in the CDC COVID-19 Science Update were published at a higher rate than overall COVID-19 preprints, and those that were ultimately published were published within months and received higher attention scores than unpublished preprints. These findings indicate that the Science Update process for selecting preprints appears have done so with high fidelity in terms of their likelihood to be published and impactful. Incorporation of high-quality preprints into the CDC COVID-19 Science Update improves this activity's capacity to inform meaningful public health decision making.

    • Injury and Violence
      1. Validation and comparison of fall screening tools for predicting future falls among older adultsexternal icon
        Burns ER, Lee R, Hodge SE, Pineau VJ, Welch B, Zhu M.
        Arch Gerontol Geriatr. 2022 Apr 30;101:104713.
        BACKGROUND: Falls are the leading cause of injuries among older adults in the United States (US). Falls are preventable and clinicians are advised to screen for fall risk yearly. There are many falls screening tools and not all have been validated for their ability to predict future falls. METHODS: We enrolled 1905 community-dwelling older adults into a 13-month study using a probability-based representative panel of the US population recruited from NORC at the University of Chicago's National Frame. Respondents completed a baseline survey, 11 monthly fall calendars, and a final survey. The baseline survey included six falls screening tools (the Stay Independent, Three Key Questions (3KQ), a modified American Geriatric/British Geriatric tool, the short Falls Efficacy-1[FES-I]) and two single screening questions ("I have fallen in the past year" and "How many times did you fall in the past 12 months?"). The baseline and final survey collected demographic and health information, including falls. Sensitivity, specificity, positive and negative likelihood ratios, and corresponding 95% confidence intervals were calculated in SAS using weighted proportions. RESULTS: There were 1563 respondents who completed the final survey (completion rate 82%). Sensitivity estimates ranged from 22.5% for the short FES-I to 68.7% for the 3KQ. Specificity estimates ranged from 57.9% for the 3KQ to 89.4% for the short FES-I. CONCLUSIONS: Falls screening tools have varying sensitivity and specificity for predicting the occurrence of a fall in the following 12 months.

      2. BACKGROUND: Mental health problems ranging from depression to more severe acts such as self-harm or suicidal behaviours are a serious problem among adolescents and young adults. Exposure to violence during the life of young people can increase mental health issues for youth. This study examines the relationship between exposure to violence and mental health issues among youth using a nationally representative study in Malawi. METHODS: We analysed data from the nationally representative Violence Against Children Survey from Malawi (2013) to quantify the association between exposures to violence (physical, sexual and emotional) and their relationship with mental distress, self-harm behaviours and suicidal ideation and attempts among youth aged 13-24 years. We evaluated the association of exposures to violence against children with reported mental health conditions among women and men. We used ordinal logistic regression models with appropriate survey weights to assess exposures to violence and the three outcomes of interest. RESULTS: Children and youth aged 13-24 years exposed to violence in childhood reported higher levels of adverse mental health effects, including mental distress, self-harm behaviours and suicidal ideation and attempts. The odds of reporting these outcomes increased as the number of violence types increased. CONCLUSIONS: Understanding the risks based on different combinations of exposures to violence in Malawi can help identify populations at higher risk and optimise violence prevention strategies.

      3. Violence against children: multifaceted approaches to a complex problemexternal icon
        Villaveces A, Viswanathan S.
        Int J Inj Contr Saf Promot. 2022 Mar;29(1):1-2.


      4. In this response to Sarah Ullman's 2020 Journal of Aggression, Maltreatment, and Trauma article, Rape Resistance: A Critical Piece of all Women's Empowerment and Holistic Rape Prevention, the author highlights the importance of a holistic and comprehensive strategy for sexual violence prevention that involves many approaches across the social ecological model, as outlined in the Centers for Disease Control and Prevention's STOP SV technical package, including effective empowerment-based training approaches. She describes that more work is needed to evaluate and identify evidence-based approaches, including those that address prevention within marginalized groups and those grassroots approaches that are already being implemented but have not been evaluated. She ends by stressing that the field has much to gain from this kind of collective, multi-sector effort. Copyright © 2022 Taylor & Francis.

    • Laboratory Sciences
      1. Spatial distribution of plasmodium falciparum and plasmodium vivax in Northern Ethiopia by microscopic, rapid diagnostic test, laboratory antibody, and antigen dataexternal icon
        Leonard CM, Assefa A, Sime H, Mohammed H, Kebede A, Solomon H, Drakeley C, Murphy M, Hwang J, Rogier E.
        J Infect Dis. 2022 Mar 2;225(5):881-890.
        BACKGROUND: Determining malaria transmission within regions of low, heterogenous prevalence is difficult. A variety of malaria tests exist and range from identification of diagnostic infection to testing for prior exposure. This study describes the concordance of multiple malaria tests using data from a 2015 household survey conducted in Ethiopia. METHODS: Blood samples (n=2279) from 3 regions in northern Ethiopia were assessed for Plasmodium falciparum and Plasmodium vivax by means of microscopy, rapid diagnostic test, multiplex antigen assay, and multiplex assay for immunoglobulin G (IgG) antibodies. Geospatial analysis was conducted with spatial scan statistics and kernel density estimation to identify malaria hot spots by different test results. RESULTS: The prevalence of malaria infection was low (1.4% by rapid diagnostic test, 1.0% by microscopy, and 1.8% by laboratory antigen assay). For P. falciparum, overlapping spatial clusters for all tests and an additional 5 unique IgG clusters were identified. For P. vivax, clusters identified with bead antigen assay, microscopy, and IgG partially overlapped. CONCLUSIONS: Assessing the spatial distribution of malaria exposure using multiple metrics can improve the understanding of malaria transmission dynamics in a region. The relative abundance of antibody clusters indicates that in areas of low transmission, IgG antibodies are a more useful marker to assess malaria exposure.

      2. A partially multiplexed HIV drug resistance (HIVDR) assay for monitoring HIVDR mutations of the protease, reverse-transcriptase (PRRT), and integrase (INT)external icon
        DeVos J, McCarthy K, Sewe V, Akinyi G, Junghae M, Opollo V, Nouhin J, Shafer R, Zeh C, Ramos A, Alexander H, Chang J.
        Microbiol Spectr. 2022 May 5:e0177621.
        As dolutegravir (DTG)-containing HIV regimens are scaled up globally, monitoring for HIV drug resistance (HIVDR) will become increasingly important. We designed a partially multiplexed HIVDR assay using Sanger sequencing technology to monitor HIVDR mutations in the protease, reverse-transcriptase (PRRT), and integrase (INT). A total of 213 clinical and analytical plasma and dried blood spot (DBS) samples were used in the evaluation. The assay detected a wide range of known HIV-1 subtypes and circulating recombinant forms (CRFs) of group M from 139 samples. INT accuracy showed that the average nucleotide (nt) sequence concordance was 99.8% for 75 plasma samples and 99.5% for 11 DBS samples compared with the reference sequences. The PRRT accuracy also demonstrated the average nucleotide sequence concordance was 99.5% for 57 plasma samples and 99.2% for 33 DBS samples. The major PRRT and INT DR mutations of all samples tested were concordant with those of the reference sequences using the Stanford HIV database (db). Amplification sensitivity of samples with viral load (VL) >5000 copies/mL showed plasma exceeded 95% of positivity, and DBS exceeded 90% for PRRT and INT. Samples with VL (1000 to 5000 copies/mL) showed plasma exceeded 90%, and DBS reached 88% positivity for PRRT and INT. Assay precision and reproducibility showed >99% nucleotide sequence concordance in each set of replicates for PRRT and INT. In conclusion, this HIVDR assay met WHO HIVDR assay performance criteria for surveillance, worked for plasma and DBS, used minimal sample volume, was sensitive, and was a potentially cost-effective tool to monitor HIVDR mutations in PRRT and INT. IMPORTANCE This HIVDR genotyping assay works for both plasma and DBS samples, requires low sample input, and is sensitive. This assay has the potential to be a user-friendly and cost-effective HIVDR assay because of its partially multiplexed design. Application of this genotyping assay will help HIVDR monitoring in HIV high-burdened countries using a DGT-based HIV drug regimen recommended by the U.S. President's Emergency Plan for AIDS Relief and the WHO.

      3. Development of HEK-293 cell lines constitutively expressing flaviviral antigens for use in diagnosticsexternal icon
        Powers JA, Skinner B, Davis BS, Biggerstaff BJ, Robb L, Gordon E, Calvert AE, Chang GJ.
        Microbiol Spectr. 2022 May 9:e0059222.
        Flaviviruses are important human pathogens worldwide. Diagnostic testing for these viruses is difficult because many of the pathogens require specialized biocontainment. To address this issue, we generated 39 virus-like particle (VLP)- and nonstructural protein 1 (NS1)-secreting stable cell lines in HEK-293 cells of 13 different flaviviruses, including dengue, yellow fever, Japanese encephalitis, West Nile, St. Louis encephalitis, Zika, Rocio, Ilheus, Usutu, and Powassan viruses. Antigen secretion was stable for at least 10 cell passages, as measured by enzyme-linked immunosorbent assays and immunofluorescence assays. Thirty-five cell lines (90%) had stable antigen expression over 10 passages, with three of these cell lines (7%) increasing in antigen expression and one cell line (3%) decreasing in antigen expression. Antigen secretion in the HEK-293 cell lines was higher than in previously developed COS-1 cell line counterparts. These antigens can replace current antigens derived from live or inactivated virus for safer use in diagnostic testing. IMPORTANCE Serological diagnostic testing for flaviviral infections is hindered by the need for specialized biocontainment for preparation of reagents and assay implementation. The use of previously developed COS-1 cell lines secreting noninfectious recombinant viral antigen is limited due to diminished antigen secretion over time. Here, we describe the generation of 39 flaviviral virus-like particle (VLP)- and nonstructural protein 1 (NS1)-secreting stable cell lines in HEK-293 cells representing 13 medically important flaviviruses. Antigen production was more stable and statistically higher in these newly developed cell lines than in their COS-1 cell line counterparts. The use of these cell lines for production of flaviviral antigens will expand serological diagnostic testing of flaviviruses worldwide.

    • Nutritional Sciences
      1. BACKGROUND: The 2020-2025 Dietary Guidelines for Americans (DGAs) recommend intake of a variety of vegetables, including dark green, red and orange, starchy, and other vegetables. OBJECTIVES: This study aims to describe sociodemographic differences in the contribution of different categories of vegetables, and the form in which they are consumed, i.e., discrete vegetables, mixed dishes, and other foods such as savory snacks to total vegetables intake on a given day. DESIGN: This is a cross-sectional, secondary analysis of the 2017-2018 National Health and Nutrition Examination Survey (NHANES). PARTICIPANTS: /Setting: This study included the data of 7122 persons aged 2 years with reliable day 1 24-hour dietary recalls. MAIN OUTCOME MEASURES: Serving equivalents of vegetables from 20 discrete categories of vegetables, and from mixed dishes and other foods as a percentage of total vegetables. STATISTICAL ANALYSES: Pairwise differences by age, sex, and race and Hispanic origin, and family income were examined using univariate t statistics, and trends by age and income examined using orthogonal polynomials. RESULTS: Mean serving equivalents of vegetables was 1.4 cups. The serving equivalents increased with age among youth, was higher among non-Hispanic Asian (NHA) persons than other subgroups and increased with increasing family income. Overall, discrete vegetables contributed 55.2% of total vegetable intake and the contribution increased with age in adults, and with increasing family income. The top five discrete vegetable contributors were other vegetables and combinations, French fries and other fried white potatoes, lettuce and lettuce salads, mashed potatoes and white potato mixtures, and baked or boiled white potatoes. Non-starchy discrete vegetables contributed more to total vegetables for adults (37.6%) than youth (28.0%), and the contribution increased with increasing family income. On the other hand, the contribution of mixed dishes and other foods decreased with increasing family income. CONCLUSIONS: Discrete vegetables only contributed 55.2% of total vegetables intake, and the top sources were not varied, three of them potato-based, which may explain the reported low vegetables intake, relative to the DGAs. More than one-third of vegetables consumed were non-starchy discrete vegetables, many of which are high in vitamins. Non-starchy discrete vegetable intake was higher in adults than youth and increased with family income.

    • Occupational Safety and Health
      1. Identifying essential critical infrastructure workers during the COVID-19 pandemic using standardized industry codesexternal icon
        Billock RM, Haring Sweeney M, Steege AL, Michaels R, Luckhaupt SE.
        Am J Ind Med. 2022 May 9.
        BACKGROUND: The Cybersecurity and Infrastructure Security Agency (CISA) produced an advisory list identifying essential critical infrastructure workers (ECIW) during the coronavirus disease 2019 (COVID-19) response. The CISA advisory list is the most common national definition of ECIW but has not been mapped to United States (U.S.) Census industry codes (CICs) to readily identify these worker populations in public health data sources. METHODS: We identified essential critical infrastructure industry designations corresponding to v4.0 of the CISA advisory list for all six-digit North American Industry Classification System (NAICS) codes and cross-walked NAICS codes to CICs. CICs were grouped as essential, non-essential, or mixed essential/non-essential according to component NAICS industries. We also obtained national estimated population sizes for NAICS and Census industries and cross-tabulated Census industry and occupation codes to identify industry-occupation pairs. RESULTS: We produced and made publicly available spreadsheets containing essential industry designations corresponding to v4.0 of the CISA advisory list for NAICS and Census industry titles and codes and population estimates by six-digit NAICS industry, Census industry, and Census industry-occupation pair. The CISA advisory list is highly inclusive and contains most industries and U.S. workers; 71.0% of Census industries comprising 80.6% of workers and 80.7% of NAICS industries comprising 87.1% of workers were designated as essential. CONCLUSIONS: We identified workers in essential critical infrastructure industries as defined by CISA using standardized industry codes. These classifications may support public health interventions and analyses related to the COVID-19 pandemic and future public health crises.

      2. Physiological stress in flat and uphill walking with different backpack loads in professional mountain rescue crewsexternal icon
        Pinedo-Jauregi A, Quinn T, Coca A, Mejuto G, Cámara J.
        Appl Ergon. 2022 Apr 27;103:103784.
        This study aimed to determine the interactive physiological effect of backpack load carriage and slope during walking in professional mountain rescuers. Sixteen mountain rescuers walked on a treadmill at 3.6 km/h for 5 min in each combination of three slopes (1%, 10%, 20%) and five backpack loads (0%, 10%, 20%, 30%, and 40% body weight). Relative heart rate (%HRmax), relative oxygen consumption (%VO(2)max), and rating of perceived exertion (RPE, Borg 1-10 scale) were compared across conditions using two-way ANOVA. Significant differences in %VO(2)max, %HRmax, and RPE across slopes and loads were found where burden increased directly with slope and load (main effect of slope, p < 0.001 for all; main effect of load, p < 0.001 for all). Additionally, significant slope by load interactions were found for all parameters, indicating an additive effect (p < 0.001 for all). Mountain rescuers should consider the physiological interaction between slope and load when determining safe occupational walking capacity.

      3. Influence of preseason antibodies against influenza virus on risk of influenza infection among healthcare personnelexternal icon
        Gorse GJ, Rattigan SM, Kirpich A, Simberkoff MS, Bessesen MT, Gibert C, Nyquist AC, Price CS, Gaydos CA, Radonovich LJ, Perl TM, Rodriguez-Barradas MC, Cummings DA.
        J Infect Dis. 2022 Mar 2;225(5):891-902.
        BACKGROUND: The association of hemagglutination inhibition (HAI) antibodies with protection from influenza among healthcare personnel (HCP) with occupational exposure to influenza viruses has not been well-described. METHODS: The Respiratory Protection Effectiveness Clinical Trial was a cluster-randomized, multisite study that compared medical masks to N95 respirators in preventing viral respiratory infections among HCP in outpatient healthcare settings for 5180 participant-seasons. Serum HAI antibody titers before each influenza season and influenza virus infection confirmed by polymerase chain reaction were studied over 4 study years. RESULTS: In univariate models, the risk of influenza A(H3N2) and B virus infections was associated with HAI titers to each virus, study year, and site. HAI titers were strongly associated with vaccination. Within multivariate models, each log base 2 increase in titer was associated with 15%, 26% and 33%-35% reductions in the hazard of influenza A(H3N2), A(H1N1), and B infections, respectively. Best models included preseason antibody titers and study year, but not other variables. CONCLUSIONS: HAI titers were associated with protection from influenza among HCP with routine exposure to patients with respiratory illness and influenza season contributed to risk. HCP can be reassured about receiving influenza vaccination to stimulate immunity.

      4. OBJECTIVE: Pilot test the effectiveness of an online training program for managing shift work and long work hours. METHOD: Fifty-seven officers from across the United States participated for 12 weeks in a pre-test, training intervention, post-test design assessing the following measures: sleep using actigraphy, diaries, and surveys; knowledge and feedback about the training using surveys. RESULTS: After the training, actigraphy data showed significant reductions in sleep latency and awakenings during sleep. Survey data showed reductions in sleepiness, difficulty staying awake during the day, and difficulty getting things done. Frequency of nightmares also decreased. Participants knowledge about sleep improved and satisfaction with the training was high. CONCLUSION: Participants were satisfied with the training and showed objective improvements in their sleep and subjective improvements in feelings when awake. This research will help inform interventions to improve police officer health and wellness.

      5. Banding together: making the case for occupational exposure bandsexternal icon
        Lentz TJ, Edmondson M.
        Synergist. 2022 May;33(5):38-41.
        Occupational hygienists and safety and health practitioners have a solid history involving the use of occupational exposure limits (OELs). The role of OELs in characterizing workplace exposures to potentially hazardous chemicals has been significant, and they also help to ensure appropriate protections are in place and functioning. In addition, OELs provide the means for hazard assessment and risk communication. Yet setting appropriate OELs is resource intensive, requiring dose-response data, exposure data, and technical expertise to accurately characterize hazards for risk management purposes. And in a world of work where the number of chemical substances in use vastly exceeds the number of chemicals with OELs, the search for additional strategies for chemical risk assessment and management began. One such strategy gaining stronger acceptance and increasing utility is occupational exposure banding and the use of occupational exposure bands (OEBs).

    • Parasitic Diseases
      1. The immediate effects of a combined mass drug administration and indoor residual spraying campaign to accelerate progress toward malaria elimination in Grande-Anse, Haitiexternal icon
        Druetz T, Stresman G, Ashton RA, Joseph V, van den Hoogen L, Worges M, Hamre KE, Fayette C, Monestime F, Impoinvil D, Rogier E, Chang MA, Lemoine JF, Drakeley C, Eisele TP.
        J Infect Dis. 2022 May 4;225(9):1611-1620.
        BACKGROUND: Haiti is planning targeted interventions to accelerate progress toward malaria elimination. In the most affected department (Grande-Anse), a combined mass drug administration (MDA) and indoor residual spraying (IRS) campaign was launched in October 2018. This study assessed the intervention's effectiveness in reducing Plasmodium falciparum prevalence. METHODS: An ecological quasi-experimental study was designed, using a pretest and posttest with a nonrandomized control group. Surveys were conducted in November 2017 in a panel of easy access groups (25 schools and 16 clinics) and were repeated 2-6 weeks after the campaign, in November 2018. Single-dose sulfadoxine-pyrimethamine and primaquine was used for MDA, and pirimiphos-methyl as insecticide for IRS. RESULTS: A total of 10 006 participants were recruited. Fifty-two percent of the population in the intervention area reported having received MDA. Prevalence diminished between 2017 and 2018 in both areas, but the reduction was significantly larger in the intervention area (ratio of adjusted risk ratios, 0.32 [95% confidence interval, .104-.998]). CONCLUSIONS: Despite a moderate coverage, the campaign was effective in reducing P. falciparum prevalence immediately after 1 round. Targeted MDA plus IRS is useful in preelimination settings to rapidly decrease the parasite reservoir, an encouraging step to accelerate progress toward malaria elimination.

      2. Epidemiological and molecular investigations of a point-source outbreak of Dracunculus medinensis infecting humans and dogs in Chad: a cross-sectional studyexternal icon
        Guagliardo SA, Thiele E, Unterwegner K, Narcisse Nanguita N, Dossou L, Tchindebet Ouakou P, Zirimwabagabo H, Ruiz-Tiben E, Hopkins DR, Roy SL, Cama V, Bishop H, Sapp S, Yerian S, Weiss AJ.
        Lancet Microbe. 2022 Feb;3(2):e105-e112.
        BACKGROUND: Dracunculiasis (also known as Guinea worm disease), caused by the Dracunculus medinensis nematode, is progressing towards eradication, with a reduction in cases from 3·5 million cases in the mid-1980s to only 54 human cases at the end of 2019. Most cases now occur in Chad. On April 19, 2019, a 19-year-old woman presented with D medinensis in an area within the Salamat region of Chad, where the disease had not been previously reported. We aimed to investigate the connection between this case and others detected locally and elsewhere in Chad using a combination of epidemiological and genetic approaches. METHODS: In this cross-sectional field study, we conducted household case searches and informal group interviews in the Bogam, Liwi, and Tarh villages in Chad. All community members including children were eligible for participation in the outbreak investigation. Adult female D medinensis associated with this outbreak were collected for genetic analysis (18 from humans and two from dogs). Four mitochondrial genes and 22 nuclear microsatellite markers were used to assess relatedness of worms associated with the outbreak in comparison with other worms from elsewhere in Chad. FINDINGS: Between April 12 and Sept 6, 2019, we identified 22 human cases and two canine cases of dracunculiasis associated with 15 households. Six (40%) of the 15 affected households had multiple human or canine cases within the household. Most cases of dracunculiasis in people were from three villages in Salamat (21 [95%] of 22 cases), but one case was detected nearly 400 km away in Sarh city (outside the Salamat region). All people with dracunculiasis reported a history of consuming fish and unfiltered water. Worms associated with this outbreak were genetically similar and shared the same maternal lineage. INTERPRETATION: Molecular epidemiological results suggest a point-source outbreak that originated from a single female D medinensis, rather than newly identified sustained local transmission. The failure of the surveillance system to detect the suspected canine infection in 2018 highlights the challenge of canine D medinensis detection, particularly in areas under passive surveillance. Human movement can also contribute to dracunculiasis spread over long distances. FUNDING: The Carter Center.

    • Public Health Leadership and Management
      1. A comparative cross-sectional evaluation of the Field Epidemiology Training Program-Frontline in Ethiopiaexternal icon
        Kebebew T, Takele T, Zeynu N, Muluneh A, Habtetsion M, Kezali J, Demelash S, Assefa Z, Hu AE, Woldetsadik MA, Turcios-Ruiz RM, Cassell CH, Harris J, Sugerman DE.
        BMC Public Health. 2022 May 10;22(1):931.
        BACKGROUND: The Field Epidemiology Training Program (FETP)-Frontline is a three-month in-service training aimed at improving surveillance officers' capacity to collect, analyze, and interpret surveillance data, and respond to health emergencies. We evaluated the effectiveness of the FETP-Frontline which was introduced in Ethiopia in 2016. METHODS: We conducted a comparative, randomized cross-sectional study to assess surveillance-related knowledge, skills, and performance among trained and untrained officers using a structured questionnaire and observation checklist. We compared the knowledge, skills, and performance scores of trained and untrained officers using the Fisher's Exact test, chi-square test, and t-test at p-value < 0.05 for statistical significance. RESULTS: We conducted the study among 74 trained and 76 untrained surveillance officers. About three-quarters of all participants were male, and the average age was 34 (± 8.6) years. Completeness and timeliness of surveillance reports were significantly higher among trained than untrained surveillance officers. The trained officers were more likely to have produced epidemiologic bulletins (55% vs 33%), conducted active surveillance six months before the survey (88% vs 72%), provided surveillance training (88% vs 65%), conducted strengths, weakness, opportunities, and threats (SWOT) analysis (55% vs 17%), and utilized Microsoft Excel to manage surveillance data (87% vs 47%). We also observed improved surveillance officers' perceived skills and knowledge, and the availability and quality of surveillance formats and reports among the trained group. CONCLUSIONS: FETP-Frontline trained surveillance officers demonstrated better knowledge, skills, and performance in most surveillance activities compared to the untrained officers. FETP-Frontline can address competency gaps among district surveillance officers in Ethiopia and other countries. Scaling up the program to cover unreached districts can enable achieving the human resource development core capacity requirement of the International Health Regulations 2005.

    • Substance Use and Abuse
      1. Early changes in puffing intensity when exclusively using open-label very low nicotine content cigarettesexternal icon
        White CM, Watson C, Bravo Cardenas R, Ngac P, Valentin-Blasini L, Blount BC, Koopmeiners JS, Denlinger-Apte RL, Pacek LR, Benowitz NL, Hatsukami DK, Donny EC, Carpenter MJ, Smith TT.
        Nicotine Tob Res. 2022 May 7.
        INTRODUCTION: In response to reducing cigarette nicotine content, people who smoke could attempt to compensate by using more cigarettes or by puffing on individual cigarettes with greater intensity. Such behaviors may be especially likely under conditions where normal nicotine content (NNC) cigarettes are not readily accessible. The current within-subject, residential study investigated whether puffing intensity increased with very low nicotine content (VLNC) cigarette use, relative to NNC cigarette use, when no other nicotine products were available. METHODS: Sixteen adults who smoke daily completed two 4-night hotel stays in Charleston, South Carolina (U.S.) in 2018 during which only NNC or only VLNC cigarettes were accessible. We collected the filters from all smoked cigarettes and measured the deposited solanesol to estimate mouth-level nicotine delivery per cigarette. These estimates were averaged within and across participants, per each 24-hour period. We then compared the ratio of participant-smoked VLNC and NNC cigarette mouth-level nicotine to the ratio yielded by cigarette smoking machines (when puffing intensity is constant). RESULTS: Average mouth-level nicotine estimates from cigarettes smoked during the hotel stays indicate participants puffed VLNC cigarettes with greater intensity than NNC cigarettes in each respective 24-hour period. However, this effect diminished over time (p<0.001). Specifically, VLNC puffing intensity was 40.0% (95% CI: 29.9, 53.0) greater than NNC puffing intensity in the first period, and 16.1% (95% CI: 6.9, 26.0) greater in the fourth period. CONCLUSION: Average puffing intensity per cigarette was elevated with exclusive VLNC cigarette use, but the extent of this effect declined across four days. IMPLICATIONS: In an environment where no other sources of nicotine are available, people who smoke daily may initially attempt to compensate for cigarette nicotine reduction by puffing on individual cigarettes with greater intensity. Ultimately, the compensatory behavior changes required to achieve usual nicotine intake from VLNC cigarettes are drastic and unrealistic. Accordingly, people are unlikely to sustain attempts to compensate for very low cigarette nicotine content.

    • Zoonotic and Vectorborne Diseases
      1. Pathogenesis and transmission of human seasonal and swine-origin a(H1) influenza viruses in the ferret modelexternal icon
        Pulit-Penaloza JA, Brock N, Jones J, Belser JA, Jang Y, Sun X, Thor S, Pappas C, Zanders N, Tumpey TM, Todd Davis C, Maines TR.
        Emerg Microbes Infect. 2022 May 10:1-20.
        Influenza A viruses (IAVs) in the swine reservoir constantly evolve, resulting in expanding genetic and antigenic diversity of strains that occasionally cause infections in humans and pose threat of emerging as a strain capable of human-to-human transmission. For these reasons, there is an ongoing need for surveillance and characterization of newly emerging strains to aid pandemic preparedness efforts, particularly for the selection of candidate vaccine viruses and conducting risk assessments. Here, we performed a parallel comparison of the pathogenesis and transmission of genetically and antigenically diverse swine-origin A(H1N1) variant (v) and A(H1N2)v, and human seasonal A(H1N1)pdm09 IAVs using the ferret model. Both groups of viruses were capable of replication in the ferret upper respiratory tract; however, variant viruses were more frequently isolated from the lower respiratory tract as compared to the human-adapted viruses. Regardless of virus origin, observed clinical signs of infection differed greatly between strains, with some viruses causing nasal discharge, sneezing and, in some instances, diarrhea in ferrets. The most striking difference between the viruses was the ability to transmit through the air. Human-adapted viruses were capable of airborne transmission between all ferret pairs. In contrast, only one out of the four tested variant viruses was able to transmit via the air as efficiently as the human-adapted viruses. Overall, this work highlights the need for sustained monitoring of emerging swine IAVs to identify strains of concern such as those that are antigenically different from vaccine strains and that possess adaptations required for efficient respiratory droplet transmission in mammals.

      2. Rabies surveillance in the United States during 2020external icon
        Ma X, Bonaparte S, Toro M, Orciari LA, Gigante CM, Kirby JD, Chipman RB, Fehlner-Gardiner C, Cedillo VG, Aréchiga-Ceballos N, Rao AK, Petersen BW, Wallace RM.
        J Am Vet Med Assoc. 2022 May 5:1-9.
        OBJECTIVE: To provide epidemiological information on animal and human cases of rabies in the US during 2020 and summaries of 2020 rabies surveillance for Canada and Mexico. ANIMALS: All animals submitted for laboratory diagnosis of rabies in the US during 2020. PROCEDURES: State and territorial public health departments and USDA Wildlife Services provided 2020 rabies surveillance data. Data were analyzed temporally and geographically to assess trends in domestic and wildlife rabies cases. RESULTS: During 2020, 54 jurisdictions submitted 87,895 animal samples for rabies testing, of which 85,483 (97.3%) had a conclusive (positive or negative) test result. Of these, 4,479 (5.2%) tested positive for rabies, representing a 4.5% decrease from the 4,690 cases reported in 2019. Texas (n = 580 [12.9%]), Pennsylvania (371 [8.3%]), Virginia (351 [7.8%]), New York (346 [7.7%]), North Carolina (301 [6.7%]), New Jersey (257 [5.7%]), Maryland (256 [5.7%]), and California (248 [5.5%]) together accounted for > 60% of all animal rabies cases reported in 2020. Of the total reported rabid animals, 4,090 (91.3%) involved wildlife, with raccoons (n = 1,403 [31.3%]), bats (1,400 [31.3%]), skunks (846 [18.9%]), and foxes (338 [7.5%]) representing the primary hosts confirmed with rabies. Rabid cats (288 [6.4%]), cattle (43 [1.0%]), and dogs (37 [0.8%]) accounted for 95% of rabies cases involving domestic animals in 2020. No human rabies cases were reported in 2020. CONCLUSIONS AND CLINICAL RELEVANCE: For the first time since 2006, the number of samples submitted for rabies testing in the US was < 90,000; this is thought to be due to factors related to the COVID-19 pandemic, as similar decreases in sample submission were also reported by Canada and Mexico.

      3. Predominance of severe plasma leakage in pediatric patients with severe dengue in Puerto Ricoexternal icon
        Paz-Bailey G, Sánchez-González L, Torres-Velasquez B, Jones E, Perez-Padilla J, Sharp TM, Lorenzi O, Delorey M, Munoz-Jordan J, Tomashek KM, Waterman SH, Alvarado LI, Rivera-Amil V.
        J Infect Dis. 2022 May 1.
        BACKGROUND: We evaluated clinical and laboratory findings among patients with non-severe or severe dengue in Puerto Rico to examine whether clinical manifestations vary by age. METHODS: During 2012-2014, we enrolled patients who arrived at the emergency department with fever or history of fever within 7 days of presentation. Serum samples were tested for dengue virus (DENV) by RT-PCR and IgM ELISA. Severe dengue was defined as severe plasma leakage or shock, severe bleeding, or organ involvement at presentation, during hospitalization, or follow-up. RESULTS: Of 1089 dengue patients identified, 281 (26%) were severe. Compared to those with non-severe dengue, patients with severe dengue were more often aged 10-19 years (55% vs. 40%, p < 0.001) and hospitalized (87% vs. 30%, p < 0.001). Severe plasma leakage or shock was more common among children aged 0-9 (59%) or 10-19 years (86%) than adults (49%) (p < 0.01). Severe bleeding was less common among 10-19 year-olds (24%) compared to 0-9 year-olds (45%) and adults (52%; p < 0.01). CONCLUSIONS: Severe plasma leakage was the most common presentation among children, highlighting important differences with adults. Vaccination against dengue could help prevent severe dengue among children in Puerto Rico.

      4. West Nile Virus and other domestic nationally notifiable arboviral diseases - United States, 2020external icon
        Soto RA, Hughes ML, Staples JE, Lindsey NP.
        MMWR Morb Mortal Wkly Rep. 2022 May 6;71(18):628-632.
        Arthropod-borne viruses (arboviruses) are transmitted to humans primarily through the bite of infected mosquitoes and ticks. West Nile virus (WNV), mainly transmitted by Culex species mosquitos, is the leading cause of domestically acquired arboviral disease in the United States (1). Other arboviruses cause sporadic cases of disease and occasional outbreaks. This report summarizes passive data for nationally notifiable domestic arboviruses in the United States reported to CDC for 2020. Forty-four states reported 884 cases of domestic arboviral disease, including those caused by West Nile (731), La Crosse (88), Powassan (21), St. Louis encephalitis (16), eastern equine encephalitis (13), Jamestown Canyon (13), and unspecified California serogroup (2) viruses. A total of 559 cases of neuroinvasive WNV disease were reported, for a national incidence of 0.17 cases per 100,000 population. Because arboviral diseases continue to cause serious illness and the locations of outbreaks vary annually, health care providers should consider arboviral infections in patients with aseptic meningitis or encephalitis that occur during periods when ticks and mosquitoes are active, perform recommended diagnostic testing, and promptly report cases to public health authorities to guide prevention strategies and messaging.

      5. Dengue: A growing problem with new interventionsexternal icon
        Wong JM, Adams LE, Durbin AP, Muñoz-Jordán JL, Poehling KA, Sánchez-González LM, Volkman HR, Paz-Bailey G.
        Pediatrics. 2022 May 11.
        Dengue is the disease caused by 1 of 3 distinct, but closely related dengue viruses (DENV-1-4) that are transmitted by Aedes spp. mosquito vectors. It is the most common arboviral disease worldwide, with the greatest burden in tropical and sub-tropical regions. In the absence of effective prevention and control measures, dengue is projected to increase in both disease burden and geographic range. Given its increasing importance as an etiology of fever in the returning traveler or the possibility of local transmission in regions in the United States with competent vectors, as well as the risk for large outbreaks in endemic US territories and associated states, clinicians should understand its clinical presentation and be familiar with appropriate testing, triage, and management of patients with dengue. Control and prevention efforts reached a milestone in June 2021 when the Advisory Committee on Immunization Practices (ACIP) recommended Dengvaxia for routine use in children aged 9 to 16 years living in endemic areas with laboratory confirmation of previous dengue virus infection. Dengvaxia is the first vaccine against dengue to be recommended for use in the United States and one of the first to require laboratory testing of potential recipients to be eligible for vaccination. In this review, we outline dengue pathogenesis, epidemiology, and key clinical features for front-line clinicians evaluating patients presenting with dengue. We also provide a summary of Dengvaxia efficacy, safety, and considerations for use as well as an overview of other potential new tools to control and prevent the growing threat of dengue .


DISCLAIMER: Articles listed in the CDC Science Clips are selected by the Stephen B. Thacker CDC Library to provide current awareness of the public health literature. An article's inclusion does not necessarily represent the views of the Centers for Disease Control and Prevention nor does it imply endorsement of the article's methods or findings. CDC and DHHS assume no responsibility for the factual accuracy of the items presented. The selection, omission, or content of items does not imply any endorsement or other position taken by CDC or DHHS. Opinion, findings and conclusions expressed by the original authors of items included in the Clips, or persons quoted therein, are strictly their own and are in no way meant to represent the opinion or views of CDC or DHHS. References to publications, news sources, and non-CDC Websites are provided solely for informational purposes and do not imply endorsement by CDC or DHHS.

Page last reviewed: May 24, 2022, 12:00 AM