CDC Science Clips: Volume 9, Issue 3, January 24, 2017
Welcome to Science Clips, CDC’s weekly digest!
Each Tuesday, to enhance awareness of emerging scientific knowledge, selected science clips will be posted here for the public health community. The focus is applied public health research and prevention science that has the capacity to improve health now.
Science Clips is a service of the Stephen B. Thacker CDC Library and Information Center and CDC’s Office of the Chief Science Officer.
The Science Clips is in the public domain and may be freely forwarded and reproduced without permission. The original sources and the CDC Science Clips should be cited as sources. Articles featured in Science Clips may be in-press or uncorrected proofs.
For assistance in obtaining copies of these articles, contact the library at firstname.lastname@example.org or 404-639-1717. Please note that links below to CDC licensed materials are available only through the Intranet and may go through the SFX server. From the SFX window, just click on the full-text link to reach the full-text.
Top Articles of the Week Selected weekly by a senior CDC scientist from the standard sections listed below. The names of CDC authors are indicated in bold text.
SETTING: Patients who initiated treatment for multidrug-resistant tuberculosis (MDR-TB) at 15 Programmatic Management of Drug-resistant Tuberculosis (PMDT) health facilities in the Philippines between July and December 2012. OBJECTIVES: To describe patients’ views of current interventions, and suggest changes likely to reduce MDR-TB loss to follow-up. METHODS: In-depth interviews were conducted between April and July 2014 with MDR-TB patients who were undergoing treatment, had finished treatment at the time of the interview (controls), or had been lost to follow-up (LTFU). Responses were thematically analyzed. RESULTS: Interviews were conducted with 182 patients who were undergoing or had completed treatment and 91 LTFU patients. Views and suggestions could be thematically categorized as approaches to facilitate adherence or address barriers to adherence. The top themes were the need for transportation assistance or improvements to the current transportation assistance program, food assistance, and difficulties patients encountered related to their medications. These themes were addressed by respectively 63%, 60%, and 32% of the participants. CONCLUSIONS: A more patient-centered approach is needed to improve MDR-TB treatment adherence. Programs should strive to provide assistance that considers patient preferences, is adequate to cover actual costs or needs, and is delivered in a timely, uninterrupted manner.
Acute aflatoxin exposure can cause death and disease (aflatoxicosis) in humans. Aflatoxicosis fatality rates have been documented to be as high as 40% in Kenya. The inclusion in the diet of calcium silicate 100 (ACCS100), a calcium montmorillonite clay, may reduce aflatoxin bioavailability, thus potentially decreasing the risk of aflatoxicosis. We investigated the efficacy, acceptability and palatability of ACCS100 in a population in Kenya with recurring aflatoxicosis outbreaks. Healthy adult participants were enrolled in this double-blinded, crossover clinical trial in 2014. Following informed consent, participants (n = 50) were randomised to receive either ACCS100 (3 g day-1) or placebo (3 g day-1) for 7 days. Treatments were switched following a 5-day washout period. Urine samples were collected daily and assessed for urinary aflatoxin M1 (AFM1). Blood samples were collected at the beginning and end of the trial and assessed for aflatoxin B1-lysine adducts from serum albumin (AFB1-lys). AFM1 concentrations in urine were significantly reduced while taking ACCS100 compared with calcium carbonate placebo (beta = 0.49, 95% confidence limit = 0.32-0.75). The 20-day interval included both the placebo and ACCS100 treatments as well as a washout period. There were no statistically significant differences in reported taste, aftertaste, appearance, colour or texture by treatment. There were no statistically significant differences in self-reported adverse events by treatment. Most participants would be willing to take ACCS100 (98%) and give it to their children (98%). ACCS100 was effective, acceptable and palatable. More work is needed to test ACCS100 among vulnerable populations and to determine if it remains effective at the levels of aflatoxin exposure that induce aflatoxicosis.
BACKGROUND: Clusters of bloodstream infections caused by Burkholderia cepacia and Stenotrophomonas maltophilia are uncommon, but have been previously identified in hemodialysis centers that reprocessed dialyzers for reuse on patients. We investigated an outbreak of bloodstream infections caused by B cepacia and S maltophilia among hemodialysis patients in clinics of a dialysis organization. STUDY DESIGN: Outbreak investigation, including matched case-control study. SETTING & PARTICIPANTS: Hemodialysis patients treated in multiple outpatient clinics owned by a dialysis organization. PREDICTORS: Main predictors were dialyzer reuse, dialyzer model, and dialyzer reprocessing practice. OUTCOMES: Case patients had a bloodstream infection caused by B cepacia or S maltophilia; controls were patients without infection dialyzed at the same clinic on the same day as a case; results of environmental cultures and organism typing. RESULTS: 17 cases (9 B cepacia and 8 S maltophilia bloodstream infections) occurred in 5 clinics owned by the same dialysis organization. Case patients were more likely to have received hemodialysis with a dialyzer that had been used more than 6 times (matched OR, 7.03; 95% CI, 1.38-69.76) and to have been dialyzed with a specific reusable dialyzer (Model R) with sealed ends (OR, 22.87; 95% CI, 4.49-infinity). No major lapses during dialyzer reprocessing were identified that could explain the outbreak. B cepacia was isolated from samples collected from a dialyzer header-cleaning machine from a clinic with cases and was indistinguishable from a patient isolate collected from the same clinic, by pulsed-field gel electrophoresis. Gram-negative bacteria were isolated from 2 reused Model R dialyzers that had undergone the facility’s reprocessing procedure. LIMITATIONS: Limited statistical power and overmatching; few patient isolates and dialyzers available for testing. CONCLUSIONS: This outbreak was likely caused by contamination during reprocessing of reused dialyzers. Results of this and previous investigations demonstrate that exposing patients to reused dialyzers increases the risk for bloodstream infections. To reduce infection risk, providers should consider implementing single dialyzer use whenever possible.
Tunneling nanotubes (TNTs) represent a novel route of intercellular communication. While previous work has shown that TNTs facilitate the exchange of viral or prion proteins from infected to naive cells, it is not clear whether the viral genome is also transferred via this mechanism and further, whether transfer via this route can result in productive replication of the infectious agents in the recipient cell. Here we present evidence that lung epithelial cells are connected by TNTs, and in spite of the presence of neutralizing antibodies and an antiviral agent, Oseltamivir, influenza virus can exploit these networks to transfer viral proteins and genome from the infected to naive cell, resulting in productive viral replication in the naive cells. These observations indicate that influenza viruses can spread using these intercellular networks that connect epithelial cells, evading immune and antiviral defenses and provide an explanation for the incidence of influenza infections even in influenza-immune individuals and vaccine failures.
New diagnostic platforms often use nasopharyngeal or oropharyngeal (NP/OP) swabs for pathogen detection for patients hospitalized with community-acquired pneumonia (CAP). We applied multipathogen testing to high-quality sputum specimens to determine if more pathogens can be identified relative to NP/OP swabs. Children (<18 years old) and adults hospitalized with CAP were enrolled over 2.5 years through the Etiology of Pneumonia in the Community (EPIC) study. NP/OP specimens with matching high-quality sputum (defined as </=10 epithelial cells/low-power field [lpf] and >/=25 white blood cells/lpf or a quality score [q-score] definition of 2+) were tested by TaqMan array card (TAC), a multipathogen real-time PCR detection platform. Among 236 patients with matched specimens, a higher proportion of sputum specimens had >/=1 pathogen detected compared with NP/OP specimens in children (93% versus 68%; P < 0.0001) and adults (88% versus 61%; P < 0.0001); for each pathogen targeted, crossing threshold (CT) values were earlier in sputum. Both bacterial (361 versus 294) and viral detections (245 versus 140) were more common in sputum versus NP/OP specimens, respectively, in both children and adults. When available, high-quality sputum may be useful for testing in hospitalized CAP patients.
Introduction: During 2000-2011, 35 injuries (8 fatal) involving winches were reported to the Coast Guard in the Southern shrimp fleet. Injuries involving the main winch drums had a higher risk for fatal outcomes compared to injuries involving the winch cathead (RR = 7.5; 1.1-53.7). The objective of this study was to design effective solutions to protect deckhands from entanglement hazards posed by winches found on the vessels in the Southern shrimp fleet. Methods: Based on injury characteristics, site visit observations, and input from vessel owners, NIOSH determined that the design and implementation of effective main-winch guarding was a feasible first-step in mitigating the entanglement hazard. Design considerations for stationary guards favor systems that are simple, affordable, durable, unobtrusive, and will not interfere with normal fishing operations. In addition, an auxiliary-stop method was tested to prevent entanglements in try-net winches. Results: Standardized passive guards were designed for three commonly found main winch models. Initial prototype guards have been sea-tested. The design of six additional guards is underway, for a total of three iterations for each winch model identified. These will incorporate features found to be valued by fishermen, will be more efficient, and will reduce the overall cost of fabrication and maintenance. Sea testing of these iterations continues. The auxiliary-stop circuit control prototype system was designed to prevent entanglements in the try-net winch and is currently being sea tested. Discussion: NIOSH has completed initial designs for stationary-winch guards. Through collaborations with shrimper associations and safety groups, the successfully tested winch guard and auxiliary stop designs will be made available to qualified welders and craftsmen to use. This approach has proven effective in preventing other types of winch injuries. Practical applications: Injury epidemiologic methods and industry input are an effective way to identify workplace hazards and to design effective safety interventions to control hazards.
BACKGROUND: After cryptosporidiosis was reported in three workers caring for preweaned calves at an academic research laboratory, we sought to identify cases, determine risk factors, and implement control measures. METHODS: A cryptosporidiosis case was defined as diarrhea duration >/=72 hr, abdominal cramps, or vomiting in an animal research laboratory worker during July 14-July 31. A confirmed case had laboratory evidence of Cryptosporidium infection. Staff were interviewed regarding illness, potential exposures, training, and personal protective equipment (PPE) standard operating procedures (SOPs). RESULTS: The cryptosporidiosis attack rate (AR) was 74% (20/27); five were laboratory-confirmed. Median job training was 2 hr including respiratory-fit testing. No SOPs existed for doffing PPE. AR for workers who removed their gloves first was 84% (16/19) compared with 20% (1/5) for workers who removed gloves last (risk ratio = 4.2; P < 0.02). CONCLUSIONS: This outbreak highlights the importance of adequate training, enforced proper PPE procedures, and promoting a culture of safety. Am. J. Ind. Med. 60:208-214, 2017.
The objective of this study was to assess usage patterns of wearable activity monitors among US adults and how user characteristics might influence physical activity estimates from this type of sample. We analyzed data on 3367 respondents to the 2015 HealthStyles survey, an annual consumer mail panel survey conducted on a nationwide sample. Approximately 1 in 8 respondents (12.5%) reported currently using a wearable activity monitor. Current use varied by sex, age, and education level. Use increased with physical activity level from 4.3% for inactive adults to 17.4% for active adults. Overall, 49.9% of all adults met the aerobic physical activity guideline, while this prevalence was 69.5% among current activity monitor users. Our findings suggest that current users of wearable activity monitors are not representative of the overall US population. Estimates of physical activity levels using data from wearable activity monitors users may be an overestimate and therefore data from users alone may have a limited role in physical activity surveillance.
To assist with public health preparedness activities, we estimated the number of expected cases of Zika virus in Puerto Rico and associated healthcare needs. Estimated annual incidence is 3.2-5.1 times the baseline, and long-term care needs are predicted to be 3-5 times greater than in years with no Zika virus.
CONTEXT: -As the number of Zika virus (ZIKV) infections continues to grow, so, too, does the spectrum of recognized clinical disease, in both adult and congenital infections. Defining the tissue pathology associated with the various disease manifestations provides insight into pathogenesis and diagnosis, and potentially future prevention and treatment, of ZIKV infections. OBJECTIVE: -To summarize the syndromes and pathology associated with ZIKV infection, the implications of pathologic findings in the pathogenesis of ZIKV disease, and the use of pathology specimens for diagnosis of ZIKV infection. DATA SOURCES: -The major sources of information for this review were published articles obtained from PubMed and pathologic findings from cases submitted to the Infectious Diseases Pathology Branch at the Centers for Disease Control and Prevention. CONCLUSIONS: -Pathologic findings associated with ZIKV infection are characteristic but not specific. In congenital Zika syndrome, tissue pathology is due to direct viral infection of neural structures, whereas in Guillain-Barre syndrome, pathology is likely due to a postviral, aberrant host-directed immune response. Both fetal and placental pathology specimens are useful for ZIKV diagnosis by molecular and immunohistochemical assays; however, the implications of ZIKV detection in placentas from second- and third-trimester normal live births are unclear, as the potential postnatal effects of late gestational exposure remain to be seen.
CDC Authored Publications The names of CDC authors are indicated in bold text. Articles published in the past 6-8 weeks authored by CDC or ATSDR staff.
BACKGROUND: Contralateral prophylactic mastectomy (CPM) rates have been increasing in the US, and although high levels of satisfaction with CPM have been reported, few studies have evaluated the long-term effects on body image, comparing CPM with breast-conserving surgery (BCS) and unilateral mastectomy (UM). METHODS: We analyzed responses from a survey of women with both a personal and family history of breast cancer who were enrolled in the Sister Study (n = 1176). Among women who underwent mastectomy, we examined satisfaction with the mastectomy decision, as well as variation in the use of reconstruction and experience of complications. Five survey items, evaluated individually and as a summed total score, were used to compare body image across surgery types (BCS, UM without reconstruction, CPM without reconstruction, UM with reconstruction, and CPM with reconstruction). RESULTS: Participants were, on average, 3.6 years post-diagnosis at the time of survey (standard deviation 1.7). The majority of women (97% of CPM, 89% of UM) were satisfied with their mastectomy decision. Reconstruction was more common after CPM than after UM (70 vs. 47%), as were complications (28 vs. 19%). Body image scores were significantly worse among women who underwent CPM than among women who underwent BCS, with the lowest scores among women who underwent CPM without reconstruction. CONCLUSIONS: In our sample, most women were highly satisfied with their mastectomy decision, including those who elected to undergo CPM. However, body image was lower among those who underwent CPM than among those who underwent BCS. Our findings may inform decisions among women considering various courses of surgical treatment.
BACKGROUND: American Indians and Alaska Natives (AI/AN) have the highest diabetes prevalence among any racial/ethnic group in the United States. Among AI/AN, diabetes accounts for 69% of new cases of end-stage renal disease (ESRD), defined as kidney failure treated with dialysis or transplantation. During 1982-1996, diabetes-related ESRD (ESRD-D) in AI/AN increased substantially and disproportionately compared with other racial/ethnic groups. METHODS: Data from the U.S. Renal Data System, the Indian Health Service (IHS), the National Health Interview Survey, and the U.S. Census were used to calculate ESRD-D incidence rates by race/ethnicity among U.S. adults aged >/=18 years during 1996-2013 and in the diabetic population during 2006-2013. Rates were age-adjusted based on the 2000 U.S. standard population. IHS clinical data from the Diabetes Cares and Outcomes Audit were analyzed for diabetes management measures in AI/AN. RESULTS: Among AI/AN adults, age-adjusted ESRD-D rates per 100,000 population decreased 54%, from 57.3 in 1996 to 26.5 in 2013. Although rates for adults in other racial/ethnic groups also decreased during this period, AI/AN had the steepest decline. Among AI/AN with diabetes, ESRD-D incidence decreased during 2006-2013 and, by 2013, was the same as that for whites. Measures related to the assessment and treatment of ESRD-D risk factors also showed more improvement during this period in AI/AN than in the general population. CONCLUSION AND IMPLICATIONS FOR PUBLIC HEALTH PRACTICE: Despite well-documented health and socioeconomic disparities among AI/AN, ESRD-D incidence rates among this population have decreased substantially since 1996. This decline followed implementation by the IHS of public health and population management approaches to diabetes accompanied by improvements in clinical care beginning in the mid-1980s. These approaches might be a useful model for diabetes management in other health care systems, especially those serving populations at high risk.
SYNOPSIS Over the last two decades, experts have reported a rising number of deaths caused by chronic kidney disease (CKD) along the Pacific coast of Central America, from southern Mexico to Costa Rica. However, this specific disease is not associated with traditional causes of CKD, such as aging, diabetes, or hypertension. Rather, this disease is a chronic interstitial nephritis termed chronic kidney disease of nontraditional etiology (CKDnT). According to the Pan American Health Organization (PAHO) mortality database, there are elevated rates of deaths related to kidney disease in many of these countries, with the highest rates being reported in El Salvador and Nicaragua. This condition has been identified in certain agricultural communities, predominantly among male farmworkers. Since CKD surveillance systems in Central America are under development or nonexistent, experts and governmental bodies have recommended creating standardized case definitions for surveillance purposes to monitor and characterize this epidemiological situation. A group of experts from Central American ministries of health, the U.S. Centers for Disease Control and Prevention (CDC), and PAHO held a workshop in Guatemala to discuss CKDnT epidemiologic case definitions. In this paper, we propose that CKD in general be identified by the standard definition internationally accepted and that a suspect case of CKDnT be defined as a person age < 60 years with CKD, without type 1 diabetes mellitus, hypertensive diseases, and other well-known causes of CKD. A probable case of CKDnT is defined as a suspect case with the same findings confirmed three or more months later.
Elevated serum tumor necrosis factor receptor 1 (TNFR1) and 2 (TNFR2) concentrations are strongly associated with increased risk of end-stage renal disease in type 2 diabetes. However, little is known about the early glomerular structural lesions that develop in patients when these markers are elevated. Here, we examined the relationships between TNFRs and glomerular structure in 83 American Indians with type 2 diabetes. Serum TNFRs and glomerular filtration rate (GFR, iothalamate) were measured during a research exam performed within a median of 0.9 months from a percutaneous kidney biopsy. Associations of TNFRs with glomerular structural variables were quantified by Spearman’s correlations and by multivariable linear regression after adjustment for age, gender, diabetes duration, hemoglobin A1c, body mass index, and mean arterial pressure. The baseline mean age was 46 years, median GFR 130 ml/min, median albumin/creatinine ratio 26 mg/g, median TNFR1 1500 pg/ml, and median TNFR2 3284 pg/ml. After multivariable adjustment, TNFR1 and TNFR2 significantly correlated inversely with the percentage of endothelial cell fenestration and the total filtration surface per glomerulus. There were significant positive correlations with mesangial fractional volume, glomerular basement membrane width, podocyte foot process width, and percentage of global glomerular sclerosis. Thus, TNFRs may be involved in the pathogenesis of early glomerular lesions in diabetic nephropathy.
CONTEXT: Knowing the subtype of vulvar cancer histology is important for estimating human papillomavirus-related cancer etiology. Surveillance of human papillomavirus-related vulvar cancers informs public health decisions related to vaccination against human papillomavirus. OBJECTIVE: To assess the accuracy of registry classifications of vulvar cancer and determine the histologic classification of cases reported as not otherwise specified. DESIGN: Pathology specimens were collected from Florida, Iowa, and Hawaii cancer registries. Registry diagnosis was compared with the pathology report from the medical record and a single expert study histology review of a representative histologic section from each case. RESULTS: The study included 60 invasive vulvar squamous cell carcinoma (SCC) cases, 6 Paget disease cases, 2 basal cell carcinoma cases, and 53 in situ cases. Comparing subtypes of invasive vulvar SCC, the registry agreed with the pathology report classification in 49 of 60 cases (81.7%). Study histology review identified the same SCC subtype as the registry in 9 of 60 cases (15.0%) and the same SCC subtype as the pathology report in 11 of 60 cases (18.3%). Whereas the registry and pathology reports classified 37 and 34 cases, respectively, as being SCC not otherwise specified, the study histology review identified a more specific subtype in all cases. CONCLUSIONS: Subtypes of vulvar cancer were frequently recorded as not otherwise specified in the cancer registry primarily because the pathology report often did not specify the histologic subtype. Vulvar cancer registry data are useful for tracking broad diagnostic categories, but are less reliable for vulvar cancer subtypes.
RATIONALE: The IFN-gamma release assays and tuberculin skin tests are used to support the diagnosis of both latent and active tuberculosis. However, we previously demonstrated that a negative tuberculin test in active tuberculosis is associated with disseminated disease and death. It is unknown whether the same associations exist for IFN-gamma release assays. OBJECTIVES: To determine the association between these tests and site of tuberculosis and death among persons with active tuberculosis. METHODS: We analyzed IFN-gamma release assays and tuberculin test results for all persons with culture-confirmed tuberculosis reported to the U.S. National Tuberculosis Surveillance System from 2010 to 2014. We used logistic regression to calculate the association between these tests and site of disease and death. MEASUREMENTS AND MAIN RESULTS: A total of 24,803 persons with culture-confirmed tuberculosis had either of these test results available for analysis. Persons with a positive tuberculin test had lower odds of disseminated disease (i.e., miliary or combined pulmonary and extrapulmonary disease), but there was no difference in the odds of disseminated disease with a positive IFN-gamma release assay. However, persons who were positive to either of these tests had lower odds of death. An indeterminate IFN-gamma release assay result was associated with greater odds of both disseminated disease and death. CONCLUSIONS: Despite perceived equivalence in clinical practice, IFN-gamma release assays and tuberculin test results have different associations with tuberculosis site, yet similar associations with the risk of death. Furthermore, an indeterminate IFN-gamma release assay result in a person with active tuberculosis is not unimportant, and rather carries greater odds of disseminated disease and death. Prospective study may improve our understanding of the underlying mechanisms by which these tests are associated with disease localization and death.
BACKGROUND: Multidrug-resistant tuberculosis (MDR-TB) is a serious obstacle to successful TB control. The 2010-2011 Bangladesh Drug Resistance Survey (DRS) showed MDR-TB prevalence to be 7% overall, 1.4% in new and 28.5% in previously treated patients. We aimed to determine the rate of MDR-TB in selected sentinel sites in Bangladesh. METHODS: Fourteen hospitals from the seven divisions in Bangladesh were selected as sentinel surveillance sites. Newly registered TB patients were systematically enrolled from August 2011 to December 2014. Sputum specimens were processed for culture and drug susceptibility testing by the proportion method using Lowenstein-Jensen medium. RESULTS: Specimens from 1906 (84%) of 2270 enrolled patients were analysed. Isolates from 61 (3.2%) were identified as having MDR-TB. The proportion of MDR-TB was 2.3% among new and 13.8% among previously treated TB patients (P < 0.001). The overall proportion of MDR-TB was 3.2%:3.5% in males and 2.3% in females; by age, the MDR-TB rate was highest (5.2%) in those aged ?65 years. CONCLUSIONS: The high proportion of MDR-TB among new patients found in this sentinel surveillance significantly differs from that reported in the DRS. While the sentinel surveillance sites were not designed to be nationally representative, it is worrying to observe a higher number of MDR-TB cases among new patients.
PURPOSE OF REVIEW: The role of the CD4 cell count in the management of people living with HIV is once again changing, most notably with a shift away from using CD4 assays to decide when to start antiretroviral therapy (ART). This article reflects on the past, current and future role of CD4 cell count testing in HIV programmes, and the implications for clinicians, programme managers and diagnostics manufacturers. RECENT FINDINGS: Following the results of recent randomized trials demonstrating the clinical and public health benefits of starting ART as soon as possible after HIV diagnosis is confirmed, CD4 cell count is no longer recommended as a way to decide when to initiate ART. For patients stable on ART, CD4 cell counts are no longer needed to monitor the response to treatment where HIV viral load testing is available. Nevertheless CD4 remains the best measurement of a patient’s immune and clinical status, the risk of opportunistic infections, and supports diagnostic decision-making, particularly for patients with advanced HIV disease. SUMMARY: As countries revise guidelines to provide ART to all people living with HIV and continue to scale up access to viral load, strategic choices will need to be made regarding future investments in CD4 cell count and the appropriate use for clinical disease management.
OBJECTIVE: The purpose of the current analysis is to examine subgroup differences in the distribution of opposite-sex sex partners in the United States across an approximate 10-year period to identify patterns that may inform sexually transmitted infection research and prevention. METHODS: Data were drawn from the 2002 and 2011-2013 National Survey of Family Growth, a US probability-based household survey focusing on sexual and reproductive health. The measures included in this analysis were lifetime opposite-sex sex partners and opposite-sex sex partners in the past year. Analyses were conducted separately for men and women. All analyses were conducted in R and R-studio with the “survey” package, focusing on medians, the 80th, and 95th quartile. RESULTS: In 2002, there were significant differences between men and women in median number of lifetime sex partners with men reporting more lifetime partners. However, in the 2011-2013 data, these differences are no longer significant. Still, the findings suggest that the top 20% and top 5% of men are reporting significantly more lifetime partners than their female counterparts. In comparison, partners in the past year remain relatively unchanged for both men and women. CONCLUSIONS: These findings suggest that there were important changes in the distribution of sex partners between 2002 and 2011-2013 that have implications for sexually transmitted infection prevention. Median lifetime partners are no longer different for women and men: however, the distribution of lifetime partners among men is becoming even more skewed.
Objectives To assess Former Persons Under Monitoring (FPUM)s’ experiences and perceptions of the United States (US) Ebola Active Monitoring Program. Study design Retrospective assessment survey of FPUM. Methods An electronic survey was distributed to FPUMs monitored in Washington, DC, during October 2014?September 2015 (n?=?830). Results Most FPUMs (>70%) had a favourable perception of the program. Less than 5% avoided future travel or participation in outbreak response activities as a result of their monitoring experience. Approximately 29% experienced a negative consequence in the US due to their travel history. Only 19.2% reported that the Check and Report Ebola (CARE) phone was their only means of communication and 56.5% never used it for daily reporting. Experiences and perceptions varied significantly by citizenship with citizens of Ebola-affected countries more likely to have a favourable perception of the program, use CARE phones and express concern about Ebola transmission and development. Conclusions FPUMs perceived the program as beneficial and undergoing monitoring was not a barrier to future travel. Negative consequences resulting from travel were frequent. Targeted distribution of resources (e.g. CARE phones) should be considered for future programs.
OBJECTIVES: Risk compensation (RC) could reduce or offset the biological prevention benefits of HIV preexposure prophylaxis (PrEP) among those at substantial risk of infection, including men who have sex with men (MSM). We investigated the potential extent and causal mechanisms through which RC could impact HIV transmission at the population and individual levels. METHODS: Using a stochastic network-based mathematical model of HIV transmission dynamics among MSM in the United States, we simulated RC as a reduction in the probability of condom use after initiating PrEP, with heterogeneity by PrEP adherence profiles and partnership type in which RC occurred. Outcomes were changes to population-level HIV incidence and individual-level acquisition risk. RESULTS: When RC was limited to MSM highly/moderately adherent to PrEP, 100% RC (full replacement of condoms) resulted in a 2% relative decline in incidence compared to no RC, but an 8% relative increase in infection risk for MSM on PrEP. This resulted from confounding by indication: RC increased the number of MSM indicated for PrEP as a function of more condomless anal intercourse among men otherwise not indicated for PrEP; this led to an increased PrEP uptake and subsequent decline in incidence. CONCLUSIONS: RC is unlikely to decrease the prevention impact of PrEP, and in some cases RC may be counterintuitively beneficial at the population level. This depended on PrEP uptake scaling with behavioral indications. Due to the increased acquisition risk associated with RC, however, clinicians should continue to support PrEP as a supplement rather than replacement of condoms.
BACKGROUND: On 6 February 2015, Kampala city authorities alerted the Ugandan Ministry of Health of a “strange disease” that killed one person and sickened dozens. We conducted an epidemiologic investigation to identify the nature of the disease, mode of transmission, and risk factors to inform timely and effective control measures. METHODS: We defined a suspected case as onset of fever (>/=37.5 degrees C) for more than 3 days with abdominal pain, headache, negative malaria test or failed anti-malaria treatment, and at least 2 of the following: diarrhea, nausea or vomiting, constipation, fatigue. A probable case was defined as a suspected case with a positive TUBEX(R) TF test. A confirmed case had blood culture yielding Salmonella Typhi. We conducted a case-control study to compare exposures of 33 suspected case-patients and 78 controls, and tested water and juice samples. RESULTS: From 17 February-12 June, we identified 10,230 suspected, 1038 probable, and 51 confirmed cases. Approximately 22.58% (7/31) of case-patients and 2.56% (2/78) of controls drank water sold in small plastic bags (ORM-H = 8.90; 95%CI = 1.60-49.00); 54.54% (18/33) of case-patients and 19.23% (15/78) of controls consumed locally-made drinks (ORM-H = 4.60; 95%CI: 1.90-11.00). All isolates were susceptible to ciprofloxacin and ceftriaxone. Water and juice samples exhibited evidence of fecal contamination. CONCLUSION: Contaminated water and street-vended beverages were likely vehicles of this outbreak. At our recommendation authorities closed unsafe water sources and supplied safe water to affected areas.
BACKGROUND: The increasing reports of Middle East Respiratory Syndrome (MERS) caused by MERS coronavirus (MERS-CoV) from many countries emphasize its importance for international travel. Muslim pilgrimages of Hajj and Umrah involve mass gatherings of international travellers. We set out to assess the presence of influenza and MERS-CoV in Hajj/Umrah returnees with acute respiratory infection. . METHODS: Disembarking passengers (n = 8753) from Saudi Arabia (October 2014 to April 2015) were interviewed for the presence of respiratory symptoms; 977 (11%) reported symptoms and 300 (age 26-90, median 60 years; 140 male) consented to participate in the study. After recording clinical and demographic data, twin swabs (nasopharyngeal and throat) were collected from each participant, pooled in viral transport media and tested by real-time RT PCR for MERS-CoV and influenza A and B viruses and their subtypes. RESULTS: The participants had symptoms of 1-15 days (median 5d); cough (90%) and nasal discharge (86%) being the commonest. None of the 300 participants tested positive for MERS-CoV; however, 33 (11%) tested positive for influenza viruses (A/H3N2 = 13, A/H1N1pdm09 = 9 and B/Yamagata = 11). Eighteen patients received oseltamivir. No hospitalizations were needed and all had uneventful recovery. CONCLUSION: Despite a high prevalence of acute respiratory symptoms, MERS coV was not seen in returning pilgrims from Hajj and Umrah. However detection of flu emphasises preventive strategies like vaccination.
BACKGROUND: Yaws is a treponemal infection that was almost eradicated fifty years ago; however, the disease has re-emerged in a number of countries including Ghana. A single-dose of intramuscular benzathine penicillin has been the mainstay of treatment for yaws. However, intramuscular injections are painful and pose safety and logistical constraints in the poor areas where yaws occurs. A single center randomized control trial (RCT) carried out in Papua New Guinea in 2012 demonstrated the efficacy of a single-dose of oral azithromycin for the treatment of yaws. In this study, we also compared the efficacy of a single oral dose of azithromycin as an alternative to intramuscular benzathine penicillin for the treatment of the disease in another geographic setting. METHODOLOGY: We conducted an open-label, randomized non-inferiority trial in three neighboring yaws-endemic districts in Southern Ghana. Children aged 1-15 years with yaws lesions were assigned to receive either 30mg/kg of oral azithromycin or 50,000 units/kg of intramuscular benzathine penicillin. The primary end point was clinical cure rate, defined as a complete or partial resolution of lesions 3 weeks after treatment. The secondary endpoint was serological cure, defined as at least a 4-fold decline in baseline RPR titre 6 months after treatment. Non- inferiority of azithromycin treatment was determined if the upper bound limit of a 2 sided 95% CI was less than 10%. FINDINGS: The mean age of participants was 9.5 years (S.D.3.1, range: 1-15 years), 247(70%) were males. The clinical cure rates were 98.2% (95% CI: 96.2-100) in the azithromycin group and 96.9% (95% CI: 94.1-99.6) in the benzathine penicillin group. The serological cure rates at 6 months were 57.4% (95% CI: 49.9-64.9) in the azithromycin group and 49.1% (95% CI: 41.2-56.9) in the benzathine penicillin group, thus achieving the specified criteria for non-inferiority. CONCLUSIONS: A single oral dose of azithromycin, at a dosage of 30mg/kg, was non-inferior to a single dose of intramuscular benzathine penicillin for the treatment of early yaws among Ghanaian patients, and provides additional support for the WHO policy for use of oral azithromycin for the eradication of yaws in resource-poor settings. TRIAL REGISTRATION: Pan African Clinical Trials Registry PACTR2013030005181 http://www.pactr.org/.
Globally, the scale-up of antiretroviral treatment represents one of the greatest successes in the history of global health. By the end of 2014 an estimated 15 million individuals, including more than 800,000 children younger than 15 years, initiated antiretroviral therapy (ART). While pediatric treatment still lags behind adult successes, many countries, particularly in Southern Africa, have reported high rates of ART initiation, diminishing mortality and markedly improved health outcomes among children with HIV.
OBJECTIVE: To summarize published evidence on drug interactions between hormonal contraceptives and antiretrovirals. DESIGN: Systematic review of the published literature. METHODS: We searched PUBMED, POPLINE, and EMBASE for peer-reviewed publications of studies (in any language) from inception through September 21, 2015. We included studies of women using hormonal contraceptives and antiretrovirals concurrently. Outcomes of interest were effectiveness of either therapy, toxicity, and pharmacokinetics. We used standard abstraction forms to summarize and assess strengths and weaknesses. RESULTS: Fifty reports from 46 studies were included. Most antiretrovirals, whether used for therapy or prevention, have limited interactions with hormonal contraceptive methods, with the exception of efavirenz. While DMPA is not affected, limited data on implants and combined oral contraceptive pills suggests that efavirenz-containing combination antiretroviral therapy may compromise contraceptive effectiveness of these methods. However, implants remain very effective despite such drug interactions. Antiretroviral plasma concentrations and effectiveness are generally not affected by hormonal contraceptives. CONCLUSION: Women taking antiretrovirals, for treatment or prevention, should not be denied access to the full range of hormonal contraceptive options, but should be counseled on the expected rates of unplanned pregnancy associated with all contraceptive methods, in order to make their own informed choices.
BACKGROUND: Expedited partner therapy (EPT) for Chlamydia trachomatis (Ct) is the practice of providing Ct-infected patients with medication, or prescription (prescription-EPT) to deliver to their sex partners without first examining those partners. New York City (NYC) providers commonly use prescription-EPT, yet NYC pharmacists report only occasional receipt of EPT prescriptions. This project assessed the frequency of EPT prescriptions filled in 2 NYC neighborhoods. METHODS: The 2 NYC facilities reporting the most frequent use of prescription-EPT were identified from Ct provider case reports and contacted to ascertain their EPT practices. Providers at the first facility (facility 1) prescribed two 1-g doses of azithromycin, including sex partner treatment on the index patient’s electronic prescription. Providers at the second facility (facility 2) gave patients paper prescriptions for sex partners. We reviewed prescriptions filled in 2015 for azithromycin, 1 or 2 g at pharmacies near these facilities; prescriptions indicating partner therapy were classified “EPT prescriptions”. RESULTS: Facility 1 providers submitted 112 Ct case reports indicating prescription-EPT, compared with 114 submitted by facility 2 providers. Twelve of 26 identified pharmacies agreed to participate. At 7 pharmacies near facility 1, we found 61 EPT prescriptions from facility 1 and 37 from other facilities. At 5 pharmacies near facility 2, we found only 1 EPT prescription from facility 2 and 3 from other facilities. CONCLUSIONS: Expedited partner therapy prescriptions were received in NYC pharmacies near to EPT-prescribing facilities, but with great variability and at a lower frequency than suggested by provider case reports. Provider EPT prescribing practices may impact the likelihood that partners receive medication and should be further evaluated.
Anal intercourse is reported by many heterosexuals, and evidence suggests that its practice may be increasing. We estimated the proportion of the HIV burden attributable to anal sex in 2015 among heterosexual women and men in the United States. The HIV Optimization and Prevention Economics model was developed using parameter inputs from the literature for the sexually active U.S. population aged 13-64. The model uses differential equations to represent the progression of the population between compartments defined by HIV disease status and continuum-of-care stages from 2007 to 2015. For heterosexual women of all ages (who do not inject drugs), almost 28% of infections were associated with anal sex, whereas for women aged 18-34, nearly 40% of HIV infections were associated with anal sex. For heterosexual men, 20% of HIV infections were associated with insertive anal sex with women. Sensitivity analyses showed that varying any of 63 inputs by +/-20% resulted in no more than a 13% change in the projected number of heterosexual infections in 2015, including those attributed to anal sex. Despite uncertainties in model inputs, a substantial portion of the HIV burden among heterosexuals appears to be attributable to anal sex. Providing information about the relative risk of anal sex compared with vaginal sex may help reduce HIV incidence in heterosexuals.
We evaluate the potential for using high-risk human papillomavirus (hr-HPV) testing-based screening for cervical intraepithelial neoplasia (CIN) in routine health services in Thailand; its accuracy in comparison to that of conventional cytology (CC); and the utility of HPV16/18 positive results and liquid-based cytology (LBC) triage for HPV-positive women in the detection of high-grade CIN. Women aged 30?60 years in Ubon Ratchathani province, Thailand were screened with CC and hr-HPV testing and those abnormal on either tests were referred for colposcopy and/or directed biopsies. The final diagnosis using COBAS was based on histology or colposcopy when histology was not available. Estimation of test accuracy parameters was done using latent class analysis using Bayesian models. Of the 5004 women were enrolled, 20 (0.4%) had abnormal CC and 174 (3.5%) women were HPV-positive. Among 185 women abnormal on CC or HPV-positive, 176 (95.1%) underwent colposcopy, of whom 101 (57.4%) had abnormal colposcopy findings. Ninety-seven women with abnormal and 69 with normal colposcopy had biopsies performed. All 21 women with histological CIN2 or worse had hr-HPV and none were abnormal on CC. The estimated sensitivity, specificity and positive predictive value were respectively 71.8%, 97.0% and 13.0% of HPV testing; 53%, 98.7% and 20.3% for triage of HPV-positive women with LBC; and 70.4%, 98.2% and 16.9% when test positivity was taken as HPV16/18 irrespective of LBC result or positive for hr-HPV non 16/18 types and LBC triage. Our study findings indicate poor performance of cytology screening and demonstrate the potential and utility of using HPV testing in public health services in Thailand as well as the utility of primary HPV testing and LBC triage in screening for cervical neoplasia.
Leveraging an existing community health strategy, a contact tracing intervention was piloted under routine programmatic conditions at three facilities in Kisumu County, Kenya. Data collected during a 6-month period were compared to existing programmatic data. After implementation of the intervention, we found enhanced programmatic contact tracing practices, noting an increase in the proportions of index cases traced, symptomatic contacts referred, referred contacts presenting to a facility for tuberculosis screening, and eligible contacts started on isoniazid preventive therapy. As contact tracing is scaled up, health ministries should consider the adoption of similar contact tracing interventions to improve contact tracing practices.
BACKGROUND: An increase in Mycoplasma pneumoniae-associated Stevens-Johnson syndrome (SJS) cases at a Colorado pediatric hospital led to an outbreak investigation. We describe the epidemiologic and molecular characteristics of M. pneumoniae among SJS case-patients and surrounding community members during the outbreak. METHODS: M. pneumoniae PCR-positive respiratory specimens from 5 Colorado hospitals and 4 referral laboratories underwent confirmatory PCR testing; positive specimens then underwent multilocus variable-number tandem-repeat analysis (MLVA) and macrolide resistance testing. Three SJS-M. pneumoniae case-patient households were surveyed using a standardized questionnaire, and nasopharyngeal/oropharyngeal swabs were obtained from all consenting/assenting household contacts. ICD-9 codes were used to identify pneumonia cases among Colorado patients aged 5-21 years from January 2009- March 2014. RESULTS: Three different M. pneumoniae MLVA types were identified among the 5 SJS case-patients with confirmed infection; MLVA type 3-X-6-2 was seen more commonly in SJS case-patients (60%) than in 69 non-SJS community specimens (29%). Macrolide resistance was identified in 7% of community specimens but not among SJS case-patients. Of 15 household contacts, 5 (33%) were M. pneumoniae-positive; all MLVA types were identical to those of the corresponding SJS case-patient, although the specimen from one contact was macrolide-resistant. Overall pneumonia cases as well as those caused by M. pneumoniae specifically peaked in October, 2013, coinciding with the SJS outbreak. CONCLUSIONS: The outbreak of M. pneumoniae-associated SJS may have been associated with a community outbreak of M. pneumoniae; clinicians should be aware of the M. pneumoniae-SJS relationship. Household transmission of M. pneumoniae was common within the households investigated.
Mobile applications, or apps, have gained widespread use with the advent of modern smartphone technologies. Previous research has been conducted in the use of mobile devices for learning. However, there is decidedly less research into the use of mobile apps for health learning (eg, patient self-monitoring, medical student learning). This deficiency in research on using apps in a learning context is especially severe in the disaster health field. The objectives of this article were to provide an overview of the current state of disaster health apps being used for learning, to situate the use of apps in a health learning context, and to adapt a learning framework for the use of mobile apps in the disaster health field. A systematic literature review was conducted by using the PRISMA checklist, and peer-reviewed articles found through the PubMed and CINAHL databases were examined. This resulted in 107 nonduplicative articles, which underwent a 3-phase review, culminating in a final selection of 17 articles. While several learning models were identified, none were sufficient as an app learning framework for the field. Therefore, we propose a learning framework to inform the use of mobile apps in disaster health learning.
BACKGROUND: The disasters at Seveso, Three Mile Island, Bhopal, Chernobyl, the World Trade Center (WTC) and Fukushima had historic health and economic sequelae for large populations of workers, responders and community members. METHODS: Comparative data from these events were collected to derive indications for future preparedness. Information from the primary sources and a literature review addressed: i) exposure assessment; ii) exposed populations; iii) health surveillance; iv) follow-up and research outputs; v) observed physical and mental health effects; vi) treatment and benefits; and vii) outreach activities. RESULTS: Exposure assessment was conducted in Seveso, Chernobyl and Fukushima, although none benefited from a timely or systematic strategy, yielding immediate and sequential measurements after the disaster. Identification of exposed subjects was overall underestimated. Health surveillance, treatment and follow-up research were implemented in Seveso, Chernobyl, Fukushima, and at the WTC, mostly focusing on the workers and responders, and to a lesser extent on residents. Exposure-related physical and mental health consequences were identified, indicating the need for a long-term health care of the affected populations. Fukushima has generated the largest scientific output so far, followed by the WTCHP and Chernobyl. Benefits programs and active outreach figured prominently in only the WTC Health Program. The analysis of these programs yielded the following lessons: 1) Know who was there; 2) Have public health input to the disaster response; 3) Collect health and needs data rapidly; 4) Take care of the affected; 5) Emergency preparedness; 6) Data driven, needs assessment, advocacy. CONCLUSIONS: Given the long-lasting health consequences of natural and man-made disasters, health surveillance and treatment programs are critical for management of health conditions, and emergency preparedness plans are needed to prevent or minimize the impact of future threats.
OBJECTIVE: We evaluated the usefulness and accuracy of media-reported data for active disaster-related mortality surveillance. METHODS: From October 29 through November 5, 2012, epidemiologists from the Centers for Disease Control and Prevention (CDC) tracked online media reports for Hurricane Sandy-related deaths by use of a keyword search. To evaluate the media-reported data, vital statistics records of Sandy-related deaths were compared to corresponding media-reported deaths and assessed for percentage match. Sensitivity, positive predictive value (PPV), and timeliness of the media reports for detecting Sandy-related deaths were calculated. RESULTS: Ninety-nine media-reported deaths were identified and compared with the 90 vital statistics death records sent to the CDC by New York City (NYC) and the 5 states that agreed to participate in this study. Seventy-five (76%) of the media reports matched with vital statistics records. Only NYC was able to actively track Sandy-related deaths during the event. Moderate sensitivity (83%) and PPV (83%) were calculated for the matching media-reported deaths for NYC. CONCLUSIONS: During Hurricane Sandy, the media-reported information was moderately sensitive, and percentage match with vital statistics records was also moderate. The results indicate that online media-reported deaths can be useful as a supplemental source of information for situational awareness and immediate public health decision-making during the initial response stage of a disaster.
Bats harbor a large diversity of coronaviruses (CoVs), several of which are related to zoonotic pathogens that cause severe disease in humans. Our screening of bat samples collected in Kenya during 2007-2010 not only detected RNA from several novel CoVs but, more significantly, identified sequences that were closely related to human CoVs NL63 and 229E, suggesting that these two human viruses originate from bats. We also demonstrated that human CoV NL63 is a recombinant between NL63-like viruses circulating in Triaenops bats and 229E-like viruses circulating in Hipposideros bats, with the break-point located near 5′ and 3′ end of the spike (S) protein gene. In addition, two further inter-species recombination events involving the S gene were identified, suggesting that this region may represent a recombination “hotspot” in CoV genomes. Finally, using a combination of phylogenetic and distance-based approaches we showed that genetic diversity of bat CoVs is primarily structured by host species and subsequently by geographic distances. IMPORTANCE: Understanding the driving forces of cross-species virus transmission is central to understanding the nature of disease emergence. Previous studies have demonstrated that bats are the ultimate reservoir hosts for a number of coronaviruses (CoVs) including ancestors of SARS-CoV, MERS-CoV, and HCoV-229E. However, the evolutionary pathways of bat CoVs remain elusive. We provide evidence for natural recombination between distantly-related African bat coronaviruses associated with Triaenops afer and Hipposideros sp. bats that resulted in a NL-63 like virus, an ancestor of the human pathogen HCoV-NL63. These results suggest that inter-species recombination may play an important role in CoV evolution and the emergence of novel CoVs with zoonotic potential.
Essentially all women are exposed to polycyclic aromatic hydrocarbons (PAHs), formed during incomplete combustion of organic materials, including fossil fuels, wood, foods, and tobacco. PAHs are ovarian toxicants in rodents, and cigarette smoking is associated with reproductive abnormalities in women. Biomonitoring of hydroxylated PAH (OH-PAH) metabolites in urine provides an integrated measure of exposure to PAHs via multiple routes and has been used to characterize exposure to PAHs in humans. We hypothesized that concentrations of OH-PAHs in urine are associated with reproductive function in women. We recruited women 18-44years old, living in Orange County, California to conduct daily measurement of urinary luteinizing hormone (LH) and estrone 3-glucuronide (E13G) using a microelectronic fertility monitor for multiple menstrual cycles; these data were used to calculate endocrine endpoints. Participants also collected urine samples on cycle day 10 for measurement of nine OH-PAHs. Models were constructed for eight endpoints using a Bayesian mixed modeling approach with subject-specific random effects allowing each participant to act as a baseline for her set of measurements. We observed associations between individual OH-PAH concentrations and follicular phase length, follicular phase LH and E13G concentrations, preovulatory LH surge concentrations, and periovulatory E13G slope and concentration. We have demonstrated the feasibility of using urinary reproductive hormone data obtained via fertility monitors to calculate endocrine endpoints for epidemiological studies of ovarian function during multiple menstrual cycles. The results show that environmental exposure to PAHs is associated with changes in endocrine markers of ovarian function in women in a PAH-specific manner.
Oxidative stress has been linked to many obesity-related conditions among children including cardiovascular disease, diabetes mellitus and hypertension. Exposure to environmental chemicals such as phthalates, ubiquitously found in humans, may also generate reactive oxygen species and subsequent oxidative stress. We examined longitudinal changes of 8-isoprostane urinary concentrations, a validated biomarker of oxidative stress, and associations with maternal prenatal urinary concentrations of phthalate metabolites for 258 children at 5, 9 and 14 years of age participating in a birth cohort residing in an agricultural area in California. Phthalates are endocrine disruptors, and in utero exposure has been also linked to altered lipid metabolism, as well as adverse birth and neurodevelopmental outcomes. We found that median creatinine-corrected 8-isoprostane concentrations remained constant across all age groups and did not differ by sex. Total cholesterol, systolic and diastolic blood pressure were positively associated with 8-isoprostane in 14-year-old children. No associations were observed between 8-isoprostane and body mass index (BMI), BMI Z-score or waist circumference at any age. Concentrations of three metabolites of high molecular weight phthalates measured at 13 weeks of gestation (monobenzyl, monocarboxyoctyl and monocarboxynonyl phthalates) were negatively associated with 8-isoprostane concentrations among 9-year olds. However, at 14 years of age, isoprostane concentrations were positively associated with two other metabolites (mono(2-ethylhexyl) and mono(2-ethyl-5-carboxypentyl) phthalates) measured in early pregnancy. Longitudinal data on 8-isoprostane in this pediatric population with a high prevalence of obesity provides new insight on certain potential cardiometabolic risks of prenatal exposure to phthalates.
PROBLEM/CONDITION: Higher rates of death in nonmetropolitan areas (often referred to as rural areas) compared with metropolitan areas have been described but not systematically assessed. PERIOD COVERED: 1999-2014 DESCRIPTION OF SYSTEM: Mortality data for U.S. residents from the National Vital Statistics System were used to calculate age-adjusted death rates and potentially excess deaths for nonmetropolitan and metropolitan areas for the five leading causes of death. Age-adjusted death rates included all ages and were adjusted to the 2000 U.S. standard population by the direct method. Potentially excess deaths are defined as deaths among persons aged <80 years that exceed the numbers that would be expected if the death rates of states with the lowest rates (i.e., benchmark states) occurred across all states. (Benchmark states were the three states with the lowest rates for each cause during 2008-2010.) Potentially excess deaths were calculated separately for nonmetropolitan and metropolitan areas. Data are presented for the United States and the 10 U.S. Department of Health and Human Services public health regions. RESULTS: Across the United States, nonmetropolitan areas experienced higher age-adjusted death rates than metropolitan areas. The percentages of potentially excess deaths among persons aged <80 years from the five leading causes were higher in nonmetropolitan areas than in metropolitan areas. For example, approximately half of deaths from unintentional injury and chronic lower respiratory disease in nonmetropolitan areas were potentially excess deaths, compared with 39.2% and 30.9%, respectively, in metropolitan areas. Potentially excess deaths also differed among and within public health regions; within regions, nonmetropolitan areas tended to have higher percentages of potentially excess deaths than metropolitan areas. INTERPRETATION: Compared with metropolitan areas, nonmetropolitan areas have higher age-adjusted death rates and greater percentages of potentially excess deaths from the five leading causes of death, nationally and across public health regions. PUBLIC HEALTH ACTION: Routine tracking of potentially excess deaths in nonmetropolitan areas might help public health departments identify emerging health problems, monitor known problems, and focus interventions to reduce preventable deaths in these areas.
Whole apples have not been previously implicated in outbreaks of foodborne bacterial illness. We investigated a nationwide listeriosis outbreak associated with caramel apples. We defined an outbreak-associated case as an infection with one or both of two outbreak strains of Listeria monocytogenes highly related by whole-genome multilocus sequence typing (wgMLST) from 1 October 2014 to 1 February 2015. Single-interviewer open-ended interviews identified the source. Outbreak-associated cases were compared with non-outbreak-associated cases and traceback and environmental investigations were performed. We identified 35 outbreak-associated cases in 12 states; 34 (97%) were hospitalized and seven (20%) died. Outbreak-associated ill persons were more likely to have eaten commercially produced, prepackaged caramel apples (odds ratio 326.7, 95% confidence interval 32.2-3314). Environmental samples from the grower’s packing facility and distribution-chain whole apples yielded isolates highly related to outbreak isolates by wgMLST. This outbreak highlights the importance of minimizing produce contamination with L. monocytogenes. Investigators should perform single-interviewer open-ended interviews when a food is not readily identified.
BACKGROUND: The switch from photosynthetic or predatory to parasitic life strategies by apicomplexans is accompanied with a reductive evolution of genomes and losses of metabolic capabilities. Cryptosporidium is an extreme example of reductive evolution among apicomplexans, with losses of both the mitosome genome and many metabolic pathways. Previous observations on reductive evolution were largely based on comparative studies of various groups of apicomplexans. In this study, we sequenced two divergent Cryptosporidium species and conducted a comparative genomic analysis to infer the reductive evolution of metabolic pathways and differential evolution of invasion-related proteins within the Cryptosporidium lineage. RESULTS: In energy metabolism, Cryptosporidium species differ from each other mostly in mitosome metabolic pathways. Compared with C. parvum and C. hominis, C. andersoni possesses more aerobic metabolism and a conventional electron transport chain, whereas C. ubiquitum has further reductions in ubiquinone and polyisprenoid biosynthesis and has lost both the conventional and alternative electron transport systems. For invasion-associated proteins, similar to C. hominis, a reduction in the number of genes encoding secreted MEDLE and insulinase-like proteins in the subtelomeric regions of chromosomes 5 and 6 was also observed in C. ubiquitum and C. andersoni, whereas mucin-type glycoproteins are highly divergent between the gastric C. andersoni and intestinal Cryptosporidium species. CONCLUSIONS: Results of the study suggest that rapidly evolving mitosome metabolism and secreted invasion-related proteins could be involved in tissue tropism and host specificity in Cryptosporidium spp. The finding of progressive reduction in mitosome metabolism among Cryptosporidium species improves our knowledge of organelle evolution within apicomplexans.
INTRODUCTION: The PhenX Toolkit, an online resource of well-established measures of phenotypes and exposures, now has 16 new measures recommended for assessing rare genetic conditions. MATERIALS AND METHODS: These measures and their protocols were selected by a working group of domain experts with input from the scientific community. RESULTS: The measures, which cover life stages from birth through adulthood, include clinical scales, characterization of rare genetic conditions, bioassays, and questionnaires. Most are broadly applicable to rare genetic conditions (e.g., family history, growth charts, bone age, and body proportions). Some protocols (e.g., sweat chloride test) target specific conditions. DISCUSSION: The rare genetic condition measures complement the existing measures in the PhenX Toolkit that cover anthropometrics, demographics, mental health, and reproductive history. They are directed at research pertaining to common and complex diseases. PhenX measures are publicly available and are recommended to help standardize assessments across a range of biomedical study designs. To facilitate incorporation of measures into human subjects’ research, the Toolkit offers data collection worksheets and compatible data dictionaries. CONCLUSION: Widespread use of standard PhenX measures in clinical, translational, and epidemiological research will enable more uniform cross-study comparisons and increase statistical power with the potential for enhancing scientific discovery.
PURPOSE: The objective of this study was to identify trends and gaps in the field of implementation science in genomic medicine. METHODS: We conducted a literature review using the Centers for Disease Control and Prevention’s Public Health Genomics Knowledge Base to examine the current literature in the field of implementation science in genomic medicine. We selected original research articles based on specific inclusion criteria and then abstracted information about study design, genomic medicine, and implementation outcomes. Data were aggregated, and trends and gaps in the literature were discussed. RESULTS: Our final review encompassed 283 articles published in 2014, the majority of which described uptake (35.7%, n = 101) and preferences (36.4%, n = 103) regarding genomic technologies, particularly oncology (35%, n = 99). Key study design elements, such as racial/ethnic composition of study populations, were underreported in studies. Few studies incorporated implementation science theoretical frameworks, sustainability measures, or capacity building. CONCLUSION: Although genomic discovery provides the potential for population health benefit, the current knowledge base around implementation to turn this promise into a reality is severely limited. Current gaps in the literature demonstrate a need to apply implementation science principles to genomic medicine in order to deliver on the promise of precision medicine.
BACKGROUND: Federal and state public health agencies in the United States are increasingly using digital advertising and social media to promote messages from broader multimedia campaigns. However, little evidence exists on population-level campaign awareness and relative cost efficiencies of digital advertising in the context of a comprehensive public health education campaign. OBJECTIVE: Our objective was to compare the impact of increased doses of digital video and television advertising from the 2013 Tips From Former Smokers (Tips) campaign on overall campaign awareness at the population level. We also compared the relative cost efficiencies across these media platforms. METHODS: We used data from a large national online survey of approximately 15,000 US smokers conducted in 2013 immediately after the conclusion of the 2013 Tips campaign. These data were used to compare the effects of variation in media dose of digital video and television advertising on population-level awareness of the Tips campaign. We implemented higher doses of digital video among selected media markets and randomly selected other markets to receive similar higher doses of television ads. Multivariate logistic regressions estimated the odds of overall campaign awareness via digital or television format as a function of higher-dose media in each market area. All statistical tests used the .05 threshold for statistical significance and the .10 level for marginal nonsignificance. We used adjusted advertising costs for the additional doses of digital and television advertising to compare the cost efficiencies of digital and television advertising on the basis of costs per percentage point of population awareness generated. RESULTS: Higher-dose digital video advertising was associated with 94% increased odds of awareness of any ad online relative to standard-dose markets (P<.001). Higher-dose digital advertising was associated with a marginally nonsignificant increase (46%) in overall campaign awareness regardless of media format (P=.09). Higher-dose television advertising was associated with 81% increased odds of overall ad awareness regardless of media format (P<.001). Increased doses of television advertising were also associated with significantly higher odds of awareness of any ad on television (P<.001) and online (P=.04). The adjusted cost of each additional percentage point of population-level reach generated by higher doses of advertising was approximately US $440,000 for digital advertising and US $1 million for television advertising. CONCLUSIONS: Television advertising generated relatively higher levels of overall campaign awareness. However, digital video was relatively more cost efficient for generating awareness. These results suggest that digital video may be used as a cost-efficient complement to traditional advertising modes (eg, television), but digital video should not replace television given the relatively smaller audience size of digital video viewers.
PROBLEM: Medical students have limited exposure to Geriatrics in their traditional training. Service-learning offers students the opportunity to engage with older adult communities and become more comfortable interacting with this population. INTERVENTION: A preclinical elective course was developed to expand medical students’ experiences in Geriatrics through service-learning. In this course, students conducted needs assessments in diverse older adult communities, created health education projects to address community-identified needs, and reflected on their experiences through written assignments and presentations. The course instructor presented lectures on special topics in Geriatrics, including ageism and health literacy. The curriculum aimed to familiarize students with older adults’ needs in a variety of settings. CONTEXT: Over 3 years, 74 students participated in the service-learning course. Students were assigned to older adult community sites, where they conducted needs assessments and designed and implemented original educational projects targeting community concerns. Program evaluation methods included a validated survey assessing students’ attitudes toward older adults, course evaluations, review of student assignments and projects, and feedback from older adult participants and site coordinators. OUTCOME: Students gained hands-on experience working with older adults and designing appropriate health education projects. Analysis of attitude surveys demonstrated students’ increased interest in Geriatrics as a career. Both students and older adult participants described enjoyable, valuable experiences gained from service-learning activities. LESSONS LEARNED: Students appreciated the combination of community and classroom learning about Geriatrics. Service-learning was most constructive at sites with responsive coordinators, engaged older adults, and a need for health education resources. The course challenged students to assess health needs in communities that included cognitively impaired elders and to design educational projects tailored to older adults.
Recent global (1) and national (2,3) health equity initiatives conclude that the elimination of health disparities requires improved understanding of social context (4,5) and ability to measure social determinants of health, including food and housing security (3). Food and housing security reflect the availability of and access to essential resources needed to lead a healthy life. The 2013 Behavioral Risk Factor Surveillance System (BRFSS) included two questions to assess perceived food and housing security in 15 states.* Among 95,665 respondents, the proportion who answered “never or rarely” to the question “how often in the past 12 months would you say you were worried or stressed about having enough money to buy nutritious meals?” ranged from 68.5% to 82.4% by state. Among 90,291 respondents living in housing they either owned or rented, the proportion who answered “never or rarely” to the question, “how often in the past 12 months would you say you were worried or stressed about having enough money to pay your rent/mortgage?” ranged from 59.9% to 72.8% by state. Food security was reported less often among non-Hispanic blacks (blacks) (68.5%) and Hispanics (64.6%) than non-Hispanic whites (whites) (81.8%). These racial/ethnic disparities were present across all levels of education; housing security followed a similar pattern. These results highlight racial/ethnic disparities in two important social determinants of health, food and housing security, as well as a substantial prevalence of worry or stress about food or housing among all subgroups in the United States. The concise nature of the BRFSS Social Context Module’s single-question format for food and housing security makes it possible to incorporate these questions into large health surveys so that social determinants can be monitored at the state and national levels and populations at risk can be identified.
Background On August 24, 2011, 31 U.S.-bound refugees from Kuala Lumpur, Malaysia (KL) arrived in Los Angeles. One of them was diagnosed with measles post-arrival. He exposed others during a flight, and persons in the community while disembarking and seeking medical care. As a result, nine cases of measles were identified. Methods We estimated costs of response to this outbreak and conducted a comparative cost analysis examining what might have happened had all U.S.-bound refugees been vaccinated before leaving Malaysia. Results State-by-state costs differed and variously included vaccination, hospitalization, medical visits, and contact tracing with costs ranging from $621 to $35,115. The total of domestic and IOM Malaysia reported costs for U.S.-bound refugees were $137,505 [range: $134,531 – $142,777 from a sensitivity analysis]. Had all U.S.-bound refugees been vaccinated while in Malaysia, it would have cost approximately $19,646 and could have prevented 8 measles cases. Conclusion A vaccination program for U.S.-bound refugees, supporting a complete vaccination for U.S.-bound refugees, could improve refugees’ health, reduce importations of vaccine-preventable diseases in the United States, and avert measles response activities and costs.
BACKGROUND: Indoor tanning is associated with an increased risk of melanoma. The US Food and Drug Administration proposed prohibiting indoor tanning among minors younger than 18 years. OBJECTIVE: We sought to estimate the health and economic benefits of reducing indoor tanning in the United States. METHODS: We used a Markov model to estimate the expected number of melanoma cases and deaths averted, life-years saved, and melanoma treatment costs saved by reducing indoor tanning. We examined 5 scenarios: restricting indoor tanning among minors younger than 18 years, and reducing the prevalence by 20%, 50%, 80%, and 100%. RESULTS: Restricting indoor tanning among minors younger than 18 years was estimated to prevent 61,839 melanoma cases, prevent 6735 melanoma deaths, and save $342.9 million in treatment costs over the lifetime of the 61.2 million youth age 14 years or younger in the United States. The estimated health and economic benefits increased as indoor tanning was further reduced. LIMITATIONS: Limitations include the reliance on available data and not examining compliance to indoor tanning laws. CONCLUSIONS: Reducing indoor tanning has the potential to reduce melanoma incidence, mortality, and treatment costs. These findings help quantify and underscore the importance of continued efforts to reduce indoor tanning and prevent melanoma.
Injury-associated deaths have substantial economic consequences in the United States. The total estimated lifetime medical and work-loss costs associated with fatal injuries in 2013 were $214 billion. In 2014, unintentional injury, suicide, and homicide (the fourth, tenth, and seventeenth leading causes of death, respectively) accounted for 194,635 deaths in the United States (2). In 2014, a total of 199,756 fatal injuries occurred in the United States, and the associated lifetime medical and work-loss costs were $227 billion. This report examines the state-level economic burdens of fatal injuries by extending a previous national-level study. Numbers and rates of fatal injuries, lifetime costs, and lifetime costs per capita were calculated for each of the 50 states and the District of Columbia (DC) and for four injury intent categories (all intents, unintentional, suicide, and homicide). During 2014, injury mortality rates and economic burdens varied widely among the states and DC. Among fatal injuries of all intents, the mortality rate and lifetime costs per capita ranged from 101.9 per 100,000 and $1,233, respectively (New Mexico) to 40.2 per 100,000 and $491 (New York). States can engage more effectively and efficiently in injury prevention if they are aware of the economic burden of injuries, identify areas for immediate improvement, and devote necessary resources to those areas.
Disseminated acanthamoebiasis is a rare, often fatal, infection most commonly affecting immunocompromised patients. We report a case involving sinuses, skin, and bone in a 60-year-old woman 5 months after heart transplantation. She improved with a combination of flucytosine, fluconazole, miltefosine, and decreased immunosuppression. To our knowledge, this is the first case of successfully treated disseminated acanthamoebiasis in a heart transplant recipient and only the second successful use of miltefosine for this infection among solid organ transplant recipients. Acanthamoeba infection should be considered in transplant recipients with evidence of skin, central nervous system, and sinus infections that are unresponsive to antibiotics. Miltefosine may represent an effective component of a multidrug therapeutic regimen for the treatment of this amoebic infection.
BACKGROUND: Pneumococci are spread by persons with nasopharyngeal colonization, a necessary precursor to invasive disease. Pneumococcal conjugate vaccines can prevent colonization with vaccine serotype strains. In 2011, Kenya became one of the first African countries to introduce the 10-valent pneumococcal conjugate vaccine (PCV10) into its national immunization program. Serial cross-sectional colonization surveys were conducted to assess baseline pneumococcal colonization, antibiotic resistance patterns, and factors associated with resistance. METHODS: Annual surveys were conducted in one urban and one rural site during 2009 and 2010 among children aged <5 years. To reflect differences in vaccine target population, recruitment was age-stratified in Kibera, whereas a simple random sample of children was drawn in Lwak. Nasopharyngeal swabs were collected from eligible children. Pneumococci were isolated and serotyped. Antibiotic susceptibility testing was performed using the 2009 isolates. Antibiotic nonsusceptibility was defined as intermediate susceptibility or resistance to >/=1 antibiotics (i.e., penicillin, chloramphenicol, levofloxacin, erythromycin, tetracycline, cotrimoxazole, and clindamycin); multidrug resistance (MDR) was defined as nonsusceptibility to >/=3 antibiotics. Weighted analysis was conducted when appropriate. Modified Poisson regression was used to calculate factors associated with antibiotic nonsusceptibility. RESULTS: Of 1,087 enrolled (Kibera: 740, Lwak: 347), 90.0% of these were colonized with pneumococci, and 37.3% were colonized with PCV10 serotypes. There were no differences by survey site or year. Of 657 (of 730; 90%) isolates tested for antibiotic susceptibility, nonsusceptibility to cotrimoxazole and penicillin was found in 98.6 and 81.9% of isolates, respectively. MDR was found in 15.9% of isolates and most often involved nonsusceptibility to cotrimoxazole and penicillin; 40.4% of MDR isolates were PCV10 serotypes. In the multivariable model, PCV10 serotypes were independently associated with penicillin nonsusceptibility (Prevalence Ratio: 1.2, 95% CI 1.1-1.3), but not with MDR. CONCLUSIONS: Before PCV10 introduction, nearly all Kenyan children aged <5 years were colonized with pneumococci, and PCV10 serotype colonization was common. PCV10 serotypes were associated with penicillin nonsusceptibility. Given that colonization with PCV10 serotypes is associated with greater risk for invasive disease than colonization with other serotypes, successful PCV10 introduction in Kenya is likely to have a substantial impact in reducing vaccine-type pneumococcal disease and drug-resistant pneumococcal infection.
During November-December 2015, as part of the 2015 cholera outbreak response in Iraq, the Iraqi Ministry of Health targeted approximately 255,000 displaced persons >1 year of age with 2 doses of oral cholera vaccine (OCV). All persons who received vaccines were living in selected refugee camps, internally displaced persons camps, and collective centers. We conducted a multistage cluster survey to obtain OCV coverage estimates in 10 governorates that were targeted during the campaign. In total, 1,226 household and 5,007 individual interviews were conducted. Overall, 2-dose OCV coverage in the targeted camps was 87% (95% CI 85%-89%). Two-dose OCV coverage in the 3 northern governorates (91%; 95% CI 87%-94%) was higher than that in the 7 southern and central governorates (80%; 95% CI 77%-82%). The experience in Iraq demonstrates that OCV campaigns can be successfully implemented as part of a comprehensive response to cholera outbreaks among high-risk populations in conflict settings.
In 1988, the World Health Assembly resolved to eradicate poliomyelitis (polio). Since then, wild poliovirus (WPV) cases have declined by >99.9%, from an estimated 350,000 cases of polio each year to 74 cases in two countries in 2015 (1). This decrease was achieved primarily through the use of trivalent oral poliovirus vaccine (tOPV), which contains types 1, 2, and 3 live, attenuated polioviruses. Since 2000, the United States has exclusively used inactivated polio vaccine (IPV), which contains all three poliovirus types (2,3). In 2013, the World Health Organization (WHO) set a target of a polio-free world by 2018 (4). Of the three WPV types, type 2 was declared eradicated in September 2015. To remove the risk for infection with circulating type 2 vaccine-derived polioviruses (cVDPV), which can lead to paralysis similar to that caused by WPV, all OPV-using countries simultaneously switched in April 2016 from tOPV to bivalent OPV (bOPV), which contains only types 1 and 3 polioviruses (5). This report summarizes current Advisory Committee on Immunization Practices (ACIP) recommendations for poliovirus vaccination and provides CDC guidance, in the context of the switch from tOPV to bOPV, regarding assessment of vaccination status and vaccination of children who might have received poliovirus vaccine outside the United States, to ensure that children living in the United States (including immigrants and refugees) are protected against all three poliovirus types. This guidance is not new policy and does not change the recommendations of ACIP for poliovirus vaccination in the United States. Children living in the United States who might have received poliovirus vaccination outside the United States should meet ACIP recommendations for poliovirus vaccination, which require protection against all three poliovirus types by age-appropriate vaccination with IPV or tOPV. In the absence of vaccination records indicating receipt of these vaccines, only vaccination or revaccination in accordance with the age-appropriate U.S. IPV schedule is recommended. Serology to assess immunity for children with no or questionable documentation of poliovirus vaccination will no longer be an available option and therefore is no longer recommended, because of increasingly limited availability of antibody testing against type 2 poliovirus.
BACKGROUND: Health information systems are central to strong health systems. They assist with patient and program management, quality improvement, disease surveillance, and strategic use of information. Many donors have worked to improve health information systems, particularly by supporting the introduction of electronic health information systems (EHIS), which are considered more responsive and more efficient than older, paper-based systems. As many donor-driven programs are increasing their focus on country ownership, sustainability of these investments is a key concern. This analysis explores the potential sustainability of EHIS investments in Malawi, Zambia and Zimbabwe, originally supported by the United States President’s Emergency Plan for AIDS Relief (PEPFAR). METHODS: Using a framework based on sustainability theories from the health systems literature, this analysis employs a qualitative case study methodology to highlight factors that may increase the likelihood that donor-supported initiatives will continue after the original support is modified or ends. RESULTS: Findings highlight commonalities around possible determinants of sustainability. The study found that there is great optimism about the potential for EHIS, but the perceived risks may result in hesitancy to transition completely and parallel use of paper-based systems. Full stakeholder engagement is likely to be crucial for sustainability, as well as integration with other activities within the health system and those funded by development partners. The literature suggests that a sustainable system has clearly-defined goals around which stakeholders can rally, but this has not been achieved in the systems studied. The study also found that technical resource constraints – affecting system usage, maintenance, upgrades and repairs – may limit EHIS sustainability even if these other pillars were addressed. CONCLUSIONS: The sustainability of EHIS faces many challenges, which could be addressed through systems’ technical design, stakeholder coordination, and the building of organizational capacity to maintain and enhance such systems. All of this requires time and attention, but is likely to enhance long-term outcomes.
Objectives: The aim of this case study was to compare two alternative strategies for prioritizing data elements for data quality assessment (DQA) in a routine health management information system. The study used data from iSante, a multi-site electronic medical record implemented by the Haitian Ministry of Health. We described and compared two prioritization strategies: (1) a Delphi process drawing iterative feedback from clinicians and stakeholders responsible for monitoring and evaluation (M&E) of health programs to identify consensus priorities for data on HIV patients; and (2) a process using burden of disease estimates from Haiti to establish priorities for data on primary care patients. Methods: The Delphi process included 26 individuals across 6 institutions, including clinicians and M&E specialists. Through three rounds of questionnaires, the stakeholders provided input for prioritization of 13 indicators for completeness, accuracy and timeliness of HIV data. The burden of disease prioritization process revealed that cardiovascular disease contributed to the greatest number of disability-adjusted life-years (DALYs). This resulted in the selection of 16 data quality indicators for primary care data. Results: Both methods informed the definition of a set of automated data quality queries to assess internal validity, completeness, and timeliness using logic and clinical plausibility. The Delphi process benefited from stakeholder input, but was lengthy in process. The burden of disease prioritization process was objective and easier to implement, but lacked stakeholder buy-in. Conclusions: A hybrid approach guided by both disease burden and stakeholder input may be most beneficial for prioritizing data elements for DQA.
OBJECTIVE: To provide a review of evidence and consensus-based description of healthcare and educational service delivery and related recommendations for children with traumatic brain injury. METHODS: Literature review and group discussion of best practices in management of children with traumatic brain injury (TBI) was performed to facilitate consensus-based recommendations from the American Congress on Rehabilitation Medicine’s Pediatric and Adolescent Task Force on Brain Injury. This group represented pediatric researchers in public health, medicine, psychology, rehabilitation, and education. RESULTS: Care for children with TBI in healthcare and educational systems is not well coordinated or integrated, resulting in increased risk for poor outcomes. Potential solutions include identifying at-risk children following TBI, evaluating their need for rehabilitation and transitional services, and improving utilization of educational services that support children across the lifespan. CONCLUSION: Children with TBI are at risk for long-term consequences requiring management as well as monitoring following the injury. Current systems of care have challenges and inconsistencies leading to gaps in service delivery. Further efforts to improve knowledge of the long-term TBI effects in children, child and family needs, and identify best practices in pathways of care are essential for optimal care of children following TBI.
The study aimed to map instantaneous centers of rotation (ICRs) of lumbar motion segments during a functional lifting task and examine differences across segments and variations caused by magnitude of weight lifted. Eleven healthy participants lifted loads of three different magnitudes (4.5, 9, and 13.5kg) from a trunk-flexed (~75 degrees ) to an upright position, while being imaged by a dynamic stereo X-ray (DSX) system. Tracked lumbar vertebral (L2-S1) motion data were processed into highly accurate 6DOF intervertebral (L2L3, L3L4, L4L5, L5S1) kinematics. ICRs were computed using the finite helical axis method. Effects of segment level and load magnitude on the anterior-posterior (AP) and superior-inferior (SI) ICR migration ranges were assessed with a mixed-effects model. Further, ICRs were averaged to a single center of rotation (COR) to assess segment-specific differences in COR AP- and SI-coordinates. The AP range was found to be significantly larger for L2L3 compared to L3L4 (p=0.02), L4L5 and L5S1 (p<0.001). Average ICR SI location was relatively higher – near the superior endplate of the inferior vertebra – for L4L5 and L5SI compared to L2L3 and L3L4 (p</=0.001) – located between the mid-transverse plane and superior endplate of the inferior vertebra – but differences were not significant amongst themselves (p>0.9). Load magnitude had a significant effect only on the SI component of ICR migration range (13.5kg>9kg and 4.5kg; p=0.049 and 0.017 respectively). The reported segment-specific ICR data exemplify improved input parameters for lumbar spine biomechanical models and design of disc replacements, and base-line references for potential diagnostic applications.
PURPOSE OF REVIEW: Viral load measurement is a key indicator that determines patients’ response to treatment and risk for disease progression. Efforts are ongoing in different countries to scale-up access to viral load testing to meet the Joint United Nations Programme on HIV and AIDS target of achieving 90% viral suppression among HIV-infected patients receiving antiretroviral therapy. However, the impact of these initiatives may be challenged by increased inefficiencies along the viral load testing spectrum. This will translate to increased costs and ineffectiveness of scale-up approaches. This review describes different parameters that could be addressed across the viral load testing spectrum aimed at improving efficiencies and utilizing test results for patient management. RECENT FINDINGS: Though progress is being made in some countries to scale-up viral load, many others still face numerous challenges that may affect scale-up efficiencies: weak demand creation, ineffective supply chain management systems; poor specimen referral systems; inadequate data and quality management systems; and weak laboratory-clinical interface leading to diminished uptake of test results. SUMMARY: In scaling up access to viral load testing, there should be a renewed focus to address efficiencies across the entire spectrum, including factors related to access, uptake, and impact of test results.
PURPOSE OF REVIEW: Recent advances in point-of-care technologies to ensure universal access to affordable quality-assured diagnostics have the potential to transform patient management, surveillance programmes, and control of infectious diseases. Decentralization of testing can put tremendous stresses on fragile health systems if the laboratory is not involved in the planning, introduction, and scale-up strategies. RECENT FINDINGS: The impact of investments in novel technologies can only be realized if these tests are evaluated, adopted, and scaled up within the healthcare system with appropriate planning and understanding of the local contexts in which these technologies will be used. SUMMARY: In this digital age, the laboratory needs to take on the role of the Command Centre for technology introduction and implementation. Implementation science is needed to understand the political, cultural, economic, and behavioural context for technology introduction. The new paradigm should include: building a comprehensive system of laboratories and point-of-care testing sites to provide quality-assured diagnostic services with good laboratory-clinic interface to build trust in test results and linkage to care; building and coordinating a comprehensive national surveillance and communication system for disease control and global health emergencies; conducting research to monitor the impact of new tools and interventions on improving patient care.
BACKGROUND: Chlamydia trachomatis (CT) and Trichomonas vaginalis (TV), two prevalent sexual transmitted infections, are known to increase HIV risk in women and could potentially diminish pre-exposure prophylaxis (PrEP) efficacy, particularly for topical interventions that rely on local protection. We investigated in macaques whether co-infection with CT/TV reduces protection by vaginal TFV gel. METHODS: Vaginal TFV gel dosing previously shown to provide 100% or 74% protection when applied either 30 minutes or 3 days before SHIV challenge was assessed in pigtailed macaques co-infected with CT/TV and challenged twice-weekly with SHIV162p3 for up to 10 weeks (2 menstrual cycles). Three groups of six macaques received either placebo or 1% TFV gel 30 minutes or 3 days before each SHIV challenge. We additionally assessed TFV and TFV-diphosphate (TFV-DP) concentrations in plasma and vaginal tissues in CT/TV co-infected (n = 4) and uninfected (n = 4) macaques. RESULTS: CT/TV co-infections were maintained during the SHIV challenge period. All macaques that received placebo gel were SHIV-infected after a median of 7 challenges (1 menstrual cycle). In contrast, no infections were observed in macaques treated with TFV gel 30 minutes before SHIV challenge (p < 0.001). Efficacy was reduced to 60% when TFV gel was applied 3 days before SHIV challenge (p = 0.07). Plasma TFV and TFV-DP concentrations in tissues and vaginal lymphocytes were significantly higher in CT/TV co-infected compared to CT/TV uninfected macaques. CONCLUSIONS: Our findings in this model suggest that CT/TV co-infection may have little or no impact on the efficacy of highly effective topical TFV modalities and highlight a significant modulation of TFV pharmacokinetics.
Nanocellulose (NC) is emerging as a highly promising nanomaterial for a wide range of applications. Moreover, many types of NC are produced, each exhibiting a slightly different shape, size, and chemistry. The main objective of this study was to compare cytotoxic effects of cellulose nanocrystals (CNC) and nanofibrillated cellulose (NCF). The human lung epithelial cells (A549) were exposed for 24 h and 72 h to five different NC particles to determine how variations in properties contribute to cellular outcomes, including cytotoxicity, oxidative stress, and cytokine secretion. Our results showed that NCF were more toxic compared to CNC particles with respect to cytotoxicity and oxidative stress responses. However, exposure to CNC caused an inflammatory response with significantly elevated inflammatory cytokines/chemokines compared to NCF. Interestingly, cellulose staining indicated that CNC particles, but not NCF, were taken up by the cells. Furthermore, clustering analysis of the inflammatory cytokines revealed a similarity of NCF to the carbon nanofibers response and CNC to the chitin, a known immune modulator and innate cell activator. Taken together, the present study has revealed distinct differences between fibrillar and crystalline nanocellulose and demonstrated that physicochemical properties of NC are critical in determining their toxicity.
BACKGROUND: We evaluated the performance of the Becton Dickinson Veritor System Flu A + B rapid influenza diagnostic test (RIDT) to detect influenza viruses in respiratory specimens from patients enrolled at five surveillance sites in Kenya, a tropical country where influenza seasonality is variable. METHODS: Nasal swab (NS) and nasopharyngeal (NP)/oropharyngeal (OP) swabs were collected from patients with influenza like illness and/or severe acute respiratory infection. The sensitivity, specificity, positive predictive value (PPV) and negative predictive value (NPV) of the RIDT using NS specimens were evaluated against nasal swabs tested by real time reverse transcription polymerase chain reaction (rRT-PCR). The performance parameter results were expressed as 95% confidence intervals (CI) calculated using binomial exact methods, with P < 0.05 considered significant. Two-sample Z tests were used to test for differences in sample proportions. Analysis was performed using SAS software version 9.3. RESULTS: From July 2013 to July 2014, 3,569 patients were recruited, of which 78.7% were aged <5 years. Overall, 14.4% of NS specimens were influenza-positive by RIDT. RIDT overall sensitivity was 77.1% (95% CI 72.8-81.0%) and specificity was 94.9% (95% CI 94.0-95.7%) compared to rRT-PCR using NS specimens. RIDT sensitivity for influenza A virus compared to rRT-PCR using NS specimens was 71.8% (95% CI 66.7-76.4%) and was significantly higher than for influenza B which was 43.8% (95% CI 33.8-54.2%). PPV ranged from 30%-80% depending on background prevalence of influenza. CONCLUSION: Although the variable seasonality of influenza in tropical Africa presents unique challenges, RIDTs may have a role in making influenza surveillance sustainable in more remote areas of Africa, where laboratory capacity is limited.
Human plasma to be analyzed for exposure to cholinesterase inhibitors is stored at 4 degrees C or lower to prevent denaturation of human butyrylcholinesterase (HuBChE), the biomarker of exposure. Currently published protocols immunopurify HuBChE using antibodies that bind native HuBChE before analysis by mass spectrometry. It is anticipated that the plasma collected from human casualties may be stored nonideally at elevated temperatures of up to 45 degrees C for days or maybe weeks. At 45 degrees C, the plasma loses 50% of its HuBChE activity in 8 days and 95% in 40 days. Our goal was to identify a set of monoclonal antibodies that could be used to immunopurify HuBChE from plasma stored at 45 degrees C. The folding states of pure human HuBChE stored at 4 and 45 degrees C and boiled at 100 degrees C were visualized on nondenaturing gels stained with Coomassie blue. Fully active pure HuBChE tetramers had a single band, but pure HuBChE stored at 45 degrees C had four bands, representing native, partly unfolded, aggregated, and completely denatured, boiled tetramers. The previously described monoclonal B2 18-5 captured native, partly unfolded, and aggregated HuBChE tetramers, whereas a new monoclonal, C191 developed in our laboratory, was found to selectively capture completely denatured, boiled HuBChE. The highest quantity of HuBChE protein was extracted from 45 degrees C heat-denatured human plasma when HuBChE was immunopurified with a combination of monoclonals B2 18-5 and C191. Using a mixture of these two antibodies in future emergency response assays may increase the capability to confirm exposure to cholinesterase inhibitors.
Filter-based toxicology studies are conducted to establish the biological plausibility of the well-established health impacts associated with fine particulate matter (PM2.5) exposure. Ambient PM2.5 collected on filters is extracted into solution for toxicology applications, but frequently, characterization is nonexistent or only performed on filter-based PM2.5, without consideration of compositional differences that occur during the extraction processes. To date, the impact of making associations to measured components in ambient instead of extracted PM2.5 has not been investigated. Filter-based PM2.5 was collected at locations (n = 5) and detailed characterization of both ambient and extracted PM2.5 was performed. Alveolar macrophages (AMJ2-C11) were exposed (3, 24, and 48?h) to PM2.5 and the pro-inflammatory cytokine interleukin (IL)-6 was measured. IL-6 release differed significantly between PM2.5 collected from different locations; surprisingly, IL-6 release was highest following treatment with PM2.5 from the lowest ambient concentration location. IL-6 was negatively correlated with the sum of ambient metals analyzed, as well as with concentrations of specific constituents which have been previously associated with respiratory health effects. However, positive correlations of IL-6 with extracted concentrations indicated that the negative associations between IL-6 and ambient concentrations do not accurately represent the relationship between inflammation and PM2.5 exposure. Additionally, seven organic compounds had significant associations with IL-6 release when considering ambient concentrations, but they were not detected in the extracted solution. Basing inflammatory associations on ambient concentrations that are not necessarily representative of in vitro exposures creates misleading results; this study highlights the importance of characterizing extraction solutions to conduct accurate health impact research.
With rapid development of novel nanotechnologies that incorporate engineered nanomaterials (ENMs) into manufactured products, long-term, low dose ENM exposures in occupational settings is forecasted to occur with potential adverse outcomes to human health. Few ENM human health risk assessment efforts have evaluated tumorigenic potential of ENMs. Two widely used nano-scaled metal oxides (NMOs), cerium oxide (nCeO2) and ferric oxide (nFe2O3) were screened in the current study using a sub-chronic exposure to human primary small airway epithelial cells (pSAECs). Multi-walled carbon nanotubes (MWCNT), a known ENM tumor promoter, was used as a positive control. Advanced dosimetry modeling was employed to ascertain delivered vs. administered dose in all experimental conditions. Cells were continuously exposed in vitro to deposited doses of 0.18?g/cm2 or 0.06?g/cm2 of each NMO or MWCNT, respectively, over 6 and 10weeks, while saline- and dispersant-only exposed cells served as passage controls. Cells were evaluated for changes in several cancer hallmarks, as evidence for neoplastic transformation. At 10weeks, nFe2O3- and MWCNT-exposed cells displayed a neoplastic-like transformation phenotype with significant increased proliferation, invasion, and soft agar colony formation ability compared to controls. nCeO2-exposed cells showed increased proliferative capacity only. Isolated nFe2O3 and MWCNT clones from soft agar colonies retained their respective neoplastic-like phenotypes. Interestingly, nFe2O3-exposed cells, but not MWCNT cells, exhibited immortalization and retention of the neoplastic phenotype after repeated passaging (12-30 passages) and after cryofreeze and thawing. High content screening and protein expression analyses in acute exposure ENM studies vs. immortalized nFe2O3 cells, and isolated ENM clones, suggested that long-term exposure to the tested ENMs resulted in iron homeostasis disruption, an increased labile ferrous iron pool, and subsequent reactive oxygen species generation, a well-established tumorigenesis promotor. In conclusion, sub-chronic exposure to human pSAECs with a cancer hallmark screening battery identified nFe2O3 as possessing neoplastic-like transformation ability, thus suggesting that further tumorigenic assessment is needed.
BACKGROUND: Although rates of neural tube defects (NTDs) have declined in the United States since fortification, disparities still exist with Hispanic women having the highest risk of giving birth to a baby with a NTD. The Promotora de Salud model using community lay health workers has been shown to be an effective tool for reaching Hispanics for a variety of health topics; however, literature on its effectiveness in folic acid interventions is limited. MATERIALS AND METHODS: An intervention using the Promotora de Salud model was implemented in four U.S. counties with large populations of Hispanic women. The study comprised the following: (1) a written pretest survey to establish baseline levels of folic acid awareness, knowledge, and consumption; (2) a small group education intervention along with a 90-day supply of multivitamins; and (3) a postintervention (posttest) assessment conducted 4 months following the intervention. RESULTS: Statistically significant differences in pre- and posttests were observed for general awareness about folic acid and vitamins and specific knowledge about the benefits of folic acid. Statistically significant changes were also seen in vitamin consumption and multivitamin consumption. Folic acid supplement consumption increased dramatically by the end of the study. CONCLUSIONS: The Promotora de Salud model relies on interpersonal connections forged between promotoras and the communities they serve to help drive positive health behaviors. The findings underscore the positive impact that these interpersonal connections can have on increasing awareness, knowledge, and consumption of folic acid. Utilizing the Promotora de Salud model to reach targeted populations might help organizations successfully implement their programs in a culturally appropriate manner.
BACKGROUND: Group B Streptococcus (GBS) and Escherichia coli have historically dominated as causes of early-onset neonatal sepsis. Widespread use of intrapartum prophylaxis for GBS disease led to concerns about the potential adverse impact on E coli incidence. METHODS: Active, laboratory, and population-based surveillance for culture-positive (blood or cerebrospinal fluid) bacterial infections among infants 0 to 2 days of age was conducted statewide in Minnesota and Connecticut and in selected counties of California and Georgia during 2005 to 2014. Demographic and clinical information were collected and hospital live birth denominators were used to calculate incidence rates (per 1000 live births). We used the Cochran-Amitage test to assess trends. RESULTS: Surveillance identified 1484 cases. GBS was most common (532) followed by E coli (368) and viridans streptococci (280). Eleven percent of cases died and 6.3% of survivors had sequelae at discharge. All-cause (2005: 0.79; 2014: 0.77; P = .05) and E coli (2005: 0.21; 2014: 0.18; P = .25) sepsis incidence were stable. GBS incidence decreased (2005: 0.27; 2014: 0.22; P = .02). Among infants <1500 g, incidence was an order of magnitude higher for both pathogens and stable. The odds of death among infants <1500 g were similar for both pathogens but among infants >/=1500 g, the odds of death were greater for E coli cases (odds ratio: 7.0; 95% confidence interval: 2.7-18.2). CONCLUSIONS: GBS prevention efforts have not led to an increasing burden of early-onset E coli infections. However, the stable burden of E coli sepsis and associated mortality underscore the need for interventions.
BACKGROUND AND OBJECTIVES: In resource-limited settings, the uptake of antenatal care visits among women, especially teenage pregnant women, is disturbingly low. Factors that influence the uptake of ANC services among teenage women is largely understudied and poorly understood in John Taolo Gaetsewe (JTG), a predominantly rural and poor district of South Africa. The aim of this study was to determine the factors that influence uptake of ANC services among teenage mothers in JTG district. METHODS: A cross-sectional health facility-based study utilising mixed method was conducted in all public health facilities (n=44) at JTG district. Mother-infant pairs (n=383) who brought their infants for six-week first DPT immunisation during the study period were enrolled in the study. Structured questionnaires were used to collect data on demographic, socio-economic and uptake of ANC indicators. RESULTS: Out of 272 respondent mothers, 18.68% were adolescent mothers (13-19 years). The logistic regression analysis shows that mother’s age (OR=2.11; 95%CI = 1.04 – 4.27); distance to the nearest health facility (OR=3.38; 95%CI = 1.45-7.87); and client service satisfaction (OR=8.58; 95%CI =2.10-34.95 are significantly associated with poor uptake of ANC services. CONCLUSION AND GLOBAL HEALTH IMPLICATIONS: There is a need to improve the quality of adolescent reproductive health services tailored to their health and developmental needs. Moreover, addressing the social determinants of health that affect individual’s healthy life style and health seeking behavior is critical.
Pumpable roof supports are currently being used to provide a safe working environment for longwall mining. Because different pumpable supports are visually similar and installed fundamentally in the same manner as other supports, there is a tendency to believe they all perform the same way. However, there are several design parameters that can affect their performance, including the cementitious material properties and the bag construction practices that influence the degree of confinement provided. A full understanding of the impact of these design parameters is necessary to optimize the support application and to provide a foundation for making further improvements in the support performance. This paper evaluates the impact of various support design parameters by examining full-scale performance tests conducted using the National Institute for Occupational Safety and Health (NIOSH) Mine Roof Simulator (MRS) as part of manufacturers’ developmental and quality control testing. These tests were analyzed to identify correlations between the support design parameters and the resulting performance. Based on more than 160 tests over 7. years, quantifiable patterns were examined to assess the correlation between the support dimensions, cementitious material type, wire pitch, and single-wall vs. dual-walled bag designs to the support capacity, stiffness, load shedding events, and yield characteristics.
Why do some room and pillar retreat panels encounter abnormal conditions? What factors deserve the most consideration during the planning and execution phases of mining and what can be done to mitigate those abnormal conditions when they are encountered? To help answer these questions, and to determine some of the relevant factors influencing the conditions of room and pillar (R & P) retreat mining entries, four consecutive R & P retreat panels were evaluated. This evaluation was intended to reinforce the influence of topographic changes, depth of cover, multiple-seam interactions, geological conditions, and mining geometry. This paper details observations were made in four consecutive R & P retreat panels and the data were collected from an instrumentation site during retreat mining. The primary focus was on the differences observed among the four panels and within the panels themselves. The instrumentation study was initially planned to evaluate the interactions between primary and secondary support, but produced rather interesting results relating to the loading encountered under the current mining conditions. In addition to the observation and instrumentation, numerical modeling was performed to evaluate the stress conditions. Both the LaModel 3.0 and Rocscience Phase 2 programs were used to evaluate these four panels. The results of both models indicated a drastic reduction in the vertical stresses experienced in these panels due to the full extraction mining in overlying seams when compared to the full overburden load. Both models showed a higher level of stress associated with the outside entries of the panels. These results agree quite well with the observations and instrumentation studies performed at the mine. These efforts provided two overarching conclusions concerning R & P retreat mine planning and execution. The first was that there are four areas that should not be overlooked during R & P retreat mining: topographic relief, multiple-seam stress relief, stress concentrations near the gob edge, and geologic changes in the immediate roof. The second is that in order to successfully retreat an R & P panel, a three-phased approach to the design and analysis of the panel should be conducted: the planning phase, evaluation phase, and monitoring phase.
Dynamic failures, or bumps, remain an imperative safety concern in underground coal mining, despite significant advancements in engineering controls. The presence of spatially discrete, stiff roof units are one feature that has been linked to these events. However, an empirical stratigraphic review indicates that no significant difference exists in the relative commonality of discrete units between bumping and non-bumping deposits. Instead an apparent relationship exists between reportable bumping and the overall stiffness of the host rock. However, this initial study is too simplistic to be conclusive; to weight the relative impact of changes in a single variable, such as the thickness or location of sandstone members, it must be examined in isolation-i.e., in a setting where all other variables are held constant. Numerical modelling provides this setting, and the effects of variability in a stiff discrete member in a hypothetical longwall mining scenario are investigated within the context of three stratigraphic types, Compliant, Intermediate and Stiff. A modelling experiment examines changes in rupture potential in stiff roof units for each stratigraphic type as discrete unit thickness and location are manipulated through a range of values. Results suggest that the stiff-to-compliant ratio of the host rock has an impact on the relative stress-inducing effects of discrete stiff members. In other words, it is necessary to consider both the thickness and the distance to the seam, within the context of the host rock, to accurately anticipate areas of elevated rupture-induced hazard; acknowledging the presence of a discrete unit within the overburden in general terms is an insufficient indicator of risk. This finding helps to refine our understanding of the role of individual stiff, strong roof members in bumping phenomena, and suggests that a holistic view of overburden lithology and site-specific numerical modelling may be necessary to improve miner safety.
Understanding coal mine rib behavior is important for inferring pillar loading conditions as well as ensuring the safety of miners who are regularly exposed to ribs. Due to the variability in the geometry of underground openings and ground behavior, point measurements often fail to capture the true movement of mine workings. Photogrammetry is a potentially fast, cheap, and precise supplemental measurement tool in comparison to extensometers, tape measures, or laser range meters, but its application in underground coal has been limited. The practical use of photogrammetry was tested at the Safety Research Coal Mine, National Institute for Occupational Safety and Health (NIOSH). A commercially available, digital single-lens reflex (DSLR) camera was used to perform the photogrammetric surveys for the experiment. Several experiments were performed using different lighting conditions, distances to subject, camera settings, and photograph overlaps, with results summarized as follows: the lighting method was found to be insignificant if the scene was appropriately illuminated. It was found that the distance to the subject has a minimal impact on result accuracy, and that camera settings have a significant impact on the photogrammetric quality of images. An increasing photograph resolution was preferable when measuring plane orientations; otherwise a high point cloud density would likely be excessive. Focal ratio (F-stop) changes affect the depth of field and image quality in situations where multiple angles are necessary to survey cleat orientations. Photograph overlap is very important to proper three-dimensional reconstruction, and at least 60% overlap between photograph pairs is ideal to avoid unnecessary post-processing. The suggestions and guidelines proposed are designed to increase the quality of photogrammetry inputs and outputs as well as minimize processing time, and serve as a starting point for an underground coal photogrammetry study.
This paper presents the results of a comprehensive study conducted by CONSOL Energy, Marcellus Shale Coalition, and Pennsylvania Coal Association to evaluate the effects of longwall-induced subsurface deformations on the mechanical integrity of shale gas wells drilled over a longwall abutment pillar. The primary objective is to demonstrate that a properly constructed gas well in a standard longwall abutment pillar can maintain mechanical integrity during and after mining operations. A study site was selected over a southwestern Pennsylvania coal mine, which extracts 457-m-wide longwall faces under about 183. m of cover. Four test wells and four monitoring wells were drilled and installed over a 38-m by 84-m centers abutment pillar. In addition to the test wells and monitoring wells, surface subsidence measurements and underground coal pillar pressure measurements were conducted as the 457-m-wide longwall panels on the south and north sides of the abutment pillar were mined by. To evaluate the resulting coal protection casing profile and lateral displacement, three separate 60-arm caliper surveys were conducted. This research represents a very important step and initiative to utilize the knowledge and science obtained from mining research to improve miner and public safety as well as the safety and health of the oil and gas industries.
BACKGROUND: National- and state-level self-reported frequency of fruit and vegetable (F/V) consumption is available for high school students from the Centers for Disease Control and Prevention’s Youth Risk Behavior Surveillance System (YRBSS). YRBSS monitors priority health-risk behaviors among a nationally representative sample of US high school students and representative samples of students in states and selected large urban school districts. However, YRBSS measures intake in times per day and not the cup equivalents that national goals use, which limits interpretation. OBJECTIVE: To help states track youth progress, scoring algorithms were developed from external data and applied to 2013 YRBSS data to estimate the percentages of high school students in the nation and 33 states meeting the US Department of Agriculture’s Food Patterns F/V intake recommendations. DESIGN: Twenty-four-hour dietary recalls were used from the 2007-2010 National Health and Nutrition Examination Survey to fit sex-specific models for 14- to 18-year-olds that estimate probabilities of meeting recommendations as a function of reported frequency of consumption and race/ethnicity, adjusting for day-to-day dietary variation. Model regression parameters were then applied to national cross-sectional YRBSS data (n=12,829) and to data from the 33 states (n=141,006) that had complete F/V data to estimate percentages meeting recommendations. RESULTS: Based on the prediction equations, 8.5% of high school students nationwide met fruit recommendations (95% CI 4.9% to 12.1%) and 2.1% met vegetable recommendations (95% CI 0.0% to 8.1%). State estimates ranged from 5.3% in Nebraska and Missouri to 8.9% in Florida for fruit and 1.0% in New Jersey, North Dakota, and South Carolina to 3.3% in New Mexico for vegetables. CONCLUSIONS: This method provides a new tool for states to track youth progress toward meeting dietary recommendations and indicates that a high percentage of youth in all states examined have low intakes of F/V.
PURPOSE: Researchers previously examined the relationship between school beverage policies and sugar-sweetened beverage (SSB) consumption. This study addressed a research gap by examining cross-sectional associations between district-level policies and practices and U.S. high school students’ consumption of milk and 100% fruit juice. METHODS: Data from the 2012 School Health Policies and Practices Study and 2013 Youth Risk Behavior Surveillance System were linked for 12 large urban school districts. Outcome variables were daily milk consumption (>/=1 glass/day) and 100% fruit juice consumption (>/=1 time/day). Exposure variables were five district policies (i.e., restrict SSB sales, maintain closed campuses, offer/sell healthful alternatives, restrict promotional products, and require nutrition education). Logistic regression models estimated the odds of consuming milk or 100% fruit juice daily, conditional on the policies and adjusting for sex, race/ethnicity, grade level, weight status, and district free/reduced-price lunch eligibility (n = 23,173). RESULTS: Students in districts that required/recommended restricting the times of SSB sales had 55% higher (adjusted odds ratio [AOR], 1.55; 95% confidence interval [CI], 1.28-1.87) odds of consuming >/=1 glass/day of milk than students in districts without this policy. Closed campus policies were associated with lower odds of consuming milk (AOR, .72; 95% CI, .63-.82) and higher odds of consuming juice (AOR, 1.27; 95% CI, 1.07-1.50). Policies requiring/recommending that districts offer/sell healthful alternatives were associated with lower odds of consuming 100% fruit juice daily. CONCLUSIONS: Results suggest that restricting SSB sales may support adolescents’ milk consumption. Future studies should assess whether the implementation of federal standards that further restrict SSB sales in school leads to increased milk consumption.
Problem: Roadway incidents are the leading cause of work-related death in the United States.: Methods: The objective of this research was to evaluate whether two types of feedback from a commercially available in-vehicle monitoring system (IVMS) would reduce the incidence of risky driving behaviors in drivers from two companies. IVMS were installed in 315 vehicles representing the industries of local truck transportation and oil and gas support operations, and data were collected over an approximate two-year period in intervention and control groups. In one period, intervention group drivers were given feedback from in-cab warning lights from an IVMS that indicated occurrence of harsh vehicle maneuvers. In another period, intervention group drivers viewed video recordings of their risky driving behaviors with supervisors, and were coached by supervisors on safe driving practices.: Results: Risky driving behaviors declined significantly more during the period with coaching plus instant feedback with lights in comparison to the period with lights-only feedback (ORadj = 0.61 95% CI 0.43-0.86; Holm-adjusted p = 0.035) and the control group (ORadj = 0.52 95% CI 0.33-0.82; Holm-adjusted p = 0.032). Lights-only feedback was not found to be significantly different than the control group’s decline from baseline (ORadj = 0.86 95% CI 0.51-1.43; Holm-adjusted p. >. 0.05).: Conclusions: The largest decline in the rate of risky driving behaviors occurred when feedback included both supervisory coaching and lights.: Practical applications: Supervisory coaching is an effective form of feedback to improve driving habits in the workplace. The potential advantages and limitations of this IVMS-based intervention program are discussed.
Introduction: More than 5,000 fatalities and eight million injuries occurred in the workplace in 2007 at a cost of $6 billion and $186 billion, respectively. Neurotoxic chemicals are known to affect central nervous system functions among workers, which include balance and hearing disorders. However, it is not known if there is an association between exposure to noise and solvents and acute injuries. . Method: A thorough review was conducted of the literature on the relationship between noise or solvent exposures and hearing loss with various health outcomes. . Results: The search resulted in 41 studies. Health outcomes included: hearing loss, workplace injuries, absence from work due to sickness, fatalities, hospital admissions due to workplace accidents, traffic accidents, hypertension, balance, slip, trips, or falls, cognitive measures, or disability retirement. Important covariates in these studies were age of employee, type of industry or occupation, or length of employment. . Discussion: Most authors that evaluated noise exposure concluded that higher exposure to noise resulted in more of the chosen health effect but the relationship is not well understood. Studies that evaluated hearing loss found that hearing loss was related to occupational injury, disability retirement, or traffic accidents. Studies that assessed both noise exposure and hearing loss as risk factors for occupational injuries reported that hearing loss was related to occupational injuries as much or more than noise exposure. Evidence suggests that solvent exposure is likely to be related to accidents or other health consequences such balance disorders. . Conclusions: Many authors reported that noise exposures and hearing loss, respectively, are likely to be related to occupational accidents. . Practical applications: The potential significance of the study is that findings could be used by managers to reduce injuries and the costs associated with those injures.
Introduction: Policing involves inherent physical and psychological dangers as well as occupational stressors that could lead to chronic fatigue. Although accounts of adverse events associated with police fatigue are not scarce, literature on the association between chronic fatigue and on-duty injury are limited. Methods: Participants were officers from the Buffalo Cardio-Metabolic Occupational Police Stress (BCOPS) Study. A 10-item questionnaire was administered to assess how tired or energetic the officers generally felt irrespective of sleep hours or workload. The questionnaire consisted of five positively worded and five negatively phrased items that measured feelings of vigor/energy and tiredness, respectively. Total as well as separate scores for positive and negative items were computed by summing scores of individual items. Payroll records documenting each officer’s work history were used to assess occurrence of injury. Poisson regression was used to estimate prevalence ratios (PR) of injury. Results: Nearly 40% of officers reported feeling drained. Overall prevalence of on-duty injury during the past year was 23.9%. Injury prevalence showed a significant increasing trend across tertiles of total fatigue score: 19.6, 21.7, and 30.8% for lowest, middle and highest tertiles, respectively (trend p-value = 0.037). After controlling for potential confounders, a 5-unit increase in total fatigue score was associated with a 12% increase in prevalence of injury which was marginally significant (p = 0.075). A 5-unit increase in fatigue score of the positively worded items was associated with a 33% increase in prevalence of injury (PR = 1.33, 95% CI: 1.04-1.70, p = 0.022). Conclusion: Officers who do not feel active, full of vigor, alert, or lively had a significantly higher prevalence of non-fatal work place injury compared to their counter parts. Practical applications: With additional prospective evidence, workplace interventions designed to enhance level of energy may reduce feelings of tiredness and hence may prevent workplace injury.
BACKGROUND: Asthma and obliterative bronchiolitis (OB) cases have occurred among styrene-exposed workers. We aimed to investigate styrene as a risk factor for non-malignant respiratory disease (NMRD). METHODS: From a literature review, we identified case reports and assessed cross-sectional and mortality studies for strength of evidence of positive association (i.e., strong, intermediate, suggestive, none) between styrene exposure and NMRD-related morbidity and mortality. RESULTS: We analyzed 55 articles and two unpublished case reports. Ten OB cases and eight asthma cases were identified. Six (75%) asthma cases had abnormal styrene inhalation challenges. Thirteen (87%) of 15 cross-sectional studies and 12 (50%) of 24 mortality studies provided at least suggestive evidence that styrene was associated with NMRD-related morbidity or mortality. Six (66%) of nine mortality studies assessing chronic obstructive pulmonary disease-related mortality indicated excess mortality. CONCLUSIONS: Available evidence suggests styrene exposure is a potential risk factor for NMRD. Additional studies of styrene-exposed workers are warranted.
BACKGROUND: We evaluated cancer incidence in a cohort of polychlorinated biphenyl (PCB) exposed workers. METHODS: Incident cancers, identified using state registries, were compared to those in a national population using standardized incidence ratios. Trends in prostate cancer incidence with cumulative PCB exposure were evaluated using standardized rate ratios and Cox regression models. For selected sites, cumulative PCB exposure was compared between aggressive (fatal/distant stage) and localized/regional cancers. RESULTS: We identified 3,371 invasive first primary cancer diagnoses among 21,317 eligible workers through 2007. Overall relative incidence was reduced. Elevations were only observed for respiratory cancers and among women, urinary organ cancers. Among men, prostate cancer incidence was reduced and not associated with cumulative PCB exposure although median exposures were significantly higher for aggressive compared to localized/regional prostate cancers. CONCLUSION: Previously observed associations between cumulative PCB exposure and prostate cancer mortality were not confirmed in this analysis; prostate cancer stage at diagnosis may explain the discrepancy.
Dust containing crystalline silica is common in mining environments in the U.S. and around the world. The exposure to respirable crystalline silica remains an important occupational issue and it can lead to the development of silicosis and other respiratory diseases. Little has been done with regard to the characterization of the crystalline silica content of specific particle sizes of mine-generated dust. Such characterization could improve monitoring techniques and control technologies for crystalline silica, decreasing worker exposure to silica and preventing future incidence of silicosis. Three gold mine dust samples were aerosolized in a laboratory chamber. Particle size-specific samples were collected for gravimetric analysis and for quantification of silica using the Microorifice Uniform Deposit Impactor (MOUDI). Dust size distributions were characterized via aerodynamic and scanning mobility particle sizers (APS, SMPS) and gravimetrically via the MOUDI. Silica size distributions were constructed using gravimetric data from the MOUDI and proportional silica content corresponding to each size range of particles collected by the MOUDI, as determined via X-ray diffraction and infrared spectroscopic quantification of silica. Results indicate that silica does not comprise a uniform proportion of total dust across all particle sizes and that the size distributions of a given dust and its silica component are similar but not equivalent. Additional research characterizing the silica content of dusts from a variety of mine types and other occupational environments is necessary in order to ascertain trends that could be beneficial in developing better monitoring and control strategies.
Mine Safety and Health Administration (MSHA) regulations require underground coal mines to install refuge alternatives (RAs). In the event of a disaster, RAs must be able to provide a breathable air environment for 96 h. The interior environment of an occupied RA, however, may become hot and humid during the 96 h due to miners’ metabolic heat and carbon dioxide scrubbing system heat. The internal heat and humidity may result in miners suffering heat stress or even death. To investigate heat and humidity buildup with an occupied RA, the National Institute for Occupational Safety and Health (NIOSH) conductedtesting on a training ten-person, tent-type RA in its Safety Research Coal Mine (SRCM) in a test area that was isolated from the mine ventilation system. The test results showed that the average measured air temperature within the RA increased by 11.4 ?C (20.5 ?F) and the relative humidity approached 90% RH. The test results were used to benchmark a thermal simulation model of the tested RA. The validated thermal simulation model predicted the average air temperature inside the RA at the end of 96 h to within 0.6 ?C (1.1 ?F) of the measured average air temperature.
Infection by the rat lungworm Angiostrongylus cantonensis represents the most common cause of infectious eosinophilic meningitis in humans, causing central nervous system (CNS) angiostrongyliasis. Most of CNS angiostrongyliasis cases were described in Asia, Pacific Basin, Australia, and some limited parts of Africa and America. CNS angiostrongyliasis has been reported in the Caribbean but never in the Lesser Antilles. The primary objectives of this study were to depict the first case of CNS angiostrongyliasis in the Lesser Antilles and investigate the environmental presence of A. cantonensis in Guadeloupe, Lesser Antilles. In December 2013, a suspected case of CNS angiostrongyliasis in an 8-month-old infant in Guadeloupe was investigated by real-time polymerase chain reaction (PCR) testing on cerebral spinal fluid (CSF). The environmental investigation was performed by collecting Achatina fulica molluscs from different parts of Guadeloupe and testing the occurrence of A. cantonensis by real-time PCR. CSF from the suspected case of angiostrongyliasis was positive for A. cantonensis by real-time PCR. Among 34 collected snails for environmental investigation, 32.4% were positive for A. cantonensis In conclusion, we report the first laboratory-confirmed case of CNS-angiostrongyliasis in the Lesser Antilles. We identified the presence and high prevalence of A. cantonensis in A. fulica in Guadeloupe. These results highlight the need to increase awareness of this disease and implement public health programs in the region to prevent human cases of angiostrongyliasis and improve management of eosinophilic meningitis patients.
Leishmaniasis in humans is caused by Leishmania spp. in the subgenera Leishmania and Viannia Species identification often has clinical relevance. Until recently, our laboratory relied on conventional PCR amplification of the internal transcribed spacer 2 (ITS2) region (ITS2-PCR) followed by sequencing analysis of the PCR product to differentiate Leishmania spp. Here we describe a novel real-time quantitative PCR (qPCR) approach based on the SYBR green technology (LSG-qPCR), which uses genus-specific primers that target the ITS1 region and amplify DNA from at least 10 Leishmania spp., followed by analysis of the melting temperature (Tm) of the amplicons on qPCR platforms (the Mx3000P qPCR system [Stratagene-Agilent] and the 7500 real-time PCR system [ABI Life Technologies]). We initially evaluated the assay by testing reference Leishmania isolates and comparing the results with those from the conventional ITS2-PCR approach. Then we compared the results from the real-time and conventional molecular approaches for clinical specimens from 1,051 patients submitted to the reference laboratory of the Centers for Disease Control and Prevention for Leishmania diagnostic testing. Specimens from 477 patients tested positive for Leishmania spp. with the LSG-qPCR assay, specimens from 465 of these 477 patients also tested positive with the conventional ITS2-PCR approach, and specimens from 10 of these 465 patients had positive results because of retesting prompted by LSG-qPCR positivity. On the basis of the Tm values of the LSG-qPCR amplicons from reference and clinical specimens, we were able to differentiate four groups of Leishmania parasites: the Viannia subgenus in aggregate; the Leishmania (Leishmania) donovani complex in aggregate; the species L (L) tropica; and the species L (L) mexicana, L (L) amazonensis, L (L) major, and L (L) aethiopica in aggregate.
An estimated 50 million persons worldwide are infected with cysticerci, the larval forms of the Taenia solium tapeworm. Neurocysticercosis can cause seizures, epilepsy, and hydrocephalus, and fatal cases have been reported in the United States in immigrants and in travelers returning from endemic countries. Pregnant women with symptomatic neurocysticercosis present treatment challenges, whereas those with the adult tapeworm infection (i.e., taeniasis) can put their infants and other family members, as well as obstetrician-gynecologists and their staff, at risk for cysticercosis. A questionnaire developed by the American College of Obstetricians and Gynecologists was sent to a representative sample of 1,000 physicians to assess their awareness of T. solium infection and the potential for it to be encountered in an obstetrics and gynecology setting. In total, 31.4% of respondents correctly answered that taeniasis is caused by eating undercooked pork containing T. solium cysts (95% confidence interval [CI] = 26.6-36.5). While only 14.5% (95% CI = 11.0-18.6) of respondents correctly answered that cysticercosis is acquired by ingesting tapeworm eggs shed in human stools, twice that number (30.3%; 95% CI = 25.5-35.3) correctly answered that a mother with taeniasis can cause cysticercosis in her infant. Practicing in a state in which cysticercosis was reportable at the time of the survey was not significantly associated with answering any of the 12 knowledge questions correctly. Overall, knowledge of T. solium infection among U.S. obstetricians-gynecologists is limited. This may result in missed opportunities to diagnose and treat pregnant women with taeniasis, which may put family members and obstetrics clinical staff at risk for cysticercosis.
The emergence of Plasmodium falciparum (Pf) resistance to artemisinin in Southeast Asia threatens malaria control and elimination activities worldwide. Multiple polymorphisms in the Pf kelch gene found in chromosome 13 (Pfk13) have been associated with artemisinin resistance. Surveillance of potential drug resistance loci within a population that may emerge under increasing drug pressure is an important public health activity. In this context, P. falciparum infections from an observational surveillance study in Senegal were genotyped using targeted amplicon deep sequencing (TADS) for Pfk13 polymorphisms. The results were compared to previously reported Pfk13 polymorphisms from around the world. A total of 22 Pfk13 propeller domain polymorphism(s) were identified in this study, of which 12 have previously not been reported. Interestingly, of the 10 polymorphisms identified in the present study that were also previously reported, all had a different amino acid substitution at these codon positions. Most of the polymorphisms were present at low frequencies and were confined to single isolates, suggesting they are likely transient polymorphisms that are part of naturally evolving parasite populations. The results of this study underscore the need to identify potential drug resistance loci existing within a population, which may emerge under increasing drug pressure.
In the United States, animal contact exhibits, such as petting zoos and agricultural fairs, have been sources of zoonotic infections, including infections with Escherichia coli, Salmonella, and Cryptosporidium. The National Association of State Public Health Veterinarians recommends handwashing after contact with animals as an effective prevention measure to disease transmission at these exhibits. This report provides a list of states that have used law, specifically statutes and regulations, as public health interventions to increase hand sanitation at animal contact exhibits. The report is based on an assessment conducted by CDC’s Public Health Law Program, in collaboration with the Division of Foodborne, Waterborne, and Environmental Diseases in CDC’s National Center for Emerging and Zoonotic Infectious Diseases. The assessment found that seven states have used statutes or regulations to require hand sanitation stations at these exhibits (5). Jurisdictions seeking to improve rates of hand sanitation at animal contact exhibits can use this report as a resource in developing their own legal interventions.
CONTEXT: Partnerships are emerging as critically important vehicles for addressing health in local communities. Coalitions involving local health departments can be viewed as the embodiment of a local public health system. Although it is known that these networks are heavily involved in assessment and community planning activities, limited studies have evaluated whether health coalitions are functioning at an optimal capacity. OBJECTIVE: This study assesses the extent to which health coalitions met or exceeded expectations for building functional capacity within their respective networks. DESIGN: An evaluative framework was developed focusing on 8 functional characteristics of coalitions previously identified by Erwin and Mills. Twenty-nine indicators were identified that served as “proxy” measures of functional capacity within health coalitions. SETTING AND PARTICIPANTS: Ninety-three County Health Councils (CoHCs) in Tennessee. MAIN OUTCOME MEASURE(S): Diverse member representation; formal rules, roles, and procedures; open, frequent interpersonal communication; task-focused climate; council leadership; resources; active member participation; and external linkages were assessed to determine the level of functionality of CoHCs. Scores across all CoHCs were analyzed using descriptive statistics such as frequency distributions, measures of central tendency, and measures of variability. Data were analyzed using SAS 9.3. RESULTS: Of 68 CoHCs (73% response rate), the total mean score for the level of functional characteristics was 30.5 (median = 30.5; SD = 6.3; range, 18-44). Of the 8 functional characteristics, CoHCs met or exceeded all indicators associated with council leadership, tasked-focused climate, and external linkages. Lowest scores were for having a written communications plan, written priorities or goals, and opportunities for training. CONCLUSION: This study advances the research on health coalitions by establishing a process for quantifying the functionality of health coalitions. Future studies will be conducted to examine the association between health coalition functional capacity, local health departments’ community health assessment and planning efforts, and changes in community health status.
PURPOSE: To explore perceptions of facilitators/barriers to sexual and reproductive health (SRH) care use among an urban sample of African-American and Hispanic young men aged 15-24 years, including sexual minorities. METHODS: Focus groups were conducted between April 2013 and May 2014 in one mid-Atlantic U.S. city. Young men aged 15-24 years were recruited from eight community settings to participate in 12 groups. Moderator guide explored facilitators/barriers to SRH care use. A brief pregroup self-administered survey assessed participants’ sociodemographics and SRH information sources. Content analysis was conducted, and three investigators independently verified the themes that emerged. RESULTS: Participants included 70 males: 70% were aged 15-19 years, 66% African-American, 34% Hispanic, 83% heterosexual, and 16% gay/bisexual. Results indicated young men’s perceptions of facilitators/barriers to their SRH care use come from multiple levels of their socioecology, including cultural, structural, social, and personal contexts, and dynamic inter-relationships existed across contexts. A health care culture focused on women’s health and traditional masculinity scripts provided an overall background. Structural level concerns included cost, long visits, and confidentiality; social level concerns included stigma of being seen by community members and needs regarding health care provider interactions; and personal level concerns included self-risk assessments on decisions to seek care and fears/anxieties about sexually transmitted infection/HIV testing. Young men also discussed SRH care help-seeking sometimes involved family and/or other social network members and needs related to patient-provider interactions about SRH care. CONCLUSIONS: Study findings provide a foundation for better understanding young men’s SRH care use and considering ways to engage them in care.
Rabies, resulting from infection by Rabies virus (RABV) and related lyssaviruses, is one of the most deadly zoonotic diseases and is responsible for up to 70,000 estimated human deaths worldwide each year. Rapid and accurate laboratory diagnosis of rabies is essential for timely administration of post-exposure prophylaxis in humans and control of the disease in animals. Currently, only the direct fluorescent antibody (DFA) test is recommended for routine rabies diagnosis. Reverse-transcription polymerase chain reaction (RT-PCR) based diagnostic methods have been widely adapted for the diagnosis of other viral pathogens, but there is currently no widely accepted rapid real-time RT-PCR assay for the detection of all lyssaviruses. In this study, we demonstrate the validation of a newly developed multiplex real-time RT-PCR assay named LN34, which uses a combination of degenerate primers and probes along with probe modifications to achieve superior coverage of the Lyssavirus genus while maintaining sensitivity and specificity. The primers and probes of the LN34 assay target the highly conserved non-coding leader region and part of the nucleoprotein (N) coding sequence of the Lyssavirus genome to maintain assay robustness. The probes were further modified by locked nucleotides to increase their melting temperature to meet the requirements for an optimal real-time RT-PCR assay. The LN34 assay was able to detect all RABV variants and other lyssaviruses in a validation panel that included representative RABV isolates from most regions of the world as well as representatives of 13 additional Lyssavirus species. The LN34 assay was successfully used for both ante-mortem and post-mortem diagnosis of over 200 clinical samples as well as field derived surveillance samples. This assay represents a major improvement over previously published rabies specific RT-PCR and real-time RT-PCR assays because of its ability to universally detect RABV and other lyssaviruses, its high throughput capability and its simplicity of use, which can be quickly adapted in a laboratory to enhance the capacity of rabies molecular diagnostics. The LN34 assay provides an alternative approach for rabies diagnostics, especially in rural areas and rabies endemic regions that lack the conditions and broad experience required to run the standard DFA assay.
DISCLAIMER: Articles listed in the CDC Knowledge to Action ScienceClips are selected by the Stephen B. Thacker CDC Library to provide current awareness of the public health literature. An article’s inclusion does not necessarily represent the views of the Centers for Disease Control and Prevention nor does it imply endorsement of the article’s methods or findings. CDC and DHHS assume no responsibility for the factual accuracy of the items presented. The selection, omission, or content of items does not imply any endorsement or other position taken by CDC or DHHS. Opinion, findings and conclusions expressed by the original authors of items included in the Clips, or persons quoted therein, are strictly their own and are in no way meant to represent the opinion or views of CDC or DHHS. References to publications, news sources, and non-CDC Websites are provided solely for informational purposes and do not imply endorsement by CDC or DHHS.