Skip directly to search Skip directly to A to Z list Skip directly to navigation Skip directly to page options Skip directly to site content

Issue 30, August 1, 2017

CDC Science Clips: Volume 9, Issue 30, August 1, 2016

Science Clips is produced weekly to enhance awareness of emerging scientific knowledge for the public health community. Each article features an Altmetric Attention score to track social and mainstream media mentions!

  1. Top Articles of the Week

    Selected weekly by a senior CDC scientist from the standard sections listed below.

    The names of CDC authors are indicated in bold text.
    • Communicable Diseases RSS Word feed
      • Prevalence of enteric infections among hospitalized patients in two referral hospitals in Ghana
        Akuffo R, Armah G, Clemens M, Kronmann KC, Jones AH, Agbenohevi P, Sagoe K, Puplampu N, Talla Nzussouo N, Ampofo W, Koram K, Duplessis C, Dueger E.
        BMC Res Notes. 2017 Jul 17;10(1):292.
        BACKGROUND: Diarrhea is an important cause of morbidity and mortality worldwide. In Africa and Ghana in particular, it is estimated to contribute directly to 19 and 25% of pediatric mortality among children under 5 years, respectively. METHODS: Surveillance for hospitalized acute diarrheal illness was initiated in November 2010 through October 2012 in a referral hospital in southern Ghana, and a teaching hospital in northern Ghana. Consenting hospitalized patients who met a standardized case definition for acute diarrheal illness provided demographic and epidemiologic data. Stool samples were collected and tested by culture for bacteria and by enzyme immunoassays for a panel of viruses and parasites. RESULTS: A total of 429 patients were enrolled; 216 (50.3%) were under 5 years, and 221 (51.5%) were females. Stool samples were received from 153 patients. Culture isolates included Shigella sp., Salmonella spp., Plesiomonas sp. and Vibrio cholerae. Of 147 samples tested for viruses, 41 (27.9%) were positive for rotaviruses, 11 (7.5%) for astroviruses, 10 (6.8%) for noroviruses, and 8 (5.4%) for adenoviruses. Of 116 samples tested for parasitic infections; 4 (3.4%) were positive for Cryptosporidium sp. and 3 (2.6%) for Giardia lamblia. Of the enrolled patients, 78.8% had taken antibiotics prior to sample collection. CONCLUSIONS: Diarrheal pathogens were identified across all ages, however, predominantly (81%) in the children under 5 years of age. This study also detected high antibiotic use which has the potential of increasing antibiotic resistance. The most common enteric pathogen detected (49.4%) was rotavirus.

      • Harshbarger C, Taylor O, Uhrig JD, Lewis MA.
        J Commun Healthc. 2017 :1-8.
        HIV prevention efforts are increasingly aimed at engaging people living with HIV (PLWH) in healthcare to enhance treatment adherence and retention in care. Clinics need evidence-based interventions to support these goals and guidance on how to successfully implement these interventions in clinical settings. We describe the development of Positive Health Check (PHC) digital health intervention to support adherence and retention in care as well as a pilot implementation to determine feasibility. We developed PHC using input from seven HIV primary care providers. Over 15 months, providers gave feedback on the development of PHC by participating in nine inquiries, via online methods or webinars, addressing topics related to intervention development and implementation. After a 1-month pilot test, four providers shared their impressions of PHC implementation via a structured, open-ended interview. Providers’ comments resulted in script revisions, feedback on the filming of four virtual doctors (actors) balanced by race and gender, a user-friendly visual design, and more engaging messaging. Implementation feedback informed protocols to increase privacy and strategies to gain buy-in from clinics. Providers responded positively after using the final version of PHC, and again after the 1-month pilot implementation. Using a collaborative development approach with healthcare providers is a viable method for developing clinic-based interventions to support clinical encounters for PLWH. Intervention development should include strategies to support integrating mobile interventions into clinic workflows.

      • Demographic and ecological risk factors for human influenza A virus infections in rural Indonesia
        Root ED, Agustian D, Kartasasmita C, Uyeki TM, Simoes EA.
        Influenza Other Respir Viruses. 2017 Jul 17.
        BACKGROUND: Indonesia has the world’s highest reported mortality for human infections with highly pathogenic avian influenza (HPAI) A(H5N1) virus. Indonesia is an agriculturally-driven country where human-animal mixing is common and provides a unique environment for zoonotic influenza A virus transmission. OBJECTIVES: To identify potential demographic and ecological risk factors for human infection with seasonal influenza A viruses in rural Indonesia, a population-based study was conducted in Cileunyi and Soreang sub-districts near Bandung in western Java from 2008 to 2011. METHODS: Passive influenza surveillance with RT-PCR confirmation of influenza A viral RNA in respiratory specimens was utilized for case selection and ascertainment. A population census and mapping were utilized for population data collection. The presence of influenza A(H3N2) and A(H1N1)pdm09 virus infections in a household was modeled using Generalized Estimating Equations. RESULTS: Each additional child aged <5 years in a household increased the odds of H3N2 approximately 5 times (OR=4.59, 95%CI: 3.30-6.24) and H1N1pdm09 by 3.5 times (OR=3.53, 95%CI: 2.51-4.96). In addition, the presence of 16-30 birds in the house was associated with an increased odds of H3N2 (OR=5.08, 95%CI: 2.00-12.92) and H1N1pdm09 (OR=12.51 95%CI: 6.23-25.13). CONCLUSION: Our findings suggest an increase in influenza A virus infections in rural Indonesian villagers living in households with young children and poultry. This article is protected by copyright. All rights reserved.

    • Disease Reservoirs and Vectors RSS Word feed
      • Explaining variation in adult Anopheles indoor resting abundance: the relative effects of larval habitat proximity and insecticide-treated bed net use
        McCann RS, Messina JP, MacFarlane DW, Bayoh MN, Gimnig JE, Giorgi E, Walker ED.
        Malar J. 2017 Jul 17;16(1):288.
        BACKGROUND: Spatial determinants of malaria risk within communities are associated with heterogeneity of exposure to vector mosquitoes. The abundance of adult malaria vectors inside people’s houses, where most transmission takes place, should be associated with several factors: proximity of houses to larval habitats, structural characteristics of houses, indoor use of vector control tools containing insecticides, and human behavioural and environmental factors in and near houses. While most previous studies have assessed the association of larval habitat proximity in landscapes with relatively low densities of larval habitats, in this study these relationships were analysed in a region of rural, lowland western Kenya with high larval habitat density. METHODS: 525 houses were sampled for indoor-resting mosquitoes across an 8 by 8 km study area using the pyrethrum spray catch method. A predictive model of larval habitat location in this landscape, previously verified, provided derivations of indices of larval habitat proximity to houses. Using geostatistical regression models, the association of larval habitat proximity, long-lasting insecticidal nets (LLIN) use, house structural characteristics (wall type, roof type), and peridomestic variables (cooking in the house, cattle near the house, number of people sleeping in the house) with mosquito abundance in houses was quantified. RESULTS: Vector abundance was low (mean, 1.1 adult Anopheles per house). Proximity of larval habitats was a strong predictor of Anopheles abundance. Houses without an LLIN had more female Anopheles gambiae s.s., Anopheles arabiensis and Anopheles funestus than houses where some people used an LLIN (rate ratios, 95% CI 0.87, 0.85-0.89; 0.84, 0.82-0.86; 0.38, 0.37-0.40) and houses where everyone used an LLIN (RR, 95% CI 0.49, 0.48-0.50; 0.39, 0.39-0.40; 0.60, 0.58-0.61). Cooking in the house also reduced Anopheles abundance across all species. The number of people sleeping in the house, presence of cattle near the house, and house structure modulated Anopheles abundance, but the effect varied with Anopheles species and sex. CONCLUSIONS: Variation in the abundance of indoor-resting Anopheles in rural houses of western Kenya varies with clearly identifiable factors. Results suggest that LLIN use continues to function in reducing vector abundance, and that larval source management in this region could lead to further reductions in malaria risk by reducing the amount of an obligatory resource for mosquitoes near people’s homes.

    • Drug Safety RSS Word feed
      • Safer and more appropriate opioid prescribing: a large healthcare system’s comprehensive approach
        Losby JL, Hyatt JD, Kanter MH, Baldwin G, Matsuoka D.
        J Eval Clin Pract. 2017 Jul 14.
        RATIONALE, AIMS, AND OBJECTIVES: The United States is in the midst of a public health epidemic with more than 40 people dying each day from prescription opioid overdoses. Health care systems are taking steps to address the opioid overdose epidemic by implementing policy and practice interventions to mitigate the risks of long-term opioid therapy. Kaiser Permanente Southern California launched a comprehensive initiative to transform the way that chronic pain is viewed and treated. Kaiser Permanente Southern California created prescribing and dispensing policies, monitoring and follow-up processes, and clinical coordination through electronic health record integration. The purpose of this paper is to describe the implementation of these interventions and assess the impact of this set of interventions on opioid prescribing. METHOD: The study used a retrospective pre-post evaluation design to track outcomes before and after the intervention. Kaiser Permanente Southern California members age 18 and older excluding cancer, hospice, and palliative care patients and this sub-population of 3 203 880 was approximately 75% of all Kaiser Permanente Southern California members. All data are from Kaiser Permanente’s pharmacy data systems and electronic health record collected on a rolling basis as interventions were implemented from January 2010 to December 2015. RESULTS: There were reductions in all tracked outcomes: a 30% reduction in prescribing opioids at high doses; a 98% reduction in the number of prescriptions with quantities greater than 200 pills; a 90% decrease in the combination of an opioid prescription with benzodiazepines and carisoprodol; a 72% reduction in the prescribing of Long Acting/Extended Release opioids; and a 95% reduction in prescriptions of brand name opioid-acetaminophen products. In addition, methadone prescribing did not increase during this period. CONCLUSIONS: This study adds promising results that a comprehensive system-level strategy has the ability to positively affect opioid prescribing. The basic components of the intervention are generalizable and applicable to other health care settings.

    • Health Economics RSS Word feed
      • Economic burden of chronic conditions among survivors of cancer in the United States
        Guy GP, Yabroff KR, Ekwueme DU, Rim SH, Li R, Richardson LC.
        J Clin Oncol. 2017 Jun 20;35(18):2053-2061.
        Purpose The prevalence of cancer survivorship and chronic health conditions is increasing. Limited information exists on the economic burden of chronic conditions among survivors of cancer. This study examines the prevalence and economic effect of chronic conditions among survivors of cancer. Methods Using the 2008 to 2013 Medical Expenditure Panel Survey, we present nationally representative estimates of the prevalence of chronic conditions (heart disease, high blood pressure, stroke, emphysema, high cholesterol, diabetes, arthritis, and asthma) and multiple chronic conditions (MCCs) and the incremental annual health care use, medical expenditures, and lost productivity for survivors of cancer attributed to individual chronic conditions and MCCs. Incremental use, expenditures, and lost productivity were evaluated with multivariable regression. Results Survivors of cancer were more likely to have chronic conditions and MCCs compared with adults without a history of cancer. The presence of chronic conditions among survivors of cancer was associated with substantially higher annual medical expenditures, especially for heart disease ($4,595; 95% CI, $3,262 to $5,927) and stroke ($3,843; 95% CI, $1,983 to $5,704). The presence of four or more chronic conditions was associated with increased annual expenditures of $10,280 (95% CI, $7,435 to $13,125) per survivor of cancer. Annual lost productivity was higher among survivors of cancer with other chronic conditions, especially stroke ($4,325; 95% CI, $2,687 to $5,964), and arthritis ($3,534; 95% CI, $2,475 to $4,593). Having four or more chronic conditions was associated with increased annual lost productivity of $9,099 (95% CI, $7,224 to $10,973) per survivor of cancer. The economic impact of chronic conditions was similar among survivors of cancer and individuals without a history of cancer. Conclusion These results highlight the importance of ensuring access to lifelong personalized screening, surveillance, and chronic disease management to help manage chronic conditions, reduce disruptions in employment, and reduce medical expenditures among survivors of cancer.

    • Laboratory Sciences RSS Word feed
      • Evaluation of the field performance of ImmunoCard STAT!(R) rapid diagnostic test for rotavirus in Dadaab Refugee Camp and at the Kenya-Somalia border
        Ope M, Nyoka R, Unshur A, Oyier FO, Mowlid SA, Owino B, Ochieng SB, Okello CI, Montgomery JM, Wagacha B, Galev A, Abdow A, Esona MD, Tate J, Fitter D, Cookson ST, Arunmozhi B, Marano N.
        Am J Trop Med Hyg. 2017 Jun;96(6):1302-1306.
        AbstractRotavirus commonly causes diarrhea in children, leading to hospitalization and even death. Rapid diagnostic tests are feasible alternatives for determining rotavirus outbreaks in refugee camps that have inadequate laboratory capacity. We evaluated the field performance of ImmunoCard STAT!(R) Rotavirus (ICS-RV) in Dadaab Refugee Camp and at the Kenya-Somalia border. From May to December 2014, we prospectively enrolled children aged < 5 years hospitalized with acute diarrhea, defined as >/= 3 episodes of loose stool in 24 hours for < 7 days. Stool samples were collected and tested by trained surveillance clerks using ICS-RV per manufacturer’s instructions. The field performance characteristics of ICS-RV were evaluated against the gold standard test, Premier Rotaclone(R) enzyme immunoassay. The operational characteristics were evaluated using World Health Organization (WHO) ASSURED criteria to determine whether ICS-RV is appropriate as a point-of-care test by administering a standard questionnaire and observing surveillance clerks performing the test. We enrolled 213 patients with a median age of 10 months (range = 1-48); 58.2% were male. A total of 71 (33.3%) and 60 (28.2%) patients tested positive for rotavirus infection by immunoassay and ICS-RV, respectively. The sensitivity, specificity, and positive and negative predictive values of ICS-RV compared with the immunoassay were 83.1% (95% confidence interval [CI] = 72.3-91.0), 99.3% (95% CI = 96.1-100), 98.3% (95% CI = 91.1-100), and 92.1% (95% CI = 86.6-95.5), respectively. The ICS-RV fulfilled the WHO ASSURED criteria for point-of-care testing. ICS-RV is a field-ready point-of-care test with good field performance and operational characteristics. It can be useful in determining rotavirus outbreaks in resource-limited settings.

    • Occupational Safety and Health – Mining RSS Word feed
      • Haas EJ, Cecala AB.
        Min Eng. 2017 ;69(7):105-109.
        Personal respirable dust sampling and the evaluation of control technologies have been providing exposure information to the mining industry but not necessarily in a way that shows how technology can be integrated to provide organizational support and resources for workers to mitigate dust sources on site. In response, the U.S. National Institute for Occupational Safety and Health (NIOSH) used previously developed Helmet-CAM technology to design and engage in a behavioral/engineering cooperative intervention to initiate and enhance mine site conversations about the risks and potential occurrences of respirable silica dust exposures on the job as well as provide impetus and solutions for mitigating higher sources of dust. The study involved 48 workers from five mine sites, who agreed to participate between April 2015 and September 2016. Using the Helmet-CAM in this series of longitudinal interventions revealed several exposure trends in respirable silica dust sources and, in many cases, simple quick-fix strategies to reduce their sources. This paper focuses on several specific identified sources of dust that were elevated but could be reduced through basic engineering fixes, low-cost resources, and supportive communication from management to remind and engage workers in protective work practices.

    • Parasitic Diseases RSS Word feed
      • Recurrence of Guinea worm disease in Chad after a 10-year absence: Risk factors for human cases identified in 2010-2011
        Sreenivasan N, Weiss A, Djiatsa JP, Toe F, Djimadoumaji N, Ayers T, Eberhard M, Ruiz-Tiben E, Roy S.Am J Trop Med Hyg. 2017 Jun 12.
        A decade after reporting its last case of Guinea worm disease (GWD), a waterborne parasitic disease targeted for eradication, Chad reported 20 confirmed human cases from 17 villages-10 cases in 2010 and 10 cases in 2011. In 2012, the first GWD dog infections were diagnosed. We conducted a case-control study during April-May 2012 to identify human transmission risk factors and epidemiologic links. We recruited 19 cases and 45 controls matched by age, sex, time, and location of exposure based on the case patients’ periods of infection 10-14 months earlier. Data were analyzed with simple conditional logistic regression models using Firth penalized likelihood methods. Unusually, GWD did not appear to be associated with household primary water sources. Instead, secondary water sources, used outside the village or other nonprimary sources used at home, were risk factors (matched odds ratio = 38.1, 95% confidence interval = 1.6-728.2). This study highlights the changing epidemiology of GWD in Chad-household primary water sources were not identified as risk factors and few epidemiologic links were identified between the handfuls of sporadic cases per year, a trend that continues. Since this investigation, annual dog infections have increased, far surpassing human cases. An aquatic paratenic host is a postulated mode of transmission for both dogs and humans, although fish could not be assessed in this case-control study due to their near-universal consumption. GWD’s evolving nature in Chad underscores the continued need for interventions to prevent both waterborne and potential foodborne transmission until the true mechanism is established.

    • Zoonotic and Vectorborne Diseases RSS Word feed
      • Ecology of filoviruses
        Amman BR, Swanepoel R, Nichol ST, Towner JS.
        Curr Top Microbiol Immunol. 2017 Jul 15.
        Filoviruses can cause severe and often fatal disease in humans. To date, there have been 47 outbreaks resulting in more than 31,500 cases of human illness and over 13,200 reported deaths. Since their discovery, researchers from many scientific disciplines have worked to better understand the natural history of these deadly viruses. Citing original research wherever possible, this chapter reviews laboratory and field-based studies on filovirus ecology and summarizes efforts to identify where filoviruses persist in nature, how virus is transmitted to other animals and ultimately, what drivers cause spillover to human beings. Furthermore, this chapter discusses concepts on what constitutes a reservoir host and highlights challenges encountered while conducting research on filovirus ecology, particularly field-based investigations.

      • Evaluation of a spotted fever group Rickettsia public health surveillance system in Tennessee
        Fill MA, Moncayo AC, Bloch KC, Dunn JR, Schaffner W, Jones TF.
        Am J Trop Med Hyg. 2017 Jul 03.
        Spotted fever group (SFG) rickettsioses are endemic in Tennessee, with approximately 2,500 cases reported during 2000-2012. Because of this substantial burden of disease, we performed a three-part evaluation of Tennessee’s routine surveillance for SFG rickettsioses cases and deaths to assess the system’s effectiveness. Tennessee Department of Health (TDH) SFG rickettsioses surveillance records were matched to three patient series: 1) patients with positive serologic specimens from a commercial reference laboratory during 2010-2011, 2) tertiary medical center patients with positive serologic tests during 2007-2013, and 3) patients identified from death certificates issued during 1995-2014 with SFG rickettsiosis-related causes of death. Chart reviews were performed and patients were classified according to the Council of State and Territorial Epidemiologists’ case definition. Of 254 SFG Rickettsia-positive serologic specimens from the reference laboratory, 129 (51%) met the case definition for confirmed or probable cases of rickettsial disease after chart review. The sensitivity of the TDH surveillance system to detect cases was 45%. Of the 98 confirmed or probable cases identified from the medical center, the sensitivity of the TDH surveillance system to detect cases was 34%. Of 27 patients identified by death certificates, 12 (44%) were classified as confirmed or probable cases; four (33%) were reported to TDH, but none were correctly identified as deceased. Cases of SFG rickettsioses were underreported and fatalities not correctly identified. Efforts are needed to improve SFG rickettsiosis surveillance in Tennessee.

  2. CDC Authored Publications
    The names of CDC authors are indicated in bold text.
    Articles published in the past 6-8 weeks authored by CDC or ATSDR staff.
    • Chronic Diseases and Conditions RSS Word feed
      1. Ascertainment of Alaska native stroke incidence, 2005-2009: Lessons for assessing the global burden of stroke
        Boden-Albala B, Allen J, Roberts ET, Bulkow L, Trimble B.
        J Stroke Cerebrovasc Dis. 2017 Jul 14.
        BACKGROUND: Stroke is a critical public health issue in the United States and globally. System models to optimally capture stroke incidence in rural and culturally diverse communities are needed. The epidemiological transition to a western lifestyle has been associated with an increased burden of vascular risk factors among Alaska Native (AN) people. The burden of stroke in AN communities remains understudied. METHODS: The Alaska Native Stroke Registry (ANSR) was designed to screen and capture all stroke cases between 2005 and 2009 through its integration into the existing single-payer Alaska Tribal Health System infrastructure. Registry staff received notification each time stroke International Classification of Diseases, Ninth Revision codes (430-436) were initiated anywhere in the system. Trained chart abstractors reviewed medical records to document incident strokes among AN patients, which were adjudicated. RESULTS: Between October 2005 and October 2009, over 2100 alerts were screened identifying 514 unique stroke cases, of which 372 were incident strokes. The average annual incidence of stroke (per 100,000) among AN adults was 190.6: 219.2 in men and 164.7 in women. Overall, the ischemic stroke incidence rate was 148.5 per 100,000 with men (184.6) having higher ischemic rates per 100,000 than women (118.3). Men have higher rates of ischemic stroke at all ages, whereas older women experienced higher rates of hemorrhagic strokes over the age of 75 years. CONCLUSIONS: We report a high rate of overall stroke, 190.6 per 100,000. The ANSR methods and findings have implications for other indigenous populations and for global health populations currently undergoing similar epidemiological transitions.

      2. An American Thoracic Society/National Heart, Lung, and Blood Institute Workshop Report: Addressing respiratory health equality in the United States
        Celedon JC, Burchard EG, Schraufnagel D, Castillo-Salgado C, Schenker M, Balmes J, Neptune E, Cummings KJ, Holguin F, Riekert KA, Wisnivesky JP, Garcia JG, Roman J, Kittles R, Ortega VE, Redline S, Mathias R, Thomas A, Samet J, Ford JG.
        Ann Am Thorac Soc. 2017 May;14(5):814-826.
        Health disparities related to race, ethnicity, and socioeconomic status persist and are commonly encountered by practitioners of pediatric and adult pulmonary, critical care, and sleep medicine in the United States. To address such disparities and thus progress toward equality in respiratory health, the American Thoracic Society and the National Heart, Lung, and Blood Institute convened a workshop in May of 2015. The workshop participants addressed health disparities by focusing on six topics, each of which concluded with a panel discussion that proposed recommendations for research on racial, ethnic, and socioeconomic disparities in pulmonary, critical care, and sleep medicine. Such recommendations address best practices to advance research on respiratory health disparities (e.g., characterize broad ethnic groups into subgroups known to differ with regard to a disease of interest), risk factors for respiratory health disparities (e.g., study the impact of new tobacco or nicotine products on respiratory diseases in minority populations), addressing equity in access to healthcare and quality of care (e.g., conduct longitudinal studies of the impact of the Affordable Care Act on respiratory and sleep disorders), the impact of personalized medicine on disparities research (e.g., implement large studies of pharmacogenetics in minority populations), improving design and methodology for research studies in respiratory health disparities (e.g., use study designs that reduce participants’ burden and foster trust by engaging participants as decision-makers), and achieving equity in the pulmonary, critical care, and sleep medicine workforce (e.g., develop and maintain robust mentoring programs for junior faculty, including local and external mentors). Addressing these research needs should advance efforts to reduce, and potentially eliminate, respiratory, sleep, and critical care disparities in the United States.

      3. Looking again at the Look AHEAD study
        Gregg EW, Wing R.
        Lancet Diabetes Endocrinol. 2017 Jul 12.

        [No abstract]

      4. Early-life farm exposures and adult asthma and atopy in the Agricultural Lung Health Study
        House JS, Wyss AB, Hoppin JA, Richards M, Long S, Umbach DM, Henneberger PK, Beane Freeman LE, Sandler DP, Long O’Connell E, Barker-Cummings C, London SJ.
        J Allergy Clin Immunol. 2017 Jul;140(1):249-256.e14.
        BACKGROUND: Previous studies, mostly from Europe, suggest that early-life farming exposures protect against childhood asthma and allergy; few data exist on asthma and allergy in adults. OBJECTIVE: We sought to examine associations between early-life farming exposures and current asthma and atopy in an older adult US farming population. METHODS: We analyzed data from 1746 farmers and 1555 spouses (mean age, 63) from a case-control study nested within the Agricultural Health Study. Current asthma and early-life farming exposures were assessed via questionnaires. We defined atopy based on specific IgE > 0.70 IU/mL to at least 1 of 10 allergens measured in blood. We used logistic regression, adjusted for age, sex, race, state (Iowa or North Carolina), and smoking (pack years), to estimate associations between early-life exposures and asthma (1198 cases and 2031 noncases) or atopy (578 cases and 2526 noncases). RESULTS: Exposure to the farming environment in utero and in early childhood had little or no association with asthma but was associated with reduced odds of atopy. The strongest association was seen for having a mother who performed farm activities while pregnant (odds ratio, 0.60; 95% CI, 0.48-0.74) and remained significant in models with correlated early-life exposures including early childhood farm animal contact and raw milk consumption. CONCLUSIONS: In a large US farming population, early-life farm exposures, particularly maternal farming activities while pregnant, were strongly associated with reduced risk of atopy in adults. These results extend previous work done primarily on childhood outcomes and suggest that protective associations of early-life farming exposures on atopy endure across the life course.

    • Communicable Diseases RSS Word feed
      1. : Using data from the National HIV Surveillance System, we examined HIV infections diagnosed between 2010 and 2015 attributed to heterosexual contact with partners previously known to be HIV infected. More than four in 10 HIV infections among heterosexual males and five in 10 HIV infections among heterosexual women were attributed to this group. Findings may inform the prioritization of prevention and care efforts and resource allocation modeling for reducing new HIV infection among discordant partnerships.

      2. Notes from the field: Cluster of acute flaccid myelitis in five pediatric patients – Maricopa County, Arizona, 2016
        Iverson SA, Ostdiek S, Prasai S, Engelthaler DM, Kretschmer M, Fowle N, Tokhie HK, Routh J, Sejvar J, Ayers T, Bowers J, Brady S, Rogers S, Nix WA, Komatsu K, Sunenshine R.
        MMWR Morb Mortal Wkly Rep. 2017 Jul 21;66(28):758-760.

        [No abstract]

      3. McCollum ED, Park DE, Watson NL, Buck WC, Bunthi C, Devendra A, Ebruke BE, Elhilali M, Emmanouilidou D, Garcia-Prats AJ, Githinji L, Hossain L, Madhi SA, Moore DP, Mulindwa J, Olson D, Awori JO, Vandepitte WP, Verwey C, West JE, Knoll MD, O’Brien KL, Feikin DR, Hammit LL.
        BMJ Open Respir Res. 2017 ;4(1).
        Introduction Paediatric lung sound recordings can be systematically assessed, but methodological feasibility and validity is unknown, especially from developing countries. We examined the performance of acoustically interpreting recorded paediatric lung sounds and compared sound characteristics between cases and controls. Methods Pneumonia Etiology Research for Child Health staff in six African and Asian sites recorded lung sounds with a digital stethoscope in cases and controls. Cases aged 1-59 months had WHO severe or very severe pneumonia; age-matched community controls did not. A listening panel assigned examination results of normal, crackle, wheeze, crackle and wheeze or uninterpretable, with adjudication of discordant interpretations. Classifications were recategorised into any crackle, any wheeze or abnormal (any crackle or wheeze) and primary listener agreement (first two listeners) was analysed among interpretable examinations using the prevalence-adjusted, bias-adjusted kappa (PABAK). We examined predictors of disagreement with logistic regression and compared case and control lung sounds with descriptive statistics. Results Primary listeners considered 89.5% of 792 case and 92.4% of 301 control recordings interpretable. Among interpretable recordings, listeners agreed on the presence or absence of any abnormality in 74.9% (PABAK 0.50) of cases and 69.8% (PABAK 0.40) of controls, presence/absence of crackles in 70.6% (PABAK 0.41) of cases and 82.4% (PABAK 0.65) of controls and presence/ absence of wheeze in 72.6% (PABAK 0.45) of cases and 73.8% (PABAK 0.48) of controls. Controls, tachypnoea, >3 uninterpretable chest positions, crying, upper airway noises and study site predicted listener disagreement. Among all interpretable examinations, 38.0% of cases and 84.9% of controls were normal (p<0.0001); wheezing was the most common sound (49.9%) in cases. Conclusions Listening panel and case-control data suggests our methodology is feasible, likely valid and that small airway inflammation is common in WHO pneumonia. Digital auscultation may be an important future pneumonia diagnostic in developing countries.

      4. Exploring factors associated with declining HIV diagnoses among African American females
        McCree DH, Jeffries 4th W, Beer L, Gant Z, Elmore K, Sutton M.
        J Racial Ethn Health Disparities. 2017 Jul 19.
        HIV diagnoses among females in the USA declined 40% during 2005-2014 with the largest decline (42%) among Black/African Americans. African American females remain disproportionately affected. We explored contributions of STD rates and sexual risk behaviors among African American females, HIV diagnoses among potential male partners, and sexual risk behaviors and viral suppression rates among HIV-positive potential male partners to declining rates of HIV diagnoses among African American females. Results suggest temporal trends in the factors that increase HIV infectiousness and transmissibility within sexual networks, i.e., decreases in rates of other sexually transmitted infections among African American females, decreases in HIV diagnoses among potential male partners, and increases in viral suppression among heterosexual and bisexual HIV-positive potential male partners in care, may explain the decline. Findings highlight a need for future research that provides context to the sexual risk behaviors and sexual network factors in order to continue progress.

      5. From one to the other: responding to Ebola cases on either side of the line
        Merrill RD, Ward SE, Oppert MC, Schneider DA.
        Pan Afr Med J. 2017 ;27(Suppl 1):12.
        This case study is adapted from events that occurred along the Sierra Leone and Guinea land border during the 2014-2016 Ebola epidemic in West Africa. The response activities involved Sierra Leone and Guinea officials, along with assistance from U.S. Centers for Disease Control and Prevention (CDC) and the World Health Organisation (WHO). This case study builds upon an understanding of basic surveillance systems and outbreak response activities. Through this exercise, students will understand how to incorporate communication and coordination into surveillance and response efforts with counterparts across the border in neighbouring countries. This integration is important to reduce the spread of communicable diseases between neighbouring countries. The time required to complete this case study is 2-3 hours.

      6. Incidence of active tuberculosis and cohort retention among adolescents in western Kenya
        Nduba V, Van’t Hoog AH, Mitchell EM, Borgdorff M, Laserson KF.
        Pediatr Infect Dis J. 2017 Jul 14.
        SETTING: Siaya County, with the highest tuberculosis notification rates in Kenya. OBJECTIVE: To determine the incidence of active tuberculosis and one year cohort retention in 12-18 year old adolescents, in preparation for Phase III tuberculosis vaccine trials. METHODS: Adolescents were enrolled and followed up for 1-2 years to determine tuberculosis incidence. Adolescents with a positive tuberculin skin test (TST), history of cohabitation with a tuberculosis case, or at least one tuberculosis symptom received clinical and sputum examination and a chest radiograph. Definite tuberculosis cases were bacteriologically confirmed and clinical cases diagnosed by a clinician based on a suggestive chest radiograph and having clinical symptoms. Risk factors were explored using Poisson regression. RESULTS: Among 4934 adolescents without tuberculosis at baseline, 26 tuberculosis cases were identified during follow up with a corresponding incidence density of 4.4 (95% CI, 3.0-6.4) events per 1000 person years of observation, 12 definite tuberculosis cases; incidence density of 2.0 (95% CI, 0.9-3.1). Having previous tuberculosis (RR= 12.5, CI 1.8, 100) and presence of TST conversion (RR=3.4, CI 1.5, 7.7) were significantly associated with higher risk of tuberculosis. Overall (4086/4925) 83.0% of adolescents were retained in the study after 1 year of follow up. Being female, older, out of school and being orphaned were significant risk factors for loss to follow up. CONCLUSION: The tuberculosis incidence in adolescents will help inform future tuberculosis vaccine trial sample size calculations for this setting. The predictive factors for tuberculosis and retention can be further explored in future trials.

      7. High mortality and coinfection in a prospective cohort of human immunodeficiency virus/acquired immune deficiency syndrome patients with histoplasmosis in Guatemala
        Samayoa B, Roy M, Cleveland AA, Medina N, Lau-Bonilla D, Scheel CM, Gomez BL, Chiller T, Arathoon E.
        Am J Trop Med Hyg. 2017 Jul;97(1):42-48.
        Histoplasmosis is one of the most common and deadly opportunistic infections among persons living with human immunodeficiency virus (HIV)/acquired immune deficiency syndrome in Latin America, but due to limited diagnostic capacity in this region, few data on the burden and clinical characteristics of this disease exist. Between 2005 and 2009, we enrolled patients >/= 18 years of age with suspected histoplasmosis at a hospital-based HIV clinic in Guatemala City. A case of suspected histoplasmosis was defined as a person presenting with at least three of five clinical or radiologic criteria. A confirmed case of histoplasmosis was defined as a person with a positive culture or urine antigen test for Histoplasma capsulatum. Demographic and clinical data were also collected and analyzed. Of 263 enrolled as suspected cases of histoplasmosis, 101 (38.4%) were confirmed cases. Median time to diagnosis was 15 days after presentation (interquartile range [IQR] = 5-23). Crude overall mortality was 43.6%; median survival time was 19 days (IQR = 4-69). Mycobacterial infection was diagnosed in 70 (26.6%) cases; 26 (25.7%) histoplasmosis cases were coinfected with mycobacteria. High mortality and short survival time after initial symptoms were observed in patients with histoplasmosis. Mycobacterial coinfection diagnoses were frequent, highlighting the importance of pursuing diagnoses for both diseases.

      8. Factors associated with the duration of moderate-to-severe diarrhea among children in rural western Kenya enrolled in the Global Enteric Multicenter Study, 2008-2012
        Schilling KA, Omore R, Derado G, Ayers T, Ochieng JB, Farag TH, Nasrin D, Panchalingam S, Nataro JP, Kotloff KL, Levine MM, Oundo J, Parsons MB, Bopp C, Laserson K, Stauber CE, Rothenberg R, Breiman RF, O’Reilly CE, Mintz ED.
        Am J Trop Med Hyg. 2017 Jul;97(1):248-258.
        Diarrheal disease is a leading cause of death among young children worldwide. As rates of acute diarrhea (AD; 1-6 days duration) have decreased, persistent diarrhea (PD; > 14 days duration) accounts for a greater proportion of the diarrheal disease burden. We describe factors associated with the duration of moderate-to-severe diarrhea in Kenyan children < 5 years old enrolled in the Global Enteric Multicenter Study. We found 587 (58%) children experienced AD, 360 (35%) had prolonged acute diarrhea (ProAD; 7-13 days duration), and 73 (7%) had PD. We constructed a Cox proportional hazards model to identify factors associated with diarrheal duration. Risk factors independently associated with longer diarrheal duration included infection with Cryptosporidium (hazard ratio [HR]: 0.868, P = 0.035), using an unimproved drinking water source (HR: 0.87, P = 0.035), and being stunted at enrollment (HR: 0.026, P < 0.0001). Diarrheal illness of extended duration appears to be multifactorial; given its association with adverse health and development outcomes, effective strategies should be implemented to reduce the duration and severity of diarrheal illness. Effective treatments for Cryptosporidium should be identified, interventions to improve drinking water are imperative, and nutrition should be improved through exclusive breastfeeding in infants </= 6 months and appropriate continued feeding practices for ill children.

    • Drug Safety RSS Word feed
      1. Changing the conversation about opioid tapering
        Dowell D, Haegerich TM.
        Ann Intern Med. 2017 Jul 18.

        [No abstract]

      2. Using prescription monitoring program data to characterize out-of-pocket payments for opioid prescriptions in a state Medicaid program
        Hartung DM, Ahmed SM, Middleton L, Van Otterloo J, Zhang K, Keast S, Kim H, Johnston K, Deyo RA.
        Pharmacoepidemiol Drug Saf. 2017 Jul 19.
        BACKGROUND: Out-of-pocket payment for prescription opioids is believed to be an indicator of abuse or diversion, but few studies describe its epidemiology. Prescription drug monitoring programs (PDMPs) collect controlled substance prescription fill data regardless of payment source and thus can be used to study this phenomenon. OBJECTIVE: To estimate the frequency and characteristics of prescription fills for opioids that are likely paid out-of-pocket by individuals in the Oregon Medicaid program. RESEARCH DESIGN: Cross-sectional analysis using Oregon Medicaid administrative claims and PDMP data (2012 to 2013). SUBJECTS: Continuously enrolled nondually eligible Medicaid beneficiaries who could be linked to the PDMP with two opioid fills covered by Oregon Medicaid. MEASURES: Patient characteristics and fill characteristics for opioid fills that lacked a Medicaid pharmacy claim. Fill characteristics included opioid name, type, and association with indicators of high-risk opioid use. RESULTS: A total of 33 592 Medicaid beneficiaries filled a total of 555 103 opioid prescriptions. Of these opioid fills, 74 953 (13.5%) could not be matched to a Medicaid claim. Hydromorphone (30%), fentanyl (18%), and methadone (15%) were the most likely to lack a matching claim. The 3 largest predictors for missing claims were opioid fills that overlapped with other opioids (adjusted odds ratio [aOR] 1.37; 95% confidence interval [CI], 1.34-1.4), long-acting opioids (aOR 1.52; 95% CI, 1.47-1.57), and fills at multiple pharmacies (aOR 1.45; 95% CI, 1.39-1.52). CONCLUSIONS: Prescription opioid fills that were likely paid out-of-pocket were common and associated with several known indicators of high-risk opioid use.

    • Environmental Health RSS Word feed
      1. The use of gamma-survey measurements to better understand radon potential in urban areas
        Berens AS, Diem J, Stauber C, Dai D, Foster S, Rothenberg R.
        Sci Total Environ. 2017 Jul 13;607-608:888-899.
        Accounting for as much as 14% of all lung cancers worldwide, cumulative radon progeny exposure is the leading cause of lung cancer among never-smokers both internationally and in the United States. To understand the risk of radon progeny exposure, studies have mapped radon potential using aircraft-based measurements of gamma emissions. However, these efforts are hampered in urban areas where the built environment obstructs aerial data collection. To address part of this limitation, this study aimed to evaluate the effectiveness of using in situ gamma readings (taken with a scintillation probe attached to a ratemeter) to assess radon potential in an urban environment: DeKalb County, part of the Atlanta metropolitan area, Georgia, USA. After taking gamma measurements at 402 survey sites, empirical Bayesian kriging was used to create a continuous surface of predicted gamma readings for the county. We paired these predicted gamma readings with indoor radon concentration data from 1351 residential locations. Statistical tests showed the interpolated gamma values were significantly but weakly positively related with indoor radon concentrations, though this relationship is decreasingly informative at finer geographic scales. Geology, gamma readings, and indoor radon were interrelated, with granitic gneiss generally having the highest gamma readings and highest radon concentrations and ultramafic rock having the lowest of each. Our findings indicate the highest geogenic radon potential may exists in the relatively undeveloped southeastern part of the county. It is possible that in situ gamma, in concert with other variables, could offer an alternative to aerial radioactivity measurements when determining radon potential, though future work will be needed to address this project’s limitations.

      2. Pilot intervention study of household ventilation and fine particulate matter concentrations in a low-income urban area, Dhaka, Bangladesh
        Weaver AM, Parveen S, Goswami D, Crabtree-Ide C, Rudra C, Yu J, Mu L, Fry AM, Sharmin I, Luby SP, Ram PK.
        Am J Trop Med Hyg. 2017 May 30.
        Fine particulate matter (PM2.5) is a risk factor for pneumonia; ventilation may be protective. We tested behavioral and structural ventilation interventions on indoor PM2.5 in Dhaka, Bangladesh. We recruited 59 good ventilation (window or door in >/= 3 walls) and 29 poor ventilation (no window, one door) homes. We monitored baseline indoor and outdoor PM2.5 for 48 hours. We asked all participants to increase ventilation behavior, including opening windows and doors, and operating fans. Where permitted, we installed windows in nine poor ventilation homes, then repeated PM2.5 monitoring. We estimated effects using linear mixed-effects models and conducted qualitative interviews regarding motivators and barriers to ventilation. Compared with poor ventilation homes, good ventilation homes were larger, their residents wealthier and less likely to use biomass fuel. In multivariable linear mixed-effects models, ventilation structures and opening a door or window were inversely associated with the number of hours PM2.5 concentrations exceeded 100 and 250 mug/m3. Outdoor air pollution was positively associated with the number of hours PM2.5 concentrations exceeded 100 and 250 mug/m3. Few homes accepted window installation, due to landlord refusal and fear of theft. Motivators for ventilation behavior included cooling of the home and sunlight; barriers included rain, outdoor odors or noise, theft risk, mosquito entry, and, for fan use, perceptions of wasting electricity or unavailability of electricity. We concluded that ventilation may reduce indoor PM2.5 concentrations but, there are barriers to increasing ventilation and, in areas with high ambient PM2.5 concentrations, indoor concentrations may remain above recommended levels.

    • Epidemiology and Surveillance RSS Word feed
      1. Case studies in applied epidemiology
        Dicker RC.
        Pan Afr Med J. 2017 ;27(Suppl 1):1.

        [No abstract]

      2. PURPOSE: This study aimed to characterize the sociodemographic characteristics of sexual minority (i.e., gay, lesbian, bisexual) adults and compare sexual minority and heterosexual populations on nine Healthy People 2020 leading health indicators (LHIs). METHODS: Using a nationally representative, cross-sectional survey (National Health Interview Survey 2013-2015) of the civilian, noninstitutionalized population (228,893,944 adults), nine Healthy People 2020 LHIs addressing health behaviors and access to care, stratified using a composite variable of sex (female, male) and sexual orientation (gay or lesbian, bisexual, heterosexual), were analyzed individually and in aggregate. RESULTS: In 2013-2015, sexual minority adults represented 2.4% of the U.S. POPULATION: Compared to heterosexuals, sexual minorities were more likely to be younger and to have never married. Gays and lesbians were more likely to have earned a graduate degree. Gay males were more likely to have a usual primary care provider, but gay/lesbian females were less likely than heterosexuals to have a usual primary care provider and health insurance. Gay males received more colorectal cancer screening than heterosexual males. Gay males, gay/lesbian females, and bisexual females were more likely to be current smokers than their sex-matched, heterosexual counterparts. Binge drinking was more common in bisexuals compared to heterosexuals. Sexual minority females were more likely to be obese than heterosexual females; the converse was true for gay males. Sexual minorities underwent more HIV testing than their heterosexual peers, but bisexual males were less likely than gay males to be tested. Gay males were more likely to meet all eligible LHIs than heterosexual males. Overall, more sexual minority adults met all eligible LHIs compared to heterosexual adults. Similar results were found regardless of HIV testing LHI inclusion. CONCLUSION: Differences between sexual minorities and heterosexuals suggest the need for targeted health assessments and public health interventions aimed at reducing specific negative health behaviors.

    • Food Safety RSS Word feed
      1. Notes from the Field: Cronobacter sakazakii infection associated with feeding extrinsically contaminated expressed human milk to a premature infant – Pennsylvania, 2016
        Bowen A, Wiesenfeld HC, Kloesz JL, Pasculle AW, Nowalk AJ, Brink L, Elliot E, Martin H, Tarr CL.
        MMWR Morb Mortal Wkly Rep. 2017 Jul 21;66(28):761-762.

        [No abstract]

      2. Evaluation of the use of zero-augmented regression techniques to model incidence of Campylobacter infections in FoodNet
        Tremblay M, Crim SM, Cole DJ, Hoekstra RM, Henao OL, Dopfer D.
        Foodborne Pathog Dis. 2017 Jul 18.
        The Foodborne Diseases Active Surveillance Network (FoodNet) is currently using a negative binomial (NB) regression model to estimate temporal changes in the incidence of Campylobacter infection. FoodNet active surveillance in 483 counties collected data on 40,212 Campylobacter cases between years 2004 and 2011. We explored models that disaggregated these data to allow us to account for demographic, geographic, and seasonal factors when examining changes in incidence of Campylobacter infection. We hypothesized that modeling structural zeros and including demographic variables would increase the fit of FoodNet’s Campylobacter incidence regression models. Five different models were compared: NB without demographic covariates, NB with demographic covariates, hurdle NB with covariates in the count component only, hurdle NB with covariates in both zero and count components, and zero-inflated NB with covariates in the count component only. Of the models evaluated, the nonzero-augmented NB model with demographic variables provided the best fit. Results suggest that even though zero inflation was not present at this level, individualizing the level of aggregation and using different model structures and predictors per site might be required to correctly distinguish between structural and observational zeros and account for risk factors that vary geographically.

    • Genetics and Genomics RSS Word feed
      1. Comparative analysis of extended-spectrum-beta-lactamase CTX-M-65-producing Salmonella enterica serovar infantis isolates from humans, food animals, and retail chickens in the United States
        Tate H, Folster JP, Hsu CH, Chen J, Hoffmann M, Li C, Morales C, Tyson GH, Mukherjee S, Brown AC, Green A, Wilson W, Dessai U, Abbott J, Joseph L, Haro J, Ayers S, McDermott PF, Zhao S.
        Antimicrob Agents Chemother. 2017 Jul;61(7).
        We sequenced the genomes of 10 Salmonella enterica serovar Infantis isolates containing blaCTX-M-65 obtained from chicken, cattle, and human sources collected between 2012 and 2015 in the United States through routine National Antimicrobial Resistance Monitoring System (NARMS) surveillance and product sampling programs. We also completely assembled the plasmids from four of the isolates. All isolates had a D87Y mutation in the gyrA gene and harbored between 7 and 10 resistance genes [aph(4)-Ia, aac(3)-IVa, aph(3′)-Ic, blaCTX-M-65, fosA3, floR, dfrA14, sul1, tetA, aadA1] located in two distinct sites of a megaplasmid ( approximately 316 to 323 kb) similar to that described in a blaCTX-M-65-positive S Infantis isolate from a patient in Italy. High-quality single nucleotide polymorphism (hqSNP) analysis revealed that all U.S. isolates were closely related, separated by only 1 to 38 pairwise high-quality SNPs, indicating a high likelihood that strains from humans, chickens, and cattle recently evolved from a common ancestor. The U.S. isolates were genetically similar to the blaCTX-M-65-positive S Infantis isolate from Italy, with a separation of 34 to 47 SNPs. This is the first report of the blaCTX-M-65 gene and the pESI (plasmid for emerging S Infantis)-like megaplasmid from S Infantis in the United States, and it illustrates the importance of applying a global One Health human and animal perspective to combat antimicrobial resistance.

    • Health Disparities RSS Word feed
      1. Ventanillas de Salud: A collaborative and binational health access and preventive care program
        Rangel Gomez MG, Tonda J, Zapata GR, Flynn M, Gany F, Lara J, Shapiro I, Rosales CB.
        Front Public Health. 2017 ;5:151.
        While individuals of Mexican origin are the largest immigrant group living in the U.S., this population is also the highest uninsured. Health disparities related to access to health care, among other social determinants, continue to be a challenge for this population. The government of Mexico, in an effort to address these disparities and improve the quality of life of citizens living abroad, has partnered with governmental and non-governmental health-care organizations in the U.S. by developing and implementing an initiative known as Ventanillas de Salud-Health Windows-(VDS). The VDS is located throughout the Mexican Consular network and aim to increase access to health care and health literacy, provide health screenings, and promote healthy lifestyle choices among low-income and immigrant Mexican populations in the U.S.

    • Health Economics RSS Word feed
      1. Trends in beverage prices following the introduction of a tax on sugar-sweetened beverages in Barbados
        Alvarado M, Kostova D, Suhrcke M, Hambleton I, Hassell T, Alafia Samuels T, Adams J, Unwin N.
        Prev Med. 2017 Jul 14.
        A 10% excise tax on sugar sweetened beverages (SSBs) was implemented in Barbados in September 2015. A national evaluation has been established to assess the impact of the tax. We present a descriptive analysis of initial price changes following implementation of the SSB tax using price data provided by a major supermarket chain in Barbados over the period 2014-2016. We summarize trends in price change before and after the tax using year-on-year mean price per liter change between SSBs and non-SSBs. We find that prior to the tax, year-on-year price growth of SSBs and non-SSBs was very similar (approximately 1%). During the quarter in which the tax was implemented, the trends diverged, with SSB prices growing by almost 3% while prices of non-SSBs decreased slightly. The growth of SSB prices outpaced non-SSBs prices in each quarter thereafter, reaching 5.9% growth compared to <1% for non-SSBs. Future analyses will assess the trends in prices of SSBs and non-SSBs over a longer period and will integrate price data from additional sources to assess heterogeneity of post-tax price changes. A continued examination of the impact of the SSB tax in Barbados will expand the evidence base available to policymakers worldwide in considering SSB taxes as a lever for reducing the consumption of added sugars at the population level.

    • Immunity and Immunization RSS Word feed
      1. Population genetic structure, antibiotic resistance, capsule switching and evolution of invasive pneumococci before conjugate vaccination in Malawi
        Chaguza C, Cornick JE, Andam CP, Gladstone RA, Alaerts M, Musicha P, Peno C, Bar-Zeev N, Kamng’ona AW, Kiran AM, Msefula CL, McGee L, Breiman RF, Kadioglu A, French N, Heyderman RS, Hanage WP, Bentley SD, Everett DB.
        Vaccine. 2017 Jul 12.
        INTRODUCTION: Pneumococcal infections cause a high death toll in Sub Saharan Africa (SSA) but the recently rolled out pneumococcal conjugate vaccines (PCV) will reduce the disease burden. To better understand the population impact of these vaccines, comprehensive analysis of large collections of pneumococcal isolates sampled prior to vaccination is required. Here we present a population genomic study of the invasive pneumococcal isolates sampled before the implementation of PCV13 in Malawi. MATERIALS AND METHODS: We retrospectively sampled and whole genome sequenced 585 invasive isolates from 2004 to 2010. We determine the pneumococcal population genetic structure and assessed serotype prevalence, antibiotic resistance rates, and the occurrence of serotype switching. RESULTS: Population structure analysis revealed 22 genetically distinct sequence clusters (SCs), which consisted of closely related isolates. Serotype 1 (ST217), a vaccine-associated serotype in clade SC2, showed highest prevalence (19.3%), and was associated with the highest MDR rate (81.9%) followed by serotype 12F, a non-vaccine serotype in clade SC10 with an MDR rate of 57.9%. Prevalence of serotypes was stable prior to vaccination although there was an increase in the PMEN19 clone, serotype 5 ST289, in clade SC1 in 2010 suggesting a potential undetected local outbreak. Coalescent analysis revealed recent emergence of the SCs and there was evidence of natural capsule switching in the absence of vaccine induced selection pressure. Furthermore, majority of the highly prevalent capsule-switched isolates were associated with acquisition of vaccine-targeted capsules. CONCLUSIONS: This study provides descriptions of capsule-switched serotypes and serotypes with potential to cause serotype replacement post-vaccination such as 12F. Continued surveillance is critical to monitor these serotypes and antibiotic resistance in order to design better infection prevention and control measures such as inclusion of emerging replacement serotypes in future conjugate vaccines.

      2. Attitudes about vaccines to prevent Ebola virus disease in Guinea at the end of a large Ebola epidemic: Results of a national household survey
        Irwin KL, Jalloh MF, Corker J, Alpha Mahmoud B, Robinson SJ, Li W, James NE, Sellu M, Jalloh MB, Diallo AA, Tracy L, Hajjeh R, VanSteelandt A, Bunnell R, Martel L, Raghunathan PL, Marston B.
        Vaccine. 2017 Jul 14.
        INTRODUCTION: In 2014-2016, an Ebola epidemic devastated Guinea; more than 3800 cases and 2500 deaths were reported to the World Health Organization. In August 2015, as the epidemic waned and clinical trials of an experimental, Ebola vaccine continued in Guinea and neighboring Sierra Leone, we conducted a national household survey about Ebola-related knowledge, attitudes, and practices (KAP) and opinions about “hypothetical” Ebola vaccines. METHODS: Using cluster-randomized sampling, we selected participants aged 15+ years old in Guinea’s 8 administrative regions, which had varied cumulative case counts. The questionnaire assessed socio-demographic characteristics, experiences during the epidemic, Ebola-related KAP, and Ebola vaccine attitudes. To assess the potential for Ebola vaccine introduction in Guinea, we examined the association between vaccine attitudes and participants’ characteristics using categorical and multivariable analyses. RESULTS: Of 6699 persons invited to participate, 94% responded to at least 1 Ebola vaccine question. Most agreed that vaccines were needed to fight the epidemic (85.8%) and that their family would accept safe, effective Ebola vaccines if they became available in Guinea (84.2%). These measures of interest and acceptability were significantly more common among participants who were male, wealthier, more educated, and lived with young children who had received routine vaccines. Interest and acceptability were also significantly higher among participants who understood Ebola transmission modes, had witnessed Ebola response teams, knew Ebola-affected persons, believed Ebola was not always fatal, and would access Ebola treatment centers. In multivariable analyses of the majority of participants living with young children, interest and acceptability were significantly higher among those living with vaccinated children than among those living with unvaccinated children. DISCUSSION: The high acceptability of hypothetical vaccines indicates strong potential for introducing Ebola vaccines across Guinea. Strategies to build public confidence in use of Ebola vaccines should highlight any similarities with safe, effective vaccines routinely used in Guinea.

      3. Progress toward measles elimination – Bangladesh, 2000-2016
        Khanal S, Bohara R, Chacko S, Sharifuzzaman M, Shamsuzzaman M, Goodson JL, Dabbagh A, Kretsinger K, Dhongde D, Liyanage J, Bahl S, Thapa A.
        MMWR Morb Mortal Wkly Rep. 2017 Jul 21;66(28):753-757.
        In 2013, at the 66th session of the Regional Committee of the World Health Organization (WHO) South-East Asia Region (SEAR), a regional goal was established to eliminate measles and control rubella and congenital rubella syndrome* by 2020 (1). WHO-recommended measles elimination strategies in SEAR countries include 1) achieving and maintaining >/=95% coverage with 2 doses of measles-containing vaccine (MCV) in every district, delivered through the routine immunization program or through supplementary immunization activities (SIAs)dagger; 2) developing and sustaining a sensitive and timely measles case-based surveillance system that meets targets for recommended performance indicators; and 3) developing and maintaining an accredited measles laboratory network (2). In 2014, Bangladesh, one of 11 countries in SEAR, adopted a national goal for measles elimination by 2018 (2,3). This report describes progress and challenges toward measles elimination in Bangladesh during 2000-2016. Estimated coverage with the first MCV dose (MCV1) increased from 74% in 2000 to 94% in 2016. The second MCV dose (MCV2) was introduced in 2012, and MCV2 coverage increased from 35% in 2013 to 93% in 2016. During 2000-2016, approximately 108.9 million children received MCV during three nationwide SIAs conducted in phases. During 2000-2016, reported confirmed measles incidence decreased 82%, from 34.2 to 6.1 per million population. However, in 2016, 56% of districts did not meet the surveillance performance target of >/=2 discarded nonmeasles, nonrubella cases section sign per 100,000 population. Additional measures that include increasing MCV1 and MCV2 coverage to >/=95% in all districts with additional strategies for hard-to-reach populations, increasing sensitivity of measles case-based surveillance, and ensuring timely transport of specimens to the national laboratory will help achieve measles elimination.

      4. Intussusception among children less than 2 years of age: Findings from pre-vaccine introduction surveillance in Pakistan
        Yousafzai MT, Thobani R, Qazi SH, Saddal N, Yen C, Aliabadi N, Ali SA.
        Vaccine. 2017 Jul 11.
        BACKGROUND: Rotavirus vaccination introduction in routine immunization is under consideration in Pakistan. Data on the baseline epidemiology of intussusception will inform surveillance strategies for intussusception after rotavirus vaccine introduction in Pakistan. We describe the epidemiology of intussusception-associated hospitalizations among children <2years of age in Karachi, Pakistan. METHODS: We conducted a retrospective chart review for July 01, 2012 through June 30, 2015 at the National Institute of Child Health (NICH) and Aga Khan University Hospital (AKUH) Karachi. At AKUH, the International Classification of Disease, ninth revision, code 560.0 for intussusception was used to retrieve intussusception case records. At NICH, daily Operation Theater, Emergency Room, and surgical daycare log sheets and surgical ward census sheets were used to identify cases. Records of children who fulfilled eligibility criteria and the Brighton Collaboration level one case definition of intussusception were selected for data analysis. We used structured case report forms to extract data for the descriptive analysis. RESULTS: We identified 158 cases of confirmed intussusception; 30 cases (19%) were from AKUH. More than half (53%) of the cases occurred in children aged 6-12months, followed by 35% among those aged <6months. Two-thirds (106/158) of the cases were male. The most common presenting complaints were vomiting and bloody stool. At NICH, almost all (93%) were managed surgically, while at AKUH, approximately 57% of the cases were managed with enemas. Three deaths occurred, all from NICH. Cases occurred without any seasonality. At NICH, 4% (128/3618) of surgical admissions among children aged <2years were attributed to intussusception, while that for AKUH was 2% (30/1702). CONCLUSION: In this chart review, intussusception predominantly affected children 0-6 months of age and occurred more commonly in males. This information on the baseline epidemiology of intussusception will inform post-vaccine introduction adverse event monitoring related to intussusception in Pakistan.

    • Injury and Violence RSS Word feed
      1. Racial and ethnic differences in homicides of adult women and the role of intimate partner violence – United States, 2003-2014
        Petrosky E, Blair JM, Betz CJ, Fowler KA, Jack SP, Lyons BH.
        MMWR Morb Mortal Wkly Rep. 2017 Jul 21;66(28):741-746.
        Homicide is one of the leading causes of death for women aged </=44 years.* In 2015, homicide caused the death of 3,519 girls and women in the United States. Rates of female homicide vary by race/ethnicity (1), and nearly half of victims are killed by a current or former male intimate partner (2). To inform homicide and intimate partner violence (IPV) prevention efforts, CDC analyzed homicide data from the National Violent Death Reporting System (NVDRS) among 10,018 women aged >/=18 years in 18 states during 2003-2014. The frequency of homicide by race/ethnicity and precipitating circumstances of homicides associated with and without IPV were examined. Non-Hispanic black and American Indian/Alaska Native women experienced the highest rates of homicide (4.4 and 4.3 per 100,000 population, respectively). Over half of all homicides (55.3%) were IPV-related; 11.2% of victims of IPV-related homicide experienced some form of violence in the month preceding their deaths, and argument and jealousy were common precipitating circumstances. Targeted IPV prevention programs for populations at disproportionate risk and enhanced access to intervention services for persons experiencing IPV are needed to reduce homicides among women.

      2. Introduction The Monitoring the Future (MTF) survey provides nationally-representative annual estimates of licensure and driving patterns among U.S. teens. A previous study using MTF data reported substantial declines in the proportion of high school seniors that were licensed to drive and increases in the proportion of nondrivers following the recent U.S. economic recession. Method To explore whether licensure and driving patterns among U.S. high school seniors have rebounded in the post-recession years, we analyzed MTF licensure and driving data for the decade of 20062015. We also examined trends in teen driver involvement in fatal and nonfatal injury crashes for that decade using data from the Fatality Analysis Reporting System and National Automotive Sampling System General Estimates System, respectively. Results During 20062015, the proportion of high school seniors that reported having a driver’s license declined by 9 percentage points (11%) from 81% to 72% and the proportion that did not drive during an average week increased by 8 percentage points (44%) from 18% to 26%. The annual proportion of black seniors that did not drive was consistently greater than twice the proportion of nondriving white seniors. Overall during the decade, 17- and 18-year-old drivers experienced large declines in fatal and nonfatal injury crashes, although crashes increased in both 2014 and 2015. Conclusions The MTF data indicate that licensure and driving patterns among U.S. high school seniors have not rebounded since the economic recession. The recession had marked negative effects on teen employment opportunities, which likely influenced teen driving patterns. Possible explanations for the apparent discrepancies between the MTF data and the 2014 and 2015 increases in crashes are explored. Practical applications MTF will continue to be an important resource for clarifying teen driving trends in relation to crash trends and informing strategies to improve teen driver safety.

    • Laboratory Sciences RSS Word feed
      1. Characterizing reactivity to Onchocerca volvulus antigens in multiplex bead assays
        Feeser KR, Cama V, Priest JW, Thiele EA, Wiegand RE, Lakwo T, Feleke SM, Cantey PT.
        Am J Trop Med Hyg. 2017 Jul 03.
        Multiplex bead assays (MBAs) may provide a powerful integrated tool for monitoring, evaluation, and post-elimination surveillance of onchocerciasis and co-endemic diseases; however, the specificity and sensitivity of Onchocerca volvulus antigens have not been characterized within this context. An MBA was developed to evaluate three antigens (OV-16, OV-17, and OV-33) for onchocerciasis. Receiver operating characteristics (ROC) analyses were used to characterize antigen performance using a panel of 610 specimens: 109 O. volvulus-positive specimens, 426 non-onchocerciasis controls with filarial and other confirmed parasitic infection, and 75 sera from patients with no other parasitic infection. The IgG and IgG4 assays for OV-16 demonstrated sensitivities of 95.4% and 96.3%, and specificities of 99.4% and 99.8%, respectively. The OV-17 IgG and IgG4 assays had sensitivities of 86.2% and 76.1% and specificities of 79.2% and 82.8%. For OV-33, the IgG and IgG4 assays had sensitivities of 90.8% and 96.3%, and specificities of 96.8% and 98.6%. The OV-16 IgG4-based MBA had the best assay characteristics, followed by OV-33 IgG4. The OV-16 IgG4 assay would be useful for monitoring and evaluation using the MBA platform. Further evaluations are needed to review the potential use of OV-33 as a confirmatory test in the context of program evaluations.

      2. Transcriptomics in toxicology
        Joseph P.
        Food Chem Toxicol. 2017 Jul 15.
        Xenobiotics, of which many are toxic, may enter the human body through multiple routes. Excessive human exposure to xenobiotics may exceed the body’s capacity to defend against the xenobiotic-induced toxicity and result in potentially fatal adverse health effects. Prevention of the adverse health effects, potentially associated with human exposure to the xenobiotics, may be achieved by detecting the toxic effects at an early, reversible and, therefore, preventable stage. Additionally, an understanding of the molecular mechanisms underlying the toxicity may be helpful in preventing and/or managing the ensuing adverse health effects. Human exposures to a large number of xenobiotics are associated with hepatotoxicity or pulmonary toxicity. Global gene expression changes taking place in biological systems, in response to exposure to xenobiotics, may represent the early and mechanistically relevant cellular events contributing to the onset and progression of xenobiotic-induced adverse health outcomes. Hepatotoxicity and pulmonary toxicity resulting from exposure to xenobiotics are discussed as specific examples to demonstrate the potential application of transcriptomics or global gene expression analysis in the prevention of adverse health effects associated with exposure to xenobiotics.

      3. Enterococcus crotali sp. nov., isolated from faecal material of a timber rattlesnake
        McLaughlin RW, Shewmaker PL, Whitney AM, Humrighouse BW, Lauer AC, Loparev VN, Gulvik CA, Cochran PA, Dowd SE.
        Int J Syst Evol Microbiol. 2017 Jun;67(6):1984-1989.
        A facultatively anaerobic, Gram-stain-positive bacterium, designated ETRF1T, was found in faecal material of a timber rattlesnake (Crotalus horridus). Based on a comparative 16S rRNA gene sequence analysis, the isolate was assigned to the genus Enterococcus. The 16S rRNA gene sequence of strain ETRF1T showed >97 % similarity to that of the type strains of Enterococcus rotai, E. caccae, E. silesiacus, E haemoperoxidus, E. ureasiticus, E. moraviensis, E. plantarum, E. quebecensis, E. ureilyticus, E. termitis, E. rivorum and E. faecalis. The organism could be distinguished from these 12 phylogenetically related enterococci using conventional biochemical testing, the Rapid ID32 Strep system, comparative pheS and rpoA gene sequence analysis, and comparative whole genome sequence analysis. The estimated in silico DNA-DNA hybridization values were <70 %, and average nucleotide identity values were <96 %, when compared to these 12 species, further validating that ETRF1T represents a unique species within the genus Enterococcus. On the basis of these analyses, strain ETRF1T (=CCUG 65857T=LMG 28312T) is proposed as the type strain of a novel species, Enterococcus crotali sp. nov.

      4. Viral pathogen detection by metagenomics and pan-viral group polymerase chain reaction in children with pneumonia lacking identifiable etiology
        Schlaberg R, Queen K, Simmon K, Tardif K, Stockmann C, Flygare S, Kennedy B, Voelkerding K, Bramley A, Zhang J, Eilbeck K, Yandell M, Jain S, Pavia AT, Tong S, Ampofo K.
        J Infect Dis. 2017 May 01;215(9):1407-1415.
        Background: Community-acquired pneumonia (CAP) is a leading cause of pediatric hospitalization. Pathogen identification fails in approximately 20% of children but is critical for optimal treatment and prevention of hospital-acquired infections. We used two broad-spectrum detection strategies to identify pathogens in test-negative children with CAP and asymptomatic controls. Methods: Nasopharyngeal/oropharyngeal (NP/OP) swabs from 70 children <5 years with CAP of unknown etiology and 90 asymptomatic controls were tested by next-generation sequencing (RNA-seq) and pan viral group (PVG) PCR for 19 viral families. Association of viruses with CAP was assessed by adjusted odds ratios (aOR) and 95% confidence intervals controlling for season and age group. Results: RNA-seq/PVG PCR detected previously missed, putative pathogens in 34% of patients. Putative viral pathogens included human parainfluenza virus 4 (aOR 9.3, P = .12), human bocavirus (aOR 9.1, P < .01), Coxsackieviruses (aOR 5.1, P = .09), rhinovirus A (aOR 3.5, P = .34), and rhinovirus C (aOR 2.9, P = .57). RNA-seq was more sensitive for RNA viruses whereas PVG PCR detected more DNA viruses. Conclusions: RNA-seq and PVG PCR identified additional viruses, some known to be pathogenic, in NP/OP specimens from one-third of children hospitalized with CAP without a previously identified etiology. Both broad-range methods could be useful tools in future epidemiologic and diagnostic studies.

      5. Quantitation of fentanyl analogs in dried blood spots by flow-through desorption coupled to online solid phase extraction tandem mass spectrometry
        Shaner RL, Schulze ND, Seymour C, Hamelin EI, Thomas JD, Johnson RC.
        Analytical Methods. 2017 ;9(25):3876-3883.
        An automated dried blood spot (DBS) elution coupled with solid phase extraction and tandem mass spectrometric analysis for multiple fentanyl analogs was developed and assessed. This method confirms human exposures to fentanyl, sufentanil, carfentanil, alfentanil, lofentanil, alpha-methyl fentanyl, and 3-methyl fentanyl in blood with minimal sample volume and reduced shipping and storage costs. Seven fentanyl analogs were detected and quantitated from DBS made from venous blood. The calibration curve in matrix was linear in the concentration range of 1.0 ng mL-1 to 100 ng mL-1 with a correlation coefficient greater than 0.98 for all compounds. The limit of detection varied from 0.15 ng mL-1 to 0.66 ng mL-1 depending on target analyte. Analysis of the entire DBS minimized the effects of hematocrit on quantitation. All quality control materials evaluated resulted in <15% error; analytes with isotopically labeled internal standards had <15% RSD, while analytes without matching standards had 15-24% RSD. This method provides an automated means to detect seven fentanyl analogs, and quantitate four fentanyl analogs with the benefits of DBS at levels anticipated from an overdose of these potent opioids.

    • Maternal and Child Health RSS Word feed
      1. Maternal stressors and social support and risks of delivering babies with gastroschisis or hypospadias
        Carmichael SL, Ma C, Tinker S, Shaw GM.
        Am J Epidemiol. 2017 May 13:1-7.
        We examined the association of maternal stressful life events and social support with risks of gastroschisis and hypospadias, using data from the National Birth Defects Prevention Study, a population-based case-control study of US births taking place in 2006-2011. We examined maternal self-reports of 7 life events and 3 sources of social support during the periconceptional period among mothers of 593 gastroschisis cases, 1,142 male hypospadias cases, and 4,399 nonmalformed controls. Responses to the questions on stressful life events were summed to form an index (higher is worse), as were responses to questions on social support (higher is better). We used logistic regression to estimate adjusted odds ratios and 95% confidence intervals. The adjusted odds ratios for gastroschisis for a 4-point increase in the stress index were 3.5 (95% confidence interval (CI): 2.6, 4.8) among nonteenage mothers (age >/=20 years) and 1.0 (95% CI: 0.5, 1.7) among teenage mothers (age <20 years). The odds ratio for hypospadias (among all mothers) was 0.8 (95% CI: 0.7, 1.1). Adjusted odds ratios for a social support score of 3 (versus 0) in the 3 respective groups were 0.6 (95% CI: 0.4, 1.0), 1.0 (95% CI: 0.5, 2.3), and 0.6 (95% CI: 0.4, 0.9). Given the lack of prior research on these outcomes and stress, results should be interpreted with caution.

      2. A national profile of attention-deficit hyperactivity disorder diagnosis and treatment among US children aged 2 to 5 years
        Danielson ML, Visser SN, Gleason MM, Peacock G, Claussen AH, Blumberg SJ.
        J Dev Behav Pediatr. 2017 Jul 14.
        OBJECTIVE: Clinical guidelines provide recommendations for diagnosis and treatment of attention-deficit hyperactivity disorder (ADHD), with specific guidance on caring for children younger than 6 years. This exploratory study describes ADHD diagnosis and treatment patterns among young children in the United States using 2 nationally representative parent surveys. METHODS: The National Survey of Children’s Health (2007-2008, 2011-2012) was used to produce weighted prevalence estimates of current ADHD and ADHD medication treatment among US children aged 2 to 5 years. The National Survey of Children with Special Health Care Needs (2009-2010) provided additional estimates on types of medication treatment and receipt of behavioral treatment among young children with special health care needs (CSHCN) with ADHD. RESULTS: In 2011 to 2012, 1.5% of young children (approximately 237,000) had current ADHD compared to 1.0% in 2007 to 2008. In 2011 to 2012, 43.7% of young children with current ADHD were taking medication for ADHD (approximately 104,000). In young CSHCN with ADHD, central nervous system stimulants were the most common medication type used to treat ADHD, and 52.8% of young CSHCN with current ADHD had received behavioral treatment for ADHD in the past year. CONCLUSION: Nearly a quarter million In young CSHCN have current ADHD, with a prevalence that has increased by 57% from 2007 to 2008 to 2011 to 2012. The demographic patterns of diagnosis and treatment described in this study can serve as a benchmark to monitor service use patterns of young children diagnosed with ADHD over time.

      3. The effects of an integrated community case management strategy on the appropriate treatment of children and child mortality in Kono District, Sierra Leone: A program evaluation
        Ratnayake R, Ratto J, Hardy C, Blanton C, Miller L, Choi M, Kpaleyea J, Momoh P, Barbera Y.
        Am J Trop Med Hyg. 2017 Jun 12.
        Integrated community case management (iCCM) aims to reduce child mortality in areas with poor access to health care. iCCM was implemented in 2009 in Kono district, Sierra Leone, a postconflict area with high under-five mortality rates (U5MRs). We evaluated iCCM’s impact and effects on child health using cluster surveys in 2010 (midterm) and 2013 (endline) to compare indicators on child mortality, coverage of appropriate treatment, timely access to care, quality of care, and recognition of community health workers (CHWs). The sample size was powered to detect a 28% decline in U5MR. Clusters were selected proportional to population size. All households were sampled to measure mortality and systematic random sampling was used to measure coverage in a subset of households. We used program data to evaluate utilization and access; 5,257 (2010) and 3,649 (2013) households were surveyed. U5MR did not change significantly (4.54 95% confidence interval [CI]: 3.47-5.60] to 3.95 [95% CI: 3.06-4.83] deaths per 1,000 per month (P = 0.4)) though a relative change smaller than 28% could not be detected. CHWs were the first source of care for 52% (2010) and 50.9% (2013) of children. Coverage of appropriate treatment of fever by CHWs or peripheral health units increased from 45.5% [95% CI: 39.2-52.0] to 58.2% [95% CI: 50.5-65.5] (P = 0.01); changes for diarrhea and pneumonia were not significant. The continued reliance on the CHW as the first source of care and improved coverage for the appropriate treatment of fever support iCCM’s role in Kono district.

      4. Variations in cause-of-death determination for sudden unexpected infant deaths
        Shapiro-Mendoza CK, Parks SE, Brustrom J, Andrew T, Camperlengo L, Fudenberg J, Payn B, Rhoda D.
        Pediatrics. 2017 ;140(1).
        OBJECTIVES: To quantify and describe variation in cause-of-death certification of sudden unexpected infant deaths (SUIDs) among US medical examiners and coroners. METHODS: From January to November 2014, we conducted a nationally representative survey of US medical examiners and coroners who certify infant deaths. Two-stage unequal probability sampling with replacement was used. Medical examiners and coroners were asked to classify SUIDs based on hypothetical scenarios and to describe the evidence considered and investigative procedures used for cause-of-death determination. Frequencies and weighted percentages were calculated. RESULTS: Of the 801 surveys mailed, 60% were returned, and 377 were deemed eligible and complete. Medical examiners and coroners classification of infant deaths varied by scenario. For 3 scenarios portraying potential airway obstruction and negative autopsy findings, 61% to 69% classified the death as suffocation/asphyxia. In the last scenario, which portrayed a healthy infant in a safe sleep environment with negative autopsy findings, medical examiners and coroners classified the death as sudden infant death syndrome (38%) and SUID (30%). Reliance on investigative procedures to determine cause varied, but 94% indicated using death scene investigations, 88% full autopsy, 85% toxicology analyses, and 82% medical history review. CONCLUSIONS: US medical examiners and coroners apply variable practices to classify and investigate SUID, and thus, they certify the same deaths differently. This variability influences surveillance and research, impacts true understanding of infant mortality causes, and inhibits our ability to accurately monitor and ultimately prevent future deaths. Findings may inform future strategies for promoting standardized practices for SUID classification.

      5. Effects of the EQUIP quasi-experimental study testing a collaborative quality improvement approach for maternal and newborn health care in Tanzania and Uganda
        Waiswa P, Manzi F, Mbaruku G, Rowe AK, Marx M, Tomson G, Marchant T, Willey BA, Schellenberg J, Peterson S, Hanson C.
        Implement Sci. 2017 Jul 18;12(1):89.
        BACKGROUND: Quality improvement is a recommended strategy to improve implementation levels for evidence-based essential interventions, but experience of and evidence for its effects in low-resource settings are limited. We hypothesised that a systemic and collaborative quality improvement approach covering district, facility and community levels, supported by report cards generated through continuous household and health facility surveys, could improve the implementation levels and have a measurable population-level impact on coverage and quality of essential services. METHODS: Collaborative quality improvement teams tested self-identified strategies (change ideas) to support the implementation of essential maternal and newborn interventions recommended by the World Health Organization. In Tanzania and Uganda, we used a plausibility design to compare the changes over time in one intervention district with those in a comparison district in each country. Evaluation included indicators of process, coverage and implementation practice analysed with a difference-of-differences and a time-series approach, using data from independent continuous household and health facility surveys from 2011 to 2014. Primary outcomes for both countries were birth in health facilities, breastfeeding within 1 h after birth, oxytocin administration after birth and knowledge of danger signs for mothers and babies. Interpretation of the results considered contextual factors. RESULTS: The intervention was associated with improvements on one of four primary outcomes. We observed a 26-percentage-point increase (95% CI 25-28%) in the proportion of live births where mothers received uterotonics within 1 min after birth in the intervention compared to the comparison district in Tanzania and an 8-percentage-point increase (95% CI 6-9%) in Uganda. The other primary indicators showed no evidence of improvement. In Tanzania, we saw positive changes for two other outcomes reflecting locally identified improvement topics. The intervention was associated with an increase in preparation of clean birth kits for home deliveries (31 percentage points, 95% CI 2-60%) and an increase in health facility supervision by district staff (14 percentage points, 95% CI 0-28%). CONCLUSIONS: The systemic quality improvement approach was associated with improvements of only one of four primary outcomes, as well as two Tanzania-specific secondary outcomes. Reasons for the lack of effects included limited implementation strength as well a relatively short follow-up period in combination with a 1-year recall period for population-based estimates and a limited power of the study to detect changes smaller than 10 percentage points. TRIAL REGISTRATION: Pan African Clinical Trials Registry: PACTR201311000681314.

    • Nutritional Sciences RSS Word feed
      1. A daily dose of 5 mg folic acid for 90 days is associated with increased serum unmetabolized folic acid and reduced natural killer cell cytotoxicity in healthy Brazilian adults
        Paniz C, Bertinato JF, Lucena MR, De Carli E, Amorim P, Gomes GW, Palchetti CZ, Figueiredo MS, Pfeiffer CM, Fazili Z, Green R, Guerra-Shinohara EM.
        J Nutr. 2017 Jul 19.
        Background: The effects of high-dose folic acid (FA) supplementation in healthy individuals on blood folate concentrations and immune response are unknown.Objective: The aim of the study was to evaluate the effects of daily consumption of a tablet containing 5 mg FA on serum folate; number and cytotoxicity of natural killer (NK) cells; mRNA expression of dihydrofolate reductase (DHFR), methylenetetrahydrofolate reductase (MTHFR), interferon gamma (IFNG), tumor necrosis factor alpha (TNFA), and interleukin 8 (IL8) genes; and concentrations of serum inflammatory markers.Methods: This prospective clinical trial was conducted in 30 healthy Brazilian adults (15 women), aged 27.7 y (95% CI: 26.4, 29.1 y), with a body mass index (in kg/m2) of 23.1 (95% CI: 22.0, 24.3). Blood was collected at baseline and after 45 and 90 d of the intervention. Serum folate concentrations were measured by microbiological assay and HPLC-tandem mass spectrometry [folate forms, including unmetabolized folic acid (UMFA)]. We used real-time polymerase chain reaction to assess mononuclear leukocyte mRNA expression and flow cytometry to measure the number and cytotoxicity of NK cells.Results: Serum folate concentrations increased by approximately 5-fold after the intervention (P < 0.001), and UMFA concentrations increased by 11.9- and 5.9-fold at 45 and 90 d, respectively, when compared with baseline (P < 0.001). UMFA concentrations increased (>1.12 nmol/L) in 29 (96.6%) participants at day 45 and in 26 (86.7%) participants at day 90. We observed significant reductions in the number (P < 0.001) and cytotoxicity (P = 0.003) of NK cells after 45 and 90 d. Compared with baseline, DHFR mRNA expression was higher at 90 d (P = 0.006) and IL8 and TNFA mRNA expressions were higher at 45 and 90 d (P = 0.001 for both).Conclusion: This noncontrolled intervention showed that healthy adults responded to a high-dose FA supplement with increased UMFA concentrations, changes in cytokine mRNA expression, and reduced number and cytotoxicity of NK cells. This trial was registered at www.ensaiosclinicos.govbr as RBR-2pr7zp.

    • Occupational Safety and Health RSS Word feed
      1. Health consequences of electric lighting practices in the modern world: A report on the National Toxicology Program’s workshop on shift work at night, artificial light at night, and circadian disruption
        Lunn RM, Blask DE, Coogan AN, Figueiro MG, Gorman MR, Hall JE, Hansen J, Nelson RJ, Panda S, Smolensky MH, Stevens RG, Turek FW, Vermeulen R, Carreon T, Caruso CC, Lawson CC, Thayer KA, Twery MJ, Ewens AD, Garner SC, Schwingl PJ, Boyd WA.
        Sci Total Environ. 2017 Jul 15;607-608:1073-1084.
        The invention of electric light has facilitated a society in which people work, sleep, eat, and play at all hours of the 24-hour day. Although electric light clearly has benefited humankind, exposures to electric light, especially light at night (LAN), may disrupt sleep and biological processes controlled by endogenous circadian clocks, potentially resulting in adverse health outcomes. Many of the studies evaluating adverse health effects have been conducted among night- and rotating-shift workers, because this scenario gives rise to significant exposure to LAN. Because of the complexity of this topic, the National Toxicology Program convened an expert panel at a public workshop entitled “Shift Work at Night, Artificial Light at Night, and Circadian Disruption” to obtain input on conducting literature-based health hazard assessments and to identify data gaps and research needs. The Panel suggested describing light both as a direct effector of endogenous circadian clocks and rhythms and as an enabler of additional activities or behaviors that may lead to circadian disruption, such as night-shift work and atypical and inconsistent sleep-wake patterns that can lead to social jet lag. Future studies should more comprehensively characterize and measure the relevant light-related exposures and link these exposures to both time-independent biomarkers of circadian disruption and biomarkers of adverse health outcomes. This information should lead to improvements in human epidemiological and animal or in vitro models, more rigorous health hazard assessments, and intervention strategies to minimize the occurrence of adverse health outcomes due to these exposures.

      2. Surveillance for silicosis deaths among persons aged 15-44 years – United States, 1999-2015
        Mazurek JM, Wood JM, Schleiff PL, Weissman DN.
        MMWR Morb Mortal Wkly Rep. 2017 Jul 21;66(28):747-752.
        Silicosis is usually a disease of long latency affecting mostly older workers; therefore, silicosis deaths in young adults (aged 15-44 years) suggests acute or accelerated disease. To understand the circumstances surrounding silicosis deaths among young persons, CDC analyzed the underlying and contributing causes of death using multiple cause-of-death data (1999-2015) and industry and occupation information abstracted from death certificates (1999-2013). During 1999-2015, among 55 pneumoconiosis deaths of young adults with International Classification of Diseases, Tenth Revision (ICD-10) code J62 (pneumoconiosis due to dust containing silica), section sign 38 (69%) had code J62.8 (pneumoconiosis due to other dust containing silica), and 17 (31%) had code J62.0 (pneumoconiosis due to talc dust) listed on their death certificate. Decedents whose cause of death code was J62.8 most frequently worked in the manufacturing and construction industries and production occupations where silica exposure is known to occur. Among the 17 decedents who had death certificates listing code J62.0 as cause of death, 13 had certificates with an underlying or a contributing cause of death code listed that indicated multiple drug use or drug overdose. In addition, 13 of the 17 death certificates listing code J62.0 as cause of death had information on decedent’s industry and occupation; among the 13 decedents, none worked in talc exposure-associated jobs, suggesting that their talc exposure was nonoccupational. Examining detailed information on causes of death (including external causes) and industry and occupation of decedents is essential for identifying silicosis deaths associated with occupational exposures and reducing misclassification of silicosis mortality.

      3. Occupational health contributions to the development and promise of occupational health psychology
        Sauter SL, Hurrell JJ.
        J Occup Health Psychol. 2017 July;22(3):251-258.
        Occupational health psychology (OHP), as it is known today, is preceded by over a century of inquiry in psychology, sociology, philosophy, and other disciplines regarding the conditions of work and the welfare of workers, organizations, and society. This diverse body of research is richly detailed in reports on the history of OHP. Less represented in these reports are the formative interests of the occupational health field in OHP. In the present discussion, we begin by giving greater visibility to these interests. As we show, the expressions occupational health psychology and occupational health psychologist and a vision for training and participation of psychologists in occupational health research and practice appear in the occupational health literature four decades ago. We describe how this interest inspired initiatives in OHP by the National Institute for Occupational Safety and Health which, in turn, influenced the formalization of OHP as a discipline in the United States. We then document sustained interests of occupational health in OHP today and illustrate the promise of this interest for psychologists, for the discipline of OHP itself, and for the health, safety, and well-being of working people. We conclude by arguing that, to realize this promise, there is value to closer and more formal engagement of psychologists and OHP institutions with the field of occupational health.

      4. Interventions to prevent occupational noise-induced hearing loss
        Tikka C, Verbeek JH, Kateman E, Morata TC, Dreschler WA, Ferrite S.
        Cochrane Database Syst Rev. 2017 Jul 07;7:Cd006396.
        BACKGROUND: This is the second update of a Cochrane Review originally published in 2009. Millions of workers worldwide are exposed to noise levels that increase their risk of hearing disorders. There is uncertainty about the effectiveness of hearing loss prevention interventions. OBJECTIVES: To assess the effectiveness of non-pharmaceutical interventions for preventing occupational noise exposure or occupational hearing loss compared to no intervention or alternative interventions. SEARCH METHODS: We searched the CENTRAL; PubMed; Embase; CINAHL; Web of Science; BIOSIS Previews; Cambridge Scientific Abstracts; and OSH UPDATE to 3 October 2016. SELECTION CRITERIA: We included randomised controlled trials (RCT), controlled before-after studies (CBA) and interrupted time-series (ITS) of non-clinical interventions under field conditions among workers to prevent or reduce noise exposure and hearing loss. We also collected uncontrolled case studies of engineering controls about the effect on noise exposure. DATA COLLECTION AND ANALYSIS: Two authors independently assessed study eligibility and risk of bias and extracted data. We categorised interventions as engineering controls, administrative controls, personal hearing protection devices, and hearing surveillance. MAIN RESULTS: We included 29 studies. One study evaluated legislation to reduce noise exposure in a 12-year time-series analysis but there were no controlled studies on engineering controls for noise exposure. Eleven studies with 3725 participants evaluated effects of personal hearing protection devices and 17 studies with 84,028 participants evaluated effects of hearing loss prevention programmes (HLPPs). Effects on noise exposure Engineering interventions following legislationOne ITS study found that new legislation in the mining industry reduced the median personal noise exposure dose in underground coal mining by 27.7 percentage points (95% confidence interval (CI) -36.1 to -19.3 percentage points) immediately after the implementation of stricter legislation. This roughly translates to a 4.5 dB(A) decrease in noise level. The intervention was associated with a favourable but statistically non-significant downward trend in time of the noise dose of -2.1 percentage points per year (95% CI -4.9 to 0.7, 4 year follow-up, very low-quality evidence). Engineering intervention case studiesWe found 12 studies that described 107 uncontrolled case studies of immediate reductions in noise levels of machinery ranging from 11.1 to 19.7 dB(A) as a result of purchasing new equipment, segregating noise sources or installing panels or curtains around sources. However, the studies lacked long-term follow-up and dose measurements of workers, and we did not use these studies for our conclusions. Hearing protection devicesIn general hearing protection devices reduced noise exposure on average by about 20 dB(A) in one RCT and three CBAs (57 participants, low-quality evidence). Two RCTs showed that, with instructions for insertion, the attenuation of noise by earplugs was 8.59 dB better (95% CI 6.92 dB to 10.25 dB) compared to no instruction (2 RCTs, 140 participants, moderate-quality evidence). Administrative controls: information and noise exposure feedbackOn-site training sessions did not have an effect on personal noise-exposure levels compared to information only in one cluster-RCT after four months’ follow-up (mean difference (MD) 0.14 dB; 95% CI -2.66 to 2.38). Another arm of the same study found that personal noise exposure information had no effect on noise levels (MD 0.30 dB(A), 95% CI -2.31 to 2.91) compared to no such information (176 participants, low-quality evidence). Effects on hearing loss Hearing protection devicesIn two studies the authors compared the effect of different devices on temporary threshold shifts at short-term follow-up but reported insufficient data for analysis. In two CBA studies the authors found no difference in hearing loss from noise exposure above 89 dB(A) between muffs and earplugs at long-term follow-up (OR 0.8, 95% CI 0.63 to 1.03 ), very low-quality evidence). Authors of another CBA study found that wearing hearing protection more often resulted in less hearing loss at very long-term follow-up (very low-quality evidence). Combination of interventions: hearing loss prevention programmesOne cluster-RCT found no difference in hearing loss at three- or 16-year follow-up between an intensive HLPP for agricultural students and audiometry only. One CBA study found no reduction of the rate of hearing loss (MD -0.82 dB per year (95% CI -1.86 to 0.22) for a HLPP that provided regular personal noise exposure information compared to a programme without this information.There was very-low-quality evidence in four very long-term studies, that better use of hearing protection devices as part of a HLPP decreased the risk of hearing loss compared to less well used hearing protection in HLPPs (OR 0.40, 95% CI 0.23 to 0.69). Other aspects of the HLPP such as training and education of workers or engineering controls did not show a similar effect.In three long-term CBA studies, workers in a HLPP had a statistically non-significant 1.8 dB (95% CI -0.6 to 4.2) greater hearing loss at 4 kHz than non-exposed workers and the confidence interval includes the 4.2 dB which is the level of hearing loss resulting from 5 years of exposure to 85 dB(A). In addition, of three other CBA studies that could not be included in the meta-analysis, two showed an increased risk of hearing loss in spite of the protection of a HLPP compared to non-exposed workers and one CBA did not. AUTHORS’ CONCLUSIONS: There is very low-quality evidence that implementation of stricter legislation can reduce noise levels in workplaces. Controlled studies of other engineering control interventions in the field have not been conducted. There is moderate-quality evidence that training of proper insertion of earplugs significantly reduces noise exposure at short-term follow-up but long-term follow-up is still needed.There is very low-quality evidence that the better use of hearing protection devices as part of HLPPs reduces the risk of hearing loss, whereas for other programme components of HLPPs we did not find such an effect. The absence of conclusive evidence should not be interpreted as evidence of lack of effectiveness. Rather, it means that further research is very likely to have an important impact.

      5. Evaluating the stability of a freestanding Mast Climbing Work Platform
        Wimer B, Pan C, Lutz T, Hause M, Warren C, Dong R, Xu S.
        J Safety Res. 2017 ;62:163-172.
        Mast Climbing Work Platforms (MCWPs) are becoming more common at construction sites and are being used as an alternative to traditional scaffolding. Although their use is increasing, little to no published information exists on the potential safety hazards they could pose for workers. As a last line of defense, a personal fall-arrest system can be used to save a worker in a fall incident from the platform. There has been no published information on whether it is safe to use such a personal fall-arrest system with MCWPs. In this study, the issues of concern for occupational safety included: (a) the overall stability of the freestanding mast climber during a fall-arrest condition and (b) whether that fall-arrest system could potentially present safety hazards to other workers on the platform during a fall-arrest condition. This research project investigated those safety concerns with respect to the mast climber stability and the workers using it by creating fall-arrest impact forces that are transmitted to the equipment and by subsequently observing the movement of the mast climber and the working deck used by the workers. This study found that when the equipment was erected and used according to the manufacturer’s recommendations during a fall-arrest condition, destabilizing forces were very small and there were no signs of potential of MCWP collapse. However, potential fall hazards could be presented to other workers on the platform during a fall arrest. Workers near an open platform are advised to wear a personal fall-arrest system to reduce the risk of being ejected. Due to the increasing use of MCWPs at construction sites, there is a corresponding need for evidence and science-based safety guidelines or regulations and further research should be conducted to continue to fill the knowledge gap with MCWP equipment.

    • Parasitic Diseases RSS Word feed
      1. Malaria-related hospitalizations in the United States, 2000-2014
        Khuu D, Eberhard ML, Bristow BN, Javanbakht M, Ash LR, Shafir SC, Sorvillo FJ.
        Am J Trop Med Hyg. 2017 Jul;97(1):213-221.
        Few data are available on the burden of malaria hospitalization in the United States. Study of malaria using hospital-based data can better define the impact of malaria and help inform prevention efforts. U.S. malaria cases identified from hospitalization discharge records in the 2000-2014 Nationwide Inpatient Sample were examined. Frequencies and population rates were reported by demographics, infecting species, clinical, financial, institutional, geographic, and seasonal characteristics, and disparities were identified. Time trends in malaria cases were assessed using negative binomial regression. From 2000 to 2014, there were an estimated 22,029 malaria-related hospitalizations (4.88 per 1 million population) in the United States, including 182 in-hospital deaths and 4,823 severe malaria cases. The rate of malaria-related hospitalizations did not change significantly over the study period. The largest number of malaria-related hospitalizations occurred in August. Malaria-related hospitalizations occurred disproportionately among patients who were male, black, or 25-44 years of age. Plasmodium falciparum accounted for the majority of malaria-related hospitalizations. On average, malaria patients were hospitalized for 4.36 days with charges of $25,789. Patients with a malaria diagnosis were more often hospitalized in the Middle Atlantic and South Atlantic census divisions, urban teaching, private not-for-profit, and large-bed-size hospitals. Malaria imposes a substantial disease burden in the United States. Enhanced primary and secondary prevention measures, including strategies to increase the use of pretravel consultations and prompt diagnosis and treatment are needed.

      2. A comparison of Kenyan Biomphalaria pfeifferi and B. sudanica as vectors for Schistosoma mansoni, including a discussion of the need to better understand the effects of snail breeding systems on transmission
        Mutuku M, Lu L, Otiato F, Mwangi IN, Kinuthia JM, Maina GM, Laidemitt MR, Lelo EA, Ochanda H, Loker ES, Mkoji GM.
        J Parasitol. 2017 Jul 14.
        In Kenya, schistosomes infect an estimated 6 million people with >30 million people at risk of infection. We compared compatibility with, and ability to support and perpetuate, Schistosoma mansoni of Biomphalaria pfeifferi and Biomphalaria sudanica, 2 prominent freshwater snail species involved in schistosomiasis transmission in Kenya. Field-derived B. pfeifferi (from a stream in Mwea, central Kenya) and B. sudanica (from Nawa, Lake Victoria, in western Kenya) were exposed to S. mansoni miracidia isolated from fecal samples of naturally infected humans from Mwea or Nawa. Juvenile (<6 mm shell diameter), young adult (6-9 mm), and adult snails (>9 mm) were each exposed to a single miracidium. Schistosoma mansoni developed faster and consistently had higher infection rates (39.6-80.7%) in B. pfeifferi than in B. sudanica (2.4-21.5%), regardless of the source of S. mansoni or the size of the snails used. Schistosoma mansoni from Nawa produced higher infection rates in both B. pfeifferi and B. sudanica than did S. mansoni from Mwea. Mean daily cercariae production was greater for B. pfeifferi exposed to sympatric than allopatric S. mansoni (583-1,686 vs. 392-1,232), and mean daily cercariae production amongst B. sudanica were consistently low (50-590) with no significant differences between sympatric or allopatric combinations. Both non-miracidia exposed and miracidia-exposed B. pfeifferi had higher mortality rates than for B. sudanica, but mean survival time of shedding snails (9.3-13.7 weeks) did not differ significantly between the 2 snail species. A small proportion (1.5%) of the cercariae shedding B. pfeifferi survived up to 40 wk post-exposure. Biomphalaria pfeifferi was more likely to become infected and to shed more cercariae than B. sudanica, suggesting that the risk per individual snail of perpetuating transmission in Kenyan streams or lacustrine habitats may differ considerably. High infections rates exhibited by the preferential self-fertilizing B. pfeifferi relative to the out-crossing B. sudanica point out to the need to investigate further the role of host breeding systems in influencing transmission of schistosomiasis by snail hosts.

      3. Resisting and tolerating P. falciparum in pregnancy under different malaria transmission intensities
        Ndam NT, Mbuba E, Gonzalez R, Cistero P, Kariuki S, Sevene E, Ruperez M, Fonseca AM, Vala A, Maculuve S, Jimenez A, Quinto L, Ouma P, Ramharter M, Aponte JJ, Nhacolo A, Massougbodji A, Briand V, Kremsner PG, Mombo-Ngoma G, Desai M, Macete E, Cot M, Menendez C, Mayor A.
        BMC Med. 2017 Jul 17;15(1):130.
        BACKGROUND: Resistance and tolerance to Plasmodium falciparum can determine the progression of malaria disease. However, quantitative evidence of tolerance is still limited. We investigated variations in the adverse impact of P. falciparum infections among African pregnant women under different intensities of malaria transmission. METHODS: P. falciparum at delivery was assessed by microscopy, quantitative PCR (qPCR) and placental histology in 946 HIV-uninfected and 768 HIV-infected pregnant women from Benin, Gabon, Kenya and Mozambique. Resistance was defined by the proportion of submicroscopic infections and the levels of anti-parasite antibodies quantified by Luminex, and tolerance by the relationship of pregnancy outcomes with parasite densities at delivery. RESULTS: P. falciparum prevalence by qPCR in peripheral and/or placental blood of HIV-uninfected Mozambican, Gabonese and Beninese women at delivery was 6% (21/340), 11% (28/257) and 41% (143/349), respectively. The proportion of peripheral submicroscopic infections was higher in Benin (83%) than in Mozambique (60%) and Gabon (55%; P = 0.033). Past or chronic placental P. falciparum infection was associated with an increased risk of preterm birth in Mozambican newborns (OR = 7.05, 95% CI 1.79 to 27.82). Microscopic infections were associated with reductions in haemoglobin levels at delivery among Mozambican women (-1.17 g/dL, 95% CI -2.09 to -0.24) as well as with larger drops in haemoglobin levels from recruitment to delivery in Mozambican (-1.66 g/dL, 95% CI -2.68 to -0.64) and Gabonese (-0.91 g/dL, 95% CI -1.79 to -0.02) women. Doubling qPCR-peripheral parasite densities in Mozambican women were associated with decreases in haemoglobin levels at delivery (-0.16 g/dL, 95% CI -0.29 to -0.02) and increases in the drop of haemoglobin levels (-0.29 g/dL, 95% CI -0.44 to -0.14). Beninese women had higher anti-parasite IgGs than Mozambican women (P < 0.001). No difference was found in the proportion of submicroscopic infections nor in the adverse impact of P. falciparum infections in HIV-infected women from Kenya (P. falciparum prevalence by qPCR: 9%, 32/351) and Mozambique (4%, 15/417). CONCLUSIONS: The lowest levels of resistance and tolerance in pregnant women from areas of low malaria transmission were accompanied by the largest adverse impact of P. falciparum infections. Exposure-dependent mechanisms developed by pregnant women to resist the infection and minimise pathology can reduce malaria-related adverse outcomes. Distinguishing both types of defences is important to understand how reductions in transmission can affect malaria disease. TRIAL REGISTRATION: ClinicalTrials.gov NCT00811421 . Registered 18 December 2008.

      4. Evaluation of onchocerciasis transmission in Tanzania: Preliminary rapid field results in the Tukuyu Focus, 2015
        Paulin HN, Nshala A, Kalinga A, Mwingira U, Wiegand R, Cama V, Cantey PT.
        Am J Trop Med Hyg. 2017 Jun 12.
        To compare diagnostic tests for onchocerciasis in a setting that has suppressed transmission, a randomized, age-stratified study was implemented in an area in Tanzania that had received 15 rounds of annual mass drug administration (MDA) with ivermectin. Study participants (N = 948) from 11 villages underwent a questionnaire, skin examination, skin snips, and blood draw. The burden of symptomatic disease was low. Ov-16 antibody rapid diagnostic test (RDT) results were positive in 38 (5.5%) participants, with 1 (0.5%), 1 (0.4%), and 2 (0.8%) in children aged 0-5, 6-10, and 11-15 years, respectively. Despite significant impact of MDA on transmission, the area would have failed to meet World Health Organization serologic criteria for stopping MDA if a full evaluation had been conducted. The specificity of the RDT, which is 97-98%, may result in the identification of a number of false positives that would exceed the current stop MDA threshold.

      5. Estimating the added utility of highly sensitive histidine-rich protein 2 detection in outpatient clinics in Sub-Saharan Africa
        Plucinski MM, Rogier E, Dimbu PR, Fortes F, Halsey ES, Aidoo M.
        Am J Trop Med Hyg. 2017 Jun 12.
        Most malaria testing is by rapid diagnostic tests (RDTs) that detect Plasmodium falciparum histidine-rich protein 2 (HRP2). Recently, several RDT manufacturers have developed highly sensitive RDTs (hsRDTs), promising a limit of detection (LOD) orders of magnitude lower than conventional RDTs. To model the added utility of hsRDTs, HRP2 concentration in Angolan outpatients was measured quantitatively using an ultrasensitive bead-based assay. The distribution of HRP2 concentration was bimodal in both afebrile and febrile patients. The conventional RDT was able to detect 81% of all HRP2-positive febrile patients and 52-77% of HRP2-positive afebrile patients. The added utility of hsRDTs was estimated to be greater in afebrile patients, where an hsRDT with a LOD of 200 pg/mL would detect an additional 50-60% of HRP2-positive persons compared with a conventional RDT with a LOD of 3,000 pg/mL. In febrile patients, the hsRDT would detect an additional 10-20% of cases. Conventional RDTs already capture the vast majority of symptomatic HRP2-positive individuals, and hsRDTs would have to reach a sufficiently low LOD approaching 200 pg/mL to provide added utility in identifying HRP2-positive, asymptomatic individuals.

      6. Serologic monitoring of public health interventions against Strongyloides stercoralis
        Vargas P, Krolewiecki AJ, Echazu A, Juarez M, Cajal P, Gil JF, Caro N, Nasser J, Lammie P, Cimino RO.
        Am J Trop Med Hyg. 2017 Jul;97(1):166-172.
        Northwestern Argentina is endemic for soil-transmitted helminths, and annual deworming programs are carried out in prioritized areas. High prevalence of Strongyloides stercoralis was reported in this area; therefore, control programs including ivermectin are being evaluated. The NIE-enzyme linked immunosorbent assay (ELISA) was used for this purpose. In this community trial, two groups of patients, classified according to housing and living conditions were evaluated. Simultaneous with baseline survey, Group 1 was moved to new households with access to improved water and sanitation facilities (W and S), where deworming (MDA, massive drug administration) took place within 1 month; whereas Group 2 received MDA but remained living with unimproved W and S. The mean time interval between baseline and the follow-up was 331 days for Group 1 and 508 for Group 2. Anti-NIE levels were measured for each individual before and after interventions and follow-up optical density (OD) ratios were calculated to quantify the variation. A significant decrease of the anti-NIE levels between baseline and follow-up was observed in both groups. Nonetheless, the number of patients that achieved the cure criteria (OD ratio < 0.6) was higher in Group 1 than Group 2 with values of 72.7% (24/33) and 45.0% (18/40), respectively (P = 0.0197). Our results support the conclusion that a combined intervention including deworming and improvements in life conditions is more effective, in terms of the proportion of subjects cured than deworming alone. Furthermore, we found that NIE-ELISA is a useful test for assessing the response to treatment and to evaluate the outcome of control intervention programs.

    • Public Health Leadership and Management RSS Word feed
      1. State health agency and local health department workforce: Identifying top development needs
        Beck AJ, Leider JP, Coronado F, Harper E.
        Am J Public Health. 2017 Jul 20:e1-e7.
        OBJECTIVES: To identify occupations with high-priority workforce development needs at public health departments in the United States. METHODS: We surveyed 46 state health agencies (SHAs) and 112 local health departments (LHDs). We asked respondents to prioritize workforce needs for 29 occupations and identify whether more positions, more qualified candidates, more competitive salaries for recruitment or retention, or new or different staff skills were needed. RESULTS: Forty-one SHAs (89%) and 36 LHDs (32%) participated. The SHAs reported having high-priority workforce needs for epidemiologists and laboratory workers; LHDs for disease intervention specialists, nurses, and administrative support, management, and leadership positions. Overall, the most frequently reported SHA workforce needs were more qualified candidates and more competitive salaries. The LHDs most frequently reported a need for more positions across occupations and more competitive salaries. Workforce priorities for respondents included strengthening epidemiology workforce capacity, adding administrative positions, and improving compensation to recruit and retain qualified employees. CONCLUSIONS: Strategies for addressing workforce development concerns of health agencies include providing additional training and workforce development resources, and identifying best practices for recruitment and retention of qualified candidates. (Am J Public Health. Published online ahead of print July 20, 2017: e1-e7. doi:10.2105/AJPH.2017.303875).

    • Reproductive Health RSS Word feed
      1. Health-related quality of life for women ever experiencing infertility or difficulty staying pregnant
        Boulet SL, Smith RA, Crawford S, Kissin DM, Warner L.
        Matern Child Health J. 2017 Jul 18.
        INTRODUCTION: Information on the health-related quality of life (HRQOL) for women with infertility is limited and does not account for the co-occurrence of chronic conditions or emotional distress. METHODS: We used data from state-added questions on reproductive health included in the 2013 Behavioral Risk Factor Surveillance System in seven states. HRQOL indicators included: self-reported health status; number of days in the past 30 days when physical and mental health was not good; number of days in the past 30 days that poor physical or mental health limited activities. We computed rate ratios for HRQOL for women ever experiencing infertility or difficulty staying pregnant compared with women never reporting these conditions; interactions with chronic conditions and depressive disorders were assessed. RESULTS: Of 7,526 respondents aged 18-50 years, 387 (4.9%) reported infertility only and 339 (4.3%) reported difficulty staying pregnant only. Infertility was associated with an increase in average number of days with poor physical health for women with chronic conditions [rate ratio (RR) 1.85, 95% confidence interval (CI) 1.04-3.29] but was protective for women without chronic conditions (RR 0.47, 95% CI 0.29-0.75). Difficulty staying pregnant was associated with an increase in average number of days of limited activity among both women with chronic conditions (RR 2.14, 95% CI 1.32-3.45) and women with depressive disorders (RR 1.72 95% CI 1.14-2.62). DISCUSSION: Many HRQOL measures were poorer for women who had infertility or difficulty staying pregnant compared to their counterparts; the association was modified by presence of chronic conditions and depressive disorders.

      2. Population-attributable fraction of tubal factor infertility associated with chlamydia
        Gorwitz RJ, Wiesenfeld HC, Chen PL, Hammond KR, Sereday KA, Haggerty CL, Johnson RE, Papp JR, Kissin DM, Henning TC, Hook EW, Steinkampf MP, Markowitz LE, Geisler WM.
        Am J Obstet Gynecol. 2017 May 19.
        BACKGROUND: Chlamydia trachomatis infection is highly prevalent among young women in the United States. Prevention of long-term sequelae of infection, including tubal factor infertility, is a primary goal of chlamydia screening and treatment activities. However, the population-attributable fraction of tubal factor infertility associated with chlamydia is unclear, and optimal measures for assessing tubal factor infertility and prior chlamydia in epidemiological studies have not been established. Black women have increased rates of chlamydia and tubal factor infertility compared with White women but have been underrepresented in prior studies of the association of chlamydia and tubal factor infertility. OBJECTIVES: The objectives of the study were to estimate the population-attributable fraction of tubal factor infertility associated with Chlamydia trachomatis infection by race (Black, non-Black) and assess how different definitions of Chlamydia trachomatis seropositivity and tubal factor infertility affect population-attributable fraction estimates. STUDY DESIGN: We conducted a case-control study, enrolling infertile women attending infertility practices in Birmingham, AL, and Pittsburgh, PA, during October 2012 through June 2015. Tubal factor infertility case status was primarily defined by unilateral or bilateral fallopian tube occlusion (cases) or bilateral fallopian tube patency (controls) on hysterosalpingogram. Alternate tubal factor infertility definitions incorporated history suggestive of tubal damage or were based on laparoscopic evidence of tubal damage. We aimed to enroll all eligible women, with an expected ratio of 1 and 3 controls per case for Black and non-Black women, respectively. We assessed Chlamydia trachomatis seropositivity with a commercial assay and a more sensitive research assay; our primary measure of seropositivity was defined as positivity on either assay. We estimated Chlamydia trachomatis seropositivity and calculated Chlamydia trachomatis-tubal factor infertility odds ratios and population-attributable fraction, stratified by race. RESULTS: We enrolled 107 Black women (47 cases, 60 controls) and 620 non-Black women (140 cases, 480 controls). Chlamydia trachomatis seropositivity by either assay was 81% (95% confidence interval, 73-89%) among Black and 31% (95% confidence interval, 28-35%) among non-Black participants (P < .001). Using the primary Chlamydia trachomatis seropositivity and tubal factor infertility definitions, no significant association was detected between chlamydia and tubal factor infertility among Blacks (odds ratio, 1.22, 95% confidence interval, 0.45-3.28) or non-Blacks (odds ratio, 1.41, 95% confidence interval, 0.95-2.09), and the estimated population-attributable fraction was 15% (95% confidence interval, -97% to 68%) among Blacks and 11% (95% confidence interval, -3% to 23%) among non-Blacks. Use of alternate serological measures and tubal factor infertility definitions had an impact on the magnitude of the chlamydia-tubal factor infertility association and resulted in a significant association among non-Blacks. CONCLUSION: Low population-attributable fraction estimates suggest factors in addition to chlamydia contribute to tubal factor infertility in the study population. However, high background Chlamydia trachomatis seropositivity among controls, most striking among Black participants, could have obscured an association with tubal factor infertility and resulted in a population-attributable fraction that underestimates the true etiological role of chlamydia. Choice of chlamydia and tubal factor infertility definitions also has an impact on the odds ratio and population-attributable fraction estimates.

      3. Prostate specific antigen concentration in vaginal fluid after exposure to semen
        Kulczycki A, Brill I, Snead MC, Macaluso M.
        Contraception. 2017 Jul 12.
        OBJECTIVE: Prostate-specific antigen (PSA) is the best established biomarker of semen exposure. PSA in vaginal fluid returns to pre-exposure concentrations within 24-48 h, but the speed of decay during the first ten hours is unknown. We sought to determine how fast PSA concentrations decline during the first ten hours after exposure to semen. STUDY DESIGN: Women in the 50 enrolled couples were intravaginally inoculated with 10, 20, 100, and 200mul of their partner’s semen and then collected vaginal swabs immediately after, 30min, 4 and 10h after exposure. Forty-seven sets of samples were tested for PSA. Mixed linear models for repeated measures examined the association between log-transformed PSA values and sampling time and semen exposure volume. Sensitivity analyses excluded data from non-abstainers. Fixed effect estimates from the statistical models were graphed. RESULTS: PSA values were highest at 200mul inoculation volumes and at earlier post-exposure timepoints, then decline steadily. The lowest inoculation volume (10mul) corresponded to the smallest concentration of PSA throughout the post-inoculation timepoints. Average PSA levels return to clinically non-detectable levels within 10h only at the lowest semen exposures. The PSA decay curve assumes a very similar profile across all timepoints and semen amounts. CONCLUSIONS: The PSA decay curve is similar for varying semen exposure volumes, with average PSA concentrations remaining above clinical thresholds 10h after exposure at all except the very smallest semen exposure levels. PSA is an objective marker of recent exposure to semen, permitting such detection with high accuracy. IMPLICATIONS: This study clarifies how PSA values vary at different semen exposure levels and timepoints during the first 10h post-exposure. Future contraceptive studies that use PSA as a semen biomarker will be better informed about PSA concentrations at different sampling times and exposure amounts.

      4. Health and economic burden of preeclampsia: no time for complacency
        Li R, Tsigas EZ, Callaghan WM.
        Am J Obstet Gynecol. 2017 Jul 11.

        [No abstract]

    • Statistics as Topic RSS Word feed
      1. Analysis of dependently truncated data in Cox framework
        Liu Y, Li J, Zhang X.
        Commun Stat Simul Comput. 2017 ;2017:1-19.
        Truncation is a known feature of bone marrow transplant (BMT) registry data, for which the survival time of a leukemia patient is left truncated by the waiting time to transplant. It was recently noted that a longer waiting time was linked to poorer survival. A straightforward solution is a Cox model on the survival time with the waiting time as both truncation variable and covariate. The Cox model should also include other recognized risk factors as covariates. In this paper we focus on estimating the distribution function of waiting time and the probability of selection under the aforementioned Cox model.

    • Substance Use and Abuse RSS Word feed
      1. The relationship between acculturation, ecodevelopment, and substance use among Hispanic adolescents
        Martinez MJ, Huang S, Estrada Y, Sutton MY, Prado G.
        J Early Adolesc. 2017 ;37(7):948-974.
        Using structural equation modeling, we examined the relationship of Hispanicism on recent substance use and whether Americanism moderated the effect in a sample of 1,141 Hispanic adolescents. The Bicultural Involvement Questionnaire (BIQ) was used to determine the degree of individual comfort in both Hispanic (Hispanicism) and American (Americanism) cultures. Hispanicism was associated with greater family functioning (beta =.36, p <.05) and school bonding (beta =.31, p <.01); Americanism moderated the effect of Hispanicism on substance use (beta =.92, p <.01). Findings suggest that Hispanic culture was protective against substance use; however, those effects differed depending on level of Americanism.

      2. Nondaily smokers’ characteristics and likelihood of prenatal cessation and postpartum relapse
        Rockhill KM, Tong VT, England LJ, D’Angelo DV.
        Nicotine Tob Res. 2017 Jul 01;19(7):810-816.
        Introduction: This study aimed to calculate the prevalence of pre-pregnancy nondaily smoking (<1 cigarette/day), risk factors, and report of prenatal provider smoking education; and assess the likelihood of prenatal cessation and postpartum relapse for nondaily smokers. Methods: We analyzed data from 2009 to 2011 among women with live-born infants participating in the Pregnancy Risk Assessment Monitoring System. We compared characteristics of pre-pregnancy daily smokers (>/=1 cigarette/day), nondaily smokers, and nonsmokers (chi-square adjusted p < .025). Between nondaily and daily smokers, we compared proportions of prenatal cessation, postpartum relapse (average 4 months postpartum), and reported provider education. Multivariable logistic regression calculated adjusted prevalence ratios (APR) for prenatal cessation among pre-pregnancy smokers (n = 27 360) and postpartum relapse among quitters (n = 13 577). Results: Nondaily smokers (11% of smokers) were more similar to nonsmokers and differed from daily smokers on characteristics examined (p </= .001 for all). Fewer nondaily smokers reported provider education than daily smokers (71.1%, 86.3%; p < .001). A higher proportion of nondaily compared to daily smokers quit during pregnancy (89.7%, 49.0%; p < .001), and a lower proportion relapsed postpartum (22.2%, 48.6%; p < .001). After adjustment, nondaily compared to daily smokers were more likely to quit (APR: 1.65; 95% confidence interval [CI]: 1.58-1.71) and less likely to relapse postpartum (APR: 0.55; 95% CI: 0.48-0.62). Conclusions: Nondaily smokers were more likely to quit smoking during pregnancy, less likely to relapse postpartum, and less likely to report provider education than daily smokers. Providers should educate all women, regardless of frequency of use, about the harms of tobacco during pregnancy, provide effective cessation interventions, and encourage women to be tobacco free postpartum and beyond. Implication: Nondaily smoking (<1 cigarette/day) is increasing among US smokers and carries a significant risk of disease. However, smoking patterns surrounding pregnancy among nondaily smokers are unknown. Using 2009-2011 data from the Pregnancy Risk Assessment Monitoring System, we found pre-pregnancy nondaily smokers compared to daily smokers were 65% more likely to quit smoking during pregnancy and almost half as likely to relapse postpartum. Providers should educate all women, regardless of frequency of use, about the harms of tobacco during pregnancy, provide effective cessation interventions, and encourage women to be tobacco free postpartum and beyond.

    • Zoonotic and Vectorborne Diseases RSS Word feed
      1. Rocky Mountain spotted fever and pregnancy: Four cases from Sonora, Mexico
        Licona-Enriquez JD, Delgado-de la Mora J, Paddock CD, Ramirez-Rodriguez CA, Candia-Plata MD, Hernandez GA.
        Am J Trop Med Hyg. 2017 Jun 12.
        We present a series of four pregnant women with Rocky Mountain spotted fever (RMSF) that occurred in Sonora, Mexico, during 2015-2016. Confirmatory diagnoses were made by polymerase chain reaction or serological reactivity to antigens of Rickettsia rickettsii by using an indirect immunofluorescence antibody assay. Each patient presented with fever and petechial rash and was treated successfully with doxycycline. Each of the women and one full-term infant delivered at 36 weeks gestation survived the infection. Three of the patients in their first trimester of pregnancy suffered spontaneous abortions. RMSF should be suspected in any pregnant woman presenting with fever, malaise and rash in regions where R. rickettsii is endemic.

      2. Diversity and phylogenetic relationships among Bartonella strains from Thai bats
        McKee CD, Kosoy MY, Bai Y, Osikowicz LM, Franka R, Gilbert AT, Boonmar S, Rupprecht CE, Peruski LF.
        PLoS One. 2017 ;12(7):e0181696.
        Bartonellae are phylogenetically diverse, intracellular bacteria commonly found in mammals. Previous studies have demonstrated that bats have a high prevalence and diversity of Bartonella infections globally. Isolates (n = 42) were obtained from five bat species in four provinces of Thailand and analyzed using sequences of the citrate synthase gene (gltA). Sequences clustered into seven distinct genogroups; four of these genogroups displayed similarity with Bartonella spp. sequences from other bats in Southeast Asia, Africa, and Eastern Europe. Thirty of the isolates representing these seven genogroups were further characterized by sequencing four additional loci (ftsZ, nuoG, rpoB, and ITS) to clarify their evolutionary relationships with other Bartonella species and to assess patterns of diversity among strains. Among the seven genogroups, there were differences in the number of sequence variants, ranging from 1-5, and the amount of nucleotide divergence, ranging from 0.035-3.9%. Overall, these seven genogroups meet the criteria for distinction as novel Bartonella species, with sequence divergence among genogroups ranging from 6.4-15.8%. Evidence of intra- and intercontinental phylogenetic relationships and instances of homologous recombination among Bartonella genogroups in related bat species were found in Thai bats.

      3. Human cases of tularemia in Armenia, 1996-2012
        Melikjanyan S, Palayan K, Vanyan A, Avetisyan L, Bakunts N, Kotanyan M, Guerra M.
        Am J Trop Med Hyg. 2017 Jun 19.
        A retrospective analysis was conducted of human cases and outbreaks of tularemia in the Republic of Armenia from 1996 to 2012 utilizing Geographic Information System software. A total of 266 human cases of tularemia were recorded in Armenia from 1996 to 2012, with yearly incidence ranging from 0 to 5.5 cases per 100,000 people. Cases predominantly affected the male population (62.8%), 11-20 year age group (37.2%), agricultural workers (49.6%), and persons residing in rural areas (93.6%). In 2003, a waterborne outbreak involving 158 cases occurred in Kotayk Marz, and in 2007, a foodborne outbreak with 17 cases occurred in Gegharkunik Marz, attributed to exposure of food products to contaminated hay. Geospatial analysis of all cases showed that the majority were associated with the steppe vegetation zone, elevations between 1,400 and 2,300 m, and the climate zone associated with dry, warm summers, and cold winters. Characterization of these environmental factors were used to develop a predictive risk model to improve surveillance and outbreak response for tularemia in Armenia.

      4. Fatal Leptospira spp./Zika virus coinfection – Puerto Rico, 2016
        Neaterour P, Rivera A, Galloway RL, Negron MG, Rivera-Garcia B, Sharp TM.
        Am J Trop Med Hyg. 2017 Jun 26.
        Coinfection with pathogens that cause acute febrile illness (AFI) can complicate patient diagnosis and management. This report describes a fatal case of Leptospira spp./Zika virus (ZIKV) coinfection in Puerto Rico. The patient presented with a 5-day history of AFI; reported behavioral risk factors for leptospirosis; was diagnosed with possible leptospirosis, dengue, chikungunya, or ZIKV disease; and received appropriate treatment of leptospirosis and dengue. Following a 3-day hospitalization, the patient died due to acute gastrointestinal hemorrhage, and kidney and liver failure. Serologic diagnostic testing for leptospirosis and ZIKV disease was both negative; however, molecular diagnostic testing performed postmortem was positive for detection of Leptospira spp. and ZIKV nucleic acid. This case demonstrates the need for continued clinical awareness of leptospirosis in areas experiencing outbreaks of pathogens that cause AFI and the need for evaluation of coinfection with AFI-causing pathogens as a risk factor for increased severity of disease.

      5. Cat scratch disease: U.S. clinicians’ experience and knowledge
        Nelson CA, Moore AR, Perea AE, Mead PS.
        Zoonoses Public Health. 2017 Jul 14.
        Cat scratch disease (CSD) is a zoonotic infection caused primarily by the bacterium Bartonella henselae. An estimated 12,000 outpatients and 500 inpatients are diagnosed with CSD annually, yet little is known regarding clinician experience with and treatment of CSD in the United States. Questions assessing clinical burden, treatment and prevention of CSD were posed to 3,011 primary care providers (family practitioners, internists, paediatricians and nurse practitioners) during 2014-2015 as part of the annual nationwide DocStyles survey. Among the clinicians surveyed, 37.2% indicated that they had diagnosed at least one patient with CSD in the prior year. Clinicians in the Pacific and Southern regions were more likely to have diagnosed CSD, as were clinicians who saw paediatric patients, regardless of specialty. When presented with a question regarding treatment of uncomplicated CSD, only 12.5% of clinicians chose the recommended treatment option of analgesics and monitoring, while 71.4% selected antibiotics and 13.4% selected lymph node aspiration. In a scenario concerning CSD prevention in immunosuppressed patients, 80.6% of clinicians chose some form of precaution, but less than one-third chose the recommended option of counseling patients to treat their cats for fleas and avoid rough play with their cats. Results from this study indicate that a substantial proportion of U.S. clinicians have diagnosed CSD within the past year. Although published guidelines exist for treatment and prevention of CSD, these findings suggest that knowledge gaps remain. Therefore, targeted educational efforts about CSD may benefit primary care providers.

      6. The eyes as a window to improved understanding of the prenatal effects of Zika virus infection
        Prakalapakorn SG, Meaney-Delman D, Honein MA, Rasmussen SA.
        J aapos. 2017 Jul 11.

        [No abstract]

      7. Temperature modulates dengue virus epidemic growth rates through its effects on reproduction numbers and generation intervals
        Siraj AS, Oidtman RJ, Huber JH, Kraemer MU, Brady OJ, Johansson MA, Perkins TA.
        PLoS Negl Trop Dis. 2017 Jul 19;11(7):e0005797.
        Epidemic growth rate, r, provides a more complete description of the potential for epidemics than the more commonly studied basic reproduction number, R0, yet the former has never been described as a function of temperature for dengue virus or other pathogens with temperature-sensitive transmission. The need to understand the drivers of epidemics of these pathogens is acute, with arthropod-borne virus epidemics becoming increasingly problematic. We addressed this need by developing temperature-dependent descriptions of the two components of r-R0 and the generation interval-to obtain a temperature-dependent description of r. Our results show that the generation interval is highly sensitive to temperature, decreasing twofold between 25 and 35 degrees C and suggesting that dengue virus epidemics may accelerate as temperatures increase, not only because of more infections per generation but also because of faster generations. Under the empirical temperature relationships that we considered, we found that r peaked at a temperature threshold that was robust to uncertainty in model parameters that do not depend on temperature. Although the precise value of this temperature threshold could be refined following future studies of empirical temperature relationships, the framework we present for identifying such temperature thresholds offers a new way to classify regions in which dengue virus epidemic intensity could either increase or decrease under future climate change.

Back to Top

CDC Science Clips Production Staff

  • John Iskander, MD MPH, Editor
  • Gail Bang, MLIS, Librarian
  • Kathy Tucker, Librarian
  • William (Bill) Thomas, MLIS, Librarian
  • Onnalee Gomez, MS, Health Scientist
  • Jarvis Sims, MIT, MLIS, Librarian

____

DISCLAIMER: Articles listed in the CDC Science Clips are selected by the Stephen B. Thacker CDC Library to provide current awareness of the public health literature. An article's inclusion does not necessarily represent the views of the Centers for Disease Control and Prevention nor does it imply endorsement of the article's methods or findings. CDC and DHHS assume no responsibility for the factual accuracy of the items presented. The selection, omission, or content of items does not imply any endorsement or other position taken by CDC or DHHS. Opinion, findings and conclusions expressed by the original authors of items included in the Clips, or persons quoted therein, are strictly their own and are in no way meant to represent the opinion or views of CDC or DHHS. References to publications, news sources, and non-CDC Websites are provided solely for informational purposes and do not imply endorsement by CDC or DHHS.

TOP