Skip directly to search Skip directly to A to Z list Skip directly to navigation Skip directly to page options Skip directly to site content

Issue 26, June 5, 2017

CDC Science Clips: Volume 9, Issue 26, July 6, 2017

Science Clips is produced weekly to enhance awareness of emerging scientific knowledge for the public health community. Each article features an Altmetric Attention score to track social and mainstream media mentions!

This week, Science Clips is pleased to collaborate with CDC Vital Signs by featuring scientific articles from the July Vital Signs (www.cdc.gov/vitalsigns). The articles marked with an asterisk are general review articles which may be of particular interest to clinicians and public health professionals seeking background information in this area.

  1. CDC Vital Signs
    • Prescription Drug Overdose RSS Word feed
      1. *Relationship between nonmedical prescription-opioid use and heroin use
        Compton WM, Jones CM, Baldwin GT.
        N Engl J Med. 2016 Jan 14;374(2):154-63.

        [No abstract]

      2. *Opioid abuse in chronic pain – misconceptions and mitigation strategies
        Volkow ND, McLellan AT.
        N Engl J Med. 2016 Mar 31;374(13):1253-63.

        [No abstract]

      3. Association between opioid prescribing patterns and opioid overdose-related deaths
        Bohnert AS, Valenstein M, Bair MJ, Ganoczy D, McCarthy JF, Ilgen MA, Blow FC.
        Jama. 2011 Apr 06;305(13):1315-21.
        CONTEXT: The rate of prescription opioid-related overdose death increased substantially in the United States over the past decade. Patterns of opioid prescribing may be related to risk of overdose mortality. OBJECTIVE: To examine the association of maximum prescribed daily opioid dose and dosing schedule (“as needed,” regularly scheduled, or both) with risk of opioid overdose death among patients with cancer, chronic pain, acute pain, and substance use disorders. DESIGN: Case-cohort study. SETTING: Veterans Health Administration (VHA), 2004 through 2008. PARTICIPANTS: All unintentional prescription opioid overdose decedents (n = 750) and a random sample of patients (n = 154,684) among those individuals who used medical services in 2004 or 2005 and received opioid therapy for pain. Main Outcome Measure Associations of opioid regimens (dose and schedule) with death by unintentional prescription opioid overdose in subgroups defined by clinical diagnoses, adjusting for age group, sex, race, ethnicity, and comorbid conditions. RESULTS: The frequency of fatal overdose over the study period among individuals treated with opioids was estimated to be 0.04%.The risk of overdose death was directly related to the maximum prescribed daily dose of opioid medication. The adjusted hazard ratios (HRs) associated with a maximum prescribed dose of 100 mg/d or more, compared with the dose category 1 mg/d to less than 20 mg/d, were as follows: among those with substance use disorders, adjusted HR = 4.54 (95% confidence interval [CI], 2.46-8.37; absolute risk difference approximation [ARDA] = 0.14%); among those with chronic pain, adjusted HR = 7.18 (95% CI, 4.85-10.65; ARDA = 0.25%); among those with acute pain, adjusted HR = 6.64 (95% CI, 3.31-13.31; ARDA = 0.23%); and among those with cancer, adjusted HR = 11.99 (95% CI, 4.42-32.56; ARDA = 0.45%). Receiving both as-needed and regularly scheduled doses was not associated with overdose risk after adjustment. CONCLUSION: Among patients receiving opioid prescriptions for pain, higher opioid doses were associated with increased risk of opioid overdose death.

      4. Trends in long-term opioid therapy for chronic non-cancer pain
        Boudreau D, Von Korff M, Rutter CM, Saunders K, Ray GT, Sullivan MD, Campbell CI, Merrill JO, Silverberg MJ, Banta-Green C, Weisner C.
        Pharmacoepidemiol Drug Saf. 2009 Dec;18(12):1166-75.
        OBJECTIVE: To report trends and characteristics of long-term opioid use for non-cancer pain. METHODS: CONSORT (CONsortium to Study Opioid Risks and Trends) includes adult enrollees of two health plans serving over 1 per cent of the US population. Using automated data, we constructed episodes of opioid use between 1997 and 2005. We estimated age-sex standardized rates of opioid use episodes beginning in each year (incident) and on-going in each year (prevalent), and the per cent change in rates annualized (PCA) over the 9-year period. Long-term episodes were defined as > 90 days with 120+ days supply or 10+ opioid prescriptions in a given year. RESULTS: Over the study period, incident long-term use increased from 8.5 to 12.1 per 1000 at Group Health (GH) (6.0% PCA), and 6.3 to 8.6 per 1000 at Kaiser Permanente of Northern California (KPNC) (5.5% PCA). Prevalent long-term use doubled from 23.9 to 46.8 per 1000 at GH (8.5% PCA), and 21.5 to 39.2 per 1000 at KPNC (8.1% PCA). Non-Schedule II opioids were the most commonly used opioid among patients engaged in long-term opioid therapy, particularly at KPNC. Long-term use of Schedule II opioids also increased substantially at both health plans. Among prevalent long-term users in 2005, 28.6% at GH and 30.2% at KPNC were also regular users of sedative hypnotics. CONCLUSION: Long-term opioid therapy for non-cancer pain is increasingly prevalent, but the benefits and risks associated with such therapy are inadequately understood. Concurrent use of opioids and sedative-hypnotics was unexpectedly common and deserves further study.

      5. CDC Guideline for Prescribing Opioids for Chronic Pain – United States, 2016
        Dowell D, Haegerich TM, Chou R.
        MMWR Recomm Rep. 2016 Mar 18;65(1):1-49.
        This guideline provides recommendations for primary care clinicians who are prescribing opioids for chronic pain outside of active cancer treatment, palliative care, and end-of-life care. The guideline addresses 1) when to initiate or continue opioids for chronic pain; 2) opioid selection, dosage, duration, follow-up, and discontinuation; and 3) assessing risk and addressing harms of opioid use. CDC developed the guideline using the Grading of Recommendations Assessment, Development, and Evaluation (GRADE) framework, and recommendations are made on the basis of a systematic review of the scientific evidence while considering benefits and harms, values and preferences, and resource allocation. CDC obtained input from experts, stakeholders, the public, peer reviewers, and a federally chartered advisory committee. It is important that patients receive appropriate pain treatment with careful consideration of the benefits and risks of treatment options. This guideline is intended to improve communication between clinicians and patients about the risks and benefits of opioid therapy for chronic pain, improve the safety and effectiveness of pain treatment, and reduce the risks associated with long-term opioid therapy, including opioid use disorder, overdose, and death. CDC has provided a checklist for prescribing opioids for chronic pain (http://stacks.cdc.gov/view/cdc/38025) as well as a website (http://www.cdc.gov/drugoverdose/prescribingresources.html) with additional tools to guide clinicians in implementing the recommendations.

      6. Mandatory provider review and pain clinic laws reduce the amounts of opioids prescribed and overdose death rates
        Dowell D, Zhang K, Noonan RK, Hockenberry JM.
        Health Aff (Millwood). 2016 Oct 01;35(10):1876-1883.
        To address the opioid overdose epidemic in the United States, states have implemented policies to reduce inappropriate opioid prescribing. These policies could affect the coincident heroin overdose epidemic by either driving the substitution of heroin for opioids or reducing simultaneous use of both substances. We used IMS Health’s National Prescription Audit and government mortality data to examine the effect of these policies on opioid prescribing and on prescription opioid and heroin overdose death rates in the United States during 2006-13. The analysis revealed that combined implementation of mandated provider review of state-run prescription drug monitoring program data and pain clinic laws reduced opioid amounts prescribed by 8 percent and prescription opioid overdose death rates by 12 percent. We also observed relatively large but statistically insignificant reductions in heroin overdose death rates after implementation of these policies. This combination of policies was effective, but broader approaches to address these coincident epidemics are needed.

      7. The role of opioid prescription in incident opioid abuse and dependence among individuals with chronic noncancer pain: the role of opioid prescription
        Edlund MJ, Martin BC, Russo JE, DeVries A, Braden JB, Sullivan MD.
        Clin J Pain. 2014 Jul;30(7):557-64.
        OBJECTIVE: Increasing rates of opioid use disorders (OUDs) (abuse and dependence) among patients prescribed opioids are a significant public health concern. We investigated the association between exposure to prescription opioids and incident OUDs among individuals with a new episode of a chronic noncancer pain (CNCP) condition. METHODS: We utilized claims data from the HealthCore Database for 2000 to 2005. The dataset included all individuals aged 18 and over with a new CNCP episode (no diagnosis in the prior 6 mo), and no opioid use or OUD in the prior 6 months (n=568,640). We constructed a single multinomial variable describing prescription on opioid days supply (none, acute, and chronic) and average daily dose (none, low dose, medium dose, and high dose), and examined the association between this variable and an incident OUD diagnosis. RESULTS: Patients with CNCP prescribed opioids had significantly higher rates of OUDs compared with those not prescribed opioids. Effects varied by average daily dose and days supply: low dose, acute (odds ratio [OR]=3.03; 95% confidence interval [CI], 2.32, 3.95); low dose, chronic (OR=14.92; 95% CI, 10.38, 21.46); medium dose, acute (OR=2.80; 95% CI, 2.12, 3.71); medium dose, chronic (OR=28.69; 95% CI, 20.02, 41.13); high dose, acute (OR=3.10; 95% CI, 1.67, 5.77); and high dose, chronic (OR=122.45; 95% CI, 72.79, 205.99). CONCLUSIONS: Among individuals with a new CNCP episode, prescription opioid exposure was a strong risk factor for incident OUDs; magnitudes of effects were large. Duration of opioid therapy was more important than daily dose in determining OUD risk.

      8. IMPORTANCE: It is important to understand the magnitude and distribution of the economic burden of prescription opioid overdose, abuse, and dependence to inform clinical practice, research, and other decision makers. Decision makers choosing approaches to address this epidemic need cost information to evaluate the cost effectiveness of their choices. OBJECTIVE: To estimate the economic burden of prescription opioid overdose, abuse, and dependence from a societal perspective. DESIGN, SETTING, AND PARTICIPANTS: Incidence of fatal prescription opioid overdose from the National Vital Statistics System, prevalence of abuse and dependence from the National Survey of Drug Use and Health. Fatal data are for the US population, nonfatal data are a nationally representative sample of the US civilian noninstitutionalized population ages 12 and older. Cost data are from various sources including health care claims data from the Truven Health MarketScan Research Databases, and cost of fatal cases from the WISQARS (Web-based Injury Statistics Query and Reporting System) cost module. Criminal justice costs were derived from the Justice Expenditure and Employment Extracts published by the Department of Justice. Estimates of lost productivity were based on a previously published study. EXPOSURE: Calendar year 2013. MAIN OUTCOMES AND MEASURES: Monetized burden of fatal overdose and abuse and dependence of prescription opioids. RESULTS: The total economic burden is estimated to be $78.5 billion. Over one third of this amount is due to increased health care and substance abuse treatment costs ($28.9 billion). Approximately one quarter of the cost is borne by the public sector in health care, substance abuse treatment, and criminal justice costs. CONCLUSIONS AND RELEVANCE: These estimates can assist decision makers in understanding the magnitude of adverse health outcomes associated with prescription opioid use such as overdose, abuse, and dependence.

      9. Interactive Voice Response-Based Self-management for Chronic Back Pain: The COPES Noninferiority Randomized Trial
        Heapy AA, Higgins DM, Goulet JL, LaChappelle KM, Driscoll MA, Czlapinski RA, Buta E, Piette JD, Krein SL, Kerns RD.
        JAMA Intern Med. 2017 Jun 01;177(6):765-773.
        Importance: Recommendations for chronic pain treatment emphasize multimodal approaches, including nonpharmacologic interventions to enhance self-management. Cognitive behavioral therapy (CBT) is an evidence-based treatment that facilitates management of chronic pain and improves outcomes, but access barriers persist. Cognitive behavioral therapy delivery assisted by health technology can obviate the need for in-person visits, but the effectiveness of this alternative to standard therapy is unknown. The Cooperative Pain Education and Self-management (COPES) trial was a randomized, noninferiority trial comparing IVR-CBT to in-person CBT for patients with chronic back pain. Objective: To assess the efficacy of interactive voice response-based CBT (IVR-CBT) relative to in-person CBT for chronic back pain. Design, Setting, and Participants: We conducted a noninferiority randomized trial in 1 Department of Veterans Affairs (VA) health care system. A total of 125 patients with chronic back pain were equally allocated to IVR-CBT (n = 62) or in-person CBT (n = 63). Interventions: Patients treated with IVR-CBT received a self-help manual and weekly prerecorded therapist feedback based on their IVR-reported activity, coping skill practice, and pain outcomes. In-person CBT included weekly, individual CBT sessions with a therapist. Participants in both conditions received IVR monitoring of pain, sleep, activity levels, and pain coping skill practice during treatment. Main Outcomes and Measures: The primary outcome was change from baseline to 3 months in unblinded patient report of average pain intensity measured by the Numeric Rating Scale (NRS). Secondary outcomes included changes in pain-related interference, physical and emotional functioning, sleep quality, and quality of life at 3, 6, and 9 months. We also examined treatment retention. Results: Of the 125 patients (97 men, 28 women; mean [SD] age, 57.9 [11.6] years), the adjusted average reduction in NRS with IVR-CBT (-0.77) was similar to in-person CBT (-0.84), with the 95% CI for the difference between groups (-0.67 to 0.80) falling below the prespecified noninferiority margin of 1 indicating IVR-CBT is noninferior. Fifty-four patients randomized to IVR-CBT and 50 randomized to in-person CBT were included in the analysis of the primary outcome. Statistically significant improvements in physical functioning, sleep quality, and physical quality of life at 3 months relative to baseline occurred in both treatments, with no advantage for either treatment. Treatment dropout was lower in IVR-CBT with patients completing on average 2.3 (95% CI, 1.0-3.6) more sessions. Conclusions and Relevance: IVR-CBT is a low-burden alternative that can increase access to CBT for chronic pain and shows promise as a nonpharmacologic treatment option for chronic pain, with outcomes that are not inferior to in-person CBT. Trial Registration: clinicaltrials.gov Identifier: NCT01025752.

      10. Increases in drug and opioid-involved overdose deaths – United States, 2010-2015
        Rudd RA, Seth P, David F, Scholl L.
        MMWR Morb Mortal Wkly Rep. 2016 Dec 30;65(5051):1445-1452.
        The U.S. opioid epidemic is continuing, and drug overdose deaths nearly tripled during 1999-2014. Among 47,055 drug overdose deaths that occurred in 2014 in the United States, 28,647 (60.9%) involved an opioid (1). Illicit opioids are contributing to the increase in opioid overdose deaths (2,3). In an effort to target prevention strategies to address the rapidly changing epidemic, CDC examined overall drug overdose death rates during 2010-2015 and opioid overdose death rates during 2014-2015 by subcategories (natural/semisynthetic opioids, methadone, heroin, and synthetic opioids other than methadone).* Rates were stratified by demographics, region, and by 28 states with high quality reporting on death certificates of specific drugs involved in overdose deaths. During 2015, drug overdoses accounted for 52,404 U.S. deaths, including 33,091 (63.1%) that involved an opioid. There has been progress in preventing methadone deaths, and death rates declined by 9.1%. However, rates of deaths involving other opioids, specifically heroin and synthetic opioids other than methadone (likely driven primarily by illicitly manufactured fentanyl) (2,3), increased sharply overall and across many states. A multifaceted, collaborative public health and law enforcement approach is urgently needed. Response efforts include implementing the CDC Guideline for Prescribing Opioids for Chronic Pain (4), improving access to and use of prescription drug monitoring programs, enhancing naloxone distribution and other harm reduction approaches, increasing opioid use disorder treatment capacity, improving linkage into treatment, and supporting law enforcement strategies to reduce the illicit opioid supply.

  2. CDC Authored Publications
    The names of CDC authors are indicated in bold text.
    Articles published in the past 6-8 weeks authored by CDC or ATSDR staff.
    • Chronic Diseases and Conditions RSS Word feed
      1. Cardiac complications, earlier treatment, and initial disease severity in Kawasaki disease
        Abrams JY, Belay ED, Uehara R, Maddox RA, Schonberger LB, Nakamura Y.
        J Pediatr. 2017 Jun 12.
        OBJECTIVES: To assess if observed higher observed risks of cardiac complications for patients with Kawasaki disease (KD) treated earlier may reflect bias due to confounding from initial disease severity, as opposed to any negative effect of earlier treatment. STUDY DESIGN: We used data from Japanese nationwide KD surveys from 1997 to 2004. Receipt of additional intravenous immunoglobulin (IVIG) (data available all years) or any additional treatment (available for 2003-2004) were assessed as proxies for initial disease severity. We determined associations between earlier or later IVIG treatment (defined as receipt of IVIG on days 1-4 vs days 5-10 of illness) and cardiac complications by stratifying by receipt of additional treatment or by using logistic modeling to control for the effect of receiving additional treatment. RESULTS: A total of 48 310 patients with KD were included in the analysis. In unadjusted analysis, earlier IVIG treatment was associated with a higher risk for 4 categories of cardiac complications, including all major cardiac complications (risk ratio, 1.10; 95% CI, 1.06-1.15). Stratifying by receipt of additional treatment removed this association, and earlier IVIG treatment became protective against all major cardiac complications when controlling for any additional treatment in logistic regressions (OR, 0.90; 95% CI, 0.80-1.00). CONCLUSIONS: Observed higher risks of cardiac complications among patients with KD receiving IVIG treatment on days 1-4 of the illness are most likely due to underlying higher initial disease severity, and patients with KD should continue to be treated with IVIG as early as possible.

      2. OBJECTIVE: To determine the impact of a health system-wide primary care diabetes management system, which included targeted guidelines for type 2 diabetes (T2DM) and prediabetes (dysglycemia) screening, on detection of previously undiagnosed dysglycemia cases. RESEARCH DESIGN AND METHODS: Intervention included electronic health record (EHR)-based decision support and standardized providers and staff training for using the American Diabetes Association guidelines for dysglycemia screening. Using EHR data, we identified 40,456 adults without T2DM or recent screening with a face-to-face visit (March 2011-December 2013) in five urban clinics. Interrupted time series analyses examined the impact of the intervention on trends in three outcomes: 1) monthly proportion of eligible patients receiving dysglycemia testing, 2) two negative comparison conditions (dysglycemia testing among ineligible patients and cholesterol screening), and 3) yield of undiagnosed dysglycemia among those tested. RESULTS: Baseline monthly proportion of eligible patients receiving testing was 7.4-10.4%. After the intervention, screening doubled (mean increase + 11.0% [95% CI 9.0, 13.0], proportion range 18.6-25.3%). The proportion of ineligible patients tested also increased (+5.0% [95% CI 3.0, 8.0]) with no concurrent change in cholesterol testing (+0% [95% CI -0.02, 0.05]). About 59% of test results in eligible patients showed dysglycemia both before and after the intervention. CONCLUSIONS: Implementation of a policy for systematic dysglycemia screening including formal training and EHR templates in urban academic primary care clinics resulted in a doubling of appropriate testing and the number of patients who could be targeted for treatment to prevent or delay T2DM.

      3. Improving hypertension control population-wide in Minnesota
        Foti K, Auerbach J, Magnan S.
        J Public Health Manag Pract. 2017 Jun 16.
        CONTEXT: Hypertension is a common and costly risk factor for cardiovascular disease, but just over half of all adults with hypertension have their blood pressure controlled nationally. In Minneapolis-St Paul, Minnesota, the rate of hypertension control is approximately 70% despite a rate of hypertension control similar to the national average as recently as the first half of the 1990s. OBJECTIVE: The purposes of this study were to identify factors in Minneapolis-St Paul and state-level policies and programs in Minnesota that may have contributed to the more rapid increase in blood pressure control there than that in the rest of the nation and to identify factors that can potentially be replicated in other jurisdictions. DESIGN, SETTING, PARTICIPANTS: The study included analysis of trends in hypertension control since 1980 based on the Minnesota Heart Survey and the National Health and Nutrition Examination Survey, as well as interviews with health care and public health leaders in Minnesota. MAIN OUTCOME MEASURE: Prevalence of hypertension control. RESULTS: Probable contributing factors identified include a focus on collaborative and continuous quality improvement; a forum for setting statewide clinical guidelines and measures; the willing participation from the largest health systems, purchasers, and nonprofit health plans; and the use of widely accepted mechanisms for providing feedback to clinicians and reporting performance. The relatively high rate of insurance coverage and socioeconomic status may have contributed but do not fully explain the difference in hypertension control as compared with the rest of the United States. CONCLUSIONS: The experience in Minnesota demonstrates that it is possible to dramatically increase hypertension control at the population level, across health systems, and health plans in a relatively short period of time. Lessons learned may be helpful to informing local, state, and national efforts to improve hypertension control.

      4. NIAID, NIEHS, NHLBI, and MCAN Workshop Report: The indoor environment and childhood asthma-implications for home environmental intervention in asthma prevention and management
        Gold DR, Adamkiewicz G, Arshad SH, Celedon JC, Chapman MD, Chew GL, Cook DN, Custovic A, Gehring U, Gern JE, Johnson CC, Kennedy S, Koutrakis P, Leaderer B, Mitchell H, Litonjua AA, Mueller GA, O’Connor GT, Ownby D, Phipatanakul W, Persky V, Perzanowski MS, Ramsey CD, Salo PM, Schwaninger JM, Sordillo JE, Spira A, Suglia SF, Togias A, Zeldin DC, Matsui EC.
        J Allergy Clin Immunol. 2017 May 10.
        Environmental exposures have been recognized as critical in the initiation and exacerbation of asthma, one of the most common chronic childhood diseases. The National Institute of Allergy and Infectious Diseases; National Institute of Environmental Health Sciences; National Heart, Lung, and Blood Institute; and Merck Childhood Asthma Network sponsored a joint workshop to discuss the current state of science with respect to the indoor environment and its effects on the development and morbidity of childhood asthma. The workshop included US and international experts with backgrounds in allergy/allergens, immunology, asthma, environmental health, environmental exposures and pollutants, epidemiology, public health, and bioinformatics. Workshop participants provided new insights into the biologic properties of indoor exposures, indoor exposure assessment, and exposure reduction techniques. This informed a primary focus of the workshop: to critically review trials and research relevant to the prevention or control of asthma through environmental intervention. The participants identified important limitations and gaps in scientific methodologies and knowledge and proposed and prioritized areas for future research. The group reviewed socioeconomic and structural challenges to changing environmental exposure and offered recommendations for creative study design to overcome these challenges in trials to improve asthma management. The recommendations of this workshop can serve as guidance for future research in the study of the indoor environment and on environmental interventions as they pertain to the prevention and management of asthma and airway allergies.

      5. [No abstract]

      6. Risk factors for podoconiosis: Kamwenge District, Western Uganda, September 2015
        Kihembo C, Masiira B, Lali WZ, Matwale GK, Matovu JK, Kaharuza F, Ario AR, Nabukenya I, Makumbi I, Musenero M, Zhu BP, Nanyunja M.
        Am J Trop Med Hyg. 2017 ;96(6):1490-1496.
        Podoconiosis, a noninfectious elephantiasis, is a disabling neglected tropical disease. In August 2015, an elephantiasis case-cluster was reported in Kamwenge District, western Uganda. We investigated to identify the disease’s nature and risk factors. We defined a suspected podoconiosis case as onset in a Kamwenge resident of bilateral asymmetrical lower limb swelling lasting >= 1 month, plus >= 1 of the following associated symptoms: skin itching, burning sensation, plantar edema, lymph ooze, prominent skin markings, rigid toes, or mossy papillomata. A probable case was a suspected case with negative microfilaria antigen immunochromatographic card test (ruling out filarial elephantiasis). We conducted active case-finding. In a case-control investigation, we tested the hypothesis that the disease was caused by prolonged foot skin exposure to irritant soils, using 40 probable case-persons and 80 asymptomatic village control-persons, individually matched by age and sex. We collected soil samples to characterize irritants. We identified 52 suspected (including 40 probable) cases with onset from 1980 to 2015. Prevalence rates increased with age; annual incidence (by reported onset of disease) was stable over time at 2.9/100,000. We found that 93% (37/40) of cases and 68% (54/80) of controls never wore shoes at work (Mantel-Haenszel odds ratio [ORMH] = 7.7; 95% [confidence interval] CI = 2.0-30); 80% (32/40) of cases and 49% (39/80) of controls never wore shoes at home (ORMH = 5.2; 95% CI = 1.8-15); and 70% (27/39) of cases and 44% (35/79) of controls washed feet at day end (versus immediately after work) (OR = 11; 95% CI = 2.1-56). Soil samples were characterized as rich black-red volcanic clays. In conclusion, this reported elephantiasis is podoconiosis associated with prolonged foot exposure to volcanic soil. We recommended foot hygiene and universal use of protective shoes.

      7. BACKGROUND: Home blood pressure monitoring (HBPM) has a substantial role in hypertension management and control. METHODS: Cross-sectional data for noninstitutionalized US adults 18 years and older (10,958) from the National Health and Nutrition Examination Survey (NHANES), years 2011-2014, were used to examine factors related to HBPM. RESULTS: In 2011-2014, estimated 9.5% of US adults engaged in weekly HBPM, 7.2% engaged in monthly HBPM, 8.0% engaged in HBPM less than once a month, and 75.3% didn’t engage any HBPM. The frequency of HBPM increased with age, body mass index, and the number of health care visits (all, P < 0.05). Also, race/ethnicity (Non-Hispanic Blacks and non-Hispanic Asians), health insurance, diagnosed with diabetes, told by a health care provider to engage in HBPM, and diagnosed as hypertensive, were all associated with more frequent HBPM (P < 0.05). Adjusting for covariates, hypertensives who were aware of, treated for, and controlled engaged in more frequent HBPM compared to their respective references: unaware (odds ratio [OR] = 2.00, 95% confidence interval [CI] = 1.53-2.63), untreated (OR = 1.99, 95% CI = 1.52-2.60), and uncontrolled (OR = 1.42, 95% CI = 1.13-1.82). Hypertensive adults (aware/unaware, treated/untreated, or controlled/uncontrolled), who received providers’ recommendations to perform HBPM, were more likely to do so compared to those who did not receive recommendations (OR = 8.04, 95% CI = 6.56-9.86; OR = 7.98, 95% CI = 6.54-9.72; OR = 8.75, 95% CI = 7.18-10.67, respectively). CONCLUSIONS: Seventeen percent of US adults engaged in monthly or more frequent HBPM and health care providers’ recommendations to engage in HBPM have a significant impact on the frequency of HBPM.

      8. No study examined and compared the association between intake of trans-fatty acids (TFAs) and risk of metabolic syndrome before and after significant reduction of TFA intakes in the US population. We hypothesized that the relationship might remain significant after substantial reduction of TFA intakes in the population. We used data on 1442 and 2233 adults aged >=20 years from the National Health and Nutrition Examination Survey 1999-2000 and 2009-2010, respectively. Multivariable logistic regression analysis was used to assess the association between plasma TFA concentrations and metabolic syndrome, including each of its 5 components. The median plasma TFA concentrations were reduced from 79.8 mumol/L in 1999-2000 to 36.9 mumol/L in 2009-2010. The fully adjusted prevalence ratios comparing the highest vs the lowest quintile of plasma TFA concentrations in 1999-2000 were 3.43 (95% confidence interval, 2.39-4.92) for metabolic syndrome, 1.72 (1.38-2.14) for large waistline, 8.25 (6.34-10.74) for high triglycerides, 1.96 (1.46-2.62) for low high-density lipoprotein cholesterol, 1.14 (0.85-1.55) for high blood pressure, and 1.48 (1.19-1.85) for high fasting glucose, respectively. The corresponding prevalence ratios in 2009-2010 were 2.93 (2.41-3.54), 1.62 (1.39-1.89), 14.93 (9.28-24.02), 3.09 (2.18-4.37), 1.27 (1.11-1.46), and 1.24 (1.06-1.46), respectively. The pattern of association between TFAs and metabolic syndrome and its components did not differ by cycles. The observed associations were consistent across the subgroups examined. Despite a 54% decline in plasma TFA concentrations from 1999-2000 to 2009-2010, it was positively associated with risk of metabolic syndrome and its individual components except for blood pressure in 1999-2000. Our findings support Food and Drug Administration initiatives to remove TFAs from the industrially-produced foods.

    • Communicable Diseases RSS Word feed
      1. Patients with primary immunodeficiencies are a reservoir of poliovirus and a risk to polio eradication
        Aghamohammadi A, Abolhassani H, Kutukculer N, Wassilak SG, Pallansch MA, Kluglein S, Quinn J, Sutter RW, Wang X, Sanal O, Latysheva T, Ikinciogullari A, Bernatowska E, Tuzankina IA, Costa-Carvalho BT, Franco JL, Somech R, Karakoc-Aydiner E, Singh S, Bezrodnik L, Espinosa-Rosales FJ, Shcherbina A, Lau YL, Nonoyama S, Modell F, Modell V, Ozen A, Berlin A, Chouikha A, Partida-Gaytan A, Kiykim A, Prakash C, Suri D, Ayvaz DC, Pelaez D, da Silva EE, Deordieva E, Perez-Sanchez EE, Ulusoy E, Dogu F, Seminario G, Cuzcanci H, Triki H, Shimizu H, Tezcan I, Ben-Mustapha I, Sun J, Mazzucchelli JT, Orrego JC, Pac M, Bolkov M, Giraldo M, Belhaj-Hmida N, Mekki N, Kuzmenko N, Karaca NE, Rezaei N, Diop OM, Baris S, Chan SM, Shahmahmoodi S, Haskologlu S, Ying W, Wang Y, Barbouche MR, McKinlay MA.
        Front Immunol. 2017 ;8(JUN).
        Immunodeficiency-associated vaccine-derived polioviruses (iVDPVs) have been isolated from primary immunodeficiency (PID) patients exposed to oral poliovirus vaccine (OPV). Patients may excrete poliovirus strains for months or years; the excreted viruses are frequently highly divergent from the parental OPV and have been shown to be as neurovirulent as wild virus. Thus, these patients represent a potential reservoir for transmission of neurovirulent polioviruses in the post-eradication era. In support of WHO recommendations to better estimate the prevalence of poliovirus excreters among PIDs and characterize genetic evolution of these strains, 635 patients including 570 with primary antibody deficiencies and 65 combined immunodeficiencies were studied from 13 OPV-using countries. Two stool samples were collected over 4 days, tested for enterovirus, and the poliovirus positive samples were sequenced. Thirteen patients (2%) excreted polioviruses, most for less than 2 months following identification of infection. Five (0.8%) were classified as iVDPVs (only in combined immunodeficiencies and mostly poliovirus serotype 2). Non-polio enteroviruses were detected in 30 patients (4.7%). Patients with combined immunodeficiencies had increased risk of delayed poliovirus clearance compared to primary antibody deficiencies. Usually, iVDPV was detected in subjects with combined immunodeficiencies in a short period of time after OPV exposure, most for less than 6 months. Surveillance for poliovirus excretion among PID patients should be reinforced until polio eradication is certified and the use of OPV is stopped. Survival rates among PID patients are improving in lower and middle income countries, and iVDPV excreters are identified more frequently. Antivirals or enhanced immunotherapies presently in development represent the only potential means to manage the treatment of prolonged excreters and the risk they present to the polio endgame.

      2. HIV stigma and social capital in women living with HIV
        Cuca YP, Asher A, Okonsky J, Kaihura A, Dawson-Rose C, Webel A.
        J Assoc Nurses AIDS Care. 2017 Jan – Feb;28(1):45-54.
        Women living with HIV (WLWH) continue to experience HIV-related stigma. Social capital is one resource that could mitigate HIV stigma. Our cross-sectional study examined associations between social capital and HIV-related stigma in 135 WLWH in the San Francisco Bay Area. The mean age of study participants was 48 years; 60% were African American; 29% had less than a high school education; and 19% were employed. Age was significantly associated with perceived HIV stigma (p = .001), but total social capital was not. Women with lower Value of Life social capital scores had significantly higher total stigma scores (p = .010) and higher Negative Self-image stigma scores (p = .001). Women who felt less valued in their social worlds may have been more likely to perceive HIV stigma, which could have negative health consequences. This work begins to elucidate the possible relationships between social capital and perceived HIV stigma.

      3. Capacity strengthening through pre-migration tuberculosis screening programmes: IRHWG experiences
        Douglas P, Posey DL, Zenner D, Robson J, Abubakar I, Giovinazzo G.
        Int J Tuberc Lung Dis. 2017 Jul 01;21(7):737-745.
        Effective tuberculosis (TB) prevention and care for migrants requires population health-based approaches that treat the relationship between migration and health as a progressive, interactive process influenced by many variables and addressed as far upstream in the process as possible. By including capacity building in source countries, pre-migration medical screening has the potential to become an integral component of public health promotion, as well as infection and disease prevention, in migrant-receiving nations, while simultaneously increasing capabilities in countries of origin. This article describes the collaborative experiences of five countries (Australia, Canada, New Zealand, United Kingdom and the United States of America, members of the Immigration and Refugee Health Working Group [IRHWG]), with similar pre-migration screening programmes for TB that are mandated. Qualitative examples of capacity building through IRHWG programmes are provided. Combined, the IRHWG member countries screen approximately 2 million persons overseas every year. Large-scale pre-entry screening programmes undertaken by IRHWG countries require building additional capacity for health care providers, radiology facilities and laboratories. This has resulted in significant improvements in laboratory and treatment capacity, providing availability of these facilities for national public health programmes. As long as global health disparities and disease prevalence differentials exist, national public health programmes and policies in migrant-receiving nations will continue to be challenged to manage the diseases prevalent in these migrating populations. National TB programmes and regulatory systems alone will not be able to achieve TB elimination. The management of health issues resulting from population mobility will require integration of national and global health initiatives which, as demonstrated here, can be supported through the capacity-building endeavours of pre-migration screening programmes.

      4. Financial incentives for linkage to care and viral suppression among HIV-positive patients: A randomized clinical trial (HPTN 065)
        El-Sadr WM, Donnell D, Beauchamp G, Hall HI, Torian LV, Zingman B, Lum G, Kharfen M, Elion R, Leider J, Gordin FM, Elharrar V, Burns D, Zerbe A, Gamble T, Branson B.
        JAMA Intern Med. 2017 Jun 19.
        Importance: Achieving linkage to care and viral suppression in human immunodeficiency virus (HIV)-positive patients improves their well-being and prevents new infections. Current gaps in the HIV care continuum substantially limit such benefits. Objective: To evaluate the effectiveness of financial incentives on linkage to care and viral suppression in HIV-positive patients. Design, Setting, and Participants: A large community-based clinical trial that randomized 37 HIV test and 39 HIV care sites in the Bronx, New York, and Washington, DC, to financial incentives or standard of care. Interventions: Participants at financial incentive test sites who had positive test results for HIV received coupons redeemable for $125 cash-equivalent gift cards upon linkage to care. HIV-positive patients receiving antiretroviral therapy at financial incentive care sites received $70 gift cards quarterly, if virally suppressed. Main Outcomes and Measures: Linkage to care: proportion of HIV-positive persons at the test site who linked to care within 3 months, as indicated by CD4+ and/or viral load test results done at a care site. Viral suppression: proportion of established patients at HIV care sites with suppressed viral load (<400 copies/mL), assessed at each calendar quarter. Outcomes assessed through laboratory test results reported to the National HIV Surveillance System. Results: A total of 1061 coupons were dispensed for linkage to care at 18 financial incentive test sites and 39359 gift cards were dispensed to 9641 HIV-positive patients eligible for gift cards at 17 financial incentive care sites. Financial incentives did not increase linkage to care (adjusted odds ratio, 1.10; 95% CI, 0.73-1.67; P = .65). However, financial incentives significantly increased viral suppression. The overall proportion of patients with viral suppression was 3.8% higher (95% CI, 0.7%-6.8%; P = .01) at financial incentive sites compared with standard of care sites. Among patients not previously consistently virally suppressed, the proportion virally suppressed was 4.9% higher (95% CI, 1.4%-8.5%; P = .007) at financial incentive sites. In addition, continuity in care was 8.7% higher (95% CI, 4.2%-13.2%; P < .001) at financial incentive sites. Conclusions and Relevance: Financial incentives, as used in this study (HPTN 065), significantly increased viral suppression and regular clinic attendance among HIV-positive patients in care. No effect was noted on linkage to care. Financial incentives offer promise for improving adherence to treatment and viral suppression among HIV-positive patients. Trial Registration: clinicaltrials.gov Identifier: NCT01152918.

      5. Design of the HPTN 065 (TLC-Plus) study: A study to evaluate the feasibility of an enhanced test, link-to-care, plus treat approach for HIV prevention in the United States
        Gamble T, Branson B, Donnell D, Hall HI, King G, Cutler B, Hader S, Burns D, Leider J, Wood AF, G. Volpp K, Buchacz K, El-Sadr WM.
        Clin Trials. 2017 Jun 01:1740774517711682.
        Background/Aims HIV continues to be a major public health threat in the United States, and mathematical modeling has demonstrated that the universal effective use of antiretroviral therapy among all HIV-positive individuals (i.e. the “test and treat” approach) has the potential to control HIV. However, to accomplish this, all the steps that define the HIV care continuum must be achieved at high levels, including HIV testing and diagnosis, linkage to and retention in clinical care, antiretroviral medication initiation, and adherence to achieve and maintain viral suppression. The HPTN 065 (Test, Link-to-Care Plus Treat [TLC-Plus]) study was designed to determine the feasibility of the “test and treat” approach in the United States. Methods HPTN 065 was conducted in two intervention communities, Bronx, NY, and Washington, DC, along with four non-intervention communities, Chicago, IL; Houston, TX; Miami, FL; and Philadelphia, PA. The study consisted of five components: (1) exploring the feasibility of expanded HIV testing via social mobilization and the universal offer of testing in hospital settings, (2) evaluating the effectiveness of financial incentives to increase linkage to care, (3) evaluating the effectiveness of financial incentives to increase viral suppression, (4) evaluating the effectiveness of a computer-delivered intervention to decrease risk behavior in HIV-positive patients in healthcare settings, and (5) administering provider and patient surveys to assess knowledge and attitudes regarding the use of antiretroviral therapy for prevention and the use of financial incentives to improve health outcomes. The study used observational cohorts, cluster and individual randomization, and made novel use of the existing national HIV surveillance data infrastructure. All components were developed with input from a community advisory board, and pragmatic methods were used to implement and assess the outcomes for each study component. Results A total of 76 sites in Washington, DC, and the Bronx, NY, participated in the study: 37 HIV test sites, including 16 hospitals, and 39 HIV care sites. Between September 2010 and December 2014, all study components were successfully implemented at these sites and resulted in valid outcomes. Our pragmatic approach to the study design, implementation, and the assessment of study outcomes allowed the study to be conducted within established programmatic structures and processes. In addition, it was successfully layered on the ongoing standard of care and existing data infrastructure without disrupting health services. Conclusion The HPTN 065 study demonstrated the feasibility of implementing and evaluating a multi-component “test and treat” trial that included a large number of community sites and involved pragmatic approaches to study implementation and evaluation.

      6. The National Ebola Training and Education Center: Preparing the United States for Ebola and other special pathogens
        Kratochvil CJ, Evans L, Ribner BS, Lowe JJ, Harvey MC, Hunt RC, Tumpey AJ, Fagan RP, Schwedhelm MM, Bell S, Maher J, Kraft CS, Cagliuso NV, Vanairsdale S, Vasa A, Smith PW.
        Health Secur. 2017 May/Jun;15(3):253-260.
        The National Ebola Training and Education Center (NETEC) was established in 2015 in response to the 2014-2016 Ebola virus disease outbreak in West Africa. The US Department of Health and Human Services office of the Assistant Secretary for Preparedness and Response and the US Centers for Disease Control and Prevention sought to increase the competency of healthcare and public health workers, as well as the capability of healthcare facilities in the United States, to deliver safe, efficient, and effective care to patients infected with Ebola and other special pathogens nationwide. NYC Health + Hospitals/Bellevue, Emory University, and the University of Nebraska Medical Center/Nebraska Medicine were awarded this cooperative agreement, based in part on their experience in safely and successfully evaluating and treating patients with Ebola virus disease in the United States. In 2016, NETEC received a supplemental award to expand on 3 initial primary tasks: (1) develop metrics and conduct peer review assessments; (2) develop and provide educational materials, resources, and tools, including exercise design templates; (3) provide expert training and technical assistance; and, to add a fourth task, create a special pathogens clinical research network.

      7. Changing trends in complications of chronic hepatitis C
        Lu M, Li J, Rupp LB, Zhou Y, Holmberg SD, Moorman AC, Spradling PR, Teshale EH, Boscarino JA, Daida YG, Schmidt MA, Trudeau S, Gordon SC.
        Liver Int. 2017 Jun 21.
        BACKGROUND AND AIMS: Chronic hepatitis C virus (HCV)-related complications have increased over the past decade. METHODS: We used join-point regression modeling to investigate trends in these complications from 2006-2015, and the impact of demographics on these trends. Using data from the Chronic Hepatitis Cohort Study (CHeCS), we identified points at which the trend significantly changed, and estimated the annual percent change (APC) in rates of cirrhosis, decompensated cirrhosis, and all-cause mortality, adjusted by race, sex, and age. RESULTS: Among 11,167 adults with chronic HCV infection, prevalence of cirrhosis increased from 20.8% to 27.6% from 2006 to 2015 with adjusted annual percentage change (aAPC) of 1.2 (p<0.01). Although incidence of all-cause mortality increased from 1.8% in 2006 to 2.9% in 2015, a join-point was identified at 2010, with aAPCs of 9.6 before (2006<2010; p<0.01) and -5.2 after (2010</=2015; p<0.01), indicating a decrease in mortality from 2010 and onward. Likewise, although overall prevalence of decompensated cirrhosis increased from 9.3% in 2006 to 10.4% in 2015, this increase was confined to patients 60 or older (aAPC=1.5; p=0.023). Asian American and Black/ African American patients demonstrated significantly higher rates of cirrhosis than White patients, while older patients and men demonstrated higher rates of cirrhosis and mortality. CONCLUSIONS: Although cirrhosis and mortality among HCV-infected patients in the US have increased in the past decade, the mortality has decreased in recent years.

      8. Epidemiology of pediatric multidrug-resistant tuberculosis in the United States, 1993-2014
        Smith SE, Pratt R, Trieu L, Barry PM, Thai DT, Ahuja SD, Shah S.
        Clin Infect Dis. 2017 Jun 19.
        Background: Multidrug-resistant tuberculosis (MDR TB) is an important global public health threat, but accurate estimates of MDR TB burden among children are lacking. Methods: We analyzed demographic, clinical and laboratory data for newly-diagnosed pediatric (<15 years) TB cases reported to the US National TB Surveillance System (NTSS) during 1993-2014. MDR TB was defined as culture-confirmed TB disease with resistance to at least isoniazid and rifampicin. To ascertain potential under-estimation of pediatric MDR TB, we surveyed high burden states for clinically-diagnosed cases treated for MDR TB. Results: Of 20,789 pediatric TB cases, 5,162 (24.8%) had bacteriologically-confirmed TB. Among 4,862 (94.2%) with drug-susceptibility testing, 82 (1.7%) had MDR TB. Most pediatric MDR TB cases were female (n=51, 62%), median age was 5 years (IQR 1-12), one-third were Hispanic (n=28, 34%), and two-thirds (n=55, 67%) were born in the US. Most cases had additional resistance to >/=1 other first-line drug (n=66; 80.5%) and one-third had resistance to >/=1 second-line drug (24/73 tested). Of 77 who started treatment prior to 2013, 66 (86%) completed treatment and 4 (5%) died. Among the four high TB burden states/jurisdictions surveyed, there was 42-55% under-estimation of pediatric MDR TB cases when using only culture-confirmed case definitions. Conclusions: Only one-quarter of pediatric TB cases had culture-confirmed TB, likely resulting in underestimation of true pediatric MDR TB burden in the US using strictly bacteriologic criteria. Better estimates of pediatric MDR TB burden in the US are needed and should include clinical diagnoses based on epidemiologic criteria.

      9. Travelers’ diarrhea and other gastrointestinal symptoms among Boston-area international travelers
        Stoney RJ, Han PV, Barnett ED, Wilson ME, Jentes ES, Benoit CM, MacLeod WB, Hamer DH, Chen LH.
        Am J Trop Med Hyg. 2017 ;96(6):1388-1393.
        This prospective cohort study describes travelers’ diarrhea (TD) and non-TD gastrointestinal (GI) symptoms among international travelers from the Boston area, the association of TD with traveler characteristics and dietary practices, use of prescribed antidiarrheal medications, and the impact of TD and non-TD GI symptoms on planned activities during and after travel. We included adults who received a pre-travel consultation at three Bostonarea travel clinics and who completed a three-part survey: pre-travel, during travel, and post-travel (2-4 weeks after return). TD was defined as self-reported diarrhea with or without nausea/vomiting, abdominal pain, or fever. Demographic and travel characteristics were evaluated by chi2 test for categorical and Wilcoxon rank-sum test for continuous variables. Analysis of dietary practices used logistic generalized estimating equation models or logistic regression models. Of 628 travelers, 208 (33%) experienced TD and 45 (7%) experienced non-TD GI symptoms. Of 208 with TD, 128 (64%), 71 (36%), and 123 (62%) were prescribed ciprofloxacin, azithromycin, and/or loperamide before travel, respectively. Thirty-nine (36%) of 108 took ciprofloxacin, 20 (38%) of 55 took azithromycin, and 28 (28%) of 99 took loperamide during travel. Of 172 with TD during travel, 24% stopped planned activities, and 2% were hospitalized. Of 31 with non-TD GI symptoms during travel, six (13%) stopped planned activities. International travelers continue to experience diarrhea and other GI symptoms, resulting in disruption of planned activities and healthcare visits for some. Although these illnesses resulted in interruption of travel plans, a relatively small proportion took prescribed antibiotics.

      10. Antiretroviral prescription and viral suppression in a representative sample of HIV-infected persons in care in four large metropolitan areas of the United States, Medical Monitoring Project, 2011 – 2013
        Wohl AR, Benbow N, Tejero J, Johnson C, Scheer S, Brady K, Gagner A, Hughes A, Eberhart M, Mattson C, Skarbinski J.
        J Acquir Immune Defic Syndr. 2017 Jun 15.
        BACKGROUND: Comparisons of ART prescription and viral suppression among people in HIV care across U.S. metropolitan areas are limited. 2011-2013 Medical Monitoring Project data were used to describe and compare associations between socio-demographics and ART prescription and viral suppression for persons receiving HIV care. SETTING: Chicago, Los Angeles County (LAC), Philadelphia, and San Francisco in the United States. METHODS: Bivariate and multivariable methods were used. RESULTS: The proportion of patients prescribed ART (91-93%) and virally suppressed (79-88%) was consistent although more persons were virally suppressed in San Francisco compared to the other areas, and a smaller proportion was virally suppressed in Philadelphia compared to Chicago. In the combined cohort, persons ages 30-49 (aPR=0.97, CI:0.94-0.99) were less likely than persons 50+, persons reporting non-injection drug use (aPR=0.94, CI:0.90-0.98) were less likely than non-users, and Hispanics were more likely than whites (aPR=1.04, CI:1.01-1.08) to be prescribed ART. Blacks (aPR=0.93; CI:0.87-0.99) and homeless persons (aPR=0.87; CI:0.80-0.95) were less likely to be virally suppressed in the combined cohort. In LAC, persons 30-49 were less likely than those 50+ to be prescribed ART (aPR=0.94, CI:0.90-0.98). Younger persons (18-29) (aPR=0.77; CI:0.60-0.99) and persons with less than a high school education (aPR=0.80; CI:0.67-0.95) in Philadelphia, blacks (aPR=0.90; CI:0.83-0.99) and MSW (aPR=0.89; CI:0.80-0.99) in Chicago, and homeless individuals in LAC (aPR=0.80; CI:0.67-0.94) were less likely to be virally suppressed. CONCLUSION: Data highlight the need to increase ART prescription to achieve viral suppression among younger persons, non-injection drug users, blacks, and homeless persons in U.S. metropolitan areas and underscores the importance of region-specific strategies for affected sub-groups.

    • Disaster Control and Emergency Services RSS Word feed
      1. COPEWELL: A conceptual framework and system dynamics model for predicting community functioning and resilience after disasters
        Links JM, Schwartz BS, Lin S, Kanarek N, Mitrani-Reiser J, Sell TK, Watson CR, Ward D, Slemp C, Burhans R, Gill K, Igusa T, Zhao X, Aguirre B, Trainor J, Nigg J, Inglesby T, Carbone E, Kendra JM.
        Disaster Med Public Health Prep. 2017 Jun 21:1-11.
        OBJECTIVE: Policy-makers and practitioners have a need to assess community resilience in disasters. Prior efforts conflated resilience with community functioning, combined resistance and recovery (the components of resilience), and relied on a static model for what is inherently a dynamic process. We sought to develop linked conceptual and computational models of community functioning and resilience after a disaster. METHODS: We developed a system dynamics computational model that predicts community functioning after a disaster. The computational model outputted the time course of community functioning before, during, and after a disaster, which was used to calculate resistance, recovery, and resilience for all US counties. RESULTS: The conceptual model explicitly separated resilience from community functioning and identified all key components for each, which were translated into a system dynamics computational model with connections and feedbacks. The components were represented by publicly available measures at the county level. Baseline community functioning, resistance, recovery, and resilience evidenced a range of values and geographic clustering, consistent with hypotheses based on the disaster literature. CONCLUSIONS: The work is transparent, motivates ongoing refinements, and identifies areas for improved measurements. After validation, such a model can be used to identify effective investments to enhance community resilience.

    • Disease Reservoirs and Vectors RSS Word feed
      1. High Triatoma brasiliensis densities and Trypanosoma cruzi prevalence in domestic and peridomestic habitats in the State of Rio Grande do Norte, Brazil: The source for Chagas disease outbreaks?
        Lilioso M, Folly-Ramos E, Rocha FL, Rabinovich J, Capdevielle-Dulac C, Harry M, Marcet PL, Costa J, Almeida CE.
        Am J Trop Med Hyg. 2017 ;96(6):1456-1459.
        A total of 2,431 Triatoma brasiliensis were collected from 39 populations of Paraiba (PB) and Rio Grande do Norte (RN) states, Brazil. In PB, Trypanosoma cruzi infection was not detected in either peridomestic or domestic vector populations. In contrast, in RN, T. brasiliensis was detected with high parasite prevalence in these ecotopes (30.7-40.0%). Moreover, peridomicile insect population densities were more than double the average densities of all other settings evaluated (19.17 versus < 8.94 triatomine/man-hour). Genotyped parasites evidenced a mix of T. cruzi lineages circulating in both peridomestic and sylvatic populations. Although vector control efforts have dramatically decreased Chagas disease transmission to humans, recent outbreaks have been detected in four municipalities of RN state. Our results clearly evidence a worrisome proximity between infected vectors and humans in RN. Indeed, finding of infected T. brasiliensis inside homes is routinely recorded by local vector control surveillance staff around the outbreak area, challenging the current and conventional view that vector transmissions are controlled in northeastern Brazil. This scenario calls for strengthening vector control surveillance and interventions to prevent further Chagas transmission, especially in RN State.

    • Environmental Health RSS Word feed
      1. Predictors of per- and polyfluoroalkyl substance (PFAS) plasma concentrations in 6-10 year old American children
        Harris MH, Rifas-Shiman SL, Calafat AM, Ye X, Mora AM, Webster TF, Oken E, Sagiv SK.
        Environ Sci Technol. 2017 May 02;51(9):5193-5204.
        Certain per- and polyfluoroalkyl substances (PFASs) are suspected developmental toxicants, but data on PFAS concentrations and exposure routes in children are limited. We measured plasma PFASs in children aged 6-10 years from the Boston-area Project Viva prebirth cohort, and used multivariable linear regression to estimate associations with sociodemographic, behavioral, and health-related factors, and maternal PFASs measured during pregnancy. PFAS concentrations in Project Viva children (sampled 2007-2010) were similar to concentrations among youth participants (aged 12-19 years) in the 2007-8 and 2009-10 National Health and Nutrition Examination Survey (NHANES); mean concentrations of most PFASs declined from 2007 to 2010 in Project Viva and NHANES. In mutually adjusted models, predictors of higher PFAS concentrations included older child age, lower adiposity, carpeting or a rug in the child’s bedroom, higher maternal education, and higher neighborhood income. Concentrations of perfluorooctanesulfonate (PFOS), perfluorooctanoate (PFOA), perfluorohexanesulfonate (PFHxS), and 2-(N-methyl-perfluorooctane sulfonamido) acetate (Me-PFOSA-AcOH) were 26-36% lower in children of black mothers compared to children of white mothers and increased 12-21% per interquartile range increase in maternal pregnancy PFASs. Breastfeeding duration did not predict childhood PFAS concentrations in adjusted multivariable models. Together, the studied predictors explained the observed variability in PFAS concentrations to only a modest degree.

    • Epidemiology and Surveillance RSS Word feed
      1. BACKGROUND: China Centre for Diseases Control and Prevention (CDC) developed the China Infectious Disease Automated Alert and Response System (CIDARS) in 2005. The CIDARS was used to strengthen infectious disease surveillance and aid in the early warning of outbreak. The CIDARS has been integrated into the routine outbreak monitoring efforts of the CDC at all levels in China. Early warning threshold is crucial for outbreak detection in the CIDARS, but CDCs at all level are currently using thresholds recommended by the China CDC, and these recommended thresholds have recognized limitations. Our study therefore seeks to explore an operational method to select the proper early warning threshold according to the epidemic features of local infectious diseases. METHODS: The data used in this study were extracted from the web-based Nationwide Notifiable Infectious Diseases Reporting Information System (NIDRIS), and data for infectious disease cases were organized by calendar week (1-52) and year (2009-2015) in Excel format; Px was calculated using a percentile-based moving window (moving window [5 week*5 year], x), where x represents one of 12 centiles (0.40, 0.45, 0.50….0.95). Outbreak signals for the 12 Px were calculated using the moving percentile method (MPM) based on data from the CIDARS. When the outbreak signals generated by the ‘mean + 2SD’ gold standard were in line with a Px generated outbreak signal for each week during the year of 2014, this Px was then defined as the proper threshold for the infectious disease. Finally, the performance of new selected thresholds for each infectious disease was evaluated by simulated outbreak signals based on 2015 data. RESULTS: Six infectious diseases were selected in this study (chickenpox, mumps, hand foot and mouth diseases (HFMD), scarlet fever, influenza and rubella). Proper thresholds for chickenpox (P75), mumps (P80), influenza (P75), rubella (P45), HFMD (P75), and scarlet fever (P80) were identified. The selected proper thresholds for these 6 infectious diseases could detect almost all simulated outbreaks within a shorter time period compared to thresholds recommended by the China CDC. CONCLUSIONS: It is beneficial to select the proper early warning threshold to detect infectious disease aberrations based on characteristics and epidemic features of local diseases in the CIDARS.

    • Genetics and Genomics RSS Word feed
      1. Genetic indicators of drug resistance in the highly repetitive genome of Trichomonas vaginalis
        Bradic M, Warring SD, Tooley GE, Scheid P, Secor WE, Land KM, Huang PJ, Chen TW, Lee CC, Tang P, Sullivan SA, Carlton JM.
        Genome Biol Evol. 2017 Jun 19.
        Trichomonas vaginalis, the most common non-viral sexually transmitted parasite, causes approximately 283 million trichomoniasis infections annually and is associated with pregnancy complications and increased risk of HIV-1 acquisition. The antimicrobial drug metronidazole is used for treatment, but in a fraction of clinical cases, the parasites can become resistant to this drug. We undertook sequencing of multiple clinical isolates and lab derived lines to identify genetic markers and mechanisms of metronidazole resistance. Reduced representation genome sequencing of approximately 100 T. vaginalis clinical isolates identified 3,923 SNP markers and presence of a bipartite population structure. Linkage disequilibrium was found to decay rapidly, suggesting genome-wide recombination and the feasibility of genetic association studies in the parasite. We identified 72 SNPs associated with metronidazole resistance, and a comparison of SNPs within several lab-derived resistant lines revealed an overlap with the clinically resistant isolates. We identified SNPs in genes for which no function has yet been assigned, as well as in functionally-characterized genes relevant to drug resistance (e.g., pyruvate:ferredoxin oxidoreductase). Transcription profiles of resistant strains showed common changes in genes involved in drug activation (e.g., flavin reductase), accumulation (e.g., multidrug resistance pump), and detoxification (e.g., nitroreductase). Finally, we identified convergent genetic changes in lab-derived resistant lines of Tritrichomonas foetus, a distantly-related species that causes venereal disease in cattle. Shared genetic changes within and between T. vaginalis and Tr. foetus parasites suggest conservation of the pathways through which adaptation has occurred. These findings extend our knowledge of drug resistance in the parasite, providing a panel of markers that can be used as a diagnostic tool.

    • Health Disparities RSS Word feed
      1. Research on racial/ethnic health disparities and socioeconomic position has not fully considered occupation. However, because occupations are racially patterned, certain occupational characteristics may explain racial/ethnic difference in health. This study examines the role of occupational characteristics in racial/ethnic disparities in all-cause mortality. Data are from a U.S. community-based cohort study (n=6342, median follow-up: 12.2 years), in which 893 deaths (14.1%) occurred. We estimated mortality hazard ratios (HRs) for African Americans, Hispanics, and Chinese Americans compared with whites. We also estimated the proportion of the HR mediated by each of two occupational characteristics, substantive complexity of work (e.g., problem solving, inductive/deductive reasoning on the job) and hazardous conditions (e.g., noise, extreme temperature, chemicals), derived from the Occupational Information Network database (O*NET). Analyses were adjusted for age, sex, nativity, working status at baseline, and study sites. African Americans had a higher rate of all-cause death (HR 1.41; 95% confidence interval [CI]: 1.19-1.66) than whites. Chinese-American ethnicity was protective (HR 0.59, CI: 0.40-0.85); Hispanic ethnicity was not significantly different from whites (HR 0.88; CI: 0.67-1.17). Substantive complexity of work mediated 30% of the higher rate of death for African Americans compared with whites. For other groups, mediation was not significant. Hazardous conditions did not significantly mediate mortality in any racial/ethnic group. Lower levels of substantive complexity of work mediate a substantial part of the health disadvantage in African Americans. This job characteristic may be an important factor in explaining racial health disparities.

    • Health Economics RSS Word feed
      1. Impact and cost-effectiveness of rotavirus vaccination in Bangladesh
        Pecenka C, Parashar U, Tate JE, Khan JA, Groman D, Chacko S, Shamsuzzaman M, Clark A, Atherly D.
        Vaccine. 2017 Jun 13.
        INTRODUCTION: Diarrheal disease is a leading cause of child mortality globally, and rotavirus is responsible for more than a third of those deaths. Despite substantial decreases, the number of rotavirus deaths in children under five was 215,000 per year in 2013. Of these deaths, approximately 41% occurred in Asia and 3% of those in Bangladesh. While Bangladesh has yet to introduce rotavirus vaccination, the country applied for Gavi support and plans to introduce it in 2018. This analysis evaluates the impact and cost-effectiveness of rotavirus vaccination in Bangladesh and provides estimates of the costs of the vaccination program to help inform decision-makers and international partners. METHODS: This analysis used Pan American Health Organization’s TRIVAC model (version 2.0) to examine nationwide introduction of two-dose rotavirus vaccination in 2017, compared to no vaccination. Three mortality scenarios (low, high, and midpoint) were assessed. Benefits and costs were examined from the societal perspective over ten successive birth cohorts with a 3% discount rate. Model inputs were locally acquired and complemented by internationally validated estimates. RESULTS: Over ten years, rotavirus vaccination would prevent 4000 deaths, nearly 500,000 hospitalizations and 3 million outpatient visits in the base scenario. With a Gavi subsidy, cost/disability adjusted life year (DALY) ratios ranged from $58/DALY to $142/DALY averted. Without a Gavi subsidy and a vaccine price of $2.19 per dose, cost/DALY ratios ranged from $615/DALY to $1514/DALY averted. CONCLUSION: The discounted cost per DALY averted was less than the GDP per capita for nearly all scenarios considered, indicating that a routine rotavirus vaccination program is highly likely to be cost-effective. Even in a low mortality setting with no Gavi subsidy, rotavirus vaccination would be cost-effective. These estimates exclude the herd immunity benefits of vaccination, so represent a conservative estimate of the cost-effectiveness of rotavirus vaccination in Bangladesh.

      2. Cost-effectiveness evaluation of a novel integrated bite case management program for the control of human rabies, Haiti 2014-2015

        Undurraga EA, Meltzer MI, Tran CH, Atkins CY, Etheart MD, Millien MF, Adrien P, Wallace RM.
        Am J Trop Med Hyg. 2017 ;96(6):1307-1317.
        Haiti has the highest burden of rabies in the Western hemisphere, with 130 estimated annual deaths. We present the cost-effectiveness evaluation of an integrated bite case management program combining community bite investigations and passive animal rabies surveillance, using a governmental perspective. The Haiti Animal Rabies Surveillance Program (HARSP) was first implemented in three communes of the West Department, Haiti. Our evaluation encompassed all individuals exposed to rabies in the study area (N = 2,289) in 2014-2015. Costs (2014 U.S. dollars) included diagnostic laboratory development, training of surveillance officers, operational costs, and postexposure prophylaxis (PEP). We used estimated deaths averted and years of life gained (YLG) from prevented rabies as health outcomes. HARSP had higher overall costs (range: $39,568-$80,290) than the no-bitecase- management (NBCM) scenario ($15,988-$26,976), partly from an increased number of bite victims receiving PEP. But HARSP had better health outcomes than NBCM, with estimated 11 additional annual averted deaths in 2014 and nine in 2015, and 654 additional YLG in 2014 and 535 in 2015. Overall, HARSP was more cost-effective (US$ per death averted) than NBCM (2014, HARSP: $2,891-$4,735, NBCM: $5,980-$8,453; 2015, HARSP: $3,534- $7,171, NBCM: $7,298-$12,284). HARSP offers an effective human rabies prevention solution for countries transitioning from reactive to preventive strategies, such as comprehensive dog vaccination.

    • Healthcare Associated Infections RSS Word feed
      1. One needle, one syringe, only one time? A survey of physician and nurse knowledge, attitudes, and practices around injection safety
        Kossover-Smith RA, Coutts K, Hatfield KM, Cochran R, Akselrod H, Schaefer MK, Perz JF, Bruss K.
        Am J Infect Control. 2017 Jun 15.
        BACKGROUND: To inform development, targeting, and penetration of materials from a national injection safety campaign, an evaluation was conducted to assess provider knowledge, attitudes, and practices related to unsafe injection practices. METHODS: A panel of physicians (n = 370) and nurses (n = 320) were recruited from 8 states to complete an online survey. Questions, using 5-point Likert and Spector scales, addressed acceptability and frequency of unsafe practices (eg, reuse of a syringe on >1 patient). Results were stratified to identify differences among physician specialties and nurse practice locations. RESULTS: Unsafe injection practices were reported by both physicians and nurses across all surveyed physician specialties and nurse practice locations. Twelve percent (12.4%) of physicians and 3% of nurses indicated reuse of syringes for >1 patient occurs in their workplace; nearly 5% of physicians indicated this practice usually or always occurs. A higher proportion of oncologists reported unsafe practices occurring in their workplace. CONCLUSIONS: There is a dangerous minority of providers violating basic standards of care; practice patterns may vary by provider group and specialty. More research is needed to understand how best to identify providers placing patients at risk of infection and modify their behaviors.

    • Immunity and Immunization RSS Word feed
      1. Trends in influenza and pneumococcal vaccination among US nursing home residents, 2006-2014
        Black CL, Williams WW, Arbeloa I, Kordic N, Yang L, MaCurdy T, Worrall C, Kelman JA.
        J Am Med Dir Assoc. 2017 Jun 13.
        BACKGROUND: Institutionalized adults are at increased risk of morbidity and mortality from influenza and pneumococcal infection. Influenza and pneumococcal vaccination have been shown to be effective in reducing hospitalization and deaths due to pneumonia and influenza in this population. OBJECTIVE: To assess trends in influenza vaccination coverage among US nursing home residents from the 2005-2006 through 2014-2015 influenza seasons and trends in pneumococcal vaccination coverage from 2006 to 2014 among US nursing home residents, by state and demographic characteristics. METHODS: Data were analyzed from the Centers for Medicare and Medicaid Services’ (CMS’s) Minimum Data Set (MDS). Influenza and pneumococcal vaccination status were assessed for all residents of CMS-certified nursing homes using data reported to the MDS by all certified facilities. RESULTS: Influenza vaccination coverage increased from 71.4% in the 2005-2006 influenza season to 75.7% in the 2014-2015 influenza season and pneumococcal vaccination coverage increased from 67.4% in 2006 to 78.4% in 2014. Vaccination coverage varied by state, with influenza vaccination coverage ranging from 50.0% to 89.7% in the 2014-2015 influenza season and pneumococcal vaccination coverage ranging from 55.0% to 89.7% in 2014. Non-Hispanic black and Hispanic residents had lower coverage compared with non-Hispanic white residents for both vaccines, and these differences persisted over time. CONCLUSION: Influenza and pneumococcal vaccination among US nursing home residents remains suboptimal. Nursing home staff can employ strategies such as provider reminders and standing orders to facilitate offering vaccination to all residents along with culturally appropriate vaccine promotion to increase vaccination coverage among this vulnerable population.

      2. Prior season vaccination and risk of influenza during the 2014-2015 season in the U.S
        Chung JR, Flannery B, Zimmerman RK, Nowalk MP, Jackson ML, Jackson LA, Petrie JG, Martin ET, Monto AS, McLean HQ, Belongia EA, Gaglani M, Fry AM.
        J Infect Dis. 2017 Jun 13.

        [No abstract]

      3. How to increase vaccination acceptance among apostolic communities: Quantitative results from an assessment in three provinces in Zimbabwe
        Gerede R, Machekanyanga Z, Ndiaye S, Chindedza K, Chigodo C, Shibeshi ME, Goodson J, Daniel F, Kaiser R.
        J Relig Health. 2017 Jun 17.
        A worldwide increasing trend toward vaccine hesitancy has been reported. Measles outbreaks in southern Africa in 2009-2010 were linked to objections originating from Apostolic gatherings. Founded in Zimbabwe in the 1950s, the Apostolic church has built up a large number of followers with an estimated 3.5 million in Zimbabwe in 2014. To inform planning of interventions for the 2015 measles-rubella vaccination campaign, we assessed vaccination status and knowledge, attitudes and practices among purposive samples of Apostolic caregivers in three districts each in Harare City, Manicaland and Matabeleland South in Zimbabwe. We conducted structured interviews among 97 caregivers of children aged 9-59 months and collected vaccination status for 126 children. Main Apostolic affiliations were Johanne Marange (53%), Madida (13%) and Gospel of God (11%) with considerable variation across assessment areas. The assessment also showed considerable variation among Apostolic communities in children ever vaccinated (14-100%) and retention of immunization cards (0-83%) of ever vaccinated. Overall retention of immunization cards (12%) and documented vaccination status by card (fully vaccinated = 6%) were low compared to previously reported measures in the general population. Mothers living in monogamous relationships reported over 90% of all DTP-HepB-Hib-3, measles and up to date immunizations during the first life year documented by immunization card. Results revealed opportunities to educate about immunization during utilization of health services other than vaccinations, desire to receive information about vaccinations from health personnel, and willingness to accept vaccinations when offered outside of regular services. Based on the results of the assessment, specific targeted interventions were implemented during the vaccination campaign, including an increased number of advocacy activities by district authorities. Also, health workers offered ways and timing to vaccinate children that catered to the specific situation of Apostolic caregivers, including flexible service provision after hours and outside of health facilities, meeting locations chosen by caregivers, using mobile phones to set up meeting locations, and documentation of vaccination in health facilities if home-based records posed a risk for caregivers. Coverage survey results indicate that considerable progress has been made since 2010 to increase vaccination acceptability among Apostolic communities in Zimbabwe. Further efforts will be needed to vaccinate all Apostolic children during routine and campaign activities in the country, and the results from our assessment can contribute toward this goal.

      4. Qualitative assessment of vaccination hesitancy among members of the Apostolic Church of Zimbabwe: A Case Study
        Machekanyanga Z, Ndiaye S, Gerede R, Chindedza K, Chigodo C, Shibeshi ME, Goodson J, Daniel F, Zimmerman L, Kaiser R.
        J Relig Health. 2017 Jun 19.
        Vaccine hesitancy or lack of confidence in vaccines is considered a threat to the success of vaccination programs. The rise and spread of measles outbreaks in southern Africa in 2009-2010 were linked to objections among Apostolic Church members, estimated at about 3.5 million in Zimbabwe as of 2014. To inform planning of interventions for a measles-rubella vaccination campaign, we conducted an assessment of the factors contributing to vaccine hesitancy using data from various stakeholders. Among nine districts in three regions of Zimbabwe, we collected data on religious attitudes toward, and perceptions of, vaccines through focus group discussions with health workers serving Apostolic communities and members of the National Expanded Programme on Immunization; semi-structured interviews with religious leaders; and open-ended questions in structured interviews with Apostolic parents/caregivers. Poor knowledge of vaccines, lack of understanding and appreciation of the effectiveness of vaccinations, religious teachings that emphasize prayers over the use of medicine, lack of privacy in a religiously controlled community, and low levels of education were found to be the main factors contributing to vaccine hesitancy among key community members and leaders. Accepting vaccination in public is a risk of sanctions. Poor knowledge of vaccines is a major factor of hesitancy which is reinforced by religious teachings on the power of prayers as alternatives. Because parents/caregivers perceive vaccines as dangerous for their children and believe they can cause death or disease, members of the Apostolic Church have more confidence in alternative methods such as use of holy water and prayers to treat diseases. Under these circumstances, it is important to debunk the myths about the power of holy water on the one hand and disseminate positive information of the efficacy of vaccines on the other hand in order to reduce hesitancy. Education about vaccines and vaccination in conjunction with government intervention, for example, through the use of social distancing policies can provide a framework for reducing hesitancy and increasing demand for vaccination.

    • Injury and Violence RSS Word feed
      1. Childhood firearm injuries in the United States
        Fowler KA, Dahlberg LL, Haileyesus T, Gutierrez C, Bacon S.
        Pediatrics. 2017 Jun 19.
        OBJECTIVES: Examine fatal and nonfatal firearm injuries among children aged 0 to 17 in the United States, including intent, demographic characteristics, trends, state-level patterns, and circumstances. METHODS: Fatal injuries were examined by using data from the National Vital Statistics System and nonfatal injuries by using data from the National Electronic Injury Surveillance System. Trends from 2002 to 2014 were tested using joinpoint regression analyses. Incident characteristics and circumstances were examined by using data from the National Violent Death Reporting System. RESULTS: Nearly 1300 children die and 5790 are treated for gunshot wounds each year. Boys, older children, and minorities are disproportionately affected. Although unintentional firearm deaths among children declined from 2002 to 2014 and firearm homicides declined from 2007 to 2014, firearm suicides decreased between 2002 and 2007 and then showed a significant upward trend from 2007 to 2014. Rates of firearm homicide among children are higher in many Southern states and parts of the Midwest relative to other parts of the country. Firearm suicides are more dispersed across the United States with some of the highest rates occurring in Western states. Firearm homicides of younger children often occurred in multivictim events and involved intimate partner or family conflict; older children more often died in the context of crime and violence. Firearm suicides were often precipitated by situational and relationship problems. The shooter playing with a gun was the most common circumstance surrounding unintentional firearm deaths of both younger and older children. CONCLUSIONS: Firearm injuries are an important public health problem, contributing substantially to premature death and disability of children. Understanding their nature and impact is a first step toward prevention.

      2. Trends in the leading causes of injury mortality, Australia, Canada, and the United States, 2000-2014
        Mack K, Clapperton A, Macpherson A, Sleet D, Newton D, Murdoch J, Mackay JM, Berecki-Gisolf J, Wilkins N, Marr A, Ballesteros M, McClure R.
        Can J Public Health. 2017 Jun 16;108(2):e185-e191.
        OBJECTIVES: The aim of this study was to highlight the differences in injury rates between populations through a descriptive epidemiological study of population-level trends in injury mortality for the high-income countries of Australia, Canada and the United States. METHODS: Mortality data were available for the US from 2000 to 2014, and for Canada and Australia from 2000 to 2012. Injury causes were defined using the International Classification of Diseases, Tenth Revision external cause codes, and were grouped into major causes. Rates were direct-method age-adjusted using the US 2000 projected population as the standard age distribution. RESULTS: US motor vehicle injury mortality rates declined from 2000 to 2014 but remained markedly higher than those of Australia or Canada. In all three countries, fall injury mortality rates increased from 2000 to 2014. US homicide mortality rates declined, but remained higher than those of Australia and Canada. While the US had the lowest suicide rate in 2000, it increased by 24% during 2000-2014, and by 2012 was about 14% higher than that in Australia and Canada. The poisoning mortality rate in the US increased dramatically from 2000 to 2014. CONCLUSION: Results show marked differences and striking similarities in injury mortality between the countries and within countries over time. The observed trends differed by injury cause category. The substantial differences in injury rates between similarly resourced populations raises important questions about the role of societal-level factors as underlying causes of the differential distribution of injury in our communities.

    • Laboratory Sciences RSS Word feed
      1. Diisocyanates are highly reactive electrophiles utilized in the manufacture of a wide range of polyurethane products, and have been identified as causative agents of occupational allergic respiratory disease. However, in spite of the significant occupational health burden associated with diisocyanate-induced asthma, the mechanism of disease pathogenesis remains largely unknown. To better understand the fate of inhaled diisocyanates, a nose-only aerosol exposure system was constructed and utilized to expose a BALB/c mouse model to aerosol generated from 4,4′-methylene diphenyl diisocyanate (MDI). Tissue and bronchoalveolar lavage samples were evaluated 4 hours and 24 hours post-exposure for evidence of diisocyanate-protein haptenation, and a label-free quantitative proteomics strategy was employed to evaluate relative changes to protein content of the cellular fraction of the lavage fluid. Following MDI aerosol exposure, expression of a number of proteins with immunological or xenobiotic metabolism relevance is increased, including endoplasmin, cytochrome P450 and argininosuccinate synthase. Western blot analysis indicated MDI-conjugated protein in the lavage fluid, which was identified as serum albumin. Tandem mass spectrometry analysis of MDI-albumin revealed MDI conjugation occurs at a dilysine motif at Lys525, as well as at a glutamine-lysine motif at Lys414, in good agreement with previously published in vitro data on diisocyanate-conjugated serum albumin.

      2. 4′-Azidocytidine (R1479) inhibits henipaviruses and other paramyxoviruses with high potency
        Hotard AL, He B, Nichol ST, Spiropoulou CF, Lo MK.
        Antiviral Res. 2017 Jun 16.
        The henipaviruses Nipah virus and Hendra virus are highly pathogenic zoonotic paramyxoviruses which have caused fatal outbreaks of encephalitis and respiratory disease in humans. Despite the availability of a licensed equine Hendra virus vaccine and a neutralizing monoclonal antibody shown to be efficacious against henipavirus infections in non-human primates, there remains no approved therapeutics or vaccines for human use. To explore the possibility of developing small-molecule nucleoside inhibitors against henipaviruses, we evaluated the antiviral activity of 4′-azidocytidine (R1479), a drug previously identified to inhibit flaviviruses, against henipaviruses along with other representative members of the family Paramyxoviridae. We observed similar levels of R1479 antiviral activity across the family, regardless of virus genus. Our brief study expands the documented range of viruses susceptible to R1479, and provides the basis for future investigation and development of 4′-modified nucleoside analogs as potential broad-spectrum antiviral therapeutics across both positive and negative-sense RNA virus families.

      3. Measuring influenza laboratory capacity: use of a tool to measure improvements
        Kennedy P, Aden T, Cheng PY, Moen A.
        BMC Infect Dis. 2017 Jun 15;17(1):431.
        BACKGROUND: To collect information, identify training needs, and assist with influenza capacity building voluntary laboratory capacity assessments were conducted using a standardized tool in CDC cooperative agreement countries. To understand the usefulness of comparing results from repeat assessments and to determine if targeted training supported improvements, this paper details comparison of assessment results of conducting 17 repeat laboratory assessments between 2009 and 2013. METHODS: Laboratory assessments were conducted by SMEs in 17 laboratories (16 countries). We reviewed the quantitative assessment results of the laboratories that conducted both an initial and follow up assessment between 2009 to 2013 using repeated measures of Anova, (Mixed procedure of SAS (9.3)). Additionally, we compared the overall summary scores and the assessor recommendations from the two assessments. RESULTS: We were able to document a statistically significant improvement between the first and second assessments both on an aggregate as well as individual indicator score. Within the international capacity tool three of the eight categories recorded statistically significant improvement (equipment, management, and QA/QC), while the other tool categories (molecular, NIC, specimen, safety and virology) showed improvement in scores although not statistically significant. CONCLUSIONS: We found that using a standardized tool and quantitative framework is useful for documenting capacity and performance improvement in identified areas over time. The use of the tool and standard reports with assessor recommendations assisted laboratories with establishing, maintaining, and improving influenza laboratory practices. On-going assessments and the consistent application of the analytic framework over time will continue to aid in building a measurement knowledge base for laboratory capacity.

      4. Evaluation of the field performance of ImmunoCard STAT! rapid diagnostic test for rotavirus in Dadaab refugee camp and at the Kenya-Somalia Border
        Ope M, Nyoka R, Unshur A, Oyier FO, Mowlid SA, Owino B, Ochieng SB, Okello CI, Montgomery JM, Wagacha B, Galev A, Abdow A, Esona MD, Tate J, Fitter D, Cookson ST, Arunmozhi B, Marano N.
        Am J Trop Med Hyg. 2017 ;96(6):1302-1306.
        Rotavirus commonly causes diarrhea in children, leading to hospitalization and even death. Rapid diagnostic tests are feasible alternatives for determining rotavirus outbreaks in refugee camps that have inadequate laboratory capacity.We evaluated the field performance of ImmunoCard STAT! Rotavirus (ICS-RV) in Dadaab Refugee Camp and at the Kenya-Somalia border. From May to December 2014, we prospectively enrolled children aged < 5 years hospitalized with acute diarrhea, defined as >= 3 episodes of loose stool in 24 hours for < 7 days. Stool samples were collected and tested by trained surveillance clerks using ICS-RV per manufacturer’s instructions. The field performance characteristics of ICS-RV were evaluated against the gold standard test, PremierTM Rotaclone enzyme immunoassay. The operational characteristics were evaluated usingWorld Health Organization (WHO) ASSURED criteria to determine whether ICS-RV is appropriate as a point-of-care test by administering a standard questionnaire and observing surveillance clerks performing the test. We enrolled 213 patients with a median age of 10 months (range = 1-48); 58.2%weremale. A total of 71 (33.3%) and 60 (28.2%) patients tested positive for rotavirus infection by immunoassay and ICS-RV, respectively. The sensitivity, specificity, and positive and negative predictive values of ICS-RV compared with the immunoassay were 83.1% (95% confidence interval [CI] = 72.3-91.0), 99.3% (95% CI = 96.1-100), 98.3% (95% CI = 91.1-100), and 92.1% (95% CI = 86.6-95.5), respectively. The ICS-RV fulfilled theWHO ASSURED criteria for point-of-care testing. ICS-RV is a field-ready point-of-care test with good field performance and operational characteristics. It can be useful in determining rotavirus outbreaks in resource-limited settings.

      5. Neuraminidase-based recombinant virus-like particles protect against lethal avian influenza A(H5N1) virus infection in ferrets
        Smith GE, Sun X, Bai Y, Liu YV, Massare MJ, Pearce MB, Belser JA, Maines TR, Creager HM, Glenn GM, Flyer D, Pushko P, Levine MZ, Tumpey TM.
        Virology. 2017 Jun 15;509:90-97.
        Avian influenza A (H5N1) viruses represent a growing threat for an influenza pandemic. The presence of widespread avian influenza virus infections further emphasizes the need for vaccine strategies for control of pre-pandemic H5N1 and other avian influenza subtypes. Influenza neuraminidase (NA) vaccines represent a potential strategy for improving vaccines against avian influenza H5N1 viruses. To evaluate a strategy for NA vaccination, we generated a recombinant influenza virus-like particle (VLP) vaccine comprised of the NA protein of A/Indonesia/05/2005 (H5N1) virus. Ferrets vaccinated with influenza N1 NA VLPs elicited high-titer serum NA-inhibition (NI) antibody titers and were protected from lethal challenge with A/Indonesia/05/2005 virus. Moreover, N1-immune ferrets shed less infectious virus than similarly challenged control animals. In contrast, ferrets administered control N2 NA VLPs were not protected against H5N1 virus challenge. These results provide support for continued development of NA-based vaccines against influenza H5N1 viruses.

      6. Evaluation of the performance of Abbott m2000 and Roche COBAS Ampliprep/COBAS Taqman assays for HIV-1 viral load determination using dried blood spots and dried plasma spots in Kenya
        Zeh C, Ndiege K, Inzaule S, Achieng R, Williamson J, Chih-Wei Chang J, Ellenberger D, Nkengasong J.
        PLoS One. 2017 ;12(6):e0179316.
        BACKGROUND: Routine HIV viral load testing is not widely accessible in most resource-limited settings, including Kenya. To increase access to viral load testing, alternative sample types like dried blood spots (DBS), which overcome the logistic barriers associated with plasma separation and cold chain shipment need to be considered and evaluated. The current study evaluated matched dried blood spots (DBS) and dried plasma spots (DPS) against plasma using the Abbott M 2000 (Abbott) and Roche Cobas Ampliprep/Cobas TaqMan (CAP/CTM) quantitative viral load assays in western Kenya. METHODS: Matched plasma DBS and DPS were obtained from 200 HIV-1 infected antiretroviral treatment (ART)-experienced patients attending patient support centers in Western Kenya. Standard quantitative assay performance parameters with accompanying 95% confidence intervals (CI) were assessed at the assays lower detection limit (400cps/ml for CAP/CTM and 550cps/ml for Abbott) using SAS version 9.2. Receiver operating curves (ROC) were further used to assess viral-load thresholds with best assay performance (reference assay CAP/CTM plasma). RESULTS: Using the Abbott test, the sensitivity and specificity, respectively, for DPS were (97.3%, [95%CI: 93.2-99.2] and 98.1% [95%CI: 89.7-100]) and those for DBS (93.9% [95%CI: 88.8-97.2] and 88.0% [95%CI: 82.2-92.4]). The correlation and agreement using paired plasma and DPS/DBS were strong, with r2 = 90.5 and rc = 68.1. The Bland-Altman relative percent change was 95.3 for DPS, (95%CI: 90.4-97.7) and 73.6 (95%CI: 51.6-86.5) for DBS. Using the CAP/CTM assay, the sensitivity for DBS was significantly higher compared to DPS (100.0% [95% CI: 97.6-100.0] vs. 94.7% [95%CI: 89.8-97.7]), while the specificity for DBS was lower: 4%, [95% CI: 0.4-13.7] compared to DPS: 94.0%, [95% CI: 83.5-98.7]. When compared under different clinical relevant thresholds, the accuracy for the Abbott assay was 95% at the 1000cps/ml cut-off with a sensitivity and specificity of 96.6% [95% CI 91.8-98.7] and 90.4% [95% CI 78.2-96.4] respectively. The optimum threshold was at 3000 cps/ml with an accuracy of 95.5%, sensitivity and specificity of 94.6% [95%CI 89.3-97.5] and 98.1% [95%CI 88.4-99.9]) respectively. The best threshold for CAP/CTM was at 4000 copies /mL, with 92.5% accuracy (sensitivity of 96.0% [95%CI 91.0-98.3] and specificity of 82.7% [95%CI 69.2-91.3]). CONCLUSIONS: There was similar performance between matched DBS, DPS and plasma using the Abbott test, and good correlation for matched DPS and plasma using the CAPCTM test. The findings suggest that DBS and DPS may be reliably used as alternative specimens to plasma to measure HIV-1 VL using Abbott, and DPS may be reliably used with CAP/CTM in resource-limited settings.

    • Maternal and Child Health RSS Word feed
      1. Autism spectrum disorder in fragile X syndrome: Cooccurring conditions and current treatment
        Kaufmann WE, Kidd SA, Andrews HF, Budimirovic DB, Esler A, Haas-Givler B, Stackhouse T, Riley C, Peacock G, Sherman SL, Brown WT, Berry-Kravis E.
        Pediatrics. 2017 June;139:S194-S206.
        Background and objective: Individuals with fragile X syndrome (FXS) are frequently codiagnosed with autism spectrum disorder (ASD). Most of our current knowledge about ASD in FXS comes from family surveys and small studies. The objective of this study was to examine the impact of the ASD diagnosis in a large clinic-based FXS population to better inform the care of people with FXS. Methods: The study employed a data set populated by data from individuals with FXS seen at specialty clinics across the country. The data were collected by clinicians at the patient visit and by parent report for nonclinical and behavioral outcomes from September 7, 2012 through August 31, 2014. Data analyses were performed by using chi<sup>2</sup> tests for association, t tests, and multiple logistic regression to examine the association between clinical and other factors with ASD status. Results: Half of the males and nearly 20% of females met Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition criteria for current ASD. Relative to the FXS-only group, the FXS with ASD (FXS+ASD) group had a higher prevalence of seizures (20.7% vs 7.6%, P < .001), persistence of sleep problems later in childhood, increased behavior problems, especially aggressive/disruptive behavior, and higher use of alpha-agonists and antipsychotics. Behavioral services, including applied behavior analysis, appeared to be underused in children with FXS+ASD (only 26% and 16% in prekindergarten and school-age periods, respectively) relative to other populations with idiopathic ASD. Conclusions: These findings confirm among individuals with FXS an association of an ASD diagnosis with important cooccurring conditions and identify gaps between expected and observed treatments among individuals with FXS+ASD.

      2. Adverse Childhood Experiences (ACEs) are prevalent in the population and linked to various negative long-term health and social consequences. However, due to the retrospective nature of most studies on the topic, little is currently known regarding ACEs’ immediate health impact. This study aims to provide insight into this area by examining the association between a new measurement, Adverse Family Experiences (AFEs), and flourishing amongst children ages 6-17 years in the United States. Data from the 2011/12 National Survey of Children’s Health were analyzed. Adjusted prevalence ratios assessed flourishing by the number of AFEs (0 events versus 1, 2, 3/3+) controlling for individual/household characteristics. A sub-analysis examined characteristics of flourishing children ages 12-17 years with 3/3+ AFEs. The results showed children with 1 AFE (APR=0.87; 95% CI=0.83-0.91), 2 AFEs (0.74; 0.69-0.79), and 3/3+ AFEs (0.68; 0.62-0.72) were less likely to flourish compared to those without any AFEs. Sub-analysis of children ages 12-17 years with 3/3+ AFEs revealed a higher proportion of flourishing children volunteering, participating in extracurricular activities, and working for pay compared to those who did not flourish. Findings show significant differences in flourishing by number of AFEs and suggest that social connectedness may play a role in determining flourishing amongst children with 3/3+ AFEs. Furthermore, the results highlight the potential importance of identifying children with high AFE counts and helping them build resilience outside of the home.

      3. Public health literature review of fragile X syndrome
        Raspa M, Wheeler AC, Riley C.
        Pediatrics. 2017 ;139:S153-S171.
        Objectives: The purpose of this systematic literature review is to describe what is known about fragile X syndrome (FXS) and to identify research gaps. The results can be used to help inform future public health research and provide pediatricians with up-to-date information about the implications of the condition for individuals and their families. Methods: An electronic literature search was conducted, guided by a variety of key words. The search focused on 4 areas of both clinical and public health importance: (1) the full mutation phenotype, (2) developmental trajectories across the life span, (3) available interventions and treatments, and (4) impact on the family. A total of 661 articles were examined and 203 were included in the review. Results: The information is presented in the following categories: developmental profile (cognition, language, functional skills, and transition to adulthood), social-emotional profile (cooccurring psychiatric conditions and behavior problems), medical profile (physical features, seizures, sleep, health problems, and physiologic features), treatment and interventions (educational/behavioral, allied health services, and pharmacologic), and impact on the family (family environment and financial impact). Research gaps also are presented. Conclusions: The identification and treatment of FXS remains an important public health and clinical concern. The information presented in this article provides a more robust understanding of FXS and the impact of this complex condition for pediatricians. Despite a wealth of information about the condition, much work remains to fully support affected individuals and their families.

      4. The future of fragile X syndrome: CDC stakeholder meeting summary
        Riley C, Mailick M, Berry-Kravis E, Bolen J.
        Pediatrics. 2017 June;139:S147-S152.

        [No abstract]

      5. Assessing the fragile X syndrome newborn screening landscape
        Riley C, Wheeler A.
        Pediatrics. 2017 ;139:S207-S215.
        Background: Fragile X syndrome (FXS) is the most common known inherited form of intellectual disability. Early identification is an important step in linking FXS individuals with appropriate and timely medical and social services. Newborn screening (NBS) is 1 approach that has been used for other conditions to facilitate early identification. Methods: A literature review was conducted to identify issues, barriers, challenges, and approaches to addressing challenges related to NBS for FXS. Search terms included: fragile X syndrome, FMR1, newborn screening, screening, and genetic testing. To supplement the literature review, 9 key informant interviews were conducted. Information gathered through these interviews supplemented what was identified in the literature. Information from both the literature review and supplemental interviews was reviewed by 3 researchers who discussed and came to consensus on thematic areas and categorization of issues. Results: The barriers and challenges related to NBS for FXS identified in the literature and by experts and stakeholders are categorized into 5 thematic areas: public health burden, treatment, timing, screening/testing methodologies, and translating results. Summaries of these issues and barriers are provided, along with potential approaches to addressing them. Conclusions: The issues and barriers described in this article highlight limited areas of knowledge that need be addressed to improve our understanding of FXS and the potential benefit of NBS. The landscape of NBS for FXS could be influenced by a series of research findings over time or a larger breakthrough that demonstrates an effective targeted treatment that has to be implemented early in life.

      6. Forward: A registry and longitudinal clinical database to study fragile X syndrome
        Sherman SL, Kidd SA, Riley C, Berry-Kravis E, Andrews HF, Miller RM, Lincoln S, Swanson M, Kaufmann WE, Brown WT.
        Pediatrics. 2017 June;139:S183-S193.
        Background and objective: Advances in the care of patients with fragile X syndrome (FXS) have been hampered by lack of data. This deficiency has produced fragmentary knowledge regarding the natural history of this condition, healthcare needs, and the effects of the disease on caregivers. To remedy this deficiency, the Fragile X Clinic and Research Consortium was established to facilitate research. Through a collective effort, the Fragile X Clinic and Research Consortium developed the Fragile X Online Registry With Accessible Research Database (FORWARD) to facilitate multisite data collection. This report describes FORWARD and the way it can be used to improve health and quality of life of FXS patients and their relatives and caregivers. Methods: FORWARD collects demographic information on individuals with FXS and their family members (affected and unaffected) through a 1-time registry form. The longitudinal database collects clinician- and parent-reported data on individuals diagnosed with FXS, focused on those who are 0 to 24 years of age, although individuals of any age can participate. Results: The registry includes >2300 registrants (data collected September 7, 2009 to August 31, 2014). The longitudinal database includes data on 713 individuals diagnosed with FXS (data collected September 7, 2012 to August 31, 2014). Longitudinal data continue to be collected on enrolled patients along with baseline data on new patients. Conclusions: FORWARD represents the largest resource of clinical and demographic data for the FXS population in the United States. These data can be used to advance our understanding of FXS: the impact of cooccurring conditions, the impact on the day-today lives of individuals living with FXS and their families, and short-term and long-term outcomes.

      7. Self-injurious behaviors in children with autism spectrum disorder enrolled in the Study to Explore Early Development
        Soke GN, Rosenberg SA, Rosenberg CR, Vasa RA, Lee LC, DiGuiseppi C.
        Autism. 2017 Jun 01:1362361316689330.
        We assessed potential factors associated with “current” or “ever” self-injurious behaviors, reported in the Autism Diagnostic Interview-Revised, among children with autism spectrum disorder (n = 692) from the Study to Explore Early Development. Data on factors examined were obtained from questionnaires, standardized clinical instruments, and birth certificates. We employed a log-binomial regression to assess these associations. Although most associations were quite similar for currently and ever exhibiting self-injurious behaviors, a few differences were noted. We documented previously unreported associations of current self-injurious behaviors with maternal age and cesarean delivery, and ever self-injurious behaviors with maternal age, child sex, gestational age, and maternal race. We also confirmed previously reported associations with adaptive skills, somatic conditions (sleep, gastrointestinal, and sensory abnormalities), and other behavioral problems. These findings are informative for clinical practice and future research.

      8. Implications of the FMR1 premutation for children, adolescents, adults, and their families
        Wheeler A, Raspa M, Hagerman R, Mailick M, Riley C.
        Pediatrics. 2017 ;139:S172-S182.
        Background and objectives: Given the nature of FMR1 gene expansions, most biological mothers, and often multiple other family members of children with fragile X syndrome (FXS), will have a premutation, which may increase individual and family vulnerabilities. This article summarizes important gaps in knowledge and notes potential implications for pediatric providers with regard to developmental and medical risks for children and adolescents with an FMR1 premutation, including possible implications into adulthood. Methods: A structured electronic literature search was conducted on FMR1 pre- and full mutations, yielding a total of 306 articles examined. Of these, 116 focused primarily on the premutation and are included in this review. Results: Based on the literature review, 5 topic areas are discussed: genetics and epidemiology; phenotypic characteristics of individuals with the premutation; implications for carrier parents of children with FXS; implications for the extended family; and implications for pediatricians. Conclusions: Although the premutation phenotype is typically less severe in clinical presentation than in FXS, premutation carriers are much more common and are therefore more likely to be seen in a typical pediatric practice. In addition, there is a wide range of medical, cognitive/developmental, and psychiatric associated features that individuals with a premutation are at increased risk for having, which underscores the importance of awareness on the part of pediatricians in identifying and monitoring premutation carriers and recognizing the impact this identification may have on family members.

    • Mining RSS Word feed
      1. There are two types of through-the-earth (TTE) wireless communication in the mining industry: magnetic loop TTE and electrode-based (or linear) TTE. While the magnetic loop systems send signal through magnetic fields, the transmitter of an electrode-based TTE system sends signal directly through the mine overburden by driving an extremely low frequency (ELF) or ultralow frequency (ULF) AC current into the earth. The receiver at the other end (underground or surface) detects the resultant current and receives it as a voltage. A wireless communication link between surface and underground is then established. For electrode-based TTE communications, the signal is transmitted through the established electric field and is received as a voltage detected at the receiver. It is important to understand the electric field distribution within the mine overburden for the purpose of designing and improving the performance of the electrode-based TTE systems. In this paper, a complete explicit solution for all three electric field components for the electrode-based TTE communication was developed. An experiment was conducted using a prototype electrode-based TTE system developed by National Institute for Occupational Safety and Health. The mathematical model was then compared and validated with test data. A reasonable agreement was found between them.

      2. Characterization of a mine fire using atmospheric monitoring system sensor data
        Yuan L, Thomas RA, Zhou L.
        Mining Engineering. 2017 ;69(6):57-62.
        Atmospheric monitoring systems (AMS) have been widely used in underground coal mines in the United States for the detection of fire in the belt entry and the monitoring of other ventilation-related parameters such as airflow velocity and methane concentration in specific mine locations. In addition to an AMS being able to detect a mine fire, the AMS data have the potential to provide fire characteristic information such as fire growth – in terms of heat release rate – and exact fire location. Such information is critical in making decisions regarding fire-fighting strategies, underground personnel evacuation and optimal escape routes. In this study, a methodology was developed to calculate the fire heat release rate using AMS sensor data for carbon monoxide concentration, carbon dioxide concentration and airflow velocity based on the theory of heat and species transfer in ventilation airflow. Full-scale mine fire experiments were then conducted in the Pittsburgh Mining Research Division’s Safety Research Coal Mine using an AMS with different fire sources. Sensor data collected from the experiments were used to calculate the heat release rates of the fires using this methodology. The calculated heat release rate was compared with the value determined from the mass loss rate of the combustible material using a digital load cell. The experimental results show that the heat release rate of a mine fire can be calculated using AMS sensor data with reasonable accuracy.

    • Occupational Safety and Health RSS Word feed
      1. Contamination of firefighter personal protective equipment and skin and the effectiveness of decontamination procedures
        Fent KW, Alexander B, Roberts J, Robertson S, Toennis C, Sammons D, Bertke S, Kerber S, Smith D, Horn G.
        J Occup Environ Hyg. 2017 Jun 21:0.
        Firefighters’ skin may be exposed to chemicals via permeation/penetration of combustion byproducts through or around personal protective equipment (PPE) or from the cross-transfer of contaminants on PPE to the skin. Additionally, volatile contaminants can evaporate from PPE following a response and be inhaled by firefighters. Using polycyclic aromatic hydrocarbons (PAHs) and volatile organic compounds (VOCs) as respective markers for non-volatile and volatile substances, we investigated the contamination of firefighters’ turnout gear and skin following controlled residential fire responses. Participants were grouped into three crews of twelve firefighters. Each crew was deployed to a fire scenario (one per day, four total) and then paired up to complete six fireground job assignments. Wipe sampling of the exterior of the turnout gear was conducted pre- and post-fire. Wipe samples were also collected from a subset of the gear after field decontamination. VOCs off-gassing from gear were also measured pre-fire, post-fire, and post-decon. Wipe sampling of the firefighters’ hands and neck was conducted pre- and post-fire. Additional wipes were collected after cleaning neck skin. PAH levels on turnout gear increased after each response and were greatest for gear worn by firefighters assigned to fire attack and to search and rescue activities. Field decontamination using dish soap, water, and scrubbing was able to reduce PAH contamination on turnout jackets by a median of 85%. Off-gassing VOC levels increased post-fire and then decreased 17-36 minutes later regardless of whether field decontamination was performed. Median post-fire PAH levels on the neck were near or below the limit of detection (< 24 micrograms per square meter [microg/m2]) for all positions. For firefighters assigned to attack, search, and outside ventilation, the 75th percentile values on the neck were 152, 71.7, and 39.3 microg/m2, respectively. Firefighters assigned to attack and search had higher post-fire median hand contamination (135 and 226 microg/m2, respectively) than other positions (< 10.5 microg/m2). Cleansing wipes were able to reduce PAH contamination on neck skin by a median of 54%.

      2. Reflective responses following a role-play simulation of nurse bullying
        Ulrich DL, Gillespie GL, Boesch MC, Bateman KM, Grubb PL.
        Nurs Educ Perspect. 2017 Jul/Aug;38(4):203-205.
        A qualitative exploratory design was used for this study to evaluate role-play simulation as an active learning strategy. The context for the role-play was bullying in nursing practice. Following a simulation, 333 students from five college campuses of three universities completed a reflection worksheet. Qualitative thematic findings were personal responses, nonverbal communications exhibited, actions taken by participants, and the perceived impact of bullying during the simulation. Role-play simulation was a highly effective pedagogy, eliciting learning at both the cognitive and affective domains.

    • Parasitic Diseases RSS Word feed
      1. Addressing the challenges of Chagas disease: An emerging health concern in the United States
        Edwards MS, Stimpert KK, Montgomery SP.
        Infect Dis Clin Pract (Baltim Md). 2017 ;25(3):118-125.

        [No abstract]

      2. The economic value of long-lasting insecticidal nets and indoor residual spraying implementation in Mozambique
        Lee BY, Bartsch SM, Stone NT, Zhang S, Brown ST, Chatterjee C, DePasse JV, Zenkov E, Briet OJ, Mendis C, Viisainen K, Candrinho B, Colborn J.
        Am J Trop Med Hyg. 2017 ;96(6):1430-1440.
        Malaria-endemic countries have to decide how much of their limited resources for vector control to allocate toward implementing long-lasting insecticidal nets (LLINs) versus indoor residual spraying (IRS). To help the Mozambique Ministry of Health use an evidence-based approach to determine funding allocation toward various malaria control strategies, the Global Fund convened the Mozambique Modeling Working Group which then used JANUS, a software platform that includes integrated computational economic, operational, and clinical outcome models that can link with different transmission models (in this case, OpenMalaria) to determine the economic value of vector control strategies. Any increase in LLINs (from 80% baseline coverage) or IRS (from 80% baseline coverage) would be cost-effective (incremental cost-effectiveness ratios <= $114/disability-adjusted life year averted). However, LLIN coverage increases tend to be more cost-effective than similar IRS coverage increases, except where both pyrethroid resistance is high and LLIN usage is low. In high-transmission northern regions, increasing LLIN coverage would be more cost-effective than increasing IRS coverage. In medium-transmission central regions, changing from LLINs to IRS would be more costly and less effective. In low-transmission southern regions, LLINs were more costly and less effective than IRS, due to low LLIN usage. In regions where LLINs are more cost-effective than IRS, it is worth considering prioritizing LLIN coverage and use. However, IRS may have an important role in insecticide resistance management and epidemic control. Malaria intervention campaigns are not a one-size-fits-all solution, and tailored approaches are necessary to account for the heterogeneity of malaria epidemiology.

      3. Multiplex serologic assessment of schistosomiasis in Western Kenya: Antibody responses in preschool aged children as a measure of reduced transmission
        Won KY, Kanyi HM, Mwende FM, Wiegand RE, Goodhew EB, Priest JW, Lee YM, Njenga SM, Secor WE, Lammie PJ, Odiere MR.
        Am J Trop Med Hyg. 2017 ;96(6):1460-1467.
        Currently, impact of schistosomiasis control programs in Schistosoma mansoni-endemic areas is monitored primarily by assessment of parasitologic indicators only. Our study was conducted to evaluate the use of antibody responses as a way to measure the impact of schistosomiasis control programs. A total of 3,612 serum samples collected at three time points from children 1-5 years of age were tested for antibody responses to two schistosome antigens (soluble egg antigen [SEA] and Sm25) by multiplex bead assay. The overall prevalence of antibody responses to SEA was high at baseline (50.0%). After one round of mass drug administration (MDA), there was minimal change in odds of SEA positivity (odds ratio [OR] = 1.02, confidence interval [CI] = 0.79-1.32, P = 0.89). However, after two rounds of treatment, there was a slight decrease in odds of SEA positivity (OR = 0.80, CI = 0.63-1.02, P = 0.08). In contrast to the SEA results, prevalence of antibody responses to Sm25 was lowest at baseline (14.1%) and higher in years 2 (19.8%) and 3 (18.4%). After one round of MDA, odds of Sm25 positivity increased significantly (OR = 1.51, CI = 1.14-2.02, P = 0.005) and remained significantly higher than baseline after two rounds of MDA (OR = 1.37, CI = 1.07-1.76, P = 0.01). There was a significant decrease in the proportion of 1-year-olds with positive SEA responses from 33.1% in year 1 to 13.2% in year 3 and a corresponding decrease in the odds (OR = 3.25, CI = 1.75-6.08, P < 0.001). These results provide preliminary evidence that schistosomiasis program impact can be monitored using serologic responses.

    • Reproductive Health RSS Word feed
      1. Association of vitamin D intake and serum levels with fertility: results from the Lifestyle and Fertility Study
        Fung JL, Hartman TJ, Schleicher RL, Goldman MB.
        Fertil Steril. 2017 Jun 16.
        OBJECTIVE: To evaluate the role of vitamin D intake and serum levels on conception of clinical pregnancy and live birth. DESIGN: Prospective cohort study. SETTING: Academic medical centers. PATIENT(S): Healthy, nulliparous women, age 18-39 years, and their male partners. INTERVENTION(S): None. MAIN OUTCOME MEASURE(S): Clinical pregnancy and live birth were compared between those who did or did not meet the vitamin D estimated average requirement (EAR) intake (10 mug/d) and with serum 25-hydroxyvitamin D (25(OH)D) considered at risk for inadequacy or deficiency (<50 nmol/L) or sufficient (>/=50 nmol/L). RESULT(S): Among 132 women, 37.1% did not meet the vitamin D EAR and 13.9% had serum levels at risk for inadequacy or deficiency. Clinical pregnancies were significantly higher among women who met the vitamin D EAR (67.5% vs. 49.0%) and with sufficient serum 25(OH)D (64.3% vs. 38.9%) compared with those who did not. Live births were higher among those who met the vitamin D EAR (59.0% vs. 40.0%). The adjusted odds ratio (AOR) of conceiving a clinical pregnancy was significantly higher among those who met the EAR (AOR = 2.26; 95% confidence interval [CI], 1.05-4.86) and had sufficient serum 25(OH)D (AOR = 3.37; 95% CI, 1.06-10.70). The associations were not significant after controlling for selected nutrients and dietary quality. CONCLUSION(S): Women with vitamin D intake below EAR and serum 25(OH)D levels at risk for inadequacy or deficiency may be less likely to conceive and might benefit from increased vitamin D intake to achieve adequacy. CLINICAL TRIAL REGISTRATION NUMBER: NCT00642590.

      2. Do nonclinical community-based youth-serving professionals talk with young men about sexual and reproductive health and intend to refer them for care?
        Marcell AV, Gibbs SE, Howard SR, Pilgrim NA, Jennings JM, Sanders R, Page KR, Loosier PS, Dittus PJ.
        Am J Mens Health. 2017 Jul;11(4):1046-1054.
        Young men (ages 15-24) may benefit from community-based connections to care since many have sexual and reproductive health (SRH) needs and low care use. This study describes nonclinical community-based youth-serving professionals’ (YSPs) SRH knowledge, confidence, past behaviors, and future intentions to talk with young men about SRH and refer them to care, and examines factors associated with care referral intentions. YSPs ( n = 158) from 22 settings in one mid-Atlantic city answered questions about the study’s goal, their demographics and work environment from August 2014 to December 2015. Poisson regression assessed factors associated with YSPs’ care referral intentions. On average, YSPs answered 58% of knowledge questions correctly, knew 5 of 8 SRH care dimensions of where to refer young men, and perceived being somewhat/very confident talking with young men about SRH (63%) and referring them to care (77%). During the past month, the majority (63%) talked with young men about SRH but only one-third made care referrals; the majority (66%) were somewhat/very likely to refer them to care in the next 3 months. Adjusted models indicated YSPs were more likely to refer young men if they had a very supportive work environment to talk about SRH (adjusted RR = 1.51, 95% CI [1.15, 1.98]), greater confidence in SRH care referral (1.28 [1.00, 1.62]), and greater SRH care referrals in the past month (1.16 [1.02, 1.33]). Nonclinical community-based YSPs have poor-to-moderate knowledge about young men’s SRH care, and less than one-third reported referrals in the past month. Findings have implications for educating YSPs about young men’s SRH care.

    • Social and Behavioral Sciences RSS Word feed
      1. Children’s diurnal cortisol responses to negative events at school and home
        Bai S, Robles TF, Reynolds BM, Repetti RL.
        Psychoneuroendocrinology. 2017 May 29;83:150-158.
        This study examined the within-and between-person associations between daily negative events – peer problems, academic problems and interparental conflict – and diurnal cortisol in school-age children. Salivary cortisol levels were assessed four times per day (at wakeup, 30min later, just before dinner and at bedtime) on eight days in 47 youths ages 8-13 years old (60% female; M age=11.28, SD=1.50). The relative contributions of within- and between-person variances in each stressor were estimated in models predicting same-day diurnal cortisol slope, same-day bedtime cortisol, and next morning wakeup cortisol. Children who reported more peer problems on average showed flatter slopes of cortisol decline from wakeup to bedtime. However, children secreted more cortisol at wakeup following days when they had reported more peer or academic problems than usual. Interparental conflict was not significantly associated with diurnal cortisol. Findings from this study extend our understanding of short-term cortisol responses to naturally occurring problems in daily life, and help to differentiate these daily processes from the cumulative effects of chronic stress.

    • Statistics as Topic RSS Word feed
      1. Using multiple imputation to address missing values of hierarchical data
        Zhang Y, Crawford S, Boulet S, Monsour M, Cohen B, McKane P, Freeman K.
        J Mod Appl Stat Methods. 2017 ;16(1):744-752.
        Missing data may be a concern for data analysis. If it has a hierarchical or nested structure, the SUDAAN package can be used for multiple imputation. This is illustrated with birth certificate data that was linked to the Centers for Disease Control and Prevention’s National Assisted Reproductive Technology Surveillance System database. The Cox-Iannacchione weighted sequential hot deck method was used to conduct multiple imputation for missing/unknown values of covariates in a logistic model.

    • Vital Statistics RSS Word feed
      1. Annual summary of vital statistics: 2013-2014
        Murphy SL, Mathews TJ, Martin JA, Minkovitz CS, Strobino DM.
        Pediatrics. 2017 ;139(6).
        The number of births in the United States increased by 1% between 2013 and 2014, to a total of 3 988 076. The general fertility rate rose 1% to 62.9 births per 1000 women. The total fertility rate also rose 0.3% in 2014, to 1862.5 births per 1000 women. The teenage birth rate fell to another historic low in 2014, 24.2 births per 1000 women. The percentage of all births to unmarried women declined to 40.2% in 2014, from 40.6% in 2013. In 2014, the cesarean delivery rate declined to 32.2% from 32.7% in 2013. The preterm birth rate declined for the seventh straight year in 2014 to 9.57%; the low birth weight rate was unchanged at 8.00%. The infant mortality rate decreased to a historic low of 5.82 infant deaths per 1000 live births in 2014. The age-adjusted death rate for 2014 was 7.2 deaths per 1000 population, down 1% from 2013. Crude death rates for children aged 1 to 19 years did not change significantly between 2013 and 2014. Unintentional injuries and suicide were, respectively, the first and second leading causes of death in this age group. These 2 causes of death jointly accounted for 46.5% of all deaths to children and adolescents in 2014.

    • Zoonotic and Vectorborne Diseases RSS Word feed
      1. Pet ownership increases human risk of encountering ticks
        Jones EH, Hinckley AF, Hook SA, Meek JI, Backenson B, Kugeler KJ, Feldman KA.
        Zoonoses Public Health. 2017 Jun 19.
        We examined whether pet ownership increased the risk for tick encounters and tickborne disease among residents of three Lyme disease-endemic states as a nested cohort within a randomized controlled trial. Information about pet ownership, use of tick control for pets, property characteristics, tick encounters and human tickborne disease were captured through surveys, and associations were assessed using univariate and multivariable analyses. Pet-owning households had 1.83 times the risk (95% CI = 1.53, 2.20) of finding ticks crawling on and 1.49 times the risk (95% CI = 1.20, 1.84) of finding ticks attached to household members compared to households without pets. This large evaluation of pet ownership, human tick encounters and tickborne diseases shows that pet owners, whether of cats or dogs, are at increased risk of encountering ticks and suggests that pet owners are at an increased risk of developing tickborne disease. Pet owners should be made aware of this risk and be reminded to conduct daily tick checks of all household members, including the pets, and to consult their veterinarian regarding effective tick control products.

      2. No serologic evidence of Middle East respiratory syndrome coronavirus infection among camel farmers exposed to highly seropositive camel herds: A household linked study, Kenya, 2013
        Munyua P, Corman VM, Bitek A, Osoro E, Meyer B, Muller MA, Lattwein E, Thumbi SM, Murithi R, Widdowson MA, Drosten C, Njenga MK.
        Am J Trop Med Hyg. 2017 ;96(6):1318-1324.
        High seroprevalence of Middle East respiratory syndrome coronavirus (MERS-CoV) among camels has been reported in Kenya and other countries in Africa. To date, the only report of MERS-CoV seropositivity among humans in Kenya is of two livestock keepers with no known contact with camels. We assessed whether persons exposed to seropositive camels at household level had serological evidence of infection. In 2013, 760 human and 879 camel sera were collected from 275 and 85 households respectively in Marsabit County. Data on human and animal demographics and type of contact with camels were collected. Human and camel sera were tested for anti- MERS-CoV IgG using a commercial enzyme-linked immunosorbent assay (ELISA) test. Human samples were confirmed by plaque reduction neutralization test (PRNT). Logistic regression was used to identify factors associated with seropositivity. The median age of persons sampled was 30 years (range: 5-90) and 50% were males. A quarter (197/ 760) of the participants reported having had contact with camels defined as milking, feeding, watering, slaughtering, or herding. Of the human sera, 18 (2.4%) were positive on ELISA but negative by PRNT. Of the camel sera, 791 (90%) were positive on ELISA. On univariate analysis, higher prevalence was observed in female and older camels over 4 years of age (P < 0.05). On multivariate analysis, only age remained significantly associated with increased odds of seropositivity. Despite high seroprevalence among camels, there was no serological confirmation of MERS-CoV infection among camel pastoralists in Marsabit County. The high seropositivity suggests that MERS-CoV or other closely related virus continues to circulate in camels and highlights ongoing potential for animal-to-human transmission.

      3. Serial head and brain imaging of 17 fetuses with confirmed Zika virus infection in Colombia, South America
        Parra-Saavedra M, Reefhuis J, Piraquive JP, Gilboa SM, Badell ML, Moore CA, Mercado M, Valencia D, Jamieson DJ, Beltran M, Sanz-Cortes M, Rivera-Casas AM, Yepez M, Parra G, Ospina Martinez M, Honein MA.
        Obstet Gynecol. 2017 Jun 06.
        OBJECTIVE: To evaluate fetal ultrasound and magnetic resonance imaging findings among a series of pregnant women with confirmed Zika virus infection to evaluate the signs of congenital Zika syndrome with respect to timing of infection. METHODS: We conducted a retrospective case series of pregnant women referred to two perinatal clinics in Barranquilla and Ibague, Colombia, who had findings consistent with congenital Zika syndrome and Zika virus infection confirmed in maternal, fetal, or neonatal samples. Serial ultrasound measurements, fetal magnetic resonance imaging results, laboratory results, and perinatal outcomes were evaluated. RESULTS: We describe 17 cases of confirmed prenatal maternal Zika virus infection with adverse fetal outcomes. Among the 14 symptomatic women, the median gestational age for maternal Zika virus symptoms was 10 weeks (range 7-14 weeks of gestation). The median time between Zika virus symptom onset and microcephaly (head circumference less than 3 standard deviations below the mean) was 18 weeks (range 15-24 weeks). The earliest fetal head circumference measurement consistent with microcephaly diagnosis was at 24 weeks of gestation. The earliest sign of congenital Zika syndrome was talipes equinovarus, which in two patients was noted first at 19 weeks of gestation. Common findings on fetal magnetic resonance imaging were microcephaly, ventriculomegaly, polymicrogyria, and calcifications. CONCLUSION: Our analysis suggests a period of at least 15 weeks between maternal Zika virus infection in pregnancy and development of microcephaly and highlights the importance of serial and detailed neuroimaging.

Back to Top

CDC Science Clips Production Staff

  • John Iskander, MD MPH, Editor
  • Gail Bang, MLIS, Librarian
  • Kathy Tucker, Librarian
  • William (Bill) Thomas, MLIS, Librarian
  • Onnalee Gomez, MS, Health Scientist
  • Jarvis Sims, MIT, MLIS, Librarian

____

DISCLAIMER: Articles listed in the CDC Science Clips are selected by the Stephen B. Thacker CDC Library to provide current awareness of the public health literature. An article's inclusion does not necessarily represent the views of the Centers for Disease Control and Prevention nor does it imply endorsement of the article's methods or findings. CDC and DHHS assume no responsibility for the factual accuracy of the items presented. The selection, omission, or content of items does not imply any endorsement or other position taken by CDC or DHHS. Opinion, findings and conclusions expressed by the original authors of items included in the Clips, or persons quoted therein, are strictly their own and are in no way meant to represent the opinion or views of CDC or DHHS. References to publications, news sources, and non-CDC Websites are provided solely for informational purposes and do not imply endorsement by CDC or DHHS.

TOP