Skip directly to search Skip directly to A to Z list Skip directly to navigation Skip directly to page options Skip directly to site content

Issue 22, June 6, 2017


CDC Science Clips: Volume 9, Issue 22, June 6, 2017

Science Clips is produced weekly to enhance awareness of emerging scientific knowledge for the public health community. Each article features an Altmetric Attention score to track social and mainstream media mentions!

This week, Science Clips is pleased to collaborate with CDC Vital Signs by featuring scientific articles from the June Vital Signs (www.cdc.gov/vitalsigns). The articles marked with an asterisk are general review articles which may be of particular interest to clinicians and public health professionals seeking background information in this area.

  1. CDC Vital Signs
    • Legionnaires’ Disease RSS Word feed
      1. *Legionellosis on the rise: A review of guidelines for prevention in the United States
        Parr A, Whitney EA, Berkelman RL.
        J Public Health Manag Pract. 2015 Sep-Oct;21(5):E17-26.
        CONTEXT: Reported cases of legionellosis more than tripled between 2001 and 2012 in the United States. The disease results primarily from exposure to aerosolized water contaminated with Legionella. OBJECTIVE: To identify and describe policies and guidelines for the primary prevention of legionellosis in the US. DESIGN: An Internet search for Legionella prevention guidelines in the United States at the federal and state levels was conducted from March to June 2012. Local government agency guidelines and guidelines from professional organizations that were identified in the initial search were also included. SETTING: Federal, state, and local governing bodies and professional organizations. RESULTS: Guidelines and regulations for the primary prevention of legionellosis (ie, Legionnaires’ disease and Pontiac fever) have been developed by various public health and other government agencies at the federal, state, and local levels as well as by professional organizations. These guidelines are similar in recommending maintenance of building water systems; federal and other guidelines differ in the population/institutions targeted, the extent of technical detail, and support of monitoring water systems for levels of Legionella contamination. CONCLUSIONS: Legionellosis deserves a higher public health priority for research and policy development. Guidance across public health agencies for the primary prevention of legionellosis requires strengthening as this disease escalates in importance as a cause of severe morbidity and mortality. We recommend a formal and comprehensive review of national public health guidelines for prevention of legionellosis.

      2. *Epidemiology and clinical management of Legionnaires’ disease
        Phin N, Parry-Ford F, Harrison T, Stagg HR, Zhang N, Kumar K, Lortholary O, Zumla A, Abubakar I.
        Lancet Infect Dis. 2014 Oct;14(10):1011-21.
        Legionnaires’ disease is an important cause of community-acquired and hospital-acquired pneumonia. Although uncommon, Legionnaires’ disease continues to cause disease outbreaks of public health significance. The disease is caused by any species of the Gram-negative aerobic bacteria belonging to the genus Legionella; Legionella pneumophila serogroup 1 is the causative agent of most cases in Europe. In this Review we outline the global epidemiology of Legionnaires’ disease, summarise its diagnosis and management, and identify research gaps and priorities. Early clinical diagnosis and prompt initiation of appropriate antibiotics for Legionella spp in all patients with community-acquired or hospital-acquired pneumonias is a crucial measure for management of the disease. Progress in typing and sequencing technologies might additionally contribute to understanding the distribution and natural history of Legionnaires’ disease, and inform outbreak investigations. Control of Legionnaires’ disease outbreaks relies on rapid ascertainment of descriptive epidemiological data, combined with microbiological information to identify the source and implement control measures. Further research is required to define the actual burden of disease, factors that influence susceptibility, key sources of infection, and differences in virulence between strains of Legionella species. Other requirements are improved, specific, sensitive, and rapid diagnostic tests to accurately inform management of Legionnaires’ disease, and controlled clinical trials to ascertain the optimum antibiotics for treatment.

      3. Legionellosis: Risk management for building water system
        ANSI/ASHRAE .
        Atlanta, GA: ASHRAE; 2015. 2015 .

        [No abstract]

      4. Direct healthcare costs of selected diseases primarily or partially transmitted by water
        Collier SA, Stockman LJ, Hicks LA, Garrison LE, Zhou FJ, Beach MJ.
        Epidemiol Infect. 2012 Nov;140(11):2003-13.
        Despite US sanitation advancements, millions of waterborne disease cases occur annually, although the precise burden of disease is not well quantified. Estimating the direct healthcare cost of specific infections would be useful in prioritizing waterborne disease prevention activities. Hospitalization and outpatient visit costs per case and total US hospitalization costs for ten waterborne diseases were calculated using large healthcare claims and hospital discharge databases. The five primarily waterborne diseases in this analysis (giardiasis, cryptosporidiosis, Legionnaires’ disease, otitis externa, and non-tuberculous mycobacterial infection) were responsible for over 40 000 hospitalizations at a cost of $970 million per year, including at least $430 million in hospitalization costs for Medicaid and Medicare patients. An additional 50 000 hospitalizations for campylobacteriosis, salmonellosis, shigellosis, haemolytic uraemic syndrome, and toxoplasmosis cost $860 million annually ($390 million in payments for Medicaid and Medicare patients), a portion of which can be assumed to be due to waterborne transmission.

      5. Plan for the control of Legionella infections in long-term care facilities: role of environmental monitoring
        Cristino S, Legnani PP, Leoni E.
        Int J Hyg Environ Health. 2012 Apr;215(3):279-85.
        In accordance with the international and national guidelines, the Emilia-Romagna Region (Italy) has established regional guidelines for the surveillance and prevention of legionellosis based on the concept of risk assessment, with particular attention to environmental monitoring. The aim of this study was to verify how environmental surveillance in the context of risk assessment plans could help to guide decisions about preventive strategies against Legionella infections in Long Term Care Facilities (LTCF). In six LTCFs in the city of Bologna (Emilia-Romagna Region) a self-control plan was implemented that included the environmental monitoring of Legionella spp. and the surveillance of hospital-acquired Legionnaires’ Disease. At baseline, four hot water systems were colonized by Legionella pneumophila (3 LCTFs) and Legionella londiniensis (1 LCTF). In each establishment specific control measures were adopted based on the characteristics of the system, the virulence of the strain and the level of the contamination. The monitoring, carried out for around two years, was also extended to the ways in which the system and the distal water distribution points were used and maintained with respect to the good practices in operation and management. The adopted actions (shock and/or continuous disinfection treatments) and the implementation of the good practice measures reduced the contamination to acceptable and stable levels. No cases of hospital-acquired legionellosis occurred during the period of study. The environmental surveillance was successful in evaluating the risk and identifying the most suitable preventive strategies.

      6. The importance of clinical surveillance in detecting Legionnaires’ disease outbreaks: a large outbreak in a hospital with a Legionella disinfection system-Pennsylvania, 2011-2012
        Demirjian A, Lucas CE, Garrison LE, Kozak-Muiznieks NA, States S, Brown EW, Wortham JM, Beaudoin A, et al .
        Clin Infect Dis. 2015 Jun 01;60(11):1596-602.
        BACKGROUND: Healthcare-associated Legionnaires’ disease (LD) is a preventable pneumonia with a 30% case fatality rate. The Centers for Disease Control and Prevention guidelines recommend a high index of suspicion for the diagnosis of healthcare-associated LD. We characterized an outbreak and evaluated contributing factors in a hospital using copper-silver ionization for prevention of Legionella growth in water. METHODS: Through medical records review at a large, urban tertiary care hospital in November 2012, we identified patients diagnosed with LD during 2011-2012. Laboratory-confirmed cases were categorized as definite, probable, and not healthcare associated based on time spent in the hospital during the incubation period. We performed an environmental assessment of the hospital, including collection of samples for Legionella culture. Clinical and environmental isolates were compared by genotyping. Copper and silver ion concentrations were measured in 11 water samples. RESULTS: We identified 5 definite and 17 probable healthcare-associated LD cases; 6 case patients died. Of 25 locations (mostly potable water) where environmental samples were obtained for Legionella-specific culture, all but 2 showed Legionella growth; 11 isolates were identical to 3 clinical isolates by sequence-based typing. Mean copper and silver concentrations were at or above the manufacturer’s recommended target for Legionella control. Despite this, all samples where copper and silver concentrations were tested showed Legionella growth. CONCLUSIONS: This outbreak was linked to the hospital’s potable water system and highlights the importance of maintaining a high index of suspicion for healthcare-associated LD, even in the setting of a long-term disinfection program.

      7. Active bacterial core surveillance for legionellosis – United States, 2011-2013
        Dooling KL, Toews KA, Hicks LA, Garrison LE, Bachaus B, Zansky S, Carpenter LR, et al .
        MMWR Morb Mortal Wkly Rep. 2015 Oct 30;64(42):1190-3.
        During 2000-2011, passive surveillance for legionellosis in the United States demonstrated a 249% increase in crude incidence, although little was known about the clinical course and method of diagnosis. In 2011, a system of active, population-based surveillance for legionellosis was instituted through CDC’s Active Bacterial Core surveillance (ABCs) program. Overall disease rates were similar in both the passive and active systems, but more complete demographic information and additional clinical and laboratory data were only available from ABCs. ABCs data during 2011-2013 showed that approximately 44% of patients with legionellosis required intensive care, and 9% died. Disease incidence was higher among blacks than whites and was 10 times higher in New York than California. Laboratory data indicated a reliance on urinary antigen testing, which only detects Legionella pneumophila serogroup 1 (Lp1). ABCs data highlight the severity of the disease, the need to better understand racial and regional differences, and the need for better diagnostic testing to detect infections.

      8. Legionella and Legionnaires’ disease: 25 years of investigation
        Fields BS, Benson RF, Besser RE.
        Clin Microbiol Rev. 2002 Jul;15(3):506-26.
        There is still a low level of clinical awareness regarding Legionnaires’ disease 25 years after it was first detected. The causative agents, legionellae, are freshwater bacteria with a fascinating ecology. These bacteria are intracellular pathogens of freshwater protozoa and utilize a similar mechanism to infect human phagocytic cells. There have been major advances in delineating the pathogenesis of legionellae through the identification of genes which allow the organism to bypass the endocytic pathways of both protozoan and human cells. Other bacteria that may share this novel infectious process are Coxiella burnetti and Brucella spp. More than 40 species and numerous serogroups of legionellae have been identified. Most diagnostic tests are directed at the species that causes most of the reported human cases of legionellosis, L. pneumophila serogroup 1. For this reason, information on the incidence of human respiratory disease attributable to other species and serogroups of legionellae is lacking. Improvements in diagnostic tests such as the urine antigen assay have inadvertently caused a decrease in the use of culture to detect infection, resulting in incomplete surveillance for legionellosis. Large, focal outbreaks of Legionnaires’ disease continue to occur worldwide, and there is a critical need for surveillance for travel-related legionellosis in the United States. There is optimism that newly developed guidelines and water treatment practices can greatly reduce the incidence of this preventable illness.

      9. Vital Signs: Deficiencies in environmental control identified in outbreaks of Legionnaires’ disease – North America, 2000-2014
        Garrison LE, Kunz JM, Cooley LA, Moore MR, Lucas C, Schrag S, Sarisky J, Whitney CG.
        MMWR Morb Mortal Wkly Rep. 2016 Jun 10;65(22):576-84.
        BACKGROUND: The number of reported cases of Legionnaires’ disease, a severe pneumonia caused by the bacterium Legionella, is increasing in the United States. During 2000-2014, the rate of reported legionellosis cases increased from 0.42 to 1.62 per 100,000 persons; 4% of reported cases were outbreak-associated. Legionella is transmitted through aerosolization of contaminated water. A new industry standard for prevention of Legionella growth and transmission in water systems in buildings was published in 2015. CDC investigated outbreaks of Legionnaires’ disease to identify gaps in building water system maintenance and guide prevention efforts. METHODS: Information from summaries of CDC Legionnaires’ disease outbreak investigations during 2000-2014 was systematically abstracted, and water system maintenance deficiencies from land-based investigations were categorized as process failures, human errors, equipment failures, or unmanaged external changes. RESULTS: During 2000-2014, CDC participated in 38 field investigations of Legionnaires’ disease. Among 27 land-based outbreaks, the median number of cases was 10 (range = 3-82) and median outbreak case fatality rate was 7% (range = 0%-80%). Sufficient information to evaluate maintenance deficiencies was available for 23 (85%) investigations. Of these, all had at least one deficiency; 11 (48%) had deficiencies in >/=2 categories. Fifteen cases (65%) were linked to process failures, 12 (52%) to human errors, eight (35%) to equipment failures, and eight (35%) to unmanaged external changes. CONCLUSIONS AND IMPLICATIONS FOR PUBLIC HEALTH PRACTICE: Multiple common preventable maintenance deficiencies were identified in association with disease outbreaks, highlighting the importance of comprehensive water management programs for water systems in buildings. Properly implemented programs, as described in the new industry standard, could reduce Legionella growth and transmission, preventing Legionnaires’ disease outbreaks and reducing disease.

      10. Current and emerging Legionella diagnostics for laboratory and outbreak investigations
        Mercante JW, Winchell JM.
        Clin Microbiol Rev. 2015 Jan;28(1):95-133.
        Legionnaires’ disease (LD) is an often severe and potentially fatal form of bacterial pneumonia caused by an extensive list of Legionella species. These ubiquitous freshwater and soil inhabitants cause human respiratory disease when amplified in man-made water or cooling systems and their aerosols expose a susceptible population. Treatment of sporadic cases and rapid control of LD outbreaks benefit from swift diagnosis in concert with discriminatory bacterial typing for immediate epidemiological responses. Traditional culture and serology were instrumental in describing disease incidence early in its history; currently, diagnosis of LD relies almost solely on the urinary antigen test, which captures only the dominant species and serogroup, Legionella pneumophila serogroup 1 (Lp1). This has created a diagnostic “blind spot” for LD caused by non-Lp1 strains. This review focuses on historic, current, and emerging technologies that hold promise for increasing LD diagnostic efficiency and detection rates as part of a coherent testing regimen. The importance of cooperation between epidemiologists and laboratorians for a rapid outbreak response is also illustrated in field investigations conducted by the CDC with state and local authorities. Finally, challenges facing health care professionals, building managers, and the public health community in combating LD are highlighted, and potential solutions are discussed.

  2. CDC Authored Publications
    The names of CDC authors are indicated in bold text.
    Articles published in the past 6-8 weeks authored by CDC or ATSDR staff.
    • Chronic Diseases and Conditions RSS Word feed
      1. Trends in indoor tanning and its association with sunburn among US adults
        Guy GP, Watson M, Seidenberg AB, Hartman AM, Holman DM, Perna F.
        J Am Acad Dermatol. 2017 Jun;76(6):1191-1193.

        [No abstract]

      2. The FLASHE Study: Survey development, dyadic perspectives, and participant characteristics
        Nebeling LC, Hennessy E, Oh AY, Dwyer LA, Patrick H, Blanck HM, Perna FM, Ferrer RA, Yaroch AL.
        Am J Prev Med. 2017 Jun;52(6):839-848.
        The National Cancer Institute developed the Family Life, Activity, Sun, Health, and Eating (FLASHE) Study to examine multiple cancer preventive behaviors within parent-adolescent dyads. The purpose of creating FLASHE was to enable the examination of physical activity, diet, and other cancer preventive behaviors and potential correlates among parent-adolescent dyads. FLASHE surveys were developed from a process involving literature reviews, scientific input from experts in the field, cognitive testing, and usability testing. This cross-sectional, web-based study of parents and their adolescent children (aged 12-17 years) was administered between April and October 2014. The nationwide sample consisted of 1,573 parent-adolescent dyads (1,699 parents and 1,581 adolescents) who returned all FLASHE surveys. FLASHE assessed parent and adolescent reports of several intrapersonal and interpersonal domains (including psychosocial variables, parenting, and the community and home environments). On a subset of example FLASHE items across these domains, responses of parents and adolescents within the same dyads were positively and significantly correlated (r =0.32-0.63). Analyses were run in 2015-2016. FLASHE data present multiple opportunities for studying research questions among individuals or dyads, including the ability to examine similarity between parents and adolescents on many constructs relevant to cancer preventive behaviors. FLASHE data are publicly available for researchers and practitioners to help advance research on cancer preventive health behaviors.

      3. Prevalence of high fractional exhaled nitric oxide among US youth with asthma
        Nguyen DT, Kit BK, Brody D, Akinbami LJ.
        Pediatr Pulmonol. 2017 Jun;52(6):737-745.
        BACKGROUND: High fractional exhaled nitric oxide (FeNO) is an indicator of poor asthma control and has been proposed as a non-invasive assessment tool to guide asthma management. OBJECTIVE: We aimed to describe the prevalence of and factors associated with high FeNO among US youth with asthma. METHODS: Data from 716 children and adolescents with asthma ages 6-19 years who participated in the 2007-2012 National Health and Nutrition Examination Survey were analyzed. Using American Thoracic Society guidelines, high FeNO was defined as >50 ppb for ages 12-19 years and >35 ppb for ages 6-11 years. Multivariate logistic regression examined associations between high FeNO and age, sex, race/Hispanic origin, income status, weight status, tobacco smoke exposure, and other factors associated with asthma control (recent use of inhaled corticosteroids, recent respiratory illness, asthma-related respiratory signs/symptoms, and spirometry). RESULTS: About 16.5% of youth with asthma had high FeNO. The prevalence of high FeNO was higher among non-Hispanic black (27%, P < 0.001) and Hispanic (20.2%, P = 0.002) youth than non-Hispanic white (9.7%) youth. Differences in high FeNO prevalence by sex (girls < boys), weight status (obese < normal weight), tobacco smoke exposure (smokers < home exposure < no exposure), and FEV1/FVC (normal < abnormal) were also observed. No differences were noted between categories for the remaining covariates. CONCLUSION: High FeNO was observed to be associated with sex, race/Hispanic origin, weight status, tobacco smoke exposure, and abnormal FEV1/FVC, but was not associated with asthma-related respiratory symptoms. These findings may help inform future research and clinical practice guidelines on the use of high FeNO in the assessment of asthma control.

      4. Lifetime risk of symptomatic hand osteoarthritis: The Johnston County Osteoarthritis Project
        Qin J, Barbour KE, Murphy LB, Nelson AE, Schwartz TA, Helmick CG, Allen KD, Renner JB, Baker NA, Jordan JM.
        Arthritis Rheumatol. 2017 May 04.
        OBJECTIVE: Symptomatic hand osteoarthritis (OA) is a common condition that affects hand strength and function, and causes disability in activities of daily living. Prior studies have estimated that the lifetime risk of symptomatic knee OA is 45% and that of hip OA is 25%. The objective of the present study was to estimate the overall lifetime risk of symptomatic hand OA, and the stratified lifetime risk according to potential risk factors. METHODS: Data were obtained from 2,218 adult subjects (ages >/=45 years) in the Johnston County Osteoarthritis Project, a population-based prospective cohort study among residents of Johnston County, North Carolina. Data for the present study were collected from 2 of the follow-up cycles (1999-2004 and 2005-2010). Symptomatic hand OA was defined as the presence of both self-reported symptoms and radiographic OA in the same hand. Lifetime risk, defined as the proportion of the population who will develop symptomatic hand OA in at least 1 hand by age 85 years, was estimated from models using generalized estimating equations. RESULTS: Overall, the lifetime risk of symptomatic hand OA was 39.8% (95% confidence interval [95% CI] 34.4-45.3%). In this population, nearly 1 in 2 women (47.2%, 95% CI 40.6-53.9%) had an estimated lifetime risk of developing symptomatic hand OA by age 85 years, compared with 1 in 4 men (24.6%, 95% CI 19.5-30.5%). Race-specific symptomatic hand OA risk estimates were 41.4% (95% CI 35.5-47.6%) among whites and 29.2% (95% CI 20.5-39.7%) among African Americans. The lifetime risk of symptomatic hand OA among individuals with obesity (47.1%, 95% CI 37.8-56.7%) was 11 percentage points higher than that in individuals without obesity (36.1%, 95% CI 29.7-42.9%). CONCLUSION: These findings demonstrate the substantial burden of symptomatic hand OA overall and in sociodemographic and clinical subgroups. Increased use of public health and clinical interventions is needed to address its impact.

      5. Changes in primary healthcare providers’ attitudes and counseling behaviors related to dietary sodium reduction, DocStyles 2010 and 2015
        Quader ZS, Cogswell ME, Fang J, Coleman King SM, Merritt RK.
        PLoS One. 2017 ;12(5):e0177693.
        High blood pressure is a major risk factor for cardiovascular disease. The 2013 ACC/AHA Lifestyle Management Guideline recommends counseling pre-hypertensive and hypertensive patients to reduce sodium intake. Population sodium reduction efforts have been introduced in recent years, and dietary guidelines continued to emphasize sodium reduction in 2010 and 2015. The objective of this analysis was to determine changes in primary health care providers’ sodium-reduction attitudes and counseling between 2010 and 2015. Primary care internists, family/general practitioners, and nurse practitioners answered questions about sodium-related attitudes and counseling behaviors in DocStyles, a repeated cross-sectional web-based survey in the United States. Differences in responses between years were examined. In 2015, the majority (78%) of participants (n = 1,251) agreed that most of their patients should reduce sodium intake, and reported advising hypertensive (85%), and chronic kidney disease patients (71%), but not diabetic patients (48%) and African-American patients (43%) to consume less salt. Since 2010, the proportion of participants agreeing their patients should reduce sodium intake decreased while the proportion advising patients with these characteristics to consume less salt increased and the prevalence of specific types of advice declined. Changes in behaviors between surveys remained significant after adjusting for provider and practice characteristics. More providers are advising patients to consume less salt in 2015 compared to 2010; however, fewer agree their patients should reduce intake and counseling is not universally applied across patient groups at risk for hypertension. Further efforts and educational resources may be required to enable patient counseling about sodium reduction strategies.

      6. Background: Educational attainment (EA) is inversely associated with colorectal cancer risk. Colorectal cancer screening can save lives if precancerous polyps or early cancers are found and successfully treated. This study aims to estimate the potential productivity loss (PPL) and associated avoidable colorectal cancer-related deaths among screen-eligible adults residing in lower EA counties in the United States. Methods: Mortality and population data were used to examine colorectal cancer deaths (2008-2012) among adults aged 50 to 74 years in lower EA counties, and to estimate the expected number of deaths using the mortality experience from high EA counties. Excess deaths (observed-expected) were used to estimate potential years life lost, and the human capital method was used to estimate PPL in 2012 U.S. dollars. Results: County-level colorectal cancer death rates were inversely associated with county-level EA. Of the 100,857 colorectal cancer deaths in lower EA counties, we estimated that more than 21,000 (1 in 5) was potentially avoidable and resulted in nearly $2 billion annual productivity loss.Conclusions: County-level EA disparities contribute to a large number of potentially avoidable colorectal cancer-related deaths. Increased prevention and improved screening potentially could decrease deaths and help reduce the associated economic burden in lower EA communities. Increased screening could further reduce deaths in all EA groups. Impact: These results estimate the large economic impact of potentially avoidable colorectal cancer-related deaths in economically disadvantaged communities, as measured by lower EA. Cancer Epidemiol Biomarkers Prev; 26(5); 736-42. (c)2016 AACR.

    • Communicable Diseases RSS Word feed
      1. Working with influenza-like illness: Presenteeism among US health care personnel during the 2014-2015 influenza season
        Chiu S, Black CL, Yue X, Greby SM, Laney AS, Campbell AP, de Perio MA.
        Am J Infect Control. 2017 May 16.
        BACKGROUND: Health care personnel (HCP) working while experiencing influenza-like illness (ILI) contribute to influenza transmission in health care settings. Studies focused on certain HCP occupations or work settings have demonstrated that some HCP often continue to work while ill. METHODS: Using a national nonprobability Internet panel survey of 1,914 HCP during the 2014-2015 influenza season, we calculated the frequency of working with self-reported ILI (ie, fever and cough or sore throat) and examined reasons for working with ILI by occupation and work setting. RESULTS: Overall, 414 (21.6%) HCP reported ILI, and 183 (41.4%) reported working with ILI (median, 3 days; range, 0-30 days). Pharmacists (67.2%) and physicians (63.2%) had the highest frequency of working with ILI. By work setting, hospital-based HCP had the highest frequency of working with ILI (49.3%). The most common reasons for working while ill included still being able to perform job duties and not feeling bad enough to miss work. Among HCP at long-term care facilities, the most common reason was inability to afford lost pay. CONCLUSIONS: More than 40% of HCP with ILI work while ill. To reduce HCP-associated influenza transmission, potential interventions could target HCP misconceptions about working while ill and paid sick leave policies.

      2. Prevalence and correlates of hepatitis C virus-associated inflammatory arthritis in a population-based cohort
        Ferucci ED, Choromanski TL, Varney DT, Ryan HS, Townshend-Bulson LJ, McMahon BJ, Wener MH.
        Semin Arthritis Rheum. 2017 Apr 24.
        OBJECTIVES: The objectives of this study were to determine the prevalence of hepatitis C virus-associated inflammatory arthritis, to describe its clinical and immunologic correlates, and to identify features that are characteristic of arthritis in chronic hepatitis C. METHODS: Participants with chronic hepatitis C infection enrolled in a population-based cohort study in Alaska and who had not received anti-viral treatment for hepatitis C were recruited. In a cross-sectional study, we assessed joint symptoms and signs, performed autoantibody and cytokine testing, and abstracted medical records for features of hepatitis C and arthritis. RESULTS: Of the 117 enrolled participants, 8 (6.8%) had hepatitis C-associated arthritis. The participants with arthritis were younger than those without (median age: 45 vs. 52, p = 0.02). Rheumatoid factor was commonly present among patients with hepatitis C-associated arthritis. The only studied autoantibody found more commonly in patients with HCV arthritis than those without arthritis was anti-nuclear antibody (63% vs. 23%, p = 0.026). The only joint symptom significantly more common in hepatitis C arthritis was self-reported joint swelling (75% vs. 26%, p = 0.007). Features of fibromyalgia were more common and functional status was worse in those with arthritis than those without. No cytokines differed in patients with and without arthritis. There were no associations of arthritis or autoantibodies with liver-related outcomes. CONCLUSIONS: In this study of a cohort of individuals with chronic HCV infection, HCV-associated arthritis was present in less than 10%. Few serologic features distinguished participants with or without arthritis, but self-reported joint swelling was more common in those with arthritis.

      3. PURPOSE: We assessed the impact of staff, clinic, and community interventions on male and female family planning client visit volume and sexually transmitted infection testing at a multisite community-based health care agency. METHODS: Staff training, clinic environmental changes, in-reach/outreach, and efficiency assessments were implemented in two Family Health Center (San Diego, CA) family planning clinics during 2010-2012; five Family Health Center family planning programs were identified as comparison clinics. Client visit records were compared between preintervention (2007-2009) and postintervention (2010-2012) for both sets of clinics. RESULTS: Of 7,826 male client visits during the time before intervention, most were for clients who were aged <30 years (50%), Hispanic (64%), and uninsured (81%). From preintervention to postintervention, intervention clinics significantly increased the number of male visits (4,004 to 8,385; Delta = +109%); for comparison clinics, male visits increased modestly (3,822 to 4,500; Delta = +18%). The proportion of male clinic visits where chlamydia testing was performed increased in intervention clinics (35% to 42%; p < .001) but decreased in comparison clinics (37% to 33%; p < .001). Subgroup analyses conducted among adolescent and young adult males yielded similar findings for male client volume and chlamydia testing. The number of female visits declined nearly 40% in both comparison (21,800 to 13,202; -39%) and intervention clinics (30,830 to 19,971; -35%) between preintervention and postintervention periods. CONCLUSIONS: Multilevel interventions designed to increase male client volume and sexually transmitted infection testing services in family planning clinics succeeded without affecting female client volume or services.

      4. Implementation of an integrated approach to the National HIV/AIDS Strategy for improving human immunodeficiency virus care for youths
        Fortenberry JD, Koenig LJ, Kapogiannis BG, Jeffries CL, Ellen JM, Wilson CM.
        JAMA Pediatr. 2017 May 22.
        Importance: Youths aged 13 to 24 years old living with human immunodeficiency virus (HIV) are less likely than adults to receive the health and prevention benefits of HIV treatments, with only a small proportion having achieved sustained viral suppression. These age-related disparities in HIV continuum of care are owing in part to the unique developmental issues of adolescents and young adults as well as the complexity and fragmentation of HIV care and related services. This article summarizes a national, multiagency, and multilevel approach to HIV care for newly diagnosed youths designed to bridge some of these fragmentations by addressing National HIV/AIDS Strategy goals for people living with HIV. Design, Setting, and Participants: Three federal agencies developed memoranda of understanding to sequentially implement 3 protocols addressing key National HIV/AIDS Strategy goals. The goals were addressed in the Adolescent Trials Network, with protocols implemented in 12 to 15 sites across the United States. Outcome data were collected from recently diagnosed youth referred to the program. Main Outcomes and Measures: Cross-agency collaboration, youth-friendly linkage to care services, community mobilization to address structural barriers to care, cooperation among services clinicians to proportion of all men who have sex with men who tested, men who identified, and rates of linkage to prevention services. Results: The program addressed National HIV/AIDS Strategy goals 2 through 4 including steps within each goal. A total of 3986 HIV-positive youths were referred for care, with more than 75% linked to care within 6 weeks of referral, with almost 90% of those youths engaged in subsequent HIV care. Community mobilization efforts implemented and completed structural change objectives to address local barriers to care. Age and racial/ethnic group disparities were addressed through targeted training for culturally competent, youth-friendly care, and intensive motivational interviewing training. Conclusions and Relevance: A national program to address the National HIV/AIDS Strategy specifically for youths can improve coordination of federal resources as well as implement best-practice models that are adapted to decrease service fragmentation and systemic barriers at local jurisdictions.

      5. The Happy Teen programme: a holistic outpatient clinic-based approach to prepare HIV-infected youth for the transition from paediatric to adult medical care services in Thailand
        Lolekha R, Boon-Yasidhi V, Na-Nakorn Y, Manaboriboon B, Vandepitte WP, Martin M, Tarugsa J, Nuchanard W, Leowsrisook P, Lapphra K, Suntarattiwong P, Thaineua V, Chokephaibulkit K.
        J Int AIDS Soc. 2017 May 16;20(Suppl 3):81-90.
        INTRODUCTION: We developed an 18-month Happy Teen 2 (HT2) programme comprised of a one-day workshop, two half-day sessions, and three individual sessions to prepare HIV-infected youth for the transition from paediatric to adult HIV care services. We describe the programme and evaluate the change in youth’s knowledge scores. METHODS: We implemented the HT2 programme among HIV-infected Thai youth aged 14-22 years who were aware of their HIV status and receiving care at two hospitals in Bangkok (Siriraj Hospital, Queen Sirikit National Institute of Child Health [QSNICH]). Staff interviewed youth using a standardized questionnaire to assess HIV and health-related knowledge at baseline and at 12 and 18 months while they participated in the programme. We examined factors associated with a composite knowledge score >/=95% at month 18 using logistic regression. RESULTS: During March 2014-July 2016, 192 of 245 (78%) eligible youth were interviewed at baseline. Of these, 161 (84%) returned for interviews at 12 and 18 months. Among the 161 youth, the median age was 17 years, 74 (46%) were female, and 99% were receiving antiretroviral treatment. The median composite score was 45% at baseline and increased to 82% at 12 months and 95% at 18 months (P < 0.001). The range of median knowledge scores for antiretroviral management, HIV monitoring, HIV services, and family planning significantly increased from baseline (range 0-75%) to (range 67-100%) at 12 months and to 100% at 18 months (P < 0.001). Almost all youth were able to describe education and career goals at 12 and 18 months compared to 75% at baseline. In multivariable analysis, a composite knowledge score at 18 months >95% was associated with education level >high school (aOR: 2.15, 95%CI, 1.03-4.48) and receipt care at QSNICH (aOR: 2.43, 95%CI, 1.18-4.98). Youth whose mother and father had died were less likely to have score >/=95% (aOR: 0.22, 95%CI, 0.07-0.67) than those with living parents. CONCLUSIONS: Knowledge useful for a successful transition from paediatric to adult HIV care increased among youth participating in the HT2 programme. Youth follow-up will continue to assess the impact of improved knowledge on outcomes following the transition to adult care services.

      6. Correlates of recent HIV testing among substance-using men who have sex with men
        Rowe C, Matheson T, Das M, DeMicco E, Herbst JH, Coffin PO, Santos GM.
        Int J STD AIDS. 2017 May;28(6):594-601.
        Men who have sex with men are disproportionately impacted by HIV and substance use is a key driver of HIV risk and transmission among this population. We conducted a cross-sectional survey of 3242 HIV-negative substance-using men who have sex with men aged 18 + in the San Francisco Bay Area from March 2009 to May 2012. Demographic characteristics and sexual risk and substance use behaviors in the last six months were collected using structured telephone questionnaires. We used multivariable logistic regression to identify independent demographic and behavioral predictors of recent HIV testing. In all, 65% reported having an HIV test in the last six months. In multivariable analysis, increasing age (aOR = 0.87, 95% CI = 0.84-0.90) and drinking alcohol (<1 drink/day: 0.65, 0.46-0.92; 2-3 drinks/day: 0.64, 0.45-0.91; 4 + drinks/day: 0.52, 0.35-0.78) were negatively associated with recent HIV testing. Having two or more condomless anal intercourse partners (2.17, 1.69-2.79) was positively associated with having a recent HIV test, whereas condomless anal intercourse with serodiscordant partners was not significantly associated with testing. Older men who have sex with men and those who drink alcohol may benefit from specific targeting in efforts to expand HIV testing. Inherently riskier discordant serostatus of partners is not as significant a motivator of HIV testing as condomless anal intercourse in general.

      7. Epidemiology of influenza B/Yamagata and B/Victoria lineages in South Africa, 2005-2014
        Seleka M, Treurnicht FK, Tempia S, Hellferscee O, Mtshali S, Cohen AL, Buys A, McAnerney JM, Besselaar TG, Pretorius M, von Gottberg A, Walaza S, Cohen C, Madhi SA, Venter M.
        PLoS One. 2017 ;12(5):e0177655.
        BACKGROUND: Studies describing the epidemiology of influenza B lineages in South Africa are lacking. METHODS: We conducted a prospective study to describe the circulation of influenza B/Victoria and B/Yamagata lineages among patients of all ages enrolled in South Africa through three respiratory illness surveillance systems between 2005 and 2014: (i) the Viral Watch (VW) program enrolled outpatients with influenza-like illness (ILI) from private healthcare facilities during 2005-2014; (ii) the influenza-like illnesses program enrolled outpatients in public healthcare clinics (ILI/PHC) during 2012-2014; and (iii) the severe acute respiratory illnesses (SARI) program enrolled inpatients from public hospitals during 2009-2014. Influenza B viruses were detected by virus isolation during 2005 to 2009 and by real-time reverse transcription polymerase chain reaction from 2009-2014. Clinical and epidemiological characteristics of patients hospitalized with SARI and infected with different influenza B lineages were also compared using unconditional logistic regression. RESULTS: Influenza viruses were detected in 22% (8,706/39,804) of specimens from patients with ILI or SARI during 2005-2014, of which 24% (2,087) were positive for influenza B. Influenza B viruses predominated in all three surveillance systems in 2010. B/Victoria predominated prior to 2011 (except 2008) whereas B/Yamagata predominated thereafter (except 2012). B lineages co-circulated in all seasons, except in 2013 and 2014 for SARI and ILI/PHC surveillance. Among influenza B-positive SARI cases, the detection of influenza B/Yamagata compared to influenza B/Victoria was significantly higher in individuals aged 45-64 years (adjusted odds ratio [aOR]: 4.2; 95% confidence interval [CI]: 1.1-16.5) and >/=65 years (aOR: 12.2; 95% CI: 2.3-64.4) compared to children aged 0-4 years, but was significantly lower in HIV-infected patients (aOR: 0.4; 95% CI: 0.2-0.9). CONCLUSION: B lineages co-circulated in most seasons except in 2013 and 2014. Hospitalized SARI cases display differential susceptibility for the two influenza B lineages, with B/Victoria being more prevalent among children and HIV-infected persons.

    • Disaster Control and Emergency Services RSS Word feed
      1. Characterization of carbon monoxide exposure during Hurricane Sandy and subsequent Nor’easter
        Schnall A, Law R, Heinzerling A, Sircar K, Damon S, Yip F, Schier J, Bayleyegn T, Wolkin A.
        Disaster Med Public Health Prep. 2017 Apr 25:1-6.
        OBJECTIVE: Carbon monoxide (CO) is an odorless, colorless gas produced by fossil fuel combustion. On October 29, 2012, Hurricane Sandy moved ashore near Atlantic City, New Jersey, causing widespread morbidity and mortality, $30 to $50 billion in economic damage, and 8.5 million households to be without power. The combination of power outages and unusually low temperatures led people to use alternate power sources, placing many at risk for CO exposure. METHODS: We examined Hurricane Sandy-related CO exposures from multiple perspectives to help identify risk factors and develop strategies to prevent future exposures. This report combined data from 3 separate sources (health departments, poison centers via the National Poison Data System, and state and local public information officers). RESULTS: Results indicated that the number of CO exposures in the wake of Hurricane Sandy was significantly greater than in previous years. The persons affected were mostly females and those in younger age categories and, despite messaging, most CO exposures occurred from improper generator use. CONCLUSIONS: Our findings emphasize the continued importance of CO-related communication and ongoing surveillance of CO exposures to support public health response and prevention during and after disasters. Additionally, regional poison centers can be a critical resource for potential on-site management, public health promotion, and disaster-related CO exposure surveillance. (Disaster Med Public Health Preparedness. 2017;page 1 of 6).

    • Epidemiology and Surveillance RSS Word feed
      1. Comparing laboratory surveillance with the notifiable diseases surveillance system in South Africa
        Benson FG, Musekiwa A, Blumberg L, Rispel LC.
        Int J Infect Dis. 2017 May 19.
        OBJECTIVE: The aim of this study was to compare laboratory surveillance with the notifiable diseases surveillance system (NDSS) in South Africa. METHODS: Data on three tracer notifiable diseases – measles, meningococcal meningitis, and typhoid – were compared to assess data quality, stability, representativeness, sensitivity and positive predictive value (PPV), using the Wilcoxon and Chi-square tests, at the 5% significance level. RESULTS: For all three diseases, fewer cases were notified than confirmed in the laboratory. Completeness for the laboratory system was higher for measles (63% vs. 47%, p<0.001) and meningococcal meningitis (63% vs. 57%, p<0.001), but not for typhoid (60% vs. 63%, p=0.082). Stability was higher for the laboratory (all 100%) compared to notified measles (24%, p<0.001), meningococcal meningitis (74%, p<0.001), and typhoid (36%, p<0.001). Representativeness was also higher for the laboratory (all 100%) than for notified measles (67%, p=0.058), meningococcal meningitis (56%, p=0.023), and typhoid (44%, p=0.009). The sensitivity of the NDSS was 50%, 98%, and 93%, and the PPV was 20%, 57%, and 81% for measles, meningococcal meningitis, and typhoid, respectively. CONCLUSIONS: Compared to laboratory surveillance, the NDSS performed poorly on most system attributes. Revitalization of the NDSS in South Africa is recommended to address the completeness, stability, and representativeness of the system.

    • Genetics and Genomics RSS Word feed
      1. Insights into Reston virus spillovers and adaption from virus whole genome sequences
        Albarino CG, Wiggleton Guerrero L, Jenks HM, Chakrabarti AK, Ksiazek TG, Rollin PE, Nichol ST.
        PLoS One. 2017 ;12(5):e0178224.
        Reston virus (family Filoviridae) is unique among the viruses of the Ebolavirus genus in that it is considered non-pathogenic in humans, in contrast to the other members which are highly virulent. The virus has however, been associated with several outbreaks of highly lethal hemorrhagic fever in non-human primates (NHPs), specifically cynomolgus monkeys (Macaca fascicularis) originating in the Philippines. In addition, Reston virus has been isolated from domestic pigs in the Philippines. To better understand virus spillover events and potential adaption to new hosts, the whole genome sequences of representative Reston virus isolates were obtained using a next generation sequencing (NGS) approach and comparative genomic analysis and virus fitness analyses were performed. Nine virus genome sequences were completed for novel and previously described isolates obtained from a variety of hosts including a human case, non-human primates and pigs. Results of phylogenetic analysis of the sequence differences are consistent with multiple independent introductions of RESTV from a still unknown natural reservoir into non-human primates and swine farming operations. No consistent virus genetic markers were found specific for viruses associated with primate or pig infections, but similar to what had been seen with some Ebola viruses detected in the large Western Africa outbreak in 2014-2016, a truncated version of VP30 was identified in a subgroup of Reston viruses obtained from an outbreak in pigs 2008-2009. Finally, the genetic comparison of two closely related viruses, one isolated from a human case and one from an NHP, showed amino acid differences in the viral polymerase and detectable differences were found in competitive growth assays on human and NHP cell lines.

      2. Critical role of RIG-I and MDA5 in early and late stages of Tulane virus infection
        Chhabra P, Ranjan P, Cromeans T, Sambhara S, Vinje J.
        J Gen Virol. 2017 May 22.
        Human noroviruses are a major cause of acute gastroenteritis worldwide, but the lack of a robust cell culture system or small animal model have hampered a better understanding of innate immunity against these viruses. Tulane virus (TV) is the prototype virus of a tentative new genus, Recovirus, in the family Caliciviridae. Its epidemiology and biological properties most closely resemble human norovirus. The host innate immune response to RNA virus infection primarily involves pathogen-sensing toll-like receptors (TLRs) TLR3 and TLR7 and retinoic acid-inducible gene I-like receptor RIG-I and melanoma differentiation associated gene 5 (MDA5). In this study, by using siRNA knockdown, we report that TV infection in LLC-MK2 cells results in an early [3 h post infection (h p.i.), P<0.05] RIG-I-dependent and type I interferon-mediated antiviral response, whereas an MDA5-mediated antiviral effect was observed at later (12 h p.i.; P<0.05) stages of TV replication. Induction of RIG-I and MDA5 was critical for inhibition of TV replication. Furthermore, pre-activation of the RIG-I/MDA5 pathway prevented TV replication (>900-fold decrease; P<0.05), suggesting that RIG-I and MDA5 ligands could be used to develop novel preventive and therapeutic measures against norovirus.

      3. Molecular epidemiology of hepatitis B virus infection in Tanzania
        Forbi JC, Dillon M, Purdy MA, Drammeh BS, Tejada-Strop A, McGovern D, Xia GL, Lin Y, Ganova-Raeva LM, Campo DS, Thai H, Vaughan G, Haule D, Kutaga RP, Basavaraju SV, Kamili S, Khudyakov YE.
        J Gen Virol. 2017 May 25.
        Despite the significant public health problems associated with hepatitis B virus (HBV) in sub-Saharan Africa, many countries in this region do not have systematic HBV surveillance or genetic information on HBV circulating locally. Here, we report on the genetic characterization of 772 HBV strains from Tanzania. Phylogenetic analysis of the S-gene sequences showed prevalence of HBV genotype A (HBV/A, n=671, 86.9 %), followed by genotypes D (HBV/D, n=95, 12.3 %) and E (HBV/E, n=6, 0.8 %). All HBV/A sequences were further classified into subtype A1, while the HBV/D sequences were assigned to a new cluster. Among the Tanzanian sequences, 84 % of HBV/A1 and 94 % of HBV/D were unique. The Tanzanian and global HBV/A1 sequences were compared and were completely intermixed in the phylogenetic tree, with the Tanzanian sequences frequently generating long terminal branches, indicating a long history of HBV/A1 infections in the country. The time to the most recent common ancestor was estimated to be 188 years ago [95 % highest posterior density (HPD): 132 to 265 years] for HBV/A1 and 127 years ago (95 % HPD: 79 to 192 years) for HBV/D. The Bayesian skyline plot showed that the number of transmissions ‘exploded’ exponentially between 1960-1970 for HBV/A1 and 1970-1990 for HBV/D, with the effective population of HBV/A1 having expanded twice as much as that of HBV/D. The data suggest that Tanzania is at least a part of the geographic origin of the HBV/A1 subtype. A recent increase in the transmission rate and significant HBV genetic diversity should be taken into consideration when devising public health interventions to control HBV infections in Tanzania.

      4. Evolutionary dynamics and genomic features of the Elizabethkingia anophelis 2015 to 2016 Wisconsin outbreak strain
        Perrin A, Larsonneur E, Nicholson AC, Edwards DJ, Gundlach KM, Whitney AM, Gulvik CA, Bell ME, Rendueles O, Cury J, Hugon P, Clermont D, Enouf V, Loparev V, Juieng P, Monson T, Warshauer D, Elbadawi LI, Walters MS, Crist MB, Noble-Wang J, Borlaug G, Rocha EP, Criscuolo A, Touchon M, Davis JP, Holt KE, McQuiston JR, Brisse S.
        Nat Commun. 2017 May 24;8:15483.
        An atypically large outbreak of Elizabethkingia anophelis infections occurred in Wisconsin. Here we show that it was caused by a single strain with thirteen characteristic genomic regions. Strikingly, the outbreak isolates show an accelerated evolutionary rate and an atypical mutational spectrum. Six phylogenetic sub-clusters with distinctive temporal and geographic dynamics are revealed, and their last common ancestor existed approximately one year before the first recognized human infection. Unlike other E. anophelis, the outbreak strain had a disrupted DNA repair mutY gene caused by insertion of an integrative and conjugative element. This genomic change probably contributed to the high evolutionary rate of the outbreak strain and may have increased its adaptability, as many mutations in protein-coding genes occurred during the outbreak. This unique discovery of an outbreak caused by a naturally occurring mutator bacterial pathogen provides a dramatic example of the potential impact of pathogen evolutionary dynamics on infectious disease epidemiology.

    • Health Disparities RSS Word feed
      1. Social determinants contribute to health disparities. Previous research has indicated that community trauma is associated with negative health outcomes. This study examined the impact of community trauma on sexual risk, marijuana use and mental health among African-American female adolescents in a juvenile detention center. One hundred and eighty-eight African-American female adolescents, aged 13-17 years, were recruited from a short-term detention facility and completed assessments on community trauma, sexual risk behavior, marijuana use, symptoms of posttraumatic stress disorder and psychosocial HIV/STD risk factors. Findings indicate that community trauma was associated with unprotected sex, having a sex partner with a correctional/juvenile justice history, sexual sensation seeking, marijuana use, affiliation with deviant peers and posttraumatic stress disorder symptoms at baseline and longitudinally. Findings reinforce the impact of community-level factors and co-occurring health issues, particularly in high-risk environments and among vulnerable populations. Structural and community-level interventions and policy-level changes may help improve access to resources and improve adolescents’ overall health and standard of living in at-risk communities.

    • Healthcare Associated Infections RSS Word feed
      1. Surgical site infection research opportunities
        Itani KM, Dellinger EP, Mazuski J, Solomkin J, Allen G, Blanchard JC, Kelz R, Berrios-Torres SI.
        Surg Infect (Larchmt). 2017 May/Jun;18(4):401-408.
        Much has been done to identify measures and modify risk factors to decrease the rate of surgical site infection (SSI). Development of the Centers for Disease Control and Prevention (CDC) Core recommendations for the prevention of SSI revealed evidence gaps in six areas: Parenteral antimicrobial prophylaxis, glycemic control, normothermia, oxygenation, antiseptic prophylaxis, and non-parenteral antimicrobial prophylaxis. Using a modified Delphi process, seven SSI content experts identified nutritional status, smoking, obesity, surgical technique, and anemia as additional areas for SSI prevention research. Post-modified Delphi process Staphylococcus aureus colonization and SSI definition and surveillance were also deemed important topic areas for inclusion. For each topic, research questions were developed, and 10 were selected as the final SSI research questions.

      2. A national implementation project to prevent catheter-associated urinary tract infection in nursing home residents
        Mody L, Greene MT, Meddings J, Krein SL, McNamara SE, Trautner BW, Ratz D, Stone ND, Min L, Schweon SJ, Rolle AJ, Olmsted RN, Burwen DR, Battles J, Edson B, Saint S.
        JAMA Intern Med. 2017 May 19.
        Importance: Catheter-associated urinary tract infection (UTI) in nursing home residents is a common cause of sepsis, hospital admission, and antimicrobial use leading to colonization with multidrug-resistant organisms. Objective: To develop, implement, and evaluate an intervention to reduce catheter-associated UTI. Design, Setting, and Participants: A large-scale prospective implementation project was conducted in community-based nursing homes participating in the Agency for Healthcare Research and Quality Safety Program for Long-Term Care. Nursing homes across 48 states, Washington DC, and Puerto Rico participated. Implementation of the project was conducted between March 1, 2014, and August 31, 2016. Interventions: The project was implemented over 12-month cohorts and included a technical bundle: catheter removal, aseptic insertion, using regular assessments, training for catheter care, and incontinence care planning, as well as a socioadaptive bundle emphasizing leadership, resident and family engagement, and effective communication. Main Outcomes and Measures: Urinary catheter use and catheter-associated UTI rates using National Healthcare Safety Network definitions were collected. Facility-level urine culture order rates were also obtained. Random-effects negative binomial regression models were used to examine changes in catheter-associated UTI, catheter utilization, and urine cultures and adjusted for covariates including ownership, bed size, provision of subacute care, 5-star rating, presence of an infection control committee, and an infection preventionist. Results: In 4 cohorts over 30 months, 568 community-based nursing homes were recruited; 404 met inclusion criteria for analysis. The unadjusted catheter-associated UTI rates decreased from 6.78 to 2.63 infections per 1000 catheter-days. With use of the regression model and adjustment for facility characteristics, the rates decreased from 6.42 to 3.33 (incidence rate ratio [IRR], 0.46; 95% CI, 0.36-0.58; P < .001). Catheter utilization was 4.5% at baseline and 4.9% at the end of the project. Catheter utilization remained unchanged (4.50 at baseline, 4.45 at conclusion of project; IRR, 0.95; 95% CI, 0.88-1.03; P = .26) in adjusted analyses. The number of urine cultures ordered for all residents decreased from 3.49 per 1000 resident-days to 3.08 per 1000 resident-days. Similarly, after adjustment, the rates were shown to decrease from 3.52 to 3.09 (IRR, 0.85; 95% CI, 0.77-0.94; P = .001). Conclusions and Relevance: In a large-scale, national implementation project involving community-based nursing homes, combined technical and socioadaptive catheter-associated UTI prevention interventions successfully reduced the incidence of catheter-associated UTIs.

      3. Introduction to the Centers for Disease Control and Prevention and the Healthcare Infection Control Practices Advisory Committee Guideline for the Prevention of Surgical Site Infections
        Solomkin JS, Mazuski J, Blanchard JC, Itani KM, Ricks P, Dellinger EP, Allen G, Kelz R, Reinke CE, Berrios-Torres SI.
        Surg Infect (Larchmt). 2017 May/Jun;18(4):385-393.
        Surgical site infection (SSI) is a common type of health-care-associated infection (HAI) and adds considerably to the individual, social, and economic costs of surgical treatment. This document serves to introduce the updated Guideline for the Prevention of SSI from the Centers for Disease Control and Prevention (CDC) and the Healthcare Infection Control Practices Advisory Committee (HICPAC). The Core section of the guideline addresses issues relevant to multiple surgical specialties and procedures. The second procedure-specific section focuses on a high-volume, high-burden procedure: Prosthetic joint arthroplasty. While many elements of the 1999 guideline remain current, others warrant updating to incorporate new knowledge and changes in the patient population, operative techniques, emerging pathogens, and guideline development methodology.

    • Immunity and Immunization RSS Word feed
      1. Influenza vaccination modifies disease severity among community-dwelling adults hospitalized with influenza
        Arriola CS, Garg S, Anderson EJ, Ryan PA, George A, Zansky SM, Bennett N, Reingold A, Bargsten M, Miller L, Yousey-Hindes K, Tatham L, Bohm SR, Lynfield R, Thomas A, Lindegren ML, Schaffner W, Fry AM, Chaves SS.
        Clin Infect Dis. 2017 May 19.
        Background: We investigated the effect of influenza vaccination on disease severity in adults hospitalized with laboratory-confirmed influenza during 2013-14, a season in which vaccine viruses were antigenically similar to those circulating. Methods: We analyzed data from the 2013-14 influenza season, and used propensity score matching to account for the probability of vaccination within age strata (18-49, 50-64 and >/=65 years). Death, intensive care unit (ICU) admission, and hospital and ICU lengths of stay (LOS) were outcome measures for severity. Multivariable logistic regression and competing risk models were used to compare disease severity between vaccinated and unvaccinated patients, adjusting for timing of antiviral treatment and time from illness onset to hospitalization. Results: Influenza vaccination was associated with a reduction in the odds of in-hospital death among patients aged 18-49 years (adjusted odds ratios [aOR] =0.21; 95% confidence interval [CI], 0.05 to 0.97), 50-64 years (aOR=0.48; 95% CI, 0.24 to 0.97), and >/=65 years (aOR=0.39; 95% CI, 0.17 to 0.66). Vaccination also reduced ICU admission among patients aged 18-49 years (aOR=0.63; 95% CI, 0.42 to 0.93) and >/=65 years (aOR=0.63; 95% CI, 0.48 to 0.81), and shortened ICU LOS among those 50-64 years (adjusted relative hazards [aRH]=1.36; 95% CI, 1.06 to 1.74) and >/=65 years (aRH=1.34; 95% CI, 1.06 to 1.73), and hospital LOS among 50-64 years (aRH=1.13; 95% CI, 1.02 to 1.26) and >/=65 years (aRH=1.24; 95% CI, 1.13 to 1.37). Conclusions: Influenza vaccination during 2013-14 influenza season attenuated adverse outcome among adults that were hospitalized with laboratory-confirmed influenza.

      2. Third dose diphtheria tetanus pertussis (DTP3) administrative coverage is a commonly used indicator for immunization program performance, although studies have demonstrated data quality issues with administrative DTP3 coverage. It is possible that administrative coverage for DTP3 may be inflated more than for other antigens. To examine this, theory, we compiled immunization coverage estimates from recent country surveys (n=71) and paired these with corresponding administrative coverage estimates, by country and cohort year, for DTP3 and 4 other antigens. Median administrative coverage was higher than survey estimates of coverage for all antigens (median differences from 26 to 30%), however this difference was similar for DTP3 as for all other antigens. These findings were consistent when countries were stratified by income level and eligibility for Gavi funding. Our findings demonstrate that while country administrative coverage estimates tend to be higher than survey estimates, DTP3 administrative coverage is not inflated more than other antigens.

      3. Influenza vaccine effectiveness against pediatric deaths: 2010-2014
        Flannery B, Reynolds SB, Blanton L, Santibanez TA, O’Halloran A, Lu PJ, Chen J, Foppa IM, Gargiullo P, Bresee J, Singleton JA, Fry AM.
        Pediatrics. 2017 ;139(5).
        BACKGROUND AND OBJECTIVES: Surveillance for laboratory-confirmed influenza-associated pediatric deaths since 2004 has shown that most deaths occur in unvaccinated children. We assessed whether influenza vaccination reduced the risk of influenza-associated death in children and adolescents. METHODS: We conducted a case-cohort analysis comparing vaccination uptake among laboratory-confirmed influenza-associated pediatric deaths with estimated vaccination coverage among pediatric cohorts in the United States. Case vaccination and high-risk status were determined by case investigation. Influenza vaccination coverage estimates were obtained from national survey data or a national insurance claims database. We estimated odds ratios from logistic regression comparing odds of vaccination among cases with odds of vaccination in comparison cohorts. We used Bayesian methods to compute 95% credible intervals (CIs) for vaccine effectiveness (VE), calculated as (1 – odds ratio) x 100. RESULTS: From July 2010 through June 2014, 358 laboratory-confirmed influenza-associated pediatric deaths were reported among children aged 6 months through 17 years. Vaccination status was determined for 291 deaths; 75 (26%) received vaccine before illness onset. Average vaccination coverage in survey cohorts was 48%. Overall VE against death was 65% (95% CI, 54% to 74%). Among 153 deaths in children with underlying high-risk medical conditions, 47 (31%) were vaccinated. VE among children with high-risk conditions was 51% (95% CI, 31% to 67%), compared with 65% (95% CI, 47% to 78%) among children without high-risk conditions. CONCLUSIONS: Influenza vaccination was associated with reduced risk of laboratory-confirmed influenza-associated pediatric death. Increasing influenza vaccination could prevent influenza-associated deaths among children and adolescents.

      4. U.S. zoster vaccine uptake has been sluggish. Most adults are aware of zoster but unaware of its distressing manifestations. We found that vaccine uptake is markedly increased immediately following occurrence of zoster in a spouse. Thus, personal zoster awareness can prompt vaccination. Our findings have implications in terms of both vaccine promotion and interpretations of vaccine performance.

      5. Responding to a cVDPV1 outbreak in Ukraine: Implications, challenges and opportunities
        Khetsuriani N, Perehinets I, Nitzan D, Popovic D, Moran T, Allahverdiyeva V, Huseynov S, Gavrilin E, Slobodianyk L, Izhyk O, Sukhodolska A, Hegazi S, Bulavinova K, Platov S, O’Connor P.
        Vaccine. 2017 May 18.
        BACKGROUND: The European Region, certified polio-free in 2002, remains at risk of wild poliovirus reintroduction and emergence of circulating vaccine-derived polioviruses (cVDPV) until global polio eradication is achieved, as demonstrated by the cVDPV1 outbreak in Ukraine in 2015. METHODS: We reviewed epidemiologic, clinical and virology data on cVDPV cases, surveillance and immunization coverage data, and reports of outbreak-related surveys, country missions, and expert group meetings. RESULTS: In Ukraine, 3-dose polio vaccine coverage declined from 91% in 2008 to 15% by mid-2015. In summer, 2015, two unrelated children from Zakarpattya province were paralyzed by a highly divergent cVDPV1. The isolates were 20 and 26 nucleotide divergent from prototype Sabin strain (with 18 identical mutations) consistent with their common origin and approximately 2-year evolution. Outbreak response recommendations developed with international partner support included conducting three nationwide supplementary immunization activities (SIAs) with tOPV, strengthening surveillance and implementing communication interventions. SIAs were conducted during October 2015-February 2016 (officially reported coverage, round 1-64.4%, round 2-71.7%, and round 3-80.7%). Substantial challenges to outbreak response included lack of high-level support, resistance to OPV use, low perceived risk of polio, widespread vaccine hesitancy, anti-vaccine media environment, economic crisis and military conflict. Communication activities improved caregiver awareness of polio and confidence in vaccination. Surveillance was enhanced but did not consistently meet applicable performance standards. Post-outbreak assessments concluded that cVDPV1 transmission in Ukraine has likely stopped following the response, but significant gaps in population immunity and surveillance remained. CONCLUSIONS: Chronic under-vaccination in Ukraine resulted in the accumulation of children susceptible to polioviruses and created favorable conditions for VDPV1 emergence and circulation, leading to the outbreak. Until programmatic gaps in immunization and surveillance are addressed, Ukraine will remain at high-risk for VDPV emergence and circulation, as well as at risk for other vaccine-preventable diseases.

    • Informatics RSS Word feed
      1. Objective: To identify physician and practice characteristics associated with high clinical and technical performance on the electronic clinical quality measure (eCQM) that calculates the proportion of patients with hypertension who have controlled blood pressure. Materials and Methods: The study included 268 602 physicians participating in the Medicare Electronic Health Record Incentive Program between 2011 and 2014. Independent variables included delivery reform participation and physician, practice level, and area characteristics. Successful technical performance was a reported eCQM with non-zero values in both the numerator and denominator. Successful clinical performance was a reported eCQM value of >/=70% hypertension control. Results: Physicians with longer experience using certified health information technology, participants in delivery reform programs, and specialists that traditionally manage hypertension were 5%-15% more likely to achieve 70% control. Physicians in smaller and rural practices and a subset of physicians unlikely to primarily manage hypertension were more likely to submit measures with a zero value in either the numerator or denominator. Discussion: More physicians are using eCQMs to track and report their quality improvement efforts. This research presents the first examination of national eCQM data to identify physician and practice-level characteristics associated with performance. Conclusion: With careful selection of measures relevant to the clinician’s specialty, complete data entry, and support for continuous quality improvement, health care professionals can excel technically and clinically. As care delivery transitions from fee-for-service to quality- and value-based models, high performers may realize financial gains and better patient outcomes. These analyses suggest patterns that may inform steps to improve performance.

    • Injury and Violence RSS Word feed
      1. Improving primary care provider practices in youth concussion management
        Arbogast KB, Curry AE, Metzger KB, Kessler RS, Bell JM, Haarbauer-Krupa J, Zonfrillo MR, Breiding MJ, Master CL.
        Clin Pediatr (Phila). 2017 May 01:9922817709555.
        Primary care providers are increasingly providing youth concussion care but report insufficient time and training, limiting adoption of best practices. We implemented a primary care-based intervention including an electronic health record-based clinical decision support tool (“SmartSet”) and in-person training. We evaluated consequent improvement in 2 key concussion management practices: (1) performance of a vestibular oculomotor examination and (2) discussion of return-to-learn/return-to-play (RTL/RTP) guidelines. Data were included from 7284 primary care patients aged 0 to 17 years with initial concussion visits between July 2010 and June 2014. We compared proportions of visits pre- and post-intervention in which the examination was performed or RTL/RTP guidelines provided. Examinations and RTL/RTP were documented for 1.8% and 19.0% of visits pre-intervention, respectively, compared with 71.1% and 72.9% post-intervention. A total of 95% of post-intervention examinations were documented within the SmartSet. An electronic clinical decision support tool, plus in-person training, may be key to changing primary care provider behavior around concussion care.

      2. Universal motorcycle helmet laws to reduce injuries: A Community Guide Systematic Review
        Peng Y, Vaidya N, Finnie R, Reynolds J, Dumitru C, Njie G, Elder R, Ivers R, Sakashita C, Shults RA, Sleet DA, Compton RP.
        Am J Prev Med. 2017 Jun;52(6):820-832.
        CONTEXT: Motorcycle crashes account for a disproportionate number of motor vehicle deaths and injuries in the U.S. Motorcycle helmet use can lead to an estimated 42% reduction in risk for fatal injuries and a 69% reduction in risk for head injuries. However, helmet use in the U.S. has been declining and was at 60% in 2013. The current review examines the effectiveness of motorcycle helmet laws in increasing helmet use and reducing motorcycle-related deaths and injuries. EVIDENCE ACQUISITION: Databases relevant to health or transportation were searched from database inception to August 2012. Reference lists of reviews, reports, and gray literature were also searched. Analysis of the data was completed in 2014. EVIDENCE SYNTHESIS: A total of 60 U.S. studies qualified for inclusion in the review. Implementing universal helmet laws increased helmet use (median, 47 percentage points); reduced total deaths (median, -32%) and deaths per registered motorcycle (median, -29%); and reduced total injuries (median, -32%) and injuries per registered motorcycle (median, -24%). Repealing universal helmet laws decreased helmet use (median, -39 percentage points); increased total deaths (median, 42%) and deaths per registered motorcycle (median, 24%); and increased total injuries (median, 41%) and injuries per registered motorcycle (median, 8%). CONCLUSIONS: Universal helmet laws are effective in increasing motorcycle helmet use and reducing deaths and injuries. These laws are effective for motorcyclists of all ages, including younger operators and passengers who would have already been covered by partial helmet laws. Repealing universal helmet laws decreased helmet use and increased deaths and injuries.

      3. Boys are victims too? Sexual dating violence and injury among high-risk youth
        Reidy DE, Early MS, Holland KM.
        Prev Med. 2017 May 18.
        OBJECTIVE: Prior research with youth exposed to violence suggests that, in this high-risk population, boys may be victims of sexual teen dating violence (TDV) and injury as frequently as girls. We sought to replicate these findings with a demographically similar sample and to determine whether the findings could be attributed the high-risk nature of the sample by assessing the impact of violence exposure on sex differences. METHODS: A cross-sectional sample of 2577 youth (ages 11-18, M=15.4, SD=1.9, 52% female, 25% Caucasian) collected in 2004 from a high-risk community reported on history of dating and exposure to multiple forms of violence. We conducted moderation analyses to test whether polyvictimization (PV) and age moderated the potential sex differences in perpetration and victimization of sexual TDV and injury. RESULTS: No significant sex differences in victimization were observed regardless of degree of PV. Boys reported more frequent sexual TDV and injury perpetration relative to girls, but only for youth reporting high degree of PV. There were no sex differences in perpetration among low PV youth. CONCLUSIONS: These findings suggest boys from high-risk communities may disproportionately perpetrate severe acts of TDV but at this early age they are equally likely to be victimized. To interrupt the cycle of violence victimization and perpetration, comprehensive violence prevention interventions targeting high-risk youth should be implemented at schools, in homes, and in the community; and they should recognize the potential for girls and boys to be victims of even the most severe forms of TDV.

      4. Changes in J-SOAP-II and SAVRY scores over the course of residential, cognitive-behavioral treatment for adolescent sexual offending
        Viljoen JL, Gray AL, Shaffer C, Latzman NE, Scalora MJ, Ullman D.
        Sex Abuse. 2017 Jun;29(4):342-374.
        Although the Juvenile Sex Offender Assessment Protocol-II (J-SOAP-II) and the Structured Assessment of Violence Risk in Youth (SAVRY) include an emphasis on dynamic, or modifiable factors, there has been little research on dynamic changes on these tools. To help address this gap, we compared admission and discharge scores of 163 adolescents who attended a residential, cognitive-behavioral treatment program for sexual offending. Based on reliable change indices, one half of youth showed a reliable decrease on the J-SOAP-II Dynamic Risk Total Score and one third of youth showed a reliable decrease on the SAVRY Dynamic Risk Total Score. Contrary to expectations, decreases in risk factors and increases in protective factors did not predict reduced sexual, violent nonsexual, or any reoffending. In addition, no associations were found between scores on the Psychopathy Checklist:Youth Version and levels of change. Overall, the J-SOAP-II and the SAVRY hold promise in measuring change, but further research is needed.

    • Laboratory Sciences RSS Word feed
      1. Measuring changes in transmission of neglected tropical diseases, malaria, and enteric pathogens from quantitative antibody levels
        Arnold BF, van der Laan MJ, Hubbard AE, Steel C, Kubofcik J, Hamlin KL, Moss DM, Nutman TB, Priest JW, Lammie PJ.
        PLoS Negl Trop Dis. 2017 May 19;11(5):e0005616.
        BACKGROUND: Serological antibody levels are a sensitive marker of pathogen exposure, and advances in multiplex assays have created enormous potential for large-scale, integrated infectious disease surveillance. Most methods to analyze antibody measurements reduce quantitative antibody levels to seropositive and seronegative groups, but this can be difficult for many pathogens and may provide lower resolution information than quantitative levels. Analysis methods have predominantly maintained a single disease focus, yet integrated surveillance platforms would benefit from methodologies that work across diverse pathogens included in multiplex assays. METHODS/PRINCIPAL FINDINGS: We developed an approach to measure changes in transmission from quantitative antibody levels that can be applied to diverse pathogens of global importance. We compared age-dependent immunoglobulin G curves in repeated cross-sectional surveys between populations with differences in transmission for multiple pathogens, including: lymphatic filariasis (Wuchereria bancrofti) measured before and after mass drug administration on Mauke, Cook Islands, malaria (Plasmodium falciparum) before and after a combined insecticide and mass drug administration intervention in the Garki project, Nigeria, and enteric protozoans (Cryptosporidium parvum, Giardia intestinalis, Entamoeba histolytica), bacteria (enterotoxigenic Escherichia coli, Salmonella spp.), and viruses (norovirus groups I and II) in children living in Haiti and the USA. Age-dependent antibody curves fit with ensemble machine learning followed a characteristic shape across pathogens that aligned with predictions from basic mechanisms of humoral immunity. Differences in pathogen transmission led to shifts in fitted antibody curves that were remarkably consistent across pathogens, assays, and populations. Mean antibody levels correlated strongly with traditional measures of transmission intensity, such as the entomological inoculation rate for P. falciparum (Spearman’s rho = 0.75). In both high- and low transmission settings, mean antibody curves revealed changes in population mean antibody levels that were masked by seroprevalence measures because changes took place above or below the seropositivity cutoff. CONCLUSIONS/SIGNIFICANCE: Age-dependent antibody curves and summary means provided a robust and sensitive measure of changes in transmission, with greatest sensitivity among young children. The method generalizes to pathogens that can be measured in high-throughput, multiplex serological assays, and scales to surveillance activities that require high spatiotemporal resolution. Our results suggest quantitative antibody levels will be particularly useful to measure differences in exposure for pathogens that elicit a transient antibody response or for monitoring populations with very high- or very low transmission, when seroprevalence is less informative. The approach represents a new opportunity to conduct integrated serological surveillance for neglected tropical diseases, malaria, and other infectious diseases with well-defined antigen targets.

      2. Performance characteristics of an antibody-based multiplex kit for determining recent HIV-1 infection
        Curtis KA, Hanson DL, Price KA, Owen SM.
        PLoS One. 2017 ;12(5):e0176593.
        The availability of reliable laboratory methods for determining recent HIV infection is vital for accurate estimation of population-based incidence. The mean duration of recent infection (MDRI) and false recent rate (FRR) are critical parameters for HIV incidence assays, as they impact HIV incidence estimates and provide a measure of assay performance. The HIV-1 Multiplex assay is an in-house developed, magnetic bead-based assay that measures virus-specific antibody levels and avidity to multiple analytes. To ensure quality control and to facilitate transfer of the assay to external laboratories or testing facilities, the in-house assay has been adapted and produced in kit form. Here, we describe the performance characteristics of the multiplex kit and demonstrate the stability of the kit components over a one-year period. Two statistical methods were employed to estimate the MDRI of the individual analytes and five different algorithms, combining multiple analyte values. The MDRI estimates for the individual analytes and five algorithms were all between 200 and 300 days post-seroconversion, with no notable difference between the two statistical approaches. All five algorithms exhibited a 0% FRR with specimens from long-term, subtype B HIV-1-infected individuals. The assay parameters described in this study provide the necessary tools to implement the HIV-1 multiplex assay and improves the utility of the assay for field use.

      3. Size and shape distributions of primary crystallites in titania aggregates
        Grulke E, Yamamoto K, Kumagai K, Hausler I, Osterle W, Ortel E, Hodoroaba V, Brown S, Chan C, Zheng J, Yamamoto K, Yashiki K, Song N, Kim Y, Stefaniak A, Schwegler-Berry D, Coleman V, Jamting ?, Herrmann J, Arakawa T, Burchett W, Lambert J, Stromberg A.
        Adv Powder Technol. 2017 .
        The primary crystallite size of titania powder relates to its properties in a number of applications. Transmission electron microscopy was used in this interlaboratory comparison (ILC) to measure primary crystallite size and shape distributions for a commercial aggregated titania powder. Data of four size descriptors and two shape descriptors were evaluated across nine laboratories. Data repeatability and reproducibility was evaluated by analysis of variance. One-third of the laboratory pairs had similar size descriptor data, but 83% of the pairs had similar aspect ratio data. Scale descriptor distributions were generally unimodal and were well-described by lognormal reference models. Shape descriptor distributions were multi-modal but data visualization plots demonstrated that the Weibull distribution was preferred to the normal distribution. For the equivalent circular diameter size descriptor, measurement uncertainties of the lognormal distribution scale and width parameters were 9.5% and 22%, respectively. For the aspect ratio shape descriptor, the measurement uncertainties of the Weibull distribution scale and width parameters were 7.0% and 26%, respectively. Both measurement uncertainty estimates and data visualizations should be used to analyze size and shape distributions of particles on the nanoscale.

      4. Respirable dust: Measured downwind during rock dust application
        Harris ML, Organiscak J, Klima S, Perera IE.
        Min Eng. 2017 ;69(5):69-74.
        The Pittsburgh Mining Research Division of the U.S. National Institute for Occupational Safety and Health (NIOSH) conducted underground evaluations in an attempt to quantify respirable rock dust generation when using untreated rock dust and rock dust treated with an anticaking additive. Using personal dust monitors, these evaluations measured respirable rock dust levels arising from a flinger-type application of rock dust on rib and roof surfaces. Rock dust with a majority of the respirable component removed was also applied in NIOSH’s Bruceton Experimental Mine using a bantam duster. The respirable dust measurements obtained downwind from both of these tests are presented and discussed. This testing did not measure miners’ exposure to respirable coal mine dust under acceptable mining practices, but indicates the need for effective continuous administrative controls to be exercised when rock dusting to minimize the measured amount of rock dust in the sampling device.

      5. Rapid and accurate molecular identification of the emerging multidrug resistant pathogen Candida auris
        Kordalewska M, Zhao Y, Lockhart SR, Chowdhary A, Berrio I, Perlin DS.
        J Clin Microbiol. 2017 May 24.
        Candida auris is an emerging multidrug resistant fungal pathogen causing nosocomial and invasive infections associated with high mortality. C. auris is commonly misidentified as several different yeast species by commercially available phenotypic identification platforms. Thus, there is an urgent need for a reliable diagnostic method. In this paper we present fast, robust, easy to perform and interpret PCR and real-time PCR assays to identify C. auris and related species: Candida duobushaemulonii, Candida haemulonii, and Candida lusitaniae Targeting rDNA region nucleotide sequences, primers specific for C. auris only or C. auris and related species were designed. A panel of 140 clinical fungal isolates was used in both PCR and real-time PCR assays followed by electrophoresis or melting temperature analysis, respectively. The identification results from the assays were 100% concordant with DNA sequencing results. These molecular assays overcome the deficiencies of existing phenotypic tests to identify C. auris and related species.

      6. Assessment of the QuantiFERON-TB Gold In-Tube test for the detection of Mycobacterium tuberculosis infection in United States Navy recruits
        Lempp JM, Zajdowicz MJ, Hankinson AL, Toney SR, Keep LW, Mancuso JD, Mazurek GH.
        PLoS One. 2017 ;12(5):e0177752.
        BACKGROUND: Immunologic tests such as the tuberculin skin test (TST) and QuantiFERON(R)-TB Gold In-Tube test (QFT-GIT) are designed to detect Mycobacterium tuberculosis infection, both latent M. tuberculosis infection (LTBI) and infection manifesting as active tuberculosis disease (TB). These tests need high specificity to minimize unnecessary treatment and high sensitivity to allow maximum detection and prevention of TB. METHODS: Estimate QFT-GIT specificity, compare QFT-GIT and TST results, and assess factor associations with test discordance among U.S. Navy recruits. RESULTS: Among 792 subjects with completed TST and QFT-GIT, 42(5.3%) had TST indurations >/=10mm, 23(2.9%) had indurations >/=15mm, 14(1.8%) had positive QFT-GIT results, and 5(0.6%) had indeterminate QFT-GITs. Of 787 subjects with completed TST and determinate QFT-GIT, 510(64.8%) were at low-risk for infection, 277(35.2%) were at increased risk, and none had TB. Among 510 subjects at low-risk (presumed not infected), estimated TST specificity using a 15mm cutoff, 99.0% (95%CI: 98.2-99.9%), and QFT-GIT specificity, 98.8% (95%CI: 97.9-99.8%), were not significantly different (p>0.99). Most discordance was among recruits at increased risk of infection, and most was TST-positive but QFT-GIT-negative discordance. Of 18 recruits with TST >/=15mm but QFT-GIT negative discordance, 14(78%) were at increased risk. TB prevalence in country of birth was the strongest predictor of positive TST results, positive QFT-GIT results, and TST-positive but QFT-GIT-negative discordance. Reactivity to M. avium purified protein derivative (PPD) was associated with positive TST results and with TST-positive but QFT-GIT-negative discordance using a 10 mm cutoff, but not using a 15 mm cutoff or with QFT-GIT results. CONCLUSIONS: M. tuberculosis infection prevalence was low, with the vast majority of infection occurring in recruits with recognizable risks. QFT-GIT and TST specificities were high and not significantly different. Negative QFT-GIT results among subjects with TST induration >/=15 mm who were born in countries with high TB prevalence, raise concerns.

      7. Monkeypox virus host factor screen using haploid cells identifies essential role of GARP complex in extracellular virus formation
        Realegeno S, Puschnik AS, Kumar A, Goldsmith C, Burgado J, Sambhara S, Olson VA, Carroll D, Damon I, Hirata T, Kinoshita T, Carette JE, Satheshkumar PS.
        J Virol. 2017 Jun 01;91(11).
        Monkeypox virus (MPXV) is a human pathogen that is a member of the Orthopoxvirus genus, which includes Vaccinia virus and Variola virus (the causative agent of smallpox). Human monkeypox is considered an emerging zoonotic infectious disease. To identify host factors required for MPXV infection, we performed a genome-wide insertional mutagenesis screen in human haploid cells. The screen revealed several candidate genes, including those involved in Golgi trafficking, glycosaminoglycan biosynthesis, and glycosylphosphatidylinositol (GPI)-anchor biosynthesis. We validated the role of a set of vacuolar protein sorting (VPS) genes during infection, VPS51 to VPS54 (VPS51-54), which comprise the Golgi-associated retrograde protein (GARP) complex. The GARP complex is a tethering complex involved in retrograde transport of endosomes to the trans-Golgi apparatus. Our data demonstrate that VPS52 and VPS54 were dispensable for mature virion (MV) production but were required for extracellular virus (EV) formation. For comparison, a known antiviral compound, ST-246, was used in our experiments, demonstrating that EV titers in VPS52 and VPS54 knockout (KO) cells were comparable to levels exhibited by ST-246-treated wild-type cells. Confocal microscopy was used to examine actin tail formation, one of the viral egress mechanisms for cell-to-cell dissemination, and revealed an absence of actin tails in VPS52KO- or VPS54KO-infected cells. Further evaluation of these cells by electron microscopy demonstrated a decrease in levels of wrapped viruses (WVs) compared to those seen with the wild-type control. Collectively, our data demonstrate the role of GARP complex genes in double-membrane wrapping of MVs necessary for EV formation, implicating the host endosomal trafficking pathway in orthopoxvirus infection.IMPORTANCE Human monkeypox is an emerging zoonotic infectious disease caused by Monkeypox virus (MPXV). Of the two MPXV clades, the Congo Basin strain is associated with severe disease, increased mortality, and increased human-to-human transmission relative to the West African strain. Monkeypox is endemic in regions of western and central Africa but was introduced into the United States in 2003 from the importation of infected animals. The threat of MPXV and other orthopoxviruses is increasing due to the absence of routine smallpox vaccination leading to a higher proportion of naive populations. In this study, we have identified and validated candidate genes that are required for MPXV infection, specifically, those associated with the Golgi-associated retrograde protein (GARP) complex. Identifying host targets required for infection that prevents extracellular virus formation such as the GARP complex or the retrograde pathway can provide a potential target for antiviral therapy.

      8. The immunoregulatory role of alpha enolase in dendritic cell function during Chlamydia infection
        Ryans K, Omosun Y, McKeithen DN, Simoneaux T, Mills CC, Bowen N, Eko FO, Black CM, Igietseme JU, He Q.
        BMC Immunol. 2017 May 19;18(1):27.
        BACKGROUND: We have previously reported that interleukin-10 (IL-10) deficient dendritic cells (DCs) are potent antigen presenting cells that induced elevated protective immunity against Chlamydia. To further investigate the molecular and biochemical mechanism underlying the superior immunostimulatory property of IL-10 deficient DCs we performed proteomic analysis on protein profiles from Chlamydia-pulsed wild-type (WT) and IL-10-/- DCs to identify differentially expressed proteins with immunomodulatory properties. RESULTS: The results showed that alpha enolase (ENO1), a metabolic enzyme involved in the last step of glycolysis was significantly upregulated in Chlamydia-pulsed IL-10-/- DCs compared to WT DCs. We further studied the immunoregulatory role of ENO1 in DC function by generating ENO1 knockdown DCs, using lentiviral siRNA technology. We analyzed the effect of the ENO1 knockdown on DC functions after pulsing with Chlamydia. Pyruvate assay, transmission electron microscopy, flow cytometry, confocal microscopy, cytokine, T-cell activation and adoptive transfer assays were also used to study DC function. The results showed that ENO1 knockdown DCs had impaired maturation and activation, with significant decrease in intracellular pyruvate concentration as compared with the Chlamydia-pulsed WT DCs. Adoptive transfer of Chlamydia-pulsed ENO1 knockdown DCs were poorly immunogenic in vitro and in vivo, especially the ability to induce protective immunity against genital chlamydia infection. The marked remodeling of the mitochondrial morphology of Chlamydia-pulsed ENO1 knockdown DCs compared to the Chlamydia-pulsed WT DCs was associated with the dysregulation of translocase of the outer membrane (TOM) 20 and adenine nucleotide translocator (ANT) 1/2/3/4 that regulate mitochondrial permeability. The results suggest that an enhanced glycolysis is required for efficient antigen processing and presentation by DCs to induce a robust immune response. CONCLUSIONS: The upregulation of ENO1 contributes to the superior immunostimulatory function of IL-10 deficient DCs. Our studies indicated that ENO1 deficiency causes the reduced production of pyruvate, which then contributes to a dysfunction in mitochondrial homeostasis that may affect DC survival, maturation and antigen presenting properties. Modulation of ENO1 thus provides a potentially effective strategy to boost DC function and promote immunity against infectious and non-infectious diseases.

      9. Laboratory-based performance evaluation of PIMA CD4+ T-lymphocyte count point-of-care by lay-counselors in Kenya
        Zeh C, Rose CE, Inzaule S, Desai MA, Otieno F, Humwa F, Akoth B, Omolo P, Chen RT, Kebede Y, Samandri T.
        J Immunol Methods. 2017 May 18.
        BACKGROUND: CD4+ T-lymphocyte count testing at the point-of-care (POC) may improve linkage to care of persons diagnosed with HIV-1 infection, but the accuracy of POC devices when operated by lay-counselors in the era of task-shifting is unknown. We examined the accuracy of Alere’s Pima POC device on both capillary and venous blood when performed by lay-counselors and laboratory technicians. METHODS: In Phase I, we compared the perfomance of POC against FACSCalibur for 280 venous specimens by laboratory technicians. In Phase II we compared POC performance by lay-counselors versus laboratory technicians using 147 paired capillary and venous specimens, and compared these to FACSCalibur. Statistical analyses included Bland-Altman analyses, concordance correlation coefficient, sensitivity, and specificity at treatment eligibility thresholds of 200, 350, and 500cells/mul. RESULTS: Phase I: POC sensitivity and specificity were 93.0% and 84.1% at 500cells/mul, respectively. Phase II: Good agreement was observed for venous POC results from both lay-counselors (concordance correlation coefficient (CCC)=0.873, bias -86.4cells/mul) and laboratory technicians (CCC=0.920, bias -65.7cells/mul). Capillary POC had good correlation: lay-counselors (CCC=0.902, bias -71.2cells/mul), laboratory technicians (CCC=0.918, bias -63.0cells/mul). Misclassification at the 500 cells/mul threshold for venous blood was 13.6% and 10.2% for lay-counselors and laboratory technicians and 12.2% for capillary blood in both groups. POC tended to under-classify the CD4 values with increasingly negative bias at higher CD4 values. CONCLUSIONS: Pima results were comparable to FACSCalibur for both venous and capillary specimens when operated by lay-counselors. POC CD4 testing has the potential to improve linkage to HIV care without burdening laboratory technicians in resource-limited settings.

    • Maternal and Child Health RSS Word feed
      1. Disaster preparedness in neonatal intensive care units
        Barfield WD, Krug SE.
        Pediatrics. 2017 ;139(5).
        Disasters disproportionally affect vulnerable, technology-dependent people, including preterm and critically ill newborn infants. It is important for health care providers to be aware of and prepared for the potential consequences of disasters for the NICU. Neonatal intensive care personnel can provide specialized expertise for their hospital, community, and regional emergency preparedness plans and can help develop institutional surge capacity for mass critical care, including equipment, medications, personnel, and facility resources.

      2. OBJECTIVE: A substantial percentage of children with congenital heart disease (CHD) fail to transfer to adult care, resulting in increased risk of morbidity and mortality. Transition planning discussions with a provider may increase rates of transfer, yet little is known about frequency and content of these discussions. We assessed prevalence and predictors of transition-related discussions between providers and parents of children with special healthcare needs (CSHCN) and heart problems, including CHD. DESIGN: Using parent-reported data on 12- to 17-year-olds from the 2009-2010 National Survey of CSHCN, we calculated adjusted prevalence ratios (aPR) for associations between demographic factors and provider discussions on shift to adult care, future insurance, and adult healthcare needs, weighted to generate population-based estimates. RESULTS: Of the 5.3% of adolescents with heart problems in our sample (n = 724), 52.8% were female, 65.3% white, 62.2% privately insured, and 37.1% had medical homes. Less than 50% had parents who discussed with providers their child’s future health insurance (26.4%), shift to adult care (22.9%), and adult healthcare needs (49.0%). Transition planning did not differ between children with and without heart problems (aPR range: 1.0-1.1). Among parents of CSHCN with heart problems who did not have discussions, up to 66% desired one. Compared to 1-/13-year-olds, a larger percentage of 16-/17-year-olds had parents who discussed their shift to adult care (aPR 2.1, 95% confidence interval (CI) [1.1, 3.9]), and future insurance (aPR 1.8, 95% CI [1.1, 2.9]). Having a medical home was associated with discussing adult healthcare needs (aPR 1.5, 95% CI [1.2, 1.8]) and future insurance (aPR 1.8, 95% CI [1.3, 2.6]). CONCLUSIONS: Nationally, less than half of adolescents with heart problems had parents who discussed their child’s transition with providers, which could be contributing to the large percentage of CHD patients who do not successfully transfer to adult care.

      3. Trends and characteristics of fetal and neonatal mortality due to congenital anomalies, Colombia 1999-2008
        Roncancio CP, Misnaza SP, Pena IC, Prieto FE, Cannon MJ, Valencia D.
        J Matern Fetal Neonatal Med. 2017 May 22:1-8.
        OBJECTIVE: To describe fetal and neonatal mortality due to congenital anomalies in Colombia. METHODS: We analyzed all fetal and neonatal deaths due to a congenital anomaly registered with the Colombian vital statistics system during 1999-2008. RESULTS: The registry included 213,293 fetal deaths and 7,216,727 live births. Of the live births, 77,738 (1.08%) resulted in neonatal deaths. Congenital anomalies were responsible for 7321 fetal deaths (3.4% of all fetal deaths) and 15,040 neonatal deaths (19.3% of all neonatal deaths). The fetal mortality rate due to congenital anomalies was 9.9 per 10,000 live births and fetal deaths; the neonatal mortality rate due to congenital anomalies was 20.8 per 10,000 live births. Mortality rates due to congenital anomalies remained relatively stable during the study period. The most frequent fatal congenital anomalies were congenital heart defects (32.0%), central nervous system anomalies (15.8%), and chromosomal anomalies (8.0%). Risk factors for fetal and neonatal death included: male or undetermined sex, living in villages or rural areas, mother’s age >35 years, low and very low birthweight, and <28 weeks gestation at birth. CONCLUSIONS: Congenital anomalies are an important cause of fetal and neonatal deaths in Colombia, but many of the anomalies may be preventable or treatable.

    • Mining RSS Word feed
      1. Ray tracing and modal methods for modeling radio propagation in tunnels with rough walls

        Zhou C.
        IEEE Trans Antennas Propag. 2017 ;65(5):2624-2634.
        At the ultrahigh frequencies common to portable radios, tunnels such as mine entries are often modeled by hollow dielectric waveguides. The roughness condition of the tunnel walls has an influence on radio propagation, and therefore should be taken into account when an accurate power prediction is needed. This paper investigates how wall roughness affects radio propagation in tunnels, and presents a unified ray tracing and modal method for modeling radio propagation in tunnels with rough walls. First, general analytical formulas for modeling the influence of the wall roughness are derived, based on the modal method and the ray tracing method, respectively. Second, the equivalence of the ray tracing and modal methods in the presence of wall roughnesses is mathematically proved, by showing that the ray tracing-based analytical formula can converge to the modal-based formula through the Poisson summation formula. The derivation and findings are verified by simulation results based on ray tracing and modal methods.

    • Nutritional Sciences RSS Word feed
      1. Support for food and beverage worksite wellness strategies and sugar-sweetened beverage intake among employed U.S. adults
        Lee-Kwan SH, Pan L, Kimmons J, Foltz J, Park S.
        Am J Health Promot. 2017 Mar;31(2):128-135.
        PURPOSE: Sugar-sweetened beverage (SSB) consumption is high among U.S. adults and is associated with obesity. Given that more than 100 million Americans consume food or beverages at work daily, the worksite may be a venue for interventions to reduce SSB consumption. However, the level of support for these interventions is unknown. We examined associations between workday SSB intake and employees’ support for worksite wellness strategies (WWSs). DESIGN: We conducted a cross-sectional study using data from Web-based annual surveys that gather information on health-related attitudes and behaviors. SETTING: Study setting was the United States. SUBJECTS: A total of 1924 employed adults (>/=18 years) selected using probability-based sampling. MEASURES: The self-reported independent variable was workday SSB intake (0, <1 or >/=1 times per day), and dependent variables were employees’ support (yes/no) for the following WWSs: (1) accessible free water, (2) affordable healthy food/drink, (3) available healthy options, and (4) less available SSB. ANALYSIS: Multivariable logistic regression was used to control for sociodemographic variables, employee size, and availability of cafeteria/vending machine. RESULTS: About half of employees supported accessible free water (54%), affordable healthy food/drink (49%), and available healthy options (46%), but only 28% supported less available SSB. Compared with non-SSB consumers, daily SSB consumers were significantly less supportive of accessible free water (adjusted odds ratio, .67; p < .05) or less available SSB (odds ratio, .49; p < .05). CONCLUSION: Almost half of employees supported increasing healthy options within worksites, although daily workday SSB consumers were less supportive of certain strategies. Lack of support could be a potential barrier to the successful implementation of certain worksite interventions.

    • Parasitic Diseases RSS Word feed
      1. Detecting infection hotspots: Modeling the surveillance challenge for elimination of lymphatic filariasis
        Harris JR, Wiegand RE.
        PLoS Negl Trop Dis. 2017 May 19;11(5):e0005610.
        BACKGROUND: During the past 20 years, enormous efforts have been expended globally to eliminate lymphatic filariasis (LF) through mass drug administration (MDA). However, small endemic foci (microfoci) of LF may threaten the presumed inevitable decline of infections after MDA cessation. We conducted microsimulation modeling to assess the ability of different types of surveillance to identify microfoci in these settings. METHODS: Five or ten microfoci of radius 1, 2, or 3 km with infection marker prevalence (intensity) of 3, 6, or 10 times background prevalence were placed in spatial simulations, run in R Version 3.2. Diagnostic tests included microfilaremia, immunochromatographic test (ICT), and Wb123 ELISA. Population size was fixed at 360,000 in a 60 x 60 km area; demographics were based on literature for Sub-Saharan African populations. Background ICT prevalence in 6-7 year olds was anchored at 1.0%, and the prevalence in the remaining population was adjusted by age. Adults>/=18 years, women aged 15-40 years (WCBA), children aged 6-7 years, or children</=5 years were sampled. Cluster (CS), simple random sampling (SRS), and TAS-like sampling were simulated, with follow-up testing of the nearest 20, 100, or 500 persons around each infection-marker-positive person. A threshold number of positive persons in follow-up testing indicated a suspected microfocus. Suspected microfoci identified during surveillance and actual microfoci in the simulation were compared to obtain a predictive value positive (PVP). Each parameter set was referred to as a protocol. Protocols were scored by efficiency, defined as the most microfoci identified, the fewest persons requiring primary and follow-up testing, and the highest PVP. Negative binomial regression was used to estimate aggregate effects of different variables on efficiency metrics. RESULTS: All variables were significantly associated with efficiency metrics. Additional follow-up tests beyond 20 did not greatly increase the number of microfoci detected, but significantly negatively impacted efficiency. Of 3,402 protocols evaluated, 384 (11.3%) identified all five microfoci (PVP 3.4-100.0%) and required testing 0.73-35.6% of the population. All used SRS and 378 (98.4%) only identified all five microfoci if they were 2-3 km diameter or high-intensity (6x or 10x); 374 (97.4%) required ICT or Wb123 testing to identify all five microfoci, and 281 (73.0%) required sampling adults or WCBA. The most efficient CS protocols identified two (40%) microfoci. After limiting to protocols with 1-km radius microfoci of 3x intensity (n = 378), eight identified all five microfoci; all used SRS and ICT and required testing 31.2-33.3% of the population. The most efficient CS and TAS-like protocols as well as those using microfilaremia testing identified only one (20%) microfocus when they were limited to 1-km radius and 3x intensity. CONCLUSION: In this model, SRS, ICT, and sampling of adults maximized microfocus detection efficiency. Follow-up sampling of more persons did not necessarily increase protocol efficiency. Current approaches towards surveillance, including TAS, may not detect small, low-intensity LF microfoci that could remain after cessation of MDA. The model provides many surveillance protocols that can be selected for optimal outcomes.

      2. Interpreting data from passive surveillance of antimalarial treatment failures
        Plucinski MM, Halsey ES, Venkatesan M, Kachur SP, Arguin P.
        Antimicrob Agents Chemother. 2017 Jun;61(6).

        [No abstract]

    • Public Health Law RSS Word feed
      1. An overview of state policies supporting worksite health promotion programs
        VanderVeur J, Gilchrist S, Matson-Koffman D.
        Am J Health Promot. 2017 May;31(3):232-242.
        PURPOSE: Worksite health promotion (WHP) programs can reduce the occurrence of cardiovascular disease risk factors. State law can encourage employers and employer-provided insurance companies to offer comprehensive WHP programs. This research examines state law authorizing WHP programs. DESIGN: Quantitative content analysis. SETTING: Worksites or workplaces. SUBJECTS: United States (and the District of Columbia). INTERVENTION: State law in effect in 2013 authorizing WHP programs. MEASURES: Frequency and distribution of states with WHP laws. ANALYSIS: To determine the content of the laws for analysis and coding, we identified 18 policy elements, 12 from the Centers for Disease Control and Prevention’s Worksite Health ScoreCard (HSC) and 6 additional supportive WHP strategies. We used these strategies as key words to search for laws authorizing WHP programs or select WHP elements. We calculated the number and type of WHP elements for each state with WHP laws and selected two case examples from states with comprehensive WHP laws. RESULTS: Twenty-four states authorized onsite WHP programs, 29 authorized WHP through employer-provided insurance plans, and 18 authorized both. Seven states had a comprehensive WHP strategy, addressing 8 or more of 12 HSC elements. The most common HSC elements were weight management, tobacco cessation, and physical activity. CONCLUSION: Most states had laws encouraging the adoption of WHP programs. Massachusetts and Maine are implementing comprehensive WHP laws but studies evaluating their health impact are needed.

    • Reproductive Health RSS Word feed
      1. OBJECTIVE: We sought to depict the topical, geospatial, and temporal diffusion of the 2015 North American Menopause Society position statement on the nonhormonal management of menopause-associated vasomotor symptoms released on September 21, 2015, and its associated press release from September 23, 2015. METHODS: Three data sources were used: online news articles, National Public Radio, and Twitter. For topical diffusion, we compared keywords and their frequencies among the position statement, press release, and online news articles. We also created a network figure depicting relationships across key content categories or nodes. For geospatial diffusion within the United States, we compared locations of the 109 National Public Radio (NPR) stations covering the statement to 775 NPR stations not covering the statement. For temporal diffusion, we normalized and segmented Twitter data into periods before and after the press release (September 12, 2015 to September 22, 2015 vs September 23, 2015 to October 3, 2015) and conducted a burst analysis to identify changes in tweets from before to after. RESULTS: Topical information diffused across sources was similar with the exception of the more scientific terms “vasomotor symptoms” or “vms” versus the more colloquial term “hot flashes.” Online news articles indicated media coverage of the statement was mainly concentrated in the United States. NPR station data showed similar proportions of stations airing the story across the four census regions (Northeast, Midwest, south, west; P = 0.649). Release of the statement coincided with bursts in the menopause conversation on Twitter. CONCLUSIONS: The findings of this study may be useful for directing the development and dissemination of future North American Menopause Society position statements and/or press releases.

      2. Population attributable fraction of tubal factor infertility associated with chlamydia
        Gorwitz RJ, Wiesenfeld HC, Chen PL, Hammond KR, Sereday KA, Haggerty CL, Johnson RE, Papp JR, Kissin DM, Henning TC, Hook EW, Steinkampf MP, Markowitz LE, Geisler WM.
        Am J Obstet Gynecol. 2017 May 19.
        BACKGROUND: Chlamydia trachomatis infection is highly prevalent among young women in the United States. Prevention of long-term sequelae of infection, including tubal factor infertility, is a primary goal of chlamydia screening and treatment activities. However, the population attributable fraction of tubal factor infertility associated with chlamydia is unclear, and optimal measures for assessing tubal factor infertility and prior chlamydia in epidemiologic studies have not been established. Black women have increased rates of chlamydia and tubal factor infertility compared to white women, but have been underrepresented in prior studies of the association of chlamydia and tubal factor infertility. OBJECTIVES: To estimate the population attributable fraction of tubal factor infertility associated with Chlamydia trachomatis infection by race (black, non-black), and assess how different definitions of C. trachomatis seropositivity and tubal factor infertility affect population attributable fraction estimates. STUDY DESIGN: We conducted a case-control study, enrolling infertile women attending infertility practices in Birmingham, AL and Pittsburgh, PA during October 2012 – June 2015. Tubal factor infertility case status was primarily defined by unilateral or bilateral fallopian tube occlusion (cases) or bilateral fallopian tube patency (controls) on hysterosalpingogram. Alternate tubal factor infertility definitions incorporated history suggestive of tubal damage or were based on laparoscopic evidence of tubal damage. We aimed to enroll all eligible women, with an expected ratio of one and three controls per case for black and non-black women, respectively. We assessed C. trachomatis seropositivity with a commercial assay and a more sensitive research assay; our primary measure of seropositivity was defined as positivity on either assay. We estimated C. trachomatis seropositivity and calculated C. trachomatis-TFI odds ratios and population attributable fraction, stratified by race. RESULTS: We enrolled 107 black women (47 cases, 60 controls) and 620 non-black women (140 cases, 480 controls). C. trachomatis seropositivity by either assay was 81% (95% confidence interval 73%, 89%) among black and 31% (95% confidence interval 28%, 35%) among non-black participants (P<0.001). Using the primary C. trachomatis seropositivity and tubal factor infertility definitions, no significant association was detected between chlamydia and tubal factor infertility among blacks (odds ratio 1.22, 95% confidence interval 0.45, 3.28) or non-blacks (odds ratio 1.41, 95% confidence interval 0.95, 2.09), and the estimated population attributable fraction was 15% (95% confidence interval -97%, 68%) among blacks and 11% (95% confidence interval -3%, 23%) among non-blacks. Use of alternate serologic measures and tubal factor infertility definitions impacted the magnitude of the chlamydia-tubal factor infertility association, and resulted in a significant association among non-blacks. CONCLUSIONS: Low population attributable fraction estimates suggest factors in addition to chlamydia contribute to tubal factor infertility in the study population. However, high background C. trachomatis seropositivity among controls, most striking among black participants, could have obscured an association with tubal factor infertility and resulted in a population attributable fraction that underestimates the true etiologic role of chlamydia. Choice of chlamydia and tubal factor infertility definitions also impacts odds ratio and population attributable fraction estimates.

    • Substance Use and Abuse RSS Word feed
      1. Increases in prescription opioid injection abuse among treatment admissions in the United States, 2004-2013
        Jones CM, Christensen A, Gladden RM.
        Drug Alcohol Depend. 2017 May 16;176:89-95.
        BACKGROUND: The 2015 HIV outbreak in Indiana associated with prescription opioid injection coupled with rising rates of hepatitis C, especially in areas with long-standing opioid abuse, have raised concerns about prescription opioid injection. However, research on this topic is limited. We assessed trends in treatment admissions reporting injection, smoking, and inhalation abuse of prescription opioids and examined characteristics associated with non-oral routes of prescription opioid abuse in the U.S. METHODS: Prescription opioid abuse treatment admissions in the 2004-2013 Treatment Episode Data Set were used to calculate counts and percentages of prescription opioid treatment admissions reporting oral, injection, or smoking/inhalation abuse overall, by sex, age, and race/ethnicity. Multivariable multinomial logistic regression was used to identify demographic and substance use characteristics associated with injection or smoking/inhalation abuse. RESULTS: From 2004-2013, oral abuse decreased from 73.1% to 58.9%; injection abuse increased from 11.7% to 18.1%; and smoking/inhalation abuse increased from 15.3% of admissions to 23.0%. Among treatment admissions, the following were associated with injection abuse: male sex, 18-54 year-olds, non-Hispanic whites, non-Hispanic other, homeless or dependent living, less than full-time work, living in the Midwest or South, >/=1 prior treatment episodes, younger age of first opioid use, and reporting use of cocaine/crack, marijuana, heroin, or methamphetamine. CONCLUSIONS: The proportion of treatment admissions reporting prescription opioid injection and smoking/inhalation abuse increased significantly in the U.S. between 2004 and 2013. Expanding prevention efforts as well as access to medication-assisted treatment and risk reduction services for people who inject drugs is urgently needed.

      2. Perceptions of harm to children exposed to secondhand aerosol from electronic vapor products, Styles Survey, 2015
        Nguyen KH, Tong VT, Marynak K, King BA.
        Prev Chronic Dis. 2017 May 25;14:E41.
        INTRODUCTION: The US Surgeon General has concluded that e-cigarette aerosol is not harmless and can contain harmful and potentially harmful chemicals, including nicotine. We assessed factors associated with adults’ perceptions of harm related to children’s exposure to secondhand aerosol from electronic vapor products (EVPs). METHODS: Data came from the 2015 Styles, an Internet panel survey of US adults aged 18 years or older (n = 4,127). Respondents were asked whether they believe aerosol from other people’s EVPs causes children harm. Harm perceptions were assessed overall and by cigarette smoking, EVP use, and sociodemographic characteristics. Multinomial logistic regression was used to assess odds of perceived harm. RESULTS: Overall, 5.3% of adults responded that secondhand EVP exposure caused “no harm” to children, 39.9% responded “little harm” or “some harm,” 21.5% responded “a lot of harm,” and 33.3% responded “don’t know.” Odds of “no harm” response were greater among men than among women, current and former cigarette smokers than among never smokers, and current and former EVP users than among never users; odds were lower among non-Hispanic blacks, Hispanics, and non-Hispanic other races than among non-Hispanic whites. Odds of responding “don’t know” were greater among men, current cigarette smokers, and current and former EVP users; odds were lower among those aged 45 to 64 years than those aged 18 to 24 years and lower among non-Hispanic other races and Hispanics than non-Hispanic whites. CONCLUSION: Two-fifths of US adults believe that children’s exposure to secondhand EVP aerosol causes some or little harm, while one-third do not know whether it causes harm. Efforts are warranted to educate the public about the health risks of secondhand EVP aerosol, particularly for children.

      3. BACKGROUND: Approximately 70% of current (past 30-day) adult marijuana users are current tobacco users, which may complicate tobacco cessation. We assessed prevalence and trends in tobacco cessation among adult ever tobacco users, by marijuana use status. METHODS: Data came from the National Survey on Drug Use and Health, a cross-sectional, nationally representative, household survey of U.S. civilians. Analyses included current, former, and never marijuana users aged>/=18 reporting ever tobacco use (cigarette, cigar, chew/snuff). We computed weighted estimates (2013-2014) of current tobacco use, recent tobacco cessation (quit 30days to 12months), and sustained tobacco cessation (quit>12months) and adjusted trends in tobacco use and cessation (2005-2014) by marijuana use status. We also assessed the association between marijuana and tobacco use status. RESULTS: In 2013-2014, among current adult marijuana users reporting ever tobacco use, 69.1% were current tobacco users (vs. 38.5% of former marijuana users, p<0.0001, and 28.2% of never marijuana users, p<0.0001); 9.1% reported recent tobacco cessation (vs. 8.4% of former marijuana users, p<0.01, and 6.3% of never marijuana users, p<0.001), and 21.8% reported sustained tobacco cessation (vs. 53.1% of former marijuana users, p<0.01, and 65.5% of never marijuana users, p<0.0001). Between 2005 and 2014, current tobacco use declined and sustained tobacco cessation increased among all marijuana use groups. CONCLUSIONS: Current marijuana users who ever used tobacco had double the prevalence (vs. never-marijuana users) of current tobacco use, and significantly lower sustained abstinence. Interventions addressing tobacco cessation in the context of use of marijuana and other substances may be warranted.

      4. Biomarkers of exposure to new and emerging tobacco and nicotine delivery products
        Schick SF, Blount BC, Jacob P, Saliba NA, Bernert JT, El Hellani A, Jatlow P, Pappas RS, Wang L, Foulds J, Ghosh A, Hecht SS, Gomez JC, Martin JR, Mesaros C, Srivastava S, St Helen G, Tarran R, Lorkiewicz PK, Blair IA, Kimmel HL, Doerschuk CM, Benowitz NL, Bhatnagar A.
        Am J Physiol Lung Cell Mol Physiol. 2017 May 18:ajplung.00343.2016.
        Accurate and reliable measurements of exposure to tobacco products are essential for identifying and confirming patterns of tobacco product use and for assessing their potential biological effects in both human populations and experimental systems. Due to the introduction of new tobacco-derived products and the development of novel ways to modify and use conventional tobacco products, precise and specific assessments of exposure to tobacco are now more important than ever. Biomarkers that were developed and validated to measure exposure to cigarettes are being evaluated to assess their utility for measuring exposure to these new products. Here, we review current methods for measuring exposure to new and emerging tobacco products, such as electronic cigarettes, little cigars, water pipe and cigarillos. Rigorously validated biomarkers specific to these new products are yet to be identified. Here, we discuss the strengths and limitations of current approaches, including whether or not they provide reliable exposure estimates. We provide specific guidance for choosing practical and economical biomarkers for different study designs and experimental conditions. Our goal is to help both new and experienced investigators measure exposure to tobacco products accurately, while avoiding common experimental errors. By identifying the capacity gaps in biomarker research on new and emerging tobacco products, we hope to provide researchers, policy makers and funding agencies with a clear action plan for conducting and promoting research on the patterns of use and health effects of these products.

      5. A novel public health approach to measuring tobacco cessation needs among cancer survivors in Alaska
        Underwood JM, Hyde-Rolland SJ, Thorsness J, Stewart SL.
        J Community Health. 2017 May 20.
        Cancer survivors who continue to smoke have poorer response to treatment, higher risk for future cancers and lower survival rates than those who quit tobacco after diagnosis. Despite the increased risk for negative health outcomes, tobacco use among Alaskan cancer survivors is 19%, among the highest in the nation. To characterize and address tobacco cessation needs among cancer survivors who called a quit line for help in quitting tobacco, Alaska’s Comprehensive Cancer Control program initiated a novel partnership with the state’s Tobacco Quit Line. Alaska’s Tobacco Quit Line, a state-funded resource that provides confidential coaching, support, and nicotine replacement therapies for Alaskan adults who wish to quit using tobacco, was used to collect demographic characteristics, health behaviors, cessation referral methods and other information on users. From September 2013- December 2014, the Alaska Quit Line included questions about previous cancer status and other chronic conditions to assess this information from cancer survivors who continue to use tobacco. Alaska’s Tobacco Quit Line interviewed 3,141 smokers, 129 (4%) of whom were previously diagnosed with cancer. Most cancer survivors who called in to the quit line were female (72%), older than 50 years of age (65%), white (67%), and smoked cigarettes (95%). Cancer survivors reported a higher prevalence of asthma, COPD and heart disease than the non-cancer cohort. Approximately 34% of cancer survivors were referred to the quit line by a health care provider. This report illustrates the need for health care provider awareness of persistent tobacco use among cancer survivors in Alaska. It also provides a sound methodologic design for assessing ongoing tobacco cessation needs among cancer survivors who call a quit line. This survey methodology can be adapted by other public health programs to address needs and increase healthy behaviors among individuals with chronic disease.

    • Zoonotic and Vectorborne Diseases RSS Word feed
      1. Rapid and specific detection of Asian- and African-lineage Zika viruses
        Chotiwan N, Brewster CD, Magalhaes T, Weger-Lucarelli J, Duggal NK, Ruckert C, Nguyen C, Garcia Luna SM, Fauver JR, Andre B, Gray M, Black WC, Kading RC, Ebel GD, Kuan G, Balmaseda A, Jaenisch T, Marques ET, Brault AC, Harris E, Foy BD, Quackenbush SL, Perera R, Rovnak J.
        Sci Transl Med. 2017 May 03;9(388).
        Understanding the dynamics of Zika virus transmission and formulating rational strategies for its control require precise diagnostic tools that are also appropriate for resource-poor environments. We have developed a rapid and sensitive loop-mediated isothermal amplification (LAMP) assay that distinguishes Zika viruses of Asian and African lineages. The assay does not detect chikungunya virus or flaviviruses such as dengue, yellow fever, or West Nile viruses. The assay conditions allowed direct detection of Zika virus RNA in cultured infected cells; in mosquitoes; in virus-spiked samples of human blood, plasma, saliva, urine, and semen; and in infected patient serum, plasma, and semen samples without the need for RNA isolation or reverse transcription. The assay offers rapid, specific, sensitive, and inexpensive detection of the Asian-lineage Zika virus strain that is currently circulating in the Western hemisphere, and can also detect the African-lineage Zika virus strain using separate, specific primers.

      2. Elevation as a proxy for mosquito-borne Zika virus transmission in the Americas
        Watts AG, Miniota J, Joseph HA, Brady OJ, Kraemer MU, Grills AW, Morrison S, Esposito DH, Nicolucci A, German M, Creatore MI, Nelson B, Johansson MA, Brunette G, Hay SI, Khan K, Cetron M.
        PLoS One. 2017 ;12(5):e0178211.
        INTRODUCTION: When Zika virus (ZIKV) first began its spread from Brazil to other parts of the Americas, national-level travel notices were issued, carrying with them significant economic consequences to affected countries. Although regions of some affected countries were likely unsuitable for mosquito-borne transmission of ZIKV, the absence of high quality, timely surveillance data made it difficult to confidently demarcate infection risk at a sub-national level. In the absence of reliable data on ZIKV activity, a pragmatic approach was needed to identify subnational geographic areas where the risk of ZIKV infection via mosquitoes was expected to be negligible. To address this urgent need, we evaluated elevation as a proxy for mosquito-borne ZIKV transmission. METHODS: For sixteen countries with local ZIKV transmission in the Americas, we analyzed (i) modelled occurrence of the primary vector for ZIKV, Aedes aegypti, (ii) human population counts, and (iii) reported historical dengue cases, specifically across 100-meter elevation levels between 1,500m and 2,500m. Specifically, we quantified land area, population size, and the number of observed dengue cases above each elevation level to identify a threshold where the predicted risks of encountering Ae. aegypti become negligible. RESULTS: Above 1,600m, less than 1% of each country’s total land area was predicted to have Ae. aegypti occurrence. Above 1,900m, less than 1% of each country’s resident population lived in areas where Ae. aegypti was predicted to occur. Across all 16 countries, 1.1% of historical dengue cases were reported above 2,000m. DISCUSSION: These results suggest low potential for mosquito-borne ZIKV transmission above 2,000m in the Americas. Although elevation is a crude predictor of environmental suitability for ZIKV transmission, its constancy made it a pragmatic input for policy decision-making during this public health emergency.

Back to Top

CDC Science Clips Production Staff

  • John Iskander, MD MPH, Editor
  • Gail Bang, MLIS, Librarian
  • Kathy Tucker, Librarian
  • William (Bill) Thomas, MLIS, Librarian
  • Onnalee Gomez, MS, Health Scientist
  • Jarvis Sims, MIT, MLIS, Librarian

____

DISCLAIMER: Articles listed in the CDC Science Clips are selected by the Stephen B. Thacker CDC Library to provide current awareness of the public health literature. An article's inclusion does not necessarily represent the views of the Centers for Disease Control and Prevention nor does it imply endorsement of the article's methods or findings. CDC and DHHS assume no responsibility for the factual accuracy of the items presented. The selection, omission, or content of items does not imply any endorsement or other position taken by CDC or DHHS. Opinion, findings and conclusions expressed by the original authors of items included in the Clips, or persons quoted therein, are strictly their own and are in no way meant to represent the opinion or views of CDC or DHHS. References to publications, news sources, and non-CDC Websites are provided solely for informational purposes and do not imply endorsement by CDC or DHHS.

TOP