Skip directly to search Skip directly to A to Z list Skip directly to navigation Skip directly to page options Skip directly to site content

Issue 29, July 25, 2017

CDC Science Clips: Volume 9, Issue 29, July 25, 2017

Science Clips is produced weekly to enhance awareness of emerging scientific knowledge for the public health community. Each article features an Altmetric Attention score to track social and mainstream media mentions!

  1. Top Articles of the Week

    Selected weekly by a senior CDC scientist from the standard sections listed below.

    The names of CDC authors are indicated in bold text.
    • Chronic Diseases and Conditions RSS Word feed
      • Raw milk consumption and other early-life farm exposures and adult pulmonary function in the Agricultural Lung Health Study
        Wyss AB, House JS, Hoppin JA, Richards M, Hankinson JL, Long S, Henneberger PK, Beane Freeman LE, Sandler DP, O’Connell EL, Cummings CB, Umbach DM, London SJ.
        Thorax. 2017 Jul 08.
        Literature suggests that early exposure to the farming environment protects against atopy and asthma; few studies have examined pulmonary function. We evaluated associations between early-life farming exposures and pulmonary function in 3061 adults (mean age=63) from a US farming population using linear regression. Childhood raw milk consumption was associated with higher FEV1 (beta=49.5 mL, 95% CI 2.8 to 96.1 mL, p=0.04) and FVC (beta=66.2 mL, 95% CI 13.2 to 119.1 mL, p=0.01). We did not find appreciable associations with other early-life farming exposures. We report a novel association between raw milk consumption and higher pulmonary function that lasts into older adulthood.

    • Communicable Diseases RSS Word feed
      • Sexually transmitted disease testing of human immunodeficiency virus-infected men who have sex with men: Room for improvement
        Dean BB, Scott M, Hart R, Battalora L, Novak RM, Durham MD, Brooks JT, Buchacz K.
        Sex Transm Dis. 2017 ;28.
        BACKGROUND: In the United States, sexually transmitted infection (STI) testing is recommended at least annually for sexually active men who have sex with men (MSM). We evaluated human immunodeficiency virus (HIV) providers’ STI testing practices and frequency of positive test results. METHODS: We analyzed data from HIV Outpatient Study (HOPS) participants who, from 2007 to 2014, completed a confidential survey about risk behaviors. Using medical records data, we assessed the frequency of gonorrhea, chlamydia, and syphilis testing and positive results during the year after the survey for MSM who reported sex without a condom in the prior 6 months. We compared testing frequency and positivity for men having 1, 2 to 3, and 4 or more sexual partners. Correlates of STI testing were assessed using general linear model to derive relative risks (RR) with associated 95% confidence intervals (CI). RESULTS: Among 719 MSM, testing frequency was 74.5%, 74.3%, and 82.9% for gonorrhea, chlamydia, and syphilis, respectively, and was higher in those men who reported more sexual partners (P < 0.001 for all). In multivariable analysis, testing for gonorrhea was significantly more likely among non-Hispanic black versus white men (RR, 1.17; 95% CI, 1.03-1.33), among men seen in private versus public clinics (RR, 1.16; 95% CI, 1.05-1.28), and among men with 2 to 3 and 4 or more sexual partners versus 1 partner (RR, 1.12; 95% CI, 1.02-1.23, and RR, 1.18; 95% CI, 1.08-1.30, respectively). Correlates of chlamydia and syphilis testing were similar. Test positivity was higher among men with more sexual partners: for gonorrhea 0.0%, 3.0%, and 6.7% for men with 1, 2 to 3, and 4 or more partners, respectively (P < 0.001, syphilis 3.7%, 3.8% and 12.5%, P < 0.001). CONCLUSIONS: Among HIV-infected MSM patients in HIV care who reported sex without a condom, subsequent testing was not documented in clinic records during the following year for up to a quarter of patients. Exploring why STI testing did not occur may improve patient care.

      • The effect of oral pre-exposure prophylaxis on the progression of HIV-1 seroconversion
        Donnell D, Ramos E, Celum C, Baeten J, Dragavon J, Tappero J, Lingappa JR, Ronald A, Fife K, Coombs RW.
        Aids. 2017 Jul 07.
        OBJECTIVE: To investigate whether oral pre-exposure prophylaxis (PrEP) alters timing and patterns of seroconversion when PrEP use continues after HIV-1 infection. DESIGN: Retrospective testing of the timing of Fiebig stage HIV-1 seroconversion in the Partners PrEP Study, a randomized placebo-controlled clinical trial of PrEP conducted in Kenya and Uganda. METHODS: Specimens from 138 seroconverters were collected every 3 months and when HIV-1 infection was suspected based on monthly rapid HIV-1 tests. Progression of seroconversion was compared between randomized groups (PrEP versus placebo) and per-protocol groups (placebo versus PrEP participants with detectable tenofovir during the seroconversion period) using laboratory assessment of Fiebig stage. Delay in site-detection of seroconversion and association with PrEP drug-regimen resistant virus were assessed using logistic regression. Analysis of time to each Fiebig stage used maximum likelihood estimation with a parametric model to accommodate the varying lengths of HIV-infection intervals. RESULTS: There was a significant increase in delayed site detection of infection associated with PrEP (OR = 3.49, p = 0.044). Delay in detection was not associated with increased risk of resistance in the PrEP arm (OR = 0.93, p = 0.95). Estimated time to each Fiebig stage was elongated in seroconverters with evidence of ongoing PrEP use, significantly for only Stage 5 (28 days versus 17 days, p = 0.05). Adjusted for Fiebig stage, viral RNA was approximately 2/3 log lower in those assigned to PrEP compared to placebo; no differences were found in Architect S/CO at any stage. CONCLUSIONS: Ongoing PrEP use in seroconverters may delay detection of infection and elongate seroconversion, although the delay does not increase risk of resistance.

      • Insights from a systematic search for information on designs, costs, and effectiveness of poliovirus environmental surveillance systems
        Duintjer Tebbens RJ, Zimmermann M, Pallansch MA, Thompson KM.
        Food Environ Virol. 2017 Jul 07.
        Poliovirus surveillance plays a critical role in achieving and certifying eradication and will play a key role in the polio endgame. Environmental surveillance can provide an opportunity to detect circulating polioviruses prior to the observation of any acute flaccid paralysis cases. We completed a systematic review of peer-reviewed publications on environmental surveillance for polio including the search terms “environmental surveillance” or “sewage,” and “polio,” “poliovirus,” or “poliomyelitis,” and compared characteristics of the resulting studies. The review included 146 studies representing 101 environmental surveillance activities from 48 countries published between 1975 and 2016. Studies reported taking samples from sewage treatment facilities, surface waters, and various other environmental sources, although they generally did not present sufficient details to thoroughly evaluate the sewage systems and catchment areas. When reported, catchment areas varied from 50 to over 7.3 million people (median of 500,000 for the 25% of activities that reported catchment areas, notably with 60% of the studies not reporting this information and 16% reporting insufficient information to estimate the catchment area population size). While numerous studies reported the ability of environmental surveillance to detect polioviruses in the absence of clinical cases, the review revealed very limited information about the costs and limited information to support quantitative population effectiveness of conducting environmental surveillance. This review motivates future studies to better characterize poliovirus environmental surveillance systems and the potential value of information that they may provide in the polio endgame.

      • INTRODUCTION: The burden of HIV infection and health outcomes for people living with HIV varies across the United States. New methods allow for estimating national and state-level HIV incidence, prevalence, and undiagnosed infections using surveillance data and CD4 values. METHODS: HIV surveillance data reported to the Centers for Disease Control and Prevention and the first CD4 value after diagnosis were used to estimate the distribution of delay from infection to diagnosis based on a well-characterized CD4 depletion model. This distribution was used to estimate HIV incidence, prevalence, and undiagnosed infections during 2010-2014. Estimated annual percentage changes were calculated to assess trends. RESULTS: During 2010-2014, HIV incidence decreased 10.3% (EAPC = -3.1%) and the percentage of undiagnosed infection decreased from 17.1% to 15.0% (EAPC = -3.3%) in the United States; HIV prevalence increased 9.1% (EAPC = 2.2%). Among 36 jurisdictions with sufficient data to produce stable estimates, HIV incidence decreased in 3 jurisdictions (Georgia, New York, and District of Columbia) and the percentage of undiagnosed HIV infections decreased in 2 states (Texas and Georgia). HIV prevalence increased in 4 states (California, Florida, Georgia, and Texas). In 2014, Southern states accounted for 50% of both new HIV infections and undiagnosed infections. CONCLUSION: HIV incidence and undiagnosed infection decreased in the United States during 2010-2014; however, outcomes varied by state and region. Progress in national HIV prevention is encouraging but intensified efforts for testing and treatment are needed in the South and states with high percentages of undiagnosed infection.

      • Global, regional, and national disease burden estimates of acute lower respiratory infections due to respiratory syncytial virus in young children in 2015: a systematic review and modelling study
        Shi T, McAllister DA, O’Brien KL, Simoes EA, Madhi SA, Gessner BD, Polack FP, Balsells E, Acacio S, Aguayo C, Alassani I, Ali A, Antonio M, Awasthi S, Awori JO, Azziz-Baumgartner E, Baggett HC, Baillie VL, Balmaseda A, Barahona A, Basnet S, Bassat Q, Basualdo W, Bigogo G, Bont L, Breiman RF, Brooks WA, Broor S, Bruce N, Bruden D, Buchy P, Campbell S, Carosone-Link P, Chadha M, Chipeta J, Chou M, Clara W, Cohen C, de Cuellar E, Dang DA, Dash-Yandag B, Deloria-Knoll M, Dherani M, Eap T, Ebruke BE, Echavarria M, de Freitas Lazaro Emediato CC, Fasce RA, Feikin DR, Feng L, Gentile A, Gordon A, Goswami D, Goyet S, Groome M, Halasa N, Hirve S, Homaira N, Howie SR, Jara J, Jroundi I, Kartasasmita CB, Khuri-Bulos N, Kotloff KL, Krishnan A, Libster R, Lopez O, Lucero MG, Lucion F, Lupisan SP, Marcone DN, McCracken JP, Mejia M, Moisi JC, Montgomery JM, Moore DP, Moraleda C, Moyes J, Munywoki P, Mutyara K, Nicol MP, Nokes DJ, Nymadawa P, da Costa Oliveira MT, Oshitani H, Pandey N, Paranhos-Baccala G, Phillips LN, Picot VS, Rahman M, Rakoto-Andrianarivelo M, Rasmussen ZA, Rath BA, Robinson A, Romero C, Russomando G, Salimi V, Sawatwong P, Scheltema N, Schweiger B, Scott JA, Seidenberg P, Shen K, Singleton R, Sotomayor V, Strand TA, Sutanto A, Sylla M, Tapia MD, Thamthitiwat S, Thomas ED, Tokarz R, Turner C, Venter M, Waicharoen S, Wang J, Watthanaworawit W, Yoshida LM, Yu H, Zar HJ, Campbell H, Nair H.
        Lancet. 2017 Jul 06.
        BACKGROUND: We have previously estimated that respiratory syncytial virus (RSV) was associated with 22% of all episodes of (severe) acute lower respiratory infection (ALRI) resulting in 55 000 to 199 000 deaths in children younger than 5 years in 2005. In the past 5 years, major research activity on RSV has yielded substantial new data from developing countries. With a considerably expanded dataset from a large international collaboration, we aimed to estimate the global incidence, hospital admission rate, and mortality from RSV-ALRI episodes in young children in 2015. METHODS: We estimated the incidence and hospital admission rate of RSV-associated ALRI (RSV-ALRI) in children younger than 5 years stratified by age and World Bank income regions from a systematic review of studies published between Jan 1, 1995, and Dec 31, 2016, and unpublished data from 76 high quality population-based studies. We estimated the RSV-ALRI incidence for 132 developing countries using a risk factor-based model and 2015 population estimates. We estimated the in-hospital RSV-ALRI mortality by combining in-hospital case fatality ratios with hospital admission estimates from hospital-based (published and unpublished) studies. We also estimated overall RSV-ALRI mortality by identifying studies reporting monthly data for ALRI mortality in the community and RSV activity. FINDINGS: We estimated that globally in 2015, 33.1 million (uncertainty range [UR] 21.6-50.3) episodes of RSV-ALRI, resulted in about 3.2 million (2.7-3.8) hospital admissions, and 59 600 (48 000-74 500) in-hospital deaths in children younger than 5 years. In children younger than 6 months, 1.4 million (UR 1.2-1.7) hospital admissions, and 27 300 (UR 20 700-36 200) in-hospital deaths were due to RSV-ALRI. We also estimated that the overall RSV-ALRI mortality could be as high as 118 200 (UR 94 600-149 400). Incidence and mortality varied substantially from year to year in any given population. INTERPRETATION: Globally, RSV is a common cause of childhood ALRI and a major cause of hospital admissions in young children, resulting in a substantial burden on health-care services. About 45% of hospital admissions and in-hospital deaths due to RSV-ALRI occur in children younger than 6 months. An effective maternal RSV vaccine or monoclonal antibody could have a substantial effect on disease burden in this age group. FUNDING: The Bill & Melinda Gates Foundation.

    • Health Economics RSS Word feed
      • OBJECTIVE: Most U.S. employers are not required to provide paid sick leave (PSL), and there is limited information on the economic return of providing PSL. We estimated potential benefits to employers of PSL in reducing absenteeism related to the spread of influenza-like illness (ILI). METHODS: We used nationally representative data and a negative binomial random effects model to estimate the impact of PSL in reducing overall absence due to illness or injury. We used published data to compute the share of ILI from the total days of absence, ILI transmission rates at workplaces, wages, and other parameters. RESULTS: Providing PSL could have saved employers $0.63 to $1.88 billion in reduced ILI-related absenteeism costs per year during 2007 to 2014 in 2016 dollars. CONCLUSION: These findings might help employers consider PSL as an investment rather than as a cost without any return.

    • Immunity and Immunization RSS Word feed
      • Varicella outbreak in a highly-vaccinated school population in Beijing, China during the voluntary two-dose era
        Suo L, Lu L, Wang Q, Yang F, Wang X, Pang X, Marin M, Wang C.
        Vaccine. 2017 Jul 03.
        BACKGROUND: Two-dose varicella vaccination has been available in Beijing since 2012 in the private sector. We investigated a varicella outbreak in a highly vaccinated elementary school population. METHODS: A cohort study was carried out and a varicella case was defined as an acute onset of generalized maculopapulovesicular rash without other apparent cause in a student attending the school from March 29 through May 17, 2015. Breakthrough varicella was defined as varicella >42days after the last vaccine dose among both 1- or 2-dose varicella vaccine recipients. Vaccination information was collected from immunization records; information on prior varicella and clinical presentations was collected by surveying students’ parents. RESULTS: Of the 1056 students in the school, 1027 (97.3%) reported no history of varicella. Prior to the outbreak, 98.6% of students had received >/=1 dose of varicella vaccine, and most (63.2%) students received two doses. Twenty varicella cases were identified for an overall attack rate of 2.0%. Half of the cases occurred in the classroom of the index case-patient, a two-dose recipient who was not isolated after symptom onset. Breakthrough varicella accounted for 95% of cases (19/20) with attack rates of 14.3% (1/7), 1.6% (6/362) and 2.0% (13/649) among unvaccinated, one-dose, and two-dose students, respectively. Most case-patients (18/20, 90%) had <50 lesions. No difference in clinical presentations was found between one-dose and two-dose recipients with breakthrough varicella. CONCLUSION: Moderate two-dose varicella vaccine coverage was insufficient to prevent a varicella outbreak. Two-dose recipients with breakthrough varicella are contagious. High two-dose varicella vaccine coverage and timely isolation of cases may be needed for varicella outbreak prevention in the two-dose era.

    • Laboratory Sciences RSS Word feed
      • Evaluation of combination drug therapy for treatment of antibiotic resistant inhalation anthrax in a murine model
        Heine HS, Shadomy SV, Boyer AE, Chuvala L, Riggins R, Kesterson A, Myrick J, Craig J, Candela MG, Barr JR, Hendricks K, Bower WA, Walke H, Drusano GL.
        Antimicrob Agents Chemother. 2017 Jul 10.
        Bacillus anthracis is considered a likely agent to be used as a bioweapon and use of a strain resistance to the first-line antimicrobial treatments is a concern. We determined treatment efficacy against a ciprofloxacin-resistant (Cr) strain of B. anthracis (Cr Ames) in a murine inhalational anthrax model. Ten groups of 46 BALB/c mice were exposed by inhalation to 7-35 LD50 of B. anthracis Cr Ames spores. Commencing at 36 hours (h) post-exposure, groups were administered intraperitoneal doses of sterile water for injections (SWI) and ciprofloxacinalone (control groups), or ciprofloxacin combined with two antimicrobials including meropenem/linezolid, meropenem/clindamycin, meropenem/rifampin, meropenem/doxycycline, penicillin/linezolid, penicillin/doxycycline, rifampin/linezolid, or rifampin/clindamycin at appropriate dosing intervals(6 or 12 hours) for the respective antibiotics. Ten mice per group were treated for 14 days and observed until day 28. Remaining animals were euthanized every 6-12h and blood, lungs, and spleens collected for lethal factor (LF) and/or bacterial load determinations. All combination groups showed significant survival over the SWI and ciprofloxacin controls: meropenem/linezolid (p=0.004), meropenem/clindamycin (p=0.005), meropenem/rifampin (p=0.012), meropenem/doxycycline (p=0.032), penicillin/doxycycline (p=0.012), penicillin/linezolid (p=0.026), rifampin/linezolid (p=0.001), and rifampin/clindamycin (p=0.032). In controls, blood, lung, and spleen bacterial counts increased to terminal endpoints. In combination treatment groups, blood and spleen bacterial counts showed low/no colonies after 24 hours treatment. LF fell below detection limits for all combination groups, yet remained elevated in control groups. Combinations with linezolid had the greatest inhibitory effect on mean LF levels.

    • Nutritional Sciences RSS Word feed
      • Modeled changes in US sodium intake from reducing sodium concentrations of commercially processed and prepared foods to meet voluntary standards established in North America: NHANES
        Cogswell ME, Patel SM, Yuan K, Gillespie C, Juan W, Curtis CJ, Vigneault M, Clapp J, Roach P, Moshfegh A, Ahuja J, Pehrsson P, Brookmire L, Merritt R.
        Am J Clin Nutr. 2017 Jul 12.
        Background: Approximately 2 in 3 US adults have prehypertension or hypertension that increases their risk of cardiovascular disease. Reducing sodium intake can decrease blood pressure and prevent hypertension. Approximately 9 in 10 Americans consume excess sodium (>/=2300 mg/d). Voluntary sodium standards for commercially processed and prepared foods were established in North America, but their impact on sodium intake is unclear.Objective: We modelled the potential impact on US sodium intake of applying voluntary sodium standards for foods.Design: We used NHANES 2007-2010 data for 17,933 participants aged >/=1 y to model predicted US daily mean sodium intake and the prevalence of excess sodium intake with the use of the standards of the New York City’s National Salt Reduction Initiative (NSRI) and Health Canada for commercially processed and prepared foods. The Food and Nutrient Database for Dietary Studies food codes corresponding to foods reported by NHANES participants were matched to NSRI and Health Canada food categories, and the published sales-weighted mean percent reductions were applied.Results: The US population aged >/=1 y could have reduced their usual daily mean sodium intake of 3417 mg by 698 mg (95% CI: 683, 714 mg) by applying NSRI 2014 targets and by 615 mg (95% CI: 597, 634 mg) by applying Health Canada’s 2016 benchmarks. Significant reductions could have occurred, regardless of age, sex, race/ethnicity, income, education, or hypertension status, up to a mean reduction in sodium intake of 850 mg/d in men aged >/=19 y by applying NSRI targets. The proportion of adults aged >/=19 y who consume >/=2300 mg/d would decline from 88% (95% CI: 86%, 91%) to 71% (95% CI: 68%, 73%) by applying NSRI targets and to 74% (95% CI: 71%, 76%) by applying Health Canada benchmarks.Conclusion: If established sodium standards are applied to commercially processed and prepared foods, a significant reduction of US sodium intake could occur.

    • Reproductive Health RSS Word feed
      • Drug interactions between non-rifamycin antibiotics and hormonal contraception: A systematic review
        Simmons KB, Haddad LB, Nanda K, Curtis KM.
        Am J Obstet Gynecol. 2017 Jul 07.
        OBJECTIVE: To determine whether interactions between non-rifamycin antibiotics and hormonal contraceptives result in decreased effectiveness or increased toxicity of either therapy. DATA SOURCES: We searched MEDLINE, Embase, clinicaltrials.gov and Cochrane libraries from database inception through June, 2016. STUDY ELIGIBILITY CRITERIA: We included trials, cohort, case-control, and pharmacokinetic (PK) studies in any language addressing pregnancy rates, pharmacodynamics or PK outcomes when any hormonal contraceptive and non-rifamycin antibiotic were administered together versus apart. Of 7291 original records identified, 29 met criteria for inclusion. STUDY APPRAISAL AND SYNTHESIS METHODS: Two authors independently assessed study quality and risk of bias using the United States Preventive Services Task Force evidence grading system. Findings were tabulated by drug class. RESULTS: Study quality ranged from good to poor and addressed only oral contraceptive pills, emergency contraception pills and the combined vaginal ring. Two studies demonstrated no difference in pregnancy rates in women using oral contraceptives with and without non-rifamycin antibiotics. No differences in ovulation suppression or breakthrough bleeding were observed in any study combining hormonal contraceptives with any antibiotic. No significant decreases in any progestin PK parameter occurred during co-administration with any antibiotic. Ethinyl estradiol area under the curve decreased when administered with dirithromycin but no other drug. CONCLUSION: Evidence from clinical and PK outcomes studies does not support the existence of drug interactions between hormonal contraception and non-rifamycin antibiotics. Data are limited by low quantity and quality for some drug classes. Most women can expect no reduction in hormonal contraceptive effect with concurrent use of non-rifamycin antibiotics.

    • Zoonotic and Vectorborne Diseases RSS Word feed

  2. CDC Authored Publications
    The names of CDC authors are indicated in bold text.
    Articles published in the past 6-8 weeks authored by CDC or ATSDR staff.
    • Chronic Diseases and Conditions RSS Word feed
      1. A population-based national estimate of the prevalence and risk factors associated with hypertension in Rwanda: implications for prevention and control
        Nahimana MR, Nyandwi A, Muhimpundu MA, Olu O, Condo JU, Rusanganwa A, Koama JB, Ngoc CT, Gasherebuka JB, Ota MO, Okeibunor JC.
        BMC Public Health. 2017 Jul 10;18(1):2.
        BACKGROUND: Hypertension is a leading cause of cardiovascular diseases and a growing public health problem in many developed and developing countries. However, population-based data to inform policy development are scarce in Rwanda. This nationally representative study aimed to determine population-based estimates of the prevalence and risk factors associated with hypertension in Rwanda. METHODS: We conducted secondary epidemiological analysis of data collected from a cross-sectional population-based study to assess the risk factors for NCDs using the WHO STEPwise approach to Surveillance of non-communicable diseases (STEPS). Adjusted odds ratios at 95% confidence interval were used to establish association between hypertension, socio-demographic characteristics and health risk behaviors. RESULTS: Of the 7116 study participants, 62.8% were females and 38.2% were males. The mean age of study participants was 35.3 years (SD 12.5). The overall prevalence of hypertension was 15.3% (16.4% for males and 14.4% for females). Twenty two percent of hypertensive participants were previously diagnosed. A logistic regression model revealed that age (AOR: 8.02, 95% CI: 5.63-11.42, p < 0.001), living in semi-urban area (AOR: 1.30, 95% CI: 1.01-1.67, p = 0.040) alcohol consumption (AOR: 1.24, 95% CI: 1.05-1.44, p = 0.009) and, raised BMI (AOR: 3.93, 95% CI: 2.54-6.08, p < 0.001) were significantly associated with hypertension. The risk of having hypertension was 2 times higher among obese respondents (AOR: 3.93, 95% CI: 2.54-6.08, p-value < 0.001) compared to those with normal BMI (AOR: 1.74, 95% CI: 1.30-2.32, p-value < 0.001). Females (AOR: 0.75, 95% CI: 0.63-0.88, p < 0.001) and students (AOR: 0.45, 95% CI: 0.25-0.80, p = 0.007) were less likely to be hypertensive. CONCLUSION: The findings of this study indicate that the prevalence of hypertension is high in Rwanda, suggesting the need for prevention and control interventions aimed at decreasing the incidence taking into consideration the risk factors documented in this and other similar studies.

      2. Applying RE-AIM to evaluate two community-based programs designed to improve access to eye care for those at high-risk for glaucoma
        Sapru S, Berktold J, Crews JE, Katz LJ, Hark L, Girkin CA, Owsley C, Francis B, Saaddine JB.
        Eval Program Plann. 2017 Jun 21;65:40-46.
        INTRODUCTION: Glaucoma is a leading cause of vision loss and blindness in the U.S. Risk factors include African American race, older age, family history of glaucoma, and diabetes. This paper describes the evaluation of a mobile eye health and a telemedicine program designed to improve access to eye care among people at high-risk for glaucoma. METHODS: The RE-AIM (reach, efficacy, adoption, implementation, and maintenance) evaluation framework was used to harmonize indicators. Both programs provided community-based eye health education and eye services related to glaucoma detection and care. Each program reported data on participants and community partners. An external evaluator conducted site visit interviews with program staff and community partners. Quantitative and qualitative data were integrated and analyzed using the RE-AIM dimensions. DISCUSSION: By targeting high-risk populations and providing comprehensive eye exams, both programs detected a large proportion of new glaucoma-related cases (17-19%) – a much larger proportion than that found in the general population (<2%). The educational intervention increased glaucoma knowledge; evidence that it led people to seek eye care was inconclusive. CONCLUSIONS: Evaluation findings from the mobile eye health program and the telemedicine program may provide useful information for wider implementation in public health clinics and in optometrist clinics located in retail outlets.

    • Communicable Diseases RSS Word feed
      1. Performance characteristics of finger-stick dried blood spots (DBS) on the determination of human immunodeficiency virus (HIV) treatment failure in a pediatric population in Mozambique
        Chang J, de Sousa A, Sabatier J, Assane M, Zhang G, Bila D, Vaz P, Alfredo C, Cossa L, Bhatt N, Koumans EH, Yang C, Rivadeneira E, Jani I, Houston JC.
        PLoS One. 2017 ;12(7):e0181054.
        Quantitative plasma viral load (VL) at 1000 copies /mL was recommended as the threshold to confirm antiretroviral therapy (ART) failure by the World Health Organization (WHO). Because of ongoing challenges of using plasma for VL testing in resource-limited settings (RLS), especially for children, this study collected 717 DBS and paired plasma samples from children receiving ART >/=1 year in Mozambique and compared the performance of DBS using Abbott’s VL test with a paired plasma sample using Roche’s VL test. At a cut-off of 1000 copies/mL, sensitivity of DBS using Abbott DBS VL test was 79.9%, better than 71.0% and 63.9% at 3000 and 5000 copies/mL, respectively. Specificities were 97.6%, 98.8%, 99.3% at 1000, 3000, and 5000 copies/mL, respectively. The Kappa value at 1000 copies/mL, 0.80 (95% CI: 0.73, 0.87), was higher than 0.73 (95% CI: 0.66, 0.80) and 0.66 (95% CI: 0.59, 0.73) at 3000, 5000 copies/mL, respectively, also indicating better agreement. The mean difference between the DBS and plasma VL tests with 95% limits of agreement by Bland-Altman was 0.311 (-0.908, 1.530). Among 73 children with plasma VL between 1000 to 5000 copies/mL, the DBS results were undetectable in 53 at the 1000 copies/mL threshold. While one DBS sample in the Abbott DBS VL test may be an alternative method to confirm ART failure at 1000 copies/mL threshold when a plasma sample is not an option for treatment monitoring, because of sensitivity concerns between 1,000 and 5,000 copies/ml, two DBS samples may be preferred accompanied by careful patient monitoring and repeat testing.

      2. Estimating influenza and respiratory syncytial virus-associated mortality in Western Kenya using health and demographic surveillance system data, 2007-2013
        Emukule GO, Spreeuwenberg P, Chaves SS, Mott JA, Tempia S, Bigogo G, Nyawanda B, Nyaguara A, Widdowson MA, van der Velden K, Paget JW.
        PLoS One. 2017 ;12(7):e0180890.
        BACKGROUND: Influenza and respiratory syncytial virus (RSV) associated mortality has not been well-established in tropical Africa. METHODS: We used the negative binomial regression method and the rate-difference method (i.e. deaths during low and high influenza/RSV activity months), to estimate excess mortality attributable to influenza and RSV using verbal autopsy data collected through a health and demographic surveillance system in Western Kenya, 2007-2013. Excess mortality rates were calculated for a) all-cause mortality, b) respiratory deaths (including pneumonia), c) HIV-related deaths, and d) pulmonary tuberculosis (TB) related deaths. RESULTS: Using the negative binomial regression method, the mean annual all-cause excess mortality rate associated with influenza and RSV was 14.1 (95% confidence interval [CI] 0.0-93.3) and 17.1 (95% CI 0.0-111.5) per 100,000 person-years (PY) respectively; and 10.5 (95% CI 0.0-28.5) and 7.3 (95% CI 0.0-27.3) per 100,000 PY for respiratory deaths, respectively. Highest mortality rates associated with influenza were among >/=50 years, particularly among persons with TB (41.6[95% CI 0.0-122.7]); and with RSV were among <5 years. Using the rate-difference method, the excess mortality rate for influenza and RSV was 44.8 (95% CI 36.8-54.4) and 19.7 (95% CI 14.7-26.5) per 100,000 PY, respectively, for all-cause deaths; and 9.6 (95% CI 6.3-14.7) and 6.6 (95% CI 3.9-11.0) per 100,000 PY, respectively, for respiratory deaths. CONCLUSIONS: Our study shows a substantial excess mortality associated with influenza and RSV in Western Kenya, especially among children <5 years and older persons with TB, supporting recommendations for influenza vaccination and efforts to develop RSV vaccines.

      3. Measles outbreak – Minnesota April-May 2017
        Hall V, Banerjee E, Kenyon C, Strain A, Griffith J, Como-Sabetti K, Heath J, Bahta L, Martin K, McMahon M, Johnson D, Roddy M, Dunn D, Ehresmann K.
        MMWR Morb Mortal Wkly Rep. 2017 Jul 14;66(27):713-717.
        On April 10, 2017, the Minnesota Department of Health (MDH) was notified about a suspected measles case. The patient was a hospitalized child aged 25 months who was evaluated for fever and rash, with onset on April 8. The child had no history of receipt of measles-mumps-rubella (MMR) vaccine and no travel history or known exposure to measles. On April 11, MDH received a report of a second hospitalized, unvaccinated child, aged 34 months, with an acute febrile rash illness with onset on April 10. The second patient’s sibling, aged 19 months, who had also not received MMR vaccine, had similar symptoms, with rash onset on March 30. Real-time reverse transcription-polymerase chain reaction (rRT-PCR) testing of nasopharyngeal swab or throat specimens performed at MDH confirmed measles in the first two patients on April 11, and in the third patient on April 13; subsequent genotyping identified genotype B3 virus in all three patients, who attended the same child care center. MDH instituted outbreak investigation and response activities in collaboration with local health departments, health care facilities, child care facilities, and schools in affected settings. Because the outbreak occurred in a community with low MMR vaccination coverage, measles spread rapidly, resulting in thousands of exposures in child care centers, schools, and health care facilities. By May 31, 2017, a total of 65 confirmed measles cases had been reported to MDH (Figure 1); transmission is ongoing.

      4. BACKGROUND: Young adults, including college students, have higher rates of chlamydia than the general population. Patient-delivered partner therapy (PDPT) is a partner treatment option for sex partners of individuals diagnosed with chlamydia or gonorrhea. We examined college health center use of PDPT in a national sample of colleges. METHODS: During 2014 to 2015, we collected data from 482 colleges and universities (55% of 885 surveyed), weighting responses by institutional characteristics abstracted from a national database (eg, 2-year vs 4-year status). We asked whether the school had a student health center and which sexual and reproductive health (SRH) services were offered. We also assessed the legal and perceived legal status of PDPT in states where schools were located. We then estimated PDPT availability at student health centers and measured associations with legal status and SRH services. RESULTS: Most colleges (n = 367) reported having a student health center; PDPT was available at 36.6% of health centers and associated with perceived legality of PDPT in the state in which the college was located (odds ratio [OR], 4.63; 95% confidence interval [CI], 1.17-18.28). Patient-delivered partner therapy was significantly associated with availability of SRH services, including sexually transmitted disease diagnosis and treatment of STI (56.2% vs 1.1%), gynecological services (60.3% vs 12.2%), and contraceptive services (57.8% vs 7.7%) (all P < .001). Compared with schools taking no action, PDPT was more likely to be available at schools that notified partners directly (OR, 8.29; 95% CI, 1.28-53.85), but not schools that asked patients to notify partners (OR, 3.47; 95% CI, 0.97-12.43). CONCLUSIONS: PDPT was more likely to be available in colleges that offered SRH services and where staff believed PDPT was legal. Further research could explore more precise conditions under which PDPT is used.

      5. Viral etiology, seasonality and severity of hospitalized patients with severe acute respiratory infections in the Eastern Mediterranean Region, 2007-2014
        Horton KC, Dueger EL, Kandeel A, Abdallat M, El-Kholy A, Al-Awaidy S, Kohlani AH, Amer H, El-Khal AL, Said M, House B, Pimentel G, Talaat M.
        PLoS One. 2017 ;12(7):e0180954.
        INTRODUCTION: Little is known about the role of viral respiratory pathogens in the etiology, seasonality or severity of severe acute respiratory infections (SARI) in the Eastern Mediterranean Region. METHODS: Sentinel surveillance for SARI was conducted from December 2007 through February 2014 at 20 hospitals in Egypt, Jordan, Oman, Qatar and Yemen. Nasopharyngeal and oropharyngeal swabs were collected from hospitalized patients meeting SARI case definitions and were analyzed for infection with influenza, respiratory syncytial virus (RSV), adenovirus (AdV), human metapneumovirus (hMPV) and human parainfluenza virus types 1-3 (hPIV1-3). We analyzed surveillance data to calculate positivity rates for viral respiratory pathogens, describe the seasonality of those pathogens and determine which pathogens were responsible for more severe outcomes requiring ventilation and/or intensive care and/or resulting in death. RESULTS: At least one viral respiratory pathogen was detected in 8,753/28,508 (30.7%) samples tested for at least one pathogen and 3,497/9,315 (37.5%) of samples tested for all pathogens-influenza in 3,345/28,438 (11.8%), RSV in 3,942/24,503 (16.1%), AdV in 923/9,402 (9.8%), hMPV in 617/9,384 (6.6%), hPIV1 in 159/9,402 (1.7%), hPIV2 in 85/9,402 (0.9%) and hPIV3 in 365/9,402 (3.9%). Multiple pathogens were identified in 501/9,316 (5.4%) participants tested for all pathogens. Monthly variation, indicating seasonal differences in levels of infection, was observed for all pathogens. Participants with hMPV infections and participants less than five years of age were significantly less likely than participants not infected with hMPV and those older than five years of age, respectively, to experience a severe outcome, while participants with a pre-existing chronic disease were at increased risk of a severe outcome, compared to those with no reported pre-existing chronic disease. CONCLUSIONS: Viral respiratory pathogens are common among SARI patients in the Eastern Mediterranean Region. Ongoing surveillance is important to monitor changes in the etiology, seasonality and severity of pathogens of interest.

      6. Rubella surveillance and diagnostic testing among a low-prevalence population, New York City, 2012-2013
        Isaac BM, Zucker JR, Giancotti FR, Abernathy E, Icenogle J, Rakeman JL, Rosen JB.
        Clin Vaccine Immunol. 2017 Jul 12.
        The New York City Department of Health and Mental Hygiene (DOHMH) receives clinical and laboratory reports for rubella. Because rubella immunoglobulin M (IgM) assays may produce false positive results and rubella infections may be asymptomatic, interpretation of positive IgM results can be challenging. Rubella reports received by DOHMH in 2012-2013 were reviewed. Rubella IgM testing purpose was determined through case investigation. Results of IgM testing by indirect enzyme-linked immunosorbent assay (ELISA) and capture enzyme immunoassay (EIA) were compared to determine positive predictive value (PPV) and specificity. DOHMH received 199 rubella reports; 2 were true cases. Of all reports, 77.9% were tested for rubella IgM erroneously, 19.6% were tested for diagnostic purposes, 2.0% had unknown test purpose, and 0.5% were not tested. PPV of indirect ELISA was 6% overall, 14% for diagnostic tests, and 0% for tests ordered erroneously. PPV of capture EIA was 29% overall, 50% for diagnostic tests, and 0% for tests ordered erroneously. Overall, specificity was 52% for indirect ELISA and 85% for capture EIA. Limiting rubella IgM testing to patients for whom rubella diagnosis is suspected and using a more specific IgM assay have the potential to reduce false positive rubella IgM results.

      7. The accuracy of HIV rapid testing in integrated bio-behavioral surveys of men who have sex with men across 5 provinces in South Africa
        Kufa T, Lane T, Manyuchi A, Singh B, Isdahl Z, Osmand T, Grasso M, Struthers H, McIntyre J, Chipeta Z, Puren A.
        Medicine (Baltimore). 2017 Jul;96(28):e7391.
        We describe the accuracy of serial rapid HIV testing among men who have sex with men (MSM) in South Africa and discuss the implications for HIV testing and prevention.This was a cross-sectional survey conducted at five stand-alone facilities from five provinces.Demographic, behavioral, and clinical data were collected. Dried blood spots were obtained for HIV-related testing. Participants were offered rapid HIV testing using 2 rapid diagnostic tests (RDTs) in series. In the laboratory, reference HIV testing was conducted using a third-generation enzyme immunoassay (EIA) and a fourth-generation EIA as confirmatory. Accuracy, sensitivity, specificity, positive predictive value, negative predictive value, false-positive, and false-negative rates were determined.Between August 2015 and July 2016, 2503 participants were enrolled. Of these, 2343 were tested by RDT on site with a further 2137 (91.2%) having definitive results on both RDT and EIA. Sensitivity, specificity, positive predictive value, negative predictive value, false-positive rates, and false-negative rates were 92.6% [95% confidence interval (95% CI) 89.6-94.8], 99.4% (95% CI 98.9-99.7), 97.4% (95% CI 95.2-98.6), 98.3% (95% CI 97.6-98.8), 0.6% (95% CI 0.3-1.1), and 7.4% (95% CI 5.2-10.4), respectively. False negatives were similar to true positives with respect to virological profiles.Overall accuracy of the RDT algorithm was high, but sensitivity was lower than expected. Post-HIV test counseling should include discussions of possible false-negative results and the need for retesting among HIV negatives.

      8. US public sexually transmitted disease clinical services in an era of declining public health funding: 2013-14
        Leichliter JS, Heyer K, Peterman TA, Habel MA, Brookmeyer KA, Arnold Pang SS, Stenger MR, Weiss G, Gift TL.
        Sex Transm Dis. 2017 Aug;44(8):505-509.
        BACKGROUND: We examined the infrastructure for US public sexually transmitted disease (STD) clinical services. METHODS: In 2013 to 2014, we surveyed 331 of 1225 local health departments (LHDs) who either reported providing STD testing/treatment in the 2010 National Profile of Local Health Departments survey or were the 50 local areas with the highest STD cases or rates. The sample was stratified by jurisdiction population size. We examined the primary referral clinics for STDs, the services offered and the impact of budget cuts (limited to government funding only). Data were analyzed using SAS, and analyses were weighted for nonresponse. RESULTS: Twenty-two percent of LHDs cited a specialty STD clinic as their primary referral for STD services; this increased to 53.5% of LHDs when combination STD-family planning clinics were included. The majority of LHDs (62.8%) referred to clinics providing same-day services. Sexually transmitted disease clinics more frequently offered extragenital testing for chlamydia and/or gonorrhea (74.7%) and gonorrhea culture (68.5%) than other clinics (52.9%, 46.2%, respectively; P < 0.05). The majority of LHDs (61.5%) reported recent budget cuts. Of those with decreased budgets, the most common impacts were fewer clinic hours (42.8%; 95% confidence interval [CI], 24.4-61.2), reduced routine screening (40.2%; 95% CI, 21.7-58.8) and reductions in partner services (42.1%; 95% CI, 23.6-60.7). One quarter of those with reduced STD budgets increased fees or copays for clients. CONCLUSIONS: Findings demonstrate gaps and reductions in US public STD services including clinical services that play an important role in reducing disease transmission. Furthermore, STD clinics tended to offer more specialized STD services than other public clinics.

      9. Improving hepatitis B birth dose coverage through village health volunteer training and pregnant women education
        Li X, Heffelfinger J, Wiesen E, Diorditsa S, Valiakolleri J, Nikuata AB, Nukuro E, Tabwaia B, Woodring J.
        Vaccine. 2017 Jul 05.
        Hepatitis B is highly endemic in the Republic of Kiribati, while the coverage of timely birth dose vaccination, the primary method shown to prevent mother-to-child transmission of hepatitis B virus, was only 66% in 2014. Children born at home are especially at high risk, as they have limited access to timely birth dose (i.e. within 24 h) vaccination. To improve birth dose coverage, a project to improve linkages between village health volunteers and health workers and educate pregnant women on hepatitis B vaccination was carried out in 16 communities with low birth dose coverage in Kiribati from November 2014 to May 2015. After project completion, the coverage of timely birth dose administration increased significantly both in the densely populated capital region of South Tarawa (from 89% to 95%, p=0.001) and the Outer Islands (from 57% to 83%, p<0.001). The coverage of timely birth dose administration among infants born at home increased significantly from 70% to 84% in South Tarawa (p=0.001) and from 49% to 75% in the Outer Islands (p<0.001). Timely birth dose was associated with being born in a hospital, being born during the study period and caregivers having developed an antenatal birth dose plan. The project demonstrates a successful model for improving hepatitis B vaccine birth dose coverage that could be adopted in other areas in Kiribati as well as other similar settings.

      10. A national survey on the use of electronic directly observed therapy for treatment of tuberculosis
        Macaraig M, Lobato MN, McGinnis Pilote K, Wegener D.
        J Public Health Manag Pract. 2017 Jul 07.
        CONTEXT: An increasing number of tuberculosis (TB) programs are adopting electronic directly observed therapy (eDOT), the use of technology to supervise patient adherence remotely. Pilot studies show that treatment adherence and completion were similar with eDOT compared with the standard in-person DOT. OBJECTIVE: In December 2015, the National Tuberculosis Controllers Association administered an online survey to determine the extent to which eDOT is used in the United States. PARTICIPANTS: Sixty-eight Centers for Disease Control and Prevention (CDC)-funded health department TB programs across the United States and a convenient sample of local health department TB programs. RESULTS: Fifty-six (82%) of 68 CDC-funded health department TB programs and an additional 57 local TB programs responded to the survey. Forty-seven (42%) of 113 TB programs are currently using eDOT, 41 (36%) are planning to implement in the next year, and 25 (22%) have no plans to implement eDOT. Of the 47 TB programs using eDOT, 31 (66%) use synchronous video DOT, 4 (9%) asynchronous video DOT, 11 (23%) a combination of both, and 1 (2%) ingestible sensor to conduct electronic observations. Forty-one (87%) indicated that treatment adherence and 40 (85%) indicated that treatment completion were about the same or higher than in-person DOT. More than 80% indicated that eDOT resulted in program cost savings, and almost all (91%) reported benefits in patient and staff satisfaction. However, 25 (53%) of the 47 TB programs that use eDOT encountered technical challenges and 37 (79%) offer eDOT to less than a third of their patients. CONCLUSIONS: Results from this survey indicate that eDOT is a promising tool that can be utilized to efficiently and effectively manage TB treatment. Findings will inform other TB programs interested in implementing eDOT. However, further evaluation is needed to assess eDOT acceptability to understand barriers to eDOT implementation from the patient and provider perspectives.

      11. Polymorphisms in the vitamin D receptor gene are associated with reduced rate of sputum culture conversion in multidrug-resistant tuberculosis patients in South Africa
        Magee MJ, Sun YV, Brust JC, Shah NS, Ning Y, Allana S, Campbell A, Hui Q, Mlisana K, Moodley P, Gandhi NR.
        PLoS One. 2017 ;12(7):e0180916.
        BACKGROUND: Vitamin D modulates the inflammatory and immune response to tuberculosis (TB) and also mediates the induction of the antimicrobial peptide cathelicidin. Deficiency of 25-hydroxyvitamin D and single nucleotide polymorphisms (SNPs) in the vitamin D receptor (VDR) gene may increase the risk of TB disease and decrease culture conversion rates in drug susceptible TB. Whether these VDR SNPs are found in African populations or impact multidrug-resistant (MDR) TB treatment has not been established. We aimed to determine if SNPs in the VDR gene were associated with sputum culture conversion among a cohort of MDR TB patients in South Africa. METHODS: We conducted a prospective cohort study of adult MDR TB patients receiving second-line TB treatment in KwaZulu-Natal province. Subjects had monthly sputum cultures performed. In a subset of participants, whole blood samples were obtained for genomic analyses. Genomic DNA was extracted and genotyped with Affymetrix Axiom Pan-African Array. Cox proportional models were used to determine the association between VDR SNPs and rate of culture conversion. RESULTS: Genomic analyses were performed on 91 MDR TB subjects enrolled in the sub-study; 60% were female and median age was 35 years (interquartile range [IQR] 29-42). Smoking was reported by 21% of subjects and most subjects had HIV (80%), were smear negative (57%), and had cavitary disease (55%). Overall, 87 (96%) subjects initially converted cultures to negative, with median time to culture conversion of 57 days (IQR 17-114). Of 121 VDR SNPs examined, 10 were significantly associated (p<0.01) with rate of sputum conversion in multivariable analyses. Each additional risk allele on SNP rs74085240 delayed culture conversion significantly (adjusted hazard ratio 0.30, 95% confidence interval 0.14-0.67). CONCLUSIONS: Polymorphisms in the VDR gene were associated with rate of sputum culture conversion in MDR TB patients in this high HIV prevalence setting in South Africa.

      12. Infection with influenza A(H1N1)pdm09 during the first wave of the 2009 pandemic: Evidence from a longitudinal seroepidemiologic study in Dhaka, Bangladesh
        Nasreen S, Rahman M, Hancock K, Katz JM, Goswami D, Sturm-Ramirez K, Holiday C, Jefferson S, Branch A, Wang D, Veguilla V, Widdowson MA, Fry AM, Brooks WA.
        Influenza Other Respir Viruses. 2017 Jul 08.
        BACKGROUND: We determined influenza A(H1N1)pdm09 antibody levels before and after the first wave of the pandemic in an urban community in Dhaka, Bangladesh. METHODS: We identified a cohort of households by stratified random sampling. We collected baseline serum specimens during July-August 2009, just prior to the initial wave of the 2009 pandemic in this community and a second specimen during November 2009, after the pandemic peak. Paired sera was tested for antibodies against A(H1N1)pdm09 virus using microneutralization assay and hemagglutinin inhibition (HI) assay. A four-fold increase in antibody titer by either assay with a titer of >/=40 in the convalescent sera was considered a seroconversion. At baseline, an HI titer of >40 was considered seropositive. We collected information on clinical illness from weekly home visits. RESULTS: We tested 779 paired sera from the participants. At baseline, before the pandemic wave, 1% overall and 3% of persons >60 years old were seropositive. After the first wave of the pandemic, 211 (27%) individuals seroconverted against A(H1N1)pdm09. Children aged 5-17 years had the highest proportion (37%) of seroconversion. Among 264 (34%) persons with information on clinical illness, 191 (72%) had illness >3 weeks prior to collection of the follow-up sera and 73 (38%) seroconverted. Sixteen (22%) of these 73 seroconverted participants reported no clinical illness. CONCLUSION: After the first pandemic wave in Dhaka, one in four persons were infected by A(H1N1)pdm09 virus and the highest burden of infection was among the school-aged children. Seroprevalence studies supplement traditional surveillance systems to estimate infection burden. This article is protected by copyright. All rights reserved.

      13. A review of risk assessment, testing, and treatment. ABSTRACT: Nurses play a critical role in the diagnosis and treatment of tuberculosis and in the prevention of tuberculosis transmission through infection control practices. To eliminate tuberculosis in the United States, however, an expanded approach to testing and treating people with latent tuberculosis infection must be implemented. Recently, the U.S. Preventive Services Task Force (USPSTF) issued a new recommendation statement on latent tuberculosis infection testing that expands nurses’ opportunities to identify at-risk populations for tuberculosis prevention. In combination with newer testing methodologies and shorter treatment regimens, implementation of the USPSTF recommendation has the potential to remove previously existing barriers to screening and treatment of both patients and health care providers. This article provides a general overview of tuberculosis transmission, pathogenesis, and epidemiology; presents preventive care recommendations for targeted testing among high-risk groups; and discusses the USPSTF recommendation’s applicability to public health and primary care practice in the United States.

      14. Attributable fraction of influenza virus detection to mild and severe respiratory illnesses in HIV-infected and HIV-uninfected patients, South Africa, 2012-2016
        Tempia S, Walaza S, Moyes J, Cohen AL, von Mollendorf C, McMorrow ML, Treurnicht FK, Venter M, Pretorius M, Hellferscee O, Wolter N, von Gottberg A, Nguweneza A, McAnerney JM, Dawood H, Variava E, Madhi SA, Cohen C.
        Emerg Infect Dis. 2017 Jul;23(7):1124-1132.
        The attributable fraction (AF) of influenza virus detection to illness has not been described for patients in different age groups or with different HIV infection statuses. We compared the age group-specific prevalence of influenza virus infection among patients with influenza-like illness (ILI) or severe acute or chronic respiratory illness (SARI and SCRI, respectively) with that among controls, stratified by HIV serostatus. The overall AF for influenza virus detection to illness was 92.6% for ILI, 87.4% for SARI, and 86.2% for SCRI. Among HIV-uninfected patients, the AF for all syndromes was highest among persons <1 and >65 years of age and lowest among persons 25-44 years of age; this trend was not observed among HIV-infected patients. Overall, influenza viruses when detected in patients with ILI, SARI, or SCRI are likely attributable to illness. This finding is particularly likely among children and the elderly irrespective of HIV serostatus and among HIV-infected persons irrespective of age.

      15. Antimicrobial resistance in Neisseria gonorrhoeae: Global surveillance and a call for international collaborative action
        Wi T, Lahra MM, Ndowa F, Bala M, Dillon JR, Ramon-Pardo P, Eremin SR, Bolan G, Unemo M.
        PLoS Med. 2017 Jul;14(7):e1002344.
        In a Policy Forum, Teodora Wi and colleagues discuss the challenges of antimicrobial resistance in gonococci.

    • Disease Reservoirs and Vectors RSS Word feed
      1. Proposal to reclassify Ehrlichia muris as Ehrlichia muris subsp. muris subsp. nov. and description of Ehrlichia muris subsp. eauclairensis subsp. nov., a newly recognized tick-borne pathogen of humans
        Pritt BS, Allerdice ME, Sloan LM, Paddock CD, Munderloh UG, Rikihisa Y, Tajima T, Paskewitz SM, Neitzel DF, Hoang Johnson DK, Schiffman E, Davis JP, Goldsmith CS, Nelson CM, Karpathy SE.
        Int J Syst Evol Microbiol. 2017 Jul 12.
        We have previously described a novel taxon of the genus Ehrlichia (type strain WisconsinT), closely related to Ehrlichia muris, that causes human ehrlichiosis among patients with exposures to ticks in the upper midwestern USA. DNA from this bacterium was also detected in Ixodes scapularis and Peromyscus leucopus collected in Minnesota and Wisconsin. To determine the relationship between the E. muris-like agent (EMLA) and other species of the genus Ehrlichia phenotypic, genotypic and epidemiologic comparisons were undertaken, including sequence analysis of eight gene loci (3906 nucleotides) for 39 EMLA DNA samples and the type strain of E. muris AS145T. Three loci were also sequenced from DNA of nine strains of E. muris from mouse spleens from Japan. All sequences from E. muris were distinct from homologous EMLA sequences, but differences between them were less than those observed among other species of the genus Ehrlichia. Phenotypic comparison of EMLA and E. muris revealed similar culture and electron microscopic characteristics, but important differences were noted in their geographic distribution, ecological associations and behavior in mouse models of infection. Based on these comparisons, we propose that type strain WisconsinT represents a novel subspecies, Ehrlichia murissubsp. eauclairensis,subsp. nov. This strain is available through the Centers for Disease Control and Prevention Rickettsial Isolate Reference Collection (CRIRC EMU002T) and through the Collection de Souches de l’Unite des Rickettsies (CSURP2883 T). The subspecies Ehrlichia murissubsp. muris subsp. nov. is automatically created and the type strain AS145T is also available through the same collections (CRIRC EMU001T, CSUR E2T). Included is an emended description of E. muris.

      2. Identifying wildlife reservoirs of neglected taeniid tapeworms: Non-invasive diagnosis of endemic Taenia serialis infection in a wild primate population
        Schneider-Crease I, Griffin RH, Gomery MA, Dorny P, Noh JC, Handali S, Chastain HM, Wilkins PP, Nunn CL, Snyder-Mackler N, Beehner JC, Bergman TJ.
        PLoS Negl Trop Dis. 2017 Jul 13;11(7):e0005709.
        Despite the global distribution and public health consequences of Taenia tapeworms, the life cycles of taeniids infecting wildlife hosts remain largely undescribed. The larval stage of Taenia serialis commonly parasitizes rodents and lagomorphs, but has been reported in a wide range of hosts that includes geladas (Theropithecus gelada), primates endemic to Ethiopia. Geladas exhibit protuberant larval cysts indicative of advanced T. serialis infection that are associated with high mortality. However, non-protuberant larvae can develop in deep tissue or the abdominal cavity, leading to underestimates of prevalence based solely on observable cysts. We adapted a non-invasive monoclonal antibody-based enzyme-linked immunosorbent assay (ELISA) to detect circulating Taenia spp. antigen in dried gelada urine. Analysis revealed that this assay was highly accurate in detecting Taenia antigen, with 98.4% specificity, 98.5% sensitivity, and an area under the curve of 0.99. We used this assay to investigate the prevalence of T. serialis infection in a wild gelada population, finding that infection is substantially more widespread than the occurrence of visible T. serialis cysts (16.4% tested positive at least once, while only 6% of the same population exhibited cysts). We examined whether age or sex predicted T. serialis infection as indicated by external cysts and antigen presence. Contrary to the female-bias observed in many Taenia-host systems, we found no significant sex bias in either cyst presence or antigen presence. Age, on the other hand, predicted cyst presence (older individuals were more likely to show cysts) but not antigen presence. We interpret this finding to indicate that T. serialis may infect individuals early in life but only result in visible disease later in life. This is the first application of an antigen ELISA to the study of larval Taenia infection in wildlife, opening the doors to the identification and description of infection dynamics in reservoir populations.

    • Drug Safety RSS Word feed
      1. New data on opioid use and prescribing in the United States
        Schuchat A, Houry D, Guy GP.
        Jama. 2017 Jul 06.

        [No abstract]

    • Environmental Health RSS Word feed
      1. Persistent organic pollutants in infants and toddlers: Relationship between concentrations in matched plasma and faecal samples
        Chen Y, Sjodin A, McLachlan MS, English K, Aylward LL, Toms LL, Varghese J, Sly PD, Mueller JF.
        Environ Int. 2017 Jul 06;107:82-88.
        Early-childhood biomonitoring of persistent organic pollutants (POPs) is challenging due to the logistic and ethical limitations associated with blood sampling. We investigated using faeces as a non-invasive matrix to estimate internal exposure to POPs. The concentrations of selected POPs were measured in matched plasma and faecal samples collected from 20 infants/toddlers (aged 13+/-4.8months), including a repeat sample time point for 13 infants (~5months apart). We observed higher rates of POP quantification in faeces (2g dry weight) than in plasma (0.5mL). Among the five chemicals that had quantification frequencies over 50% in both matrices, except for HCB, log concentration in faeces (Cf) and blood (Cb) were correlated (r>0.74, P<0.05) for p.p’-dichlorodiphenyldichloroethylene (p,p’-DDE), 2,3′,4,4′,5-pentachlorobiphenyl (PCB118), 2,2′,3,4,4′,5′-pentachlorobiphenyl (PCB138) and 2,2′,4,4′,5,5′-pentachlorobiphenyl (PCB153). We determined faeces:plasma concentration ratios (Kfb), which can be used to estimate Cb from measurements of Cf for infants/toddlers. For a given chemical, the variation in Kfb across individuals was considerable (CV from 0.46 to 0.70). Between 5% and 50% of this variation was attributed to short-term intra-individual variability between successive faecal samples. This variability could be reduced by pooling faeces samples over several days. Some of the remaining variability was attributed to longer-term intra-individual variability, which was consistent with previously reported observations of a decrease in Kfb over the first year of life. The strong correlations between Cf and Cb demonstrate the promise of using faeces for biomonitoring of these compounds. Future research on the sources of variability in Kfb could improve the precision and utility of this technique.

      2. Sanitation practices and perceptions in Kakuma refugee camp, Kenya: Comparing the status quo with a novel service-based approach
        Nyoka R, Foote AD, Woods E, Lokey H, O’Reilly CE, Magumba F, Okello P, Mintz ED, Marano N, Morris JF.
        PLoS One. 2017 ;12(7):e0180864.
        Globally, an estimated 2.5 billion people lack access to improved sanitation. Unimproved sanitation increases the risk of morbidity and mortality, especially in protracted refugee situations where sanitation is based on pit latrine use. Once the pit is full, waste remains in the pit, necessitating the construction of a new latrine, straining available land and funding resources. A viable, sustainable solution is needed. This study used qualitative and quantitative methods to design, implement, and pilot a novel sanitation system in Kakuma refugee camp, Kenya. An initial round of 12 pre-implementation focus group discussions (FGDs) were conducted with Dinka and Somali residents to understand sanitation practices, perceptions, and needs. FGDs and a supplementary pre-implementation survey informed the development of an innovative sanitation management system that incorporated the provision of urine and liquid-diverting toilets, which separate urine and fecal waste, and a service-based sanitation system that included weekly waste collection. The new system was implemented on a pilot scale for 6 weeks. During the implementation, bi-weekly surveys were administered in each study household to monitor user perceptions and challenges. At the end of the pilot, the sanitation system was assessed using a second round of four post-implementation FGDs. Those who piloted the new sanitation system reported high levels of user satisfaction. Reported benefits included odor reduction, insect/pest reduction, the sitting design, the appropriateness for special populations, and waste collection. However, urine and liquid diversion presented a challenge for users who perform anal washing and for women who had experienced female genital mutilation. Refugee populations are often culturally and ethnically diverse. Using residents’ input to inform the development of sanitation solutions can increase user acceptability and provide opportunities to improve sanitation system designs based on specific needs.

      3. Lead (Pb), cadmium (Cd), mercury (Hg), and arsenic (As) are among the top 10 pollutants of global health concern. Studies have shown that exposures to these metals produce severe adverse effects. However, the mechanisms underlying these effects, particularly joint toxicities, are poorly understood in humans. The objective of this investigation was to identify and characterize prevalent combinations of these metals and their species in the U.S. NHANES population to provide background data for future studies of potential metal interactions. Exposure was defined as urine or blood levels >/= medians of the NHANES 2007-2012 participants >/=6 years (n = 7408). Adjusted-odds ratios (adj-OR) and 95% confidence intervals were determined for covariates (age, gender, and race/ethnicity, cotinine and body mass index). Species-specific analysis was also conducted for As and Hg including iAs (urinary arsenous acid and/or arsenic acid), met-iAs (urinary monomethylarsonic acid and/or dimethylarsinic acid), and oHg (blood methyl-mercury and/or ethyl-mercury). For combinations of As and Hg species, age- and gender-specific prevalence was determined among NHANES 2011-2012 participants (n = 2342). Data showed that approximately 49.3% of the population contained a combination of three or more metals. The most prevalent unique specific combinations were Pb/Cd/Hg/As, Pb/Cd/Hg, and Pb/Cd. Age was consistently associated with these combinations: adj-ORs ranged from 10.9 (Pb/Cd) to 11.2 (Pb/Cd/Hg/As). Race/ethnicity was significant for Pb/Cd/Hg/As. Among women of reproductive age, frequency of oHg/iAs/met-iAS and oHg/met-iAs was 22.9 and 40.3%, respectively. These findings may help prioritize efforts to assess joint toxicities and their impact on public health.

      4. Manufacturing of perfluorooctanoic acid (PFOA), a synthetic chemical with a long half-life in humans, peaked between 1970 and 2002, and has since diminished. In the United States, PFOA is detected in the blood of >99% of people tested, but serum concentrations have decreased since 1999. Much is known about exposure to PFOA in drinking water; however, the impact of non-drinking water PFOA exposure on serum PFOA concentrations is not well characterized. The objective of this research is to apply physiologically based pharmacokinetic (PBPK) modeling and Monte Carlo analysis to evaluate the impact of historic non-drinking water PFOA exposure on serum PFOA concentrations. In vitro to in vivo extrapolation was utilized to inform descriptions of PFOA transport in the kidney. Monte Carlo simulations were incorporated to evaluate factors that account for the large inter-individual variability of serum PFOA concentrations measured in individuals from North Alabama in 2010 and 2016, and the Mid-Ohio River Valley between 2005 and 2008. Predicted serum PFOA concentrations were within two-fold of experimental data. With incorporation of Monte Carlo simulations, the model successfully tracked the large variability of serum PFOA concentrations measured in populations from the Mid-Ohio River Valley. Simulation of exposure in a population of 45 adults from North Alabama successfully predicted 98% of individual serum PFOA concentrations measured in 2010 and 2016, respectively, when non-drinking water ingestion of PFOA exposure was included. Variation in serum PFOA concentrations may be due to inter-individual variability in the disposition of PFOA and potentially elevated historical non-drinking water exposures.

    • Epidemiology and Surveillance RSS Word feed
      1. Trend and causes of adult mortality in Kersa Health and Demographic Surveillance System (Kersa HDSS), eastern Ethiopia: verbal autopsy method
        Ashenafi W, Eshetu F, Assefa N, Oljira L, Dedefo M, Zelalem D, Baraki N, Demena M.
        Popul Health Metr. 2017 Jul 01;15(1):22.
        BACKGROUND: The health problems of adults have been neglected in many developing countries, yet many studies in these countries show high rates of premature mortality in adults. Measuring adult mortality and its cause through verbal autopsy (VA) methods is becoming an important process for mortality estimates and is a good indicator of the overall mortality rates in resource-limited settings. The objective of this analysis is to describe the levels, distribution, and trends of adult mortality over time (2008-2013) and causes of adult deaths using VA in Kersa Health and Demographic Surveillance System (Kersa HDSS). METHODS: Kersa HDSS is a demographic and health surveillance and research center established in 2007 in the eastern part of Ethiopia. This is a community-based longitudinal study where VA methods were used to assign probable cause of death. Two or three physicians independently assigned cause of death based on the completed VA forms in accordance with the World Health Organization’s International Classification of Diseases. In this analysis, the VA data considered were of all deaths of adults age 15 years and above, over a period of six years (2008-2013). The mortality fractions were determined and the causes of death analyzed. Analysis was done using STATA and graphs were designed using Microsoft Excel. RESULTS: A total of 1535 adult deaths occurred in the surveillance site during the study period and VA was completed for all these deaths. In general, the adult mortality rate over the six-year period was 8.5 per 1000 adult population, higher for males (9.6) and rural residents (8.6) than females (7.5) and urban residents (8.2). There is a general decrease in the mortality rates over the study period from 9.4 in 2008-2009 to 8.1 in 2012-2013. Out of the total deaths, about one-third (32.4%) occurred due to infectious and parasitic causes, and the second leading cause of death was diseases of circulatory system (11.4%), followed by gastrointestinal disorders (9.2%). Tuberculosis (TB) showed an increasing trend over the years and has been the leading cause of death in 2012 and 2013 for all adult age categories (15-49, 50-64, and 65 years and over). Chronic liver disease (CLD) was indicated as leading cause of death among adults in the age group 15-49 years. CONCLUSION: The increasing TB-related mortality in the study years as well as the relative high mortality due to CLD among adults of age 15-49 years should be further investigated and triangulated with health service data to understand the root cause of death.

      2. Enhancing surveillance for mass gatherings: The role of syndromic surveillance
        Fleischauer AT, Gaines J.
        Public Health Rep. 2017 Jul/Aug;132(1_suppl):95s-98s.

        [No abstract]

      3. The evolution of Biosense: Lessons learned and future directions
        Gould DW, Walker D, Yoon PW.
        Public Health Rep. 2017 Jul/Aug;132(1_suppl):7s-11s.
        The BioSense program was launched in 2003 with the aim of establishing a nationwide integrated public health surveillance system for early detection and assessment of potential bioterrorism-related illness. The program has matured over the years from an initial Centers for Disease Control and Prevention-centric program to one focused on building syndromic surveillance capacity at the state and local level. The uses of syndromic surveillance have also evolved from an early focus on alerts for bioterrorism-related illness to situational awareness and response, to various hazardous events and disease outbreaks. Future development of BioSense (now the National Syndromic Surveillance Program) includes, in the short term, a focus on data quality with an emphasis on stability, consistency, and reliability and, in the long term, increased capacity and innovation, new data sources and system functionality, and exploration of emerging technologies and analytics.

      4. Using syndromic surveillance for all-hazards public health surveillance: Successes, challenges, and the future
        Yoon PW, Ising AI, Gunn JE.
        Public Health Rep. 2017 Jul/Aug;132(1_suppl):3s-6s.

        [No abstract]

    • Food Safety RSS Word feed
      1. Outbreak characteristics associated with identification of contributing factors to foodborne illness outbreaks
        Brown LG, Hoover ER, Selman CA, Coleman EW, Schurz Rogers H.
        Epidemiol Infect. 2017 Jul 10:1-9.
        Information on the factors that cause or amplify foodborne illness outbreaks (contributing factors), such as ill workers or cross-contamination of food by workers, is critical to outbreak prevention. However, only about half of foodborne illness outbreaks reported to the United States’ Centers for Disease Control and Prevention (CDC) have an identified contributing factor, and data on outbreak characteristics that promote contributing factor identification are limited. To address these gaps, we analyzed data from 297 single-setting outbreaks reported to CDC’s new outbreak surveillance system, which collects data from the environmental health component of outbreak investigations (often called environmental assessments), to identify outbreak characteristics associated with contributing factor identification. These analyses showed that outbreak contributing factors were more often identified when an outbreak etiologic agent had been identified, when the outbreak establishment prepared all meals on location and served more than 150 meals a day, when investigators contacted the establishment to schedule the environmental assessment within a day of the establishment being linked with an outbreak, and when multiple establishment visits were made to complete the environmental assessment. These findings suggest that contributing factor identification is influenced by multiple outbreak characteristics, and that timely and comprehensive environmental assessments are important to contributing factor identification. They also highlight the need for strong environmental health and food safety programs that have the capacity to complete such environmental assessments during outbreak investigations.

    • Health Economics RSS Word feed
      1. Lifetime cost of abusive head trauma at ages 0-4, USA
        Miller TR, Steinbeigle R, Lawrence BA, Peterson C, Florence C, Barr M, Barr RG.
        Prev Sci. 2017 Jul 06.
        This paper aims to estimate lifetime costs resulting from abusive head trauma (AHT) in the USA and the break-even effectiveness for prevention. A mathematical model incorporated data from Vital Statistics, the Healthcare Cost and Utilization Project Kids’ Inpatient Database, and previous studies. Unit costs were derived from published sources. From society’s perspective, discounted lifetime cost of an AHT averages $5.7 million (95% CI $3.2-9.2 million) for a death. It averages $2.6 million (95% CI $1.0-2.9 million) for a surviving AHT victim including $224,500 for medical care and related direct costs (2010 USD). The estimated 4824 incident AHT cases in 2010 had an estimated lifetime cost of $13.5 billion (95% CI $5.5-16.2 billion) including $257 million for medical care, $552 million for special education, $322 million for child protective services/criminal justice, $2.0 billion for lost work, and $10.3 billion for lost quality of life. Government sources paid an estimated $1.3 billion. Out-of-pocket benefits of existing prevention programming would exceed its costs if it prevents 2% of cases. When a child survives AHT, providers and caregivers can anticipate a lifetime of potentially costly and life-threatening care needs. Better effectiveness estimates are needed for both broad prevention messaging and intensive prevention targeting high-risk caregivers.

      2. Medical costs of treating breast cancer among younger medicaid beneficiaries by stage at diagnosis
        Trogdon JG, Ekwueme DU, Poehler D, Thomas CC, Reeder-Hayes K, Allaire BT.
        Breast Cancer Res Treat. 2017 Jul 12.
        BACKGROUND: Younger women (aged 18-44 years) diagnosed with breast cancer often face more aggressive tumors, higher treatment intensity, and lower survival rates than older women. In this study, we estimated incident breast cancer costs by stage at diagnosis and by race for younger women enrolled in Medicaid. METHODS: We analyzed cancer registry data linked to Medicaid claims in North Carolina from 2003 to 2008. We used Surveillance, Epidemiology, and End Results (SEER) Summary 2000 definitions for cancer stage. We split breast cancer patients into two cohorts: a younger and older group aged 18-44 and 45-64 years, respectively. We conducted a many-to-one match between patients with and without breast cancer using age, county, race, and Charlson Comorbidity Index. We calculated mean excess total cost of care between breast cancer and non-breast cancer patients. RESULTS: At diagnosis, younger women had a higher proportion of regional cancers than older women (49 vs. 42%) and lower proportions of localized cancers (44 vs. 50%) and distant cancers (7 vs. 9%). The excess costs of breast cancer (all stages) for younger and older women at 6 months after diagnosis were $37,114 [95% confidence interval (CI) = $35,769-38,459] and $28,026 (95% CI = $27,223-28,829), respectively. In the 6 months after diagnosis, the estimated excess cost was significantly higher to treat localized and regional cancer among younger women than among older women. There were no statistically significant differences in excess costs of breast cancer by race, but differences in treatment modality were present among younger Medicaid beneficiaries. CONCLUSIONS: Younger breast cancer patients not only had a higher prevalence of late-stage cancer than older women, but also had higher within-stage excess costs.

    • Immunity and Immunization RSS Word feed
      1. On September 19, 2014, CDC published the Advisory Committee on Immunization Practices (ACIP) recommendation for the routine use of 13-valent pneumococcal conjugate vaccine (PCV13) among adults aged >/=65 years, to be used in series with 23-valent pneumococcal polysaccharide vaccine (PPSV23) (1). This replaced the previous recommendation that adults aged >/=65 years should be vaccinated with a single dose of PPSV23. As a proxy for estimating PCV13 and PPSV23 vaccination coverage among adults aged >/=65 years before and after implementation of these revised recommendations, CDC analyzed claims for vaccination submitted for reimbursement to the Centers for Medicare & Medicaid Services (CMS). Claims from any time during a beneficiary’s enrollment in Medicare Parts A (hospital insurance) and B (medical insurance) since reaching age 65 years were assessed among beneficiaries continuously enrolled in Medicare Parts A and B during annual periods from September 19, 2009, through September 18, 2016. By September 18, 2016, 43.2% of Medicare beneficiaries aged >/=65 years had claims for at least 1 dose of PPSV23 (regardless of PCV13 status), 31.5% had claims for at least 1 dose of PCV13 (regardless of PPSV23 status), and 18.3% had claims for at least 1 dose each of PCV13 and PPSV23. Claims for either type of pneumococcal vaccine were highest among beneficiaries who were older, white, or with chronic and immunocompromising medical conditions than among healthy adults. Implementation of the National Vaccine Advisory Committee’s standards for adult immunization practice to assess vaccination status at every patient encounter, recommend needed vaccines, and administer vaccination or refer to a vaccinating provider might help increase pneumococcal vaccination coverage and reduce the risk for pneumonia and invasive pneumococcal disease among older adults (2).

      2. High risk for invasive meningococcal disease among patients receiving eculizumab (Soliris) despite receipt of meningococcal vaccine
        McNamara LA, Topaz N, Wang X, Hariri S, Fox L, MacNeil JR.
        MMWR Morb Mortal Wkly Rep. 2017 Jul 14;66(27):734-737.
        Use of eculizumab (Soliris, Alexion Pharmaceuticals), a terminal complement inhibitor, is associated with a 1,000-fold to 2,000-fold increased incidence of meningococcal disease (1). Administration of meningococcal vaccines is recommended for patients receiving eculizumab before beginning treatment (2,3). Sixteen cases of meningococcal disease were identified in eculizumab recipients in the United States during 2008-2016; among these, 11 were caused by nongroupable Neisseria meningitidis. Fourteen patients had documentation of receipt of at least 1 dose of meningococcal vaccine before disease onset. Because eculizumab recipients remain at risk for meningococcal disease even after receipt of meningococcal vaccines, some health care providers in the United States as well as public health agencies in other countries recommend antimicrobial prophylaxis for the duration of eculizumab treatment; a lifelong course of treatment is expected for many patients. Heightened awareness, early care seeking, and rapid treatment of any symptoms consistent with meningococcal disease are essential for all patients receiving eculizumab treatment, regardless of meningococcal vaccination or antimicrobial prophylaxis status.

      3. BACKGROUND: Vaccination of health care personnel (HCP) can reduce influenza-related morbidity and mortality among HCP and their patients. This study investigated workplace policies associated with influenza vaccination among HCP who work in ambulatory care settings without influenza vaccination requirements. METHODS: Data were obtained from online surveys conducted during April 2014 and April 2015 among nonprobability samples of HCP recruited from 2 preexisting national opt-in Internet panels. Respondents were asked about their vaccination status and workplace policies and interventions related to vaccination. Logistic regression models were used to assess the independent associations between each workplace intervention and influenza vaccination while controlling for occupation, age, and race or ethnicity. RESULTS: Among HCP working in ambulatory care settings without a vaccination requirement (n = 866), 65.7% reported receiving influenza vaccination for the previous influenza season. Increased vaccination coverage was independently associated with free onsite vaccination for 1 day (prevalence ratio [PR], 1.38; 95% confidence interval [CI], 1.07-1.78 or >1 day PR, 1.58; 95% CI, 1.29-1.94) and employers sending personal vaccination reminders (PR, 1.20; 95% CI, 0.99-1.46). Age >/=65 years (PR, 1.30; 95% CI, 1.07-1.56) and working as a clinical professional (PR, 1.26; 95% CI, 1.06-1.50) or clinical nonprofessional (PR, 1.28; 95% CI, 1.03-1.60) were also associated with higher coverage. Vaccination coverage increased with increasing numbers of workplace interventions. CONCLUSIONS: Implementing workplace vaccination interventions in ambulatory care settings, including free onsite influenza vaccination that is actively promoted, could help increase influenza vaccination among HCP.

    • Injury and Violence RSS Word feed
      1. Detecting suicide-related emergency department visits among adults using the District of Columbia Syndromic Surveillance System
        Kuramoto-Crawford SJ, Spies EL, Davies-Cole J.
        Public Health Rep. 2017 Jul/Aug;132(1_suppl):88s-94s.
        OBJECTIVES: Limited studies have examined the usefulness of syndromic surveillance to monitor emergency department (ED) visits involving suicidal ideation or attempt. The objectives of this study were to (1) examine whether syndromic surveillance of chief complaint data can detect suicide-related ED visits among adults and (2) assess the added value of using hospital ED data on discharge diagnoses to detect suicide-related visits. METHODS: The study data came from the District of Columbia electronic syndromic surveillance system, which provides daily information on ED visits at 8 hospitals in Washington, DC. We detected suicide-related visits by searching for terms in the chief complaints and discharge diagnoses of 248 939 ED visits for which data were available for October 1, 2015, to September 30, 2016. We examined whether detection of suicide-related visits according to chief complaint data, discharge diagnosis data, or both varied by patient sex, age, or hospital. RESULTS: The syndromic surveillance system detected 1540 suicide-related ED visits, 950 (62%) of which were detected through chief complaint data and 590 (38%) from discharge diagnosis data. The source of detection for suicide-related ED visits did not vary by patient sex or age. However, whether the suicide-related terms were mentioned in the chief complaint or discharge diagnosis differed across hospitals. CONCLUSIONS: ED syndromic surveillance systems based on chief complaint data alone would underestimate the number of suicide-related ED visits. Incorporating the discharge diagnosis into the case definition could help improve detection.

    • Laboratory Sciences RSS Word feed
      1. Ambient particulate matter may upset redox homeostasis, leading to oxidative stress and adverse health effects. Size distributions of water-insoluble and water-soluble OPDTT(dithiothreitol assay, measure of oxidative potential per air volume) are reported for a roadside site and an urban site. The average water-insoluble fractions were 23% and 51%, and 37% and 39%, for fine and coarse modes at the roadside and urban sites, respectively, measured during different periods. Water-soluble OPDTTwas unimodal, peaked near 1-2.5 m due to contributions from fine-mode organic components plus coarse-mode transition metal ions. In contrast, water-insoluble OPDTTwas bimodal, with both fine and coarse modes. The main chemical components that drive both fractions appear to be the same, except that for water-insoluble OPDTTthe compounds were absorbed on surfaces of soot and non-tailpipe traffic dust. They were largely externally mixed and deposited in different regions in the respiratory system, transition metal ions predominately in the upper regions and organic species, such as quinones, deeper in the lung. Although OPDTTper mass (toxicity) was highest for ultrafine particles, estimated lung deposition was mainly from accumulation and coarse particles. Contrasts in the phases of these forms of OPDTTdeposited in the respiratory system may have differing health impacts. 2017 American Chemical Society.

      2. The Centers for Disease Control and Prevention developed a biomonitoring method to rapidly and accurately quantify chromium and cobalt in human whole blood by ICP-MS. Many metal-on-metal hip implants which contain significant amounts of chromium and cobalt are susceptible to metal degradation. This method is used to gather population data about chromium and cobalt exposure of the U.S. population that does not include people that have metal-on-metal hip implants so that reference value can be established for a baseline level in blood. We evaluated parameters such as; helium gas flow rate, choice and composition of the diluent solution for sample preparation, and sample rinse time to determine the optimal conditions for analysis. The limits of detection for chromium and cobalt in blood were determined to be 0.41 and 0.06 g L-1, respectively. Method precision, accuracy, and recovery for this method were determined using quality control material created in-house and historical proficiency testing samples. We conducted experiments to determine if quantitative changes in the method parameters affect the results obtained by changing four parameters while analyzing human whole blood spiked with National Institute of Standard and Technology traceable materials: the dilution factor used during sample preparation, sample rinse time, diluent composition, and kinetic energy discrimination gas flow rate. The results at the increased and decreased levels for each parameter were statistically compared to the results obtained at the optimized parameters. We assessed the degree of reproducibility obtained under a variety of conditions and evaluated the method’s robustness by analyzing the same set of proficiency testing samples by different analysts, on different instruments, with different reagents, and on different days. The short-term stability of chromium and cobalt in human blood samples stored at room temperature was monitored over a time period of 64 hours by diluting and analyzing samples at different time intervals. The stability of chromium and cobalt post-dilution was also evaluated over a period of 48 hours and at two storage temperatures (room temperature and refrigerated at 4 C). The results obtained during the stability studies showed that chromium and cobalt are stable in human blood for a period of 64 hours. The Royal Society of Chemistry 2017.

      3. Pulsotype diversity of Clostridium botulinum strains containing serotypes A and/or B genes
        Halpin JL, Joseph L, Dykes JK, McCroskey L, Smith E, Toney D, Stroika S, Hise K, Maslanka S, Luquez C.
        Foodborne Pathog Dis. 2017 Jul 10.
        Clostridium botulinum strains are prevalent in the environment and produce a potent neurotoxin that causes botulism, a rare but serious paralytic disease. In 2010, a national PulseNet database was established to curate C. botulinum pulsotypes and facilitate epidemiological investigations, particularly for serotypes A and B strains frequently associated with botulism cases in the United States. Between 2010 and 2014 we performed pulsed-field gel electrophoresis (PFGE) using a PulseNet protocol, uploaded the resulting PFGE patterns into a national database, and analyzed data according to PulseNet criteria (UPGMA clustering, Dice coefficient, 1.5% position tolerance, and 1.5% optimization). A retrospective data analysis was undertaken on 349 entries comprised of type A and B strains isolated from foodborne and infant cases to determine epidemiological relevance, resolution of the method, and the diversity of the database. Most studies to date on the pulsotype diversity of C. botulinum have encompassed very small sets of isolates; this study, with over 300 isolates, is more comprehensive than any published to date. Epidemiologically linked isolates had indistinguishable patterns, except in four instances and there were no obvious geographic trends noted. Simpson’s Index of Diversity (D) has historically been used to demonstrate species diversity and abundance within a group, and is considered a standard descriptor for PFGE databases. Simpson’s Index was calculated for each restriction endonuclease (SmaI, XhoI), the pattern combination SmaI-XhoI, as well as for each toxin serotype. The D values indicate that both enzymes provided better resolution for serotype B isolates than serotype A. XhoI as the secondary enzyme provided little additional discrimination for C. botulinum. SmaI patterns can be used to exclude unrelated isolates during a foodborne outbreak, but pulsotypes should always be considered concurrently with available epidemiological data.

      4. The risk of workers’ exposure to aerosolized particles has increased with the upsurge in the production of engineered nanomaterials. Currently, a whole-body standard test method for measuring particle penetration through protective clothing ensembles is not available. Those available for respirators neglect the most common challenges to ensembles, because they use active vacuum-based filtration, designed to simulate breathing, rather than the positive forces of wind experienced by workers. Thus, a passive method that measures wind-driven particle penetration through ensemble fabric has been developed and evaluated. The apparatus includes a multidomain magnetic passive aerosol sampler housed in a shrouded penetration cell. Performance evaluation was conducted in a recirculation aerosol wind tunnel using paramagnetic Fe3O4 (i.e., iron (II, III) oxide) particles for the challenge aerosol. The particles were collected on a PVC substrate and quantified using a computer-controlled scanning electron microscope. Particle penetration levels were determined by taking the ratio of the particle number collected on the substrate with a fabric (sample) to that without a fabric (control). Results for each fabric obtained by this passive method were compared to previous results from an automated vacuum-based active fractional efficiency tester (TSI 3160), which used sodium chloride particles as the challenge aerosol. Four nonwoven fabrics with a range of thicknesses, porosities, and air permeabilities were evaluated. Smoke tests and flow modeling showed the passive sampler shroud provided smooth (non-turbulent) air flow along the exterior of the sampler, such that disturbance of flow stream lines and distortion of the particle size distribution were reduced. Differences between the active and passive approaches were as high as 5.5-fold for the fabric with the lowest air permeability (0.00067 m/sec-Pa), suggesting the active method overestimated penetration in dense fabrics because the active method draws air at a constant flow rate regardless of the resistance of the test fabric. The passive method indicated greater sensitivity since penetration decreased in response to the increase in permeability.

      5. Objectives: Workers who fabricate stone countertops using hand tools are at risk of silicosis from overexposure to respirable crystalline silica. This study explored the efficacy of simple engineering controls that can be used for dust suppression during use of hand tools by stone countertop fabricators. Methods: Controlled experiments were conducted to measure whether wet methods and on-tool local exhaust ventilation (LEV) reduced respirable dust (RD) exposures during use of various powered hand tools on quartz-rich engineered stone. RD samples collected during edge grinding with a diamond cup wheel and a silicon carbide abrasive wheel were analyzed gravimetrically as well as by X-ray diffraction to determine silica content. A personal optical aerosol monitor was used simultaneously with the RD samples and also for rapid assessment of controls for polishing, blade cutting, and core drilling. Results: On-tool LEV and sheet-flow-wetting were effective in reducing exposures, especially when used in combination. Sheet-flow-wetting with LEV reduced geometric mean exposures by as much as 95%. However, typical water-spray-wetting on a grinding cup was less effective when combined with LEV than without LEV. Mean silica content of RD samples from grinding operations was 53%, and respirable mass and silica mass were very highly correlated (r = 0.980). Optical concentration measures were moderately well correlated with gravimetric measures (r = 0.817), but on average the optical measures during a single trial using the factory calibration were only one-fifth the simultaneous gravimetric measures. Conclusions: Sheet-flow-wetting combined with on-tool LEV is an effective engineering control for reducing RD exposures during engineered stone edge grinding and blade cutting. On the other hand, addition of LEV to some water-spray-wetted tools may reduce the effectiveness of the wet method.

      6. This method was designed for sampling select quaternary ammonium (quat) compounds in air or on surfaces followed by analysis using ultraperformance liquid chromatography tandem mass spectrometry. Target quats were benzethonium chloride, didecyldimethylammonium bromide, benzyldimethyldodecylammonium chloride, benzyldimethyltetradecylammonium chloride, and benzyldimethylhexadecylammonium chloride. For air sampling, polytetrafluoroethylene (PTFE) filters are recommended for 15-min to 24-hour sampling. For surface sampling, Pro-wipe 880 (PW) media was chosen. Samples were extracted in 60:40 acetonitrile:0.1% formic acid for 1 hour on an orbital shaker. Method detection limits range from 0.3 to 2 ng/ml depending on media and analyte. Matrix effects of media are minimized through the use of multiple reaction monitoring versus selected ion recording. Upper confidence limits on accuracy meet the National Institute for Occupational Safety and Health 25% criterion for PTFE and PW media for all analytes. Using PTFE and PW analyzed with multiple reaction monitoring, the method quantifies levels among the different quats compounds with high precision (<10% relative standard deviation) and low bias (<11%). The method is sensitive enough with very low method detection limits to capture quats on air sampling filters with only a 15-min sample duration with a maximum assessed storage time of 103 days before sample extraction. This method will support future exposure assessment and quantitative epidemiologic studies to explore exposure-response relationships and establish levels of quats exposures associated with adverse health effects.

      7. In the USA, rabies vaccines (RVs) are licensed for intramuscular (IM) use only, although RVs are licensed for use by the intradermal (ID) route in many other countries. Recent limitations in supplies of RV in the USA reopened discussions on the more efficient use of available biologics, including utilization of more stringent risk assessments, and potential ID RV administration. A clinical trial was designed to compare the immunogenic and adverse effects of a purified chicken embryo cell (PCEC) RV administered ID or IM. Enrollment was designed in four arms, ID Pre-Exposure Prophylaxis (Pre-EP), IM Pre-EP, ID Booster, and IM Booster vaccination. Enrollment included 130 adult volunteers. The arms with IM administration received vaccine according to the current ACIP recommendations: Pre-EP, three 1mL (2.5 I.U.) RV doses, each on day 0, 7, and 21; or a routine Booster, one 1ml dose. The ID groups received the same schedule, but doses administered were in a volume of 0.1mL (0.25 I.U.). The rate of increase in rabies virus neutralizing antibody titers 14-21days after vaccination were similar in the ID and correspondent IM groups. The GMT values for ID vaccination were slightly lower than those for IM vaccination, for both naive and booster groups, and these differences were statistically significant by t-test. Fourteen days after completing vaccination, all individuals developed RV neutralizing antibody titers over the minimum arbitrary value obtained with the rapid fluorescent focus inhibition test (RFFIT). Antibodies were over the set threshold until the end of the trial, 160days after completed vaccination. No serious adverse reactions were reported. Most frequent adverse reactions were erythema, induration and tenderness, localized at the site of injection. Multi use of 1mL rabies vaccine vials for ID doses of 0.1 was demonstrated to be both safe and inmunogenic.

      8. Characterization of Eptesipoxvirus, a novel poxvirus from a microchiropteran bat
        Tu SL, Nakazawa Y, Gao J, Wilkins K, Gallardo-Romero N, Li Y, Emerson GL, Carroll DS, Upton C.
        Virus Genes. 2017 Jul 06.
        The genome of Eptesipoxvirus (EPTV) is the first poxvirus genome isolated from a microbat. The 176,688 nt sequence, which is believed to encompass the complete coding region of the virus, is 67% A+T and is predicted to encode 191 genes. 11 of these genes have no counterpart in GenBank and are therefore unique to EPTV. The presence of a distantly related ortholog of Vaccinia virus F5L in EPTV uncovered a link with fragmented F5L orthologs in Molluscum contagiosum virus/squirrelpox and clade II viruses. Consistent with the unique position of EPTV approximately mid-point between the orthopoxviruses and the clade II viruses, EPTV has 11 genes that are specific to the orthopoxviruses and 13 genes that are typical, if not exclusive, to the clade II poxviruses. This mosaic nature of EPTV blurs the distinction between the old description of the orthopoxvirus and clade II groups. Genome annotation and characterization failed to find any common virulence genes shared with the other poxvirus isolated from bat (pteropoxvirus); however, EPTV encodes 3 genes that may have been transferred to or from deerpox and squirrelpox viruses; 2 of these, a putative endothelin-like protein and a MHC class I-like protein are likely to have immunomodulatory roles.

      9. A new, low-cost approach based on the application of atmospheric radio frequency glow discharge (rf-GD) optical emission spectroscopy (OES) has been developed for near real-time measurement of multielemental concentration in airborne particulate phase. This method involves deposition of aerosol particles on the tip of a cathode in a coaxial microelectrode system, followed by ablation, atomization, and excitation of the particulate matter using the rf-GD. The resulting atomic emissions are recorded using a spectrometer for elemental identification and quantification. The glow discharge plasma in our system was characterized by measuring spatially resolved gas temperatures (378-1438 K) and electron densities (2-5 1014cm-3). Spatial analysis of the spectral features showed that the excitation of the analyte occurred in the region near the collection electrode. The temporal analysis of spectral features in the rf-GD showed that the collected particles were continuously ablated; the time for complete ablation of 193 ng of sucrose particles was found to be approximately 2 s. The system was calibrated using 100 nm particles containing C, Cd, Mn, and Na, respectively. The method provides limits of detection in the range of 0.055-1.0 ng, and a measurement reproducibility of 5-28%. This study demonstrates that the rf-GD can be an excellent excitation source for the development of low-cost hand-held sensors for elemental measurement of aerosols. 2017, American Chemical Society. All rights reserved.

    • Maternal and Child Health RSS Word feed
      1. Racial and geographic differences in breastfeeding – United States, 2011-2015
        Anstey EH, Chen J, Elam-Evans LD, Perrine CG.
        MMWR Morb Mortal Wkly Rep. 2017 Jul 14;66(27):723-727.
        Breastfeeding provides numerous health benefits for infants and mothers alike. The American Academy of Pediatrics recommends exclusive breastfeeding for approximately the first 6 months of life and continued breastfeeding with complementary foods through at least the first year (1). National estimates indicate substantial differences between non-Hispanic black (black) and non-Hispanic white (white) infants across breastfeeding indicators in the United States (2). CDC analyzed 2011-2015 National Immunization Survey (NIS) data for children born during 2010-2013 to describe breastfeeding initiation, exclusivity through 6 months and duration at 12 months among black and white infants. Among the 34 states (including the District of Columbia [DC]) with sufficient sample size (>/=50 per group), initiation rates were significantly (p<0.05) lower among black infants than white infants in 23 states; in 14 of these states (primarily in the South and Midwest), the difference was at least 15 percentage points. A significant difference of at least 10 percentage points was identified in exclusive breastfeeding through 6 months in 12 states and in breastfeeding at 12 months in 22 states. Despite overall increases in breastfeeding rates for black and white infants over the last decade, racial disparities persist. Interventions specifically addressing barriers to breastfeeding for black women are needed.

      2. Are lower TSH cutoffs in neonatal screening for congenital hypothyroidism warranted? A debate
        Lain S, Trumpff C, Grosse SD, Olivieri A, Van Vliet G.
        Eur J Endocrinol. 2017 Jul 10.
        When newborn screening (NBS) for congenital hypothyroidism (CH) using thyroid stimulating hormone (TSH) as a primary screening test was introduced, typical TSH screening cutoffs were 20 to 50 mU/L of whole blood. Over the years, lowering of TSH cutoffs has contributed to an increased prevalence of detected CH. However, a consensus on the benefit deriving from lowering TSH cutoffs at screening is lacking. The present paper outlines arguments both for and against the lowering of TSH cutoffs at NBS. It includes a review of recently published evidence from Australia, Belgium, and Italy. A section focused on economic implications of lowering TSH cutoffs is also provided. One issue that bears further examination is the extent to which mild iodine deficiency at the population level might affect the association of neonatal TSH values with cognitive and developmental outcomes. A debate on TSH cutoffs provides the opportunity to reflect on how to make NBS for CH more effective and to guarantee optimum neurocognitive development and a good quality of life to babies with mild as well as with severe CH. All authors of this debate article agree on the need to establish optimal TSH cutoffs for screening programs in various settings and to ensure the benefits of screening and access to care for newborns worldwide.

      3. Early hearing detection and vocabulary of children with hearing loss
        Yoshinaga-Itano C, Sedey AL, Wiggin M, Chung W.
        Pediatrics. 2017 Jul 08.
        BACKGROUND AND OBJECTIVES: To date, no studies have examined vocabulary outcomes of children meeting all 3 components of the Early Hearing Detection and Intervention (EHDI) guidelines (hearing screening by 1 month, diagnosis of hearing loss by 3 months, and intervention by 6 months of age). The primary purpose of the current study was to examine the impact of the current EHDI 1-3-6 policy on vocabulary outcomes across a wide geographic area. A secondary goal was to confirm the impact of other demographic variables previously reported to be related to language outcomes. METHODS: This was a cross-sectional study of 448 children with bilateral hearing loss between 8 and 39 months of age (mean = 25.3 months, SD = 7.5 months). The children lived in 12 different states and were participating in the National Early Childhood Assessment Project. RESULTS: The combination of 6 factors in a regression analysis accounted for 41% of the variance in vocabulary outcomes. Vocabulary quotients were significantly higher for children who met the EHDI guidelines, were younger, had no additional disabilities, had mild to moderate hearing loss, had parents who were deaf or hard of hearing, and had mothers with higher levels of education. CONCLUSIONS: Vocabulary learning may be enhanced with system improvements that increase the number of children meeting the current early identification and intervention guidelines. In addition, intervention efforts need to focus on preventing widening delays with chronological age, assisting mothers with lower levels of education, and incorporating adults who are deaf/hard-of-hearing in the intervention process.

    • Nutritional Sciences RSS Word feed
      1. Availability and promotion of healthful foods in stores and restaurants – Guam, 2015
        Lundeen EA, VanFrank BK, Jackson SL, Harmon B, Uncangco A, Luces P, Dooyema C, Park S.
        Prev Chronic Dis. 2017 Jul 13;14:E56.
        Chronic disease, which is linked to unhealthy nutrition environments, is highly prevalent in Guam. The nutrition environment was assessed in 114 stores and 63 restaurants in Guam. Stores had limited availability of some healthier foods such as lean ground meat (7.5%) and 100% whole-wheat bread (11.4%), while fruits (81.0%) and vegetables (94.8%) were more commonly available; 43.7% of restaurants offered a healthy entree or main dish salad, 4.1% provided calorie information, and 15.7% denoted healthier choices on menus. Improving the nutrition environment could help customers make healthier choices.

      2. Purpose In 2014, the New Jersey Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) began requiring WIC-authorized stores to stock at least two fresh fruits and two fresh vegetables. We aimed to evaluate the effect of this policy change on fruit and vegetable purchases among WIC-participating households and to assess variation by household access to a healthy food store such as a supermarket or large grocery store. Description Households with continuous WIC enrollment from June 2013 to May 2015 were included (n = 16,415). Participants receive monthly cash-value vouchers (CVVs) to purchase fruits and vegetables. For each household, the CVV redemption proportion was calculated for the period before and after the policy by dividing the total dollar amount redeemed by the total dollar amount issued. Complete redemption was defined as a proportion >/=90% and the change in complete redemption odds was assessed after adjusting for Supplemental Nutrition Assistance Program participation. Assessment We observed a small increase following the policy change [odds ratio (OR) 1.10, 95% confidence interval (CI) 1.04-1.17]; however, the effect varied by healthy food access (p = 0.03). The odds increased for households with access to at least one healthy food store (OR 1.13, 95% CI 1.06-1.20) while no effect was observed for households without such access (OR 0.91, 95% CI 0.76-1.10). Conclusion Policy change was associated with a small increase in purchasing, but only among households with healthy food access. The state is addressing this gap through technical assistance interventions targeting WIC-authorized small stores in communities with limited access.

    • Occupational Safety and Health RSS Word feed
      1. Mortality from amyotrophic lateral sclerosis and Parkinson’s disease among different occupation groups – United States, 1985-2011
        Beard JD, Steege AL, Ju J, Lu J, Luckhaupt SE, Schubauer-Berigan MK.
        MMWR Morb Mortal Wkly Rep. 2017 Jul 14;66(27):718-722.
        Amyotrophic lateral sclerosis (ALS) and Parkinson’s disease, both progressive neurodegenerative diseases, affect >1 million Americans (1,2). Consistently reported risk factors for ALS include increasing age, male sex, and cigarette smoking (1); risk factors for Parkinson’s disease include increasing age, male sex, and pesticide exposure, whereas cigarette smoking and caffeine consumption are inversely associated (2). Relative to cancer or respiratory diseases, the role of occupation in neurologic diseases is much less studied and less well understood (3). CDC evaluated associations between usual occupation and ALS and Parkinson’s disease mortality using data from CDC’s National Institute for Occupational Safety and Health (NIOSH) National Occupational Mortality Surveillance (NOMS), a population-based surveillance system that includes approximately 12.1 million deaths from 30 U.S. states.* Associations were estimated using proportionate mortality ratios (PMRs), standardizing indirectly by age, sex, race, and calendar year to the standard population of all NOMS deaths with occupation information. Occupations associated with higher socioeconomic status (SES) had elevated ALS and Parkinson’s disease mortality. The shifts in the U.S. workforce toward older ages and higher SES occupationsdagger highlight the importance of understanding this finding, which will require studies with designs that provide evidence for causality, detailed exposure assessment, and adjustment for additional potential confounders.

      2. OBJECTIVE: Asthma severity is defined as the intensity of treatment required to achieve good control of asthma symptoms. Studies have shown that work-related asthma (WRA) can be associated with poorer asthma control and more severe symptoms than non-WRA. Associations between asthma medications and WRA status were assessed using data from the 2012-2013 Asthma Call-back Survey among ever-employed adults (>/=18 years) with current asthma from 29 states. METHODS: Persons with WRA had been told by a physician that their asthma was work-related. Persons with possible WRA had asthma caused or made worse by their current or previous job, but did not have physician-diagnosed WRA. Asthma medications were classified as controller (i.e., long-acting beta-agonist, inhaled corticosteroid, oral corticosteroid, cromolyn/nedocromil, leukotriene pathway inhibitor, methylxanthine, anti-cholinergics) and rescue (i.e., short-acting beta-agonist). Demographic and clinical characteristics were examined. Associations between asthma medications and WRA status were assessed using a multivariate logistic regression to calculate adjusted prevalence ratios (PRs). RESULTS: Among an estimated 15 million ever-employed adults with current asthma, 14.7% had WRA and an additional 40.4% had possible WRA. Compared with adults with non-WRA, those with WRA were more likely to have taken anti-cholinergics (PR = 1.80), leukotriene pathway inhibitor (PR = 1.59), and methylxanthine (PR = 4.76), and those with possible WRA were more likely to have taken methylxanthine (PR = 2.85). CONCLUSIONS: Results provide additional evidence of a higher proportion of severe asthma among adults with WRA compared to non-WRA. To achieve optimal asthma control, adults with WRA may require additional intervention, such as environmental controls or removal from the workplace exposure.

      3. Capture and coding of industry and occupation measures: Findings from eight National Program of Cancer Registries states
        Freeman MB, Pollack LA, Rees JR, Johnson CJ, Rycroft RK, Rousseau DL, Hsieh MC.
        Am J Ind Med. 2017 Aug;60(8):689-695.
        BACKGROUND: Although data on industry and occupation (I&O) are important for understanding cancer risks, obtaining standardized data is challenging. This study describes the capture of specific I&O text and the ability of a web-based tool to translate text into standardized codes. METHODS: Data on 62 525 cancers cases received from eight National Program of Cancer Registries (NPCR) states were submitted to a web-based coding tool developed by the National Institute for Occupational Safety and Health for translation into standardized I&O codes. We determined the percentage of sufficiently analyzable codes generated by the tool. RESULTS: Using the web-based coding tool on data obtained from chart abstraction, the NPCR cancer registries achieved between 48% and 75% autocoding, but only 12-57% sufficiently analyzable codes. CONCLUSIONS: The ability to explore associations between work-related exposures and cancer is limited by current capture and coding of I&O data. Increased training of providers and registrars, as well as software enhancements, will improve the utility of I&O data.

      4. Mortality from circulatory diseases and other non-cancer outcomes among nuclear workers in France, the United Kingdom and the United States (INWORKS)
        Gillies M, Richardson DB, Cardis E, Daniels RD, O’Hagan JA, Haylock R, Laurier D, Leuraud K, Moissonnier M, Schubauer-Berigan MK, Thierry-Chef I, Kesminiene A.
        Radiat Res. 2017 Jul 10.
        Positive associations between external radiation dose and non-cancer mortality have been made in a number of published studies, primarily of populations exposed to high-dose, high-dose-rate ionizing radiation. The goal of this study was to determine whether external radiation dose was associated with non-cancer mortality in a large pooled cohort of nuclear workers exposed to low-dose radiation accumulated at low dose rates. The cohort comprised 308,297 workers from France, United Kingdom and United States. The average cumulative equivalent dose at a tissue depth of 10 mm [Hp(10)] was 25.2 mSv. In total, 22% of the cohort were deceased by the end of follow-up, with 46,029 deaths attributed to non-cancer outcomes, including 27,848 deaths attributed to circulatory diseases. Poisson regression was used to investigate the relationship between cumulative radiation dose and non-cancer mortality rates. A statistically significant association between radiation dose and all non-cancer causes of death was observed [excess relative risk per sievert (ERR/Sv) = 0.19; 90% CI: 0.07, 0.30]. This was largely driven by the association between radiation dose and mortality due to circulatory diseases (ERR/Sv = 0.22; 90% CI: 0.08, 0.37), with slightly smaller positive, but nonsignificant, point estimates for mortality due to nonmalignant respiratory disease (ERR/Sv = 0.13; 90% CI: -0.17, 0.47) and digestive disease (ERR/Sv = 0.11; 90% CI: -0.36, 0.69). The point estimate for the association between radiation dose and deaths due to external causes of death was nonsignificantly negative (ERR = -0.12; 90% CI: <-0.60, 0.45). Within circulatory disease subtypes, associations with dose were observed for mortality due to cerebrovascular disease (ERR/Sv = 0.50; 90% CI: 0.12, 0.94) and mortality due to ischemic heart disease (ERR/Sv = 0.18; 90% CI: 0.004, 0.36). The estimates of associations between radiation dose and non-cancer mortality are generally consistent with those observed in atomic bomb survivor studies. The findings of this study could be interpreted as providing further evidence that non-cancer disease risks may be increased by external radiation exposure, particularly for ischemic heart disease and cerebrovascular disease. However, heterogeneity in the estimated ERR/Sv was observed, which warrants further investigation. Further follow-up of these cohorts, with the inclusion of internal exposure information and other potential confounders associated with lifestyle factors, may prove informative, as will further work on elucidating the biological mechanisms that might cause these non-cancer effects at low doses.

      5. Stressful life events and posttraumatic growth among police officers: A cross-sectional study
        Leppma M, Mnatsakanova A, Sarkisian K, Scott O, Adjeroh L, Andrew ME, Violanti JM, McCanlies EC.
        Stress Health. 2017 Jul 13.
        Police officers often continue to face numerous threats and stressors in the aftermath of a disaster. To date, posttraumatic growth (PTG) has been studied primarily in the context of significant trauma; thus, it is not known whether stressful life events are associated with PTG. This study investigated the development of PTG among 113 police officers working in the New Orleans area following Hurricane Katrina. Hierarchical regression was used to evaluate if gratitude, social support, and satisfaction with life moderated the relationship between stressful life events (as measured by the total life stress score) and PTG, after adjustment for age, sex, race, level of involvement in Hurricane Katrina, and alcohol intake. Results indicate that stressful life events are independently associated with PTG. Gratitude, satisfaction with life, and social support were seen to moderate this relationship; as stressful life events increased so too did PTG-particularly among officers with higher levels of gratitude (B = 0.002, p </= .05), satisfaction with life (B = 0.002, p </= .05), and social support (B = 0.001, p </= .05). These findings suggest that promoting satisfaction with life, interpersonal support, and gratitude may be beneficial to those who are regularly at risk of trauma exposure.

    • Occupational Safety and Health – Mining RSS Word feed
      1. Morbidity and health risk factors among New Mexico miners: A comparison across mining sectors
        Shumate AM, Yeoman K, Victoroff T, Evans K, Karr R, Sanchez T, Sood A, Laney AS.
        J Occup Environ Med. 2017 Jul 10.
        OBJECTIVE: This study examines differences in chronic health outcomes between coal, uranium, metal, and nonmetal miners. METHODS: In a cross-sectional study using data from a health screening program for current and former New Mexico miners, log-binomial logistic regression models were used to estimate relative risks of respiratory and heart disease, cancer, osteoarthritis, and back pain associated with mining in each sector as compared with coal, adjusting for other relevant risk factors. RESULTS: Differential risks in angina, pulmonary symptoms, asthma, cancer, osteoarthritis, and back pain between mining sectors were found. CONCLUSIONS: New Mexico miners experience different chronic health challenges across sectors. These results demonstrate the importance of using comparable data to understand how health risks differ across mining sectors. Further investigation among a broader geographic population of miners will help identify the health priorities and needs in each sector.

    • Physical Activity RSS Word feed
      1. With aging, muscle injury from rapid, continuous stretch-shortening contractions (SSCs) is prolonged and maladaptation to moderate-velocity, intermittent SSCs is more common. We investigate the hypothesis that high baseline levels of inflammatory signaling and oxidative stress may underlie these outcomes, while careful modulation of high-intensity SSC training design resets basal conditions and permits muscle adaptation to SSCs.

    • Reproductive Health RSS Word feed
      1. Pregnancy-related mortality in the United States, 2011-2013
        Creanga AA, Syverson C, Seed K, Callaghan WM.
        Obstet Gynecol. 2017 Jul 07.
        OBJECTIVE: To update national population-level pregnancy-related mortality estimates and examine characteristics and causes of pregnancy-related deaths in the United States during 2011-2013. METHODS: We conducted an observational study using population-based data from the Pregnancy Mortality Surveillance System to calculate pregnancy-related mortality ratios by year, age group, and race-ethnicity groups. We explored 10 cause-of-death categories by pregnancy outcome during 2011-2013 and compared their distribution with those in our earlier reports since 1987. RESULTS: The 2011-2013 pregnancy-related mortality ratio was 17.0 deaths per 100,000 live births. Pregnancy-related mortality ratios increased with maternal age, and racial-ethnic disparities persisted with non-Hispanic black women having a 3.4 times higher mortality ratio than non-Hispanic white women. Among causes of pregnancy-related deaths, the following groups contributed more than 10%: cardiovascular conditions ranked first (15.5%) followed by other medical conditions often reflecting pre-existing illnesses (14.5%), infection (12.7%), hemorrhage (11.4%), and cardiomyopathy (11.0%). Relative to the most recent report of Pregnancy Mortality Surveillance System data for 2006-2010, the distribution of cause-of-death categories did not change considerably. However, compared with serial reports before 2006-2010, the contribution of hemorrhage, hypertensive disorders of pregnancy, and anesthesia complications declined, whereas that of cardiovascular and other medical conditions increased (population-level percentage comparison). CONCLUSION: The pregnancy-related mortality ratio and the distribution of the main causes of pregnancy-related mortality have been relatively stable in recent years.

    • Statistics as Topic RSS Word feed
      1. Objective: Direct reading instruments are valuable tools for measuring exposure as they provide real-time measurements for rapid decision making. However, their use is limited to general survey applications in part due to issues related to their performance. Moreover, statistical analysis of real-time data is complicated by autocorrelation among successive measurements, non-stationary time series, and the presence of left-censoring due to limit-of-detection (LOD). A Bayesian framework is proposed that accounts for non-stationary autocorrelation and LOD issues in exposure time-series data in order to model workplace factors that affect exposure and estimate summary statistics for tasks or other covariates of interest. Method: A spline-based approach is used to model non-stationary autocorrelation with relatively few assumptions about autocorrelation structure. Left-censoring is addressed by integrating over the left tail of the distribution. The model is fit using Markov-Chain Monte Carlo within a Bayesian paradigm. The method can flexibly account for hierarchical relationships, random effects and fixed effects of covariates. The method is implemented using the rjags package in R, and is illustrated by applying it to real-time exposure data. Estimates for task means and covariates from the Bayesian model are compared to those from conventional frequentist models including linear regression, mixed-effects, and time-series models with different autocorrelation structures. Simulations studies are also conducted to evaluate method performance. Results: Simulation studies with percent of measurements below the LOD ranging from 0 to 50% showed lowest root mean squared errors for task means and the least biased standard deviations from the Bayesian model compared to the frequentist models across all levels of LOD. In the application, task means from the Bayesian model were similar to means from the frequentist models, while the standard deviations were different. Parameter estimates for covariates were significant in some frequentist models, but in the Bayesian model their credible intervals contained zero; such discrepancies were observed in multiple datasets. Variance components from the Bayesian model reflected substantial autocorrelation, consistent with the frequentist models, except for the auto-regressive moving average model. Plots of means from the Bayesian model showed good fit to the observed data. Conclusion: The proposed Bayesian model provides an approach for modeling non-stationary autocorrelation in a hierarchical modeling framework to estimate task means, standard deviations, quantiles, and parameter estimates for covariates that are less biased and have better performance characteristics than some of the contemporary methods.

Back to Top

CDC Science Clips Production Staff

  • John Iskander, MD MPH, Editor
  • Gail Bang, MLIS, Librarian
  • Kathy Tucker, Librarian
  • William (Bill) Thomas, MLIS, Librarian
  • Onnalee Gomez, MS, Health Scientist
  • Jarvis Sims, MIT, MLIS, Librarian

____

DISCLAIMER: Articles listed in the CDC Science Clips are selected by the Stephen B. Thacker CDC Library to provide current awareness of the public health literature. An article's inclusion does not necessarily represent the views of the Centers for Disease Control and Prevention nor does it imply endorsement of the article's methods or findings. CDC and DHHS assume no responsibility for the factual accuracy of the items presented. The selection, omission, or content of items does not imply any endorsement or other position taken by CDC or DHHS. Opinion, findings and conclusions expressed by the original authors of items included in the Clips, or persons quoted therein, are strictly their own and are in no way meant to represent the opinion or views of CDC or DHHS. References to publications, news sources, and non-CDC Websites are provided solely for informational purposes and do not imply endorsement by CDC or DHHS.

TOP