Persons using assistive technology might not be able to fully access information in this file. For assistance, please send e-mail to: mmwrq@cdc.gov. Type 508 Accommodation and the title of the report in the subject line of e-mail.
Methodology of the Youth Risk Behavior Surveillance System — 2013
The material in this report originated in the National Center for HIV/AIDS, Viral Hepatitis, STD, and TB Prevention, Rima F. Khabbaz, MD, Acting Director; and the Division of Adolescent and School Health, Howell Wechsler, EdD, Director.
Corresponding preparer: Nancy D. Brener, PhD, National Center for HIV/AIDS, Viral Hepatitis, STD, and TB Prevention, 4770 Buford Highway NE, MS K-33, Atlanta, GA 30341; Telephone: 770-488-6184; Fax: 770-488-6156; E-mail: nad1@cdc.gov.
Summary
Priority health-risk behaviors (i.e., interrelated and preventable behaviors that contribute to the leading causes of morbidity and mortality among youths and adults) often are established during childhood and adolescence and extend into adulthood. The Youth Risk Behavior Surveillance System (YRBSS), established in 1991, monitors six categories of priority health-risk behaviors among youths and young adults: 1) behaviors that contribute to unintentional injuries and violence; 2) sexual behaviors that contribute to human immunodeficiency virus (HIV) infection, other sexually transmitted diseases, and unintended pregnancy; 3) tobacco use; 4) alcohol and other drug use; 5) unhealthy dietary behaviors; and 6) physical inactivity. In addition, YRBSS monitors the prevalence of obesity and asthma among this population.
YRBSS data are obtained from multiple sources including a national school-based survey conducted by CDC as well as school-based state, territorial, tribal, and large urban school district surveys conducted by education and health agencies. These surveys have been conducted biennially since 1991 and include representative samples of students in grades 9–12. In 2004, a description of the YRBSS methodology was published (CDC. Methodology of the Youth Risk Behavior Surveillance System. MMWR 2004;53 [No RR-12]). Since 2004, improvements have been made to YRBSS, including increases in coverage and expanded technical assistance. This report describes these changes and updates earlier descriptions of the system, including questionnaire content; operational procedures; sampling, weighting, and response rates; data-collection protocols; data-processing procedures; reports and publications; and data quality. This report also includes results of methods studies that systematically examined how different survey procedures affect prevalence estimates. YRBSS continues to evolve to meet the needs of CDC and other data users through the ongoing revision of the questionnaire, the addition of new populations, and the development of innovative methods for data collection.
Background and Rationale
Data from surveillance systems are critical for planning and evaluating public health programs. During the late 1980s, when CDC began funding education agencies to implement school-based programs to prevent human immunodeficiency virus (HIV), only a limited number of health-related school-based surveys existed in the United States to inform program planning and evaluation. The Monitoring the Future study had been ongoing since 1975 (1). This study measured drug use and related determinants in a national sample of students in grade 12; it has since been expanded to include students in grades 8 and 10 and a broader health-risk behavior focus. In 1987, the one-time National Adolescent Student Health Survey was administered to a nationally representative sample of students in grades 8 and 10; this survey measured consumer skills (e.g., reading food and drug labels), alcohol and other drug use, injury prevention, nutrition, knowledge and attitudes about sexually transmitted diseases (STDs) and acquired immunodeficiency syndrome (AIDS), attempted suicide, and violence-related behaviors (2). In addition, in 1989, CDC conducted a national survey to measure knowledge, beliefs, and behaviors concerning HIV among high school students (3). However, surveys conducted only on a national level, one-time surveys, and surveys addressing only certain categories of health-risk behaviors could not meet the needs of state, territorial, and local education and health agencies that had begun receiving funding to implement school health programs.
More specifically, in 1987, CDC began providing financial and technical assistance to state, territorial, and local education agencies to implement effective HIV prevention programs for youths. Since 1992, CDC also has provided financial and technical assistance to state education agencies to implement additional broad-based programs, often referred to as "coordinated school health programs," which focus on obesity and tobacco use prevention. Since 2008, CDC also has funded tribal governments for HIV prevention and coordinated school health programs.
Before 1991, school-based HIV prevention programs and coordinated school health programs frequently were developed without empiric information on the prevalence of key behaviors that most influence health and on how those behaviors varied over time and across subgroups of students. To plan and help determine the effectiveness of school health programs, public health and education officials need to understand how programs influence the health-risk behaviors associated with the leading causes of morbidity and mortality among youths and adults in the United States.
In 1991, to address the need for data on the health-risk behaviors that contribute substantially to the leading causes of morbidity and mortality among U.S. youths and young adults, CDC developed the Youth Risk Behavior Surveillance System (YRBSS), which monitors six categories of priority health-risk behaviors among youths and young adults: 1) behaviors that contribute to unintentional injuries and violence; 2) sexual behaviors that contribute to HIV infection, other STDs, and unintended pregnancy; 3) tobacco use; 4) alcohol and other drug use; 5) unhealthy dietary behaviors; and 6) physical inactivity. In addition, the surveillance system monitors the prevalence of obesity and asthma among this population. The system includes a national school-based survey conducted by CDC as well as school-based state, territorial, tribal, and large urban school district surveys conducted by education and health agencies. In these surveys, conducted biennially since 1991, representative samples of students typically in grades 9–12 are drawn. In 2004, a description of the YRBSS methodology was published (4). This updated report discusses changes that have been made to YRBSS since 2004 and provides an updated, detailed description of the features of the system, including questionnaire content; operational procedures; sampling, weighting, and response rates; data-collection protocols; data-processing procedures; reports and publications; and data quality. This report also includes results of new methods studies on the use of computer-based data collection and describes enhancements made to the technical assistance system that supports state, territorial, tribal, and large urban school district surveys.
Purposes of YRBSS
YRBSS has multiple purposes. The system was designed to enable public health professionals, educators, policy makers, and researchers to 1) describe the prevalence of health-risk behaviors among youths, 2) assess trends in health-risk behaviors over time, and 3) evaluate and improve health-related policies and programs. YRBSS also was developed to provide comparable national, state, territorial, and large urban school district data as well as comparable data among subpopulations of youths (e.g., racial/ethnic subgroups) and to monitor progress toward achieving national health objectives (5–7) (Table 1) and other program indicators (e.g., CDC's performance on selected Government Performance and Results Act measures) (8). Although YRBSS is designed to produce information to help assess the effect of broad national, state, territorial, tribal, and local policies and programs, it was not designed to evaluate the effectiveness of specific interventions (e.g., a professional development program, school curriculum, or media campaign).
As YRBSS was being developed, CDC decided that the system should focus almost exclusively on health-risk behaviors rather than on the determinants of these behaviors (e.g., knowledge, attitudes, beliefs, and skills), because there is a more direct connection between specific health-risk behaviors and specific health outcomes than between determinants of behaviors and health outcomes. Many behaviors (e.g., alcohol and other drug use and sexual behaviors) measured by YRBSS also are associated with educational and social outcomes, including absenteeism, poor academic achievement, and dropping out of school (9).
Data Sources
YRBSS data sources include ongoing surveys as well as one-time national surveys, special-population surveys, and methods studies. The ongoing surveys include school-based national, state, tribal, and large urban school district surveys of representative samples of high school students and, in certain sites, representative state, territorial, and large urban school district surveys of middle school students. The ongoing surveys are conducted biennially; each cycle begins in July of the preceding even-numbered year (e.g., in 2010 for the 2011 cycle) when the questionnaire for the upcoming year is released and continues until the data are published in June of the following even-numbered year (e.g., in 2012 for the 2011 cycle). This section describes the ongoing surveys, one-time national surveys, and special-population surveys. Methods studies are described elsewhere in this report (see Data Quality).
This report focuses predominantly on the ongoing school-based national, state, territorial, tribal, and large urban school district surveys. The national Youth Risk Behavior Survey (YRBS) provides data representative of students in grades 9–12 attending U.S. high schools. State, territorial, tribal, and large urban school district surveys provide data representative of high school students or middle school students in states, territories, tribal governments, and large urban school districts that receive funding from CDC through cooperative agreements. Starting in 2013, education or health agencies in all 50 states, seven territorial education agencies, and 31 local education agencies are eligible to receive funding to conduct a YRBS.
One-Time National Surveys
Several one-time national surveys have been conducted as part of YRBSS. These one-time surveys include a Youth Risk Behavior Supplement, which was added to the 1992 National Health Interview Survey to provide information regarding persons aged 12–21 years, including youths attending school as well as those not attending school (10); a National College Health Risk Behavior Survey, which was conducted in 1995 to measure the prevalence of health-risk behaviors among undergraduate students enrolled in public and private 2- and 4-year colleges and universities (11); and a National Alternative High School Youth Risk Behavior Survey, which was conducted in 1998 to measure selected health-risk behaviors among a nationally representative sample of students in grades 9–12 attending alternative high schools (12).
In 2010, also as part of YRBSS, CDC conducted the National Youth Physical Activity and Nutrition Study (NYPANS), which was designed to 1) provide nationally representative data on behaviors and behavioral determinants related to nutrition and physical activity among high school students, 2) provide data to help improve the clarity and test the performance of questions on the YRBSS questionnaire, and 3) enhance understanding of the associations among behaviors and behavioral determinants related to physical activity and nutrition and their association with body mass index (BMI) (weight[kg]/height[m]2). The study included a paper-and-pencil questionnaire administered to a nationally representative sample of 11,429 students attending public and private schools in grades 9–12, a standardized protocol to measure height and weight among all students completing the questionnaire, and telephone interviews to measure 24-hour dietary recalls among a subsample of 909 students (8% of those who completed questionnaires).
Special-Population Surveys
Special-population surveys related to short-term federal initiatives have been conducted periodically as part of YRBSS. In 2005, 2007, and 2009, a total of 40 communities participating in the Steps to a HealthierUS program (13) conducted at least one school-based survey of students in grades 9–12 in their program intervention areas. These communities used a modified YRBSS questionnaire that measured dietary behaviors, physical activity, and tobacco use and the prevalence of obesity and asthma (13). In 2010 and 2011, a total of 44 communities participating in the Communities Putting Prevention to Work (CPPW) program conducted school-based surveys of students in grades 9–12 in their program intervention areas. Six communities used the standard YRBSS questionnaire, and 38 used a modified questionnaire that measured dietary behaviors, physical activity, and tobacco use and the prevalence of obesity. In 2012 and 2013, a total of 17 CPPW communities will conduct a second YRBS.
In addition to the fiscal and technical support provided through cooperative agreements to tribal governments to conduct a YRBS, CDC also provides technical assistance for other surveys of American Indian youths. Since 1994, the Bureau of Indian Education (BIE) has conducted a YRBS periodically among American Indian youths attending middle and high schools funded by BIE. Since 1997, the Navajo Nation has conducted a YRBS periodically in schools on Navajo reservations and in border town schools having high Navajo enrollment. In 2011, CDC also provided technical assistance to the Nez Perce Tribe to conduct a YRBS.
Questionnaire
Initial Questionnaire Development
To determine which health-risk behaviors YRBSS would assess initially, CDC first reviewed the leading causes of morbidity and mortality among youths and adults. In 1988, four causes accounted for 68% of all deaths among persons aged 1–24 years: motor-vehicle crashes (31%), other unintentional injuries (14%), homicide (13%), and suicide (10%) (14). In 2008, of all deaths among persons aged 10–24 years, 72% were attributed to these four causes: 26% resulted from motor-vehicle crashes, 17% from other unintentional injuries, 16% from homicide, and 13% from suicide (15). In 1988, substantial morbidity also resulted from approximately 1 million pregnancies occurring among adolescents (16) and the estimated 12 million cases of STDs among persons aged 15–29 years (17). Although rates of pregnancy and STDs among adolescents have decreased during 1991–2009 (18–20), pregnancy and STDs, including HIV infection, remain critical public health problems among youths. In 1988, approximately two thirds of all deaths among adults aged >25 years resulted from cardiovascular disease (41%) and cancer (23%) (14). In 2008, the percentage of deaths among persons in this age group resulting from cardiovascular disease had decreased to 34%, but the percentage resulting from cancer remained at 23% (15).
These serial reviews indicate that virtually all behaviors contributing to the leading causes of morbidity and mortality can be placed into six priority health-risk behavior categories: 1) behaviors that contribute to unintentional injuries and violence; 2) sexual behaviors that contribute to HIV infection, other STDs, and unintended pregnancy; 3) tobacco use; 4) alcohol and other drug use; 5) unhealthy dietary behaviors; and 6) physical inactivity. These behaviors frequently are interrelated and often are established during childhood and adolescence and extend into adulthood.
In 1989, CDC asked each of the federal agencies responsible for improving or monitoring the incidence and prevalence of behavioral risks in each of the six categories to appoint a person to serve on a YRBSS steering committee. In August 1989, CDC and steering committee members convened a 2-day workshop to identify priority behaviors and devise questions to measure those behaviors. For each of the six priority health-risk behavior categories, a panel was established that included scientific experts from other federal agencies, including the U.S. Department of Education, the National Institutes of Health, the Health Resources and Services Administration, and the Office of the Assistant Secretary for Health as well as scientists from academic institutions, survey research specialists from CDC's National Center for Health Statistics (NCHS), and staff from CDC's Division of Adolescent and School Health. Because YRBSS was to be implemented primarily through school-based surveys, a representative of the Society of State Directors of Health, Physical Education, and Recreation, an organization of state leaders of school-based health programs, also was included on each panel. Because students would have a single class period of approximately 45 minutes to complete the YRBSS questionnaire covering all six priority health-risk behavior categories, each panel was asked to identify only the highest priority behaviors and to recommend a limited number of questions to measure the prevalence of those behaviors. In October 1989, the first draft of the YRBSS questionnaire was completed and was reviewed by representatives from the education agency of each state, the District of Columbia, four U.S. territories, and 16 local education agencies then funded by CDC. Survey research specialists from NCHS also provided comments and suggestions. A second version of the YRBSS questionnaire was administered during spring 1990 to a national sample of students in grades 9–12 as well as to samples of students in 25 states and nine large urban school districts. In addition, the second version was sent to the Questionnaire Design Research Laboratory at NCHS for laboratory and field testing with high school students. NCHS staff examined student responses to the questionnaire and recommended ways to improve reliability and validity by clarifying the wording of questions, setting recall periods, and identifying response options.
In October 1990, a third version of the YRBSS questionnaire was completed. The questionnaire was similar to that used during spring 1990 but revised to take into account data collected by CDC and state and local education agencies during spring 1990, information from NCHS's laboratory and field tests, and input from YRBSS steering committee members and representatives of each state and the 16 local education agencies. It also included questions for measuring national health objectives for 2000 (5). During spring 1991, this questionnaire was used by 26 states and 11 large urban school districts to conduct a YRBS and by CDC to conduct a national YRBS.
In 1991, CDC determined that biennial surveys would be sufficient to measure health-risk behaviors among students because behavior changes typically occur gradually. Since 1991, YRBSs have been conducted every odd year at the national, state, territorial, and large urban school district levels.
Questionnaire Characteristics and Revisions
All YRBSS questionnaires are self-administered, and students record their responses on a computer-scannable questionnaire booklet or answer sheet. Skip patterns* are not included in any YRBSS questionnaire to help ensure that similar amounts of time are required to complete the questionnaire, regardless of each student's health-risk behavior status. This technique also prevents students from detecting on other answer sheets and questionnaire booklets a pattern of blank responses that might identify the specific health-risk behaviors of other students.
In each even-numbered year between 1991 and 1997, in consultation with the sites (states, territories, and large urban school districts) conducting a survey, CDC revised the YRBSS questionnaire to be used in the subsequent cycle. These revisions reflected site and national priorities. For example, in 1992, CDC added 10 questions to the 1993 questionnaire to measure the National Education Goal for safe, disciplined, and drug-free schools (21) and to address reporting requirements for the U.S. Department of Education's Safe and Drug-Free Schools Program (http://www2.ed.gov/about/offices/list/osdfs/index.html).
In 1997, CDC undertook an in-depth, systematic review of the YRBSS questionnaire. The review was motivated by multiple factors, including a goal for YRBSS to measure Healthy People 2010 national health objectives, which were being developed at that time. The purpose of the review and the subsequent revision process was to ensure that the questionnaire would provide the most effective assessment of the most critical health-risk behaviors among youths. To guide the decision-making process, CDC solicited input from content experts from CDC and academia as well as from representatives from other federal agencies; state, territorial, and local education agencies; state health departments; and national organizations, foundations, and institutes. On the basis of input from approximately 800 persons, CDC developed a proposed set of questionnaire revisions that were sent to all state, territorial, and local education agencies for further input. In addition to considering the amount of support from sites for the proposed revisions, CDC considered multiple factors in making final decisions about the questionnaire, including 1) input from the original reviewers, 2) whether the question measured a health-risk behavior practiced by youths, 3) whether data on the topic were available from other sources, 4) the relation of the behavior to the leading causes of morbidity and mortality among youths and adults, and 5) whether effective interventions existed that could be used to modify the behavior. As a result of this process, CDC created the 1999 YRBSS questionnaire by adding 16 new questions, deleting 11 questions, and making substantial wording changes to 14 questions. For example, two questions that assessed self-reported height and weight were added in recognition of increasing concerns regarding obesity. As a result, YRBSS now generates national, state, territorial, tribal, and large urban school district estimates of BMI calculated from self-reported data.
The 2013 YRBSS questionnaire reflects minor changes that CDC has made to the questionnaire since 1999. During each even-numbered year since 1999, CDC has sought input from experts both inside and outside of CDC regarding what questions should be changed, added, or deleted. These changes, additions, and deletions were then placed on a ballot sent to the YRBS coordinators at all sites, and each site voted for or against each proposed change, addition, and deletion. CDC considered the results of this balloting process when finalizing each questionnaire. Each cycle, CDC develops a standard questionnaire that sites can use as is or modify to meet their needs. The 2013 standard YRBSS questionnaire includes five questions that assess demographic information; 23 questions related to unintentional injuries and violence; 10 on tobacco use; 18 on alcohol and other drug use; seven on sexual behaviors; 16 on body weight and dietary behaviors, including height and weight; five on physical activity; and two on other health-related topics (i.e., asthma and sleep). The 2013 standard questionnaire and the rationale for the inclusion of each question are available at http://www.cdc.gov/yrbss.
For the national YRBS, five to 11 additional questions are added to the standard questionnaire each cycle. These questions typically cover health-related topics that do not fit in the six priority health-risk behavior categories (e.g., sun protection). The 2013 national YRBS questionnaire also is available at http://www.cdc.gov/yrbss.
Each cycle, CDC makes the standard questionnaire available to sites as a computer-scannable booklet. In 2011, nine states, one tribe, and six large urban school districts used the standard questionnaire computer-scannable booklets. CDC sends sites that wish to modify the standard questionnaire a print-ready copy of their questionnaire and scannable answer sheets. Sites can modify the standard questionnaire within certain parameters: 1) two thirds of the questions from the standard YRBSS questionnaire must remain unchanged; 2) additional questions are limited to eight mutually exclusive response options; and 3) skip patterns, grid formats, and fill-in-the-blank formats cannot be used. Furthermore, sites that modify the standard YRBSS questionnaire and use the scannable answer sheets must retain the height and weight questions as questions six and seven and cannot have more than 99 questions. This numerical limit is set so the questionnaire can be completed during a single class period by all students, even those who might read slowly.
For sites that want to modify the standard questionnaire, CDC also provides a list of optional questions for consideration. This list has been available to sites since 1999 and is updated when the standard YRBSS questionnaire is updated. It includes questions on the current version of the national YRBS questionnaire; questions that have been included in a previous national, state, territorial, tribal, or large urban school district YRBS questionnaire; and questions designed to address topics of key interest to CDC or the sites. By using these optional questions, sites can obtain data comparable to those from the national YRBS or from other sites that use these questions and be assured they are adding questions that already have been reviewed and approved by CDC. A site also can choose to develop its own questions if none of the optional questions addresses a topic that the site wants to measure. CDC reviews site-developed questions to ensure that their complexity, reading level, and formatting are appropriate for a YRBS. In 2011, a total of 38 states, five territories, 16 large urban school districts, and three tribes modified the standard questionnaire.
Questionnaire Reliability and Validity
CDC has conducted two test-retest reliability studies of the national YRBS questionnaire, one in 1992 and one in 2000. In the first study, the 1991 version of the questionnaire was administered to a convenience sample of 1,679 students in grades 7–12. The questionnaire was administered on two occasions, 14 days apart (22). Approximately three fourths of the questions were rated as having a substantial or higher reliability (kappa = 61%–100%), and no statistically significant differences were observed between the prevalence estimates for the first and second times that the questionnaire was administered. The responses of students in grade 7 were less consistent than those of students in grades 9–12, indicating that the questionnaire is best suited for students in those grades.
In the second study, the 1999 questionnaire was administered to a convenience sample of 4,619 high school students. The questionnaire was administered on two occasions, approximately 2 weeks apart (23). Approximately one of five questions (22%) had significantly different prevalence estimates for the first and second times that the questionnaire was administered. Ten questions (14%) had both kappas <61% and significantly different time-1 and time-2 prevalence estimates, indicating that the reliability of these questions was questionable (23). These problematic questions were revised or deleted from later versions of the questionnaire.
No study has been conducted to assess the validity of all self-reported behaviors that are included on the YRBSS questionnaire. However, in 2003, CDC reviewed existing empiric literature to assess cognitive and situational factors that might affect the validity of adolescent self-reporting of behaviors measured by the YRBSS questionnaire (24). In this review, CDC determined that, although self-reports of these types of behaviors are affected by both cognitive and situational factors, these factors do not threaten the validity of self-reports of each type of behavior equally. In addition, each type of behavior differs in the extent to which its self-report can be validated by an objective measure. For example, reports of tobacco use are influenced by both cognitive and situational factors and can be validated by biochemical measures (e.g., cotinine). Reports of sexual behavior also can be influenced by both cognitive and situational factors, but no standard exists to validate the behavior. In contrast, reports of physical activity are influenced substantially by cognitive factors and to a lesser extent by situational ones. Such reports can be validated by mechanical or electronic monitors (e.g., heart rate monitors). Understanding the differences in factors that compromise the validity of self-reporting of different types of behavior can assist policymakers in interpreting data and researchers in designing measures that do not compromise validity.
In 2000, CDC conducted a study to assess the validity of the two YRBS questions on self-reported height and weight (25). In that study, 2,965 high school students completed the 1999 version of the YRBSS questionnaire on two occasions approximately 2 weeks apart. After completing the questionnaire, the students were weighed and had their height measured. Self-reported height, weight, and BMI calculated from these values were substantially reliable, but on average, students in the study overreported their height by 2.7 inches and underreported their weight by 3.5 pounds, which indicates that YRBSS probably underestimates the prevalence of overweight and obesity in adolescent populations.
Operational Procedures
The national YRBS is conducted during February–May of each odd-numbered year. All except a few sites also conduct their survey during this period; certain sites conduct their YRBS during the fall of odd-numbered years or during even-numbered years. Separate samples and operational procedures are used in the national survey and in the state, territorial, tribal, and large urban school district surveys. The national sample is not an aggregation of the state and large urban school district surveys, and state or large urban school district estimates cannot be obtained from the national survey.
In certain instances, a school is selected as part of the national sample as well as a state or large urban school district sample. Similarly, a school might be selected as part of both a state and a large urban school district sample or a state and a tribal sample. When a school is selected as part of two or more samples, the field work is conducted only once to minimize the burden on the school and eliminate duplication of efforts. The school's data then are incorporated into both datasets during data processing. The coordination of these overlapping samples is critical to the successful operation of YRBSS, and weekly meetings are required to ensure that overlapping schools are identified, responsibilities for recruitment and data collection are documented, and methods for sharing data are agreed upon.
National Survey
Since 1990, the national school-based YRBS has been conducted under contract with ICF Macro, Inc., an ICF International Company. With CDC oversight, the contractor is responsible for sample design and sample selection. After the schools have been selected, the contractor also is responsible for obtaining the appropriate state-, district-, and school-level clearances to conduct the survey in those schools. The contractor works with sampled schools to select classes, schedule data collection, and obtain parental permission. In addition, the contractor hires and trains data collectors to follow a common protocol to administer the questionnaires in the schools, coordinates data collection, weights the data, and prepares the data for analysis.
State, Territorial, Tribal, and Large Urban School District Surveys
Before 2003, CDC funded state and local education agencies for HIV prevention or coordinated school health programs, and sites could use a portion of these cooperative agreement funds to conduct a YRBS. Since the 2003 cycle, separate cooperative agreement funds have been made available to sites to conduct a survey, and since 2008, both state education and state health agencies have been eligible to apply for these funds. Each state must determine which agency will take responsibility for conducting its survey. In 2011, five state health departments directly received separate YRBSS cooperative agreement funds, and health departments in an additional eight states and one large urban school district received funds from the education agency to lead administration of their survey (Box 1). The remaining surveys were conducted by education agencies. Since 2009, tribal governments also have been eligible to apply for funds to conduct a YRBS. Certain state and local education agencies conduct a YRBS with the assistance of survey contractors. In 2011, a total of 24 state education agencies and five local education agencies hired contractors to assist with survey administration.
State, territorial, and local agencies and tribal governments funded by CDC to conduct a YRBS do so among samples of high school students. In addition, certain sites conduct a separate survey among middle school students by using a modified YRBSS questionnaire designed specifically for the reading and comprehension skills of students in this age group. In 2011, a total of 16 states, three territories, one tribe, and 14 large urban school districts conducted a middle school YRBS (Box 1). In addition, in 2011, one state (Alaska) and two large urban school districts (Memphis and San Bernardino) conducted a YRBS among alternative school students.
Certain states coordinate their YRBS sample with samples for other surveys (e.g., the Youth Tobacco Survey [YTS]) (http://www.cdc.gov/tobacco/data_statistics/surveys/yts/index.htm) to reduce the burden on schools and students and to save resources. States use one of two methods of coordinated sampling: multiple-school sampling and multiple-class sampling. In multiple-school sampling, the number of schools needed for one survey is multiplied by the number of surveys being coordinated. This method produces nonoverlapping samples of schools. The separate samples can be used during the same or separate semesters, and schools can be assured that they will be asked to participate in only one survey. This approach is most useful in sites that have at least 50 high schools, in sites that administer the surveys in different semesters, and in sites in which at least one of the surveys has been considered controversial or has not been conducted successfully. This method ensures that the success of one survey does not depend on the success of the others. In multiple-class sampling, multiple surveys are conducted simultaneously in separate classes in the same sample of schools. The number of classes needed for one survey is multiplied by the number of surveys, and then the classes are assigned randomly to each survey. This approach is useful in states with few high schools, in states where each survey has been conducted successfully, and in states where the coordinators of each survey are willing to work together closely. Regardless of the type of coordination, CDC and the sponsoring agencies work together to plan and implement the coordination. In 2011, a total of 17 states, one territory, and one tribe used coordinated samples to conduct their YRBS and YTS. In addition, one state and two large urban school districts coordinated their YRBS sample with a CPPW sample (Box 1).
Technical Assistance
Technical assistance for state, territorial, and local agencies and tribal governments is provided by both CDC and Westat, which has served as CDC's technical assistance contractor since the inception of YRBSS. CDC staff include scientists and project officers who oversee the cooperative agreement for the sites. In addition, each site is assigned a survey operations specialist and a statistician from Westat.
Each YRBS site has a survey coordinator who works for a state, territorial, or local education or health agency or tribal government. These coordinators have variable expertise and experience in conducting surveys. To help ensure the quality of YRBSS, since the first cycle, CDC has provided technical assistance to the agencies conducting the surveys. This assistance has become increasingly comprehensive. During the first cycle, such assistance was limited and consisted primarily of answering questions posed by site coordinators. Since that time, technical assistance has been expanded to cover the entire survey process and is a continual and proactive system. CDC and its technical assistance contractor provide technical assistance on survey planning, sample selection, questionnaire modification, survey administration, obtaining parental permission, data processing, weighting, report generation, and dissemination of results. Sites are responsible for administering the surveys; the role of CDC and its technical assistance contractor is to help ensure that survey administration runs smoothly and yields sufficient response rates and high-quality data.
Technical Assistance Tools
Westat has worked with CDC to develop tools for providing technical assistance. These tools include instructional materials, communication tools, and specialized software.
Instructional Materials
CDC publishes the Handbook for Conducting Youth Risk Behavior Surveys (26), a comprehensive guide that is revised each cycle on the basis of feedback from sites and questions that arose during the preceding cycle. The 2013 version of the Handbook contains 107 pages spanning eight chapters and also includes nine appendices (Box 2). As a supplement to the Handbook, in 2008 and 2010, CDC and Westat developed two short instructional videos for sites. One video focuses on scientifically selecting classes, and the other describes how to prepare the data and documentation for processing. Each video has step-by-step instructions for routine tasks. These videos use animation to make the information engaging. The videos are designed to provide an overview of these tasks at the beginning of a cycle and to serve as a resource for survey coordinators to use as they prepare to carry out each task.
Communication Tools
A monthly electronic newsletter is sent to all survey coordinators via e-mail. Each one-page newsletter focuses on a part of the survey process (e.g., sampling, questionnaire modification, or follow-up). Topics are selected to coincide with the typical survey timeline. The brief newsletters provide tips or reminders designed to help sites conduct successful surveys.
The password-protected Survey Technical Assistance Website, which CDC and Westat launched in 1999, is used by survey coordinators to request materials (e.g., questionnaire booklets and answer sheets), download references and supporting documents (e.g., the Handbook and sample parental permission forms), and check the status of their data (e.g., what processing steps already have been completed). Survey coordinators also can use the website to access contact information for CDC and technical assistance contractor staff and send e-mail messages to request further assistance. During the 2011 cycle, the website received 812 visits from the 76 survey sites. The website also provides reports to support survey management. For example, CDC and Westat use the website to track when questionnaires are received from each site and to check the status of data processing.
Survey coordinators can access peer-to-peer technical assistance through a YRBSS listserv that was established in 2009 by the South Carolina Department of Education. The listserv has 79 members, including survey coordinators and staff from CDC, Westat, and ICF Macro. CDC staff monitor the listserv to provide clarifications when needed. On average, 15 messages are posted to the listserv each month. The most common topics discussed are survey administration (e.g., techniques for obtaining parental permission), questionnaire modification, dissemination of results, and the use of incentives. Members also have used the listserv to connect at meetings or with others in their region.
Specialized Software
To provide technical assistance with sample selection, in 1989, CDC and Westat developed PCSample, a specialized software program that draws two-stage cluster samples of schools and classes within sampled schools for each site. CDC and Westat use PCSample to select YRBS samples efficiently. Schools are selected with probability proportional to school enrollment size, and classes are selected randomly. When PCSample was developed, no commercially available software program was available for this purpose, and PCSample remains the only example of this type of program. Although PCSample was developed specifically for YRBSS, it also is used for other school-based surveys (e.g., YTS, Global YTS, and Global YRBS).
PCSample requires an updated sampling frame, which is a list of schools in the site's jurisdiction that also includes the number of students in each school enrolled in grades 9–12. If the state, territorial, or local agency or tribal government cannot provide the sampling frame, the technical assistance contractor provides the frame used previously or one created using the Common Core of Data from the National Center for Education Statistics (27). The survey coordinator then must update the sampling frame by deleting closed schools, adding newly opened schools, and providing updated enrollment numbers. PCSample also requires information on sampling parameters (e.g., expected school and student response rates, attendance rates, and desired sample size). This information is provided by the survey coordinator, with assistance from CDC and its technical assistance contractor, via a sampling parameter worksheet. The sampling parameters are used to balance the need to select a sample that is large enough to generate precise estimates but small enough so that the site's resources are not overtaxed and schools and students are not burdened unnecessarily.
PCSample generates two types of forms: school-level forms for each school in the sample and a classroom-level form for each school that is reproduced later for each sampled classroom in that school. The school-level form contains unique random numbers calculated by using a sampling interval based on the size of the school and the desired sample size; the survey coordinator uses these numbers to select classes randomly in participating schools. The survey coordinator completes a school-level form for each sampled school and a classroom-level form for each sampled classroom. The information on these forms provides a record of the sampling and survey administration process and is used to weight the data.
To help monitor site progress, the technical assistance contractor provides each site with a Microsoft Excel (28)–based tracking form for recording information on scheduling progress and school and student participation. Before development of the tracking form in 2011, CDC and Westat contacted sites regularly during data collection to check whether schools had been cleared and scheduled for survey administration. Some sites used paper tracking forms or created electronic forms to help them monitor their progress, but this recordkeeping was not done in a systematic or consistent fashion. The tracking form now in use in all sites is a spreadsheet that contains a list of the schools selected for the survey, along with columns for documenting the date the school agrees to participate, the date for survey administration, the date the survey is confirmed as completed, and student participation information. The spreadsheet is programmed to calculate the school, student, and overall response rates automatically as new information is added. Sites are required to send the tracking form to the technical assistance contractor regularly to aid troubleshooting and technical assistance.
Technical Assistance Modes
In addition to the tools developed to help sites conduct successful surveys, a key part of technical assistance is the one-on-one guidance provided to sites. This individualized technical assistance is provided most commonly through a toll-free telephone number and e-mail. CDC and its technical assistance contractor also meet with survey coordinators in person. Coordinators often attend national conferences and other meetings during which appointments can be scheduled with CDC or contractor staff also in attendance. Site visits by project officers also provide opportunities for providing site-specific technical assistance. Every conversation between any personnel at a site and CDC or its technical assistance contractor, whether in person, through e-mail, or on the telephone, is logged into the Survey Technical Assistance Website. This enables all technical assistance staff members working with the site to see in real time what questions have been asked and what information has been shared. During the 2011 cycle (July 2010–July 2012), 5,279 contacts were made between sites and Westat or CDC. The number of contacts per site during this period ranged from three to 125 (median: 42). Approximately 36% of these requests were of a general nature (e.g., how to obtain YRBSS-related materials), 22% were related to sampling, 16% to questionnaire administration, 7% each to clearance and questionnaire modification, 6% to weighting, 3% to reports, and 3% to other concerns (e.g., scanning and data processing).
A critical aspect of technical assistance is the 2-day in-person training sessions that CDC and Westat have provided for sites since 1992. These sessions are conducted during August of every even-numbered year in preparation for YRBSS data collection in the odd-numbered year. CDC invites survey coordinators and contractors who either are new or are from sites that have not conducted a YRBS successfully to attend the training. The content of the training is based on the YRBS Handbook (26) and covers all aspects of the survey process, including planning the survey, modifying questionnaires, obtaining clearance, selecting schools and classes, obtaining parental permission, administering surveys, and preparing data for analysis. The training comprises both lectures and hands-on skill-building activities and is designed by persons with expertise in adult learning principles. In addition to the Handbook, participants receive a training manual containing practice exercises and supplemental resources to help them conduct successful surveys. In addition, intensive, one-on-one technical assistance meetings are available at the training for sites that want to discuss detailed questionnaire or sampling issues. In 2012, YRBS coordinators and contractors from 29 sites participated in the training.
Sampling, Weighting, and Response Rates
State, Territorial, Tribal, and Large Urban School District Surveys
Each state, territorial, tribal, and large urban school district YRBS employs a two-stage, cluster sample design to produce a representative sample of students in grades 9–12 in its jurisdiction. Samples are selected using PCSample. In 2011, Ohio and South Dakota included both public and private schools in their sampling frames; all other states included only public schools. Each large urban school district sample included only schools in the funded school district (e.g., San Diego Unified School District) rather than in the entire area (e.g., greater San Diego County). In the first sampling stage, in all except a few sites, schools are selected with probability proportional to school enrollment size. In the second sampling stage, intact classes of a required subject or intact classes during a required period (e.g., second period) are selected randomly. All students in sampled classes are eligible to participate. In certain sites, these procedures are modified to meet the individual needs of the sites. For example, in a given site, all schools, rather than a sample of schools, might be selected to participate.
Those surveys that have a sample selected according to the protocol described above, appropriate documentation of school and classroom selection, and an overall response rate of ≥60% are weighted. These three criteria are used to ensure that the data are representative of students in grades 9–12 in that jurisdiction. The overall response rate is calculated by multiplying the school response rate by the student response rate. A weight is applied to each student record to adjust for student nonresponse and the distribution of students by grade, sex, and race/ethnicity in each jurisdiction. Therefore, weighted estimates are representative of all students in grades 9–12 in each jurisdiction.
Surveys that do not have an overall response rate of ≥60% and appropriate documentation are not weighted. Unweighted data represent only the students participating in the survey. Since 1991, both the number of participating sites and the number and percentage of weighted sites have increased (Table 2). In 2011, a total of 43 states, five territories, 21 large urban school districts, and two tribal governments had weighted data; four states and one large urban school district had unweighted data (Box 1). In 2011, in sites with weighted data, the student sample sizes ranged from 1,147 to 13,201 (median: 2,170) for the state surveys, from 1,013 to 11,570 (median: 1,767) for the large urban school district surveys, and from 476 to 3,190 (median: 1,634) for the territorial surveys. Student sample sizes were 91 and 1,480 for the two tribal surveys. Among the state surveys, school response rates ranged from 73% to 100%, student response rates ranged from 60% to 88%, and overall response rates ranged from 60% to 84%. Among the large urban school district surveys, school response rates ranged from 84% to 100%, student response rates ranged from 61% to 86%, and overall response rates ranged from 61% to 86%. Among the territorial surveys, school response rates ranged from 97% to 100%, student response rates ranged from 75% to 85%, and overall response rates ranged from 75% to 85%. Among the tribal surveys, school response rates were 78% and 100%, student response rates were 77% and 83%, and overall response rates were 65% and 77%.
National Survey
The national YRBS uses a three-stage, cluster sample design to obtain a nationally representative sample of U.S. students in grades 9–12. The target population comprises all public and private school students in grades 9–12 in the 50 states and the District of Columbia. U.S. territories are not included in the sampling frame. The national YRBS sample is designed to produce estimates that are accurate within ±5% at a 95% confidence level. Overall estimates as well as estimates for sex, grade, race/ethnicity, grade by sex, and race/ethnicity by sex subgroups meet this standard. Estimates for grade by race/ethnicity subgroups are accurate within ±5% at a 90% confidence level.
The first-stage sampling frame for each national survey includes primary sampling units (PSUs) consisting of large-sized counties or groups of smaller, adjacent counties. Since the 1999 sample, PSUs large enough to be selected with certainty are divided into sub-PSU units. Schools then are sorted by size and assigned in rotation to the newly created sub-PSU units. PSUs are selected from 16 strata categorized according to the metropolitan statistical area† (MSA) status and the percentages of black and Hispanic students in PSUs. PSUs are classified as urban if they are in one of the 54 largest MSAs in the United States; otherwise, they are considered rural. PSUs are selected with probability proportional to school enrollment size for PSUs.
In the second stage of sampling, schools are selected from PSUs. A list of public and private schools in PSUs is obtained from the Market Data Retrieval (MDR) database (29). This database includes information, including enrollment figures, from both public and private schools and the most recent data from the Common Core of Data from the National Center for Education Statistics (27). Schools with all four high school grades (9–12) are considered "whole schools." Schools with any other set of grades are considered "fragment schools" and are combined with other schools (whole or fragment) to form a "cluster school" that includes all four grades. The cluster school is treated as a single school during school selection. Schools are divided into two groups on the basis of enrollment. Schools with an estimated enrollment of ≥25 students for each grade are considered large, and schools with an estimated enrollment of <25 students for any grade are considered small. Approximately one fourth of PSUs are selected for small-school sampling. For each of these PSUs, one small school is drawn with probability proportional to size, considering only small schools within that PSU. Three large schools then are selected from all sampled PSUs with probability proportional to school enrollment size.
To enable a separate analysis of data for black and Hispanic students, CDC has used three strategies to achieve oversampling of these students: 1) larger sampling rates are used to select PSUs that are in high-black and high-Hispanic strata; 2) a modified measure of size is used that increases the probability of selecting schools that have a disproportionately high minority enrollment; and 3) two classes per grade, rather than one, are selected in schools with a high minority enrollment. All of these strategies were used in selecting the national samples through 2011. Because of decreases in the percentage of white students in the U.S. population (30), for the 2013 sample, sufficient numbers of black and Hispanic students were sampled using only the third strategy.
The final stage of sampling consists of randomly selecting one or two entire classes in each chosen school and in each of grades 9–12. Examples of classes include homerooms or classes of a required subject (e.g., English and social studies). All students in sampled classes are eligible to participate. Since 1991, the national YRBS has been conducted 11 times with an average sample size of 14,517 and average school, student, and overall response rates of 78%, 86%, and 71%, respectively (Table 3).
A weight based on student sex, race/ethnicity, and school grade is applied to each record to adjust for student nonresponse and oversampling of black and Hispanic students. To avoid inflated sampling variances, statisticians trim and distribute weights exceeding a criterion value among untrimmed weights using an iterative process (31). The final overall weights are scaled so that the weighted count of students equals the total sample size and the weighted proportions of students in each grade match national population projections for each survey year. Therefore, weighted estimates are representative of all students in grades 9–12 who attend public and private schools in the United States. For both the national YRBS and the state, territorial, tribal, and large urban school district surveys, sampled schools, classes, and students who refuse to participate are not replaced. Sampling without replacement maintains the integrity of the sample design and helps avoid the introduction of unmeasurable bias into the sample.
Data-Collection Protocols
Data-collection protocols are similar for national, state, territorial, tribal, and large urban school district surveys. Local procedures for obtaining parental permission are followed before administering a YRBS in any school. Certain schools use active permission, meaning that parents must send back to the school a signed form indicating their approval before their child can participate. Other schools use passive permission, meaning that parents send back a signed form only if they do not want their child to participate in the survey. In the 2011 state and large urban school district surveys, four (9%) of 47 participating states (Alaska, Hawaii, New Jersey, and Utah) used statewide active permission procedures, and two (9%) of 22 large urban school districts (Dallas and San Bernardino) used district-wide active permission. Some schools within other sites also used active permission. In the 2011 national YRBS, 10% of schools used active permission, and 90% used passive permission.
For the national survey and for the majority of state, territorial, tribal, and large urban school district surveys, trained data collectors travel to each participating school to administer the questionnaires to students. These data collectors read a standardized script to participating students. The script includes an introduction to the survey. Data collectors also record information about schools and classrooms (e.g., grade level of classes sampled and number of students enrolled in a sampled class). This information is used later in the survey process to verify sample selection and to weight data.
In certain state, territorial, tribal, and large urban school district surveys, the questionnaires are sent to the school, and teachers of the selected classes administer the survey to their class by using the standardized script. The school then sends the completed questionnaires and accompanying documentation forms to the agency conducting the survey.
Procedures for all the YRBSs are designed to protect student privacy by allowing for anonymous and voluntary participation. In all surveys, students complete the self-administered questionnaire during one class period and record their responses directly in a computer-scannable booklet or on a computer-scannable answer sheet. To the extent possible, students' desks are spread throughout the classroom to minimize the chance that students can see each other's responses. Students also are encouraged to use an extra sheet of paper or an envelope provided by the data collector to cover their responses as they complete the questionnaire. In the national survey, and in certain state, territorial, tribal, and large urban school district surveys, when students complete the questionnaire, they are asked to seal their questionnaire booklet or answer sheet in an envelope before placing it in a box.
Students who are absent on the day of data collection still can complete questionnaires if their privacy can be maintained. These make-up data-collection efforts sometimes are administered by the data collector; however, if the data collector cannot administer the questionnaire, school personnel can perform this task. Allowing students who were absent on the day of data collection to take the survey at a later date increases student response rates. In addition, because absent students, especially those who are absent without parental permission, are more likely to engage in health-risk behaviors than students who are not absent (32), make-up data collection procedures help provide data representative of all high school students.
Data-Processing Procedures
Data processing for state, territorial, tribal, and large urban school district surveys is a collaborative effort between CDC and its technical assistance contractor that provides a system of checks and balances. All except a few sites send completed questionnaires or answer sheets to the contractor, which scans them and constructs a raw electronic dataset. Certain sites scan their answer sheets and send the raw electronic dataset to the contractor. The contractor sends all raw datasets to CDC, which edits them to identify out-of-range responses, logical inconsistencies, and missing data. The data cleaning and editing process is performed by the Survey Data Management System (SDMS), which CDC developed in 1999 to process all YRBSS data and produce reports. Originally developed as a stand-alone system, SDMS was transformed to a web-based system in 2008 and performs its functions using Visual Basic (33), SAS (34), and SUDAAN (35) programs. The processing system accommodates questionnaires in which questions have been deleted or added by the sites by first screening them to note differences from the standard questionnaire and then accounting for those differences during processing.
For the 2011 cycle, 179 logical edits were performed on each standard questionnaire. Responses that conflict in logical terms are both set to missing, and data are not imputed. For example, if a student responds to one question that he or she has never smoked but then responds to a subsequent question that he or she has smoked two cigarettes during the previous 30 days, the processing system sets both responses to missing. Neither response is assumed to be the correct response. Questionnaires with <20 valid responses remaining after editing are deleted from the dataset. In 2011, the median number of completed questionnaires in the state surveys that failed quality-control checks and were excluded from analysis was 13 (range: 0–351), and in the large urban school district surveys, the median was also 13 (range: 0–231).
Additional data edits are applied to the height, weight, and BMI variables to ensure that the results are biologically plausible. These three variables are set to missing when an observation lies outside logical limits developed by CDC's Division of Nutrition, Physical Activity, and Obesity (36). In 2011, the median number of completed questionnaires in the state surveys that had their height and weight data set to missing because either these values or their resulting BMIs were considered implausible for the student's sex and age was 32 (range: 7–567). In the large urban school district surveys, the median was 27 (range: 14–163).
Edited data are sent to the technical assistance contractor for weighting. If response rates are sufficient, documentation is complete, and the site followed sampling protocols correctly, the contractor weights the data according to the procedures previously described in this report and sends the weights to CDC, which merges the weights onto the edited data file. CDC and Westat use file transfer and tracking functions built into the Survey Technical Assistance Website to ensure that all transfers are logged and reported.
Data processing for the national survey is similar to that performed for the state, territorial, tribal, and large urban school district surveys. The national survey contractor scans all completed questionnaires from the national survey and sends a SAS dataset to CDC. To maintain consistency with the data processing used for the state, territorial, tribal, and large urban school district surveys, CDC converts this dataset to one that is then processed in SDMS just as all other YRBS datasets are processed. The national dataset is treated as if it were a state, territorial, tribal, or large urban school district site. CDC edits the data by using the same procedures described previously. In 2011, a total of 78 questionnaires (0.5%) in the national survey failed quality-control checks and were excluded from analysis, and 182 questionnaires (1%) had their height and weight data set to missing because the height, weight, or resulting BMI was considered implausible for the student's sex and age. CDC then sends edited data to the national survey contractor, whose statisticians weight the data according to the procedures described previously and then send the weights to CDC, which merges the weights onto the data file.
Reports and Publications
Reports
CDC generates a report for the national survey and for each participating site. Before 2013, each report contained approximately 500 pages in a three-ring binder divided into two sections: survey results and survey documentation. The survey results section included a sample description; bar charts and pictographs summarizing key results; tables and graphs that provided prevalence estimates for each question, including site-added questions; and a report that provided the results of trend analyses using logistic regression to test whether results have changed over time. The tables provided estimates overall and by sex, race/ethnicity, grade, and age and included 95% confidence intervals for all sites with weighted data. To help ensure the reliability of the estimates and protect the anonymity of respondents, subgroup results were not reported if any subgroup contained <100 students. The graphs provided estimates overall and by sex, grade, and race/ethnicity, and were provided as PowerPoint (37) files to facilitate presentation of results. CDC used SDMS to generate these reports with customized Visual Basic (33) programs that ran SAS (34), SUDAAN (35), and Crystal Reports (38) to generate multiple output files (28,37,39,40). The survey documentation section of the report included a copy of the site's questionnaire, a data user's guide describing how data were edited and how each variable was calculated, a codebook for the electronic data set, information on sampling and weighting, a Sample Statistics Report including the standard error and design effect for each variable, and additional resource documents such as "Understanding, Analyzing, and Presenting Your YRBS Data," "How to Interpret YRBS Trend Data," and "Software for Analyzing YRBS Data." In addition, each report contained a CD-ROM with an electronic copy of the site's data in multiple data file formats (e.g., SAS, SPSS [41], and ASCII) to permit subsequent analyses and an electronic copy of all the material described above.
Beginning with the 2013 cycle, CDC will use the SAS Output Delivery System (34) instead of Crystal Reports (38) as part of SDMS. To reduce environmental impact and cost, all report materials described above will be provided in electronic format only on a CD-ROM. Sites have been using electronic files to share their data and results with others and to post results on their websites, and they will continue to be able to do so.
To ensure the timeliness of the data, CDC typically completes site reports within 12 weeks of receipt of completed questionnaires or answer sheets. Because surveys generally are completed in the spring, most sites receive their reports during the summer, so they can use their survey results to help plan for the coming school year.
In 2008, CDC conducted interviews with sites to determine how they use the CDC-generated report (42). According to those interviews, many sites use or adapt the report by providing copies to key persons (e.g., district and school administrators) or by posting the results on their agency website. Other sites use the data to create their own presentations or products such as summary reports, brochures, or fact sheets. Such products often combine YRBS results with those from other data sources (e.g., vital statistics and educational policy information). Some sites conduct further analyses of their YRBS data, make the data set available for others to conduct secondary analyses, or integrate the data into their own online data query systems.
The 2008 interviews also asked site coordinators how they used their YRBS results in general (42). In addition to documenting the prevalence of priority health-risk behaviors among youths, YRBS data also are used to inform professional development, plan and monitor programs, support health-related policies and legislation, seek funding, and garner support for future surveys.
Publications
During September 1991–April 1992, results from the national YRBS conducted in 1990 were published for the first time in 12 reports in the weekly MMWR (43–54). One of these reports also included state and large urban school district data (46). During June–December 1992, results from the national YRBS conducted in 1991 were published for the first time in five weekly MMWR reports (55–59) and in a Public Health Reports supplement (60); results from the 1991 state and large urban school district surveys were published for the first time in four of the five weekly MMWR reports (56–59).
Beginning with the 1993 surveys, data from the national, state, and large urban school district surveys have been published together in MMWR Surveillance Summaries (61–70). Each MMWR Surveillance Summary includes a brief introduction and description of methods, followed by results for each behavior measured. National results are presented by race/ethnicity and by school grade separately for each sex. State and large urban school district results are presented by sex, and include the minimum, maximum, and median prevalence estimates for states and large urban school district sites separately. This format allows results from individual sites to be compared with those from other states or large urban school districts as well as with the national estimates. Since the 1999 cycle, information also has been provided on trends. Since 2003, only sites with weighted data have been included.
In addition to providing descriptive information on the prevalence of health-risk behaviors among youths, YRBSS also provides researchers with data for secondary analyses. These analyses have resulted in the publication of reports in both the weekly MMWR and peer-reviewed journals. During 1994–2012, approximately 90 articles by CDC authors using YRBSS data were published in peer-reviewed journals, and 23 reports using YRBSS data were published in the weekly MMWR. Although data from the national YRBS were used for the majority of these analyses, for other analyses, data from the national household-based survey, the National College Health Risk Behavior Survey, the National Alternative High School Youth Risk Behavior Survey, and state-based surveys were used. A list of journal articles by CDC authors using YRBSS data is available at http://www.cdc.gov/healthyyouth/yrbs/articles.htm, and a list of the weekly MMWR reports is available at http://www.cdc.gov/healthyyouth/yrbs/publications.htm.
Researchers outside CDC also publish reports using YRBSS data. In May 2012, CDC reviewed the following databases: MEDLINE (National Library of Medicine, National Institute of Health, Bethesda, Maryland, available at http://www.ncbi.nlm.nih.gov/PubMed), PsycINFO (American Psychological Association, District of Columbia, available at http://psycnet.apa.org), CAB (available at http://cabdirect.org), ERIC (Education Resources Information Center, U.S. Department of Education, available at http://www.eric.ed.gov), and Web of Knowledge (Thomson Reuters, New York, New York, available at http://apps.webofknowledge.com). This review documented at least 148 scientific publications (articles, book chapters, and dissertations) written solely by non-CDC authors that have been published since a similar review was conducted in 2004. Publications based on studies in which researchers created their own questionnaires by using selected questions or groups of questions from the YRBSS questionnaire are not included in the previous figure. YRBSS data also are cited extensively in the media, including magazines, newspapers, radio, television, and Internet news sites.
Website
CDC maintains a website that includes information on YRBSS (http://www.cdc.gov/yrbss). The website includes links to MMWR Surveillance Summaries, a map of participating sites, questionnaires, a list of publications, and other resource materials, such as those provided to sites as part of their report. The site also includes 18 fact sheets containing national results. Some fact sheets present results for the most current survey year by sex and by race/ethnicity; others present trend results from 1991 to the most current survey year for all students, by race/ethnicity, and by topic (e.g., sexual behaviors). For site-specific results, fact sheets on obesity-related behaviors, sexual risk behaviors, and tobacco use also are included on the website for all sites with weighted data. These fact sheets also include results from School Health Profiles (71). For the 2011 cycle, 186 site-specific fact sheets were posted on the website. Most of these fact sheets also include a Quick Response code that users can scan with a smartphone or tablet computer to link automatically to the YRBSS website.
The YRBSS website also includes a data widget, a small customizable web program that can be put on any website to display YRBSS results quickly and conveniently. The widget program resides on CDC servers, so when CDC updates the program, the widgets on other websites are updated automatically.
A cornerstone of the YRBSS website is Youth Online, a user-friendly data-query application that allows users to view detailed survey results by site, question, demographic variables, and survey year. All weighted national, state, territorial, tribal, and large urban school district results from 1991–2011 are available for public use. Youth Online allows users to create tables and graphs of YRBSS results, compare results from different locations, and examine trends. The YRBSS website also includes approximately 580 links to Youth Online results for specific topics for specific sites. These links provide users with easy access to Youth Online results and provide examples of Youth Online capabilities. These links also increase efficiency because they replace static PDF files that required considerable formatting and production time.
The YRBSS website also includes data files and documentation for all national surveys conducted since 1991. When these files are downloaded, researchers can conduct their own analyses of the national data. The site also includes information for researchers who want to analyze state, territorial, tribal, and large urban school district YRBSS data. Although certain sites have given CDC permission to share their datasets directly, other sites require that researchers contact them to request data. Researchers who want state, territorial, tribal, or large urban school district datasets can use an online request form to request data or site contact information. In 2012, the YRBSS website received 498,277 visits, and Youth Online received 421,782 visits (CDC, unpublished data, 2012).
In 2012, CDC launched a web-based e-mail distribution list for YRBSS as part of CDC's "Get E-mail Updates" system. Users can subscribe to this CDC-administered list to receive bulletins with new information about YRBSS. These bulletins have included an announcement of the release of the 2011 MMWR Surveillance Summary and an announcement of the release of a weekly MMWR article reporting national YRBS trend data. At the end of 2012, this list had 74,068 subscribers (GovDelivery subscriber report, December 31, 2012).
Data Quality
From the inception of YRBSS, CDC has been committed to ensuring that the data are of the highest quality. Obtaining high-quality data begins with high-quality questions. As described previously, the original questionnaire was subjected to laboratory and field testing, and CDC conducted reliability testing of the 1991 and 1999 versions of the questionnaire. In addition, two studies have been conducted to assess the effect of implementing changes to the questions that assess race and ethnicity (72,73). CDC made these changes to comply with new standards established by the Office of Management and Budget in 1997 (74). The first study tested the effect of changing the question from one in which students were required to select a single response to one that allowed students to select one or more responses. The second study tested the effect of changing from a single-question format that asked about race and ethnicity together to a two-question format that asked separate questions about race and ethnicity. Both studies indicated that the changes to the questions had only a minimal effect on reported race/ethnicity and that trend analyses that included white, black, and Hispanic subgroups were not affected (72,73).
Another aspect of data quality is the level of nonresponse to questions. For the 2011 national YRBS, nonresponse attributed to blank responses, invalid responses, out-of-range responses, and responses that did not meet edit criteria ranged from 0.5% for the question that assesses the sex of the respondent to 14% for the question that assesses the race of the respondent. For 91% of all questions, the nonresponse rate was <5%, and for 16% of all questions, the nonresponse rate was <1%.
To further ensure data quality, survey administrators use standardized procedures. To determine how using different procedures can affect survey results, CDC has conducted a series of methods studies. In the first study, conducted in 2002, CDC examined how prevalence estimates were affected by varying honesty appeals,§ the wording of questions, and data-editing protocols, while holding population, setting, questionnaire context, and mode of administration constant (75). The study indicated that different honesty appeals and data-editing protocols did not have a statistically significant effect on prevalence estimates. In addition, the study indicated that, although differences in the wording of questions can create statistically significant differences in certain prevalence estimates, no particular type of wording consistently produced higher or lower estimates.
In 2004, CDC conducted a study to determine how varying the mode and setting of survey administration might affect prevalence estimates. While previous research had examined the effects of varying setting (school versus home) (76–79) and mode (paper-and-pencil instrument [PAPI] versus computer-assisted self-interview [CASI]) (80–83), this study was the first to assign school classes randomly to one of four conditions in which mode and setting were varied systematically: school-based administration using PAPI, school-based administration using CASI, home-based PAPI administration, and home-based CASI administration. Results revealed that students completing questionnaires at school were more likely to report health-risk behaviors than students completing questionnaires at home, but that mode effects were weaker: prevalence estimates for health-risk behaviors generally did not differ for CASI and PAPI administrations, and these effects were independent of setting (84). On the basis of these results, CDC decided to continue with PAPI administration for the YRBSS. Because the use of CASI did not increase the reporting of risk behaviors, its increased cost and complicated logistics did not appear to be justified.
In 2008, CDC conducted two additional studies to determine the feasibility and effect of conducting YRBS as a web-based survey. In the first study, classes in grades 9 and 10 were assigned randomly to complete the YRBS in three conditions: 1) using PAPI in the classroom, 2) using web-based CASI administration in school computer labs or in classrooms with sufficient numbers of computers, or 3) using web-based CASI that students could complete on their own (i.e., at any time at any computer with Internet access). Results indicated that risk behavior prevalence estimates generated from PAPI and CASI administered in a classroom setting in schools generally were equivalent (85). However, web-based CASI administration yielded more missing data than PAPI administration, and web-based "on their own" administration yielded an unacceptably low response rate. In addition, perceived and actual privacy and perceived anonymity were compromised when administering in-class web-based questionnaires (86).
In the second study, paper-and-pencil questionnaires were mailed to a nationally representative sample of public and private high school principals to assess computer availability in U.S. schools and to assess principals' perceptions of web-based student surveys. Although 64% of principals preferred web-based student surveys to those conducted via PAPI, only 30% said they would be more likely to agree to participate in such a survey if it were conducted online. Further, this study revealed that many schools do not have sufficient computer capacity to participate in an in-class web-based survey. As a result, web-based student surveys could create a significant burden on schools and lead to unacceptably low school participation rates (87). Taken together, the results of the 2008 studies informed CDC's decision not to convert the YRBSS from PAPI to web-based administration so the quality of the system could be maintained (85–87).
Limitations
YRBSS is subject to at least five limitations. First, all YRBSS data are self-reported, and the extent of underreporting or overreporting of behaviors cannot be determined, although studies described in this report demonstrate that the data are of acceptable quality. Second, the school-based national, state, territorial, tribal, and large urban school district survey data apply only to youths who attend school and therefore are not representative of all persons in this age group. Nationwide, in 2009, approximately 4% of persons aged 16–17 years were not enrolled in a high school program and had not completed high school (88). The NHIS and Youth Risk Behavior Supplement conducted in 1992 demonstrated that out-of-school youths are more likely than youths attending school to engage in the majority of health-risk behaviors (89). Third, local parental permission procedures are not consistent across school-based survey sites. However, in a 2004 study, CDC demonstrated that the type of parental permission typically does not affect prevalence estimates as long as student response rates remain high (90). Fourth, state-level data are not available for all 50 states. Three states (Minnesota, Oregon, and Washington) do not participate, and in 2011, four states (California, Missouri, Nevada, and Pennsylvania) did not obtain weighted data. Finally, YRBSS addresses only those behaviors that contribute to the leading causes of morbidity and mortality among youths and adults. However, school and community interventions should focus not only on behaviors but also on the determinants of those behaviors.
Global Youth Risk Behavior Survey
CDC has applied many of the features of YRBSS successfully to the Global Youth Risk Behavior Survey (G-YRBS), also known as the Global School-Based Student Health Survey. G-YRBS was developed by the World Health Organization (WHO) and CDC in collaboration with UNICEF, UNESCO, and UNAIDS. Since 2003, G-YRBS has provided data on health behaviors and protective factors among students in 84 developing countries. G-YRBS is a school-based survey conducted primarily among students aged 13–17 years. Like YRBSS, G-YRBS uses a standardized scientific sample selection process, common school-based methodology, and a standard questionnaire. The G-YRBS standard questionnaire addresses the leading causes of morbidity and mortality among children and adults worldwide and includes 10 core modules: 1) alcohol use; 2) dietary behaviors; 3) drug use; 4) hygiene; 5) mental health; 6) physical activity; 7) protective factors, such as parental supervision; 8) sexual behaviors that contribute to HIV infection, other STDs, and unintended pregnancy; 9) tobacco use; and 10) violence and unintentional injuries. A list of core-expanded questions also is available, and countries participating in G-YRBS can modify the standard questionnaire to meet country-specific needs just as state, territorial, and local agencies and tribal governments can modify the standard YRBSS questionnaire to meet their needs.
As CDC does for YRBSS sites, WHO and CDC provide ongoing capacity building and technical support to countries conducting a G-YRBS. Capacity building includes help with sample design and selection using PCSample, training of survey coordinators, provision of survey implementation handbooks, provision and scanning of computer-scannable answer sheets, data editing and weighting, and provision or facilitation of funding and other resources. WHO and CDC also offer two capacity-building workshops for participating countries. One of these workshops, the Survey Implementation Workshop, is similar to the YRBSS training that CDC provides to sites. This 3-day workshop builds the capacity of survey coordinators to implement the survey in their country following common sampling and survey administration procedures that ensure that the surveys are standardized and comparable across countries and that the data are of the highest quality. For countries that implement a G-YRBS successfully, CDC and WHO also provide a second 4-day workshop, the Data Analysis and Reporting Workshop, which trains survey coordinators to conduct data analysis and generate a country-specific report and fact sheet using Epi Info (91). Since 2001, persons from 124 countries have attended one or both of these workshops.
As with YRBSS sites, CDC also provides individualized technical assistance and monitors site progress for countries participating in G-YRBS. Once data are collected and completed answer sheets are shipped, CDC provides similar services for G-YRBS as for YRBSS, including scanning of answer sheets, data processing, and report production using SDMS. As with YRBSS, CDC maintains a public-facing G-YRBS website (available at http://www.cdc.gov/gshs) that includes country-specific questionnaires, fact sheets, and public-use datasets available for additional analyses. A G-YRBS bibliography also is available at http://www.who.int/chp/gshs/GSHS_Bibliography.pdf.
School Health Profiles
Another CDC surveillance system that uses many of the features of YRBSS is School Health Profiles (Profiles), which provides biennial data on school health policies and practices in secondary schools in states, territories, large urban school districts, and tribal governments (71). Profiles uses two standard computer-scannable questionnaire booklets to collect data, one for school principals and one for lead health education teachers. These questionnaires are mailed to the schools. When revising these standard questionnaires for each cycle, CDC uses a voting process similar to that used for YRBSS. To draw representative samples of schools for Profiles, CDC and Westat use a specialized software program called PCSchool that is similar to PCSample. Profiles technical assistance is similar to that of YRBSS in that it includes a comprehensive handbook (92), monthly newsletters, and instructional videos, as well as the same Survey Technical Assistance Website used for YRBSS. As with YRBSS, CDC also provides individualized technical assistance to Profiles sites and monitors site progress using a standardized tracking form. In addition, as is done for YRBSS, CDC produces a comprehensive report containing data from all participating states, large urban school districts, territories, and tribes with weighted data (71) and fact sheets for each Profiles cycle, and maintains a public-facing website (available at http://www.cdc.gov/schoolhealthprofiles). The Profiles website includes questionnaires, publications, fact sheets, and a PowerPoint (37) presentation.
Future Directions
YRBSS is evolving constantly to meet the needs of CDC and other users of the data. The questionnaire is revised before each biennial cycle, and new survey populations periodically have been added to the system since its inception. In the future, additional substate sampling and analysis might be possible, similar to the Selected Metropolitan/Micropolitan Area Risk Trends that are part of the Behavioral Risk Factor Surveillance System (93). Finally, although web-based administration is not recommended for YRBSS at this time, CDC will continue to monitor schools' computer capacity as well as the development of innovative and cost-effective methods that ensure students' privacy; such advances could permit online administration of the YRBS in the future.
References
- Johnston LD, O'Malley PM, Bachman JG, Schulenberg JE. Monitoring the Future national survey results on drug use, 1975–2011. Volume I: secondary school students. Ann Arbor, MI: Institute for Social Research, The University of Michigan; 2012.
- American School Health Association, Association for the Advancement of Health Education, Society for Public Health Education, Inc. The National Adolescent Student Health Survey: a report on the health of America's youth. Oakland, CA: Third Party Publishing; 1989.
- Kann L, Anderson JE, Holtzman D, et al. HIV-related knowledge, beliefs, and behaviors among high school students in the United States: results from a national survey. J Sch Health 1991;61:397–401.
- CDC. Methodology of the Youth Risk Behavior Surveillance System. MMWR 2004;53(No RR-12).
- Public Health Service: Healthy people 2000: national health promotion and disease prevention objectives—full report, with commentary. Washington, DC: US Department of Health and Human Services, Public Health Service; 1990; DHHS publication no. (PHS) 91-50212.
- US Department of Health and Human Services. Healthy people 2010. With understanding and improving health and objectives for improving health. Washington, DC: US Department of Health and Human Services; 2000.
- US Department of Health and Human Services, Office of Disease Prevention and Health Promotion. Healthy people 2020. Available at http://www.healthypeople.gov. Accessed February 11, 2013.
- CDC. FY 2012 online performance appendix. Available at http://www.cdc.gov/fmo/topic/Performance/performance_docs/FY2012_CDC_Online_Performance_Appendix.pdf. Accessed February 11, 2013.
- Dryfoos JG. Adolescents at risk: prevalence and prevention. New York, NY: Oxford University Press; 1990.
- Adams PF, Schoenborn CA, Moss AJ, Warren CW, Kann L. Health-risk behaviors among our nation's youth: United States, 1992. Vital Health Stat 1995;192:1–51.
- CDC. Youth risk behavior surveillance: National College Health Risk Behavior Survey—United States, 1995. MMWR 1997;46(No. SS-6).
- Grunbaum J, Kann L, Kinchen SA, et al. Youth risk behavior surveillance: National Alternative High School Youth Risk Behavior Survey, United States, 1998. MMWR 1999;48(No. SS-7).
- Brener ND, Kann L, Garcia D, et al. Youth Risk Behavior Surveillance—selected Steps communities, 2005. MMWR 2007;56(No. SS-2):1–16.
- National Center for Health Statistics. Advance report of final mortality statistics, 1989. Hyattsville, MD: US Department of Health and Human Services, Public Health Service, National Center for Health Statistics; 1992. DHHS publication no. PHS 92-1120.
- CDC, National Center for Health Statistics. Mortality data file for 2008 with all state identifiers [CD-ROM]; 2011. Available at http://www.cdc.gov/nchs/data_access/cmf.htm.
- Hofferth SL. Teenage pregnancy and its resolution. In: Hofferth SL and Hayes CD, eds. Risking the future: adolescent sexuality, pregnancy and childbearing. Washington, DC: National Academy Press; 1987:78–92.
- CDC. 1990 Division of STD/HIV prevention annual report, 1990. Atlanta, GA: US Department of Health and Human Services, CDC; 1991.
- CDC. Vital signs: teen pregnancy—United States, 1991–2009. MMWR 2011;60:414–20.
- CDC. Tracking the hidden epidemics: trends in STDs in the United States. Atlanta, GA: US Department of Health and Human Services, CDC; 2000.
- CDC. Sexually transmitted disease morbidity for selected STDs by age, race/ethnicity and gender, 1996–2009, CDC WONDER online database, June 2011. Available at http://wonder.cdc.gov/std-std-race-age.html. Accessed February 11, 2013.
- US Department of Education, National Education Goals Panel. Goal 6: safe, disciplined, and drug-free schools. In: US Department of Education. Measuring progress toward the National Education Goals: potential indicators and measurement strategies. Washington, DC: US Department of Education, 1991; publication no. 91-01.
- Brener ND, Collins JL, Kann L, Warren CW, Williams BI. Reliability of the Youth Risk Behavior Survey questionnaire. Am J Epidemiol 1995;141:575–80.
- Brener ND, Kann L, McManus T, Kinchen SA, Sundberg EC, Ross JG. Reliability of the 1999 Youth Risk Behavior Survey questionnaire. J Adolesc Health 2002;31:336–42.
- Brener ND, Billy JOG, Grady WR. Assessment of factors affecting the validity of self-reported health-risk behavior among adolescents: evidence from the scientific literature. J Adolesc Health 2003;33:436–57.
- Brener ND, McManus T, Galuska DA, Lowry R, Wechsler H. Reliability and validity of self-reported height and weight among high school students. J Adolesc Health 2003;32:281–7.
- CDC. 2013 Handbook for conducting Youth Risk Behavior Surveys. Atlanta, GA: US Department of Health and Human Services, CDC; 2012.
- US Department of Education, National Center for Education Statistics. Common core of data, Public Elementary/Secondary School Universe Survey. Available at http://nces.ed.gov/ccd. Accessed February 11, 2013.
- Microsoft Corporation. Microsoft Excel 2010. Redmond, WA: Microsoft Corporation; 2010.
- Market Data Retrieval. National Education Database Master Extract. Shelton, CT: Market Data Retrieval, Inc.; 2010.
- National Center for Education Statistics. Digest of Education Statistics, 2011. Available at http://nces.ed.gov/programs/digest. Accessed February 11, 2013.
- Potter FJ. A study of procedures to identify and trim extreme sampling weights. In: American Statistical Association. Proceedings of the Section on Survey Research Methods of the American Statistical Association. Research Triangle Park, NC: American Statistical Association; 1990:225–30.
- Eaton DK, Brener N, Kann LK. Associations of health risk behaviors with school absenteeism: does having permission for the absence make a difference? J Sch Health 2008;78:223–9.
- Microsoft Corporation. Visual Studio 2008, professional edition. Redmond, WA: Microsoft Corporation; 2007.
- SAS Institute, Inc. SAS, Version 9.2. Cary, NC: SAS Institute; 2008.
- Research Triangle Institute. SUDAAN: software for the statistical analysis of correlated data, release 10. Research Triangle Park, NC: Research Triangle Institute; 2008.
- CDC. 2011 YRBS data user's guide, 2012. Available at ftp://ftp.cdc.gov/pub/data/yrbs/2011/YRBS_2011_National_User_Guide.pdf. Accessed February 11, 2013.
- Microsoft Corporation. PowerPoint 2010. Redmond, WA: Microsoft Corporation; 2010.
- Business Objects Software, Ltd. Crystal Reports, Version 10.0.0.533. Dublin, Ireland: Business Objects Software, Ltd.; 2003.
- Microsoft Corporation. Microsoft Word 2010. Redmond, WA: Microsoft Corporation; 2010.
- Adobe Systems, Inc., Adobe Acrobat, version 9.5.0. San Jose, CA: Adobe Systems, Inc.; 2010.
- SPSS, Inc. SPSS for Windows, Release 19.0.0. Chicago, IL: SPSS Inc.; 2010.
- Foti K, Balaji A, Shanklin S. Uses of Youth Risk Behavior Survey and School Health Profiles data: applications for improving adolescent and school health. J Sch Health 2011;81:345–54.
- CDC. Participation of high school students in school physical education—United States, 1990. MMWR 1991;40:607, 613–5.
- CDC. Tobacco use among high school students—United States, 1990. MMWR 1991;40:617–9.
- CDC. Attempted suicide among high school students—United States, 1990. MMWR 1991;40:633–5.
- CDC. Current tobacco, alcohol, marijuana, and cocaine use among high school students—United States, 1990. MMWR 1991;40:659–63.
- CDC. Weapon-carrying among high school students—United States, 1990. MMWR 1991;40:681–4.
- CDC. Body-weight perceptions and selected weight-management goals and practices of high school students—United States, 1990. MMWR 1991;40:741, 747–50.
- CDC. Alcohol and other drug use among high school students—United States, 1990. MMWR 1991;40:776–7, 783–4.
- CDC. Sexual behavior among high school students—United States, 1990. MMWR 1991;40:885–8.
- CDC. Vigorous physical activity among high school students—United States, 1990. MMWR 1992;41:33–5.
- CDC. Physical fighting among high school students—United States, 1990. MMWR 1992;41:91–4.
- CDC. Safety-belt use and helmet use among high school students—United States, 1990. MMWR 1992;41:111–4.
- CDC. Selected behaviors that increase risk for HIV infection among high school students—United States, 1990. MMWR 1992;41:231, 237–40.
- CDC. Selected tobacco-use behaviors and dietary patterns among high school students—United States, 1991. MMWR 1992;41:417–21.
- CDC. Participation in school physical education and selected dietary patterns among high school students—United States, 1991. MMWR 1992;41:597–601, 607.
- CDC. Tobacco, alcohol, and other drug use among high school students—United States, 1991. MMWR 1992;41:698–703.
- CDC. Behaviors related to unintentional and intentional injuries among high school students—United States, 1991. MMWR 1992;41:760–5, 771–2.
- CDC. Selected behaviors that increase risk for HIV infection, other sexually transmitted diseases, and unintended pregnancy among high school students—United States, 1991. MMWR 1992;41:945–50.
- Kann L, Warren W, Collins JL, Ross J, Collins B, Kolbe LJ. Results from the national school-based 1991 Youth Risk Behavior Survey and progress toward achieving related health objectives for the nation. Public Health Rep 1993;108(Suppl 1):47–55.
- Kann L, Warren CW, Harris WA, et al. Youth risk behavior surveillance—United States, 1993. MMWR 1995;44(No. SS-1).
- Kann L, Warren CW, Harris WA, et al. Youth risk behavior surveillance—United States, 1995. MMWR 1996;45(No. SS-4).
- Kann L, Kinchen SA, Williams BI, et al. Youth risk behavior surveillance—United States, 1997. MMWR 1998;47(No. SS-3).
- Kann L, Kinchen SA, Williams BI, et al. Youth risk behavior surveillance—United States, 1999. MMWR 2000;49(No. SS-5).
- Grunbaum J, Kann L, Kinchen SA, et al. Youth risk behavior surveillance—United States, 2001. MMWR 2002;51(No. SS-4).
- Grunbaum J, Kann L, Kinchen SA, et al. Youth risk behavior surveillance—United States, 2003. MMWR 2004;53(No. SS-2).
- Eaton D, Kann L, Kinchen SA, et al. Youth risk behavior surveillance—United States, 2005. MMWR 2006;55(No. SS-5).
- Eaton D, Kann L, Kinchen SA, et al. Youth risk behavior surveillance—United States, 2007. MMWR 2008;57(No. SS-4).
- Eaton D, Kann L, Kinchen SA, et al. Youth risk behavior surveillance—United States, 2009. MMWR 2010;59(No. SS-5).
- Eaton D, Kann L, Kinchen SA, et al. Youth risk behavior surveillance—United States, 2011. MMWR 2012;59(No. SS-4).
- Brener ND, Demissie Z, Foti K, et al. School Health Profiles 2010: characteristics of health programs among secondary schools. Atlanta, GA: US Department of Health and Human Services, CDC; 2011.
- Brener ND, Kann L, McManus T. A comparison of two survey questions on race and ethnicity among high school students. Public Opinion Quarterly 2003;67:227–36.
- Eaton DK, Brener ND, Kann L, Pittman V. High school student responses to different question formats assessing race/ethnicity. J Adolesc Health 2007;41:488–94.
- Office of Management and Budget. Revisions to the standards for the classification of federal data on race and ethnicity. Federal Register 1997;62:58781–90.
- Brener ND, Grunbaum JA, Kann L, McManus T, Ross J. Assessing health risk behaviors among adolescents: the effect of question wording and appeals for honesty. J Adolesc Health 2004;35:91–100.
- Kann L, Brener ND, Warren CW, Collins JL, Giovino GA. An assessment of the effect of data collection setting on the prevalence of health-risk behaviors among adolescents. J Adolesc Health 2002;31:327–35.
- Gfroerer J, Wright D, Kopstein A. Prevalence of youth substance use: the impact of methodological differences between two national surveys. Drug Alcohol Depend 1997;47:19–30.
- Rootman I, Smart RG. A comparison of alcohol, tobacco and drug use as determined from household and school surveys. Drug Alcohol Depend 1985;16:89–94.
- Needle R, McCubbin H, Lorence J, Hochhauser M. Reliability and validity of adolescent self-reported drug use in a family-based study: a methodological report. International J Addictions 1983;18:901–12.
- Turner CF, Ku L, Rogers SM, Lindberg LD, Pleck JH, Sonenstein FL. Adolescent sexual behavior, drug use, and violence: increased reporting with computer survey technology. Science 1998;280:867–73.
- Wright DL, Aquilino WS, Supple AJ. A comparison of computer-assisted and paper-and-pencil self-administered questionnaires in a survey on smoking, alcohol, and drug use. Public Opinion Quarterly 1998;62:331–53.
- Beebe TJ, Harrison PA, McCrae JA Jr, Anderson RE, Fulkerson JA. An evaluation of computer-assisted self-interviews in a school setting. Public Opinion Quarterly 1998;62:623–32.
- Hallfors D, Khatapoush S, Kadushin C, Watson K, Saxe L. A comparison of paper vs computer-assisted self interview for school alcohol, tobacco, and other drug surveys. Evaluation and Program Planning 2000;23:149–55.
- Brener ND, Eaton DK, Kann L, et a;. The association of survey setting and mode with self-reported health risk behaviors among high school students. Public Opinion Quarterly 2006;70:354–74.
- Eaton DK, Brener N, Kann L et al. Comparison of paper-and-pencil versus web administration of the Youth Risk Behavior Survey (YRBS): health-risk behavior prevalence estimates. Evaluation Review 2010;34:137–53.
- Denniston MM, Brener N, Kann L, et al. Comparison of web-based versus paper-and-pencil administration of the Youth Risk Behavior Survey (YRBS): participation, data quality, and perceived privacy and anonymity by mode of data collection. Computers in Human Behavior 2010;26:1054–60.
- Eaton DK, Brener ND, Kann L, et al. Computer availability and principals' perception of online surveys. J Sch Health 2011;81:365–73.
- Chapman C, Laird J, Ifill N, KewalRamani A. Trends in high school dropout and completion rates in the United States: 1972–2009 (NCES 2012–006). Available at http://nces.ed.gov/pubs2012/2012006.pdf. Accessed February 11, 2013.
- CDC. Health risk behaviors among adolescents who do and do not attend school—United States, 1992. MMWR 1994;43:129–32.
- Eaton DK, Lowry R, Brener ND, Grunbaum JA, Kann L. Passive versus active parental permission in school-based survey research: does type of permission affect prevalence estimates of self-reported risk behaviors? Evaluation Review 2004;28:564–77.
- CDC. Epi Info, Release 3.5.1. [Software and documentation]. Atlanta, GA: US Department of Health and Human Services, CDC; 2008.
- CDC. Handbook for developing School Health Profiles. Atlanta, GA: US Department of Health and Human Services, CDC; 2012.
- CDC. SMART: selected metropolitan/micropolitan area risk trends. Available at http://apps.nccd.cdc.gov/brfss-smart/index.asp. Accessed February 11, 2013.
TABLE 1. National health objectives and a leading health indicator measured by the national Youth Risk Behavior Survey* |
|
---|---|
Topic area/Objective no. |
Objective |
Cancer |
|
C-20.3 |
Reduce the proportion of adolescents in grades 9–12 who report using artificial sources of ultraviolet light for tanning |
C-20.5 |
Increase the proportion of adolescents in grades 9–12 who follow protective measures that may reduce the risk of skin cancer |
Injury and violence prevention |
|
IVP-34 |
Reduce physical fighting among adolescents |
IVP-35 |
Reduce bullying among adolescents |
IVP-36 |
Reduce weapon carrying by adolescents on school property |
Mental health and mental disorders |
|
MHMD-2 |
Reduce suicide attempts by adolescents |
MHMD-3 |
Reduce the proportion of adolescents who engage in disordered eating behaviors in an attempt to control their weight |
Physical activity |
|
PA-3.1 |
Increase the proportion of adolescents who meet current Federal physical activity guidelines for aerobic physical activity |
PA-3.2 |
Increase the proportion of adolescents who meet current Federal physical activity guidelines for muscle-strengthening activity |
PA-3.3 |
Increase the proportion of adolescents who meet current Federal physical activity guidelines for aerobic physical activity and for muscle-strengthening activity |
PA-5 |
Increase the proportion of adolescents who participate in daily school physical education |
PA-8.2.3 |
Increase the proportion of adolescents in grades 9–12 who view television, videos, or play video games for no more than 2 hours a day |
PA-8.3.3 |
Increase the proportion of adolescents in grades 9–12 who use a computer or play computer games outside of school (for nonschool work) for no more than 2 hours a day |
Sleep health |
|
SH-3 |
Increase the proportion of students in grades 9–12 who get sufficient sleep |
Substance abuse |
|
SA-1 |
Reduce the proportion of adolescents who report that they rode, during the previous 30 days, with a driver who had been drinking alcohol |
Tobacco use |
|
TU-2.1 |
Reduce the use of tobacco products by adolescents (past month) |
TU-2.2† |
Reduce the use of cigarettes by adolescents (past month) |
TU-2.3 |
Reduce the use of smokeless tobacco products by adolescents (past month) |
TU-2.4 |
Reduce the use of cigars by adolescents (past month) |
TU-7 |
Increase smoking cessation attempts by adolescent smokers |
* Source: National health objectives and leading health indicators are determined by the US Department of Health and Human Services. Adapted from US Department of Health and Human Services. Healthy People 2020. Available at http://www.healthypeople.gov/2020/default.aspx. † Leading health indicator. |
* Skip patterns occur when a particular response to one question indicates to the respondents that they should not answer one or more subsequent questions.
§ Honesty appeals, typically part of questionnaire introductions, ask respondents to be truthful when self-reporting behaviors.
† Areas with a population of ≥500,000 persons.
Use of trade names and commercial sources is for identification only and does not imply endorsement by the U.S. Department of Health and Human Services.
References to non-CDC sites on the Internet are provided as a service to MMWR readers and do not constitute or imply endorsement of these organizations or their programs by CDC or the U.S. Department of Health and Human Services. CDC is not responsible for the content of pages found at these sites. URL addresses listed in MMWR were current as of the date of publication.
All MMWR HTML versions of articles are electronic conversions from typeset documents. This conversion might result in character translation or format errors in the HTML version. Users are referred to the electronic PDF version (http://www.cdc.gov/mmwr) and/or the original MMWR paper copy for printable versions of official text, figures, and tables. An original paper copy of this issue can be obtained from the Superintendent of Documents, U.S.
Government Printing Office (GPO), Washington, DC 20402-9371; telephone: (202) 512-1800. Contact GPO for current prices.
**Questions or messages regarding errors in formatting should be addressed to mmwrq@cdc.gov.