Methodology of the Youth Risk Behavior Surveillance System — 2013

Verified

Added on  2023/05/29

|21
|20505
|430
AI Summary
This report by CDC describes the methodology of the Youth Risk Behavior Surveillance System (YRBSS) established in 1991. YRBSS monitors six categories of priority health-risk behaviors among youths and young adults. The report includes questionnaire content, operational procedures, sampling, weighting, and response rates, data-collection protocols, data-processing procedures, reports and publications, and data quality.

Contribute Materials

Your contribution can guide someone’s learning journey. Share your documents today.
Document Page
Centers for Disease Control & Prevention (CDC)
Methodology of the Youth Risk Behavior Surveillance System — 2013
Author(s): Nancy D. Brener, Laura Kann, Shari Shanklin, Steve Kinchen, Danice K. Eaton,
Joseph Hawkins and Katherine H. Flint
Source: Morbidity and Mortality Weekly Report: Recommendations and Reports, Vol. 62,
No. 1 (March 1, 2013), pp. 1-20
Published by: Centers for Disease Control & Prevention (CDC)
Stable URL: https://www.jstor.org/stable/10.2307/24832543
REFERENCES
Linked references are available on JSTOR for this article:
https://www.jstor.org/stable/10.2307/24832543?seq=1&cid=pdf-
reference#references_tab_contents
You may need to log in to JSTOR to access the linked references.
JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide
range of content in a trusted digital archive. We use information technology and tools to increase productivity and
facilitate new forms of scholarship. For more information about JSTOR, please contact support@jstor.org.
Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at
https://about.jstor.org/terms
Centers for Disease Control & Prevention (CDC)is collaborating with JSTOR to digitize, preserve
and extend access toMorbidity and Mortality Weekly Report: Recommendations and Reports
This content downloaded from
105.231.112.221 on Tue, 04 Dec 2018 04:40:42 UTC
All use subject to https://about.jstor.org/terms

Secure Best Marks with AI Grader

Need help grading? Try our AI Grader for instant feedback on your assignments.
Document Page
Recommendations and Reports
Methodology of the
Youth Risk Behavior Surveillance System — 2013
Prepared by
Nancy D. Brener, PhD1
Laura Kann, PhD1
Shari Shanklin, MS1
Steve Kinchen1
Danice K. Eaton, PhD2
Joseph Hawkins, MA3
Katherine H. Flint, MS4
1Division of Adolescent and School Health, National Center for HIV/AIDS, Viral Hepatitis, STD, and TB Prevention, C
2Division of Human Development and Disability, National Center for Birth Defects and Developmental Disabilities
3Westat, Rockville, Maryland
4ICF International, Calverton, Maryland
Summary
Priority health-risk behaviors (i.e., interrelated and preventable behaviors that contribute to the lead
and mortality among youths and adults) often are established during childhood and adolescence and
Youth Risk Behavior Surveillance System (YRBSS), established in 1991, monitors six categories of prio
among youths and young adults: 1) behaviors that contribute to unintentional injuries and violence; 2
contribute to human immunodeficiency virus (HIV) infection, other sexually transmitted diseases, and
3) tobacco use; 4) alcohol and other drug use; 5) unhealthy dietary behaviors; and 6) physical inactivi
monitors the prevalence of obesity and asthma among this population.
YRBSS data are obtained from multiple sources including a national school-based survey conducted
based state, territorial, tribal, and large urban school district surveys conducted by education and hea
have been conducted biennially since 1991 and include representative samples of students in grades
of the YRBSS methodology was published (CDC. Methodology of the Youth Risk Behavior Surveillance
[No RR-12]). Since 2004, improvements have been made to YRBSS, including increases in coverage a
This report describes these changes and updates earlier descriptions of the system, including questio
procedures; sampling, weighting, and response rates; data-collection protocols; data-processing proce
and data quality. This report also includes results of methods studies that systematically examined ho
affect prevalence estimates. YRBSS continues to evolve to meet the needs of CDC and other data use
of the questionnaire, the addition of new populations, and the development of innovative methods for
had been ongoing since 1975 (1). This study measureBackground and Rationale use and related determinants in a national sample of
Data from surveillance systems are critical for planning andin grade 12; it has since been expanded to include st
evaluating public health programs. During the late 1980s,in grades 8 and 10 and a broader health-risk behavio
when CDC began funding education agencies to implementIn 1987, the one-time National Adolescent Student He
school-based programs to prevent human immunodeficiencySurvey was administered to a nationally representativ
virus (HIV), only a limited number of health-related school-of students in grades 8 and 10; this survey measured
based surveys existed in the United States to inform programskills (e.g., reading food and drug labels), alcohol and
planning and evaluation. The Monitoring the Future studydrug use, injury prevention, nutrition, knowledge and
The material in this report originated in the National Center for HIV/
AIDS, Viral Hepatitis, STD, and TB Prevention, Rima F. Khabbaz,
MD, Acting Director; and the Division of Adolescent and School
Health, Howell Wechsler, EdD, Director.
Corresponding preparer: Nancy D. Brener, PhD, National Center for
HIV/AIDS, Viral Hepatitis, STD, and TB Prevention, 4770 Buford
Highway NE, MS K-33, Atlanta, GA 30341; Telephone: 770-488-6184;
Fax: 770-488-6156; E-mail: nad1@cdc.gov.
about sexually transmitted diseases (STDs) and a
immunodeficiency syndrome (AIDS), attempted suicid
violence-related behaviors (2). In addition, in 198
conducted a national survey to measure knowledge, b
and behaviors concerning HIV among high school stu
(3). However, surveys conducted only on a national le
time surveys, and surveys addressing only certain ca
of health-risk behaviors could not meet the need
MMWR / March 1, 2013/ Vol. 62/ No. 1 1
This content downloaded from
105.231.112.221 on Tue, 04 Dec 2018 04:40:42 UTC
All use subject to https://about.jstor.org/terms
Document Page
Recommendations and Reports
territorial, and local education and health agencies that had
begun receiving funding to implement school health programs.
More specifically, in 1987, CDC began providing financial
and technical assistance to state, territorial, and local education
agencies to implement effective HIV prevention programs
for youths. Since 1992, CDC also has provided financial and
technical assistance to state education agencies to implement
additional broad-based programs, often referred to as
coordinated school health programs,” which focus on obesity
and tobacco use prevention. Since 2008, CDC also has funded
tribal governments for HIV prevention and coordinated school
health programs.
Before 1991, school-based HIV prevention programs and
coordinated school health programs frequently were developed
without empiric information on the prevalence of key behaviors
that most influence health and on how those behaviors varied
over time and across subgroups of students. To plan and
help determine the effectiveness of school health programs,
public health and education officials need to understand how
programs influence the health-risk behaviors associated with
the leading causes of morbidity and mortality among youths
and adults in the United States.
In 1991, to address the need for data on the health-risk
behaviors that contribute substantially to the leading causes of
morbidity and mortality among U.S. youths and young adults,
CDC developed the Youth Risk Behavior Surveillance System
(YRBSS), which monitors six categories of priority health-risk
behaviors among youths and young adults: 1) behaviors that
contribute to unintentional injuries and violence; 2) sexual
behaviors that contribute to HIV infection, other STDs, and
unintended pregnancy; 3) tobacco use; 4) alcohol and other drug
use; 5) unhealthy dietary behaviors; and 6) physical inactivity.
In addition, the surveillance system monitors the prevalence of
obesity and asthma among this population. The system includes
a national school-based survey conducted by CDC as well as
school-based state, territorial, tribal, and large urban school
district surveys conducted by education and health agencies. In
these surveys, conducted biennially since 1991, representative
samples of students typically in grades 9–12 are drawn. In 2004,
a description of the YRBSS methodology was published (4).
This updated report discusses changes that have been made to
YRBSS since 2004 and provides an updated, detailed description
of the features of the system, including questionnaire content;
operational procedures; sampling, weighting, and response rates;
data-collection protocols; data-processing procedures; reports
and publications; and data quality. This report also includes
results of new methods studies on the use of computer-based data
collection and describes enhancements made to the technical
assistance system that supports state, territorial, tribal, and large
urban school district surveys.
Purposes of YRBSS
YRBSS has multiple purposes. The system was desig
enable public health professionals, educators, policy m
and researchers to 1) describe the prevalence of hea
behaviors among youths, 2) assess trends in hea
behaviors over time, and 3) evaluate and improv
related policies and programs. YRBSS also was de
to provide comparable national, state, territorial,
urban school district data as well as comparable data
subpopulations of youths (e.g., racial/ethnic subgroup
monitor progress toward achieving national health ob
(5–7) (Table 1) and other program indicators (e.g., CD
performance on selected Government Performance a
Act measures) (8). Although YRBSS is designed to pro
information to help assess the effect of broad nationa
territorial, tribal, and local policies and programs, it w
designed to evaluate the effectiveness of specific inte
(e.g., a professional development program, school cu
or media campaign).
As YRBSS was being developed, CDC decided that t
system should focus almost exclusively on health-risk
rather than on the determinants of these behavio
knowledge, attitudes, beliefs, and skills), because
more direct connection between specific health-risk b
and specific health outcomes than between determin
behaviors and health outcomes. Many behaviors (e.g.
and other drug use and sexual behaviors) measured b
also are associated with educational and social o
including absenteeism, poor academic achievemen
dropping out of school (9).
Data Sources
YRBSS data sources include ongoing surveys as
one-time national surveys, special-population surve
methods studies. The ongoing surveys include school
national, state, tribal, and large urban school district
of representative samples of high school students
certain sites, representative state, territorial, and larg
school district surveys of middle school students. The
surveys are conducted biennially; each cycle beg
of the preceding even-numbered year (e.g., in 2010 f
2011 cycle) when the questionnaire for the upcoming
released and continues until the data are published in
the following even-numbered year (e.g., in 2012 for t
cycle). This section describes the ongoing surveys, on
national surveys, and special-population surveys.
studies are described elsewhere in this report (see Da
MMWR / March 1, 2013/ Vol. 62/ No. 12
This content downloaded from
105.231.112.221 on Tue, 04 Dec 2018 04:40:42 UTC
All use subject to https://about.jstor.org/terms
Document Page
Recommendations and Reports
TABLE 1. National health objectives and a leading health indicator measured by the national Youth Risk Behavior
Topic area/Objective no. Objective
Cancer
C-20.3 Reduce the proportion of adolescents in grades 9–12 who report using artificial sources of ultraviolet light for tanning
C-20.5 Increase the proportion of adolescents in grades 9–12 who follow protective measures that may reduce the risk of skin c
Injury and violence prevention
IVP-34 Reduce physical fighting among adolescents
IVP-35 Reduce bullying among adolescents
IVP-36 Reduce weapon carrying by adolescents on school property
Mental health and mental disorders
MHMD-2 Reduce suicide attempts by adolescents
MHMD-3 Reduce the proportion of adolescents who engage in disordered eating behaviors in an attempt to control their weight
Physical activity
PA-3.1 Increase the proportion of adolescents who meet current Federal physical activity guidelines for aerobic physical activit
PA-3.2 Increase the proportion of adolescents who meet current Federal physical activity guidelines for muscle-strengthening a
PA-3.3 Increase the proportion of adolescents who meet current Federal physical activity guidelines for aerobic physical activit
muscle-strengthening activity
PA-5 Increase the proportion of adolescents who participate in daily school physical education
PA-8.2.3 Increase the proportion of adolescents in grades 9–12 who view television, videos, or play video games for no more tha
PA-8.3.3 Increase the proportion of adolescents in grades 9–12 who use a computer or play computer games outside of school (f
work) for no more than 2 hours a day
Sleep health
SH-3 Increase the proportion of students in grades 9–12 who get sufficient sleep
Substance abuse
SA-1 Reduce the proportion of adolescents who report that they rode, during the previous 30 days, with a driver who had bee
Tobacco use
TU-2.1 Reduce the use of tobacco products by adolescents (past month)
TU-2.2 Reduce the use of cigarettes by adolescents (past month)
TU-2.3 Reduce the use of smokeless tobacco products by adolescents (past month)
TU-2.4 Reduce the use of cigars by adolescents (past month)
TU-7 Increase smoking cessation attempts by adolescent smokers
* Source: National health objectives and leading health indicators are determined by the US Department of Health and Human Services. Adap
of Health and Human Services. Healthy People 2020. Available at http://www.healthypeople.gov/2020/default.aspx.
Leading health indicator.
This report focuses predominantly on the ongoing school-
based national, state, territorial, tribal, and large urban school
district surveys. The national Youth Risk Behavior Survey
(YRBS) provides data representative of students in grades 9–12
attending U.S. high schools. State, territorial, tribal, and large
urban school district surveys provide data representative of high
school students or middle school students in states, territories,
tribal governments, and large urban school districts that receive
funding from CDC through cooperative agreements. Starting
in 2013, education or health agencies in all 50 states, seven
territorial education agencies, and 31 local education agencies
are eligible to receive funding to conduct a YRBS.
One-Time National Surveys
Several one-time national surveys have been conducted as
part of YRBSS. These one-time surveys include a Youth Risk
Behavior Supplement, which was added to the 1992 National
Health Interview Survey to provide information regarding
persons aged 12–21 years, including youths attending school
as well as those not attending school (10); a National College
Health Risk Behavior Survey, which was conducted in
to measure the prevalence of health-risk behavior
undergraduate students enrolled in public and private
4-year colleges and universities (11); and a National A
High School Youth Risk Behavior Survey, which was c
in 1998 to measure selected health-risk behaviors am
nationally representative sample of students in grade
attending alternative high schools (12).
In 2010, also as part of YRBSS, CDC conducted
National Youth Physical Activity and Nutrition Stud
(NYPANS), which was designed to 1) provide nati
representative data on behaviors and behavioral dete
related to nutrition and physical activity among high
students, 2) provide data to help improve the clarity a
the performance of questions on the YRBSS questionn
and 3) enhance understanding of the associations
behaviors and behavioral determinants related to
activity and nutrition and their association with body
index (BMI) (weight[kg]/height[m]2). The study included a
paper-and-pencil questionnaire administered to a nat
representative sample of 11,429 students attendin
MMWR / March 1, 2013/ Vol. 62/ No. 1 3
This content downloaded from
105.231.112.221 on Tue, 04 Dec 2018 04:40:42 UTC
All use subject to https://about.jstor.org/terms

Secure Best Marks with AI Grader

Need help grading? Try our AI Grader for instant feedback on your assignments.
Document Page
Recommendations and Reports
and private schools in grades 9–12, a standardized protocol to
measure height and weight among all students completing the
questionnaire, and telephone interviews to measure 24-hour
dietary recalls among a subsample of 909 students (8% of those
who completed questionnaires).
Special-Population Surveys
Special-population surveys related to short-term federal
initiatives have been conducted periodically as part of
YRBSS. In 2005, 2007, and 2009, a total of 40 communities
participating in the Steps to a HealthierUS program (13)
conducted at least one school-based survey of students in grades
9–12 in their program intervention areas. These communities
used a modified YRBSS questionnaire that measured dietary
behaviors, physical activity, and tobacco use and the prevalence
of obesity and asthma (13). In 2010 and 2011, a total of
44 communities participating in the Communities Putting
Prevention to Work (CPPW) program conducted school-
based surveys of students in grades 9–12 in their program
intervention areas. Six communities used the standard YRBSS
questionnaire, and 38 used a modified questionnaire that
measured dietary behaviors, physical activity, and tobacco use
and the prevalence of obesity. In 2012 and 2013, a total of 17
CPPW communities will conduct a second YRBS.
In addition to the fiscal and technical support provided
through cooperative agreements to tribal governments to
conduct a YRBS, CDC also provides technical assistance for
other surveys of American Indian youths. Since 1994, the
Bureau of Indian Education (BIE) has conducted a YRBS
periodically among American Indian youths attending middle
and high schools funded by BIE. Since 1997, the Navajo
Nation has conducted a YRBS periodically in schools on
Navajo reservations and in border town schools having high
Navajo enrollment. In 2011, CDC also provided technical
assistance to the Nez Perce Tribe to conduct a YRBS.
Questionnaire
Initial Questionnaire Development
To determine which health-risk behaviors YRBSS would
assess initially, CDC first reviewed the leading causes of
morbidity and mortality among youths and adults. In
1988, four causes accounted for 68% of all deaths among
persons aged 1–24 years: motor-vehicle crashes (31%), other
unintentional injuries (14%), homicide (13%), and suicide
(10%) (14). In 2008, of all deaths among persons aged 10–24
years, 72% were attributed to these four causes: 26% resulted
from motor-vehicle crashes, 17% from other unintentional
injuries, 16% from homicide, and 13% from suicide (1
1988, substantial morbidity also resulted from approx
1 million pregnancies occurring among adolescents (1
the estimated 12 million cases of STDs among person
15–29 years (17). Although rates of pregnancy an
among adolescents have decreased during 1991–200
pregnancy and STDs, including HIV infection, remain
public health problems among youths. In 1988, appro
two thirds of all deaths among adults aged >25 years
from cardiovascular disease (41%) and cancer (23%)
2008, the percentage of deaths among persons in thi
resulting from cardiovascular disease had decreased
the percentage resulting from cancer remained at 23
These serial reviews indicate that virtually all b
contributing to the leading causes of morbidity and m
can be placed into six priority health-risk behavior ca
1) behaviors that contribute to unintentional injur
violence; 2) sexual behaviors that contribute to HIV in
other STDs, and unintended pregnancy; 3) tobacc
4) alcohol and other drug use; 5) unhealthy dietary be
and 6) physical inactivity. These behaviors frequen
interrelated and often are established during childhoo
adolescence and extend into adulthood.
In 1989, CDC asked each of the federal agencies res
for improving or monitoring the incidence and prevale
behavioral risks in each of the six categories to appoi
to serve on a YRBSS steering committee. In August 19
and steering committee members convened a 2-day
to identify priority behaviors and devise questions to
those behaviors. For each of the six priority health-ris
categories, a panel was established that included
experts from other federal agencies, including the
Department of Education, the National Institutes of H
the Health Resources and Services Administration, an
Office of the Assistant Secretary for Health as well as
from academic institutions, survey research specialis
CDC’s National Center for Health Statistics (NCHS), an
staff from CDC’s Division of Adolescent and School He
Because YRBSS was to be implemented primarily
school-based surveys, a representative of the Society
Directors of Health, Physical Education, and Recreatio
organization of state leaders of school-based health p
also was included on each panel. Because students w
a single class period of approximately 45 minutes to c
the YRBSS questionnaire covering all six priority healt
behavior categories, each panel was asked to identify
highest priority behaviors and to recommend a limite
of questions to measure the prevalence of those
In October 1989, the first draft of the YRBSS question
was completed and was reviewed by representatives
MMWR / March 1, 2013/ Vol. 62/ No. 14
This content downloaded from
105.231.112.221 on Tue, 04 Dec 2018 04:40:42 UTC
All use subject to https://about.jstor.org/terms
Document Page
Recommendations and Reports
education agency of each state, the District of Columbia,
four U.S. territories, and 16 local education agencies then
funded by CDC. Survey research specialists from NCHS
also provided comments and suggestions. A second version
of the YRBSS questionnaire was administered during spring
1990 to a national sample of students in grades 9–12 as well
as to samples of students in 25 states and nine large urban
school districts. In addition, the second version was sent to
the Questionnaire Design Research Laboratory at NCHS
for laboratory and field testing with high school students.
NCHS staff examined student responses to the questionnaire
and recommended ways to improve reliability and validity by
clarifying the wording of questions, setting recall periods, and
identifying response options.
In October 1990, a third version of the YRBSS questionnaire
was completed. The questionnaire was similar to that used
during spring 1990 but revised to take into account data
collected by CDC and state and local education agencies
during spring 1990, information from NCHS’s laboratory
and field tests, and input from YRBSS steering committee
members and representatives of each state and the 16 local
education agencies. It also included questions for measuring
national health objectives for 2000 (5). During spring 1991,
this questionnaire was used by 26 states and 11 large urban
school districts to conduct a YRBS and by CDC to conduct
a national YRBS.
In 1991, CDC determined that biennial surveys would be
sufficient to measure health-risk behaviors among students
because behavior changes typically occur gradually. Since 1991,
YRBSs have been conducted every odd year at the national,
state, territorial, and large urban school district levels.
Questionnaire Characteristics and
Revisions
All YRBSS questionnaires are self-administered, and students
record their responses on a computer-scannable questionnaire
booklet or answer sheet. Skip patterns* are not included in any
YRBSS questionnaire to help ensure that similar amounts of
time are required to complete the questionnaire, regardless of
each student’s health-risk behavior status. This technique also
prevents students from detecting on other answer sheets and
questionnaire booklets a pattern of blank responses that might
identify the specific health-risk behaviors of other students.
In each even-numbered year between 1991 and 1997, in
consultation with the sites (states, territories, and large urban
school districts) conducting a survey, CDC revised the YRBSS
questionnaire to be used in the subsequent cycle. These revisions
* Skip patterns occur when a particular response to one question indicates to the
respondents that they should not answer one or more subsequent questions.
reflected site and national priorities. For example, in
added 10 questions to the 1993 questionnaire to mea
National Education Goal for safe, disciplined, and drug
schools (21) and to address reporting requirements fo
Department of Education’s Safe and Drug-Free Schoo
(http://www2.ed.gov/about/offices/list/osdfs/index.htm
In 1997, CDC undertook an in-depth, systematic rev
the YRBSS questionnaire. The review was motivated b
factors, including a goal for YRBSS to measure Health
2010 national health objectives, which were being de
at that time. The purpose of the review and the subse
revision process was to ensure that the questionnaire
provide the most effective assessment of the most cr
risk behaviors among youths. To guide the decision-m
process, CDC solicited input from content experts fro
and academia as well as from representatives from o
agencies; state, territorial, and local education agenc
health departments; and national organizations, foun
and institutes. On the basis of input from approximate
persons, CDC developed a proposed set of questi
revisions that were sent to all state, territorial, and lo
agencies for further input. In addition to consider
amount of support from sites for the proposed revisio
considered multiple factors in making final decisions
questionnaire, including 1) input from the original rev
2) whether the question measured a health-risk b
practiced by youths, 3) whether data on the topic wer
from other sources, 4) the relation of the behavior to
causes of morbidity and mortality among youths and
5) whether effective interventions existed that could
modify the behavior. As a result of this process, CDC
1999 YRBSS questionnaire by adding 16 new question
11 questions, and making substantial wording change
questions. For example, two questions that assessed
height and weight were added in recognition of i
concerns regarding obesity. As a result, YRBSS now g
national, state, territorial, tribal, and large urban scho
estimates of BMI calculated from self-reported data.
The 2013 YRBSS questionnaire reflects minor ch
that CDC has made to the questionnaire since 1999. D
each even-numbered year since 1999, CDC has soug
from experts both inside and outside of CDC reg
what questions should be changed, added, or deleted
changes, additions, and deletions were then placed o
sent to the YRBS coordinators at all sites, and each si
for or against each proposed change, addition, and de
CDC considered the results of this balloting process w
finalizing each questionnaire. Each cycle, CDC de
standard questionnaire that sites can use as is or mod
meet their needs. The 2013 standard YRBSS question
MMWR / March 1, 2013/ Vol. 62/ No. 1 5
This content downloaded from
105.231.112.221 on Tue, 04 Dec 2018 04:40:42 UTC
All use subject to https://about.jstor.org/terms
Document Page
Recommendations and Reports
includes five questions that assess demographic information;
23 questions related to unintentional injuries and violence; 10
on tobacco use; 18 on alcohol and other drug use; seven on
sexual behaviors; 16 on body weight and dietary behaviors,
including height and weight; five on physical activity; and
two on other health-related topics (i.e., asthma and sleep). The
2013 standard questionnaire and the rationale for the inclusion
of each question are available at http://www.cdc.gov/yrbss.
For the national YRBS, five to 11 additional questions are
added to the standard questionnaire each cycle. These questions
typically cover health-related topics that do not fit in the six
priority health-risk behavior categories (e.g., sun protection).
The 2013 national YRBS questionnaire also is available at
http://www.cdc.gov/yrbss.
Each cycle, CDC makes the standard questionnaire available
to sites as a computer-scannable booklet. In 2011, nine states,
one tribe, and six large urban school districts used the standard
questionnaire computer-scannable booklets. CDC sends sites
that wish to modify the standard questionnaire a print-ready
copy of their questionnaire and scannable answer sheets.
Sites can modify the standard questionnaire within certain
parameters: 1) two thirds of the questions from the standard
YRBSS questionnaire must remain unchanged; 2) additional
questions are limited to eight mutually exclusive response
options; and 3) skip patterns, grid formats, and fill-in-the
blank formats cannot be used. Furthermore, sites that modify
the standard YRBSS questionnaire and use the scannable
answer sheets must retain the height and weight questions
as questions six and seven and cannot have more than 99
questions. This numerical limit is set so the questionnaire can
be completed during a single class period by all students, even
those who might read slowly.
For sites that want to modify the standard questionnaire, CDC
also provides a list of optional questions for consideration. This
list has been available to sites since 1999 and is updated when the
standard YRBSS questionnaire is updated. It includes questions on
the current version of the national YRBS questionnaire; questions
that have been included in a previous national, state, territorial,
tribal, or large urban school district YRBS questionnaire; and
questions designed to address topics of key interest to CDC or
the sites. By using these optional questions, sites can obtain data
comparable to those from the national YRBS or from other sites
that use these questions and be assured they are adding questions
that already have been reviewed and approved by CDC. A site also
can choose to develop its own questions if none of the optional
questions addresses a topic that the site wants to measure. CDC
reviews site-developed questions to ensure that their complexity,
reading level, and formatting are appropriate for a YRBS. In 2011,
a total of 38 states, five territories, 16 large urban school districts,
and three tribes modified the standard questionnaire.
Questionnaire Reliability and Valid
CDC has conducted two test-retest reliability studie
national YRBS questionnaire, one in 1992 and one in
In the first study, the 1991 version of the questionnai
administered to a convenience sample of 1,679 stude
grades 7–12. The questionnaire was administered
occasions, 14 days apart (22). Approximately three fo
of the questions were rated as having a substantial o
reliability (kappa = 61%–100%), and no statistically s
differences were observed between the prevalence e
for the first and second times that the questionn
administered. The responses of students in grade 7 w
consistent than those of students in grades 9–12, indi
that the questionnaire is best suited for students in th
In the second study, the 1999 questionnaire was ad
to a convenience sample of 4,619 high school st
The questionnaire was administered on two occas
approximately 2 weeks apart (23). Approximately
five questions (22%) had significantly different pr
estimates for the first and second times that the ques
was administered. Ten questions (14%) had both
<61% and significantly different time-1 and time-2 pr
estimates, indicating that the reliability of these ques
questionable (23). These problematic questions were
or deleted from later versions of the questionnaire.
No study has been conducted to assess the validity
self-reported behaviors that are included on the Y
questionnaire. However, in 2003, CDC reviewed e
empiric literature to assess cognitive and situational
might affect the validity of adolescent self-reporting o
measured by the YRBSS questionnaire (24). In this re
CDC determined that, although self-reports of these t
behaviors are affected by both cognitive and situation
these factors do not threaten the validity of self-repor
type of behavior equally. In addition, each type of beh
differs in the extent to which its self-report can be va
an objective measure. For example, reports of tobacc
influenced by both cognitive and situational factors a
be validated by biochemical measures (e.g., cotinine)
of sexual behavior also can be influenced by both cog
and situational factors, but no standard exists to valid
behavior. In contrast, reports of physical activity are i
substantially by cognitive factors and to a lesser
situational ones. Such reports can be validated by me
electronic monitors (e.g., heart rate monitors). Under
the differences in factors that compromise the validit
reporting of different types of behavior can assist pol
in interpreting data and researchers in designing mea
do not compromise validity.
MMWR / March 1, 2013/ Vol. 62/ No. 16
This content downloaded from
105.231.112.221 on Tue, 04 Dec 2018 04:40:42 UTC
All use subject to https://about.jstor.org/terms

Paraphrase This Document

Need a fresh take? Get an instant paraphrase of this document with our AI Paraphraser
Document Page
Recommendations and Reports
In 2000, CDC conducted a study to assess the validity of
the two YRBS questions on self-reported height and weight
(25). In that study, 2,965 high school students completed
the 1999 version of the YRBSS questionnaire on two
occasions approximately 2 weeks apart. After completing
the questionnaire, the students were weighed and had their
height measured. Self-reported height, weight, and BMI
calculated from these values were substantially reliable, but on
average, students in the study overreported their height by 2.7
inches and underreported their weight by 3.5 pounds, which
indicates that YRBSS probably underestimates the prevalence
of overweight and obesity in adolescent populations.
Operational Procedures
The national YRBS is conducted during February–May of
each odd-numbered year. All except a few sites also conduct
their survey during this period; certain sites conduct their
YRBS during the fall of odd-numbered years or during even-
numbered years. Separate samples and operational procedures
are used in the national survey and in the state, territorial,
tribal, and large urban school district surveys. The national
sample is not an aggregation of the state and large urban
school district surveys, and state or large urban school district
estimates cannot be obtained from the national survey.
In certain instances, a school is selected as part of the national
sample as well as a state or large urban school district sample.
Similarly, a school might be selected as part of both a state and a
large urban school district sample or a state and a tribal sample.
When a school is selected as part of two or more samples, the
field work is conducted only once to minimize the burden on
the school and eliminate duplication of efforts. The school’s
data then are incorporated into both datasets during data
processing. The coordination of these overlapping samples
is critical to the successful operation of YRBSS, and weekly
meetings are required to ensure that overlapping schools are
identified, responsibilities for recruitment and data collection
are documented, and methods for sharing data are agreed upon.
National Survey
Since 1990, the national school-based YRBS has been
conducted under contract with ICF Macro, Inc., an ICF
International Company. With CDC oversight, the contractor
is responsible for sample design and sample selection. After the
schools have been selected, the contractor also is responsible
for obtaining the appropriate state-, district-, and school-
level clearances to conduct the survey in those schools. The
contractor works with sampled schools to select classes,
schedule data collection, and obtain parental permission. In
addition, the contractor hires and trains data coll
follow a common protocol to administer the questionn
in the schools, coordinates data collection, weights th
and prepares the data for analysis.
State, Territorial, Tribal, and Large U
School District Surveys
Before 2003, CDC funded state and local education
for HIV prevention or coordinated school health progr
sites could use a portion of these cooperative agreem
to conduct a YRBS. Since the 2003 cycle, separate co
agreement funds have been made available to sites t
a survey, and since 2008, both state education and st
agencies have been eligible to apply for these fu
state must determine which agency will take respons
conducting its survey. In 2011, five state health depa
directly received separate YRBSS cooperative agreem
and health departments in an additional eight states
urban school district received funds from the educatio
to lead administration of their survey (Box 1). The rem
surveys were conducted by education agencies. Since
governments also have been eligible to apply for fund
a YRBS. Certain state and local education agencies co
YRBS with the assistance of survey contractors. In 20
of 24 state education agencies and five local educatio
hired contractors to assist with survey administration
State, territorial, and local agencies and tribal gover
funded by CDC to conduct a YRBS do so among samp
of high school students. In addition, certain sites
a separate survey among middle school students by u
modified YRBSS questionnaire designed specifically fo
reading and comprehension skills of students in this a
In 2011, a total of 16 states, three territories, one trib
large urban school districts conducted a middle schoo
(Box 1). In addition, in 2011, one state (Alaska)
large urban school districts (Memphis and San Bernar
conducted a YRBS among alternative school students
Certain states coordinate their YRBS sample with sa
for other surveys (e.g., the Youth Tobacco Survey
(http://www.cdc.gov/tobacco/data_statistics/surveys/y
htm) to reduce the burden on schools and students a
save resources. States use one of two methods of coo
sampling: multiple-school sampling and multiple-cl
sampling. In multiple-school sampling, the number of
needed for one survey is multiplied by the number of
being coordinated. This method produces nonoverl
samples of schools. The separate samples can be use
the same or separate semesters, and schools can be
that they will be asked to participate in only one
MMWR / March 1, 2013/ Vol. 62/ No. 1 7
This content downloaded from
105.231.112.221 on Tue, 04 Dec 2018 04:40:42 UTC
All use subject to https://about.jstor.org/terms
Document Page
Recommendations and Reports
BOX 1. Sites that conducted a Youth Risk Behavior Survey (YRBS), 2011
State surveys
Alabama* Illinois Montana Rhode Island
Alaska Indiana Nebraska South Carolina§,¶
Arizona Iowa Nevada§,†† South Dakota
Arkansas Kansas New Hampshire Tennessee
California**,†† Kentucky§ New Jersey Texas
Colorado§ Louisiana New Mexico§ Utah**
Connecticut Maine§ New York Vermont
Delaware§ Maryland North Carolina§,¶ Virginia¶,**
Florida Massachusetts North Dakota§,¶ West Virginia§,¶
Georgia§,¶,** Michigan Ohio** Wisconsin
Hawaii§,¶ Mississippi§,¶ Oklahoma Wyoming§
Idaho Missouri¶,†† Pennsylvania¶,††
Territorial surveys
American Samoa
Guam§,¶
Marshall Islands
Northern Mariana Islands§
Palau§
Puerto Rico
Tribal government surveys
Cherokee Nation
Winnebago Tribe§
Large urban school district surveys
Baltimore, Maryland†† Duval County, Florida§ Orange County, Florida§
Boston, Massachusetts Houston, Texas§ Palm Beach County, Florida§
Broward County, Florida Los Angeles, California* Philadelphia, Pennsylvania*
Charlotte-Mecklenburg, North Carolina§ Memphis, Tennessee§ San Bernardino, California§
Chicago, Illinois§ Miami-Dade County, Florida§ San Diego, California
Dallas, Texas§ Milwaukee, Wisconsin§ San Francisco, California§
Detroit, Michigan New York City, New York Seattle, Washington
District of Columbia§
* Site used coordinated samples to conduct its YRBS and a Communities Putting Prevention to Work survey.
Health department received funds from the education agency to administer YRBS.
§ Site conducted a middle school YRBS.
Site used coordinated samples to conduct its YRBS and Youth Tobacco Survey.
** Health department received YRBS-specific funds directly from CDC.
†† Site did not obtain weighted survey data.
This approach is most useful in sites that have at least 50
high schools, in sites that administer the surveys in different
semesters, and in sites in which at least one of the surveys
has been considered controversial or has not been conducted
successfully. This method ensures that the success of one survey
does not depend on the success of the others. In multiple-class
sampling, multiple surveys are conducted simultaneously in
separate classes in the same sample of schools. The number
of classes needed for one survey is multiplied by the number
of surveys, and then the classes are assigned randomly to each
survey. This approach is useful in states with few high
in states where each survey has been conducted succ
in states where the coordinators of each survey are w
work together closely. Regardless of the type of coord
CDC and the sponsoring agencies work together to pl
implement the coordination. In 2011, a total of 17 sta
territory, and one tribe used coordinated samples to c
their YRBS and YTS. In addition, one state and two lar
urban school districts coordinated their YRBS sample
CPPW sample (Box 1).
MMWR / March 1, 2013/ Vol. 62/ No. 18
This content downloaded from
105.231.112.221 on Tue, 04 Dec 2018 04:40:42 UTC
All use subject to https://about.jstor.org/terms
Document Page
Recommendations and Reports
Technical Assistance
Technical assistance for state, territorial, and local agencies
and tribal governments is provided by both CDC and Westat,
which has served as CDC’s technical assistance contractor since
the inception of YRBSS. CDC staff include scientists and project
officers who oversee the cooperative agreement for the sites. In
addition, each site is assigned a survey operations specialist and
a statistician from Westat.
Each YRBS site has a survey coordinator who works for a
state, territorial, or local education or health agency or tribal
government. These coordinators have variable expertise and
experience in conducting surveys. To help ensure the quality
of YRBSS, since the first cycle, CDC has provided technical
assistance to the agencies conducting the surveys. This assistance
has become increasingly comprehensive. During the first cycle,
such assistance was limited and consisted primarily of answering
questions posed by site coordinators. Since that time, technical
assistance has been expanded to cover the entire survey process
and is a continual and proactive system. CDC and its technical
assistance contractor provide technical assistance on survey
planning, sample selection, questionnaire modification, survey
administration, obtaining parental permission, data processing,
weighting, report generation, and dissemination of results. Sites
are responsible for administering the surveys; the role of CDC
and its technical assistance contractor is to help ensure that survey
administration runs smoothly and yields sufficient response rates
and high-quality data.
Technical Assistance Tools
Westat has worked with CDC to develop tools for providing
technical assistance. These tools include instructional materials,
communication tools, and specialized software.
Instructional Materials
CDC publishes the Handbook for Conducting Youth Risk
Behavior Surveys (26), a comprehensive guide that is revised
each cycle on the basis of feedback from sites and questions
that arose during the preceding cycle. The 2013 version of the
Handbook contains 107 pages spanning eight chapters and
also includes nine appendices (Box 2). As a supplement to the
Handbook, in 2008 and 2010, CDC and Westat developed
two short instructional videos for sites. One video focuses on
scientifically selecting classes, and the other describes how to
prepare the data and documentation for processing. Each video
has step-by-step instructions for routine tasks. These videos use
animation to make the information engaging. The videos are
designed to provide an overview of these tasks at the beginning
of a cycle and to serve as a resource for survey coordinators to
use as they prepare to carry out each task.
Communication Tools
A monthly electronic newsletter is sent to all su
coordinators via e-mail. Each one-page newsletter
on a part of the survey process (e.g., sampling, quest
modification, or follow-up). Topics are selected to coin
the typical survey timeline. The brief newsletters prov
or reminders designed to help sites conduct successf
The password-protected Survey Technical Assista
Website, which CDC and Westat launched in 1999, is
by survey coordinators to request materials (e.g., que
booklets and answer sheets), download references
supporting documents (e.g., the Handbook and sa
parental permission forms), and check the status of t
(e.g., what processing steps already have been c
Survey coordinators also can use the website to acce
information for CDC and technical assistance contrac
and send e-mail messages to request further assistan
the 2011 cycle, the website received 812 visits from
survey sites. The website also provides reports to sup
management. For example, CDC and Westat use the
to track when questionnaires are received from each
to check the status of data processing.
Survey coordinators can access peer-to-peer tec
assistance through a YRBSS listserv that was establis
2009 by the South Carolina Department of Education
listserv has 79 members, including survey coordinato
staff from CDC, Westat, and ICF Macro. CDC staff mon
the listserv to provide clarifications when needed. On
15 messages are posted to the listserv each month. T
common topics discussed are survey administratio
techniques for obtaining parental permission), questio
modification, dissemination of results, and the use
incentives. Members also have used the listserv to co
meetings or with others in their region.
Specialized Software
To provide technical assistance with sample sele
1989, CDC and Westat developed PCSample, a specia
software program that draws two-stage cluster sa
schools and classes within sampled schools for e
CDC and Westat use PCSample to select YRBS sam
efficiently. Schools are selected with probability propo
to school enrollment size, and classes are selected ra
When PCSample was developed, no commercially ava
software program was available for this purpose, and
remains the only example of this type of program. Alt
PCSample was developed specifically for YRBSS, it
used for other school-based surveys (e.g., YTS, Globa
and Global YRBS).
MMWR / March 1, 2013/ Vol. 62/ No. 1 9
This content downloaded from
105.231.112.221 on Tue, 04 Dec 2018 04:40:42 UTC
All use subject to https://about.jstor.org/terms

Secure Best Marks with AI Grader

Need help grading? Try our AI Grader for instant feedback on your assignments.
Document Page
Recommendations and Reports
BOX 2. Handbook for Conducting Youth Risk Behavior Surveys
(YRBS) table of contents
Chapters
Planning your YRBS
Modifying questionnaires
Obtaining clearance
Sampling schools and classes
Obtaining parental permission
Administering surveys
Preparing data for analysis
Reporting survey results
Appendices
2013 YRBS questionnaires
Item rationale
Bibliography
Letters of support
Questions and answers
Clearance strategies
Selecting classes
Parental permission
Survey TA users’ guide
PCSample requires an updated sampling frame, which is a list
of schools in the site’s jurisdiction that also includes the number
of students in each school enrolled in grades 9–12. If the state,
territorial, or local agency or tribal government cannot provide
the sampling frame, the technical assistance contractor provides
the frame used previously or one created using the Common Core
of Data from the National Center for Education Statistics (27).
The survey coordinator then must update the sampling frame
by deleting closed schools, adding newly opened schools, and
providing updated enrollment numbers. PCSample also requires
information on sampling parameters (e.g., expected school and
student response rates, attendance rates, and desired sample size).
This information is provided by the survey coordinator, with
assistance from CDC and its technical assistance contractor, via
a sampling parameter worksheet. The sampling parameters are
used to balance the need to select a sample that is large enough
to generate precise estimates but small enough so that the site’s
resources are not overtaxed and schools and students are not
burdened unnecessarily.
PCSample generates two types of forms: school-level forms
for each school in the sample and a classroom-level form for
each school that is reproduced later for each sampled classroom
in that school. The school-level form contains unique random
numbers calculated by using a sampling interval based on
the size of the school and the desired sample size; the survey
coordinator uses these numbers to select classes randomly in
participating schools. The survey coordinator comp
school-level form for each sampled school and a class
level form for each sampled classroom. The informati
these forms provides a record of the sampling an
administration process and is used to weight the data
To help monitor site progress, the technical ass
contractor provides each site with a Microsoft Excel (
tracking form for recording information on scheduling
and school and student participation. Before developm
the tracking form in 2011, CDC and Westat contacted
regularly during data collection to check whether sch
been cleared and scheduled for survey administration
sites used paper tracking forms or created electronic
help them monitor their progress, but this recordkeep
not done in a systematic or consistent fashion. The tr
form now in use in all sites is a spreadsheet that cont
of the schools selected for the survey, along with colu
documenting the date the school agrees to participat
for survey administration, the date the survey is confi
as completed, and student participation informatio
spreadsheet is programmed to calculate the school, s
and overall response rates automatically as new infor
is added. Sites are required to send the tracking form
technical assistance contractor regularly to aid troubl
and technical assistance.
Technical Assistance Modes
In addition to the tools developed to help sites
successful surveys, a key part of technical assista
one-on-one guidance provided to sites. This individua
technical assistance is provided most commonly t
toll-free telephone number and e-mail. CDC and its te
assistance contractor also meet with survey coordina
person. Coordinators often attend national conferenc
other meetings during which appointments can be sc
with CDC or contractor staff also in attendance. Site v
by project officers also provide opportunities for prov
site-specific technical assistance. Every conversation
any personnel at a site and CDC or its technical assist
contractor, whether in person, through e-mail, or
telephone, is logged into the Survey Technical As
Website. This enables all technical assistance staff m
working with the site to see in real time what questio
been asked and what information has been shared. D
2011 cycle (July 2010–July 2012), 5,279 contacts wer
between sites and Westat or CDC. The number of con
site during this period ranged from three to 125 (med
Approximately 36% of these requests were of a gene
(e.g., how to obtain YRBSS-related materials), 22%
related to sampling, 16% to questionnaire admini
MMWR / March 1, 2013/ Vol. 62/ No. 110
This content downloaded from
105.231.112.221 on Tue, 04 Dec 2018 04:40:42 UTC56 UTC
All use subject to https://about.jstor.org/terms
Document Page
Recommendations and Reports
7% each to clearance and questionnaire modification, 6%
to weighting, 3% to reports, and 3% to other concerns (e.g.,
scanning and data processing).
A critical aspect of technical assistance is the 2-day in-person
training sessions that CDC and Westat have provided for sites
since 1992. These sessions are conducted during August of every
even-numbered year in preparation for YRBSS data collection in
the odd-numbered year. CDC invites survey coordinators and
contractors who either are new or are from sites that have not
conducted a YRBS successfully to attend the training. The content
of the training is based on the YRBS Handbook (26) and covers
all aspects of the survey process, including planning the survey,
modifying questionnaires, obtaining clearance, selecting schools
and classes, obtaining parental permission, administering surveys,
and preparing data for analysis. The training comprises both
lectures and hands-on skill-building activities and is designed by
persons with expertise in adult learning principles. In addition to
the Handbook, participants receive a training manual containing
practice exercises and supplemental resources to help them
conduct successful surveys. In addition, intensive, one-on-one
technical assistance meetings are available at the training for sites
that want to discuss detailed questionnaire or sampling issues.
In 2012, YRBS coordinators and contractors from 29 sites
participated in the training.
Sampling, Weighting, and
Response Rates
State, Territorial, Tribal, and Large Urban
School District Surveys
Each state, territorial, tribal, and large urban school district
YRBS employs a two-stage, cluster sample design to produce
a representative sample of students in grades 9–12 in its
jurisdiction. Samples are selected using PCSample. In 2011,
Ohio and South Dakota included both public and private
schools in their sampling frames; all other states included
only public schools. Each large urban school district sample
included only schools in the funded school district (e.g., San
Diego Unified School District) rather than in the entire area
(e.g., greater San Diego County). In the first sampling stage,
in all except a few sites, schools are selected with probability
proportional to school enrollment size. In the second sampling
stage, intact classes of a required subject or intact classes during
a required period (e.g., second period) are selected randomly. All
students in sampled classes are eligible to participate. In certain
sites, these procedures are modified to meet the individual needs
of the sites. For example, in a given site, all schools, rather than
a sample of schools, might be selected to participate.
Those surveys that have a sample selected acc
the protocol described above, appropriate documenta
school and classroom selection, and an overall respon
of ≥60% are weighted. These three criteria are used
that the data are representative of students in grades
that jurisdiction. The overall response rate is calcu
multiplying the school response rate by the student r
rate. A weight is applied to each student record to ad
student nonresponse and the distribution of students
sex, and race/ethnicity in each jurisdiction. Therefore
estimates are representative of all students in grades
each jurisdiction.
Surveys that do not have an overall response rate o
appropriate documentation are not weighted. Unweig
represent only the students participating in the surve
1991, both the number of participating sites and the
and percentage of weighted sites have increased (Ta
2011, a total of 43 states, five territories, 21 large urb
districts, and two tribal governments had weighted da
states and one large urban school district had un
data (Box 1). In 2011, in sites with weighted data, the
sample sizes ranged from 1,147 to 13,201 (median: 2
the state surveys, from 1,013 to 11,570 (median: 1,7
the large urban school district surveys, and from 476
(median: 1,634) for the territorial surveys. Studen
sizes were 91 and 1,480 for the two tribal surveys. Am
the state surveys, school response rates ranged from
100%, student response rates ranged from 60% to 88
overall response rates ranged from 60% to 84%. Amo
large urban school district surveys, school response r
from 84% to 100%, student response rates ranged fro
to 86%, and overall response rates ranged from 61%
Among the territorial surveys, school response rates r
from 97% to 100%, student response rates ranged fro
to 85%, and overall response rates ranged from 75%
Among the tribal surveys, school response rates were
100%, student response rates were 77% and 83%, an
response rates were 65% and 77%.
National Survey
The national YRBS uses a three-stage, cluster samp
to obtain a nationally representative sample of U.S. s
grades 9–12. The target population comprises all pub
private school students in grades 9–12 in the 50 state
the District of Columbia. U.S. territories are not includ
the sampling frame. The national YRBS sample is des
to produce estimates that are accurate within ±5% a
confidence level. Overall estimates as well as est
sex, grade, race/ethnicity, grade by sex, and race
MMWR / March 1, 2013/ Vol. 62/ No. 1 11
This content downloaded from
105.231.112.221 on Tue, 04 Dec 2018 04:40:42 UTC
All use subject to https://about.jstor.org/terms
Document Page
Recommendations and Reports
TABLE 2. Total number of participating states, large urban school districts, territories, and tribal governments fu
with weighted data — Youth Risk Behavior Surveillance System, United States, 1991–2011
States Large urban school districts Territories Tribal governments Total sites
% with % with % with % with % with
Year No. weighted data No. weighted data No. weighted data No. weighted data No. weighted data
1991 26 35 11 64 2 50 0 0 39 45
1993 40 55 14 64 2 100 0 0 56 59
1995 39 56 17 71 5 60 0 0 61 61
1997 38 63 17 88 5 80 0 0 60 72
1999 41 54 17 82 4 50 0 0 62 61
2001 37 60 19 74 7 29 0 0 63 60
2003 43 74 22 91 5 80 0 0 70 80
2005 44 91 23 91 4 75 0 0 71 90
2007 44 89 22 100 5 100 0 0 71 93
2009 47 89 23 91 4 75 2 50 76 87
2011 47 92 22 96 5 100 2 100 76 93
by sex subgroups meet this standard. Estimates for grade by
race/ethnicity subgroups are accurate within ±5% at a 90%
confidence level.
The first-stage sampling frame for each national survey
includes primary sampling units (PSUs) consisting of large-
sized counties or groups of smaller, adjacent counties. Since the
1999 sample, PSUs large enough to be selected with certainty
are divided into sub-PSU units. Schools then are sorted by size
and assigned in rotation to the newly created sub-PSU units.
PSUs are selected from 16 strata categorized according to the
metropolitan statistical area (MSA) status and the percentages
of black and Hispanic students in PSUs. PSUs are classified as
urban if they are in one of the 54 largest MSAs in the United
States; otherwise, they are considered rural. PSUs are selected
with probability proportional to school enrollment size for PSUs.
In the second stage of sampling, schools are selected from
PSUs. A list of public and private schools in PSUs is obtained
from the Market Data Retrieval (MDR) database (29). This
database includes information, including enrollment figures,
from both public and private schools and the most recent data
from the Common Core of Data from the National Center
for Education Statistics (27). Schools with all four high school
grades (9–12) are considered “whole schools.” Schools with any
other set of grades are considered “fragment schools” and are
combined with other schools (whole or fragment) to form a
cluster school” that includes all four grades. The cluster school
is treated as a single school during school selection. Schools are
divided into two groups on the basis of enrollment. Schools
with an estimated enrollment of ≥25 students for each grade are
considered large, and schools with an estimated enrollment of
<25 students for any grade are considered small. Approximately
one fourth of PSUs are selected for small-school sampling. For
each of these PSUs, one small school is drawn with probability
Areas with a population of ≥500,000 persons.
proportional to size, considering only small schools w
PSU. Three large schools then are selected from all sa
PSUs with probability proportional to school enrollmen
To enable a separate analysis of data for black and
students, CDC has used three strategies to achieve o
of these students: 1) larger sampling rates are used t
PSUs that are in high-black and high-Hispanic strata;
modified measure of size is used that increases the p
selecting schools that have a disproportionately high
enrollment; and 3) two classes per grade, rather than
selected in schools with a high minority enrollment. A
strategies were used in selecting the national sample
2011. Because of decreases in the percentage of whi
in the U.S. population (30), for the 2013 sample, suffi
numbers of black and Hispanic students were sample
only the third strategy.
The final stage of sampling consists of randomly sel
or two entire classes in each chosen school and in ea
9–12. Examples of classes include homerooms or clas
required subject (e.g., English and social studies). All
in sampled classes are eligible to participate. Since 1
national YRBS has been conducted 11 times with an a
sample size of 14,517 and average school, student, a
response rates of 78%, 86%, and 71%, respectively (T
A weight based on student sex, race/ethnicity, and s
grade is applied to each record to adjust for student n
and oversampling of black and Hispanic students. To
inflated sampling variances, statisticians trim and dis
weights exceeding a criterion value among untrimme
using an iterative process (31). The final overall weig
scaled so that the weighted count of students equals
sample size and the weighted proportions of students
grade match national population projections for each
year. Therefore, weighted estimates are representati
students in grades 9–12 who attend public and privat
MMWR / March 1, 2013/ Vol. 62/ No. 112
This content downloaded from
105.231.112.221 on Tue, 04 Dec 2018 04:40:42 UTC
All use subject to https://about.jstor.org/terms

Paraphrase This Document

Need a fresh take? Get an instant paraphrase of this document with our AI Paraphraser
Document Page
Recommendations and Reports
in the United States. For both the national YRBS and the
state, territorial, tribal, and large urban school district surveys,
sampled schools, classes, and students who refuse to participate
are not replaced. Sampling without replacement maintains the
integrity of the sample design and helps avoid the introduction
of unmeasurable bias into the sample.
Data-Collection Protocols
Data-collection protocols are similar for national, state,
territorial, tribal, and large urban school district surveys. Local
procedures for obtaining parental permission are followed
before administering a YRBS in any school. Certain schools
use active permission, meaning that parents must send back to
the school a signed form indicating their approval before their
child can participate. Other schools use passive permission,
meaning that parents send back a signed form only if they do
not want their child to participate in the survey. In the 2011
state and large urban school district surveys, four (9%) of 47
participating states (Alaska, Hawaii, New Jersey, and Utah)
used statewide active permission procedures, and two (9%)
of 22 large urban school districts (Dallas and San Bernardino)
used district-wide active permission. Some schools within
other sites also used active permission. In the 2011 national
YRBS, 10% of schools used active permission, and 90% used
passive permission.
For the national survey and for the majority of state,
territorial, tribal, and large urban school district surveys,
trained data collectors travel to each participating school to
administer the questionnaires to students. These data collectors
read a standardized script to participating students. The script
includes an introduction to the survey. Data collectors also
record information about schools and classrooms (e.g., grade
level of classes sampled and number of students enrolled in
a sampled class). This information is used later in the survey
process to verify sample selection and to weight data.
In certain state, territorial, tribal, and large urban school
district surveys, the questionnaires are sent to the school, and
teachers of the selected classes administer the survey to their
class by using the standardized script. The school then sends the
completed questionnaires and accompanying documentation
forms to the agency conducting the survey.
Procedures for all the YRBSs are designed to protect
student privacy by allowing for anonymous and voluntary
participation. In all surveys, students complete the self-
administered questionnaire during one class period and record
their responses directly in a computer-scannable booklet
or on a computer-scannable answer sheet. To the extent
possible, students’ desks are spread throughout the classroom
TABLE 3. Sample sizes and response rates for national Yo
Behavior Surveys — United States 1991–2011
Response rates (%)
Year Sample size* School Student Overall
1991 12,272 75 90 68
1993 16,296 78 90 70
1995 10,904 70 86 60
1997 16,262 79 87 69
1999 15,349 77 86 66
2001 13,601 75 83 63
2003 15,214 81 83 67
2005 13,917 78 86 67
2007 14,041 81 84 68
2009 16,410 81 88 71
2011 15,425 81 87 71
* Number of usable questionnaires after data-editing protocols were appl
Overall response rate = (no. of participating schools / no. of eligible sam
schools) x (no. of usable questionnaires / no. of eligible students sample
to minimize the chance that students can see ea
responses. Students also are encouraged to use an ex
of paper or an envelope provided by the data co
cover their responses as they complete the questionn
the national survey, and in certain state, territorial, tr
large urban school district surveys, when students co
questionnaire, they are asked to seal their questionna
or answer sheet in an envelope before placing it in a
Students who are absent on the day of data collecti
can complete questionnaires if their privacy can be m
These make-up data-collection efforts sometimes
administered by the data collector; however, if the da
cannot administer the questionnaire, school perso
perform this task. Allowing students who were absent
day of data collection to take the survey at a later da
student response rates. In addition, because absent s
especially those who are absent without parental per
are more likely to engage in health-risk behaviors tha
who are not absent (32), make-up data collection pro
help provide data representative of all high school stu
Data-Processing Procedures
Data processing for state, territorial, tribal, and larg
school district surveys is a collaborative effort betwee
and its technical assistance contractor that provides a
of checks and balances. All except a few sites send co
questionnaires or answer sheets to the contractor, wh
them and constructs a raw electronic dataset. Certain
their answer sheets and send the raw electronic data
contractor. The contractor sends all raw datasets
which edits them to identify out-of-range responses, l
inconsistencies, and missing data. The data cleani
editing process is performed by the Survey Data Man
MMWR / March 1, 2013/ Vol. 62/ No. 1 13
This content downloaded from
105.231.112.221 on Tue, 04 Dec 2018 04:40:42 UTC
All use subject to https://about.jstor.org/terms
Document Page
Recommendations and Reports
System (SDMS), which CDC developed in 1999 to process
all YRBSS data and produce reports. Originally developed as
a stand-alone system, SDMS was transformed to a web-based
system in 2008 and performs its functions using Visual Basic
(33), SAS (34), and SUDAAN (35) programs. The processing
system accommodates questionnaires in which questions have
been deleted or added by the sites by first screening them to
note differences from the standard questionnaire and then
accounting for those differences during processing.
For the 2011 cycle, 179 logical edits were performed on each
standard questionnaire. Responses that conflict in logical terms
are both set to missing, and data are not imputed. For example,
if a student responds to one question that he or she has never
smoked but then responds to a subsequent question that he
or she has smoked two cigarettes during the previous 30 days,
the processing system sets both responses to missing. Neither
response is assumed to be the correct response. Questionnaires
with <20 valid responses remaining after editing are deleted
from the dataset. In 2011, the median number of completed
questionnaires in the state surveys that failed quality-control
checks and were excluded from analysis was 13 (range: 0–351),
and in the large urban school district surveys, the median was
also 13 (range: 0–231).
Additional data edits are applied to the height, weight,
and BMI variables to ensure that the results are biologically
plausible. These three variables are set to missing when an
observation lies outside logical limits developed by CDC’s
Division of Nutrition, Physical Activity, and Obesity (36).
In 2011, the median number of completed questionnaires in
the state surveys that had their height and weight data set to
missing because either these values or their resulting BMIs
were considered implausible for the student’s sex and age was
32 (range: 7–567). In the large urban school district surveys,
the median was 27 (range: 14–163).
Edited data are sent to the technical assistance contractor for
weighting. If response rates are sufficient, documentation is
complete, and the site followed sampling protocols correctly,
the contractor weights the data according to the procedures
previously described in this report and sends the weights to
CDC, which merges the weights onto the edited data file.
CDC and Westat use file transfer and tracking functions built
into the Survey Technical Assistance Website to ensure that all
transfers are logged and reported.
Data processing for the national survey is similar to that
performed for the state, territorial, tribal, and large urban
school district surveys. The national survey contractor scans
all completed questionnaires from the national survey and
sends a SAS dataset to CDC. To maintain consistency with
the data processing used for the state, territorial, tribal, and
large urban school district surveys, CDC converts this dataset
to one that is then processed in SDMS just as all othe
datasets are processed. The national dataset is treate
were a state, territorial, tribal, or large urban school d
CDC edits the data by using the same procedures des
previously. In 2011, a total of 78 questionnaires (0.5%
national survey failed quality-control checks and were
from analysis, and 182 questionnaires (1%) had their
and weight data set to missing because the height, w
resulting BMI was considered implausible for the stud
and age. CDC then sends edited data to the national
contractor, whose statisticians weight the data accord
procedures described previously and then send the w
CDC, which merges the weights onto the data file.
Reports and Publications
Reports
CDC generates a report for the national survey
each participating site. Before 2013, each report cont
approximately 500 pages in a three-ring binder divide
two sections: survey results and survey documentatio
survey results section included a sample description;
and pictographs summarizing key results; tables and
that provided prevalence estimates for each question
site-added questions; and a report that provided the r
trend analyses using logistic regression to test wheth
have changed over time. The tables provided estimat
and by sex, race/ethnicity, grade, and age and includ
confidence intervals for all sites with weighted data. T
ensure the reliability of the estimates and protect the
of respondents, subgroup results were not reporte
subgroup contained <100 students. The graphs pr
estimates overall and by sex, grade, and race/ethnici
were provided as PowerPoint (37) files to facilitate pr
of results. CDC used SDMS to generate these reports
customized Visual Basic (33) programs that ran S
SUDAAN (35), and Crystal Reports (38) to generate m
output files (28,37,39,40). The survey documentation
of the report included a copy of the site’s quest
data user’s guide describing how data were edited an
each variable was calculated, a codebook for the
data set, information on sampling and weighting, a Sa
Statistics Report including the standard error and des
for each variable, and additional resource documents
Understanding, Analyzing, and Presenting Your YRBS
How to Interpret YRBS Trend Data,” and “Softwar
Analyzing YRBS Data.” In addition, each report contai
CD-ROM with an electronic copy of the site’s data in m
data file formats (e.g., SAS, SPSS [41], and ASCII) to p
MMWR / March 1, 2013/ Vol. 62/ No. 114
This content downloaded from
105.231.112.221 on Tue, 04 Dec 2018 04:40:42 UTC
All use subject to https://about.jstor.org/terms
Document Page
Recommendations and Reports
subsequent analyses and an electronic copy of all the material
described above.
Beginning with the 2013 cycle, CDC will use the SAS
Output Delivery System (34) instead of Crystal Reports
(38) as part of SDMS. To reduce environmental impact and
cost, all report materials described above will be provided in
electronic format only on a CD-ROM. Sites have been using
electronic files to share their data and results with others and
to post results on their websites, and they will continue to be
able to do so.
To ensure the timeliness of the data, CDC typically
completes site reports within 12 weeks of receipt of completed
questionnaires or answer sheets. Because surveys generally are
completed in the spring, most sites receive their reports during
the summer, so they can use their survey results to help plan
for the coming school year.
In 2008, CDC conducted interviews with sites to determine
how they use the CDC-generated report (42). According to
those interviews, many sites use or adapt the report by providing
copies to key persons (e.g., district and school administrators)
or by posting the results on their agency website. Other sites
use the data to create their own presentations or products such
as summary reports, brochures, or fact sheets. Such products
often combine YRBS results with those from other data sources
(e.g., vital statistics and educational policy information). Some
sites conduct further analyses of their YRBS data, make the
data set available for others to conduct secondary analyses, or
integrate the data into their own online data query systems.
The 2008 interviews also asked site coordinators how
they used their YRBS results in general (42). In addition to
documenting the prevalence of priority health-risk behaviors
among youths, YRBS data also are used to inform professional
development, plan and monitor programs, support health-
related policies and legislation, seek funding, and garner
support for future surveys.
Publications
During September 1991–April 1992, results from the
national YRBS conducted in 1990 were published for the first
time in 12 reports in the weekly MMWR (43–54). One of these
reports also included state and large urban school district data
(46). During June–December 1992, results from the national
YRBS conducted in 1991 were published for the first time in
five weekly MMWR reports (55–59) and in a Public Health
Reports supplement (60); results from the 1991 state and large
urban school district surveys were published for the first time
in four of the five weekly MMWR reports (56–59).
Beginning with the 1993 surveys, data from the national,
state, and large urban school district surveys have been
published together in MMWR Surveillance Summa
(61–70). Each MMWR Surveillance Summary includes
introduction and description of methods, followed by
for each behavior measured. National results are
by race/ethnicity and by school grade separately for e
State and large urban school district results are
by sex, and include the minimum, maximum, and me
prevalence estimates for states and large urban scho
sites separately. This format allows results from indiv
to be compared with those from other states or large
school districts as well as with the national estimates
the 1999 cycle, information also has been provided o
Since 2003, only sites with weighted data have been
In addition to providing descriptive information o
prevalence of health-risk behaviors among youths
also provides researchers with data for secondary
These analyses have resulted in the publication of rep
both the weekly MMWR and peer-reviewed journals. D
1994–2012, approximately 90 articles by CDC author
YRBSS data were published in peer-reviewed journals
23 reports using YRBSS data were published in the we
MMWR. Although data from the national YRBS were u
the majority of these analyses, for other analyses, da
national household-based survey, the National Colleg
Risk Behavior Survey, the National Alternative High S
Youth Risk Behavior Survey, and state-based surveys
A list of journal articles by CDC authors using YRBSS d
is available at http://www.cdc.gov/healthyyouth/yrbs/
htm, and a list of the weekly MMWR reports is availab
http://www.cdc.gov/healthyyouth/yrbs/publications.ht
Researchers outside CDC also publish reports using
In May 2012, CDC reviewed the following databases:
(National Library of Medicine, National Institute of He
Bethesda, Maryland, available at http://www.ncbi.nlm
PubMed), PsycINFO (American Psychological Associat
of Columbia, available at http://psycnet.apa.org), CAB
at http://cabdirect.org), ERIC (Education Resources In
Center, U.S. Department of Education, available at ht
eric.ed.gov), and Web of Knowledge (Thomson Reuters, N
New York, available at http://apps.webofknowledge.co
review documented at least 148 scientific publication
chapters, and dissertations) written solely by non-CDC
have been published since a similar review was condu
Publications based on studies in which researchers cr
questionnaires by using selected questions or groups
from the YRBSS questionnaire are not included in the
figure. YRBSS data also are cited extensively in the m
magazines, newspapers, radio, television, and Interne
MMWR / March 1, 2013/ Vol. 62/ No. 1 15
This content downloaded from
105.231.112.221 on Tue, 04 Dec 2018 04:40:42 UTC
All use subject to https://about.jstor.org/terms

Secure Best Marks with AI Grader

Need help grading? Try our AI Grader for instant feedback on your assignments.
Document Page
Recommendations and Reports
Website
CDC maintains a website that includes information on
YRBSS (http://www.cdc.gov/yrbss). The website includes links
to MMWR Surveillance Summaries, a map of participating
sites, questionnaires, a list of publications, and other resource
materials, such as those provided to sites as part of their report.
The site also includes 18 fact sheets containing national results.
Some fact sheets present results for the most current survey
year by sex and by race/ethnicity; others present trend results
from 1991 to the most current survey year for all students, by
race/ethnicity, and by topic (e.g., sexual behaviors). For site-
specific results, fact sheets on obesity-related behaviors, sexual
risk behaviors, and tobacco use also are included on the website
for all sites with weighted data. These fact sheets also include
results from School Health Profiles (71). For the 2011 cycle,
186 site-specific fact sheets were posted on the website. Most
of these fact sheets also include a Quick Response code that
users can scan with a smartphone or tablet computer to link
automatically to the YRBSS website.
The YRBSS website also includes a data widget, a small
customizable web program that can be put on any website to
display YRBSS results quickly and conveniently. The widget
program resides on CDC servers, so when CDC updates the
program, the widgets on other websites are updated automatically.
A cornerstone of the YRBSS website is Youth Online, a
user-friendly data-query application that allows users to view
detailed survey results by site, question, demographic variables,
and survey year. All weighted national, state, territorial, tribal,
and large urban school district results from 1991–2011 are
available for public use. Youth Online allows users to create
tables and graphs of YRBSS results, compare results from
different locations, and examine trends. The YRBSS website
also includes approximately 580 links to Youth Online results
for specific topics for specific sites. These links provide users
with easy access to Youth Online results and provide examples
of Youth Online capabilities. These links also increase efficiency
because they replace static PDF files that required considerable
formatting and production time.
The YRBSS website also includes data files and documentation
for all national surveys conducted since 1991. When these files
are downloaded, researchers can conduct their own analyses
of the national data. The site also includes information for
researchers who want to analyze state, territorial, tribal, and
large urban school district YRBSS data. Although certain sites
have given CDC permission to share their datasets directly,
other sites require that researchers contact them to request
data. Researchers who want state, territorial, tribal, or large
urban school district datasets can use an online request form to
request data or site contact information. In 2012, the
website received 498,277 visits, and Youth Online rec
421,782 visits (CDC, unpublished data, 2012).
In 2012, CDC launched a web-based e-mail distribut
list for YRBSS as part of CDC’s “Get E-mail Updates” s
Users can subscribe to this CDC-administered list to r
bulletins with new information about YRBSS. These bu
have included an announcement of the release of the
MMWR Surveillance Summary and an announcement
release of a weekly MMWR article reporting national Y
trend data. At the end of 2012, this list had 74,068 su
(GovDelivery subscriber report, December 31, 2012).
Data Quality
From the inception of YRBSS, CDC has been commit
to ensuring that the data are of the highest quality. O
high-quality data begins with high-quality question
described previously, the original questionnaire was s
to laboratory and field testing, and CDC conducted re
testing of the 1991 and 1999 versions of the question
addition, two studies have been conducted to assess
of implementing changes to the questions that a
and ethnicity (72,73). CDC made these changes to co
with new standards established by the Office of Mana
and Budget in 1997 (74). The first study tested t
of changing the question from one in which students
required to select a single response to one that allow
to select one or more responses. The second study te
effect of changing from a single-question format that
about race and ethnicity together to a two-question fo
that asked separate questions about race and ethnici
studies indicated that the changes to the questions h
a minimal effect on reported race/ethnicity and t
analyses that included white, black, and Hispanic sub
were not affected (72,73).
Another aspect of data quality is the level of nonres
questions. For the 2011 national YRBS, nonresponse a
to blank responses, invalid responses, out-of-range re
and responses that did not meet edit criteria ranged f
for the question that assesses the sex of the respond
for the question that assesses the race of the respond
91% of all questions, the nonresponse rate was <5%,
16% of all questions, the nonresponse rate was <1%.
To further ensure data quality, survey administr
standardized procedures. To determine how using diff
procedures can affect survey results, CDC has co
a series of methods studies. In the first study, conduc
MMWR / March 1, 2013/ Vol. 62/ No. 116
This content downloaded from
105.231.112.221 on Tue, 04 Dec 2018 04:40:42 UTC
All use subject to https://about.jstor.org/terms
Document Page
Recommendations and Reports
2002, CDC examined how prevalence estimates were affected
by varying honesty appeals,§ the wording of questions, and
data-editing protocols, while holding population, setting,
questionnaire context, and mode of administration constant
(75). The study indicated that different honesty appeals and
data-editing protocols did not have a statistically significant
effect on prevalence estimates. In addition, the study indicated
that, although differences in the wording of questions can
create statistically significant differences in certain prevalence
estimates, no particular type of wording consistently produced
higher or lower estimates.
In 2004, CDC conducted a study to determine how varying
the mode and setting of survey administration might affect
prevalence estimates. While previous research had examined the
effects of varying setting (school versus home) (76–79) and mode
(paper-and-pencil instrument [PAPI] versus computer-assisted
self-interview [CASI]) (80–83), this study was the first to assign
school classes randomly to one of four conditions in which mode
and setting were varied systematically: school-based administration
using PAPI, school-based administration using CASI, home-based
PAPI administration, and home-based CASI administration.
Results revealed that students completing questionnaires at school
were more likely to report health-risk behaviors than students
completing questionnaires at home, but that mode effects were
weaker: prevalence estimates for health-risk behaviors generally did
not differ for CASI and PAPI administrations, and these effects
were independent of setting (84). On the basis of these results,
CDC decided to continue with PAPI administration for the
YRBSS. Because the use of CASI did not increase the reporting
of risk behaviors, its increased cost and complicated logistics did
not appear to be justified.
In 2008, CDC conducted two additional studies to determine
the feasibility and effect of conducting YRBS as a web-based
survey. In the first study, classes in grades 9 and 10 were
assigned randomly to complete the YRBS in three conditions:
1) using PAPI in the classroom, 2) using web-based CASI
administration in school computer labs or in classrooms with
sufficient numbers of computers, or 3) using web-based CASI
that students could complete on their own (i.e., at any time at
any computer with Internet access). Results indicated that risk
behavior prevalence estimates generated from PAPI and CASI
administered in a classroom setting in schools generally were
equivalent (85). However, web-based CASI administration
yielded more missing data than PAPI administration, and web-
based “on their own” administration yielded an unacceptably
low response rate. In addition, perceived and actual privacy and
§ Honesty appeals, typically part of questionnaire introductions, ask respondents
to be truthful when self-reporting behaviors.
perceived anonymity were compromised when admin
in-class web-based questionnaires (86).
In the second study, paper-and-pencil questionnaire
mailed to a nationally representative sample of p
private high school principals to assess computer ava
in U.S. schools and to assess principals’ perceptions o
based student surveys. Although 64% of principals pr
web-based student surveys to those conducted via PA
30% said they would be more likely to agree to partic
such a survey if it were conducted online. Further, thi
revealed that many schools do not have sufficient com
capacity to participate in an in-class web-based surve
result, web-based student surveys could create a sign
burden on schools and lead to unacceptably low
participation rates (87). Taken together, the results o
studies informed CDC’s decision not to convert the YR
from PAPI to web-based administration so the quality
system could be maintained (85–87).
Limitations
YRBSS is subject to at least five limitations. First, all
data are self-reported, and the extent of underre
overreporting of behaviors cannot be determined, alt
studies described in this report demonstrate that the
of acceptable quality. Second, the school-based natio
territorial, tribal, and large urban school district surve
apply only to youths who attend school and therefore
representative of all persons in this age group. Nation
2009, approximately 4% of persons aged 16–17 years
enrolled in a high school program and had not comple
school (88). The NHIS and Youth Risk Behavior Supple
conducted in 1992 demonstrated that out-of-school y
are more likely than youths attending school to engag
majority of health-risk behaviors (89). Third, local par
permission procedures are not consistent across scho
survey sites. However, in a 2004 study, CDC demonst
that the type of parental permission typically does no
prevalence estimates as long as student response rat
high (90). Fourth, state-level data are not available fo
states. Three states (Minnesota, Oregon, and Wash
do not participate, and in 2011, four states (Calif
Missouri, Nevada, and Pennsylvania) did not obtain w
data. Finally, YRBSS addresses only those behavio
contribute to the leading causes of morbidity and mo
among youths and adults. However, school and comm
interventions should focus not only on behaviors but
the determinants of those behaviors.
MMWR / March 1, 2013/ Vol. 62/ No. 1 17
This content downloaded from
105.231.112.221 on Tue, 04 Dec 2018 04:40:42 UTC
All use subject to https://about.jstor.org/terms
Document Page
Recommendations and Reports
Global Youth Risk Behavior Survey
CDC has applied many of the features of YRBSS successfully
to the Global Youth Risk Behavior Survey (G-YRBS), also
known as the Global School-Based Student Health Survey.
G-YRBS was developed by the World Health Organization
(WHO) and CDC in collaboration with UNICEF, UNESCO,
and UNAIDS. Since 2003, G-YRBS has provided data on
health behaviors and protective factors among students in
84 developing countries. G-YRBS is a school-based survey
conducted primarily among students aged 13–17 years.
Like YRBSS, G-YRBS uses a standardized scientific sample
selection process, common school-based methodology, and a
standard questionnaire. The G-YRBS standard questionnaire
addresses the leading causes of morbidity and mortality
among children and adults worldwide and includes 10 core
modules: 1) alcohol use; 2) dietary behaviors; 3) drug use;
4) hygiene; 5) mental health; 6) physical activity; 7) protective
factors, such as parental supervision; 8) sexual behaviors that
contribute to HIV infection, other STDs, and unintended
pregnancy; 9) tobacco use; and 10) violence and unintentional
injuries. A list of core-expanded questions also is available, and
countries participating in G-YRBS can modify the standard
questionnaire to meet country-specific needs just as state,
territorial, and local agencies and tribal governments can
modify the standard YRBSS questionnaire to meet their needs.
As CDC does for YRBSS sites, WHO and CDC provide
ongoing capacity building and technical support to countries
conducting a G-YRBS. Capacity building includes help with
sample design and selection using PCSample, training of survey
coordinators, provision of survey implementation handbooks,
provision and scanning of computer-scannable answer sheets,
data editing and weighting, and provision or facilitation of
funding and other resources. WHO and CDC also offer two
capacity-building workshops for participating countries. One
of these workshops, the Survey Implementation Workshop,
is similar to the YRBSS training that CDC provides to sites.
This 3-day workshop builds the capacity of survey coordinators
to implement the survey in their country following common
sampling and survey administration procedures that ensure that
the surveys are standardized and comparable across countries
and that the data are of the highest quality. For countries
that implement a G-YRBS successfully, CDC and WHO
also provide a second 4-day workshop, the Data Analysis and
Reporting Workshop, which trains survey coordinators to
conduct data analysis and generate a country-specific report
and fact sheet using Epi Info (91). Since 2001, persons from
124 countries have attended one or both of these workshops.
As with YRBSS sites, CDC also provides individualized
technical assistance and monitors site progress for countries
participating in G-YRBS. Once data are collected
completed answer sheets are shipped, CDC provides
services for G-YRBS as for YRBSS, including scann
answer sheets, data processing, and report productio
SDMS. As with YRBSS, CDC maintains a public-fac
G-YRBS website (available at http://www.cdc.gov/gshs
includes country-specific questionnaires, fact shee
public-use datasets available for additional analyses.
bibliography also is available at http://www.who.int/ch
GSHS_Bibliography.pdf.
School Health Profiles
Another CDC surveillance system that uses many of
features of YRBSS is School Health Profiles (Profiles),
provides biennial data on school health policies and p
in secondary schools in states, territories, large urban
districts, and tribal governments (71). Profiles uses tw
computer-scannable questionnaire booklets to coll
one for school principals and one for lead health educ
teachers. These questionnaires are mailed to the
When revising these standard questionnaires for each
CDC uses a voting process similar to that used for YR
draw representative samples of schools for Profiles, C
Westat use a specialized software program called PCS
that is similar to PCSample. Profiles technical ass
similar to that of YRBSS in that it includes a comprehe
handbook (92), monthly newsletters, and instructiona
as well as the same Survey Technical Assistance Web
for YRBSS. As with YRBSS, CDC also provides individu
technical assistance to Profiles sites and monitors site
using a standardized tracking form. In addition, as is d
YRBSS, CDC produces a comprehensive report contai
data from all participating states, large urban school
territories, and tribes with weighted data (71) and fac
for each Profiles cycle, and maintains a public-facing
(available at http://www.cdc.gov/schoolhealthprofile
Profiles website includes questionnaires, publicatio
sheets, and a PowerPoint (37) presentation.
Future Directions
YRBSS is evolving constantly to meet the needs of C
other users of the data. The questionnaire is revised b
biennial cycle, and new survey populations periodical
been added to the system since its inception. In the f
additional substate sampling and analysis might be p
similar to the Selected Metropolitan/Micropolitan Area
Trends that are part of the Behavioral Risk Factor Sur
MMWR / March 1, 2013/ Vol. 62/ No. 118
This content downloaded from
105.231.112.221 on Tue, 04 Dec 2018 04:40:42 UTC
All use subject to https://about.jstor.org/terms

Paraphrase This Document

Need a fresh take? Get an instant paraphrase of this document with our AI Paraphraser
Document Page
Recommendations and Reports
System (93). Finally, although web-based administration is not
recommended for YRBSS at this time, CDC will continue to
monitor schools’ computer capacity as well as the development
of innovative and cost-effective methods that ensure students’
privacy; such advances could permit online administration of
the YRBS in the future.
References
1. Johnston LD, O’Malley PM, Bachman JG, Schulenberg JE. Monitoring
the Future national survey results on drug use, 1975–2011. Volume I:
secondary school students. Ann Arbor, MI: Institute for Social Research,
The University of Michigan; 2012.
2. American School Health Association, Association for the Advancement
of Health Education, Society for Public Health Education, Inc. The
National Adolescent Student Health Survey: a report on the health of
America’s youth. Oakland, CA: Third Party Publishing; 1989.
3. Kann L, Anderson JE, Holtzman D, et al. HIV-related knowledge,
beliefs, and behaviors among high school students in the United States:
results from a national survey. J Sch Health 1991;61:397–401.
4. CDC. Methodology of the Youth Risk Behavior Surveillance System.
MMWR 2004;53(No RR-12).
5. Public Health Service: Healthy people 2000: national health promotion
and disease prevention objectives—full report, with commentary.
Washington, DC: US Department of Health and Human Services,
Public Health Service; 1990; DHHS publication no. (PHS) 91-50212.
6. US Department of Health and Human Services. Healthy people 2010.
With understanding and improving health and objectives for improving
health. Washington, DC: US Department of Health and Human
Services; 2000.
7. US Department of Health and Human Services, Office of Disease
Prevention and Health Promotion. Healthy people 2020. Available at
http://www.healthypeople.gov. Accessed February 11, 2013.
8. CDC. FY 2012 online performance appendix. Available at http://www.
cdc.gov/fmo/topic/Performance/performance_docs/FY2012_CDC_
Online_Performance_Appendix.pdf. Accessed February 11, 2013.
9. Dryfoos JG. Adolescents at risk: prevalence and prevention. New York,
NY: Oxford University Press; 1990.
10. Adams PF, Schoenborn CA, Moss AJ, Warren CW, Kann L. Health-risk
behaviors among our nation’s youth: United States, 1992. Vital Health
Stat 1995;192:1–51.
11. CDC. Youth risk behavior surveillance: National College Health Risk
Behavior Survey—United States, 1995. MMWR 1997;46(No. SS-6).
12. Grunbaum J, Kann L, Kinchen SA, et al. Youth risk behavior surveillance:
National Alternative High School Youth Risk Behavior Survey, United
States, 1998. MMWR 1999;48(No. SS-7).
13. Brener ND, Kann L, Garcia D, et al. Youth Risk Behavior Surveillance—
selected Steps communities, 2005. MMWR 2007;56(No. SS-2):1–16.
14. National Center for Health Statistics. Advance report of final mortality
statistics, 1989. Hyattsville, MD: US Department of Health and Human
Services, Public Health Service, National Center for Health Statistics;
1992. DHHS publication no. PHS 92-1120.
15. CDC, National Center for Health Statistics. Mortality data file for 2008
with all state identifiers [CD-ROM]; 2011. Available at http://www.
cdc.gov/nchs/data_access/cmf.htm.
16. Hofferth SL. Teenage pregnancy and its resolution. In: Hofferth SL and
Hayes CD, eds. Risking the future: adolescent sexuality, pregnancy and
childbearing. Washington, DC: National Academy Press; 1987:78–92.
17. CDC. 1990 Division of STD/HIV prevention annual report, 1990. Atlanta,
GA: US Department of Health and Human Services, CDC; 1991.
18. CDC. Vital signs: teen pregnancy—United States, 1991–2009. MMWR
2011;60:414–20.
19. CDC. Tracking the hidden epidemics: trends in STDs in the Un
States. Atlanta, GA: US Department of Health and Human Serv
CDC; 2000.
20. CDC. Sexually transmitted disease morbidity for selected STD
race/ethnicity and gender, 1996–2009, CDC WONDER onlin
database, June 2011. Available at http://wonder.cdc.gov/std-st
html. Accessed February 11, 2013.
21. US Department of Education, National Education Goals Panel.
safe, disciplined, and drug-free schools. In: US Department of
Measuring progress toward the National Education Goals:
indicators and measurement strategies. Washington, DC: U
Department of Education, 1991; publication no. 91-01.
22. Brener ND, Collins JL, Kann L, Warren CW, Williams BI. Reliabi
of the Youth Risk Behavior Survey questionnaire. Am J Ep
1995;141:575–80.
23. Brener ND, Kann L, McManus T, Kinchen SA, Sundberg EC, Ro
Reliability of the 1999 Youth Risk Behavior Survey questio
J Adolesc Health 2002;31:336–42.
24. Brener ND, Billy JOG, Grady WR. Assessment of factors affecti
validity of self-reported health-risk behavior among adolescen
from the scientific literature. J Adolesc Health 2003;33:436–57
25. Brener ND, McManus T, Galuska DA, Lowry R, Wechsler H. Re
and validity of self-reported height and weight among hig
students. J Adolesc Health 2003;32:281–7.
26. CDC. 2013 Handbook for conducting Youth Risk Behavior Sur
Atlanta, GA: US Department of Health and Human Services, CD
27. US Department of Education, National Center for Education S
Common core of data, Public Elementary/Secondary School Un
Survey. Available at http://nces.ed.gov/ccd. Accessed February
28. Microsoft Corporation. Microsoft Excel 2010. Redmond, WA: M
Corporation; 2010.
29. Market Data Retrieval. National Education Database Master E
Shelton, CT: Market Data Retrieval, Inc.; 2010.
30. National Center for Education Statistics. Digest of Education S
Available at http://nces.ed.gov/programs/digest. Accessed Feb
31. Potter FJ. A study of procedures to identify and trim extreme
weights. In: American Statistical Association. Proceedings of th
Survey Research Methods of the American Statistical Associat
Triangle Park, NC: American Statistical Association; 1990:225–
32. Eaton DK, Brener N, Kann LK. Associations of health risk beha
with school absenteeism: does having permission for the abse
a difference? J Sch Health 2008;78:223–9.
33. Microsoft Corporation. Visual Studio 2008, professional ed
Redmond, WA: Microsoft Corporation; 2007.
34. SAS Institute, Inc. SAS, Version 9.2. Cary, NC: SAS Institute; 2
35. Research Triangle Institute. SUDAAN: software for the statisti
of correlated data, release 10. Research Triangle Park, NC: Re
Triangle Institute; 2008.
36. CDC. 2011 YRBS data user’s guide, 2012. Available at ftp://ftp
gov/pub/data/yrbs/2011/YRBS_2011_National_User_Guide.pdf.
Accessed February 11, 2013.
37. Microsoft Corporation. PowerPoint 2010. Redmond, WA: Micro
Corporation; 2010.
38. Business Objects Software, Ltd. Crystal Reports, Version 10.0
Dublin, Ireland: Business Objects Software, Ltd.; 2003.
39. Microsoft Corporation. Microsoft Word 2010. Redmond, W
Microsoft Corporation; 2010.
40. Adobe Systems, Inc., Adobe Acrobat, version 9.5.0. San Jose,
Systems, Inc.; 2010.
41. SPSS, Inc. SPSS for Windows, Release 19.0.0. Chicago, IL: SPS
42. Foti K, Balaji A, Shanklin S. Uses of Youth Risk Behavior Surve
School Health Profiles data: applications for improving adolesc
school health. J Sch Health 2011;81:345–54.
43. CDC. Participation of high school students in school phys
education—United States, 1990. MMWR 1991;40:607, 613–5.
MMWR / March 1, 2013/ Vol. 62/ No. 1 19
This content downloaded from
105.231.112.221 on Tue, 04 Dec 2018 04:40:42 UTC
All use subject to https://about.jstor.org/terms
Document Page
Recommendations and Reports
44. CDC. Tobacco use among high school students—United States, 1990.
MMWR 1991;40:617–9.
45. CDC. Attempted suicide among high school students—United States,
1990. MMWR 1991;40:633–5.
46. CDC. Current tobacco, alcohol, marijuana, and cocaine use among high
school students—United States, 1990. MMWR 1991;40:659–63.
47. CDC. Weapon-carrying among high school students—United States,
1990. MMWR 1991;40:681–4.
48. CDC. Body-weight perceptions and selected weight-management goals
and practices of high school students—United States, 1990. MMWR
1991;40:741, 747–50.
49. CDC. Alcohol and other drug use among high school students—United
States, 1990. MMWR 1991;40:776–7, 783–4.
50. CDC. Sexual behavior among high school students—United States,
1990. MMWR 1991;40:885–8.
51. CDC. Vigorous physical activity among high school students—United
States, 1990. MMWR 1992;41:33–5.
52. CDC. Physical fighting among high school students—United States,
1990. MMWR 1992;41:91–4.
53. CDC. Safety-belt use and helmet use among high school students—
United States, 1990. MMWR 1992;41:111–4.
54. CDC. Selected behaviors that increase risk for HIV infection among high
school students—United States, 1990. MMWR 1992;41:231, 237–40.
55. CDC. Selected tobacco-use behaviors and dietary patterns among high
school students—United States, 1991. MMWR 1992;41:417–21.
56. CDC. Participation in school physical education and selected dietary
patterns among high school students—United States, 1991. MMWR
1992;41:597–601, 607.
57. CDC. Tobacco, alcohol, and other drug use among high school
students—United States, 1991. MMWR 1992;41:698–703.
58. CDC. Behaviors related to unintentional and intentional injuries among high
school students—United States, 1991. MMWR 1992;41:760–5, 771–2.
59. CDC. Selected behaviors that increase risk for HIV infection, other
sexually transmitted diseases, and unintended pregnancy among high
school students—United States, 1991. MMWR 1992;41:945–50.
60. Kann L, Warren W, Collins JL, Ross J, Collins B, Kolbe LJ. Results from
the national school-based 1991 Youth Risk Behavior Survey and progress
toward achieving related health objectives for the nation. Public Health
Rep 1993;108(Suppl 1):47–55.
61. Kann L, Warren CW, Harris WA, et al. Youth risk behavior surveillance—
United States, 1993. MMWR 1995;44(No. SS-1).
62. Kann L, Warren CW, Harris WA, et al. Youth risk behavior surveillance—
United States, 1995. MMWR 1996;45(No. SS-4).
63. Kann L, Kinchen SA, Williams BI, et al. Youth risk behavior
surveillance—United States, 1997. MMWR 1998;47(No. SS-3).
64. Kann L, Kinchen SA, Williams BI, et al. Youth risk behavior
surveillance—United States, 1999. MMWR 2000;49(No. SS-5).
65. Grunbaum J, Kann L, Kinchen SA, et al. Youth risk behavior
surveillance—United States, 2001. MMWR 2002;51(No. SS-4).
66. Grunbaum J, Kann L, Kinchen SA, et al. Youth risk behavior
surveillance—United States, 2003. MMWR 2004;53(No. SS-2).
67. Eaton D, Kann L, Kinchen SA, et al. Youth risk behavior surveillance—
United States, 2005. MMWR 2006;55(No. SS-5).
68. Eaton D, Kann L, Kinchen SA, et al. Youth risk behavior surveillance—
United States, 2007. MMWR 2008;57(No. SS-4).
69. Eaton D, Kann L, Kinchen SA, et al. Youth risk behavior surveillance—
United States, 2009. MMWR 2010;59(No. SS-5).
70. Eaton D, Kann L, Kinchen SA, et al. Youth risk behavior surveillance—
United States, 2011. MMWR 2012;59(No. SS-4).
71. Brener ND, Demissie Z, Foti K, et al. School Health Profiles 2010:
characteristics of health programs among secondary schools. Atlanta,
GA: US Department of Health and Human Services, CDC; 2011.
72. Brener ND, Kann L, McManus T. A comparison of two survey questions
on race and ethnicity among high school students. Public Opinion
Quarterly 2003;67:227–36.
73. Eaton DK, Brener ND, Kann L, Pittman V. High school stu
responses to different question formats assessing race/ethnici
Health 2007;41:488–94.
74. Office of Management and Budget. Revisions to the standard
classification of federal data on race and ethnicity. Feder
1997;62:58781–90.
75. Brener ND, Grunbaum JA, Kann L, McManus T, Ross J. Assessi
health risk behaviors among adolescents: the effect of questio
and appeals for honesty. J Adolesc Health 2004;35:91–100.
76. Kann L, Brener ND, Warren CW, Collins JL, Giovino GA. An ass
of the effect of data collection setting on the prevalence of he
behaviors among adolescents. J Adolesc Health 2002;31:327–3
77. Gfroerer J, Wright D, Kopstein A. Prevalence of youth substan
the impact of methodological differences between two nationa
Drug Alcohol Depend 1997;47:19–30.
78. Rootman I, Smart RG. A comparison of alcohol, tobacco and d
as determined from household and school surveys. Drug Alcoh
1985;16:89–94.
79. Needle R, McCubbin H, Lorence J, Hochhauser M. Reliability a
validity of adolescent self-reported drug use in a family-based
methodological report. International J Addictions 1983;18:901
80. Turner CF, Ku L, Rogers SM, Lindberg LD, Pleck JH, Sonenstein
Adolescent sexual behavior, drug use, and violence: increased
with computer survey technology. Science 1998;280:867–73.
81. Wright DL, Aquilino WS, Supple AJ. A comparison of computer
paper-and-pencil self-administered questionnaires in a survey
alcohol, and drug use. Public Opinion Quarterly 1998;62:331–5
82. Beebe TJ, Harrison PA, McCrae JA Jr, Anderson RE, Fulkerson J
evaluation of computer-assisted self-interviews in a school set
Opinion Quarterly 1998;62:623–32.
83. Hallfors D, Khatapoush S, Kadushin C, Watson K, Saxe L. A co
of paper vs computer-assisted self interview for school alcoho
other drug surveys. Evaluation and Program Planning 2000;23
84. Brener ND, Eaton DK, Kann L, et a;. The association of survey
and mode with self-reported health risk behaviors among high
students. Public Opinion Quarterly 2006;70:354–74.
85. Eaton DK, Brener N, Kann L et al. Comparison of paper-and-p
web administration of the Youth Risk Behavior Survey (YRBS):
behavior prevalence estimates. Evaluation Review 2010;34:13
86. Denniston MM, Brener N, Kann L, et al. Comparison of web-ba
versus paper-and-pencil administration of the Youth Risk B
Survey (YRBS): participation, data quality, and perceived priva
anonymity by mode of data collection. Computers in Human B
2010;26:1054–60.
87. Eaton DK, Brener ND, Kann L, et al. Computer availability and
perception of online surveys. J Sch Health 2011;81:365–73.
88. Chapman C, Laird J, Ifill N, KewalRamani A. Trends in high sch
dropout and completion rates in the United States: 1972–2009
2012–006). Available at http://nces.ed.gov/pubs2012/20120
Accessed February 11, 2013.
89. CDC. Health risk behaviors among adolescents who do and d
attend school—United States, 1992. MMWR 1994;43:129–32.
90. Eaton DK, Lowry R, Brener ND, Grunbaum JA, Kann L. Passive
active parental permission in school-based survey research: d
permission affect prevalence estimates of self-reported risk be
Evaluation Review 2004;28:564–77.
91. CDC. Epi Info, Release 3.5.1. [Software and documentation]. A
GA: US Department of Health and Human Services, CDC; 2008
92. CDC. Handbook for developing School Health Profiles. Atlanta
US Department of Health and Human Services, CDC; 2012.
93. CDC. SMART: selected metropolitan/micropolitan area risk
Available at http://apps.nccd.cdc.gov/brfss-smart/index.asp.
February 11, 2013.
MMWR / March 1, 2013/ Vol. 62/ No. 120
This content downloaded from
105.231.112.221 on Tue, 04 Dec 2018 04:40:42 UTC
All use subject to https://about.jstor.org/terms
1 out of 21
circle_padding
hide_on_mobile
zoom_out_icon
[object Object]

Your All-in-One AI-Powered Toolkit for Academic Success.

Available 24*7 on WhatsApp / Email

[object Object]