SS Part B

SS Part B.docx

2021 and 2023 National Youth Risk Behavior Surveys and 2021 ABES

OMB: 0920-0493

Document [docx]
Download: docx | pdf


SUPPORTING STATEMENT FOR THE

2021 and 2023 NATIONAL YOUTH RISK BEHAVIOR SURVEY and 2021 ADOLESCENT BEHAVIORS AND EXPERIENCES SURVEY





PART B


















Submitted by:

Nancy Brener, PhD, Project Officer

Division of Adolescent and School Health

National Center for HIV/AIDS, Viral Hepatitis, STD, and TB Prevention
1600 Clifton Road, Mailstop US8-1

Atlanta, GA 30329-4027
404-718-8133 (voice); 404-718-8010 (fax)
[email protected]

Centers for Disease Control and Prevention
Department of Health and Human Services

October 20, 2020

TABLE OF CONTENTS



LIST OF ATTACHMENTS

  1. Authorizing Legislation

  2. 60-Day Federal Register Notice

  3. 60-Day Federal Register Notice Comment(s)

  4. Rationale for Survey Questions

  5. Expert Reviewers for the 1989 Consultations

  6. Report on the 2018 YRBS External Peer Review

  7. Data Collection Checklist

  8. Survey Administrator Script

H1. Survey Administrator Script YRBS 2021 – Booklet

H2. Survey Administrator Script YRBS 2023 – Tablet

H3. Survey Administrator/Student Video Script ABES - Web

H4. Survey Administrator/Student Video Script ABES - Web (Spanish Version)

  1. Parental Permission Forms and Supplemental Documents

I1. Parental Permission Form and Fact Sheet (English Version)

I2. Parental Permission Form and Fact Sheet (Spanish Version)

I3. Parental Permission Form Distribution Script

I4. Parental Permission Form Reminder Notice (English Version)

I5. Parental Permission Form Reminder Notice (Spanish Version)

I6. ABES Parental Permission Form and Fact Sheet (English Version)

I7. ABES Parental Permission Form and Fact Sheet (Spanish Version)

I8. ABES Parental Permission Form Distribution Script EDL

I9. ABES Parental Permission Form Distribution Script In Person

I10. ABES Parental Permission Form Reminder Notice EDL (English Version)

I11. ABES Parental Permission Form Reminder Notice In Person (English Version)

I12. ABES Parental Permission Form Reminder Notice EDL (Spanish Version)

I13. ABES Parental Permission Form Reminder Notice In Person (Spanish Version)

  1. IRB Approval Letters

  2. Questionnaires

K1. Youth Risk Behavior Survey Questionnaire

K2. Adolescent Behaviors and Experience Survey Questionnaire

  1. Recruitment Scripts for the Youth Risk Behavior Survey

L1. State-level Recruitment Script for the Youth Risk Behavior Survey

L2. District-level Recruitment Script for the Youth Risk Behavior Survey

L3. School-level Recruitment Script for the Youth Risk Behavior Survey

L4. ABES State-level Recruitment Script

L5. ABES District-level Recruitment Script

L6. ABES School-level Recruitment Script

  1. Example Table Shells

  2. Sampling and Weighting Plan

  3. Data Collection Checklist Supplemental Documents

O1. Letter to Teachers in Participating Schools

O2. Make-up List and Instructions

O3. ABES Letter to Teachers in Participating Schools EDL

O4. ABES Letter to Teachers in Participating Schools In Person

  1. Letters of Invitation

P1. Letter of Invitation to States

P2. Letter of Invitation to School Districts

P3. Letter of Invitation to School Administrators

P4. YRBS Fact Sheet for Schools – Booklet

P5. YRBS Fact Sheet for Schools – Tablet

P6. Letter to Agreeing Schools

P7. ABES Letter of Invitation to States

P8. ABES Letter of Invitation to School Districts

P9. ABES Letter of Invitation to School Administrators

P10. ABES Fact Sheet for Schools

P11. ABES Letter to Agreeing Schools EDL

P12. ABES Letter to Agreeing Schools In Person

  1. Privacy Impact Assessment (PIA)





B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS

B.1 RESPONDENT UNIVERSE AND SAMPLING METHODS

The universe for both the YRBS and ABES will consist of all regular public and private school students in grades 9, 10, 11, and 12 in the 50 states and the District of Columbia.

Independent samples of schools will be drawn for the ABES and each cycle of the YRBS. For all samples, the sampling frame for schools combines data files obtained from MDR Inc. (Market Data Retrieval, Inc.) and from the National Center for Education Statistics (NCES). The MDR file contains school information including enrollments, grades, race distributions within the school, district, and county, and other contact information for public and non-public schools across the nation. The NCES file includes the Common Core of Data (CCD) for public schools and the Private School Survey (PSS) for non-public schools. When combining data sources to form a sampling frame, duplicates are eliminated so that each school is represented once on the final frame.

Table B-1 displays the current distribution of schools nationally by urban status and type of school.

Table B-1

Distribution of Schools Nationally by Urban Status and School Type

Urban Status

School Type

Total

Frequency

Percent

Public

Private

Catholic

City-Large

2835

739

236

3810

10.04%

2.62%

0.84%

13.49%

City-Midsize

1165

333

90

1588

4.12%

1.18%

0.32%

5.62%

City-Small

1400

363

116

1879

4.96%

1.29%

0.41%

6.65%

Suburb-Large

4676

1355

279

6,310

16.55%

4.80%

0.99%

22.34%

Suburb-Midsize

666

146

16

828

2.36%

0.52%

0.06%

2.93%

Suburb-Small

409

82

21

512

1.45%

0.29%

0.07%

1.81%

Town-Fringe

732

77

15

824

2.59%

0.27%

0.05%

2.92%

Town-Distant

1673

166

52

1891

5.92%

0.59%

0.18%

6.69%

Town-Remote

1148

89

33

1270

4.06%

0.32%

0.12%

4.50%

Rural-Fringe

2602

587

59

3248

9.21%

2.08%

0.21%

11.50%

Rural-Distant

3192

238

5

3435

11.30%

0.84%

0.02%

12.16%

Rural-Remote

2580

70

4

2654

9.13%

0.25%

0.01%

9.40%

Total

23078

4245

926

28249

81.69%

15.03%

3.28%

100.00%



Sampling or other respondent selection method used: Students will be selected using the procedures described in detail below. To briefly summarize, for the ABES and each YRBS cycle, a nationally representative sample of students will be selected using a three-stage stratified cluster sample. Primary Sampling Units (PSUs - counties, a portion of a county, or a group of counties) and Secondary Sampling Units (SSUs - schools) within sampled PSUs will be selected with probability proportional to size (PPS) selection methods. Within each selected school, one class in each grade will be selected to participate, except in high minority schools where two classes per grade will be selected. All students in selected classes are eligible to participate, with the exception of students who cannot complete the survey independently (e.g., for language or cognitive reasons). A Spanish translation of both questionnaires will be made available to any students who need it.

Expected response rates for the data collection: The average participation rates over the 16 prior cycles of YRBS are 77% for schools and 86% for students. Over the five most recent cycles, the average participation rates are 75% for schools and 84% for students. We consider the historical participation rates, with particular emphasis on the rates obtained during the five most recent cycles, to reflect the current picture of school and student participation in preparing the sample design for the 2021 and 2023 YRBS.

For the ABES, we expect lower-than-historical response rates due to COVID-19. We have assumed a 40% school response rate. In order to attain the desired student yield, the typical YRBS school sample size will be augmented for the ABES.

Actual response rates achieved during the last collection period: During the most recent cycle of the YRBS, conducted in 2019, the participation rates were 75% for schools and 80% for students.

Statistical justification for all sample sizes: The expected student sample size for the YRBS is approximately 19,144 students before nonresponse and the expected student sample size for the ABES is approximately 48,312 before nonresponse. These sample sizes are necessary to meet study precision requirements. The sample sizes are calculated by inflating the sample size that would be required under the assumptions of simple random sampling by historical design effects (to account for the complex sampling design) and participation rates to account for nonresponse at both the student and school levels. It is possible that the sample size for the fall 2021 YRBS might need to be increased if school response rates are still expected to be lower than historical values because of lingering effects of the COVID-19 pandemic.

B.2 PROCEDURES FOR THE COLLECTION OF INFORMATION

Statistical Methodology for Stratification and Sample Selection

For the ABES and each YRBS cycle, a probability sample will be selected that will support national estimates among students in grades 9-12 overall and by grade, sex, and race/ethnicity (white, black, Hispanic). The design also will support sex-specific estimates by grade or race/ethnicity and racial/ethnic-specific estimates by grade. A detailed description of the YRBS sampling design may be found in Attachment N. The ABES sampling design is similar except that it has been augmented to account for an expected 40% school response rate.

Sampling Frame. The sampling frame for both the ABES and each YRBS cycle will stratify the 50 states and the District of Columbia by urbanicity and minority composition. The sampling frame is structured into geographically defined units, Primary Sampling Units (PSUs), defined as a county, portion of a county, or a group of contiguous counties. The stratification by minority composition will divide the PSUs into eight groups based on the percentages of blacks and Hispanics in the PSU. This is accomplished in two steps. First, each PSU is stratified into either the Hispanic stratum or black stratum based on whether there is a higher percentage of Hispanic or black enrolled students in the PSU. Each stratum is then subdivided into four strata depending on the percentage of black or Hispanic enrolled students, as appropriate, in the PSU. The eight racial/ethnic-oriented strata will each be further divided by urban status defined as being in one of the 54 largest Metropolitan Statistical Areas (MSA) versus not. In addition, the first-stage PSU sample will be implicitly stratified by geography using 5-digit zip code areas.

Selection of PSUs. For each YRBS cycle, 54 PSUs will be selected, and for the ABES, 81 PSUs will be selected. For all samples, PSUs will be selected with probability proportional to the student enrollment in the PSU within strata. The allocation of PSUs to the first-stage strata will be approximately in proportion to the total enrollment in the PSU. A proportional allocation tends to maximize the precision of overall survey estimates.

Selection of Secondary Sampling Units (SSUs). SSUs are comprised of either a single school (if the school includes each of grades 9-12) or multiple schools “linked” together. An SSU is comprised of multiple “linked” schools when the physical schools do not include all of grades 9-12. This is done to form school-based SSUs that provide coverage for all four grades in each unit. SSUs will be grouped by size as either large or small, depending upon whether they have 28 students or more per grade. For each YRBS cycle, in each selected PSU, at least three large SSUs (28 students or more per grade) will be selected with probability proportional to an aggregate enrollment measure, resulting in 162 selected SSUs (54 PSUs * 3 SSUs). In addition, from a sub-sample of 15 PSUs, one small SSU (fewer than 28 students per grade) will be randomly selected to represent those attending small schools. A total of 177 SSUs will be selected (162 large and 15 small). These 177 SSUs will include approximately 200 physical schools, to account for “linked” schools that are combined during sampling to provide the full span of the grades of interest. For the ABES, a total of 293 large SSUs and 27 small SSUs will be randomly selected.



Selection of Classes. For the ABES and each cycle of the YRBS, classes in each school are randomly selected based on two specific scientific parameters to ensure a nationally representative sample.  First, classes must be selected in such a way that all students in the desired grade(s) within the school have a chance to participate.  Second, all classes must be mutually exclusive so that no student is selected more than once.   In each school, once we have determined the type of class or time period from which classes will be selected, we randomly select the appropriate number of intact classes within each grade.  To maintain acceptable school participation rates, it is essential that each school have input into the decision regarding which classes will be sampled in their school.  Examples of class sampling frames that have been used in past surveys include a required subject course such as English or all 2nd period classes.  As long as the scientific sampling parameters are met, we work with each school to identify a classroom sampling frame that will work best for the school. For the fall 2021 YRBS, additional classes will be selected in each school to test the use of tablets to administer the questionnaire.

Selection of Students. As stated above, for both the YRBS and the ABES, all students in a selected classroom are eligible to participate, with the exception of students who cannot complete the survey independently (e.g., for language or cognitive reasons). As stated above, for each YRBS cycle, we will draw a sample of 54 PSUs, with 3 large SSUs (“full” schools) selected from each PSU, for a total of 162 large SSUs. Based on historical averages, a PSU will supply a sample of 336 students across all of grades 9-12 before non-response (162 SSUs * 4 grades/school * 28 students per grade). The estimated sample yield from these large schools will be 18,144 students before school and student non-response. For the ABES, we will draw a sample of 81 PSUs, for a total of 293 large SSUs. The estimated sample yield from these large schools will be 45,937 students before school and student non-response.

To provide adequate coverage of students in small schools (those with an enrollment of less than 28 students per grade) we also will select one small SSU in each of 15 subsample PSUs for each YRBS sample, therefore adding an additional 15 SSUs to the sample. From historical averages, small SSUs are expected to add 1,000 students before non-response in each YRBS sample. For the ABES, we will select one small SSU in each of 27 subsample PSUs, adding 27 SSUs to the sample. These small SSUs are expected to add 2,375 students before non-response to the ABES sample.

Refusals. School districts, schools, and students who refuse to participate in the study, and students whose parents refuse to give permission, will not be replaced in the sample. Further, any schools that participate in the ABES during spring 2021 that are also drawn into the fall 2021 YRBS sample will not be re-approached for participation in the YRBS and will be considered automatic refusals. We will record the characteristics of schools that refuse for analysis of potential study biases. Accounting for school and student nonresponse, we expect approximately 12,067 participating students in each YRBS cycle and 15,460 participating students in the ABES.

Estimation and Justification of Sample Size

The YRBS and ABES are designed to produce estimates with error margins of ±5 percent:

  • 95 percent confidence for domains defined by grade, sex, or race/ethnicity;

  • 95 percent confidence for domains defined by crossing grade by sex, and race/ethnicity by sex; and

  • 90 percent confidence for domains formed by crossing grade with race/ethnicity.

During the design of the initial YRBS cycles, CDC’s contractor conducted a series of simulation studies that investigated the relationship of various weighting functions to the resulting numbers and percentages of minority students in the obtained samples. New simulation studies are performed periodically to determine opportunities for efficiency while maintaining the target yields across grade, sex, and race/ethnicity to meet the levels of precision required for CDC’s purposes. The 2021 and 2023 YRBS sample size and design will be consistent with the parameters developed for the 2017 and 2019 cycles, but the ABES sample size will be larger to account for anticipated lower-than-historical school response rates due to COVID-19. Minor design refinements are made to account for the changing demographics of the in-school population of students, primarily, the growing number of Hispanic students.

Estimation and Statistical Testing Procedures

Sample data will be weighted by the reciprocal of the probability of case selection and adjusted for non-response. The resulting weights will be trimmed to reduce mean-squared error. Next, the strata weights will be adjusted to reflect true relative enrollments. Finally, the data will be post-stratified to match national distributions of high school students by race/ethnicity and grade.

Variances will be computed using linearization methods. YRBS data are also used for trend analyses where data for successive cycles are compared using statistical testing techniques. Statistical testing methods are used also to compare subgroup prevalence rates (e.g., male versus female students) for each cross-sectional survey.

Confidence intervals vary from estimate to estimate depending upon whether the estimate is for the full population or for a subset such as a particular grade or sex. Within a grouping, confidence intervals also vary depending on the level of the estimate and the design effect associated with the measure. Based on prior YRBS cycles with similar designs and sample sizes, we can expect the following for both YRBS and ABES:

  • Estimates among students overall or by grade, sex, or race/ethnicity (white, black, Hispanic) will be accurate at ±5 percent at 95 percent confidence.

  • For racial/ethnic estimates by grade (e.g., 11th grade Hispanics), about 70% will be accurate to within ±5 percent at 90 percent confidence.1

The experience in using these data is that the levels of sampling errors involved are appropriate given the uses of the data for descriptive reporting and trend analysis.

Use of Less Frequent Than Annual Data Collection

As stated in A.6, the YRBS originally was planned and twice approved by OMB as an annual survey. Based on experience, it was determined in 1992 that it would be sufficient to address the programmatic needs of CDC and other Federal agencies to conduct the YRBS biennially. By shifting from an annual to biennial survey starting with the 1993 YRBS, burden has been reduced by half. In 2021, CDC is still conducting the YRBS only once, but since that survey is being moved to the fall because of the COVID-19 pandemic, CDC is adding the ABES in the spring to ensure data are available to monitor student health behaviors and experiences during the pandemic.

Survey Instrument

The YRBS questionnaire (Attachment K1), contains 99 items which can be roughly divided into seven categories. The first category includes five demographic questions. The remaining questions address health risk behaviors in six categories: unintentional injuries and violence; tobacco use; alcohol and other drug use; sexual behaviors that contribute to HIV infection, other sexually transmitted diseases and unintended pregnancies; unhealthy dietary behaviors; and physical inactivity. Obesity (assessed by self-reported height and weight) and other health behaviors also are assessed. The questions are all in a multiple-choice format. During the 2021 cycle, the YRBS questionnaire will be administered as a 12-page optically scannable questionnaire booklet. Additional classes will be selected to test the tablet-based, electronic administration of the questionnaire. Following the 2021 cycle, the YRBS will transition to electronic data collection to reduce burden and improve information gathering via advances in information technology. Beginning with the 2023 cycle, the YRBS will be a digitally-based self-administered questionnaire.

The ABES questionnaire (Attachment K2) is identical to the YRBS questionnaire with two exceptions. First, all YRBS questions that refer to behaviors or experiences that occurred at school, on school property, or on the way to or from school have been revised to include a response option that allows students to indicate that they have not attended school in-person during the timeframe specified in the question. Second, additional questions have been included to assess students’ behaviors and experiences specifically during the COVID-19 pandemic.

Data Collection Procedures

For the YRBS, data will be collected by a small staff of professional data collectors, specially trained to conduct the survey. The time during the school day in which the survey is administered varies by school.  This decision is made in coordination with each school to ensure that the type of class or period of the day selected for sampling 1) meets the scientific sampling parameters to ensure a nationally representative sample and 2) results in the least burden/highest possible acceptability for the school. The data collector will have direct responsibility for administering the survey to students. Data collectors will follow a survey administrator script (Attachments H1 and H2).

Teachers will be asked to remain at the front or back of the classroom and not to walk around the room monitoring the aisles during survey administration because doing so could affect honest responses and compromise anonymity. Teachers also will be asked to identify students allowed to participate in the survey and to make sure non-participating students have appropriate alternative activities. The rationale for this is to increase the candor and comfort level of students. The only direct responsibility of teachers in data collection is to distribute and follow up on parental permission forms sent out prior to the scheduled date for data collection in the school. Teachers are provided with a parental permission form distribution script (Attachment I3) to follow when distributing permission forms to students.

For the ABES, teachers of selected classes will be provided with written documentation of all processes they need to follow (Attachment O3 for exclusively distance learning [EDL] schools and Attachment O4 for in-person schools), and will also be provided a link to a brief, high-quality video that can be viewed securely online using an access ID. This video will convey the same information as the written documentation but provides another, more user-friendly means to communicate teachers’ roles and responsibilities, reinforce survey protocol, and provide an overview of key steps for successful survey implementation. Teachers are provided with a parental permission form distribution script/email template (Attachment I8 for EDL schools and I9 for in-person schools) to follow when distributing permission forms to students or parents. Teachers also will be provided with sign-in information for students. For students attending schools in-person at least part-time, teachers will distribute sign-in cards with student access IDs when the students are physically attending school. For students attending school in an EDL environment, teachers will provide a classroom-level sign-in to all students in a selected class via the school’s established teacher-student communication channels. Upon sign in, all records are associated with a unique student-level ID in the backend database, but none of these student access IDs can be traced to any individual student.

Using any internet-connected device, students will log in to the ABES via a secure URL using the sign-in information provided by the teacher. Students will be instructed to find a comfortable place where they can take the survey in private. After viewing a brief video similar to the Survey Administrator Script (Attachment H3, Student Video Script), students will complete the web-based questionnaire. The interface allows respondents to move easily through the questions. Should a connection be interrupted, respondents will be able to start, stop, and return to their survey on the last-viewed screen without data loss. To eliminate the possibility of students intentionally or unintentionally using the same sign-in ID twice, once a questionnaire is submitted by the student, the record cannot be accessed again.

For both the YRBS and ABES, in general, data collection procedures have been designed to ensure that:

  • Protocol is followed in obtaining access to schools

  • Everyday school activity schedules are disrupted minimally

  • Administrative burden placed on teachers is minimal

  • Parents give informed permission to participate in the survey

  • Anonymity of student participation is maintained, with no punitive actions against nonparticipants

  • Alternative activities are provided for nonparticipants

  • Control over the quality of data is maintained

Student anonymity for the tablet version of the YRBS that will be tested in 2021 and rolled out in 2023 will continue to be maintained by the same processes implemented for the scannable booklets. At the start of survey administration, professionally trained YRBS data collectors will remind students that their responses will be captured anonymously (Attachment H1). Students will be instructed to hand their tablet to the data collector at the conclusion of survey administration. Following data collection at each school, student data will be uploaded to a central repository using secure protocols and erased from the tablets.

For both the YRBS and ABES, the Data Collection Checklist (Attachment G) is completed by teachers to track which students have received parental permission to participate in the data collection. The teachers receive instructions on completing the Data Collection Checklist in the “Letter to Teachers in Participating Schools” (Attachment O1 for YRBS, Attachment O3 for ABES EDL schools, and Attachment O4 for ABES in-person schools). For the YRBS, the data collector will utilize the information on the Data Collection Checklist to identify students eligible for a make-up survey administration; this information will be recorded by the data collector on the “Make-up List and Instructions” document (Attachment O2).



Obtaining Access to and Support from Schools

For both the YRBS and ABES, all initial letters of invitation will be on CDC letterhead from the Department of Health and Human Services and signed by Kathleen Ethier, PhD, Director, Division of Adolescent and School Health, National Center for HIV, Viral Hepatitis, STD, and TB Prevention, Centers for Disease Control and Prevention.  The procedures for gaining access to schools will have three major steps:

  • Notify state education agencies (SEAs) in states with sampled schools and inform states of their schools’ selection into the ABES sample or national YRBS sample. Obtain names of supportive school district contacts and general guidance on working with the selected school districts and schools in the state, and request state-level support for the survey prior to sending district invitations. State-level letters of support for the ABES will not be solicited because of time constraints.

  • Invite school districts in which selected schools are located to participate in the study. For Catholic schools and other private schools, invite the office comparable to the school district office (e.g., diocesan office of education). Obtain approval to invite sampled schools to participate. Verify existence of schools, grade ranges, and other information as needed. Request that the school district notify schools that they may anticipate being contacted about the survey. Request general guidance on working with the selected schools and district scheduling information.

  • Once cleared at the school district level, invite selected schools to participate. Verify information previously obtained about the school. Present the burden and benefits of participation in the survey. Obtain approval for participation at the school level. After a school agrees to participate, develop a customized plan for collection of data in the school (e.g., select classes and schedule survey date). Ensure that all pre-survey materials reach the school well in advance of when they are needed. Maintain contact with schools until all data collection activities have been completed.

Prior experience suggests the process of working with each state education agency, school district, and school will have unique features. Discussions with each education agency will recognize the organizational constraints and prevailing practices of the state. Scripts for use in guiding these discussions for the YRBS may be found in Attachments L1 (state-level), L2 (district-level), and L3 (school-level). Parallel scripts for the ABES may be found in Attachments L4, L5, and L6. Attachment P contains copies of letters of invitation to states (Attachment P1 for YRBS and P7 for ABES), school districts (Attachment P2 for YRBS and P8 for ABES), and school administrators (Attachment P3 for YRBS and P9 for ABES). Attachment P also contains the YRBS Fact Sheet for Schools (Attachments P4 and P5) and the ABES Fact Sheet for Schools (Attachment P10). The letters to be sent to schools once they have agreed to participate are found in Attachment P6 for YRBS, P11 for ABES ED schools, and P12 for ABES in-person schools.

Informed Consent

The parental permission form and fact sheet (Attachments I1 and I2 for YRBS and Attachments I6 and I7 for ABES) inform both the student and the parent about an important activity in which the student has the opportunity to participate. By providing adequate information about the activity, it ensures that permission will be informed. The permission form indicates that a copy of the questionnaire will be available for review by parents at their child’s school. The parental permission forms will be made available in both English and Spanish.

For both surveys, a waiver of written student assent was obtained for the participation of children because this research presents no more than minimal risk to subjects, parental permission is required for participation, the waiver will not adversely affect the rights and welfare of the students because they are free to decline to take part, and it is thought that some students may perceive their responses are not anonymous if they are required to provide stated assent and sign a consent/assent document. Students are told “Participating in this survey is voluntary and your grade in this class will not be affected, whether you answer the questions or not.” Completion of the survey implies student assent.

Quality Control

Table B.2 lists the major means of quality control for both the paper and electronic data collection methodologies. As shown, the task of collecting quality data begins with a clear and explicit study protocol, regardless of mode.

For the paper questionnaire, data collection quality control includes visual inspection and scanning of collected booklets in order to produce a single data set for analysis. For the digitally-based questionnaire, data collection quality control begins with accurate programming of the YRBS and ABES questionnaires and concludes with the regular submission of data records to a secure central repository.

Because the ultimate aim is production of high-quality data sets and reports, various quality assurance activities will be applied during the data collection phase. Subsequent to YRBS data collector training, measures must be taken to reinforce training, to assist field staff who express/exhibit difficulties completing data collection activities, and to verify compliance with data collection protocols. Similarly, teachers administering the ABES will be provided with customized technical assistance and outreach to ensure all protocols typically followed by a data collector are followed by the teachers. Also, early inspection of a preliminary data set is necessary to ensure data integrity.

Table B.2 - Major Means of Quality Control

Survey Step

Quality Control Procedures

Mailing to Districts and School

  • Validate district and school sample to verify/update contact information of district/diocese/school leadership (100%)

  • Check inner vs. outer label for agreement in correspondence (5% sample)

  • Verify that any errors in packaging were not systematic (100%)

  • Determine if local approval processes require a formal research proposal (100% of districts)

  • Review all formal research applications and confirm they are in accordance with local requirements (100%)

Telephone Follow-up Contacts

  • Monitor early sample of calls to ensure that the recruiter follows procedures, elicits proper information, and has proper demeanor (10%)

  • Perform spot checks on recruiters’ class selection outcomes to confirm procedures were implemented according to protocol (10%)

Previsit Logistics

Verification (YRBS only)

  • Review data collection procedures with school personnel in each school to ensure that all preparatory activities are performed properly in advance of data collector arrival (e.g., distribution of permission forms) (100%)

Data Collector Training and Supervision of School Visits (YRBS only)

  • Issue quizzes during data collector training to ensure that key concepts are understood (daily during training)

  • Maintain at least one weekly telephone monitoring of all field staff throughout data collection (100% of field staff)

  • Reinforce training and clarify procedures through periodic conference calls with field staff (100% of field staff)

  • Verify by telephone with a 10% sample of schools that data collection procedures are being followed

Teacher Adherence to ABES Protocols

(ABES only)

  • Supplement written materials with brief, engaging videos that convey similar information regarding process and protocols

  • Proactive outreach to schools prior to survey administration date to assess school readiness, clarify instructions, and provide additional materials, as needed (100% of schools)

  • Provide a direct toll-free number to project staff and display prominently on all communications to teachers (100% of communications)

  • Systematic follow-up with schools and teachers post-survey to verify administration completion and encourage high student engagement (100% of schools and teachers)

Questionnaire Programming and Testing (Electronic)

  • Ensure verbatim wording of displayed text to that of the analyst/programmer version of the questionnaire (100% of question and instructional text)

  • Create “dummy data set” to verify that all entered responses are correctly captured in the data set as intended (minimum 50 records)

Computer Scanning (Paper)

  • Verify scanning program is operating correctly by comparing scanned values against bubbled-in responses; repeat until no issues are found (10 booklets)

Receipt Control (Electronic)

  • Verify syncing of data from the field is occurring no later than 48 hours after data collection concludes (100% of schools participating in tablet data collection)

  • Verify number of data records received in the data base match the number of expected records reported by field staff (100% of schools participating in tablet data collection)

  • Capture date/time stamps and staff credentials in the centralized system for all transactions (100%)

Receipt Control (Paper)

  • Cross-reference received data with expected data (100%)

  • Verify that a sample of forms received the prior day were logged in and are stored in the proper location (5%)

  • Require entry of staff ID in receipt control and all other transactions (100%)

  • Verify initial data editing by all editors until standards are achieved (100%)

  • Spot check editing by editor (5%)

  • Transcribe questionnaires that are not scannable (100%)

  • Remove any scannable forms that reflect intentional misuse by respondent (100%)

Data Review

(Electronic)

  • During fielding, extract records from at least three schools to verify data set is capturing and storing records as expected (during first week of fielding, or after at least three schools’ data have been collected and synced)

Data Review
(
Paper)

  • Verify that all anticipated schools are represented in the data set and frequencies of records by school match reported student participation rates (100% of schools)

Merge data sets of scanned paper and electronic survey data

  • Read scanned file into data set format that matches structure of electronic data set (100% of variables)

  • Confirm frequencies by mode (100% of schools)

  • Verify that all anticipated schools are represented in the data set and frequencies of records by school match reported student participation rates (100% of schools)



B.3 METHODS TO MAXIMIZE RESPONSE RATES AND DEAL WITH NO RESPONSE

Expected Response Rates

Historically for the YRBS, the overall response rate (product of school response rate and student response rate) has ranged from 60% to 71%, with a 16-cycle average of 66%. For the purposes of 2021 and 2023 YRBS sample design, we have conservatively assumed an overall response rate of 63%, the average over the five most recent survey cycles. The addition of a $500 token of appreciation for each participating YRBS school (as suggested by OMB in 1999) has helped maintain stable school participation rates, and the $250 token of appreciation for each participating ABES school should help with reaching expected school participation rates. The participation rates established by the YRBS are the result of the application of established procedures for maximizing school and student participation and minimizing non-response as described below. These will also be applied to the ABES, unless otherwise noted.

Methods for Maximizing Responses

To increase the likelihood of an affirmative decision to participate, we will:

(1) work through the SEA and/or state health agency to communicate state-level support

(2) indicate that the survey is sponsored by CDC and has support of Federal and state agencies

(3) for the YRBS, solicit endorsement from and convey to school districts and schools that the survey has the endorsement of many key national education and health associations, such as the American Academy of Pediatrics, American Association of School Administrators, Association of State and Territorial Health Officials, Council of Chief State School Officers, National Association of Secondary School Principals, National Association of State Boards of Education, National Catholic Educational Association, National Education Association, National PTA, and the National School Boards Association

(4) maintain a toll-free hotline to answer questions from school district and school officials, teachers, parents, and students throughout the process of recruiting schools and obtaining parental permission for student participation

(5) comply with district requirements in preparing written proposals for survey clearance

(6) convey a willingness to appear in person, if needed, to present the survey before a school board, research committee, or other local entity tasked with reviewing the survey

(7) offer schools a monetary token of appreciation of $500 for YRBS ($250 for ABES), consistent with previous recommendations from OMB, and implemented on the national YRBS since 2001.

Once a school has agreed to participate, we collaborate with each school to determine the class selection method that fits best with their school environment and meets the scientific protocol and to schedule a survey administration date that is convenient for their school calendar. As indicated in A.16, when collecting data during the spring semester, it is highly desirable to complete data collection before the final 1-2 months of school. At that point in the academic year, schools are typically focused on testing and attendance can be unstable, particularly among twelfth grade students. To further encourage participation among students in selected classes, we will recommend that schools help to advertise the survey through the principal’s newsletter, PTA meetings, and other established means of parental communication.

Methods for Handling Non-Response

We distinguish among six potential types of nonresponse problems: refusal to participate by a selected school district, school, teacher, parent, or student; and collection of incomplete information from a student. To minimize refusals at all levels--from school district to student--we will use a variety of techniques, emphasizing the importance of the survey and the value of the data gathered in allowing for the continued monitoring of factors that influence youth health. All participating districts and schools will be notified when the survey results are published and data are available for download from CDC’s website, which districts and schools may utilize in supporting grant applications.

Dealing with refusals from parents, teachers, and students require different strategies. Parental permission form reminders (Attachments I4 and I5 for YRBS, I10 and I12 for ABES EDL schools, and I11 and I13 for ABES in-person schools) will be sent to parents who have not returned parental permission forms within an agreed upon time period (e.g., 3 days); those who do not respond to the reminder will be sent a second and final reminder. The permission form will provide a telephone number at CDC that parents may call to have questions answered before agreeing to give permission for their child's participation. Permission forms will be available in English, Spanish, and other languages as required based on dominant languages spoken by parents in selected schools. Project staff will be available to answer questions from parents who remain uncertain of permission.

Teacher refusals to cooperate with the study are not expected to become a cause for concern because school leadership will already have agreed to participate. Refusals by students who have parental permission to participate are expected to be minimal. No punitive action will be taken against a nonconsenting student. Nonconsenting students will not be replaced. Data will be analyzed to determine if student nonresponse introduces any biases.

To minimize the likelihood of missing values on the questionnaire, students will be reminded within the instrument instructions and verbally by the survey administrator (for the YRBS) to review their answers prior to submitting their paper booklet or electronic questionnaire. For the ABES, students have the ability to review their responses before they submit their questionnaire. They can choose to start back at the first question and proceed all the way through, or they can jump directly to unanswered questions. Validations at the submission screen present students with a list of any unanswered questions, which they can then choose to answer or submit with them being blank.

For the optically scannable questionnaire booklets utilized for the 2021 YRBS, students will be prompted by the written survey instructions and reminded verbally through the survey administrator script to review their answers before turning it in to verify that: (1) each question they wished to answer has been answered, (2) only one oval is filled in for each question with the exception of the question on race/ethnicity, and (3) each response has been entered with a No. 2 pencil, fills the oval, and is dark. A No. 2 pencil will be provided to each survey participant to reduce the likelihood that responses will not scan properly, which would produce missing values. In addition, when completed questionnaires are visually scanned later at project headquarters, any oval that is lightly filled in will be darkened (unless they appear to be erasures) and stray marks will be erased before the forms are scanned. Missing values for an individual student on the survey will not be imputed.

For the electronic surveys , students will be prompted within the survey application instructions and reminded verbally through the survey administrator script (for YRBS questionnaires administered via tablet) to review their answers to verify that each question they wished to answer has been answered before submitting their survey responses electronically. Missing values for an individual student on the survey will not be imputed.

B.4 TESTS OF PROCEDURES OR METHODS TO BE UNDERTAKEN

YRBS questionnaire items were originally tested by the NCHS laboratories. The 1993 special issue of Public Health Reports on the development of the Youth Risk Behavior Surveillance System describes the development and testing process. A limited pretest of the questionnaire on nine respondents was conducted in November 1989 by the contractor in the Prince George's County, Maryland school system in accord with OMB guidelines. The pretest was conducted to:

  • Quantify respondent burden

  • Test survey administrator instructions and procedures

  • Verify the overall feasibility of the survey approach

  • Identify needed changes in the instruments or instructions to control/reduce burden

The pilot test sharpened the articulation of certain survey questions and produced an empirical estimate of the survey burden.

The YRBS questionnaire has been used extensively in 16 prior national school-based surveys approved by OMB, as well as at the state and local levels. Further pilot testing in accord with OMB guidelines has been performed on new and potential questions.

B.5 INDIVIDUALS CONSULTED ON STATISTICAL ASPECTS AND INDIVIDUALS COLLECTING AND/OR ANALYZING DATA

Under OMB's prior review of the YRBS (OMB No. 0920-0493, expiration, 9/30/2019), a Notice of Action was issued that requested the study undergo an external peer review prior to submitting the next package for approval. To ensure continuous scientific rigor of the sample design, best practices for recruitment, and efficient strategies to maximize participation rates, a panel of four experts was convened in April 2018. Four experts in survey methodology, school-based data collection, and health surveys commented on the YRBS methodology and offered recommendations for improvement. Specifically, the topics of discussion were frame development and sampling design, maximizing participation, transition to a mixed mode methodology and YRBS strategy to address emerging topics. A summary of the panel's recommendations and CDC's concurrence with those recommendations can be found in Attachment F.

Statistical aspects of the study have been reviewed by the individuals listed below.

  • Ronaldo Iachan, PhD

ICF

530 Gaither Road, Suite 500

Rockville, Maryland 20850

Phone: (301) 572-0538

E-mail: [email protected]

  • Richard (Lee) Harding, MS

ICF

530 Gaither Road, Suite 500

Rockville, Maryland 20850

Phone: (301) 572-0524

E-mail: [email protected]



Within the agency, the following individual will be responsible for receiving and approving contract deliverables and will have primary responsibility for data analysis:

  • Nancy Brener, PhD
    Team Lead, Survey Operations Team
    Division of Adolescent and School Health
    Centers for Disease Control and Prevention

1600 Clifton Road, NE

Mailstop US8-1

Atlanta, GA 30329

Phone: 404-718-8133
Email: [email protected]



The representative of the contractor responsible for conducting the planned data collection is:

  • Alice Roberts, MS

Project Director

ICF

530 Gaither Road, Suite 500

Rockville, Maryland 20850

Phone: (301) 572-0290

E-mail: [email protected]

1 Based on empirical results and simulations.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorWindows User
File Modified0000-00-00
File Created2022-07-01

© 2024 OMB.report | Privacy Policy