NATIONAL YOUTH TOBACCO SURVEY
2018-2020
OMB No. 0920-0621, expires 01/31/2021
Revision
SUPPORTING STATEMENT: PART
B
04/25/2018
Submitted by:
Ahmed Jamal, MBBS, MPH
Centers for Disease Control and Prevention
Office on Smoking and Health
Epidemiology Branch
4770 Buford Highway NE, MS-F79
Atlanta, GA 30341
Phone: 770-488-5077
Fax: 770-488-5848
E-mail: [email protected]
Centers for Disease Control and Prevention
Department of Health and Human Services
TABLE OF CONTENTS
B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS
B.1. Respondent Universe and Sampling Methods
B.2. Procedures for the Collection of Information
B.3. Methods to Maximize Response Rates and Deal with Nonresponse
B.4. Tests of Procedures or Methods to be Undertaken
B.5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or
Analyzing Data
REFERENCES
LIST OF ATTACHMENTS
Authorizing Legislation
B1. 60-Day Federal Registrar Notice
B2. Public Comment on the 60-Day Federal Registrar Notice
C. State Tobacco Control Reports that Cite National Youth Tobacco Survey Data
D. Publications from Prior Cycles of National Youth Tobacco Survey
E1. State-level Recruitment Scripts for the National Youth Tobacco Survey
E2. State-level Recruitment Script for the National Youth Tobacco Survey Supplemental Documents - State Letter of Invitation
F1. District-level Recruitment Scripts for the National Youth Tobacco Survey
F2. District-level Recruitment Script for the National Youth Tobacco Survey Supplemental Documents - District Letter of Invitation
G1. School-level Recruitment Scripts for the National Youth Tobacco Survey
G2. School-level Recruitment Script for the National Youth Tobacco Survey Supplemental Documents - School Letter of Invitation and NYTS Fact Sheet for Schools
G3. School-level Recruitment Script for the National Youth Tobacco Survey Supplemental Documents - Letter to Agreeing Schools
H1. Data Collection Checklist for the National Youth Tobacco Survey
H2. Data Collection Checklist for the National Youth Tobacco Survey Supplemental Documents - Letter to Teachers in Participating Schools
I1. National Youth Tobacco Survey Questionnaire
I2. National Youth Tobacco Survey Questionnaire Supplemental Documents - Parental Permission Form Distribution Script
I3. National Youth Tobacco Survey Questionnaire Supplemental Documents – Parental Permission Form and Fact Sheet (English Version)
I4. National Youth Tobacco Survey Questionnaire Supplemental Documents – Parental Permission Form and Fact Sheet (Spanish Version)
I5. National Youth Tobacco Survey Questionnaire Supplemental Documents – Parental Permission Form Reminder Notice (English Version)
I6. National Youth Tobacco Survey Questionnaire Supplemental Documents – Parental Permission Form Reminder Notice (Spanish Version)
I7. National Youth Tobacco Survey Questionnaire Supplemental Documents - Questionnaire Administration Script
I8. Summary of Changes in the NYTS Questionnaire from 2017 to 2018
J. IRB Approval Letter
K. Sample Table Shells
L. Detailed Sampling and Weighting Plan
M. National Youth Tobacco Survey Non-Response Analysis Report, 2017
N. Cognitive Testing Results, 2017
O. External Peer Review of the National Youth Tobacco Survey Sampling Methodology, October 31, 2017
P1. Youth Cognitive Interview Guide – Understanding Use of Electronic Vapor Devices (EVDS) among Youth
P2. Cognitive Interview Assent Form, Individuals under 18 years of age - Understanding Use of Electronic Vapor Devices (EVDS) among Youth.
P3. Cognitive Interview Consent Form, Individuals 18 years of age - Understanding Use of Electronic Vapor Devices (EVDS) among Youth.
P4. Cognitive Testing: Parental Permission for minor child to be in a cognitive interview on understanding use of electronic vapor devices among youth
P5. Cognitive Testing Youth Materials – Electronic Nicotine Delivery Systems Key Facts
P6. Cognitive Testing Youth Materials – Tools for Quitting
P7. Cognitive Testing: Participant Screener – Understanding Use of Electronic Vapor Devices (EVDS) among Youth.
LIST OF TABLES
Table A.4 Characteristics of selected surveys of school-going youth or inclusive of school-aged youth, United States
Table A.12.a Estimated Annualized Burden Hours
Table A.12.b Annualized Estimated Cost to Respondents
Table A.14 Estimated Annualized Study Cost
Table A.15 Annualized Estimates for the 2018 NYTS, with changes since previous OMB approval
Table B.1 Distribution of Schools by Urban Status and School Type
Table B.2.a Quality Assurance Measures Before, During, and After Data Collection
Table B.2.b Major Means of Quality Control
Table B.3 Historical NYTS Participation Rates
LIST OF FIGURES
Figure A.4.a Relationship between NYTS and Other Surveys that are Inclusive of School-Aged Youth (e.g., PATH and NSDUH)
Figure A.4.b Relationship between NYTS and Other School-Based Surveys (e.g., YRBSS and MTF)
B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS
Survey methods remain the same as in past cycles of NYTS administration. Although the NYTS will be conducted with scannable questionnaires in 2018, CDC is committed to reducing burden and improving information gathering via advances in information technology. This study will employ a repeat cross-sectional design to develop national estimates of tobacco use behaviors and exposure to pro- and anti-tobacco influences among students enrolled in grades 6-12.
As presented in this supporting justification, every effort has been made to maintain the methodology established in prior cycles of the NYTS (1999, 2000, 2002, 2004, 2006, 2009, 2011-2017) to permit comparability across cycles. Data are reported at the national level only; no school district or regional estimates will be produced.
The universe for the study will consist of students in 6th through 12th grade that attend public and private schools in the 50 U.S. States and the District of Columbia. Private schools will include both religious and non-religious schools.
The sampling frame for schools has been obtained from Market Data Retrieval (MDR) (formerly known as Quality Education Data, Inc., or QED). It has been augmented by combining it with the frames maintained by the National Center for Education Statistics (NCES). School-level data on enrollment by grade and minority race/ethnicity are available in the NCES data set.
Table B.1 displays the current U.S. distribution of eligible schools by urban status and type of school. This tabulation was computed over a frame of eligible schools with middle school and/or high school grades prepared using the latest MDR files that are the basis for the sampling frame. 1
Table B.1 – Distribution of Schools by Urban Status and School Type
Table of School Type by Urban Status |
|||
School Type |
Urban Status |
||
Frequency |
Rural |
Urban |
Total |
Non-public |
4264 |
6605 |
10869 |
Public |
29526 |
21896 |
51422 |
Total |
33790 |
28501 |
62291 |
A national probability sample will be selected that will support national estimates by grade, sex, and grade cross-tabulated by sex, for students enrolled in grades 6-12. The design will further support separate estimates of the characteristics of non-Hispanic white, non-Hispanic black, and Hispanic students by school level (middle and high school). The procedures for stratification and sample selection are consistent with those from previous cycles of NYTS. Additional details of the sampling plan are provided in Attachment L.
Sampling Frame and Stratification
For the 2018 NYTS survey, we will use a combination of sources to create the school frame in order to increase school coverage. Along with the MDR dataset, we will use two files from NCES; the Common Core Dataset (CCD) which is a national file of public schools and the Private School Universe Survey Dataset (PSS), a file of national non-public schools. The principle behind combining multiple data sources is to increase the coverage of schools nationally.
The sampling frame representing the 50 U.S. States and the District of Columbia will be stratified by urban status and by racial/ethnic minority concentrations. The definition of urban status strata, distinguishing urban and non-urban areas, will be based on metropolitan statistical area, or Metropolitan Statistical Area (MSA), versus non-MSA areas. The sample will be structured into geographically defined units, called primary sampling units (PSUs), which consist of one county or a group of small, contiguous counties. Table B-1 provides the distribution of eligible schools in the frame.
We will impose a school size threshold as an additional criterion for eligibility. By removing from the frame those schools with an aggregate enrollment of less than 25 students across eligible grades, we will improve efficiency and safeguard privacy. Attachment L demonstrates that the coverage losses are negligible in terms of eligible students as well as in terms of potential biases.
Selection of PSUs. A total of 85 PSUs will be selected with probability proportional to the student enrollment in the PSU. The PSUs will be allocated to the urban/non-urban strata in proportion to the total eligible student enrollment in the stratum. This approach will increase the sampling efficiency by generating a nearly self-weighting sample.
Selection of Schools. Schools will be classified by enrollment size as small, medium or large. Small schools contain one or more grades with less than 25 students per eligible grade. The remaining schools are classified as medium if they have fewer than 50 students in any of the eligible grades; otherwise, they are considered large schools.
Among large schools, two schools will be selected in each sample PSU, one middle school and one high school, with probability proportional to the measure of enrollment size. In principle, a total of 170 large school (85 high schools and 85 middle schools) selections will be made at the second stage from the 85 sample PSUs. Among medium schools, 10 high schools and 10 middle schools will be selected from a sub-sample of 10 PSUs. Similarly, among small schools, a separate random sample of 15 middle schools and 15 high-schools will be taken from 15 sub-sample PSUs. A total of 220 schools will be selected.
Selection of Students. Classes are selected based on two specific scientific parameters to ensure a nationally representative sample. First, classes have to be selected in such a way that all students in the school have a chance to participate. Second, all classes must be mutually exclusive so that no student is selected more than once. In each school, once we have determined the type of class or time period from which classes will be selected, we randomly select the appropriate number of classes within each grade. To maintain acceptable school participation rates, it is essential that each school have input in the decision of which classes will be sampled in their school following one of the above approaches. Examples of class sampling frames that have been used in past cycles include all 2nd period classes or a required physical education class. As long as the scientific sampling parameters are met, we work with each school to identify a classroom sampling frame that will work best for each school. All students in a selected classroom will be selected for the study.
To facilitate accurate prevalence estimates among racial/ethnic minority groups, the sampling design always seeks to balance increasing yields for minority students with overall precision. Prior cycles of the NYTS have successfully employed double class sampling to increase the number of non-Hispanic black and Hispanic students. In previous NYTS cycles, schools with high racial/ethnic populations were subject to double class selection. More specifically, two classes per grade were selected in these schools, compared to one class per grade in other schools, to increase the number of racial/ethnic minority students sampled. The 2018 NYTS will use double class selection among schools with high racial/ethnic populations.
Refusals. School districts, schools, or students who refuse to participate in the study will not be replaced in the sample. We will record the characteristics of schools that refuse along with reasons given for their refusal for analysis of potential study biases.
The NYTS is designed to produce the key estimates accurate to within ± 5% at a 95% precision level. Estimates by grade, sex, and grade cross-tabulated by sex, meet this standard. The same standard is used for the estimates for racial/ethnic groups by school level (middle and high school).
The derivation of sample sizes is driven by these precision levels for subgroup estimates, specifically for the smallest subgroups defined by grade and by sex. With a sample size of approximately 3,429 participants by grade—totals of 10,287 and 13,716 for middle school and high school grades, respectively—the design will ensure the required precision levels for design effects as large as 3.0. As shown in Attachment L, subgroups of size 1,500 students will achieve the +/-5% precision levels for 95% confidence intervals.
We propose to replicate key aspects of the sampling design utilized for the 2017 NYTS. Refinements typically occur in response to the changing demographics of the in-school population and to meet CDC’s policy needs. For example, increasing percentages of minority students will likely lead to more efficient sampling of minority students. In addition, the proposed design will more effectively oversample non-Hispanic black students by increasing the sampling intensity in those schools with high concentrations of non-Hispanic black students.
The anticipated total number of participating students is 24,000, as developed in Attachment L. We will randomly select 39 schools of the 85 large high schools and 39 schools of the 85 large middle schools into the double class sampling group. In other words, we will select two classes per grade in these schools (i.e., six classes in middle schools and eight classes in high schools) to ensure that target precision levels are met for racial/ethnic minority group estimates. Among the remaining large schools, only one class per grade level will be selected (46 high schools and 46 middle schools). Similarly, one class per grade level will be selected in medium schools. In small schools, that is, those that cannot support a full class selection at each grade, all students in all eligible grades are taken into the sample.
The sample was designed to yield approximately 1,500 participating non-Hispanic black students per level and approximately 1,500 participating Hispanic students per level. The target numbers were achieved in the previous cycles of the NYTS and will be confirmed in the simulation studies that we perform to fine tune the sampling parameters prior to sample selection.
Sample data will be weighted by the inverse of the probability of case selection and adjusted for non-response. The resulting weights will be trimmed to reduce mean-squared error. Next, the strata weights will be adjusted to reflect true relative enrollments rather than relative weighted enrollment. Finally, the data will be post-stratified to match national distributions of middle and high school students by race/ethnicity and grade. Variances will be computed using linearization methods.
Confidence intervals vary depending upon whether an estimate represents the full population or a subset, such as a particular grade, sex, or racial/ethnic group. Within a grouping, they also vary depending on the level of the estimate and the design effect associated with the measure.
Based on the prior NYTS cycles, as well as on precision requirements that have driven the sampling design, we can expect the following subgroup estimates to be within ±5% at 95% precision level:
Estimates by grade, sex, and grade cross-tabulated by sex
Racial/Ethnic minority group estimates for non-Hispanic blacks and Hispanics cross-tabulated by school level
The former estimates will be derived from projected sample sizes of 3,428 participating students per grade, and therefore, approximately 1,714 by sex within grade. For the latter estimates, the anticipated number of participants in each minority group is at least 1,500 per school level. For conservative design effect scenarios (design effects as large as 3.0), estimates based on these subgroup sample sizes will be within +/- 5 percentage points at the 95% confidence level.
The NYTS data are used for trend analyses where data for successive cycles are compared with statistical testing techniques. Statistical testing methods are also used to compare subgroup prevalence rates (e.g., male versus female students) for each cycle of the NYTS. These tests will be performed with statistical techniques that account for the complex survey design.
The 2018 NYTS questionnaire (Attachment I1) contains 88 items. The first set of questions on the questionnaire gather demographic data. Most of the remaining questions address the following tobacco-related topics: tobacco use (cigarettes, smokeless tobacco, cigars, pipes, bidis, electronic vapor products and hookah), knowledge and attitudes, media and advertising, minors’ access and enforcement, cessation, and environmental exposure to tobacco smoke. The questions are in a multiple-choice format and will be administered as an 8-page, optically scannable questionnaire booklet. Beginning 2019, CDC hopes to administer the survey via a digitally-based self-administered questionnaire, with a scannable questionnaire as a backup option.
Data will be collected by a small staff of professional data collectors who are specially trained to conduct the NYTS. The time during the school day in which the survey is administered varies by school. This decision is made in coordination with each school to ensure that the type of class or period of the day selected for sampling: 1) meets the scientific sampling parameters to ensure a nationally representative sample; and 2) results in the least burden/highest possible acceptability for the school. Each data collector will have direct responsibility for administering the survey to students. Data collectors will follow a questionnaire administration guide (Attachment I7). Teachers will be asked to remain at the front or back of the classroom and not to walk around the room monitoring the aisles during survey administration because doing so could affect honest responses and compromise anonymity. Teachers also will be asked to identify students with parental consent to participate in the survey and to make sure non-participating students have appropriate alternate activities. The rationale for this is to increase the candor and comfort level of students. The only direct responsibility of teachers in data collection is to distribute and follow up on parental permission forms sent out prior to the scheduled date of data collection in the school. Teachers are provided with a parental permission form distribution script (Attachment I2) to follow when distributing permission forms to students. The Data Collection Checklist (Attachment H1) is completed by teachers to track which students have received parental permission to participate in the data collection. The teachers receive instructions on completing the Data Collection Checklist in the “Letter to Teachers in Participating Schools” (Attachment H2). The data collector will utilize the information on the Data Collection Checklist to identify students eligible for a make-up survey administration; this information will be recorded by the data collector on the “Make-up List and Instructions” document (also included in Attachment H1).
At the start of the survey administration sessions, professionally trained NYTS data collectors will instruct students to not put their names anywhere on the paper and pencil survey instrument (if used) and remind them that their responses will be treated in an anonymous manner (Questionnaire Administration Script, Attachment I7). At the conclusion of the survey administration session, students will be instructed to place their completed surveys in an envelope and seal it. The sealed individual student envelopes will then be deposited into a classroom-specific envelope.
In general, our data
collection procedures have been designed to ensure that:
Protocol is followed in obtaining access to schools
Everyday school activity schedules are disrupted minimally
Administrative burden placed on teachers is minimal
Parents give informed permission to participate in the survey
Anonymity of student participation is maintained, with no punitive actions against non-participants
Alternative activities are provided for nonparticipants
Control over the quality of data is maintained
CDC plans to move towards administrating the survey electronically (digitally self-administered) after 2018. Anonymity and confidentiality for the electronic survey will be maintained by the same processes above for the paper and pencil version. For the digitally based self-administered questionnaire, at the start of the survey administration sessions, professionally trained NYTS electronic data collectors will remind students that their responses will be captured anonymously (Questionnaire Administration Script, Attachment I7). At the conclusion of the survey administration session, students will be instructed to hand their tablet to the data collector. The students’ data will immediately be uploaded to the cloud database and erased from the tablet. As the NYTS electronic administration is completed in each selected class, the classroom-specific tablet will be stored in a school-specific box.
This transition to a digitally self-administered survey will introduce skip patterns, thus, varying the respondent burden. Skip patterns are based on each respondents’ use of six tobacco products: cigarette, cigar/cigarillos/little cigars, chewing tobacco/snuff/dip, e-cigarettes, hookah/waterpipe, and other tobacco products (roll-your-own cigarettes; pipes; snus; dissolvable tobacco products; bidis). Using the 2018 NYTS survey (88 questions), non-users of any tobacco product will be asked 47 questions; former users of one to six tobacco products will be asked between 56-76 questions; and current users of one to six tobacco products will be asked between 61-88 questions.
CDCs contractor will provide rental or purchase of electronic devices (i.e. laptops, tablets or compatible devices) on which all survey items will be loaded and used by students to complete the survey. The contractor shall also provide the necessary computer software for each electronic device, so that survey items can be administered to each student. We also plan to conduct an electronic pilot study on this during this OMB cycle, and will use the results from that to inform our planned move from paper and pencil to electronic data collection. If unforeseen circumstances or constraints prohibit administration of an electronic-based survey, a traditional paper and pencil survey will continue to be used.
All initial letters of invitation will be on CDC letterhead from the Department of Health and Human Services and signed by Corinne Graffunder, DrPH, MPH, Director of the Office on Smoking and Health, NCCDPHP at CDC. The procedures for gaining access to and support from states, districts, and schools will have three major steps:
First, support will be sought from State Education Agencies and State Departments of Health. The initial request will be accompanied by a study fact sheet and a list of all sampled districts and schools in their jurisdiction. States will be asked to provide general guidance on working with the selected school districts and schools and to notify school districts that they may anticipate being contacted about the survey.
Once cleared at the state level, an invitation packet will be sent to sampled school districts in the state. Districts will receive a list of schools sampled from within their district in the invitation packet and will be asked to provide general guidance on working with them and to notify schools that they may anticipate being contacted about the study. Telephone contact will be made with the office comparable to the district office (e.g., diocesan office of education), if there is one.
Once cleared at the school district level, selected schools will be invited to participate. Information previously obtained about the school will be verified. The burden and benefits of participation in the survey will be presented. After a school agrees to participate, a tailor-made plan for collection of data in the school will be developed (e.g., select classes, determine whether the survey will be administered to selected classes sections simultaneously or in serial). Well in advance of the agreed upon survey administration date, schools will receive the appropriate number of parental consent forms and instructions. All materials needed to conduct the survey will be provided by the data collector visiting the school. Contact with schools will be maintained until all data collection activities have been completed.
Prior experience suggests the process of working with each state’s health and education agencies, school districts and schools will have unique features. Communication with each agency will recognize the organizational constraints and prevailing practices of the agency. Scripts for use in guiding these discussions may be found in Appendices C1 (state-level), D1 (district-level), and E1 (school-level). Copies of letters of invitation can be found in Attachment E2 (state-level); Attachment F2 (district-level); and Attachment G2 (school-level). Attachment G2 also contains the NYTS Fact Sheet for Schools. Attachment G3 contains a copy of the letter sent to school administrators once they have agreed to participate.
The permission form informs both the student and the parent about an important activity in which the student has the opportunity to participate. By providing adequate information about the activity, it helps ensure that permission will be informed. A copy of the permission form is contained in Appendices G4 (English version) and G5 (Spanish version). In accordance with the No Child Left Behind Act, the permission form indicates that a copy of the questionnaire will be available for review by parents at their child’s school.
A waiver of written student assent was obtained for the participation of children because this research presents no more than minimal risk to subjects, parental permission is required for participation, the waiver will not adversely affect the rights and welfare of the students because they are free to decline to take part, and it is thought that some students may perceive they are not anonymous if they are required to provide stated assent and sign a consent/assent document. Students are told “Participating in this survey is voluntary and your grade in this class will not be affected, whether or not you answer the questions.” Completion of the survey implies student assent.
As a means to monitor the parental permission form process and to ensure questionnaires are completed only by students for whom permission has been obtained, teachers are asked to enter student names on the Data Collection Checklist (similar to a class roll) (Appendix H1). Teachers can substitute any other information in place of student names (such as student ID numbers or letters) on the Data Collection Checklist as long as it will allow them to individually determine which students received parental permission to participate. This information will be conveyed to the data collector on the survey administration day.
The Data Collection Checklist is an optional tool to assist in managing the parental permission and student assent process. It will be destroyed at the end of the study. No individually identifiable information is collected on the NYTS survey (e.g., student name, class, school, etc.); therefore there is no way to connect students’ names to their response data.
NYTS is required by law to notify parents of students selected for NYTS surveys that their child has been selected and that student participation is voluntary. Schools may use various processes to obtain parental permission, forms of notification (electronically, such as email, or a hard-copy letter) either provided by the state or developed by the school. However, the notification shall include the following elements:
this school will be participating in NYTS and your child’s classroom may be/is selected to participate;
a brief description of the nature and importance of NYTS;
all responses are confidential and results will not be reported to or about individual students or schools; and
your child may be excused from participation for any reason, is not required to finish the survey, and is not required to answer any test questions.
Quality assurance measures will be taken before, during, and after the survey to reduce the likelihood of human and technical error while ensuring enhanced internal and external validity of the survey (Table B.2.a).
Table B.2.a – Quality Assurance Measures Before, During, and After Data Collection
Before |
During |
After |
|
|
|
Table B.2.b lists the major means of quality control. As shown, the task of collecting quality data begins with a clear and explicit study protocol and ends with procedures for the visual inspection and scanning of collected data. In between these activities, and subsequent to data collector training, measures must be taken to reinforce training, to assist field staff who express/exhibit difficulties completing data collection activities, and to check on data collection techniques. Because the ultimate aim is production of a high quality database and reports, various quality assurance activities will be applied during the data collection phase.
Table B.2.b – Major Means of Quality Control
Survey Step |
Quality Control Procedures |
Mailing to Districts and School |
|
Telephone Follow-up Contacts |
|
Previsit Logistics Verification |
|
Data Collector Training and Supervision of School Visits |
|
Receipt Control |
|
Manual Editing |
|
Computer Scanning for a scannable questionnaire |
|
Computer data collection for digital-based questionnaire |
|
Non-Response Analyses
Non-response may occur at both the school and student (child) level. However, the non-response analysis completed by CDC is based on school characteristics, as differences between participating and non-participating students cannot be measured in the NYTS. We also analyze aggregate demographic and socioeconomic characteristics of the student population available at the school-level. Along with school and student characteristic non-response bias analyses, the NYTS non-response analysis also assesses the potential for item nonresponse bias.
CDC completes various analyses to assess non-response bias in the NYTS. First, CDC assesses whether non-response rates might pose a potential problem overall or for certain population subgroups. High levels of non-response would indicate that more intensive efforts are required to attain participation overall or for certain subgroups. Even if analyses do not suggest bias from non-response, such results might suggest efforts be made to reduce or to adjust for the residual bias that may be induced by non-response. For the NYTS, these analyses were used to identify lower responding subgroups and compensate for potential non-response bias in the weighting process with the use of weighting class adjustments.
Second, CDC assesses the participation rates achieved in the current NYTS cycle in the context of historical participation rates at the student and school levels. In 2017, results suggest that there were changes in the response rates from 1999 to 2017. The major change was due to a decrease in school participation, as little change was observed in student participation rates during this period.
Third, CDC contrasts participating and non-participating schools through the comparisons of school and student population characteristics. School-level non-response analyses for the NYTS assess differences in school participation rates by census region, school type (public vs. non-public), school size (large vs. small) and urban status (urban vs. rural). Exploratory non-response analyses also assessed potential differences in school participation rates by school enrollment changes, presence of a library or media center, and the student-to-computer ratio. Student population characteristics assessed for non-response biases include race/ethnicity distribution, per-student Title I spending, school affluence, school percent college bound, and school percent receiving free lunch.
Finally, CDC assesses item nonresponse for each survey question in the NYTS. Details on the methods and results of the 2017 NYTS non-response analysis are provided in a report as “Attachment M”.
Across 12 cycles, the NYTS has maintained exceptional student and school response rates (Table B.3). We have averaged a 76% combined (school x student) response rate. At the school and student levels, response rates are higher. The school participation rate has averaged 84% with a low of 73%, and student participation rate has averaged approximately 90% with a low of 87%.
Table B.3 – Historical NYTS Participation Rates
CDC’s recent nonresponse bias analysis suggested that the drop in response rate during 1999 and 2017 was due to declining response rates among non-public schools. In 2017, school type (public vs. non-public) was associated with school participation in the bivariate analysis; non-public schools responded at a significantly lower rate than public schools (56.0% and 78.8%, respectively). However, non-public schools make up a small percentage of all schools in the sample, thus, this difference is unlikely to lead to potential biases. No other school-level differences in school participation were observed in bivariate analyses in 2017. However, to mitigate against such potential biases, the school non-response adjustments take school type and school size into account. Furthermore, no differences in school participation were noted by school-level student population characteristics (such as race/ethnicity, school affluence, or percentage of students who are college-bound) in 2017.
Although the school (76.8%) and student (88.7%) participation rates were lower in 2017 than historical averages, NYTS participation rates traditionally have been relatively high compared to other federally funded, national, school-based, health-related surveys of high school students. For example, the widely cited Monitoring the Future survey (formerly known as the High School Senior Survey) achieves substantially lower participation rates. The participation rates established by the NYTS are the product of the application of proven and tested procedures for maximizing school and student participation.
As indicated in Section A.16.c, it is desirable to complete data collection before the final month of school (i.e., by mid-April to mid-May, depending on location). Many schools are very busy at that time with standardized testing and final exams; in addition, attendance can be very unstable, especially among twelfth grade students.
We distinguish among six potential types of nonresponse problems: refusal to participate by a selected school district, school, teacher, parent, or student; and collection of incomplete information from a student.
To minimize refusals at all levels--from school district to student--we will use a variety of techniques, emphasizing the importance of the survey. Given the subject matter is tobacco, we expect that a few school districts or schools will need to place the issue of survey participation before the school board. To increase the likelihood of an affirmative decision, we will: (1) work through the state agencies to communicate its support of the survey; (2) indicate that the survey is being sponsored by CDC; (3) convey to the school district or school that the survey has the endorsement of many key national educational and health associations, such as the National PTA, American Medical Association, National Association of State Boards of Education, Council of Chief State School Officers and the National School Boards Association;(4) maintain both a toll-free hotline and dedicated email account to answer questions from the school board; (5) offer a package of educational products to each participating school, as recommended by OMB in approving the 1998 YRBS in alternative schools (OMB No. 0920-0416, expiration 12/98) and implemented on NYTS ever since; (6) comply with all requirements from school districts in preparing written proposals for survey clearance; (7) convey a willingness to appear in person, if needed, to present the survey before a school board, research committee, or other local entity tasked with reviewing the survey; and (8) offer schools a monetary incentive of $500.
The sampling plan does not allow for the replacement of schools that refuse to participate due to concern that replacing schools would introduce bias. All participating state departments of health and education, school districts, and schools also will have access to the published survey results.
Maximizing responses and dealing with refusals from parents, teachers, and students require different strategies. To maximize responses, we will recommend that schools help to advertise the survey through the principal’s newsletter, PTA meetings, and other established means of communication. Reminders will be sent to parents who have not returned parental permission forms within an agreed upon time period (e.g., three days); those who do not respond to the reminder will be sent a second and final reminder. The permission form will provide a telephone number at CDC that parents may call to have questions answered before agreeing to give permission for their child’s participation. Permission forms will be available in English, Spanish, and any other languages spoken by a large percentage of parents in a given school district. Field staff will be available on location to answer questions from parents who remain uncertain of permission. Bilingual field staff will be used in locations with high Hispanic concentrations (e.g., California, Florida, New York City, and Texas).
Teacher refusals to cooperate with the study are not expected to be a problem because schools will already have agreed to participate. Refusals by students who have parental permission to participate are expected to be minimal. No punitive action will be taken against a nonconsenting student. Nonconsenting students will not be replaced. Data will be analyzed to determine if student nonresponse introduces any biases.
To minimize the likelihood of missing values on the survey, students will be reminded in writing in the questionnaire booklet and verbally by the survey administrator to review the optically “scan-able” questionnaire before turning it in to verify that: (1) each question has been answered, (2) only one oval is filled in for each question with the exception of questions instructing the respondent to choose one or more answers (e.g. the question on race asks the student to mark each race that applies); and (3) each response has been entered with a No. 2 pencil, fills the oval, and is dark. A No. 2 pencil will be provided to each survey participant to reduce the likelihood that the responses will not scan properly, which would produce missing values. In addition, when completed questionnaires are visually scanned later at project headquarters, any oval that is lightly filled in will be darkened (unless they appear to be erasures) and stray marks will be erased before the forms are scanned. Missing values for an individual student on the survey will not be imputed.
The NYTS core questionnaire items–those identified for use both nationally and at the state level–originally were subjected to cognitive analyses by RTI in 1999. This cognitive analysis directly affected the first NYTS questionnaire fielded in 1999. Cognitive analyses of a small number of new questions were conducted in the fall of 2003 to investigate potential sources of error. A limited pretest of the 2004 NYTS questionnaire was also conducted in August 2003. Cognitive testing was undertaken again prior to the 2006 NYTS. Specifically, testing evaluated revisions to certain existing core survey questions and additional new items subsequently under consideration. In April 2005, a pretest of the NYTS 2006 questionnaire was conducted in accord with OMB guidelines. The pretests sharpened the articulation of certain survey questions and confirmed the existing empirical estimate of the survey burden. In 2012, cognitive testing was performed on 26 new questions that were added to the NYTS; while retaining the overall length of the survey to 81 questions. In 2013, another round of cognitive testing was done but this time it was performed on the whole survey. For the 2015 cycle of NYTS, cognitive testing was done on 11 new questions that focused on electronic vapor products (e.g. electronic cigarettes, electronic cigars, vape pens, electronic hookah). The new questions were tested, including any changes, and final question wording.
The current ICR includes a new line item in the burden table to support more robust testing of changes to the NYTS questionnaire prior to their implementation. Burden is specifically allocated to performing cognitive testing of new or modified questions that will provide better measures of tobacco products. The burden also includes testing of the questionnaires to confirm that they can be completed in 45 minutes.
Statistical aspects of the study have been reviewed by the individuals listed below.
Sean Hu, MD, DrPH. Senior Epidemiologist
Phone:
770-488-5845 |
|
Agency Responsibility
Within the agency, the following individual will be responsible for receiving and approving contract deliverables and will have primary responsibility for data analysis:
Ahmed Jamal, MBBS, MPH
Centers for Disease Control and
Prevention
Office on Smoking and Health, Epidemiology Branch
4770 Buford Highway NE, MS-F79
Atlanta, GA 30341
Phone:
770-488-5077; Fax: 770-488-5848
E-mail: [email protected]
Responsibility for Data Collection
The representative of the contractor responsible for conducting the planned data collection is: As designated by the contractor
REFERENCES
CDC (2001). Youth Tobacco Surveillance–United States, 2000. MMWR; 50(SS-4).
CDC (2010). Tobacco Use Among Middle and High School Students—United States, 2000-2009. MMWR; 59(33):1063-1068.
CDC (2012a). Current Tobacco Use Among Middle and High School Students – United States, 2011. MMWR; 61(31): 581-585.
CDC (2012b). National Youth Tobacco Survey. Atlanta, GA: US Department of Health and Human Services, Centers for Disease Control and Prevention. Available at http://www.cdc.gov/tobacco/data_statistics/surveys/nyts.
CDC (2013a). Notes from the field: Electronic cigarette use among middle and high school students—United States, 2011-2012. Morbidity and Mortality Weekly Report; 62(35), 729–730.
CDC (2013b). Tobacco Product Use Among Middle and High School Students- United States, 2011 and 2012. Morbidity and Mortality Weekly Report; 62(45), 893-897.
CDC (2013c). Winnable Battles Progress Report- 2010-2015. Atlanta, GA: US Department of Health and Human Services, Centers for Disease Control and Prevention.
CDC (2014a). Best Practices for Comprehensive Tobacco Control Programs – 2014. Atlanta, GA: U.S. Department of Health and Human Services, CDC, National Center for Chronic Disease Prevention and Health Promotion, Office on Smoking and Health.
CDC (2014b). Budget Request Summary- Fiscal Year 2015. Atlanta, GA: US Department of Health and Human Services, Centers for Disease Control and Prevention.
CDC (2014c). Winnable Battles. Retrieved from http://www.cdc.gov/winnablebattles/
CDC (2014d). Annual Performance Report and Performance Plan- Fiscal Year 2014. Atlanta, GA: US Department of Health and Human Services, Centers for Disease Control and Prevention.
FDA (2014). FDA Proposes to Extend Its Tobacco Authority to Additional Tobacco Products,
including e-cigarettes. FDA NEWS RELEASE. N.p., 24 Apr. 2014. Web. 9 May 2014. http://www.fda.gov/newsevents/newsroom/pressannouncements/ucm394667.htm
Frieden, T.R. (2010). A framework for public health action: The health impact pyramid. American Journal of Public Health, 100(4), 590-595.
Institute of Medicine (2011). Leading Health Indicators for Healthy People 2020: Letter Report. Washington, DC: The National Academies Press.
National Institute on Drug Abuse (2003). Youths’ Opportunities to Experiment Influence Later Use of Illegal Drugs. National Institute on Drug Abuse, 17(5).
National Institute on Drug Abuse (2014). Monitoring the Future national results on drug use: 1975-2013: Overview, Key Findings on Adolescent Drug Use. National Institute on Drug Abuse, National Institutes of Health. Ann Arbor, MI: Institute for Social Research, The University of Michigan.
U.S. Bureau of Labor Statistics (2014). May 2013 National Occupational Employment and Wage Estimates, United States. Retrieved from http://www.bls.gov/oes/current/oes_nat.htm
USDHHS (2010a). How Tobacco Smoke Causes Disease: The Biology and Behavioral Basis for Smoking-Attributable Disease: A Report of the Surgeon General. Atlanta, GA: U.S. Department of Health and Human Service, Public Health Service, CDC, National Center for Chronic Disease Prevention and Health Promotion, Office on Smoking and Health, 2010.
USDHHS (2010b). Healthy People 2020. Washington, D.C.: U.S. Department of Health and Human Services. Available at: http://healthypeople.gov/2020/default.aspx
USDHHS (2012a). Ending the Tobacco Epidemic: Progress toward a Healthier Nation. Washington, DC: US Department of Health and Human Services, Office of the Assistant Secretary for Health.
USDHHS (2012b). Preventing Tobacco Use Among Youth and Young Adults: A Report of the Surgeon General. Atlanta, GA: U.S. Department of Health and Human Services, Centers for Disease Control and Prevention, National Center for Chronic Disease Prevention and Health Promotion, Office on Smoking and Health.
USDHHS (2014). The Health Consequences of Smoking- 50 years of Progress: A Report of the Surgeon General. Atlanta, GA: U.S. Department of Health and Human Services, Centers for Disease Control and Prevention, National Center for Chronic Disease Prevention and Health Promotion, Office on Smoking and Health.
USDHHS, NIH, & NCI (2007). NCI’s President’s Cancer Panel 2006-2007 Annual Report:
Promoting Healthy Lifestyles: Policy, Program, and Personal Recommendations for Reducing Cancer Risk. U.S. Department of Health and Human Services, National Institutes of Health, National Cancer Institute.
USDHHS, NIH, & NIDA (2007). Director’s Report to the National Advisory Council on Drug Abuse. U.S. Department of Health and Human Services, National Institutes of Health, National Institute on Drug Abuse.
1 We created a dichotomy of urban vs. non-urban schools using the Metro Status categorical variable available in these files.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Modified | 0000-00-00 |
File Created | 0000-00-00 |