0621 NYTS SSB_final 12 17 2020_clean

0621 NYTS SSB_final 12 17 2020_clean.docx

2022 National Youth Tobacco Survey (NYTS)

OMB: 0920-0621

Document [docx]
Download: docx | pdf






Information Collection Request





Revision



NATIONAL YOUTH TOBACCO SURVEY, 2021 - 2023


OMB No. 0920-0621, expires 04/30/2021






SUPPORTING STATEMENT: PART B






Submitted by:

David Homa, PhD, MPH

Centers for Disease Control and Prevention

Office on Smoking and Health

Epidemiology Branch

4770 Buford Highway NE, MS-S107-7

Atlanta, GA 30341

Phone: 770-488-3626

Fax: 770-488-5848

E-mail: [email protected]








July 29, 2020


TABLE OF CONTENTS


B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS


B.1. Respondent Universe and Sampling Methods


B.2. Procedures for the Collection of Information

B.3. Methods to Maximize Response Rates and Deal with Nonresponse

B.4. Tests of Procedures or Methods to be Undertaken


B.5 Response to 2016 Peer Review Panel Recommendations Regarding NYTS Sample


B.6. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or

Analyzing Data


REFERENCES


LIST OF ATTACHMENTS


A1. Authorizing Legislation


A2 Federal Register Notice


A3 Federal; Register Notice Comments and Agency Responses


B. State Tobacco Control Reports that Cite National Youth Tobacco Survey Data


C. Publications from Prior Cycles of the National Youth Tobacco Survey


D1. State-level Recruitment Script for the National Youth Tobacco Survey


D2. State-level Recruitment Script for the National Youth Tobacco Survey Supplemental Document – State Letter of Invitation


E1. District-level Recruitment Script for the National Youth Tobacco Survey


E2. District-level Recruitment Script for the National Youth Tobacco Survey Supplemental Document – District Letter of Invitation


F1. School-level Recruitment Script for the National Youth Tobacco Survey


F2. School-level Recruitment Script for the National Youth Tobacco Survey Supplemental Documents – School Letter of Invitation and NYTS Fact Sheet for Schools


F3. School-level Recruitment Script for the National Youth Tobacco Survey Supplemental Documents – Letter to Agreeing Schools


G1. Data Collection Checklist for the National Youth Tobacco Survey


G2. Data Collection Checklist for the National Youth Tobacco Survey Supplemental Documents – Letter to Teachers in Participating Schools


H1. National Youth Tobacco Survey Questionnaire


H2. National Youth Tobacco Survey Questionnaire Supplemental Documents – Parental Permission Form Distribution Script


H3. National Youth Tobacco Survey Questionnaire Supplemental Documents – Parental Permission Form (Active) and Fact Sheet (English Version)


H4. National Youth Tobacco Survey Questionnaire Supplemental Documents – Parental Permission Form (Passive) and Fact Sheet (English Version)


H5. National Youth Tobacco Survey Questionnaire Supplemental Documents – Parental Permission Form (Active) and Fact Sheet (Spanish Version)


H6. National Youth Tobacco Survey Questionnaire Supplemental Documents – Parental Permission Form (Active) and Fact Sheet (Spanish Version)


H7. National Youth Tobacco Survey Questionnaire Supplemental Documents – Parental Permission Form Reminder Notice (English Version)


H8. National Youth Tobacco Survey Questionnaire Supplemental Documents – Parental Permission Form Reminder Notice (Spanish Version)


H9. National Youth Tobacco Survey Questionnaire Supplemental Documents – Questionnaire Administration Script


I Refusal Conversion and Nonresponse Cover Memos.


J1. CDC IRB Approval Letter


J2. Contractor IRB Approval Letter


K. Sample Table Shells


L. Detailed Sampling and Weighting Plan


M1 Dear Teacher Letter, On-Campus Instruction, Active Permission


M2 Dear Teacher Letter, On-Campus Instruction, Passive Permission


M3 Dear Teacher Letter, Virtual/Distance Instruction, Active Permission


M4 Dear Teacher Letter, Virtual/Distance Instruction, Passive Permission


N NYTS Student and Teacher Video Scripts


O NYTS Online Teacher Landing Page-Class Enrollment Form


P Summary of School Arrangements Forms




LIST OF TABLES


Table A.8: Consultants for 2021-2023 NYTS

Table A.10: Access Controls

Table A.12a: Estimated Annualized Burden Hours

Table A.12b: Annualized Estimated Cost to Respondents

Table A.14: Annualized Study Cost

Table A.15: Annualized Estimates of Respondents and Burden, 2021-23 NYTS

Table A.16: Schedule of Activities for 2021 NYTS

Table B.1: Distribution of Schools by Urban Status and School Type

Table B.2: Major Means of Quality Control


B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS


As of 2019, the NYTS is administered through electronic data collection. The questionnaire incorporates skip logic to tailor the questionnaire based on respondents’ tobacco product use behaviors. Thus, respondents are not asked to respond to questions that do not apply to them, reducing overall respondent burden. Non-branded product images also are included in the electronic questionnaire to improve product recognition and recall. Overall, these changes improve validity of response, enhancing and enhance data quality. Findings from the 2018 NYTS Electronic Pilot study indicated that incorporating skip logic into the electronic survey reduced mean survey completion time by 15%, reduced the number of questions respondents needed to answer by 30%, and reduced the number of contradictory and inconsistent responses to zero, in comparison to an electronic survey with no skip logic (Hu et al, 2020). In the first full electronic NYTS administration conducted in 2019, the average survey completion time was approximately 12.5 minutes; however, students are given up to one class period (approximately 35-45 minutes) to complete the survey.


Data for the NYTS typically are completed in schools (classrooms) on provided tablets that do not require the use of WiFi or Internet access; data are stored locally on the tablet until it can be uploaded directly to a secure server (ICF MS SQL Server Database) via secure mobile hotspots. Thus, the main data collection procedure does not rely on a schools’ IT networks. However, students absent on the day of survey administration are asked to complete a make-up survey using a virtual (web-based) version of the survey. This make-up survey is programmed to mimic the tablet-based survey.

Given the unprecedented circumstances presented by the COVID-19 pandemic, the 2021 NYTS will be administered electronically using a pre-programmed web-based survey. Instead of completing the survey on a provided tablet, participating students will access the survey through a URL using an internet-connected device. The web-based survey is designed to simulate the tablet-based survey. Similar to previous years, the web-based survey also will be used to obtain make-up surveys from eligible students who are absent on the initial date of data collection.


Other than the change in mode of administration and associated modification of data collection procedures, all other survey methods remain the same as past cycles of NYTS administration; the 2021-2023 NYTS will be a continuation of the NYTS cycles that have taken place since 1999, employing the general sampling design framework used in the previous cycles Specifically, this study will employ a repeat cross-sectional design to develop national estimates of tobacco product use behaviors and exposure to pro- and anti-tobacco influences among U.S. students enrolled in grades 6-12.



B.1 RESPONDENT UNIVERSE AND SAMPLING METHODS


The universe for the study will consist of students in 6th through 12th grade that attend public and private schools in the 50 U.S. States and the District of Columbia. Private schools will include both religious and non-religious schools.

The sampling frame for schools has been obtained from Market Data Retrieval (MDR) (formerly known as Quality Education Data, Inc., or QED). It has been augmented by combining it with the frames maintained by the National Center for Education Statistics (NCES). School-level data on enrollment by grade and minority race/ethnicity are available in the NCES data set.


Table B.1 displays the current U.S. distribution of eligible schools by urban status and type of school. This tabulation was computed over a frame of eligible schools with middle school and/or

high school grades prepared using the latest MDR files that are the basis for the sampling frame.1 The sampling frame of eligible schools will not change due to administering the 2021 NYTS as a virtual survey.



Table B.1: Distribution of Schools by Urban Status and School Type



Table of School Type by Urban Status

School Type

Urban Status

Frequency
Percent
Row Pct
Col Pct

Non-Urban

Urban

Total

Non-Public

5702
8.01
41.94
14.54

7893
11.09
58.06
24.70

13595
19.10

Public

33512
47.09
58.20
85.46

24065
33.81
41.80
75.30

57577
80.90

Total

39214
55.10

31958
44.90

71172
100.00



B.2 PROCEDURES FOR COLLECTION OF INFORMATION

Statistical Methodology for Stratification and Sample Selection


A national probability sample will be selected that will support national estimates by grade, sex, and grade cross-tabulated by sex, for students enrolled in grades 6-12. The design will further support separate estimates of the characteristics of non-Hispanic white, non-Hispanic black, and Hispanic students by school level (middle and high school). The procedures for stratification and sample selection are consistent with those from previous cycles of NYTS. Additional details of the sampling plan are provided in Attachment L.


Sampling Frame and Stratification. For the 2021-2023 NYTS surveys, we will use a combination of sources to create the school frame in order to increase school coverage. Along with the MDR dataset, we will use two files from NCES; the Common Core Dataset (CCD) which is a national file of public schools and the Private School Universe Survey Dataset (PSS), a file of national non-public schools. The principle behind combining multiple data sources is to increase the coverage of schools nationally.


The sampling frame representing the 50 U.S. States and the District of Columbia will be stratified by urban status and by racial/ethnic minority concentrations. The definition of urban status strata, distinguishing urban and non-urban areas, will be based on metropolitan statistical area, or Metropolitan Statistical Area (MSA), versus non-MSA areas. The sample will be structured into geographically defined units, called primary sampling units (PSUs), which consist of one county or a group of small, contiguous counties. Table B.1 provides the distribution of eligible schools in the frame.


We will impose a school size threshold as an additional criterion for eligibility. By removing schools with an aggregate enrollment of less than 40 students across eligible grades (grades 6-8 for middle schools; grades 9-12 for high schools) from the frame, we will improve efficiency and safeguard privacy. Attachment K demonstrates that the coverage losses are negligible in terms of eligible students as well as in terms of potential biases.


Selection of PSUs. A total of 100 PSUs will be selected with probability proportional to the student enrollment in the PSU. The PSUs will be allocated to the urban/non-urban strata in proportion to the total eligible student enrollment in the stratum. This approach will increase the sampling efficiency by generating a nearly self-weighting sample.


Selection of Schools (Secondary Sampling Units, SSUs). Schools will be classified by enrollment size as small, medium or large. Small schools contain one or more grades with less than 28 students per eligible grade. The remaining schools are classified as medium if they have fewer than 56 students in any of the eligible grades; otherwise, they are considered large schools.


The number of SSUs (schools) was increased in 2021 to ensure a sufficient sample of participating schools and students to allow for precision in all subgroup estimates of interest, given the potential for lower school response rates due to COVID-19. As such, two large schools are selected for each of the 100 sample PSUs, one for each school level (middle [grades 6-8], high school [grades 9-12]). An additional large school (for each school level) is selected in a subsample of 60 PSUs (320 large schools, total). Additionally, 80 medium schools (one for each school level from 40 subsample PSUs) and 50 small schools (one for each school level from 25 subsample PSUs) are selected from a random sample of all PSUs. In total, approximately 450 secondary sampling units (SSUs), or schools, are selected overall (320 Large + 80 Medium + 50 Small). The PSU subsamples will be drawn as a simple random sample and schools will be drawn with probability proportional to the measure of eligible students enrolled in a school.


Physical schools will be classified as “whole” for high schools if they have all high school grades (9th through 12th), and whole for middle schools if they had all middle-school grades (6th through 8th). Otherwise, they will be considered a “fragment” school. Fragment schools will be linked with other schools (fragment or whole) to form a linked school that has all grades present for a given level. Thus, the actual number of physical schools will be larger (than 450 SSUs, above) after the disaggregation of SSUs into physical school buildings.


Selection of Students. Classes are selected based on two specific scientific parameters to ensure a nationally representative sample. First, classes have to be selected in such a way that all students in each school grade level have a chance to participate. Second, all classes must be mutually exclusive so that no student is selected more than once. In each school, once we have determined the type of class or time period from which classes will be selected, we randomly select the appropriate number of classes within each grade. To maintain acceptable school response rates, it is essential that each school have input in the decision of which classes will be sampled in their school following one of the above approaches. Examples of class sampling frames that have been used in past cycles include all 2nd period classes or a required physical education class. As long as the scientific sampling parameters are met, we work with each school to identify a classroom sampling frame that will work best for each school. All students in a selected class will be selected for the study.


To facilitate accurate prevalence estimates among racial/ethnic minority groups, the sampling design always seeks to balance increasing yields for minority students with overall precision. Prior cycles of the NYTS have successfully employed double class sampling to increase the number of non-Hispanic black and Hispanic students. In previous NYTS cycles, schools with high racial/ethnic minority populations were subject to double class selection. More specifically, two classes per grade were selected in these schools, compared to one class per grade in other schools, to increase the number of racial/ethnic minority students sampled. The 2021-2023 NYTS will use double class selection among schools with high racial/ethnic minority populations.


Refusals. School districts, schools, or students who refuse to participate in the study will not be replaced in the sample. We will record the characteristics of schools that refuse along with reasons given for their refusal for analysis of potential study biases.


The statistical methodology for stratification and sample selection will not change due to administering the 2021 NYTS as a virtual survey.


Estimation and Justification of Sample Size


The NYTS is designed to produce the key estimates accurate to within ± 5% at a 95% precision level. Estimates by grade, sex, and grade cross-tabulated by sex, meet this standard. The same standard is used for the estimates for racial/ethnic groups by school level (middle and high school).


The derivation of sample sizes is driven by these precision levels for subgroup estimates, specifically for the smallest subgroups defined by grade and by sex. We propose to replicate key aspects of the sampling design utilized in prior cycles of the NYTS. Refinements typically occur in response to the changing demographics of the in-school population and to meet CDC’s policy needs. For example, increasing percentages of minority students will likely lead to more efficient sampling of minority students. In addition, the proposed design will more effectively oversample non-Hispanic black students by increasing the sampling intensity in those schools with high concentrations of non-Hispanic black students.


The anticipated total number of participating students is a minimum of 24,000, as developed in Attachment L. The number of participating students, in excess of 24,000 as required, was also used in the 2020 NYTS cycle to generate approximately equivalent effective sample sizes and precision levels overall. Specifically, the 2021 NYTS sampling design will aim at balancing student yields by grade, with target sample sizes of approximately 3,453 participating students per grade (for a total of 24,172 students). Therefore, they will also ensure the precision of estimates by individual grade (e.g., sex by grade subgroup estimates on the basis of at least 1,700 students). In calculating the sample sizes for the 2021 NYTS, we made our approach more robust by assuming a conservative combined rate of 42.5% (50% school response rate x 85% student response rate), substantially lower than the historical overall response rate. These numbers are closer to the more recent experience at both levels. The main reason is to account for higher levels of anticipated school refusals due to COVID-19 precautions in the 2020/2021 school environment. A secondary reason is that the student participation rate needs to be adjusted to account for a growing number of ineligible students. This augmentation will be implemented by selecting a supplement school sample in subsample PSUs, as described above. The 2021 sample will include 160 large SSUs per level (middle school and high school); the sample will also include 40 medium SSUs per level and 25 small SSUs per level. The total sample size will be 225 SSUs selected per level, or 450 SSUs in total. Of the 160 large SSUs selected per level, 100 will be assigned double class sampling (i.e., 60 will have one class selected per grade). A SSU will be classified as “whole” for high schools if they have all high-school grades (9th through12th), and whole for middle schools if they had all grades 6th through 8th. Otherwise, they will be considered a “fragment” school. Fragment schools will be linked with other schools (fragment or whole) to form a linked school that has all grades present for a given level. We will link schools before sampling using an algorithm that links geographically proximate schools. Linked schools are treated as SSUs with selection performed at the grade level. The linked schools yield a total of 509 schools selected at the secondary stage in 2021 NYTS.


Estimation and Statistical Testing Procedures


Sample data will be weighted by the inverse of the probability of case selection and adjusted for non-response. The resulting weights will be trimmed to reduce mean-squared error. Next, the strata weights will be adjusted to reflect true relative enrollments rather than relative weighted enrollment. Finally, the data will be post-stratified to match national distributions of middle and high school students by race/ethnicity and grade. Variances will be computed using linearization methods.


Confidence intervals vary depending upon whether an estimate represents the full population or a subset, such as a particular grade, sex, or racial/ethnic group. Within a grouping, they also vary depending on the level of the estimate and the design effect associated with the measure.


Based on the prior NYTS cycles, as well as on precision requirements that have driven the sampling design, we can expect the following subgroup estimates to be within ±5% at 95% precision level:


  • Estimates by grade, sex, and grade cross-tabulated by sex


  • Racial/ethnic minority group estimates for non-Hispanic blacks and Hispanics cross-tabulated by school level


The former estimates will be derived from projected sample sizes of 3,428 participating students per grade, and therefore, approximately 1,700 by sex within grade. For the latter estimates, the anticipated number of participants in each minority group is at least 1,500 per school level. For conservative design effect scenarios (design effects as large as 3.0), estimates based on these subgroup sample sizes will be within +/- 5 percentage points at the 95% confidence level.


The NYTS data are used for trend analyses where data for successive cycles are compared with statistical testing techniques, when applicable. Statistical testing methods are also used to compare subgroup prevalence estimates (e.g., male versus female students) for each cycle of the NYTS. These tests will be performed with statistical techniques that account for the complex survey design. The 2019 NYTS will serve as the baseline year for assessing trends based on electronically-collected data, though potential changes in tobacco product use behaviors due to the COVID-19 pandemic in conjunction with the implementation of a completely web-based survey will also be taken into account when examining 2021 NYTS data with respect to prior years.



Survey Instrument


The 2021 NYTS questionnaire (Attachment H1) contains 166 items. The first set of questions on the questionnaire gather demographic data. Most of the remaining questions address the following tobacco-related topics: tobacco use (e-cigarettes, cigarettes, smokeless tobacco (chewing tobacco/snuff/dip), cigars (cigars, little cigars, cigarillos), hookah, roll-your-own tobacco, pipes, snus, dissolvable tobacco products, bidis, heated tobacco products, and nicotine pouches), knowledge and attitudes, media and advertising, minors’ access and enforcement, cessation, and environmental exposure to tobacco smoke from combustible tobacco products and secondhand aerosol from e-cigarettes.


The questionnaire incorporates skip logic to tailor the questionnaire based on respondents’ tobacco product use behaviors. Thus, respondents are not asked to respond to questions that do not apply to them, reducing overall respondent burden. Product images also are included in the electronic questionnaire to improve product recognition and recall. Given the efficiencies gained by transitioning to an electronic administration, previous “check all that apply” type questions related to flavored tobacco use and access to tobacco products are not asked separately for each specified tobacco product. This will allow for differentiation in patterns of use for individual products.



Data Collection Procedures


ICF International, Inc. serves as the data collection contractor for NYTS (see Section B.6). The following describes the data collection procedures that will be for an on-campus data administration of NYTS (which has been done through 2020 NYTS). The section that follows will describe modification of procedures for conducting the NYTS virtually in 2021 due to the COVID-19 environment.


On-campus Administration (Standard NYTS Data Collection Procedure)

Data will be collected by professional data collectors who are specially trained to conduct the NYTS. Data collectors for the electronic administration were hired based on their having a working level of comfort with the technology (tablets, personal hotspots, web applications) and being able to transport and lift the equipment (up to 50 pounds). The time during the school day in which the survey is administered varies by school. This decision is made in coordination with each school to ensure that the type of class or period of the day selected for sampling: 1) meets the scientific sampling parameters to ensure a nationally representative sample; and 2) results in the least burden/highest possible acceptability for the school. Each data collector will have direct responsibility for administering the survey to students.


Teachers are asked to distribute and follow up on parental permission forms sent out prior to the scheduled date of data collection. Teachers are provided with a parental permission form distribution script (Attachment H2) to follow when distributing permission forms to students. Teachers can use the Data Collection Checklist (Attachment G1; instructions for completion provided in Attachment G2, “Letter to Teachers in Participating Schools”) to track which students have received parental permission to participate in the data collection. The data collector will utilize the information on the Data Collection Checklist to identify students eligible for a make-up survey administration; this information will be recorded by the data collector on the “Make-up List and Instructions” document (also included in Attachment G1). Data collectors will leave instructions and access codes for eligible non-participating students to complete make-up surveys. Teachers will provide instructions, pass out access codes, and oversee the completion of make-up surveys upon students’ return to class.


Once inside a classroom, data collectors communicate with the teacher on the status of permission form distribution, parental refusals, and completion of the Data Collection Checklist. Teachers are asked to identify students without parental consent to participate and to make sure they have appropriate alternate activities. The data collector will utilize the information on the Data Collection Checklist (DCC) to identify students eligible for a make-up survey administration; this information will be recorded by the data collector on the “Make-up List and Instructions” document (also included in Attachment G1).


The data collector reads verbatim to eligible students a prepared script (Attachment H9) that emphasizes anonymity and the voluntary nature of the survey. Data collectors distribute tablets and instruction cards providing students’ unique survey access code as prompted in the script. Once the script is read fully and materials are distributed, students follow the instructions on the cards to unlock the tablet, enter the survey application (using their access code), and begin answering.


As students finish, data collectors collect the tablets and instruction cards. In the process, they check that the survey is submitted and that the application is ready for the next student’s use. If the application is not on the submission screen, the tablet is returned to the student and the data collector gives the student instructions on how to close the survey. (Data collectors are instructed in this protocol in an effort to avoid seeing any student responses that might be visible on screen). Tablets are carefully counted to make sure that all are returned, and the instruction cards are stored securely so that they are not accidentally distributed again to a later class.


Instructions and materials are left for teachers to administer make-up surveys to eligible nonparticipants upon their return to class. Materials include a survey administration script, a make-up list showing names/identifiers of eligible students, instruction cards with unique survey access codes for each eligible student, and a troubleshooting guide for the web-based make-up survey. As noted previously, the web-based survey is programmed to mimic the tablet-based survey. Teachers are instructed that students need to complete the make-up survey at school using an internet-connected device (personal or school-provided). Upon submission, make-up surveys are automatically incorporated into the counts for the appropriate school and class in the Data Collection Management Application (DCMA), the contractor’s system used to monitor, track, and report on fielding activities.


Once data collection at a school is concluded, data collectors record the number of eligible students in each selected class, based on the documentation on the DCC, and the expected number of completed surveys, based on their observation in the classroom into the DCMA. If a class did not participate for whatever reason, this is also documented in the DCMA and this class then is tagged for follow up to ensure their participation.


At the end of each days’ data collection, data collectors sync the tablet data to ICF’s secure server by connecting each tablet to the internet via a provided secure MiFi hotspot device. When this manual sync is complete, a screen appears with a timestamp noting the last successful sync. Data synchronization also can be done automatically (waiting for the tablet to “push” the data at preset intervals (usually 5 minutes) after connecting to a WiFi signal).


In general, data collection procedures have been designed to ensure that:

  • Protocol is followed in obtaining access to schools;

  • Everyday school activity schedules are disrupted minimally;

  • Administrative burden placed on teachers is minimal;

  • Parents give informed permission to participate in the survey;

  • Anonymity of student participation is maintained, with no punitive actions against non-participants;

  • Alternative activities are provided for nonparticipants;

  • Control over the quality of data is maintained.


Virtual Administration

When this data collection package was developed and submitted, CDC was anticipating being able to conduct on-campus data collection for the 2021 NYTS. The administration of the 2020 NYTS, however, was ended early in March 2020 due to COVID-19. As the year progressed, CDC anticipated several concerns due to the pandemic that could impact standard data collection procedures for 2021 NYTS. These include, but are not limited to:

  • Hybrid instructional models that include both on-campus and distance learning;

  • Small class sizes in order to maintain social distancing;

  • Restrictive visitor policies that may not allow data collectors to enter schools;

  • Policies that restrict pupils sharing supplies, inclusive of project tablets;

  • Travel restrictions or quarantine procedures (state-specific);

  • Perceived personal risks to data collectors;

  • Increased district and school resistance due to focus on adapting to “a new normal.”


As a result of the pandemic, modifications to standard NYTS data collection procedures have been incorporated to allow for virtual participation by eligible students from sampled schools and classrooms that are currently engaged in distance learning. This virtual data collection model is a proactive approach designed to anticipate remote/distance learning shifts within schools as they address COVID-19, allow for schools to participate without undue burden on the teacher or school, and ensure that study protocols are followed.


Supplemental changes to the 2021 data collection methods are described in the sections that follow:


Use of Technical Assistance Providers (Instead of Data Collectors)

On-campus data collectors will not be utilized for any data collections in 2021. Instead, off-site Technical Assistance Providers (TAPs) will be utilized to virtually support data collection activities in schools and to ensure that study protocols are followed. All TAPs will complete a required training covering topics like those covered for professional data collectors. TAPs will be trained to virtually support all components of a data collection, including, but not limited to:

  • Confirming receipt of survey administration materials;

  • Ensuring the distribution of parental permission forms;

  • Confirming accessibility of survey URL;

  • Answering questions from the designated school contact and/or teachers before, during, or after the survey administration;

  • Monitoring completion of survey administration in classrooms;

  • Following up on missing classes or missing classroom enrollment information;

  • Obtaining school level enrollment information and school incentive forms.


Instructional Videos for Students and Teachers

Schools and teachers will receive instructions on how to access important information about the survey administration procedures (Attachments F2, M1-M4). Additionally, two brief, high-quality videos will be produced for teachers (“pre-survey video and “day-of-survey “ video) to provide detailed instructions on the survey administration procedures (Attachment N) and how to accurately record their classroom enrollment and participation information via an online Teacher’s Landing Page/Class Enrollment Form (Attachment O). Instructions for how to access these videos and additional information is provided on the Summary of School Arrangements (SSA) forms (Attachment P) as well as a personalized ”Dear Teacher” letter (Attachments M1-M4). These instructions differ slightly depending on the type of permission form used (active or passive) and the learning environment (on-campus or virtual). Teachers may reach out to the TAP assigned to the school if they need additional guidance or have any difficulties.


Verbal instructions previously communicated to students by the data collector prior to the start of the survey will be substituted by a brief, high-quality instructional video that will play after students log into the survey but before any questions are displayed (Attachment N). This will ease the burden on the teachers and ensure that all students are receiving the same instructions for the survey administration. To maintain an environment conducive to taking the survey, students (those attending school in-class) will be provided with earbuds so that they can hear the audio associated with the pre-survey instructional video.


Web-Based Survey

The 2021 NYTS will be administered electronically using a pre-programmed web-based survey. Participating students will access the survey using an internet-connected device. The web-based survey is designed to simulate the tablet-based survey. As in previous years, the web-based survey also will be used to obtain make-up surveys from eligible students who are absent on the initial date of data collection.


Allowances for Virtual/Distance Learning

For the 2021 administration, all eligible students within a selected class will be invited to participate, even those currently engaged in virtual/distance learning. Students will still be asked to take the survey during the scheduled “class” time. To facilitate analyses related to setting effect, students will be asked a question about where they are physically taking the survey (in school, at home, somewhere else). All students will login to the survey using an access code; these access codes will vary depending on the instructional model at the school (on-campus vs. virtual).


Class Attendance Procedures

For students physically attending in the classroom, the standard data collection method, with modifications for virtual administration of the survey, will be used. The following steps will occur:

  • The designated school contact will receive the survey administration materials, packaged by class, and will distribute the materials to the teacher of each selected class.

  • On the day of the scheduled survey administration, the teacher will distribute a sign-in card to each eligible student containing a unique 5-digit student-level access code.

  • Each student will use an internet connected device and will login to the NYTS survey using the provided URL and their unique 5-digit student-level access code.

    • This access code is a single use code and is only intended and designed to be used by one respondent.

  • Make-up surveys will be conducted with eligible students who were absent on the day of survey administration using the unused, unique student-level access codes on the printed sign-in cards (those remaining from the provided survey administration materials for each respective class).


Virtual/Distance Learning Procedures

To accommodate eligible students in a virtual/distance learning environment or in any other case where students are not in a classroom together, supplemental data collection methods will be used. The following steps will occur:

  • The designated school contact will receive the survey administration materials, packaged by class, and will distribute the materials to the teacher of each selected class, either in hard copy or electronically, based on whether or not the teachers are physically in the school building or teaching remotely.

  • On the day of the scheduled survey administration, the teacher will provide the students with the survey URL and a 5-digit classroom-level access code.

  • Each student will use an internet connected device and will login to the NYTS web-based survey using the provided URL and the 5-digit classroom-level access code provided by their teacher.

    • As each student logs into the survey using the classroom-level access code, the system will assign a unique 5-digit student-level access code to each student.

    • The unique student-level access code will be displayed on the screen and students will be directed to write down their personalized access code.

  • Make-ups will be conducted with eligible students using the classroom-level access code. Technical Assistance Providers (TAPs; see below) will work with the school contact and teachers to determine when all make-ups have been completed and the classroom-level access code can be disabled.


The classroom-level access code is used in virtual learning environments in order to ease the burden on the teacher’s administration in their virtual classroom (e.g., distribution of a printed, individual student sign-in card is not possible). Additionally, this approach is used for privacy concerns in order to avoid linking unique student-level access codes to individual students (such as through email or other electronic forms of communication). This model is only meant to supplement the current standard data collection model to account for these unique virtual learning environments.


In either learning environment, if a student’s internet connection gets disrupted, or if they need to step away from the survey prior to submitting their responses, they may log out of the web survey application and log back in using their unique 5-digit student-level access code. This ensures their responses remain private and will allow the student to pick up where they previously left off, to complete the survey.



Obtaining Access to and Support from Schools


All initial letters of invitation to participate in the NYTS will be on CDC letterhead from the Department of Health and Human Services and signed by Linda Neff, PhD, MSPH, Chief of the Epidemiology Branch at the Office on Smoking and Health, National Center of Chronic Disease Prevention and Health Promotion (NCCDPHP) at CDC. The procedures for gaining access to and support from states, districts, and schools will have three major steps:


  • First, support will be sought from State Education Agencies and State Departments of Health. The initial request will be accompanied by a study fact sheet and a list of all sampled districts and schools in their jurisdiction. States will be asked to provide general guidance on working with the selected school districts and schools and to notify school districts that they may anticipate being contacted about the survey.


  • Once cleared at the state level, an invitation packet will be sent to sampled school districts in the state. Districts will receive a list of schools sampled from within their district in the invitation packet and will be asked to provide general guidance on working with them and to notify schools that they may anticipate being contacted about the study. Telephone contact will be made with the office comparable to the district office (e.g., diocesan office of education), if there is one. Some districts require that a research proposal be submitted and approved to conduct scientific studies among their students. This practice is becoming more common; for the 2019 NYTS, 39 districts (representing 70 sampled schools) required research proposals. The format, length, and timeline for these proposals varies by district. For these districts, sampled schools cannot be contacted for individual participation until the research proposal is approved.


  • Once cleared at the school district level, selected schools will be invited to participate. Information previously obtained about the school will be verified. The burden and benefits of participation in the survey will be presented. After a school agrees to participate, a tailor-made plan for collection of data in the school will be developed (e.g., select classes, determine whether the survey will be administered to selected classes sections simultaneously or in serial). Well in advance of the agreed upon survey administration date, schools will receive the appropriate number of parental consent forms and instructions. All materials needed to conduct the survey will be provided by the data collector visiting the school. Contact with schools will be maintained until all data collection activities have been completed.


Prior experience suggests the process of working with each state’s health and education agencies, school districts and schools will have unique features. Communication with each agency will recognize the organizational constraints and prevailing practices of the agency. Scripts for use in guiding these discussions may be found in Appendices D1 (state-level), E1 (district-level), and F1 (school-level). Copies of letters of invitation can be found in Attachment D2 (state-level); Attachment E2 (district-level); and Attachment F2 (school-level). Attachment F2 also contains the NYTS Fact Sheet for Schools. Attachment F3 contains a copy of the letter sent to school administrators once they have agreed to participate.


No changes to standard recruitment procedures have occurred as a result of the pandemic. However, use of on-campus recruitment visits will not be possible due to COVID-19 precautions.



Informed Consent


The permission form informs both the student and the parent about an important activity in which the student has the opportunity to participate. By providing adequate information about the activity, it helps ensure that permission to participate will be informed. Copies of the active and passive permission forms are contained in Appendices H3 and H4 (English versions) and H5 and H6 (Spanish versions). In accordance with the No Child Left Behind Act, the permission forms indicate that a copy of the questionnaire will be available for review by parents at their child’s school.


A waiver of written student assent is obtained for the participation of children because this research presents no more than minimal risk to subjects; parental permission is required for participation. The waiver will not adversely affect the rights and welfare of the students because they are free to decline to take part, and it is thought that some students may perceive they are not anonymous if they are required to provide stated assent and sign a consent/assent document. Students are told “Participating in this survey is voluntary and your grade in this class will not be affected, whether or not you answer the questions.” Completion of the survey implies student assent.


As a means to monitor the parental permission form process and to ensure questionnaires are completed only by students for whom permission has been obtained, teachers are asked to enter student names on the Data Collection Checklist (similar to a class roll) (Attachment G1); for virtual administration, an online version will be made available. Teachers can substitute any other information in place of student names (such as student ID numbers or letters) on the Data Collection Checklist as long as it will allow them to individually determine which students received parental permission to participate. This information will be conveyed to the data collector on the survey administration day.


The Data Collection Checklist is an optional tool to assist in managing the parental permission and student assent process. It will be destroyed at the end of the study. No individually identifiable information is collected on the NYTS survey (e.g., student name, class, school, etc.), therefore there is no way to connect students’ names to their response data.


NYTS is required by law to notify parents of students selected for NYTS surveys that their child has been selected and that student participation is voluntary. Schools may use various processes to obtain parental permission, forms of notification (electronically, such as email, or a hard-copy letter) either provided by the state or developed by the school. However, the notification shall include the following elements:


  • this school will be participating in NYTS and your child’s classroom may be/is selected to participate;

  • a brief description of the nature and importance of NYTS;

  • all responses are confidential, and results will not be reported to or about individual students or schools; and

  • your child may be excused from participation for any reason, is not required to finish the survey, and is not required to answer any test questions.


For the 2021 virtual administration, parental permission is still required. Parental permission forms will be sent home with students who attend school on-campus. For those attending school virtually, an electronic version of the parental permission forms will be sent to the school; teachers will then send (via email) the permission form to the parent or guardian of each student in their class. Teachers will still be required to track permission forms as they come back, though the online version of the Data Collection Checklist used in the virtual administration. Only students who receive parental permission will be allowed to participate. As with previous administrations, no names will be collected as part of this procedures.


Quality Control


Table B.2 lists the major means of quality control. As shown, the task of collecting quality data begins with a clear and explicit study protocol, is supplemented with accurate programming of the NYTS questionnaire, and concludes with the regular submission of data records to a secure repository. In between these activities, and subsequent to data collector (or Technical Assistance Provider, TAP) training, measures must be taken to reinforce training, to assist field staff (or TAPs) who express/exhibit difficulties completing data collection activities, and to check on data collection techniques. Also, early inspection of a preliminary data set is necessary to ensure data integrity. Because the ultimate aim is production of a high-quality data set and reports, various quality assurance activities will be applied during the data collection phase.


As noted previously, the virtual data collection for 2021 will rely on Technical Assistance Providers (TAPs) instead of traditional on-campus data collection field staff. However, TAPs will still go undergo thorough training prior to fielding the 2021 NYTS.


Table B.2: Major Means of Quality Control

Survey Step

Quality Control Procedures

Mailing to Districts and School

  • Validate district and school sample to verify/update contact information of district/diocese/school leadership (100%)

  • Check inner vs. outer label for agreement in correspondence (5% sample)

  • Verify that any errors in packaging were not systematic (100%)

Follow-up with School Contacts

  • Monitor early sample of recruitment inquiries to ensure that the recruiter follows procedures, elicits proper information, and has proper demeanor (10%)

  • Perform spot checks on recruiters’ class selection outcomes to confirm procedures were implemented according to protocol (10%)

Pre-Fielding Logistics

Verification

  • Review data collection procedures with school personnel in each school to ensure that all preparatory activities are performed properly in advance of survey fielding (e.g., distribution of permission forms) (100%)

Staff or TAP Training and Supervision of Survey Fielding

  • Issue quizzes during field staff/TAP training to ensure that key concepts are understood (daily during training)

  • Maintain at least one weekly telephone monitoring of all field staff/TAPs throughout data collection (100% of field staff/TAPs)

  • Reinforce training and clarify procedures through periodic conference calls with field staff (100% of field staff/TAPs)

  • Verify by telephone or email with a 10% sample of early schools that all data collection procedures are being followed

Questionnaire Programming and Testing

  • Ensure verbatim wording of displayed text to that of the analyst/programmer version of the questionnaire (100% of question and instructional text)

  • Use a variety of user profiles and behavior combinations to test correct and appropriate skip log and routing through questionnaire (100% of questions with programmed “triggers”)

  • Verify that any write-in responses are within prescribed ranges (100% of write-in questions)

  • Create “dummy data set” to verify that all entered responses are correctly captured in the data set as intended (minimum 50 records)

Receipt Control

  • Verify receipt of data from the field is occurring no later than 48 hours after data collection concludes (100% of schools)

  • Verify number of data records received in the data base match the number of expected records reported by field staff (100% of schools)

  • Capture date/time stamps and staff credentials in the centralized system for all transactions (100%)

Data Review

  • During fielding, extract records from at least three schools to verify data set is capturing and storing records as expected (during first week of fielding, or after at least three schools’ data have been collected and synced)



B.3 METHODS TO MAXIMIZE RESPONSE RATES AND DEAL WITH NONRESPONSE

Expected Response Rates


Across 16 cycles, the NYTS has maintained exceptional student and school response rates. We have averaged a 72.4% overall (school x student) response rate. The school response rate has averaged 80.8% with a low of 49.9%, and student response rate has averaged approximately 89.5% with a low of 85.9%. Notably, the lowest school response rate recorded for NYTS across these cycles (49.9%) corresponded to the 2020 NYTS, which occurred as a result of school closures due to the COVID-19 pandemic; however, the corresponding student response rate was high (87.4%), resulting in an overall response rate of 43.6%. In comparison, for 2019 NYTS, the school response rate was 77.2% and the student response rate was 85.8%, for an overall response rate of 66.3%.


NYTS response rates traditionally have been relatively high compared to other federally funded, national, school-based, health-related surveys of high school students. For example, the widely cited Monitoring the Future survey (formerly known as the High School Senior Survey) achieves substantially lower response rates. The response rates established by the NYTS are the product of the application of proven and tested procedures for maximizing school and student participation. For 2021, we assumed a more conservative overall response rate (student x school) of 42.5% in determining the sample size calculations, which is closer to the more recent experience at both levels. The main reason is to account for higher levels of anticipated school refusals due to COVID-19 precautions in the 2020/2021 school environment.


As indicated in Table A.16 in Supporting Statement A, it is desirable to complete data collection before the final month of school (i.e., by mid-April to mid-May, depending on location). Many schools are very busy at that time with standardized testing and final exams; in addition, attendance can be very unstable, especially among twelfth grade students.

Methods for Maximizing Responses and Handling Nonresponse


We distinguish among six potential types of nonresponse problems: refusal to participate by a selected school district, school, teacher, parent, or student; and collection of incomplete information from a student.


To minimize refusals at all levels--from school district to student--we will use a variety of techniques, emphasizing the importance of the survey. Given the subject matter is tobacco, we expect that a few school districts or schools will need to place the issue of survey participation before the school board. To increase the likelihood of an affirmative decision, we will: (1) work through the state agencies to communicate its support of the survey; (2) indicate that the survey is being sponsored by CDC; (3) convey to the school district or school that the survey has the endorsement of many key national educational and health associations, such as the National PTA, American Medical Association, National Association of State Boards of Education, Council of Chief State School Officers and the National School Boards Association;(4) maintain both a toll-free hotline and dedicated email account to answer questions from the school board; (5) offer a package of educational products to each participating school, as recommended by OMB in approving the 1998 YRBS in alternative schools (OMB No. 0920-0416, expiration 12/98) and implemented on NYTS ever since; (6) comply with all requirements from school districts in preparing written proposals for survey clearance; (7) convey a willingness to appear in person, if needed, to present the survey before a school board, research committee, or other local entity tasked with reviewing the survey; and (8) offer schools a monetary incentive of $500.


Once recruiters encounter district’s or school’s refusals to participate, they are encouraged to listen closely to the decision maker’s concern(s) and “leave the door open” with the districts and/or schools for additional contact later. NYTS staff will begin targeted outreach to refusals and unresponsive districts and schools during the period of data collections. Initially this targeted outreach primarily includes remailing the original invitation packet with a personalized cover memo and telephone follow-up by the original recruiter. By remailing the information, this presents a fresh request to which the decision maker can respond, since it is likely that the original request is likely lost or discarded. Recruiters are encouraged to identify relevant potential connections between NYTS content and district or school health-related initiatives via strategic plans and curriculum descriptions available on district and school websites and include this in the memo and/or as part of their conversations with the decision maker. Sample memos utilized for re-mails can be found in Attachment I. Other methods included 1) contact by a different recruiter who provided a “new voice” to the decision maker or 2) on-campus recruitment visits by field staff.

The sampling plan does not allow for the replacement of schools that refuse to participate due to concern that replacing schools would introduce bias. All participating state departments of health and education, school districts, and schools also will have access to the published survey results.


Maximizing responses and dealing with refusals from parents, teachers, and students require different strategies. To maximize responses, we will recommend that schools help to advertise the survey through the principal’s newsletter, PTA meetings, and other established means of communication. Reminders (Attachments H7 and H8) will be sent to parents who have not returned parental permission forms within an agreed upon time period (e.g., three days); those who do not respond to the reminder will be sent a second and final reminder. The permission form will provide a telephone number at CDC that parents may call to have questions answered before agreeing to give permission for their child’s participation. Permission forms will be available in English, Spanish, and any other languages spoken by a large percentage of parents in a given school district. Field staff will be available on location to answer questions from parents who remain uncertain of permission. Bilingual field staff will be used in locations with high Hispanic concentrations (e.g., California, Florida, New York City, and Texas).


Teacher refusals to cooperate with the study are not expected to be a problem because schools will already have agreed to participate. Refusals by students who have parental permission to participate are expected to be minimal. No punitive action will be taken against a nonconsenting student. Nonconsenting students will not be replaced. Data will be analyzed to determine if student nonresponse introduces any biases.


Participation in the NYTS is completely voluntary, and students may skip any question in which they are not comfortable answering. However, to minimize the likelihood of missing values on the survey, particularly on questions which skip patterns are contingent on, the digital-based questionnaire has data range validations and prompts. Thus, students who skip questions on ever or current use of individual tobacco products or who give out-of-range answers will be reminded to provide an appropriate response in a pop-up in the digital-based questionnaire before proceeding further in the survey. In the 2021 NYTS survey (attachment H1), programming instructions for missing or “not answered (NA)” responses to crucial questions are provided (see question 6, ever e-cigarette use, for an example). The prompt, or soft validation, reads, “You skipped this question. Please provide a response or use the arrows at the bottom of the screen to continue.” For questions that ask participants to specify a written numerical response (e.g., to enter the number of days they have used a product), programming instructions are provided to only allow a specific range of values. In the 2021 survey, please see question 9, current e-cigarette use, as an example of where programming ranges are employed in the NYTS. Additionally, once students finish the survey, a summary table will display a list of all unanswered question items before the student submits the survey. This provides students a chance to go back directly to each unanswered question item, if they choose to do so. In 2021, the pre-survey instructional video for students will help provide an overview of how to properly navigate and successfully submit their survey responses. In addition, teachers will be guided by TAPs to ensure that students are able to successfully troubleshoot through any technical issues. Should NYTS revert back to an on-campus mode of administration on tablets beyond 2021, data collectors will check to be sure that the survey is submitted.



B.4 TESTS OF PROCEDURES OR METHODS TO BE UNDERTAKEN


The NYTS core questionnaire items during the initial NYTS cycle underwent cognitive analyses by RTI in 1999. Further cognitive analyses or pretests of the survey were conducted in 2003, 2004, 2005, 2012, 2013, and 2015, as described in previous Supporting Statements for NYTS. In 2017, 13 questions and response options were cognitively tested for inclusion on NYTS, focused primarily on e-cigarettes: rules regarding use in the home, reasons for use, types of e-cigarettes used, and how youth accessed e-cigarettes. In 2018, cognitive testing of 15 questions and response options was completed, again focused on e-cigarette use. Terminology, devices used, substances used, and reasons for use were all explored. Finally, cognitive testing was performed in 2019, assessing e-cigarette terminology, e-cigarette devices used, substances used in e-cigarettes (e.g. nicotine, marijuana, THC), reasons for using e-cigarettes, exposure to secondhand tobacco smoke and secondhand e-cigarette aerosol, exposure to e-cigarette posts in social media, injunctive and descriptive e-cigarette norms, and indicators for affluence or socioeconomic status (SES) of the student’s family. The findings of these testing activities are used to improve existing questions on the NYTS (e.g. users knowing substances they are using in e-cigarettes) as well as generate new questions (assessment of student’s family’s SES).


The current ICR includes an updated line item in the burden table to support testing of changes to the NYTS questionnaire prior to their implementation. Burden is specifically allocated to performing testing of new or modified questions that will provide better measures of tobacco products. The burden also includes pre-testing of the questionnaire to confirm that they can be completed in 45 minutes.



B.5. RESPONSE TO 2016 PEER REVIEW PANEL RECOMMENDATIONS REGARDING NYTS SAMPLE


In October 2016, OSH convened a peer review panel of external experts to assess and develop recommendations for updating and enhancing the NYTS sample. This review was mandated by OMB as a condition of renewal for 2018-2020 NYTS package; an overview of the panel’s recommendations and the agency’s response were published there. For the 2021-2023 package, OMB asked OSH to address the status of items where OSH concurred with the panel’s recommendations and looked to implement them in the future. This information is provided in the following paragraphs:



SAMPLING DESIGN


Recommendation: Given adolescent developmental trajectories associated with the take-up of tobacco use behaviors, it is conceivable that the tobacco use rates in any given school could be very different at some grade levels between the start and the conclusion of the Spring Semester. This is a potential concern that could be at least partially addressed via analyses of existing NYTS data that compares tobacco use prevalence, while introducing appropriate controls, based on the date that questionnaires are administered to students. Indeed, such analyses could be conducted with NYTS data pooled across multiple years, which might ensure adequate statistical power was available.


Response: These analyses have not been undertaken yet. OSH still concurs that these analyses could be informative but considers that the impact could be limited due to the short time frame in which the NYTS is actually fielded. Furthermore, OSH believes that tobacco use behaviors and patterns are likely to be more similar across this short time frame in the spring semester, compared to if the NYTS was conducted in the fall.



Recommendation: Consider using substitute schools when the originally sampled school is not able to participate but has students who should validly be included in the NYTS. Substitute schools can provide an effective means of reducing the bias due to school nonresponse, provided that each substitute school is quite closely matched to the school it’s replacing with regard to characteristics that are related to the original school’s selection.


Response: This recommendation was determined to be unfeasible, due to the complexity of introducing a sampling with replacement strategy into the current sampling procedure. Given that refusals occur during the recruitment phase, after the sample is drawn, a true sampling with replacement (i.e., replacement is selected before next unit is drawn) cannot be implemented.


Recommendation: Consider oversampling American Indian/Alaskan Native (AI/AN) students to allow for subgroup reporting for these students. Two potential approaches include: 1) including BIE students in the sample; and 2) oversampling high proportion American Indian schools in certain select states (i.e., Arizona, Minnesota, North Carolina, Oregon, Utah, Washington, and Wisconsin).


Response: OSH still concurs with this recommendation, but it has not been implemented, since this would involve both an increase in the sample size and increased funding to implement. OSH can confer with its contractor about the feasibility of doing an AI/AN oversample for the 2022 or 2023 NYTS.



RECRUITMENT


Recommendation: Perhaps focus groups with administrators and separately for students could elucidate possible means of improving response rates.


Response: OSH still concurs with this recommendation, but it has not been implemented due to funding constraints.



Recommendation: Utilization of YRBS-like state level contacts in recruitment may be a potential opportunity to provide improved recruitment.


Response: OSH continues to consider this option, but implementation would depend on ability to fund states for this activity. However, the NYTS contractor, ICF, has made other improvements to their recruitment procedures at the district and school levels to increase the likelihood of participation. Examples include increasing on-campus recruitment visits to districts and schools (see below) and if possible, offering January administration dates to minimize mid- and late-semester conflicts for the schools. Given competing surveys and other research activities in schools, there has been an increasing trend in the requirement of district research proposals at the national level. ICF found it necessary to proactively complete these proposals ahead of the official recruitment activities, given that many of these proposals have deadlines prior to the beginning of the school year. For the 2019 NYTS cycle, for example, 39 districts, representing 70 schools (22% of all schools) required research proposals, up from only 18 proposals in 2018. ICF has also increased their on-campus recruitment efforts. These on-campus recruitment visits build upon the work of the recruitment staff. In 2019, data collection field staff were involved in roughly 100 on-campus recruitment visits during the data collection window. Following on-campus recruitment visits, 22 districts (representing 41% of visited districts) and 39 schools (representing 63% of visited schools) agreed to participate in the NYTS. ICF plans to continue proactively completing district research proposals, as required, and building up the on-campus recruitment efforts in the future.



Recommendation: Consider developing a best practices guide (BPG) for schools to use as they ready their staff and students for NYTS data collection. The BPG goals are to obtain teacher buy-in and to motivate students to do their best. The NAEP program has developed such a guide for 12th grade schools that includes videos to introduce NAEP to teachers and students, PowerPoint presentations to share at faculty and student meetings, and resources schools can customize to share information about NAEP. Also, the NAEP BPG contains successful strategies for increasing student participation (e.g., ways to recognize and thank students, assembly announcements, school newsletter information, school newspaper article, etc.).


Response: OSH still concurs with this recommendation, but ability to implement it is limited by staff and funding constraints. ICF, the data collection contractor for NYTS, has a brief protocol – one possibility is to investigate whether this could be expanded or enhanced.



Recommendation: NYTS should explore shifting some or all of its field staff training (whether it be pre-survey work, actual data collection, or close out activities) to a distance learning model. The NAEP program has successfully implemented distance learning over the recent past with significant cost savings to the program. One example: converting one day of on-campus field staff training into self-paced, multimedia training modules with follow-up phone conversations with supervisory field staff.


Response: OSH considers the current on-campus training program effective. With the transition to electronic survey administration starting in 2019, on-campus training has been imperative due to differences from administering a paper-and-pencil survey. The 3-day on-campus training combines a mix of lecture, role plays, and hands on practice. It is very important that interviewers have the opportunity for hands-on practice and interaction with the tablets themselves to learn how to sync the tablets, upload data to the secure server, and generally troubleshoot any other issues with the equipment. Additionally, this on-campus training focuses on “finding your voice”, in which interviewers participate in role play activities to become comfortable with the data collection procedures and scenarios. The on-campus training also allows the NYTS contractor (ICF) to evaluate the suitability and training of the field staff; in 2019, one hired data collector was determined not to be a good fit for this project and was subsequently let go and replaced. It would be difficult to replicate these activities and experiences in a distance learning model. However, after the electronic data collection procedures are well established, it may be possible in the future to investigate the feasibility of distance learning for training data collectors, consulting with NAEP about their program.



Recommendation: The NYTS should consider developing and having all field staff fill-out a session debriefing form that collects information such as attendance problems, student behavior, teacher and student reactions, adequacy of seating and space. This information can help the NYTS learn more about the logistical and qualitative factors that may impact data collection and can be used to enhance procedures for future data collections. Additionally, a school coordinator debriefing interview can augment the field staff session debriefing form.


Response: Weekly debriefings with data collectors were done for the first several weeks that the 2019 NYTS was in the field. Additionally, following the 2019 NYTS fielding, the NYTS contractor held a final debrief with data collector staff in order to gain insights and impressions about the electronic fielding, overall. These debriefings were informative and are tentatively planned for the 2020 NYTS.

WEIGHTING


Recommendation: If possible, findings from the annual non-response bias studies done as part of NYTS should be summarized and included as part of its annual methodological report. In terms of the non-response bias analyses, it would be very useful to examine the degree to which school drop-out rates are associated with school response rates in the NYTS.



Response: Traditionally, information related to the school location (e.g., urban, non-urban) and student population that is investigated in the non-response analysis is not released to the public, due to concerns with confidentiality and privacy of the school and student respondents. However, the feasibility of providing a summary of the non-response bias study into the public-use NYTS methodology report is being investigated.



Recommendation: The NYTS should consider providing preliminary weights for pilot and field tests for two main reasons: 1) to be able to review testing procedures, drafting analyses, and getting an early look at general results prior to the availability of the final weights (which allow for the reporting of unbiased results); and 2) the psychometric evaluation of pilot survey items as part of the process of determining which items should be included in the final survey. The integration of a preliminary weighting process may play a role in helping NYTS achieve reporting within six months from the end of data collection.


Response: Transitioning to electronic administration of the NYTS has reduced the time needed to produce a final weighted analytic dataset. Based on the 2019 NYTS, the general expectation is that an analytic dataset can be produced within approximately 2 months after the end of data collection. The schedule of the release of public use data will depend on agency needs and priorities, but 6 months after end of data collection is feasible. OSH considers there to be less need for preliminary weights than in the past and thus, has not implemented this recommendation.



B.6 INDIVIDUALS CONSULTED ON STATISTICAL ASPECTS AND INDIVIDUALS COLLECTING AND/OR ANALYZING DATA

Statistical Review


Statistical aspects of the study have been reviewed by the individuals listed below.

Sean Hu, M.D., Dr.P.H.

Senior Epidemiologist

Centers for Disease Control and Prevention

Phone: 770-488-5845

E-mail [email protected]

Ronaldo Iachan, Ph.D.

Technical Director

ICF International, Inc.

Email: [email protected]



Agency Responsibility


Within the agency, the following individuals will be responsible for receiving and approving contract deliverables and for having primary responsibility for data analysis:


Contract Deliverables

Sean Hu, MD, Dr.P.H.

Office on Smoking and Health, Epidemiology Branch, CDC
4770 Buford Highway NE, MS-S107-7
Atlanta, GA 30341
Phone: 770-488-5845; Fax: 770-488-5848

Email: [email protected]


Data Analysis

Ahmed Jamal, M.B.B.S., M.P.H.
Office on Smoking and Health, Epidemiology Branch, CDC
4770 Buford Highway NE, MS-S107-7
Atlanta, GA 30341
Phone: 770-488-5077; Fax: 770-488-5848
E-mail: [email protected]


Responsibility for Data Collection


The representatives of the contractor, ICF International, Inc., responsible for conducting the planned data collection are Alice Roberts (Project Director), Kate Flint, and Jill Trott, and others as designated by the contractor


REFERENCES


CDC (2001). Youth Tobacco Surveillance–United States, 2000. MMWR; 50(SS-4).


CDC (2012). National Youth Tobacco Survey. Atlanta, GA: US Department of Health and Human Services, Centers for Disease Control and Prevention. Available at http://www.cdc.gov/tobacco/data_statistics/surveys/nyts.

CDC (2013). Arrazola RA, Singh T, Corey CG, et al. Tobacco Use Among Middle and High School Students — United States, 2011–2014. MMWR Morb Mortal Wkly Rep 2015 / 64(14);381-385.

CDC (2014). Best Practices for Comprehensive Tobacco Control Programs – 2014. Atlanta, GA: U.S. Department of Health and Human Services, CDC, National Center for Chronic Disease Prevention and Health Promotion, Office on Smoking and Health.

CDC (2015). Arrazola RA, Singh T, Corey CG, et al. Tobacco Use Among Middle and High School Students — United States, 2011–2014. MMWR Morb Mortal Wkly Rep, 2015: 64(14);381-385.

CDC (2016). Centers for Disease Control and Prevention. CDC Winnable Battles: final report. Available at https://www.cdc.gov/winnablebattles/report/index.html.

CDC (2018a). Centers for Disease Control and Prevention. CDC's 6|18 initiative: accelerating evidence into action. Available at https://www.cdc.gov/sixeighteen/index.html.

CDC (2018b). Centers for Disease Control and Prevention. Fiscal year 2019: Justification of estimates for appropriation committees. Available at https://www.cdc.gov/budget/documents/fy2019/fy-2019-cdc-congressional-justification.pdf.

CDC (2019a). Gentzke AS, Creamer M, Cullen KA, et al. Vital Signs: Tobacco product use among middle and high school students — United States, 2011–2018. MMWR Morb Mortal Wkly Rep 2019;68:157–164. DOI: http://dx.doi.org/10.15585/mmwr.mm6806e1

CDC (2019b). Wang TW, Gentzke AS, Creamer MR, et al. Tobacco Product Use and Associated Factors Among Middle and High School Students —United States, 2019. MMWR Surveill Summ 2019;68(No. SS-12):1–22. DOI: http://dx.doi.org/10.15585/mmwr.ss6812a1

CDC (2020). Wang TW, Neff LJ, Park-Lee E, et al. E-cigarette Use Among Middle and High School Students – United States, 2020. MMWR Morb Mortal Wkly Rep 2020;69:1310-1312. DOI: http://dx.doi.org/10.15585/mmwr.mm6937e1

FDA (2014). FDA Proposes to Extend Its Tobacco Authority to Additional Tobacco Products, including e-cigarettes. FDA NEWS RELEASE. N.p., 24 Apr. 2014. Web. 9 May 2014. http://www.fda.gov/newsevents/newsroom/pressannouncements/ucm394667.htm

FDA (2018). Cullen KA, Ambrose BK, Gentzke AS, et al. Notes from the Field: Use of Electronic Cigarettes and Any Tobacco Product Among Middle and High School Students — United States, 2011–2018. MMWR Morb Mortal Wkly Rep 2018;67:1276–1277. DOI: http://dx.doi.org/10.15585/mmwr.mm6745a5

FDA (2019). Food and Drug Administration. newly signed legislation raises federal minimum age of sale of tobacco products to 21. Silver Spring, MD: US Department of Health and Human Services, Food and Drug Administration; 2019. https://www.fda.gov/tobacco-products/ctp-newsroom/newly-signed-legislation-raises-federal-minimum-age-sale-tobacco-products-21.

FDA-CDC (2019). Cullen KA, Gentzke AS, Sawdey MD, et al. E-cigarette use among youth in the United States, 2016-2019. JAMA 2019;322(21):2095-2103. Published online November 05, 2019. doi:10.1001/jama.2019.18387

FDA (2020). Center for Tobacco Products. Enforcement priorities for Electronic Nicotine Delivery Systems (ENDS) and other deemed products on the market without premarket authorization (revised). Silver Spring, MD: US Department of Health and Human Services, Food and Drug Administration; 2020. https://www.fda.gov/media/133880/download.

Hu SS, Gentzke A, Jamal A, et al. Feasibility of Administering an Electronic Version of the National Youth Tobacco Survey in a Classroom Setting. Prev Chronic Dis 2020;17:190294. DOI: https://doi.org/10.5888/pcd17.190294.

National Institute on Drug Abuse (2014). Monitoring the Future national results on drug use: 1975-2013: Overview, Key Findings on Adolescent Drug Use. National Institute on Drug Abuse, National Institutes of Health. Ann Arbor, MI: Institute for Social Research, The University of Michigan.

OSG (2018). Office of the Surgeon General. Surgeon General’s advisory on e-cigarette use among youth. December 18, 2018. Available at https://e-cigarettes.surgeongeneral.gov/documents/surgeon-generals-advisory-on-e-cigarette-use-among-youth-2018.pdf.

USBLS 2018. U.S. Bureau of Labor Statistics (2019). May 2018 National Occupational Employment and Wage Estimates, United States. Available at http://www.bls.gov/oes/current/oes_nat.htm

USDHHS (2010). U.S. Department of Health and Human Services. Healthy People 2020. Washington, D.C.: Available at: http://healthypeople.gov/2020/default.aspx

USDHHS (2012). US Department of Health and Human Services, Ending the Tobacco Epidemic: Progress toward a Healthier Nation. Washington, DC: Office of the Assistant Secretary for Health.

USDHHS (2019). U.S. Department of Health and Human Services. Development of the national health promotion and disease prevention objectives for 2030. Available at https://www.healthypeople.gov/2020/About-Healthy-People/Development-Healthy-People-2030.

1 We created a dichotomy of urban vs. non-urban schools using the Metro Status categorical variable available in these files.

28



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleOMB SUPPORTING STATEMENT: Part B
AuthorKatherine.H.Flint
File Modified0000-00-00
File Created2021-04-29

© 2024 OMB.report | Privacy Policy