SS Part B - SHPPS 2014 and 2016 28 Mar 2013

SS Part B - SHPPS 2014 and 2016 28 Mar 2013.docx

School Health Policies and Practices Study 2012

OMB: 0920-0445

Document [docx]
Download: docx | pdf







OMB SUPPORTING STATEMENT: Part B



SCHOOL HEALTH POLICIES AND PRACTICES STUDY


OMB No. 0920-0445

Reinstatement with Changes









Submitted by:


Division of Adolescent and School Health

National Center for HIV/AIDS, Viral Hepatitis, STD, and TB Prevention


Centers for Disease Control and Prevention

Department of Health and Human Services



Project Officer:


Nancy D. Brener, PhD

Division of Adolescent and School Health

National Center for HIV/AIDS, Viral Hepatitis, STD, and TB Prevention

Centers for Disease Control and Prevention

4770 Buford Highway, NE

Mailstop K-33

Atlanta, GA 30341-3717

Phone: 770-488-6184

Fax: 770-488-6156

E-mail: [email protected]


April 17, 2013




Table of Contents

B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS 1

B.1 RESPONDENT UNIVERSE AND SAMPLING METHODS 1

B.1.a Respondent Universe 1

B.1.b Schools in the 2014 Study 2

B.1.c Courses with Health or Physical Education Content in the 2014 Study 3

B.1.d School Districts in the 2016 Study 3

Table B.1.d Frame Number of School Districts from SHPPS 2012 4

B.2 PROCEDURES FOR COLLECTION OF INFORMATION 4

B.2.a Statistical Method for Stratification and Sample Selection 4

B.2.b Estimation and Justification of Sample Size 7

Table B.2.b1. Planned Sample Sizes for the Various Cycles 7

Table B.2.b2. Sample Sizes for the School Survey in 2014 (participating schools) 8

Table B.2.b4.  School and Course Sample Sizes Needed in 2014 to Achieve Target Levels of Precision for Various Design Effect Scenarios 10

Table B.2.b5. Sample Sizes for the District Survey in 2016 11

Table B.2.b6: Design Effects and Standard Error for District Level Estimates 11

Table B.2.b7: School District Sample Sizes Needed to Achieve Target Levels of Precision for Various Design Effect Scenarios 12

B.2.c Weighting and Estimation Procedures 12

B.2.d Use of Less Frequent Data Collection to Reduce Burden 12

B.2.e Survey Questionnaires 12

Table B.2.e Distribution of SHPPS Data Collection Instruments across Components and Respondent Levels 13

B.2.f Obtaining Access to and Support from State Education Agencies (SEAs), School Districts, and Schools 14

B.2.g Data Collection Procedures 15

B.2.h Quality Control 19

Table B.2.h Major Means of Quality Control 20

B.3 METHODS TO MAXIMIZE RESPONSE RATES AND DEAL WITH NONRESPONSE 21

B.3.a Expected Response Rates 21

B.3.b Methods for Maximizing Responses and Handling Nonresponse 21

B.4 TESTS OF PROCEDURES OR METHODS TO BE UNDERTAKEN 23

B.5 INDIVIDUALS CONSULTED ON STATISTICAL ASPECTS AND INDIVIDUALS COLLECTING AND/OR ANALYZING DATA 24

B.5.a Statistical Review 24

B.5.b Agency Responsibility 24

B.5.c Responsibility for Data Collection 24

REFERENCES 25


APPENDICES


  1. Authorizing Legislation


  1. 60-Day Federal Register Announcement


  1. Justification of SHPPS in Terms of the Year 2020 Health Objectives for the Nation


  1. Consultants in Questionnaire Design

D-1 Content Panel Participants

D-2 National Reviewers


  1. Participant Notification Documents

E-1 School Participant Notification Document

E-2 Classroom Participant Notification Document

E-3 District Participant Notification Document


  1. Example Tables


  1. Complete Set of Study Questionnaires

G-1 School Health Education

G-2 School Physical Education and Activity

G-3 School Health Services

G-4 School Nutrition Services

G-5 School Healthy and Safe School Environment

G-6 School Mental Health and Social Services

G-7 School Faculty and Staff Health Promotion

G-8 Classroom Health Education

G-9 Classroom Physical Education and Activity

G-10 District Health Education

G-11 District Physical Education and Activity

G-12 District Health Services

G-13 District Nutrition Services

G-14 District Healthy and Safe School Environment

G-15 District Mental Health and Social Services

G-16 District Faculty and Staff Health Promotion


  1. Study Communication for SHPPS 2014

H-1 State Recruitment Script for SHPPS 2014

H-2 District Recruitment Script for SHPPS 2014

H-3 School Recruitment Script for SHPPS 2014

H-4 State Invitation Letter for SHPPS 2014

H-5 District Invitation Letter for SHPPS 2014

H-6 School Invitation Letter for SHPPS 2014

H-7 School- and Classroom-level Content Outlines



  1. Study Communications for SHPPS 2016

I-1 State Recruitment Script for SHPPS 2016

I-2 District Recruitment Script for SHPPS 2016

I-3 State Invitation Letter for SHPPS 2016

I-4 District Invitation Letter for SHPPS 2016

I-5 District-Level Content Outlines


  1. Fact Sheet

J-1 Fact Sheet for SHPPS 2014

J-2 Fact Sheet for SHPPS 2016

B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS


B.1 RESPONDENT UNIVERSE AND SAMPLING METHODS

The proposed study includes a school- and classroom-level data collection with a nationally representative sample of schools in 2014 and a district-level data collection with a nationally representative sample of school districts in 2016. The respondent universe for the 2014 school- and classroom-level study is all private and public elementary, middle, and high schools nationwide (i.e., in the 50 states and the District of Columbia). The respondent universe for the 2016 district-level study is all public school districts nationwide.

For both data collections, the sampling frame has been obtained from MDR, Inc. The MDR data encompass all school districts and both private and public schools and include the latest data from the Common Core of Data (CCD) from the National Center for Education Statistics (NCES).  MDR school-level files also include data on enrollments by grade and minority status. The school frame will be complemented by the CCD public schools as well as the Private School Survey (PSS) non-public schools at each level—elementary, middle and high school. The district frame will be complemented by the CCD public school districts. By making combined use of both NCES and MDR files, the approach is similar to that used in previous cycles of SHPPS but increases reliance on the NCES files. The refinement is designed to improve coverage and to make the frame more consistent with stratification and post-stratification data based on NCES data.

One of the differences in the current sampling designs compared to previous SHPPS samples is in the first-stage stratification. Previous cycles used a stratification of ZIP Code areas by poverty and by urbanicity to develop four cross-strata along these two dimensions. The new stratification uses a classification developed for NCES for schools and districts, a classification that assigns one of 12 NCES Locale categories based on an urban-rural continuum that ranges from large cities (most urban) to remote rural areas (most rural).

Using the NCES classification directly will lead to several efficiencies that ensue from two basic aspects of the sampling and weighting approaches. First, strata will be more homogeneous in terms of more similar schools within a same stratum than across different strata. Homogeneous strata lead to improved precision, i.e., smaller sampling errors (variances and standard errors). Second, the weighting will link known NCES population totals to strata leading to more stable post-stratification factors. This improved post-stratification will lead to both smaller potential biases and to smaller unequal weighting effects on survey variances.

B.1.a Respondent Universe


The 2014 study universe includes schools and required school courses with health and physical education content. Respondents will be personnel who have responsibility for one or more of the seven components of school health programs for which data collection instruments have been developed: health education, physical education and activity, health services, nutrition services, healthy and safe school environment, mental health and social services, and faculty and staff health promotion. Respondents for health and physical education courses will be teachers of those courses.


The 2016 study universe includes school districts. Similarly, respondents will be personnel who have responsibility for one or more of the seven components of school health programs for which data collection instruments have been developed: health education, physical education and activity, health services, nutrition services, healthy and safe school environment, mental health and social services, and faculty and staff health promotion.

B.1.b Schools in the 2014 Study


The universe of schools contains approximately 130,000 public and non-public schools. Schools will be stratified by school type (public/non-public) and level (elementary/middle/high). As in the previous SHPPS cycles, schools eligible for the study will be public and non-public schools with any of grades 1 through 12. Kindergarten is a grade of interest, but schools containing kindergarten and/or pre-Kindergarten, but not first grade or higher, will be excluded. For SHPPS 2014, the same threshold implemented during SHPPS 2012 sampling of 30 for the minimal school enrollment will be implemented to make the selection and data collection more effective.


Schools are excluded if they serve only non-eligible students or they exclusively serve students that are provided services of interest at another eligible school in the sample. Specifically, we will exclude:

  • Schools consisting only of grades lower than kindergarten or higher than twelfth

  • Alternative schools

  • Schools providing services to a “pull-out” population who are provided services in another eligible school

  • Schools run by the Defense Department or the Bureau of Indian Affairs

  • Schools with fewer than 30 students

The initial school sample will contain approximately 846 schools. Based on response rates from SHPPS 2006, we anticipate a school participation rate of 78 percent; i.e., 78 percent of eligible sampled schools are expected to participate. We refer to the expected number of eligible responding schools as the “school respondent sample size.” Please note that in computing the participation rate, the number of projected ineligibles has been excluded from the denominator. Sampled schools could be excluded because they have ceased to operate, changed their target population such that they no longer fall into our universe (e.g., regular school changed to a special education school) or changed the age group they serve (e.g., a school selected as a middle school is now an elementary school). Schools found to be ineligible during sample validation will be replaced by similar schools—same level and type—selected within the same primary sampling unit (PSU), or geographic grouping. If no such school is available in the same PSU, then a similar school will be selected from a neighboring PSU within the same state.

We anticipate that of 846 selected schools, approximately 25 will be found to be ineligible after sample validation and will not be replaced, and 821 will be eligible schools. The estimated percent ineligible (3%) and participation rate (78%) are based on our experience in fielding SHPPS 2006. Of the 821 eligible schools, 640 eligible schools are expected to participate.


Table B.1.b provides frame totals for the school strata for the 2014 study. Note that these counts are based on the frame of second stage units (SSUs) and split schools following the classification of schools into the three school levels. Any school that falls into more than one of the level categories will be split conceptually into separate frame units into each of the level strata in which it appears. These are referred to as split schools. We anticipate that approximately one fourth of the schools in the frame will fall into more than one category and will be split. In this way, a single school building may contribute two units; e.g., a school that spans grades K-8 would contribute an elementary school SSU and a middle school SSU.


Table B.1.b. Population counts (number of schools) in the frame for each school stratum


School Type

Elementary Schools

Middle Schools

High Schools

Public

54,218

26,583

22,284

Non-Public

16,672

10,958

5,997

TOTAL

70,890

37,541

28,281


Working with the principal or designated contact person at each participating school, we will seek to identify as prospective respondents the school staff member or members primarily responsible for delivering and/or coordinating each of the components of school health programs listed in Section B.1.a.

B.1.c Courses with Health or Physical Education Content in the 2014 Study


A probabilistic sample of all required courses and elementary school grades containing health education and/or physical education content will be drawn for inclusion in the study. For each such required course or grade, we will randomly select one teacher as a study respondent. The selection of these teachers is described in section B.2.a.2.

B.1.d School Districts in the 2016 Study


Unlike previous district samples for SHPPS, the district sample for the 2016 cycle no longer needs to provide a platform for a linked school sample. Because of this, the sampling design can be much simpler and more efficient than the design for previous SHPPS surveys. The improved statistical efficiency also implies that estimates of precision comparable to previous SHPPS surveys can be based on smaller samples.


The universe of school districts in the 50 states and the District of Columbia contains approximately 18,000 public school districts. The study will survey a stratified random sample of public school districts, stratified by urbanicity. The initial sample size will be 920 districts, of which we expect approximately 685 eligible districts to respond (see section B.2.b). As an example, a school district will be regarded as ineligible if it is a technical school district containing only vocational-technical schools that serve a pull-out population. Districts found to be ineligible during sample validation will be replaced by similar districts in a nearby PSU.


Based on response rates from SHPPS 2012, we anticipate a district participation rate of 76 percent; i.e., 76 percent of eligible sampled districts are expected to participate in the district-level survey. We refer to the expected number of eligible responding districts as the “district respondent sample size.” Please note that in computing the participation rate, the number of projected ineligibles has been excluded from the denominator. We anticipate that of 920 selected school districts, approximately 2% (18) will be found to be ineligible after sample validation and excluded from the number of prospective participants, leaving 902 eligible districts, of which 685 (76%) are expected to participate. Note that this discussion does not include 20 districts funded by CDC’s Division of Adolescent and School Health (DASH) that are included with certainty.


Through the office of the superintendent of each of these district-level entities, the district official primarily responsible for coordinating each of the components of school health programs listed in Section B.1.a. will be identified as respondents. It is anticipated that occasionally the district superintendent will identify a local government official who does not work for the school district (e.g., a health services coordinator in the county health department) who is the most knowledgeable respondent for that component.


Table B.1.d provides a summary of sampling frame statistics from the 2012 study, including the number of school districts in the frame as well as the numbers in each of the strata as described in Section B.2.b. The sampling frame will be updated with current data for the 2016 study.

Table B.1.d Frame Number of School Districts from SHPPS 2012


First-stage strata

Number of districts

Percent of all districts

Urban

6,072

51.3

Non-urban

7,012

48.7

TOTAL

13,084

100.0

B.2 PROCEDURES FOR COLLECTION OF INFORMATION

B.2.a Statistical Method for Stratification and Sample Selection


The sampling design modifications introduced to the SHPPS 2012 design that have led to smaller design effects (DEFFs), and therefore precision gains when compared to previous cycles, have been retained in the 2014 and 2016 studies. School sample sizes have been re-calculated for the 2014 study based on stratification by school type (public/private) and level (elementary, middle, and high) only. Previous cycles included stratification by size (small/large), as well. Design effect estimates and response rates obtained from the most recent data collection, school-level in 2006 and district-level in 2012, ensure that precision requirements can be met in the most efficient manner.

We will discuss each stage of the sampling for schools for the 2014 study and districts for the 2016 study below. Section B.2.a.1 summarizes the sample sizes and gives the expected precision of survey estimates for schools in the 2014 study; section B.2.a.4 summarizes the sample sizes and gives the expected precision of survey estimates for districts in the 2016 study.


B.2.a.1 School Sample for 2014 Study


Schools eligible for the study will be public and non-public schools with any of grades 1 through 12 that have a total student enrollment of 30 or more students. Kindergarten is a grade of interest, but schools containing Kindergarten and/or pre-Kindergarten, but not first grade, will be excluded. This strategy will include schools that contain kindergarten in addition to higher grades. From a sample of 178 PSUs, corresponding to about 320 districts, a stratified random sample of 846 (initial sample size) public and non-public schools will be selected.


The frame of schools will be stratified by school level (elementary, middle, and high) and school type (public and non-public). School level will be defined based on the grades present in the school using the following mutually exclusive subgroups of eligible schools (recalling that these exclude schools that only offer Kindergarten):


Elementary: Schools with any grade 5 or under

Middle: Schools with grade 7 or 8, or only grade 6, or only grades 5 and 6

High: Schools with any of grades 10, 11, or 12, or only grade 9


Any school that falls into more than one of the level categories will be split conceptually into separate frame units into each of the level strata in which it appears. These are referred to as split schools. We anticipate that approximately one fourth of the schools in the frame will fall into more than one category and will be split.


The sample will be selected in two stages, with PSUs selected at the first stage and schools selected at the second stage. A two-stage sampling design, with some degree of sample clustering, seems necessary for cost-efficiency reasons as data collectors will visit all the participating schools.

The PSUs will be selected with probability proportional to size (PPS), using as measure of size (MOS) the number of eligible schools in the PSU. Schools will be selected with equal probabilities within strata.

The sample allocation for the number of schools in each of the school strata are determined to satisfy the variance constraint that 95 percent confidence intervals around estimated proportions be no greater than 0.05. These precision requirements are the same as the levels achieved in SHPPS 2006.


B.2.a.2 Course Selection for 2014 Study


A probabilistic sample of all required courses and elementary school grades containing health education and/or physical education content will be drawn for inclusion in the study. Therefore, sampling units (as well as analysis units) will be courses or elementary school classes, and they will be represented by selected teachers who will report the data for the course/class.


For each of the two content areas, Health Education and Physical Education and Activity, up to two teachers will be sampled randomly from among all eligible teachers, i.e., those who are currently teaching the course or have taught the course during the current school year and who are still members of this school’s staff. Note that for elementary schools, most regular classroom teachers will likely meet these criteria, and natural units are grades rather than courses as used for secondary schools. The differences in selection procedures for secondary vs. elementary are described below.


  1. Secondary Schools


The process involves several steps performed separately for classroom Health Education and Physical Education and Activity content areas within each sampled school. We describe the steps for Health Education, with a comparable process taking place for Physical Education and Activity.


1) Construct a list of all courses containing health instruction.

2) Select a random sample of two courses if the list contains more than two courses; otherwise, take all courses.

3) Identify the teachers linked to each selected course.

4) For each selected course, randomly select one teacher from the list of teachers in the prior step.

5) For the teacher/course pair, select one section from the course sections taught by the teacher.

For each school, we will carefully record the numbers involved in steps 1, 3, and 5 as these will used for weighting the selected section up to the course and school levels.


  1. Elementary Schools


A similar sequence of steps will be taken to select grades for Health Education and Physical Education and Activity (separately), and identify reporting units, within each sampled elementary school. Again, we describe the steps for Health Education, with a comparable process taking place for Physical Education and Activity.


  1. Identify all eligible grades at which health instruction is required.

  2. Randomly select two of these grades (unless the school contains only one, then take it).

  3. List all teachers providing instruction at each of these grades.

  4. Randomly select one teacher for each of these grades.

  5. If health instruction for a selected teacher’s class is provided by a specialist, interview the specialist about instruction for that particular class.

B.2.a.3 Observation Component for the 2014 Study


In each school that has vending machines that are accessible to students during the school day, up to five vending machines (both snack and beverage) will be randomly selected to undergo observations. For schools that report five or fewer vending machines that students can access, each vending machine will undergo observations.


B.2.a.4 District Sample for 2016 Study


All public school districts are eligible for inclusion in the 2016 study. We will select a district sample large enough to give district estimates of the desired precision. Approximately 920 districts will be selected; the 20 school districts funded by DASH will be included with certainty. It is anticipated that approximately 685 districts will participate among those found eligible for the study, for a participation rate of 76 percent.


Domains of interest for the district strata are defined by total districts and by urbanicity. The sample will be stratified by urban status using the 12 NCES Locale categories and will be selected with equal probabilities within each stratum. By using proportional allocation to strata, we will also attain approximately equal probabilities overall, and a nearly self-weighting sample of districts.


B.2.b Estimation and Justification of Sample Size


Table B.2.b1 presents, in summary, the sample sizes and expected number of respondents for the school and course samples in 2014 and the district sample in 2016. The estimation and justification of sample sizes for the 2014 and 2016 are discussed separately below.

Table B.2.b1. Planned Sample Sizes for the Various Cycles


Year

Sampling Units

Number of Selected Units

Projected Eligible Units

Expected

Participation Rate

Respondent Sample Size

2014

Schools

846

821

78%

640

Courses

1240

1240

96%

1,229 Health Education

1240

1240

96%

1,229 Physical Education

2016

Districts

920

902

76%

685


Table B.2.b2 presents the planned sample allocation to school strata for the 2014 study, i.e., the number of schools to be selected within each of these strata. Allocation to the public and non-public school sub-stratum cells will be proportional within each of the school level strata. The allocation in Table B.2.b.2 uses the population school stratum proportions computed from Table B.1.b.


Table B.2.b2. Sample Sizes for the School Survey in 2014 (participating schools)


Sample Sizes for the School Survey

School Strata

School Sample Allocation

Elementary Schools

214

Public

164

Private/Catholic

50

Middle Schools

213

Public

151

Private/Catholic

62

High Schools

213

Public

168

Private/Catholic

45




School sample sizes were derived to generate standard errors of 3% or less. The derivations were based on design effects between 1.3 and 2.0 for overall estimates, similar to those obtained for typical school-level estimates in SHPPS 2006. Table B.2.b3 presents design effects as well as standard errors for a number of school estimates using SHPPS 2006 data. The DEFF estimates were useful guides for the design of the 2014 SHPPS sample although we expect to achieve substantially lower DEFFs in the 2014 survey.

Subgroup estimates can also be computed with the same precision levels yet are premised on lower DEFFs (between 1.2 and 1.3). Lower DEFFs occur for subgroups as clustering effects are diluted.1


Table B.2.b3: Standard Errors and Design Effects (DEFFs) for Key School-level Estimates Computed using SHPPS 2006 Data


Estimate

Overall

Elementary Schools

Middle Schools

High Schools

Schools with Tobacco Free Policies

Percentage

63.60%

65.40%

58.70%

66.10%

Standard Error

2.40%

3.40%

3.10%

3.00%

Design Effect

2.4

1.7

1.3

1.3

Schools with Required PE

Percentage

78.40%

69.30%

83.90%

95.10%

Standard Error

1.80%

3.10%

2.40%

1.70%

Design Effect

1.9

1.6

1.3

1.9

Schools with Required PE in Each of Their Grades

Percentage

26.40%

34.80%

20.50%

12.30%

Standard Error

2.00%

3.10%

2.50%

2.30%

Design Effect

2

1.5

1.3

1.6

Schools with Required HIV Prevention Instruction

Percentage

59.50%

39.10%

74.50%

88.40%

Standard Error

2.20%

3.70%

3.10%

2.10%

Design Effect

1.7

1.6

1.5

1.3

Schools with Required Nutrition and Dietary Behavior Instruction

Percentage

84.20%

84.60%

82.30%

86.30%

Standard Error

1.60%

2.40%

2.70%

2.30%

Design Effect

1.8

1.3

1.5

1.4

Schools with Required Alcohol or Other Drugs Prevention Instruction

Percentage

81.80%

76.50%

84.60%

91.80%

Standard Error

1.80%

2.70%

2.50%

1.80%

Design Effect

2

1.2

1.4

1.4

Schools That Had a School Nurse

Percentage

86.30%

87.00%

86.50%

84.30%

Standard Error

1.80%

2.40%

2.70%

2.50%

Design Effect

2.5

1.7

1.8

1.4

Subgroups of interest are those defined by school level—elementary, middle, and high schools—each expected to have approximately 213 participating schools.  These subgroup sample sizes are expected to provide subgroup estimates with standard errors of 3.4% or less for DEFF=1, and 3.7% or less for DEFF=1.2.  Therefore, subgroup estimates will meet precision levels with standard errors less than 3.7% for the range of design effects expected for subgroups, i.e., DEFF between 1.0 and 1.2.

Table B.2.b4 shows the school sample sizes needed to achieve the target precision levels.  Assuming DEFFs near 1.2, the table shows that subgroup samples of at least n=333 schools will be necessary to achieve standard errors of 3.0% and 187 schools for standard errors of 4%. Within second-stage strata—e.g., defined by school level—we expect DEFFs to be lower than 1.3 based on 2006 data.  Thus, the expected precision of estimates based on elementary schools, middle schools and high schools will be comparable to those of simple random samples of the same size (DEFFs near 1.0).


In addition, Table B.2.b4 shows that for the course sample to achieve standard errors of 2.5% or less, the sample size needs to be 800 for DEFF=2.0.


Table B.2.b4.  School and Course Sample Sizes Needed in 2014 to Achieve Target Levels of Precision for Various Design Effect Scenarios


Design Effect

Standard Error

2.5%

3.0%

4.0%

1.2

480

333

187

1.3

520

361

203

1.4

560

388

218

1.5

600

416

234

2.0

800

555

312

Table B.2.b5 presents the planned sample allocation to district strata for the 2016 study, i.e., the number of districts to be selected within each of these strata.



Table B.2.b5. Sample Sizes for the District Survey in 2016


Stratum Label

Stratum Description

Population Count

Population Percent

Sample Districts

11

City-Large

1430

7.83

72

12

City-Midsize

495

2.71

25

13

City-Small

796

4.36

41

21

Suburb-Large

2891

15.83

146

22

Suburb-Midsize

394

2.16

20

23

Suburb-Small

305

1.67

16

31

Town-Fringe

374

2.05

19

32

Town-Distant

1441

7.89

73

33

Town-Remote

1235

6.76

62

41

Rural-Fringe

2484

13.6

125

42

Rural-Distant

3568

19.54

180

43

Rural-Remote

2803

15.35

141

 TOTAL


18259


920



The DEFF will be between 1.0 and 1.4 for the district sample because this roughly equal-probability sample will have minimal unequal weighting effects (due mostly to differential response rates by strata) and no clustering effects. Table B.2.b6 presents a range of example district estimates based on SHPPS 2012 data. This exhibit shows that even the unequal-probability sampling design used in SHPPS 2012 achieved low design effects (between 1.5 and 2.2).

Table B.2.b6: Design Effects and Standard Error for District Level Estimates

Estimate

Percent

Standard Error

Design Effect

Districts with Tobacco Free Policies

55.40%

3.70%

2.2

Districts with Required HIV Instruction

95.60%

1.30%

1.5


Estimate

Elementary Schools

Middle Schools

High Schools

Districts with Required Nutrition and Dietary Behavior Instruction

Percentage

77.40%

85.10%

87.90%

Standard Error

2.80%

2.40%

2.30%

Design Effect

1.7

1.7

1.6

Districts with Required Alcohol or Other Drugs Prevention Instruction

Percentage

79.00%

89.70%

89.30%

Standard Error

2.70%

2.10%

2.30%

Design Effect

1.7

1.6

1.7

The derived sample sizes are premised on three empirically-based DEFF scenarios: DEFF=1.2, 1.3 and 1.4. Table B.2.b7 shows that for the conservative DEFF=1.4, approximately 560 completed district surveys would be necessary to generate estimates with at most a 2.5% standard error. The anticipated district sample sizes will then generate precise estimates within +/- 5 percentage points for a 95% confidence interval.

Table B.2.b7: School District Sample Sizes Needed to Achieve Target Levels of Precision for Various Design Effect Scenarios


Design Effect (DEFF)

2.0% Standard Error

2.5% Standard Error

1.2

749.6

480

1.3

812.5

520

1.4

874.6

560

B.2.c Weighting and Estimation Procedures


For both the 2014 and 2016 studies, the base weight for each sampled entity will be equal to the inverse of its probability of selection. Prior to data analysis, sampling statisticians will prepare sampling weights adjusted for non-response within strata. Final survey weights will reflect the probability of selection and non-response adjustments; these weights will be appropriate for national estimates and estimates within strata.


The estimation process will use statistical software developed for analyses of survey data arising from complex sampling designs (e.g., SUDAAN). These estimation procedures will appropriately account for the effects of non-response, unequal probability sampling, stratification, and clustering. Examples of tables that will be completed through analysis of the data are in Appendix F.


B.2.d Use of Less Frequent Data Collection to Reduce Burden


The planned data collections will occur once. School-level data collection will occur in 2014 in order to collect school, classroom, and vending machine data that could not be collected as part of the OMB-approved SHPPS 2012 (OMB no. 0920-0445, exp. 9/30/2012) study due to funding cuts.


District-level data collection will occur in 2016.

B.2.e Survey Questionnaires


The school study in 2014 involves the use of nine questionnaires designed to measure policies and practices at the school and classroom levels related to the following seven components of school health programs: health education, physical education and activity, health services, nutrition services, healthy and safe school environment, mental health and social services, and faculty and staff health promotion. The questionnaires are designed for computer-assisted personal interviewing (CAPI).

The school study also involves three data collections regarding 1) obtaining support from state education agencies in recruiting schools, 2) obtaining clearances from districts to approach schools selected from their district and 3) recruiting schools, identifying respondents, scheduling in-person interviews, randomly selecting class sections/grades for inclusion in the health education and physical education classroom-level components, and identification of vending machines for the observation component.


The district study in 2016 involves the use of seven questionnaires designed to measure policies and practices at the district level related to the following seven components of school health programs: health education, physical education and activity, health services, nutrition services, healthy and safe school environment, mental health and social services, and faculty and staff health promotion. The district questionnaires are designed for self-administered, web-based administration.


The district study also involves two data collections regarding 1) obtaining support from state education agencies in recruiting districts and 2) recruiting districts and working with the district contact to identify appropriate respondents.


Table B.2.e illustrates the distribution of the 21 data collection instruments across components and levels of jurisdiction. The complete set of questionnaires can be found in Appendix G. The state, district and school recruitment scripts for the 2014 study can be found in Appendix H-1, H-2, and H-3, respectively. The state and district recruitment scripts for the 2016 study can be found in Appendix I-1 and I-2, respectively.

Table B.2.e Distribution of SHPPS Data Collection Instruments across Components and Respondent Levels


Component

State

District

School

Classroom

Total

Number of Instruments

Health Education


3

Physical Education and Activity


3

Health Services



2

Nutrition Services



2

Healthy and Safe School Environment



2

Mental Health and Social Services



2

Faculty and Staff and Health Promotion



2

Number of Questionnaires


7

7

2

16

Recruitment Scripts

(x2)

(x2)


5

Total Number of Instruments





21

SHPPS 2014 and SHPPS 2016 will capitalize on earlier efforts to revise and review the questionnaires in preparation for fielding SHPPS 2012. CDC and the contractor conducted extensive reviews of the SHPPS 2006 questionnaires. Questions were deleted when the 2006 data showed the question had low yield and the resulting data were not useful to CDC. Minor modifications, such as question wording, were made to improve clarity. The only change to the previously OMB-approved school-level questionnaires is revising references to the 2012 academic year to substitute the 2014 academic year. A new component to the SHPPS 2014 study is the inclusion of vending machine observations. This new element, originally scheduled to occur as part of SHPPS 2012 but canceled due to budget cuts, will yield the only nationally representative dataset of snack and beverage offerings available to students through school vending machines.


The survey instruments used to collect district-level data in 2016 underwent a similar revision process as that used with the school-level instruments described above. In addition, question wording was revised because of a change in the mode of administration from CATI in 2006 to web-based in 2012. No further revisions to the OMB-approved SHPPS 2012 district-level questionnaires are anticipated for use in 2016.


B.2.f Obtaining Access to and Support from State Education Agencies (SEAs), School Districts, and Schools


All initial letters of invitation will be on CDC letterhead from the Department of Health and Human Services and signed by the Director of DASH/NCHHSTP at CDC.


The procedures for gaining access to and support from states, districts, and schools for the conduct of SHPPS 2014 will have three major steps:


  1. First, support will be sought from SEAs. The initial request will be accompanied by a study fact sheet and a list of all sampled districts and schools in the SEA’s jurisdiction. States will be asked to provide general guidance on working with the selected school districts and schools and to notify school districts that they may anticipate being contacted about the survey.


  1. Once cleared at the state level, an invitation packet will be sent to sampled school districts in the state. Districts will receive a list of schools sampled from within their district in the invitation packet and will be asked to provide general guidance on working with them and to notify schools that they may anticipate being contacted about the study. Telephone contact will be made with each selected school district office and diocesan office of education in follow-up on the initial correspondence.


  1. Once cleared at the school district level, selected schools will be invited to participate. Information previously obtained about the school will be verified. The burden and benefits of participation in the survey will be presented. After a school agrees to participate, a tailor-made plan for collection of data in the school will be developed (e.g., identify respondents, determine the best and worst weeks during the spring semester for data collection, gather schedules for respondents, etc.). Contact with schools will be maintained until all data collection activities have been completed.


The procedures for gaining access to and support from states and districts for the conduct of SHPPS 2016 will have two major steps:


  1. Again, we will first seek support from SEAs. The initial request will be accompanied by a study fact sheet and a list of all sampled districts in the SEA’s jurisdiction. States will be asked to provide general guidance on working with the selected school districts and to notify school districts that they may anticipate being contacted about the survey.


  1. Once cleared at the state level, an invitation packet will be sent to sampled school districts in the state. Telephone contact will be made with the district superintendent or designee to elicit support and identify district-level respondents.


Prior experience suggests the process of working with each state education agency, school district, and school will have unique features. Discussions with each education agency will recognize the organizational constraints and prevailing practices of the agency. Invitation letters for the 2014 data collection to states, districts, and school; scripts for use in guiding discussions with states, districts, and schools, and school questionnaire content outlines are found in Appendix H. Invitation letters for the 2016 data collection to states and districts, scripts for guiding discussions with states and districts, and district questionnaire content outlines are contained in Appendix I. The study fact sheets are contained in Appendix J.


B.2.g Data Collection Procedures


For the school-level study, data collection will begin in February 2014, pending the completion of appropriate clearance processes. Data collection will be conducted in-person using computer-assisted personal interview technology (CAPI) on netbook computers.


For the district-level study, data collection will begin in October 2015, pending the completion of appropriate clearance processes. Data collection will be via web-based questionnaire technology.


School collection in 2014. Once a school has agreed to participate in the study, a project staff member will contact the principal or school administrator to identify respondents and schedule data collection activities. To assist with respondent identification, schools will be provided with content outlines (Appendix H-7). Respondent names and interview schedules will be stored in an online case management system; the schedule will later be verified and confirmed by the field interviewer who is assigned to that school. At each school, the interviewers will complete each of the seven school-level questionnaires with the respondents most knowledgeable about the specific component within that school. In addition, interviews will take place with teachers of sampled health education and physical education courses. Procedures for identifying course respondents are described in section B.2.a.2, as well as below. Procedures for identifying primary respondents for the seven school-level questionnaires are described below.


Through the office of the school administrator of each sampled school, the school staff member primarily responsible for delivering and/or coordinating each of these components of school health programs will be identified as follows:


  • Health Education. The school administrator will be asked to identify the lead health educator (sometimes a department chair) who can provide overall information about the organization of the school’s health education program. Note that, while these procedures will apply to middle and high schools, they may have to be modified somewhat for uses in elementary schools (both for Health Education and Physical Education) because there may not be a lead teacher for these subjects in elementary schools.


  • Physical Education and Activity. The school administrator will be asked to identify the lead physical educator who can provide overall information about the organization of the school’s physical education and activity program. The school administrator also will be asked to identify the individual most knowledgeable about the interscholastic sports program at the school.


  • Health Services. The respondent universe includes personnel responsible for a variety of health services activities at the school including student health records, immunization requirements, screenings, administering student medications, and other health services. The school administrator will be asked to identify the individual(s) who is most knowledgeable about the health services provided within or by the school. Respondents will include physicians, nurses, health aides, and other designated school staff.


  • Nutrition Services. The school administrator will be asked to identify the person primarily responsible for managing the planning, preparation, and provision of school nutrition services, usually the school food service manager.


  • Healthy and Safe School Environment. The school administrator of each school will be the respondent on questions related to the school environment and the school’s health policies and practices, including those related to prevention of violence, tobacco use, alcohol use, and illegal drug use. The school administrator will be offered the option of designating an assistant school administrator or someone else as the more appropriate respondent. The school administrator also will be asked to identify the person most knowledgeable about issues related to the physical school environment and health hazards.


  • Mental Health and Social Services. The school administrator will be asked to identify the individual(s) who is most knowledgeable about the mental health and social services provided by the school. Respondents will include guidance counselors, social workers, nurses, school administrators, and assistant principals.


  • Faculty and Staff Health Promotion. The school administrator will be asked to identify the person who is most knowledgeable about the health promotion services and activities provided by the school for faculty and staff. Respondents will include nurses, teachers, members of a school wellness council, guidance counselors, principals, and assistant principals.


Courses/Classes


In each middle or high school, up to two classroom teachers will be interviewed for required courses with health education and physical education content. Courses will be randomly selected from all required health education and physical education courses offered at a school. In elementary schools, we will interview both regular classroom teachers and specialists, if any, who teach health and/or physical education content. Up to two elementary classroom teachers and/or specialists will be randomly selected among those grades where instruction on health or physical education is required. See section B.2.a.2 for details.


Observation component


In each school that has vending machines that are accessible to students during the school day, up to five vending machines (both snack and beverage) will be randomly selected to undergo observations. Observations entail the use of digital photography to capture objective information about the snack and beverage options available to students in vending machines. See section B.2.a.3 for details.


Data collection process. After a school visit has been scheduled by a member of the central study staff, a confirmation letter will be sent approximately one to two weeks before the visit, followed by a telephone call from the field interviewer responsible for the school. When a school agrees to participate, a customized plan for the data collection at the school will be developed in consultation with the school administrator. Every effort will be made to minimize disruption of the school schedule by working around school and classroom commitments. The school- and classroom-level interviews will be conducted by specially trained interviewers. An average of two days will be spent collecting data at each school. Data will be collected using computer-assisted personal interviewing (CAPI) technology.


A group of approximately 25 data collectors will be employed to conduct the school- and classroom-level interviews. Before the interviewers are sent to the field, they will undergo an intensive training program. The training will cover the purposes of the study, use of the computer and digital camera, standard interviewing procedures, confidentiality requirements, and handling problematic situations (e.g., cancellations, reluctant respondents). Training will include both group instruction as well as paired mock interviews where interviewers practice interviewing one another with a prepared script.


For the observation component, data collectors will take four photographs per vending machine. The first photo will be of an “identity card” that provides the school ID, state, and vending machine ID number. The following three photos will each represent a third of the machine’s offerings. Data collectors will start at the top of the machine and move downward so that the entire contents of a machine will be represented. This process will be repeated for each vending machine.


Digital photographs will be transferred from the interviewers’ cameras directly to the interviewers’ netbook computers daily. The same software used to conduct the face-to-face interviews will be used to manage vending machine photographs and associate them with the correct school, thus reducing the potential for error.


Interviewers will transmit their completed interview data and digital photographs electronically daily. Because the interview data will have already been keyed into the interviewers’ netbook computers, data entry will not be necessary. Also, since the computer-assisted methodology will prevent interviewers from skipping questions in error, the need for any follow-up contact with school or classroom respondents will be minimal.


District collection in 2016. District contacts will receive content outlines (Appendix I-5) in an initial mailing during the recruitment phase to assist them with the identification of the most knowledgeable respondents for each of the seven questionnaire content areas. Telephone follow-up will occur two to three days following the mailing to address any questions the contact may have and, if it is convenient for the contact, elicit the names of the most knowledgeable respondents for each questionnaire content area. Due to the breadth of topics that fall under some of the content areas (e.g., Healthy and Safe School Environment), more than one respondent may be needed to complete a questionnaire. For content areas for which we anticipate this to be the case, contacts will be provided the opportunity to designate the most knowledgeable respondent for each of the questionnaire’s “modules.” Questionnaire modules are comprised of topics that are similar in content and could likely be addressed by one person with expertise on those topics. Procedures for identifying the most knowledgeable respondents for each questionnaire content area are described below.


Through the district contact, personnel most knowledgeable about each of these components of school health programs will be identified as follows:


  • Health Education. The district contact will be asked to identify the district health education coordinator, who can address questions about school health education standards, instructional content by school level, staffing and professional development, collaboration efforts, and program promotion and evaluation.


  • Physical Education and Activity. The district contact will be asked to identify the district physical education coordinator, who can provide information about physical education standards, instructional content by school level, physical education for students with disabilities, use of protective gear, assessment, physical activity and discipline, staffing and professional development, program promotion and evaluation, and interscholastic sports.


  • Health Services. The district contact will be asked to identify the district health services coordinator, who can provide information about student health records; required immunizations; screening and testing; administering student medications; funding for standard health services; collaboration efforts; provision of health services; staffing characteristics; and school-based health centers.


  • Nutrition Services. The district contact will be asked to identify the district school food authority director or district food service director, who can provide overall information about menu planning and food ordering; food preparation; collaboration, promotion, and evaluation; professional development; and food service and child nutrition requirements and recommendations.


  • Healthy and Safe School Environment. The district contact will be asked to identify the district school health coordinator who can address questions related to the district’s policies on the prevention of violence, tobacco use, and injuries; crisis preparedness, response, and recovery; foods and beverages available outside of the school meal programs; and transportation to and from school. The contact also will be asked to identify the individual responsible for the oversight of issues related to physical school environment and health hazards for the district.


  • Mental Health and Social Services. The district contact will be asked to identify the district mental health and social services coordinator, who can provide information about provision of services; collaboration, promotion, and evaluation; staffing characteristics; and professional development.


  • Faculty and Staff Health Promotion. The district contact will be asked to identify the individual most knowledgeable about the district’s health insurance, required examinations and screenings, health promotion activities and services, employee assistance programs, and planning and coordination.


Respondents will be mailed an information packet prior to data collection. This packet will contain a fact sheet, a content outline for each questionnaire for which the respondent has been identified as most knowledgeable, instructions on how to access the study website, and a unique study identifier that will allow the respondent to log in and complete the questionnaire(s) to which he has been assigned.


Data collection process.  Upon identifying the most knowledgeable respondents for each of the questionnaire content areas, name and contact information will be stored in an online case management system (CMS).  During this process, a distinctive study identifier is generated and is linked to the questionnaire(s) for which the respondent has been identified.  Each respondent is then assigned a randomly generated, unique access code for the web-based data collection system, which is linked to the respondent’s unique identifier in the CMS. Once the respondents have received their informational packet, they may access the website from any Internet-connected computer using their assigned access code and begin completing the questionnaire(s) they were assigned.


Each time a respondent advances to a new screen of questions, data is saved to the central repository.  This allows respondents to “break-off” a questionnaire and return to it at a later time without data loss.  Since the data will already be keyed into the web-based system, data entry will not be necessary.  Also, because the computer-assisted methodology will prevent respondents from skipping questions in error, the need for any follow-up contact with district respondents will be minimal.


B.2.h Quality Control


The task of collecting quality data begins with a clear and explicit study protocol and ends with procedures for the verification of collected data. In between these activities, and subsequent to data collector training, measures will be taken to reinforce training to assist field staff who run into trouble and to check on data collection techniques. Table B.2.h lists the major means of quality control.

Table B.2.h Major Means of Quality Control


Survey Step

Quality Control Procedures

Survey Programming

  • Conduct internal programming review of CAPI and web-based questionnaires to ensure accuracy of questionnaires (100%)

Pre-mail Contact with Schools and District

  • Discuss the goals and content of the study and of the specific questionnaires with contact at school and district levels to ensure that the most appropriate respondents are identified

Mail Out for Schools and Districts

  • Check inner vs. outer label for correspondence (5% sample)

  • Verify that any errors in packaging were not systematic (100%)

Telephone Follow-up Contacts

  • Monitor early sample of calls to ensure that the recruiter follows procedures, elicits proper information, and has proper demeanor (10%)

Identification of Most Knowledgeable Respondents

  • Explain the goals and content of the questionnaires with contact at schools and districts to ensure that the most appropriate respondents are identified

Interviewer Training and Supervision for School Interviews

  • Maintain at least one weekly telephone monitoring of all field staff throughout data collection (100% of field staff)

  • Reinforce training and clarify procedures through periodic field newsletters (100% of field staff)

  • Verify by telephone with a 10% sample of early schools that all data collection procedures are being followed

Netbook Computer Verification

  • Prior to each data collection, conduct netbook computer verification procedures to ensure netbook boots, questionnaires are loaded on the netbook, and interview loads after each start-up (100%)

  • Reload questionnaires on netbooks and ensure problems were not systematic (100%)

Digital Camera Verification

  • Conduct internal testing of digital camera functionality, including operation of camera, synch process with netbook, tagging photos with appropriate school, and acceptance into central repository (100%)

  • Ensure data collectors demonstrate appropriate and correct use of digital camera technology as a requirement of data collector training (100%)

  • Prior to each data collection, conduct digital camera verification procedures to ensure camera has adequate battery life, that camera mode is operational, and that there is sufficient memory (100%)

Receipt Control of School Interview and Vending Machine Data

  • Examine first 10 transmissions from each interviewer to ensure data integrity and quality of photographs

  • Review sample of transmissions from each interviewer throughout data collection to ensure data integrity (5% of transmissions)

Receipt Control of District Survey Data

  • Examine data submitted from first 10 districts to ensure data integrity

  • Review sample of submitted data from throughout data collection to ensure data integrity (10% of transmissions)



B.3 METHODS TO MAXIMIZE RESPONSE RATES AND DEAL WITH NONRESPONSE

B.3.a Expected Response Rates


Due to loss of funding for SHPPS 2012, the school-level data collection did not occur. However, the overall school-level participation rate for SHPPS 2006 was 77.9%. We have conservatively assumed a minimum response rate of 74% for each questionnaire component; however, a higher overall school response rate (80%) is anticipated for SHPPS 2014 due to improvements in non-public school recruitment procedures. Specifically, prior to sending an invitation letter, these schools will be contacted by recruiters with expertise in gaining cooperation from non-public schools to provide information about the study. These recruiters have received specialized training in anticipating the types of concerns these schools may have about participating in research studies and how to address these concerns. In addition, support for the study will be sought from national associations of Christian and Catholic schools. Classroom-level response rates for the Health Education and Physical Education and Activity questionnaires were 95% in 2006. We assume a similar classroom response rate (96%) for 2014.


At the district level, SHPPS 2012 achieved an overall response rate of 76.7%. We assume a similar response rate will be achieved in SHPPS 2016, as we will again utilize a web-based methodological approach and will launch data collection activities immediately after recruitment activities.

B.3.b Methods for Maximizing Responses and Handling Nonresponse


Several methods will be used to maximize responses to SHPPS 2014 and SHPPS 2016. These methods will emphasize the importance of the study, minimize the burden of participation, and maximize the reward of participation. Specific methods are described below.


Methods to Emphasize the Importance of the Study


  1. Strong support of national and state education and health organizations will be imparted during the initial recruitment of sample members. Letters of support will emphasize the value of participation.

  2. State education agencies will be asked to write a letter of support for the study that will be used at the district and school levels. Similarly, written district support will be cited during the contacts with school personnel.

  3. CDC sponsorship of the study will be stressed in all communication with sample members. Correspondence with the sample members will be on CDC letterhead and signed by the Director of the Division of Adolescent and School Health.

  4. Project materials will emphasize the importance of the study for improving school health programs for youth. Materials will include fact sheets from SHPPS 2006 and SHPPS 2012 data as well as the important national health objectives that the study addresses.

  5. Sample members will be informed that early initial contact is being made to facilitate their participation. Similarly, sample members will be informed that recruiters will make repeated follow-up efforts to encourage participation due to the great importance that the data have to federal, state, and local health and education officials.

Methods to Minimize Response Burden


  1. An iterative process of review by experts and practitioners has ensured the significance of all questions included in the study, and thereby reduced the risk that sample members will spend time answering questions needlessly.

  2. Use of CAPI in 2014 to conduct the interviews will reduce respondent burden by automatically navigating through complex logic and skip patterns.

  3. The use of web-based technology in 2016 will allow respondents to respond to the questionnaires at a time and place of their convenience from any Internet-connected computer.

  4. Questionnaires have been modularized to enable more than one respondent to address different topics covered in one questionnaire. For example, for the Healthy and Safe School Environment questionnaire, one respondent may address general policies, but another respondent may be needed to address questions on physical school environment. This approach will help to reduce the burden on any one given respondent.

  5. Setting a school enrollment minimum of 30 students for inclusion in the sampling frame reduces the number of very small schools that are burdened by participating because of respondents wearing so many different “hats” and therefore being selected to respond to multiple content areas.


Methods to Reward Participation


  1. Schools will be directed to educational materials provided by CDC as an incentive for participation. Although these materials are available to the public upon request, schools might not be aware of their availability.

  2. Following the lead of other national, school-based studies, schools will be offered a monetary stipend in appreciation of their participation. Stipends will be awarded in the amount of $250.


Handling Nonresponse


A thorough sample validation will occur prior to commencement of recruitment. The main objective of the validation is to confirm that each school district and school still exists and fully meets all of our criteria to be considered eligible for SHPPS. Districts and schools found to be ineligible for SHPPS prior to the start of recruitment will be replaced as described above in Sections B.1.b and B.1.d. Districts and schools found to be ineligible after the start of recruitment will be replaced to the extent that is feasible.


The secondary purpose of validation is to confirm all the information we have about a school to prepare the recruiters and data collectors for working with the school. This includes information about school size, address, name of principal, telephone and email addresses, and other information critical to planning to communicate with the school. Similar validation processes are followed with respect to districts.


The best approach to handling nonresponse is to avoid it whenever possible. The study contractor has more than 30 years of successful experience in national school-based, health-related data collections and over 25 years of experience conducting computer-assisted surveys. Every effort will be made to encourage all sample members to participate in the study. Further, when study staff make personal contact with sample members, they will always strive to obtain participation and to avoid refusals. Study staff remain in contact with respondents who have agreed to participate and monitor the completion of questionnaires. Follow-up with respondents who have agreed to participate but have not submitted a completed questionnaire will occur via telephone and/or email.


When a staff member encounters a reluctant respondent, the case will be referred to a more senior staff member in an attempt to encourage participation in the study. In addition, study staff will encourage sample members to contact the project director and the study’s Federal project officer with questions and concerns that they may have. The project director and project officer will be available by telephone to answer these questions and concerns. These procedures have proved successful in several studies of this nature.


B.4 TESTS OF PROCEDURES OR METHODS TO BE UNDERTAKEN


From November 2010 through January 2011, the contractor conducted a pretest to assess the clarity of new and modified questionnaire items. This pretest was conducted within OMB guidelines with volunteer district and school personnel and classroom teachers. Between four and seven respondents were involved in the pretest of a given questionnaire. Respondents were selected with the purpose of obtaining a diverse group. At the school level, responding schools were both public and private, were from several regions of the country (e.g. east, Midwest), and varied widely in size and in the socio-economic status of the students served. At the district level, responding districts were from several regions of the country (e.g. south, Midwest), from both urban and non-urban areas, and served students at various levels of socio-economic status.


In an effort to approximate the circumstances under which school- and classroom-level data collection will occur, pretests occurred on-site at schools since CAPI will be used to conduct the full school- and classroom-level surveys. District-level pretests took place by telephone in front of an Internet-connected computer. Cognitive interviews were conducted to determine how respondents interpreted new and modified items; to evaluate the adequacy of response options, definitions, and other descriptions provided within the questionnaires; and to assess the appropriateness of specific terms or phrases. As a result of the pretests, respondent burden was reduced and the potential utility of survey results was enhanced through the elimination or clarification of questions. For example, some questions were not understood by most respondents even after the interviewer provided clarification. Such questions were deleted from the questionnaires. For other questions that were poorly understood, if the intent of the question became clear when the interviewer provided definitions or examples, these definitions or examples were then incorporated into the questions to improve clarity. When respondents could not differentiate between two similar questions, such questions were combined into a single question on the questionnaire. Finally, when respondents noted that a response option they might have selected was not available to them, such response options were added to the questionnaire.


Empirical estimates of respondent burden were also obtained through the administration of each of the school, classroom, and district questionnaires in its entirety.

B.5 INDIVIDUALS CONSULTED ON STATISTICAL ASPECTS AND INDIVIDUALS COLLECTING AND/OR ANALYZING DATA

B.5.a Statistical Review

Statistical aspects of the study have been reviewed by:

Ronaldo Iachan, Ph.D., Senior Statistician

ICF International

11785 Beltsville Drive

Calverton, Maryland 20705

(301) 572-0538

[email protected]


William Robb, Ph.D., Senior Statistician

ICF International

40 Wall Street

New York, NY 10005

(646) 695-8182

[email protected]

B.5.b Agency Responsibility

Within the agency, the following individual will be responsible for receiving and approving contract deliverables and will have primary responsibility for data analysis:

Nancy Brener, Ph.D.

Division of Adolescent and School Health

Centers for Disease Control and Prevention

Atlanta, Georgia 30341

770-488-6184

[email protected]

B.5.c Responsibility for Data Collection


The representative of the contractor responsible for conducting the planned data collection is:


Rocco Russo, Ph.D.

ICF International

11785 Beltsville Drive

Calverton, Maryland 20705

(301) 572-0250

[email protected]

REFERENCES


  1. Kish, L. (1965) Survey Sampling, New York: J. Wiley and Sons.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authoralice.m.roberts
File Modified0000-00-00
File Created2021-01-29

© 2024 OMB.report | Privacy Policy