SHPPS 2012 SS Part B_revised

SHPPS 2012 SS Part B_revised.docx

School Health Policies and Practices Study 2012

OMB: 0920-0445

Document [docx]
Download: docx | pdf







OMB SUPPORTING STATEMENT: Part B



SCHOOL HEALTH POLICIES AND PRACTICES STUDY 2012


OMB No. 0920-0445

Reinstatement with Changes









Submitted by:


Division of Adolescent and School Health

National Center for Chronic Disease Prevention and Health Promotion


Centers for Disease Control and Prevention

Department of Health and Human Services



Project Officer:


Nancy D. Brener, PhD

Division of Adolescent and School Health

National Center for Chronic Disease Prevention and Health Promotion

Centers for Disease Control and Prevention

4770 Buford Highway, NE

Mailstop K-33

Atlanta, GA 30341-3717

Phone: 770-488-6184

Fax: 770-488-6156

E-mail: [email protected]


April 15, 2011

Revised August 31, 2011



TABLE OF CONTENTS


B. COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS




1.

Respondent Universe and Sampling Methods






a. Respondent Universe



b. State Education Agencies



c. School Districts



d. Schools



e. Courses with Health or Physical Education Content



f. Statistical Methodology for Stratification and Sample Selection



g. Estimation and Justification of Sample Size



h. Weighting and Estimation Procedures





2.

Procedures for Collection of Information






a. Use of Less Frequent Data Collection to Reduce Burden



b. Survey Questionnaires



c. Obtaining Access to and Support from State Education Agencies,

School Districts, and Schools



d. Data Collection Procedures



e. Quality Control





3.

Methods to Maximize Response Rates and Deal with Nonresponse






a. Expected Response Rates



b. Methods for Maximizing Responses and Handling Nonresponse





4.

Tests of Procedures or Methods to be Undertaken





5.

Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data






a. Statistical Review



b. Agency Responsibility



c. Responsibility for Data Collection












APPENDICES



  1. Authorizing Legislation


  1. 60-Day Federal Register Announcement


  1. Justification of SHPPS in Terms of the Year 2020 Health Objectives for the Nation

  2. Consultations in Questionnaire Design

D-1 Content Panel Participants

D-2 National Reviewers


  1. Participant Notification Documents

E-1 State Participant Notification DocumentE-2 District Participant Notification Document

E-3 School Participant Notification Document

E-4 Classroom Participant Notification Document


  1. Example Tables


  1. Questionnaires

G-1 State Health Education

G-2 State Physical Education and Activity

G-3 State Health Services

G-4 State Nutrition Services

G-5 State Healthy and Safe School Environment

G-6 State Mental Health and Social Services

G-7 District Health Education

G-8 District Physical Education and Activity

G-9 District Health Services

G-10 District Nutrition Services

G-11 District Healthy and Safe School Environment

G-12 District Mental Health and Social Services

G-13 District Faculty and Staff Health Promotion

G-14 School Health Education

G-15 School Physical Education and Activity

G-16 School Health Services

G-17 School Nutrition Services

G-18 School Healthy and Safe School Environment

G-19 School Mental Health and Social Services

G-20 School Faculty and Staff Health Promotion

G-21 Classroom Health Education

G-22 Classroom Physical Education and Activity


  1. State and District Communications

H-1 State Invitation Letter

H-2 District Invitation Letters

H-3 State Recruitment Script

H-4 District Recruitment Scripts

H-5 State-level Content Outlines

H-6 District-level Content Outlines


  1. School Communications

I-1 Invitation Letters

I-2 Recruitment Scripts

I-3 School-level Content Outlines

I-4 Classroom-level Content Outlines


  1. Fact Sheet


  1. References



LIST OF TABLES



A.12.A


Total Burden Hours



A.12.B


Total Costs to Respondents




B.1.A


Frame Number of School Districts




B.1.B


Population Counts in the Frame for Each School Stratum




B.1.C


School Size Sub-Stratum Boundaries in Each Stratum Cell




B.1.D


Planned Sample Sizes for the Various Stages




B.1.E


Sample Sizes for the District Survey




B.1.F


Sample Sizes for the School Survey




B.1.G


Standard Errors and Design Effects for Key Estimates Computed in the 2006 SHPPS




B.1.H


School District Sample Sizes Needed to Achieve Target Levels of Precision

For Various Design Effect Scenarios




B.1.I


School Sample Sizes Needed to Achieve Target Levels of Precision for Various

Design Effect Scenarios




B.2.A


Distribution of SHPPS Questionnaires across Components and Respondent Levels




B.2.B


Major Means of Quality Control









B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS


B.1 RESPONDENT UNIVERSE AND SAMPLING METHODS


The sample design for SHPPS 2012 is identical to the one used in SHPPS 2006, except for an increase in the number of districts drawn in 2012. Based on the district response rate and percentage of ineligible districts obtained in 2006, this increase was deemed necessary to meet precision targets.


B.1.a Respondent Universe


The study universe includes state education agencies, local education agencies (school districts), schools, and school courses with health or physical education content. Respondents for states, districts, and schools will be personnel who have responsibility for one or more of the seven components of school health programs for which data collection instruments have been developed: health education, physical education and activity, health services, nutrition services, healthy and safe school environment, mental health and social services, and faculty and staff health promotion. Respondents for health and physical education courses will be teachers of those courses.


B.1.b State Education Agencies


The state education agency (SEA) of each state and the District of Columbia constitute the universe for state-level data collection in SHPPS 2012. The study will survey all 51 of these agencies. Through the office of the chief state school officer in each SEA, the state official primarily responsible for each of these components of school health programs will be identified as respondents: health education, physical education and activity, health services, nutrition services, healthy and safe school environment, and mental health and social services. It is possible that in some states the chief state school officer will identify a state-level official who does not work for the SEA (e.g., a health services coordinator in the state health department) who is the most knowledgeable respondent for a specified component.


B.1.c School Districts


The universe of school districts in the 50 states and the District of Columbia contains approximately 12,800 public school districts. Non-public schools that are not arranged in districts or the equivalent will not be represented in the district-level universe but will appear in the school-level universe and sample.


The study will survey a stratified random sample of public school districts. The frame of school districts will be stratified by urbanicity and poverty. The initial sample size will be 1,006 districts, of which we expect approximately 665 eligible districts to respond (see section B.1.g). As an example, a school district will be regarded as ineligible if the district does not contain schools operating under the authority of the district, or only serves special schools not in our study universe. Based on response rates from SHPPS 2006, we anticipate a district participation rate of 75 percent; i.e., 75 percent of eligible sampled districts are expected to participate in the district-level survey. We refer to the expected number of eligible responding districts as the “district respondent sample size.” Please note that in computing the participation rate, the number of projected ineligibles has been excluded from the denominator. We anticipate that of 1,006 selected school districts, approximately 120 will be found to be ineligible and excluded from the number of prospective participants, leaving 886 eligible districts, of which 665 or 75 percent of eligibles are expected to participate. Note that this discussion does not include 20 DASH-funded districts included with certainty.


Through the office of the superintendent of each of these district-level entities, the district official primarily responsible for coordinating each of the components of school health programs will be identified as respondents: health education, physical education and activity, health services, nutrition services, healthy and safe school environment, mental health and social services, and faculty and staff health promotion. It is anticipated that occasionally the district superintendent will identify a local government official who does not work for the school district (e.g., a health services coordinator in the county health department) who is the most knowledgeable respondent for that component.


Table B.1.A provides a summary of sampling frame statistics from the 2012 study, including the number of school districts in the frame as well as the numbers in each of the first-stage strata defined by poverty and urban status as described in Section B.1.f.


Table B.1.A Frame Number of School Districts



First-stage strata

Number of districts

Percent of all districts

Urban, Low Poverty

3,447

26.96

Urban, High Poverty

2,625

20.53

Rural, Low Poverty

2,952

23.09

Rural, High Poverty

3,760

29.41


Table B.1.B shows the variation in average number of districts per PSU across the four first-stage strata. The average numbers of districts per PSU are used to fine-tune the number of PSUs needed in the first-stage (first-phase) sample to generate the target number of districts per stratum and overall.


Table B.1.B Average Number of Districts per PSU in the First-stage Strata


First-stage stratum

Stratum Count (PSUs)

Average Districts per PSU

Rural Poor

1,197

3.14119

Rural Non-poor

890

3.31685

Urban Poor

1,416

1.85381

Urban Non-poor

1,904

1.81040


Table B.1.C (below) provides additional details of the distribution of schools in the sampling frame.


B.1.d Schools


The universe of schools contains approximately 126,000 public and non-public schools. The study will survey a stratified random sample of public and non-public schools, selected within the geographic area covered by districts sampled in the prior stage of sampling. Schools will be stratified by school type (public vs. non-public), level (elementary vs. middle vs. high), and size (small vs. large).


The initial school sample will contain approximately 1,452 schools. Based on response rates from SHPPS 2006, we anticipate a school participation rate of 74 percent; i.e., 74 percent of eligible sampled schools are expected to participate. We refer to the expected number of eligible responding schools as the “school respondent sample size.” Please note that in computing the participation rate, the number of projected ineligibles has been excluded from the denominator. Schools could be excluded because they have ceased to operate, changed their target population such that they no longer fall into our universe (e.g., regular school changed to a special education school), or changed the age group they serve (e.g., a school selected as a middle school is now an elementary school). Schools found to be ineligible during sample validation will be replaced by similar schools—same level and type—selected within the same PSU. If no such school is available in the same PSU, then a similar school will be selected from a neighboring PSU within the same state.


We anticipate that of 1,452 selected schools, approximately 43 will be found to be ineligible, and 1,409 will be eligible schools. The estimated percent ineligible (2.9%) and participation rate (74%) are based on our experience in fielding SHPPS 2006. Of the 1,409 eligible schools, 1,043 (74 percent) eligible schools are expected to participate.


Table B.1.C provides frame totals for the school strata from the 2012 study, i.e., for the second-stage strata defined by school type, level and size described in Section B.1.f.


Table B.1.C Population counts (number of schools) in the frame for each school stratum (second-stage strata defined by school type by level by size)



School Level

School Type and Size


Elementary Schools


Middle Schools


High Schools

Public, Small

19,841

7,742

7,588

Public, Large

32,816

16,599

11,145

Non-Public, Small

13,815

9,727

4,402

Non-Public, Large

898

954

872

TOTAL

67,370

35,022

24,007



Working with the principal or designated contact person at each participating school, we will seek to identify as prospective respondents the school staff member or members primarily responsible for delivering and/or coordinating each of the components of school health programs: health education, physical education and activity, health services, nutrition services, healthy and safe school environment, mental health and social services, and faculty and staff health promotion.


B.1.e Courses with Health or Physical Education Content


A probabilistic sample of all required courses and elementary school classes containing health education and/or physical education content will be drawn for inclusion in the study. For each such required course/class, we will randomly select one teacher as a study respondent. The selection of these teachers is described in section B.1.f.


B.1.f Statistical Methodology for Stratification and Sample Selection


The sampling design modifications introduced to the 2006 SHPPS design that led to smaller design effects (DEFFs), and therefore precision gains when compared to the 2000 study for district, school, and course sample estimates have been retained in the 2012 study. Sample sizes have been re-calculated for the 2012 study based on design effect estimates and response rates obtained from the 2006 study to ensure that precision requirements, unchanged from the 2006 study, can be met in the most efficient manner.

Each stage of the sampling and the construction of the composite measure of size (MOS) for the districts are discussed further below. Section B.1.g summarizes the sample sizes and gives the expected precision of survey estimates.


1) District Sample (First Stage of Sampling)


At the first stage of sampling, approximately 420 primary sampling units (PSUs) will be selected, encompassing 1,006 districts. These PSUs are geographical groupings of districts. This first-stage sample will be stratified by urban status and poverty, and selected with equal probabilities within strata. It is anticipated that approximately 665 districts will participate among those found eligible for the study, for a participation rate of 75 percent.


For sampling, districts with fewer than six schools will be combined to form first stage sampling units (PSUs), and some very large districts, including the 20 school districts funded by DASH, will be included with certainty. The number of districts per PSU will not be known until this process is complete. Thus, we cannot provide an exact sample size for the PSU draw. While the district sample size is fixed to ensure that the sample meet precision requirements, the PSU draw size is not. This number will be fixed following frame construction to ensure that the required number of districts is drawn.


Domains of interest for the district strata are defined by total districts, the two levels of poverty, and the two levels of urbanicity. Districts will be allocated equally across four district strata. We will select a district sample large enough to give district estimates of the desired precision, and will sample schools from a sub-sample of the districts.


Stratification variables


District-level estimates from the study will be proportions of districts overall as well as proportions by categories of urbanicity and poverty.


The frame for the district sample will consist of all regular districts, excluding supervisory unions of school districts and special purpose districts (e.g., those for special education or vocational education only). As part of frame construction, non-public schools will be linked to the public school district within the area which they are located (geographically), and those non-public schools will be included in the count of schools for the district.


The frame of districts will be stratified by two categories of urbanicity (urban versus non-urban) and by two categories of poverty level. The category definitions will follow those developed for the 2006 study.


The variables used will be obtained from the Census 2000 SF3 ZTAC file. The median of the Percent Poor variable will be used to define poverty strata within each of the two urbanicity strata. The medians will be used to equalize the strata along each dimension.


The two poverty strata are defined as low percentage of students living under poverty versus high percentage of students living under poverty. The poverty level variable will be based on the percentage of school age children living below the poverty level in the given ZIP Code area. The Percent Rural variable is computed as the ratio of the Rural total to Urban plus Rural totals. The rural and urban totals are the number of persons living in the rural area of the ZIP Code, and the number of persons living in the urban area of the ZIP Code, respectively.


Specifically, districts within ZIP Codes with values for this variable less than or equal to the median for the district universe will be classified as “low poverty” and districts within ZIP Codes with values greater than the median will be classified as “high poverty.” Similarly, the median of the Percent Rural variable will be used to classify ZIP Code areas, and hence districts, into rural and urban strata. Districts within ZIP Codes with values for this variable less than or equal to the median for the district universe will be classified as urban, and districts within ZIP Codes with values greater than the median will be classified as rural.


Selecting the district sample

A stratified sample of districts will be selected in two phases:

  1. an equal probability sample for the district survey, and

  2. a PPS sub-sample as a platform for the school sample.


The MOS for selecting a sub-sample of districts is a composite size measure based on the desired school sampling fractions and the number of schools (public and non-public) in each of the second-stage (school) strata.

The sample sizes in each of the second-stage (school) strata will be allocated to the first-stage (district) strata in proportion to the number of schools in each of the first-stage strata. The 12 second-stage strata are the key school subgroups defined by school level (3 categories), school type (two categories), and school size (two categories).


Composite measures of size will be used to ensure that:

  • The targeted school sample sizes are approximately achieved for the school subpopulations of interest (i.e., the second-stage strata defined by level and size),

  • An approximately equal number of schools are selected per district (to help equalize the numbers of schools to be visited per district), and

  • School weights are approximately equal within strata.

2) School Sample (Second Stage of Sampling)


Schools eligible for the study will be public and non-public schools with any of grades 1 through 12. Kindergarten is a grade of interest, but schools containing kindergarten and/or pre-kindergarten, but not first grade, will be excluded. This strategy will include schools that contain kindergarten in addition to higher grades.


From a sub-sample of 253 PSUs, corresponding to about 566 districts, a stratified random sample of 1,452 (initial sample size) public and non-public schools will be selected. The average number of districts per PSU depends on the stratum as shown earlier in Table B.1.B.


Stratification variables


The frame of schools within the identified sub-sample of districts will be stratified by school level (that is, elementary, middle, or high), school type (public and non-public), and by school size (small and large). School level will be defined based on the grades present in the school and school size will be based on school enrollment for relevant grades.


School level will be defined as the following mutually exclusive subgroups of eligible schools (recalling that these exclude schools that only offer Kindergarten):


Elementary: Schools with any grade 5 or under

Middle: Schools with grades 7 or 8, or only grade 6, or only grades 5 and 6

High: Schools with any of grades 10, 11, or 12, or only grade 9


Any school that falls into more than one of the level categories will be split conceptually into separate frame units into each of the level strata in which it appears. We anticipate that approximately one fourth of the schools in the frame will fall into more than one category and will be split.


The categories of school size will be based on school enrollment, and set to divide the population of schools into two approximately equal groups (within level-by-type cells). For purposes of computing enrollment, schools falling into more than one of the school level categories will be divided into enrollment groups using the following guidelines: grades K-5 will be classified as elementary, grades 6-8 will be classified as middle, and grades 9-12 will be classified as high school. A classification scheme developed for the 2006 study will be employed that, while adhering to these guidelines, prevents the creation of inappropriate “single grade” schools.


Using the current Quality Education Data file, school size categories are defined using the stratum boundaries, or cutoff, computed as the median enrollment in each cell defined by school level. The medians used as cutoffs were as follows:


343 for elementary schools

178 for middle schools

372 for high schools



For example, for elementary schools, small schools are those with enrollment of 343 or less, and large schools those with enrollment in excess of this cutoff. Schools with an enrollment of less than 30 will be dropped from the frame.


The school sample is based on a sub-sample of districts selected from the first-stage sample.


The composite MOS for selecting the probability-proportional-to-size (PPS) (sub)sample of districts will be computed as:

4 9

S(i)=∑ ∑ f(h,j) N(h,i,j).

h=1 j=1


The sample size of schools to be selected from second stage stratum j in district (h,i) is computed as:



n(h,i,j) = n f(h,j) N(h,i,j) .

S(i)


where:


h = index for the first stage (district strata), h=1, . . ., 4

i = index for districts within district strata

j = index for the second stage (school strata), j=1, . . ., 12

N(h,i,j) = number of schools in first stage stratum h, district i, second stage stratum j

F(h,j) = n(h,j)/N(h,j)=desired sampling fraction for schools in first stage stratum h, second stage stratum j

n(h,j) = desired sample size of schools in first stage stratum h, second stage stratum j

N(h,j) = number of schools in first stage stratum h, second stage stratum j = ∑ N(h,i,j)

j

n = total desired school sample size.


The n(h,i,j) are the sample sizes of schools to be selected from each second stage stratum from each district i. They will generally be non-integer, and will be randomly rounded. The n(h,i,j) have the property that the total in each sampled district i will be approximately equal, and equal to n/d where d is the total number of districts in the sample (see Folsom, Potter, and Williams)4. The n(h,i,j) also have the property that


∑ ∑ n(h,i,j) = n(j), the desired second-stage sample size.

h i

Selecting the school sample


The sample allocation for the number of schools in each of the school strata, n(j), are determined to satisfy the variance constraint that 95 percent confidence intervals around estimated proportions be no greater than 0.05. These precision requirements are the same as the levels achieved in the 2006 SHPPS.


3) Course Sample (Third Stage of Sampling)


Study staff will interview teacher respondents about required courses with health education and/or physical education content in each of the sampled schools. Therefore, sampling units (as well as analysis units) will be courses or elementary school classes, and they will be represented by selected teachers who will report the data for the course/class.


For each of the two content areas, Health Education and Physical Education, two teachers will be sampled randomly from among all eligible teachers, i.e., those who are currently teaching the course, have taught the course during the current school year, or taught the course as recently as spring semester of the prior school year and who are still members of this school’s staff. Note that for elementary schools, most regular classroom teachers will likely meet these criteria, and natural units are grades rather than courses as used for secondary schools, at least up through fourth grade. The differences in selection procedures for secondary vs. elementary are described below.


  1. Secondary Schools


The process involves several steps performed separately for Physical Education (PE) and Health Education (HE) within each sampled school. We describe the steps for HE, with a comparable process taking place for PE.


  1. Construct a list of all courses containing health instruction.

  2. Select a random sample of 2 courses if the list contains more than 2 courses; otherwise, take all courses.

  3. Identify the teachers linked to each selected course.

  4. For each selected course, randomly select one teacher from the list of teachers in the prior step.

  5. For the teacher/course pair, select one section from the course sections taught by the teacher.

For each school, we will carefully record the numbers involved in steps 1, 2, 4 and 5 as these will used for weighting the selected section up to the course and school levels.


  1. Elementary Schools


A similar sequence of steps will be taken to select grades for PE and HE (separately), and identify reporting units, within each sampled elementary school. Again, we describe the steps for HE; the steps for PE are similar.


  1. Identify all eligible grades at which health instruction is required.

  2. Randomly select two of these grades (unless the school contains only 1).

  3. List all teachers providing instruction at each of these grades.

  4. Randomly select one teacher for each of these grades.

  5. If health instruction for a selected teacher’s class is provided by a specialist, interview the specialist about instruction for that particular class.


B.1.g Estimation and Justification of Sample Size

All 50 states and the District of Columbia will be included in SHPPS 2012 and we expect a 100 percent response rate at the state level. Therefore, the state responding sample size will be 51. Table B.1.D presents the sample sizes and expected number of respondents, by strata, for the district, school, and course samples.

Table B.1.D Planned Sample Sizes for the Various Stages

Sampling Units

Number of Selected Units

Projected Eligible Units

Participation Rate

Respondent Sample Size

Districts

1,006

886

75%

665

Schools

1,452

1,409

74%

1,043

Courses

---



2,002 Health Education

2,002 Physical Education


The number of participating courses for each subject (HE, PE), 2002, was obtained by multiplying the number of courses per school by the number of participating schools, 2* 1,043, and then applying to the product the expected course participation rate, 96%.


District sampling, phase I: An initial sample of 1,006 districts will be selected, and approximately 665 districts are expected to participate among those found eligible for the study, for a participation rate of 75 percent. The participation rate is computed from the number of eligible districts expected in the sample. Based on the 2006 SHPPS experience, twelve percent of districts are expected to be ineligible.


This sample will also provide estimates for the following subgroups of districts with precision requirements that are not as tight as the above:

  • Non-urban districts

  • Urban districts

  • Districts with low percentage of children under poverty

  • Districts with medium or high percentage of children under poverty.


District sampling, phase II: For the school sampling, a stratified sub-sample of 566 districts will be randomly selected from the overall district sample, distributed across the strata as shown in Table B.1.E , i.e., the school sample will be drawn from these districts. Non-public schools will be linked geographically to districts, and selected from the same sample districts.


TableB.1.E Sample Sizes for the District Survey


District Sample



District Survey Sample

Sub-sample of Districts

For School Sample

District Strata

Districts in Population

Initial Sample Size

Expected Eligible Responding Districts1

Initial Sample

Urban, low poverty

3,447

252

167

142

Urban, high poverty

2,625

251

166

141

Not urban, low poverty

2,952

252

166

142

Not urban, high poverty

3,760

251

166

141

Total

12,784

1,006

665

566


1Assumes a participation rate of 75% among an estimated 886 eligible districts.


Table B.1.F presents the planned sample allocation to school strata (second-stage strata), i.e., the number of schools to be selected within each of these strata.



Table B.1.F Sample Sizes for the School Survey


Sample Sizes for the School Survey

School Strata

School Sample Allocation

Elementary Schools

484

Small

242

Public

143

Private/Catholic

99

Large

242

Public

236

Private/Catholic

6

Middle Schools

484

Small

242

Public

107

Private/Catholic

135

Large

242

Public

229

Private/Catholic

13

High Schools

484

Small

242

Public

153

Private/Catholic

89

Large

242

Public

224

Private/Catholic

18



The planned sampling design will achieve the levels of precision targeted for districts, schools and courses/classes with standard errors of 2.5% or less for all estimated percentages, corresponding to confidence intervals within +/- 5 percentage points (at the 95% confidence level). These levels are relaxed to +/- 6 percentage points for subgroup estimates at all levels.


To develop the sample sizes required to achieve these precision levels, the logical steps started with the district sample sizes that tend to drive the sample sizes overall. The sample sizes at the other levels are to some extent derived from the minimal district sample size as we planned to keep the same average number of sample schools per district and courses/classes per school adopted in previous cycles.


Empirical Results: SHPPS 2006 Data as Guidance (Upper Bound on Variances)


Estimation precision is similar to that achieved in the 2006 SHPPS survey; we use the SHPPS 2006 levels of precision to confirm this. The empirical approach uses the design effects (DEFFs) achieved for a range of key estimates computed with the 2000 SHPPS data. Table B.1.G presents the estimated standard errors and design effects attained for selected key estimates.



Table B.1.G Standard Errors and Design Effects (DEFFs) for Key Estimates Computed in the 2006 SHPPS


a) School Level Estimates



Estimate

Overall

Elementary Schools

Middle Schools

High Schools

Schools with Tobacco Free Policies

Percentage

63.6%

65.4%

58.7%

66.1%

Standard Error

2.4%

3.4%

3.1%

3.0%

Design Effect

2.4

1.7

1.3

1.3

Schools with Required PE

Percentage

78.4%

69.3%

83.9%

95.1%

Standard Error

1.8%

3.1%

2.4%

1.7%

Design Effect

1.9

1.6

1.3

1.9

Schools with Required PE in Each of Their Grades

Percentage

26.4%

34.8%

20.5%

12.3%

Standard Error

2.0%

3.1%

2.5%

2.3%

Design Effect

2.0

1.5

1.3

1.6

Schools with Required HIV Prevention Instruction

Percentage

59.5%

39.1%

74.5%

88.4%

Standard Error

2.2%

3.7%

3.1%

2.1%

Design Effect

1.7

1.6

1.5

1.3

Schools with Required Nutrition and Dietary Behavior Instruction

Percentage

84.2%

84.6%

82.3%

86.3%

Standard Error

1.6%

2.4%

2.7%

2.3%

Design Effect

1.8

1.3

1.5

1.4

Schools with Required Alcohol or Other Drugs Prevention Instruction

Percentage

81.8%

76.5%

84.6%

91.8%

Standard Error

1.8%

2.7%

2.5%

1.8%

Design Effect

2.0

1.2

1.4

1.4

Schools That Had a School Nurse

Percentage

86.3%

87.0%

86.5%

84.3%

Standard Error

1.8%

2.4%

2.7%

2.5%

Design Effect

2.5

1.7

1.8

1.4



b) District Level Estimates



Estimate

Percent

Standard Error

Design Effect

Districts with Tobacco Free Policies

55.4%

3.7%

2.2

Districts with Required HIV Instruction

95.6%

1.3%

1.5


Estimate

Elementary Schools

Middle Schools

High Schools

Districts with Required Nutrition and Dietary Behavior Instruction

Percentage

77.4%

85.1%

87.9%

Standard Error

2.8%

2.4%

2.3%

Design Effect

1.7

1.7

1.6

Districts with Required Alcohol or Other Drugs Prevention Instruction

Percentage

79.0%

89.7%

89.3%

Standard Error

2.7%

2.1%

2.3%

Design Effect

1.7

1.6

1.7


Sample Sizes for Required Precision Levels


This section presents the sample sizes developed to achieve the levels of precision targeted for school districts, schools, and courses/classes reported by teachers in each category, Health Education and Physical Education. The precision is discussed in terms of standard errors, or equivalently, confidence intervals, for different design effect (DEFF) scenarios.


1) District Sample


The DEFF will be between 1.5 and 1.8 for the district sample because this equal-probability sample will have minimal effects of clustering or unequal weighting. As shown in Table B.1.G, even the unequal-probability sampling design used in the 2000 SHPPS achieved low design effects (between 1.5 and 2.0). The derived sample sizes (See Table B.1.H) are premised on three empirically-based DEFF scenarios: DEFF=1.5, 1.6 and 1.7. Table B.1.H shows that for the conservative DEFF=1.7, approximately 680 completed district surveys would be necessary to generate estimates with at most a 2.5% standard error.


Table B.1.H School District Sample Sizes Needed to Achieve Target Levels of Precision for Various Design Effect Scenarios


Design Effect (DEFF)

2.0% Standard Error

2.5% Standard Error

1.5

937

600

1.6

1,000

640

1.7

1,062

680


2) School Sample and Course Sample


For the school sample, we expect larger design effects, DEFF between 2.0 and 2.5, as this sample will exhibit some effects of clustering and unequal weighting overall (albeit not within strata). Table B.1.G provides empirical evidence that the two-stage sampling design used to select schools in the 2006 SHPPS generated DEFFs of 2.5 or lower.


Table B.1.I shows the school sample sizes needed to achieve the target precision levels. To achieve standard errors of 2.5% or less, a sample of at least 1,000 schools would be needed for DEFF=2.5 (the most conservative scenario).


Table B.1.I also shows that subgroup samples of at least n=361 schools will be necessary to achieve standard errors of 3.0%, assuming DEFFs near 1.3. These sample sizes may be conservative for some subgroups. Within second-stage strata—e.g., defined by school level—we expect DEFFs to be lower than 1.3 based on 2006 data. Thus, the expected precision of estimates based on elementary schools, middle schools and high schools will be comparable to those of simple random samples of the same size*-- approximately n=267 for each subgroup .


In addition, Table B.1.I shows that for the course sample to achieve standard errors of 2.0% or less, the sample size needs to be 1,250 for DEFF=2.0 and 1,562 for DEFF=2.5.



Table B.1.I School Sample Sizes Needed to Achieve Target Levels of Precision for Various Design Effect Scenarios


Design Effect

Standard Error

2.0%

2.5%

3%

1.2

750

480

333

1.3

812

520

361

1.4

875

560

389

1.5

937

600

417

2.0

1,250

800

555

2.5

1,562

1,000

694


B.1.h Weighting and Estimation Procedures


The base weight for each sampled entity will be equal to the inverse of its probability of selection (conditional at each stage of sampling). Prior to data analysis, sampling statisticians will prepare sampling weights adjusted for non-response within strata. Final survey weights will reflect the probability of selection and non-response adjustments; these weights will be appropriate for national estimates and estimates within strata.


The estimation process will use statistical software developed for analyses of survey data arising from complex sampling designs (e.g., SUDAAN). These estimation procedures will appropriately account for the effects of non- response, unequal probability sampling, stratification, and clustering. Examples of tables that will be completed through analysis of the data are in Appendix F.



B.2 PROCEDURES FOR COLLECTION OF INFORMATION


B.2.a Use of Less Frequent Data Collection to Reduce Burden


This is a one-time collection of information.


B.2.b Survey Questionnaires


The study involves the use of 22 questionnaires designed to measure policies and practices at the state, district, school and classroom levels related to the following seven components of school health programs: health education, physical education and activity, health services, nutrition services, healthy and safe school environment, mental health and social services, and faculty and staff health promotion. The state and district questionnaires are designed for self-administered web-based administration and the school and classroom questionnaires are designed for computer-assisted personal interviewing (CAPI).


The study also involves three data collections involved in recruiting states, districts, and schools – one at each level. These recruitment scripts involve working with state, district, and school contacts to identify appropriate respondents (at all levels), schedule in-person interviews (school and classroom levels), and randomly select class sections for inclusion in the health education and physical education components (classroom level).


Table B.2.A illustrates the distribution of the 23 data collection instruments across components and levels of jurisdiction. The complete set of questionnaires can be found in Appendix G. The state recruitment script can be found in Appendix H-3. The district recruitment script can be found in Appendix H-4. The school recruitment script can be found in Appendix I-2.


In preparation for SHPPS 2012, CDC and the contractor conducted extensive reviews of the SHPPS 2006 questionnaires. Questions were deleted when the 2006 data showed the question had low yield and the resulting data were not useful to CDC. Minor modifications, such as question wording, have been made to the SHPPS 2006 questionnaires to improve clarity. Also, question wording was revised because of a change in the mode of administration. State- and district-level data collection in 2006 was conducted via computer-assisted telephone interviewing; in 2012 this data collection will be self-administered via the Internet. In an effort to reduce redundancy in data collection efforts within CDC, state-level questionnaires have been revised to no longer collect data on state policies related to school health programs. Also, the state-level questions dealing with faculty and staff health promotion have been incorporated into the healthy and safe school environment questionnaire, thus reducing the number of state-level questionnaires. A new component to the SHPPS 2012 study is the inclusion of vending machine observations. This new element will yield the only nationally representative dataset of snack and beverage offerings available to students through school vending machines.

Table B.2.A Distribution of SHPPS Data Collection Instruments across Components and Respondent Levels


Component

State

District

School

Classroom

Total

Number of Instruments

Health Education

4

Physical Education and Activity

4

Health Services


3

Nutrition Services


3

Healthy and Safe School Environment


3

Mental Health and Social Services


3

Faculty and Staff and Health Promotion



2

Recruitment Scripts


3

Number of Questionnaires

6

7

7

2

25


B.2.c Obtaining Access to and Support From State Education Agencies (SEAs), School Districts, and Schools


All initial letters of invitation will be on CDC letterhead from the Department of Health and Human Services and signed by Howell Wechsler, Ed.D, M.P.H., Director, DASH, NCCDPHP at CDC. The procedures for gaining access to and support from states, district, and schools will have three major steps:


$ First, support and clearance will be sought from SEAs. The initial request will be accompanied by a study fact sheet and a list of all sampled districts and schools in the SEA’s jurisdiction. Following an initial mailing of the request packet, telephone contact with the Chief State School Officer or designee will be made to elicit support and identify state-level respondents. States will be asked to provide general guidance on working with the selected school districts and schools and to notify school districts that they may anticipate being contacted about the survey.


$ Once cleared at the state level, an invitation packet will be sent to sampled school districts in the state. It should be noted, however, that only a subsample of districts will contain sampled schools. Therefore, some districts will be approached solely as prospective respondents; other districts will be approached both as respondents and as a means of gaining access to sampled schools. Those districts that contain sampled schools will receive a list of these schools in the invitation packet and will be asked to provide general guidance on working with the selected schools and to notify schools that they may anticipate being contacted about the survey. Telephone contact will be made with the office comparable to the district office (e.g., diocesan office of education), if there is one.


$ Once cleared at the school district level, selected schools will be invited to participate. Information previously obtained about the school will be verified. The burden and benefits of participation in the survey will be presented. After a school agrees to participate, a tailor-made plan for collection of data in the school will be developed (e.g., identify respondents, determine the best and worst weeks during the spring semester for data collection, gather schedules for respondents, etc.). Contact with schools will be maintained until all data collection activities have been completed.


Prior experience suggests the process of working with each state education agency, school district, and school will have unique features. Discussions with each education agency will recognize the organizational constraints and prevailing practices of the agency. Letters to states and districts, scripts for use in guiding discussions with states and districts, and state and district questionnaire content outlines are found in Appendix H. Letters to schools, scripts for guiding discussions with school officials, and school and classroom questionnaire content outlines are contained in Appendix I. The study fact sheet is contained in Appendix J.



B.2.d Data Collection Procedures


Data collection will begin in October 2011 at the state and district levels, pending the completion of appropriate clearance processes. Data collection at these levels will be via web-based questionnaire technology. School- and classroom-level data collection will begin in February 2012, pending the completion of appropriate clearance processes. Data collection at these levels will be conducted in person using computer-assisted personal interview technology (CAPI).


State and district collection. State and district contacts will receive content outlines (Appendix H) in an initial mailing during the recruitment phase to assist them with the identification of the most knowledgeable respondents for each of the six questionnaire content areas. Telephone follow-up will occur two to three days following the mailing to address any questions the contact may have and, if it is convenient for the contact, elicit the names of the most knowledgeable respondents for each questionnaire content area. Due to the breadth of topics that fall under some of the content areas (e.g., Healthy and Safe School Environment), more than one respondent may be needed to complete a questionnaire. For content areas for which we anticipate this to be the case, contacts will be provided the opportunity to designate the most knowledgeable respondent for each of the questionnaire’s “modules.” Questionnaire modules are comprised of topics that are similar in content and could likely be addressed by one person with expertise on those topics. Procedures for identifying the most knowledgeable respondents for each questionnaire content area are described below.


Through the state contact, personnel most knowledgeable about each of these components of school health programs will be identified as follows:

  • Health Education. The state contact will be asked to identify the state health education coordinator, who can address questions about school health education standards; state assistance to districts and schools; instructional content by school level; certifications, licensure or endorsements offered by the state; professional development; and collaboration with outside organizations.


  • Physical Education and Activity. The state contact will be asked to identify the state physical education coordinator, who can provide overall information about physical education standards; state assistance to districts and schools; instructional content by school level; fitness testing; certifications, licensure or endorsements offered by the state; professional development; collaboration effors; and interscholastic sports.


  • Health Services. The state contact will be asked to identify the state health services coordinator, who can provide information about state assistance to districts and schools; funding for health services; collaboration efforts; reporting requirements; professional development; and school-based health centers.


  • Nutrition Services. The state contact will be asked to identify the state child nutrition or nutrition services director, who can provide overall information about state assistance to districts and schools, certifications and professional development, collaboration efforts, and program evaluation.


  • Healthy and Safe School Environment. The state contact will be asked to identify the state school health coordinator who can address questions related to state assistance to districts and schools; professional development; faculty and staff health promotion; reporting of school violence; crisis preparedness, response, and recovery; and school health coordination.


  • Mental Health and Social Services. The state contact will be asked to identify the state mental health and social services coordinator, who can provide information about state assistance to districts and schools; schools serving as Medicaid providers; collaboration efforts, program evaluation, and professional development.


Through the district contact, personnel most knowledgeable about each of these components of school health programs will be identified as follows:


  • Health Education. The district contact will be asked to identify the district health education coordinator, who can address questions about school health education standards, instructional content by school level, staffing and professional development, collaboration efforts, and program promotion and evaluation.


  • Physical Education. The district contact will be asked to identify the district physical education coordinator, who can provide information about physical education standards, instructional content by school level, physical education for students with disabilities, use of protective gear, assessment, physical activity and discipline, staffing and professional development, program promotion and evaluation, and interscholastic sports.


  • Health Services. The district contact will be asked to identify the district health services coordinator, who can provide information about student health records; required immunizations; screening and testing; administering student medications; funding for standard health services; collaboration efforts; provision of health services; staffing characteristics; and school-based health centers.


  • Nutrition Services. The district contact will be asked to identify the district school food authority director or district food service director, who can provide overall information about menu planning and food ordering; food preparation; collaboration, promotion, and evaluation; professional development; and food service and child nutrition requirements and recommendations.


  • Healthy and Safe School Environment. The district contact will be asked to identify the district school health coordinator who can address questions related to the district’s policies on the prevention of violence, tobacco use, and injuries; crisis preparedness, response, and recovery; foods and beverages available outside of the school meals program; and transportation to and from school. The contact also will be asked to identify the individual responsible for the oversight of issues related to physical school environment and health hazards for the district.


  • Mental Health and Social Services. The district contact will be asked to identify the district mental health and social services coordinator, who can provide information about provision of services; collaboration, promotion, and evaluation; staffing characteristics; and professional development.


  • Faculty and Staff Health Promotion. The district contact will be asked to identify the individual most knowledgeable about the district’s health insurance, required examinations and screenings, health promotion activities and services, employee assistance programs, and planning and coordination.


At both the state and district levels, respondents will be mailed an information packet prior to data collection. This packet will contain a fact sheet, a content outline for each questionnaire for which the respondent has been identified as most knowledgeable, instructions on how to access the study website, and a unique study identifier that will allow the respondent to log and complete the questionnaire(s) to which he has been assigned.


Data collection process.  Upon identifying the most knowledgeable respondents for each of the questionnaire content areas, name and contact information will be stored in an online case management system (CMS).  During this process, a unique study identifier is generated and is linked to the questionnaire(s) for which the respondent has been identified.  Each respondent is then assigned a randomly generated distinctive access code for the web-based data collection system which is linked to the respondent’s unique identifier in the CMS. Once the respondents have received their informational packet, they may access the website from any Internet-connected computer using their assigned access code and begin completing the questionnaire(s) they were assigned.


Each time a respondent advances to a new screen of questions, data is saved to the central repository.  This allows respondents to “break-off” a questionnaire and return to it at a later time without data loss.  Since the data will already be keyed into the web-based system, data entry will not be necessary.  Also, because the computer-assisted methodology will prevent respondents from skipping questions in error, the need for any follow-up contact with state or district respondents will be minimal.


School collection. Once a school has agreed to participate in the study, a project staff member will contact the principal or school administrator to identify respondents and schedule data collection activities. Respondent names and interview schedules will be stored in the online case management system; the schedule will later be verified and confirmed by the field interviewer who is assigned to that school. At each school we will complete each of the seven school-level questionnaires with the respondents most knowledgeable about the specific component within that school. In addition, we will interview a sample of teachers of health education and physical education courses. Procedures for identifying course respondents are described in section B.1.f, as well as below. Procedures for identifying primary respondents for the seven school-level questionnaires are described below.


Through the office of the school administrator of each sampled school, the school staff member primarily responsible for delivering and/or coordinating each of these components of school health programs will be identified as follows:


  • Health Education. The school administrator will be asked to identify the lead health educator (sometimes a department chair) who can provide overall information about the organization of the school’s health education program. Note that, while these procedures will apply to middle and high schools, they may have to be modified somewhat for uses in elementary schools (both for Health Education and Physical Education) because there may not be a lead teacher for these subjects in elementary schools.


  • Physical Education. The school administrator will be asked to identify the lead physical educator who can provide overall information about the organization of the school’s physical education and activity program. The school administrator also will be asked to identify the individual most knowledgeable about the interscholastic sports program at the school.


  • Health Services. The respondent universe includes personnel responsible for a variety of health services activities at the school including student health records, immunization requirements, screenings, administering student medications, and other health services. The school administrator will be asked to identify the individual(s) who is most knowledgeable about the health services provided within or by the school. Respondents will include physicians, nurses, health aides, and other designated school staff.


  • Nutrition Services. The school administrator will be asked to identify the person primarily responsible for managing the planning, preparation, and provision of school nutrition services, usually the school food service manager.


  • Healthy and Safe School Environment. The school administrator of each school will be the respondent on questions related to the school environment and the school’s health policies and practices, including those related to prevention of violence, tobacco use, alcohol and illegal drug use. The school administrator will be offered the option of designating an assistant school administrator or someone else as the more appropriate respondent. The school administrator also will be asked to identify the person most knowledgeable about issues related to the physical school environment and health hazards.


  • Mental Health and Social Services. The school administrator will be asked to identify the individual(s) who is most knowledgeable about the mental health and social services provided by the school. Respondents will include guidance counselors, social workers, nurses, school administrators, and assistant principals.


  • Faculty and Staff Health Promotion. The school administrator will be asked to identify the person who is most knowledgeable about the health promotion services and activities provided by the school for faculty and staff. Respondents will include nurses, teachers, member of a school wellness council, guidance counselors, principals, and assistant principals.


Courses/Classes


In each middle or high school, up to two classroom teachers will be interviewed for required courses with health education and physical education content. Courses will be randomly selected from all required health education and physical education courses offered at a school. In elementary schools, we will interview both regular classroom teachers and specialists, if any, who teach health and/or physical education content. Up to two elementary classroom teachers and/or specialists will be randomly selected among those grades where instruction on health or physical education is required. See section B.1.f for details.


Observation component


In each school that has vending machines that are accessible to students during the school day, up to five vending machines (both snack and beverage) will be randomly selected to undergo observations. For schools that report five for fewer vending machines that students can access, each vending machine will undergo observations. Observations entail the use of digital photography to capture objective information about the snack and beverage options available to students in vending machines.


Data collection process. After a school visit has been scheduled by a member of the central study staff, a confirmation letter will be sent approximately one to two weeks before the visit, followed by a telephone call from the field interviewer responsible for the school. When a school agrees to participate, a customized plan for the data collection at the school will be developed in consultation with the school administrator. Every effort will be made to minimize disruption of the school schedule by working around school and classroom commitments. The school and classroom level interviews will be conducted by specially trained interviewers. An average of two days will be spent collecting data at each school. Data will be collected using computer-assisted personal interviewing (CAPI) technology.


A group of approximately 70 data collectors will be employed to conduct the school and classroom interviews. Before they are sent to the field, they will undergo an intensive training program. The training will cover the purposes of the study, use of the computer and digital camera, standard interviewing procedures, confidentiality requirements, and handling problematic situations (e.g., cancellations, reluctant respondents). Training will include both group instruction as well as paired mock interviews where interviewers practice interviewing one another with a prepared script.


For the observation component, data collectors will take five photographs per vending machine. The first photo will be of an “identity card” that provides the school ID, state, and vending machine ID number. The following four photos will be of each quadrant of the machine, starting with the upper left quadrant, then upper right, then lower left, and finally lower right. This process will be repeated for each vending machine.


Digital photographs will be transferred from the interviewers’ cameras directly to the interviewers’ netbook computers daily. The same software used to conduct the face-to-face interviews will be used to manage vending machine photographs and associate them with the correct school, thus reducing the potential for error.


Interviewers will transmit their completed interview data and digital photographs electronically daily. Because the interview data will have already been keyed into the interviewers’ netbook computers, data entry will not be necessary. Also, since the computer-assisted methodology will prevent interviewers from skipping questions in error, the need for any follow-up contact with school or classroom respondents will be minimal.


B.2.e Quality Control


The task of collecting quality data begins with a clear and explicit study protocol and ends with procedures for the verification of collected data. In between these activities, and subsequent to data collector training, measures will be taken to reinforce training to assist field staff who run into trouble and to check on data collection techniques. Table B.2.B lists the major means of quality control.


Table B.2.B Major Means of Quality Control


Survey Step

Quality Control Procedures

Survey Programming

  • Conduct internal programming review of web-based and CAPI questionnaires to ensure accuracy of questionnaires (100%)

Pre-mail Contact with States, Districts, and Schools

  • Discuss the goals and content of the study and of the specific questionnaires with contact at state, district, and school levels to ensure that the most appropriate respondents are identified

Mail Out for State, Districts, and Schools

  • Check inner vs. outer label for correspondence (5% sample)

  • Verify that any errors in packaging were not systematic (100%)

Telephone Follow-up Contacts

  • Monitor early sample of calls to ensure that the recruiter follows procedures, elicits proper information, and has proper demeanor (10%)

Identification of Most Knowledgeable Respondents

  • Explain the goals and content of the questionnaires with contact at states, districts, and schools to ensure that the most appropriate sample members are identified

Receipt Control of State and District Survey Data

  • Examine data submitted from first 10 states and first 10 districts to ensure data integrity

  • Review sample of submitted data from throughout data collection to ensure data integrity (10% of transmissions)

Interviewer Training and Supervision for School Interviews

  • Maintain at least one weekly telephone monitoring of all field staff throughout data collection (100% of field staff)

  • Reinforce training and clarify procedures through periodic field newsletters (100% of field staff)

  • Verify by telephone with a 10% sample of early schools that all data collection procedures are being followed

Netbook Computer Verification

  • Prior to each data collection, conduct netbook computer verification procedures to ensure netbook boots, questionnaires are loaded on the netbook, and interview loads after each start-up (100%)

  • Reload questionnaires on netbooks and ensure problems were not systematic (100%)

Digital Camera Verification

  • Conduct internal testing of digital camera functionality, including operation of camera, synch process with netbook, tagging photos with appropriate school, and acceptance into central repository (100%)

  • Ensure data collectors demonstrate appropriate and correct use of digital camera technology as a requirement of data collector training (100%).

  • Prior to each data collection, conduct digital camera verification procedures to ensure camera has adequate battery life, that camera mode is operational, and that there is sufficient memory. (100%)

Receipt Control of School Interview and Vending Machine Data

  • Examine first 10 transmissions from each interviewer to ensure data integrity and quality of photographs

  • Review sample of transmissions from each interviewer throughout data collection to ensure data integrity (5% of transmissions)


B.3 METHODS TO MAXIMIZE RESPONSE RATES AND DEAL WITH NONRESPONSE


B.3.a Expected Response Rates


This study anticipates a participation rate of 100 percent at the state level. Each previous cycle of SHPPS has achieved 100% response rates at the state level, and it is anticipated that the use of web-based technology will make this level of response easily attainable. At the district level, SHPPS 2006 achieved an overall response rate of 74.5 %. We have conservatively assumed a minimum response rate of 75% for each questionnaire component; however, a higher overall response rate is anticipated (78%) for this cycle due to the use of web-based technology and accelerating the launch of district data collection to Fall 2011. Overall school-level participation rate in 2006 was 77.9%. We have conservatively assumed a minimum response rate of 74% for each questionnaire component; however, a higher overall school response rate (80%) is anticipated for SHPPS 2012 due to improvements in non-public school recruitment procedures. Specifically, prior to sending an invitation letter, these schools will be contacted by recruiters with expertise in gaining cooperation from non-public schools to provide information about the study. These recruiters have received specialized training in anticipating the types of concerns these schools may have about participating in research studies and how to address these concerns. In addition, support for the study will be sought from national associations of Christian and Catholic schools. Classroom-level response rates rose from 90% for both questionnaires content areas in 2000 to 95% in 2006. We assume a similar classroom response rate (96%) for 2012.


B.3.b Methods for Maximizing Responses and Handling Nonresponse


Several methods will be used to maximize responses to SHPPS 2012. These methods will emphasize the importance of the study, minimize the burden of participation, and maximize the reward or participation. Specific methods are described below.


Methods to Emphasize the Importance of the Study


  1. Strong support of the national and state education and health organizations will be imparted during the initial recruitment of sample members. Letters of support will emphasize the value of participation.

  2. State education agencies will be asked to write a letter of support for the study that will be used at the district and school levels. Similarly, written district support will be cited during the contacts with school personnel.

  3. CDC sponsorship of the study will be stressed in all communication with sample members. Correspondence with the sample members will be on CDC letterhead and signed by the Director of the Division of Adolescent and School Health.

  4. Project materials will emphasize the importance of the study for improving school health programs for youth. Materials will include fact sheets from SHPPS 2012 data as well as the important national health objectives that the study addresses.

  5. Sample members will be informed that early initial contact is being made to facilitate their participation. Similarly, sample members will be informed that recruiters will make repeated follow-up efforts to encourage participation due to the great importance that the data have to federal, state, and local health and education officials.


Methods to Minimize Response Burden


  1. An iterative process of review by experts and practitioners has ensured the significance of all questions included in the study, and thereby reduced the risk that sample members will spend time answering questions needlessly.

  2. The use of web-based technology will allow respondents to respond to the questionnaires at a time and place of their convenience from any internet-connected computer.

  3. Use of CAPI to conduct the interviews will reduce respondent burden by automatically navigating through complex logic and skip patterns.

  4. Questionnaires have been modularized to enable more than one respondent to address different topics covered in one questionnaire. For example, for the Healthy and Safe School Environment questionnaire, one respondent may address general policies, but another respondent may be needed to address questions on physical school environment. This approach will help to reduce the burden on any one given respondent.

  5. Setting a school enrollment minimum of 30 students for inclusion in the sampling frame reduces the number of very small schools that are burdened by participating because of respondents wearing so many different “hats” and therefore being selected to respond to multiple content areas.


Methods to Maximize the Reward of Participation


  1. Schools will be directed to educational materials provided by CDC as an incentive for participation. Although these materials are available to the public upon request, schools might not be aware of their availability.

  2. Respondents will be sent copies of study articles and be pointed to the project internet site where they can obtain additional information about the study and contact information for study staff.


Handling Nonresponse


A thorough sample validation will occur prior to commencement of recruitment. The main objective of the validation is to confirm that each school still exists and fully meets all of our criteria to be considered eligible for SHPPS. Schools found to be ineligible for SHPPS prior to the start of recruitment will be replaced as described above in section B.1.d. Schools found to be ineligible after the start of recruitment will be replaced to the extent that is feasible.


The secondary purpose of validation is to confirm all the information we have about a school to prepare the recruiters and data collectors for working with the school. This includes information about school size, address, name of principal, telephone and email addresses, and other information critical to planning to communicate with the school. Similar validation processes are followed with respect to districts.


The best approach to handling nonresponse is to avoid it whenever possible. The study contractor has more than 30 years of successful experience in national school-based, health-related data collections and over 25 years of experience conducting computer-assisted surveys. Every effort will be made to encourage all sample members to participate in the study. Further, when study staff makes personal contact with sample members, they will always strive to obtain participation and to avoid refusals. Study staff remain in contact with respondents who have agreed to participate and monitor the completion of web-based questionnaires. Follow-up with respondents who have agreed to participate but have not submitted a completed questionnaire will occur via telephone and/or email.


When a staff member encounters a reluctant respondent, the case will be referred to a more senior staff member in an attempt to encourage participation in the study. In addition, study staff will encourage sample members to contact the project director and the study’s federal project officer with questions and concerns that they may have. The project director and project officer will be available by telephone to answer these questions and concerns. These procedures have proved successful in several studies of this nature.


B.4 TESTS OF PROCEDURES OR METHODS TO BE UNDERTAKEN


From November 2010 through January 2011, the contractor conducted a pretest to assess the clarity of new and modified questionnaire items. This pretest was conducted within OMB guidelines with volunteer district and school personnel and classroom teachers. Between four and seven respondents were involved in the pretest of a given questionnaire. Respondents were selected with the purpose of obtaining a diverse group. At the state level, respondents were selected from states in several regions of the country (e.g., northeast, west, Midwest) and from states with both rural (e.g. Alaska, Montana) and non-rural (e.g. Connecticut, Michigan) areas. At the district level, responding districts were from several regions of the country (e.g. south, Midwest), from both urban and non-urban areas, and served students at various levels of socio-economic status. At the school level, responding schools were both public and private, were from several regions of the country (e.g. east, Midwest), and varied widely in size and in the socio-economic status of the students served.


In an effort to approximate the circumstances under which state- and district-level respondents will participate, pretests took place by telephone in front of an internet-connected computer. School- and classroom-level pretests occurred at schools since on-site since CAPI will be used to conduct the full school and classroom surveys. Cognitive interviews were conducted to determine how respondents interpreted new and modified items; to evaluate the adequacy of response options, definitions, and other descriptions provided within the questionnaires; and to assess the appropriateness of specific terms or phrases. As a result of the pretests, respondent burden was reduced and the potential utility of survey results was enhanced through the elimination or clarification of questions. For example, some questions were not understood by most respondents even after the interviewer provided clarification. Such questions were deleted from the questionnaires. For other questions that were poorly understood, if the intent of the question became clear when the interviewer provided definitions or examples, these definitions or examples were then incorporated into the questions to improve clarity. When respondents could not differentiate between two similar questions, such questions were combined into a single question on the questionnaire. Finally, when respondents noted that a response option they might have selected was not available to them, such response options were added to the questionnaire.


Empirical estimates of respondent burden were also obtained through the administration of each of the state, district, school, and classroom questionnaires in its entirety.


B.5 INDIVIDUALS CONSULTED ON STATISTICAL ASPECTS AND INDIVIDUALS COLLECTING AND/OR ANALYZING DATA

B.5.a Statistical Review

Statistical aspects of the study have been reviewed by:

  • Ronaldo Iachan, Ph.D., Senior Statistician

ICF Macro (Macro International Inc.)

11785 Beltsville Drive

Calverton, Maryland 20705

(301) 572-0538

[email protected]


  • William Robb, Ph.D., Senior Statistician

ICF Macro (Macro International Inc.)

126 College Street
Burlington, VT 05401 USA

(802) 863-9600

[email protected]


B.5.b Agency Responsibility

Within the agency, the following individual will be responsible for receiving and approving contract deliverables and will have primary responsibility for data analysis:

$ Nancy Brener, Ph.D.

Division of Adolescent and School Health

Centers for Disease Control and Prevention

Atlanta, Georgia 30341

770-488-6184

[email protected]


B.5.c Responsibility for Data Collection


The representative of the contractor responsible for conducting the planned data collection is:


$ David Cotton, Director

ICF Macro (Macro International Inc.)

3 Corporate Square NE

Suite 370

Atlanta, GA 30329

404/321-3211

[email protected]








* Table 9 shows that for these subgroup estimates, the overall DEFF is mostly between 1.3 and 1.6 (with one single exception).



vii


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authoralice.m.roberts
File Modified0000-00-00
File Created2021-02-01

© 2024 OMB.report | Privacy Policy