School Nutrition and Meal Cost Study (SNMCS) Supporting Statement Part B

School Nutrition and Meal Cost Study (SNMCS) Supporting Statement Part B.docx

School Nutrition and Meal Cost Study

OMB: 0584-0596

Document [docx]
Download: docx | pdf

School Nutrition and Meal Cost Study (SNMCS)

Contract Number:
AG-3198-C-13-0001



OMB Supporting Statement
Part B



March 7, 2014



Project Officer: John Endahl


Office of Policy Support

Food and Nutrition Service/USDA

3101 Park Center Drive, Room 1004

Alexandria, VA 22302



This page has been left blank for double-sided copying.

PART B STATISTICAL METHODS 1

B.1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection. 5

B.2. Describe the procedures for the collection of information including:

Statistical methodology for stratification and sample selection,

Estimation procedure,

Degree of accuracy needed for the purpose described in the justification,

Unusual problems requiring specialized sampling procedures, and

Any use of periodic (less frequent than annual) data collection cycles to reduce burden. 7

B.3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied. 19

B.4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information. 21

B.5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency. 23

APPENDIX A. Summary of Data Collection Plan (Table)




This page has been left blank for double-sided copying.

PART B. STATISTICAL METHODS

B.1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.

We will select nationally-representative samples that provide unbiased and precise estimates at each level of analysis (SFAs, schools, students, parents, and meals) for the population and unbiased and moderately precise estimates for subgroups.1 Key subgroups include SFA and school size (enrollment), poverty level, urbanicity, FNS region,2 school type (elementary, middle, high), and school meal participants/nonparticipants. The universe for the SNMCS includes public school SFAs, the schools and students they serve, students’ parents, and meals served in the SFAs.3

Table B.1. SNMCS Respondent Universe

Sample Group

Estimated size of respondent universe

Public SFAs

15,126

Schools (K-12)

94,683

Enrolled students (within schools in the National School Lunch Program)

49,692,894

Source: FNS-742 Verification Summary Report, SY 2011-12.

The proposed final samples will include 502 unique SFAs, 1,200 schools, 2,400 students and their parents, and 5,040 and 3,360 lunch and breakfast plate waste observations, respectively.4 Target completion rates are detailed in Appendix A. We will collect some data from all SFAs and schools in the sample, but in other cases we will collect the data from only a large subsample of those units. This approach maximizes statistical precision and data quality while minimizing respondent burden. All sampled SFAs will participate in the SFA Director Survey providing information on SFA policies and institutional and community characteristics. Principals and foodservice managers (FSMs) at sampled schools will complete surveys providing information on school policies and characteristics and the characteristics of foodservice operations. In addition, FSMs in all sampled schools will complete a menu survey for the target week that will provide data to assess the nutritional quality of meals offered and served. A large sample of SFAs and schools will provide data for the cost study and a separate sample of SFAs and schools within those SFAs will have students sampled for the student/parent interviews to assess meal program participation, client satisfaction and students’ dietary intakes.

Specifically, the sampling approach will first randomly divide a sampling frame of all SFAs into three separate SFA sub-frames.5 SFAs will then be sampled from each sub-frame using methods based on the study objectives particular to that sample. The largest SFAs6 in the SFA frame will be included in two of the three groups but all other SFAs will be sampled in only one of the three groups:

  • Group 1 includes 106 SFAs but no schools. These SFAs will participate in the SFA Director Survey to provide the precision required for estimates of SFA characteristics and policies.

  • Group 2 comprises 100 SFAs and 300 schools. The Group 2 sample will include the 4 largest SFAs and 12 schools sampled from those largest SFAs, plus a sample of 96 other SFAs and 288 of their schools (3 per SFA). Group 2 SFAs and schools will participate in the SFA Director, FSM, and Principal Surveys and FSMs will complete the Basic Menu Survey. We will complete interviews with 2,400 students and their parents from these schools to provide information on meal program participation, client satisfaction, and students’ dietary intakes from school meals and over a full 24-hour period.

  • Group 3 includes 300 SFAs and 900 schools (3 per SFA) that will participate in the SFA Director, FSM, and Principal Surveys. Through the use of additional cost interviews, they will also provide data for the meal cost estimates, including completing the Expanded Menu Survey. The Group 3 sample will include the 4 largest SFAs and 12 of their schools, plus a sample of 296 other SFAs and 888 of their schools. Plate waste will be observed at a subsample of Group 3 schools: we will observe 5,040 NSLP lunches and 3,360 SBP breakfasts from 56 SFAs and 168 schools.

B.2. Describe the procedures for the collection of information including:

  • Statistical methodology for stratification and sample selection,

  • Estimation procedure,

  • Degree of accuracy needed for the purpose described in the justification,

  • Unusual problems requiring specialized sampling procedures, and

  • Any use of periodic (less frequent than annual) data collection cycles to reduce burden.

a. Sample Selection, Estimation and Precision

Selecting the samples requires high quality sampling frames at each stage. To select the samples of SFAs we will construct a frame that combines data from the most recently available FNS-742 Verification Summary Report (VSR) list of SFAs, the Common Core of Data (CCD) “Local Education Agency (School District) Universe Survey” and a Census file (SAIPE) that contains school district-level estimates of student-age children in poverty. We have recently successfully merged these files for other FNS studies, including SNDA-IV and APEC-II. The VSR and CCD files contain complementary information. If a local education agency (LEA) or district appears on the VSR we will know it is (or was when the file was compiled) an SFA. The CCD contains more locating information than does the VSR and has information that allows the elimination of some types of ineligible districts (such as those serving institutional populations) and will be useful for stratification based on urbanicity, racial decomposition and whether or not the SFA serves only charter schools.

In some cases, there may be multiple SFAs that are sampled that are managed by the same entity. This may lead to excessive burden for that entity and may limit the ability to disaggregate certain information on individual SFAs being managed by that entity. If such cases arise, we will first see if it is possible for the entity to provide information for all sampled SFAs under its control. If not, we would consider sampling one SFA from the multiple SFAs originally sampled to reduce respondent burden. Alternatively, if the issue is more due to issues with disaggregating information across SFAs, we would use the information provided for all sampled SFAs collectively and impute outcomes for individual SFAs. Once we determine which SFAs in the sample are subject to this kind of problem (if any) we will provide more detailed specifications for how we will handle sub sampling and data collection from these SFAs.

The frame for selecting schools within SFAs will be the CCD school-level file.7 It contains enrollment figures, grades served, demographic information, and locating information. In some SFAs, the CCD may not be current due to recent school closures, mergers or additions or may have inadequate information for constructing the school sampling frame. We will investigate the extent to which these kinds of issues occur by randomly selecting a subsample of SFAs and gathering a list of schools and other information from those SFAs. If it turns out that a sizeable percentage of these SFAs are not up to date in terms of school lists and/or other information required for constructing the school sampling frame, we will consider options for addressing the issue in more detail. Any newly formed schools since the CCD release that we identify will be given a chance of selection into the sample.

a.1. Sampling SFAs

After the sampling frame for SFAs has been prepared, we will select SFAs in three steps. First, we will identify the overall certainty SFA selections (discussed more below). Second, we will stratify the overall frame of remaining SFAs (less the certainty SFA selections) and use random selection methods to assign SFAs to the three sampling groups (or sub-frames). Third, we will select the three samples of SFAs—one from each of the three groups. We will select the Group 1 sample of SFAs—those that will participate only in the SFA Director Survey—using stratified random selection. For the student/parent survey Group 2 we will sample SFAs with stratified probability proportionate to size (PPS) selection using enrollment as the measure of size (MOS) and SFA size as a main stratification variable;8 for the meal cost data collection (Group 3) we will sample SFAs with

stratified PPS selection using the square root of enrollment as the MOS and SFA size as a main stratification variable.9

a. Selecting the Overall Certainty SFA Sample (Largest SFAs)

We will establish hypothetical sampling rates and determine which SFAs would be selected with certainty for Groups 2 and 3.10 We will then define the largest SFAs as those that would be hypothetical certainty selections for both the student/parent survey (Group 2) and the meal cost estimation (Group 3). These largest SFAs will be sampled with certainty and assigned to Groups 2 and 3. (After the sub-frames for Groups 2 and 3 are formed and SFA selection probabilities are determined for those groups, we may identify additional certainty selections to be made within one group or the other. However, these additional certainty selections will not be part of the “overall” certainty SFAs.)

b. Selecting the Three Groups of SFAs

After the largest SFAs (overall certainty selections) have been allocated to Groups 2 and 3, we will allocate all SFAs that serve charter schools only to Group 1. We will then use random sampling procedures to divide the remaining SFAs (excluding the largest certainty selections and those that serve charter schools only) on the overall frame among Groups 1, 2, and 3. The SFAs randomly assigned to a group will serve as the sampling frame for that group. The first step in dividing the overall frame will be to stratify the remaining SFAs on the overall frame by region, total enrollment, urbanicity, and poverty level.11 We will use explicit stratification (in which separate samples with fixed sample sizes are selected within each stratum) to ensure a minimum sample size for some strata, and implicit stratification (sorting before using a method such as sequential or systematic selection) for other variables.12 We will then use Chromy’s sequential selection (available in SAS PROC SURVEYSELECT) to organize the frame into three subsamples each containing approximately13 one-third of the remaining SFAs. These three subsamples will serve as the sampling frames for selecting the SFA samples for Groups 1 – 3.

Sampling SFAs for Group 1. We will select the Group 1 sample of SFAs using a stratified random design (that is, selecting with equal probability within strata). As mentioned earlier, this method best serves the purpose of the Group 1 sample, which is to add observations to the SFA Director Survey so that SFA characteristics can be measured precisely. The primary stratification variable for selecting the Group 1 sample will be FNS region. We will also use implicit stratification on total enrollment, urbanicity, and poverty level to ensure proportionate representation in the sample of SFAs defined by these characteristics. We will then use Chromy’s sequential selection to choose the samples within explicit strata. The Group 1 sample will contain no certainty selections because we are selecting SFAs with equal probabilities.

Sampling SFAs for Group 2. The Group 2 SFA sample will include the four largest SFAs and a sample of 96 other SFAs that we will select using stratified PPS selection. We will identify any certainty selections beyond the 4 largest SFAs prior to selecting the final PPS sample. We will then stratify those not sampled with certainty, explicitly by SFA size (based on the number of schools) and region and implicitly by urbanicity and poverty level. In implementing PPS selection, we propose to use total enrollment as the MOS, because the primary objective of the Group 2 sample is to provide a sample of students that will yield precise estimates. By using PPS with enrollment as the MOS we can obtain a sample of students selected with close to equal probabilities of selection. This will lead to more precise estimates because it will reduce the loss of precision due to unequal analysis weights.14 We will select the sample using Chromy’s sequential selection procedure.

Sampling SFAs for Group 3. The Group 3 sample of SFAs will include the four largest SFAs and a sample of 296 other SFAs to be selected using stratified PPS sampling. We will identify any certainty selections beyond the 4 largest SFAs prior to selecting the final PPS sample. The MOS will be the square root of enrollment. This was the MOS used for the SLBCS-II study and is appropriate because of the two kinds of meal cost estimates that will be produced (one weighted by SFAs and the other by the number of reimbursable meals provided by SFAs). Stratification for the other (noncertainty) selections will be the same as for the Groups 2 sample, as will the selection procedure (Chromy within explicit strata).

a.2. Sampling Schools

We will sample schools from both Group 2 and Group 3. For most SFAs, we will sample 3 schools per SFA overall. The sampling frame for each SFA will be the CCD file of schools. Strata for sampling will be based on school level (elementary, middle, and high schools). We will use PPS sampling for Groups 2 and 3, with enrollment as the MOS for Group 2 and the square root of enrollment as the MOS for Group 3.

As in the case of SFAs, our sampling strategies will allow for attrition (ineligibility and nonparticipation). Rather than select three schools (one school per level—elementary, middle, and high school) per SFA, we will select three pairs of schools (one from each level) per SFA and randomly assign one school to be the main selection and the other to be the back-up. If there are more than 6 schools total in an SFA but one or more levels (elementary, middle, or high school) have no schools or only one schools, we will sample 3 schools for the main sample and 3 replacements. If there are 3 to 6 schools in an SFA we will sample 3 schools for the main sample and the rest will be replacements. When not enough schools are available to achieve the 6 total schools, we will select additional schools from other SFAs.15 This process may lead to a slight imbalance of schools in the final sample across levels (e.g., we may have slightly more elementary schools than we do middle schools). The sample sizes provided in Figure II.1 and elsewhere are stated in terms of sample schools participating in the study.

Both Groups 2 and 3 will include 12 schools from the four largest SFAs. Group 2 will also include 288 schools and Group 3 will include 888 schools.16 In each of the four largest SFAs we will select 6 schools and assign 3 to Group 2 (student/parent survey sampling) and 3 to Group 3 (cost study sampling).17 In selecting schools outside the 4 largest SFAs we will select a sample large enough to yield observations on up to 3 schools (unless the SFA has fewer than 3 schools): (1) if the SFA has schools at all levels (elementary, middle, and high school), one at each level; (2) if the SFA has at least 3 schools and has schools at two levels, 2 from one level and 1 from another; (3) if the SFA has at least 3 schools, all at one level, 3 at that level; and (4) if the SFA has only one or two schools, we will sample all of them.

a.3. Sampling Students

This section describes the procedures for selecting the national sample of students from Group 2 SFAs and schools for the student and parent interviews. We will sample students from the 300 schools recruited for this part of the study. Procedures for sampling these schools and the SFAs to which they belong were discussed earlier. Sampled students will be drawn from a sampling frame based on rosters for sampled schools (obtained from either the school or district records). We will sample students based on de-identified information if necessary and obtain consent in accordance with the districts’ requirements for selecting the student sample. When a student has been selected, we will obtain consent, determine eligibility, and conduct the student interview. We will then interview the parents of students for whom we have completed the student interview.

We will obtain completed interviews for at least 2,400 students and their parents in 300 sampled schools in 100 SFAs from Group 2, distributed equally (800 each) by school level (elementary, middle, and high school). To obtain 2,400 full completes, (defined as interviews with both child and parent), the study will need an initial recruiting sample of 3,663; this assumes that 75 percent will give consent to participate (2,747) and that 96 percent18 of children whose parents consent (2,637) will be eligible for the study (that is, in school on the target day and not in an ineligible group) and complete the in-school interview. Students on the sample list could be ineligible because they have left the school (moved, transferred, or dropped out); because they are in an ineligible group (such as special education students); or simply because they were absent on the specific target day. The rates noted here are informed by experiences on the SNDA-III and APEC studies. After students are interviewed, it is assumed that interviews will be conducted with 91 percent of the parents at all levels,19 leading to 2,400 completes for both student and parent interviews.

A second dietary recall is needed from a subsample of students to construct estimates of usual dietary intakes. For the second day of dietary recalls, the goal is to obtain completed interviews with 25 percent of children with full first-day completes. The plan is to attempt telephone interviews with about 708 students. At an 85 percent response rate, the study will obtain 600 second-day intakes (25 percent of the first-day sample).

a.4. Sampling Lunches and Breakfasts for the Plate Waste Study

We will collect data for the plate waste observations in a subsample of 56 of the Group 3 cost study SFAs. We will select the subsample using stratified random sampling, using enrollment, urbanicity, and poverty level as implicit stratification variables. Prior to sampling, we will restrict the SFAs eligible for selection into the plate waste study sample so that only SFAs with 3 or more schools are included.20 The 4 largest SFAs will have a chance of selection for the plate waste study, but will not be selected for it with certainty. We will conduct the plate waste observations in the schools in those SFAs sampled for the cost study data collection, yielding 168 schools. Of these 168 schools, we expect that about 90 percent of them (or about 151 of them) will serve breakfasts. Within each school we will randomly select meal serving periods and lines, and within them, samples of lunches (and breakfasts in schools also providing school breakfasts) on a single day. We will observe plate waste on 5,040 lunches (an average of 30 per school in 168 schools) and on 3,360 breakfasts (an average of 22 or 23 per school in 151 schools).

The final sample sizes are designed to yield estimates from sample data that meet precision standards at each level (SFA, school, student, parent, and meal). For binary variables (estimated percentages), the sample sizes are designed to produce a 95 percent confidence interval no greater than plus or minus 5 percentage points for the sample as a whole and no greater than plus or minus 10 percentage points for important subgroups. For continuous variables, we have designed the sample to produce confidence intervals no greater than +/- 5 (whole sample) or 10 percent (subgroups) of the mean. Key subgroups include SFA and school size (enrollment), poverty level, urbanicity, FNS region, school type (elementary, middle, high), and school meal participants/nonparticipants.

At each level, there might be more than one data collection instrument administered, each with many items measured. For example, all SFAs complete the SFA Director Survey, and those in Group 3 will participate in meal cost data collection. For schools, all are administered the FSM, Principal, and Menu Surveys; SFAs and schools in Group 3 are also interviewed to obtain data to estimate the costs of reimbursable meals. Students and parents in Group 2 SFAs and schools are administered surveys including dietary recalls. We will implement the Option 1 plate waste study in a subsample of Group 3 SFAs and schools to collect data regarding individual meals.

At the SFA level, we first estimated the sample sizes we would have to estimate costs to produce a reimbursable lunch (and breakfast): (1) with the SFA as the unit of analysis—that is, the full cost per reimbursable meal for the average SFA; and (2) with the reimbursable meal as the unit of analysis—that is, the SFA weighted by the number of reimbursable meals it provides. The SLBCS-II study estimated the means and standard errors (from a sample of 120 SFAs) of producing a reimbursable lunch as $2.36 and $0.09 with the unit of analysis as the SFA and as $2.28 and $0.04 with the unit of analysis as a reimbursable lunch. From these results, we extrapolated that we would need a national sample of 268 SFAs to get a confidence interval of plus or minus $0.118 (5 percent of $2.36); however, to get subgroup confidence intervals of no greater than plus or minus $0.236 (10 percent), we would need a sample of 272 SFAs (68 per 25 percent subgroup). We propose to increase the sample to 300, which will allow precise estimates for one of the breakfast cost measures (in addition to other cost measures). We need a still larger sample size to measure SFA characteristics with the required precision. To get the desired subgroup precision would require a sample of 126 SFAs per subgroup, with an expected design effect (DEFF) due to weighting of about 1.3, which would increase the total number to 502 overall.21 Because we will sample the 4 largest SFAs (for Groups 2 and 3)—296 more SFAs for Group 3, and 96 more for Group 2—we allocated 106 to Group 1 SFAs to bring the total to 502.

For estimated means, we project that that the 95 percent confidence interval for the Group 3 SFA sample as a whole will be 4.8 percent of the estimated cost per meal with the SFA as the unit of analysis, and 2.1 percent of the estimated cost per meal with the NSLP meal as the unit of analysis (Table B.1). For estimates of percentages, we project confidence intervals of plus or minus 5.0 percentage points for the SFA sample as a whole. For 25 percent subgroups the meal cost precision will be plus or minus 9.5 percent of the mean. For SFA characteristics, the precision for subgroup will be +/- 10 percentage points.


Table B.2. Expected Precision in Terms of (Half Width) Confidence Intervals Expressed as Percentage Points for Estimated Characteristics and as Percent of the Mean for Estimates of Continuous Variables for the Sample as a Whole and for Subgroups

Level

Instrument/
Data

Variable

Entire Sample

25 Percent

SFA

SFA Director Survey

Binary outcome (50% as a conservative standard)

5.0%

10.0%

SFA

Meal Cost

Mean Cost Per Lunch, SFA as Unit of Analysis

4.8

9.5

SFA

Meal Cost

Mean Cost Per Lunch, Meal as Unit of Analysis

2.1

4.3

SFA

Meal Cost

Mean Cost Per Breakfast, SFA as Unit of Analysis

11.5

23.8

SFA

Meal Cost

Mean Cost Per Breakfast, Meal as Unit of Analysis

5.0

10.0

School

Principal, FSM

Binary outcome (50% as a conservative standard)

4.0

7.5

School

Menu Survey

Mean percent of calories from saturated fat (NSLP)

1.4

2.9

Student*

Student Survey

Binary outcome (50% as a conservative standard)

4.5

6.5


Dietary Recall

Number of calories consumed at lunch

4.6

7.3


Dietary Recall

Percentage of calories from fat consumed at lunch

3.3

5.1


Dietary Recall

Percentage of calories from saturated fat consumed at lunch

3.9

6.1


Dietary Recall

Potassium consumed at lunch

4.8

7.6


Dietary Recall

Grams of saturated fat consumed at lunch

6.5

10.2


Dietary Recall

Sodium consumed at lunch

5.7

9.0

Plate Waste

Lunch

Binary outcome (50% as a conservative standard)

5.1

7.8


Breakfast**

Binary outcome (50% as a conservative standard)

4.8

7.5


* Students subgroups of approximately 25% or more include Males, Females, Students by whether income is > 185% of FPL , all student by whether NSLP participants; 33% include Students by Level (elementary, middle, high), Income <185 and NSLP participant; include High School and NSLP, Income >185 and NSLP participant;

** Design Effect is lower for Breakfast, so even with a smaller sample size, precision is about the same as for Lunch.

The sampling frames for students will be lists obtained from sampled schools of students enrolled in those schools.

b. Data Collection Methods

To achieve the goals of the SNMCS, data must be collected on several substantive areas: (1) characteristics and environments of SFAs and schools participating in the school meal programs and foodservice operations; (2) food and nutrient content of reimbursable meals and snacks offered and served; (3) meal costs and revenues; and (4) student participation in school meal programs, satisfaction with meals (including plate waste), and dietary intake. We will conduct different data collection activities in the three groups of SFAs. This approach will provide the desired levels of precision while distributing response burden such that no SFA is responsible for providing every data element from all substantive areas.

The data collection approach is illustrated in the table in Appendix A. The table and text are organized by instrument rather than data collection group because many instruments will be used in both Groups 2 and 3. Most data will be collected between January and June 2015. This timing will condense collection of the data used to estimate student dietary intake and the nutrient content and cost of school meals to as narrow a period as possible; it will also provide consistency with data collection periods in SNDA-III, SNDA-IV, and SLBCS-II. Planning and previsit interviews for Group 2 and 3 SFAs will be conducted from September to December 2014, and follow-up interviews for Group 3 to collect final cost and revenue data for SY 2014–2015 will be completed in the fall and winter of SY 2015–2016.

The SNMCS will leverage many instruments or items used previously in SNDA, SLBCS, and other studies; it will also require the development of new instruments and items in order to address the full range of research questions.

B.3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.

A wide range of methods will be used to maximize participation and reduce nonresponse in all aspects of data collection. We will undertake several activities to lay the groundwork for our intensive recruitment campaign: developing recruitment materials, securing endorsements, creating a data management system to support recruiting, obtaining institutional review board (IRB) approval, and training the recruitment team. We have prepared an informative set of recruitment materials that are included in our OMB submission. The materials describe the purpose of the study in a straightforward way that stresses the important role each participating SFA, school, and individual plays in the study’s success. We have developed a study logo that can be reprinted in other study literature or on stationery to brand all communication efforts.

Gaining national, regional, and State support for the SNMCS is critical to our success in securing participation. The School Nutrition Association has provided a letter of endorsement. USDA will also provide a letter of support from an appropriate official. Such letters will provide critical study support and recruitment leverage when reaching out to SFA directors.

Together, study contractors Mathematica and Abt will conduct a comprehensive training for all recruiters that will cover project details, anticipated challenges, and expectations. With a full understanding of the project and its goals within the current school foodservice environment, our recruiters will impart a level of aptitude and professionalism in all communications with study participants. We will conduct half-day trainings for Group 2 and Group 3 recruiters via interactive webinar. Training will be conducted separately to enable us to train recruiters on the specifics of each group’s data collection activities. At the conclusion of the training, each recruiter will receive his or her assigned SFAs.

  • We will begin the first outreach steps of our recruitment strategy, working with FNS to gain support at the regional and state levels. We will contact each FNS regional office regarding FNS’s contract with Mathematica and explain the importance of participation at all levels to the success of the study and asks for their support by sharing this information with their States’ child nutrition (CN) directors.

  • Mathematica, Abt, and Agralytica’s team will locate any sample overlap with other projects and use existing relationships to help make our recruiting more efficient. Many SFAs, districts, and their staff have worked with our team on recent and ongoing studies. Our successful experience with past projects provides us an opportunity to revisit these existing relationships during the SNMCS. For example, since we anticipate that some SFAs using foodservice management companies (FSMCs) will require additional encouragement to participate, we will dedicate Agralytica staff to recruiting those SFAs, as they have a long history of working with FSMCs.

  • Following our conversations with FNS Regional Offices and State CN directors, recruiters will begin sending communications to all sampled Group 2 and 3 SFA directors. The initial mailing will include an introductory letter from FNS, any letters of endorsement, the brochure, and an enclosure with contact information for sampled schools.

  • Recruiters will call to confirm receipt of the mailing, assess eligibility (for example, confirming that the SFA participates in the NSLP and that none of the sampled schools are residential facilities), describe study objectives, address any SFA concerns, explain the study timeline and participation requirements, confirm contact information for study schools,22 and inquire about basic school foodservice characteristics (for example, participation in the SBP and whether meals are prepared in an off-site kitchen). During this call, recruiters will also discuss incentives. Incentives will help us overcome the competing demands and time constraints that study participants face.

  • Student and parent recruitment will initially be coordinated through the district to streamline the process and establish a consistent approach across the district’s sampled schools. We will obtain student rosters for sample selection at the district level when feasible. Current Family Educational Rights and Privacy Act (FERPA) regulations permit the release of directory information, so we will request student and parent names, telephone numbers, and mail and email addresses, along with basic demographic information (gender, grade, and meal certification status) to monitor whether our completed sample is representative of the student universe from which it was drawn. We will randomly select a minimum of 16 students per school (possibly more depending on consent requirements and other restrictions placed on data collection procedures), which includes a number of reserve students to enable us to achieve 2,400 completed student and parent interviews and dietary recalls. Participation incentives will be actively promoted during student and parent recruitment to attract attention to the study and to convey the importance of participation.

B.4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.

SNMCS data collection will rely largely on instruments and individual items that have been fielded in previous studies. As a result, pretesting was selectively used in cases of substantial changes to content, mode, or methodology. For the Menu Survey and several staff surveys and interviews, the overall structure and approach of the instruments resembles instruments used in past studies. Therefore, we focused on testing only specific new or modified items to examine whether these items were clearly presented and understood as intended by respondents and were likely to produce quality data. The Cafeteria Observation guide was tested in schools under typical data collection conditions to test the protocol for these observations as well as the instrument’s functionality. Tests of the 24-Hour Dietary Recall focused on new methodologies being added to the SNMCS, namely the Child’s Food Diary for parents of elementary students, administration of the second recalls by telephone, and portions of the Child Interview. All respondents received incentives comparable to those planned for the study.

We recruited 13 SFAs in five States to participate in pretesting SFA- and school-level instruments; no more than six participants completed any specific activity. FNS contacted State CN directors to introduce the pretest; we then drafted an email for the CN directors to send to the SFA directors describing the study and pretest goals and highlighting the importance of participation. Recruiters followed up with the SFA directors by telephone and email to describe testing activities in detail, select schools, and schedule times for the pretests. We sent pretest participants copies of the instruments to be tested and then conducted telephone interviews to get feedback. Testing of the Cafeteria Observation Guide was conducted on-site by staff trained in its administration.

Staff trained in administering the AMPM dietary recall interview pretested the 24-Hour Dietary Recall, Food Diary, and portions of the Child Interview with four elementary students and their parents by telephone.23 This pretesting focused on two key changes in methodology: (1) the use of Food Diaries to aid parents’ recall of elementary students’ intake; and (2) conducting second recalls by telephone with this age group. A convenience sample of student-parent dyads recruited through local sources were sent a Food Model Booklet, measuring cups, and a Child’s Food Diary with accompanying instructions. They completed the recall and a debriefing interview together.

We used participant (or in the case of the cafeteria observations, contractor staff) feedback to modify a limited number of questions and response options in these instruments. In the case of the SFA Director Survey, we revised the burden estimate based on participants’ experiences completing the questionnaire in full.

The Electronic Menu Survey (EMS) represents an important mode change from the hard copy menu surveys used on prior studies. Testing will be important to ensure its full functionality in real-world settings. A separate field testing of the EMS is planned for later in 2014, allowing time for any resulting programming changes ahead of the survey being fielded in January 2015.

B.5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.

The information will be collected and analyzed by Mathematica Policy Research, Abt Associates, and Agralytica, Inc. The sampling procedures were developed by John Hall (telephone: (609) 275-2357) and Nick Beyler (telephone: (202) 250-3539) of Mathematica and David Judkins (telephone: (301) 347-5952) of Abt Associates, building on previous work on the SNDA-IV, SNDA-III, and SLBCS-II projects. The sampling plans were reviewed internally by Frank Potter (telephone: (239) 558-5956), senior statistician at Mathematica. Brian Richards (telephone: (202) 720-2518) from the National Agricultural Statistics Service (NASS) has also reviewed this supporting statement and provided comments that have been incorporated.

1 National estimates of characteristics (percentages) will have a 95 percent confidence interval of no more than plus or minus 5 percentage points and estimates of means should have a 95 percent confidence interval of no more than plus or minus 5 percent of the mean. The confidence intervals for key subgroups will not exceed plus or minus 10 percentage points or 10 percent of mean values, respectively. The samples we propose will meet the subgroup precision requirements for subgroups comprising roughly 25 percent of the population of SFAs, schools, or students.

2 These are the seven regions of the country that administer USDA’s food and nutrition programs.

3 SFAs serving only institutionalized populations or SFAs operated by states or the federal government will be excluded from the sampling universe. SFAs that serve charter schools only will be included for SFA-level analyses but charter schools will not be included in the sampling universe for school-level, student-level or meal-level analyses. Private schools and SFAs serving private schools only will be excluded from all sampling frames.

4 Sample sizes described in this chapter are stated in terms of numbers of participating SFAs, schools, students, and parents. The sizes of the samples selected will be expanded to allow for nonparticipation due to ineligibility or noncooperation.

5 One sub-frame (for Group 1) will include all the SFAs that operate for charter schools only and the other two sub-frames (for Groups 2 and 3) will include only SFAs operating for public schools (and not those operating for charter schools only). More details about the approach for dividing the SFA sampling frame into 3 sub-frames is provided in the next section.

6 The largest SFAs and their schools will participate in the SFA Director-, FSM-, and Principal Surveys and will be asked to both provide data for the cost study and to participate in the student/parent interview data collection. Based on preliminary analyses, we expect there to be four large SFAs from which 24 schools (6 per SFA) will be sampled: half the schools participating in the meal cost component and half in the student/parent component (but no schools will participate in both). Once we have a sampling frame constructed for this study, we can confirm the exact number of large SFAs.

7 Specifically, a file from the Public Elementary/Secondary School Universe Study.

8 Prior to PPS sampling, we will stratify the Group 2 SFA frame so that smaller SFAs (those with fewer schools) are in one stratum and all other SFAs another. We may also stratify based on urbanicity and FNS region if the stratum sizes are large enough to allow for such stratification. SFAs in the “small SFA” strata will be under-sampled (relative to SFAs in other strata) so that we improve our chances of avoiding situations in which SFAs with very few schools are in the sample. More specific information about these strata (and how SFAs will be divided across strata) will be provided once the sampling frame has been constructed and we can look at the distributions of SFA sizes in Group 2.

9 Prior to PPS sampling, we will stratify the Group 3 SFA frame so that smaller SFAs (those with fewer schools) are in one stratum and all other SFAs are in another. We may also stratify based on urbanicity and FNS regions if the stratum sizes are large enough to allow for such stratification. SFAs in the “small SFA” strata will be under-sampled (relative to SFAs in other strata) so that we improve our chances of avoiding situations in which SFAs with very few schools are in the sample. More specific information about these strata (and how SFAs will be divided across strata) will be provided once the sampling frame has been constructed and we can look at the distributions of SFA sizes in Group 3.

10 In cases in which PPS methods are used, some sampling units (in this case SFAs) might have MOSs large enough that they are certain to be selected into the sample (that is, their probability of selection is 1.0).

11 Once we have the sampling frame, we will determine how to define these (and any other) stratification variables. For example, we will determine whether the poverty level variable should be continuous or categorical and, if categorical, how many categories should be used.

12 Variables for which proportionate representation is desired but exact sample size targets do not have to be defined.

13 This statement comes with a few caveats. The Group 1 frame will likely include more total SFAs because all SFAs that serve charter schools only will be placed in Group 1. Because we are sampling three times as many SFAs from Group 3 relative to Group 2, we may also consider allocating a larger number of SFAs to the Group 3 sub-frame. We will do so if equal allocation to sub-frames for Groups 2 and 3 leads to an excessive number of certainty selections in Group 3.

14 As we will discuss later, the basic weight for the student sample will be the sampling weight or inverse of each student’s probability of selection into the sample. Although there will be adjustments to the weights for nonresponse, starting with sampling weights that are close to equal will reduce the variability of the final analysis weights.

15 Selection of additional schools from other SFAs may occur even when SFAs with shortfalls have 6 or more schools if there are nonrespondent schools.

16 In most cases, there will be 3 schools selected from each SFA. However, additional schools will be selected from larger sampled SFAs to balance out the fact that less than 3 schools will be selected from smaller sampled SFAs.

17 One issue with sampling schools in the largest SFAs is that the methods we propose for sampling schools differ between Groups 2 and 3. We considered splitting the school sampling frames in the largest SFAs (as we split the SFA frame) before selection. However, the anticipated costs and benefits of that approach led us to propose using the Group 2 procedures to sample all schools in the four largest SFAs. Our rationale for preferring Group 2 procedures over Group 3’s is that the Group 2 sample has fewer schools and would be more adversely affected by having schools chosen in these SFAs using methods that differed from those used in other SFAs.

18 98 percent of elementary and 95 percent of secondary students.

19 95 percent of elementary and 89 percent of secondary student parents.

20 SFAs that serve fewer than 3 schools are excluded because of the excessive burden that would be necessary for reaching the target number of plate waste observations in such SFAs (30 lunch observations per school and 22-23 breakfast observations per school).

21 Each of the 4 large SFAs contributes more to the precision then smaller SFAs, so not every subgroup will require 126 SFAs. Two will only need 125.

22 For any sampled school that has been closed, we will substitute a replacement school. For a subset of SFAs, we will ask the SFA director the names of any schools newly opened since we constructed the school sampling frame and give those schools an opportunity to be selected into the SFA’s school sample.

23 Our streamlined pretest approach for the recalls focuses on the youngest respondents who are likely to have the most cognitive difficulty completing the interview. We anticipate that addressing whatever challenges children encounter with the interview will be applicable to youths as well.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorLocalAdmin
File Modified0000-00-00
File Created2021-01-27

© 2024 OMB.report | Privacy Policy