Special Nutrition Program Operations Study (SNPOS)
Statement
for
Paperwork
Reduction
Act
Submission
Part B: Collection of Information Employing
Statistical Methods
April 21, 2011
Office of Nutrition Analysis
Food and Nutrition Service
United States Department of Agriculture
Project Officer: John Endahl
Telephone: 703-305-2127
SUPPORTING STATEMENT PART B
Special Nutrition Program Operations Study
B.1 Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.
The respondent universe for the proposed survey will include all school food authorities (SFAs) operating in public school districts in the United States and outlying territories that are required to submit form FNS-742 (SFA Verification Summary Data 7 CFR Part 245, Determining Eligibility for Free & Reduced Price Meals, OMB# 0584-0026, expiration date 3/31/2013) annually to the United State Department of Agriculture (USDA), Food and Nutrition Service (FNS). In general, SFAs that participate in the National School Lunch Program (NSLP) or School Breakfast Program (SBP) are included in the respondent universe with the following exceptions:
SFAs that operate only in Residential Child Care Institutions that do not have day time students;
SFAs that do not have students who are eligible for free/reduced-price lunch;
SFAs in some outlying territories that are not required to complete form FNS-742; and
Private schools that participate in the NSLP.
The 2009-10 FNS-742 database will be used to construct the SFA sampling frame; i.e., the universe file from which the respondent samples will be drawn. There are currently over 18,000 SFAs in the 2009-10 FNS-742 data base. However, approximately 15,000 SFAs operating in public school districts meet the criteria above and will be included in the sampling frame. Note that the unit of analysis for the proposed study will be the SFA which usually (but not always) coincides with a local education agency (LEA) as defined in the U.S. Department of Education’s Common Core of Data (CCD) Local Education Agency Universe Survey File maintained by the National Center for Education Statistics (NCES). Exceptions are SFAs that operate school food programs for multiple school districts and those operating individual schools (e.g., some public charter schools). In the 2009-10 FNS-742 data base, about 89 percent of the eligible SFAs match a district (LEA) in the 2008-09 CCD universe file (see Table B1). However, the matched SFAs account for over 94 percent of the total student enrollment served by the SFAs in the frame.
CCD-status of SFA |
Enrollment size class1 |
Number of SFAs |
Total enrollment1 |
Number of schools2 |
Matches school district (LEA) in CCD |
Less than 1,000 |
6,532 |
2,770,985 |
12,424 |
|
1,000 to 4,999 |
4,863 |
11,314,642 |
24,041 |
|
5,000 to 24,999 |
1,530 |
15,229,936 |
24,283 |
|
25,000 or more |
272 |
16,413,091 |
23,372 |
|
Subtotal |
13,197 |
45,728,654 |
84,120 |
Does not match LEA in CCD |
Less than 1,000 |
1,101 |
305,233 |
2,124 |
|
1,000 to 4,999 |
379 |
913,357 |
2,167 |
|
5,000 to 24,999 |
107 |
1,057,970 |
1,818 |
|
25,000 or more |
14 |
539,072 |
837 |
|
Subtotal |
1,601 |
2,815,632 |
6,946 |
All SFAs |
Total |
14,798 |
48,544,286 |
91,066 |
1 Number of students with access to NSLP/SBP as reported in 2009-10 FNS 742.
2 Counts of schools operating NSLP/SBP as reported in 2009-10 FNS 742.
Expected Response Rates
The expected response rate is the proportion of SFA Directors who respond to the survey as a percentage of the total number of SFA Directors in the sample. We plan to sample 1,765 SFAs to obtain 1,500 completes; the expected response rate is 85 percent for the School Food Authority (SFA) Director Survey. The State Agency Child Nutrition (CN) Director survey will be conducted among all 56 state directors and will not involve any sampling. We expect at least a 95 percent response rate for the State Agency Child Nutrition Director survey.
Previous Data Collections and Response Rates
This is a new data collection. However, the 85% and 95% response rates for the SFA and State Agency CN Directors respectively is based on prior surveys involving SFA directors and State Agency CN directors.
B.2 Describe the procedures for the collection of information including:
Statistical methodology for stratification and sample selection,
Estimation procedure,
Degree of accuracy needed for the purpose described in the justification,
Unusual problems requiring specialized sampling procedures, and
Any use of periodic (less frequent than annual) data collection cycles to reduce burden.
Below we describe the procedures for the collection of information including statistical methodology for stratification and sample selection, estimation procedure, and the degree of accuracy needed for the purpose described in the justification.
A goal of the sample design is to obtain a nationally representative sample of SFAs that will yield population estimates with a precision of ±5 percent at the 95 percent level of confidence for the overall SFA population and for specified subgroups of SFAs. Under simple random sampling, this translates to a sample size of 400-500 responding SFAs for each subgroup. For example, with three key subgroups of roughly equal size (e.g., one-way classifications defined by enrollment size of SFA or by poverty status based on the percent of students eligible for free/reduced price lunch), the total required sample size would range from 1,200-1,500 SFAs to meet the specified precision levels. In general, however, simple random sampling is not efficient for the multiple analytic objectives of the study. For example, while a simple random (or self-weighting) sample would be optimal for estimating the overall prevalence of SFAs reporting various types of food service practices or programs, it can be inefficient for estimating the numbers of students involved in these types of services or programs. A stratified sample design using variable rates that depend on the size of the SFA would better meet these conflicting objectives. Stratification not only helps to ensure that adequate sample sizes are obtained for important analytic subgroups of interest, but can also be effective in reducing the sampling errors of estimates that are correlated with enrollment size.
A stratified sampling design employing varying sampling fractions will be used to select the SFA sample for the study. Such a design will generally inflate the standard errors of prevalence estimates as compared with simple random sampling but is justifiable for reasons mentioned above. A measure of the relative precision of a complex sample design is given by the design effect (DEFF), which is defined to be the ratio of the variance of an estimate based on the complex sample design to the hypothetical variance based on a simple random sample of the same size. A design effect of 1.00 means that the complex sample is roughly equivalent to a simple random sample in terms of sampling precision. A design effect less than 1.00 means that the sample is more precise than a simple random sample; this could occur, for example, in a stratified sample in which some SFAs that are sampled at very high rates. Under the proposed design, we have estimated that the resulting design effects will range from slightly under 1.00 to slightly under 1.9 depending on the subgroup being analyzed, with an overall design effect less than 1.4. As indicated in table B2, which summarizes the expected margins of error of a prevalence estimate under the proposed design for a range of sample sizes and design effects, a total SFA sample size of 1,500 responding SFAs should be more than adequate to meet or exceed the ±5 percent precision requirement even for design effect as large as 1.5. For a subgroup consisting of 500 SFAs for which the design effect is 1.10 (e.g., this would be reasonable for subgroups defined by size of SFA, but may be larger for other subgroups), the expected level of precision for the subgroup would be at most ±4.9 percent (and would be lower for prevalence estimates that are less than 50 percent or greater than 50 percent).
Design effect (DEFF) |
|||
n |
1.10 |
1.25 |
1.50 |
100 |
11.0% |
12.5% |
15.0% |
200 |
7.8% |
8.8% |
10.6% |
300 |
6.4% |
7.2% |
8.7% |
400 |
5.5% |
6.3% |
7.5% |
500 |
4.9% |
5.6% |
6.7% |
600 |
4.5% |
5.1% |
6.1% |
700 |
4.2% |
4.7% |
5.7% |
800 |
3.9% |
4.4% |
5.3% |
900 |
3.7% |
4.2% |
5.0% |
1,000 |
3.5% |
4.0% |
4.7% |
1,100 |
3.3% |
3.8% |
4.5% |
1,200 |
3.2% |
3.6% |
4.3% |
1,300 |
3.1% |
3.5% |
4.2% |
1,400 |
2.9% |
3.3% |
4.0% |
1,500 |
2.8% |
3.2% |
3.9% |
* Entries correspond to 95% confidence limits for an estimated prevalence of approximately 50%. For estimated prevalence less than 50% or greater than 50%, the confidence limits will be smaller than those indicated in the table.
Sample Stratification and Selection
As indicated in Section B.1, an SFA-level database derived from 2009-10 Verification Summary Reports data (FNS form 742) will be used to construct the SFA sampling frame. In addition to a unique identifier (SFAID), name of SFA, and state in which the SFA is located, the database includes information about the type of control of the SFA/school district (public or private), number of schools participating in the NSLP/SBP, total enrollment in participating schools, and the number of students eligible for free or reduced-price lunch. This information, along with data from the most recent NCES Common Core of Data (CCD) LEA universe file will be used to stratify SFAs for sampling purposes. Note that all known eligible SFAs, including those that cannot be matched to the current CCD file, will be included in the sampling frame. Although the nonmatched SFAs account for a small percentage of students with access to NSLP or SBP (e.g., see table B1) and could be excluded from the sampling frame, we plan to include them to minimize potential coverage biases resulting from the inability to perfectly link SFAs in the FNS-742 database to the corresponding LEA in the CCD universe file.
The types of district-level variables that can be used either as explicit or implicit stratifiers include region (defined by the seven FNS regions), enrollment size class, a measure of poverty status defined by the percent of students eligible for free/reduced price lunch, minority status defined by the percent of non-white students served by the SFA, type of locale (e.g., central city, suburban, town, rural), and instructional level of schools served by the SFA (e.g., elementary schools only, secondary schools only, or both). Since many of these characteristics are related, it will not be necessary to employ all of them in stratification to account for the variation in SFAs. Thus, we propose to define explicit sampling strata based on three primary variables: SFA enrollment size, FNS region, and poverty status. Note that since type-of-locale, minority status, and instructional level will not be available for SFAs that are not matched to LEAs in the CCD file, the non-matched cases will be placed in a separate category for sampling purposes. The CCD variables will be used as possible implicit stratifiers (i.e., sorting variables) to ensure appropriate dispersion and representation in the sample. A stratified sample of 1,765 SFAs will be allocated to the strata in rough proportion to the aggregate square root of the enrollment of SFAs in the stratum. Such an allocation gives large SFAs relatively higher selection probabilities than smaller ones and is expected to provide acceptable sampling precision for both prevalence estimates (e.g., the proportion of SFAs with a specified characteristic) and numeric measures correlated with enrollment (e.g., the number of students in SFAs with access to various food services or programs). Prior to sample selection, SFAs in the sampling frame will be sorted by characteristics available from the CCD file to the extent feasible to induce additional implicit stratification. Within each primary stratum defined by size class, FNS region, and poverty status, SFAs will be selected systematically at rates that are roughly proportional to the mean of the square root of the enrollment of the SFAs in the stratum.. Assuming an overall response rate of 85 percent, the initial sample of approximately 1,765 SFAs will yield about 1,500 completed questionnaires. Table B3 summarizes the proposed sample allocation and the expected sample yields by SFA enrollment size and poverty level.
Percent eligible for free/reducted price lunch1 |
Enrollment size class2 |
Number of SFAs to be sampled |
Expected number of responding SFAs3 |
Under 60 percent |
Less than 1,000 |
275 |
234 |
|
1,000 to 4,999 |
545 |
463 |
|
5,000 to 24,999 |
337 |
286 |
|
25,000 or more |
116 |
99 |
|
Subtotal |
1,273 |
1,082 |
60 percent or more |
Less than 1,000 |
139 |
118 |
|
1,000 to 4,999 |
170 |
145 |
|
5,000 to 24,999 |
124 |
105 |
|
25,000 or more |
59 |
50 |
|
Subtotal |
492 |
418 |
All SFAs |
Total |
1,765 |
1,500 |
1Calculated from the numbers of students eligible for free or reduced price lunch as reported in 2009-10 FNS 742.
2Number of students with access to NSLP/SBP as reported in 2009-10 FNS 742.
3Based on 85% response rate. Note: See Table B4 for additional breakouts of the sample by type of locale, poverty status, and FNS region.
Expected Levels of Precision
Table B4 summarizes the approximate sample sizes and standard errors to be expected under the proposed design for selected subgroups. The standard errors in table B4 reflect design effects ranging from 1.0 or less to 1.5 depending on subgroup. The design effect primarily reflects the fact that under the proposed stratified design, large SFAs will be sampled at relatively higher rates (i.e., have smaller sampling weights) than small SFAs. The standard errors in table B4 can be converted to 95 percent confidence bounds by multiplying the entries by 2. For example, an estimated proportion of the order of 20 percent (P = 0.20) for suburban SFAs will be subject to a margin of error of ±4.6 percent at the 95 percent confidence level. Similarly, an estimated proportion of the order of 50 percent (P = 0.50) for SFAs in the Northeast region will be subject to a margin of error of ±8.6 percent at the 95 percent confidence level.
Domain (subset) |
Expected sample size* |
Standard error† of an estimated
|
||
P = 0.20 |
P = 0.33 |
P = 0.50 |
||
Total sample |
1,500 |
0.012 |
0.014 |
0.015 |
Community Type(Locale) |
|
|
|
|
Missing‡ |
128 |
0.041 |
0.049 |
0.052 |
City |
225 |
0.036 |
0.043 |
0.046 |
Suburban |
403 |
0.023 |
0.027 |
0.029 |
Town |
250 |
0.027 |
0.031 |
0.033 |
Rural |
494 |
0.020 |
0.023 |
0.024 |
Percent of students eligible for free/reduced price lunch |
|
|
|
|
Less than 30 |
394 |
0.023 |
0.027 |
0.028 |
30 to 59.9 |
689 |
0.018 |
0.021 |
0.022 |
60 or more |
418 |
0.024 |
0.028 |
0.030 |
FNS Region |
|
|
|
|
Mid Atlantic |
173 |
0.035 |
0.041 |
0.043 |
Midwest |
339 |
0.024 |
0.028 |
0.030 |
Mountain |
173 |
0.035 |
0.041 |
0.043 |
Northeast |
167 |
0.035 |
0.041 |
0.043 |
Southeast |
196 |
0.033 |
0.039 |
0.041 |
Southwest |
213 |
0.032 |
0.038 |
0.041 |
Western |
240 |
0.034 |
0.040 |
0.042 |
SFA Enrollment Size |
|
|
|
|
Under 1,000 |
358 |
0.020 |
0.024 |
0.026 |
1,000 to 4,999 |
607 |
0.015 |
0.018 |
0.019 |
5,000 or more |
536 |
0.015 |
0.018 |
0.019 |
* Expected number of responding eligible SFAs, assuming response rate of 85 percent. The standard errors given in this table are given for illustration. Actual standard errors will depend on characteristics being estimated and may differ from those shown.
† Assumes unequal weighting design effect ranging from 0.78 to 1.87 depending on subgroup.
‡ Includes SFAs in FNS-742 database that do not match to any records on the CCD frame.
Estimation and Calculation of Sampling Errors
For estimation purposes, sampling weights reflecting the overall probabilities of selection and differential nonresponse rates will be attached to each data record providing usable SFA data. The first step in the weighting process will be to assign a base weight to each sampled SFA. The base weight is equal to the reciprocal of the probability of selecting the SFA for the study, which will vary by sampling stratum under the proposed stratified sample design. Next, the base weights will be adjusted for nonresponse within cells consisting of SFAs that are expected to be homogeneous with respect to response propensity. To determine the appropriate adjustment cells, we will conduct a nonresponse bias analysis to identify characteristics of SFAs that are correlated with nonresponse. The potential set of predictors to be used to define the adjustment cells will include SFA-level characteristics that are available from the FNS database and data from the most recent CCD file. Within these cells, a weighted response rate will be computed and applied to the SFA base weights to obtain the corresponding nonresponse-adjusted weights.
To properly reflect the complex features of the sample design, standard errors of the survey-based estimates will be calculated using jackknife replication. Under the jackknife replication approach, 100 subsamples or "replicates" will be formed in a way that preserves the basic features of the full sample design. A set of weights (referred to as “replicate weights”) will then be constructed for each jackknife replicate. Using the full sample weights and the replicate weights, estimates of any survey statistic can be calculated for the full sample and for each of the jackknife replicates. The variability of the replicate estimates is used to obtain the variance of the survey statistic. The replicate weights can be imported into variance estimation software (e.g., SAS, SUDAAN, WESVAR) to calculate standard errors of the survey-based estimates. In addition to the replicate weights, stratum and unit codes will be provided in the data files to permit calculation of standard errors using Taylor series approximations if desired. Note that while replication and Taylor series methods often produce similar results, jackknife replication has some advantages in reflecting statistical adjustments used in weighting such as nonresponse and poststratification (e.g., see Rust, K.F., and Rao, J.N.K., 1996. Variance estimation for complex surveys using replication techniques. Statistical Methods in Medical Research, 5: 283-310).
B.3 Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.
Overall response projections were presented earlier. Achieving this response rate involves locating the sample members and securing participation. We estimate 85 percent of the SFA Directors will either complete a self administered paper survey or the web-administered survey. We also expect all State Agency Child Nutrition (CN) Directors to complete the surveys.
Below we describe procedures to be followed to maximize the number of sample members who complete the survey:
The letters inviting SFA Directors and State Agency CN Directors to participate in the surveys will be very carefully developed to emphasize the importance of this study and how the information will help the Food and Nutrition Service (FNS) to better understand and address current policy issues related to Special Nutrition Program (SNP) operations.
Before the SFAs are invited to participate in the study, the contractor will gain support from relevant associations representing organizations with an interest in the success of this study (e.g. School Nutrition Association) and Food Service Management Companies managing school meals programs on behalf of SFAs.
Designated FNS regional staff will serve as regional study liaisons and be kept closely informed of the project so that they will be able to answer questions from SFAs and encourage participation.
The contractor will have a toll free number that SFAs can call to ask any questions related to the study.
Sampled SFA Directors will have the option of completing the survey using the mode of their choice (hard copy or web). The State CN Directors will have the option of completing a hard copy survey or a telephone survey.
We will follow up by telephone with all sampled SFA and the State CN Directors who do not complete the survey within a specified period and urge them to complete the survey. At that point if the State Directors prefer to complete the survey over the telephone, a telephone interviewer will administer the survey over the telephone. The SFA Directors will not be given the option of completing a telephone survey because they need to gather data to complete the survey, and it is not practical to complete the SFA survey on the telephone.
Follow-up reminders will be sent either by email (if an email address is available) or by regular mail to respondents who have not mailed the survey or completed the web survey.
The following procedures will be used to maximize the completion rates for surveys that are administered by telephone:
Use a core of interviewers with experience working on telephone surveys, particularly interviewers who have proven their ability to obtain cooperation from a high proportion of sample members, to administer the survey over the telephone to State Agency CN Directors who do not complete the hard copy survey.
All telephone interviewers will complete training specific to this study.
Use call scheduling procedures that are designed to call numbers at different times of the day (between 8am and 6pm) and week (Monday through Friday), to improve the chances of finding a respondent at work.
Make every reasonable effort to obtain an interview at the initial contact, but allow respondents flexibility in scheduling appointments to be interviewed.
Conduct silent monitoring of interviews to identify and promptly correct behaviors that could be inviting refusals or otherwise contributing to low cooperation rates.
Leave a message on voice mail in order to let the respondent know the call was for a research study.
Provide a toll-free number for respondents to call to verify the study’s legitimacy or to ask other questions about the study.
Require many unsuccessful call attempts to a number without reaching someone before considering whether to treat the case as “unable to contact.”
Implement refusal conversion efforts for first-time refusals and use interviewers who are skilled at refusal conversion and will not unduly pressure the respondent.
B.4 Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.
The discussion below provides the results of the feasibility study and the pretest of the survey instruments.
Feasibility study. We contacted several of the SFA Directors and one State Agency CN Director who completed the questionnaires to discuss their responses to the questions they found difficult to answer. During the call we probed for clarity, appropriate use of definitions and terms, and the level of ease for respondents to gather the requested information. The feasibility study helped us understand whether the SFA and State Agency CN directors are able to provide the data as requested in the draft survey items. We determined that SFA Directors found it hard to report income and expenditure information broken down by all the categories identified in the SFA survey.
Pretest. Westat purposively selected 24 potential SFA pilot sites taking into account their representation across all FNS regions, student enrollment ranging from small (1,000 and 2,500 students) to large (25,000 and 100,000 students), and poverty status (percent of students eligible for free/reduced priced lunch). Our goal was to recruit nine SFAs to participate in the pre-test. No potential SFA pilot sites received data collection instruments until they agreed to participate. Several SFA directors declined to participate in the pre-test due to time or staffing constraints. Although nine SFA directors agreed to participate in the pre-test, only seven of them were able to send in their completed paper version of the survey in time for the pre-test. For the State Agency Child Nutrition Director Survey we attempted to recruit 4 states, and 3 of them participated.
The purpose of the pretest was to test the questionnaire, focusing on (1) clarity of the wording, (2) availability of the information, and (3) response burden. Respondents reported that the time taken to complete the survey was far more than the original 1hour estimated. Based on this feedback, we worked on shortening both surveys to reduce the burden on the respondents. Additionally respondents provided valuable feedback on question wording as well as questions that were hard to respond to. Several questions were identified that could not be answered.
B.5 Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.
The contractor, Westat will conduct this study.
Name |
Affiliation |
Telephone Number |
|
Juanita Lucas-McLean |
Westat |
301-294-2866 |
|
Adam Chu |
Westat |
301-251-4326 |
|
Cynthia Thomas |
Westat |
301-251-4364 |
|
John Endahl |
FNS/USDA |
703-305-2127 |
|
Jennifer Rhorer |
NASS/USDA |
202.720.2616 |
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Freeland_s |
File Modified | 0000-00-00 |
File Created | 2021-02-01 |