OMB Submission
(Part B)
Community Eligibility Option Evaluation
Contract #:
GS-10F-00086K
Order #:
AG-3198-D-11-0074
Revised
July 26, 2012
Prepared for:
John Endahl
Office of Research and Analysis
Food and Nutrition Service/USDA
3101 Park Center Dr, Rm 1004
Alexandria, VA 22302
Prepared by:
Abt Associates Inc.
55 Wheeler Street
Cambridge, MA 02138
Community Eligibility Option Evaluation—OMB Clearance Package (Part B)
Table of Contents
B.1 Respondent Universe and Sampling Methods 1
B.2 Procedure for the Collection of Information 4
Degree of Accuracy Needed: Precision, Statistical Power, and Minimum Detectable Differences 5
B.3 Methods to Maximize the Response Rates and to Deal with Nonreponse. 5
B.4 Test of Procedures or Methods to be Undertaken 6
B.5 Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data. 7
FNS anticipates an overall response rate of 83.4 percent. The respondent universe for the Community Eligibility Option Evaluation will include.
11 State Child Nutrition Directors;
51 State Department of Education Staff familiar with how NSLP eligibility data are used in addition to specific purpose of qualifying students for the NSLP;
An estimated 3,500 LEA Food Service Directors in 7 States;
An estimated 3,621 School Administrators in 7 States; and
An estimated 10,863 school cafeteria managers in 7 States.
Several of these numbers are estimates; the sampling frame data from the participating States are not final and complete.
This evaluation entails multiple data collections at various jurisdictional levels. Samples for some components of the data collection are nested. Exhibit B.1 displays estimates for the size of the universe, the sampled portion, and the response rate for each data collection component.
The sampling frame for LEAs will consist of those LEAs in the 7 participating States that are either eligible and participating, eligible and not participating, or nearly eligible to participate in the CE Option. These three groups are defined by FNS, and participating States are required to report these data to FNS by July each year. We will use these data to construct the sampling frame. Schools within these LEAs constitute the universe of schools. The sampling frame for schools will be constructed from lists of participating, eligible non-participating, and near-eligible schools that the participating States are required to provide to FNS under program guidance.
Implementation study. A total of 1400 LEAs will be invited to participate in the web survey. All LEAs that use the CE Option in School Year 2012-2013 will be invited to participate. Eligible non-participating and near-eligible LEAs will be invited to participate in proportion to their representation in the universe of LEAs. If the total number across all three groups exceeds 1400, eligible non-participating and near-eligible LEAs will be systematically sampled with sorting by the number of enrolled students to achieve a sample size of 1400.
Impact study. The impact study will use a multi-stage design with several samples of LEAs and schools. The first stage will be the selection of LEAs for the Participation, Enrollment, Attendance, and Revenue (PEAR) component of the impact study. The second and third stage samples of LEAs for other components of the impact study will be subsamples of this PEAR sample, and samples of schools will be taken from these LEAs.
For the PEAR survey, we will select a representative sample of LEAs participating in the CE Option (the treatment group) and a matched comparison sample of non-participating LEAs. Thus, the selection of the comparison group from the population of non-participating LEAs will be purposive. The goal of the impact study is to estimate the causal effects of the Community Eligibility Option on participating LEAs. To achieve this, we need a comparison sample of non-participating LEAs. Ideally, we would randomly assign LEAs to participate in the option, but this is not feasible. An alternative would be to select a representative sample of eligible non-participating LEAs, but this could result in a comparison sample that differed substantially from the sample of participants, in which case we would have to completely rely on statistical multivariate regression methods to control for differences between groups.
Instead, our strategy in this evaluation is to select a comparison sample closely matched to the sample of participants. We will use propensity score matching to select a sample of non-participating LEAs. The participating LEAs along with their best matched eligible nonparticipating LEAs will form the PEAR sample. It is possible that no sufficiently close matches may be identified for some participating LEAs, in which case these LEAs would have to be excluded from the PEAR sample. If the matched sample of participating and eligible non-participating LEAs exceeds the specified sample for data collection, a systematic sample will be selected with sorting by the number of enrolled students. Private schools will be excluded from the impact study because we are unlikely to find suitable matches given the more limited data available on their characteristics.
At the second stage, the sample of LEAs for the Costs and Staffing and Certification Record Review data collection will be an approximate 50 percent systematic subsample of the PEAR sample. Participating LEAs with less than 70 percent of schools participating in the CE Option will be excluded from this sample, in order to focus resources on estimating impacts in participating schools. We have chosen the 70 percent threshold to minimize spillover effects on participating schools, while allowing inclusion of sufficient numbers of LEAs in States where there are relatively few LEAs where all schools participate in the CE Option. In addition, the sample for the Costs and Staffing data collection will exclude LEAs with fewer than 300 students, because such LEAs will not have enough certification records to meet the data collection requirements. Thus estimates from the second and third stages will be representative of the subset of LEAs meeting two criteria: at least 300 students and (for participating LEAs) at least 70 percent of schools participating in the CE Option. Schools for Costs and Staffing interviews and for the Certification Record Review will be drawn from the Costs and Staffing subsample of LEAs. Within LEAs, schools will be stratified by grade level (elementary, middle, and high), with random selection of one school per stratum present in the LEA. Approximately one-third of sampled schools will have School Administrator Cost Interviews. Within schools thus selected, certification records will be systematically sampled for review.
At the third stage, the sample of LEAs for the Meal Counting and Claiming data collection will be an approximate 50 percent systematic subsample of the Costs and Staffing sample, after excluding LEAs with less than one school in each of the three grade ranges. Schools for the Menu Survey, Cashier Observations, and Meal Counting and Claiming data collections will be drawn from the Meal Counting and Claiming subsample of LEAs, using the same stratification and sampling procedure as for the selection of schools for the Costs and Staffing and Certification Record Review data collection.
Exhibit B.1. Estimated Sizes of Universes, Sampled Portions, and Expected Response Rates for Implementation Survey and Impact Study Components
Level |
Data Collected |
Stratuma |
Number in Universe |
Number Sampled |
Percent Sampled |
Expected Number Responding |
Expected Response Rate |
States |
|
|
|
|
|
|
|
|
State survey |
51 |
51 |
100.0 |
45 |
88.2 |
|
|
Implementation |
11 |
11 |
100.0 |
11 |
100.0 |
|
LEAs |
|
|
|
|
|
|
|
|
Implementation |
|
|
|
|
|
|
|
|
EP |
450 |
450 |
100.0 |
360 |
80.0 |
|
|
EN |
1525 |
475 |
31.1 |
380 |
80.0 |
|
|
NE |
1525 |
475 |
31.1 |
380 |
80.0 |
|
|
Total |
3500 |
1400 |
40.0 |
1120 |
80.0 |
|
Participation, Enrollment, Attendance, and Revenue |
||||||
|
|
EP |
450 |
150 |
33.3 |
120 |
80.0 |
|
|
EN |
1525 |
150 |
9.8 |
120 |
80.0 |
|
|
Total |
1975 |
300 |
15.2 |
240 |
80.0 |
|
Costs and Staffing (LEA level) |
|
|
|
|
|
|
|
|
EP |
450 |
66 |
14.7 |
53 |
80.3 |
|
|
EN |
1525 |
66 |
4.3 |
53 |
80.3 |
|
|
Total |
1975 |
132 |
6.7 |
106 |
80.3 |
|
Meal Counting and Claiming |
|
|
|
|
||
|
|
EP |
450 |
32 |
7.1 |
26 |
81.3 |
|
|
EN |
1525 |
32 |
2.1 |
26 |
81.3 |
|
|
Total |
1975 |
64 |
3.2 |
52 |
81.3 |
Schools |
|
|
|
|
|
|
|
|
Costs and Staffing Interviews (School Food Service Manager), and Certification Record Reviewb |
|
|
|
|
||
|
|
EP |
2475 |
159 |
6.4 |
159 |
100.0 |
|
|
EN |
8388 |
159 |
1.9 |
159 |
100.0 |
|
|
Total |
10863 |
318 |
2.9 |
318 |
100.0 |
|
Cost and Staffing Interview for School Administrator c |
|
|
|
|
||
|
|
EP |
825 |
50 |
6.0 |
50 |
100.0 |
|
|
EN |
2796 |
50 |
1.8 |
50 |
100.0 |
|
|
Total |
3621 |
100 |
2.8 |
100 |
100.0 |
|
Menu Surveys and Cashier Observationsd |
|
|
|
|||
|
|
EP |
2475 |
97 |
3.9 |
78 |
80.4 |
|
|
EN |
8388 |
97 |
1.2 |
78 |
80.4 |
|
|
Total |
10863 |
194 |
1.8 |
156 |
80.4 |
|
Meal Counting and Claiming |
|
|
|
|
||
|
|
EP |
2475 |
97 |
3.9 |
78 |
80.4 |
|
|
EN |
8388 |
97 |
1.2 |
78 |
80.4 |
|
|
Total |
10863 |
194 |
1.8 |
156 |
80.4 |
a Strata are defined as: "EP"= eligible, participating LEA, "EN"=eligible, non-participating LEA, "NE"=near- eligible LEA.
b Certification record sample in each school will include 38 students approved for free/reduced-price meals in both EP and EN schools, and 12 denied students in EN schools.
c We estimate that one in three schools has a role for school administrators in activities affected by implementation of the CE Option, so the universe and the sample for these interviews are approximately one-third of the universe and sample for the Cost and Staffing Interviews for School Food Service Managers.
d In each school, 60 lunches and 40 breakfasts will be observed.
This is a one-time data collection effort with no unusual problems that require specialized sampling procedures. All respondents will be sent advance letters notifying them of the data collection activity and interviews will be conducted by trained project staff (see Section A.2. for more details). Statistical methodology for stratification and sample selection was discussed in Section B.1. Estimation procedures and the degree of accuracy attainable with impact estimates are discussed below. (See Section A.16 for a discussion on estimates from the implementation study.)
The estimation procedures used in this evaluation are focused on estimating the impact of the CE Option on participating LEAs and schools. This will be accomplished by comparing outcomes at either a single point in time or across time between participating LEAs and a matched set of eligible nonparticipating LEAs. This is distinct from attempting to estimate the characteristics of an underlying population from a sample. (See the discussion of the matched comparison design in Section B.1.)
For all outcomes, the primary procedure for comparing outcomes between participating and eligible nonparticipating LEAs will use regression models to separate the impact of participation from the effects of LEA or school characteristics. Models will be appropriate to the data structure, which will include pretest and posttest data for participation, revenues, and presence of the SBP, but posttest data alone for other outcomes. Standard errors of estimates will be adjusted to account for clustering of observations (i.e., schools within LEA, records within schools).
The evaluation has been designed to meet FNS’ expectations for detection of differences in outcomes between participating and eligible non-participating LEAs at widely accepted levels of statistical significance and power. Exhibit B.2 shows the minimal detectable differences (MDDs) between groups at a 5 percent significance level and 80 percent power for selected outcome measures. These MDDs are estimated for t-tests to detect differences in means between participating and eligible non-participating LEAs. The planned regression models are expected to reduce the MDDs by 10 to 20 percent.
Exhibit B.2. Minimal detectable differences between participating and eligible non-participating LEAs at a 5 percent significance level and 80 percent
Measure |
Estimated average for nonparticipating schools |
MDD at 5 percent significance level with 80 percent power |
Participation |
75 percent |
5.4 percentage points |
Administrative Costs (as percentage of total foodservice costs) |
8.2 percent |
3 percentage points |
Certification error |
8.2 percent |
1.75 percentage points |
Percentage of food energy from saturated fat (a measure of nutritional quality) |
10.8 percent |
1 percentage point |
Cashier error rate |
3.1 percent |
1.5 percentage points |
We do not anticipate a problem obtaining the necessary response rates. The major factor ensuring high response rates is that participation of States, LEAs, and schools in the evaluation is not voluntary. HHFK stipulates that “States, State educational agencies, local educational agencies, schools, institutions, facilities, and contractors participating in programs authorized under this Act and the Child Nutrition Act of 1966 (42 U.S.C. 1771 et seq.) shall cooperate with officials and contractors acting on behalf of the Secretary, in the conduct of evaluations and studies under those Acts” as a condition of receiving funding.” We plan to reference HHFK in the advance letters that will be sent to States and LEAs before any data collection activity. We will ask States to send messages to LEAs advising them of the requirement to cooperate. Project staff will follow-up with phone calls and emails as needed to recruit respondents. For LEAs that are non-responsive, we will ask the State CN directors to send follow-up emails to non-respondents to remind them of the requirement to cooperate with the study. The cooperation of the LEA subsequently ensures the cooperation of the sampled schools within the LEA.
Every effort will be made to minimize the burden placed on web-survey respondents and during on-site data collection activities. The web surveys allow respondents to work within their schedule by starting and stopping completion of the survey as often as they need. Telephone survey interviews and on-site data collection activities will be conducted at a time convenient for the respondents.
We will recruit LEAs and schools for the on-site data collection, prioritizing the third stage, which is the most intrusive. If one of a pair refuses to participate, we will go back to the pool for the next best candidate. LEAs that do not participate in on-site data collection will still be included in the PEAR survey when possible.
We conducted pretests of all data collection instruments for which OMB clearance is being requested. The objectives of the pretest were to evaluate: the ability of respondents to understand and respond to questions; methods of administering the survey instruments; appropriateness of response categories; and, length of time required to administer the survey instruments. All pretests were conducted with no more than 9 respondents per instrument. The three versions of the LEA Foodservice Directors Survey for implementation, (1) participating LEAs, (2) eligible but nonparticipating LEAs, and (3) near eligible LEAs, contained some of the same questions. Therefore, shortened versions of the first two surveys, including only those questions unique to that survey, were administered to some pretest respondents so that no more than 9 pretest participants answered any one question. Pre-test summaries of each instrument are included in Appendix K.
Field procedures and testing of on-site school instruments were conducted by Abt staff at 6 LEAs and 6 schools in Massachusetts in early February 2012. The instruments tested included: Pre-Visit LEA Questionnaire (by telephone before site visits); Cost and Staffing Interview (including preparation forms); Certification Records Abstraction Form; Application Data Form; Pre-Visit School Information Questionnaire (by telephone before site visits when possible); Menu Survey; Meal and Cashier Observations; LEA Meal Count Verification Form, and School Meal Count Verification Form. The pretests included debriefings with respondents on burden and any questions they had difficulty answering.
We conducted paper-version pretests of the web-based surveys for the Implementation Study (EP, EN, and NE surveys) and Impact Study (PEAR survey) in early February 2012. These surveys were sent to respondents by Federal Express with pre-addressed, pre-paid materials for returning surveys to Abt Associates. Respondents were also provided with instructions for recording the time they spent completing the surveys and making note of questions and instructions that were unclear, and response options that were inadequate. After receiving completed surveys, project staff followed up with each respondent to discuss their experience completing the survey and noted any comments received on the draft instrument. The instruments were revised based on feedback obtained from respondents, including clarification of instructions, changing some of the questions, and modifications to question wording. In addition some questions were deleted or simplified to reduce the overall survey burden.
We conducted a State Education Agency (SEA) survey by telephone with eight States, two each in the West, Southwest, Mid Atlantic, and Mountain Plains regions. An initial contact was made with each State to determine the appropriate respondent. Subsequently, the survey was conducted with Directors of Child Nutrition, directors associated with financial departments, a Director of the Department of Education, and an Operations Director at the Department of Education. The survey and procedures were revised based on the pre-test experience.
Respondent burden was adjusted for data collection instruments based on the results of the pretest, if necessary. Respondent burden for each data collection instrument as pretested as well as for the final version is presented in Exhibit A.1.
Name |
Affiliation |
Telephone Number |
|
Chris Logan |
Senior Associate, Abt Associates Inc. |
617.349.2821 |
|
K.P. Srinath |
Sampling Statistician, Abt Associates Inc. |
301.634.1836 |
|
Joseph Harkness |
Associate Scientist, Abt Associates Inc |
301.347.5890 |
|
Laura Peck |
Principal Associate, Abt Associates Inc. |
301.347.5537 |
|
Eleanor Harvill |
Senior Analyst, Abt Associates Inc. |
301.347.5638 |
|
Matthew Gregg |
Statistician, USDA/National Agricultural Statistics Service |
202-720-3388 |
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Abt Single-Sided Body Template |
Author | Missy Robinson |
File Modified | 0000-00-00 |
File Created | 2021-01-30 |