rev APEC II_OMB Part B-Final 07-26-12

rev APEC II_OMB Part B-Final 07-26-12.docx

NSLP/SBP Access, Participation, Eligibility, and Certification Study

OMB: 0584-0530

Document [docx]
Download: docx | pdf

Supporting Justification for OMB Clearance for the National School Lunch Program and School Breakfast Program Access, Participation, Eligibility, and Certification Study II (APEC-II Study)



Part B

Final

July 26, 2012

Project Officer: Reneé Arroyo-Lee Sing

Contract Number:

AG-3198-C-11-0001


Mathematica Reference Number:

40030.034



Submitted to:

U.S. Department of Agriculture

Food and Nutrition Service

3101 Park Center Drive

Alexandria, VA 22302

Project Officer: Reneé Arroyo-Lee Sing

Submitted by:

Mathematica Policy Research

P.O. Box 2393

Princeton, NJ 08543-2393

Telephone: (609) 799-3535

Facsimile: (609) 799-0005

Project Director: Laura Castner

Supporting Justification for OMB Clearance for the National School Lunch Program and School Breakfast Program Access, Participation, Eligibility, and Certification Study II
(APEC-II Study)

Part B

Final

July 26, 2012

Eric Zeidman

Raquel af Ursin

John Hall

Laura Castner

Alicia Leonard





CONTENTS

PART B. COLLECTION OF INFORMATION USING STATISTICAL METHODS 1

1. Respondent Universe and Sampling Methods 1

2. Procedures for the Collection of Information 8

3. Methods to Maximize Response Rates 16

4. Test of Procedures 18

5. Individuals Consulted on Statistical Aspects of the Design 18

TABLES

B1.1 Respondent Universe, Samples, and Expected Response Rates (Including Main and CEO Samples) 2

B2.1 90 Percent Confidence Intervals: About Mean Amount in Error 9

B2.2 90 Percent Confidence Intervals for Percentage Estimates of Case Error Due to Administrative Error, Assuming Administrative Error Rate Is 10 Percent 10

B5.1 Individuals Consulted on Data Collection or Analysis 19




ATTACHMENT A: TABLES AND FIGURES

ATTACHMENT B: SCHOOL DISTRICT & SFA CONTACT DOCUMENTS

ATTACHMENT C: SCHOOL FOOD AUTHORITY (SFA) Survey (SFA DISTRICT DIRECTOR QUESTIONNAIRE)

ATTACHMENT D: SFA REIMBURSEMENT CLAIMS DATA FORMS

ATTACHMENT E: APPLICATION DATA ABSTRACTION FORM

ATTACHMENT F: CERTIFIED AND DENIED APPLICANT SAMPLING FORM

ATTACHMENT G: CEO STUDENT SAMPLING

ATTACHMENT H: SCHOOL MEAL COUNT VERIFICATION FORMS

ATTACHMENT I: MEAL TRANSACTION OBSERVATION FORM

ATTACHMENT J: CHANGES IN STUDENT CERTIFICATION AND ENROLLMENT FORM

ATTACHMENT K: PARTICIPATION DATA PROTOCOL

ATTACHMENT L: HOUSEHOLD CONTACT DOCUMENTS

ATTACHMENT M: HOUSEHOLD SURVEY

ATTACHMENT N: MATHEMATICA CONFIDENTIALITY PLEDGE

ATTACHMENT O: STATE CONTACT DOCUMENTS

ATTACHMENT P: PUBLIC COMMENTS

ATTACHMENT Q: RESPONSE TO PUBLIC COMMENTS





PART B. COLLECTION OF INFORMATION USING STATISTICAL METHODS

1. Respondent Universe and Sampling Methods

Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.



The APEC-II study involves a multistage–clustered sample design and includes probability samples of school food authorities (SFAs), schools, students certified for free and reduced-priced meals, and households that applied for and were denied free and reduced-price meal benefits in School Year (SY) 2012–13. Substantive data for the study will be obtained from the entities at each level of sampling. The respondent universe includes all public and private SFAs and schools participating in the NSLP and SBP that are located in the contiguous 48 states and the District of Columbia, and their students certified for free and reduced-price meals, as well as households that applied for and were denied certification. This is referred to as the main sample. A supplemental sample of SFAs and schools participating in the Community Eligibility Option (CEO) will be selected from the universe of participants in the 7 participating States in SY 2012–13. All students in these schools will comprise the universe for a student sample. Additional detail on the study design, samples and data collection can be found in Attachment A.

The units sampled at the first two stages—SFAs and schools—are important information units themselves, as well as being the means for facilitating access to, and creating efficient sampling frames of, units at each successive stage.

The need for separate estimates of erroneous payments for the NSLP and SBP affects much of the sample design. While over 85 percent of the schools participating in the NSLP also participate in the SBP, only about one-third as many eligible students receive free or reduced-price breakfasts as receive free or reduced-price lunches. Therefore, to achieve OMB precision standards for estimating the rate of erroneous payments for both the NSLP and SBP, the main sample (exclusive of students in CEO schools) includes completing interviews with the parents of 3,835 students certified for free or reduced-price meals and 585 households with denied applicants from a total sample of 5,525 households. We anticipate that at least 1,843 of these sampled households will contain students who participate in the SBP and interviews will be completed with 1,474.

To produce estimates of erroneous payments in districts and schools participating in the CEO, we will supplement the main sample by collecting records data for 2,160 students in 135 participating schools in 45 SFAs within 5 selected States. These states will be randomly selected from among the 7 states participating in the CEO in SY 2012–13 as an efficient means to generate a representative sample within the project resources available to address the relevant research objectives. Statistical projections will be limited to SFAs and schools participating in the CEO within the 5 sampled states.

Table B1.1. shows the respondent universe, the initial sample sizes to be released for contact (includes the main and CEO samples), expected response rates, and the target number of completed cases for each level of data collection, as well as the comparable response rates achieved in the first APEC study (APEC-I).

Table B1.1. Respondent Universe, Samples, and Expected Response Rates (Including Main and CEO Samples)

Respondent

Universe

Initial Sample

Expected Response Rate

Target completed cases

APEC-I Response Rate

SFAs (for survey)

19,000

175

95%

166

100%

Schools (for on-site observations)

98,500

435

100%

435

100%

Students (main sample for survey)

21,000,000a

5,525

80%

4,420

83%

aThe universe of students for the household survey includes students certified for meal program benefits and denied applicants.


Efforts similar to those undertaken on the first APEC study will be used to ensure similarly high response rates. A detailed discussion of these efforts is included in B3.

Sampling SFAs for the Main Sample. We will begin with an SFA sampling frame, or a list of SFAs in the contiguous United States, based on data from the Form FNS-742 (Verification Summary Report) file. After the initial SFA sample has been selected, it will be merged with the National Center for Education Statistics (NCES) Core of Common Data (CCD) district-level file1 to obtain locating and other information for public SFAs (including the NCES Local Education Agency ID that will be used to create the school-level frame). To obtain information for private SFAs, we will merge the sample with the NCES Private School Survey (PSS) files.

We will use a stratified design and probability-proportional-to-size (PPS) to select an initial sample of SFAs large enough to recruit 130 SFAs for the main national sample. The first level of stratification will be the States. The motivation for this approach is that the selection of the base sample is affected by the fact that the first stage of sampling (SFAs) must be completed before we know which districts or SFAs will be participating in the CEO. This allows us maximum flexibility to adjust the sample within particular strata only. Each State that is large enough to have at least two SFAs in the final sample (i.e., having at least 2 percent of the estimated target population of students) will comprise its own stratum. The smaller States will be combined.

Within each State that is its own stratum, we will form strata based on prevalence of schools participating in the school meals program, the proportion of schools using Provision 2 or 3 (P 2/3),2 and the proportion of eligible students that are directly certified. Within the State, we will define certainty selections, if any, and in sampling SFAs not selected with certainty, we will implicitly stratify (sort based on the stratifying variables) the sample frame rather than use explicit stratification. A random, sequential selection from the sorted list of SFAs will produce a sample of SFAs that will have proportionate representation of the stratifying factors. States large enough to form their own stratum will be allocated a minimum of two selections. States too small to be allocated at least two selections (that is, they have less than 2 percent of the study population of students) will be grouped into one or more strata and will be randomly allocated 0, 1, or 2 SFAs in the national sample.3 Potential SFA replacements will also be selected in all strata for instances in which sampled SFAs are found ineligible or unable to participate in the study.

The main analytic variables of interest are at the student or meal level rather than SFA level. Thus, the samples of SFAs will be selected with PPS. Because this study focuses on the precision of estimates regarding reimbursement errors for meals served to students, the appropriate measure of size (MOS) would be the number of students eligible for free or reduced-price meals. In this way, we will set the probability of selection (from the frame) for each SFA such that if schools are selected PPS within SFAs and an equal number of students is sampled per school, the resulting sample will be approximately self-weighting. This will lead to greater precision for meal- and student-level estimates.

While we will make every effort to ensure participation of all sampled SFAs and schools, some may be ineligible and some may refuse to take part. We expect that the greatest source of ineligibility for the base sample will be participation in the CEO.4 To account for ineligibility and non-cooperation, we propose random substitution of similar SFAs, selected at the same time as the main sample and released only if necessary. Because participation in the CEO will not be known at the time the base sample of SFAs is selected, the number of potential substitutes will be larger than in previous similar studies. We will select a sample slightly more than three times as large as desired and form triplets of SFAs belonging to adjacent zones (within explicit strata, if these are used). Two of each triplet will be randomly selected to serve as the substitutes. For all substitutions, we will appropriately account for all sample releases in the weights and response rates.

Sampling SFAs Participating in the CEO. The CEO sample will be selected in 5 of the 7 States where the CEO is being implemented in 2012–13. The frame for sampling SFAs for the CEO study will comprise SFAs in the 5 States that include at least one school participating in the CEO. Selection will be with PPS and large enough to recruit 45 SFAs. We will select a sample slightly larger than two times the number needed for the survey and will form pairs of SFAs, one of which will be randomly designated as the main selection and the other as the alternate. A small number of pairs will be kept in reserve to allow for instances in which both SFAs in a sampled pair decline to participate.

Sampling Main Schools. For each SFA selected into the initial sample, we will compile a sampling frame of schools to select the school sample. The frame for public schools will be the most recent school-level CCD. To give the schools not in the frame a chance to be selected, we will ask public SFAs in our sample to provide names, enrollment, and program participation data for schools that have come into existence since the last FNS-742 was compiled.

We will select samples that will yield, on average, 3 schools per SFA, for a total of 390 schools. We plan to stratify schools into two to four groups. In all SFAs, we will stratify on level (elementary versus middle and high schools). In SFAs that use Provision 2 or 3 or CEO, we will also stratify on those characteristics. We will use implicit rather than explicit stratification for other characteristics. As with SFAs, we will select a substitute sample for schools. To create the pool of substitutes, we will select samples twice as large as needed in each explicit stratum and randomly assign half to be substitutes in case of ineligibility or refusal to participate.

Sampling Schools for the CEO Study. The sampling of schools for the CEO sample will be mostly the same as for the base sample. In a CEO SFA, schools are grouped into claiming units, which may comprise the entire SFA or one or more groups of schools. In some SFAs, each claiming group is a single school. In those SFAs, sampling will be the same as for the base sample. However, the procedures may be different for other SFAs:

  • If the SFA is the claiming unit, we may sample schools within the SFA. However, if feasible, sampling students directly from the SFA would lead to more precise estimates.

  • If there is at least one multischool claiming unit that does not cover the SFA, we would sample claiming units at this stage; if a multischool unit is sampled, we would either sample a school within it or attempt to sample students directly.

A total of 135 schools will be included in the CEO sample, each providing samples of students used to produce estimates of certification error. On-site data collection related to noncertification error will be limited to a subset of 45 schools.

Sampling Main Students. Team leaders will visit sampled schools in the fall of 2012 to compile lists with the information needed for stratifying and selecting students, and select samples of students for the household survey, including students certified for free or reduced-price meals and students whose applications were denied. Newly certified students will similarly be sampled, but the sampling will be done centrally—no onsite visit is required—during the spring of 2013.

We will select enough students for the main sample to yield 4,420 household interviews, 3,835 with certified students and 585 with denied applicants. We expect that to accomplish this objective, we will have to select a sample of 5,525 student records with whom household surveys will be attempted. An additional 1,420 potential student replacements will be selected to ensure sufficient available sample in SFAs where there is variation in household response rates. It is anticipated that most of the replacements will not be released for data collection.

Sampling CEO Students. In CEO schools, sampling will be done centrally from lists provided by SFAs (Attachment G). We will sample sufficient records to ultimately include 2,160 students in the study—a quarter of which were directly certified in the prior year, half of which were certified by application, and a quarter of which were students who were not certified (either denied applicants or did not apply for meal benefits) and paid full price for school meals. Students in the CEO sample will not be included in the student sample for the household survey or collection of student records but will be used to match records against lists of SNAP and TANF recipients to determine the accuracy of CEO reimbursement rates.

Based on program regulations (Section 305 of the Healthy, Hunger-Free Kids Act), we will not need to obtain consent to obtain information related to students’ meal program applications and direct certification documents. This however, does not apply to the paid students, who did not apply for free or reduced price meals or applied but certification was denied. If the relevant district did not agree that the application of the new FERPA amendment, published on December 2, 2011, at 76 FR 75604-1 amending the Family Education Educational Rights and Privacy Act, 34 C.F.R. 99, (New FERPA Amendment) is appropriate for their district, parents of these sampled paid students (using identification numbers only) will be sent consent forms (Attachment L) by the district before any identifiable information is shared. Paid students will be initially sampled assuming an 80 percent consent rate.

Sampling Cashier Transactions. We will randomly sample a day during the onsite data collection visit to observe cashier transactions. At each sampled school, field staff will enter the following data into a sampling program (separately for breakfast and lunch eating occasions): the number of periods the meals will be served, the number of serving lines per period, and the approximate number of students passing through the lines. The program will then provide a start value and selection rule (sample every “nth” tray thereafter). Field staff will observe a total of 50 breakfast and 50 lunch transactions at each school across two or more randomly selected serving lines.

2. Procedures for the Collection of Information

Describe the procedures for the collection of information including:

  • Statistical methodology for stratification and sample selection,

  • Estimation procedure,

  • Degree of accuracy needed for the purpose described in the justification,

  • Unusual problems requiring specialized sampling procedures, and

  • Any use of periodic (less frequent than annual) data collection cycles to reduce burden.

Sampling Estimation and Precision. OMB specifications for statistical precision require a 90 percent confidence interval of ±2.5 percent around the national estimate of the percentage of erroneous payments.5,6 Table B2.1 presents the precision expected under the main sample design (statistical methodology for stratification and sample selection is detailed in Section B1) for estimates relating to the erroneous payments, expressed as a percentage of all meal reimbursements, expressed as the half-width of 90 percent confidence intervals. The confidence interval for the study’s estimate of the rate of overall erroneous payments is ±1.18percentage points for the NSLP and ±1.64for the SBP, combining estimates across the main and CEO samples.7 The precision of both estimates are within the OMB standard of ±2.5 percentage points. The greater-than-required power will be advantageous by providing greater precision for estimates based on subgroups (for instance, directly certified students), for analysis based on the main data set (such as modeling), and for making comparisons between APEC-II and APEC-I. Table B2.1 also includes precision estimates for subgroups based on CEO status. The confidence interval for the study’s estimate of the rate of overall erroneous payments for CEO is ±2.54 percentage points for the NSLP and ±3.07 for the SBP. The precision required by OMB is ±2.5 percent around the national estimate of the percentage of erroneous payments; there is no precision requirement for subgroups of districts and schools. Nonetheless, APEC-II includes sufficiently large samples of students in CEO participating districts and schools to yield very precise estimates for estimating erroneous payment rates in the CEO component.

The study design will provide a sample of applicants from sampled schools in which to estimate case error rate due to administrative error. This is the same student sample supporting the household survey (n = 4,420 completes). We will use this sample to estimate the prevalence of certification error due to administrative error separately for the NSLP and SBP. The estimates of case error rates due to administrative error are based on all applicants (and directly certified students), both approved and denied. Table B2.2 provides estimates of expected precision. For this analysis of case error due to administrative error only, the 90 percent confidence interval will be ±1.30 percentage points for the NSLP and ±1.82 percentage points for the SBP, assuming a case error rate due to administrative error near 10 percent.

Table B2.1. 90 Percent Confidence Intervals: About Mean Amount in Error

Mean Amount in Error

Sample Size
(Students)

90 Percent Confidence Interval Error for
Payments in Errora

NSLP



Overall Erroneous Payment Rate, National



Total

6,580

±1.18

Overpayment

6,580

±1.05

Underpayment

6,580

±0.60

Overall Erroneous Payment Rate, Non-CEO



Total

4,420

±1.26

Overpayment

4,420

±1.12

Underpayment

4,420

±0.64

Erroneous Payment Rate, CEO



Total

2,160

±2.54

Overpayment

2,160

±2.24

Underpayment

2,160

±1.28




SBPb




Overall Erroneous Payment Rate, National

2,194


Total

2,194

±1.64

Overpayment

2,194

±1.45

Underpayment


±0.83

Overall Erroneous Payment Rate, Non-CEO



Total

1,474

±1.73

Overpayment

1,474

±1.56

Underpayment

1,474

±0.89

Erroneous Payment Rate, CEO



Total

720

±3.07

Overpayment

720

±2.71

Underpayment

720

±1.55

aIn percentage points, assuming an error rate of 9.0 percent for the main study sample. The overpayment rate is assumed to be 7.1 percent and the underpayment rate 2.2 percent. For the CEO sample the total error will be the sum of these, or 9.3 percent. Different levels of design effects are assumed for different estimates. The design effects have two components, a design effect of weighting (Deff_w) and a design effect of clustering (Deff_c). Deff_w is assumed to be 1.5 which is consistent with APEC-I. To estimate Deff_c we assumed that the intracluster correlation (icc) for the main sample would be the same as in APEC I (0.032). We assume that the icc for the CEO sample will be approximately 0.065; while we believe this is reasonable there is no similar study on which to empirically estimate the icc for this subpopulation.

bAssumes one-third of sampled approved FRP students will participate in the SBP.


Table B2.2. 90 Percent Confidence Intervals for Percentage Estimates of Case Error Due to Administrative Error, Assuming Administrative Error Rate Is 10 Percent


Sample Size

90% Confidence Interval

NSLP

4,420

±1.30

SBP

1,474

±1.82


We will use weights when analyzing the data. An initial adjustment factor—the sampling weight—adjusts for differences between sampled students, schools, or SFAs in their initial probabilities of selection. Subsequent adjustment factors will adjust for nonresponse; also, if necessary, we will use a trimming factor to reduce the influence of extremely large weights (outliers). Sampling weights will be calculated for each SFA, school, and student included in the sample, as well as for the counting and claiming data. We will take into account the fact that unequal sample weights create a design effect and reduce precision levels, and we will take this design effect into account when we estimate standard errors in the analysis (we also took this into account in estimating precision levels here).

SFA Recruitment. SFA and school participation in the study is required under the Healthy, Hunger-Free Kids Act of 2010 (P.L. 111–296), Section 305: “States, State educational agencies, local educational agencies, schools, institutions, facilities, and contractors participating in programs authorized under this Act and the Child Nutrition Act of 1966 (42 U.S.C. 1771 et seq.) shall cooperate with officials and contractors acting on behalf of the Secretary, in the conduct of evaluations and studies under those Acts.” Stakeholder support for the study will be promoted through dissemination of study plans to child nutrition liaisons in each of the seven FNS regional offices and to State child nutrition directors. We also plan to approach the School Nutrition Association for letters of endorsement or support to provide to school districts.

SFA directors and superintendents in sampled districts will be sent recruiting materials ahead of a telephone recruiting call. (The recruitment protocol, which includes recruitment letters and suggested text for calling or speaking to the SFA Director or Superintendent, is included in Attachment B.) Once a district agrees to participate, Mathematica will develop and execute a Memorandum of Understanding (MOU).

SFA Data Collection Procedures. Westat will attempt to collect data from 175 SFA directors in the sampled school districts, using a self-administered questionnaire (Attachment C) about the district’s administrative practices regarding the school meal programs and quantitative questions requiring look-up of district and food service records. Surveys will be e-mailed during the 2012–13 school year and we anticipate completing the field period by the end of June 2013.

Household Survey Procedures. Mathematica field interviewers will contact parents of sampled students to administer an in-person household interview, including income verification, and obtain permission to obtain student records. The household survey will be conducted in the main sample of 130 SFAs and 390 schools. Surveys are expected to be completed with 4,420 households—3,835 certified students, including 767 later in the school year (called “newly certified students”), and 585 denied applicants. Interviews will be conducted throughout the school year, but with most occurring during the first few months, when most applications are received and certification activities take place.

Selecting Samples of Students. Survey team leaders will visit school districts during the beginning of the school year and select samples of certified students and denied applicants while onsite.

  • Sampling Free or Reduced-Price-Approved Students. Team leaders will obtain lists of students who are approved to receive free or reduced-price meals at each study school at the time of their visit. They will count the total number of eligible free or reduced-price certified students and enter this information into Excel programs loaded onto their laptop computers. The computer will select the sample of approximately 8 main selections plus replacements of free and reduced-price certified students for each study school. Later in the school year, a small sample of “newly certified” students will be selected from any students who were ineligible to be selected at the initial sampling visit. These new certified students will be sampled from Mathematica’s central office.

  • Sampling Denied Applicants. We will define our denied applicant sample as applications submitted but not approved—either complete applications that were denied or incomplete applications. Field interviewers will stratify a school’s denied applicants into two groups: (1) denied applications that are complete and (2) those that are incomplete. Field interviewers will then select an average of 1.5 denied applicants from both groups per school plus replacements, using a sample allocation that selects relatively more completed applications that are denied than incomplete applications (at a 60:40 ratio when applied across the full sample of schools).

Obtaining Household Contact Information. Team leaders will check the student roster (or obtain the application, if necessary) to get the names, addresses, and telephone numbers of the parents of each student selected for the survey (Attachment F). Mathematica will then use the information to create interviewing assignments and to generate letters that will be mailed to parents the week before home visits are made. Mathematica will discuss with each school district the application of the New FERPA Amendment to release of such directory information without parental consent. However, some school districts may have policies that do not permit the release of the names and addresses of students without receiving prior parental consent. Mathematica is prepared to work within these districts’ policies and limit receipt of contact information to households who have granted consent.

Contacting Parents. Parents will be sent advance letters printed on USDA letterhead the week before in-person contacts are made at sampled households. Further, as necessary for districts not applying the New FERPA Amendment, interviewers will also obtain parental consent for the release of student records and provide a copy to the respondent. All documents pertaining to parent contact and consent are included in Attachment L.

Conducting the Household Survey. From September through November 2012, Mathematica interviewers will complete in-person interviews with approximately 8 certified households and 1 to 2 denied applicants in each participating school, for a total of 3,068 certified students and 585 denied applicants. During the remainder of the school year (January to May 2013), we will complete interviews with approximately 2 newly certified applicants from each school during a second visit to the district, for a total of 767 newly certified students.

The household survey will be used to obtain an accurate measure of the household’s monthly income and family size at the time of application. Mathematica will use successful APEC-I procedures asking for all the different sources of income received by household members. We will also request documentation of income sources. The computer will compare information from the self-reports against the information in the documents, and should amounts differ, the interviewer will ask the respondent about the discrepancy to resolve it. At the end of the sequence, income sources across all adults and sources will be summed in order to derive a total monthly amount for the household. Then we will ask respondents whether that total accurately reflects the household’s regular monthly income. If the answer is no, respondents will be asked what sources or household members differ, and by how much. Amounts will be adjusted to yield the appropriate monthly total for the time of application.

Application and Direct Certification Data. Mathematica or Westat will visit SFAs and schools to collect the data that appears on the certification applications and direct certification documents for the samples of free and reduced-price approved students and denied applicants. Overall, this involves collecting data on 4,420 students from 390 sampled schools in the main and State samples. The 4,420 record abstractions will be made up of 3,835 approved free and reduced-price students and 585 students whose applications were denied. Subject to approval by schools, field staff will make copies of the application forms and direct certification documentation. When schools do not permit copying, the information will be hand-copied onto standardized data abstraction forms (Attachment E). Field staff team leaders will review these abstraction forms to ensure completeness. The application and direct certification documentation copies or completed abstraction forms will be sent to Westat’s central office. Data from photocopies of documents will be entered onto abstraction forms. Data from all abstraction forms will then be data-entered, followed by a quality control review.

Collecting Student-Level Records Data on NSLP and SBP Participation. Mathematica will collect data on individual meal program participation for sampled students in districts and schools that compile data by student. Where available, this information will be collected for 4,420 students in the free and reduced-price meal sample and denied applicants sample. We will obtain participation information covering the entire school year. Mathematica will request that SFAs provide this information in two waves: once for the first semester and then for the second semester. Mathematica central office staff will also contact SFAs just before the end of the school year and request any changes in sampled students’ certification status and enrollment.

Cashier Error Data Collection. In order to collect data on cashier error, Westat will station field staff near points-of-sale on a randomly sampled observation day during a target week and meal periods. Using hardcopies or electronic meal transaction forms (Attachment I), that staff will record the tray’s contents (foods taken and amounts taken); whether the tray was that of a student, a nonstudent, or an adult; and whether or not the cashier recorded the meal as reimbursable. Observers will follow consistent procedures to randomly sample point-of-sale and time combinations and select interval samples of transactions. Recorded information will be coded by specially trained Westat staff using information on whether each school uses a food-based or a nutrient-based menu-planning approach and whether the each school uses “offer versus serve” (a system aimed at reducing food waste and encouraging student choice). We plan to observe meal service operations at all of the 390 schools in the main sample and 45 schools from the CEO sample. We will collect data on 50 lunch transactions and (when relevant) 50 breakfast transactions per school.

Aggregation Error Data Collection. Westat will collect data on each stage of the meal reporting process for each sampled school for a target week (the full week completed prior to the school visit) and target month (the month prior to the school visit). These will be distributed across the school year. We also will collect data on the number of students in the meal-pricing categories (free or reduced-price), enrollment, daily attendance, and number of serving days, to help us assess the accuracy of the meal counts. All raw data on counting, consolidation, and claiming will be processed by Westat central office staff to determine prevalence and amount-of-aggregation errors.

Collected data includes:

  • Daily Counts for Target Week – validated daily cashier and total meal counts

  • Monthly Counts – meal count reports by price category

  • District Reimbursement Claims for Sampled School – SFA records of meal counts and documentation of submitted claims to State.

  • District Consolidation and Claims Across All Schools – school-level meal counts and State-reported totals

Data Collector Training. Mathematica will directly oversee training of all data collectors, including their subcontractors. Training will be designed specifically for the project and the data collection tasks for which each group will be responsible. Training will rely on written and presented materials, focusing on overall study goals, sensitivity issues relevant to the study population, instrument administration, best practices and quality control, achieving high response rates, respondent privacy, and data security.


3. Methods to Maximize Response Rates

Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.



Anticipated response rates are shown in Table B.1.1 (see Section B.1). Response rates are based on those achieved in the first APEC study.8 For APEC-II, the expected response rate is higher than what was achieved in APEC-I because Section 305 of the Healthy Hunger-Free Kids Act of 2010 indicates that participation in studies is mandatory for districts that participate in the school meals programs.9 Thus, in addition to the items stated below, which are comparable to those used in APEC-I, we will stress to reluctant districts that their participation is required, and if necessary, seek the assistance of the state child nutrition office and regional FNS offices in reinforcing the requirement. For this and other aspects of data collection, we will use a wide range of methods to maximize participation and reduce nonresponse:

  • SFAs will be recruited by trained, permanent, professional members of Mathematica’s staff with relevant experience working with school meals programs or other relevant stakeholders.

  • A letter from USDA (see Attachment O) will be sent to each State child nutrition director to build support for the study and encourage SFA directors to offer their full cooperation.

  • Reluctant SFAs will be referred to the project director or survey director for follow-up. As appropriate, State child nutrition staff or FNS staff may contact the most reluctant respondents to underscore the requirement of study participation.

  • Field interviewers will be sent onsite to perform student sampling and to conduct application data abstraction, and to collect and verify assorted counts and claim records required for the aggregation error data collection. This serves to significantly limit the burden placed on district and school staff in order to provide the data required for analysis.

  • Advance information will be sent to sampled households. A USDA advance letter will describe the importance of the study, the token of appreciation available for completing the survey, privacy protections, and the fact that receipt of benefits will be unaffected by study participation. A study brochure will include general information about the study and instructions on who to contact with questions or for additional information.

  • A $25 gift card will be provided to respondents after completion of household surveys, including document verification of income sources (average 45 minute burden). APEC-I achieved an 83 percent response rate on the household survey using a comparable approach.

  • Data collectors will be qualified, well-trained professionals interviewers. Project-specific training will emphasize achieving high response rates by focusing on sensitivity issues relevant to the study population (for example, immigration status, stigma associated with public assistance, and fear of being investigated), the privacy protections that respondents can be assured of, and refusal conversion techniques. A sufficient number of data collectors will be bilingual in Spanish in order to maximize response among non-English speaking respondents. All study materials provided to households will be available in Spanish.

Our expectation based on the experience of APEC-I is that the planned methods of data collection will result in accurate and reliable data necessary for planned analyses and modeling at acceptable response rates. The number of completed instruments will be the numerator in response rate calculations. A completed instrument will be defined as one in which all critical items for inclusion in the main analysis are complete and within valid ranges. All attempted respondents, excluding those determined to be ineligible, will be the denominator in response rate calculations.

After data have been collected, a non-response analysis will be conducted and the results will be used in constructing weights to be used in analysis. At a minimum, the non-response analysis will examine which SFA, school and student characteristics are correlated with non-response and use the results to define cells for non-response analysis. If any response rates should fall below 80 percent, the analysis will also estimate the potential for bias and the ability of weighting adjustments to correct for that bias.

4. Test of Procedures

Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.



The household survey which was significantly reduced in length from the first APEC study was pretested in March, 2012 with 4 respondents. The survey was administered in person in respondents’ homes and respondents were given a $25 gift card token of appreciation. Respondents volunteered for the pretest following promotion of the study by a local non-profit organization. Each household was currently certified for free or reduced price meals or had applied and been denied. Household burden estimates in the Part A supporting statement were derived from the average length of the pretests and include those pretest interviews.

Data collection instruments that are essentially unmodified from APEC-I were not pretested and instead rely on burden estimates and best practices from their fielding on the prior study.

5. Individuals Consulted on Statistical Aspects of the Design

Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.



Mathematica, Westat, and FNS staff consulted on statistical aspects of the design (see Table B5.1). The same staff will be responsible for the collection and analysis of the study’s data. Comments from the public and from National Agricultural Statistics Service (NASS) were also consulted.

Table B5.1. Individuals Consulted on Data Collection or Analysis

Mathematica Staff (Contractor)

Laura Castner

Project Director

202-484-3282

Phil Gleason

Senior Fellow

315-781-8495

John W. Hall

Senior Statistician

609-275-2357

Quinn Moore

Senior Researcher

919-240-4879

Michael Ponza

Associate Director/Senior Fellow

510-830-3707

Eric Zeidman

Survey Researcher

609-936-2784

Westat Staff (subcontractor)

Janice Machado

Senior Study Director

301-294-2801

Mustafa Karakus

Senior Economist

301-294-2874

FNS Staff

Reneé Arroyo-Lee Sing

FNS Project Officer

703-305-2126

John Endahl

Senior Program Analyst

703-305-2122





1 Local Education Agency (School District) Universe Survey Data.

2 This is approximated by the proportion of schools that are Provision 2 and 3 but not in the base year.

3 A State whose size is between 0 and 1 percent of the total would be allocated either 0 or 1 SFA. An SFA with more than 1 but less than 2 percent would be allocated 1 or 2.

4 SFAs that have only CEO schools will be ineligible for the base sample. It is possible that an SFA on the FNS 742 would have no eligible students, but that is true of fewer than 1 percent of SFAs.

5 OMB states that “significant erroneous payments are defined as annual erroneous payments in a program exceeding both 2.5% of program payments and $10 million.” Programs and activities susceptible to such significant erroneous payments are to determine an annual estimated amount of erroneous payments, identify the reasons they are at risk of erroneous payments, and implement a plan to reduce them. OMB calls the first threshold the “error rate” and the second the “error amount.” We interpret this as meaning that the error rate is the ratio of two “dollar-denominated” sums: total annual erroneous payments divided by total annual payments. For the NSLP or SBP), the error rate will equal the total dollar amount of erroneous payments made to certified students and denied applicants divided by total reimbursements for all reimbursable meals under the particular meal program. The study will also assess the prevalence of “case error” rate: the percentage of all applicants and directly certified students erroneously certified or denied benefits for which they are eligible.

6 This is mathematically equivalent to the requirement that the confidence interval around the ratio of average error, as a percentage of average reimbursement per meal, be ±2.5 percentage points.

7 In making the overall combined estimate, we assume that reimbursements to CEO participants will comprise 8 percent of the national total. This assumption is derived from the distribution of reimbursements to States participating in CEO in FY2010.

8 SFAs in APEC-I were sampled in pairs in which the districts were carefully matched on characteristics, with one SFA randomly assigned to be the main selection and the other a replacement in case of non-response or ineligibility of the main selection. In APEC-I, 77 percent of districts agreed to participate (78 of 103 public SFAs). This rate is based on all SFAs ever released for recruitment efforts, including replacements for those that refused. All non-response at the district level was due to refusals to participate in the study. Because the replacements were statistically valid replacements for the main selections, we did not see any discernable bias in the SFA selection. We did adjust for nonresponse within the sampling weights.



9 SEC. 305. PROGRAM EVALUATION.

Section 28 of the Richard B. Russell National School Lunch Act (42 U.S.C. 1769i) is amended by adding at the end the following:

‘‘(c) COOPERATION WITH PROGRAM RESEARCH AND EVALUATION.—States, State educational agencies, local educational agencies, schools, institutions, facilities, and contractors participating in programs authorized under this Act and the Child Nutrition Act of 1966 (42 U.S.C. 1771 et seq.) shall cooperate with officials and contractors acting on behalf of the Secretary, in the conduct of evaluations and studies under those Acts.’’


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorDPatterson
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy