2. APEC_III_OMB_Part_B_5.9.17

2. APEC_III_OMB_Part_B_5.9.17.docx

Third Access, Participation, Eligibility and Certification Study Series (APEC III)

OMB: 0584-0530

Document [docx]
Download: docx | pdf

Supporting Statement – Part B
for
OMB Control Number 0584-0530

Third Access, Participation, Eligibility and Certification Study Series (APEC III)





May 2017





Devin Wallace-Williams, PhD

Social Science Research Analyst

Office of Policy Support

Food and Nutrition Service

United States Department of Agriculture

3101 Park Center Drive

Alexandria, Virginia 22302

Phone: 703-457-6791

Email: [email protected]




This page is intentionally blank.



Table of Contents

PART B. COLLECTION OF INFORMATION USING STATISTICAL METHODS 1


1. Respondent Universe and Sampling Methods 1

2. Procedures for the Collection of Information 6

3. Methods to Maximize the Response Rates and to Deal With Nonresponse 28

4. Test of Procedures or Methods to be Undertaken 30

5. Individuals Consulted on Statistical Aspects of the Design & Individuals Collecting and/or Analyzing Data 31


Table


B1-1 Respondent universe, samples, and expected response rates 2


B2-1 School-level weights to be constructed for analysis of non-certification errors 17


B2-2 Example of the effect of increased sample sizes on the confidence intervals of SFA- and school-level estimates 22


B2-3 Expected 90% confidence bounds around error rate estimated for APEC III 23


B5-1 Individuals consulted on data collection or analysis 31



Figure


B1-1 Overview of school sampling 5


LIST OF APPENDICES TO APEC III OMB SUPPORTING STATEMENT


Appendix


A Applicable Statutes, Regulations, and Reference Documents


A1. Richard B. Russell National School Lunch Act (as amended through P.L. 113–79, Enacted February 07, 2014)

A2. Improper Payments Information Act (IPIA) of 2002 (P.L. 107-300)

A3. 2009 Executive Order 13520—Reducing Improper Payments

A4. Improper Payments Elimination and Recovery Act (IPERA) of 2010 (P.L. 111-204)

A5. Improper Payments Elimination and Recovery Improvement Act (IPERIA) of 2012 (P.L. 112-248)

A6. Office of Inspector General (OIG) USDA’S FY 2014 Compliance with Improper Payment Requirements

A7. M-15-02 – Appendix C to Circular No. A-123, Requirements for Effective Estimation and Remediation of Improper Payments

A8. FNS FY 2014 Research and Evaluation Plan

A9. Healthy Hunger-Free Kids Act 2010 (P.L. 111-296)


B SFA/State Data Collection Forms


B1. SFA Request for E-Records (Non-CEP Schools for Household Sampling)

B2. SFA Reminder for E-Records (Non-CEP Schools for Household Sampling)

B3. SFA Request for E-Records (CEP Schools for ISP Data Abstraction)

B4. SFA Reminder for E-Records (CEP Schools for ISP Data Abstraction)

B5. SFA Pre-Visit Interview

B6. SFA Data Collection Visit Reminder Email

B7. Application Data Abstraction Form

B8. Round 2 & 3 Application Data Abstraction Scheduling Email

B9. SFA Reimbursement Claim Verification Form—Sampled Schools

B10. SFA Reimbursement Consolidation and Claim Verification Form―All Schools

B11. SFA Director Survey (web based)

B12. SFA Director Interview (by phone)

B13. SFA Meal Participation Data Request

B14. State Meal Claim Data Request


C School Data Collection Forms


C1. School Pre-Visit Interview

C2. School Data Collection Visit Reminder Email

C3. Meal Transaction Observation Form

C4. Example of Meal Transaction Sampling

C5. Cafeteria Manager Interview

C6. School Meal Count Verification Form

LIST OF APPENDICES TO APEC III OMB SUPPORTING STATEMENT (continued)


Appendix


D Household Data Collection Forms


D1. Household Survey Appointment Reminder Letter

D2. Household Survey Appointment Reminder Letter―Spanish

D3. Household Survey Income Worksheet

D4. Household Survey Income Worksheet―Spanish

D5. Household Survey

D6. Household Survey―Spanish

D7. Household Survey Income Source Show Card

D8. Household Survey Income Source Show Card―Spanish

D9. Household Survey Incentives Received Form

D10. Household Survey Incentives Received Form―Spanish

D11. Household Interview Appointment Reminder Letter

D12. Household Interview Appointment Reminder Letter―Spanish

D13. Household Interview

D14. Household Interview―Spanish

D15. Household Interview Incentives Received Form

D16. Household Interview Incentives Received Form―Spanish


E Summary of Public Comments


E1. School Nutrition Association Comments

E2. Food Research and Action Center Comments


F Response to Public Comments


F1. FNS Response to School Nutrition Association Comments

F2. FNS Response to Food Research and Action Center Comments


G National Agricultural Statistics Service (NASS) Comments


H Response to National Agricultural Statistics Service (NASS) Comments


I Household Survey Consent Forms


I1. Household Survey Consent Form

I2. Household Survey Consent Form―Spanish


J Westat Confidentiality Pledge


K Westat Federal-Wide Assurance


LIST OF APPENDICES TO APEC III OMB SUPPORTING STATEMENT (continued)


Appendix


L Westat APEC III IRB Approval Letters


M Westat Information Technology and Systems Security Policy and Best Practices


N APEC III Burden Table


O Sample Frame Development and Selection Process


O1. Sample Frame Development and Selection Procedures

O2. Template E-Letter from FNS Regional Liaison to State Child Nutrition Director

O3. State Follow Up Contact Guide

O4. SFA Study Notification and School Data Verification E-Letter

O5. SFA Follow Up Contact Guide (Study Notification and School Data Verification)

O6. SFA Automated Email―Receipt of Verification of School Data

O7. SFA Confirmation and Next Steps Email

O8. SFA School Sample Notification E-Letter

O9. School Study Notification E-Letter

O10. APEC III Fact Sheet (for SFAs and Schools)


P APEC III SFA Sample Selection Memo


Q APEC III School Sample Selection Memo


R Special School Weights for Non-Certification Errors


S Data Collection Summary


T Household Recruitment


T1. Household Survey Recruitment Letter

T2. Household Survey Recruitment Letter—Spanish

T3. Household Survey Brochure

T4. Household Survey Brochure—Spanish

T5. Household Survey Recruitment Contact Guide

T6. Household Survey Recruitment Contact Guide—Spanish

T7. Household Interview Recruitment Contact Guide

T8. Household Interview Recruitment Contact Guide—Spanish


U APEC III Cognitive Pretest Findings Report





PART B. COLLECTION OF INFORMATION
USING STATISTICAL METHODS


1. Respondent Universe and Sampling Methods

Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.

Respondent Universe

The respondent universe for the Third Access, Participation, Eligibility, and Certification Study Series (APEC III)1 includes (a) School Food Authorities (SFAs) (and their corresponding 46 State child nutrition agencies2), (b) schools (both Community Eligibility Provision (CEP) schools and non-CEP schools) within SFAs, and (c) students (households) who applied for, or were directly certified, for meal benefits in School Year (SY) 2017-2018 within the sampled schools. APEC III includes the same respondent universe as APEC II and will follow the same design described herein, with some enhancements (e.g., increased sample sizes and addition of qualitative data).3

Table B1-1 presents a summary of the universe, samples, expected response rate for each respondent type, and overall response rate. A description of the efforts that have (or will) be implemented to ensure a high response rate is described in response to Question 3 of this Supporting Statement Part B. The response rate for SFA participation has exceeded the minimum expected.4 With the confirmation of participation from 96 percent of the SFAs, it is expected that the school response rate will meet or exceed the minimum because the SFAs direct and encourage the participation of their schools. The CEP student records are existing and required records; hence the response rate will be 100 percent. The response rate of 75 percent for the student/household sample is a conservative minimum that is expected to be exceeded based on APEC II response rates and targeted efforts to maximize response rates (described in Question 3). However, if the minimum response rate for the student/household sample is not met, we will conduct nonresponse bias analyses (described in Question 2). The overall response rate for APEC III is expected to be at least 84%.

Table B1-1. Respondent universe, samples, and expected response rates

Respondent

Universe1

Initial sample

Minimum expected response rate

Targeted completed cases

APEC II

response

rates2

SFAs3:

17,854

336

81%

275


Non-CEP SFAs

14,881

192

81%

155

96%

CEP SFAs

2,973

144

83%

120

100%







Schools:

93,990

782

80%

626

95%

Non-CEP Schools (Non-CEP SFAs)

64,831

437

80%

353


Non-CEP Schools (CEP SFAs)

11,759

111

80%

85


CEP Schools (CEP SFAs)

17,400

234

80%

188








Student/Households:

23,653,919

6,424

75%

4,818

83%

Non-CEP Student Households (Non-CEP SFAs)

15,489,495

5,177

75%

3,883


Non-CEP Student Households (CEP SFAs)

8,164,424

1,247

75%

935








CEP School Student Records

5,009,556

4,488

100%

4,488

N/A


TOTAL

28,775,319

12,030

84%

10,206

83%

1 Based on data from APEC II sampling frame drawn from FNS 742 data, including FY 15 FNS-742 SFA File (version dated 2-22-2016) and CEP SY 15-16 National Elections DataSeptember 2015 (version dated 3-25-2016).

2 Per Table II.3 APEC II Response Rates, APEC II Final Report Volume I, pg. 29.

3 The sampling unit was the SFAs, and their State Child Nutrition Agencies (46 States).


Sampling Overview

The sampling plan for APEC III is a multistage stratified probability sampling design where the first-stage sampling units (FSUs) are composed of a nationally representative sample of SFAs, the second-stage sampling units (SSUs) are composed of stratified samples of schools within SFAs, and the third-stage sampling units (TSUs) are composed of students (households) within schools.

Appendix O1 describes the sample frame development and selection process. This was a multistep process that began with use of FNS administrative data5 and public records from the National Center for Education Statistics (NCES). FNS regional directors and regional liaisons supported this process with study notification to State child nutrition directors. Later, sampled SFAs were notified and engaged to review, verify, and provide updates if needed to the sample frame data. Finally, SFAs and schools will be notified and informed that they will be contacted later for data collection. The study materials used during the sample frame development and selection process are included in Appendices O2O10.

The sampling plan generally followed the approach used in APEC I and II but with additional SFA, school, and household samples, as shown in Figure B1-1. The APEC III SFA Sample Selection Memo (Appendix P) and the APEC III School Sample Selection Memo (Appendix Q) provide details on the sampling procedures and results. Because certification for school meals occurs differently between CEP and non-CEP schools, CEP schools were sampled separately.6 SFAs were divided into those with no CEP schools and those with at least one CEP school. Among SFAs with at least one CEP school, we sampled both CEP schools and non-CEP schools. For sampled SFAs implementing CEP districtwide, only CEP schools were sampled. The current allocation of the SFA sample is proportional to the number of certified students to be selected for APEC III in the two SFA categories (i.e., CEP and non-CEP SFAs). In the case of the non-CEP SFAs, the number of students eligible to be selected for APEC III is the number of students approved for free or reduced-price meals. In the case of the CEP SFAs, the number of students eligible to be selected for APEC III is the sum of (a) the number of students certified for free or reduced-priced meals (i.e., enrollment of the school multiplied by the identified student percentage (ISP) multiplied by 1.6)7; and (b) the number of students approved or certified for free or reduced-price meals in the non-CEP schools in the SFA.

Table B1-1 summarizes the target sample size of 626 schools for APEC III. The numbers refer to the desired number of schools (participating schools) for subsequent household/application sampling. To ensure that these sample sizes could be achieved, a larger sample was projected to be selected initially. For sampling purposes when developing the sampling plan, we assumed that the response rate among the selected schools would be at least 80 percent. Thus, approximately 782 schools were sampled to yield 626 participating schools, of which approximately 548 are non-CEP schools and 234 are CEP schools8. Note that the actual numbers selected differed slightly from those shown in Table B1-1 due to updated school data and the implementation of the one- or three- schools per SFA requirement described in the APEC III School Sample Selection Memo (Appendix Q). Figure B1-1 provides a summary of the numbers of schools sampled. Whereas Table B1-1 provides the summary of the sampling plan, Figure B1-1 provides the final results of school sample selection.

Figure B1-1. Overview of school sampling


NOTES

* Of the 336 SFAs that were sampled, 323 (96%) confirmed participation. This exceeded the expected response rate, resulting in a surplus of SFAs (323 instead of 275). As a result, a random sample of 302 of the 323 SFAs were selected for subsequent school sampling. The remaining SFAs could potentially be used to provide a reserve sample of schools in the unlikely event that it becomes necessary.

**These numbers reflect the original designation of the SFA type (non-CEP or CEP) during sampling. During SFA confirmation and verification, some SFAs were found to fall into a different category. The original sampling status was maintained, but we sampled according to their true status. As a result, some CEP schools are sampled from SFAs originally designated as non-CEP.


For the household survey, we expect a minimum response rate of 75 percent among the sampled student households. It is possible that the response rate will be slightly lower compared to APEC II because in APEC III we will provide households with advance notice and guidance about the income documentation that will be requested during the survey. This approach will result in a higher response in providing income documentation, which should result in more accurate income data.9 However, the advance notification about income documentation requests may result in some potential respondents being more hesitant about participation, lowering the overall response rate. During recruitment these concerns will be addressed as part of refusal aversion. However, sampled households will also be informed that they can still participate without providing income documentation.10 Thus, we will select 6,424 eligible applications in order to obtain 4,818 completed household surveys. The study anticipates a 100 percent response rate with CEP school student records because SFAs are required to maintain the documentation. Nonresponse bias analysis will be conducted to address statistical considerations regarding the response rate (described in Question 2).

2. Procedures for the Collection of Information

Describe the procedures for the collection of information, including:
  • Statistical methodology for stratification and sample selection,

  • Estimation procedure,

  • Degree of accuracy needed for the purpose described in the justification,

  • Unusual problems requiring specialized sampling procedures, and

  • Any use of periodic (less frequent than annual) data collection cycles to reduce burden.

APEC III follows the same design and analysis methods as APEC II. In addition to being a replication study, APEC III includes design enhancements to include (1) an increased sample of SFAs; (2) an increased sample of schools due to the increase of SFAs; (3) expansion of the SFA director survey; (4) addition of an SFA director qualitative interview (for a sub-sample only); (5) addition of a cafeteria manager interview (for a sub-sample only); (6) an increased sample of households due to the increase in number of sampled schools; (7) expansion of the household survey to better understand reasons for errors; and (8) addition of a household interview (with a sub-sample only). These are reflected in the discussion that follows.

Statistical Methodology for Stratification and Sample Selection

SFA Sample Selection. The APEC III SFA Sample Selection Memo (Appendix P) provides details on the sample selection of SFAs. The Form FNS-742 SFA Verification Collection Report Data (approved under OMB # 0584-0594 Food Programs Reporting System (FPRS), expiration date 9/30/2019), which covers all SFAs in the United States, served as the frame for selecting the sample of SFAs. SFA-level data available from the FNS-742 frame included the number of schools participating in NSLP/SBP, number of students with access to NSLP/SBP, and total number of students certified for free or reduced-price lunch. A probability sample of CEP and non-CEP SFAs from a sample frame derived from Form FNS-742 SFA Verification Collection Report Data were selected. To be eligible for selection, the SFAs must have at least one school participating in the NSLP or SBP. In addition, 2015 CEP elections data (provided by FNS) were used to develop a measure of size for SFAs and schools that implemented CEP.

Prior to selecting the SFAs, SFAs in the frame were assigned to one of two strata: (a) the CEP stratum consisting of any SFA with at least one CEP school; and (b) the non-CEP stratum consisting of all remaining SFAs. Within each stratum, prior to sample selection, SFAs were sorted by selected variables to achieve an implicit stratification of the sample. Sorting variables included FNS region, State, type of control (public/private), SFA enrollment, and percentage of students eligible for free or reduced-price meals. After sorting within each “CEP status” stratum separately, a sample of SFAs was selected with probability proportionate to size (PPS) where, depending on the status of the SFA, the sampling measure of size was either (a) the number of students eligible for free or reduced-price meals as reported in the FNS-742 frame (for the non-CEP SFAs) or (b) a composite measure of size developed from the estimated number of students in CEP schools and the number of students certified for free or reduced-price meals in non-CEP schools (for CEP SFAs)11. Because CEP SFAs can contain both CEP and non-CEP schools, the measure of size to be used for sampling purposes was a composite measure of size based on the appropriate measure of size for the two types of schools. Let M1 = an estimate of the number of students in the CEP schools and M2 = an estimate of the number of students eligible for free or reduced-price meals in the non-CEP schools. The composite measure of size used to select the CEP SFAs was computed as CMOS = r1*M1 + r2*M2, where r1 and r2 are composite weighting factors designed to yield approximately self-weighting samples of students (records) from each type of school while controlling the numbers of sampled student records per sampled CEP SFA to the extent feasible.

The initial sample of 192 non-CEP SFAs was selected systematically with probabilities proportionate to the number of students eligible for free or reduced-price meals from the sorted sampling frame using a random start. Similar to the non-CEP sample, the initial sample of 144 CEP SFAs was selected systematically with probabilities proportionate to an appropriate measure of size from the sorted sampling frame using a random start. After the 144 CEP SFAs were selected, those that were selected with probability less than 1 (the “noncertainty” SFAs) were sub-sampled. As indicated in Figure B1-1, both CEP and non-CEP schools were selected within the CEP SFA sample. The APEC-III SFA Sample Selection Memo (Appendix P) provides more details on the sample selection of non-CEP and CEP SFAs.

Sample Selection of Schools. We developed lists of schools from each selected (responding) SFA to form and stratify the sample frame of schools. In those cases where the SFA could be linked to the Common Core of Data (CCD) local education agency universe files maintained by NCES, an initial list of schools was prepared and included with the SFA study notification materials for verification and updating. Using these verified and/or updated lists, we selected schools as described below. The resulting probability sample of 777 schools from the participating SFAs will, after appropriately accounting for nonresponse in sample weighting, be nationally representative of schools participating in the NSLP/SBP in the 48 contiguous states and Washington, DC.

Even though the rate at which schools participate in the SBP is lower than the NSLP participation rate, substantially unbiased national estimates for each program type can be derived. Based on the standard errors reported for APEC II, we estimate that the APEC III sample will be sufficient to provide enough schools in SBP to meet the IPERA precision requirements. There is no need to oversample SBP schools.

The basic framework for the selection of schools differed between non-CEP and CEP schools. For those schools in non-CEP SFAs where three schools were selected, three “school type” strata defined by grade level were formed for sampling purposes: elementary, middle, and secondary (including schools with combined elementary and secondary grades). All schools in the SFA were assigned to one of these three strata; however, when schools in the SFA did not readily fall into one of these categories, the assignment was made to the school stratum that most closely matched the grade range covered by the school. As long as schools had a known chance of selection, no bias will arise in analyses, although the stratum assignments can affect the variability of weights of schools assigned to a particular stratum. After the schools were assigned to the appropriate grade level stratum, the schools in the SFA were sorted by grade level and then by a measure of size defined by the number of students certified for free or reduced-price meals within grade level. A systematic sample of schools was selected from the sorted list of schools with probabilities proportionate to the measure of size. Similarly, both CEP and non-CEP schools were selected from participating CEP SFAs. For any SFA in which three schools were selected, the schools were sorted by the three grade-level strata, and within these strata schools were further sorted by the appropriate school-level measure of size. For the non-CEP schools in the SFA, the measure of size was defined by the number of students certified for free or reduced-price meals within grade level. For the CEP schools in the SFA, the measure of size was defined by the number of students certified for free or reduced-price meals (i.e., enrollment of the school multiplied by the identified student percentage (ISP) multiplied by 1.6) A systematic sample of the specified number of schools was selected from the sorted list with probabilities proportionate to the school-level measure of size.

With some exceptions, the CEP SFAs were designated to supply either one or three sampled schools for the study where feasible. The number of SFAs assigned to each group depended on the number of schools of a particular type (i.e., CEP or non-CEP) in the SFA, and the approximately optimal sampling rate for selecting schools within the SFA. Details about the allocation of schools to SFAs are described in the APEC III School Sample Selection memo (Appendix Q).  Once the number of schools to be selected was determined, the approach to selecting schools was as follows: first, prior to sampling, schools were sorted by CEP status. Within the CEP stratum, schools were sorted by grade level category and by measure of size within grade level. The specified number of CEP schools was then selected from the sorted list with probabilities proportionate to the measure of size. The aim was to sample at least one CEP school from the CEP SFAs. The selection of the non-CEP schools was done in a similar fashion, except that sampling was done across all CEP SFAs.12 In this way, all non-CEP schools in CEP SFAs have an appropriate non-zero chance of selection.


Sample Selection of Students. The targeted number of sampled students (respondents) per participating school is approximately 11 in non-CEP schools13. Assuming a 75 percent participation/eligibility rate, the sampling rate within schools will be determined so that an average of roughly 16 students will be sampled per school. Lists of study-eligible students will be created for each participating school, including a flag distinguishing between those approved for free or reduced-price meals and those whose application was denied. Prior to sample selection, we will sort the students by this certification status flag to ensure proportional representation on this characteristic within schools. An equal probability sample of students will then be selected from each non-CEP school at rates designed to achieve (to the extent feasible) an overall self-weighting sample of students among all non-CEP schools. Note that the non-CEP schools from which the students will be sampled can belong to either a non-CEP SFA or a CEP SFA.

Note that, unlike APEC II, no effort will be made to limit sampling to one sampled student per household. There is no need to do so, and there are ways to limit, even eliminate, the number of times more than one student is selected per household.14 In sampling student records within participating CEP schools, the targeted number of student records is 24 per school15. Students will be assigned to two groups: (a) identified students, and (b) students who were not identified or certified for school meal benefits.

Finally, we note that the targeted student record sample sizes summarized in Table B1-1 refer to the total sample size across three rounds of data collection. Based on an analysis of application data by certification month from APEC II, we expect to allocate about 66 percent of the sampled households to the first data collection phase (August 2017 to November 2017), 32 percent to the second data collection phase (December 2017 to February 2018), and the remaining 2-3 percent to the third data collection phase (March 2018 to June 2018). This will ensure that the sample represents all the applications for the school year and will eliminate any potential seasonal biases.

Case Selection for Qualitative Interviews. APEC III will conduct three sets of qualitative interviews: interviews with SFA directors (Appendix B12), cafeteria managers (Appendix C5), and parents/guardians (Appendix D13/D14). The procedures for the qualitative interviews are summarized in the data collection procedures section of this Supporting Statement (page 25). The samples for the qualitative interviews will be selected from the pool of respondents in the main APEC III sample. With a total of 60 SFA director interviews, we will select an equal number of SFA directors from small SFAs (1999 students), medium SFAs (1,0004,999 students), large SFAs (5,00024,999), and very large SFAs (25,000+ students) to interview to capture the range of viewpoints and practices. The sample for SFA director interviews will include both CEP and non-CEP SFAs. SFA directors will be contacted to explain the purpose of the interview (Appendix B12) and provide the opportunity for other staff to participate, as there may be other staff more directly involved in the certification and claiming processes.

APEC III will conduct 60 semi-structured interviews with cafeteria managers of participating schools during data collection visits. Criteria for selecting cafeteria managers will include SFA size, school grade level, and CEP status. Qualitative analysis will be completed on a rolling basis. Interviews with cafeteria managers (Appendix C5) will offer additional insight on factors leading to meal claiming, counting, and reimbursable meal errors.

A random sample of 60 households will be contacted for the qualitative interviews (Appendix D13/14): 30 households with application errors and 30 without application errors. Application errors will be identified by analyzing income and household size variables from two sources: (1) application abstraction (Appendix B7); and (2) the household survey (Appendix D5/D6). Sampling will be stratified by the two groups (with and without errors), and randomly selected on a rolling basis over the course of the year and will only include households with students that are in non-CEP schools16.

Estimation Procedures

Weighting. The analysis of errors occurs at the student level for certification errors and the school level for meal claiming and aggregation errors. In addition, results from the SFA director survey will be used to derive nationally representative estimates of SFA and meal program characteristics. Thus, three sets of weights corresponding to the three levels of sampling and analysis (i.e., SFAs, schools, and students/households) will be constructed as described below. The construction of the required weights will be carried out sequentially starting with the SFA weights because the weights for each subsequent stage of analysis will build on the weights computed in the previous stage.

In general, sampling weights are required for analysis of complex survey data such as those obtained in APEC III to (a) reflect the probabilities of selection at the three stages of sampling−SFAs, schools, and students/households; (b) compensate for differential rates of nonresponse at the various stages of sampling; and (c) adjust for sampling variability and potential under coverage through post stratification (calibration).

Because the weights to be constructed at each level are not independent from each other, we begin with a description of the methodology for constructing the SFA weights, followed by the school weights, and finally the student/household weights.

SFA Weights. An SFA-level weight is required for analysis of results from the SFA director survey such as those presented in Appendix D of the APEC II final report.17 Because the approach for selecting SFAs for APEC III differs from that used in APEC I and II, the procedures for weighting the SFA sample will be simplified. First, a base weight, , equal to the reciprocal of the probability of selection under the proposed sample design will be assigned to each sampled SFA i in sampling stratum h, i.e.,

= 1 / , (1)


where = the probability of selecting SFA i from stratum h. Note that the sampling strata denoted by h are defined by CEP status (SFA has at least one CEP school versus those with no CEP schools), and is proportional to the estimated number of students eligible for free or reduced-price lunch within the stratum. The base weights defined above are statistically unbiased in the sense that the sum of the base weights, , summed across all of the sampled SFAs, provides an unbiased estimate of the number of SFAs in the country.

To the extent that any of the sampled SFAs do not participate in the study for any reason, the SFA base weights will be adjusted to compensate the loss of eligible SFAs in the sample. The adjustment will be made within weighting cells in which the predicted propensity to respond to the survey are similar. We will use Chi Square Automatic Interaction Detector (CHAID) to develop the weighting cells, using information available in the sampling frame for the sampled SFAs such as FNS region, type of control (public/private), enrollment size class, percentage of students eligible for free or reduced-price lunch (in categories), urbanicity (if available from CCD), grade span (if available), and possibly others. The CHAID algorithm provides an effective and efficient way of identifying the significant predictors of SFA nonresponse. The primary output from the CHAID analysis will be a set of K weighting cells (defined by a subset of the predictor variables entering into the analysis) with the property that the variation in expected response propensity across the weighting cells is maximized.

Within each of the K weighting cells determined by the CHAID analysis, an SFA-level adjustment factor, , will be computed as

= / , (2)


where the summation in the numerator extends over all of the a sampled SFAs in weighting cell k, whereas the summation in the denominator extends over all of the responding SFAs in the weighting cell. The final (nonresponse-adjusted) weight, , for SFA i in weighting cell k to be used for analysis will then be computed as

= . (3)


School Weights. Following procedures similar to those used in APEC I and II, we will construct nine separate sets of school-level weights. The first set of weights (referred to as the “general” school weights) will be used to develop national estimates of school-level characteristics and will also serve as the basis for constructing the student-level weights described in the next subsection. The remaining eight sets of school weights are specifically designed for analysis of the four types of non-certification error crossed by two meal types (lunch or breakfast).

General School Weights for Analysis of School Characteristics. To construct the general school weights, an initial weight, , representing the overall probability of selecting the school for the sample will be computed for sampled school j in SFA i in stratum h as

= / , (4)


where is the final SFA weight given by equation (3), and = the (conditional) probability of selecting school j in SFA i in sampling stratum h. Note that if the school is so large that it is selected with certainty within the SFA, = 1. Otherwise, is proportional to the expected number of students eligible for free or reduced-price lunch in the school.

To compensate for nonresponding schools, a weighting adjustment similar to that described earlier for SFAs, will be applied to the initial school weight given by equation (4). In this case, the types of school-level variables to be included in the CHAID analysis will include FNS region, CEP status of SFA, CEP status of school within SFA, type of control, and enrollment size of school (e.g., under 500, 500 to 999, and 1000+). Based on the CHAID analysis, a set of K weighting cells will be specified. Let denote the nonresponse adjustment factor for the kth (k = 1, 2, ..., K) weighting cell defined by

= / , (5)


where the summation in the numerator extends over all of the b sampled schools in weighting cell k, whereas the summation in the denominator extends over all of the responding schools in the weighting cell. The final (nonresponse-adjusted) school weight, , for school j in SFA i in weighting cell k will then be computed as

= . (6)


Final Post Stratified School Weights. In APEC I and II, the nonresponse adjustment was accomplished indirectly through post stratification to “best estimates” of the numbers of study-eligible schools in SFAs. To implement such an adjustment, reliable independent estimates of the numbers of study-eligible schools are required to serve as control totals in post stratification. Assuming that such control totals are available from FNS administrative files (or can be estimated with high precision), let = the known number of study-eligible schools in the population for poststratum g, where the poststrata are the four subgroups of schools defined by (1) private schools; (2) public schools with enrollment under 500; (3) public schools with enrollment between 500 and 999; and (4) public schools with enrollment of 1,000 or greater. Within poststratum g (g = 1, 2, ..., 4), a post stratification adjustment factor, , will be computed as

= / , (7)


where the summation in the denominator extends over all of the responding schools in poststratum g. The final (poststratified) general school weight, , for school j in SFA i in poststratum g will then be computed as

= . (8)


Special School Weights for Analysis of Non-Certification Errors. In addition to the general school weights, we will construct eight sets of special school weights for analysis of (a) four types of non-certification error: meal claiming error, POS aggregation error, school-to-SFA report aggregation error, and SFA-to-State-agency meal claim aggregation error, crossed by (b) two meal types (breakfast and lunch). The eight sets of weights to be constructed are indicated in Table B2-1, with more details provided in Appendix R (Special School Weights for Non-Certification Errors).

Table B2-1. School-level weights to be constructed for analysis of non-certification errors


 Type of non-certification error

Meal type

Lunch

Breakfast

Meal claiming error

POS aggregation error

School-to-SFA report aggregation error

SFA-to-State meal claiming aggregation error

Student (Household) Weights. Estimates of certification errors will be based on (a) responses to the household survey (Appendix D5/D6) in non-CEP schools; and (b) the results of the data abstraction audit of student-records in CEP schools (Appendix B3). The final student weights will include an adjustment for nonresponse and a post stratification adjustment to align the sample-based weighted estimates of dollar reimbursement amounts to the corresponding known population amounts available from FNS administrative files.

The starting point for constructing the required student weights is the assignment of an initial student weight to the responding students in the sample. For students selected from the non-CEP schools, the term “responding student” refers to the response status of the household in which the student resides, as it is information from the household survey that will be used to determine improper payments. For student records selected from CEP schools, improper payments will be determined directly from administrative records. The initial student weight, , for student s in school j of SFA i in sampling stratum h will be computed as

= / , (14)


where = the general school weight defined by equation (8) and is the (conditional) probability of selecting student s from school j in SFA i in sampling stratum h. The (within-school) probability of selecting a student for the sample will depend on whether the sampled school is a CEP or non-CEP school, and within the non-CEP schools, will also depend on the certification status of the student. The values of will be known at the time of sampling.

Next, a nonresponse adjustment will be applied to the initial weights (14) to compensate for sample losses due to incomplete student-level data. Note that for those student records in CEP schools that are selected for data abstraction, we expect no missing data (i.e., 100 percent response rate). Thus, the nonresponse adjustment described here will apply only to students selected from the non-CEP schools. Weighting cells will be defined by type of control, CEP status of SFA in which the school is located, and certification status of the student (i.e., approved for free or reduced-price lunch versus denied). These are broadly consistent with the variables used in APEC II to form nonresponse weighting cells; however, we will also consider other variables (e.g., derived from the household survey as appropriate). Within final weighting cell k, a nonresponse adjustment factor will be computed as

= / , (15)


where the summation in the numerator extends over all of the b sampled students in weighting cell k, whereas the summation in the denominator extends over all of the “responding students” in the weighting cell (i.e., those for which the household provides sufficiently complete data for calculation of improper payments for the sampled student). The nonresponse-adjusted student weight, , for student s in (non-CEP) school j in SFA i in weighting cell k will then be computed as

= . (16)


Note that (16) applies only to students in non-CEP schools. In CEP schools,

= . (17)


The final step in student weighting will be to calibrate the nonresponse-adjusted weights given by (16) and (17) so that sample-based weighted estimates of total (annual) reimbursements equals the corresponding “known” amounts recorded in FNS data files. This adjustment will be made separately for NSLP and SBP and for CEP and non-CEP schools. Let = the total annual reimbursement amount derived from FNS administrative files for meal type m (breakfast or lunch) and type of school c (CEP versus non-CEP). The final calibrated weight for student s in school j in school type c and meal type m will be computed as

= { / }, (17a)


where = the total annual reimbursement reported by responding student s in school j in school type c for meal type m.

Variance Estimation. In addition to the full sample weights described above, we will create and attach a series of replicate weights to each data record for variance estimation around the mean for study variables (e.g., error rates). Replication methods (e.g., jackknife procedures) provide a relatively simple and robust approach to estimating sampling variances for complex survey data. Note that while Taylor series methods can also be used in estimation of variance, replication method has advantages in reflecting statistical adjustments used in weighting, such as nonresponse and post stratification. Under the replication approach, we will form replicate weights by deleting selected cases from the full sample and adjusting the base weights of the retained cases accordingly. The entire weighting process developed for the full sample will then be applied separately to each replicate, resulting in a series of replicate weights. The replicate weights can be imported into variance estimation software (e.g., SAS) to calculate standard errors of the survey-based estimates and conduct significance tests. In addition to the replicate weights, we will provide stratum and unit codes in the data files to permit calculation of standard errors using Taylor series approximations if desired.

Degree of Accuracy (and Levels of Precision) Needed for the Purpose Described in this Justification

An important benefit of the sample size increases (compared to APEC II) is increased precision for study estimates. The precision goal for APEC III as required by IPERA is that 90 percent confidence bounds (or “margin of error”) for estimates of the improper payment rate (expressed as the ratio of the dollar amount in error to total reimbursements) be no more than ±2.5 percent.

Although key APEC II error rates generally met IPERA precision requirements, the margins of error achieved in the study were somewhat wider (less precise) than initially projected. The proposed sampling plan is expected to attain a more desirable level of precision for such estimates as a result of both the increased sample size and the inclusion of a substantial number of non-CEP schools in the CEP SFA sample. Increased precision will support more rigorous correlation analyses and improved precision of all point estimates. For instance, for estimates related to administrative certification errors and meal claiming errors, where information from sampled households is not required for estimation purposes, precision will be increased substantially. It is difficult to directly quantify gains in precision for SFAs and schools as they are sampled with probability proportionate to size rather than with equal probability, which adds to the variance due to the variability of sample weights. However, a sense of the improved precision can be obtained by considering equal probability samples of SFAs and schools.

For example, for the non-CEP sample at the SFA level, for an equal probability sample of SFAs, Table B2-2 indicates that the width of confidence intervals related to estimated percentages of administrative errors would be reduced (i.e., improved) by about 20 percent compared to APEC II through the proposed increase in sample size for SFAs. Because SFAs are sampled with probability proportionate to size, there will be a design effect associated with the variance of the weights, which will mean that the reduction will not be as substantial but will be appreciable compared to what would have been obtained without such a supplement. For non-CEP schools, an equal probability sample of schools where the sample size is increased by about 12 percent would result in a reduction in the width of confidence intervals for a school-based estimated percentage related to meal claiming errors that would be close to 10 percent. Again, schools will not have been selected with equal probability, so the reduction will not be as great as that indicated in Table B2-2 but is nonetheless expected to be appreciable.


Table B2-2. Example of the effect of increased sample sizes on the confidence intervals of SFA- and school-level estimates


Sample size increase (non-CEP)

Estimated reduction in width of confidence intervals1

Type of error

54% more SFAs

~20%

Administrative

18% more schools

~10%

Meal Claiming

1 Assuming an equal probability sample of SFAs


The sampling precision of estimates of improper payment rates derived from the household survey and student application records will depend on the underlying standard deviation of the error rates () among all applications in the population, as well as the design effects due to clustering ( ) and unequal weighting ( ). Based on estimates of standard errors reported in the APEC II final report, we have assumed that the values of will range from 0.25 to 0.75 for NSLP and from 0.10 to 0.70 for SBP, depending on whether the error rate being computed is an overpayment, underpayment, net error rate, or gross error rate. The design effect due to clustering is given approximately by the formula (e.g., see equation 2.23 of Skinner, Holt, and Smith, 198918):
= 1 + (m-1)n1 + (n-1)2, where m = the average number of sample schools per SFA, n = the average number of sample students (i.e., either households or student-records) per school, 1 = the intraclass correlation within PSUs, and 2 = the intraclass correlation within schools. For sample planning purposes, we have assumed that both 1 and 2 are both of the order of 0.02, which we believe are likely to be conservative assumptions. Finally, we assume an unequal weighting effect of = 1.40 for both NSLP and SBP, which is also a conservative estimate.

Based on the assumptions given above, estimates of the levels of precision to be expected under the proposed sample design are summarized in Table B2-3. For both NSLP and SBP, 90 percent confidence bounds around estimates of both gross and net error rates are expected to meet or exceed the ±2.5 percent IPERA requirement. Estimates of under- and over-payment error rates are similarly expected to meet the ±2.5 percent precision level.

Table B2-3. Expected 90% confidence bounds around error rate estimated for APEC III


 Program

 Type of improper
payment rate

Sample size (households/student records)

 Standard error (SE)1

 90% conf. bounds1

Non-CEP schools

CEP
schools

All schools

NSLP

Gross Improper Payment

5,520

3,630

9,150

0.98%

±1.61%


Overpayment

5,520

3,630

9,150

0.83%

±1.36%


Underpayment

5,520

3,630

9,150

0.45%

±0.74%


Net Improper Payment

5,520

3,630

9,150

0.91%

±1.49%

SBP2

Gross Improper Payment







Overpayment

2,484

1,634

4,118

1.23%

±2.03%


Underpayment

2,484

1,634

4,118

0.99%

±1.62%


Net Improper Payment

2,484

1,634

4,118

0.57%

±0.94%

1 See text for assumptions used to calculate standard errors and confidence bounds.

2 Assumes that over 90 percent of NSLP schools also participate in SBP and that about 50 percent of NSLP students also participate in SBP (29% average SBP participation rate divided by 58% average NSLP participation rate = 50%). See Table D.5 of Appendix D of APEC II revised draft report, which indicates that the average NSLP participation rate is 58 percent and the average SBP participation rate is 29 percent.


Nonresponse Bias Analysis. APEC III will make every effort to achieve as high a response rate as practicable with the available resources; however, nontrivial response losses can occur. As specified in the Standards and Guidelines for Statistical Surveys published by the Office of Management and Budget, 19 a nonresponse bias analysis (NRBA) is required if the overall unit response rate for a survey is less than 80 percent. Based on the experiences of APEC II, this is unlikely to be an issue for the data collection from SFAs or schools but may be for the household survey with parents/guardians. For APEC III, the overall household survey response rate is the product of the response rates at each of the following three stages20 of sampling: SFAs, schools, and households. In general, compensation for nonresponse in sample surveys is handled by weight adjustments at each stage of selection.

The purpose of the NRBA is to assess the impact of nonresponse on the survey estimates and the effectiveness of the weight adjustments to dampen potential nonresponse biases. The types of analyses to be conducted to evaluate possible nonresponse biases will include the following:

  • Comparing characteristics of nonrespondents (or the total sample) to those of respondents using information available for both nonrespondents and respondents;

  • Modeling response propensity using multivariate analyses;

  • Evaluating differences between unadjusted (i.e., base-) weighted estimates of selected sampling characteristics and the corresponding population (frame) parameter; and

  • Comparing unadjusted (base) weights (for the estimates of error/improper payments) with nonresponse-adjusted weights (for these estimates).

Unusual Problems Requiring Specialized Sampling Procedures

There are no unusual problems that require specialized sampling procedures for the sampling of SFAs, schools, and/or households. However, it is important to note that the largest portion of the household survey sample will be drawn from applications submitted between August and September 2017. As noted, two more rounds of smaller household sample selection will also be conducted to ensure that applications submitted after September are included in sampling. This is to ensure that household survey data are representative of applications submitted throughout the entire school year (see Appendix S Data Collection Summary for more details).

Any Use of Periodic (Less Frequent Than Annual) Data Collection Cycles to Reduce Burden

The data collection effort, with multiple visits to SFAs and schools, is planned to be done one time only during the 2017-2018 school year.

The data collection cycle consists of recruitment and data collection activities. Appendix S provides a summary of data collection procedures.

Recruitment Procedures

Recruitment activities include recruiting households to participate in the household survey component of the study. Household recruitment will include a recruitment packet sent via mail and a follow-up recruitment call from a data collector. Once a parent or guardian agrees to participate and has a scheduled appointment, a reminder letter, as well as the income worksheet (see Appendices D1D4), will be sent via mail or email (if provided). Recruitment materials are found in Appendices T1T8. A summary of the follow-up procedures for household survey recruitment is included in the Data Collection Summary (Appendix S).

Additional recruitment activities include recruiting a sub-sample of 60 SFA directors and a sub-sample of 60 households for an in-depth interview by phone. These will be a sub-sample of individuals who completed the SFA director survey or household survey, respectively. At the end of the household survey, respondents will be notified that they may be contacted for this follow-up interview.

Data Collection Procedures

Appendix S provides a summary of data collection activities and how the information will be used. Appendices B (B1B14), C (C1C6), and D (D1D16) include the data collection forms for SFA/State, school, and household data collection, respectively. Data collection will include (a) phone pre-visit interviews with SFAs and schools; (b) in-person abstraction from income eligibility applications (or categorical eligibility/ direct certification) records from SFA records; (c) electronic review of identified student records for CEP schools; (d) in-person/electronic abstraction of meal count and claiming records from States, SFAs, and schools; (e) meal observations; (f) in-person cafeteria manager interviews; (g) in-person household surveys; (h) web-based SFA director survey; (i) SFA director in-depth interview by phone (sub-sample); (j) household in-depth interview by phone (sub-sample); and (k) electronic review of administrative data on meal participation (data on the number of meals served and claimed for sampled students).

Sampling of households and household surveys will take place three times during the study year to ensure coverage of applications submitted at different times during the year. At the appointed time, the data collectors will travel to sampled households to conduct the in-person household survey as a computer assisted personal interview (CAPI) with each participant. The household in-depth interviews will be conducted via phone with a subset of 60 parents/guardians who completed the household survey and will include questions on general experience with the application process to identify any areas that were confusing to the respondent, if the respondent used a paper application or web-based version, and if any school staff or other knowledgeable individuals were available to answer questions. The in-person household survey and household in-depth interviews will be conducted during phases 1, 2 and 3 of the data collection schedule.

Beginning in phase 2, SFAs and schools will be contacted by phone to complete the pre-visit interviews (which obtain information to prepare for the data collection visits) and to schedule the data collection visits. During phases 2 and 3 of data collection, SFA directors will also be asked to complete the web-based SFA Director Survey (Appendix B11), which will provide relevant SFA characteristics for later analyses and comparisons. An in-depth SFA Director Interview (Appendix B12) will be conducted with a subset of 60 SFA directors to garner a better understanding of how SFA policies, procedures, and characteristics affect errors, in addition to actions that would be most effective in reducing errors. Lastly, SFA meal participation data will be requested via the SFA Meal Participation Data Request (Appendix B13) for direct, electronic submission.

Finally, during phase 3 of data collection, all State-level data will be requested directly from the State via the State Meal Claim Data Request (Appendix B14) for direct, electronic submission. All data will be received and stored in a secure, password protected computer-based system.

SFA and school data collection will be conducted by trained field data collectors. Field data collectors will be recruited and hired from regions local to the sampled SFAs and schools. Training will include online home study tutorials, in person training, and home based post-training exercises. All data collectors will be certified to conduct data collection.

The SFA Director Interview (Appendix B12) and household in-depth interview (Appendix D13/D14) are phone interviews that will be conducted by Westat researchers with experience in qualitative research methods. In addition, the interviewers will be trained on project objectives and the interview protocols. The interviewer will contact the SFA Director by email and/or phone to schedule the SFA Director Interview at a time that is convenient for him/her. At the end of the household survey (Appendix D5/D6), the interviewer will ask the respondent if he/she is willing to participate in a follow up telephone interview should they be randomly selected. Appendix S provides more details on the procedures for the Household In-Depth Interview.

3. Methods to Maximize the Response Rates and to Deal With Nonresponse

Describe methods to maximize response rates and to deal with issues of nonresponse. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.

As reported in Table B1-1, a minimum 80 percent response rate is anticipated from SFAs and schools and a minimum 75 percent response rate from parents/guardians for the household survey. We anticipate that the study will experience higher response rates among SFAs and schools because (1) response rates in APEC II were high; (2) participation in this study is part of Healthy, Hunger-Free Kids Act (HHFKA) requirements (Appendix A9); and (3) the schools will be directed and encouraged to participate by their SFAs. Additional steps to maximize response rates among SFAs and schools include working with the State child nutrition director and FNS regional liaisons to provide study notification and endorsement, encourage participation, and address any questions or concerns. Currently, 96 percent of the SFAs have already confirmed participation. As a result, the participation rate among eligible schools will be comparable as the SFA directors agreed to facilitate the participation of their sampled schools when data collection begins.

For the household survey we expect a minimum response rate of 75 percent among the sampled student households. Households will be provided with advance notice and guidance for the income documentation that will be requested during the survey. If response rates fall below 75 percent, nonresponse bias analysis will be conducted as described in the response to question B2.

Several steps will be taken to maximize response rates among sampled households, including the following:

  • Inclusion in the recruitment packet for sampled student households information about the importance of the study, privacy protections, incentive information, and the assurance that meal benefits will not be affected by study participation;

  • Use of qualified, trained, and experienced individuals to conduct recruitment and household interviews;

  • Use of experienced senior-level staff to recruit reluctant parents/guardians;

  • Use of data collection methods for the household survey that shift the burden to the data collector (i.e., computer assisted personal interview), minimizing data entry and writing for survey respondents; and

  • Use of a modest incentive payment to parents/guardians who complete the household survey and an additional incentive to those who are sampled for and complete the household in-depth interview.

Table A2-1 (see Question 2 of Supporting Statement Part A) provides a summary of the data collection methods, types of respondents, and what will be collected. As shown in Table A2-1, most of the data collection from SFAs and schools require access to existing records. As described in Question 3 of Part A, the burden of data collection in SFAs and schools is shifted heavily to the field data collector after access to the records is provided. For the household survey component, the collection of data electronically using computer assisted personal interviewing (CAPI) will be highlighted during recruitment to ease the respondents’ concerns about the burden of completing the survey. All together, these efforts will also work to maximize response rates.

4. Test of Procedures or Methods to be Undertaken

Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.

The household survey instrument is comparable to the survey instrument that was approved by OMB for the APEC II study in 2012 (Approval # 0584–0530 NSLP/SBP Access, Participation, Eligibility, and Certification Study, Discontinued 08/31/2015). Cognitive pretests of survey instruments were conducted in March and April 2016. These included the household survey, household in-depth interview, SFA director survey, SFA director in-depth interview, and cafeteria manager in-depth interview. The findings and recommendations were submitted as a deliverable to FNS and incorporated into the final study instruments (see Appendix U, APEC III Cognitive Pretest Findings Report).

Data collection instruments and procedures for records abstraction and meal observations were all developed based on previously approved APEC II procedures, with minor modifications. It is important to note that only trained data collectors will use these data collection forms; the staff from the schools will not be asked to complete these forms.

5. Individuals Consulted on Statistical Aspects of the Design & Individuals Collecting and/or Analyzing Data

Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.

Table B5-1 presents a summary of staff consulted on statistical aspects of the design. These staff will be responsible for the collection and analysis of the study’s data.

Table B5-1. Individuals consulted on data collection or analysis


Westat Staff (contractor)

Title

Phone number

Roline Milfort, Ph.D., PMP

Senior Study Director

301-251-8229

Laurie May, Ph.D.

Vice President, Associate Director

301-517-4076

Mustafa Karakus, Ph.D.

Senior Economist

301-294-2874

Adam Chu, Ph.D.

Senior Statistician

301-251-4326

Robert Fay, Ph.D.

Senior Statistician

240-314-2318

Roger Tourangeau, Ph.D.

Senior Statistician

301-294-2828

David Cantor, Ph.D.

Survey Methodologist

301-294-2080

Melissa Rothstein, Ph.D.

Independent Technical Advisor

703-346-4484

Subcontractor/Consultants

Ted Macaluso, Ph.D.

President, Ted Macaluso, LLC

571-214-9658

FNS Staff

Devin Wallace-Williams, Ph.D.

Social Science Research Analyst

703-457-6791

John Endahl, Ph.D.

Senior Program Analyst

703-305-2127

NASS Staff

Jennifer Rhorer

Mathematical Statistician

202-720-2616


1 The household survey component of the study is referred to as National School Meals Study (NSMS).

2 The first stage of sampling for APEC III was at the SFA level. The final sampling of resulted in SFAs from only 46 States.

3 See Part A, Question 15.

4 The response rate was 96 percent whereas the minimum expected response rate was 81 percent. However, to remain within burden estimates, some of the sampled SFAs will be released.

5 The data sources were: a) FY15 FNS-742 SFA File (version dated 2-22-16) and b) CEP SY15-16 National Elections Data-September 2015 (version dated 3-25-16).

6 In CEP schools, all students receive free meals, and households do not submit an application for free or reduced priced meals.

7 The ISP is the proportion of identified students, out of the total enrolled students, who are directly certified for free School Meals (through means other than an application, and are not subject to verification). The ISP is multiplied by a factor of 1.6 to determine the total percentage of meals at the school or district that will be reimbursed at the Federal “free” rate. The 1.6 multiplier takes into account the provision that if the ISP is 62.5% or higher, then 100% of meals are claimed at the “free” rate (62.5x1.6 = 100%).

8 The sampled school counts were approximated when the sampling plan was developed. The actual number of sampled schools is slightly fewer due to updated school data received during the sample frame development. Specifically, 777 schools (instead of 782) were sampled, of which 547 (instead of 548) were non-CEP and 230 (instead of 234) were CEP.

9 This is an FNS requirement for APEC III since the prior study had a very low response with income documentation.

10 If respondents complete the household survey but do not provide income documentation, they will receive the $30 incentive but not the additional $20 incentive for income documentation.

11 In CEP schools, all students receive free meals, and households do not submit an application for free or reduced-price meals.

12 In the case of large SFAs like New York City and Los Angeles, the fraction of the 70 non-CEP schools to be selected from them will be proportional to their sizes, so it will be possible for more than two non-CEP schools to be sampled from the large (certainty) SFAs. The likelihood of sampling more than two non-CEP schools in the certainty CEP SFAs is enhanced by the fact that about 50 percent of the sampled CEP SFAs will contain only CEP schools (this percentage is based on preliminary tabulations of the FNS-742 sampling frame).

13 In order to meet the targeted number of completes summarized in Table B1-1, 11 household surveys per 438 non-CEP School (353 from non-CEP SFAs plus 85 from CEP SFAs) must be completed (438 x 11 = 4,818).

14 Sorting students by family prior to sample selection and selecting a systematic random sample of students from the sorted list will minimize the number of students selected from any one household. We can ensure that no more than one student per household is selected as long as the maximum number of students in a household is less than the systematic sampling interval (for instance, unless over half of the students in a school are to be selected, no households with two students in the school will have more than one selected for the study).

15 In order to meet the targeted number of completes, approximately 24 records per 187 CEP school is required (187 X 24 = 4488).

16 In CEP schools, households do not submit an application for free or reduced priced meals.

17 U.S. Department of Agriculture, Food and Nutrition Service, Office of Policy Support, Program Error in the National School Lunch Program and School Breakfast Program: Findings from the Second Access, Participation, Eligibility and Certification Study (APEC II) Volume 1: Findings by Quinn Moore, Judith Cannon, Dallas Dotter, Esa Eslami, John Hall, Joanne Lee, Alicia Leonard, Nora Paxton, Michael Ponza, Emily Weaver, Eric Zeidman, Mustafa Karakus, Roline Milfort. Project Officer Joseph F. Robare. Alexandria, VA: May 2015.

18 Skinner, C. J., Holt, D., and Smith, T. M. F. (Eds.). (1989). Analysis of Complex Surveys. New York: John Wiley and Sons.

19 Office of Management and Budget (September 2006). Standards and Guidelines for Statistical Surveys. Source: http://www.whitehouse.gov/sites/default/files/omb/inforeg/statpolicy/standards_stat_surveys.pdf

20 We assume and expect a 100 percent response rate at the State level.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorChantell Atere
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy