Supporting Statement for OMB Clearance for the School Meals Operations Study: Evaluation of the COVID-19 Child Nutrition Waivers and Child Nutrition Programs
Part B
Revision to OMB # 0584-0607, School Meals Operations Study (SMO)
Holly Figueroa
Social Science Research Analyst
Office of Policy Support
Food and Nutrition Service
United States Department of Agriculture
1320 Braddock Place
Alexandria, Virginia 22314
Phone: 703-305-2105
Email: [email protected]
TABLE OF CONTENTS
B1. Respondent Universe and Sampling Methods 1
B2. Procedures for the Collection of Information 6
B3. Methods to Maximize the Response Rates and to Deal with Nonresponse 13
B4. Test of Procedures or Methods to be Undertaken 18
B5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data 24
TABLES
Table B1.1. SFA sample allocation by stratum 2
Table B1.2. Summary of Respondent Universe and Expected and Prior Response Rates 5
Table B2.3. Sample size estimates and expected margins of error for SFA sample and key subgroups 12
Table B4.1. Changes to State CN Director Survey based on respondent feedback 19
Table B4.2. Changes to SFA Director Survey based on respondent feedback 22
Table B5.1. Individuals consulted on statistical aspects of study design 24
A Section 2202 of the Families First Coronavirus Response Act (FFCRA)
B Research Questions and Waiver List
C Section 28 of the Richard B. Russell National School Lunch Act and Section 305 of the Healthy, Hunger-Free Kids Act of 2010
D.1 State Agency Child Nutrition Director Survey: Fall 2021
D.2 Screenshots of State Agency Child Nutrition Director Survey: Fall 2021
D.3 State Agency Child Nutrition Director Survey: Summer 2022
D.4 Screenshots of State Agency Child Nutrition Director Survey: Summer 2022
E.1 FNS-10 Administrative Data Request for FY 2021 and FY 2022
E.2 FNS-418 Administrative Data Request for FY 2021 and FY 2022
E.3 FNS-44 Administrative Data Request for FY 2021 and FY 2022
F.1 School Food Authority Director Survey 2021-2022
F.2 Screenshots of School Food Authority Director Survey 2021-2022
G.1 Notification from USDA FNS to Regional Offices
G.2 Study support email (from FNS RO to SA)
G.3 SA Advance emails
G.4 Brochure
G.5 Telephone meeting advance email and call script
G.6 SA Invitation email
G.7 Reminder email
G.8 Telephone reminder script
G.9 SA Last chance post card
G.10 Study support email (from SA to SFA)
G.11 SFA Advance letter
G.12 SFA Invitation email
G.13 SFA Last chance post card
H.1 Public comment 1
H.2 Public comment 2
H.3 Public comment 3
H.4 Public comment 4
H.5 Public comment 5
H.6 Public comment 6
H.7 Public comment 7
H.8 Public comment 8
H.9 Public comment 9
H.10 Response to Public Comment 1
H.11 Response to Public Comment 2
H.12 Response to Public Comments 3 and 4
H.13 Response to Public Comment 5
H.14 Response to Public Comment 6
H.15 Response to Public Comment 8
H.16 Response to Public Comment 9
I.1 National Agricultural Statistics Service Comments
I.2 FNS Response to National Agricultural Statistics Service Comments
J Confidentiality Pledge
K Estimated Annualized Burden
L FNS-742 School Food Authority Verification Collection Report
M SMO Option Period 1 Pre-test Findings Memo
Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.
Three web-based surveys will be conducted, two with State Child Nutrition (CN) directors and a third with school food authority (SFA) directors. The respondent universe for the State Agency Child Nutrition Director Surveys (Appendices D.1-D.4) includes the 67 State CN directors that oversee the National School Lunch Program (NSLP), the School Breakfast Program (SBP), NSLP Seamless Summer Option (SSO), Summer Food Service Program (SFSP), and Child and Adult Care Food Program (CACFP) in the 50 States, District of Columbia, Guam, Puerto Rico, and U.S. Virgin Islands.
The respondent universe for the School Food Authority Director Survey (Appendix F.1 and F.2) includes all SFAs operating in public and private school districts (including charters) in the United States and outlying territories that were required to submit form FNS-742 School Food Authority Verification Collection Report (OMB number 0584-0594 Food Programs Reporting System (FPRS), expiration date 07-31-2023) (Appendix L) to FNS in school year (SY) 2020-20211. In general, all SFAs that participated in the National School Lunch Program (NSLP) or the School Breakfast Program (SBP) are included in the respondent universe except SFAs associated with Federally-administered schools.
The sampling frame is based on the latest available FNS-742 file, supplemented with school district-level characteristics from the U.S. Department of Education’s National Center for Education Statistics (NCES) and district-level estimates of school-age children in poverty from the U.S. Census Bureau’s Small Area Income and Poverty Estimates file. This will result in a respondent universe of approximately 18,830 SFAs. Table B.1.1 summarizes the distribution of eligible SFAs in the sampling frame by sampling stratum, which is based on SFA enrollment size and poverty status (defined as the percentage of students in the SFA that are eligible for free or reduced-price meals).
Table B1.1. SFA sample allocation by stratum
Stratum |
SFA size (student enrollment) |
Poverty level (percentage approved for F/RP meals) |
Total populationa |
Augmented sample selected |
Initially released sample |
Reserve sample |
Expected survey completesb |
1 |
1–499 |
<60% |
5,342 |
237 |
191 |
46 |
153 |
2 |
1–499 |
60% + |
3,092 |
221 |
179 |
42 |
143 |
3 |
500–2,499 |
<60% |
4,525 |
414 |
335 |
79 |
268 |
4 |
500–2,499 |
60% + |
2,094 |
197 |
159 |
38 |
127 |
5 |
2,500–4,999 |
<60% |
1,290 |
178 |
144 |
34 |
115 |
6 |
2,500–4,999 |
60% + |
561 |
54 |
44 |
10 |
35 |
7 |
5,000–99,999 |
<60% |
1,361 |
178 |
144 |
34 |
115 |
8 |
5,000–99,999 |
60% + |
538 |
53 |
43 |
10 |
34 |
9 |
100,000+ |
-- |
27 |
27 |
27 |
0 |
22 |
|
|
|
18,830 |
1,559 |
1,266 |
293 |
1,012 |
a The population numbers are based on a sample frame file that combines the FNS-742 and NCES from school year 2017–2018, the last time an SFA survey was conducted under 0584-0607. An updated frame will be used when designing the sample for the school year 2021-2022 survey.
b The number of expected survey completes assumes an 80 percent response rate from the initially released sample. Additional SFAs in the augmented sample will be held in reserve as backup SFAs, which will be random ordered in each stratum and released as needed to achieve the stratum’s target number of completes.
F/RP = free or reduced price.
The study team plans to sample SFAs with equal probability within each stratum. The targeted sample size across strata will be allocated disproportionately (relative to their population proportions) to allow for a sufficient number of SFAs in policy-relevant subgroups. SFAs that pretest the survey will be excluded from sample selection. A census will be taken of SFAs with 100,000 or more enrolled students (“certainty SFAs”) and enough other SFAs will be sampled to obtain 12 to 300 completed SFA surveys per stratum. The study team will select a sample that allows for lower-than-anticipated response rates (an augmented sample). From this sample, the team will randomly subsample a first release that assumes an 80 percent response rate, where all remaining sampled SFAs are held in reserve as backup SFAs. In each stratum, backup SFAs will be random ordered and released as needed to achieve the stratum’s target number of completes. The final sampling weights will then be calculated as the product of the initial sampling weight (the inverse of the probability of selection for the augmented sample) and the release adjustment (the number of cases selected for the augmented sample in each stratum divided by the number of released cases in that stratum).
To streamline survey data collection, the team also plans to collect disaggregated administrative data from 67 State Agency Directors that are currently only reported to FNS in aggregate on forms FNS-10, Report of School Program Operations, FNS-418, Report of the Summer Food Service Program for Children, and FNS-44, Report of the Child and Adult Care Food Program (which are approved under OMB# 0584-0594, Food Programs Reporting System (FPRS), expiration date 07/31/2023). No sampling or weighting is required for the state-level collections as the study will include a census of all 67 SAs. Given the total universe of only 67 eligible States and territories and their relatively distinct characteristics, there is not an efficient sample design that could closely match the comprehensive data on State policies and student meal service that a census will yield to better understand variation and localized concerns.
The nationally representative sample of SFAs (including charters) is expected to result in 1,012 completed web surveys, a response rate of 80%. This will balance the need for precise estimates with the desire for minimizing burden on SFA directors.
Achieving high response rates on the SA and SFA surveys and the administrative data request is critical to the study’s success. The Child Nutrition Program Operations Study II (CN-OPS II) (OMB number 0584-0607, expiration date 08/31/2022), which is the predecessor to the SMO Study, had survey response rates of 100 percent for the State CN survey and 78 percent for the SFA survey averaged across three data collection years. In addition, the SMO Study had a response rate of 100 percent for the State CN survey and administrative data collection during the first year of data collection in SY 2020-2021.States are aware that their participation in this collection (both the survey and administrative data collection components) is mandatory, and they are already preparing to participate in the next two data collections. Therefore, the team anticipates again reaching 100 percent completion with the 67 State agency respondents for the web survey and the administrative data collection. With the SFA survey, while SFA directors will likely be continuing to make adjustments to program operations due to the COVID-19 pandemic during SY 2021-2022, we expect that they will also be interested in reporting to FNS on the effects of the pandemic on their program operations and finances. This survey will be their primary opportunity to report directly to FNS on their unique pandemic experiences, and study recruitment materials will convey the importance of reporting this information to inform future program policy and assistance efforts. Further, the administrative data collection was added to the current study in part to minimize the burden of survey completion for SFAs, in contrast to previous versions of the study; they will not be asked for information in the survey that is available through the data. Therefore, while SFAs will be informed that their participation in the survey is mandatory, based on past experience with similar surveys, the unique situation brought on by the COVID-19 pandemic, and efforts to reduce burden, we anticipate reaching an 80 percent response rate for the SFA web survey.
Table B1.2. Summary of Respondent Universe and Expected and Prior Response Rates
Respondent |
Universe |
Target completed cases |
Expected Response Rates |
CN-OPS II Response Rates |
SMO Response Rates |
||
|
|
|
|
SY |
SY |
SY |
SY |
State CN Agencies |
67 |
67 |
100% |
100% |
100% |
100% |
100% |
School Food Authorities |
18,830 |
1,012 |
80% |
82% |
77% |
76% |
NA |
Total |
18,897 |
1,079 |
81% |
83% |
78% |
77% |
100% |
* The universe for CN-OPS II only included the 55 State CN agencies that oversee the NSLP and the SBP. Due to the unanticipated school closures related to COVID-19, other CN programs, including the SFSP and the CACFP, were used to provide meals to children beginning in March 2020. Thus, the State agencies that oversee the SFSP and the CACFP were added to the SY 2020-2021 universe, increasing the universe of State CN agencies to 67.
**Due to the COVID-19 pandemic and its resulting impacts on school districts nationwide, the SY 2020-2021 collection only included the state-level survey and administrative data collection (the SFA director survey was cancelled for SY 2020-2021).
The approach to achieving high response rates builds on prior FNS study experience. The team developed engaging recruiting materials, listed in Appendices G.1-G.13, to describe the study, including a variety of appeals to encourage participation. FNS headquarters will send a notification to FNS Regional Offices (Appendix G.1) to introduce the SMO Study data collection activities for SYs 2021-2022 and 2022-2023 and ask FNS Regional Office staff to inform SAs about the upcoming data collections. Before each state-level survey, all FNS Regional Offices will send the Study Support Email from FNS RO to SA (Appendix G.2) to build support for the study among States. The study team will also provide information to SAs on the various study components prior to each data collection using the SA Advance Email (Appendix G.3), the study brochure (Appendix G.4), and the Telephone Meeting Advance Email and Call Script (Appendix G.5). All State CN directors will be asked to send the Study Support Email from SA to SFA (Appendix G.10) to build study support among sampled SFAs. Prior to each survey, State CN directors will be sent the SA Invitation Email (Appendix G.6) and SFA directors will be sent the SFA Director Invitation Email (Appendix G.12) to request that each respondent use the unique enclosed link to access and complete the survey. To encourage survey completion, State CN directors will receive the Reminder Email (Appendix G.7) every 2-3 weeks (two reminder emails total per survey), a reminder call at 7 weeks using the Telephone Reminder Script (Appendix G.8), and the SA Last Chance Post Card (Appendix G.9) until the target of 67 respondents is reached for each survey. Similarly, SFA directors who have not yet completed the web survey will receive the Reminder Email (Appendix G.7) every 2-3 weeks (two reminder emails total), a reminder call at 7 weeks using the Telephone Reminder Script (Appendix G.8), and the SFA Last Chance Post Card (Appendix G.13) until the target of 1,012 respondents is reached.
A professional, trained, survey support specialist will be available to assist respondents by phone or email during business hours. Also, the web survey will allow respondents to save and exit the survey at any point, and then return to access and complete the survey later.
The study team will calculate response rates using industry standards from the American Association of Public Opinion Research. Depending on item-completion patterns, the study team will classify partially completed surveys as either sufficiently completed to treat as a respondent, or insufficiently completed to treat as a nonrespondent. Response rates will be presented both unweighted and weighted by the release-adjusted sampling weights, and both overall and by key subgroups.
Describe the procedures for the collection of information including:
Statistical methodology for stratification and sample selection;
Estimation procedure;
Degree of accuracy needed for the purpose described in the justification;
Unusual problems requiring specialized sampling procedures; and
Any use of periodic (less frequent than annual) data collection cycles to reduce burden.
As detailed in Part A, Question 2, information will be collected via web surveys of State CN directors and SFA directors and state-level administrative data collections. The respondents will have 10 weeks to complete each survey and provide the administrative data, which allows time to plan their approach for completion. Respondents will receive reminder emails and calls from trained survey support personnel and administrative data liaisons. Respondents may also call and/or email professional survey support specialists to request help in completing their survey or with technical issues.
The SMO Study employs an efficient sample design for the SFA survey that will minimize the overall response burden for SFAs. In comparison to the previously approved CNOPS-II approach, SMO includes reduced target sample sizes for SFAs. Overall, the SMO SFA sample is estimated to include 1,266 SFAs, whereas the CN-OPS II sample included 2,188 SFAs. SFAs will be sampled with equal probability within each stratum. We plan to allocate the targeted sample size across strata disproportionately (relative to their population proportions) to allow for a sufficient number of SFAs in policy-relevant subgroups.
This sample design meets the study’s required precision levels of 5 percentage points (pp) overall and 10 pp for key subgroups. See Table B2.3 for margins of error by subgroup. Given the descriptive nature of this study, the SMO Study’s sample design is structured to ensure the desired level of precision for national estimates and estimates of key subgroups.
As described in Question B1, the team will stratify the sample by SFA size (based on student enrollment in five categories) and poverty level (based on the percentage of students approved for free or reduced-price meal benefits [F/RP]; less than 60 percent F/RP or greater than or equal to 60 percent F/RP) (Table B1.1). Although the study team plans to sample SFAs with equal probability within each stratum, the targeted sample size across strata will be allocated disproportionately (relative to their population proportions) to allow for a sufficient number of SFAs in policy-relevant subgroups. The study team will take a census of SFAs with 100,000 or more enrolled students (“certainty SFAs”) and sample enough other SFAs to obtain 12 to 300 completed SFA surveys per stratum. This approach will produce an augmented sample, from which the team will randomly subsample a first release of 1,266 SFAs that assumes an 80 percent response rate. All remaining SFAs from the augmented sample will be held in reserve as backup SFAs. In each stratum, backup SFAs will be randomly ordered and released as needed to achieve the stratum’s target number of completes, though the study team does not anticipate that this will be necessary.
When the sample frame is final, the study team will draw the sample. The stratum of “certainty” SFAs represents less than 1 percent of all SFAs but more than 15 percent of all students. The annual sample allocation for responding SFAs (aside from “certainty” SFAs) will be subject to the following restrictions: (1) no more than 80 percent of one-quarter of the total number of SFAs in the stratum2 and (2) a minimum of 115 SFAs for strata with more than 575 SFAs.3 The study team will implicitly stratify (that is, sort the frame within the SFA size and poverty level strata before sampling) by characteristics including urbanicity, and charter-only status to help ensure proportional representation of SFAs with these characteristics in the sample, and then select SFAs with equal probability within stratum using sequential sampling. Table B.2.2 provides SFA sample sizes for SY 2021-2022.
Table B2.2. SFA sample size for SY 2020-2021 (selected, released, completes, noncompletes, and reserves)
Sample type |
SFA type |
School Year 2021-2022 |
Augmented sample selected |
All SFAs |
1,559 |
Certainty SFAs |
27 |
|
Non-certainty SFAs |
1,532 |
|
Initially released sample |
All SFAs |
1,266 |
Certainty SFAs |
27 |
|
Non-certainty SFAs |
1,239 |
|
Expected survey completes (80%) |
All SFAs |
1,012 |
Certainty SFAs |
22 |
|
Non-certainty SFAs |
990 |
|
Expected survey noncompletes (20%) |
All SFAs |
254 |
Certainty SFAs |
5 |
|
Non-certainty SFAs |
249 |
|
Reserve sample |
All SFAs |
293 |
Certainty SFAs |
0 |
|
Non-certainty SFAs |
293 |
After sampling, the study team will conduct a thorough quality control (QC) review to check that unweighted counts match those in the sample design, overall and by stratum, and that counts weighted by the sampling weights (which differ from analysis weights and account for only the probability of selection) match those in the sample frame, overall and by stratum.
The web survey of State CN Directors and the disaggregated administrative data collection will be conducted with a census of State CN directors. Because this study involves a census of SAs and a 100 percent response rate is expected, there is no need for sampling, weighting or nonresponse adjustments at the state level. Rather, the state-level data will provide reliable answers to the study’s research questions that represent the full population.
Estimation Procedure.
After data collection is complete, the study team will construct analysis weights for the SFA sample. The first step is to update the initial sampling weight with final backup release information (as indicated in Part B, Question 1). The study team will again check that the sample counts using the final release-adjusted sampling weights match the sample frame counts, overall and by stratum. To adjust the responding sample for differential nonresponse patterns, the team will look at SFA characteristics that are (1) available for both respondent and nonrespondent SFAs (that is, available in the FNS-742 or NCES and (2) expected to be correlated with key outcomes. Using logistic regression with stepwise procedures, the team will develop a parsimonious but well-fitting model of the propensity for an SFA to respond and use the inverse of the resulting propensity scores to adjust the sampling weights of respondents. The team will then benchmark the weight sums to the sampling frame and compare to other external sources of SFA counts. The team will use calibration techniques (such as post-stratification or raking) to bring the weights in line with target SFA totals by key subgroups. Finally, the team will examine the weights for outliers and trim weights, if needed. As with the sampling process, the development of analysis weights will undergo rigorous QC review. During analysis, the team will account for design effects on the variance of estimates caused by unequal weighting by using the Taylor series linearization method, which uses the analysis weights described above, or by producing replicate weights.
Because state-level data will be collected from a census of State CN agencies, no estimation procedures are necessary for the state-level collection components.
Degree of Accuracy Needed for the Purpose Described in the Justification.
When using probability sampling methods and selecting from a frame with full coverage of the national population, the resulting sample (of a sufficiently large size) will be nationally representative and will provide unbiased estimates of the population after applying the corresponding sampling weights (adjusted for nonresponse). Similarly, estimates for key subgroups from that sample will be unbiased, in expectation, after applying weights. For the SFA survey, the sample size and the sample design affect the precision of these representative estimates. With 1,012 completed SFA surveys, margins of error (half-widths of 95 percent confidence intervals) of +/– 3.2 percentage points (pp) are estimated around a 50 percent outcome, after accounting for design effects due to unequal weighting (Table B2.3). Outcomes closer to 0 or 100 percent will have even better precision. For the key subgroups of SFA size and percentage of students approved for F/RP meals, margins of error are estimated to range from +/– 4.9 pp for small SFAs to 8.9 pp for medium SFAs. Other subgroups of interest will also be included in the sample, including urbanicity and charter status, and based on our preliminary analysis, we anticipate that the margins of error for those strata will be approximately within this range.4
Table B2.3. Sample size estimates and expected margins of error for SFA sample and key subgroups
Group |
Population size |
Sample sizea |
Margin of errorb (percentage points) |
Total (national estimates) |
18,830 |
1,012 |
3.2 |
SFA size (student enrollment) |
|
|
|
Very small (1–499) |
8,434 |
319 |
5.6 |
Small (500–2,499) |
6,619 |
415 |
4.9 |
Medium (2,500–4,999) |
1,851 |
127 |
8.9 |
Large (5,000–99,999) |
1,899 |
129 |
8.1 |
Certainty (100,000+) |
27 |
22 |
0 |
Percentage of students approved for free or reduced-price meals |
|
|
|
Less than 40% |
8,673 |
408 |
5.0 |
40-59% |
3,861 |
257 |
6.3 |
60% or more |
6,296 |
347 |
5.4 |
a Target completes per period.
b Incorporates a finite population correction factor and a design effect due to nonresponse adjustments of 1.1, plus design effects due to disproportionate sampling across strata (also about 1.1, for overall estimates; varying for different subgroups).
The study team expects to be able to detect as significant (with α = 0.05 and 80 percent power) underlying differences of 10 to 12 pp between SFA subgroups of different sizes and a difference of 10 pp between SFAs with above/below 60 percent F/RP enrollment. When comparing estimates from all SFAs between one study year and another, the study should have sufficient power with this design to detect differences as small as 7.8 pp.
Unusual Problems Requiring Specialized Sampling Procedures.
We do not anticipate any unusual problems requiring any specialized sampling procedures.
Any use of Periodic (Less Frequent than Annual) Data Collection Cycles to Reduce Burden.
In SY 2021-2022, the State survey data collection procedure will be conducted twice, once during fall 2021 and once in summer 2022, and the administrative data collection procedures will be conducted once in spring 2022 (covering FY 21). Also, in SY 2021-2022, the SFA survey data collection procedures will be conducted once, in winter 2022. In SY 2022-2023, the State administrative data collection procedures will be conducted once, in spring 2023 (covering FY 22). Concern regarding the periodicity of data collection cycles is generally not applicable. The two State surveys in SY 2021-2022 are being conducted to capture information on the statutory reporting requirements for the nationwide CN COVID-19 waivers used in SYs 2020-2021 and 2021-2022 separately, as soon after each school year’s waivers expire as is administratively feasible.
Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.
The study is expected to achieve an 80 percent response rate for the SFA web survey and 100 percent for the SA web survey and administrative data request. This means that the data collected from States will represent the entire universe of State CN Directors, and rather than providing estimates to answer the research questions, we will be able to provide actual population totals. Thus, these data will provide reliable answers to the study’s research questions that represent the full population. Achieving the specified response rate involves contacting the States and selected SFAs, securing their participation in the study, and then offering support and completion reminders. The study team will use the following methods to maximize participation and reduce nonresponse:
FNS headquarters will notify (Appendix G.1) the FNS Regional Offices about SMO data collection activities for SYs 2021-2022 and 2022-2023 and ask FNS Regional Office staff to inform SAs about the upcoming data collections.
All FNS Regional Offices will send the Study Support Email from FNS RO to SA (Appendix G.2) to CN directors to encourage participation in the study. Also, the study team will copy the Regional Offices on communications with the SAs to promote participation and response.
The study team will send all State CN directors the SA Advance email (Appendix G.3) and Brochure (Appendix G.4) prior to each survey, which explain the purpose of the study and describe study activities. The emails will provide State CN directors with notice that they will be invited to respond to a survey and provide administrative data in order to fulfill their statutory reporting requirements on the nationwide FFCRA waivers. In addition, the email sent prior to the winter 2021-2022 SFA survey will indicate that the study team will soon contact selected SFAs and will ask State CN directors to promote cooperation by sending the Study Support Email from SA to SFA (Appendix G.10) to SFA directors.
The study team will hold telephone meetings with SAs to discuss the administrative data requests (Appendices E.1-E.3) using the Telephone Meeting Advance Email and Call Script (Appendix G.5). Because the SY 2021-2022 and SY 2022-2023 data requests will be generally the same as the request for SY 2020-2021, it is expected that SAs will be familiar with the request.
The study team will send sampled SFA directors the SFA Advance Letter (Appendix G.11) and Brochure (Appendix G.4), which explain the purpose of the study and describe study activities, to provide SFA directors with notice that they will be invited to respond to a survey.
The study team will send all State CN directors the SA Invitation Email (Appendix G.6) prior to each survey to invite them to complete the survey.
The study team will send all SFA directors the SFA Invitation Email (Appendix G.12) to invite them to complete the survey.
Recruiting materials were carefully developed to emphasize the following points, which may resonate with respondents:
The SA data collection is designed to gather information on statutory reporting requirements for the nationwide COVID-19 CN waivers.
State agencies’ full participation in the study will satisfy their statutory reporting requirements.
The SFA data collection is designed to gather information on how the pandemic affected SFA program operations and finances.
Data regarding the use and impact of the COVID-19 CN waivers will be used to inform FNS policy and procedures in future emergency situations.
Having updated information about CN program operations at the state and local levels will help FNS inform policy and budget decisions, future training and technical assistance, and future nationwide waivers offered during emergencies.
SMO has been designed to reduce participant burden by limiting the sample to the smallest possible number of SFAs needed to support the research and relying on administrative data whenever possible to avoid redundant information requests.
SFAs have been selected to participate as part of a nationally representative sample, which means that each SFA response will speak for many SFAs.
Designated FNS regional staff will serve as regional study liaisons and will be kept closely informed about the project so that they will be able to answer questions from CN directors and encourage participation.
A toll-free number and study email address will be provided to all participants so that CN directors and SFA directors can receive assistance with the study. Staff will be readily available to clarify survey questions and work with participants to resolve technical issues, such as difficulty logging on or advancing past pages. Personalized assistance bolsters the perceived legitimacy of the study and will encourage respondents to persist in completing the survey.
For each survey, the Reminder Email (Appendix G.7) will be sent to State CN directors and SFA directors every 2-3 weeks for a total of two reminder emails before the reminder phone call (Appendix G.8) at week 7.
The study team will follow up by telephone with all CN directors and SFA directors who do not complete the survey and urge them to complete the survey.
The study team will use call scheduling procedures that are designed to call numbers at different times of the day (between 8am and 6pm) and days of the week (Monday through Friday) to improve the chances of finding a respondent at work.
Toward the end of the field period for each survey, all SAs or SFAs who have yet to submit their surveys will be mailed the SA or SFA Last Chance Post Card (Appendices G.9 and G.13, respectively).
Although the team anticipates a 100% response rate for SAs, because an 80% response rate is anticipated for SFAs, a nonresponse bias analysis will be conducted to assess whether weighting appears to have mitigated the risk for nonresponse bias. Although nonresponse bias can rarely be measured directly, the study team plans to look at the set of available SFA characteristics to assess the risk for nonresponse bias. These characteristics include variables available on the FNS-742 file, such as size measures (number of schools and number of students) and percent of students with free or reduced-price lunch. We will also include variables about the associated school district from the NCES files, such as charter school status, geographic characteristics, number of each type of school in the district (elementary, middle, and/or high) and grades offered in the district. The team will show frequency distributions for these variables for (1) the frame, (2) the sample (weighted by the release-adjusted sampling weight), (3) the respondents (weighted by the release-adjusted sampling weight), and (4) the respondents with their full nonresponse-adjusted weights. However, recognizing the limitations of the available data on SFA characteristics for analyzing nonresponse bias, the team will also conduct analyses comparing early responders to late responders, and both to nonrespondents, to assess the value of using this information when calculating nonresponse adjustments to the weights.
Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.
The study team pretested the State and SFA CN Director surveys in June 2021. The pretest of the SA web survey was conducted with 3 SAs who volunteered to participate in the pretests in response to a request that was distributed through the FNS Regional Offices. These SAs also varied across relevant characteristics, including State department type, CN programs overseen, and program participation. The pretest of the SFA survey was conducted with 6 SFAs that operate in the 3 pretest States and were available to participate within the pretest timeframe. These SFAs were selected to provide variation across key sub-groups of interest, including size (number of schools served), urbanicity, and operational approaches used during the COVID-19 pandemic. Because the SMO administrative data collection in SY 2020-2021 involved the same data collection instruments that will be used in SYs 2021-2022 and 2022-2023, pretesting of those instruments was not conducted.
The study team conducted 60-minute telephone debriefing interviews with each pretest respondent to solicit feedback on the web survey. The interviews focused on asking respondents to identify and share concerns about unclear questions or response options, questions that took too long to answer, burden, and the flow of the survey. The team used cognitive methods to gauge respondents’ understanding of the intent of questions and response options, focusing on newly crafted or significantly revised survey items, or those that asked about topics that are complex or difficult to measure. Pretest findings are summarized in the SMO Option Period 1 Pre-test Findings Memo (Appendix M). Table B4.1 and B4.2 summarize respondent feedback and the changes made to the surveys based on the pretest findings.
Table B4.1. Changes to State CN Director Survey based on respondent feedback
Question |
Respondent feedback |
Survey changes |
Overall |
Respondents commented that the ability to export or review their answers prior to submission would help them ensure that the survey is accurate and complete. |
We will look into adding this functionality. |
Overall |
Respondents said it would be helpful to have a copy of their responses to the Base Period survey to ensure consistent responses, assist with leadership changes, and provide baseline information for the comparison questions. |
In the advance and invitation emails, we will indicate that SAs may request a copy of their responses to the Base Period survey. |
Overall |
Respondents requested a copy of the survey to facilitate coordination across staff and ease survey completion. |
We will continue to offer copies of the survey. |
Survey navigation |
One respondent suggested changing the web navigation, so respondents are brought to the beginning of the section each time they enter it, rather than picking up at their last unanswered question in that section. |
We will make this change. |
Multiple questions |
One respondent commented that one challenge in completing the survey is that they are hesitant to select the “don’t know” response. Respondents also said it was challenging to provide approximate answers since they did not have actual data to support their responses. |
We added instructions to the questions that read, ‘Please provide your best estimate. If you do not have this information, please select “Don’t know.”’ |
Multiple questions |
One respondent commented that their LPOs changed which CN programs they used throughout the school year so providing numbers or estimates of proportions is challenging. They were not sure if the survey was asking about any LPO that used it during the time period. |
Where appropriate, we clarified that we are asking about use of the waiver or specific implementation methods at any time in the timeframe specified. |
Multiple questions |
One respondent noted that it was time consuming to select a response for every item in the Base Period survey. They suggested that having an option to indicate an item applied to all programs would help ease burden. |
We have simplified certain response options (from proportions to yes/no). For questions A8, V2, and V4, we added an “All” response option so respondents do not have to select each program separately. We also added the instruction, “If a factor contributed to operational challenges for all listed Child Nutrition Programs your State Agency administers, please select “All.” |
Waiver list, E1, E2, E3, E4, E15, E16, E18, E19 |
One respondent noted that the closed enrolled area eligibility waivers are very different policy changes from the regular area eligibility waivers and suggested asking about them separately to avoid confusion. |
We moved the questions about closed enrolled area eligibility waivers from E15-16 and E18-19 to questions E1-E4 and moved them to a separate grouping in the waivers list. |
A2a, A2b, B2c, B2d, C1, D1 |
One respondent commented that response option 2, “Estimated number of SFAs, data not provided by all SFAs” was a limiting response and not entirely accurate for their situation. They suggested adding other reasons for providing an estimated number. |
We removed the “data not provided by all SFAs/sponsors/institutions” from the “estimated number” response options throughout. We also removed “data provided by all SFAs” from the “accurate number” response options. It is not necessary to add other reasons; whether the number is actual or estimated is all that is needed. |
A3, A6a, A5b, B3, B4c, B5, C2, D2, E32, E36 |
One respondent suggested the response, “about half” may be confusing and a range of percentages to clarify what that should be interpreted as would be helpful. |
No change. We recommend leaving the scale as is to maintain consistency with the Base Period survey. As noted above, we have added instructions to the questions that read, ‘Please provide your best estimate. If you do not have this information, please select “Don’t know.”’ to aid respondents. |
B4c item b |
One respondent commented that the term “school buildings” is limiting since the CACFP At-Risk Afterschool Program is not offered in school buildings in their State. |
We added “or afterschool program” to the item so it reads, “Grab-and-go tables or kiosks in school or afterschool program buildings.” |
C1, C2 |
One respondent commented that the decision to use this waiver is at the State level so all SFAs use it. |
We changed question C1 to ask, “Did your State Agency waive the identified Child Nutrition Programs reporting requirements for all SFAs in your State?” If they respond yes, we will have the total number of SFAs from the administrative data. If respondents select no, they’ll be asked, “You indicated that you did not waive the identified Child Nutrition Programs reporting requirements for all SFAs in your State. For what types of SFAs were the reporting requirements waived?” We also deleted question C2. |
C3 |
Respondents commented that the wording of this question was confusing because the waiver did not affect the information they reported, but rather it waived the requirement to report specific elements. |
We changed the question to ask, “Did your State Agency use the Child Nutrition Programs reporting requirements waiver for any of the following in SY 2020-2021?” |
Section E and F summer 2021 questions |
One respondent suggested defining “summer” with example months. |
We added the note, “By summer 2020, we generally mean the months of May through September 2020.” |
E2, E4, E6, E12 E16, E19, E23, E25, F2, F7 |
One respondent expressed concerns about the “No change” response option as they would not be able to definitively say there was no change in the number of LPOs that used the waiver. |
We changed the response option from “No change” to “About the same number of LPOs used the waiver in SY 2020-2021.” |
E2, E4, E6, E16, E19, E23, F2 |
One respondent noted that they forgot the comparison time period when completing the grids. |
We added a row that reads, “Compared to March through September 2020” to remind respondents of the question. |
B5, E17, E17a, E20, E20a |
One respondent explained that they thought these questions were about approaches used to identify area eligible sites, not about new approaches used because of the waiver. |
We changed the questions to clarify that we are asking about “new meal sites made area eligible under the waiver.” |
G4 |
Respondents commented that they were unsure whether they would be able to answer this question because they may not have comprehensive information on how effective the CN Emergency Operational Costs reimbursement payments were in stabilizing the financial health of LPOs or making up for their lost revenues during the public health emergency in spring 2020. |
We added a “don’t know” response option here. |
V4 |
One respondent commented that their SA’s challenge was not insufficient funds but the rules around spending their SAE and SAF funds. |
We split item d into two separate items: “d. Insufficient State administrative expense funds (SAE) or State administrative funds (SAF) e. Rules regarding use of State administrative expense funds (SAE) or State administrative funds (SAF)” |
CACFP = Child and Adult Care Food Program; LPO= Local program operators; SAE=State administrative expense funds; SAF=State administrative funds; SFA= School Food Authority; SA=State Agency.
Table B4.2. Changes to SFA Director Survey based on respondent feedback
Question |
Respondent feedback |
Survey changes |
1.1 |
One respondent noted that there is a school in their district that serves special needs students beyond 12th grade. They were not sure based on our instructions whether that type of school meets the definition of “other.” |
We added a clarification to include any schools with grades K—12 and/or K—13 to the definition of “Other” school types. |
1.2 |
Respondents suggested it would be helpful to clarify the difference between CACFP at-risk afterschool and CACFP outside-school-hours components and between the USDA DoD Fresh Fruit and Vegetable Program and the Fresh Fruit and Vegetable Program.
One respondent found the NSLP acronym in the name for the Seamless Summer Option (SSO) confusing because the program also provides breakfasts. |
We added definitions of the CACFP At-Risk Afterschool Meals and CACFP Outside-School-Hours Care components, as well as the USDA DoD Fresh Fruit and Vegetable Program and the Fresh Fruit and Vegetable Program.
We removed the acronym, “NSLP” from the Seamless Summer Option to avoid potential confusion. |
1.3-1.5 |
Respondents suggested using acronyms for SFSP and SSO in this set of questions, because they are not used to seeing SFSP & SSO fully written out. |
We replaced the full names for Summer Food Service Program and Seamless Summer Options with SFSP and SSO in questions 1.3 and 1.4. The full names of both programs are introduced in question 1.2. |
1.9 |
One respondent suggested providing clarification for the response option “Conduct outreach” because outreach could mean providing information on the website or brochures, which is a separate response option. |
We changed “Conducted outreach” to “Conducted other outreach” and moved this option to the bottom of list. |
1.11 |
Respondents suggested two new response options for this question, one to capture not having enough information to answer parent questions and another to capture that the P-EBT is a new program to learn and implement. |
We added two new response options to this question: “Insufficient information or guidance from State about benefits” and “Quickly implementing a new program.” |
1.13-1.16 |
Respondents reported that the phrase “students attending full time in person (no virtual instruction was provided)” is unclear because some schools allowed students to attend in person (in the school building), but instruction was provided virtually, outside of the classroom. For example, a classroom teacher may provide instruction to a portion of the students in the classroom, while the remaining students attend virtually from another area of the building, such as the library, to allow for smaller classroom sizes and social distancing. |
We changed the language in question 1.13 from “…did schools in your SFA have students attending full time in person (no virtual instruction was provided) for all or part of the year” to, “…did schools in your SFA provide instruction to all students in school buildings for all or part of the year?”. We also revised questions 1.14 through 1.16 to reflect this change to the language in 1.13. |
1.17 |
One respondent asked if “full time in-person” also included hybrid, since some students were full-time in person during the hybrid approach described above for question 1.13. |
We revised the language in the question from “…did some or all students attend school in person…” to “did some all or all students receive instruction in school buildings (including full-time in person and hybrid).” |
1.18-1.19 |
Several respondents requested clarification of the definition for bulk foods and were unsure if methods used by the schools in their SFA met this definition if only one component of the meal (such as milk) is delivered in bulk. |
We clarified the hover text definition for bulk food packages by adding “one or more items” to the definition. |
1.20 |
One respondent suggested it would be helpful to have a link to the FNS waiver guidance. |
We added a link to FNS’s webpage for the Child Nutrition COVID-19 Waivers. |
1.24 |
Two respondents suggested adding a response option to capture items offered in bulk food packages that do not require preparation. |
We added a response option for, “No preparation required” to the response options. |
1.25 |
One respondent suggested “circle sheets” as another meal counting method used by their SFA. |
We added “Circle sheets, or other form for meal counting” to the list of response options for this question, and for the comparable question in section 4 (4.4). |
1.29 |
One respondent wasn’t sure if the question was about access to food from a school garden for taste-testing purposes or using food from a garden as part of a reimbursable meal. |
We added “reimbursable” meals to the question to clarify. |
1.31 |
One respondent suggested including response options for unclear and untimely guidance from FNS or the State as separate options.
Another respondent suggested removing or separating, “and using alternative meal delivery methods” from response option g, which was “staffing meal sites and using alternate meal delivery method.” They clarified that they had staffing issues, but not related to alternative meal sites. |
We split out separate response options for unclear guidance from FNS and the State and untimely guidance from FNS and the State (four options total).
We removed “…and using alternative meal delivery methods” from response option g to allow respondents to select this option if staffing any meal site was a factor that contributed to operational challenges. We also added the text “and using” to response option d to allow SFAs to report that using alternative meal sites was a factor that contributed to operational challenges.
We made the same revisions to the comparable question in section 4 (4.12). |
Section 2 overall |
Respondents found this section easy to answer overall. One respondent referred to their financial records to answer questions in this section. |
None. |
2.5 |
One respondent suggested adding “CN General Fund” as a response option for additional funding sources. |
We revised the “District general fund” option to “District or Child Nutrition general fund.” |
2.6 through 2.11 |
Several respondents were not sure what the “Child Nutrition Emergency Operational Costs reimbursement payments” were and if they applied to them. One respondent suggested changing the terminology to “lost revenue reimbursements” or providing a link for more information. |
We added link to the guidance document on FNS’s website. |
Section 3 overall |
None of the respondents who participated in the pre-test currently have FSMC contracts. One respondent with prior experience working at an FSMC said the questions in this section should be straightforward for SFA Directors with an FSMC contract to answer. |
None. |
Section 4 overall |
Overall, respondents felt this section would be easy to answer in early 2022 and reflect the timeframe of October through December 2021. |
None. |
4.5 |
One respondent suggested adding an option to reflect operational changes that increased use of prepackaged or heat and serve meals and decreased meals prepared from scratch. |
We added an option for “Increased use of heat and serve meals (versus prepared from scratch)” to this question. |
CACFP = Child and Adult Care Food Program; CN = Child Nutrition; FSMC = Food service management company; NSLP = National School Lunch Program; P-EBT = Pandemic Electronic Benefit Transfer; USDA DoD Fresh = USDA Department of Defense Fresh Fruit and Vegetable Program.
Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.
Table B5.1 lists staff consulted on statistical aspects of the design. The same staff will be responsible for collecting and analyzing the study data.
Table B5.1. Individuals consulted on statistical aspects of study design
Mathematica staff |
Title |
Phone |
|
Rachel Frisk |
Project Director |
202-552-6447 |
|
Barbara Carlson |
Senior Statistician |
617-674-8372 |
|
Liz Gearan |
Sr Researcher |
617-301-8978 |
|
Andrew Gothro |
Researcher |
202-250-3569 |
|
Eric Grau |
Senior Statistician |
609-945-3330 |
|
Josh Leftin |
Researcher |
202-250-3531 |
|
Sarah Forrestal |
Senior Survey Researcher |
609-945-6616 |
|
Veronica Severn |
Survey Analyst |
617-715-6931 |
|
Liana Washburn |
Research Analyst |
202-250-3551 |
|
Eric Zeidman |
Senior Survey Researcher |
609-936-2784 |
|
USDA staff |
Title |
Phone |
|
Holly Figueroa, FNS |
Social Science Research Analyst |
703-305-2105 |
|
Maggie Applebaum, FNS |
Branch Chief, Special Nutrition Analysis Branch |
703-305-2578 |
|
Janis Johnston, FNS |
Acting Director, Office of Program Integrity |
703-305-2106 |
|
Doug Kilburg
|
NASS Reviewer |
202-690-8640 |
1 If the SY 2020-2021 FNS-742 data are incomplete due to the COVID-19 pandemic and resulting changes in program operations, we will use the SY 2019-2020 file.
2 These constraints allow for SFA nonresponse and enable non-overlapping samples across the four potential data collection years included under the research contract between FNS and Mathematica, which covers SYs 2020-2021 through 2023-2024. Due to the COVID-19 pandemic, the SFA survey was dropped from the SY 2020-2021 data collection, resulting in three potential data collection years with an SFA survey.
3 These thresholds were selected to ensure precision levels for important subgroups were less than +/- 10 percentage points, while also ensuring a total sample size of at least 1,000 SFAs per year.
4Upon OMB approval for the collection, the research team will merge the FNS-742 data with the NCES data in order to produce margin of error estimates for urbanicity and charter status.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Mathematica Standard Report Template |
Author | MATHEMATICA |
File Modified | 0000-00-00 |
File Created | 2021-10-18 |