SMO_OP3_Part B Final

SMO_OP3_Part B Final.docx

School Meals Operations Study: Evaluation of the School-based Child Nutrition Programs

OMB: 0584-0607

Document [docx]
Download: docx | pdf

Supporting Statement for OMB Clearance for the School Meals Operations Study: Evaluation of the School-based Child Nutrition Programs



Part B



Revision to OMB # 0584-0607, School Meals Operations Study (SMO)

April 6, 2023



Darcy Güngör

Social Science Research Analyst

Email: [email protected]

Conor McGovern

Branch Chief

Email: [email protected]



Office of Policy Support

Food and Nutrition Service

United States Department of Agriculture

1320 Braddock Place

Alexandria, Virginia 22314

TABLE OF CONTENTS



TABLES



APPENDICES

A Section 2202 of the Families First Coronavirus Response Act (FFCRA)

B Research Questions

C Section 28 of the Richard B. Russell National School Lunch Act and Section 305 of the Healthy, Hunger-Free Kids Act of 2010

D.1 State Agency Child Nutrition Director Survey SY 2023-2024

D.2 Screenshots of State Agency Child Nutrition Director Survey SY 2023-2024

E.1 FNS-10 Administrative Data Request for FY 2023

E.2 FNS-418 Administrative Data Request for FY 2023

E.3 FNS-44 Administrative Data Request for FY 2023

F.1 School Food Authority Director Survey SY 2023-2024

F.2 Screenshots of School Food Authority Director Survey SY 2023-2024

G.1 Notification from USDA FNS to Regional Offices

G.2 Study support email from FNS Regional Offices to State Agencies

G.3 State Agency Advance Email

G.4 Brochure

G.5 Telephone Meeting Advance Email and Telephone Script

G.6 State Agency Invitation Email

G.7 Reminder Email

G.8 Telephone Reminder Script

G.9 State Agency Last Chance Post Card

G.10 Study Support Email from State Agencies to SFAs

G.11 SFA Advance Email

G.12 SFA Invitation Email

G.13 SFA Last Chance Post Card

H.1 Public Comment 1

H.2 Public Comment 2

H.3 Public Comment 3

H.4 Public Comment 4

H.5 Public Comment 5

H.6 Public Comment 6

H.7 Public Comment 7

H.8 Response to Public Comment 1

H.9 Response to Public Comment 2

H.10 Response to Public Comment 3

H.11 Response to Public Comment 6

H.12 Response to Public Comment 7

I. National Agricultural Statistics Service Comments and FNS Response

J Confidentiality Pledge

K Estimated Annualized Burden

L FNS-742 School Food Authority Verification Collection Report

M SMO Option Period 2 Pre-test Findings Memo

B1. Respondent Universe and Sampling Methods

Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.

Two web-based surveys will be conducted, one with State Child Nutrition (CN) Directors and one with school food authority (SFA) Directors. The respondent universe for the State Agency Child Nutrition Director Surveys (Appendices D.1 and D.2) includes 56 State CN Directors that oversee the National School Lunch Program (NSLP), the School Breakfast Program (SBP), and NSLP Seamless Summer Option (SSO), in the 50 States, District of Columbia, Guam, Puerto Rico, and U.S. Virgin Islands.

The respondent universe for the School Food Authority Director Survey (Appendices F.1 and F.2) includes all SFAs operating in public and private school districts (including charters) in the United States and outlying territories that were required to submit form FNS-742 School Food Authority Verification Collection Report (OMB number 0584-0594 Food Programs Reporting System (FPRS), expiration date July 31, 2023) (Appendix L) to FNS in school year (SY) 2022-2023. In general, all SFAs that participated in the NSLP or the SBP are included in the respondent universe except SFAs associated with Federally-administered schools.

The sampling frame is based on the latest available FNS-742 file, supplemented with school district-level characteristics from the U.S. Department of Education’s National Center for Education Statistics (NCES) and district-level estimates of school-age children in poverty from the U.S. Census Bureau’s Small Area Income and Poverty Estimates file. This will result in a respondent universe of approximately 18,821 SFAs. Table B.1.1 summarizes the distribution of eligible SFAs in the sampling frame by sampling stratum, which is based on SFA enrollment size and poverty status (defined as the percentage of students in the SFA that are eligible for free or reduced-price meals).

Table B1.1. SFA sample allocation by stratum

Stratum

SFA size (student enrollment)

Poverty level (percentage approved for F/RP meals)

Total populationa

Augmented sample selected

Initially released sample

Reserve sample

Expected survey completesb

1

1–499

<60%

5,337

372

298

46

238

2

1–499

60% +

3,090

216

173

42

138

3

500–2,499

<60%

4,524

314

251

79

200

4

500–2,499

60% +

2,094

180

144

38

115

5

2,500–4,999

<60%

1,289

180

144

34

115

6

2,500–4,999

60% +

561

55

44

10

35

7

5,000–99,999

<60%

1,361

180

144

34

115

8

5,000–99,999

60% +

538

54

43

10

34

9

100,000+

--

27

27

27

0

22




18,821

1,561

1,268

293

1,012

a The population numbers are based on a list of SFAs that combines data from the SY 2020-2021 FNS-742 file and SY 2019-2020 FNS-742 file, the last time an SFA survey was conducted under 0584-0607. An updated frame will be used when designing the sample for the SY 2022-2023 survey.

b The number of expected survey completes assumes an 80 percent response rate from the initially released sample. Additional SFAs in the augmented sample will be held in reserve as backup SFAs, which will be random ordered in each stratum and released as needed to achieve the stratum’s target number of completes.

F/RP = free or reduced price.

The study team plans to sample SFAs with equal probability within each stratum. The targeted sample size across strata will be allocated disproportionately (relative to their population proportions) to allow for a sufficient number of SFAs in policy-relevant subgroups. SFAs that pretest the survey will be excluded from sample selection. A census will be taken of SFAs with 100,000 or more enrolled students (“certainty SFAs”) and enough other SFAs will be sampled to obtain 12 to 300 completed SFA surveys per stratum. The study team will select a sample that allows for lower-than-anticipated response rates (an augmented sample). From this sample, the team will randomly subsample a first release that assumes an 80 percent response rate, where all remaining sampled SFAs are held in reserve as backup SFAs. In each stratum, backup SFAs will be random ordered and released as needed to achieve the stratum’s target number of completes. The final sampling weights will then be calculated as the product of the initial sampling weight (the inverse of the probability of selection for the augmented sample) and the release adjustment (the number of cases selected for the augmented sample in each stratum divided by the number of released cases in that stratum).

To streamline survey data collection, the team also plans to collect disaggregated administrative data from 68 State agencies that are currently only reported to FNS in aggregate on forms FNS-10, Report of School Program Operations, FNS-418, Report of the Summer Food Service Program for Children, and FNS-44, Report of the Child and Adult Care Food Program (which are approved under OMB# 0584-0594, Food Programs Reporting System (FPRS), expiration date 07/31/2023). No sampling or weighting is required for the state-level collections as the study will include a census of all 68 State agencies. Given the total universe of only 68 eligible State agencies and Territories and their relatively distinct characteristics, there is not an efficient sample design that could closely match the comprehensive data on State policies and student meal service that a census will yield to better understand variation and localized concerns.

Expected Response Rates

The nationally representative sample of 1,268 SFAs (including charters) is expected to result in 1,012 completed web surveys, a response rate of 80%. This will balance the need for precise estimates with the desire for minimizing burden on SFA Directors.

Achieving high response rates on the SA and SFA surveys and the administrative data request is critical to the study’s success. The Child Nutrition Program Operations Study II (CN-OPS II) (OMB number 0584-0607, expiration date 08/31/2022), which is the predecessor to the SMO Study, had survey response rates of 100 percent for the State CN survey and 78 percent for the SFA survey averaged across three data collection years. In addition, the SMO Study had a response rate of 100 percent for the State CN Director survey and administrative data collection during the first three years of data collection in SY 2020-2021, SY 2021-2022, and SY 2022-2023, and a response rate of 86 percent for the SFA Director survey fielded in SY 2021-2022. The team anticipates again reaching 100 percent completion with the 68 State agency respondents for the web survey and the administrative data collection. With the SFA survey, while SFA Directors will likely be continuing to make adjustments to program operations due to the COVID-19 pandemic during SY 2023-2024, we expect that they will also be interested in reporting to FNS on their current operations. This survey will be their primary opportunity to report directly to FNS on their unique experiences, and study recruitment materials will convey the importance of reporting this information to inform future program policy and assistance efforts. Further, the administrative data collection was added to the current study in part to minimize the burden of survey completion for SFAs, in contrast to previous versions of the study; they will not be asked for information in the survey that is available through the data and the SFA survey is limited to 30 minutes in length. Therefore, while SFAs will be informed that their participation in the survey is mandatory, based on past experience with similar surveys and efforts to reduce burden, we anticipate reaching an 80 percent response rate for the SFA web survey.

Table B1.2. Summary of Respondent Universe and Expected and Prior Response Rates

Respondent

Universe

Target completed cases

Expected Response Rates

CN-OPS II Response Rates

SMO Response Rates





SY
15-16

SY
16-17

SY
17-18

SY
20-21

SY 21-22

SY 22-23**

State CN Agencies

68

68

100%

100%

100%

100%

100%

100%

100%

School Food Authorities

18,821

1,012

80%

82%

77%

76%

NA

86%

NA

Total

18,889

1,079

81%

83%

78%

77%

100%

100%

100%

* The universe for CN-OPS II only included the 55 State CN agencies that oversee the NSLP and the SBP. Due to the unanticipated school closures related to COVID-19, other CN Programs, including the SFSP and the CACFP, were used to provide meals to children beginning in March 2020. Thus, the State agencies that oversee the SFSP and the CACFP were added to the SY 2020-2021 universe, increasing the universe of State CN agencies to 68.

**This is the response rate for the State CN agency survey as administrative data collection for SY 22-23 is not yet complete.



The approach to achieving high response rates builds on prior FNS study experience. The team developed engaging recruiting materials, listed in Appendices G.1-G.13, to describe the study, including a variety of appeals to encourage participation. FNS headquarters will send a notification to FNS Regional Offices (Appendix G.1) to introduce the SMO Study data collection activities occurring in SY 2023-2024 and focusing on SY 2022-2023 operations, and ask FNS Regional Office staff to inform State agencies about the upcoming data collections. Before fielding the state-level survey, all FNS Regional Offices will send the Study Support Email from FNS Regional Offices to State Agencies (Appendix G.2) to build support for the study among States. The study team will also provide information to State agencies on the various study components prior to each data collection using the State Agency Advance Email (Appendix G.3), the study brochure (Appendix G.4), and the Telephone Meeting Advance Email and Telephone Script (Appendix G.5). All State CN Directors will be asked to send the Study Support Email from State Agencies to SFAs (Appendix G.10) to build study support among sampled SFAs. Prior to each survey, State CN Directors will be sent the State Agency Invitation Email (Appendix G.6) and SFA Directors will be sent the SFA Invitation Email (Appendix G.12) to request that each respondent use the unique enclosed link to access and complete the survey. To encourage survey completion, State CN Directors will receive the Reminder Email (Appendix G.7) every 2-3 weeks (estimate two reminder emails total), a reminder call at 7 weeks using the Telephone Reminder Script (Appendix G.8), and the State Agency Last Chance Post Card (Appendix G.9) until the target of 56 respondents is reached. Similarly, SFA Directors who have not yet completed the web survey will receive the Reminder Email (Appendix G.7) every 2-3 weeks (estimate two reminder emails total), a reminder call at 7 weeks using the Telephone Reminder Script (Appendix G.8), and the SFA Last Chance Post Card (Appendix G.13) until the target of 1,012 respondents is reached.

A professional, trained, survey support specialist will be available to assist respondents by phone or email during business hours. Also, the web survey will allow respondents to save and exit the survey at any point, and then return to access and complete the survey later.

The study team will calculate response rates using industry standards from the American Association of Public Opinion Research. Depending on item-completion patterns, the study team will classify partially completed surveys as either sufficiently completed to treat as a respondent, or insufficiently completed to treat as a nonrespondent. Response rates will be presented both unweighted and weighted by the release-adjusted sampling weights, and both overall and by key subgroups.

B2. Procedures for the Collection of Information

Describe the procedures for the collection of information including:

  • Statistical methodology for stratification and sample selection;

  • Estimation procedure;

  • Degree of accuracy needed for the purpose described in the justification;

  • Unusual problems requiring specialized sampling procedures; and

  • Any use of periodic (less frequent than annual) data collection cycles to reduce burden.

As detailed in Part A, Question 2, information will be collected via web surveys of State CN Directors and SFA Directors and state-level administrative data collections. The respondents will have 10 weeks to complete each survey and provide the administrative data, which allows time to plan their approach for completion. Respondents will receive reminder emails and calls from trained survey support personnel and administrative data liaisons. Respondents may also call and/or email professional survey support specialists to request help in completing their survey or with technical issues.

Statistical Methodology for Stratification and Sample Selection.

The SMO Study employs an efficient sample design for the SFA survey that will minimize the overall response burden for SFAs. In comparison to the previously approved CNOPS-II approach, SMO includes reduced target sample sizes for SFAs. Overall, the SMO SFA sample is estimated to include 1,268 SFAs, whereas the CN-OPS II sample included 2,188 SFAs. SFAs will be sampled with equal probability within each stratum. We plan to allocate the targeted sample size across strata disproportionately (relative to their population proportions) to allow for a sufficient number of SFAs in policy-relevant subgroups.

This sample design meets the study’s required precision levels of 5 percentage points (pp) overall and 10 pp for key subgroups. See Table B2.3 for margins of error by subgroup. Given the descriptive nature of this study, the SMO Study’s sample design is structured to ensure the desired level of precision for national estimates and estimates of key subgroups.

As described in Question B1, the team will stratify the sample by SFA size (based on student enrollment in five categories) and poverty level (based on the percentage of students approved for free or reduced-price meal benefits [F/RP]; less than 60 percent F/RP or greater than or equal to 60 percent F/RP) (Table B1.1). Although the study team plans to sample SFAs with equal probability within each stratum, the targeted sample size across strata will be allocated disproportionately (relative to their population proportions) to allow for a sufficient number of SFAs in policy-relevant subgroups. The study team will take a census of SFAs with 100,000 or more enrolled students (“certainty SFAs”) and sample enough other SFAs to obtain 12 to 300 completed SFA surveys per stratum. This approach will produce an augmented sample, from which the team will randomly subsample a first release of 1,268 SFAs that assumes an 80 percent response rate. All remaining SFAs from the augmented sample will be held in reserve as backup SFAs. In each stratum, backup SFAs will be randomly ordered and released as needed to achieve the stratum’s target number of completes, though the study team does not anticipate that this will be necessary.

When the sample frame is final, the study team will draw the sample. The stratum of “certainty” SFAs represents less than 1 percent of all SFAs but more than 15 percent of all students. The annual sample allocation for responding SFAs (aside from “certainty” SFAs) will be subject to the following restrictions: (1) no more than 80 percent of one-quarter of the total number of SFAs in the stratum1 and (2) a minimum of 115 SFAs for strata with more than 575 SFAs.2 The study team will implicitly stratify (that is, sort the frame within the SFA size and poverty level strata before sampling) by characteristics including urbanicity, and charter-only status to help ensure proportional representation of SFAs with these characteristics in the sample, and then select SFAs with equal probability within stratum using sequential sampling. Table B.2.2 provides SFA sample sizes for SY 2023-2024 data collection, focusing on SY 2022-2023 operations.

Table B2.2. SFA sample size for SY 2023-2024 data collection, focusing on SY 2022-2023 operations (selected, released, completes, noncompletes, and reserves)

Sample type

SFA type

School Year 2023-2024 Data Collection

Augmented sample selected

All SFAs

1,559

Certainty SFAs

27

Non-certainty SFAs

1,532

Initially released sample

All SFAs

1,268

Certainty SFAs

27

Non-certainty SFAs

1,241

Expected survey completes (80%)

All SFAs

1,012

Certainty SFAs

22

Non-certainty SFAs

990

Expected survey noncompletes (20%)

All SFAs

254

Certainty SFAs

5

Non-certainty SFAs

249

Reserve sample

All SFAs

293

Certainty SFAs

0

Non-certainty SFAs

293



After sampling, the study team will conduct a thorough quality control (QC) review to check that unweighted counts match those in the sample design, overall and by stratum, and that counts weighted by the sampling weights (which differ from analysis weights and account for only the probability of selection) match those in the sample frame, overall and by stratum.

The web survey of State CN Directors and the disaggregated administrative data collection will be conducted with a census of State CN Directors. Because this study involves a census of State agencies and a 100 percent response rate is expected, there is no need for sampling, weighting or nonresponse adjustments at the state level. Rather, the state-level data will provide reliable answers to the study’s research questions that represent the full population.

Estimation Procedure.

After data collection is complete, the study team will construct analysis weights for the SFA sample. The first step is to update the initial sampling weight with final backup release information (as indicated in Part B, Question 1). The study team will again check that the sample counts using the final release-adjusted sampling weights match the sample frame counts, overall and by stratum. To adjust the responding sample for differential nonresponse patterns, the team will look at SFA characteristics that are (1) available for both respondent and nonrespondent SFAs (that is, available in the FNS-742 or NCES and (2) expected to be correlated with key outcomes. Using logistic regression with stepwise procedures, the team will develop a parsimonious but well-fitting model of the propensity for an SFA to respond and use the inverse of the resulting propensity scores to adjust the sampling weights of respondents. The team will then benchmark the weight sums to the sampling frame and compare to other external sources of SFA counts. The team will use calibration techniques (such as post-stratification or raking) to bring the weights in line with target SFA totals by key subgroups. Finally, the team will examine the weights for outliers and trim weights, if needed. As with the sampling process, the development of analysis weights will undergo rigorous QC review. During analysis, the team will account for design effects on the variance of estimates caused by unequal weighting by using the Taylor series linearization method, which uses the analysis weights described above, or by producing replicate weights.

Because state-level data will be collected from a census of State CN agencies, no estimation procedures are necessary for the state-level collection components.

Degree of Accuracy Needed for the Purpose Described in the Justification.

When using probability sampling methods and selecting from a frame with full coverage of the national population, the resulting sample (of a sufficiently large size) will be nationally representative and will provide unbiased estimates of the population after applying the corresponding sampling weights (adjusted for nonresponse). Similarly, estimates for key subgroups from that sample will be unbiased, in expectation, after applying weights. For the SFA survey, the sample size and the sample design affect the precision of these representative estimates. With 1,012 completed SFA surveys, margins of error (half-widths of 95 percent confidence intervals) of +/– 3.2 percentage points (pp) are estimated around a 50 percent outcome, after accounting for design effects due to unequal weighting (Table B2.3). Outcomes closer to 0 or 100 percent will have even better precision. For the key subgroups of SFA size and percentage of students approved for F/RP meals, margins of error are estimated to range from +/– 4.9 pp for small SFAs to 8.9 pp for medium SFAs. Other subgroups of interest will also be included in the sample, including urbanicity and charter status, and based on our preliminary analysis, we anticipate that the margins of error for those strata will be approximately within this range.3

Table B2.3. Sample size estimates and expected margins of error for SFA sample and key subgroups

Group

Population size

Sample sizea

Margin of errorb (percentage points)

Total (national estimates)

18,821

1,012

3.2

SFA size (student enrollment)




Very small (1–499)

8,427

319

5.6

Small (500–2,499)

6,618

415

4.9

Medium (2,500–4,999)

1,850

127

8.9

Large (5,000–99,999)

1,899

129

8.1

Certainty (100,000+)

27

22

0

Percentage of students approved for free or reduced-price meals




Less than 40%

8,670

408

5.0

40-59%

3,857

257

6.3

60% or more

6,294

347

5.4

a Target completes per period.

b Incorporates a finite population correction factor and a design effect due to nonresponse adjustments of 1.1, plus design effects due to disproportionate sampling across strata (also about 1.1, for overall estimates; varying for different subgroups).

The study team expects to be able to detect as significant (with α = 0.05 and 80 percent power) underlying differences of 10 to 12 pp between SFA subgroups of different sizes and a difference of 10 pp between SFAs with above/below 60 percent F/RP enrollment. When comparing estimates from all SFAs between one study year and another, the study should have sufficient power with this design to detect differences as small as 7.8 pp.

Unusual Problems Requiring Specialized Sampling Procedures.

We do not anticipate any unusual problems requiring any specialized sampling procedures.

Any use of Periodic (Less Frequent than Annual) Data Collection Cycles to Reduce Burden.

In SY 2023-2024, the State survey data collection procedure will be conducted in fall 2023 and focus on SY 2022-2023 operations. The administrative data collection procedures will be conducted in spring 2024 and cover FY 2023. Also, the SFA survey data collection procedures will be conducted in SY 2023-2024 and focus on SY 2022-2023 operations. Concern regarding the periodicity of data collection cycles is generally not applicable.

B3. Methods to Maximize the Response Rates and to Deal with Nonresponse

Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.

The study is expected to achieve an 80 percent response rate for the SFA web survey and 100 percent for the SA web survey and administrative data request. This means that the data collected from States will represent the entire universe of State CN Directors, and rather than providing estimates to answer the research questions, we will be able to provide actual population totals. Thus, these data will provide reliable answers to the study’s research questions that represent the full population. Achieving the specified response rate involves contacting the States and selected SFAs, securing their participation in the study, and then offering support and completion reminders. The study team will use the following methods to maximize participation and reduce nonresponse:

  • FNS headquarters will notify (Appendix G.1) the FNS Regional Offices about SMO data collection activities during SY 2023-2024, focusing on SY 2022-2023 operations, and ask FNS Regional Office staff to inform State agencies about the upcoming data collections.

  • All FNS Regional Offices will send the Study Support Email from FNS Regional Offices to State Agencies (Appendix G.2) to CN Directors to encourage participation in the study. Also, the study team will copy the Regional Offices on communications with the State agencies to promote participation and response.

  • The study team will send all State CN Directors the State Agency Advance email (Appendix G.3) and Brochure (Appendix G.4) prior to fielding the survey, which explain the purpose of the study and describe study activities. The email will provide State CN Directors with notice that they will be invited to respond to a survey and provide administrative data in order to fulfill their statutory reporting requirements on the nationwide FFCRA waivers. In addition, the email sent prior to the SFA survey, will indicate that the study team will soon contact selected SFAs and will ask State CN Directors to promote cooperation by sending the Study Support Email from State Agencies to SFAs (Appendix G.10) to SFA Directors.

  • The study team will hold telephone meetings with State agencies to discuss the administrative data requests (Appendices E.1-E.3) using the Telephone Meeting Advance Email and Telephone Script (Appendix G.5). Because the FY 2023administrative data requests will be generally the same as the most recent data collection, it is expected that State agencies will be familiar with the request.

  • The study team will send sampled SFA Directors the SFA Advance Email (Appendix G.11) and Brochure (Appendix G.4), which explain the purpose of the study and describe study activities, to provide SFA Directors with notice that they will be invited to respond to a survey.

  • The study team will send all State CN Directors the State Agency Invitation Email (Appendix G.6) prior to each survey to invite them to complete the survey.

  • The study team will send all SFA Directors the SFA Invitation Email (Appendix G.12) to invite them to complete the survey.

  • Recruiting materials were carefully developed to emphasize the following points, which may resonate with respondents:

  • The SY 2023-2024 SMO data collections, focusing on SY 2022-2023 operations, will address research questions in five primary topic areas: (1) school and site participation, (2) child participation, (3) program operations, and (4) financial management, and (5) the nationwide CN COVID-19 Waiver to Allow Fiscal Action Flexibility for Meal Pattern Violations Related to COVID-19 Supply Chain Disruptions Impacting School Meals in School Year 2022-23.

  • Having updated information about CN Program operations at the state and local levels will help FNS inform policy and budget decisions, future training and technical assistance, and future nationwide waivers offered during emergencies.

  • The SA data collection is designed to gather information on statutory reporting requirements for the nationwide CN COVID-19 Waiver to Allow Fiscal Action Flexibility for Meal Pattern Violations Related to COVID-19 Supply Chain Disruptions Impacting School Meals in School Year 2022-23. State agencies’ full participation in the study will satisfy their statutory reporting requirements.

  • SMO has been designed to reduce participant burden by limiting the sample to the smallest possible number of SFAs needed to support the research and relying on administrative data whenever possible to avoid redundant information requests.

  • SFAs have been selected to participate as part of a nationally representative sample, which means that each SFA response will speak for many SFAs.

  • Designated FNS regional staff will serve as regional study liaisons and will be kept closely informed about the project so that they will be able to answer questions from CN Directors and encourage participation.

  • A toll-free number and study email address will be provided to all participants so that CN Directors and SFA Directors can receive assistance with the study. Staff will be readily available to clarify survey questions and work with participants to resolve technical issues, such as difficulty logging on or advancing past pages. Personalized assistance bolsters the perceived legitimacy of the study and will encourage respondents to persist in completing the survey.

  • For each survey, the Reminder Email (Appendix G.7) will be sent to State CN Directors and SFA Directors every 2-3 weeks for a total of two reminder emails before the reminder phone call (Appendix G.8) at week 7.

  • The study team will follow up by telephone with all CN Directors and SFA Directors who do not complete the survey and urge them to complete the survey.

  • The study team will use call scheduling procedures that are designed to call numbers at different times of the day (between 8am and 6pm) and days of the week (Monday through Friday) to improve the chances of finding a respondent at work.

  • Toward the end of the field period for each survey, all State agencies or SFAs who have yet to submit their surveys will be mailed the State Agency or SFA Last Chance Post Card (Appendices G.9 and G.13, respectively).

Although the team anticipates a 100% response rate for State agencies, because an 80% response rate is anticipated for SFAs, a nonresponse bias analysis will be conducted to assess whether weighting appears to have mitigated the risk for nonresponse bias. Although nonresponse bias can rarely be measured directly, the study team plans to look at the set of available SFA characteristics to assess the risk for nonresponse bias. These characteristics include variables available on the FNS-742 file, such as size measures (number of schools and number of students) and percent of students with free or reduced-price lunch. We will also include variables about the associated school district from the NCES files, such as charter school status, geographic characteristics, number of each type of school in the district (elementary, middle, and/or high) and grades offered in the district. The team will show frequency distributions for these variables for (1) the frame, (2) the sample (weighted by the release-adjusted sampling weight), (3) the respondents (weighted by the release-adjusted sampling weight), and (4) the respondents with their full nonresponse-adjusted weights. However, recognizing the limitations of the available data on SFA characteristics for analyzing nonresponse bias, the team will also conduct analyses comparing early responders to late responders, and both to nonrespondents, to assess the value of using this information when calculating nonresponse adjustments to the weights.

B4. Test of Procedures or Methods to be Undertaken

Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.

The study team pretested the State and SFA CN Director surveys in March 2023. The pretest of the SA web survey was conducted with three State agencies who volunteered to participate in the pretests. These SAs varied across relevant characteristics, including implementation of State-provided universal free meals. The pretest of the SFA survey was conducted with six SFAs that operate in four States and were available to participate within the pretest timeframe. These SFAs were selected to provide variation across key sub-groups of interest, including size (number of schools served) and urbanicity. Because the SMO administrative data collection during the first three years of the study (SYs 2020-2021, 2021-2022, and 2022-2023) involved the same data collection instruments that will be used during SY 2023-2024, covering FY 2023, pretesting of those instruments was not conducted.

The study team conducted 30-minute telephone debriefing interviews with each pretest respondent to solicit feedback on the survey. The interviews focused on asking respondents to identify and share concerns about unclear questions or response options, questions that took too long to answer, burden, and the flow of the survey. The team used cognitive methods to gauge respondents’ understanding of the intent of questions and response options, focusing on newly crafted or significantly revised survey items, or those that asked about topics that are complex or difficult to measure. Pretest findings are summarized in the SMO Option Period 2 Pre-test Findings Memo (Appendix M). Table B4.1 and B4.2 summarize respondent feedback and the changes made to the surveys based on the pretest findings.

Table B4.1. Changes to State CN Director Survey based on respondent feedback

Questions

Respondents’ feedback

Survey changes

C1, C2, C3

One respondent noted that they could not easily access information for these questions but said the preparation note in the introduction will be helpful when they complete the survey in the fall.

No change. The preparation note indicates respondents will need to gather this data for the survey. The questions also include a “Don’t know” response option for respondents who cannot access the data or do not know how to respond.

C4

Respondents reported they returned FFVP funds because some schools are unable to use all their awarded funds due to staffing, supply chain, and other challenges that prevent them from being able to fully participate in the program.

We added the response option “Schools did not utilize all of their awarded funds,” to a new question, C4, that addresses a new research question: “Why did States return FFVP funding to the USDA?”

D2, D4

One respondent reported that their State’s model of subsidizing universal free meals involves multiple components that made answering this question challenging.

No change. Respondents can select all subsidies that apply to their State and report any unique circumstances in the “Other, specify” text box.

E1

Two respondents were unsure how to respond to this question because their State’s policy is to implement the Federal policy.

We changed the response options to allow States with a Buy American policy to indicate “Yes, our State implements the Federal policy,” or “Yes, our State has a State-specific policy.”

E2, E3

One respondent suggested that Response Item 7 should be changed to “Encouragement for SFAs to order USDA foods in order to meet Buy American.”

No change. We do not recommend changing this response item as the question is about the components of the State’s Buy American policy.

E6

One respondent suggested including an option for exceptions when domestic foods are out of season.

No change. Seasonality can be captured in Items A and B: “The domestic food product is in inadequate supply,” and “The domestic food product is low quality.”

E9

Two respondents were unsure whether they should consider participation in LFS through Agricultural Marketing Service.

We added a clarification to the survey question, “If your State participated in the Local Food for Schools Cooperative Agreement Program (LFS), please do not consider this in your response.”

E11

One respondent was unsure what constituted training, and whether delivering a training is different from providing guidance.

We changed the question to ask about “training or guidance.”

CN = Child Nutrition; FFVP = Fresh Fruit and Vegetable Program; LFS = Local Food for Schools Cooperative Agreement Program; SFA = School Food Authority; USDA = U.S. Department of Agriculture.



Table B4.2. Changes to SFA Director Survey based on respondent feedback

Questions

Respondents’ feedback

Survey changes

Overall

One respondent said questions felt too brief, and they would have liked the opportunity to elaborate further in certain questions.

No change. Opportunities to provide more information are captured in the “Other, specify” response options for many questions. Adding additional short answer or write-in question formats could add to the burden and might shift the respondents’ focus away from the questions’ intention.

Survey navigation

One respondent commented it would be helpful if the electronic version of the survey had an option to save progress when partway through a section.

No change. Respondents will be able to save progress while taking the web version of the survey, as described in the instruction page.

Section B

One respondent suggested including a question about the impact of States’ models of financing universal free meals on participation, free lunch stigma, and meal service policies, such as Offer versus Serve.

No change. The State survey addresses subsidies and reimbursements to SFAs. The additional topics mentioned do not align with the research questions and adding a new question would modestly increase respondent burden.

D2, D3, D4, D5

Two respondents reported that meal prices in their SFA changed midyear, as their State allocated funding to provide reduced price meals at no cost. However, there was no response option to indicate this situation in the question.

We added the instruction, “If the price changed during the school year, please report the price charged for the majority of the school year.”

D6

One respondent selected “Increased lunch prices in all schools,” but reported that the price increase was not in response to the PLE provision. They were exempt from PLE, because their operating balance was positive.

We italicized “in response to the Paid Lunch Equity provision” in the question, to make it clearer to respondents that their responses should consider the PLE provision.

E1

One respondent reported that some SFA directors might not know the specifics of the Keep Kids Fed Act of 2022.

We added a definition of the Keep Kids Fed Act of 2022 to the Glossary page.

E4

One respondent suggested including a question asking why SFAs did not receive Equipment Assistance Grants.

No change. This comment is not aligned with the research questions.

E11

One respondent reported it was difficult to estimate how much unpaid meal debt was recovered, because only a small percentage of households pay their unpaid meal debts.

We added the preparation note to the survey introduction, “You will be asked to report the total amount of money owed to your SFA because of unpaid meal charges and the amount that has been recovered since the end of SY 2022–2023.” Respondents can also report that they do not know how much money was recovered.

E12

One respondent reported that unpaid meal balances in their SFA are primarily paid through financial aid from their district, charitable donations, and the Local Education Agency.

We added a question asking about the proportion of unpaid meal balances SFAs recovered from household payments, district funding, charitable donations, state funding, and other sources.

F4

One respondent suggested changing Item D to “Using domestic commodities or products increases administrative burden when they are not available or cost more.”

No change. We do not recommend changing Item D, because Items A and C capture supply and cost challenges. Adding language to this item will change the meaning of the item.

F4a

One respondent commented that obtaining Buy American information from suppliers can be difficult and time consuming, and, although the suppliers are responsive, sometimes the information is not readily available.

We changed Item D to “Food suppliers are not responsive to requests or able to provide information about the percentage of U.S. content in end products,” to better encompass SFA experiences.

F5

Respondents indicated it would be helpful to have a definition or example of a threshold in the question, because SFA Directors might interpret this term differently.

We modified the question to include “For example, an SFA may use a threshold of 5 percent, meaning an exception to the Buy American provision is warranted if the cost of a domestic commodity or product is at least 5 percent more than the cost of a non-domestic commodity or product.”

F6

Two respondents reported their SFA is part of a buying group, so they are not always aware of whether their purchases include non-domestic commodities. Another respondent commented that “occasionally” should be a response option, as their SFA only occasionally procures food items under the exception to Buy American.

We added the instruction, “If your SFA procured any of these foods under an exception to the Buy American provision at any time in SY 2022–2023, please select “Yes.” Respondents can select “Don’t know” if they do not have this information.

F10

One respondent commented that SFAs can use multiple units to track exceptions to Buy American, and a “Select all” question format would make responding to the question easier and make responses more accurate.

We changed the response format of Question F10 to “Select all that apply,” to allow respondents to select multiple units of measure.

F11

Some respondents reported difficulty reporting on the percentage of food procurements that were not domestic commodities or products.

We added a preparation note to the survey introduction, “You will be asked to report the percentage of food procurements that were not domestic commodities or products during SY 2022–2023.” Respondents are encouraged to report their best estimate and have the option to select “Don’t know.”

G3

Four respondents reported additional trainings or resources for determining ingredient costs and developing standardized recipes would be helpful.

We added two new response options: (1) “Determining ingredient costs or price per serving for standardized recipes” and (2) “Incorporating scratch cooking.”

G4

Respondents indicated online resources and materials are generally preferred; however, they prefer new materials and materials that will be distributed to schools as paper copies.

No change. We do not recommend changing this question as respondents did not report difficulty answering it.

G5

Two respondents indicated it would be helpful to encompass scratch and speed-scratch cooking in this question, as the schools in their SFA use both methods. Another respondent reported that schools in their SFA frequently use speed-scratch recipes, and they would have liked a way to indicate this in the survey.

We added use of speed-scratch cooking as a response option to Question G5 and created a new question, G6b, about how often SFAs use speed-scratch cooking.

G7

One respondent suggested splitting Item U into two response options because parents and students have different feelings about new recipes and menus.

We separated Item U into “Concern about students’ acceptance of new recipes or menus” and “Concern about parents’ acceptance of new recipes or menus.”

G8, G9

One respondent reported the frequency of offering nonanimal-based proteins at breakfasts and lunches differs by school level.

We changed Questions G8 and G9 to ask respondents to report how often plant-based meals were offered at each school level.

PLE = Paid Lunch Equity; SFA = School Food Authority; SY = school year.

B5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.

Table B5.1 lists staff consulted on statistical aspects of the design. The same staff will be responsible for collecting and analyzing the study data.



Table B5.1. Individuals consulted on statistical aspects of study design

Mathematica staff

Title

Phone

Email

Kevin Conway

Project Director

609-750-4083

[email protected]

Barbara Carlson

Senior Statistician

617-674-8372

[email protected]

Andrew Gothro

Researcher

202-250-3569

[email protected]

Eric Grau

Senior Statistician

609-945-3330

[email protected]

Josh Leftin

Researcher

202-250-3531

[email protected]

Sarah Forrestal

Senior Survey Researcher

609-945-6616

[email protected]

Veronica Severn

Survey Researcher

617-715-6931

[email protected]

Liana Washburn

Researcher

202-250-3551

[email protected]

Eric Zeidman

Senior Survey Researcher

609-936-2784

[email protected]

USDA staff

Title

Phone

Email

Darcy Güngör, FNS

Social Science Research Analyst, Child Nutrition Evaluation Branch, Office of Policy Support


[email protected]

Conor McGovern, FNS

Branch Chief, Child Nutrition Evaluation Branch, Office of Policy Support


[email protected]

Susannah Barr, FNS

Social Science Research Analyst, Child Nutrition Evaluation Branch, Office of Policy Support


[email protected]

Maggie Applebaum, FNS

Deputy Associate Administrator, Child Nutrition Programs

703-305-2578

[email protected]

Janis Johnston, FNS

Director, Program Integrity & Innovation Division, Child Nutrition Programs

703-305-2106

[email protected]

Doug Kilburg

NASS Reviewer

202-690-8640

[email protected]



1 These constraints allow for SFA nonresponse and enable non-overlapping samples across the four potential data collection years included under the research contract between FNS and Mathematica, which covers SYs 2020-2021 through 2023-2024. Due to the COVID-19 pandemic, the SFA survey was dropped from the SY 2020-2021 and SY 2022-2023 data collections, resulting in two potential data collection years with an SFA survey.

2 These thresholds were selected to ensure precision levels for important subgroups were less than +/- 10 percentage points, while also ensuring a total sample size of at least 1,000 SFAs per year.

3Upon OMB approval for the collection, the research team will merge the FNS-742 data with the NCES data in order to produce margin of error estimates for urbanicity and charter status.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleMathematica Standard Report Template
AuthorMATHEMATICA
File Modified0000-00-00
File Created2023-08-25

© 2024 OMB.report | Privacy Policy