CNOPS II Year 4 Part B Final

CNOPS II Year 4 Part B Final.docx

Child Nutrition Program Operations Study II (CN-OPS II): Year 4

OMB: 0584-0607

Document [docx]
Download: docx | pdf



Supporting Statement for

Revision to OMB # 0584-0607

Child Nutrition Program Operations Study II
(CN-OPS-II): Year 4



Part B





March 12, 2019



Holly Figueroa

Office of Policy Support

Food and Nutrition Service

United States Department of Agriculture

3101 Park Center Drive

Alexandria, VA 22302

Telephone: 703-305-2105

Email:[email protected]



Table of Contents

Part B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS 1

B.1 Respondent Universe and Sampling Method. . 1

B.2 Procedures for the Collection of Information 6

B.3 Describe Methods to Maximize the Response Rates and to Deal with Non-response. 11

B.4 Test of Procedures or Methods to be Undertaken. 13

B.5 Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data. 14





Tables



Appendices

A. Research Issues and Research Questions

B.1 CN Invitation Letter

B.2 CN Initial Follow-Up Email

B.3 CN Reminder Email

B.4 CN Director Telephone Reminder

B.5 CN Director Thank You Letter

B.6 State Email Notification of Selected SFAs

B.7 SFA Invitation Letter

B.8 Initial Follow-Up Email to SFA Directors

B.9 SFA Director Reminder Email

B.10 SFA Director Telephone Reminder

B.11 SFA Thank You Letter

C.1 State Child Nutrition Director Survey

C.2 State CN Director Survey (web version)

D.1 School Food Authority Director Survey

D.2 SFA Director Survey (web version)

E.1 60-Day Notice Comments

E.2 FNS Response to 60-Day Notice Comments

E.3 Comments from the National Agricultural Statistics Service (NASS)

E.4 FNS Response to NASS Comments

F.1 Section 28 of the Richard B. Russell National School Lunch Act

F.2 Section 305 of the Healthy Hunger-Free Kids Act of 2010

G 2M Research Employee Nondisclosure Agreement

H Annualized Burden and Cost of CN-OPS II Year 4

I CN-OPS II Year 4 Pretest Report

  1. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS

B.1 Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.

The respondent universe for this information collection includes all 55 State Child Nutrition (CN) agencies and 14,854 public School Food Authorities (SFAs). As described below and in Supporting Statement B of the currently approved collection for the Child Nutrition Program Operations Study II (CN-OPS-II) (OMB Control No. 0584-0607, expiration date: 07/31/2020), in order to produce nationally representative samples while minimizing the number of SFAs that will be asked to participate in a survey in more than one year, the samples for all four study years were selected at the beginning of Year 1 (SY 2015–16). Thus, the respondent universe for the proposed Year 4 (SY 2018–19) surveys is derived from the 19,527 School Food Authorities (SFAs) that participated in the National School Lunch Program (NSLP) or the School Breakfast Program (SBP) and submitted form FNS-742 SFA Verification Collection Report Summary Data (FNS 742) (OMB # 0584-0594 Food Programs Reporting System [FPRS], expiration date 9/30/19) for SY 2014–15. The United States Department of Agriculture (USDA) Food and Nutrition Service (FNS) excluded 4,673 SFAs in the following categories:

  • SFAs that operate only in residential child care institutions that do not have day time students;

  • SFAs that do not have students who are eligible for free or reduced price (FRP) meals; and

  • SFAs that operate only in private schools.

Thus, the universe of SFAs for this collection includes all 14,854 public SFAs with day students that participated in NSLP and/or SBP and submitted form FNS-742 for SY 2014–15. Table B-1 provides a summary of the respondent universe and expected response rates (described further below).

Table B‑1. Summary of Respondent Universe, Samples, and Expected Response Rates

Respondent

Universe

Initial Sample

Target completed cases

Response Rates





Year 1
SY 15-16

Year 2
SY16-17

Year 3
SY17-18

Year 4 (expected) SY 18-19

States

55

55

55

100%

100%

100%

100%

SFAs

14,854

2,188

1750

82%

77%

76%

80%

TOTAL

14,909

2,243

1,805

82%

78%

77%

80%



The FNS 742 database includes information on the location of the SFA, the number of students in the schools served by each SFA, and the number of students eligible to receive FRP meals in the schools served by each SFA. This information was used to create 10 strata based on both SFA size (number of students enrolled) and estimated percentage of students eligible for FRP meals (defined as high poverty [60 percent or more of enrolled students reported eligible for FRP meals] and low poverty [less than 60 percent of enrolled students reported eligible for FRP meals]). The universe was also stratified by the seven FNS regions such that each unit in the universe was coded into its respective region. The FNS 742 database was merged with the U.S. Department of Education’s Common Core of Data (CCD) Local Education Agency (LEA) Universe Survey File, which is maintained by the National Center for Education Statistics (NCES). Using the CCD data, the universe was then stratified by urbanicity—a classification derived by NCES that designates the location of school districts as urban, suburban, town, or rural areas. Most SFAs in the universe matched with the CCD data, but 584 did not. The unmatched SFAs were treated as a specific stratum in the urbanicity variable.

The explicit strata (SFA size and the estimated percentage of students eligible for FRP meals), including the number of SFAs and corresponding number of students in each stratum, are presented in the first five columns of Table B.2. This stratification plan allows FNS to establish samples that limit the number of SFAs required to complete the study in multiple years and produce estimates of population and sub-population parameters that meet the precision requirements of the study.

The sample design uses a combination of certainty selection and stratified probability proportionate-to-size (PPS) sampling of SFAs to select those that will complete the survey in each year of the CN-OPS-II. The measure of size is the square root of the total number of students in the schools served by the SFA. This sample was then divided into four random subsets to create the SFA sample for each year. This approach minimizes the number of SFAs that will be asked to participate in a survey in more than 1 year. Within each stratum, SFAs were sorted by FNS Region (defined by the seven FNS Regional offices) and by urbanicity status (defined by location in urban, suburban, town, or rural area) to ensure the sample selected was balanced on these additional implicit stratifying variables. As seen in Table B.2, all SFAs were selected (according to the criteria just mentioned) in strata 3–10 for sampling in at least 1 year. PPS sampling was used in strata 1 and 2.

Table B‑2. SFA Universe and Sample Sizes and Estimated Design Effects by Stratum

Strata

SFA Size (Students)

Poverty Level

Universe

Sampling

Estimated Design Effects

Number

Range

<>60%a

SFAs

Students

SFA Sample Selected to Support All 4 Years

Sample of SFAs Each Year

Expected SFA Respondents at 80% Response Rate

Point in Time

Year to Year

1

0–2,499

High

3,186

2,062,994

1,567

344

274

1.24

1.18

2

Low

7,810

6,587,739

4,514

988

790

1.14

1.09

3

2,500–4,999

High

421

1,457,686

421

92

74

0.83

0.78

4

Low

1,472

5,214,147

1,472

322

258

0.83

0.78

5

5,000–9,999

High

260

1,818,285

260

57

46

0.83

0.78

6

Low

793

5,514,825

793

173

139

0.83

0.78

7

10,000–99,999

High

256

5,940,334

256

56

45

0.83

0.78

8

Low

625

14,508,774

625

137

109

0.83

0.78

9

100,000–299,999

All

24

3,534,678

24b

12

10

0.58

0.50

10

300,000+

All

7

4,330,908

7c

7

6

0.14

0.11


Total


14,854

50,970,370

9,939

2,188

1,750

1.161

1.103

Source: FNS 742 Database, SY 2014–15, and CCD 2014–15 LEA data.

a Percentage of enrolled students approved for FRP meals.

b Each of these SFAs were sampled to participate every 2 years, so they create 48 selections in the 4-year sample.

c Each of these SFAs were sampled to participate in all 4 years, to create 28 selections in the overall sample.



As discussed in section B.2, the precision requirements of the study require 1,750 responses per year, or 7,000 over the 4-year study. The 4-year sample size of 9,939 includes some reserve SFAs in any given year because only 2,188 will be needed to achieve 1,750 responses at an 80 percent response rate. A reserve SFA is part of the sample, but is only contacted in special cases. These extenuating circumstances include releasing some of the reserve SFAs to replace SFAs selected for another study around the same time (which was done in earlier years) in order to minimize the burden of FNS studies on selected SFAs. In addition, each year the SFA sample was matched to the FNS 742 database to determine if any new SFAs were formed, and some were selected as part of the reserve SFAs. The sampling weights were also adjusted so the full sample was calibrated to the number of SFAs in the 742 data for the target school year (SY 2018–19 for Year 4).

Expected Response Rates

Based on prior experience with SFA surveys conducted for other studies, the research team aims to achieve an SFA response rate of 80 percent. To achieve an 80 percent response rate, we will follow a multi-step process, beginning with notifying key stakeholders about the study through well-established FNS communication channels, then offering a user-friendly web interface for the survey, providing email and telephone support, and email and telephone reminders. Additionally, the web survey allows respondents to save and exit the survey at any time and then work on completing the survey later. Thus, the research team sampled 2,188 SFAs to obtain 1,750 completed surveys from the SFA Directors. If the 80 percent target is not reached during the data collection period, FNS, through its Regional Offices, will contact State agencies and ask them to contact the unresponsive SFAs in their State and encourage participation. This process may extend the proposed data collection period, but in previous studies it has been effective in achieving a higher response rate. The Year 2 and Year 3 surveys fell somewhat short of the 80 percent target, but an earlier start is planned for the Year 4 survey, which should make the target feasible. In any case, the study team will conduct an analysis on the potential for nonresponse bias (described below) and make appropriate adjustments to the weights to minimize bias.

The Child Nutrition (CN) Director Survey will be conducted as a census of all 55 State CN directors (50 States, 4 U.S. Territories, and the District of Columbia) and will not involve sampling.1 We expect a 100 percent response rate for the CN Director Survey. The sample plan and procedures for collecting data are the same as those used in Years 1 through 3 of CN-OPS-II. The SFA Director response rate in Year 1 was 81.8 percent, and a 100 percent response rate was achieved from the 55 State CN directors. The Year 2 SFA Director response rate was 76.8 percent, and the Year 3 SFA Director response rate is estimated to be 76.3 percent2.

B.2 Describe the procedures for the collection of information including:

  • Statistical methodology for stratification and sample selection,

  • Estimation procedure,

  • Degree of accuracy needed for the purpose described in the justification,

  • Unusual problems requiring specialized sampling procedures, and

  • Any use of periodic (less frequent than annual) data collection cycles to reduce burden.

The procedures and statistical methodology of this collection are consistent with those described in the previously approved information collection request; no changes have been made since the collection was last approved. The information will be collected via web surveys of State CN Directors and SFA Directors. The respondents will have at least 9 weeks to complete a survey, which allows time to plan their approach to the survey. Some respondents may ask other staff members to complete different sections of the survey, which is facilitated by forwarding the survey link. Respondents will receive reminder emails and calls from trained survey support personnel. Respondents may also call and/or email for help from trained personnel in completing their survey or with technical issues. If desired, respondents can complete some questions on the phone because the trained support personnel have access to the respondent’s survey during phone calls.

Statistical Methodology for Stratification and Sample Selection

FNS requires that the survey sample design will result in a nationally representative sample of SFAs and students participating in NSLP and SBP. Additionally, FNS requires point-in-time national estimates of population totals and percentages with a precision of ±5 percentage points at the 95 percent level of confidence and ±10 percentage points at the 95 percent level of confidence for subgroups of SFAs. The primary subgroups of interest will be defined by the size of SFA in terms of the number of students, the location of the SFA in terms of urbanicity as defined by the CCD, the poverty of the SFA in terms of the percentage of students reported eligible for FRP meals, and the location of the SFA in terms of geographic region as defined by FNS. Additionally, FNS requires precision to detect differences between year-to-year estimates of ±10 percentage points at the 95 percent level of confidence. Within simple random sampling, these requirements translate to sample sizes of 400–500 responding SFAs for each subgroup. For example, with four key subgroups of roughly equal size, the total sample size required to meet the required precision would range from 1,750–2,000 SFAs.

In general, however, simple random sampling is not efficient for the multiple analytic objectives of the study. For example, while a simple random (or self-weighting) sample would be optimal for estimating the overall proportion of SFAs reporting various types of food service practices or programs, it would be inefficient for estimating the number of students involved in these types of services or programs. A stratified sample design that also accounts for the size of the SFA, as measured by the number of students, is necessary to meet these competing objectives. Stratification not only helps to ensure that adequate sample sizes are obtained for important analytic subgroups of interest, but it can also be effective in reducing the sampling errors of estimates that are correlated with enrollment size.

As indicated in Section B.1, an SFA-level database derived from the 2014–15 Verification Collection Report data was used to construct the SFA sampling frame. In addition to a unique identifier, the name of the SFA, and the State in which the SFA is located, the database includes information about the type of the SFA/school district (public or private), the number of schools participating in NSLP/SBP, the total enrollment in participating schools, and the number of students eligible for FRP meals. This information, along with data from the 201415 CCD LEA universe file, was used to stratify SFAs for sampling purposes. All known eligible SFAs, including those that could not be matched to the CCD file, were included in the sampling frame.

The types of SFA/district-level variables that can be used either as explicit or implicit stratifiers include region (defined by the seven FNS regions),3 enrollment size, a measure of poverty status defined by the percent of students eligible for FRP meals, minority status defined by the percent of students who were non-white, type of locale or urbanicity (e.g., central city, suburban, town, rural), and instructional level of schools served by the SFA (e.g., elementary schools only, secondary schools only, or both).4 Because many of these characteristics are related, it was not necessary to employ all of them in the stratification to account for variation in the SFAs. Two variables were used to create the strata: SFA enrollment size and poverty status. A total sample of 2,188 SFAs were allocated to the strata as discussed in section B.1.

Estimation Procedure

For estimation purposes, the sampling weights will reflect the overall probabilities of selection and of responding to the survey. The first step in the weighting process will be to assign a base weight to each sampled SFA. The base weight is equal to the reciprocal of the probability of selecting the SFA for the study, which will vary by sampling stratum under the stratified sample design. Next, the base weights will be adjusted for nonresponse within cells consisting of SFAs that are expected to be homogeneous with respect to response propensity. To determine the appropriate adjustment cells, we will conduct a nonresponse analysis to identify characteristics of SFAs that are correlated with nonresponse. The potential set of predictors to be used to define the adjustment cells will include SFA-level characteristics that are available from the FNS 742 database, and data from the most recent CCD file. Within these cells, a weighted response rate will be computed and applied to the SFA base weights to obtain the corresponding nonresponse-adjusted weights.

To properly reflect the complex features of the sample design, standard errors of the survey-based estimates will be calculated using jackknife replication. Under the jackknife replication approach, subsamples or “replicates” will be formed in a way that preserves the basic features of the full sample design. A set of weights (referred to as “replicate weights”) will then be constructed for each jackknife replicate. Using the full sample weights and the replicate weights, estimates of any survey statistic can be calculated for the full sample and for each of the jackknife replicates. The variability of the replicate estimates is used to obtain the variance of the survey statistic. The replicate weights can be imported into variance estimation software (e.g., SAS, STATA, SUDAAN, WESVAR) to calculate standard errors of the survey-based estimates. In addition to the replicate weights, stratum and unit codes will be provided in the data files to permit calculation of standard errors using Taylor series approximations if desired. Note that while replication and Taylor series methods often produce similar results, jackknife replication has some advantages when weights are adjusted to account for nonresponse.5

Degree of Accuracy Needed for Purpose Described in Justification

Table B‑3. Expected Precision Levels Based on the Proposed Sample Design for 1,750 Completed SFAs

Domain

Population

Completed SFA Interviews

Precision for Point in Time

MDD Year-to-Year






Small

10,996

1,064

3.2%

4.5%

Medium

2,946

516

3.9%

5.4%

Large

912

170

6.7%

9.2%

City

1,798

239

6.3%

8.6%

Suburban

3,287

477

4.4%

6.0%

Town

2,512

314

5.6%

7.7%

Rural

6,672

730

3.8%

5.2%

Missing Urbanicity

584

-



Poverty High

4,123

438

4.9%

6.7%

Poverty Low

10,731

1,312

2.7%

3.7%

Mid Atlantic

1,494

188

7.1%

9.8%

Midwest

3,798

432

4.8%

6.7%

Mountain

2,295

248

6.5%

9.0%

Northeast

1,641

198

7.0%

9.7%

Southeast

1,235

174

7.2%

9.8%

Southwest

2,239

249

6.4%

8.8%

Western

2,079

252

6.2%

8.5%






Total /Overall

14,854

1,750

2.5%

3.5%

Source: FNS 2014–15 742 Database, CCD 2014–15 LEA data.

Note: Of the 14,854 SFAs in the FNS 742 database, we were unable to match 584 to the CCD, resulting in missing urbanicity information as shown in Table B.3.



Unusual Problems Requiring Specialized Sampling Procedures

We do not anticipate any unusual problems requiring any specialized sampling procedures.

Any Use of Periodic (Less Frequent Than Annual) Data Collection Cycles to Reduce Burden.

The data collection procedures have been conducted annually in SY 2015–16, SY 2016–17, SY 2017–18, and will be conducted again in SY 2018–19. Concern regarding the periodicity of data collection cycles is not applicable.

B.3 Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.

Overall response rate projections were presented in Section B.1. Achieving the specified response rate involves contacting the States and the selected SFAs, securing their participation in the survey, and then offering support and completion reminders, using the procedures described below. We estimate 80 percent of the sampled SFA Directors will complete the web-administered survey. We expect 100 percent of State CN Directors to complete their survey.

The procedures for maximizing the number of respondents to the surveys are:

  • The letters inviting SFA Directors and CN Directors to participate were carefully developed to emphasize the importance of this study and how the information will help FNS to better understand and address current policy issues related to CN program operations.

  • The current contact information will be used for initial correspondence and will be updated as needed throughout the data collection period to facilitate communication with the respondents.

  • Designated FNS regional staff will serve as regional study liaisons and will be kept closely informed about the project so that they will be able to answer questions from SFA and CN Directors and encourage participation.

  • A toll-free number and study email address will be provided to all participants so that SFA and CN Directors can receive assistance with the study.

  • CN Directors and sampled SFA Directors will have the option of completing the web-based survey via telephone with a trained survey support professional entering information into the web survey.

  • Periodic email reminders will be sent to sampled SFA and CN Directors who have not yet completed their respective surveys.

  • We will follow up by telephone with all sampled SFA and CN Directors who do not complete the survey within a specified period and urge them to complete the survey. We will emphasize that if the CN or SFA Director prefers to complete the survey or remaining sections of the survey over the telephone, a telephone interviewer will administer the web-based survey or remaining parts of the survey over the telephone.

  • We will use call scheduling procedures that are designed to call numbers at different times of the day (between 8am and 6pm) and days of the week (Monday through Friday), to improve the chances of finding a respondent at work.

If the final SFA Directors survey response rate drops below 80 percent, a nonresponse bias analysis will be conducted, and weighting adjustments to correct for potential biases will be performed as indicated in Section B.2.

B.4 Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.

Two State CN Directors and 5 SFA Directors participated in the pretest of the survey instruments for CN-OPS II Year 4. The pretest took place in August 2018. All pretest respondents completed paper versions of the instruments and were asked to enter start and end times for specific sections of the survey to assess burden. However, the primary goal of the pre-test was to test new or substantially revised questions for clarity, phrasing, and appropriateness of response options.

Participants completed the paper survey, scanned it, and returned the scanned survey by email. Survey staff conducted 30-minute debriefing interviews with each pre-test respondent to solicit feedback. The interviews focused on asking respondents to identify any questions and sections that were unclear and to recommend changes to the wording of questions. Respondents provided substantial useful feedback, which was used to revise the instruments (see Appendices C and D). Changes made include minor terminology changes (such as changing “school foodservices” to “school nutrition services” throughout the surveys), additions to response options, and re-ordering of questions to reduce respondent burden (for example, SFA Directors indicated that many schools did not make accommodations for special dietary needs or preferences, so questions were reordered to allow respondents to skip over this set of questions if not relevant to their SFA). Additional pretest details concerning who was in the test sample, how they were chosen, the evaluation criteria for the test, and specific comments from pre-test participants are summarized in the CN-OPS II Year 4 Pretest Report in Appendix I.

B.5 Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.

Table B.4 presents a summary of individuals consulted on statistical aspects of the design. 2M Research (2M) and Mathematica Policy Research (Mathematica) will conduct data collection and analysis for this study, in coordination with FNS.

Table B‑4. Persons Consulted on Statistical Aspects of the Design

Name

Affiliation

Telephone Number

email

Anne Gordon

2M

817-856-0891

[email protected]

Michael Sinclair

Mathematica

202-552-6439

[email protected]

Eric Zeidman

Mathematica

609-936-2784

[email protected]

Charlotte Cabili

Mathematica

202-238-3322

[email protected]

Holly Figueroa

FNS/USDA

703-305-2105

[email protected]

Jennifer Rhorer

NASS/USDA

202-720-3026

[email protected]



1 In Years 2 and 3, Puerto Rico and the U.S. Virgin Islands were exempt due to Hurricane Maria. Nonetheless, the U.S. Virgin Islands completed the CN Director Survey in Year 3.

2 Final Year 3 response rates will be provided over the next few months as data analysis progresses.

3 The seven regions (and States) are: Northeast (CT, ME, MA, NH, NY, RI, VT), Mid-Atlantic (DE, DC, MD, NJ, PA, PR, VA, VI, WV), Southeast (AL, FL, GA, KY, MS, NC, SC, TN), Midwest (IL, IN, MI, MN, OH, WI), Southwest (AR, LA, NM, OK, TX), Mountain Plains (CO, IA, KS, MO, MT, NE, ND, SD, UT, WY), and Western (AK, AZ, CA, GU, HI, ID, NV, OR, WA).

4 Elementary school is defined as any school with any span of grades from kindergarten through grade 6, such as K–4, 4–6, or K–5. Middle or junior high school is defined as any school that has no grade lower than grade 6 and no grade higher than grade 9, such as grade 6 only, 6–7, or 6–9. High school is defined as any school that has no grade lower than grade 9 and continues through grade 12, such as grade 9 only, 9–10, or 9–12. Schools that do not fit these definitions, such as 6–12, K–8, or K–12, are categorized as “other.”

5 Rust, K. F., & Rao, J. N. K. (1996). Variance estimation for complex surveys using replication techniques. Statistical Methods in Medical Research, 5, 283–310.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorFreeland_s
File Modified0000-00-00
File Created2021-01-20

© 2024 OMB.report | Privacy Policy