CNOPS II OMB Part B Final 12.23.15

CNOPS II OMB Part B Final 12.23.15.docx

Child Nutrition Program Operations Study II (CN-OPS II)

OMB: 0584-0607

Document [docx]
Download: docx | pdf

Part B: Collection of Information Using Statistical Methods


Supporting Justification for OMB Clearance for the Child Nutrition Program Operations Study II (CN-OPS II)



Part B




December 23, 2015










Office of Policy Support

Food and Nutrition Service

United States Department of Agriculture

3101 Park Center Drive

Alexandria, VA 22302

Project Officer: Devin Wallace-Williams

Telephone: 703-457-6791

Email:[email protected]


Table of Contents


Part B. Collections of Information Employing Statistical Methods B-3


B.1 Respondent Universe and Sampling Methods B-3

B.2 Procedures for the Collection of Information B-6

B.3 Methods to Maximize the Response Rates and to Deal with Nonresponse B-13

B.4 Test of Procedures or Methods to be Undertaken B-15

B.5 Individuals Consulted on Statistical Aspects and Individuals Collecting
and/or Analyzing Data B-17


Tables

Table B1. Distribution of eligible SFAs in the 2014-15 FNS-742 universe file (sampling frame) by enrollment size and percent of students eligible for free/reduced price lunch B-4

Table B2. Expected margins of error* for various sample sizes (n) and design
effects (DEFF)
B-8

Table B3. Proposed sample sizes for the SFA survey B-10

Table B4. Expected responding sample sizes and corresponding standard error of an
estimated proportion under proposed design for selected analytic domains
B-11


Appendices


A. Research Issues and Research Questions

B.1 Email Notification to Regional Offices

B2. Invitation Letter to State Agency Child Nutrition Directors/Frequently Asked Questions (FAQs)/Web Survey Information Sheet

B3. Follow-Up E-Mail to State CN Directors

B4.1. CN Director Telephone Interviewer Script

B4.2. SFA Director Telephone Interviewer Script

B5 Thank You Letter to State CN Director for Completing the CN-OPS II Survey

B6 Email Notification to State Child Nutrition Directors List of Participating SFAs and Frequently Asked Questions (FAQs)

B7. Invitation Letter to SFA Directors, Frequently Asked Questions (FAQs) and Web Survey Information Sheet

B8. Follow-Up E-Mail to SFA Directors

B9. Thank You Letter to SFA Director for Completing the CN-OPS II Survey

C. State Child Nutrition Director Survey 2015-2016

D. School Food Authority (SFA) Director Survey 2015-2016

E.1 Comments from SNA

E.2 FNS Response to Comments

F.1. Section 28 of the Richard B. Russell National School Lunch Act Amended Through February 2014

F.2 Healthy Hunger-Free Kids Act of 2010

G Table of Estimated Burdens




  1. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS

B.1 Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.



The respondent universe for the proposed Year 1 survey includes all SFAs operating in public school districts in the United States and outlying territories that were required to submit form FNS-742 SFA Verification Collection Report Summary Data (approved under OMB# 0584-0026 7 CFR Part 245, Determining Eligibility for Free & Reduced Price Meals, expiration date 4/30/2016) to FNS in SY 2014-15. In general, all SFAs that participated in the NSLP (National School Lunch Program) or the SBP (School Breakfast Program) are included in the respondent universe with the following exceptions:


  • SFAs that operate only in Residential Child Care Institutions that do not have day time students;

  • SFAs that do not have students who are eligible for free/reduced-price lunch;

  • SFAs in some outlying territories that are not required to complete form FNS-742; and

  • Private schools that participate in the NSLP.


The SY 2014-15 FNS-742 database was used to construct the SFA sampling frame (i.e., the universe file) from which the respondent samples will be drawn. There were over 19,000 SFAs in the 2014-15 FNS-742 database. However, only the approximate 15,000 SFAs operating in public school districts were included in the sampling frame. Note that the unit of analysis for the study is the SFA, which usually (but not always) coincides with a local education agency (LEA), as defined in the U.S. Department of Education’s Common Core of Data (CCD) Local Education Agency Universe Survey File maintained by the National Center for Education Statistics (NCES). Exceptions include SFAs that operate school food programs for multiple school districts and those operating individual schools (e.g., some public charter schools). In the 2014-15 FNS-742 database, approximately 96 percent of the eligible SFAs matched a district (LEA) in the CCD universe file. Those that did not match remained in the sampling frame with an indicator denoting that they do not have associated CCD data. Table B1 summarizes the distribution of eligible SFAs in the sampling frame by enrollment size and poverty status based on the percentage of students eligible for free/reduced-price lunch.


Table B1. Distribution of eligible SFAs in the 2014-15 FNS-742 universe file
(sampling frame) by enrollment size and percent of students
eligible for free/reduced price lunch

Percent eligible for free/reduced price lunch1

Enrollment size class

Expected number of SFAs to be sampled

Under 60 percent

0- 2,499

8,278


2,500-4,999

1,600


5,000-9,999

824


10,000 to 99,999

656


100,000 to 299,999

12


Certainty (300,000+)

2




60 percent or more

0- 2,499

2,610


2,500-4,999

326


5,000-9,999

210


10,000 to 99,999

216


100,000 to 299,999

10


Certainty (300,000+)

4

All SFAs

Total

14,748

1 Number of students with access to NSLP/SBP as reported in 2014-15 FNS 742.



Expected Response Rates


The response rate is the proportion of sampled SFAs that complete the SFA survey. Based on prior experience with SFA surveys conducted for other studies, we expect to achieve an SFA response rate of 80 percent. To ensure an 80 percent response rate we will follow a multi-step process, beginning with notification of the study through well-established FNS communication channels and then utilizing a user friendly web interface to the survey, providing email and telephone support, and email and telephone reminders. Additionally, the web survey allows respondents to save and exit and any time and then work on completing the survey later. Thus, the study team plans to sample 2,188 SFAs to obtain 1,750 completed surveys from the SFA directors. In the event that 80 percent is not reached during the data collection period, FNS, through their Regional offices, will contact State agencies and ask them to contact the unresponsive SFAs in their State and encourage participation. This process may extend the proposed data collection period but in previous studies it has been effective in achieving the 80 percent response rate. Additionally, we will conduct an analysis on the potential for nonresponse bias (described below) and make appropriate adjustments to the weights to minimize bias.


The State Child Nutrition Director survey will be conducted as a census of all 55 State directors (including all States, U.S. Territories, and the District of Columbia) and will not involve sampling. We expect a 100 percent response rate for the State Child Nutrition Director survey. Given that the states are significantly different from each other owing to the salient differences, any sampling procedures for the states would have to introduce complex samples, which would increase the sample size. The burden savings from sampling are not justified by the loss of information on the heterogeneity of the States. The overall response rate will be approximately 80.4%.



B.2 Describe the procedures for the collection of information including:


  • Statistical methodology for stratification and sample selection,

  • Estimation procedure,

  • Degree of accuracy needed for the purpose described in the justification,

  • Unusual problems requiring specialized sampling procedures, and

  • Any use of periodic (less frequent than annual) data collection cycles to reduce burden.


This section describes the procedures for the collection of information, including statistical methodology for stratification and sample selection, estimation procedure, and the degree of accuracy needed for the purpose described in the justification.


A goal of the survey sample design is to obtain a nationally representative sample of SFAs that will yield population estimates with a precision of ±5 percent at the 95 percent level of confidence for the overall SFA population and for specified subgroups of SFAs. Within simple random sampling, this translates to a sample size of 400-500 responding SFAs for each subgroup. For example, with four key subgroups of roughly equal size, the total sample size required to meet the specified precision levels would range from 1,750-2,000 SFAs. In general, however, simple random sampling is not efficient for the multiple analytic objectives of the study. For example, while a simple random (or self-weighting) sample would be optimal for estimating the overall prevalence of SFAs reporting various types of food service practices or programs, it would be inefficient for estimating the number of students involved in these types of services or programs. A stratified sample design using variable rates that depend on the size of the SFA would better meet these conflicting objectives. Stratification not only helps to ensure that adequate sample sizes are obtained for important analytic subgroups of interest, but can also be effective in reducing the sampling errors of estimates that are correlated with enrollment size.


A stratified sampling design employing varying sampling fractions was used to select the SFA sample for the study. Such a design will generally inflate the standard errors of prevalence estimates as compared with simple random sampling, but is justifiable for the reasons mentioned above. A measure of the relative precision of a complex sample design is given by the design effect (DEFF), which is defined as the ratio of the variance of an estimate based on the complex sample design, to the hypothetical variance based on a simple random sample of the same size. A design effect of 1.00 means that the complex sample is roughly equivalent to a simple random sample in terms of sampling precision. (A design effect less than 1.00 can sometimes occur if the sampling rates in some strata are very high, resulting in non-negligible, finite population correction factors.) Under the proposed design, we have estimated that the resulting design effects will range from 0.17 to 1.46. As indicated in Table B2, which summarizes the expected margins of error of a prevalence estimate under the proposed design for a range of sample sizes and design effects, a total SFA sample size of 1,750 responding SFAs should be more than adequate to meet or exceed the ±5 percent precision requirement, even for a design effect as large as 1.46


Table B2. Expected margins of error* for various sample sizes (n) and design
effects (DEFF)


Design effect (DEFF)

n

1.10

1.25

1.50

100

11.0%

12.5%

15.0%

200

7.8%

8.8%

10.6%

300

6.4%

7.2%

8.7%

400

5.5%

6.3%

7.5%

500

4.9%

5.6%

6.7%

600

4.5%

5.1%

6.1%

700

4.2%

4.7%

5.7%

800

3.9%

4.4%

5.3%

900

3.7%

4.2%

5.0%

1,000

3.5%

4.0%

4.7%

1,100

3.3%

3.8%

4.5%

1,200

3.2%

3.6%

4.3%

1,300

3.1%

3.5%

4.2%

1,400

2.9%

3.3%

4.0%

1,500

2.8%

3.2%

3.9%

*Entries correspond to 95% confidence limits for an estimated prevalence of approximately 50%. For estimated prevalence less than 50% or greater than 50%, the confidence limits will be smaller than those indicated in the table.



Sample Stratification and Selection


As indicated in Section B.1, an SFA-level database derived from 2014-15 Verification Collection Report data was used to construct the SFA sampling frame. In addition to a unique identifier (SFAID), the name of the SFA, and State in which the SFA is located, the database includes information about the control type of the SFA/school district (public or private), the number of schools participating in the NSLP/SBP, the total enrollment in participating schools, and the number of students eligible for free or reduced-price lunch. This information, along with the data from the most recent NCES Common Core of Data (CCD) and the LEA universe file, where applicable, were used to stratify SFAs for sampling purposes. Note that all known eligible SFAs, including those that could not be matched to the then-current CCD file, were included in the sampling frame. Any SFAs that could not be matched to the then-current CCD file will be treated as a separate category if they are selected as part of the sample.


The types of SFA/district-level variables that can be used either as explicit or implicit stratifiers include region (defined by the seven FNS regions1), enrollment size, a measure of poverty status defined by the percent of students eligible for free/reduced-price lunch, minority status defined by the percent of students who were non-white, type of locale (e.g., central city, suburban, town, rural), and instructional level of schools served by the SFA (e.g., elementary schools only, secondary schools only, or both)2. Since many of these characteristics are related, it was not necessary to employ all of them in the stratification to account for variation in the SFAs. Two variables were used to create the strata: SFA enrollment size and poverty status. A total sample of 2,188 SFAs were allocated to the strata in proportion to the square root of the enrollment of SFAs within the stratum. Such an allocation gives large SFAs relatively higher selection probabilities than smaller ones and provides acceptable sampling precision for both prevalence estimates (e.g., the proportion of SFAs with a specified characteristic) and numeric measures correlated with enrollment (e.g., the number of students in SFAs with access to various food services or programs). Because CN-OPS II will include future studies, the sample design also attempts to minimize the number of SFAs that will be asked to participate each year.




Table B3. Proposed sample sizes for the SFA survey

Percent eligible for free/reduced price lunch1

Enrollment size class2

Expected number of SFAs to be sampled

Expected number of responding SFAs3

Under 60 percent

0- 2,499

980

784


2,500-4,999

400

320


5,000-9,999

206

165


10,000 to 99,999

164

131





60 percent or more

0- 2,499

232

186


2,500-4,999

82

65


5,000-9,999

53

42


10,000 to 99,999

54

43


100,000 to 299,999

11

9


Certainty (300,000+)

6

5

All SFAs

Total

2,188

1,750


1Calculated from the number of students eligible for free or reduced price lunch as reported in 2014-15 FNS 742.

2Number of students with access to NSLP/SBP as reported in 2014-15 FNS 742.

3Based on 80% response rate. Note: See Table B4 for additional breakouts of the sample.



Expected Levels of Precision


Table B4 summarizes the approximate survey sample sizes and precision to be expected under the proposed design for selected subgroups. The standard errors in Table B4 reflect design effects ranging from 0.17 to approximately 1.5 depending on subgroup. The design effect primarily reflects the fact that under the proposed stratified design, large SFAs will be sampled at relatively higher rates (i.e., have smaller sampling weights) than small SFAs.



Table B4. Expected responding sample sizes and corresponding standard error of an estimated proportion under proposed design for selected analytic domains

Domain (subset)

Number of Respondents Completing Survey *


Precision

Total sample

1,750

2.8%

Percent of students eligible for free/reduced price lunch

 


Less than 60

1,414

2.7%

60 or more

336

5.8%

FNS Region

 


Mid Atlantic

193

7.3%

Midwest

430

5.1%

Mountain

243

7.0%

Northeast

201

7.2%

Southeast

178

7.2%

Southwest

246

6.7%

Western

259

6.3%

SFA Enrollment Size

 


Small

970

3.7%

Medium

592

3.6%

Large

188

6.5%

* Expected number of responding eligible SFAs, assuming response rate of 80 percent. The precision is plus or minus 5 percent with a 95% level of confidence. Estimates assume unequal weighting design effect ranging from 0.17 to 1.46 depending on subgroup.



Estimation and Calculation of Sampling Errors


For estimation purposes, sampling weights reflecting the overall probabilities of selection and differential nonresponse rates will be attached to each data record providing usable SFA data. The first step in the weighting process will be to assign a base weight to each sampled SFA. The base weight is equal to the reciprocal of the probability of selecting the SFA for the study, which will vary by sampling stratum under the proposed stratified sample design. Next, the base weights will be adjusted for nonresponse within cells consisting of SFAs that are expected to be homogeneous with respect to response propensity. To determine the appropriate adjustment cells, we will conduct a nonresponse bias analysis to identify characteristics of SFAs that are correlated with nonresponse. The potential set of predictors to be used to define the adjustment cells will include SFA-level characteristics that are available from the FNS database and data from the most recent CCD file. Within these cells, a weighted response rate will be computed and applied to the SFA base weights to obtain the corresponding nonresponse-adjusted weights.


To properly reflect the complex features of the sample design, standard errors of the survey-based estimates will be calculated using jackknife replication. Under the jackknife replication approach, subsamples or "replicates" will be formed in a way that preserves the basic features of the full sample design. A set of weights (referred to as “replicate weights”) will then be constructed for each jackknife replicate. Using the full sample weights and the replicate weights, estimates of any survey statistic can be calculated for the full sample and for each of the jackknife replicates. The variability of the replicate estimates is used to obtain the variance of the survey statistic. The replicate weights can be imported into variance estimation software (e.g., SAS, STATA, SUDAAN, WESVAR) to calculate standard errors of the survey-based estimates. In addition to the replicate weights, stratum and unit codes will be provided in the data files to permit calculation of standard errors using Taylor series approximations if desired. Note that while replication and Taylor series methods often produce similar results, jackknife replication has some advantages in reflecting statistical adjustments used in weighting such as nonresponse and post-stratification (e.g., see Rust, K.F., and Rao, J.N.K., 1996. Variance estimation for complex surveys using replication techniques. Statistical Methods in Medical Research, 5: 283-310).

Unusual Problems Requiring Specialized Sampling Procedures


We do not anticipate any unusual problems requiring any specialized sampling procedures


Any Use of Periodic (Less Frequent Than Annual) Data Collection Cycles to Reduce Burden.


The data collection procedures will be conducted annually.


B.3 Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.


Overall response rate projections were presented earlier. Achieving the specified response rate involves locating the sample members to secure participation using procedures described below. We estimate 80 percent of the sampled SFA directors will complete the web-administered survey. We expect 100 percent of State Child Nutrition directors to complete their survey.


Below we describe procedures to be followed to maximize the number of sample members who complete the survey:

  • The letters inviting SFA Directors and State Child Nutrition Directors to participate will be very carefully developed to emphasize the importance of this study and how the information will help the Food and Nutrition Service (FNS) to better understand and address current policy issues related to Special Nutrition Program (SNP) operations.

  • The current contact information will be used for all initial correspondence and be updated as needed throughout the data collection period to facilitate communication with the study team.

  • Designated FNS regional staff will serve as regional study liaisons and be kept closely informed of the project so that they will be able to answer questions from SFAs and States and encourage participation.

  • A toll free number and study email address will be provided so that SFAs and States can receive assistance with the study.

  • Sampled SFA directors and State Child Nutrition directors will have the option of completing the web-based survey as a telephone survey. The State Child Nutrition directors will also have the option of completing the web-based survey as a telephone survey.

  • Periodic email reminders will be sent to sample members who have not yet completed the survey.

  • We will follow up by telephone with all sampled SFA and State Child Nutrition directors who do not complete the survey within a specified period and urge them to complete the survey. At that point, if the directors prefer to complete the survey or remaining sections of the survey over the telephone, a telephone interviewer will administer the survey or remaining parts over the telephone.


The following procedures will be used to maximize the completion rates for surveys that are administered by telephone:


  • Use a core of interviewers with experience working on telephone surveys, particularly interviewers who have proven their ability to obtain cooperation from a high proportion of sample members.

  • Conduct a telephone interviewer training session specific to this study.

  • Use call scheduling procedures that are designed to call numbers at different times of the day (between 8am and 6pm) and days of the week (Monday through Friday), to improve the chances of finding a respondent at work.

  • Provide a toll-free number and email help address for respondents to verify the study’s legitimacy or to ask other questions about the study.


B.4 Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.


The draft State Agency (SA) and School Food Authority (SFA) Director surveys were pre-tested in mid-August 2015. The pre-test instruments included newly developed questions and questions that were heavily edited from the previous FNS surveys. These were evaluated in terms of understandability (confusing wording or layout, failure to grasp what we were looking for, etc.) and length of time to answer. Additionally, all pre-test participants were debriefed by phone and provided opportunities for general comments about the instruments. Two State Child Nutrition (CN) directors and six SFA directors from four States participated in the pre-test.


To arrange the pre-tests, we chose five States that were known to have SFAs that varied with respect to use of special provisions (e.g. Provision 2, 3, Community Eligibility Provision) and use of direct verification. FNS asked the Mid-Atlantic and Southeast Regional Offices to email the State Agencies located in West Virginia, New Jersey, North Carolina, Virginia, and Kentucky about their potential involvement in the Child Nutrition Operations Study II (CNOPS-II). The email asked State CN directors if they would be interested in participating in a pre-test of the State Child Nutrition Director Survey and if they could recommend five SFA directors in their State to potentially participate in a pre-test of the SFA Director Survey. For each of the recommended SFAs, we asked State CN directors if they could indicate if the SFAs participated in special provisions and/or used direct verification, two important topics on the survey that would allow for some diversity in selecting respondents. After the necessary outreach was made by FNS and the Regional Offices, we contacted them via email and follow up phone calls. We selected New Jersey and Kentucky State CN directors to participate in the State Director Survey pre-test based on their ability to participate in the necessary timeframe. Then we prioritized outreach to specifically recommended SFAs so that we had a high chance of lining up SFAs that varied in size, participation in special provisions, and use of direct verification.

Child Nutrition Director Survey Pre-Test Findings

Both pre-test respondents said the questions in the CN Director survey were clearly worded and easy to answer without much input from other staff. During the debriefing, it was explained to respondents that we would be fielding the survey starting in February 2016. When asked if they thought this was an appropriate time of year to field the survey, both respondents expressed that they should be able to answer questions in the survey by then. Both pre-test respondents estimated it took them about one hour to complete the pre-test survey.


One key concern expressed by both pre-test respondents was that they were not able to answer the questions related to the hiring standards, including the number of new SFA directors hired and the percentages of SFA directors meeting hiring requirements. Respondents said they did not have access to these data, even if they had records from administrative reviews. As a result, one question was dropped, while three others were revised to be yes/no questions. The other key concern pertained to the amount of time it would take to determine how long schools had participated in Provisions 1 and 2 and the Community Eligibility Provision (CEP). . This question was thus simplified so that it only asks about the number of schools operating under the CEP.


SFA Director Survey Pre-Test Findings

Pre-test participants for the SFA Director survey indicated that fielding the survey in February 2016 would be an appropriate time for them to answer the questions. This included questions related to the new professional standards rule that went into effect on July 1, 2015 as well as the numbers of students directly certified by October 31, 2015 and after October 31, 2015.

As a result of pre-testing, five questions were dropped and seventeen questions were modified. Most of the modifications were to more precisely ask the respondent the desired information by including a definition or an example. Two questions were modified to provide simpler choices for the respondents. By revising and dropping some questions, we are estimating that the time to complete the survey will be two hours. This estimate is based on the debriefing of pre-testers and from the experience of SFAs in responding to similar questions in other studies.



B.5 Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.



The Contractor, 2M Research Services, and their Subcontractor, Mathematica Policy Research, will conduct this study.





Name

Affiliation

Telephone Number

e-mail

Jim Murdoch

2M

817-856-0863

[email protected]

Roderick Harrison

2M

202-266-9901

[email protected]

Eric Zeidman

Mathematica

609-936-2784

[email protected]

Charlotte Cabili

Mathematica

202-238-3322

[email protected]

Michael Sinclair

Mathematica

202-552-6439

[email protected]

Devin Wallace-Williams

FNS/USDA

703-457-6791

[email protected]

Jennifer Rhorer

NASS/USDA

202-720-2616

[email protected]





1 The seven regions (and states) are: Northeast (CT, ME, MA, NH, NY, RI, VT), Mid-Atlantic (DE, DC, MD, NJ, PA, PR, VA, VI, WV), Southeast (AL, FL, GA, KY, MS, NC, SC, TN), Midwest (IL, IN, MI, MN, OH, WI), Southwest (AR, LA, NM, OK, TX), Mountain Plains (CO, IA, KS, MO, MT, NE, ND, SD, UT, WY), and Western (AK, AZ, CA, GU, HI, ID, NV, OR, WA).

2 Elementary school is defined as any school with any span of grades from kindergarten through grade 6. Middle or junior high school is defined as any school that has no grade lower than grade 6 and no grade higher than grade 9. High school is defined as any school that has no grade lower than grade 9 and continues through grade 12. Schools that do not fit these definitions are categorized as “other.”



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorFreeland_s
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy