CNOPS II Year 2 Part B Final

CNOPS II Year 2 Part B Final.docx

Child Nutrition Program Operations Study II (CN-OPS II)

OMB: 0584-0607

Document [docx]
Download: docx | pdf

Supporting Justification for OMB Clearance for the Child Nutrition Program Operations Study II (CN-OPS II)

OMB # 0584-0607



Part B



March 31, 2017










Office of Policy Support

Food and Nutrition Service

United States Department of Agriculture

3101 Park Center Drive

Alexandria, VA 22302

Project Officer: Devin Wallace-Williams

Telephone: 703-457-6791

Email:[email protected]


Table of Contents


Part B. Collections of Information Employing Statistical Methods B-2


B.1 Respondent Universe and Sampling Methods B-2

B.2 Procedures for the Collection of Information B-6

B.3 Methods to Maximize the Response Rates and to Deal with Nonresponse B-12

B.4 Test of Procedures or Methods to be Undertaken B-14

B.5 Individuals Consulted on Statistical Aspects and Individuals Collecting
and/or Analyzing Data B-18


Tables

Table B1. Universe and Sample Sizes and Estimated Design Effects by Stratum B-4

Table B2. Expected Precision Levels Based on the Proposed Sample Design for 1,750 Completed SFAs. B-9




  1. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS

B.1 Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.

The respondent universe for the proposed Years 2, 3, and 4 (School Year (SY) 2016–17, SY 2017–18, SY 2018–19) surveys is derived from the 19,527 School Food Authorities (SFAs) that participated in the National School Lunch Program (NSLP) or the School Breakfast Program (SBP) and submitted form FNS-742 SFA Verification Collection Report Summary Data (FNS 742) (approved under OMB # 0584-0594 Food Programs Reporting System (FPRS), expiration date 9/30/19) to FNS for SY 2014–15. FNS excluded 4,673 SFAs; a universe of size N = 14,854 remains. The following SFAs were excluded:

  • SFAs that operate only in Residential Child Care Institutions that do not have day time students;

  • SFAs that do not have students who are eligible for free/reduced price (FRP) lunch; and

  • SFAs that operate only in private schools.

In summary, the universe includes all 14,854 public SFAs with day students that participated in the National School Lunch Program (NSLP) or the School Breakfast Program (SBP) and submitted form FNS-742 to FNS for SY 2014–15.

The FNS 742 database includes information on the location of the SFA, the number of students in the schools served by each SFA, and the number of students eligible to receive FRP meals in the schools served by each SFA. This information was used to create 10 strata based on both SFA size (number of students enrolled) and estimated percentage of students eligible for FRP meals (defined as high poverty—60 percent or more FRP students and low poverty—less than 60 percent FRP students). The universe was also stratified by the seven FNS regions such that each unit in the universe was coded into its respective region. The FNS 742 database was merged with the U.S. Department of Education’s Common Core of Data (CCD) Local Education Agency Universe Survey File, which is maintained by the National Center for Education Statistics (NCES). Using the CCD data, the universe was then stratified by urbanicity—a classification derived by NCES that designates the location of school districts in urban, suburban, town, or rural areas. All SFAs in the universe matched with the CCD data except 584 units. The unmatched units were treated as a specific stratum in the urbanicity variable.

The explicit strata (SFA size and the estimated percentage of students eligible for FRP meals), including the number of SFAs and corresponding number of students in each stratum are presented in the first five columns of Table B1. This stratification plan allows FNS to establish samples that limit the number of SFAs required to complete the study in multiple years, and produce estimates of the population and sub-population parameters that meet the precision requirements of the study.



Table B1. Universe and Sample Sizes and Estimated Design Effects by Stratum

Strata

SFA Size

(Students)

Poverty Level

Universe

Sampling

Estimated Design Effects

Number

Range

<>60%a

SFAs

Students

SFA Sample Selected to Support All 4 Years

Sample of SFAs Each Year

Expected SFA Respondents at 80 % Response Rate

Point in Time

Year to Year

1

0–2,499

High

3,186

2,062,994

1,567

344

274

1.24

1.18

2

Low

7,810

6,587,739

4,514

988

790

1.14

1.09

3

2,500–4,999

High

421

1,457,686

421

92

74

0.83

0.78

4

Low

1,472

5,214,147

1,472

322

258

0.83

0.78

5

5,000–9,999

High

260

1,818,285

260

57

46

0.83

0.78

6

Low

793

5,514,825

793

173

139

0.83

0.78

7

10,000–99,999

High

256

5,940,334

256

56

45

0.83

0.78

8

Low

625

14,508,774

625

137

109

0.83

0.78

9

100,000–299,999

All

24

3,534,678

24b

12

10

0.58

0.50

10

300,000+

All

7

4,330,908

7c

7

6

0.14

0.11


Total


14,854

50,970,370

9,939

2,188

1,750

1.161

1.103

Source: FNS 742 Database, and Common Core Data (CCD) 2014–2015 LEA data.

a Percentage of enrolled students reported eligible for FRP meals.

b Each of these SFAs were sampled to participate every 2 years, so overall they create 48 selections in the 4-year sample.

c Each of these SFAs were sampled to participate in all 4 years, to create 28 selections in the overall sample.

The sample design utilizes a combination of certainty selection and stratified probability proportionate-to-size (PPS) sampling of SFAs to select those that will complete the survey within the entire 4-year period of CN-OPS-II. The measure of size is the total number of students in the schools served by the SFA. This overall sample was then divided into four random subsets to create the SFA sample for each year. This approach minimizes the number of SFAs that will be asked to participate in a survey in more than one year. Within each stratum, SFAs were sorted by FNS Region (defined by the seven FNS Regional offices) and by urbanicity status (defined by location in urban, suburban, town, or rural area) to ensure the sample selected was balanced on these additional implicit stratifying variables. As seen in Table B1, all SFAs were selected (according to the criteria just mentioned) in strata 3–10 for sampling in at least one year. PPS sampling was used in strata 1 and 2.

As discussed in section B.2, the precision requirements of the study require 1,750 responses per year, or 7,000 over the 4-year study. The 4-year sample size of 9,939 includes some reserve SFAs in any given year, because only 2,188 will be needed to achieve an 80 percent response rate. A reserve SFA is part of the sample, but is only contacted in special cases. These extenuating circumstances may include replacing an ineligible SFA with a reserve SFA and releasing some of the reserve SFAs to guarantee 1,750 responses.

Expected Response Rates

The response rate is the proportion of sampled SFAs that complete the SFA survey. Based on prior experience with SFA surveys conducted for other studies, we (the research team) expect to achieve an SFA response rate of 80 percent. To ensure an 80 percent response rate, we will follow a multi-step process, beginning with notification of the study through well-established FNS communication channels, and then utilizing a user-friendly web interface for the survey, providing email and telephone support, and email and telephone reminders. Additionally, the web survey allows respondents to save and exit at any time and then work on completing the survey later. Thus, the research team plans to sample 2,188 SFAs to obtain 1,750 completed surveys from the SFA directors. If 80 percent is not reached during the data collection period, FNS, through its Regional offices, will contact State agencies and ask them to contact the unresponsive SFAs in their State and encourage participation. This process may extend the proposed data collection period, but in previous studies it has been effective in achieving the 80 percent response rate. Additionally, we will conduct an analysis on the potential for nonresponse bias (described below), and make appropriate adjustments to the weights to minimize bias.

The Child Nutrition (CN) Director Survey will be conducted as a census of all 55 State CN directors (50 States, 4 U.S. Territories, and the District of Columbia) and will not involve sampling. We expect a 100 percent response rate for the CN Director survey. The overall response rate for the study will be approximately 80.5 percent ((1,750+55)/(2188+55)). The sample plan and procedures for collecting data are like Year 1 of CN-OPS-II. The overall response rate in Year 1 was 81.1 percent, including a 100 percent response rate from the 55 State CN directors.

B.2 Describe the procedures for the collection of information including:

  • Statistical methodology for stratification and sample selection,

  • Estimation procedure,

  • Degree of accuracy needed for the purpose described in the justification,

  • Unusual problems requiring specialized sampling procedures, and

  • Any use of periodic (less frequent than annual) data collection cycles to reduce burden.

The information will be collected primarily with web surveys. The respondents will have at least 10 weeks to complete a survey, which allows time to plan their approach to the survey. Some respondents may ask other staff members to complete different sections of the survey, which is facilitated by forwarding the survey link. Respondents will receive reminder emails and calls from trained survey support personnel. Respondents may also call and/or email for help, from trained personnel, in completing their survey or with technical issues. If desired, respondents can complete some questions on the phone because the trained support personnel have access to the respondent’s survey during phone calls via computer-assisted telephone interview software.

Degree of accuracy needed for the purpose described in the justification

FNS requires that the survey sample design will result in a nationally representative sample of SFAs and number of students participating in the NSLP and the SBP. Additionally, FNS requires population point-in-time estimates with a precision of ±5 percentage points at the 95 percent level of confidence and ±10 percentage points at the 95 percent level of confidence for subgroups of SFAs. The primary subgroups of interest will be defined by the size of SFA in terms of the number of students, the location of the SFA in terms of urbanicity as defined by the CCD, the poverty of the SFA in terms of the percentage of students eligible for FRP meals, and the location of the SFA in terms of geographic region as defined by FNS. Additionally, FNS requires precision to detect differences between year-to-year estimates of ±10 percentage points at the 95 percent level of confidence. Within simple random sampling, these requirements translate to sample sizes of 400–500 responding SFAs for each subgroup. For example, with four key subgroups of roughly equal size, the total sample size required to meet the required precision would range from 1,750–2,000 SFAs. In general, however, simple random sampling is not efficient for the multiple analytic objectives of the study. For example, while a simple random (or self-weighting) sample would be optimal for estimating the overall proportion of SFAs reporting various types of food service practices or programs, it would be inefficient for estimating the number of students involved in these types of services or programs. A stratified sample design that also accounts for the size of the SFA, as measured by the number of students, is necessary to meet these competing objectives. Stratification not only helps to ensure that adequate sample sizes are obtained for important analytic subgroups of interest, but it can also be effective in reducing the sampling errors of estimates that are correlated with enrollment size.

A stratified sampling design that employs varying sampling probabilities will be used to select the SFA sample for the study. This design will generally inflate the standard errors of estimates as compared with simple random sampling, but it is justifiable for the reasons mentioned above. A measure of the relative precision of a complex sample design is given by the design effect (DEFF), which is defined as the ratio of the variance of an estimate based on the complex sample design to the hypothetical variance based on a simple random sample of the same size. The design effect is not a fixed characteristic of the sample but one that differs from variable to variable. A DEFF of 1.00 for a variable means that the complex sample is roughly equivalent to a simple random sample in terms of precision. A DEFF less than 1.00 can sometimes occur if the sampling rates in some strata are very high, resulting in non-negligible, finite population correction factors. A DEFF of 2 means that the sample would need to be twice the size of a simple random sample, while a DEFF of 0.5 means that the sample would need to be just half the size of a simple random sample. Under the proposed design, we have estimated that the resulting DEFFs will range from 0.17 to 1.46 across subgroups; given we are expecting to select most of the SFAs in some of the strata, the design effects for some of the subgroups will be less than one. As shown in Table B2, based on the expected margins of error for the estimates of proportions for the population and for key subgroups, a total SFA sample size of 1,750 responding SFAs will be adequate to meet or exceed the ±5 percentage points precision requirement for point-in-time estimates. Similarly, the minimum detectible difference (MDD) is less than 10 percentage points for all subgroups, indicating that the responding sample will meet the precision for year-to-year differences of ±10 percentage points across subgroups.


Table B2. Expected Precision Levels Based on the Proposed Sample Design for 1,750 Completed SFAs.


Domain

Population

Completed SFA Interviews

Precision for Point in Time

Minimum Detectible Difference (MDD) Year to Year






Small

10,996

1,064

3.2%

4.5%

Medium

2,946

516

3.9%

5.4%

Large

912

170

6.7%

9.2%

City

1,798

239

6.3%

8.6%

Suburban

3,287

477

4.4%

6.0%

Town

2,512

314

5.6%

7.7%

Rural

6,672

730

3.8%

5.2%

Missing Urbanicity

584

-



Poverty High

4,123

438

4.9%

6.7%

Poverty Low

10,731

1,312

2.7%

3.7%

Mid Atlantic

1,494

188

7.1%

9.8%

Midwest

3,798

432

4.8%

6.7%

Mountain

2,295

248

6.5%

9.0%

Northeast

1,641

198

7.0%

9.7%

Southeast

1,235

174

7.2%

9.8%

Southwest

2,239

249

6.4%

8.8%

Western

2,079

252

6.2%

8.5%






Total /Overall

14,854

1,750

2.5%

3.5%

Source: FNS 742 Database, Common Core Data (CCD) 2014-2015 LEA data.

Of the 14,854 SFAs on the FNS-742 database, we were unable to match 584 to the CCD resulting in missing urbanicity information as shown in Table B2.


Statistical methodology for Sample Stratification and Selection

As indicated in Section B.1, an SFA-level database derived from the 2014–15 Verification Collection Report data was used to construct the SFA sampling frame. In addition to a unique identifier (SFAID), the name of the SFA, and the State in which the SFA is located, the database includes information about the type of the SFA/school district (public or private), the number of schools participating in the NSLP/SBP, the total enrollment in participating schools, and the number of students eligible for FRP meals. This information, along with the data from the most recent CCD LEA universe file, where applicable, were used to stratify SFAs for sampling purposes. All known eligible SFAs, including those that could not be matched to the most current CCD file, were included in the sampling frame.

The types of SFA/district-level variables that can be used either as explicit or implicit stratifiers include region (defined by the seven FNS regions1), enrollment size, a measure of poverty status defined by the percent of students eligible for FRP meals, minority status defined by the percent of students who were non-white, type of locale or urbanicity (e.g., central city, suburban, town, rural), and instructional level of schools served by the SFA (e.g., elementary schools only, secondary schools only, or both).2 Because many of these characteristics are related, it was not necessary to employ all of them in the stratification to account for variation in the SFAs. Two variables were used to create the strata: SFA enrollment size and poverty status. A total sample of 2,188 SFAs were allocated to the strata as discussed in section B1.


Table B2 summarizes the approximate survey sample sizes and precision to be expected under the proposed design for selected subgroups. The standard errors in Table B2 reflect design effects ranging from 0.17 to approximately 1.5, depending on subgroup. The design effect primarily reflects the fact that under the proposed stratified design, large SFAs will be sampled at relatively higher rates (i.e., have smaller sampling weights) than small SFAs.

Estimation Procedure

For estimation purposes, the sampling weights will reflect the overall probabilities of selection and of responding to the survey. The first step in the weighting process will be to assign a base weight to each sampled SFA. The base weight is equal to the reciprocal of the probability of selecting the SFA for the study, which will vary by sampling stratum under the proposed stratified sample design. Next, the base weights will be adjusted for nonresponse within cells consisting of SFAs that are expected to be homogeneous with respect to response propensity. To determine the appropriate adjustment cells, we will conduct a nonresponse bias analysis to identify characteristics of SFAs that are correlated with nonresponse. The potential set of predictors to be used to define the adjustment cells will include SFA-level characteristics that are available from the FNS database, and data from the most recent CCD file. Within these cells, a weighted response rate will be computed and applied to the SFA base weights to obtain the corresponding nonresponse-adjusted weights.

To properly reflect the complex features of the sample design, standard errors of the survey-based estimates will be calculated using jackknife replication. Under the jackknife replication approach, subsamples or "replicates" will be formed in a way that preserves the basic features of the full sample design. A set of weights (referred to as “replicate weights”) will then be constructed for each jackknife replicate. Using the full sample weights and the replicate weights, estimates of any survey statistic can be calculated for the full sample and for each of the jackknife replicates. The variability of the replicate estimates is used to obtain the variance of the survey statistic. The replicate weights can be imported into variance estimation software (e.g., SAS, STATA, SUDAAN, WESVAR) to calculate standard errors of the survey-based estimates. In addition to the replicate weights, stratum and unit codes will be provided in the data files to permit calculation of standard errors using Taylor series approximations if desired. Note that while replication and Taylor series methods often produce similar results, jackknife replication has some advantages when weights are adjusted to account for nonresponse.3

Unusual Problems Requiring Specialized Sampling Procedures

We do not anticipate any unusual problems requiring any specialized sampling procedures.

Any Use of Periodic (Less Frequent Than Annual) Data Collection Cycles to Reduce Burden.

The data collection procedures will be conducted annually in SY 2016–17, SY 2017–18, and SY 2018–19.


B.3 Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.

Overall response rate projections were presented earlier. Achieving the specified response rate involves contacting the selected SFAs, securing their participation in the survey, and then offering support and completion reminders, using the procedures described below. We estimate 80 percent of the sampled SFA directors will complete the web-administered survey. We expect 100 percent of State CN directors to complete their survey.

The procedures for maximizing the number of respondents to the surveys are:

  • The letters inviting SFA directors and CN directors to participate were carefully developed to emphasize the importance of this study and how the information will help FNS to better understand and address current policy issues related to the Child Nutrition Program operations.

  • The current contact information will be used for all initial correspondence, and will be updated as needed throughout the data collection period to facilitate communication with the research team.

  • Designated FNS regional staff will serve as regional study liaisons, and will be kept closely informed of the project so that they will be able to answer questions from SFA and CN directors and encourage participation.

  • A toll-free number and study email address will be provided to all participants so that SFA and State directors can receive assistance with the study.

  • Sampled SFA directors and the CN directors will have the option of completing the web-based survey as a telephone survey.

  • Periodic email reminders will be sent to sampled SFAs who have not yet completed the survey.

  • We will follow up by telephone with all sampled SFA and CN directors who do not complete the survey within a specified period and urge them to complete the survey. At that point, if the directors prefer to complete the survey or remaining sections of the survey over the telephone, a telephone interviewer will administer the survey, or the remaining parts, over the telephone.

The following procedures will be used to maximize the completion rates for surveys that are administered by telephone:

  • Use a core of interviewers with experience conducting telephone surveys, particularly interviewers who have proven their ability to obtain cooperation from a high proportion of survey participants.

  • Conduct a telephone interviewer training session specific to this study.

  • Use call scheduling procedures that are designed to call numbers at different times of the day (between 8am and 6pm) and days of the week (Monday through Friday), to improve the chances of finding a respondent at work.

  • Provide a toll-free number and email help address for respondents to verify the study’s legitimacy, or to ask other questions about the study.

B.4 Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.

Pretest participants will continue to differ in all years of the CN-OPS II study. Additionally, pretest participants will vary by FNS region to reduce burden on regions and States.

The draft CN and SFA Director Surveys for Year 2 (SY 16-17) were pretested in October 2016. The pretest instruments included newly developed questions and questions that were heavily edited from the previous CN-OPS-II surveys. The pretest was conducted to ensure that questions are understandable, that they use language familiar to respondents, and that they are consistent with the concepts they aim to measure. The team has also used these pretests to identify typical instrumentation problems (such as question wording and incomplete or inappropriate response categories), to measure the response burden, and to confirm that there are no unforeseen difficulties in administering the instruments. Additionally, all pretest participants were debriefed by phone and provided opportunities for general comments about the instruments. We developed debriefing protocols that were used to guide our conversation with respondents. The debriefing protocols ask specific questions about the clarity of questions and feasibility of providing answers. Three CN directors and five SFA directors participated in the pretest for Year 2.

To arrange the pretests, six States that were known to have above average response rates were chosen for CN-OPS-II year 1 survey. FNS asked the Mid-West and Mountain Plains Regional offices to email the State agencies located in Minnesota, Utah, Kansas, Indiana, Iowa, and Wisconsin, about their potential involvement in the CN-OPS-II. The email asked CN directors if they would be interested in participating in a pretest of the CN Director Survey, and if they could recommend five SFA directors in their State to potentially participate in a pretest of the SFA Director Survey. For each of the recommended SFAs, CN directors were asked to indicate whether the SFA would be using free or reduced-price meal applications in school year 2016-17. After the necessary outreach was made by FNS and the Regional Offices, the contractor contacted them via email and follow-up phone calls. Indiana, Kansas, and Minnesota State CN directors were chosen to participate in the CN Director Survey pretest based on their ability to participate within the necessary timeframe. Then the research team prioritized outreach to specifically recommended SFAs to allow for a high chance of recruiting SFAs that varied in size. Six SFAs were invited to participate in the pretest and five responded.

CN Director Survey Pretest Findings

The three pretest respondents provided very helpful feedback about the pretest instrument. Using this feedback, we improved the flow of several sections and revised questions and instructions to clarify or improve construction and accuracy. For example, at question 2.8 all respondents commented that they were unsure whether they should include the summer meals programs in their response. One respondent was unsure whether to include CACFP in the response. In reaction to this feedback, the research team added instruction text that indicates that school meal operations include breakfast and lunch programs, summer meals, and CACFP. Another example of a change made to improve clarity was question 3.7. Respondents commented that the phrase “prior to the execution of the contract” was not clear. Typically, they reviewed the contracts during administrative review but reviewed processes and requests for proposals during procurement review. Because of this feedback, the phrase “prior to execution of the contract” was removed from questions 3.7, 3.8, 3.16, 3.18 and 3.21. At Section 4, one respondent commented that FNS is rolling out a new electronic prototype in December. The respondent said that there is a hard-copy prototype and an electronic prototype. The respondent recommended that the survey specify that it does not ask about the electronic prototype in this section. An instruction was added to the beginning of the section that states that the survey does not ask about to the electronic prototype. The research team also added questions 4.5a, b, and c to ask about the State’s policy for electronic applications.

Modifications were also made to reduce the time that the respondent would spend on calculations such as the percentage of hours in training and in tabulations such as enumerating the number of regional management companies working with SFAs in a State. The three pretest respondents spent very different amounts of time answering the questions. On average, the pre-test survey took 150 minutes. To further reduce the burden, we dropped questions concerning professional standards and plan to add those to the Year 3 survey. Considering all modifications, we estimate that the average burden will be 120 minutes.


SFA Director Survey Pretest Findings

Overall, pre-test respondents reported that questions in most sections in the SFA Director Pre-test Survey were clearly worded and easy to answer. Based on feedback from pre-test respondents, two sections of the survey were extensively modified to reduce both burden and improve accuracy in addressing the research questions. Questions in other sections of the survey were modified per feedback from the pre-testers to improve the flow of the survey, include more response options, and reduce the time needed for background research by, for example, including links to referenced documents and templates, and adding hover text for some definitions. For example, in Section 4, respondents expressed confusion as to whether each question should total 100 percent or the combined responses should total 100 percent. The research team combined the questions to create 4.9, which asks respondents to estimate the percentages of their SFA’s total purchases of goods and services for school meals procured during SY 2016–2017. In Section 8, respondents indicated uncertainty about whether they use the USDA prototype application for free and reduced price (F/RP) meals because they use a State-issued application and/or made minor modifications to the USDA prototype application. Therefore, the research team added a link to the USDA prototype application to Q8.2, reordered the questions to first ask whether respondents use a hard-copy F/RP meal application, and then modified the response options on 8.2 to allow for a larger variety of responses. The research team also added 8.6 to ask where they got their primary hard-copy application form.

Given these changes, FNS estimates that the SFA Director Survey will take approximately 120 minutes to complete.


B.5 Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.

The Contractor, 2M Research Services (2M), and its Subcontractor, Mathematica Policy Research (Mathematica), will conduct this study.

Name

Affiliation

Telephone Number

email

Jim Murdoch

2M

817-856-0863

[email protected]

Michael Sinclair

Mathematica

202-552-6439

[email protected]

Eric Zeidman

Mathematica

609-936-2784

[email protected]

Charlotte Cabili

Mathematica

202-238-3322

[email protected]

Devin Wallace-Williams

FNS/USDA

703-457-6791

[email protected]

Evan Schulz

NASS/USDA

202-690-8640

[email protected]


1 The seven regions (and States) are: Northeast (CT, ME, MA, NH, NY, RI, VT), Mid-Atlantic (DE, DC, MD, NJ, PA, PR, VA, VI, WV), Southeast (AL, FL, GA, KY, MS, NC, SC, TN), Midwest (IL, IN, MI, MN, OH, WI), Southwest (AR, LA, NM, OK, TX), Mountain Plains (CO, IA, KS, MO, MT, NE, ND, SD, UT, WY), and Western (AK, AZ, CA, GU, HI, ID, NV, OR, WA).

2 Elementary school is defined as any school with any span of grades from kindergarten through grade 6, such as K–4, 4–6, or K–5. Middle or junior high school is defined as any school that has no grade lower than grade 6 and no grade higher than grade 9, such as grade 6 only, 6-7, or 6-9. High school is defined as any school that has no grade lower than grade 9 and continues through grade 12, such as grade 9 only, 9–10, or 9–12. Schools that do not fit these definitions, such as 6–12, K–8, or K–12, are categorized as “other.”

3 Rust, K. F., & Rao, J. N. K.. (1996). Variance estimation for complex surveys using replication techniques. Statistical Methods in Medical Research, 5, 283–310.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorFreeland_s
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy