SNOPS OMB Part B Year 3

SNOPS OMB Part B Year 3.docx

Special Nutrition Program Operations Study (SNOPS)

OMB: 0584-0562

Document [docx]
Download: docx | pdf

Special Nutrition Program Operations Study (SN-OPS)


Statement for Paperwork Reduction Act Submission

Revision of current OMB Number 0584-0562

Part B: Supporting Statement



December 20, 2013










Office of Policy Support

Food and Nutrition Service

United States Department of Agriculture

Project Officer: Allison Magness

Telephone: 703-305-2098

Table of Contents

Pag

e

Introduction A-3

Part A: Justification A-5


A.1 Circumstances That Make the Collection of Information Necessary A-5

A.2 Purpose and use of the Information A-10

A.3 Use of Information Technology and Burden Reduction A-13

A.4 Efforts to Identify Duplication and Use of Similar Information A-14

A.5 Impact on Small Businesses or Other Small Entities A-14

A.6 Consequences of Collecting the Information less Frequently A-14

A.7 Special Circumstances Relating to the Guidelines of 5 CFR 1320.5 A-15

A.8 Comments in Response to Federal Register Notice and Efforts to Consult Outside Agency A-16

A.9 Explanation of Any Payment or Gift to Respondents A-16

A.10 Assurance of Confidentiality Provided to Respondents A-16

A.11 Justification for Sensitive Questions A-17

A.12 Estimates of Annualized Burden Hours and Costs A-17

A.13 Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers A-20

A.14 Annualized Cost to the Federal Government.. A-20

A.15 Explanation for Program Changes or Adjustments. A-20

A.16 Plans for Tabulation and Publication and Project Time Schedule. A-21

A.17 Reason(s) Display of OMB Expiration Date is Innapropriate. A-25

A.18 Exceptions to Certification for Paperwork Reduction Act Submission. A-25


Part B. Collections of Information Employing Statistical Methods B-3


B.1 Respondent Universe and Sampling Methods B-3

B.2 Procedures for the Collection of Information B-5

B.3 Methods to Maximize the Response Rates and to Deal with Nonresponse B-13

B.4 Test of Procedures or Methods to be Undertaken B-14

B.5 Individuals Consulted on Statistical Aspects and Individuals
Collecting and/or Analyzing Data B-17




Tables

Table A1. Estimates of respondent burden A-18

Table A2. Annualized cost to respondents A-19

Table A3. Data collection schedule A-22

Table B1. Distribution of eligible SFAs in the 2011-12 FNS-742 universe file B-4

Table B2. Expected margins of error* for various sample sizes (n) and
design effects (DEFF) B-7

Table B3. Proposed sample sizes for the SFA survey B-10

Table B4. Expected sample sizes and corresponding standard error of an
estimated proportion under proposed design for selected

analytic domains B-11


Appendixes

A Crosswalk Between Topics And Questions Included
In The 2012-13 And 2013-14 State Child Nutrition Director Surveys A-1

B Crosswalk Between Topics And Questions Included
In The 2012-13 And 2013-14 SFA Director Surveys B-1

C Research Questions Year 3 C-1

D1 Email Notification to Regional Offices D1-1

D2 Invitation Letter to State CN Directors D2-1

D3.1 Follow-Up Email to State CN Directors D3.1-1

D3.2 Reminder Email to State CN Directors D3.2-1

D4 CN and SFA Director Telephone Interviewer Scripts D4-1

D5 Thank You Letter to State CN Director D5-1

D6 Email Notification to State CN Directors D6-1

D7 Invitation Letter to SFA Directors D7-1

D8.1 Follow-Up Email to SFA Directors D8.1-1

D8.2 Reminder Email to SFA Directors D8.2-1

D9 Thank You Letter to SFA Director D9-1

E State CN Director Survey Year 3 E-1

F SFA Director Survey Year 3 F-1





SUPPORTING STATEMENT B


Special Nutrition Program Operations Study



  1. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS

B.1 Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.



The respondent universe for the proposed Year 3 survey includes all SFAs operating in public school districts in the United States and outlying territories that were required to submit form FNS-742 (SFA Verification Summary Data 7 CFR Part 245, Determining Eligibility for Free & Reduced Price Meals, OMB# 0584-0026, expiration date 4/30/2016) to USDA- FNS in SY2011-12. The Year 3 survey will use the same respondent universe and sampling frame as Year 2. In general, all SFAs that participated in the NSLP or SBP are included in the respondent universe with the following exceptions:


  • SFAs that operate only in Residential Child Care Institutions that do not have day time students;

  • SFAs that do not have students who are eligible for free/reduced-price lunch;

  • SFAs in some outlying territories that are not required to complete form FNS-742; and

  • Private schools that participate in the NSLP.



The SY 2011-12 FNS-742 database was used to construct the SFA sampling frame (i.e., the universe file) from which the respondent samples will be drawn. There were over 19,000 SFAs in the 2011-12 FNS-742 data base. However, only the approximately 15,000 SFAs operating in public school districts were included in the sampling frame. Note that the unit of analysis for the study is the SFA, which usually (but not always) coincides with a local education agency (LEA) as defined in the U.S. Department of Education’s Common Core of Data (CCD) Local Education Agency Universe Survey File maintained by the National Center for Education Statistics (NCES). Exceptions are SFAs that operated school food programs for multiple school districts and those operating individual schools (e.g., some public charter schools). In the 2011-12 FNS-742 data base, approximately 85 percent of the eligible SFAs matched a district (LEA) in the then-current CCD universe file. Table B1 summarizes the distribution of eligible SFAs in the sampling frame by enrollment size and categories of poverty status based on the percentage of students eligible for free/reduced-price lunch.


Table B1. Distribution of eligible SFAs in the 2011-12 FNS-742 universe file
(sampling frame) by enrollment size and percent of students
eligible for free/reduced price lunch

 

Percent of students eligible for free/reduced price lunch

 

SFA enrollment1

Less than 30

30 to 59

60 or more

Total

Under 1,000

1,146

3,474

3,299

7,919

1,000 to 4,999

1,421

2,441

1,400

5,262

5,000 to 24,999

434

707

508

1,649

25,000 or more

55

129

112

296

Total

3,056

6,751

5,319

15,126

1 Number of students with access to NSLP/SBP as reported in 2011-12 FNS 742.



Expected Response Rates


The response rate is the proportion of sampled SFAs that complete the SFA survey. Based on experience with the previous two years of the SFA survey, we expect to achieve an SFA response rate of 80 percent. Thus, we plan to sample 1,875 SFAs to obtain 1,500 completed surveys with SFA directors. The State Child Nutrition Director survey will be conducted among all 56 State directors (including all States, U.S. Territories, and the District of Columbia) and will not involve any sampling. We expect at least a 95 percent response rate for the State Child Nutrition Director survey.



Previous Data Collections and Response Rates


This data collection is similar to the Year 1 and Year 2 data collection conducted in SY 2011-12 and SY 2012-13. The assumed 80 percent and 95 percent response rates for the SFA and State Child Nutrition Directors, respectively, are based on experience in the prior surveys involving SFA and State Child Nutrition Directors.



B.2 Describe the procedures for the collection of information including:


  • Statistical methodology for stratification and sample selection,

  • Estimation procedure,

  • Degree of accuracy needed for the purpose described in the justification,

  • Unusual problems requiring specialized sampling procedures, and

  • Any use of periodic (less frequent than annual) data collection cycles to reduce burden.


Below we describe the procedures for the collection of information including statistical methodology for stratification and sample selection, estimation procedure, and the degree of accuracy needed for the purpose described in the justification.


A goal of the survey sample design is to obtain a nationally representative sample of SFAs that will yield population estimates with a precision of ±5 percent at the 95 percent level of confidence for the overall SFA population and for specified subgroups of SFAs. Under simple random sampling, this translates to a sample size of 400-500 responding SFAs for each subgroup. For example, with three key subgroups of roughly equal size, the total required sample size would range from 1,200-1,500 SFAs to meet the specified precision levels. In general, however, simple random sampling is not efficient for the multiple analytic objectives of the study. For example, while a simple random (or self-weighting) sample would be optimal for estimating the overall prevalence of SFAs reporting various types of food service practices or programs, it would be inefficient for estimating the numbers of students involved in these types of services or programs. A stratified sample design using variable rates that depend on the size of the SFA would better meet these conflicting objectives. Stratification not only helps to ensure that adequate sample sizes are obtained for important analytic subgroups of interest, but can also be effective in reducing the sampling errors of estimates that are correlated with enrollment size.


A stratified sampling design employing varying sampling fractions was used to select the SFA sample for the study. Such a design will generally inflate the standard errors of prevalence estimates as compared with simple random sampling but is justifiable for reasons mentioned above. A measure of the relative precision of a complex sample design is given by the design effect (DEFF), which is defined to be the ratio of the variance of an estimate based on the complex sample design to the hypothetical variance based on a simple random sample of the same size. A design effect of 1.00 means that the complex sample is roughly equivalent to a simple random sample in terms of sampling precision. (A design effect less than 1.00 can sometimes occur if the sampling rates in some strata are very high, resulting in non-negligible finite population correction factors.) Under the proposed design, we have estimated that the resulting design effects will range from slightly under 1.00 to 1.50 depending on the subgroup being analyzed. As indicated in table B2, which summarizes the expected margins of error of a prevalence estimate under the proposed design for a range of sample sizes and design effects, a total SFA sample size of 1,500 responding SFAs should be more than adequate to meet or exceed the ±5 percent precision requirement even for design effect as large as 1.5. For a subgroup consisting of 500 SFAs for which the design effect is 1.10 (e.g., this would be reasonable for subgroups defined by size of SFA, but may be larger for other subgroups), the expected level of precision for the subgroup would be at most ±4.9 percent (and would be lower for prevalence estimates that are less than 50 percent or greater than 50 percent).


Table B2. Expected margins of error* for various sample sizes (n) and design
effects (DEFF)


Design effect (DEFF)

n

1.10

1.25

1.50

100

11.0%

12.5%

15.0%

200

7.8%

8.8%

10.6%

300

6.4%

7.2%

8.7%

400

5.5%

6.3%

7.5%

500

4.9%

5.6%

6.7%

600

4.5%

5.1%

6.1%

700

4.2%

4.7%

5.7%

800

3.9%

4.4%

5.3%

900

3.7%

4.2%

5.0%

1,000

3.5%

4.0%

4.7%

1,100

3.3%

3.8%

4.5%

1,200

3.2%

3.6%

4.3%

1,300

3.1%

3.5%

4.2%

1,400

2.9%

3.3%

4.0%

1,500

2.8%

3.2%

3.9%

*Entries correspond to 95% confidence limits for an estimated prevalence of approximately 50%. For estimated prevalence less than 50% or greater than 50%, the confidence limits will be smaller than those indicated in the table.



Sample Stratification and Selection


As indicated in Section B.1, an SFA-level database derived from 2011-12 Verification Summary Reports data (FNS form 742) was used to construct the SFA sampling frame. In addition to a unique identifier (SFAID), name of SFA, and State in which the SFA is located, the database includes information about the type of control of the SFA/school district (public or private), number of schools participating in the NSLP/SBP, total enrollment in participating schools, and the number of students eligible for free or reduced-price lunch. This information, along with data from the most recent NCES Common Core of Data (CCD) LEA universe file where applicable were used to stratify SFAs for sampling purposes. Note that all known eligible SFAs, including those that could not be matched to the then-current CCD file, were included in the sampling frame.


The types of SFA/district-level variables that can be used either as explicit or implicit stratifiers include region (defined by the seven FNS regions1), enrollment size class, a measure of poverty status defined by the percent of students eligible for free/reduced-price lunch, minority status defined by the percent of students who were non-white, type of locale (e.g., central city, suburban, town, rural), and instructional level of schools served by the SFA (e.g., elementary schools only, secondary schools only, or both)2. Since many of these characteristics are related, it was not necessary to employ all of them in the stratification to account for the variation in SFAs. Three variables were used to create the strata: SFA enrollment size, FNS region, and poverty status. Note that since type-of-locale, minority status, and instructional level are not be available for SFAs that are not matched to LEAs in the CCD file, the non-matched cases were placed in a separate category for sampling purposes. The CCD variables were used as implicit stratifiers (i.e., sorting variables) to ensure appropriate representation in the sample. A total sample of 1,865 SFAs were allocated to the strata in proportion to the aggregate square root of the enrollment of SFAs in the stratum. Such an allocation gives large SFAs relatively higher selection probabilities than smaller ones and provides acceptable sampling precision for both prevalence estimates (e.g., the proportion of SFAs with a specified characteristic) and numeric measures correlated with enrollment (e.g., the number of students in SFAs with access to various food services or programs).


During 2011-12, the base year (or Year 1), 1,400 SFAs completed the SFA Directors’ Survey. To permit longitudinal analyses, in Year 2, all of the still-eligible SFAs that were selected in Year 1 (including responding as well as nonresponding SFAs) were retained in the sample. Additionally, to achieve the desired total sample size of 1,500 respondents, the Contractor added a supplemental sample of 141 SFAs, bringing the total number of SFAs in the sample to 1,875. Similar to procedures used to select the Year 1 sample, the supplemental sample was selected at rates that depended on the size of the SFA, where large SFAs are selected at relatively higher rates than smaller ones. Table B3 summarizes the Year 2 expected numbers of SFAs to be sampled and the corresponding expected numbers of responding SFAs by percent eligible for free/reduced-price lunch and enrollment size class. For Year 3, these expectations will be identical.



Table B3. Proposed sample sizes for the SFA survey

Percent eligible for free/reduced price lunch1

Enrollment size class2

Expected number of SFAs to be sampled

Expected number of responding SFAs3

Under 60 percent

Less than 1,000

292

234


1,000 to 4,999

579

463


5,000 to 24,999

357

286


25,000 or more

124

99


Subtotal

1,352

1,082

60 percent or more

Less than 1,000

148

118


1,000 to 4,999

181

145


5,000 to 24,999

131

105


25,000 or more

63

50


Subtotal

523

418

All SFAs

Total

1,875

1,500

1Calculated from the numbers of students eligible for free or reduced price lunch as reported in 2011-12 FNS 742.

2Number of students with access to NSLP/SBP as reported in 2011-12 FNS 742.

3Based on 80% response rate. Note: See Table B4 for additional breakouts of the sample.



Expected Levels of Precision


Table B4 summarizes the approximate survey sample sizes and standard errors to be expected under the proposed design for selected subgroups. The standard errors in table B4 reflect design effects ranging from 1.0 or less to 1.5 depending on subgroup. The design effect primarily reflects the fact that under the proposed stratified design, large SFAs will be sampled at relatively higher rates (i.e., have smaller sampling weights) than small SFAs. The standard errors in table B4 can be converted to 95 percent confidence bounds by multiplying the entries by 2. For example, an estimated proportion of the order of 20 percent (P = 0.20) for SFAs in which fewer than 30 percent of students are eligible for free/reduced price lunch will be subject to a margin of error of ±4.6 percent at the 95 percent confidence level. Similarly, an estimated proportion of the order of 50 percent (P = 0.50) for SFAs in the Northeast region will be subject to a margin of error of ±8.6 percent at the 95 percent confidence level.


Table B4. Expected sample sizes and corresponding standard error of an estimated proportion under proposed design for selected analytic domains

Domain (subset)

Expected sample size*

Standard error† of an estimated
proportion equal to …

P = 0.20

P = 0.33

P = 0.50

Total sample

1,500

0.012

0.014

0.015

Percent of students eligible for free/reduced price lunch

 




Less than 30

394

0.023

0.027

0.028

30 to 59.9

688

0.018

0.021

0.022

60 or more

418

0.024

0.028

0.030

FNS Region

 




Mid Atlantic

173

0.035

0.041

0.043

Midwest

339

0.024

0.028

0.030

Mountain

173

0.035

0.041

0.043

Northeast

167

0.035

0.041

0.043

Southeast

195

0.033

0.039

0.041

Southwest

213

0.032

0.038

0.041

Western

240

0.034

0.040

0.042

SFA Enrollment Size

 




Under 1,000

352

0.020

0.024

0.026

1,000 to 4,999

608

0.015

0.018

0.019

5,000 or more

540

0.015

0.018

0.019

* Expected number of responding eligible SFAs, assuming response rate of 80 percent. The standard errors given in this table are given for illustration. Actual standard errors will depend on characteristics being estimated and may differ from those shown.

Assumes unequal weighting design effect ranging from 0.78 to 1.87 depending on subgroup.



Estimation and Calculation of Sampling Errors


For estimation purposes, sampling weights reflecting the overall probabilities of selection and differential nonresponse rates will be attached to each data record providing usable SFA data. The first step in the weighting process will be to assign a base weight to each sampled SFA. The base weight is equal to the reciprocal of the probability of selecting the SFA for the study, which will vary by sampling stratum under the proposed stratified sample design, and also depend on whether the SFA was originally sampled in Year 1 or was selected for the supplemental sample. Next, the base weights will be adjusted for nonresponse within cells consisting of SFAs that are expected to be homogeneous with respect to response propensity. To determine the appropriate adjustment cells, we will conduct a nonresponse bias analysis to identify characteristics of SFAs that are correlated with nonresponse. The potential set of predictors to be used to define the adjustment cells will include SFA-level characteristics that are available from the FNS database and data from the most recent CCD file. Within these cells, a weighted response rate will be computed and applied to the SFA base weights to obtain the corresponding nonresponse-adjusted weights.


To properly reflect the complex features of the sample design, standard errors of the survey-based estimates will be calculated using jackknife replication. Under the jackknife replication approach, 100 subsamples or "replicates" will be formed in a way that preserves the basic features of the full sample design. A set of weights (referred to as “replicate weights”) will then be constructed for each jackknife replicate. Using the full sample weights and the replicate weights, estimates of any survey statistic can be calculated for the full sample and for each of the jackknife replicates. The variability of the replicate estimates is used to obtain the variance of the survey statistic. The replicate weights can be imported into variance estimation software (e.g., SAS, SUDAAN, WESVAR) to calculate standard errors of the survey-based estimates. In addition to the replicate weights, stratum and unit codes will be provided in the data files to permit calculation of standard errors using Taylor series approximations if desired. Note that while replication and Taylor series methods often produce similar results, jackknife replication has some advantages in reflecting statistical adjustments used in weighting such as nonresponse and poststratification (e.g., see Rust, K.F., and Rao, J.N.K., 1996. Variance estimation for complex surveys using replication techniques. Statistical Methods in Medical Research, 5: 283-310).




B.3 Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.


Overall response rate projections were presented earlier. Achieving the specified response rate involves locating the sample members to secure participation using procedures described below. We estimate 80 percent of the sampled SFA Directors will complete the web-administered survey. We expect at least 95 percent of State Child Nutrition Directors to complete their survey.


Below we describe procedures to be followed to maximize the number of sample members who complete the survey:


  • The letters inviting SFA Directors and State Child Nutrition Directors to participate in the third year of the surveys will be very carefully developed to emphasize the importance of this study and how the information will help the Food and Nutrition Service (FNS) to better understand and address current policy issues related to Special Nutrition Program (SNP) operations.

  • Designated FNS regional staff will serve as regional study liaisons and be kept closely informed of the project so that they will be able to answer questions from SFAs and States and encourage participation.

  • The Contractor will have a toll free number and study email address so that SFAs and States can receive assistance with the study.

  • Sampled SFA Directors will have the option of completing the web-based survey as a telephone survey. The State Child Nutrition Directors will have the option of completing the web-based survey as a telephone survey.

  • Periodic email reminders will be sent to sample members who have not yet completed the survey.

  • We will follow up by telephone with all sampled SFA and State Child Nutrition Directors who do not complete the survey within a specified period and urge them to complete the survey. At that point if the Directors prefer to complete the survey or remaining sections of the survey over the telephone, a telephone interviewer will administer the survey or remaining parts over the telephone.


The following procedures will be used to maximize the completion rates for surveys that are administered by telephone:


  • Use a core of interviewers with experience working on telephone surveys, particularly interviewers who have proven their ability to obtain cooperation from a high proportion of sample members.

  • Conduct a telephone interviewer training specific to this study.

  • Use call scheduling procedures that are designed to call numbers at different times of the day (between 8am and 6pm) and days of the week (Monday through Friday), to improve the chances of finding a respondent at work.

  • Provide a toll-free number and email help address for respondents to verify the study’s legitimacy or to ask other questions about the study.


B.4 Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.


A substantial number of the questions to be included in the third year of data collection were asked in Year 1 and/or Year 2. The contractor conducted a pretest of the two surveys with a focus on content that was developed for Year 3. Three State Child Nutrition Directors and seven SFA Directors from within the three States completed a hard-copy version of their respective surveys in early January 2014 and provided feedback during telephone debriefing interviews. The interviews assessed: (1) clarity of the wording, (2) availability of the information, and (3) response burden (SFA Directors only). Pretest participants’ feedback was used to revise the instruments and data collection plans; key changes are described below.

State Agency (SA) Survey. Overall, pretest participants felt the new content in the SA Survey addressed important topics. The major concern that was expressed by all respondents was the timing of the survey and the availability of the Administrative Review data being requested. They indicated that the process is new and they do not yet have all the systems in place to be able to quickly and easily access information for the questions in Section A. A further complication is the timing of the survey and the schedule for completion of these reviews. The pretest participants noted that many reviews will be taking place toward the end of the school year (April, May, and June). We are taking three steps to address this issue:

  • Delay administering the SA Survey until June 2014 to allow more reviews to be completed.

  • Revise the Section A questions to ask participants to provide the number of reviews completed as of June 1, 2014 and estimate the remaining number of reviews.

  • Acknowledge the potential challenge SAs may have with Section A to minimize participants’ potential frustration.

The pretest participants provided valuable feedback on other questions and response options in the survey. We revised a small number of the SA Survey questions, with the key changes summarized below:

  • Questions A9-A10a: One participant explained that the dollar amount of fiscal action may not be known until the review is completed and closed. The question wording was refined to focus on closed reviews as of June 1, 2014 and to exclude all disregards (regardless of each SA’s disregard amount). We considered asking all SAs to provide the number of disregards using the $600 threshold for consistency, but felt the approach would make the question more difficult for SAs that have a different threshold to answer. We also added a yes/no question to determine if a State’s disregard for overclaim is less than $600, and if yes, a follow-up question will determine the actual overclaim amount.

  • Questions C9a-C10a: We added food service management companies to the questions because the pretest participants all suggested they should be included. SAs review these contracts and provide prototype procurement documents or model contracts to SFAs.

SFA Director Survey. On average, SFA Directors took 1 hour and 5 minutes to complete the third year survey plus an additional hour looking up information needed to complete the survey. Average burden was therefore approximately 2 hours, a reduction of 1 hour from the Year 2 survey.

The other feedback we received from SFA Directors was minor. We used it to adjust question or response option wording, or add response options, in order to improve clarity and salience. Key changes are described below:

  • We added instructions in questions 5.5 and 5.6 for respondents who currently do not use a certain type of fruit or vegetable product to mark “use less often” if they used the product before implementing the new meal patterns or “same frequency” if they were not using the product previously either.

  • SFA directors were divided on question 10.3 over whether this information should be reported in terms of none/some/all or elementary/middle/high breakdowns. Some noted the school-level breakdown could be difficult to provide if there are many “other” schools under their authority, making this question more difficult to answer. Other pretest participants said answering by school level would allow them to more specifically report this information than the none/some/all breakdown. Given that SFA Directors had difficulty in prior rounds of SN-OPS to report data separately by school level, we retained the none/some/all format. This format is also consistent with the analysis goal of assessing the proportion of schools implementing Smarter Lunchroom techniques.



B.5 Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.



The Contractor, 2M Research Services, and their Subcontractor, Mathematica Policy Research, will conduct this study.


Name

Affiliation

Telephone Number

e-mail

Jim Murdoch

2M

817-856-0863

[email protected]

Roderick Harrison

2M

202-266-9901

[email protected]

Anne Gordon

Mathematica

301-294-3943

[email protected]

Nicholas Beyler

Mathematica

202-250-3539

[email protected]

Allison Magness

FNS/USDA

703-305-2098

[email protected]

Jennifer Rhorer

NASS/USDA

202-720-2616

[email protected]




1 The seven regions (and states) are: Northeast (CT, ME, MA, NH, NY, RI, VT), Mid-Atlantic (DE, DC, MD, NJ, PA, PR, VA, VI, WV), Southeast (AL, FL, GA, KY, MS, NC, SC, TN), Midwest (IL, IN, MI, MN, OH, WI), Southwest (AR, LA, NM, OK, TX), Mountain Plains (CO, IA, KS, MO, MT, NE, ND, SD, UT, WY), and Western (AK, AZ, CA, GU, HI, ID, NV, OR, WA).

2 Elementary school is defined as any school with any span of grades from kindergarten through grade 6. Middle or junior high school is defined as any school that has no grade lower than grade 6 and no grade higher than grade 9. High school is defined as any school that has no grade lower than grade 9 and continues through grade 12. Schools that do not fit these definitions are categorized as “other.”


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorFreeland_s
File Modified0000-00-00
File Created2021-01-28

© 2024 OMB.report | Privacy Policy