Final Revised Part A 07_19_11

Final Revised Part A 07_19_11.docx

Evaluation of the Summer Food Service Program Enhancement Demonstrations

OMB: 0584-0560

Document [docx]
Download: docx | pdf

Evaluation of the Summer Food Service Program Enhancement Demonstrations

Supporting Statement and Tables

OMB Submission Part A–May 4, 2011



OMB Control # 0584-NEW

























Contact Information:

Chan Chanhatasilpa, Ph.D.

FNS Project Manager

USDA, Food and Nutrition Service

3101 Park Center Drive, 10th floor

Alexandria, VA 22302

703-305-2115

[email protected]




Table of Contents


PART A. Justification Page

A.1 Circumstances Making the Collection of Information Necessary 1

A.2 Purpose and Use of Information Collection 5

A.3 Use of Information Technology and Burden Reduction 10

A.4 Efforts to Identify Duplication and Use of Similar Information 10

A.5 Impact on Small Businesses or Other Small Entities 12

A.6 Consequences of Collecting Information Less Frequently 12

A.7 Special Circumstances Relating to the Guidelines of 5 CFR 1320.5 13

A.8 Comments in Response to Federal Register Notice and Efforts to Consult

Outside the Agency 14

A.9 Explanation of Any Payments of Gifts to Respondents 15

A.10 Assurance of Confidentiality Provided to Participants 16

A.11 Justification for Sensitive Questions 18

A.12 Estimates of Annualized Burden Hours and Costs 18

A.13 Estimates of Other Total Annual Cost Burden to Respondents or

Recordkeepers 24

A.14 Estimates of Annualized Costs to the Federal Government 24

A.15 Explanation for Program Changes 25

A.16 Plans for Tabulation of, Publication, and Project Time Schedule 25

A.17 Reasons Display of OMB Expiration Date is Inappropriate 34

A.18 Exemptions to Certification for Paperwork Reduction Act Submissions 34


References 35



Tables


A.8.1 Consultants from outside the agency and FNS 15

A.12.1a Estimated annualized respondent hour burden and annualized cost to respondents – individuals/households reporting 20

A.12.1b Estimated annualized respondent hour burden and annualized cost to

respondents – state agencies reporting 21

A.12.1c Estimated annualized respondent hour burden and annualized cost to

Respondents – business (not-for-profit) (sponsors) reporting 22

A.12.1d Summary of annualized burden and cost to respondents 23

A.14.1 Estimated annualized cost to the Federal government 25

A.16.1 Project data collection and reporting schedule 26


Page Left Intentionally Blank

A. Justification

A.1 Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.



This is a new information collection request. The mission of the Food and Nutrition Service (FNS) within the US Department of Agriculture (USDA) is to ensure that children and low-income families have access to food and a more healthful diet. Thus, the FNS administers nutrition assistance programs for children and needy families. Examples of FNS nutrition programs are the Women, Infants and Children (WIC) program; the National School Lunch Program (NSLP); the School Breakfast Program (SBP); and the Summer Food Service Program (SFSP).


The NSLP and SBP offer balanced meals at no cost or reduced cost to children living in households with limited resources. However, school-age children are more susceptible to food insecurity during the summer when they do not have access to meals provided at school (Nord & Romig., 2006). To fill this gap, the SFSP, established by the National School Lunch Act of 1968 as the Special Food Service Program for Children (P.L. 90-302), was created to ensure that low-income children continue to receive nutritious meals when school is not in session.1 Through the program, approved sponsors provide free meals to children in areas with significant concentrations of low-income households. Eligible sponsoring organizations include schools; camps; colleges and universities participating in the National Youth Sports Program (NYSP);

units of Federal, state, or local government; and other community- or faith-based organizations. Sponsors receive Federal reimbursement from the USDA through their state administering agency to assist with the costs of preparing and serving meals at feeding sites.

Children who are eligible to receive meals at these sites are eligible for the four different types of demonstration projects that will be evaluated under this information collection.


Recent data indicate that about 19.4 million children receive free or reduced price meals through the NSLP (USDA, 2010). However, only about 3.2 million children receive meals during the summer – about 16 percent. Moreover, rates of poverty and food insecurity are among the highest in the country among families living in rural America (US Bureau of the Census, 2007). Yet, of all the SFSP sites, less than one-third are located in rural communities (US Bureau of Census, 2007).


Several factors are thought to account for the differences in participation rates in the NSLP and SFSP. While the NSLP is available nationwide, the SFSP is available only in areas with high concentrations of low-income children. Moreover, it is mandatory for children to attend school, so access to free and reduced priced meals is well within their proximity. While transportation to school is provided to those who need it, this is not the case for the SFSP (USDA, 2006). In a FNS sponsored study on the SFSP and the needs of nonparticipating children, it was noted that more than half of parents whose children were eligible did not participate because of lack of awareness of the site locations in their area and transportation issues (USDA, 2006).


Another explanation of the dramatic differences between program participation during the school year and summer programs is the shortage of SFSP sites. In a 2003 study on the SFSP, about 8 percent of the sponsoring organizations did not offer the SFSP the following summer (USDA, 2003). In 2010, there were 34 summer food sites for every 100 school lunch programs (Boteach & Milam, 2010). To maintain a large number of sites, it is important for sponsors to continue offering the program and expanding the number of sites and new sponsors. Households with nonparticipating SFSP-eligible children were more likely to be severely or moderately hungry. Moreover, a majority of parents of participating SFSP eligible children reported that they relied on the program to provide breakfast (79 percent) and lunch (91 percent) for their children (USDA, 2006). This study also noted that all households with nonparticipating SFSP-eligible children would like their children to have access to a summer program that provides breakfast and lunch.


In its efforts to identify ways to reach a greater number of SFSP eligible children and facilitate food security in the summer, the USDA has developed and begun to implement four types of SFSP demonstrations. In Arkansas (Demonstration 1, Extended Operations), 88 sponsors operated the SFSP in 196 sites for 40 or more days than they usually operate their programs. In Mississippi (Demonstration 2, Enrichment Activity), 30 sites were selected to support enhancement activities to increase attendance at summer programs. These two demonstrations began in the summer 2010 and will continue operations during the summer 2011.


In March 2011, FNS selected grantees to implement two additional types of enhancement demonstrations –3 grantees in the Home Delivery demonstration and 3 grantees in the “Backpack” demonstration. Both types of demonstrations (consisting of a total of 6 grantees) will begin operations in summer 2011 and will continue in summer 2012. The Home Delivery Demonstration (Demonstration 3) will offer delivery of up to two meals per day to the homes of eligible children in rural areas. This demonstration will only operate in rural areas, and only children identified by school districts as eligible for free or reduced-price school meals will be eligible to receive delivered meals. The Food Backpack Demonstration (Demonstration 4) will provide weekend and holiday meals to eligible children who are already participating in the SFSP. Children normally eligible to receive meals at SFSP sites (i.e., those age 18 and younger, graduated seniors over age 18, or those who have been determined by the state to have a mental or physical disability and who participate in a public or nonprofit school program established for people with disabilities) will be eligible to receive meals under the Food Backpack Demonstration Project. Each site can operate the Backpack demonstration for varying lengths of time and have varying start and end dates. Similarly, eligible children can choose to participate for the entire duration or a part of the duration the SFSP is offered.


These demonstrations are authorized under the Agriculture, Rural Development, Food and Drug Administration and Related Agencies Appropriations Act of 2010 (Pub. L. 111-80, Sec. 749(g). The Act authorizes the USDA to carry out demonstration projects to develop and test methods of providing access to food for children in urban and rural areas in the summer when school is not in session and for the independent evaluation of these demonstrations.



A. 2 Indicate how, by whom, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.



This is a one-time study. The data collected in this evaluation will be used by FNS to provide policymakers with information to make decisions about potential changes in Federal summer food nutrition programs for children. The specific goals of the evaluation are to assess the following:

  1. The impact of each SFSP enhancement demonstration model on participation and meal service;

  2. The food security status among recipients of the home delivered meals and backpack demonstrations;

  3. The “targeting accuracy” in the Meal Delivery and Backpack demonstrations;

  4. The process of project implementation in each SPSP enhancement demonstration; and

  5. The total and component specific costs of implementing and operating SFSP demonstrations.


Evaluation data will provide Congress with rigorous and timely information to make decisions about possible program changes during the next Child Nutrition reauthorization cycle. The evaluation will document the process and challenges of implementing the demonstrations and will provide valuable information should the demonstrations succeed.

This is a mixed mode evaluation, and data will be collected by telephone, face-to-face, electronically, and over the internet.

The primary objective of the SFSP enhancement demonstration projects is to increase participation in the SFSP and facilitate food security. Parents of participants in the Backpack and Meal Delivery demonstrations will be asked to complete up to four interviews at the following points in time: summer 2011, fall 2011, and twice in summer 2012. This frequency of data collection will allow FNS to assess changes in food security over time and to evaluate differences in food security between the summer demonstrations and the National School Lunch Program and between the two demonstrations.

All 8 state grantees will be interviewed once during the first year of data collection as part of a process evaluation. The purpose of these interviews will be to obtain a high level overview of demonstration project operations from the grantee perspective. In 2012, only two demonstrations will still be running (Meal Delivery and Backpack). Since operations are expected to change between 2011 and 2012, we will conduct site visits and key informant interviews for the 6 state grantees from the Meal Delivery and Backpack demonstrations in summer 2012. The 2012 interview will contain the same questions as in 2011.

Interviews will also be conducted with key informants in sponsoring organizations and sites. In 2011, sponsor and site staff/volunteers from Demonstrations 1, 2, and 4 and sponsors in Demonstration 3 will be interviewed.2 In 2012, sponsors and site staff/volunteers in the Backpack demonstrations and sponsors in the Meal Delivery demonstrations will be interviewed. The purpose of interviews with sponsors is to obtain an understanding of the implementation of demonstrations from the perspective of local agency officials. The purpose of interviewing staff/volunteers at sites is to understand demonstration project operations at the ground level where food is provided to the children.

In 2011, the 8 State grantees (including grantees from Demonstrations 1 and 2) will also be asked to complete a short cost instrument to collect information on start up operations and operations specific to their roles and responsibilities as a demonstration project grantee. The 6 state grantees for the Meal Delivery and Backpack demonstrations will be asked to complete the instrument again in the second year of the demonstration. An extended cost instrument will be fielded to sponsors in all four types of demonstration projects in 2011 and sponsors in the Meal Delivery and Backpack demonstration projects in 2012.

The impact portion of the evaluation will collect data on household food security (using a 30-day reference period) at four points in time – summer 2011, fall 2011, and twice in summer 2012. Thus, we will be able to draw conclusions about food security in the households of the children who participate in the Backpack and Meal Delivery demonstrations in the summer compared to their household food security when they are back in school and the NSLP program in the fall. Results indicating that household food security in the summer was at least at the same level as in the fall would indicate a strong association with participation in the demonstrations. It would, however, not be able to show what the household food security would have been like in the absence of the demonstrations. It will also not be able to compare household food security with the previous school year (spring 2011)


It was not possible to compare summer 2011 household food security with that in the 2010/2011 school year because identifying eligible children in the spring who would definitely participate in the demonstration projects in the summer was not possible. The fact that we measure these important outcomes at four points in time, however (including twice next summer), lends strength to our design and the ability to answer questions on comparisons of outcomes between demonstrations, over the two years of the demonstration (2011 and 2012), over time within the same demonstration period (summer 2012), relative to the NSLP during fall 2011, and relative to external benchmarks from the Current Population Survey (CPS) Supplement.3


Nevertheless, the issue of being able to attribute effects to these demonstrations is an important one and was seriously considered. Randomly assigning children to treatment and control in the Backpack or Meal Delivery demonstrations was rejected because the sampling frame could not be assembled until the participants arrive at the food sites to pick up their backpacks or delivered meals. The negative ramifications of providing backpacks or meals to some, but not all, of these children were thought to outweigh the benefits of randomization. The use of comparison sites was considered but also rejected due to the difficulty of identifying SFSP programs that would be sufficiently similar to demonstration programs absent the demonstrations.


Based on comparison to the school year baseline, this evaluation will be able to demonstrate the impact of the backpack and meal delivery demonstrations on household food security. It will also be able to provide distinctions between and comparisons among funding sources, administrative start up costs, operational costs, and benefit costs as well as descriptive analyses of the key processes and outputs associated with the implementation of the demonstrations.


Another possible limitation is that use of the 30-day reference period, instead of a longer one, to measure household food security may result in lower incidence rates that in turn will require larger minimum detectable differences (MDD) between groups. The use of a longer reference period was rejected due to the short window of time in which to collect data during the summer. Food sites operate at varying points in time over the summer – some beginning right after school ends in May and others beginning as late as early July. Their end dates also vary, with some ending in June, July and August. Using a reference period longer than 30 days would risk confounding the effects of the demonstrations with other effects. Because the 30-day reference period is used in USDA’s yearly assessment of household food security using the CPS Supplement, we will be able to use these national data as benchmarks to compare with the data we collect.


A. 3 Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also describe any consideration of using information technology to reduce burden.



FNS is committed to complying with E-Government Act of 2002 to promote the use of technology. In developing the data collection approach, the project team has taken into consideration opportunities for the use of information technology and automated data collection in an effort to reduce burden. On behalf of FNS, local field coordinators hired by the contractor will make use of the sponsoring and site organizations’ available technology when preparing the lists of SFSP participants from which the sample will be selected.

The research design also calls for the review and abstraction of data from demonstration project materials. We will request that demonstration projects provide these materials electronically whenever possible. FNS estimates approximately 80 percent of respondents will submit this information electronically.


A.4 Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purposes described in Item 2 above.



There are no similar data collection efforts available. Every effort has been made to avoid duplication. The data requirements for the evaluation of new demonstration projects are designed to test possible strategies for providing access to food for children who are not easily reached by the existing Summer Food Service Program. The approaches that will be evaluated represent new initiatives. We have carefully reviewed FNS reporting requirements, State administrative agency reporting requirements, and special studies by government and private agencies. It was concluded that no existing data sources can provide data needed to answer the study’s research questions to determine the effectiveness, cost, or implementation of the demonstrations.





A.5 If the collection of information impacts small businesses or other small entities (Item 5 of OMB Form 83-I), describe any methods used to minimize burden.



The information collection will involve SFSP demonstration sites that may represent small business or other small entities. In order to minimize burden to demonstration sites, the evaluation approach involves the use of local field coordinators to handle activities that are not part of the normal day-to-day operations. Sites will not incur any costs associated with the use of field coordinators; the field coordinators will be hired and paid by the contractor responsible for the data collection effort.

Although the field coordinators will work closely with their assigned sites to work out the details of their support, field coordinators will offer assistance in several key ways. They will prepare the lists of SFSP participants that will serve as the frame for the participant sample. Once the sample has been selected, field coordinators will facilitate data collection by preparing the packages of study materials that will be provided in the backpacks or accompany the meal delivery (Attachment E). Field coordinators will also be responsible for conducting in-person followup with nonrespondent households (Attachment C). FNS estimates that one percent of our respondents are small entities (i.e., fewer than 10).



A.6 Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.



Although respondents will answer more than once during this study, this is a single time data collection. The proposed frequency of data collection for each component is the minimum frequency of data collection consistent with the evaluation objectives and which will allow the conduct of the proposed analyses. In addition, this data collection is authorized under the Agriculture, Rural Development, Food and Drug Administration and Related Agencies Appropriations Act of 2010 (Pub. L. 111-80, Sec. 749(g) which authorizes the conduct of these demonstration projects and their evaluation.



A.7 Explain any special circumstances that would cause an information collection to be conducted in a manner:

* requiring respondents to report information to the agency more often than quarterly;

* requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;

* requiring respondents to submit more than an original and two copies of any document; * requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years;

* in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;

* requiring the use of a statistical data classification that has not been reviewed and approved by OMB;

* that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or

* requiring respondents to submit proprietary trade secrets, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.



There are no special circumstances. The collection of information is conducted in a manner consistent with guidelines in 5 CFR 1320.5.



A.8. If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments. Specifically address comments received on cost and hour burden. Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported. Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years - even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.



Federal Register Notice

A 60-day notice to solicit public comments was published in the Federal Register (Volume 76, No. 5, pages 1129-1130) on January 7, 2011. No public comments were received in response to this notice.

Outside Consultants

For this evaluation, the project team consulted with the key stakeholders and outside consultants listed in Table A.8.1. These individuals have expertise in child food assistance programs, especially SFSP, and are providing guidance and support to the project in understanding how to coordinate and communicate with federal, state, and local agencies and identify key informants. They will also be providing further feedback on research design and data collection materials. Mark Nord of the USDA was also consulted on the food security module of the parent questionnaire. National Agricultural Statistics Service (NASS) has reviewed this information collection and we have taken their comments into consideration.



Table A.8.1 Consultants from outside the agency and FNS

Name

Title and Affiliation

Jerry Cater, Ph.D., RD

Consultant for school nutrition programs in Mississippi

Research Scientist (retired)
Applied Research Division
National Food Service Management Institute Hattiesburg,
MS

(228) 860-9636

Katherine Yadrick, Ph.D., RD

Professor and Chair

Department of Nutrition and Food Systems
University of Southern Mississippi

Hattiesburg, MS

(601) 266-4479

Mark Nord, Ph.D.

Sociologist

Economic Research Service

USDA

Washington, DC


(202) 694-5433





A.9. Explain any decision to provide any payment or gift to respondents, other than reenumeration of contractors or grantees.



In order to maximize response rates, many research studies offer compensation for participants. Numerous empirical studies have shown that incentives are effective in increasing response rates and that dollar amounts in the $5 to $20 range can increase participation in low–income households (Martinez-Ebers, 1997; Singer & Bossarte, 2006; Kulka, 1994; Mack et. al., 1998). For these reasons, parents with contact information who complete the interview will receive $20 for each interview as an incentive. Those without a telephone number (initially, about 60 percent of the sample) will become part of the Take Phone Home Program (described below in Part B) and be given a cell phone with 30 prepaid minutes (valued at between $20 and $30 for the cell phone and minutes) for the interview and an additional 50 prepaid minutes for their own use. Participants in the demonstration will be allowed to keep the cell phones after the interviews are complete. We believe these small incentives may induce many parents to participate. However, they are not large enough to be coercive.


No gifts or payments will be made to state grantees and demonstration sponsors for completing the cost interview or to participants in the key informant interviews.



A.10 Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.



Participants in the evaluation are subject to the assurances and safeguards as provided by the Privacy Act of 1974 (5 USC 552a), which requires the safeguarding of individuals against invasion of privacy. The Privacy Act also provides for the confidential treatment of records maintained by a Federal agency according to either the individual’s name or some other identifier.

The Privacy Act requires that before personal identifying information (such as SSN or EIN) may be shared with other entities, a Privacy Notice must first be published. FNS published such a Privacy Act notice (System of Records) to specify the uses to be made of the information in this collection. This Notice was published titled FNS-8 USDA/FNS Studies and Reports in the Federal Register on March 31, 2000, Volume 65, Number 63, and is located on pages 17251-17252.



Privacy and confidentiality will be protected throughout the data collection and processing operations and will not be shared with anyone outside this study, except as otherwise required by law. We will separate names and other direct identifying information from electronic survey response data by storing them in separate files linked only by a nonmeaningful study ID. Files with identifying information, which will not be needed for analysis, will be stored in directories separate from the response data. Hardcopy study data will be kept in locked files and locked rooms when not in use. Access to participants’ personal data, whether in hardcopy or electronic format, will be limited to persons authorized and needing to use the data. All contractor staff, including all field staff, will be required to sign a confidentiality pledge (Attachment D).

Participants will be informed of the safeguards in place to protect their privacy and the extent to which confidentiality can be assured. Parents of SFSP participants will be provided with a letter introducing the evaluation and describing what participation entails (Attachment E). Both the letters and a verbal introductory script to the telephone parent survey (Attachment F) clearly inform the respondent that the information provided will remain confidential.

The evaluation plan was submitted to the Westat Institutional Review Board and approved on February 2, 2011 (Attachment G). The IRB reviewed all assurances of confidentiality provided and plans for document and data storage to ensure that the requirements of all assurances will be met.



A.11 Provide additional justification for any questions of a sensitive nature, such as sexual behavior and attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.



Some questions contained in the parent interview could be considered sensitive by some respondents. These include a question on household income, a series of items on food security, and several questions on whether the food given to children is being eaten by the intended recipients. All potentially sensitive questions are directly related to the planned analyses of outcome data and are essential to evaluate the effectiveness of the demonstrations.

As part of the introductory phone script (Attachment F), potential respondents will be informed that they may choose not to answer any question without penalty. The questions in the parent interview were reviewed by the contractor’s Institutional Review Board.

A. 12 Provide estimates of the hour burden of the collection of information. The statement should:


* Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.

* If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens in Item 13 of OMB Form 83-I.

* Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included in Item 13.



The FNS is requesting approval for the evaluation of four types of Summer Food Service Program Enhancement demonstrations. Data collection activities will include four separate interviews with parents of demonstration project participants in the Meal Delivery and Backpack demonstrations (Attachment H); key informant interviews (Attachment J) with state officials (N=5), demonstration sponsors (N=28), and local sites (N=48) in Demonstrations 1, 2, and 4 and with state officials (N=3) and demonstration sponsors (N=4) in Demonstration 3; and the use of separate cost instruments that will be completed by demonstration state grantees and sponsors in all demonstrations. In addition, once they are developed, contractor staff will obtain demonstration recruitment and other project materials and extract key information on a data abstraction form.


Information will be collected in two consecutive years. Tables A.12.1a-d provide a summary of the annualized burden and cost to respondents. Cognitive testing on questions on the backpack questionnaire was conducted with five individuals who implement a backpack program in Texas. The parent interview is expected to take about half an hour to complete. This estimate is based on simulated runs with evaluation project staff answering the questionnaire. The burden associated with the conduct of key informant interviews and completion of the extended cost instrument is estimated to average one hour each. The shorter cost instrument targeted to state grantees is expected to take about 45 minutes to complete. The burden

Table A.12.1a Estimated annualized respondent hour burden and annualized cost to respondents – individuals/households reporting


Affected Public

(a) Description of Collection Activity

(b) Instrument Type

(c ) No. Respon-dents

(d) No. Responses Per Respondent

(e) Total Annual Responses (cxd)

(f) Hours Per Response

(g) Total Burden (exf)

(h)

Estimated Annual Burden hours (g/3years)

(i)

Hourly Wage Rate

Annual Cost to Respondent (hxi)

Individual/Household (Parents SFSP Participants) Reporting


Pretest (expert providers of backpack program)

Survey

5

1.00

5.00

1.000

5.00

1.7

$ 27.00

$ 45.00

Individuals/ Households

Initial recruitment for backpack and meal deliverya

Initial turnout to pick up backpacks

1,000

1.00

1,000.00

0.08

83.50

27.8

$ 8.00

$ 222.67

 

Parents interviewed once in summer 2011 or parents from supplemental sample interviewed once in summer 2012b

telephone interview

213

1.00

213.00

0.50

106.50

35.5

$ 8.00

$ 284.00

 

Parents interviewed twice in 2011 only or parents from 2012 supplemental sample interviewed twice in 612b

telephone interview

612

2.00

1,224.00

0.50

612.00

204.0

$ 8.00

$ 1,632.00

 

Parents interviewed twice in 2011 and the same parents interviewed once in 2012b

telephone interview

20

3.00

60.00

0.50

30.00

10.0

$ 8.00

$ 80.00

 

Parents interviewed twice in 2011 and twice in 2012b

telephone interview

371

4.00

1,484.00

0.50

742.00

247.3

$ 8.00

$ 1,978.67

 

Non-responders


175

1.00

175.00

0.05

8.77

2.9

$ 9.00

$ 26.30

 

Total Individual/Household Burden

 

1,005

4.14

4,161.00

0.38

1,587.77

529.3

$ 10.00

$ 4,268.64


aAccounts for initial contact with parents to describe study.

bEstimates are based on the sample size and response rate assumptions in Tables B.1.1 and B.1.2 of Part B and were derived as follows. Of the 804 responding parents in 2011, an expected 612 will also complete the second interview, leaving 192 who complete only one interview. Similarly, of the 412 responding parents in the 2012 supplemental sample, an expected 391 will complete the second interview, leaving 21 who complete only one interview. This brings the total number of parents completing just one interview to 213. Since only those parents who completed two interviews in 2011 are followed into 2012, the number of parents originally selected in 2012 who complete exactly two interviews are those who do not complete any interviews in 2012, (i.e., an expected 221). Since the number of parents in the 2012 supplemental sample that complete two interviews is expected to be 391, the total number or parents completing exactly two interviews is 612. Finally, note that only those parents originally selected in 2011 can complete three or four interviews. Based on the assumptions of Tables B.1.1 and B.1.2, the expected numbers are 20 and 371, respectively.

Table A.12.1b Estimated annualized respondent hour burden and annualized cost to respondents – state agencies reporting

Affected Public

(a)

Description of Record keeping Activity

(b) Instrument Type

(c ) No. Respon-dents

(d) No. Responses Per Respondent

(e) Total Annual Responses (cxd)

(f) Hours Per Response

(g) Total Burden (exf)

(h)

Estimated Annual Burden hours (g/3years)

(i)

Hourly Wage Rate

Annual Cost to Respondent (hxi)



State Agencies Reporting






State Agencies

Key informants - State grantees, from demo 3 & 4a

In person interview

6

2.00

12.00

1.00

12.00

4

$ 7.00

$ 108.00

Key informants - State grantees, from demo 1& 2 interviewed in 2011a

In person interview

2

1.00

2.00

1.00

2.00

0.7

$ 27.00

$ 18.00


State grantees (demo 3&4) a

Cost instrument - short

6

2.00

12.00

0.7500

9.00

3

$ 27.00

$ 81.00


State grantees (demo 1&2) a

Cost instrument - short

2

1.00

2.00

0.75

1.50

0.5

$ 27.00

$ 13.50


State grantees (demo 3&4) b

Document Review

6

2.00

12.00

0.08

0.96

0.3

$ 27.00

$ 9


State grantees (demo 1&2) b

Document Review

2

1.00

2.00

0.08

0.16

0.1

$ 27.00

$ 1


Total State Agencies Burden


8

5.25

42.00

0.61

25.62

8.54

$ 27.00

$ 230.58


aThere is 1 state grantee for demo 1 and 1 state grantee for demo 2 (for a total of 2 state grantees) and 3 state grantees for demo 3 and 4 (for a total of 6 state grantees). The total number of state grantees is 8.


b Demonstration state grantees will be contacted and asked to provide documents for review and data abstractions. The number of individuals contacted is based on the number of state grantees (AR, MS, 3 Backpack, 3 Meal Delivery). Document review and data abstraction will be performed by contractor staff. There will be no annual burden to respondents other than the time required to provide materials. Individuals representing state grantees in demonstrations 3 and 4 will be asked to provide documents at two points in time (summer 2011 and summer 2012).   State grantees in demonstrations 1 and 2 will be contacted only once during summer 2011 because these demonstrations will cease to operate after 2011.


Table A.12.1c Estimated annualized respondent hour burden and annualized cost to respondents – business (not-for-profit) (sponsors) reporting


Affected Public

(a) Description of Record keeping Activity

(b) Instrument Type

(c ) No. Respon-dents

(d) No. Responses Per Respondent

(e) Total Annual Responses (cxd)

(f) Hours Per Response

(g) Total Burden (exf)

Estimated Annual Burden hours (g/3years)

Hourly Wage Rate

Annual Cost to Respondent (jxk)

Business - Not- for- Profit (Sponsors) Reporting

Business (Sponsors)

Sponsors and sites from demo 3 & 4a

In person interview

60

2.00

120.00

1.00

120.00

40

$ 27.00

$ 1,080.00

 

Sponsors and sites from demo 1 & 2b

In person interview

20

1.00

20.00

1.00

20.00

6.67

$ 27.00

$ 180.00

 

Sponsor and sites from demo 3 & 4c

Cost instrument - extended

20

2.00

40.00

1.00

40.00

13.30

$ 27.00

$ 360.00

 

Sponsor and sites from demo 1 & 2c

Cost instrument - extended

148

1.00

148.00

1.00

148.00

49.33

$ 27.00

$ 1,332.00

 

Demonstration Sponsor from demo 3 & 4 d

Document Review

20

2.00

40.00

0.08

3.20

1.07

$ 27.00

$ 28.80

 

Demonstration Sponsor from demo 1 & 2 d

Document Review

148

1.00

148.00

0.08

11.8

3.9

$ 27.00

$ 106.56

 

Total Business Burden

 

168

3.07

516.00

0.67

343.04

114.35

$ 27.00

$ 3,087.36



aDemos 3 and 4 have have a total of 20 sponsors. All 20 sponsors and up to 40 site staff/volunteers will be interviewed.


bDemos 1 and 2 are expected to have approximately 148 sponsors and hundreds of sites. The contractor will work with state grantees to identify a total of up to 20 sponsors and sites.


cEstimate is based on the expected number of sponsors in the AR and MS demonstrations (148 sponsors) and the number of sponsors in the 3 Backpack and 3 Meal Delivery demonstrations (20).


d Demonstration sponsors will be contacted and asked to provide documents for review and data abstractions. The number of individuals contacted is based on the number of sponsors (148 sponsors for demo 1 & 2 and 20 sponsors for demo 3 & 4). Document review and data abstraction will be performed by contractor staff. There will be no annual burden to respondents other than the time required to provide materials.







Table A.12.1d Summary of Annualized Burden and Cost to Respondents


Affected Public

Estimated No. of Respondents

No. of Responses per Respondent

Total Annual Responses

Estimated total Hours per Response

Estimated Annual Total Burden

Estimated Annual Total Cost to Respondent

Individual/Households

1,005

4.14

4,161.00

0.38

529.26

$ 4268.64

State Agencies (grantees)

8

5.25

42.00

0.61

8.54

$ 230.58

Business Not for Profit (sponsors/demo sites)

168

3.07

516.00

0.66

114.35

$ 3,087.36

Total Burden Estimates

1,181

4.00

4,719.00

0.14

652.14

$ 7,586.58



estimates were obtained by sharing the interview guides and the cost instruments with a small number of stakeholders and program experts and soliciting their input. Project staff will be responsible for the document review and information abstraction so respondent burden associated with this activity is limited to the amount of time it will take sponsors to collect and provide copies of the requested materials to the contractor. In most cases, delivery will be electronic. These materials represent brochures, recruitment materials, and other project documents that demonstration sponsors and sites will be developing and will have readily available to them. Thus, we have estimated the burden at 5 minutes.


The total annualized cost to respondents is $7,586.58, as summarized in Table A.12.1c. Average hourly wage rates for the different types of respondents were calculated using an estimated 40-hour work week and weekly earnings data from the Current Population Survey released from the United States Department of Labor, Bureau of Labor Statistics (January 20, 2011). We used the usual weekly earnings for workers in the management, professional, and related occupations to calculate an average hourly rate of $27 for key informants, demonstration sponsors, and state grantees. Parent hourly wage rate ($8) was based on the income/poverty ratio distribution among all National School Lunch Program participants (Newman & Ralston, 2006) to estimate a weighted mean of per person household income per hour.


A. 13 Provide estimates of the total annual cost burden to respondents or record keepers resulting from the collection of information, (do not include the cost of any hour burden shown in items 12 and 14).  The cost estimates should be split into two components: (a) a total capital and start-up cost  component annualized over its expected useful life; and (b) a total operation and maintenance and purchase of services  component.


Respondents and recordkeepers will incur no capital or maintenance costs.


A. 14 Provide estimates of annualized costs to the Federal government. Also, provide a description of the method used to estimate cost, which should include quantification of hours, operational expenses (such as equipment, overhead, printing, and support staff), and any other expense that would not have been incurred without this collection of information. Agencies may also aggregate cost estimates from Items 12, 13, and 14 in a single table.



Estimates of Annualized Costs to the Federal Government

Total costs include work performed by the evaluation contractor, Westat, and USDA personnel. Westat is funded at a total cost of $1,475,315 over the 32-month contract period, for an annualized cost of $491,772. Westat will be responsible for data collection, data analysis, and report writing activities. USDA personnel will oversee the evaluation effort. This information collection also assumes that a total of 294 hours of Federal employee time, for a GS-14, step 1, Lead Program Analyst at $50.41 per hour, and 588 hours of Federal employee time for a GS-13, step 1, Program Analyst at $42.66 per hour for a total of $106,545.30 Federal staffing cost for 32 months or $39,904.62 on an annual basis. Federal employee pay rates are based on the General Schedule of the Office of Personnel Management (OPM) for 2011 for the Washington, DC locality)



Table A.14.1 summarizes the estimated annualized cost to the Federal Government.



Table A.14.1 Estimated annualized cost to the Federal government


Annualized Cost

USDA Personnel

$39,905

Evaluation Contractor (Westat)

$491,772

Total

$531,677





A.15 Explain the reasons for any program changes or adjustments reported in Items 13 or 14 of the OMB Form 83-I.



This is a new data collection that will add a total of 621 burden hours annually, to the FNS inventory as a result of program changes.



A.16 For collections of information whose results will be published, outline plans for tabulation and publication. Address any complex analytical techniques that will be used. Provide the time schedule for the entire project, including beginning and ending dates of the collection of information, completion of report, publication dates, and other actions.



The project timeline showing the dates of collection of the evaluation data and completion of planned reports is shown in Table A.16.1.


Table A.16.1: Project data collection and reporting schedule

Activity

Project Schedule

Finalize research design

Jan. 28 – July 11, 2011

Collect and analyze 2011 data


Build list sample and construct sampling frame

May 6 – June 17, 2011

Collect parent data for summer 2011

July 11 – Aug. 12, 2011

Collect key informant data

July 11 – Aug. 31, 2011

Collect parent data for school year 2011

Sept. 30 – Nov. 25, 2011

Collect cost data

March 1 – Dec. 16, 2011*

Draft and finalize Congressional status report on 2011
demonstrations

Sept. 12 – Oct. 10, 2011

Draft and finalize evaluation report on 2011 demonstrations

Feb. 27 – April 30, 2012

Revise research design and analysis plan for 2012 demonstrations

May 7 – Jun 4, 2012

Collect and analyze 2012 data


Collect parent data – Time 1

July 1, 2012 – Aug. 15, 2012

Collect parent data – Time 2

Aug. 1 – Sept. 15, 2012

Collect key informant data

June 1 – Aug. 1, 2012

Collect cost data

Sept. 24 – Nov. 5, 2012

Draft and finalize Congressional status report on 2012
demonstrations

Oct. 8 – Nov. 5, 2012

Draft and finalize evaluation report on 2011-2012
demonstrations

April 1 –May 27, 2012


*The contractor will begin to collect 2010 administrative cost data in March, 2011.



Prior to data analysis, and especially early in the first data collection period, the contractor will review the frequencies of all variables to identify any “red flags” (e.g., a high percentage of “don’t know’s” (DKs) or missing data; variability of responses). For some of the questions in which a parent is asked to name all children who brought home a backpack and then is asked questions for each child (e.g., contents of the backpack, where the food was stored, the frequency in which all the food was consumed, and which foods were shared with others in the household), it is possible that these questions will become cumbersome for parents with a large number of children in the family. Thus, we will examine the frequency of responses on these questions categorized by the number of children in the family to see whether there are differences in DKs and differences in variability between families with many and few children. Variables with >15% of DK or missing data would be considered a “red flag.”


Data tabulation will occur separately for the outcome and process evaluation and then integrated through a synthesis process. A description of the proposed analysis and synthesis approach is provided below.




Analysis of Outcome Data


Analysis of parent survey data will consist of weighting the data and then conducting analyses to compare the data over time, between the two demonstrations, between the summer and fall, and with national benchmarks on household food security.


Weighting the Survey Results. The first step of the weighting process is to calculate the base weight. The base weight is the reciprocal of the probability of selecting a household (family) for the survey which will vary depending on the size of the demonstration area, the number of sites to be sampled in the case of the Backpack demonstration, and the number of participants to be sampled. Since the sample of households will be determined by the selection of individual participants from lists provided by the demonstration sponsors or sites, the household selection probability will also depend on the number of participating youths in the household. For example, for the Backpack demonstration, if it is necessary to sample households indirectly through the selection of children, the base weight for a sampled household i from site j in demonstration area d is given by the formula:



= ,


where is the inverse of the probability of selecting site j from demonstration area d; = the number of children selected from site j in demonstration area d; = the total number of eligible children served by site j in demonstration area d; and = the number of eligible children in household i served by site j in demonstration area d. Note that for the Meal Delivery demonstration, the formula for the base weight will not involve terms relating to the selection of sites. The base weight defined above is a household-level weight.


The next step is to perform nonresponse adjustments to account for interview nonresponse. This step is important because the interview response rate for the population of interest (households participating in the SFSP enhancement demonstrations) is expected to be low. Although efforts will be made to achieve as high a response rate as practicable with the available resources, it may not be possible to achieve a response rate of 80 percent, in which case a nonresponse bias analysis (NRBA) will be conducted to assess the impact of nonresponse on the survey estimates. It is well known that low response rates threaten the validity of survey estimates if not handled properly. Without appropriate adjustments, the estimates could be severely biased. This threat stems from the fact that the response propensity is usually correlated with the characteristics of the sampled units. To counter this threat a nonresponse adjustment that reflects the differential response propensity of the sampled households will be applied to the base weights, thus reducing the potential bias resulting from low response rates.


Nonresponse adjustments will be made within cells defined by characteristics found to be correlated with response propensity and are known for all sampled households. For example, information available in the sampling frame and other administrative data sources (e.g., size of household, number of children, employment status, and income) could be used for this purpose. We will use software such as Chi-square Automatic Interaction Detector (CHAID) or similar software to form the nonresponse adjustment cells. CHAID is an effective algorithm for identifying cells that are homogeneous with respect to response propensity. The results of the CHAID analysis will be used to define a set of nonresponse adjustment cells within which the weighting adjustments will be calculated. For example, let k denote a weight adjustment cell. The nonresponse-adjusted weight, for the ith responding household in cell k will be computed as:


= / ,


where the sum of base weights in the numerator extends over the total sample in cell k, and the sum in the denominator extends over the responding households.


Data Analysis. For the Backpack and Home Delivery demonstrations, we will have detailed individual level data on food outcomes, demographics, and household circumstances from the parent telephone survey. We will use these data to estimate the percentage of households which experience very low or very low or low food security during a 30-day reference period. Related to food security, we will tabulate the percentage of households that participate in other programs such as SNAP, TANF, and WIC. We will assess meal targeting by calculating the percentage of households in which the provided food was (mostly) consumed by the children for whom it was intended. We will weigh the survey data, as described above, to produce outcome statistics (e.g., percents, totals, or means) and their corresponding standard errors.


We will make comparisons between demonstration projects, over the years of the demonstrations, within the same demonstration period, relative to the National School Lunch Program, and to external national benchmarks. Analysis will consist of direct comparisons of outcome statistics and model-based comparisons of outcomes.


  • Comparisons of Outcomes Between Demonstrations. To compare the Meal Delivery and Backpack demonstrations, we will calculate the aggregate outcomes of interest (e.g. percentage of participants who have food security and the percentage of participants who consume most of the supplemental food themselves) for each demonstration and subtract one from the other to calculate the difference in outcomes between the sites. We will test the differences for significance using their standard error estimates. We will make the comparisons for years 2011 and 2012, and for the two years pooled.


We will also use a single equation outcomes model to compare the two demonstration projects. In this analysis the unit of observation is the individual child. We will model food security outcomes as a function of individual and community characteristics to account for differences in the profiles of participating children between the demonstration projects and over time that may confound demonstration outcomes.


  • Comparisons Over the Years of the Demonstration. To assess the progression of the demonstrations over their two years of operations, we will examine the direct difference in outcome between children in 2011 and the same children in 2012. We will subtract the 2011 outcome estimate from the corresponding 2012 estimate. We will calculate these differences separately by demonstration project and for the two demonstrations pooled. We will test the differences for significance using their standard error estimates. Note that since many of the same households will be interviewed twice, in the summers of 2011 and 2012, variance estimates have to be adjusted for within-person correlation. Standard error calculations will also reflect attrition between the summers and for sample replenishment.


  • Comparisons of Food Security Over Time Within the Same Demonstration Period. We will also examine the stability of our household food security measures over time within the same demonstration period. We will take two measures of food security for children participating in the demonstrations during the summer 2012. We will assess the variation in these food security measures by calculating the correlation between them. If there is a great deal of variation between the two measures, we will explore using regression analysis to determine how covariates, such as change in participation in other programs or the time of the month the measure was taken, are contributing to the variation.


  • Comparisons of Aggregate Outcome Measures Relative to the National School Lunch Program (NSLP). We will assess the direct difference in food security outcomes between children who participated in a demonstration in 2011 and the same children in the fall 2011 when they participated in the NSLP. From this comparison, we will be able to assess how close the demonstration projects come to achieving the outcomes of the school year program for participants. We will calculate this difference by subtracting children’s outcome estimate during the summer from their corresponding outcome during the fall of the 2011 school year. We will test the differences for significance using their standard error estimates. Note that since the same households will be interviewed twice, both in the summer and in the fall, variance estimates have to be adjusted for within-person correlation.


  • Comparison to External Benchmarks. We will use the Current Population Survey (CPS) Supplement for national benchmark comparisons of child food security status within the past 30 days. We will compare food security status in demonstration participants to the 2010 sample in the CPS by household composition, race/ethnicity, household income, residence (e.g., relative to Metropolitan statistical areas), employment and labor force status of adults in the household, educational attainment of adults in the household, age of the oldest child, and number of children in the household.



Analysis of Implementation Process Data


The implementation process analysis will consist of a descriptive analysis of the key processes and outputs that were addressed in key informant interviews (e.g., processes for providing demonstration benefits; timing and methods for informing parents and caretakers of availability of benefits; procedures of the demonstration; outreach; community partnerships; design, delivery, timing, and effectiveness of sponsor training; roles and responsibilities; administrative controls; actions to maintain program integrity, prevent loss, theft, improper issuance at Federal state, local agency, and provider levels; and challenges and resolution of challenges). We will compare responses to similar questions administered to different categories of key respondents to validate responses and ascertain impressions based on a variety of perspectives. We will address these processes and outputs for each demonstration project and compare and contrast our findings.



Analysis of Cost Data


The cost data will provide economic cost estimates by cost components, demonstration, and funding sources. We will further analyze the data to document distinctions between and comparisons among funding sources (grant funding, other government and private sources, in-kind funds and volunteers), administrative start-up costs, ongoing administrative costs of operation, and benefit costs. We will conduct analyses to provide total and average costs per meal delivered as well as range of costs. As we will have cost data by sponsor, we will also explore whether the data supports estimating the marginal cost associated with expanding the demonstrations to one more child.


Focusing on common outcome measures across the demonstrations, the contractor will examine efficiency measures in terms of variations in cost per unit of outcome. We will document the cost differences across the four demonstrations and explore possible reasons for variations. Additionally, we will examine the characteristics of sponsors within a demonstration that have the lowest average costs per unit of output to determine if there are commonalities in their structure and attributes.


Final Synthesis


Using two types of triangulation (data triangulation and methodological triangulation), we will examine major themes across all data sources and methodologies to draw conclusions on effectiveness and efficiency of the demonstration projects. Findings from the analysis of administrative data will be included in our final synthesis.




A.17 If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.



USDA is not seeking an exemption from display of the expiration date of OMB approval.



A.18 Explain each exception to the certification statement identified in Item 19, "Certification for Paperwork Reduction Act Submissions," of OMB Form 83-I.



USDA is not requesting any exceptions to item 19 of OMB Form 83-I.



References

Boteach M., and Milam S. (2010). As school ends, hunger beings. Children Need Access to Subsidized Meals in Summer Too. Center for American Progress, June 9, 2010. http://www.americanprogress.org/issues/2010/06/summer_hunger.html. Accessed November 12, 2010.

Kulka, R. (1994). The use of incentives to survey ‘hard-to-reach’ respondents: A brief review of empirical research and current practice. Paper prepared for seminar on new directions in statistical methodology, Bethesda, MD.

Mack, S., Huggins, V., Keathley, D., & Sundukchi, M. (1998). Do monetary incentives improve response rates in the Survey of Income and Participation. Proceedings of the Section on Survey Methodology, American Statistical Association, 529-34.

Martinez-Ebers, V. (1997). Using monetary incentives with hard-to-reach populations in panel surveys. International Journal of Public Opinion Research, 9, 77-86.

Newman, C, and Ralston, K. (2006). Profiles of participants in the National School Lunch Program. Data From Two National Surveys. USDA Economic Information Bulletin, No. 17, August, 2006. http://151.121.68.30/publications/eib17/eib17.pdf Accessed April 7, 2011.

Nord, M., and Romig, K. (2006). Hunger in the summer: seasonal food insecurity and the National School Lunch and Summer Food Service programs. Journal of Children and Poverty, 12(2), 141-158

Singer, E. & Bossarte, R.M. (2006). Incentives for survey participation: When are they coercive? American Journal of Preventive Medicine, 31, 411-418.

US Bureau of the Census, Current Population Survey, Annual Social and Economic Supplements (2007), available at http://www.census.gov/hhes/www/poverty/histpov/ hstpov8.xls; U.S. department of agriculture, Household Food Security in the United States, economic research reports, no. 11 (2004), no. 29 (2005), no. 49 (2006), and no. 66 (2007) (Washington, DC: USDA, Economic Research Service).

USDA. (2003). Feeding Low-Income Children When School Is Out - The Summer Food Service Program: Anne Gordon and Ronette Briefel, Mathematica Policy Research, Inc., for the Food and Rural Economics Division, Economic Research Service, U.S. Department of Agriculture. Food Assistance and Nutrition Research Report No. 30, 2003.

USDA. (2006). Analysis of Summer Food Program and Food Needs of Nonparticipating Children: Final Report, Special Nutrition Program Report Series, No. CN-06-SFSP, Project Officer: Fred Lesnett. U.S. Department of Agriculture, Food and Nutrition Service, Office of Analysis, Nutrition, and Evaluation, Alexandria, VA: 2006.

USDA. (2010). Solicitation for State Agencies: Summer 2011 SFSP Home Delivery and Food Backpacks Demonstration Projects. Request for Applications. http://www.fns.usda.gov/ora/menu/DemoProjects/SummerFood/MealDelivery_RFA.pdf. Accessed November 15, 2010.





























1In 1975, a separate Child Care Food Program and a Summer Food Service Program were authorized by an amendment to the National School Lunch Act (P.L. 94-105).

2 Demonstration 3 (the Meal Delivery demonstration) are not expected to operate with the use of local sites. Given the close proximity between the end of the demonstration in Arizona and the beginning of the 2011/2012 school year, Arizona sponsors have indicated that in-person site visits will not be convenient for them. To acquire the same information gathered from the other Meal Delivery demonstration States, the interviews will be conducted by phone using identical interview protocols.


3 In the summer of 2011 and for the first summer 2012 measures, parent questionnaires will be administered at least 30 days after the beginning of demonstration due to the 30-day retrospective of the food insecurity measures.  The 2011 summer measures will be compared to the fall 2011 measures to determine the differences in food insecurity among summer project participants and those same participants when they are back in school and receiving the regular school meals.  The second summer 2012 measures will be administered at least 30 days after the first 2012 measure.  Differences between these two measures will allow us to examine the variability in summer food security estimates.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorHintz_e
File Modified0000-00-00
File Created2021-02-01

© 2024 OMB.report | Privacy Policy