Evaluation of Demonstration Projects to End Childhood Hunger (EDECH)
Contract Number: AG-3198-C-14-0019
OMB Supporting Statement
Part A: Justification
Project
Officer: Danielle Berman
Office of Policy Support
U.S.
Department of Agriculture
Food and Nutrition Service
3101 Park Center Drive
Alexandria, VA 22302
CONTENTS
PART A: JUSTIFICATION
A.1. Explanation of circumstances that make collection of data necessary 1
A.2. How the information will be used, by whom, and for what purpose 2
1. Use of the information 2
2. Study objectives 2
3. Data collection from project staff and State and local partner organizations 3
4. Participant data collection 8
A.3. Uses of improved information technology to reduce burden 12
A.4. Efforts to identify and avoid duplication 13
A.5. Efforts to minimize burden on small businesses or other entities 13
A.6. Consequences of less frequent data collection 14
A.7. Special circumstances requiring collection of information in a manner inconsistent with Section 1320.5(d)(2) of the Code of Federal Regulations 14
A.8. Federal Register comments and efforts to consult with persons outside the agency 15
A.9. Payments to respondents 16
A.10. Assurance of confidentiality 22
A.11. Questions of a sensitive nature 23
A.12. Estimates of respondent burden 23
A.13. Estimates of other annual costs to respondents 24
A.14. Estimates of annualized government costs 24
A.15. Changes in hour burden 25
A.16. Time schedule, publication, and analysis plans 25
1. Study schedule 25
2. Publication of study results 26
3. Plans for analysis 26
A.17. Display of expiration date for OMB approval 31
A.18. Exceptions to certification statement 31
References 32
Attachments
A1 Healthy, Hunger-Free Kids Act (HHFKA) of 2010, Section 23 [42 U.S.C. 179d]
A2 Overview of evaluation approach
B Instruments and protocols
B.1 Implementation study staff interview guide
B.2.a Cost study instructions—start-up
B.2.b Cost study worksheets—start-up
B.2.c Cost study instructions-- implementation
B.2.d Cost study worksheets—implementation
B.3.a Participant focus group discussion guide—English
B.3.b Participant focus group discussion guide—Spanish
B.4.a Participant in-depth interview topic guide—English
B.4.b Participant in-depth interview topic guide—Spanish
B.5.a Household survey, baseline—English
B.5.b Household survey, baseline—Spanish
B.6.a Household survey, 12-month follow-up—English
B.6.b Household survey, 12-month follow-up—Spanish
B.7.a Household survey, 18-month follow-up—English
B.7.b Household survey, 18-month follow-up—Spanish
C Contact materials
C.1.a Participant focus group telephone screener—English
C.1.b Participant focus group telephone screener—Spanish
C.2.a Frequently asked questions (for focus group recruitment) —English
C.2.b Frequently asked questions (for focus group recruitment) —Spanish
C.3.a Participant focus group confirmation letter/email—English
C.3.b Participant focus group confirmation letter/email—Spanish
C.4.a Participant focus group reminder script—English
C.4.b Participant focus group reminder script—Spanish
C.5.a Participant in-depth interview recruitment script—English
C.5.b Participant in-depth interview recruitment script—Spanish
C.6.a Participant in-depth interview confirmation letter/email—English
C.6.b Participant in-depth interview confirmation letter/email—Spanish
C.7.a Participant in-depth interview reminder script—English
C.7.b Participant in-depth interview reminder script—Spanish
C.8.a Study brochure—English
C.8.b Study brochure—Spanish
C.9.a Household survey advance letter—English
C.9.b Household survey advance letter—Spanish
C.10.a Household survey 12- and 18-month follow-up advance letter—English
C.10.b Household survey 12- and 18-month follow-up advance letter—Spanish
C.11.a Household survey reminder letter—English
C.11.b Household survey reminder letter—Spanish
C.12.a Household survey refusal conversion letter—English
C.12.b Household survey refusal conversion letter—Spanish
C.13.a Household survey field locating letter—English
C.13.b Household survey field locating letter—Spanish
D Consent forms
D.1.a Participant focus group consent form—English
D.1.b Participant focus group consent form—Spanish
D.2.a Participant interview consent form—English
D.2.b Participant interview consent form—Spanish
D.3.a Active consent for household survey—English
D.3.b Active consent for household survey—Spanish
D.4.a Passive consent for household survey—English
D.4.b Passive consent for household survey—Spanish
E Federal Register notice
F National Agricultural Statistics Service comments
G Confidentiality pledge
H Burden tables
H.1 Participant burden estimates
H.2 Alternate participant burden estimates assuming lower household survey response rates
I Federal Register comments
J Responses to Federal Register comments
Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.
In the Healthy, Hunger-Free Kids Act (HHFKA) of 2010 (Public Law 111-296), Congress added a new Section 23 [42 U.S.C. 179d] to develop and evaluate innovative strategies to “reduce the risk of childhood hunger or provide a significant improvement to the food security status of households with children.” This section mandates research on the causes and consequences of childhood hunger and the testing of innovative strategies to end childhood hunger and food insecurity. In the HHFKA, Congress called for the evaluation of demonstration projects to end childhood hunger (henceforth denoted as the Evaluation of Demonstration Projects to End Childhood Hunger, or EDECH). EDECH will assess impacts, implementation, and costs in five demonstration projects. The data being collected under this submission are necessary to meet the congressionally mandated requirement. A copy of the statute is included in Attachment A1.
FNS/USDA reviewed applications and awarded cooperative agreements to five State and Tribal agencies in February 2015. The agencies’ demonstration projects are described below:
The Chickasaw Nation will implement the Packed Promise project, which will provide food as well as nutrition education materials through home delivery, plus vouchers for purchase of fresh fruits and vegetables to households with children who qualify for free school meals. The demonstration area includes rural counties in south-central Oklahoma.
The Commonwealth of Kentucky will implement the Ticket to Healthy Food SNAP Demonstration, which will compare households (with children) that experience no change in benefits to treatment households (with children) that receive additional transportation deductions to calculate Supplemental Nutrition Assistance Program (SNAP) net income. In this project all households eligible for the demonstration will receive a fixed transportation deduction, and all households eligible for the demonstration that report any earned income will also receive an enhanced earned income deduction equal to 10 percent of earned income. The demonstration includes rural counties in eastern Kentucky.
The Nevada Division of Public and Behavioral Health will implement the Nevada Healthy Hunger Free Kids Project that will compare (1) an increase in SNAP benefits, (2) an increase in SNAP benefits plus additional outreach, education, and case management, and (3) a control group. The demonstration area includes zip codes in an urban county in southern Nevada.
The Navajo Nation Division of Health will implement the Food Access Navigation Project, which will hire Food Access Navigators to evaluate assets and gaps in food access infrastructure. The project aims to increase the number of school breakfast and afterschool food programs by 30-50 percent and the number of summer food sites by 25 percent. The demonstration area includes rural tribal chapters in the eastern portion of Navajo Nation.
The Virginia Department of Education will implement the Virginia Hunger-Free Kids Act Demonstration Project, which will compare control schools to treatment schools that will (1) serve three meals a day to all children during the school year, (2) provide food backpacks for weekends and school breaks, and (3) extend an enhanced SNAP benefit or electronic benefits transfer (EBT) card during the summer months to children eligible for free or reduced-price school meals. Virginia has two demonstration areas: one urban site located in the central part of the State and one rural site located in southwest Virginia.
Indicate how, by whom, how frequently, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.
The study is a new information collection. FNS/USDA will use the information gathered in the data collection activities described here to describe the five demonstration projects; to determine if the demonstration projects reduced food insecurity among children (referred to as child food insecurity) or household food insecurity for households with children; and to describe the relative effectiveness and cost-efficiency of the demonstration projects.
The evaluation has seven research objectives:
To describe each demonstration project in detail
To describe the processes involved in the implementation and operation of each demonstration project
To determine the impact of each demonstration project on the prevalence of food insecurity
To determine how impacts on food insecurity among children and household with children vary by relevant factors
To identify outcomes related to site-specific components of each demonstration
To determine the total and component costs of each demonstration project
To describe the relative effectiveness and cost-efficiency of the demonstration projects
Attachment A2 provides an overview of the evaluation approach, including the sample, data sources, and main outcomes for each of the study’s objectives.
To support the implementation study, the contractor will collect data from project staff and State and local partner organizations through in-person interviews. Qualitative data from interviews will describe each demonstration project (Objective 1) and the implementation and operations of each project (Objective 2). Findings related to implementation procedures, successes, and challenges will help the study team interpret the project outcomes (Objective 5).
Interview respondents. The contractor will conduct interviews with the State or Tribal agency director/manager and key local partners in each site. USDA has established cooperative agreements with the State or Tribal agencies (Chickasaw Nation, Kentucky, Navajo Nation, Nevada, and Virginia) to implement the demonstration projects. Lead agencies are partnering with other State and local public agencies and private organizations depending on the nature and scope of their demonstration projects. Lead and partner public agencies may include, for instance, entities responsible for administering the SNAP; Special Supplemental Nutrition Program for Women, Infants, and Children (WIC); and/or school breakfast and lunch programs at State, local, or Tribal levels. Private partners may include EBT vendors that deliver enhanced SNAP benefits via EBT cards and/or community-based organizations that provide nutrition education as part of the intervention, help households enroll in benefit programs, or provide access to supplemental food. The contractor will conduct interviews with administrators or managers who oversee or guide the demonstration’s design and implementation, frontline staff who deliver services to participants, and administrators at organizations that partner with implementing agencies but do not provide intervention services directly.
Interview topics and timing. The contractor plans to conduct up to three rounds of in-person interviews with staff in each site, depending on the demonstration period implemented by each site. Three awardees will implement the project for 12 months; they will receive two in-person visits. Two awardees will operate for 12 to 24 months and will receive three visits. The first visit will occur in all sites during fall 2015 at the end of the planning period, just before the launch of each demonstration. These interviews will focus on activities that occurred during the planning period, including topics such as the vision or logic model for the project, planned project design, implementation plan, community context, and the planning process itself (Attachment B.1).
Interviews conducted in the second visits will cover similar topics but with a focus on activities and experiences during the demonstration period. Interviews will probe leadership and partner roles, staffing structures, recruitment and engagement strategies, specific services offered and received, deviations from plans, and respondents’ perceptions of challenges and successes, among other topics (Attachment B.1). The third visit will repeat the same set of interview topics with a focus on capturing changes that have occurred since the prior interviews.
The goal of the cost study is to understand the resources needed to implement and operate each demonstration and the costs of those resources, overall and for each household enrolled in the demonstration. In addition, the cost and impact data will ultimately be combined to assess the relative cost-effectiveness of each demonstration in reducing food insecurity among children and food insecurity of households with children (Objectives 6 and 7).
Assessing demonstration costs involves collecting comprehensive data from awardee staff and administrative data sources on the costs of implementing and operating each demonstration intervention. All interventions will incur costs in some or all of the following categories: (1) personnel/staff labor directly related to the intervention; (2) volunteer or donated labor; (3) nonlabor direct services or supplies (such as after-school suppers); (4) travel; and (5) indirect costs, such as accounting, space and facilities, and human resources.1 The contractor will use the instructions and instruments in Attachments B.2a–d to ensure systematic data collection across these major cost categories during the start-up period and during the implementation period. The forms can be customized to account for key differences across interventions.
The two cost instruments are designed as Microsoft Excel workbooks that will capture (1) start-up costs (Attachments B.2a-b) and (2) ongoing operational and administrative costs (Attachments B.2c-d). The State, local, or Tribal agency director or manager (or a designated cost study liaison, such as a financial analyst) will coordinate completion of the workbooks at the agency level. As appropriate, private sector not-for-profit agency directors or managers may also complete workbooks (such workbooks would likely contain only worksheets or workbook pages 2–4). Costs for agencies with small roles and for-profit contractors will be based on the awardees’ estimate of costs or resources provided, or information from invoices they send the awardee, to avoid burdening them.
Soon after the pre-implementation interviews, the evaluation contractor will provide cost study respondents with the start-up cost data workbook and the ongoing operations period workbook and train cost study respondents to complete them through an hour-long webinar. Cost study respondents will be asked to submit the start-up cost form within a month of the training and the operational cost form quarterly (to minimize recall errors). To avoid duplicating efforts and burdening respondents, the contractor will work with demonstration projects to obtain the cost data in whatever way is most convenient, including accepting administrative reports or copies of invoices that provide some or all of the needed information in a different format. The estimate of burden includes time for cost study respondents to answer questions via email or telephone about the materials they submit. Respondents will submit completed cost data workbooks via a secure file transfer protocol site, in case personally identifiable information is included.
The evaluation contractor will use administrative data to assess the costs of food benefits from the point of view of the Federal government. For interventions that supplement food benefits with a social service component, the contractor will use administrative data from the awardee to determine the level of participation in the social services and benefits received.
Administrative data from demonstration projects will be used to describe project implementation (Objective 2) and to interpret impact estimates (Objectives 3 and 5) and costs (Objectives 6 and 7). The administrative data collected may include (1) household composition (size, income, number of children, and indicator of whether the household is new or returning to benefits); (2) project application timing and application disposition, including any denial reason; and (3) benefit information (program participation, level of benefits, and benefit use).
First, contractor staff will work with the State, local, or Tribal agency director/manager (or designee) to identify the relevant agency or agencies, staff, and data type relevant to the demonstration project. Second, the contractor will develop a comprehensive memorandum of understanding (MOU) for each State or Tribal organization specifying the data sources and variables to be shared for the evaluation, procedures to protect the data, and plans for developing and transferring a public use data file at the end of the evaluation. The timing and mode of transmission for data from each source will be determined in consultation between the contractor and demonstration projects. Third, the State, local, or Tribal agency director/manager will work with the contractor to ensure that the evaluation will obtain the necessary data according to the schedule agreed upon in the MOU.
Because the structure of each demonstration will vary, the exact approach to obtain administrative data will also vary among projects. The contractor will coordinate across multiple agencies to receive different file types at varied frequencies to obtain the information needed for the intended analyses. Exhibit A.4.c illustrates file types and the frequency of expected collection.
Exhibit A.3.c. Administrative data files and frequency of collection for EDECH
Administrative data source |
Frequency |
Rationale for inclusion and notes |
|
|
|
SNAP caseload files |
Monthly |
Monthly records enable us to check whether a household was enrolled in SNAP last month or recently; administrative data do not always clearly distinguish new from recertifying clients. |
SNAP EBT data |
Semiannually |
Regular information on benefit usage enables us to examine changes in how households spend the benefits for which they qualify. |
WIC |
Biennial (April 2016 and April 2018) |
Biennial WIC program and participant characteristics data provide a census of all cases. |
|
|
|
NSLP/SBP |
Semiannually (October, April) |
The October file represents school districts’ best counts of program enrollment for the year and will capture many families; a midyear extract updates us on eligibility changes during the year. Participation data can be at the individual or aggregate school levels. |
CACFP |
Varies by State |
States and localities have different systems for tracking CACFP enrollment and participation, so the contractor must work closely with local liaisons to arrange a data delivery plan that meets evaluation needs. |
Intervention-specific administrative data |
Varies by awardee |
Awardees that provide services (e.g., backpacks of food) to treatment group members, rather than or in addition to changes to project benefits and/or access, will be expected to track provision of these services. This data enables us to assess the fidelity of project implementation, project and service take-up rates, and the nature and intensity of services that project participants receive. |
CACFP = Child and Adult Food Care Program; NSLP = National School Lunch Program; SBP = School Breakfast Program.
During the second and (when applicable) third site visits, the contractor will conduct two focus groups with the parents/guardians from families participating in the demonstrations. Participants will provide a firsthand account of service components offered and received and the implementation process (Objective 2). Their experiences will contribute to the interpretation of project impacts (Objective 5). Focus groups will discuss how participants learned of the project, their motivation to participate, the services they received, the implementing agencies they encountered, their experiences interacting with project staff, their perceptions on the usefulness of the project, their other thoughts on the project’s successes and challenges, and their suggestions for project improvement . Focus groups will be held in the evenings at convenient locations in the intervention areas and discussions will last no longer than 90 minutes.
Focus group recruitment, consent, and incentives. The contractor will seek to collect participants’ contact information from each awardee and will inquire about possible convenient locations to host the focus groups. Staff will aim to recruit 20 to 25 participants from among households participating in the project, with the expectation that 8 to 12 will attend. Staff experienced in recruiting respondents will directly contact focus group participants by telephone to explain the study’s purpose, topics to be discussed, incentives, and logistics (Attachments C.1-2). Staff will mail a confirmation letter (Attachment C.3) and make a reminder call to people who agreed to participate to remind them of the upcoming focus group (Attachment C.4). The contractor will offer participants an incentive of $50 (see Section A.9 for justification). Before the focus group begins, staff will obtain active consent from each participant by distributing consent forms, explaining the content of the forms, and asking those who agree to participate to sign the forms indicating their consent (Attachment D.1).
The contractor will conduct in-person, 90-minute interviews with 80 survey respondents from the treatment group (16 in each demonstration project) to expand knowledge obtained from the survey. The objective is to understand the reasons participants are or are not food secure at follow-up and the pathways through which the projects might be helping participants. Interview topics will include household food consumption, social and community supports, coping strategies, and nutrition assistance, among others (Attachment B.4).
In-depth interview recruitment, consent, and incentives. The contractor will purposively select a sample of respondents based on their responses to the follow-up survey. The selection process will prioritize respondents who represent food secure and food insecure households in order to enable comparisons. Only those who indicated on the survey that they would be willing to participate in an in-person interview will be recruited. Trained staff will contact interview participants by telephone to explain the purpose of the interviews, topics to be discussed, logistics, incentives, and to address concerns (Attachment C.5). Staff will call, mail, and (if applicable) email those who agreed to participate to remind them of the upcoming interview (Attachments C.6 and C.7). The contractor will offer participants $50 as a token of appreciation (justified in Section A.9), and will obtain active consent by distributing a consent form, explaining the content of the form, and asking those who agree to participate to sign the form indicating their consent (Attachment D.2).
The purpose of the baseline and follow-up telephone surveys is to determine the impact of demonstration projects on the prevalence of food insecurity among children and food insecurity of households with children (Objectives 3 and 4), as well as the components of each demonstration that are related to outcomes such as participation in nutrition assistance programs, food shopping and spending, and dietary quality (Objective 5). Survey topics include household food security, sociodemographic characteristics, nutrition assistance program participation and supports, food expenditures and food access; food shopping and other related behaviors; and children’s diet quality. The contractor will conduct household surveys at baseline, approximately one to three months before full project operations begin. The survey will be administered by telephone to an adult member of the household who does most of the meal planning or food shopping for the household, referred to as the parent/guardian (Attachment B.5). The contractor will contact respondents for a follow-up survey 12 months following the baseline survey (Attachment B.6). For one demonstration project that has an implementation period lasting more than 12 months, the contractor will administer a second follow-up survey 18 months after baseline (Attachment B.7).
Notifying households and obtaining consent. Awardees or the contractor will obtain consent from eligible households. Exhibit A.4.c shows the steps to be taken by grantees and the contractor for the recruitment, consent, and advance survey activities for each of the demonstration projects. The procedures vary based on local requirements and schedules. Two awardees—Chickasaw and Navajo Nations—require active consent, in which participants must sign a form allowing the contractor to contact them to conduct the survey. The remainder allow passive consent, in which a letter and information about the data collection are provided to participants with an option to refuse participation in the study.
Exhibit A.4.c. Recruitment and consent process for EDECH data collection
Grantee |
Recruitment and consent processes by grantee |
Consent and advance survey processes by contractor |
Chickasaw Nation |
Grantee notifies households of the demonstration project. Grantee obtains written active consent for the demonstration and sharing contact information with contractor. Grantee confirms household eligibility. |
Contractor selects evaluation sample and mails the baseline survey advance letter. Contractor obtains active verbal consent from the evaluation sample for the evaluation. |
Kentucky |
Grantee identifies eligible households from SNAP files and notifies them of the demonstration project. |
Contractor mails the baseline survey advance letter and obtains passive consent from the evaluation sample. |
Navajo Nation |
Grantee notifies eligible households of the demonstration project and the evaluation, and obtains written active consent. |
Contactor mails the baseline survey advance letter to the actively consented evaluation sample. |
Nevada |
Grantee identifies eligible households from SNAP files and notifies them of the demonstration project. |
Contractor mails the baseline survey advance letter and obtains passive consent from the evaluation sample. |
Virginia |
Grantee notifies eligible households of the demonstration project and the evaluation, and obtains passive consent. |
Contractor mails the baseline survey advance letter to the consented evaluation sample. |
Among demonstration projects requiring active consent, participants will receive a study brochure (Attachment C.8) and consent form that they must sign indicating whether they agree to be contacted for the survey (Attachment D.3). Some of the data fields or information requested in the active consent letter are demonstration-specific and therefore may vary by active consent site. Among demonstration projects that allow passive consent, participants will receive a study brochure (Attachment C.8) and either a notification letter informing them of how to opt out (by toll-free number or email) if they choose not to participate in the study (Attachment D.4) or an advance letter describing opt-out procedures (Attachment C.9). The awardee will provide contact information for eligible participants to the contractor by uploading information to a secure file transfer protocol site.
Conducting the household survey. The contractor will send an advance letter to participants before the baseline (Attachment C.9) and follow-up (Attachment C.10) rounds of the survey to inform them that the survey is beginning (see also Exhibit A.4.c). To encourage completion of the surveys, a $30 incentive will be offered to household Justification for the use of incentives in this low-income, hard to reach population, is provided in Section A.9.
The contractor will attempt to reach respondents with a reminder letter (Attachment C.11) if multiple call attempts prove unsuccessful. The contractor will also send a letter to people who initially decline to complete the interview to emphasize the importance of the survey and ask them to reconsider participating (Attachment C.12). For the follow-up surveys, the contractor will mail an additional letter to participants to notify them that someone may try to contact them in person to complete the survey (Attachment C.13).
Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also, describe any consideration of using information technology to reduce burden.
The study strives to comply with the E-Government Act of 2002 (Public Law 107-347, 44 U.S.C. Ch 36) by using computer-assisted telephone interviewing (CATI) for the surveys of parents/guardians. By including programmed skip patterns and consistency and data range checks, this technology reduces data entry error that often necessitates callbacks to respondents to clarify the responses recorded by an interviewer using pencil and paper to conduct an interview. The study will collect all data for the household surveys electronically using CATI.
The contractor will also collect all planning and start-up costs and ongoing administration costs electronically through Excel workbooks, which are easily customizable and likely familiar to the staff who will be completing them. This format will enable the contractor to systematically collect data across major cost categories while limiting burden by accounting for key differences across intervention models, so that the information requested of State, local, or Tribal agency directors or managers and private sector not-for-profit agency directors or managers is pertinent to their demonstration projects.
It is not practicable to offer electronic reporting for the in-depth interviews with project staff, State and local partner organizations, and participants, and focus groups with participants because of the qualitative, discussion-based nature of these activities. All notes will be recorded electronically.
Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purpose described in item 2 above.
FNS has made every effort to avoid duplication. FNS has reviewed USDA reporting requirements, State administrative agency reporting requirements, and special studies by other government and private agencies. To our knowledge, there is no similar information available or being collected for the current time frame that could be used to evaluate the congressionally mandated demonstration projects.
If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.
Information being requested or required has been held to the minimum required for the intended use. Although smaller State agencies, Indian Tribal Organizations and for-profit and not-for-profit awardee partners are involved in this data collection effort, they deliver the same program benefits and perform the same function as any other State agency or business partner. Thus, they maintain the same kinds of administrative information on file. We estimate one small business or other small entity will serve as a partner to each awardee (five total). The same methods to minimize burden will be used with all such entities. To avoid burdening agencies with small roles, and for-profit contractors, the evaluation contractor will exclude from the data collection those that receive minimal funding or resources from the awardee.
Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.
The data collection described in this document is essential for meeting the congressional mandate for an independent evaluation of the demonstration projects to end childhood hunger. There is currently no other effort that can address the research objectives of the proposed study. Without this information, FNS will not be able to produce the required annual Reports to Congress. Moreover, collecting data less frequently would jeopardize the impact evaluation, because the design requires both pre- and post-intervention household surveys. Conducting interviews at more than one point in the projects’ life cycle is also necessary to understand project implementation and maintenance.
Explain any special circumstances that would cause an information collection to be conducted in a manner
requiring respondents to report information to the agency more often than quarterly;
Collecting some types of administrative data more often than quarterly is warranted to accomplish study objectives. For example, monthly records from SNAP application files best enable the study to track trends. State systems can overwrite historical data about a case when a new application is filed, so not collecting these data monthly impairs the ability to note trends in new applications over the study period. Exhibit A.4.c describes the rationale for including various types of administrative data and why some necessitate more frequent collection. File types, structures, and availability will vary by State and demonstration project, so the contractor will coordinate across multiple agencies to receive different file types at varied frequencies to obtain the information needed for the implementation and impact analyses.
requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;
requiring respondents to submit more than an original and two copies of any document;
requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records for more than three years;
in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;
requiring the use of a statistical data classification that has not been reviewed and approved by OMB;
that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or
requiring respondents to submit proprietary trade secret, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.”
There are no other special circumstances; information collection is consistent with 5 CFR 1320.5.
If applicable, provide a copy and identify the date and page number of publication in the Federal Register of the agency's notice, soliciting comments on the information collection prior to submission to OMB. Summarize public comments received in response to that notice and describe actions taken by the agency in response to these comments.
8a. Federal Register notice and comments
A notice of the proposed information collection and an invitation for public comment was published in the Federal Register, December 29, 2014, volume 79, number 248, pages 78027–78029 (Attachment E). Two comments were received during the public comment period. The public comments and FNS’ responses to the comments are included in Attachments I and J.
8b. Consultations outside the agency
In addition to soliciting comments from the public, FNS consulted with the people listed in Exhibit A.8.a for their expertise in matters such as data sources and availability, research design, sample design, level of burden, and clarity of instructions for this collection.
Exhibit A.8.a. Consultations outside the agency
Name |
Degree |
Title |
Organization |
Telephone number |
Alisha Coleman-Jensen |
Ph.D. |
Social Science Analyst, Economic Research Service |
USDA |
(202) 694-5456 |
Christian Gregory |
Ph.D. |
Agricultural Economist, Economic Research Service |
USDA |
(202) 694-5132 |
Shelley Ver Ploeg |
Ph.D. |
Economist, Economic Research Service |
USDA |
(202) 694-5372 |
Audra Zakzeski |
M.S. |
National Agricultural Statistics Service |
USDA |
(703) 877-8000 |
Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.
Incentives for this information collection are planned for the participant focus groups, in-depth interviews, and household surveys, all of which are voluntary for respondents. For the latter, the incentives are an essential component of the multiple methods that will be used to minimize non-response bias as described in Section B.3 of this information collection request, and are especially critical because of the longitudinal design in which household respondents will be followed for up to 18 months and re-contacted for a second or third follow-up survey.
A primary objective of this study as noted in Section A.2 is to be able to ascertain the effectiveness of each of the individual projects to require a sufficient level of statistical precision (see Part B Section B.2.2), and thus a large enough sample of completed interviews to detect statistically significant differences in the impacts of the interventions. In three of the five intervention sites, the sample sizes are constrained by the number of schools or Tribal chapters and the number of eligible households with students. In the other two sites, the sample sizes are not quite as limited but are affected by the design and the schedule for project implementation; in these two SNAP sites the baseline survey must be conducted before randomization to support the project implementation schedule, and as such, the final completed interview sample sizes are a function of the count of cases completing the baseline survey during that timeframe. With these limitations on the size of the available sample one can draw, the study must work to maximize the projects’ response rates to obtain a large sample size in each project, achieving a response rate as close to 80 percent as possible, or higher, in part, through the use of an incentive program discussed in this section.
The target population for this study is low-income households with children at risk of childhood food insecurity. Households eligible for EDECH include those eligible for free school meals in Chickasaw Nation, free or reduced-price (FRP) school meals in Virginia, enrolled in schools in Navajo Nation, and those receiving SNAP benefits in Kentucky or Nevada. The EDECH population is similar to the study population in the evaluation of the Summer Electronic Benefits Transfer for Children (SEBTC; OMB Control Number 0584-0559, Discontinued March 31, 2014) conducted by FNS, USDA, in 2011-2013. SEBTC targeted households with children eligible for FRP school meals in 16 sites in 10 states. Among SEBTC households, about half (48 percent) were headed by one female adult, 70 percent were below poverty, 72 percent had at least one employed adult, and 62 percent were receiving SNAP benefits prior to the SEBTC intervention (Collins et al. 2013). Low-income parents’ work and child care schedules make it difficult to reach parents and guardians by telephone; incentives are essential to obtaining the response rates needed to support the EDECH impact analysis. In addition, the grantees for the EDECH study have stated that their populations may have nonworking telephones, or multiple cell phones that they alternate using, making them difficult to reach.
The survey incentives proposed for the EDECH study are based on the characteristics of the study population and experience with conducting telephone surveys with similar low-income populations and in recent studies on food security:
The USDA-sponsored Supplemental Nutrition Assistance Program on Food Security (OMB Control Number 0584-0563, Discontinued September 19, 2011) offered a modest $2 pre-pay incentive and a $20 post-pay upon completing the telephone interview and had a response rate of 56 percent for baseline and 67 percent for a six-month follow-up, indicating $20-22 is not sufficient for helping to achieve the target response rate of 80 percent in EDECH.
Site-specific baseline survey response rates in the USDA-sponsored 2012 SEBTC study (OMB Control Number 0584-0559, Discontinued March 31, 2014) ranged from 39 percent to 79 percent across 14 sites using a $25 incentive. The average unweighted response rate was 67 percent; the rate was 53 percent for passive consent sites and 75 percent for active consent sites (Briefel et al. 2013), indicating that $25 is not sufficient for helping to achieve the target response rate for each demonstration project in EDECH. The lowest SEBTC baseline survey response rate was observed in an Indian Tribal Organization (ITO) using passive consent. EDECH has a mix of active and passive consent sites and two out of the five grantees are ITOs, which may pose additional challenges in reaching study populations living in rural Tribal areas.
Both of these recent food security studies conducted in populations similar to EDECH indicate that a $25 incentive is insufficient in reaching the target response rate of 80 percent. In addition to the evidence cited above, $20 may be too little to offer to members of the EDECH control groups as an enticement to participate in the baseline and follow-up surveys, compared to the benefits those in the treatment groups will receive. For example, treatment households will receive an additional $40 to $60 in SNAP benefits (or equivalent) per eligible child per month in Nevada and Virginia, respectively. [Virginia SNAP benefits will be provided in the summer months only.]
To achieve the higher response rates desired to obtain reliable impact estimates, and given a compressed baseline data collection period (prior to projects beginning their interventions), the contractor plans to offer all households a $30 incentive. This approach will help to balance the use and cost of the incentives while expediting participation. The $30 incentive was set based on the evidence cited above that a $22-$25 incentive did not contribute to meeting the target 80 percent response rate in low-income households on SNAP or with children eligible for FRP school meals. Research has shown that incentives can minimize non-response bias to surveys without compromising the quality of the data (Singer and Kulka 2002; Singer et al. 1999; Singer and Ye 2013).
Mercer et al. (2015) conducted a meta-analysis of the dose-response association between incentives and response and found a positive relationship between higher incentives and response rates for household telephone surveys offering post-pay incentives. Singer et al. (1999) had in a previous meta-analysis found that incentives in face-to-face and telephone surveys were effective at increasing response rates, with a one dollar increase in incentive resulting in approximately a one-third of a percentage point increase in response rate, on average. Further, sufficient incentives can help obtain a high cooperation rate for both the baseline and follow-up surveys so that at follow-up less field interviewer effort will be needed to locate sample members to complete the survey.
The above discussion summarizes evidence for the effectiveness of incentives for reducing non-response bias, the response rates associated with offering lower incentive amounts to highly similar target populations, and the identification of several OMB-approved experiments that demonstrated or are testing the effectiveness of tiered incentives, all of which justify the proposed incentive amount. In addition, the justifications for offering incentives and the amounts to be offered address key Office of Management and Budget (OMB) considerations identified in its “Guidance on Agency and Statistical Information Collections” memorandum (Graham 2006):
Improved data quality. Three of five sites with a cluster design have a constraint for the precision requirements discussed in Part B, which are premised on an 80 percent response rate. In order to meet that target, intensive follow-up and an incentive is essential to obtaining the sample sizes needed for the impact analysis. If incentives are lower, it may not be feasible to meet the precision requirements, so Part B also includes information corresponding to a 60 percent response rate.
Incentives can also increase sample representativeness. Because they may be more salient to some sample members than others, respondents who would otherwise not consider participating in the surveys may do so because of the incentive offer (Groves et al. 2000).
Using incentives to encourage survey participation will also impact data quality for the in-depth interviews. Interviewees will be recruited based on food security status and an expression of interest at the end of the 12-month follow-up survey. Thus, higher survey participation will ensure a larger number of potential interviewees from which to select and recruit.
Improved coverage of specialized respondents or minority populations. The target populations are socially disadvantaged groups, namely low-income, rural, and Tribal households, all of which are considered hard-to-reach (Bonevski et al. 2014). In addition, households in the demonstration areas are specialized respondents because they are limited in number and difficult to recruit, and their lack of participation jeopardizes the impact study. Incentives may encourage greater participation among these groups.
Reduced respondent burden. As described above, the incentive amounts planned for EDECH are justified because they are commensurate with the costs of participation, which can include cellular telephone usage or travel to a location with telephone service, a known issue in the largely rural demonstration areas included in the study.
Complex study design. The household surveys collected for the impact study are longitudinal. Participants will be asked to complete up to three surveys over a period of up to 18 months. Incentives in amounts similar to those planned for this evaluation have been shown to increase response rates, decrease refusals and noncontacts, and increase data quality compared to a no-incentive control group in a longitudinal study (Singer and Ye 2013).
Past experience. The studies described above demonstrate the effectiveness of incentives for surveys of similar length fielded to similar low-income study populations.
Equity. The incentive amounts will be offered equally to all potential survey participants. The incentives will not be targeted to specific subgroups or participants in only some of the demonstration areas, nor will they be used to convert refusals. Moreover, if incentives were to be offered only to the most disadvantaged households in the Tribal demonstration areas, the differing motivations to participate used across projects will limit the planned descriptive comparisons across sites.
In summary, the planned incentives for the longitudinal household surveys are designed to promote response and high data quality, and to reduce respondent burden associated with the surveys, which are similar in length and will be conducted with similar populations as in other OMB-approved information collections. While the proposed incentive may not achieve an 80 percent response rate or higher, they clearly offer an effective method to maximize the size of the samples obtainable and the data quality of the study’s finding.
The planned $50 incentives for the in-depth interviews and focus groups are also consistent with many of the key OMB considerations described above as well as other OMB-approved information collections. For example, $50 incentives are currently being offered to community members, including parents, participating in one-hour telephone interviews for the Evaluation of the Pilot Project for Canned, Frozen, or Dried Fruits and Vegetables in the Fresh Fruit and Vegetable Program for USDA/FNS (OMB Control Number 0584-0598, Expiration Date September 30, 2017). The study to assess the effect of Supplemental Nutrition Assistance Program on Food Security (OMB Control Number 0584-0563, Discontinued September 19, 2011) offered a $30 incentive to a target population similar to those to be studied in EDECH for completing a 90-minute in-depth interview.
Respondent burden. Most of the EDECH demonstration areas are rural. As a result, focus group participants will need to travel long distances to focus group facilities. Participants are also likely to incur child care costs for the time spent in the 90-minute discussions and on travelling. In-depth interview participants will also likely incur child care costs while participating in a 90-minute interview. Therefore, the planned incentive amount is commensurate with the costs of participation for respondents.
Reduced study costs. The study to assess the effect of Supplemental Nutrition Assistance Program on Food Security experienced many missed interview appointments. Offering lower or no incentives for EDECH in-depth interviews or focus groups may increase the costs associated with recruiting the needed number of participants. It may also result in increased travel costs for contractor staff if the number of missed interview appointments is substantial, a problem compounded by the rural settings of the projects.
Past experience. The difficulties completing in-depth interviews for study to assess the effect of Supplemental Nutrition Assistance Program on Food Security provides evidence from past experience that a higher incentive amount for a similar population ($50 versus $30) is justified.
Equity. Similar to the justification provided above for the household surveys, offering larger incentives for the in-depth interview and focus group participants in only a subset of the demonstration areas would be inequitable and would limit the planned cross-site comparisons if the motivations to participate were inconsistent across projects.
Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.
All respondents’ information will be kept private and not disclosed to anyone but the analysts conducting this research, except as required by law. Demonstration project and partner staff and individual parents/guardians participating in this study will be assured that the information they provide will not be released in a form that identifies them. No identifying information will be attached to any reports or data supplied to USDA or any other researchers. The identities of the project directors from the States and Tribal agencies are known, because their information was included on applications to participate in the demonstration.
FNS published a system of record notice (SORN) titled FNS-8 USDA/FNS Studies and Reports in the Federal Register on April 25, 1991, volume 56, pages 19078–19080, that discusses the terms of protections that will be provided to respondents.
All contractor staff are required to sign a confidentiality agreement (Attachment G). In this agreement, staff pledge to maintain the privacy of all information collected from the respondents and not to disclose it to anyone other than authorized representatives of the study. Issues of privacy will be discussed during training sessions with staff working on the project. Grantees will transfer records such as administrative data to the contractor using a secure file transfer protocol site, in case the files contain personally identifiable information.
Provide additional justification for any questions of a sensitive nature, such as sexual behavior or attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.
The household survey includes questions about income, race/ethnicity, whether anyone in the household has a physical, mental, or emotional limitation, and participation in nutrition and other assistance programs, which some survey respondents might consider sensitive. However, these questions are essential to measure some of the key outcomes of the demonstration projects and are needed as covariates or to form subgroups in the analyses. For example, questions about SNAP and WIC participation are critical because some projects aim to increase participation as a mechanism for reducing child food insecurity and food insecurity among households with children. Furthermore, these data are needed to control for participation in such programs when estimating the impact of a demonstration project. These questions have been used in other studies approved by OMB, including the evaluation of the Summer Electronic Benefits Transfer for Children (SEBTC) (OMB Control Number 0584-0559, Discontinued March 31, 2014) and the study to assess the effect of Supplemental Nutrition Assistance Program on Food Security (OMB Control Number 0584-0563, Discontinued September 19, 2011).
The contractor will obtain active or passive consent in all sites, as discussed in Section A.2, and will inform potential study participants that they may decline to answer any question.
Provide estimates of the hour burden of the collection of information.
Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens in Item 13 of OMB Form 83-I.
Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories.
The public affected by this study are State, local, and Tribal governments; private partner agencies; and individuals, including the parents and guardians of children in households participating in the demonstration projects. Attachment H.1 shows sample sizes, estimated burden, and estimated annualized cost of respondent burden for each part of the data collection and for all data collection. Estimated response times are based on similar instruments completed by the same types of respondents in FNS’s SEBTC evaluation. Annualized cost of respondent burden is the product of each type of respondent’s annual burden and average hourly wage rate. The total estimated burden across all data collection components is 31,625 hours. The total cost of respondent burden is $258,152.
Provide estimates of the total annual cost burden to respondents or record keepers resulting from the collection of information, (do not include the cost of any hour burden shown in items 12 and 14). The cost estimates should be split into two components: (a) a total capital and start-up cost component annualized over its expected useful life; and (b) a total operation and maintenance and purchase of services component.
No capital and start-up or ongoing operational and maintenance costs are associated with this information collection.
Provide estimates of annualized cost to the Federal government. Also, provide a description of the method used to estimate cost and any other expense that would not have been incurred without this collection of information.
The total cost to the Federal government is $9,999,953 over a 50-month period, or $2,399,989 on an annualized basis. The largest cost to the Federal government is to pay a contractor $9,998,212 to conduct the study and deliver data files. This is based on an estimate of 106,592 hours, with a salary range of $36.86 to $396.72 per hour. This information collection also assumes a total of 40 hours of Federal employee time per year for a GS13 step 1 social science research analyst serving as the FNS project officer at $43.52 per hour, for a total of $1,740.80. Federal employee pay rates are based on the General Schedule of the Office of Personnel Management for 2015.
Explain the reasons for any program changes or adjustments reported in Items 13 or 14 of the OMB Form 83-1.
This is a new information collection that will add 31,625 burden hours and 182,381 annual responses to the OMB inventory as a result of program changes due a new statute, Section 23. [42 U.S.C. 179d] of the HHFKA.
For collections of information whose results are planned to be published, outline plans for tabulation and publication.
Exhibit A.16.a shows the planned schedule for EDECH.
Exhibit A.16.a. Project schedule
Activity |
Schedule |
Identify, recruit, and consent sample
Conduct data collection |
Begins within a week of OMB clearance 9/1/15–9/1/17 |
Prepare interim evaluation report 1 |
12/1/15–4/30/16 |
Prepare interim evaluation report 2 |
7/1/16–11/30/16 |
Prepare interim evaluation report 3 |
7/1/17–11/30/17 |
Analyze data and prepare final evaluation reports |
4/30/17–11/30/18 |
Prepare data files and documentation |
6/30/18–11/15/18 |
Prepare journal manuscript |
8/31/18–11/30/18 |
Briefing on study results |
6/30/18–7/31/18 |
The contractor will produce five final report volumes, one for each demonstration project, describing project implementation, costs, and impacts. Each project-specific final report will include appendices that describe study methodology and provide technical details about the study design, sampling, weighting, response rates, data processing, and analysis. The contractor will also prepare three interim reports, which will be a mixture of project-specific and topical cross-project reports, due to the demonstration projects’ different operating lengths. The integrated final report will examine impacts, implementation, and cost-effectiveness across all demonstration projects. For each set of research questions, the contractor will synthesize findings across the demonstrations, using theme tables, side-by-side comparisons of costs and impacts, and special analyses that compare findings across a subset of the awardees. The final reports will be posted on the USDA FNS website (http://www.fns.usda.gov/ops/research-and-analysis).
The implementation study data will consist of (1) qualitative data from in-person interviews with project staff, focus groups with participants, and in-person interviews with participants; and (2) administrative data from demonstration projects. Analysis of the implementation study data will focus on understanding and documenting specific components of each demonstration project, planning and implementation processes, staff and participants’ perceptions, and implications for impact findings. Findings from the implementation analyses will be reported in project-specific profiles and cross-site descriptions. The contractor will triangulate data across multiple respondents and sources to strengthen reliability and will use the following tools to organize the data for analysis and reporting:
Qualitative coding of in-depth participant interviews (via Atlas.ti) will ensure analyses are systematic and comprehensive. Coding qualitative data by question, construct, and respondents’ characteristics (among other possible dimensions) and arraying results into tabular form facilitates identification of themes across various dimensions.
Thematic tables of staff interviews and focus groups, which display answers to key questions by respondent type, location, or another dimension, distill large amounts of information into tabular form to help identify key themes.
Narrative site-specific descriptions, which draw on data from all interviews and focus groups for a particular site (including findings from coded data and thematic tables), highlight themes, divergent perspectives, and examples. A structured template will guide the development of these descriptions. Findings from qualitative sources, especially participant interviews, will be supported with survey findings.
Site time lines and other graphical displays can provide clear documentation of site-specific activities and help to illustrate connections between implementation and impacts.
Descriptive statistics on participants’ characteristics and service use will be tabulated from administrative project data and management information system data to the extent such information is available for a given site.
The first goal of the cost study (Objective 6) is to assess the full costs of each intervention in a way that would be useful in replicating it in the future. In addition, to address Objective 7, the costs per household have to be measured in a way that is sufficiently comparable across the demonstrations to facilitate assessment of relative cost-effectiveness. To estimate costs, the contractor’s team will use the cost data worksheets to obtain information on quantities (for example, hours worked or meals served) and prices (for example, salaries or food costs) of the resources used to implement the interventions. The contractor will then compute total costs, as well as (1) costs per project component or activity; (2) costs attributable to various funding sources; (3) costs per household and per child, typically reported per month or other period; and (4) costs per stage of the demonstration. The contractor will also estimate costs of in-kind, donated, or volunteer resources, valuing them at market prices. Finally, the contractor will assess the sensitivity of the estimates to key assumptions and variations in determinants of costs. For example, one test could involve using national average salary estimates instead of those reported by each awardee.
Create study database and analysis files. The contractor will prepare restricted and public use SAS data files with variable names keyed to the question numbers of each instrument. The contractor will clean the data files by checking for consistency, missing values, outliers, and other problem values. Next, it will create and add constructed variables and sampling weights. Complete documentation—including the file structure, codebook, variable definitions and formulation, descriptions of editing and imputation procedures, and SAS code—will accompany each set of data files, to facilitate full replication of all analyses presented in the evaluation report.
Impact study analyses. The primary outcome in these analyses will be food insecurity among children, as measured by USDA’s 30-day survey module. Key secondary outcomes will include other food security measures among households with children, including 30-day measures of: food insecurity among adults; food insecurity among the household (adults and/or children); and very low food security among children, adults, and the household. In addition, the study will examine other secondary outcomes, including food expenditures, shopping patterns, time spent on food shopping and meal preparation, dietary quality (in one project, Chickasaw Nation), and participation in other nutrition assistance and support programs.
Before estimating the statistical models, the contractor will complete several descriptive analyses to provide an overview of the characteristics of the sample, assess the validity of the study’s evaluation designs, and describe trends over time in key study outcomes. This work will begin with the contractor preparing a set of descriptive tables showing the characteristics of the sample at each project site and a description of key outcomes in the participating households at baseline.
The contractor will compare the characteristics of each project’s participants with characteristics of the control or comparison households at each site to assess the validity of each demonstration’s evaluation design. For random assignment designs, the contractor will test whether randomization successfully produces treatment and control groups that are equivalent at baseline in terms of socioeconomic characteristics and baseline values of primary and secondary outcomes. To test if this is the case, the contractor will compare treatment and control means for each baseline variable. Likewise, for quasi-experimental designs the contractor will examine whether the project’s treatment sites are similar to comparison sites using an analogous set of statistical tests for baseline variables measured at both the household and site levels (such as geographic areas or SNAP offices).
In addition to formal models that generate estimates of project impacts (described later), it will be useful to present descriptive evidence of trends in outcomes among study participants measured as changes in these values from the baseline to follow-up periods. For example, examining trends in food security levels among control or comparison group members would indicate whether the outcomes changed in the study population over time even without the project. In addition to food insecurity and project participation, trends in other outcomes—including food expenditures, shopping patterns, time use on food shopping, and meal preparation—and participation in other nutrition assistance and support projects will be described.
The impact estimation approach will compare treatment group outcomes with a counterfactual estimate of what those outcomes would have been in the absence of the project. The method used to estimate this counterfactual will depend on the specific evaluation design developed for each project. See Section B for a description of the specific econometric models to be estimated.
For each of the five demonstration projects, the contractor will also estimate impacts among subgroups of participants who might respond differently to interventions, such as an expansion in SNAP benefits or encouragement to enroll in nutrition assistance programs. In particular, the contractor plans to estimate the impact of each project on subgroups defined by the following types of characteristics:
Household structure (for example, presence of three or more children in the household, presence of more than one adult in the household)
Baseline food security status (for example, prevalence of food insecurity among children before the project)
Race/ethnicity of the household head
Education level of the household head (for example, less than high school, high school degree, or any postsecondary education)
Income (for example, less than or greater than 100 percent of the Federal poverty level)
The contractor will also examine whether impacts differ based on respondents’ participation in other nutrition assistance programs. For example, if an intervention was designed to encourage greater levels of participation in SNAP, the contractor would estimate the impacts of the project for subgroups defined by respondents’ participation status in other nutrition assistance programs.
The analysis will describe the characteristics of the sites implementing each project and examine differences across sites between the projects’ impact estimates. Specifically, the contractor will conduct significance tests to examine whether the impact estimates for each project are statistically distinguishable. Because the number of planned projects is small (i.e., five), the comparative analysis will be descriptive in nature and will be able to generate hypotheses only about which project designs or demonstration-site characteristics might be related to impacts on food insecurity among children.
If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.
The agency plans to display the expiration date for OMB approval of the information collection on all instruments.
Explain each exception to the certification statement identified in Item 19 "Certification for Paperwork Reduction Act.
This study does not require any exceptions to the Certification for Paperwork Reduction Act (5 CFR 1320.9).
Bonevski, B., M. Randell, C. Paul, K. Chapman, L. Twyman, J. Bryant, I. Brozek, and C. Hughes. “Reaching the Hard-to-Reach: A Systematic Review of Strategies for Improving Health and Medical Research with Socially Disadvantaged Groups.” BMC Medical Research Methodology, 2014, vol. 14, no. 42.
Briefel R., A. Collins, G. Rowe, A. Wolf, J.A. Klerman, C.W. Logan, C.S. Wulsin, A. Enver, C. Owens, J. Jacobson, and S. Bell. Summer Electronic Benefits Transfer for Children (SEBTC) Demonstration: 2012 Congressional Status Report. Submitted to the U.S. Department of Agriculture, Food and Nutrition Service. Cambridge, MA: Abt Associates, 2012/
Collins, Ann M., Ronette Briefel, Jacob Alex Klerman, Gretchen Rowe, Anne Wolf, Christopher W. Logan, Anne Gordon, Carrie Wolfson, Ayesha Enver, Cheryl Owens, Charlotte Cabili, and Stephen Bell. “Summer Electronic Benefits Transfer for Children (SEBTC) Demonstration: Evaluation Findings for the Full Implementation Year.” Final report submitted to the U.S. Department of Agriculture, Food and Nutrition Service. Cambridge, MA: Abt Associates, July 2013.
Graham, J.D. “Guidance on Agency Survey and Statistical Information Collections.” Washington, DC: Office of Management and Budget, January 20, 2006. Available at: http://www.whitehouse.gov/sites/default/files/omb/assets/omb/inforeg/pmc_survey_guidance_2006.pdf. Accessed January 2, 2015.
Groves, R.M., E. Singer, and A. Corning. “Leverage-Saliency Theory of Survey Participation: Description and an Illustration.” Public Opinion Quarterly, 2000, vol. 64, pp. 299-308.
Kovac, M.D. and J. Markesich. “Tiered Incentive Payments: Getting the Most Bang for Your Buck.” Presented at the Annual Conference of the American Association of Public Opinion Research, St. Pete Beach, FL, 2002.
Mercer, A., A. Caporaso, D. Cantor, and R. Townsend, R. “How Much Gets You How Much? Monetary Incentives and Response Rates in Household Surveys.” Public Opinion Quarterly, 2015, vol. 79 (1), pp.105-129.
Singer, E., R.M. Groves, and A.D. Corning. “Differential Incentives: Beliefs About Practices, Perceptions of Equity, and Effects on Survey Participation.” Public Opinion Quarterly, 1999, vol. 63, pp. 251–260.
Singer, E., and R.A. Kulka. “Paying Respondents for Survey Participation.” In Studies of Welfare Populations: Data Collection and Research Issues. Panel on Data and Methods for Measuring the Effects of Changes in Social Welfare Programs, edited by Michele Ver Ploeg, Robert A. Moffitt, and Constance F. Citro. Committee on National Statistics, Division of Behavioral and Social Sciences and Education. Washington, DC: National Academy Press, 2002, pp. 105–128.
Singer, E., and C. Ye. “The Use and Effects of Incentives in Surveys.” The Annals of the American Academy of Political and Social Science, 2013, vol. 645, no. 112.
1 In addition, increases in program benefits (such as SNAP benefits) will be costs in most of the demonstrations, but these are considered as part of the impact study.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Evaluation of Demonstration Projects to End Childhood Hunger, Part A: Justification |
Subject | OMB Statement |
Author | Mathematica Staff |
File Modified | 0000-00-00 |
File Created | 2021-01-25 |