Supporting Statement for Paperwork Reduction Act Submission
Evaluation of SNAP-Nutrition Education Practices, Wave II
OMB Control No. 0584-0554
Prepared for:
Sara Olson
U.S. Department of Agriculture
Food and Nutrition Service
3101 Park Center Drive
Alexandria, VA 22302
(703) 605-4013
Prepared by:
Altarum Institute and RTI International
A.1 Explanation of Circumstances That Make Collection of Data Necessary A3
A.2 Purpose and Use of Information A5
A.3 Use of Information Technology for Burden Reduction A9
A.4 Efforts to Identify Duplication and Use of Similar Information A10
A.5 Impacts on Small Businesses or Other Small Entities A10
A.6 Consequences of Collecting Information Less Frequently A11
A.7 Special Circumstance Relating to the Guideline of 5 CFR 1320.5 A11
A.8 Comments in Response to Federal Register Notice and Efforts to Consult Outside Agency A12
A.9 Explanation of Any Payment or Gift to Respondents A12
A.10 Assurance of Confidentiality Provided to Respondents A13
A.11 Justification for Sensitive Questions A14
A.12 Estimates of Hour Burden Including Annualized Hourly Costs A15
A.13 Estimates of Other Total Annual Cost Burden to Respondents or Record Keepers A18
A.14 Annualized Cost to the Federal Government A19
A.15 Explanation for Program Changes or Adjustments A19
A.16 Plans for Tabulation and Publication and Project Time Schedule A19
A.17 Reason(s) That Display of OMB Expiration Date Is Inappropriate A22
A.18 Exceptions to Certification for Paperwork Reduction Act Submissions A22
Part B Statistical Methods
B.1 Respondent Universe and Sampling Methods B1
B.2 Procedures for the Collection of Information B5
B.3 Methods to Maximize Response Rates and Deal with Nonresponse B5
B.4 Tests of Procedures or Methods to Be Undertaken B7
B.5 Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data B7
List of Appendices
A Data Collection Instruments for Impact Evaluation
B Data Collection Instruments for Process Evaluation
C Assessment of Demonstration Project-Led Impact Evaluation Review Form
D Comments on Federal Register Notice and Responses to Comments
E Statistical Methods for the Impact Evaluation
F Data Collection Methods for Impact Evaluation
G Common Assumptions for Statistical Models of Parental Reports of Children’s Fruit and Vegetable Consumption in a Clustered, Experimental, or Quasi-experimental Design
H Cover Letters, Recruitment Letters, and Other Study Materials for Impact Evaluation
I Cover Letters, Recruitment Letters, and Other Study Materials for Process Evaluation
A.1–Explain the circumstances that make the collection of information necessary. Identify any legal or administrative requirements that necessitate the collection. Attach a copy of the appropriate section of each statute and regulation mandating or authorizing the collection of information.
BACKGROUND
This is a revision of a currently approved collection. The Food and Nutrition Service (FNS) of the U.S. Department of Agriculture (USDA) promotes optimal health and well-being of low-income individuals through improved nutrition and well-designed nutrition education efforts within the Supplemental Nutrition Assistance Program (SNAP). Under Section 17 of the Food and Nutrition Act of 2008 (7 U.S.C. 2026), the Secretary may undertake research that will help improve the administration and effectiveness of SNAP.
The nutrition assistance programs are a critical component to attaining FNS’s goals. FNS defines SNAP Education (SNAP-Ed, formerly called Food Stamp Nutrition Education) activities as those designed to increase the likelihood of healthy food choices by SNAP recipients and those who are eligible for but currently not participating in the program. As the largest of the Federal nutrition assistance programs, SNAP has a significant stake in ensuring that nutrition education works to meet these goals.
To identify the extent to which SNAP-Ed interventions can be linked to increasing healthy eating behaviors, FNS has funded a total of seven model SNAP-Ed projects that will have the resources necessary to both implement the intervention and provide the support to measure the impact of the intervention on client behavior. Through the Models of SNAP-Ed and Evaluation, Wave I (OMB Control Number: 0584-0554; expiration date 1/31/2013), FNS funded and is currently evaluating four model projects. This proposed study—titled “Evaluation of SNAP Education Practices, Wave II”—(builds upon Wave I) entails the evaluation of three additional model projects and represents an expansion of the ongoing FNS Wave I study. By evaluating SNAP-Ed projects that address SNAP Guiding Principles and support a rigorous evaluation model, FNS can gain an understanding of the effectiveness of the seven SNAP-Ed projects and provide nutrition educators with examples of evaluation methodologies that are both feasible and scientifically robust and identify effective behavioral change interventions. The following are brief summaries of each of the three model demonstration projects:
University of Kentucky Cooperative Extension Service (UKCES) Literacy, Eating, and Activity for Primary School-Age Children. UKCES will conduct an intervention in schools using a curriculum that aims to increase fruit and vegetable consumption. The target audience for this intervention will be children in the first through third grades in eight public elementary schools in Perry and Laurel Counties. The intervention includes eight lessons that are centered on popular children’s storybooks.
Iowa Nutrition Network (INN) BASICS Program. INN will conduct an intervention in schools by using a multichannel approach. The target audience is third-graders in 22 public elementary schools in three school districts. The intervention includes a series of 12 lessons and aims to encourage children to choose fruits and vegetables for snacks and low-fat or fat-free dairy products at meals and snacks.
Michigan State University Extension (MSUE) Eat Smart, Live Strong. MSUE will implement the 4-week Eat Smart, Live Strong curriculum, an educational curriculum developed by FNS, in 14 senior centers across the State from a variety of ethnically diverse rural and urban communities. The goal of this intervention is to encourage older adults to increase their fruit and vegetable consumption and participate in at least 30 minutes of moderate-intensity physical activity each day.
PURPOSE AND NEED
A key focus of FNS has been its efforts to ensure that nutrition education interventions are science based and have the intended effects on clients’ behavior. With independent evaluations of these three additional demonstration projects, FNS will be better equipped to determine whether SNAP-Ed nutrition education interventions can positively affect the nutrition and health behaviors of SNAP participants and those who are eligible but choose not to participate in the SNAP program. A 2006 systems review (Food Stamp Nutrition Education Systems Review; OMB control# 0584-0528; expiration: 3/31/2008) identified a number of issues related to the implementation of SNAP-Ed and the extent to which implementing agencies (IAs) and their local projects focus on client behaviors and evaluate project outcomes. In most cases, IAs reported that they lacked the expertise and funds needed to initiate and complete rigorous impact evaluations of their SNAP-Ed practices. As a result, it is difficult for SNAP-Ed implementers or FNS to determine which SNAP-Ed approaches are most successful in terms of facilitating behavior change or, for that matter, whether SNAP-Ed efforts are effective at all. As in Wave I, each of the three demonstration projects included in Wave II will have its own independent evaluation component. By comparing each of the three demonstration project evaluations with the more rigorous evaluation to be conducted by FNS contractors, FNS will be able to provide logistically practical examples of project-level SNAP-Ed evaluation efforts that are methodologically robust. The widespread use of these methodologically robust evaluations by nutrition education implementers not only will help educators to refine their evaluations in order to maximize the interventions intended effect but will provide FNS with a more general measure of the effectiveness of SNAP-Ed.
In addition, FNS and States who might model these SNAP-Ed interventions in the future need to understand why they were or were not successful, which is why the process evaluation piece of this study is critical. Finally, FNS also needs to know what demonstration project-led evaluations were effective. Again, if FNS plans to recommend evaluation methods or to cite these models as examples for replication, it is necessary for these methods to be examined in terms of rigor, quality, and practicality.
A.2—Indicate how, by whom, how frequently, and for what purpose the information is to be used. Except for a new collection, indicate the actual use the agency has made of the information received from the current collection.
STUDY PURPOSE
As with Wave I, the purpose of the current study is to determine whether any of the three selected projects can serve as good examples of SNAP-Ed delivery that (1) positively affect the nutrition and health behaviors of SNAP participants while adhering to FNS Guiding Principles, (2) exhibit the potential to serve as models of effective nutrition education intervention for large segments of the SNAP audience while requiring levels of resources that are manageable by a large percentage of SNAP-Ed-implementing agencies, and (3) provide methodologically robust yet logistically practical examples of project-level SNAP-Ed evaluation efforts.
Specifically, this study encompasses an independent impact evaluation, a process evaluation, and an assessment of the demonstration project-led impact evaluations. The independent impact evaluations will employ a pre/post with control/comparison design in order to examine the extent to which clients initiated or sustained the desired behaviors. The process evaluation examines the implementation of the demonstration projects and analyzes lessons learned during implementation—from the perspective of program administrators, direct educators, and the target audiences—to inform program modifications and potential expansions or replication of these models. The assessment of demonstration project-led impact evaluations will measure the quality of the evaluation and include a description of the strengths and weaknesses of each evaluation design and its implementation, as well as an evaluation of the soundness of the outcome measures. Results from the independent and demonstration project-led data collection efforts will be analyzed separately and then compared to examine the extent to which the results are similar, explain discrepancies, and produce an integrated report of findings and recommendations.
OVERVIEW OF STUDY DESIGN
FNS’s contractors for this project—Altarum Institute and its subcontractor RTI International—worked with FNS and each demonstration project to develop a rigorous evaluation approach tailored to the specific intervention being evaluated. The evaluation approach for each demonstration project ensures that the FNS evaluation does not contaminate the demonstration project’s intervention and/or impact evaluation; establishes causality between the interventions and the dietary behavioral outcomes within the limitations imposed by delivering a public nutrition education program; and incorporates an “intention-to-treat” approach, meaning that individuals who drop out of the intervention will be contacted in order to collect information on outcome measures as well as reasons for dropping out of the program.
Determining the effectiveness of the interventions and their potential for replicability, the primary objective of the process evaluation will require a clear understanding of each intervention’s planning and implementation. Existing documentation will be used to obtain objective information for the process evaluation; qualitative methods will be used to gather more in-depth information on program implementation, as well as perspectives of key players engaged in or exposed to the intervention (e.g., program staff, educators, target audience).
SUMMARY OF DATA COLLECTION METHODS
Table A.1-1 summarizes the research design and data collection methods for the impact evaluation for each demonstration project. The data collection instruments for Wave II, which have been tailored for each specific demonstration project, are similar to those used in Wave I and can be found in Appendix A.
Table A.1-1. Summary of the Research Design and Data Collection Methods for the Impact Evaluation
Demonstration Project |
Research Design |
Data Collection Method |
INN |
Use a quasi-experimental research design with schools purposively assigned as follows:
|
Survey parents/caregivers of third grade students pre- and post-intervention using a mail/telephone survey approach. |
UKCES |
Use an experimental research design with schools in Laurel and Perry Counties randomly assigned to the intervention group (n = 8) or the control group (n = 8). |
Survey parents/caregivers of first- through third-grade students pre- and post-intervention using a mail/telephone survey approach. |
MSUE |
Use an experimental research design with senior centers randomly assigned to the intervention group (n = 14) or the control group (n = 15). |
Conduct in-person interviews with participating seniors at pre-intervention. Use a mail/telephone survey approach for the post-intervention survey. |
For the process evaluation data collection effort, the methods used and respondent types will vary for each intervention project. Table A.1-2 provides a summary of the data collection methods. The data collection instruments can be found in Appendix B.
Table A.1-2 Summary of Data Collection Methods for the Process Evaluation
Method |
Purpose |
In-depth, open-ended discussions with SNAP-Ed program- and partner-level staff INN, UKCES, and MSUE |
Capture the experiences and perspectives of, as well as lessons learned by, personnel on the administrator and, in most cases, provider sides of the program. For most respondent types, these interviews will take place at both pre- and post-implementation. |
Structured observations of the nutrition education classes INN and UKCES |
Collect information related to environmental influences, observe participant interest in nutrition education lessons, and describe how implementation is or is not consistent with plans. |
In-depth, post-intervention open-ended discussions with classroom teachers INN and UKCES |
Assess attitudes about the importance of the nutrition messages of the intervention, perspectives on what worked well, what could be improved in the administration and delivery of the intervention, and the degree to which teachers incorporate SNAP-Ed messages into other classroom activities. |
Post-intervention mail questionnaires with classroom teachers INN and UKCES |
Collect similar information to that referenced in the cell above from the subset of teachers who are not involved in the in-depth open-ended discussions. |
Post-intervention structured group interviews with participants’ parents INN, UKCES, and MSUE |
Capture the perspectives and level of satisfaction of program participants (MSUE) or their parents/caregivers (INN and UKCES) with the nutrition education program. |
Abstraction of textual information and extraction of numerical program data from secondary data sources INN, UKCES, and MSUE |
Objectively document the planned and actual intervention. Obtain data needed to quantify the total attendance and the average and range of attendance per intervention site, the average and range of nutrition education received by participants, and total and per participant costs. |
Some of the methods described in Table A.1-2 also will be used to capture information related to the assessment of the demonstration project-led evaluations. For example, the primary data sources for the assessment of the demonstration project-led evaluations are (1) pre-post, in-depth, open-ended discussions with demonstration projects’ evaluation managers and (2) a review of and an abstraction or extraction from the 2012 Annual SNAP-Ed reports from FNS (or a similar report describing the results of the demonstration project’s evaluation). Using these data sources, the contractor will complete a Demonstration Project-Led Evaluation Rating Form that rates the demonstration projects’ impact evaluation on criteria such as viable comparison strategy, sampling size or sampling strategy, outcome measures, data collection, and data analysis. This form is provided in Appendix C.
USE OF THE INFORMATION
The results of the impact evaluation and assessment of the demonstration project-led evaluation, coupled with the process evaluation findings, will be used to (1) determine which, if any, of the three demonstration projects can serve as good examples of SNAP-Ed delivery that meet the previously described FNS criteria; (2) identify lessons learned in terms of the design, planning, and implementation process and provide recommendations to FNS on how these interventions could be improved to potentially enhance outcomes; and (3) determine which, if any, of the demonstration project-led assessments provide methodologically robust yet logistically practical examples of project-level SNAP-Ed evaluation efforts. In more general terms, the evaluation of the demonstration projects’ nutrition education interventions will provide evidence of the potential effectiveness of the SNAP-Ed program, while the assessment of the demonstration projects’ impact evaluations will generate models of practical yet robust evaluation techniques that can be adopted by a significant percentage of SNAP-Ed implementing agencies. By promulgating these impact evaluation techniques to SNAP-Ed nutrition educators, FNS will be able to obtain even broader evidence of the success of SNAP-Ed.
A.3—Describe whether, and to what extent, the collection of information involves the use of automated, electronic, mechanical, or other technological collection techniques or other forms of information technology, e.g., permitting electronic submission of responses, and the basis for the decision for adopting this means of collection. Also, describe any consideration of using information technology to reduce burden.
FNS makes every effort to comply with the E-Government Act of 2002. For the impact evaluation, it will be necessary to collect information on outcome measures of interest from program participants. The use of the Internet to collect this information was considered but not used, because many low-income individuals do not readily have access to the Internet. Instead, a combination of in-person survey administration in a group setting (MSUE only) and mail/telephone surveys will be used.
A.4—Describe efforts to identify duplication. Show specifically why any similar information already available cannot be used or modified for use for the purpose described in item 2 above.
Every effort has been and will be made to avoid duplication of data collection. For the process evaluation, these efforts include (1) a thorough review of many extant documents, including Fiscal Year 2012 SNAP-Ed plans and reports, nutrition curricula, expenditure reports, school menus, and class attendance logs; and (2) adding on a limited number of process-related questions to the impact evaluation instruments to avoid multiple contacts with the same respondents. For the impact evaluation, INN and UKCES are surveying children, not parents or caregivers, so there is no duplication. For the MSUE study, both MSUE and FNS are surveying older adults. To minimize respondent burden, MSUE will collect information on participant demographics and share that information with FNS.
A.5—If the collection of information impacts small businesses or other small entities, describe any methods used to minimize burden.
FNS estimates that all senior centers (MSUE) and schools (INN and UKCES) involved in the FNS evaluation do qualify as small businesses. In all instances, we have taken the necessary steps to ensure that the burden on any organization, especially any small business, is minimized. For the impact evaluations for the INN and UKCES projects, school administrators will be asked only to assist with distributing and collecting study enrollment materials and engaging students and their caregivers in the study. For the process evaluation, none of the senior centers or schools that are serving as controls in the evaluation will be contacted, and we will only be conducting in-depth interviews with the school principals and a subsample of classroom teachers in Iowa and Kentucky schools that are receiving SNAP-Ed interventions; brief questionnaires will be administered to the remainder of classroom teachers in these schools. Only program managers from 6 of 14 senior centers receiving the SNAP-Ed intervention in Michigan will be interviewed.
A.6—Describe the consequence to Federal program or policy activities if the collection is not conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing burden.
If this data collection was not conducted or conducted less frequently, FNS will not be able to determine or improve the administration or effectiveness of SNAP. Thus, it will be necessary to conduct pre- and post-intervention surveys to collect data on key outcome measures before and after the intervention (Shadish, Cook, & Campbell, 2002).
Onsite data collection for the process evaluation will take place at three points in time prior to, during, and just after implementation of the nutrition education. Pre-implementation interviews will focus on the planned intervention design and implementation; post-implementation interviews will focus on experiences, lessons learned, and deviations from the planned implementation. Eliminating either data collection effort would significantly affect the ability to draw accurate conclusions about factors that may have contributed to the success or failure of the intervention. The “during” data collection period specifically will be for observing nutrition education in action. This observation will provide an opportunity for conclusions to be made about the degree to which nutrition education was implemented as planned as well as to document certain environmental factors that might not otherwise be reported but that could significantly affect the impact of the nutrition education.
A.7—Explain any special circumstances that would cause an information collection to be conducted in a manner: requiring respondents to report information to the agency more often than quarterly; requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it; requiring respondents to submit more than an original and two copies of any document; requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records for more than three years; in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study; requiring the use of a statistical data classification that has not been reviewed and approved by OMB; that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or requiring respondents to submit proprietary trade secret, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information’s confidentiality to the extent permitted by law.
There are no special circumstances. The collection of information is conducted in a manner consistent with the guidelines in 5 CFR 1320.5.
A.8—If applicable,
provide a copy and identify the date and page number of publication
in the Federal Register of the agency’s notice, soliciting
comments on the information collection prior to submission to OMB.
Summarize public comments received in response to that notice and
describe actions taken by the agency in response to these comments.
Describe efforts to consult with persons outside the agency to obtain
their views on the availability of data, frequency of collection, the
clarity of instructions and recordkeeping, disclosure, or reporting
form, and on the data elements to be recorded, disclosed, or
reported.
FEDERAL REGISTER NOTICE
In accordance with the Paperwork Reduction Act of 1995, an announcement of the FNS’s intent to seek OMB approval to collect information for the Evaluation of SNAP-Nutrition Education Practices provided an opportunity for public comment. This announcement was published in the Federal Register on September 7, 2010 (75 FR 54295) and specified a 60-day period for comment ending November 8, 2010. A copy of the comments received and FNS’s response to those comments are provided in Appendix D.
CONSULTATION WITH OUTSIDE AGENCIES
The Evaluation of SNAP-Nutrition Education Practices has been developed in consultation with both technical and substantive experts. Joanne Guthrie, Ph.D., from the USDA Economic Research Service (phone: 202-694-5373) participated in the selection of the demonstration projects and provided input on the study design. The data collection instruments, study plan, sampling and methodologies were all reviewed by the National Agricultural Statistical Service (NASS). NASS comments and FNS responses are included as Appendix J. Contact the NASS Survey Administration Branch for more information.
A.9—Explain any decision to provide any payment or gift to respondents, other than renumeration of contractors or grantees.
Incentives will be offered to study participants to maximize the response rates (see Table A.9-1). The incentive amounts are the same as for Wave I of this study. Additionally, for the INN and UKCES evaluations, the contractor will work with selected schools to coordinate the data collection for the impact surveys. To acknowledge their assistance in distributing and collecting study enrollment materials and engaging students and their caregivers in the study, we will provide incentives to the school, classroom teachers, and a site coordinator. The incentive amounts are based on the contractor’s experience with collecting data in schools. School teachers in the intervention classrooms will also serve as key respondents for the process evaluation and be provided an incentive for completion of either an onsite interview or a submission of a written questionnaire (see Table A.9-1).
Table A.9-1. Incentives for the Impact and Process Evaluation Data Collection
Program and Respondent Type |
Impact Evaluation |
Process Evaluation |
||
|
Prior to survey |
Pre-survey |
Post-survey |
Structured group interview |
INN and UKCES |
|
|
|
|
Pre-test with Spanish-speaking individuals |
$50 |
N/A |
N/A |
N/A |
Parents/caregivers of study participants |
N/A |
$10 |
$15 |
$50 |
School* |
$200 |
N/A |
N/A |
N/A |
Teacher** |
$25 |
N/A |
N/A |
N/A |
Site coordinator† |
$50 |
N/A |
N/A |
N/A |
Children (return of enrollment envelope) ‡ |
Token gift worth $1 |
N/A |
N/A |
N/A |
MSUE |
|
|
|
|
Study participants |
N/A |
$10 |
$15 |
$50 |
NA = Not applicable
*A check made out to the school will be provided to the principal.
**The teacher incentive is based upon the percentage of students returning the envelope with the enrollment form, not the number of caregivers who enroll or provide contact information.
† The principal will be asked to designate someone to coordinate the daily collection of the returned sealed envelopes for pick up by a field interviewer on a daily basis during the enrollment period.
‡ Children will receive token incentives (e.g., a pencil or eraser) for returning their caregiver’s enrollment form whether or not the caregiver agrees to participate and provide contact information for the mail/telephone impact survey.
A.10—Describe any assurance of confidentiality provided to respondents and the basis for the assurance in statute, regulation, or agency policy.
The individuals participating in this study will be notified that (1) their participation in the study is voluntary, (2) there will be no penalty if they chose not to participate, and (3) the information that they provide will not be released in a form that identifies them except as required by law. No identifying information will be attached to any reports or data supplied to FNS. All respondents to the impact and process evaluation data collections will be asked to provide informed consent before participating in the data collection. The informed consent form (included as part of Appendix H) will describe the precautions taken to protect participant contact information and survey or interview responses.
RTI’s Institutional Review Board (IRB) has reviewed and approved the survey protocols for the impact evaluation to ensure that human subjects are protected and that procedures to ensure privacy are adequate (included as Appendices K, L, and M). After reviewing data collection instruments for the process evaluation, Social Solutions International, Inc. determined that this component of the study qualified for IRB exemptions under 45 CFR 46.101(b)(1) (included as Appendix N). Older adults (MSUE project) and parents/caregivers (INN and UKCES) who decide to participate in the study will read, sign, and return an informed consent form. The informed consent, approved by RTI’s IRB, describes the precautions taken to protect participants’ contact information and survey responses. Specifically, participants will be informed that their names will be replaced with an identification number and other personal information will be stored separately from their survey answers. FNS published a Federal Register notice on containing personal identifiable information (PII) on individuals doing business with FNS as a Privacy Act Notice (system of records notice) in the Federal Register Volume 65 pages 17251-52 to specify the uses to be made of the information in this collection. In addition, respondents to the impact and process evaluation will be provided oral and written notification that their data will be treated as private and released to the public only in the form of aggregate statistics.
A.11—Provide additional justification for any questions of a sensitive nature, such as sexual behavior or attitudes, religious beliefs, and other matters that are commonly considered private. This justification should include the reasons why the agency considers the questions necessary, the specific uses to be made of the information, the explanation to be given to persons from whom the information is requested, and any steps to be taken to obtain their consent.
The data collection instruments for this study do not contain questions of a sensitive nature.
A.12—Provide estimates of the hour burden of the collection of information. The statement should: (1) indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens in Item 13 of OMB Form 83-I; and (2) provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories.
The estimates of hour burden and cost to respondents are provided in Tables A.12-1 and A.12-2.
These tables shows the affected public (States Agencies, Business, and Individuals/Households); sample sizes (total N, 4717); estimated annual burden in hours (total, 2133.07); estimated frequency of response (maximum of 2); and estimated total cost of respondent burden (total, $24,621.86). Estimates of burden include time required to read advance letters (both responders and non-responders), response times for surveys and interviews, coordinate any activities, and respond to specific data collection queries.
Table A.12-1a. Reporting Estimates of Hour Burden and Cost to Program Developers and Providers for INN’s Demonstration Project1
Respondent (total) |
Instrument Type |
Estimated Number of Respondents |
Frequency of Response |
Total Annual Responses |
Response Burden in Hours |
Est. Total Hours Per Response |
Estimated Hourly Wage |
Estimated Cost to Respondents |
Demonstration project staff |
Key informant contact information* |
1 |
1 |
1 |
0.33 |
0.33 |
$39.70 |
$13.10 |
In-depth, open-ended discussions |
5 |
2 |
10 |
0.75 |
7.50 |
$39.70 |
$297.75 |
|
Demonstration project cost form* |
1 |
1 |
1 |
0.75 |
0.75 |
$39.70 |
$29.78 |
|
School principals |
Introductory meeting |
33 |
1 |
33 |
0.5 |
16.50 |
$40.43 |
$667.10 |
Coordinating activities* |
33 |
1 |
33 |
1.5 |
49.50 |
$40.43 |
$2,001.29 |
|
In-depth, open-ended discussions* |
6 |
1 |
6 |
0.5 |
3.00 |
$40.43 |
$121.29 |
|
School food service directors |
In-depth, open-ended discussions |
3 |
1 |
3 |
0.33 |
0.99 |
$20.99 |
$20.78 |
Direct educators |
In-depth, open-ended discussions |
6 |
2 |
12 |
0.5 |
6.00 |
$35.00 |
$210.00 |
Classroom teachers |
Coordinating activities |
66 |
1 |
66 |
1 |
66.00 |
$35.00 |
$2,310.00 |
Brief questionnaires* |
50 |
2 |
100 |
0.25 |
25.00 |
$35.00 |
$875.00 |
|
Brief interviews* |
16 |
1 |
16 |
0.5 |
8.00 |
$35.00 |
$280.00 |
|
Retail store produce managers |
In-depth, open-ended discussions |
8 |
1 |
8 |
0.33 |
2.64 |
$23.65 |
$62.44 |
TOTAL for INN |
N/A |
121 |
N/A |
289 |
N/A |
186.21 |
N/A |
$6,888.51 |
Table A.12-1b. Reporting Estimates of Hour Burden and Cost to Program Developers and Providers for MSUE’s Demonstration Project1
Respondent (total) |
Instrument Type |
Estimated Number of Respondents |
Frequency of Response |
Total Annual Responses |
Response Burden in Hours |
Estimated Total Hours |
Estimated Hourly Wage |
Estimated Cost to Respondents |
Demonstration project staff |
Key informant contact information* |
1 |
1 |
1 |
0.33 |
0.33 |
$37.40 |
$12.34 |
In-depth, open-ended discussions |
3 |
2 |
6 |
0.75 |
4.50 |
$37.40 |
$168.30 |
|
Demonstration project cost form* |
1 |
1 |
1 |
0.75 |
0.75 |
$37.40 |
$28.05 |
|
Regional/local extension staff |
In-depth, open-ended discussions |
5 |
2 |
10 |
0.75 |
7.50 |
$37.40 |
$280.50 |
Senior center program managers |
In-depth, open-ended discussions |
6 |
1 |
6 |
0.75 |
4.50 |
$28.00 |
$126.00 |
Direct educators |
In-depth, open-ended discussions |
19 |
2 |
38 |
0.5 |
19.00 |
$55.00 |
$1,045.00 |
TOTAL for MSUE |
N/A |
33 |
N/A |
62 |
N/A |
36.58 |
N/A |
$1,660.19 |
Table A.12-1c. Reporting Estimates of Hour Burden and Cost to Program Developers and Providers for UKCES’ Demonstration Project2
Respondent (total) |
Instrument Type |
Estimated Number of Respondents |
Frequency of Response |
Total Annual Responses |
Response Burden in Hours |
Estimated Total Hours |
Estimated Hourly Wage |
Estimated Cost to Respondents |
Demonstration project staff |
Key informant contact information* |
1 |
1 |
1 |
0.33 |
0.33 |
$30.50 |
$10.07 |
In-depth, open-ended discussions |
5 |
2 |
10 |
0.75 |
7.50 |
$30.50 |
$228.75 |
|
Demonstration project cost form* |
1 |
1 |
1 |
0.75 |
0.75 |
$30.50 |
$22.88 |
|
Family/consumer science agents |
In-depth, open-ended discussions |
4 |
2 |
8 |
0.75 |
6.00 |
$28.70 |
$172.20 |
School principals |
Introductory meeting |
16 |
1 |
16 |
0.5 |
8.00 |
$28.70 |
$229.60 |
Coordinating activities* |
16 |
1 |
16 |
1.5 |
24.00 |
$28.70 |
$688.80 |
|
In-depth, open-ended discussions* |
4 |
1 |
4 |
0.5 |
2.00 |
$37.40 |
$74.80 |
|
Classroom teachers |
Coordinating activities |
64 |
1 |
64 |
0.75 |
48.00 |
$22.54 |
$1,081.92 |
Brief questionnaires* |
50 |
1 |
50 |
0.25 |
12.50 |
$22.54 |
$281.75 |
|
Brief interviews* |
12 |
1 |
12 |
0.25 |
3.00 |
$22.54 |
$67.62 |
|
Direct educators |
In-depth, open-ended discussions |
4 |
2 |
8 |
0.5 |
4.00 |
$30.50 |
$122.00 |
TOTAL for UKCES |
N/A |
93 |
N/A |
190 |
N/A |
116.08 |
N/A |
$2,980.38 |
Respondent (total) |
Instrument Type |
Estimated Number of Respondents* |
Frequency of Response |
Total Annual Responses |
Response Burden Hours** |
Estimated Total Hours |
Estimated Hourly Wage |
Est. Cost to Respondents |
|
|
Pretests conducted with English-speaking individuals in North Carolina |
9 |
1 |
9 |
1 |
9.00 |
$7.25 |
$65.25 |
|
INN- Parents of nutrition education recipients |
Pretest with Spanish-speaking individuals |
3 |
1 |
3 |
1 |
3.00 |
$7.25 |
$21.75 |
|
Mail/telephone questionnaire, nonrespondents |
840 |
1 |
840 |
0.07 |
58.8 |
$7.25 |
$426.30 |
||
Mail/telephone questionnaire, respondents |
909 |
2 |
1,818 |
0.28 |
509.04 |
$7.25 |
$3,690.54 |
||
Structured group interview, nonrespondents* |
223 |
1 |
223 |
0.07 |
15.61 |
$7.25 |
$113.17 |
||
Structured group interview, respondents (3 interviews x 8 different participants)* |
24 |
1 |
24 |
2 |
48.00 |
$7.25 |
$348.00 |
||
|
INN SUBTOTAL |
1,761 |
N/A |
2,917 |
N/A |
643.45 |
N/A |
$4,665.01 |
|
UKCES- Parents of nutrition education recipients |
Mail/telephone questionnaire, nonrespondents |
740 |
1 |
740 |
0.07 |
51.8 |
$7.25 |
$375.55 |
|
Mail/telephone questionnaire, respondents |
800 |
2 |
1,600 |
0.28 |
448.00 |
$7.25 |
$3,248.00 |
||
Structured group interview, nonrespondents* |
178 |
1 |
178 |
0.07 |
12.46 |
$7.25 |
$90.34 |
||
Structured group interview, respondents (4 interviews x 8 participants)* |
32 |
1 |
32 |
2 |
64.00 |
$7.25 |
$464.00 |
||
|
UKCES SUBTOTAL |
1,540 |
N/A |
2,550 |
N/A |
576.26 |
N/A |
$4,177.89 |
|
|
Pretests conducted with older adults in North Carolina |
9 |
1 |
9 |
1 |
9.00 |
$7.25 |
$65.25 |
|
MSUE- senior nutrition education recipients |
Self-administered questionnaire, nonrespondents |
410 |
1 |
410 |
0.07 |
28.7 |
$7.40 |
$212.38 |
|
Self-administered questionnaire & follow-up mail/telephone questionnaire, respondents |
750 |
2 |
1,500 |
0.28 |
420.00 |
$7.40 |
$3,108.00 |
||
Structured group interview, nonrespondents* |
297 |
1 |
297 |
0.07 |
20.79 |
$7.40 |
$153.85 |
||
Structured group interview, respondents (6 interviews x 8 participants)* |
48 |
1 |
48 |
2 |
96 |
$7.40 |
$710.40 |
||
|
MSUE SUBTOTAL |
1,169 |
N/A |
2,264 |
N/A |
574.49 |
N/A |
$4,249.88 |
|
|
|||||||||
|
PROGRAM RECIPIENT GRAND TOTAL |
4,470 |
N/A |
7,731 |
N/A |
1, 794.2 |
N/A |
$13,092.78 |
Table A.12-2. Reporting Estimates of Hour Burden and Cost to Program Recipients³
³These estimates are for the maximum number of respondents. For the impact data collection, this assumes that 65 percent will consent to providing contact information, an 80 percent response rate for the pre-intervention survey, and no attrition between pre- and post-data collection.
*Multiple instrument types for the same respondents; these respondents previously counted in row above.
**The estimates of response time are based on Wave I and experience using similar instruments in other studies as well as on pilot testing of the impact instruments in August 2010. The hourly wage rates for respondents (SNAP-Ed program recipients) were assumed equivalent to minimum wage. The minimum wage rates for each state were obtained from the U.S. Department of Labor, Employment Standards Administration Web site (http://www.dol.gov/esa/minwage/america.html
A.13—Provide estimates of the total annual cost burden to respondents or record keepers resulting from the collection of information, (do not include the cost of any hour burden shown in items 12 and 14). The cost estimates should be split into two components: (a) a total capital and start-up cost component annualized over its expected useful life; and (b) a total operation and maintenance and purchase of services component.
There are no capital, start-up, or ongoing operation or maintenance costs for this information collection.
A.14—Provide estimates of annualized cost to the Federal government. Also, provide a description of the method used to estimate cost and any other expense that would not have been incurred without this collection of information.
The total cost to the Federal government for the evaluation design, instrument development, respondent recruitment, data collection, data analysis, and Federal government review and oversight of the Evaluation of SNAP-Nutrition Education Practices is $2,413,379.10. The period of performance for the study is September 2009 through September 2013 (4 years); therefore, the annualized cost is $603,344.78.
A.15—Explain the reasons for any program changes or adjustments reported in Items 13 or 14 of the OMB Form 83-1.
There is a program change of 765 burden hours. The changes are due to substantive revisions contained in this phase as well as changes to the instruments (adding and deleting questions). As a result of these changes the number of respondents and responses increased.
A.16—For collections of information whose results are planned to be published, outline plans for tabulation and publication.
Plans for tabulation and publication are described in this section. Table A.16-1 provides the expected periods of performance for data collection, analysis, and reporting.
Table A.16-1. Anticipated Schedule for Data Collection, Tabulation, and Reporting
Activity |
|
Period of Performance |
Primary data collection for INN |
Pre-intervention survey |
September 2011-Oct 2011 |
Post-intervention survey |
May 2012 – July 2012 |
|
Primary data collection for UKCES |
Pre-intervention survey |
September 2011-Oct 2011 |
Post-intervention survey |
January 2012 – March 2012 |
|
Primary data collection for MSUE |
Pre-intervention survey |
February 2012 – April 2012 |
Post-intervention survey |
April 2012 – July 2012 |
|
Data analysis |
November 2011–March 2013 |
|
Preparation of final reports |
October 2012–September 2013 |
ANALYSIS PLAN FOR IMPACT EVALUATION
The impact evaluation will measure the effect of exposure to each intervention on key outcome measures. For each program, we will assess whether or not exposure to the education program led to an increase in the consumption of fruits and vegetables. For UKCES and INN, we will solicit information from parents or adult care takers; for MSUE, we will solicit self-reports. For all demonstration programs, we will examine the following hypotheses:
H1: Individuals who participate in SNAP-Ed demonstration programs will increase their consumption of fruits and vegetables between baseline (pre-intervention) and follow-up (post-intervention) compared to similar individuals who did not participate in SNAP-Ed demonstration programs.
H2: Individuals who participate in SNAP-Ed demonstration programs (or their caregiver) will report greater availability of healthy foods (e.g., fruits) and less availability of unhealthy foods (chips, sweetened carbonated beverages) between baseline (pre-intervention) and follow-up (post-intervention) compared to similar individuals (or the parents of individuals) who did not participate in SNAP-Ed demonstration programs.
In addition, the following program-specific hypotheses will be addressed:
H3 (INN, UKCES): Caregivers of children participating in SNAP-Ed demonstration programs will report an increased willingness among their children between baseline (pre-intervention) and follow-up (post-intervention) to try new fruits than parents of children not participating in SNAP-Ed demonstration programs.
H4 (INN, UKCES): Caregivers of children participating in SNAP-Ed demonstration programs will report an increase in the variety of fruit and vegetables consumed by their children between baseline (pre-intervention) and follow-up (post-intervention) compared to caregivers of children not participating in SNAP-Ed demonstration programs.
H5 (INN): Caregivers of children participating in SNAP-Ed demonstration programs will report increased consumption of low-fat or fat-free milk for their children between baseline (pre-intervention) and follow-up (post-intervention) compared to similar individuals who did not participate in SNAP-Ed demonstration programs.
All hypotheses will be tested through the specification of multivariable regression models that include a dichotomous treatment indicator and control for potentially confounding influences. Hypothesis tests are two-tailed and designed to control for Type-I and Type-II error. Part B, Estimation Procedures, provides additional information on the estimation, analysis, and hypothesis testing procedures for the impact evaluation of each demonstration project.
ANALYSIS PLAN FOR PROCESS EVALUATION
The process evaluation will primarily be qualitative in nature and collected via questionnaires or interviews as well as through data abstraction and extraction from existing documentation. Quantitative data related to nutrition education dose and reach as well as budgetary information will also be collected. We will gather and analyze this information using a case study approach. For each demonstration program, we will examine the following hypotheses:
H1: Demonstration programs will be implemented as originally planned.
H2: Straying from the planned implementation plan will affect the demonstration programs’ ability to achieve the intended outcome(s).
H3: Exposure to nutrition education outside the SNAP-Ed interventions will affect the demonstration programs’ ability to achieve the intended outcome(s).
H4: Participants’ satisfaction with and the cultural appropriateness of the nutrition education messages will affect the demonstration programs’ ability to achieve the intended outcome(s).
ANALYSIS PLAN FOR ASSESSMENT OF DEMONSTRATION PROJECT-LED EVALUATIONS
The objectives of the assessment of the demonstration project-led evaluations are to (1) describe how each demonstration project evaluated the success of its intervention, (2) describe the results of each demonstration project’s evaluation and how they compare with the FNS evaluation, and (3) describe lessons learned about each demonstration project’s evaluation. We will use a case study approach to summarize the data collected in the pre- and post-intervention interviews with the demonstration project’s evaluation manager, our review of the 2012 annual SNAP-Ed reports from FNS (or similar report describing the results of the demonstration project’s evaluation), and our completion of the Demonstration Project-Led Evaluation Rating Form. Additionally, we will conduct analyses to compare the results of the FNS evaluation to the demonstration project’s evaluation. This analysis will compare the direction and magnitude of the intervention’s impact for the two evaluations. If there are differences, we will attempt to identify and explain the cause of these differences.
PUBLICATION OF RESULTS
The results of the impact and process evaluations and the assessment of the demonstration project-led evaluations for the Evaluation of SNAP-Nutrition Education Practices Study, Wave II will be provided in the form of a final report, Evaluation of SNAP-Nutrition Education Practices Study, Wave II: Final Report. Upon completion, FNS will make the final report and executive summary, Evaluation of SNAP-Nutrition Education Practices Study, Wave II: Executive Summary, available on the FNS Web site.
A.17—If seeking approval to not display the expiration date for OMB approval of the information collection, explain the reasons that display would be inappropriate.
FNS plans to display the expiration data for OMB approval on all instruments.
A.18—Explain each exception to the certification statement identified in Item 19 “Certification for Paperwork Reduction Act.”
There are no exceptions to the Certification for Paperwork Reduction Act (5 CFR 1320.9) for this study.
Estimates of response time are based on Wave I and experience using similar instruments in other studies. Hourly wage rates for respondents were obtained from (1) the application submitted by each demonstration project to FNS, which in some cases provides detail on salary or hourly wage rates for program staff members and (2) the Bureau of Labor Statistics’ estimates for occupational wages (http://www.bls.gov/oes/current/oessrcst.htm). FNS anticipates a 95–100 percent response rate from staff, education providers, and center and school administrators due to their high-level of engagement in the intervention efforts. *Multiple instrument types for same respondents; these respondents previously counted in other row.
2 The estimates of response time are based on Wave I and experience using similar instruments in other studies. The hourly wage rates for respondents were obtained from (1) the application submitted by each demonstration project to FNS, which in some cases provides detail on salary or hourly wage rates for program staff members and (2) the Bureau of Labor Statistics’ estimates for occupational wages (http://www.bls.gov/oes/current/oessrcst.htm). FNS anticipates a 95–100 percent response rate from program staff, nutrition education providers, and center and school administrators due to their high-level of engagement in the intervention efforts. *Multiple instrument types for the same respondents; these respondents previously counted in other row.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | hwilson |
File Modified | 0000-00-00 |
File Created | 2021-01-31 |