Supporting Statement Part A 1_13_10

Supporting Statement Part A 1_13_10.doc

Models of SNAP-Education (ED)and Evaluation Study

OMB: 0584-0554

Document [doc]
Download: doc | pdf













Supporting Statement for Paperwork Reduction Act Submission

Models of SNAP-Ed and Evaluation









Prepared for:

Hoke Wilson

U.S. Department of Agriculture

Food and Nutrition Service

3101 Park Center Drive

Alexandria, VA 22302

703-305-2131

[email protected]



Prepared by:

Altarum Institute and RTI International

Table of Contents

Part B Statistical Methods

B.1 Respondent Universe and Sampling Methods B1

B.2 Procedures for the Collection of Information B5

B.3 Methods to Maximize Response Rates and Deal with Non-Response B5

B.4 Tests of Procedures or Methods to Be Undertaken B7

B.5 Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data B7



List of Appendices

A Data Collection Instruments for Impact Evaluation

B Data Collection Instruments for Process Evaluation

C Assessment of IA-Led Impact Evaluation Review Form

D Comments on Federal Register Notice and Responses to Comments

E Statistical Methods for the Impact Evaluation

F Data Collection Methods for Impact Evaluation

G Common Assumptions for Statistical Models of Parental Reports of Children’s Fruit and Vegetable Consumption in a Clustered, Experimental or Quasi-Experimental Design

H Cover Letters, Recruitment Letters, and Other Study Materials for Impact Evaluation

I Cover Letters, Recruitment Letters, and Other Study Materials for Process Evaluation

Part A

Justification

A.1 Explanation of Circumstances That Make Collection of Data Necessary

Background

The Food and Nutrition Service (FNS) of the U.S. Department of Agriculture promotes optimal health and well-being of low-income individuals through improved nutrition and well-designed nutrition education efforts within the Supplemental Nutrition Assistance Program (SNAP). Under Section 17 of the Food and Nutrition Act of 2008 [7 U.S.C. 2026] the Secretary may undertake research that will help improve the administration and effectiveness of the SNAP.

The nutrition assistance programs are a critical component to attaining FNS’ goals. FNS defines SNAP-Ed (formerly called Food Stamp Nutrition Education) activities as those designed to increase the likelihood of healthy food choices by SNAP recipients and those eligible for SNAP, but who are currently not participating in the program. As the largest of the Federal nutrition assistance programs, the SNAP has a significant stake in ensuring that nutrition education works to meet these goals.

To identify the extent to which SNAP-Ed interventions can be linked to increasing healthy eating behaviors, FNS funded four model SNAP-Ed projects that will have the resources necessary to both implement the intervention and provide the support to measure the impact of the intervention on client behavior. By evaluating SNAP-Ed projects that address SNAP Guiding Principles and support a rigorous evaluation model, FNS can gain estimates of the effectiveness of the four SNAP-Ed projects and be able to provide nutrition educators with examples of evaluation methodologies that are both feasible and scientifically robust and identify effective behavioral change interventions. The following are brief summaries of each of the four model SNAP-Ed projects that are being evaluated.

  • Chickasaw Nation’s Eagle Adventure Diabetes Prevention Program. The Chickasaw Nation Nutrition Services (CNNS) will conduct an intervention in schools using a curriculum that builds upon a diabetes prevention program developed through Indian Health Services and the Centers for Disease Control and Prevention. The target audience for this intervention will be children in the first through third grades in five public elementary schools in Pontotoc County, OK.

  • University of Nevada’s All 4 Kids Program. The University of Nevada will conduct an 8-week intervention in six Head Start centers in Las Vegas. The All 4 Kids intervention will be delivered in the Head Start classrooms of 3- and 4-year-old children and to their parents through monthly family events. Classroom teachers will also receive structured orientation training and support to encourage incorporation of the All 4 Kids messages into daily classroom activities.

  • New York State Department of Health’s (NYSDOH) Eat Well and Play Hard in Child Care Settings (EWPHCCS). The NYSDOH demonstration is a childcare center intervention targeted to 3- and 4-year-old children, their parents, and teachers at centers that participate in the Child and Adult Care Food Program (CACFP) and primarily serve low-income children and families. During the study period, EWPHCCS will be conducted by trained nutrition educators at an estimated 156 childcare centers across the State. At each center, over an 8-week period, registered dietitians (RDs) will provide six lessons for children in their classroom settings, six classes for their parents or caregivers, and two classes for staff.

  • Pennsylvania State University’s (PSU) Web-Based About Eating Program. PSU’s intervention is Web-based and directed at low-income women enrolled in SNAP. The goal is to promote eating competency through a self-paced on-line curriculum with multiple modules.

Purpose and Need

A key focus of FNS has been its efforts to ensure that nutrition education interventions are science-based and have the intended effects on clients’ behavior. With independent evaluations of these four demonstration projects through this one-time study, FNS will be able to determine if SNAP-Ed nutrition education interventions can positively impact the nutrition and health behaviors of SNAP participants and those who would be eligible, but choose not to participate in the program. A 2006 systems review identified a number of issues related to the implementation of SNAP-Ed, and the extent to which implementing agencies (IAs) and their local projects focus on client behaviors and evaluate project outcomes. In most cases, IAs reported that they lacked the expertise and funds needed to initiate and complete rigorous impact evaluations of their SNAP-Ed practices. As a result, it is difficult for SNAP-Ed implementers or FNS to determine which SNAP-Ed approaches are most successful in terms of causing behavior change, or for that matter, whether SNAP-Ed efforts are effective at all. Each of the four demonstration projects will have its own, independent evaluation component. By comparing each of the four demonstration project evaluations with the more rigorous evaluation to be conducted by FNS contractors, FNS will be able to provide logistically practical examples of project-level SNAP-Ed evaluation efforts which are methodologically robust. The wide-spread use of these methodologically robust evaluations by nutrition education implementers will not only help educators to refine their evaluations in order to maximize the interventions intended effect, but will provide FNS with another, more general measure of the effectiveness of SNAP-Ed.

In addition, FNS and States who might model these SNAP-Ed interventions in the future need to understand why they were or were not successful, which is why the process evaluation piece of this study is critical. Finally, FNS also needs to know what grantee-led evaluations were effective. Again, if FNS plans to recommend evaluation methods or to cite these models as examples for replication, it is necessary for these methods to be examined in terms of rigor, quality, and practicality.

A.2 Purpose and Use of the Information

Study purpose

The purpose of the current study is to determine whether any of the four selected projects can serve as good examples of SNAP-Ed delivery that meet the following criteria: (1) positively impact the nutrition and health behaviors of SNAP participants while adhering to FNS Guiding Principles, (2) exhibit the potential to serve as models of effective nutrition education intervention for large segments of the SNAP audience while requiring levels of resources that are manageable by a large percentage of SNAP-Ed-implementing agencies, and (3) provide methodologically robust yet logistically practical examples of project-level SNAP-Ed evaluation efforts.

Specifically, this study encompasses three evaluations: an independent impact evaluation, process evaluation, and assessment of the IA-led impact evaluations. The independent impact evaluations will employ a pre/post with control design in order to examine the extent to which clients initiated or sustained the desired behaviors. The process evaluation examines the implementation of the demonstration projects and analyzes lessons learned during implementation–from the perspective of program administrators, direct educators, and the target audiences—to inform program modifications and potential expansions or replication of these models. The assessment of IA-led impact evaluations will measure the quality of the evaluation and include a description of the strengths and weaknesses of each evaluation design and its implementation, as well as an evaluation of the soundness of the outcome measures. Results from the independent and IA-led data collection efforts will be analyzed separately and then compared to examine the extent to which the results are similar, explain discrepancies, and produce an integrated report of findings and recommendations.

Overview of Study Design

FNS’s Contractors for this project—Altarum Institute and its subcontractor RTI International—worked with FNS and each demonstration project to develop a rigorous evaluation approach tailored to the specific intervention being evaluated. The evaluation approach for each demonstration project ensures that the FNS evaluation does not contaminate the IA’s intervention and/or impact evaluation; establishes causality between the interventions and the dietary behavioral outcomes within the limitations imposed by delivering a public nutrition education program; and incorporates an “intention-to-treat” approach, meaning, individuals who drop out of the intervention will be contacted in order to collect information on outcome measures as well as reasons for dropping out of the program.

Determining the effectiveness of the interventions and their potential for replicability, which is the primary objective of the process evaluation, will require a clear understanding of each intervention’s planning and implementation. Existing documentation will be used to obtain objective information for the process evaluation; qualitative methods will be used to gather more in-depth information on program implementation, as well as perspectives of key players engaged in or exposed to the intervention (e.g., program staff, educators, target audience).

Summary of Data Collection Methods

Impact Evaluation

Table A.1-1 summarizes the research design and data collection methods for the impact evaluation for each demonstration project. The data collection instruments can be found in Appendix A.

Table A.1-1. Summary of the Research Design and Data Collection Methods for the Impact Evaluation

Demonstration Project

Research Design

Data Collection Method

Chickasaw Nation Nutrition Service (CNNS) Eagle Adventure Diabetes Prevention Program

Quasi-experimental research design with intervention schools in Pontotoc County, OK (n = 5) matched to control schools in Bryan County, OK (n = 5) based on characteristics of the school and students

Survey parents/caregivers of first- through third-grade students pre- and post-intervention, using a mail/telephone survey approach.

University of Nevada (UNV) All 4 Kids Program

Quasi-experimental research design with the 2 Head Start Centers that have previously received the intervention purposively assigned to the intervention group and the remaining 10 centers randomly assigned to the intervention or control group, for a total of 6 centers in each arm of the trial.

Pre-intervention: survey parents/caregivers of preschool children pre- intervention onsite at the Head Start Center.

Post intervention: survey parents/caregivers using a mail/telephone survey approach.

New York State Department of Health (NYSDOH) Eat Well and Play Hard in Child Care Settings (EWPHCCS) Program

Experimental research design with child care centers randomly assigned to the intervention group (n = 12) or the control group (n = 12); centers will be matched based on characteristics of the centers and pairs of centers randomly selected; design will include two strata: (1) New York City and (2) the rest of the State of NY.

Survey parents/caregivers of preschool children pre- and post-intervention using a mail/telephone survey approach.

Pennsylvania State University (PSU) Web-Based About Eating Program

Experimental research design in which study participants are randomly assigned to the intervention group (n = 145) or the comparison group (n = 145), with stratification for rural/urban and Expanded Food and Nutrition Education Program (EFNEP) participation.

Survey participants via the Internet pre- and post-intervention; program drop-outs and nonrespondents to the post-intervention surveys will be contacted by mail/telephone.



For the process evaluation data collection effort, the methods used and respondent types will vary for each intervention project. Table A.1-2 provides a summary of the data collection methods. The data collection instruments can be found in Appendix B.

Table A.1-2 Summary of Data Collection Methods for the Process Evaluation.

Method

Purpose

In-depth, Open-Ended Discussions with SNAP-Ed Program- and Partner-Level Staff

CNNS, UNV, NYSDOH, and PSU

To capture the experiences and perspectives of, as well as lessons learned by, personnel on the administrator and, in most cases, provider sides of the program. For most respondent types, these interviews will take place at both pre- and post-implementation.

Structured Observations of the Nutrition Education Classes

CNNS, UNV, and NYSDOH

To collect information related to environmental influences, observe participant interest in nutrition education lessons, and describe how implementation is/is not consistent with plans.

In-depth, Post-Intervention Open-Ended Discussions with Classroom Teachers

UNV and NYSDOH

To assess attitudes about the importance of the nutrition messages of the intervention, perspectives on what worked well, what could be improved in the administration and delivery of the intervention, and the degree to which teachers incorporate SNAP-Ed messages into other classroom activities.

Post-Intervention Mail Questionnaires with Classroom Teachers

UNV and NYSDOH

To collect similar information to that referenced in the cell above from the subset of teachers who are not involved in the in-depth open-ended discussions.

Post-Intervention Structured Group Interviews with Participants’ Parents

CNNS, UNV, and NYSDOH

To capture the perspectives and level of satisfaction of parents/caregivers whose children participated in the nutrition education interventions and who, in the case of UNV and NYSDOH, also are a target audience of the intervention through parent classes or family events.

Email Questionnaire of Pilot Participants Post-Intervention

PSU

To capture any feasibility or access concerns that the pilot participants experienced. This information will provide insight into challenges that could be expected if other SNAP-Ed programs consider implementing a web-based intervention and serve as a point of reference when talking with Web-developers.

Telephone Interview with Intervention Participants Post-Intervention

PSU

To capture more in-depth information about problems that arose with regards to accessing the intervention and participants’ likes and dislikes with regards to the intervention as well as their perspectives on the relevance of the material.

Abstraction of Textual Information and Extraction of Numerical Program Data from Secondary Data Sources

CNNS, UNV, NYSDOH, and PSU

To objectively document the planned and actual intervention. To obtain data needed to quantify the total attendance and the average and range of attendance per intervention site, the average and range of nutrition education received by participants, and total and per participant costs.



Some of the methods described in Table A.1-2 also will be used to capture information related to the assessment of the IA-led evaluations. For example, the primary data sources for the assessment of the IA-led evaluations are: (1) pre- and post- in-depth, open-ended discussions with IAs’ evaluation managers and (2) review of and abstraction/extraction from the 2010 Annual SNAP-Ed reports from FNS (or similar report describing the results of the IA’s evaluation). Using these data sources, the contractor will complete an Implementing Agency-Led Evaluation Rating Form that rates the IAs’ impact evaluation on criteria such as viable comparison strategy, sampling size/sampling strategy, outcome measures, data collection, and data analysis. This form is provided in Appendix C.

Use of the Information

The results of the impact evaluation and assessment of the IA-led evaluation coupled with the process evaluation findings, will be used to: (1) determine which, if any, of the four demonstration projects can serve as good examples of SNAP-Ed delivery that meet the previously described FNS criteria; (2) identify lessons learned in terms of the design, planning, and implementation process and provide recommendations to FNS on how these interventions could be improved to potentially enhance outcomes; and (3) determine which, if any, of the IA-led assessments provides methodologically robust yet logistically practical examples of project-level SNAP-Ed evaluation efforts. In more general terms, the evaluation of the demonstration projects’ nutrition education interventions will provide evidence of the potential effectiveness of the SNAP-Ed program, while the assessment of the demonstration projects’ impact evaluations will generate models of practical yet robust evaluation techniques that can be adopted by a significant percentage of SNAP-Ed implementing agencies. By promulgating these impact evaluation techniques to SNAP-Ed nutrition educators, FNS will be able to obtain even broader evidence of the success of SNAP-Ed.

A.3 Use of Information Technology to Burden Reduction

FNS makes every effort to comply with the E-Government Act of 2002. For the impact evaluation, it will be necessary to collect information on outcome measures of interest from program participants. For the NYSDOH, CNNS, and UNV programs, the use of the Internet to collect this information was considered but not used because many low-income individuals do not readily have access to the Internet. Instead, a combination of in-person survey administrations (UNV only) and mail/telephone surveys will be used. The PSU intervention is Web-based, so the pre- and post-intervention data collection also will be Web-based to minimize respondent burden. It is anticipated that 100% of all respondents for this intervention will respond using the internet. This represents 14% of all study respondents. The Web address and screen shots will be provided to OMB once the contractor has finalized their development.

A.4 Efforts to Identify Duplication and Use of Similar Information

Every effort has been and will be made to avoid duplication of data collection for the process evaluation. These efforts include (1) a thorough review of many extant documents, including but not limited to FY2010 SNAP-Ed Plans and Reports, nutrition curriculum, expenditure reports, school menus, and class attendance logs or Web hits and (2) adding on a limited number of process-related questions to the impact evaluation instruments as to avoid multiple contacts with the same respondents.

A.5 Impacts Small Businesses or Other Small Entities

FNS estimates that all 46 CACFP childcare centers (NYSDOH), Acelero Learning Centers (UNV), and schools in Pontotoc and Bryant Counties in OK (CNNS) involved in the FNS evaluation do qualify as small businesses. In all instances, we have taken the necessary steps to ensure that the burden on any organization, but especially small businesses, is minimized. For the impact evaluation, center and school administrators will only be asked to assist with distributing and collecting study enrollment materials and engaging students and their caregivers in the study. For the process evaluation, none of the 23 childcare centers or schools that are serving as controls in the evaluation will be contacted and we will only be conducting in-depth interviews with the center administrator and 3 classroom teachers from 3 out of the 12 CACFP childcare centers and 4 out of 6 Acelero Learning Centers that are receiving SNAP-Ed interventions; brief questionnaires will be administered to 3 classroom teachers at the remaining 9 CACFP childcare centers and 2 Acelero Learning Centers. Principals from all five schools receiving the SNAP-Ed intervention in Pontotoc County will be interviewed; however, no data will be collected from school teachers.

A.6 Consequences of Collecting the Information Less Frequently

The primary objective of the impact evaluation will be to measure the effect of exposure to each intervention on key outcome measures (e.g., intake of fruits and vegetables). Thus, it will be necessary to conduct pre- and post-surveys to collect data on key outcome measures before and after the intervention (Shadish, Cook, & Campbell, 2002).

Onsite data collection for the process evaluation will take place at three points in time: prior to, during, and just after implementation of the nutrition education. Pre-implementation interviews will focus on the planned intervention design and implementation, while post-implementation interviews will focus on experiences, lessons learned, and deviations from the planned implementation. Eliminating either one of these data collection efforts would significantly impact the ability to draw accurate conclusions about factors that may have contributed to the success or failure of the intervention. The “during” data collection period will specifically be to observe nutrition education in action. This observation will provide an opportunity for conclusions to be made about the degree to which nutrition education was implemented as planned as well as to document certain environmental factors that might not otherwise be reported, but that could significantly affect the impact of the nutrition education.

A.7 Special Circumstance Relating to the Guideline of 5 CFR 1320.5

There are no special circumstances. The collection of information is conducted in a manner consistent with the guidelines in 5 CFR 1320.5.

A.8 Comments in Response to the Federal Register Notice and Efforts to Consult Outside Agency

Federal Register notice

In accordance with the Paperwork Reduction Act of 1995, an announcement of the Food and Nutrition Service’s intent to seek OMB approval to collect information for the Models of SNAP-Ed and Evaluation Study provided an opportunity for public comment. This announcement was published in the Federal Register on May 8, 2009 (74 FR 21619), and specified a 60-day period for comment ending July 7, 2009. A copy of the comments received and FNS’ response to those comments are provided in Appendix D.

Consultation with Outside Agencies

The Models of SNAP-Ed and Evaluation Study has been developed in consultation with both technical and substantive experts. Marilyn Townsend, PhD from the University of California Davis reviewed the draft instruments for the impact evaluation; and Joanne Guthrie, PhD, from Economic Research Service, USDA, participated in the selection of the demonstration projects and provided input on the study design. The data collection instruments, study plan, sampling and methodologies were all reviewed by the National Agricultural Statistical Service (NASS). NASS comments and FNS responses are included as Appendix J. Contact the NASS Survey Administration Branch for more information.

A.9 Explanation of Any Payment or Gift to Respondents

Incentives will be offered to study participants to maximize the response rates for the pre- and post-intervention surveys and to minimize attrition between the pre- and post-intervention surveys. (see Table A.9-1). For the CNNS, NYSDOH, and the UNV evaluations, incentives will be provided to parents/caregivers that complete each survey. A $10 cash incentive will be provided for completing the pre-intervention survey and a $15 cash incentive for completing the post-intervention survey. Additionally, children who return the completed form with their parents’/caregivers’ contact information will receive a token incentive worth $1 to engage them in the study and to enhance their cooperation with study recruitment. Although the literature suggests that it may be more effective to provide a small, pre-paid incentive for mail surveys (Berk et al., 1987; Schewe and Cournoyer, 1976), we decided that a larger, promised incentive would be more effective than pre-paid incentives for this study for several reasons. First, we had concerns about sending cash home with children for the pre-intervention survey and decided to offer a token incentive to the child instead. Second, we thought it was important to be consistent between the pre- and the post-intervention surveys and to offer a promised incentive for both surveys. Third, we believed that offering a promised incentive for the post-intervention survey would help to minimize attrition between the pre-and post-intervention surveys, and that it would be beneficial to offer a larger incentive for the post-intervention survey due to the increased length of the survey. Finally, our study design employs a multi-mode approach with the option to respond by mail or telephone. Offering only a pre-paid incentive would not provide compensation to participants who complete the survey by phone. The incentive amounts ($10 for pre- and $15 for post) are consistent with what is typically offered for mail surveys of comparable length. Additionally, the contractor has received approval for the incentive amounts from its IRB.

For PSU, participants will receive a $10 cash incentive for completing the pre-intervention survey and a $15 cash incentive for completing the post-intervention survey. We believe this level of compensation is required to maximize the response to each survey and to minimize attrition between the pre- and post-surveys. The incentive amounts are consistent with what is typically offered for Internet surveys of comparable length. The contractor and the IA have received separate IRB approval for the incentive amounts.

For the CNNS and NYSDOH evaluations, the Contractor will work with selected schools/child care centers to coordinate the data collection for the impact surveys. Based on the contractor’s experience conducting data collection in the school environment we believe it is important to offer incentives to the school/centers and teachers to obtain buy-in for the study and to secure their cooperation in recruiting parents and caregivers for the study. Incentives will be provided to the center/school, classroom teachers, and a site coordinator. The incentive amounts are based on the contractor's experience with collecting data in schools and child care centers. The schools/centers are not part of the IA and thus will not receive any monetary funding (other than the incentives listed in Table A.9-1) for participating in the intervention or evaluation study. Also, the control centers/schools will not receive the nutrition education intervention until after the study is completed, so they may be less engaged in the study. Thus, it is important to offer some type of monetary incentive to secure their cooperation. The incentive to the school/center ($200/$125) is being offered to obtain buy-in for the study from the principal/center director and thus to help facilitate the contractor’s access to teachers. The incentive to the teachers ($25) is to acknowledge their assistance in distributing study enrollment materials and engaging students and their caregivers in the study during the 1 month study enrollment period. The incentive to the site coordinator ($50) is to acknowledge their assistance in collecting the completed enrollment materials and returning them to the contractor during the 1 month study enrollment period.

Childcare teachers in the intervention classrooms will also serve as key respondents for the process evaluation and will be provided an incentive for completion of either an on-site interview or submission of a written questionnaire (see Table A.9-1). The incentive amounts are based on the contractor's experience with collecting data in childcare centers and are to acknowledge the time the respondent spent completing the interview before or after their regular paid employment hours.

For the process evaluation's structured group discussions with parents, we are providing a $50 incentive to each  participant. This level is within the range of $50 - $75 in current research practice for focus group participants, as recommended by Krueger and Casey (2009).  It is also less than the $75 rate offered in several other federal OMB-approved research studies.

Table A.9-1. Incentives for the Impact and Process Evaluation Data Collection


Program and Respondent Type

Impact Evaluation

Process Evaluation

Prior to Survey

Pre-Survey

Post-Survey

Structured group interview

Post-intervention interview

Post-intervention questionnaire

PSU

Study participants

NA

$10

$15

NA

$15

NA

CNNS, NYSDOH, and UNV

Study participants (parents/caregivers)

NA

$10

$15

$50

NA

NA

CNNS and NYSDOH

School (CNNS)*

$200

NA

NA

NA

NA

NA

Center (NYSDOH)*

$125

NA

NA

NA

NA

NA

Teacher**

$25

NA

NA

NA

NA

NA

Site coordinator***

$50

NA

NA

NA

NA

NA

Children (return of enrollment envelope)****

token gift worth $1

NA

NA

NA

NA

NA

UNV and NYSDOH

Teacher

NA

NA

NA

NA

$15

$10

NA = Not applicable

*A check made out to the school/center will be provided to the school principal or center director.

**The teacher incentive is based upon the percentage of students returning the envelope with the enrollment form, not the number of caregivers who enroll/provide contact information.

***The school principal or center director will be asked to designate someone to coordinate the daily collection of the returned sealed envelopes for pick up by a field interviewer on a daily basis during the enrollment period.

* ***Children will receive token incentives (e.g., a friendship bracelet, pen) for returning their caregiver’s enrollment form whether or not the caregiver agrees to participate and provide contact information for the mail/telephone impact survey.

A.10 Assurance of Privacy Provided to Respondents

The individuals participating in this study will be (1) notified that their participation in the study is voluntary and that there will be no penalty if they chose not to participate; and (2) be assured that the information they provide will not be released in a form that identifies them except as required by law. No identifying information will be attached to any reports or data supplied to FNS. All respondents to the impact and process evaluation data collections will be asked to provide informed consent before participating in the data collection. The informed consent form will describe the precautions taken to protect participant contact information and survey/interview responses.

RTI’s Institutional Review Board (IRB) will review the survey protocols to ensure that human subjects are protected and that procedures to ensure privacy are adequate. FNS will provide OMB with a copy of the protocols once IRB approval is obtained. For the CNNS, NYSDOH, and UNV projects, parents/caregivers who decide to participate in the study will read, sign, and return an informed consent form. For the PSU project, the introductory screen of the survey will provide information on informed consent. The informed consent, approved by RTI’s IRB, will describe the precautions taken to protect participants’ contact information and survey responses. Specifically, participants will be informed that their names will be replaced with an identification number and other personal information will be stored separately from their survey answers. In addition, respondents to the impact and process evaluation will be provided oral and written assurances that their data will be treated as private and released to the public only in the form of aggregate statistics.

A.11 Justification for Sensitive Questions

The data collection instruments for this study do not contain questions of a sensitive nature.

A.12 Estimates of Hour Burden Including Annualized Hourly Costs

The estimates of hour burden and cost to respondents are provided in Tables A.12-1 and A.12-2.

Table A.12-1. Reporting Estimates of Hour Burden and Cost to Program Developers and Providers.

Respondent

Instrument Type

Estimated Number of Respondents

Responses Annually per Respondent

Total Annual Responses

Response Burden in Hours

Estimated Total Hours

Estimated Hourly Wage

Estimated Cost to Respondents

CNNS

IA-level respondents

In-depth, open-ended discussions

2

2

4

0.67

2.7

$34.09

$91.09

Direct Educators

In-depth, open-ended discussions

1

2

2

0.50

1.0

$34.09

$34.09

School principals

In-depth, open-ended discussions

5

2

10

0.50

5.0

$41.43

$207.15

UNV

IA-level respondents

In-depth, open-ended discussions

4

2

8

0.67

5.3

$42.14

$225.20

Direct Educators

In-depth, open-ended discussions

4

2

8

0.50

4.0

$42.14

$168.56

Head Start Site Directors

In-depth, open-ended discussions

4

2

8

0.50

4.0

$22.29

$89.16

Head Start Classroom Teachers

In-depth, open-ended discussions

12

1

12

0.50

6.0

$12.80

$76.80

Mail questionnaire

6

1

6

0.25

1.5

$12.80

$19.20

NYSDOH

IA-level respondents

In-depth, open-ended discussions

7

2

14

0.67

9.4

$38.28

$357.99

Direct Educators

In-depth, open-ended discussions

6

2

12

0.50

6.0

$24.75

$148.50

Childcare Center Site Directors

In-depth, open-ended discussions

3

2

6

0.50

3.0

$29.22

$87.66

Childcare Center Classroom Teachers

In-depth, open-ended discussions

9

1

9

0.50

4.5

$12.80

$57.60

Mail questionnaire

27

1

27

0.25

6.8

$12.80

$86.40

PSU

IA-level respondents

In-depth, open-ended discussions

3

2

6

0.67

4.0

$55.29

$221.60

Recruiters

In-depth, open-ended discussions

8

1

8

0.50

4.0

$19.71

$78.84

Web Developers

In-depth, open-ended discussions

3

1

3

0.50

1.5

$12.00

$18.00

TOTAL

n/a

104

26

143

8.17

68.6

n/a

$1,967.84

The estimates of response time are based on experience using similar instruments in other studies. The hourly wage rates for respondents were obtained (1) from the application submitted by each IA to FNS, which in some cases provides detail on salary or hourly wage rates for program staff members and (2) from the Bureau of Labor Statistics’ estimates for occupational wages. (http://www.bls.gov/oes/current/oessrcst.htm). FNS anticipates a 95-100% response rate from program staff, nutrition education providers, and center and school administrators due to their high-level of engagement in the intervention efforts.

Table A.12-2. Reporting Estimates of Hour Burden and Cost to Program Recipients.

Respondent

Instrument Type

Estimated Number of Respondents*

Responses Annually per Respondent

Total Annual Responses

Response Burden in Hours**

Estimated Total Hours

Estimated Hourly Wage

Estimated
Cost to Respondents

CNNS

Parents of nutrition education recipients

Mail/telephone Questionnaire (Impact)

838

2

1676

0.25

419.0

$7.25

$3,037.75

Structured Group Interviews [3 interviews x 10 participants] (Process)

10

3

30

2

60.0

$7.25

$435.00

UNV

Parents of nutrition education recipients

One in-person Interview [pre] and one mail/telephone questionnaire [post] (Impact)

600

2

1200

0.25

300.0

$7.55

$2265.00

Structured Group Interviews [3 interviews x 10 participants] (Process)

10

3

30

2

60.0

$7.55

$453.00

NYSDOH

Parents of nutrition education recipients

Mail/telephone questionnaire (Impact)

786

2

1572

0.25

393.0

$7.25

$2849.25

Structured Group Interviews [3 interviews x 10 participants] (Process)

10

3

30

2

60.0

$7.25

$435.00

PSU

Nutrition education participants

Internet questionnaire (Impact)

362

2

724

0.25

181.0

$7.25

$1,312.25

Telephone interview (Process)

8

1

8

0.5

4.0

$7.25

$29.00

Nutrition education pilot participants

Email questionnaire (Process)

8

1

8

0.25

2.0

$7.25

$14.50

TOTAL

n/a

2692

13

5278

7.75

1479

n/a

$10,830.75

* These estimates are for the maximum number of respondents. For the impact data collection, this assumes no attrition between pre- and post- data collection.

**The estimates of response time are based on experience using similar instruments in other studies as well as on pilot testing of the impact instruments in July 2009. The hourly wage rates for respondents (SNAP-Ed Program recipients) were assumed to be equivalent to minimum wage. The minimum wage rates for each state were obtained from the U.S. Department of Labor, Employment Standards Administration website (http://www.dol.gov/esa/minwage/america.htm).

A.13 Estimates of Other Total Annual Cost Burden to Respondents or Record Keepers

There are no capital/start-up or ongoing operation/maintenance costs for this information collection.

A.14 Annualized Cost to Federal Government

The total cost to the Federal government for the evaluation design, instrument development, respondent recruitment, data collection, data analysis, and Federal government review and oversight of the Models of SNAP-Ed and Evaluation Study is $2,631,334. The period of performance for the study is September 2008 through September 2011 (3 years); therefore, the annualized cost is $877,111.

A.15 Explanation for Program Changes or Adjustments

This submission to OMB is for a new project and will result in a total of 1,570 burden hours.

A.16 Plans for Tabulation and Publication and Project Time Schedule

Plans for tabulation and publication are described in this section. Table A.16-1 provides the expected periods of performance for data collection, analysis, and reporting.

Table A.16-1. Anticipated Schedule for Data Collection, Tabulation, and Reporting.

Activity

Period of Performance

Primary data collection

January 2010 – November 2010

Data analysis

April 2010 – May 2011

Preparation of final reports

June 2011 – September 2011

Analysis Plan for Impact Evaluation

The impact evaluation will measure the effect of exposure to each intervention on key outcome measures. For each program we will assess whether or not exposure to the education program led to an increase in the consumption of fruits and vegetables. For programs aimed at young children, we will solicit information from parents or adult care takers; for demonstration programs that include adults, we will solicit self-report. For all demonstration programs, we will examine the following hypotheses:

  • H1: Individuals who participate in SNAP-Ed demonstration programs will increase their consumption of fruits and vegetables between baseline (pre-intervention) and follow-up (post-intervention) as compared to similar individuals who did not participate in SNAP-Ed demonstration programs.

  • H2: Individuals who participate in SNAP-Ed demonstration programs (or their caregiver) will report greater availability of healthy foods (e.g., fruits) and less availability of unhealthy foods (chips, sweetened carbonated beverages) between baseline (pre-intervention) and follow-up (post-intervention) as compared to similar individuals (or the parents of individuals) who did not participate in SNAP-Ed demonstration programs.

In addition, the following program-specific hypotheses will be addressed:

  • H3: (CNNS, UN, NYSDOH): Caregivers of children participating in SNAP-Ed demonstration programs will report an increased willingness among their children between baseline (pre-intervention) and follow-up (post-intervention) to try new fruits than parents of children not participating in SNAP-Ed demonstration programs.

  • H4: (CNNS, UN, NYSDOH): Caregivers of children participating in SNAP-Ed demonstration programs will report an increase in the variety of fruit and vegetables consumed by their children between baseline (pre-intervention) and follow-up (post-intervention) as compared to caregivers of children not participating in SNAP-Ed demonstration programs.

  • H5 (NYSDOH): Individuals who participate in SNAP-Ed demonstration programs will show an increased likelihood of drinking skim or 1% milk between baseline (pre-intervention) and follow-up (post-intervention) as compared to similar individuals who did not participate in SNAP-Ed demonstration programs.

All hypotheses will be tested through the specification of multi-variable regression models that include a dichotomous treatment indicator and control for potentially confounding influences. Hypothesis tests are two-tailed, and designed to control for Type-I and Type-II error. Part B, Estimation Procedures, provides additional information on the estimation, analysis, and hypothesis testing procedures for the impact evaluation of each demonstration project.

Analysis Plan for Process Evaluation

The process evaluation will primarily be qualitative in nature and collected via questionnaires or interviews as well as through data abstraction and extraction from existing documentation. Quantitative data related to nutrition education dose and reach as well as budgetary information will also be collected. We will gather and analyze this information using a case study approach. For each demonstration program, we will examine the following hypotheses:

  • H1: Demonstration programs will be implemented as originally planned.

  • H2: Straying from the planned implementation plan will affect the demonstration programs’ ability to achieve the intended outcome(s).

  • H3: Exposure to nutrition education outside the SNAP-Ed interventions will affect the demonstration programs’ ability to achieve the intended outcome(s).

  • H4: Participants’ satisfaction with and the cultural appropriateness of the nutrition education messages will affect the demonstration programs’ ability to achieve the intended outcome(s).

Analysis Plan for Assessment of IA-led Evaluations

The objectives of the assessment of the IA-led evaluations are to (1) describe how each IA evaluated the success of its intervention, (2) describe the results of each IA’s evaluation and how they compare with the FNS evaluation, and (3) describe lessons learned about each IA’s evaluation. We will use a case study approach to summarize the data collected in the pre- and post- interviews with the IA’s evaluation manager, our review of the 2010 Annual SNAP-Ed reports from FNS (or similar report describing the results of the IA’s evaluation), and our completion of the IA-Led Evaluation Rating Form. Additionally, we will conduct analyses to compare the results of the FNS evaluation to the IA’s evaluation. This analysis will compare the direction and magnitude of the intervention’s impact for the two evaluations. If there are differences, we will attempt to identify and explain the cause of these differences.

Publication of Results

The results of the impact and process evaluations and the assessment of the IA-led evaluations for the Models of SNAP-Ed and Evaluation Study will be provided in the form of a final report, Models of SNAP-Ed and Evaluation Study: Final Report. Upon completion, FNS will make the final report and executive summary, Models of SNAP-Ed and Evaluation Study: Executive Summary, available on our Web site.

A.17 Reason(s) Display of OMB Expiration Date is Inappropriate

FNS plans to display the expiration data for OMB approval on all instruments.

A.18 Exceptions to Certification for Paperwork Reduction Act Submissions

There are no exceptions to the Certification for Paperwork Reduction Act (5 CFR 1320.9) for this study.

File Typeapplication/msword
Authorhwilson
Last Modified Byhwilson
File Modified2010-01-13
File Created2010-01-13

© 2024 OMB.report | Privacy Policy