Attachment A - Overview of the PREP Evaluation

Attachment A - Overview of the PREP Evaluation.docx

Personal Responsibility Education Program (PREP) Multi-Component Evaluation

Attachment A - Overview of the PREP Evaluation

OMB: 0970-0398

Document [docx]
Download: docx | pdf








Attachment A

Overview of the PREP Evaluation



Overview of the PREP Multi-Component Evaluation and Data Collection Efforts Previously Approved by OMB

The goal of the PREP Multi-Component Evaluation is to document how programs funded through the PREP grants program are designed and implemented in the field and to assess selected PREP-funded programs’ effectiveness. The evaluation will include three primary, independent study components: (1) the Design and Implementation Study (DIS), (2) the Performance Analysis Study (PAS), and (3) the Impact and In-Depth Implementation Study (IIS). Each study component is described in more detail below.

Design and Implementation Study (DIS). States must decide how to distribute grant funds to local sub-awardees, what program models to authorize, and how to support implementation. Under­standing the range of decisions made to meet the funding requirements and how implementation unfolds can contribute to stronger future program design for pregnancy prevention and other programs.

Documentation of states’ PREP design decisions and implementation experiences will be based on two rounds of data collection. In spring 2012, the contractor began a Design Survey. The contractor reviewed documents that outline state PREP program plans and conducted telephone interviews with state PREP officials to learn more about what programming decisions have been made, and why.

  • The Design Survey conducted as part of the DIS received approval on March 7, 2012 (OMB Control #0970-0398). The data collection instrument was a discussion guide which was used with states administrators in states that received PREP state grant funding (i.e., “PREP State-Level Coordinators”). The survey focused on issues such as: how program and policy decisions were made at the state level, the context in which the state is operating, how PREP programs are being integrated with other efforts to reduce teen pregnancy in the state, the relationship between the state and PREP sub-awardees, and how states intend to meet the legislative requirements of the funding.

In Spring/Summer 2014, the team will begin an Implementation Survey, which will focus on program implementation, including training and technical assistance provided, monitoring activities to ensure that programs are being replicated with fidelity, adaptations made to fit local contexts, and youth enrollment and retention. For the Implementation Survey, the contractor will conduct interviews with state PREP officials, and selected sub-awardees. The information collected about sub-awardees through the Design Survey - including program type, region, urbanicity, experience, population size served, target population, and setting - will inform the selection of sub-awardees to be included in the Implementation Survey.

Performance Analysis Study (PAS). All PREP grantees will be required to submit standard information on program structure and program delivery. The performance measures will be aligned, to the extent possible, with other federally funded teenage pregnancy prevention programs, and will also reflect the unique feature of the PREP funding. These measures will help ACF understand whether the PREP objectives are being met and whether technical assistance may be needed to support program improvement.

  • Collection of Performance Analysis Study data for state and tribal grantees was approved on March 12, 2013 (OMB Control # 0970-0398). Data collection from state and tribal grantees is currently underway.

  • This current ICR pertains to the Performance Analysis Study for the Competitive PREP (CPREP) grantees, and therefore this study component is described in more detail in Part A of this package.

Impact and Implementation Study (IIS). In four or five sites (a site could be an entire state, local sub-awardees within a state, or a CPREP grantee), this component of the evaluation will provide rigorous estimates of program effectiveness on key outcomes, such as rates of sexual initiation and abstinence, contraceptive use, and teen pregnancy, and include a detailed look at program delivery.

  • The Field Data Collection as part of the IIS was approved on November 6, 2011 (OMB Control #0970-0398) to inform site selection for this component of the evaluation. The field collection activity involves the collection of information from observations of program activities and interviews with a range of experts and persons involved with programs about various aspects of existing prevention programs, topics the experts view as important to address through evaluation, and assessments of feasibility for conducting a random assignment study in a particular site. These data will be used to help enhance decisions about the types of programs to be evaluated in the IIS and which sites can support the rigorous study.

In each selected site, the impact study will be based on a random assignment design. For sites that can support random assignment and are selected for the impact and implementation study, the evaluation team will work collaboratively with grantees to develop a plan for either randomly assigning individuals to a group that will receive the program or to a control group, or ran­domly assigning program sites (such as schools, clinics, or group homes) to deliver the program or to be control sites. To assess the impacts of each program, the evaluation team will administer surveys to both groups of youth shortly before the programs begin, and about 8-12 months after the program ends, and then one year after that.1

  • The baseline survey of the IIS was approved on March 12, 2013 (OMB Control # 0970-0398).

  • This current ICR pertains to the follow-up surveys, which will be administered after the programs end.1

In addition to this analysis of program impacts, the IIS component of the evaluation will also involve an In-Depth Implementation Study in each site to document and assess program implementation. The evaluation team will make multiple visits to the selected sites, conduct interviews with stakeholders and program staff, hold focus groups with program participants, and review program documents. Team members will also observe program sessions to document the quality of delivery and fidelity to the program models.

  • This current ICR pertains to the In-Depth Implementation Study interviews, staff survey, and youth focus group, which occur while the program is ongoing. This study component is described in more detail in Part A of this package.

1 If a program lasts more than one year, the first follow-up survey will occur during programming.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorBCollette
File Modified0000-00-00
File Created2021-01-29

© 2024 OMB.report | Privacy Policy