Planning Local Evaluations as part of the Personal Responsibility Education Program (PREP): Promising Youth Programs (PYP)

Formative Data Collections for Policy Research and Evaluation

Template 1 Impact Evaluation Template

Planning Local Evaluations as part of the Personal Responsibility Education Program (PREP): Promising Youth Programs (PYP)

OMB: 0970-0356

Document [docx]
Download: docx | pdf

PREP-PYP PREIS/TPREP Impact Evaluation Template MATHEMATICA POLICY RESEARCH

Shape1

According to the Paperwork Reduction Act of 1995, an agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a valid OMB control number. The valid OMB control number for this collection is 0970-0356; this number is valid through 03/31/2018.  Public reporting burden for this collection of information is estimated to average 300 minutes, including the time for reviewing instructions, gathering and maintaining the data needed, and reviewing the collection of information. This collection of information is voluntary for individuals, but the information is required from Grantees.



PREIS / Tribal PREP IMPACT Evaluation template

Note: Tribal PREP grantees would use this template if they plan to conduct rigorous impact evaluations. Some additional questions appropriate for tribal grantees would be added to the template.

Instructions: This template is intended to gather pertinent details about your evaluation, including research questions, study design, program details, sample characteristics, data collection plans, and other details related to the feasibility of the study.

Please complete this form to the best of your ability given the status of the evaluation plan, fleshing out and updating the plans laid out in your application, as applicable. The written plan will be used internally between Grantee, local evaluator, project officer, and your TA liaison as the basis for discussion on phone calls during the planning period. These discussions will be used to further develop the plans and provide additional clarification as needed to finalize a feasible design that meets the grant and study objectives. Grantee plans must be approved by your FYSB project officer before proceeding with implementation.

Impact Evaluation Overview



  1. Please list the research questions that will guide your impact evaluation. Please use concise language and frame these as questions rather than hypotheses.







  1. Please indicate on what public-facing registry the trial will be registered. (If not already determined, FYSB and OPRE encourage grantees to register their experiments at clinicaltrials.gov.)





  1. Are there any potential conflicts of interest (real or perceived) among the grantee organization, local evaluator, and/or curriculum developer? ___ yes ___ no



  1. If yes, please describe the conflicts and what steps will be taken to address them.







Program and Comparison Conditions

  1. Name of program:





  1. Please complete the table below to describe all the components of the program funded under the grant, including all Adulthood Preparation Subjects.

Column A: List each component that will be offered (including any group or individual sessions, service referrals, service learning, or other services).

Column B: For each component, describe the amount, duration, and intended dosage (e.g. 5 sessions over 3 weeks for a total of 15 hours of programming).

Column C: Briefly describe the content of each component.

Example provided in first row of the table.

A: Component

B: Amount, duration, intended dosage

C: Content

Classroom lessons

5 sessions over 3 weeks for a total of 15 hours of programming

Lessons on contraceptive use and HIV prevention, decision making, and setting educational plans
















  1. For each component, please describe who will deliver or facilitate the component and the intended setting for offering the component.

Example provided in first row of the table.

A: Component

B: Who will deliver?

C. Setting

Classroom lessons

Trained facilitators – program staff that will travel between programs

After school program

















  1. Has your logic model changed since the grant application? ___ yes ___ no

If yes, please provide a copy of the updated logic model.





  1. Is the program funded by this grant embedded as part of a larger set of services offered to youth (e.g. funding a new program offering in an after school program)?

___ Yes ____ No

If yes, please describe how the funded program fits within the larger set of services.





  1. Please list any other services related to adolescent sexual health that are available to youth in the communities where your program will operate.





  1. Please describe the control/comparison group experience?





  1. How will the comparison group’s experience differ from the program group’s?





  1. Are there any plans to offer the intervention to the control/comparison group at a future time point? If so, what is the timeline for those activities?





Youth Target Population and How they will be Enrolled and Retained



  1. Please describe the characteristics of the youth population you plan to serve with the program.

Age:



Grade level:



Race/ethnicity or tribe:



Gender:



Risk characteristics:



Other characteristics:







  1. What, if any, eligibility criteria will be used to identify youth for the program?







  1. Please provide the expected start and end dates for program and evaluation enrollment.


Program Enrollment

Evaluation Enrollment

Expected start date



Expected end date







Group Formation



  1. Will your evaluation use random assignment? ___ yes ___ no





If yes, will your evaluation randomly assign individuals or groups?

___ individuals ___ groups



  • If your evaluation will use random assignment (of individuals OR groups), please respond to #19-20:

  1. Please order these following activities to reflect the sequence of these events as they will occur in your study:

_____ obtaining informed consent from participants

_____ conducting baseline data collection

_____ conducting random assignment

_____ notifying participants of their assigned condition

_____ beginning to provide services



  1. Please describe the process of random assignment.

a. Who will conduct random assignment?



b. Will groups be stratified in any way to ensure balance between treatment and control? If yes, what characteristics and methods will be used?



c. What strategies will be used to ensure there is no re-assignment or non-random assignment to condition?



d. What strategies will be used to prevent contamination between treatment groups (e.g. strategies to assign siblings, or friends)?



e. How and when will participants be informed of their treatment status/assignment?





  • If your evaluation will randomly assign groups (clusters), please respond to #20:



  1. Will clusters be re-assigned to condition during the evaluation? ___ yes ___ no

If yes, when and under what circumstances?







  • If your evaluation will not use random assignment, please respond to #21:

  1. Please describe the methods you will use to match participants on key characteristics, what those characteristics will be, and how and when they will be obtained.









Sample Size, Recruitment, and Retention



  1. Please complete the table below to describe your expected enrollment into the evaluation sample, by treatment condition.

Row A: Indicate the number of cohorts to be enrolled over the evaluation period. If you have a program that enrolls continuously, enter N/A.

Row B: Indicate the number of groups (clusters) to be participating in each cohort. If an individual RCT, enter N/A.

Row C: Indicate the number of youth to be enrolled in each cluster, for each cohort. (An average estimate is acceptable.)



Unit

Treatment

Control

A. Number of cohorts to be enrolled



B. Number of clusters to be enrolled per cohort



C. Number of youth to be enrolled per cluster per cohort



Notes: If there are more than two study groups, please add a column for each additional group and label it accordingly.



  1. If the number of expected youth to be enrolled varies between the program and evaluation, please describe why the enrollment levels vary.





  1. Please describe the strategies you will use to recruit implementation sites / partners. If you have already begun to recruit partners, please describe the status of those partnerships.







  1. Will your partners or implementation sites be responsible for recruiting study participants? ___ yes ___ no



If yes, please complete the table below by listing each partner/site and the number of youth they are expected to recruit. Add additional rows as needed.



Partner Organization

Enrollment Target















If your partners will be recruiting study participants, please respond to #26:

  1. Please describe how you will collaborate with partners to ensure they meet their enrollment targets.









  1. Please describe any anticipated challenges related to reaching the intended youth population.





  1. Please describe the strategies you will use to recruit participants into the study and how the strategies will address the recruitment challenges you anticipate.







  1. Please describe the procedures you will use to collect consent from study participants, or consent from their parent or guardian and assent from the participant (if needed).







  1. Please describe how you will engage youth and retain them in the program.







  1. Please describe the strategies you will use to track and retain youth enrolled in the study, including any incentives used and the data collected for maintaining contact with youth.







  1. What is the anticipated response rate for each round of follow-up data collection? For randomized controlled trials, use the randomly assigned sample as the denominator. For quasi-experimental studies, use the baseline sample as the denominator.

_____ Consent (n/a for QEDs, must be 100%)

_____ Baseline (Wave 1) (n/a for QEDs, must be 100%)

_____ Post-program (Wave 2)

_____ Short-term follow-up (Wave 3)

_____ Long-term follow-up (Wave 4)



  1. Please provide your power analyses for two key outcomes. Use the table below to report the assumptions used in your power calculations, as well as the resulting minimum detectable impact for your targeted outcomes. You can use the power calculator here to assist you.

Outcome 1

Name of outcome of interest


Is outcome binary or continuous?


Level of significance (typically 0.05 percent)


# sides of test (ideally two-tailed)


Power (typically 80 percent)


Total number of individuals contributing to impact analysis (sample size after expected non-response for survey wave analyzed)


Probability of assignment to treatment group


If binary outcome, enter mean of outcome variable


If continuous outcome, enter the standard deviation of the outcome (>0)


Proportion of individual-level (or within-group) variance of outcome explained by covariates


For cluster RCTs: the intraclass correlation coefficient (ICC)


For cluster RCTs: proportion of group-level variance of outcome explained by covariates


Minimum detectable impact (MDI)


Minimum detective effect size (MDES)




Outcome 2

Name of outcome of interest


Is outcome binary or continuous?


Level of significance (typically 0.05 percent)


# sides of test (ideally two-tailed)


Power (typically 80 percent)


Total number of individuals contributing to impact analysis (sample size after expected non-response for survey wave analyzed)


Probability of assignment to treatment group


If binary outcome, enter mean of outcome variable


If continuous outcome, enter the standard deviation of the outcome (>0)


Proportion of individual-level (or within-group) variance of outcome explained by covariates


For cluster RCTs: the intraclass correlation coefficient (ICC)


For cluster RCTs: proportion of group-level variance of outcome explained by covariates


Minimum detectable impact (MDI)


Minimum detective effect size (MDES)














Outcomes and Data Collection

  1. Please fill in the table below for each of your outcome measures. The table is pre-populated with eight measures that will be required core measures for all impact evaluations to collect in all four waves of data collection. (More guidance on core measures will be provided to grantees in the near future.) (Note to OMB: OMB approval for the core measures will be sought under a different OMB package. ICR number is forthcoming.) Please add rows for any other outcomes to be evaluated as needed.

Outcome

Measure

Assessed at baseline?

Assessed post-program?

Assessed at short-term follow-up?

Assessed at long-term follow-up?

Core measures

Sexual initiation / activity (vaginal)

Ever engaged in sex

X

X

X

X

Recent sexual activity (vaginal)

Vaginal sex in past 3 months

X

X

X

X

Birth control (recent risky sexual activity)

Vaginal sex in past 3 months with contraceptives (excluding condoms)

X

X

X

X

Condom use (recent risky sexual activity)

Vaginal sex in past 3 months with condoms

X

X

X

X

Unprotected sex (recent risky sexual activity)

Unprotected sex (no contraceptives/condoms) in past 3 months

X

X

X

X

Recent sexual activity (oral)

Oral sex in past 3 months

X

X

X

X

Pregnancy

Ever been pregnant/caused pregnancy

X

X

X

X

Adult communication

Communication with caring adult

X

X

X

X

Other behavioral measures



















Other non-behavioral measures – include sexual intentions, knowledge of pregnancy prevention strategies or adult preparation subjects





















  1. Please fill in the table below for each wave of data collection.

Example provided in first row of the table.

Wave of data collection

Timing of data collection (since end of intervention)

Method(s) of data collection (e.g., pen-and paper survey, web-based survey, in-person interview, etc.)

Who will be responsible for data collection


Will methods and/or data collection procedures differ by study group

Immediate post-program

Within two weeks of intervention’s end

In person group administration of paper survey in schools (two attempts at each school), with web-based follow up for non-respondents. We will send text messages to youth with a link to the survey.

Program and evaluation staff.

Program staff administer the paper surveys in treatment schools.

Evaluator contacts control group to complete web-based survey

Baseline

(Wave 1)

At baseline




Post-program

(Wave 2)





Short-term follow-up

(Wave 3)





Long-term follow-up

(Wave 4)









  1. Are data sharing agreements necessary? ___ yes ___ no

If yes, what is the status of those agreements?







  1. Please indicate which data you think will be the most challenging to collect and why, and describe what strategies you will put in place to address those challenges.







Implementation/Process Study



  1. Please list the research questions you will use for the implementation study. Please use concise language and frame these as questions rather than hypotheses.









  1. Please complete the following two tables.

In the first table, list all research questions described in question #40 in the column headings, then list all planned data sources in the rows under the data source heading. Then mark which data sources will be used to address which research questions.



Example provide in first row of each table.

Data Source

Research Question

Did youth receive the intended dose of the program?




Attendance data

X




































In the second table, identify all data sources, who will provide the data for each source, and when and how data collection for the source will occur.



Data Source

Respondent

Timing/periodicity of data collection

Data collection mode

Attendance data

Program staff

After each group session

Self-administered

































DRAFT 16 01/21/21 19:14 A1/P1

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorJAlamillo
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy