Appendix C-Tribal PREP Analysis Plan Instructions

Appendix C-Tribal PREP Analysis Plan Instructions.docx

Formative Data Collections for ACF Program Support

Appendix C-Tribal PREP Analysis Plan Instructions

OMB: 0970-0531

Document [docx]
Download: docx | pdf

FYSB Personal Responsibility Education Program (PREP): Promising Youth Programs (PYP)

instructions for Tribal PREP implementation and outcomes Evaluation Analysis Plan

This template is designed to help evaluators plan and write evaluation reports using high quality methods. An analysis plan, developed before beginning any analyses, demonstrates an evaluator’s commitment to being objective and not being swayed by the data in hand—in other words, following a pre-specified, systematic, and scientific approach. The analysis plan both ensures funders, the program staff, the tribal communities, and potential skeptics that the outcomes and approaches to analyzing outcomes were pre-selected and justifies those approaches. The analysis plan serves as a road map for the planned analyses, which is helpful should you experience any changes in key project staff. We encourage evaluators to share analysis plans with a colleague who can carefully review the methods and approaches.

Descriptive evaluations of Tribal PREP programs include implementation (or process) studies and outcomes studies. An implementation study aims to answer questions about how the program was delivered in a particular context. Generally, an outcomes study seeks to answer questions about whether and how the outcomes for program participants changed during the program.

Shape1 Both studies might also investigate questions related to the cultural relevance or significance of the program, including any adaptations that were made to address the needs of the communities involved in the projects. To the extent that your project aims to answer questions like these, be sure to include the relevant information throughout the analysis plan. For example, if you plan to measure how well certain program adaptations were implemented, be sure the program description includes those adaptations.

The analysis plan template is organized as follows. Section A discusses the elements needed for the analysis plan for the implementation study. Section B discusses the elements needed for the analysis plan for the outcomes study. Within each section, subsections 1–3 revisit proposed research questions, program characteristics, and the study design. Subsection 4 provides the blueprint for the analyses that will be conducted for each evaluation. Your responses should reflect your approved evaluation plan and research questions but may include additional details and information. Highlight any changes from the approved evaluation plan and research questions. The general structure of the template should work for all grantees regardless of the details of their evaluations, but some grantees might need to adapt some subsections to fit their design.

Please email your analysis plan to your Federal Project Officer (FPO) and copy your rigorous evaluation technical assistance (RETA) liaison by [date]. For consistency, use this naming convention when submitting your analysis plan: [Grantee Name] Evaluation Analysis Plan. If you have made substantial changes to your evaluation (for instance, sample size, eligible population, or data collection plan) since your last abstract submission, submit an updated abstract as well, with tracked changes. The latest version of your abstract is available in your grantee SharePoint folder, in the Abstract subfolder. Your FPO and RETA liaison will review the analysis plan, along with the latest version of your abstract for context, and will provide comments and suggested edits and return to you for revisions. FYSB would like your FPO to be able to approve your analysis plan and updated abstract (if applicable) by [date] so that you can begin work on the final report.

A. Implementation evaluation

1. Research questions

The research questions articulate the main hypotheses of your implementation evaluation. Research questions can refer to implementation elements such as fidelity, dosage, quality of implementation, engagement, and context. Use the research questions from your approved evaluation plan and make note of any revisions to those questions. Present the research questions in a table, organized by implementation elements. Table 1 provides an example.

Implementation elements:

  • Fidelity: Whether the program (and each of its activities, if applicable) was delivered as intended

  • Dosage: The amount of programming (or activities, if applicable) youth received

  • Quality: How well the program was delivered to youth

  • Engagement: Whether youth engaged in program services, and if so, how engaged they were

  • Context: Other types of services available to program participants, external events, or unplanned adaptations that might have affected implementation.

Table 1. Examples of research questions for each implementation element

Implementation element

Research question

Fidelity

  • Were all intended program activities offered and for the expected duration?

  • What content did the youth receive?

  • Who delivered services to youth?

  • What were the unplanned adaptations to key program components?

Dosage

  • How often did youth participate in the program on average?

Quality

  • What was the quality of staff–participant interactions?

Engagement

  • How engaged were youth in the program?

Context

  • What other pregnancy prevention programming was available to study participants?

  • What external events affected implementation?



2. Program description

Use a table to clearly and succinctly summarize your program’s components. See Tables 2 and 3 for examples (sample text included in italics). If you are implementing different program models or curricula with different youth (for example, Curriculum A is offered to 6th graders and Curriculum B is offered to 8th graders), repeat the content in this section for each program model or curriculum being evaluated as part of your implementation study. You can use text from your approved evaluation plan or abstract to complete the tables. These definitions offer guidance on what to include in each section of the tables.

  • Intended program activities. List each of the key structural elements of the program (for example, group classes, one-on-one services) (see Table 2 for an example).

  • Intended program content. For each activity, list the topics to be covered and the resources/materials provided. Also, indicate whether specific content is based on a particular curriculum. Include any modifications planned before the program was implemented and the reasons for modification. Include a list of the APS topics covered in each activity.

  • Planned dosage and implementation schedule. Describe the number of sessions and the duration of each component of the program. Include the length of each session and how frequently sessions occur. Describe variation in the frequency or length of sessions across sites, if applicable.

  • Intended delivery (location and setting). Describe where the program component takes place.

  • Characteristics, education and training of staff. Provide information on the required education, gender, cultural background, and hiring requirements of the providers/facilitators of each component, and on the training and technical assistance activities that were planned for providers to maintain fidelity.

Table 2. Description of intended program activities for [Program Name]

Program activity

Curriculum and content

Dosage and schedule

Delivery

Workshop

Youth Pride curriculum

  • Strategies to prevent HIV infection

  • Consequences of sex: Pregnancy

APS topics: Adolescent development

8 hours, one hour per week

Group workshop in classrooms

Informational text messages

Reminders from lessons on strategies to prevent HIV infection

APS topics: Adolescent development

12 text messages sent over one year

Text messages sent by a centrally managed service with reminder facts from lessons



Table 3. Staff training and development to support program activities for [Program Name]

Program activity

Education and initial training of staff

Ongoing training of staff

Workshop

Facilitators are male and female and hold at least a bachelor’s degree and receive four days of initial training.

Facilitators receive a half-day of semiannual refresher training in the program’s curricula from study staff.

Informational text messages

N/A

N/A

3. Evaluation design

The evaluation design should explain the sample formation, including the consent process, data sources, and outcomes, for your implementation evaluation. Aim to provide a detailed account of your methods, sufficiently transparent so that someone could replicate them. Include the following sections:

Shape2 a. Sample formation

Describe the sample you will use to answer each research question. For example, you might have interviewed staff about curriculum delivery or conducted focus groups of a subset of participants in the program. For each sample, provide any required characteristics for sample inclusion (for example, age, AI/AN status, attending a particular school, geographical area).

Provide this information for each of the samples used to address research questions for the implementation study.

b. Data sources

Describe the data sources you will use to answer the research questions (for example, youth surveys, fidelity protocols, attendance logs). Describe the timing and frequency of each data collection effort (for example, during all sessions, once a week, annually), and the party responsible for collecting the data. Use a table to link the information on data collection to the research questions. See Table 4 for an example of such a table (sample text included in italics). Finally, please provide a copy of your data collection instruments in an appendix to your analysis plan (at a minimum, provide the instruments that you used to collect the data you will use for your implementation study).

Table 4. Examples of data for addressing the research questions

Implementation element

Research question

Data source

Timing/frequency of data collection

Party responsible for data collection

Fidelity

Were all intended program activities offered and for the expected duration?

Fidelity tracking log or protocol; session observations

Every session for fidelity tracking; two times a year for session observations

Program staff for fidelity tracking; study staff for session observations

Fidelity

What content did the youth receive?

Fidelity tracking log or protocol; attendance logs; session observations

Every session for fidelity tracking and attendance logs; two times a year for session observations

Program staff for fidelity tracking and attendance logs; study staff for session observations

Fidelity

Who delivered services to youth?

Staff applications; hiring records; training logs

One time X months after start of implementation; annually

Program staff

Fidelity

What were the unplanned adaptations to key program components?

Adaptation request; work plan; 6-month progress report; annual progress report

Annually; ad hoc

Program staff; study staff

Dosage

How often did youth participate in the program on average?

Workshop sessions and individual service contacts; attendance logs

All sessions delivered

Program staff

Quality

What was the quality of staff–participant interactions?

Observations of interaction quality, using protocol developed by study staff

X percent of sessions selected at random for observation

Study staff

Engagement

How engaged were youth in the program?

Observations of engagement, possibly using an engagement assessment tool; ratings from facilitator fidelity logs; engagement ratings from participant satisfaction surveys

Y percent of sessions selected at random for observation

Study staff

Context

What other pregnancy prevention programming was available to study participants?

Interviews with staff from partnering agencies in the community; survey items on baseline and follow-up assessments; websites of other agencies in the community providing programming

Once a year; ad hoc

Study staff

Context

What external events affected implementation?

Interviews with community/county representatives; list of site/school closures

Once a year; ad hoc

Study staff

c. Measurement

Describe the specific measures you will construct to address each research question. Use a table to link the description of the measures to the research questions. Table 5 presents an example of such table (sample text included in italics). Define each measure being examined in the implementation evaluation using data sources included in Table 4. For each measure, briefly explain how you will construct it (that is, survey item source or, if multiple item sources, how those items will be combined). In addition, describe any targets you pre-specified and used, if applicable, to assess how well the program was implemented relative to program or developer standards.

Table 5. Examples of measures for addressing the research questions

Implementation element

Research question

Measures

Target

Fidelity

Were all intended program activities offered and for the expected duration?

  • Total number of sessions delivered

  • Average session duration, calculated as the average of the recorded session lengths (in minutes)

  • 95 percent of groups to receive all 12 sessions

  • Average session duration will be at least 40 minutes

Fidelity

What content did the youth receive?

  • Total number of topics covered, calculated as the average of the total number of topics checked by each program facilitator in the daily fidelity tracking log or protocol

  • 95 percent of groups to receive 90 percent of the topics

Fidelity

Who delivered services to youth?

  • Number and type of staff delivering services to study participants, such as the number of session facilitators

  • Percentage of staff trained, calculated as the number of staff who were trained divided by the total number of staff who delivered the program

  • Three full-time health educators will deliver programming

  • All health educators to receive at least 20 hours of training each year

Fidelity

What were the unplanned adaptations to key program components?

  • List of unplanned adaptations, such as a change in setting, sessions added or deleted, and components cut

  • n/a

Dosage

How often did youth participate in the program on average?

  • Average number (or percentage) of sessions youth attended

  • Percentage of the sample attending the required or recommended proportion of sessions

  • Percentage of the sample that did not attend sessions at all

  • n/a

  • 75 percent of youth to attend 75 percent of the program sessions

  • Less than 5 percent of the sample receives none of the program

Quality

What was the quality of staff–participant interactions?

  • Percentage of sessions with high quality interactions, calculated as the percentage of observed interactions that study staff scored as “high quality”

  • 90 percent of observed sessions to be implemented with high quality (rated as a 3.5 out of 4 on the quality scale)

Engagement

How engaged were youth in the program?

  • Percentage of sessions with moderate participant engagement, calculated as the percentage of sessions in which study staff scored participants’ engagement as “moderately engaged” or higher

  • Average engagement rating, calculated as the average of engagement scale scores (ranging from 1–5, for example) across satisfaction surveys

  • 90 percent of observed sessions to be implemented with moderate to high engagement









  • n/a

Context

What other pregnancy prevention programming was available to study participants?

  • Percentage of the sample receiving pregnancy prevention programming from other providers, constructed from clients’ survey data on experiences outside of the current program

  • List of pregnancy prevention programming available to study participants outside of the current program, as described on the websites from other agencies in the community

  • Less than 20 percent of youth to receive formal content outside of the program

Context

What external events affected implementation?

  • Percentage and total number of anticipated study participants missed due to community issues, if any

  • n/a



4. Analysis methods

This section should lay out the approach you will use to answer the research questions. Describe your approach to cleaning and analyzing data to answer the research questions.

a. Data preparation

Describe how you will clean and prepare the data for analysis. For quantitative data, include information on how you will handle missing or inconsistent data. For qualitative data, include information on your approaches to examine and summarize qualitative data from interviews, focus groups, and observations. Provide the name of any software used to organize and code data. Provide details on the coding system you will used, who will code the data, and how you will ensure consistency across coders.

b. Analytic approach

Describe how you will analyze measures to answer the research questions. For example, for quantitative data, describe whether you will calculate averages, percentages, and frequencies using the data collected and whether you will compare them to established targets for implementation. For qualitative data, this may include describing how you will identify common themes from interview data, or identify lessons learned from focus groups conducted with youth or staff.

B. Youth outcomes evaluation

1. Research questions

The research questions state the main hypotheses for the outcomes evaluation. Use the research questions from your approved evaluation plan and make note of any revisions to those questions. You can present the research questions in this section as a list. If you have different research questions for different samples, or subgroups, make that clear in the presentation. For example, you might be interested in different outcomes for middle school youth in your study than for high school youth. Present the research questions in a table, organized by implementation elements. Table 6 provides an example.

Table 6. Examples of research questions for each outcome

Outcome

Research question

Refusal skills

What percentage of youth reported an increase in refusal skills from pre-test to post-test?


Cultural connectedness

What percentage of youth reported an increase in their reported connection to their tribal community from pre-test to post-test?



2. Program description

Specify which program or programs described in Section A.2 are part of your outcomes evaluation, and for which samples.

3. Evaluation design

The evaluation design section should explain the design overview, target population, enrollment and consent process, data collection, and outcomes for your evaluation. Aim to provide a detailed account of your methods, sufficiently transparent so that someone could replicate them.

a. Overview of design

Shape3 As part of the overview for this section, briefly describe the research design used in the outcomes evaluation (for example, comparing people over time using a pre-post approach with linked data or comparing groups over time using a pre-post approach with unlinked data).

b. Target population

Describe the characteristics of the youth to be served by the project, including the study setting, context, and eligibility criteria. Include information about the type and number of sites that you aimed to enroll (for example, the number of schools, classrooms, or clinics, and the target location and grade) and the total number of youth intended to enroll in the evaluation. Broadly describe the intended characteristics of the target sample (for example, youth at risk for teen pregnancy or STIs due to high rates of sexual initiation). Describe any required characteristics for sample inclusion (for example, age, grade level, attending a particular school, geographical area, AI/AN status). You can expand on text from your approved evaluation plan and from the “sample and setting” section of the approved evaluation abstract.

c. Enrollment and consent process

Describe the process for enrolling the sample and obtaining consent. Include information on recruitment and enrollment procedures, such as the number and type of staff who were responsible for recruiting youth and details about each type of recruitment strategy used (for example, flyers home to parents, referrals from social service agencies, direct outreach, advertisements), how you formed partnerships with the sites, how the sites identified eligible participants, and how you enrolled the youth into the evaluation and program. Also, include the time frame for recruitment and enrollment in the evaluation. In addition, describe the timing, process, and materials used during the consent process (such as the consent forms, incentives, and so on). Provide the start date of sample enrollment and the estimated date (month and year) for completing sample enrollment.

d. Data collection

This section should be a high-level overview of the data sources and data collection methods you will use for the research questions your study will address. Describe the timing of each data collection point (for example, at enrollment, at the first workshop X weeks after enrollment, at the final workshop Y weeks after the first workshop). Describe the modes and methods of collecting data at each data collection point (for example, in-person paper surveys, online surveys, phone interviews). Provide estimated dates (month and year) for starting and completing each planned data collection effort. Attach the survey instrument(s) as an appendix to the submitted plan.

Use text from your approved evaluation plan or evaluation abstract, and use a table to link data sources to research questions. See Table 7 for a sample table. You do not need to explain how you will construct specific measures in this section.



Table 7. Sources of data to address the research questions

Research question to be addressed

Data source

Timing of data collection

Mode of data collection

Start and end date of data collection

Research question 1

Program participants

At the first lesson

In-person hard-copy survey

September 2018 through March 2020





Research question 4

Program participants

Six months after the last lesson

Telephone interview

September 2019 through March 2021



e. Outcomes

This section will discuss the specific outcome measures you will use to answer the research questions. For each outcome being examined in the outcomes evaluation, briefly explain how you will construct each outcome measure (that is, survey item source or, if multiple item sources, how those items will be combined). If you will construct measures from multiple questions, describe the survey items you will use to create each construct and how you will code them to create the measure. Use the outcomes provided in the “outcomes study” section of the approved evaluation plan.

Use the table shell provided in Table 8 to describe all measures that will answer the research questions. Describe how you are constructing the outcome measure and the time periods when you will measure outcomes. In the “outcome name” column, include the name of the outcome that will be used. In the “source item” column, provide the data source for the measure. In the “constructed measure” column, describe the outcome and how it is constructed. If the outcome is from a published measure, provide the name of the measure. In the “timing of measure” column, indicate when the measure was measured, relative to the end of the program.

Table 8. Outcomes used to answer the research questions

Outcome name

Source item

Constructed measure

Timing of measure

Refusal skills

Would you say that being in the program has made you more likely, about the same, or less likely to… resist or say no to peer pressure?”

Dichotomous variable coded as 1 if answered “more likely” and zero otherwise.

A post-test (immediately after program ends)



4. Analysis methods

This section should lay out the approach you will use to examine youth outcomes, including data preparation and statistical methods. Note that we provide a recommended approach to constructing your analytic sample. If you choose to adopt this approach, address the issues discussed below in your analysis plan. If you propose an alternative approach to constructing your sample, such as through dealing with youth nonresponse or attrition though multiple imputation or nonresponse weights, carefully articulate your proposed approach for each section that follows.

Recommended analytic approach

In order to look at changes over time in youth attitudes, intentions, or knowledge, we recommend limiting your analytic sample to youth who have responded to both the pre- and post-program survey. Using this approach, you can examine how a person’s outcomes have changed over time, controlling for factors that might be associated with those changes (for instance, age or sex).

You will also need to deal with item nonresponse, when youth have skipped questions that you plan to incorporate in your analyses. We recommend dropping those youth from your analyses as well (that is, a complete case analysis), unless they are a sizable percentage of the sample. If so, you could consider imputing data using a process such as multiple imputation. Feel free to discuss options with your RETA liaison.

In addition to conducting your outcome analysis, we recommend you conduct a nonresponse analysis in order to understand the representativeness of the analytic sample with your original enrolled population. You would compare the profile (for example, baseline demographics and risk characteristics) of those in your analysis sample to the full sample who responded to the baseline survey by assessing differences in baseline means using a t-test of mean differences.

Note: If you are unable to link pre- and post-program survey data for an individual, we recommend you aggregate the data to the smallest cluster possible (such as classroom-year) and look at differences within that cluster over time. When using cluster-level data, it will be important to give the reader a sense of the proportion of the youth in the program who are in your analytic sample (for example, 90 percent of youth in the served classrooms provided pre- and post-program data). Instead of a nonresponse analysis, describe how the sample analyzed might reflect the population that received the program. For example, report on the percentage of the enrolled sample who took the pre-test and the percentage who took the post-test. In addition, estimate the number of youth who completed the post-test who were not enrolled in the program at the time of the pre-test, and conversely, the number of youth who left the program after completing the pre-test but before completing the post-test.

a. Data cleaning

Describe how you will clean the baseline and post-program data and prepare them for analysis. For example, if you find a person has reported sexual activity at baseline and reported never having had sex at follow-up, what rules will you use to reconcile these responses?

b. Analytic sample

Indicate whether you will use a complete case analysis. If you will not (see “Recommended analytic approach” above), describe how you will construct your analytic sample.

c. Missing data

If you will use a complete case analysis, please state that. If you will impute missing data, rather than dropping cases that are missing their entire survey or are missing selected measures you will analyze, describe how you will impute data and account for the imputation in your analysis.

d. Analytic approach

Describe how you will analyze outcome measures to answer the research questions. For instance, explain whether you will describe summary statistics (for example, means or percentages) of outcome measures at the follow-up(s); compare means and other indicators at baseline and the follow-up(s) (that is, a pre-post analysis); conduct correlational analyses of the associations between implementation variables (such as dosage/participation) and outcome measures; or conduct growth-curve analyses. Provide the following information:

  1. Model specification. Describe the type of model you will use to examine outcomes (for example, linear regression, logistic regression). Define the criteria you will use to assess the statistical significance of the findings (for example, “findings are considered statistically significant based on p < .05, two-tailed test”). Specify the statistical software package you will use.

  2. Covariates. Using the table shell provided in Table 9, list all potential covariates (control variables in the regression) you plan to include in the models you will use to examine the outcomes of interest. If you have not determined the covariates yet, describe a plan for determining the covariates you will include in your analyses.

Table 9. Covariates included in outcomes analysis

Covariate

Description of covariate

Age

Age (in years) as of the baseline data collection



  1. Nonresponse analysis. If following the recommended approach, list the measures you will examine to determine how similar the sample with post-program survey data is to the full sample with pre-program survey data. Discuss the method you will use to assess the magnitude or statistical significance of the differences. If you will not be using a complete case analysis and instead take a different approach to defining your analytic sample, describe how you will assess how similar the analytic sample is to the original sample.

  2. Subgroup analysis. If you plan on conducting any subgroup analyses, list those subgroups and discuss any differences in your analytic approach, if applicable. See Table 10 for a sample table.

Table 10. Subgroup analyses in outcomes analysis

Subgroup

Outcome

Description of deviation from full-sample analytic methods

Girls

Refusal skills

Same as full-sample analysis






8

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleMathematica Standard Report Template
AuthorCindy Castro
File Modified0000-00-00
File Created2022-05-09

© 2024 OMB.report | Privacy Policy