Strengthening the Implementation of Marriage and Relationship Programs (SIMR) - Analysis Plan Template

Formative Data Collections for ACF Program Support

Appendix F_HMRE_Implementation_Analysis Plan_Instructions

Strengthening the Implementation of Marriage and Relationship Programs (SIMR) - Analysis Plan Template

OMB: 0970-0531

Document [docx]
Download: docx | pdf





Instructions for Implementation Study Analysis Plan Template for HMRE Award Recipients Conducting Impact Studies

Month, Day, Year

This page has been left blank for double-sided copying.

The Implementation Study Analysis Plan for Impact Studies

The Administration for Children and Families (ACF), Office of Family Assistance (OFA) is requiring that all Healthy Marriage and Relationship Education (HMRE) award recipients with local impact evaluations funded by OFA provide an implementation study analysis plan for their evaluations, in addition to an impact evaluation analysis plan (see additional template and instructions). In any rigorous impact evaluation, it is important to tell a clear, succinct story about program implementation. The implementation findings (1) describe the services offered to and received by people in the treatment group and people in the control or comparison group, (2) contextualize the impact findings, and (3) generate hypotheses about why the program did or did not have a positive impact. Developing a structured implementation study analysis plan before examining the implementation data will foster an efficient and effective approach for analyzing the data and reporting the findings.

Please complete this implementation study analysis plan in addition to an impact evaluation analysis plan. This document provides instructions for completing the implementation study analysis plan. Award recipients must provide information on all required sections. Please do not structure your analysis plan to match the formatting of these instructions. Instead, please use the provided template (HMRE Implementation Analysis Plan Template).

Please email your implementation study analysis plan (together with your impact evaluation analysis plan) to your Federal Program Specialist (FPS) and copy your Evaluation Technical Assistant Partner (ETAP) liaison by [Insert DATE]. For consistency, please use this common file naming convention when submitting your implementation study analysis plan:

[HMRE Award Recipient Name] Implementation Analysis Plan.docx

Your FPS and ETAP liaison will review the analysis plan, provide comments and suggest edits, and return it to you for revisions. Your analysis plans must be revised and approved by your FPS.



Instructions for completing the implementation study analysis plan template

ACF expects that evaluators will complete the analysis plans, with program directors and program staff adding input as appropriate. Therefore, these instructions are mainly directed to evaluators and include a few technical terms. For many of the sections, evaluators can draw from the most recently approved evaluation plan.

For the implementation study analysis plan, please describe the research questions you aim to answer, the data you will use to answer those research questions, and the methods you will use to analyze the implementation data and describe the findings. To the extent possible, please use tables to briefly summarize the required implementation information.

The focus of this implementation study analysis plan is to measure the intervention services received by the intervention group and the alternative services and services similar to the intervention that were received by the control or comparison group during the evaluation period (from study enrollment through the final follow-up interview), based on the data you have collected through surveys and nFORM. The analysis plan should focus on the services evaluation participants received, which might be a subset of all services provided and populations served under the HMRE award. Please discuss any questions about the focus of the implementation study with your ETAP liaison and FPS.

A. Research questions

The research questions articulate the main hypotheses of your implementation study. Research questions can refer to implementation elements, such as fidelity, dosage, quality of implementation, engagement, and context, as the following examples show:

  • Fidelity: Were the intervention services and the control or comparison services (and each of their components, if applicable) delivered as intended?

  • Dosage: How much of the programming (or how many components, if applicable) did members of the intervention group and members of the control or comparison group receive?

  • Quality: How well were the services implemented or delivered to members of the intervention group and members of the control or comparison group?

  • Engagement: Did adults, couples, or youth in the intervention group and those in the control and comparison group engage in the provided services, and if so, how engaged were they?

  • Context: What other types of services are available to members of the intervention group and members of the control or comparison group? What external events or unplanned adaptations might have affected implementation of the intervention services and the control or comparison services as intended?

Table 1 lists examples (in italics) of research questions for each implementation element.

Table 1. Examples of research questions for each implementation element and study group

Implementation element

Research question

Intervention group questions

Fidelity

  • Were all intended intervention components offered and for the expected duration?

  • What content did the intervention group receive?

  • Who delivered services to the intervention group members?

  • What were the unplanned adaptations to key intervention components?

Dosage

  • How often did the intervention group participate in the intervention on average?

Quality

  • What was the quality of staff–participant interactions?

Engagement

  • How engaged were intervention group members in the intervention?

Context

  • What other HMRE programming was available to intervention group members?

  • What external events affected implementation?

Control or comparison group questions

Fidelity

  • If members of the control or comparison group received alternative services during the evaluation period: Were all intended service components offered to the control or comparison group and for the expected duration?

  • What content did the control or comparison group receive from any sources during the evaluation period?

  • Who delivered any services similar to the intervention to the control or comparison group during the evaluation period?

  • What were the unplanned adaptations to components offered to the control or comparison group as alternative services during the evaluation?

Dosage

  • How often did members of the control or comparison group participate in alternative services or in services similar to the intervention from other sources, on average?

Quality

  • What was the quality of staff–participant interactions in alternative services or in services similar to the intervention received from other sources, if known?

Engagement

  • How engaged were members of the control or comparison group in the alternative services, or in services similar to the intervention received from other sources, if known?

Context

  • What other HMRE programming was available to members of the control or comparison group?

B. Data sources

Describe the data sources you will use to answer the research questions (for example, nFORM, fidelity protocols, attendance logs). Describe the timing and frequency of each data collection effort (for example, during all sessions, once a week, annually), and the party responsible for collecting the data. Use a table to link the information on data collection to the research questions. Table 2 presents an example of such a table (sample text appears in italics).

Table 2. Examples of data for addressing the research questions

Implementation element

Research question

Data source

Timing and frequency of data collection

Party responsible for data collection

Fidelity

Were all intended intervention components offered and for the expected duration?

Workshop sessions in nFORM

All sessions delivered

Intervention staff

Fidelity

What content did the clients receive?

Fidelity tracking log or protocol; attendance logs; session observations

Every session for fidelity tracking and attendance logs; two times a year for session observations

Intervention staff for fidelity tracking and attendance logs; study staff for session observations

Fidelity

Who delivered services to clients?

Staff applications; hiring records; training logs

One time X months after start of implementation; annually

Intervention staff

Fidelity

What were the unplanned adaptations to key intervention components?

Adaptation request; work plan; six-month progress report; annual progress report

Annually; ad hoc

Intervention staff; study staff

Dosage

How often did clients participate in the intervention on average?

Workshop sessions and individual service contacts in nFORM; attendance logs

All sessions delivered

Intervention staff

Quality

What was the quality of staffparticipant interactions?

Observations of interaction quality, using protocol developed by study staff

X percentage of sessions selected at random for observation

Study staff

Engagement

How engaged were clients in the intervention?

Observations of engagement, possibly using an engagement assessment tool; ratings from facilitator fidelity logs; engagement ratings from participant satisfaction surveys

Y percentage of sessions selected at random for observation

Study staff

Context

What other HMRE programming was available to study participants?

Interviews with staff from partnering agencies in the community; survey items on baseline and follow-up assessments; websites of other agencies in the community providing HM/RE programming

Once a year; ad hoc

Study staff

Context

What external events affected implementation?

Interviews with community or county representatives; list of site or school closures

Once a year; ad hoc

Study staff

Note: We use the word “clients” in this table for simplicity. These research questions should be adapted by replacing the term “clients” and specifying “intervention group members” and “control or comparison group members” to address the research questions posed in Table 1, separately by group.

C. Analysis approach

Describe the specific measures you will construct and your approaches to using those measures to answer the research questions. For example, describe whether you will calculate averages, percentages, and frequencies using the data you will collect for the implementation study. In addition, include information on your approaches to examine and summarize qualitative data from interviews, focus groups, and observations. Use a table to link the description of the measures to the research questions. Table 3 presents an example of such a table (sample text appears in italics).

Table 3. Examples of measures for addressing the research questions

Implementation element

Research question

Measures

Fidelity

Were all intended intervention components offered and for the expected duration?

  • Total number of sessions delivered

  • Average session duration, calculated as the average of the recorded session lengths (in minutes)

Fidelity

What content did the clients receive?

  • Total number of topics covered, calculated as the average of the total number of topics checked by each intervention facilitator in the daily fidelity tracking log or protocol

  • Number of HMRE topics covered by other providers during the evaluation period, based on survey data

Fidelity

Who delivered services to clients?

  • Number and type of staff delivering services to study participants, such as the number of session facilitators and couples’ therapists

  • Percentage of staff trained, calculated as the number of staff who were trained divided by the total number of staff who delivered the intervention

Fidelity

What were the unplanned adaptations to key intervention components?

  • List of unplanned adaptations, such as a change in setting, sessions added or deleted, and components cut

Dosage

How often did clients participate in the intervention on average?

  • Average number (or percentage) of sessions clients attended

  • Percentage of the sample attending the required or recommended proportion of sessions

  • Percentage of the sample that did not attend any sessions

  • Participation in services similar to those offered by the HMRE program from other sources, and number of hours received, based on survey data

Quality

What was the quality of staff–participant interactions?

  • Percentage of sessions with high-quality interactions, calculated as the percentage of observed interactions that study staff scored as “high quality”

Engagement

How engaged were clients in the intervention?

  • Percentage of sessions with moderate participant engagement, calculated as the percentage of sessions in which study staff scored participants’ engagement as “moderately engaged” or higher

  • Average engagement rating, calculated as the average of engagement scale scores (ranging from 1–5, for example) across satisfaction surveys

  • Reports of level of engagement in the intervention or in similar HMRE services, based on survey data

Context

What other HMRE programming was available to study participants?

  • Percentage of the sample receiving HMRE programming from other providers, constructed from clients’ survey data on experiences outside of the current intervention

  • List of HMRE programming available to study participants outside of the current intervention, as described on the websites from other agencies in the community

Context

What external events affected implementation?

  • Percentage and total number of anticipated study participants not enrolled due to community issues, if any

  • Number of sites or schools that were closed as a result of weather events or policy changes (unrelated to the HMRE programming), if any

Note: Please adapt the questions to measure the intervention services received by the intervention group and alternative services and services similar to the intervention received by the control or comparison group during the evaluation period (enrollment through the final follow-up interview), based on the data you have collected through surveys and nFORM.



Mathematica® Inc.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleImplemenation
AuthorJulieta Lugo-Gil
File Modified0000-00-00
File Created2024-10-07

© 2024 OMB.report | Privacy Policy