Strengthening the Implementation of Marriage and Relationship Programs (SIMR) - Analysis Plan Template

Formative Data Collections for ACF Program Support

Appendix B_HMRE_Descriptive_Analysis Plan_Instructions

Strengthening the Implementation of Marriage and Relationship Programs (SIMR) - Analysis Plan Template

OMB: 0970-0531

Document [docx]
Download: docx | pdf





Instructions for Descriptive Evaluation Analysis Plan Template for HMRE Award Recipients

Month, Day, Year



This page has been left blank for double-sided copying.

The Descriptive Evaluation Analysis Plan

The Administration for Children and Families (ACF), Office of Family Assistance (OFA) is requiring all Healthy Marriage and Relationship Education (HMRE) award recipients conducting descriptive local evaluations funded by OFA to provide an analysis plan for their evaluations. Descriptive evaluations can include an outcomes study, an implementation study (sometimes referred as process study), or both types of studies. Generally, an outcomes study seeks to answer questions about whether and how the outcomes for program participants change during the intervention. These changes in outcomes cannot be described as intervention impacts, because the study does not include a control group whose outcomes can be compared with those of the intervention group. An implementation study aims to answer questions about how the intervention was delivered in a particular context.

The descriptive analysis plan is a document that describes the program design; the research questions; and the analytic approach for the outcome study, the implementation study, or both (depending on the studies the HMRE award recipient is conducting). In addition, for the outcome study, the document includes a description of the evaluation enrollment process, data collection procedures, and outcome measures; for the implementation study, the plan also discusses the research questions and the data used to answer the research questions. Developing a structured analysis plan before examining outcomes or implementation data will foster an efficient and effective approach to analyzing the data and reporting the findings. In addition to helping you document your approach for the descriptive analysis, this document is designed to help you detail information you can use in your final evaluation report or other dissemination products. It builds on and refines the most recently approved evaluation design plan.

The reason for developing an analysis plan before conducting any analysis is to demonstrate a team’s commitment to objectivity with a prespecified, systematic, and scientific approach. It also promotes transparency and credibility by showing your team preselected outcomes and analytic approaches for the evaluation, thus ensuring ACF, program staff, and other interested parties that you have not focused on outcomes that happen to emerge as statistically significant.

The following sections provide definitions and instructions for completing the analysis plan for a descriptive evaluation. Award recipients must provide information on all required sections. Please do not structure your analysis plan to match the formatting of these instructions. Instead, please use the provided template (HMRE Descriptive Analysis Plan Template) for this analysis plan. OFA strongly encourages award recipients and their evaluators to share this analysis plan internally with their evaluation teams, and perhaps even with award recipient staff, so everyone understands it and has an opportunity to discuss key decisions. The analysis plan can be considered an agreement between award recipients and their evaluators regarding the evaluation’s research questions and the approaches the team will use to address those questions. The instructions are organized as follows:

  • Section A provides guidelines for describing the intended intervention.

  • Sections B and C provide a blueprint for describing the plans for outcomes and process or implementation evaluations, respectively, including the proposed research questions, the data that will be collected, and the methods that will be used to analyze the data and describe the findings.

These instructions have been created so all award recipients can fill out each section of the analysis plan regardless of the specifics of their evaluation. However, some award recipients might need to adapt some subsections to fit their approach.

Under the direction of ACF, your Evaluation Technical Assistance Partner (ETAP) will review your analysis plan to provide input and support as you draft it. Please email your analysis plan to your Federal Program Specialist (FPS) and copy your ETAP by [ENTER DATE]. For consistency, please use this common file naming convention when submitting your analysis plan:

[HMRE award recipient Name] Evaluation Analysis Plan.docx

Your FPS and ETAP liaison will review the analysis plan, provide comments and suggest edits, and return it to you for revisions. Your analysis plan must be revised and approved by your FPS.





Instructions for completing the descriptive analysis plan template

ACF expects that evaluators will complete the analysis plan, with program directors and program staff adding input as appropriate. Therefore, these instructions are directed to the evaluators and include a few technical terms. For many of the sections, evaluators can draw from the most recently approved evaluation plan.

A. Description of the intended intervention(s)

Describe the intended experiences of those in the intervention condition(s) (that is, what the interventions aim to offer them). In particular, describe the following aspects of the intervention (also organized in Tables 1 and 2):

  1. Intended components. Describe all of the key elements of the intervention (group classes, workshops, one-on-one services, and others). If the intervention includes multiple components, please describe all of them. For example, “This is a multicomponent intervention in which parenting couples receive classes in relationship skills, workshops on economic stability topics, case management, and booster sessions.” If the intervention consists of adding services to a program, describe the program and all the additional services that are part of the intervention. If the intervention consists of providing services not related to a curriculum or program (for example, case management, counseling, or home visits), describe each of the services.

  2. Intended content. Describe the topics that each component of the intervention covers and the resources or materials provided. Also, indicate whether specific content is based on a particular healthy marriage or relationship education curriculum.

  3. Planned dosage and implementation schedule. Describe the number of sessions and the duration of each component of the intervention. Include the length and frequency of each session. Describe variation in the frequency or length of sessions across sites, if applicable— for example, “This is an eight-month workshop, with two-hour sessions that take place once a week.”

  4. Intended delivery. Describe where the intervention component is delivered and who delivers it.

  5. Focal population. Describe the characteristics of the population that each component of the intervention intends to serve, such as age, gender, marital status, and socioeconomic status. For example, “This component of the intervention is intended to be delivered to youth exiting the foster care system.”

  6. Education and initial training of staff. Provide information on the required education, gender, cultural background, and hiring requirements of the providers or facilitators of each component, and on the training and technical assistance that are offered to those providers before they begin to deliver a component.

  7. Ongoing training of staff. Provide information on the additional and ongoing training that is offered after the initial training to ensure staff maintain fidelity to established practices.

Please use tables to summarize the intervention’s characteristics. Summaries should be clear and succinct. See Tables 1 and 2 for examples, with sample text included in italics. If different sets of services are offered to different populations in your descriptive evaluation study, consider creating separate tables for each population.

Table 1. Description of intended intervention components, content, dosage and implementation schedule, delivery, and focal populations

Components

Content

Dosage and schedule

Delivery

Focal population

Relationship skills workshops

Healthy relationships curriculum: understanding partner’s perspectives; avoiding destructive conflict; and communicating effectively

A total of 20 hours, with two-hour sessions occurring twice weekly, or four-hour sessions for five consecutive Saturdays

Group lessons provided at the intervention’s facilities; two trained facilitators lead every session.

Married couples with low income

Economic stability workshops

Preparing a resume; interview and communication skills; appropriate work attire; financial literacy

Monthly two-hour workshops

Each workshop is led by one facilitator.

Individual members of a couple who need help with a job search

Table 2. Staff education and training (initial and ongoing)

Component

Education and initial training

Ongoing training

Relationship skills workshops

Facilitators are male and female, hold at least a bachelor’s degree, and received four days of initial training.

Facilitators receive a half-day of semi-annual refresher training in the intervention’s curricula from study staff.

Economic stability workshops

Facilitators are male and female, hold at least a bachelor’s degree, and received two days of initial training.

Facilitators receive a half-day of semi-annual refresher training in the intervention’s curricula from study staff.

B. Outcomes study

Describe the research questions, sample formation process, data collection procedures, outcome measures, and analytic approach for your outcomes study.

1. Research questions

Research questions for the outcomes study typically focus on how participating in the intervention is associated with HMRE, outcomes such as the status and quality of couples’ relationships, the quality of co-parenting or parenting, and economic stability and well-being. Outcomes can be measured in a variety of ways, such as surveys, direct assessments, and observations. A best practice is to clearly connect the outcomes and the time points when they are measured with the intervention’s logic model or theory of change. Examples of research questions that specify an outcome and a time point include:

  • What is the likelihood that couples are married at the end of the intervention?

  • What is the level of affection that couples feel toward each other at the end of the intervention?

  • What is the level of parents’ nurturing behavior and engagement in age-appropriate activities with children at the end of the intervention?

  • What is the employment status of parents at the end of the intervention, relative to their employment status at the baseline?

2. Outcomes evaluation enrollment

Provide the name of the Institutional Review Board that approved the study and data collection plans, the date of approval, and the dates of any supplemental review approvals.

Describe how the members of the focal population become part of the sample you will use to answer these research questions. Provide the information for all members of the sample, including organizations from which they were recruited, all service locations, and eligibility criteria to be part of the study. Include the following:

  • Recruitment and study sample enrollment targets. Describe where participants were recruited, including agencies and schools and all service locations. Provide the desired sample size for the study.

  • Sample eligibility criteria. Describe any required characteristics for sample inclusion (for example, age, marital status, involvement with the child support system, attending a particular school, geographical area, employment status, and ability to speak and understand particular languages used for intervention delivery).

  • Additional selection criteria. Describe any additional criteria for selecting the sample beyond the eligibility criteria. These might include criteria to limit the number of clients who need to be tracked (for example, randomly selecting eligible participants, or including only specific classrooms in participating schools).

  • Consent process for enrollment in the evaluation. Describe the timing, process, and materials used (such as the consent forms and incentives). Provide the start date of sample enrollment and the estimated date (month and year) for completing sample enrollment.

3. Data sources and data collection

Describe the data sources for the analyses—for example, surveys of intervention participants or administrative data. Describe the timing of each data collection point (for example, at enrollment, at the first workshop X weeks after enrollment, or at the final workshop Y weeks after the first workshop), and any other follow-up data collection time periods after the intervention ends. Describe the modes and methods of collecting data at each data collection point (in-person paper surveys, online surveys, phone interviews, and others). Provide estimated dates (month and year) for starting and completing each planned data collection effort. Finally, please provide a copy of your data collection instruments in an appendix to your analysis plan. (At a minimum, provide the instruments you used to collect data for your outcomes study.)

Summarize the information on each feature of data collection in a table (see Table 3 for an example; sample text appears in italics).

Table 3. Sources of data to address the research questions

Data source

Timing of data collection

Mode of data collection

Start and end date of data collection

Intervention participants

At the first workshop (one month after enrollment)

In-person online survey

September 2021 through March 2024

4. Outcome measures

Describe the specific outcome measures you will use to answer the research questions. If you will construct measures from multiple questions, describe the survey items you will use to create each construct and how you will code them to create the measure, including your approach if one or more items are missing within a given composite.

Complete Table 4 (sample text appears in italics), describing all measures you will use to answer the research questions of the outcomes study. Describe how you are operationalizing the outcome measure and when you will measure outcomes. These outcomes should connect to your proposed primary research questions. Whenever applicable and possible, provide the properties of the outcome measures, such as reliability and internal consistency.

Table 4. Outcomes used to answer the research questions

Research question

Outcome name

Description of the outcome measure

Source of the measure

Timing of measure

RQ1

Marital status

The outcome measure is a yes or no response taken directly from the question in the survey, “Are you married?”

nFORM exit survey

A post-test (immediately after the intervention ends)

RQ2

Level of affection

The outcome measure is a scale (value range 1–5) calculated from both partners’ responses as the average of five survey items measuring support, intimacy, commitment, trust, and friendship.

Local follow-up survey

Three months after the intervention ends

5. Analysis approach

The analysis plan should lay out the analytic approach you will use to examine outcomes and answer the research questions. The analytic approach also includes a description of how you define the analytic sample and your approaches to cleaning data and examining outcomes to answer the research questions, including your plans to deal with missing data. Note that ACF strongly discourages imputation of outcome data for analyses used to answer the primary research questions; therefore, the main analysis will need to use the sample of participants with complete data or limited imputed baseline data.

  • Analytic sample. You will use the analytic sample to examine outcomes and answer the research questions. Describe how you will define the analytic sample (for each research question, if applicable). Clearly articulate what data you require for a person to be part of the analytic sample. For example, perhaps the analytic sample for the study will be people with complete baseline and outcome data for all variables of interest (that is, a complete-case sample). Alternatively, the analytic sample might be people who have complete outcome data but some missing baseline data, which will be imputed (following the guidance described below). As mentioned, imputing outcome data is generally not an acceptable practice; therefore, the primary analysis approach cannot include people with missing outcome data in the analytic sample. Imputed outcome data can be used for secondary or sensitivity analyses, at the HMRE award recipient’s discretion.

  • Data preparation. Describe how you will clean the baseline and follow-up data and prepare them for analysis. This includes whether and how you will merge or combine data from different sources. Describe in detail how you will handle missing data (such as the process to impute missing baseline data, or the process to impute outcome data as part of a sensitivity analysis), and your plans to identify and handle responses that are inconsistent or seem inaccurate, across both baseline and outcome (at post-test and follow-up) surveys. For example, if you are administering surveys to both members of a couple separately, describe the strategies you will use to verify that the answers are consistent, such as checking that both people report the same marital status.

  • Analytic approach. Describe how you will analyze outcome measures to answer the research questions. For example, explain whether you will compute summary statistics (for example, means, percentages) of outcome measures at the follow-up; compare means and other indicators at baseline and at follow-up (that is, a pre-post analysis); conduct correlational analyses between implementation variables (such as dosage or participation) and outcome measures; or conduct growth-curve analyses. Provide the following information:

  • Model specification. Describe the type of model you will use to estimate change in outcomes over time (linear regression, logistic regression, or other). Define the criteria you will use to assess the statistical significance of the findings (for example, “Findings are considered statistically significant based on p < 0.05, two-tailed test”). Specify the statistical software package you will use.

  • Covariates. List all potential covariates you plan to include in the models you will use to examine the outcomes of interest. If you have not yet determined the covariates, describe a plan for determining the covariates you will include in your analyses.

6. Attrition

Attrition is the number of people in the baseline sample for whom follow-up was not completed or who are missing outcome data. Provide the following information:

  • Plans to minimize sample attrition. Describe the approaches you used to maximize participation in the follow-up data collection—that is, the strategies you used to follow up with study participants and increase response rates for the outcomes data collection. For example, describe the number of attempts, time period, and mode of communication (phone calls, text messages, or other) for reaching out to study participants during follow-up. In addition, describe the incentives you provided.

  • Approach to report attrition. Describe the approach you will use to report attrition from the baseline sample (the sample of study participants who completed the baseline data collection). This could include your approach to reporting (1) the proportion of the sample that is missing each outcome measure and (2) the difference in means of baseline variables or measures between samples with and without outcomes data. These steps will help you explore whether systematic differences exist between those two samples.1

C. Implementation study

If you are conducting a process or implementation study, describe the research questions you aim to answer, the data you will use to answer the research questions, and the methods you will use to analyze the data and describe the findings. This information will be useful in contextualizing the analysis of outcomes, generating hypotheses about changes in outcomes, and informing intervention improvement.

1. Research questions

The research questions articulate the main hypotheses of your process or implementation study. Research questions can refer to implementation elements such as fidelity, dosage, quality of implementation, engagement, and context, as the following examples show:

  • Fidelity: Was the intervention (and each of its components, if applicable) delivered as intended?

  • Dosage: How much of the programming (or how many components, if applicable) did participants receive?

  • Quality: How well was the intervention implemented or delivered to participants?

  • Engagement: Did couples or parents engage in intervention services, and if so, how engaged were they?

  • Context: What other types of services are available to intervention participants? What external events or unplanned adaptations might have affected implementation of the intervention as intended?

Table 5 lists examples (in italics) of research questions for each implementation element.

Table 5. Examples of research questions for each implementation element

Implementation element

Research question

Fidelity

  • Were all intended intervention components offered and for the expected duration?

  • What content did the clients receive?

  • Who delivered services to clients?

  • What were the unplanned adaptations to key intervention components?

Dosage

  • How often did clients participate in the intervention on average?

Quality

  • What was the quality of staff–participant interactions?

Engagement

  • How engaged were clients in the intervention?

Context

  • What other HMRE programming was available to study participants?

  • What external events affected implementation?

2. Implementation evaluation enrollment

Provide the name of the Institutional Review Board that approved the study and data collection plans, the date of approval, and the dates of any supplemental review approvals.

Describe the sample you will use to answer these research questions and whether and how the sample in the process or implementation study differs from the sample in the outcomes study. If not described elsewhere in the analysis plan, provide the following information for all clients in the sample, the agencies and schools from which they were recruited, and all service locations:

  • Describe any required characteristics for sample inclusion (for example, age, marital status, attending a particular school, geographical area, employment status, and ability to speak and understand particular languages used for intervention delivery).

  • Describe, in detail, the consent process for study enrollment if it differs from the outcomes evaluation. For example, if you conducted interviews or focus groups, please describe the process of obtaining consent from those participants. Describe the timing, process, and materials used (consent forms and incentives, for example). Provide the start date of sample enrollment and the estimated date (month and year) for completing sample enrollment.

3. Data sources and data collection

Describe the data sources you will use to answer the research questions (for example, nFORM surveys, fidelity protocols, and attendance logs). Describe the timing and frequency of each data collection effort (for example, during all sessions, once a week, or annually), and the party responsible for collecting the data. Use a table to link the information on data collection to the research questions. See Table 6 for an example table (sample text appears in italics). Finally, please provide a copy of your data collection instruments in an appendix to your analysis plan. (At a minimum, provide the instruments you used to collect data for your process or implementation study.)

Table 6. Examples of data for addressing the research questions

Implementation element

Research question

Data source

Timing and frequency of data collection

Party responsible for data collection

Fidelity

Were all intended intervention components offered and for the expected duration?

Workshop sessions in nFORM

All sessions delivered

Intervention staff

Fidelity

What content did the clients receive?

Fidelity tracking log or protocol; attendance logs; session observations

Every session for fidelity tracking and attendance logs; two times a year for session observations

Intervention staff for fidelity tracking and attendance logs; study staff for session observations

Fidelity

Who delivered services to clients?

Staff applications; hiring records; training logs

One time X months after start of implementation; annually

Intervention staff

Fidelity

What were the unplanned adaptations to key intervention components?

Adaptation request; work plan; six-month progress report; annual progress report

Annually; ad hoc

Intervention staff; study staff

Dosage

How often did clients participate in the intervention on average?

Workshop sessions and individual service contacts in nFORM; attendance logs

All sessions delivered

Intervention staff

Quality

What was the quality of staffparticipant interactions?

Observations of interaction quality, using protocol developed by study staff

X percent of sessions selected at random for observation

Study staff

Engagement

How engaged were clients in the intervention?

Observations of engagement, possibly using an engagement assessment tool; ratings from facilitator fidelity logs; engagement ratings from participant satisfaction surveys

Y percent of sessions selected at random for observation

Study staff

Context

What other HMRE programming was available to study participants?

Interviews with staff from partnering agencies in the community; survey items on baseline and follow-up assessments; websites of other agencies in the community providing HMRE programming

Once a year; ad hoc

Study staff

Context

What external events affected implementation?

Interviews with community or county representatives; list of site or school closures

Once a year; ad hoc

Study staff

4. Analysis approach

Describe the measures you will construct and your approaches to using those measures to answer the research questions. For example, describe whether you will calculate averages, percentages, and frequencies using the data you will collect for the study. In addition, include information on your approaches to examine and summarize qualitative data from interviews, focus groups, and observations. Use a table to link the description of the measures to the research questions. Table 7 presents an example table (sample text appears in italics).

Table 7. Examples of measures for addressing the research questions

Implementation element

Research question

Measures

Fidelity

Were all intended intervention components offered and for the expected duration?

  • Total number of sessions delivered

  • Average session duration, calculated as the average of the recorded session lengths (in minutes)

Fidelity

What content did the clients receive?

  • Total number of topics covered, calculated as the average of the total number of topics checked by each intervention facilitator in the daily fidelity tracking log or protocol

Fidelity

Who delivered services to clients?

  • Number and type of staff delivering services to study participants, such as the number of session facilitators and couples’ therapists

  • Percentage of staff trained, calculated as the number of staff who were trained divided by the total number of staff who delivered the intervention

Fidelity

What were the unplanned adaptations to key intervention components?

  • List of unplanned adaptations, such as a change in setting, sessions added or deleted, and components cut

Dosage

How often did clients participate in the intervention on average?

  • Average number (or percentage) of sessions clients attended

  • Percentage of the sample attending the required or recommended proportion of sessions

  • Percentage of the sample that did not attend any sessions

Quality

What was the quality of staff–participant interactions?

  • Percentage of sessions with high-quality interactions, calculated as the percentage of observed interactions that study staff scored as “high quality”

Engagement

How engaged were clients in the intervention?

  • Percentage of sessions with moderate participant engagement, calculated as the percentage of sessions in which study staff scored participants’ engagement as “moderately engaged” or higher

  • Average engagement rating, calculated as the average of engagement scale scores (ranging from 1–5, for example) across satisfaction surveys

Context

What other HMRE programming was available to study participants?

  • Percentage of the sample receiving HMRE programming from other providers, constructed from clients’ survey data on experiences outside of the current intervention

  • List of HMRE programming available to study participants outside of the current intervention, as described on the websites from other agencies in the community

Context

What external events affected implementation?

  • Percentage and total number of expected study participants not enrolled due to community issues, if any

  • Number of sites or schools that were closed as a result of initiatives in the county or school district (unrelated to the HMRE programming), if any



1 For the 2015 HMRF award recipients, a webinar on nonresponse bias in descriptive evaluations presented the recommended approach to assessing nonresponse and considering this information when describing findings and implications. Teams may view a recording of the webinar to inform analysis planning.

Mathematica® Inc.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleInstructions for Instrument 3:
AuthorJulieta Lugo-Gil
File Modified0000-00-00
File Created2024-10-06

© 2024 OMB.report | Privacy Policy