Strengthening the Implementation of Marriage and Relationship Programs (SIMR)

Formative Data Collections for ACF Program Support

Attachment C-SIMR Staff Interview Topic Guide Protocol 5_26-clean

Strengthening the Implementation of Marriage and Relationship Programs (SIMR)

OMB: 0970-0531

Document [docx]
Download: docx | pdf

SIMR

staff interview Topic Guide for simr

Note to reviewers: This instrument includes a universe of questions relevant to a broad range of strategies that will be tested in the Strengthening the Implementation of Marriage and Relationship Education (SIMR) project. The instrument will be tailored and shortened for each individual site and the strategy that it is testing.

INTERVIEWER NOTE: Based on your current understanding of the program and previous conversations with the program, use this topic guide to identify relevant topics that then will be tailored to the program. To do this, first, identify topics where we have not already collected information. Then, use the identified topics to develop program-specific questions.

Not all programs will have staff in the roles identified in the table. To map staff roles in the table to specific programs, use the following definitions:

  • Program leaders and managers: Individuals responsible for the overall direction and management of the program with a high-level understanding of the program’s mission.

  • Program supervisors: Those who oversee program implementation, provide support to the frontline staff, and provide information to program leaders and managers.

  • Frontline staff: Staff responsible for serving parents, children, and families enrolled in the program; this can include frontline staff at a partner agency, if appropriate. Separate interviews will be conducted for staff (or small groups of staff) who work with different populations in the program.

Introduction and consent

Thank you for taking the time to speak with us today. We are from Mathematica, an independent research firm, and we are here to learn about your experiences using [strategy]. My name is [NAME] and my colleague is [NAME].

We are speaking today on behalf of the Strengthening the Implementation of Marriage and Relationship Programs project, which we call “SIMR” for short. SIMR is a study sponsored by the Administration for Children and Families within the U.S. Department of Health and Human Services. Through this project, [program] has been working with Mathematica and our partner, Public Strategies, to design and test strategies to address common implementation challenges.

We will be speaking today about your program’s use of [strategy]. We will ask you some questions regarding the strategy including training or materials you received, your use and comfort with the strategy, participant’s responses, and suggested improvements. Before we start, I want to let you know that your participation in this interview is voluntary, and you may stop at any time. Do you consent to participate in this interview?

Providing information is voluntary, and all individual responses that are collected will be kept private to the extent permitted by law. We expect this discussion to take about 45 minutes. There are no right or wrong answers. We value the information you will share with us and want to make sure we capture it all by recording it. If you do not agree to the recording, you can still participate, and we will not record it, but we have someone who will take notes. Only the team that is working on the study will have access to them. We will destroy the recording and the transcription at the end of the study.

Do we have your permission to record the discussion?

Now, I am going to read a statement:

An agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a currently valid OMB control number. The OMB number and expiration date for this collection are OMB #: 0970-0531, Expiration: 07/31/2022. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to Robert Wood; [email protected].

Do you have any questions before we get started?

NOTE: The Paperwork Reduction Act Statement: This collection of information is voluntary and will be used to gather information for the purpose of rapid-cycle learning activities to strengthen programs. Public reporting burden for this collection of information is estimated to average 45 minutes per response, including the time for reviewing instructions, gathering and maintaining the data needed, and reviewing the collection of information. An agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a currently valid OMB control number. The OMB number and expiration date for this collection are OMB #: 0970-0531, Exp: 07/31/2022. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to Robert Wood; [email protected]

Topics to cover in interviews

Topic

Program leaders and managers

Program supervisors

Frontline staff

A. Feedback on training, guidance, or materials

Usefulness and clarity of training, guidance, or materials received as a part of SIMR

First, I’d like you to think about [training, guidance, or materials] you received as a part of [using strategy] for the SIMR project.

  1. As a result of the [training, guidance, or materials], how prepared did you feel to [use the strategy]?

  • If not, what would you like more information, training, or guidance on?

  1. Talk about your reactions to the [training, guidance, or materials]. Were there parts of the [training, guidance, or materials] that you felt were particularly useful, relevant, or engaging? If so, what were they? Were there parts that you felt were not useful, relevant, or engaging? If so, what were they?

  • What suggestions do you have for doing things differently for future [training, guidance, or materials]? Probe for changes and/or additions.


Change(s) in knowledge and behaviors from receiving training, guidance, or materials

  1. Since [receiving training/guidance/materials] have you made changes in how you think about or perform [focus of training/guidance/materials]? What have you noticed?

  2. Since [receiving training/guidance/materials] have you noticed any changes in how [other program staff] think about or perform [focus of training/guidance/materials]? What have you noticed?

  3. What do you think was the most important thing you learned from the [training/guidance/materials]? Why do you think it is the most important?

  4. How much information in the [training/guidance/materials] was new information for you? Please provide an example of something that was new information, and something that you learned before.

  • If applicable: Where did you learn it?



Confidence in skills gained from receiving training, guidance, or materials

  1. After [receiving training/guidance/materials], did you feel like the information you learned made you better prepared to do your job? Why or why not?

  2. Do you think that the [training/guidance/materials] has improved the way you do your job? Why or why not?



Overall satisfaction with training, guidance, or materials

  1. Overall, thinking about the [training/guidance/materials] that you received as a part of [using strategy] for the SIMR project, how satisfied are you with the [training/guidance/materials] provided? What makes you say that?

Suggested changes and/or additions to training, guidance, or materials



See A.2.

B. Use of strategy

Fidelity of implementation (i.e., did you use each part of the strategy, as developed?) (for program leaders and managers and supervisors: observations of fidelity)

  1. Walk me through how you [used the strategy]. (If strategy has multiple parts, ask about each part of the strategy)

  • Do you think you’ve been able to [use the strategy] as intended?

  • What about [strategy] was easy to implement as planned?

  • What about [strategy] was challenging to implement as planned?

Frequency of strategy use

  1. Over the past [time interval], have you [used the strategy]? How often?

  • [If staff member has not used the strategy] What has gotten in the way of [using strategy]?


Reasons for not using the strategy, if strategy was not implemented



See B.2



Barriers and facilitators to implementation of strategy

  1. Overall, have you found any specific barriers that have made implementation challenging, or any factors that have helped you use [strategy]? Describe them.

  1. Are there parts of the strategy that are easier than others? Which ones, and why?

Comfort with strategy (for supervisors: perceived comfort)

  1. Is there anything about [strategy] that has felt unnatural or uncomfortable? What is weird or uncomfortable about it?

  2. For supervisors: What feedback have you received about the strategy from the staff you supervise? What have they told you about their comfort using the strategy?


Extent to which strategy has made their job easier or improved their ability to complete their job successfully

  1. How different do you think [strategy] is from how you approached your job before? What is different about it/not so different?

  2. Would you say that [strategy] has made your job easier or harder? In what way?


C. Participant responsiveness




Observed participant response to strategy (e.g., improved relationships between participants and with staff, increased attendance, improved engagement, etc.)

  1. Over the past [time interval], have you seen any changes in the way that participants have reacted to [strategy]? (Use targeted changes to tailor this question. For example, improved relationships between participants and with staff, increased attendance, improved engagement)


Observed changes in participant skills/behaviors

  1. Over the past [time interval], have you observed any changes in participants’ [hypothesized changes in skills/behaviors]? What about other skills or behaviors? (Use targeted changes to tailor this question.)

  • What do you think might be behind these changes? In the past [time interval], have you been doing anything differently? If so, what?


D. Extent to which strategy affects recruitment, retention, or engagement




Perceptions of whether it improves service delivery, addresses participant needs, or achieves intended outcomes

  1. Overall, on a scale of one to five, how effective do you think [strategy] has been at [changing intended outcome], with one being not at all effective and five being extremely effective?

  • Why did you choose that rating?

Perceptions of whether strategy effectiveness differs across circumstances or populations

  1. Have there been any situations in which [strategy] works better or worse than others?

  • What were these situations and how did the situations differ? (Probe on whether the situations involved different populations or settings)

Perception or observed changes of strategy’s effect on recruitment, enrollment, or engagement

  1. Over the past [time interval], what, if any, specific changes have you observed related to [focus of strategy]? How, if at all, do you think [strategy] contributed to these changes?

  • How can you tell? Can you think of an example that illustrates why it [is/isn’t] working the way it should be?

E. Data collection

How the program assesses fidelity and role of staff in different positions

  1. What is the program doing to make sure that staff are [implementing the strategy] according to plan? (for example, conducting observations, providing support, checking in)

  • How are various staff in the program involved in assessing if the strategy is happening as planned?


Additional data collected outside of nFORM (e.g., incentives, attendance, recruitment) and how it’s used to identify areas for improvement

  1. What, if any, data does your program collect outside of nFORM (e.g., data on incentives, attendance, recruitment) to help track progress with [strategy]?

  • If additional data is collected, how do you use that data to identify areas for improvement?




Frequency that program shares and discusses data with partners

  1. Do you share or discuss any of the data you collect with partners? If yes:

  • What partners do you share or discuss data with?

  • What data is shared with them?

  • How often is the data shared or discussed with them?




Frequency of internal staff conversations around data (e.g., program leaders/managers discuss with supervisors, supervisors discuss with frontline staff)

  1. In the past [time interval], have you discussed [strategy] with your coworkers?

  • If yes, who did you talk with? (for example, program supervisors, frontline staff, someone else)

  • What was the nature of the conversation? Was it to get feedback on the implementation of [strategy] in general, provide information or guidance, or something else?

  • Did you discuss program data as a part of this conversation? (for example, trends in recruitment or retention)


Challenges collecting, entering, and/or processing data to assess success of strategy

  1. What data are you responsible for related to [strategy]? (for example, completing a self-assessment, maintaining a log or tracker, entering data into nFORM)

  • What challenges, if any, have you had with entering data and tracking implementation of the strategy? (for example, tracker isn’t intuitive, data entry is time consuming)

  • What other data or information would be helpful for knowing whether [strategy] is working or not?



Additional data program staff would like to collect or analyze to assess success of strategy

  1. What other data or information would be helpful for knowing whether [strategy] is working or not?

F. Suggested improvement overall

Parts of strategy that worked well and why

  1. Reflecting on [use of the strategy] in the past [time interval], what worked particularly well? Why?


Parts of strategy that didn’t work well and why

  1. What parts of [strategy] didn’t work so well, or need the most work? Why?


Desired or recommended adaptions to the strategy and why this change may improve the strategy

  1. How could we make [strategy] work better? What would you change? Why? (This could be adaptations to the strategy or changes to program policies/procedures to better accommodate the strategy)

Desired or recommended changes to program policies/procedures to better accommodate the strategy and why that may improve implementation of the strategy

  1. Are there other things that make it hard to do your job, or other things in your program that could improve? Why? How would you improve them?

  • What changes to program policies or procedures would help you implement [strategy] better?

G. If partner organizations are involved in implementing the strategy

Frequency of communication with partner agency

  1. How often do you [meet/talk] with [relevant partner] about [focus of strategy]?

  • How does this communication usually occur? (Probe for: standing meeting, ad-hoc, phone, in-person, email)

Process of communication with partner agency



See G.1.

Nature and focus of communication related to the strategy

  1. What are some of the topics and issues that have come up in those conversations?

Satisfaction with partner agency regarding strategy or change

  1. On a scale of 1-5, where 1 is “very dissatisfied” and 5 is “very satisfied,” how would you rate working with [relevant partner] to [focus of strategy]? Why?

  • [If applicable] What suggestions do you have to improve the relationship?




DRAFT 3

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleMathematica Standard Report Template
AuthorNickie Fung
File Modified0000-00-00
File Created2022-05-04

© 2024 OMB.report | Privacy Policy