Family Self-Sufficiency Demonstration Development Grants Evaluation Support

Formative Data Collections for ACF Research

Instrument 1_FSSDD Discussion Guide

Family Self-Sufficiency Demonstration Development Grants Evaluation Support

OMB: 0970-0356

Document [docx]
Download: docx | pdf

FSSDD-GES Discussion Guide for Initial Calls with Grant Recipients

Discussion Guide to Identify and Develop Evaluation Support Activities to Conduct with Grant Recipients

Introduction and consent

Thank you so much for meeting with us today. As we have discussed, my colleague [NAME] and I are providing research and evaluation support to you as one of the Family Self-Sufficiency Development Demonstration (FSSDD) grant recipients, with funding from the Office of Planning, Research, and Evaluation (OPRE) within the Administration for Children and Families (ACF) in the U.S. Department of Health and Human Services. The intention of the FSSDD Grants Evaluation Support is to build grant recipients’ research and evaluation capacity and help them generate evidence related to their interventions, which aim to improve the lives of children and families.

We have reviewed your grant application and publicly available information about your organization and the services you offer. The purpose of the discussion today is to help us collaboratively identify areas in which you could benefit from our help. We’ll have up to three discussions with you all about this over the next couple months, leading to a concrete research and evaluation support plan at the end of our discussions. We will ask you about your intervention, [only if already implementing: how you implement it], any challenges you have encountered, and the types of research and evaluation you have engaged in already and the research you are interested in conducting with our support. Your participation in this discussion is voluntary. Each of our three conversations will take about 60 minutes.

I would like to record our conversation today, so I don’t miss anything. Is it okay with you if I record the conversation? If you want me to turn the recorder off for any reason or at any time, just say so. The recording will only be accessed by evaluation support team members and will be stored on a secure drive at Mathematica. We’ll destroy the recording at the end of this project. [INTERVIEWER: TURN THE RECORDER ON]

Okay, I have now turned on the recorder. Now that I have the recorder on, I need to ask you again, is it okay if I record this conversation? [Interviewer: Get verbal consent to record after beginning to record.]

Shape1

PAPERWORK REDUCTION ACT OF 1995 (Pub. L. 104-13) STATEMENT OF PUBLIC BURDEN:

The purpose of this information collection is to provide evaluation support to innovative interventions serving individuals, children, and families facing challenges to economic independence to expand the evidence base. Public reporting burden for this collection of information is estimated to average 60 minutes per response. This is a voluntary collection of information. An agency may not conduct or sponsor, and a person is not required to respond to, a collection of information subject to the requirements of the Paperwork Reduction Act of 1995, unless it displays a currently valid OMB control number. The OMB # is 0970-0356 and the expiration date is 02/29/2024. If you have any comments on this collection of information, please contact Annalisa Mastri at [email protected].







Module 1: Well-defined intervention informed by evidence

Key topics to cover in this module:

  • Why did the grant recipient pick this intervention to focus on for the FSSDD grant project? What problem(s) are they trying to solve through this intervention?

  • What sources of information (such as evidence-based or -informed practices and practice wisdom) did they draw from to develop their intervention?

  • Is the grant recipient clear about what the intervention is, what they are hoping to achieve, and how the intervention will help them achieve those outcomes?

  • Is the intervention defined and mapped out in a way that it could be replicated by another site?



In this set of questions, we’d like to talk about how you identified and designed your intervention. We’d like to hear about the problem you’re trying to solve through your intervention, what information you drew from to design or adapt your intervention, and what you’re hoping to achieve.

  1. What problem(s) or challenge(s) is your intervention seeking to address? How did you decide this was a problem or challenge you wanted to address? What are the goals for your intervention? Who is eligible for your intervention? Are there specific criteria they must meet to participate? [If not yet implementing: Who will be eligible? Are there specific criteria they will have to meet to participate?]

  2. Tell us a little more about how you developed or decided on implementing this intervention. What informed your thinking? To what extent, if any, did existing research or evidence play a part in the intervention design?

    1. Probe: Did you adapt an existing model? If so, how and why?

    2. Probe: Who was involved in developing or deciding on this intervention? For example, organizational leaders, community members, program participants for other interventions implemented at your organization, key partners, and funders.

  3. Describe your intervention’s services and activities.

    1. Probes: What services are offered to participants? [If not yet implementing: What services will be offered?] [interviewers will probe for frequency and intensity of services]

    2. Probes: Where are services delivered? [If not yet implementing: Where will services be delivered?] [interviewers will probe for location and mode, such as in-person, virtual, or hybrid]

    3. Note to interviewers: If the intervention is focused on centralizing services— ask whether and how this approach differs from the way that they were doing business before.

  4. Do you use tools for your intervention, such as a curriculum, implementation manual, or worksheets, to help staff implement the intervention consistently? What are they? Who developed the tools? Are they tailored to the population(s) that you serve? What feedback have you received on these tools from staff who use them (Probes: ease of use, relevance, specificity, etc.)?

    1. [If not yet implementing: Do you have tools that you plan to use for your intervention, such as a curriculum, implementation manual, worksheets, etc., to help staff implement the intervention consistently? What are they?]

    2. [If not yet developed, who will be involved in developing them? Will they be tailored to the population(s) that you serve? What feedback have you received on these tools from staff that will use them? (Probes: about ease of use, relevance, specificity, etc.)]

  5. A theory of change or logic model depicts the relationship between your intervention’s resources and services and its intended effects (called outputs and outcomes). It’s important to have these relationships documented, because they demonstrate what your intervention is intended to do and how—in this case, how you intend to enhance economic outcomes and family well-being for people in your community.

    1. Do you have a theory of change or logic model developed?

      1. If so, do you think it captures your intervention activities and expected outcomes well, or could it use some improvement/updating? Does your theory of change or logic model include contextual factors that might affect implementation of the intervention?

      2. If not, do you have initial ideas for what your intervention’s theory of change or logic model would include?

  6. What are some of the outcomes you think are important?

    1. Probes: How many services do you think someone has to complete to see a difference on those outcomes?

Module 2: Implementation strength

Key topics to cover in this module:

  • Does the grant recipient have a well-defined, systematic plan for implementing the intervention? Does the plan address barriers and facilitators to change that might influence the intervention’s ability to achieve the intended outcomes?

  • Does the grant recipient effectively acknowledge or adapt to the external context (or have plans to do so, in the case of interventions not yet being implemented)?

  • If applicable, has the implementation plan been executed as intended? Do implementation strategies show promise in the ability of the grant recipients to sustain the intervention over time?



[Note: Instructions are included in this section on adapting questions for grant recipients not yet implementing their interventions.]

In this set of questions, we’d like to hear about how you implement your intervention, how you address challenges and/or adapt the intervention to your local context, and how you plan to sustain the intervention over time. This helps us understand your organization’s implementation experience with this intervention and the strengths and potential challenges to implementation. [If not yet implementing: In this set of questions, we’d like to hear about how you plan to implement your intervention, including how you plan to address challenges and adapt the intervention to your local context, and how you plan to sustain the intervention over time. This helps us understand the potential strengths and challenges to implementation.]

  1. How would you characterize the context and community in which your intervention operates? How does the context/community influence your intervention’s implementation and operations? [If not yet implementing: How would you characterize the context/community in which your intervention will operate? How do you anticipate the context/community will influence your intervention’s implementation and operations?]

    1. Probes: Think about how the labor market might influence implementation. For grant recipients implementing interventions nationally, probe on differences in context across sites.

  2. How does the intervention aim to meet the needs of people in your area? Are there any changes you would make to better fit their needs? [If not yet implementing: Are there any changes you plan to make to better fit their needs?]

  3. Do you have the necessary resources to implement the intervention well? Do you need to add or change any?

    1. Probes: Think about financial resources, staff/personnel, space, technology, partners.

  4. How are staff who provide intervention services selected? What kind of training and professional development do you provide before they start delivering this intervention? [If not yet implementing: How will you select staff who provide intervention services? What kind of training and professional development will you provide before they start delivering this intervention?]

  5. What ongoing management or monitoring takes place to make sure the intervention is on track? [If not yet implementing: Tell us about your plans for monitoring implementation to make sure the intervention is on track.]

    1. Probe: Do you have one or more data systems that support implementation? How do you use them to monitor implementation? [If not yet implementing: Do you have one or more data systems that will support implementation? How will you use them to monitor implementation?]

    2. Probe: Do you have a standard set of expectations for implementation of your intervention (fidelity standards)? Do you compare implementation to those standards? How (and how often)? [If not yet implementing: Will you compare implementation to those standards? How (and how often)?]

    3. What do you do when you identify implementation challenges? How do you address them? [If not yet implementing: How do you plan to identify and address implementation challenges?]

  6. How do you promote clear and consistent communication about your intervention within your organization [if applicable: and with your partners]? [If not yet implementing: How do you plan to promote clear and consistent communication about your intervention within your organization [if applicable: and with your partners]?]

    1. Probes: What systems or processes do you have in place? (e.g., regular meetings, community town halls) [SKIP if not yet implementing]

  7. Are organizational leaders, community members, key partners, and funders supportive of the intervention? In what ways do they show support? In what ways could support be strengthened?

Module 3: FSSDD grant project readiness for research and evaluation

Key topics to cover in this module:

  • What are the grant recipient’s goals for evaluation? What evaluation activities has the grant recipient conducted in the past and what challenges has it faced?

  • What type(s) of evaluation does the grant recipient want to conduct? Does the grant recipient meet the conditions necessary for the type of evaluation it wants to conduct? In what areas does it need assistance? Is there another type of evaluation that might better suit the grant recipient’s interests and capacity?



In this set of questions, we’d like to hear about your interests in evaluation. We’d like to hear about evaluations you’ve conducted in the past, your priorities for evaluation under the FSSDD grant project, and areas where you think we can provide assistance. This will allow us to tailor our evaluation support for you.

  1. Could you tell me some more about the research and evaluation activities you’ve undertaken so far? Do you have a set of research questions? An evaluation design? Is that something you would like help with? Have you worked with an external evaluator in the past? Are you working with one now?

  2. What has kept you from pursuing additional research and evaluation activities? Is there support for evaluation activities within your organization? Among your partners and the community? If yes, what does that support look like? Do staff have relevant knowledge and skills to conduct or manage research and evaluation activities?

    1. Probes: Lack of time, funding, infrastructure such as data management systems, and support from organizational leaders, community members, program participants

  3. Learning more about the proposed evaluation activities from the grant application: Let’s switch gears to talk about your FSSDD grant project. Tell me some more about your evaluation plans—the evaluation or evaluation activities you proposed pursuing in your grant application for funding. [interviewers should have a good sense of this already from the grant application; use that information and responses here to decide which evaluation design questions to ask (among questions 4 through 8)]

    1. [refer to research questions specified in grant application] The research questions in your application include [list]. Are these questions still your main research questions or have you made any changes to them since you submitted them in your application? How did you come up with them? To what extent did you involve intervention leaders, staff, participants, and community members in developing them?

    2. Do you have a specific type of study design in mind that you’d like to use to answer those research questions?

    3. Are there specific kinds of data you think you would need to gather to answer those research questions?

    4. Do you have an external evaluator you would work with for this evaluation (separate from the Mathematica/TAP evaluation support team)? If not, are you looking for or interested in partnering with an external evaluator?

  4. If formative or rapid-cycle evaluation:

    1. Have you conducted a formative or rapid-cycle evaluation before? (A formative evaluation helps to assess whether an intervention is feasible and identifies elements that might be changed or improved. A rapid-cycle evaluation is a type of formative evaluation that helps to quickly test operational changes to improve outcomes.) How did you go about doing that? Is there anything you’d like to do differently next time you conduct this type of study? Is that something you would like help with?

    2. What is the size of your sample? Is increasing the sample size something you’d like help with?

    3. What data do you think you’ll need?

      1. Note to interviewer: ask questions on data collection approaches, as appropriate (from questions 9, 10, and 11 in this section on survey, qualitative, and administrative data)

    4. Have you drawn conclusions for adaptations or improvements based on your analysis of past formative or rapid-cycle evaluations? Is there anything you’d like to do differently? Is that something you would like help with?

  5. If implementation or process evaluation:

    1. Have you conducted an implementation or process evaluation before? (An implementation or process evaluation are terms that we use interchangeably and are studies that document how an intervention is implemented and whether it was implemented according to its design.) How did you go about doing that? Is there anything you’d like to do differently? Is that something you would like help with?

    2. What data do you think you’ll need?

      1. Note to interviewer: ask questions about data collection approaches as appropriate (from questions 9, 10, and 11 in this section on survey, qualitative, and administrative data)

    3. How will you use the findings from this evaluation? How have you used findings from similar types of evaluations in the past?

  6. If outcomes or descriptive evaluation:

    1. Have you conducted an outcomes or descriptive evaluation before? (An outcomes or descriptive evaluation assesses changes in participants’ outcomes before and after they take part in the intervention.) How did you go about doing that? Is there anything you’d like to do differently? Is that something you would like help with?

    2. What is the size of your sample? Is increasing the sample size something you’d like help with?

    3. What data do you think you’ll need?

      1. Note to interviewer: ask questions about data collection approaches as appropriate (from questions 9, 10, and 11 in this section on survey, qualitative, and administrative data)

    4. How will you use the findings from this evaluation? How have you used findings from similar types of evaluations in the past?

  7. If randomized controlled trial (RCT) evaluation:

    1. Have you conducted a RCT before? (An RCT compares the outcomes of two groups of people—a group randomly assigned to receive the intervention and a group randomly assigned to not receive it). How did you go about doing that? Is there anything you’d like to do differently? Is that something you would like help with?

    2. Are more participants eligible and interested in your intervention than can be served? If not, do you think you can increase the number of participants interested beyond the maximum number that can be served?

    3. If you randomly assigned participants to either receive your intervention or not, would participants in the two groups receive vastly different services? How so?

    4. Can you think of any challenges that might come up with random assignment?

      1. Do you think it can be done in a way that is not disruptive (or minimally so) to staff and participants?

      2. Do you think staff can make sure the two study groups are offered different services (that is, no one in the comparison group gets intervention services)?

      3. Is random assignment something you’d like help with?

    5. What is the size of your sample? (A sample for an RCT would be the group of people who are randomly assigned to receive or not receive the intervention.) Have you done an analysis to know what sample size you would need to perceive an impact of your intervention? Is increasing the sample size something you’d like help with?

    6. Have you consulted with your staff, community members, and your partners about possibly conducting an RCT? Would they be supportive of an RCT?

    7. What data do you think you’ll need?

      1. Note to interviewer: ask questions about data collection approaches as appropriate (from questions 9, 10, and 11 in this section on survey, qualitative, and administrative data)

    8. How will you use the findings from this evaluation? How have you used findings from similar types of evaluations in the past?

  8. If quasi-experimental evaluation:

    1. Have you conducted a quasi-experimental evaluation before? (A quasi-experimental evaluation compares the outcomes of two similar groups of people—one that participated in the intervention and one that did not. Under this type of design, people in the study are not randomly assigned to receive the intervention or not.) How did you go about doing that? Is there anything you’d like to do differently? Is that something you would like help with?

    2. Is there a group of people who are not in your intervention but are similar to people in your intervention that could serve as a comparison group?

      1. Is there is a common reason or reasons that people in your potential comparison group do not participate in your intervention?

      2. What services do the people in the potential comparison group receive in the community?

      3. Is finding a comparison group something you’d like help with?

    3. What is the size of your sample? (A sample for a quasi-experimental evaluation would be the group of participants and comparison group members for whom you have data on the outcomes that you want to learn more about.) Is increasing the sample size something you’d like help with?

    4. What data do you think you’ll need?

      1. Note to interviewer: ask questions about data collection approaches as appropriate (from questions 9, 10, and 11 in this section on survey, qualitative, and administrative data)

    5. How will you use the findings from this evaluation? How have you used findings from similar types of evaluations in the past?

  9. If qualitative data collection and analysis (e.g., interviews, focus groups):

    1. Have you developed qualitative data collection instruments, such as one-on-one interview and focus group protocols, before? How did you go about doing that? Is there anything you’d like to do differently? Is that something you would like help with?

    2. Have you coded and analyzed qualitative data before? How did you go about doing that? Is there anything you’d like to do differently? Is that something you would like help with?

  10. If survey data collection and analysis:

    1. Have you developed surveys before (and if yes, what type of survey was it -- web, phone, written, etc.)? How did you go about doing that? Is there anything you’d like to do differently? Do you typically develop questions yourself, use questions from other surveys, or something else? Is that something you’d like help with?

    2. When you develop new questions, how do you do that? Do you test them out on potential respondents to be sure they understand them?

    3. What are your response rates usually like?

    4. Have you analyzed similar data before? How did you go about doing that? Is there anything you’d like to do differently? Is that something you’d like help with?

    5. To what extent have you thought about how these data might reflect disparities or historical underinvestment in certain groups? Is that something you’d like help with thinking through?

  11. If administrative data collection and analysis (e.g., wage data or operations data):

    1. What kinds of things are you trying to measure with these data?

      1. Probes: Getting a sense of people’s starting points (baseline data)? Information about what services they receive while in the intervention? Short- or long-term outcomes?

    2. Can you easily access the data sources you need? If not, why not? Is that something you’d like help with? (For example, we can help put together requests to government entities for administrative data; please note that we cannot help with financial resources to obtain data.)

    3. Have you had or do you think you will have any challenges analyzing the data? What are they?

    4. To what extent have you thought about how these data might reflect historical underinvestment in certain groups? Is that something you’d like help with thinking through and analyzing?







DRAFT 06/06/22 6

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleMathematica Report Template
AuthorAnnalisa Mastri
File Modified0000-00-00
File Created2022-06-06

© 2024 OMB.report | Privacy Policy