Feedback on ECE-RISE Pilot Project

Fast Track Generic Clearance for Collection of Qualitative Feedback on Agency Service Delivery

Interview Instrument 1. Interview for Project Operations Team

Feedback on ECE-RISE Pilot Project

OMB: 0970-0401

Document [docx]
Download: docx | pdf


Interview Instrument 1. Interview for Project Operations Team

Thank you for agreeing to speak with us today. My name is [NAME], and I’m joined by my

colleague, [NAME]. We’re from the Urban Institute, a nonprofit social and economic policy

research organization in Washington, DC.

1. (Who is leading the research and funding it) In partnership with Mathematica, we are leading the Child Care Evaluation and Capacity Building Center under a federal contract with the Administration for Children and Families in the U.S. Department of Health and Human Services.

2. As part of the larger project, we are conducting a process evaluation to objectively document the benefits and challenges of the ECE-RISE capacity building pilot. The results of the process evaluation will provide ACF information that can improve future capacity building efforts, enabling CCDF Lead Agencies to contribute more to ACF evidence-building efforts, and improve their ability to seek evidence as they make child care policy and operational decisions.

3. (Requirements) Over the next hour and a half, we will be asking you a series of questions designed to gather in-depth information regarding the development, application process, implementation/participation, and outcomes of ECE-RISE. You may not know the answer to every question, and that is fine. If there are any questions that you don’t feel knowledgeable about or don’t feel comfortable answering, just let us know and we will move on. You can also let us know names and affiliations of other people who may be more informed on those topics, and we can follow up with them.

4. (Voluntary) I’ll remind you that this interview is voluntary. There will be no consequences if you decline or stop the interview. If you need to take a break at any time, please let us know.

5. (Consent to record) We’ll take notes during our discussion, but if it’s okay with you, we would also like to record this interview to help fill in our notes. If you would like me to stop recording at any time during the interview, please let me know. We will delete the recording once our analysis is complete.

6. (What we will do with the data) The information we gather during your interview will be paired with what we hear in other interviews and with other information from a document review. Ultimately, we will create a memo that will be shared only with our federal project officers at the Office of Planning, Research, and Evaluation, which is the research arm of the Administration for Children and Families.

7. (Privacy) We ask that you participate in a private setting away from earshot and viewing by unauthorized persons to include family members and we want you to understand that given the technical limitations of zoom and similar internet platforms, we cannot guarantee the privacy of what might be said. [If group interview: While we will maintain the privacy of what is said, we cannot control what other participants may say outside of the interview]. We will not identify you by name in our products. If we quote you in our study products or describe something you shared, we will never use your name will make attempts attribute the quote in any way that someone could not identify you. However, because of your role on the ECE-RISE project and regular interactions with some ACF staff, they may figure out who you are from the remarks that we report. Therefore, we cannot guarantee complete privacy of your participation or the views you express.

8. (Risks and benefits) If you share something about a challenge you experienced that is sensitive, there could be a risk of reputational harm because people might be able to identify you due to your role on the ECE-RISE project. Sharing challenges helps other people learn, but you have to decide how comfortable you are with what you share. You can ask us not to include some parts of the information in the memo that will be delivered to OPRE. Otherwise, there are minimal risks to participation in this interview. While there are no direct benefits to you, you may benefit from knowing your experiences with ECE-RISE could be used to inform and improve future ACF capacity building efforts.



Do you have any questions for me about the study?

Do we have your consent to proceed with our interview?

Do we have your permission to record?


[If the interviewee says yes] Thank you.  If you are ready, I will start recording now.  

[If the interviewee say no, research assistant will be prepared to take close-to-verbatim notes.] 



Background

I’d like to begin by learning about the work you did as part of the ECE-Rise project team.

  1. Can you please describe your role on the ECE-RISE project team?

    1. [Probe if not mentioned] When did you join the ECE-RISE project team? What were your primary tasks?

Pilot Project and Application Development

My next set of questions will be about the development of the ECE-RISE program, including development of the application and selection criteria.

  1. To start, why don't you walk me through the development of ECE-RISE. How was the program developed?

    1. What concepts, theories, data, or existing models contributed to the development of the program?

    2. Who was involved?

    3. In what ways, if any, did COVID-19 influence program development?

  2. What are the primary goals of ECE-RISE?

    1. Please describe the characteristics or needs of the agencies that were intended to be the focal agencies.

  3. Please describe the main components and activities of ECE-RISE as they were intended to occur.

    1. In what ways do each of these components and activities support the program’s goals?

  4. What was the expected level of effort for lead agency participation in the ECE-RISE program?

    1. Who, in terms of roles or types of positions, was expected to participate?

  5. How was the application and selection process developed?

    1. What concepts or requirements guided the development of the process?

    2. What were the key questions that the team intended to answer through the process?

  6. What were the site selection criteria and how were those criteria developed?

  7. What was the expected level of effort for agencies to complete the application?

Application Process

Next, I’d like to learn about the application and selection process.

  1. Please walk me through the application and selection process.

    1. Who was involved in reviewing applications? Who was involved in the selection decision?

    2. Did project team members have different perspectives on which agencies should be selected? If so, how were differences resolved?

    3. To what extent was the process carried out as planned? If changes were made, what changes were they and why?

  2. Why do you think only three lead agencies applied to participate in ECE-RISE?

    1. [Possible probes] Application process too cumbersome? Research and evaluation a low priority for agencies? Lack of agency capacity? Post-pandemic effects?

  3. From your perspective, what were the strengths of the application and selection process? What worked well?

  4. We know there was only funding to support the participation of two of the three agencies that applied. What are some of the key reasons that the Texas Workforce Commission (TWC) and the Mississippi Division for Early Childhood Care and Development (MDHS-DECCD) were selected and the Cowlitz Tribe was not selected to participate in ECE-RISE?

    1. How was this decision communicated to the Tribe, and what did you perceive their reaction to be?

  5. We know one of the selected agencies (MDHS-DECCD) chose to drop out of the ECE-RISE program before the official kickoff. What is your understanding of why they made this choice?

    1. From your perspective, was this something that the selection process could or should have predicted? If yes, why do you think it was missed or what should be changed? If no, why not?

Implementation of and Participation in the Project

My next set of questions are about how ECE-RISE was implemented with the Texas Workforce Commission, including things such as what activities the Lead Agency participated in, changes to the program, and the ECE-RISE team’s level of effort.

  1. Please describe the key components of ECE-RISE as carried out with the Texas Workforce Commission (TWC).

    1. What was the sequencing of activities and how was that sequencing determined?

      1. If we were to divide the implementation stage into roughly three time periods – start-up, primary implementation, and wind down – would that make sense to you? If no, how would you suggest segmenting the implementation and how TWC may have experienced it?

    2. What were the responsibilities of the ECE-RISE team in developing and carrying out the activities?

    3. What were the responsibilities of TWC for engaging in the activities?

  2. We know that ECE-RISE did not provide any funding to TWC. Since there was no funding contract, what was used to set up level of effort or participation expectations?

  3. To what extent was ECE-RISE implemented as envisioned?

    1. What changes, if any, were made to ECE-RISE’s implementation and why?

      1. [Probe if not mentioned] We know that the original structure of ECE-RISE had planned for peer learning activities where the two participating sites would engage in activities helping each other.

        1. What changes did you make to adjust for having just one team?

        2. What were the pros and cons of having just one team?

      2. [Probe on other changes in implementation that are found during document review]

  4. What activities did TWC engage in and how often?

    1. To what extent was TWC involved in shaping the types, frequency, or sequencing of activities?

    2. What activities, if any, did TWC engage in differently than originally intended?

    3. Do you think the level of effort required of TWC was more or less than expected? Why do you say that?

    4. [Probe on specific activities or changes found during document review]

  5. What activities did the ECE-RISE project team engage in and how often? Please include both activities directly engaging with TWC and support activities that TWC may or may not be aware of; think about what you would communicate to someone trying to prepare to replicate the program.

    1. How many ECE-RISE team members were regularly involved?

    2. What was the approximate level of effort for engagement?

    3. Were adjustments made to the level of effort of the ECE-RISE team at any point? If so, what kind of change was made and why was it made?

    4. [Probe on specific activities or changes found during document review]

  6. Were any other people engaged in supporting the work (beyond the ECE-RISE and TWC teams)? If yes, why and what did they do?

    1. [Probe on engagement of others found during document review if not mentioned]

Process Outcomes

Next, I’d like to learn more about your perspective on the extent to which TWC was able to meet its goals and grow its research and evaluation capacity through the ECE-RISE program.

  1. What were TWC’s goals for participating in ECE-RISE?

  2. From your perspective, in what ways did TWC meet their goals for the project? In what ways did they not meet their goals?

    1. Do you think the level of support the team provided to TWC was appropriate to help TWC achieve its goals? Why do you say that?

    2. From your perspective, what are some other things they learned or accomplished that were not part of their original goals?

  3. What were the key strengths of TWC that supported them in accomplishing their project goals?

    1. To what extent do you think the application process effectively identified these strengths? Why do you say that?

  4. What challenges or barriers made it difficult for TWC to accomplish their project goals?

    1. To what extent do you think the application process effectively identified these challenges and barriers? Why do you say that?

  5. In what ways has ECE-RISE increased TWC's research and evaluation capacity?

    1. What do you see as the evidence of that increased capacity?

      1. [Possible probes] changes to data collection systems, new data analysis personnel or tools, knowledge and skills related to using and interpreting data, changes in how data insights are reported or presented, knowledge of evaluation designs

      2. [Include as probes capacity indicators found during document review]

    2. What barriers remain that limit TWC’s capacity? Are these barriers persistent or have they changed over time?

      1. [Possible probes] staff turnover or vacancies, statutory/policy requirements or limits, lack of leadership support, insufficient funding

      2. [Include as probes barriers found during document review]

Lessons Learned

Finally, I’d like to understand your perspective on what worked and lessons learned from the ECE-RISE pilot.

  1. Overall, what do you think ECE-RISE did really well?

    1. What do you think didn’t work as well?

  2. If ACF were to implement a second ECE-RISE cohort or a program with similar goals, what changes would you recommend to the application process, selection process, or the program itself?

    1. What would you recommend keeping the same?

    2. [Possible probes if not mentioned] cohort size, types of activities, intensity of activities/participation, length of time for accomplishing the goal, resources available (time and dollars), specialty areas of the people providing the supports (e.g. researchers)

  3. What are some broader lessons learned about measuring and supporting research and evaluation capacity that you think might benefit future studies of or projects to support research and evaluation capacity?

  4. Is there anything we didn’t ask that you’d like to tell us about ECE-RISE’s development, implementation, or lessons learned?

Thank you for sharing your insights and experiences with us. We appreciate your time.

PAPERWORK REDUCTION ACT OF 1995 (Pub. L. 104-13) STATEMENT OF PUBLIC BURDEN: The purpose of this information collection is to help the government understand the benefits and challenges of a research and evaluation capacity building pilot and will be used to improve future capacity-building projects. Public reporting burden for this collection of information is estimated to average 90 minutes per respondent, including the time for reviewing instructions, gathering and maintaining the data needed, and reviewing the collection of information. This is a voluntary collection of information. An agency may not conduct or sponsor, and a person is not required to respond to, a collection of information subject to the requirements of the Paperwork Reduction Act of 1995, unless it displays a currently valid OMB control number. The OMB # is 0970-0401 and the expiration date is 06/30/2024. If you have any comments on this collection of information, please contact Teresa Derrick-Mills at [email protected].

Shape1

Interview Instrument 1. Interview for Project Operations Team

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorKat B
File Modified0000-00-00
File Created2024-07-25

© 2024 OMB.report | Privacy Policy