Case Studies of Child Care and Early Education Supply-Building and Sustainability Efforts

Formative Data Collections for ACF Research

Instrument 4. Data and Evaluation Staff Interview Protocol (draft 5-clean)

Case Studies of Child Care and Early Education Supply-Building and Sustainability Efforts

OMB: 0970-0356

Document [docx]
Download: docx | pdf

OMB Control No.: 0970-0356

Expiration Date: 01/31/2027

Length of time for interview: 60 minutes


INSTRUMENT 4: DATA AND EVALUATION STAFF INTERVIEW PROTOCOL


The CCEE Supply Building research team will use this protocol to conduct interviews with the key staff member(s) in charge of collecting data and/or leading or overseeing evaluations of the strategies. These staff may work for the CCDF Lead Agency or a partner organization (such as an external research and evaluation team). Interviews may be conducted one-on-one or jointly.  

This protocol is a guide, not a script. All respondents may not be asked all questions. Interviewers will tailor questions to the specific strategies and roles and responsibilities of the respondents. Interviewers will add probes to further explore the responses provided. 


(Note: In the following section, the interviewer will not read words in parentheses)


(Introduction) Thank you for agreeing to speak with us today. My name is [NAME], and I’m joined by my colleague, [NAME]. We’re from the Urban Institute, a nonprofit, nonpartisan research organization based in Washington, DC.


(Who is leading the research and funding it) The Office of Planning, Research, and Evaluation (OPRE) within the Administration for Children and Families (ACF) contracted with the Urban Institute to lead a project called “Understanding Supply-Building and Sustainability Efforts of the Child Care and Early Education Market.”


(Purpose) As part of the larger project, we are conducting case studies of several states about supply-building or sustainability strategies we identified through a web scan or survey. The goal of the case studies is to document and share information about the strategies and to use the information to inform recommendations for future research of child care and early education supply-building and sustainability strategies.


(Requirements) Over the next hour, we will be asking you a series of questions designed to gather information regarding a strategy your state is implementing so we can learn more. You may not know the answer to every question, and that is fine. If there are any questions that you don’t feel knowledgeable about or don’t feel comfortable answering, just let us know and we will move on.


(Voluntary) This interview is voluntary. There will be no consequences if you decline or stop the interview. If you need to take a break at any time, please let us know.


Shape1

Public reporting burden for this collection of information is estimated to average 60 minutes per response. This information collection is voluntary. An agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a currently valid Office of Management and Budget (OMB) control number. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to: Urban Institute, 500 L’Enfant Plaza SW, Washington, DC 20037.








(Consent to record) We’ll take notes during our discussion, but if it’s okay with you, we would also like to record this interview to help fill in our notes. If you would like me to stop recording at any time during the interview, please let me know. We will delete the recording once our analysis is complete.


(What we will do with the data) We will share the information that we gather during the interview with our federal project officers at the Office of Planning, Research, and Evaluation, which is the research arm of the Administration for Children and Families (we will reference the Administration for Children and Families as ACF going forward), as well as with staff at the federal Office of Child Care within ACF. Following the case studies, we will prepare and share with ACF written memos describing each strategy and our findings from interviews and focus groups. We may also use the information to develop a public report or brief that summarizes what we learned across all of the states that participate in case studies.


(Privacy) Importantly, we will identify your state in our memorandum to ACF. We will not identify you by name, but they will know we spoke with key staff involved in collecting data and/or leading or overseeing evaluations of the strategies. If we produce a report or brief that is available to the public, we will keep your identity and the identify of all individuals we interview private. In addition, we will name the states in the report or brief but none of the information presented will be attributed to a particular state. Rather, we will describe themes across the states and strategies that participate in the case studies.


(Risks and benefits) There are no anticipated personal risks or benefits to participating in this research.


(OMB statement) An agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a currently valid OMB control number. The OMB number and expiration date for this collection are OMB #: 0970-0356, Exp: 01/31/2027.


(Virtual interview/Zoom statement) We ask that you participate in a private setting away from earshot or viewing by unauthorized persons which includes family members, and we want you to understand that given the technical limitations of Zoom and similar internet platforms, we cannot guarantee the confidentiality of what might be said.


    • Do you have any questions?

    • Do we have your consent to proceed with our interview?

    • Do we have your permission to record?

(If the interviewee says yes) Thank you. If you are ready, I will start recording now.

(If the interviewee say no, research assistant will be prepared to take close-to-verbatim notes.)


We know your state is implementing various strategies to build and sustain the supply of child care and education. In our interview today, we’d like to focus specifically on [NAME OF SELECTED STRATEGY].  

Before we begin, let me provide you with a brief ‘roadmap’ of what we’ll cover during the interview. We’ll start with questions about your background. Next, we will discuss data collection and whether you have a plan to collect and analyze information about the strategy's activities. Then, we will discuss your successes, challenges, and lessons learned implementing the strategy. Sound good?


Interviewee Background 

Let’s start with a couple background questions about you.  

  1. Please tell us about your current role, your responsibilities related to [STRATEGY], and how long you’ve been in this position. What role, if any, did you have in planning [STRATEGY]? What role, if any, have you played in implementing or monitoring implementation of [STRATEGY]?  


Data Collection and Use

My next set of questions is about information you collect about [STRATEGY] and whether you have or plan to evaluate [STRATEGY]. In the context of this interview, I am using the term “evaluation” to refer to a systematic process for collecting and analyzing information about the strategy's activities, characteristics, and outcomes.  

  1. To start, please describe your approach to collecting data on [STRATEGY]. At a high level, what types of information do you collect? What are the goals or objectives of these efforts?

    • Do you collect data about implementation of [STRATEGY]? For example, number of participants, characteristics of participants, service receipt or use of funds (if relevant), participant satisfaction, consistency of implementation or service delivery (if relevant).

    • Do you collect data about the intended outcomes of [STRATEGY]? For example, increase in [INSERT DESCRIPTION OF OUTCOMES IDENTIFIED IN OTHER INTERVIEWS].

  1. What kinds of information [are/were] being collected about the implementation of [STRATEGY]?

(Probe on whether data are collected on:

        1. Number of participants/take-up

        2. Information on characteristics of participants

        3. Provider satisfaction

        4. Parent satisfaction

        5. Intermediary organization satisfaction

        6. Consistency of implementation

        7. Expenditures

        8. Other).

  1. What kinds of information [are/were] collected to assess progress towards outcomes?

(Probe on whether data are collected on:

  1. Number of/increase in slots

  2. Number of/increase in slots for [reserved, priority populations]

  3. Number of/increase in providers

  4. Program quality/increase in slots in quality programs

  5. Other).

  1. What methods [are/were] used to collect information about implementation? What methods [are/were] used to collect information about outcomes?

(Probe on whether the agency collects progress reports; program administrative data; surveys interviews, and/or focus groups with administrators, agencies, other individuals involved in implementation, child care providers, early educators, and/or staff participating or involved in implementation, families involved or who could be affected).

          • Why were these chosen?

(Probe on ease of data access, prioritization of hearing directly from providers/parents, legislative/regulatory requirement, data staff availability).


[FOR METHODS IDENTIFIED IN QUESTION 4 (PRIORITIZING ADMIN DATA FIRST, THEN QUANTITATIVE/QUALITATIVE DATA COLLECTED, AND PROGRESS REPORTS AS LOWEST PRIORITY) ASK THE FOLLOWING QUESTIONS]:

Next, I have a series of questions about the data you collect. I am going to ask the same series of questions about [METHODS] you described above. If we have time, we will also talk about [ADDITIONAL METHODS]. If you have documentation about the data that you are able to share with us, we can use that instead of discussing it now. Let me go over the list of questions and then please let me know if you are able to or would prefer to share documentation about the data with us rather than discussing it now.

[READ THE SERIES OF QUESTIONS AND THEN ASK WHETHER THE PARTICIPANT IS ABLE TO PROVIDE DOCUMENTATION ABOUT ANY DATA THAT CAN ADDRESS THE QUESTIONS. IF SO, DO NOT ASK THE QUESTIONS ABOUT THAT DATASET/METHOD].

  1. What timeframes do these data cover? Do they include baseline data, that is, data from before [STRATEGY] was implemented?

  2. How often are these data collected?

  3. Is documentation available about the data? If so, what is available?

(Probe on whether they have a data dictionary or other types of documentation available about variables, etc.).

  1. What is the unit of observation?

(Probe on whether data are available at the provider, family, household, or child level; or aggregated above original unit of observation).

  1. What is the sample size or number of records?

  2. What is the response rate/share of observations covered by the dataset?

(Probe on whether data cover all providers/families who are impacted by the strategy, whether the sample was randomly selected, or whether there may be some non-respondents or providers/families who are not observed in the data).

  1. Where are the data stored?

  2. Are data publicly available? Who/which organizations/types of individuals has/have access to the data? How does the state decide who/which organizations/types of individuals can access the data?

  3. How are data used and for what purpose? What, if any, analysis is conducted? Which agency or organization leads the analysis?

(Probe on whether data are used for reporting purposes to funders, for monitoring and/or to inform program improvement efforts, to inform ongoing planning, ongoing decisions about funding, and so on).


[ASK ONLY ONCE]:

  1. Has [STRATEGY] been evaluated or are there plans to evaluate it?

(If yes):

    1. Which agency or organization [is leading/led/will lead] the evaluation?

    2. What [does/did/will] the evaluation focus on?

    3. Are there any publications (internal or external) summarizing findings? Are you able to share them with us? [IF SO REQUEST COPIES/LINKS/ETC. TO REPORTS].


Lessons Learned

My last set of questions is about what has worked well and what challenges you have faced, what you have learned about [STRATEGY], and what you would like to learn about it in the future.  

  1. Thinking about your data collection and evaluation efforts to date, what has worked well? What has been most successful?  

  2. What issues or challenges have you faced?  

  3. What, if anything, would you have done differently? 

  4. What have you learned from the information collected and/or from the evaluation(s) about how [STRATEGY] is working?  

  5. Are there other questions or topics you would like to explore with the data but haven’t been able to? If so, what would you like to explore?  

  1. Are there topics you would like to learn about [STRATEGY]/questions you would like answered but don’t have the data to answer? If so, what are they?

  2. What advice about data collection and evaluation would you give another state interested in implementing [STRATEGY]?


Wrap Up 

Great! Those were all my questions. Is there anything else you’d like to share about your experiences that we didn’t discuss? 

Understanding the Supply Building and Sustainability Efforts of the Child Care and Early Education Market OMB Instruments: Case Studies of Supply Building and Sustainability Strategies


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorDelGrosso, Tricia
File Modified0000-00-00
File Created2025-05-29

© 2025 OMB.report | Privacy Policy