Alternative Supporting Statement for Information Collections Designed for
Research, Public Health Surveillance, and Program Evaluation Purposes
Understanding Child Care Licensing Challenges, Needs, and Use of Data
Formative Data Collections for ACF Research
0970 - 0356
Supporting Statement
Part B
March 2021
Submitted By:
Office of Planning, Research, and Evaluation
Administration for Children and Families
U.S. Department of Health and Human Services
4th Floor, Mary E. Switzer Building
330 C Street, SW
Washington, D.C. 20201
Project Officers:
Ivelisse Martinez Beck, Ph.D.
• Understand child care licensing administrators’ views of the key issues and challenges for the licensing unit in implementing the provisions of the reauthorized Child Care Development Block Grant (CCDBG) Act of 2014;
• Understand the extent to which child care licensing administrators use data to make decisions regarding licensing;
• Identify pressing issues facing child care licensing staff that could be addressed via research;
• Develop ideas for data-related resources to help strengthen state/territory licensing systems;
• Identify formats for resources that would be most useful to licensing staff;
• Identify states/territories with which we might partner for future studies as part of the larger project, The Role of Licensing in Early Care and Education (TRLECE); and
• Learn about changes made to the licensing system in response to the COVID-19 pandemic, to inform future research questions and products.
This study is intended to present an internally-valid description of child care licensing administrators’ experiences implementing the provisions of the reauthorized CCDBG Act of 2014 and addressing the COVID-19 pandemic, use of data, and needs for data-related resources. It is not intended to promote statistical generalization to other sites or service populations.
As noted in Supporting Statement A, this information is not intended to be used as the principal basis for public policy decisions and is not expected to meet the threshold of influential or highly influential scientific information.
We plan to the conduct interviews with the child care licensing administrator in each state and territory, as well as the District of Columbia (n = 56), and a staff member who is knowledgeable about the child care licensing data system in each state, territory, and the District of Columbia (n = 56).
As explained in section A1 of SSA and in section B2 of this support statement, based on our prior work with child care licensing, we know that the issues faced by each state and territory are unique. The states and territories vary enormously in how their child care licensing system is organized, which providers are required to be licensed, their licensing regulations and monitoring responsibilities, where the licensing staff and licensing data are housed, and their ability to access and use data. For that reason, it is important to collect data from as many states and territories as possible, rather than from a sample, to fully explore this variation and understand the full landscape of the issues to inform our later project work and the Office of Planning, Research, and Evaluation’s (OPRE’s) research agenda. Qualitative interviews are ideal for situations like this where the range of possible responses is unknown. Interviews will allow us to gather deep and nuanced information to meet the study objectives.
The information gathered will be purely descriptive. We will not collect evaluation information and the information collected will not be used to assess state systems or evaluate impact. Key limitations will be included in written products associated with this study.
We will attempt to collect data from each state/territory, and the state/territory will serve as the unit of analysis. In each state/territory, we will collect information from the child care licensing administrator and from a staff member knowledgeable about the state’s/territory’s licensing data system.
We plan a census data collection, gathering information from as many states/territories as possible, because of the variability in licensing systems across states and territories, the differential impact of COVID-19, and the importance of understanding each state/territory’s needs in order to develop useful resources to support their decision-making and to identify research priorities for the TRLECE project and for OPRE. For instance, we know that in some states and territories the child care licensing unit is in the same agency as the Child Care and Development Fund (CCDF) administrator, who oversees implementation of the CCDBG Reauthorization that includes licensing-related requirements; in other states, they are in separate agencies. States/territories vary in their requirements about who must be licensed and who is exempt from licensing. Some licensing units have access to a researcher while others do not. Data systems also vary widely, with some states/territories having old data systems that are challenging to link to other data systems and other states/territories having data systems that link information from various sources. Finally, there has been wide variation in how COVID-19 impacted states/territories and how their licensing systems responded, with some states/territories closing almost all licensed child care and others working to keep as much open as possible.
For this study, it is important to collect information from as many states and territories as possible to thoroughly understand this variation, which will help identify pressing research questions and create products that are useful to all states and territories. Selecting a sample of states/territories would not provide a complete picture of the child care licensing system so requesting participation from all 56 states/territories is necessary.
Initial draft interview protocols were developed based on OPRE’s goals and objectives for the project. The team was careful to review other data sources that might include relevant information (e.g., 2017 Child Care Licensing Study survey and NARA’s COVID-19 survey) to ensure there was no duplication. Once we had a working draft, we sought input from several groups of individuals and revised the draft after each consultation. The groups that provided information were: 1) a subgroup of the TRLECE’s technical expert panel (TEP) members, including researchers and national TA center staff who are familiar with current state/territory licensing efforts and who were formerly state licensing agency staff; 2) OCC staff; and 3) four pilot participants including two past child licensing administrators and two people who were knowledgeable of state/territory licensing data.
During pilot tests, we administered the entire protocol to get an estimate of burden. Then, we asked respondents for feedback on the questions (e.g., clarity) and discussed responses that were different from what we anticipated because those might suggest lack of clarity in the question. We also asked them for general feedback and suggestions for obtaining a high response rate and used their feedback to update the protocol and email text. From the first pilot interview, we learned that that draft was too long and included too many prompts. We then revised the tool and shared it with a second former licensing state child care licensing administrator to get her input. Following that conversation, we revised the tool once again, engaged additional federal staff within OPRE and OCC and then finalized the protocol. We then administered it, in full, to the second state child care licensing administrator to estimate burden.
Through pilot testing, we did not ask the same question of more than 9 individuals. None of the pilot test respondents were TEP members who reviewed the earlier version of the interview protocol nor current state licensing staff who will be invited to participate in the study.
Table 1 outlines how each study objective will be addressed in the Licensing Administrator and Data Systems interviews.
Table 1. Crosswalk of Study Objectives and Research Questions
Study Objectives |
Licensing Administrator Interview Question Numbers |
Data Systems Interview Question Numbers |
Understand child care licensing administrators’ views of the key issues and challenges for the licensing unit in implementing the provisions of the reauthorized Child Care Development Block Grant (CCDBG) Act of 2014 |
9 28 |
n/a |
Understand the extent to which child care licensing administrators use data to make decisions regarding licensing |
18 19 20 22 28 |
11 12 13 14 |
Identify pressing issues facing child care licensing staff that could be addressed via research |
12 13 15 16 17 20 29 |
8 14 18 19 |
Develop ideas for data-related resources to help strengthen state/territory licensing systems |
18b 24 25 26 |
11b |
Identify formats for resources that would be most useful to licensing staff |
27 |
n/a |
Identify states/territories with which we might partner for future studies as part of the larger project, The Role of Licensing in Early Care and Education (TRLECE) |
13 18 21 28 |
5 6 7 9 10 11 13 |
Learn about changes made to the licensing system in response to the COVID-19 pandemic, to inform future research questions and products |
10 11 14 15a 16a 17a 23 |
15 16 17
|
Two members of the contractor study team will take part in each interview: an interviewer and a notetaker.
We will begin recruitment for the interviews within one week of receiving OMB approval. OPRE will request a letter of support from the Office of Child Care to include in the recruitment outreach to licensing administrators. See APPENDIX D for a draft of the letter.
We will request an up-to-date contact list of licensing administrators from federally-funded National Center on Early Childhood Quality Assurance. We will contact each administrator via email to describe the study, the types of questions we will ask, the amount of time it will take, and to request their participation (see Appendix A). The email will originate from Dr. Kelly Maxwell’s (Child Trends) email address because some of the respondents will know Kelly (personally or by reputation) and therefore be more likely to open an email from her than from a generic or project email address. To facilitate scheduling, the email will provide a link taking respondents to an online platform (REDCap) where they can select the interview slot that works best for them. That link will also ask state licensing administrators for the name and contact information for the data systems person. The email will also provide respondents a way to contact Child Trends if they have questions (via email and telephone) or want to let us know that they will participate; the email will also let them know that we will be following up via telephone and email if they do not respond. Table 2 outlines our plans to reach out to participants four times. The text of the emails and call scripts for the licensing administrator appear in APPENDIX A and those for the data systems person appear in APPENDIX B.
When respondents sign up for a time slot, we will also ask for their permission to continue to contact them by email, as required by Child Trends data security policies. State licensing administrators will also be asked for the name and contact information of the data systems person when they sign up for the interview. The scheduling platform will also provide an option for respondents to easily let us know that they do not want to participate so we can stop sending reminders. If they choose to not participate, they will be asked if there is anyone else in their state/territory who could potentially respond to our questions. Licensing administrators who opt out will still be asked for the name of the data systems person.
Table 2. Outreach Plans (see Appendices A and B for text)
Outreach |
Timing |
Mode |
1 |
Week 1- Initial participation request: Early in week 1, this email will provide a link to select a time (or opt out) and also let them know that we will call them if we do not hear from them. |
|
2 |
Week 1 (2 to 3 days after initial request)- First reminder email: later in week 1, this email will again provide a link to select a time (or opt out) and let them know that we will call them if we do not hear from them. |
|
3 |
Week 2-Telephone call: during this phone call in week 2, we will give them an option of signing up for a slot during the call, selecting interview times on the scheduling platform, or opting out. |
Telephone |
4 |
Week 3- Final reminder: in week 3 we will send a final reminder about participation via email. |
The scheduling platform link will also allow them to let us know if they do not want to participate so we can stop sending reminders. If they opt out, the scheduling platform will then ask them to provide the name and contact information for someone else in their state who could potentially respond to our questions.
At the beginning of the interview, if the licensing administrator has not previously identified someone familiar with their licensing data system in the scheduling platform, we will ask the licensing administrator for the name and contact information for a staff member who can answer questions related to their licensing data system. We will ask the licensing administrator to let the data systems person know about the interview and that we will be reaching out. To schedule interviews with the data systems person, we will follow the same procedures outlined above for scheduling with the licensing administrator (i.e., four emails/phone calls with a link to the scheduling platform and an option to opt out; See Appendix B).
All interviews will take place via Microsoft Teams. Respondents will have the choice of calling in, using a regular telephone number provided by Teams or joining via the Teams software (with video on or off). All interview questions will be programmed into REDCap, and the notetaker will type notes into REDCap during the interview. Additionally, all interviews will be audio recorded using the Teams software, with respondent permission, so that notetakers can refer to them as needed if the notes are unclear or incomplete. Immediately following each interview, the recordings will be transferred from Teams to REDCap, where they will be securely stored. As soon as they are transferred, the recordings will be erased from Teams. Only permitted study staff will be able to access the interview notes and recordings in REDCap. The recordings will not be transcribed; their purpose is only to clarify notes. All interview data will be linked to participant IDs only. Access to the key that links personally indefinable information to IDs will be limited to only study staff who need the information to complete the interviews.
How are the data collection activities monitored for quality and consistency (e.g., interviewer training)?
The lead researcher for this activity and the Principal Investigator (PI) for the TRLECE project, will train 2-4 additional Child Trends or ICF staff to conduct these interviews. As part of the training, the activity lead and PI will serve as the interviewer for the first several interviews and another interviewer-in-training will take notes (e.g., essentially doing live transcription of responses). After each of these initial interviews, the team will debrief to ensure the interviewer-in-training understood why certain questions were probed. Once we are confident that the interviewer-in-training understands the study goals and interviewing techniques, they will begin conducting interviews with a research assistant taking notes.
While interviews are taking place, Child Trends will hold a weekly meeting with all interviewers (including the lead researcher and Principal Investigator for the TRLECE project) and notetakers to discuss issues that are arising, monitor progress and quality, and ensure consistency.
Our goal is to include as many states and territories as possible, plus the District of Columbia (n =56). In a previous study that included interviews with CCDF administrators, about 75% of administrators responded. We expect licensing administrators to be especially interested in participating in these interviews to share their experiences with COVID-19 and to discuss issues and needs. Thus, we are planning for participation from 50 out of the 56 potential states and territories and we anticipate two individuals will join from each participating state or territory for a total of 100 participants in the Child Care Licensing Administrator interviews. We expect similar participation from data system staff because they will be identified by licensing administrators, and again we expect that two individuals will join from each participating state or territory for a total of 100 participants in the Data Systems interviews. The interviews are not designed to produce statistically generalizable findings and participation is wholly at the respondent’s discretion.
Participants will not be randomly sampled, and findings are not intended to be representative, so non-response bias will not be calculated. We will use information from the 2017 Child Care Licensing Survey (see SSA section A4 for a description of that study) to document and report on characteristics of non-responding states (e.g., number of licensed child care facilities, agency that houses child care licensing).
The information will not be used to generate population estimates, either for internal use or dissemination, and no policy decisions will be made based on these data. The information will describe the range of issues faced across state/territory licensing administrators and their most pressing needs for research and data.
All questions in the semi-structured interviews will be open-ended. We will code them using qualitative analysis software (i.e., Dedoose), and 3-5 coders will identify and summarize themes. As described in section A4 of SSA, these interviews will build off of information gathered as part of the 2017 Child Care Licensing Survey and the recent COVID-19 survey from the National Association for Regulatory Administration (NARA). During analysis, we will refer back to those data sets for context and high-level understanding of each state’s licensing system as we interpret the findings from this study.
Once we have completed half of the interviews, we will randomly select five to review to help us begin creating the coding structure for the open-ended questions. This approach will balance the need to start analysis quickly so that the information can be summarized in a timely fashion, with the need to ensure that the coding structure is based on a representative batch of interviews (i.e., not simply the first few). Determining the coding structure early on will also benefit interrater reliability for the coding team. The coding structure will be revised throughout the coding process to ensure that it captures themes that emerge as interviews continue. During this first phase, the two coders most knowledgeable about the topic will independently review the interview notes and make notes on themes and patterns. Coders will meet, along with the activity lead and/or PI, to discuss the possible themes and develop a coding structure for the interviews. The interview notes from these five will then be independently coded by the same two coders to confirm that the coding structure is distinctive and coherent. If the two coders disagree on several codes, they will meet with the activity lead and/or PI to refine the coding structure. If significant changes are made to the coding structure based on the reconciliation of the coding of the first five interviews, the team will randomly select up to five more interview notes to review, code, and discuss.
Once the coding structure has been developed, interrater reliability will be established among the full coding team, and the coding team will independently review and code all of the interview notes, referring to audio recordings as needed. The coders will meet at regular intervals to confirm agreement, and the activity lead or PI will serve as a “tie breaker” for consensus when agreement cannot be achieved. Instances of differing opinion, as well as the resolution, will be documented. During regular meetings with coders, the team will also discuss ideas for new codes based on their review of interviews. The team will add new codes and review previously coded interviews as needed. Within two months of completing data collection, the team will prepare a draft memo that describes the preliminary findings and tabulations, organized by the research questions and topics.
We will analyze state and territory data separately to investigate whether the needs and challenges are different for the two groups. We also propose analyzing information across the following subgroups of states: states with a large versus small number of licensed providers; proportion of children under age 13 who receive a child care subsidy; and states with different governance structures (e.g., licensing co-located with CCDF or subsidy; state- or county-administered CCDF).
The data will be used to create an internal report for OPRE and its federal partners. The report will include prominent themes that emerged from open-ended questions, as well as considerations or suggestions from the Child Trends/ICF team based on their interpretation of the data. The report will also describe findings in light of variation in states (e.g., co-location of licensing with other ECE services).
Although the entire report will not be made public, it may be used to inform other future efforts, such as TRLECE or ACF research design documents, background materials for technical work groups, conceptual frameworks, and to contextualize research findings from follow-up data collections that have full PRA approval. We may also produce an informational report for OCC and its TA providers to support federally-funded TA. In sharing findings in these other contexts, we will describe the study methods and limitations with regard to generalizability and as a basis for policy.
There are three main benefits of this data collection:
Improve TRLECE and OPRE’s future research and data collection efforts to ensure they address states’ and territories’ most pressing questions related to child care licensing.
Ensure that research-based products that result from TRLECE and OPRE meet states’ and territories’ needs.
Identify states who could partner with TRLECE for future research studies on child care licensing.
B8. Contact Person(s)
TRLECE PI
Co-Director for Early Childhood Research
Child Trends
1516 E Franklin St, Suite 205 | Chapel Hill, NC 27514
(919) 869-3251
INSTRUMENT A: TRLECE Licensing State Administrator Interview Protocol
INSTRUMENT B: TRLECE Data Systems Staff Interview Protocol
APPENDIX A: TRLECE Recruitment Email and Call Scripts for Licensing State Administrator
APPENDIX B: TRLECE Recruitment Email and Call Scripts for Data Systems Staff
APPENDIX C: TRLECE IRB Exemption Letter
APPENDIX D: TRLECE Draft letter of support from Office of Child Care (OCC)
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Diane Early |
File Modified | 0000-00-00 |
File Created | 2023-10-26 |