EHS_CCP Sustainability_OMB_SSB_revised3_clean

EHS_CCP Sustainability_OMB_SSB_revised3_clean.docx

Early Head Start–Child Care Partnerships Sustainability Study

OMB: 0970-0471

Document [docx]
Download: docx | pdf

Alternative Supporting Statement for Information Collections Designed for

Research, Public Health Surveillance, and Program Evaluation Purposes



Early Head Start-Child Care Partnerships Sustainability Study



OMB Information Collection Request

0970-0471





Supporting Statement

Part B



July 2021











Submitted By:

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


4th Floor, Mary E. Switzer Building

330 C Street, SW

Washington, D.C. 20201


Project Officers: Christine Fortunato, Amy Madigan, Sarah Blankenship, and Jenessa Malin


Part B

B1. Objectives

Study Objectives

The Early Head Start–Child Care Partnerships Sustainability Study (“the Sustainability Study”) is designed to show whether and how partnerships between Early Head Start (EHS) and child care providers (referred to as “partnerships”) have been sustained and which features of the partnerships support sustainability. Specifically, the study will examine how partnerships from the 2016 National Descriptive Study (NDS) of Early Head Start-Child Care Partnerships (EHS-CCPs) are faring and describe the characteristics of current partnerships (including those formed since the NDS was fielded, regardless of whether they are funded through an EHS-CCP grant).

Generalizability of Results

This study has two components: web-based surveys and semi-structured interviews. EHS program director and child care provider responses to the web-based surveys will be weighted to be nationally representative of all EHS program directors and child care providers funded through the first wave of EHS-CCP grants in 2015. The interviews are intended to present internally valid descriptions of child care providers whose partnerships with EHS programs have dissolved or been sustained over time, not to promote statistical generalization to other sites or populations.

Appropriateness of Study Design and Methods for Planned Uses

As noted in Supporting Statement A (SSA), this information is not intended to be used as the principal basis for public policy decisions and is not expected to meet the threshold of influential or highly influential scientific information.

B2. Methods and Design

Target Population

The target population for the web-based surveys is the universe of EHS programs that received 2015 EHS-CCP grants and a sample of child care providers. We will survey the universe of programs in order to have a longitudinal design when combined with the NDS, which also surveyed the universe of programs; completions at two time points will enable us to conduct longitudinal analyses. As described in more detail below, child care providers selected for interviews will be chosen to illustrate partnerships that have been dissolved or sustained over time and are not meant to be representative of the population of child care providers.

Sampling

Web-based surveys. We will conduct a web-based survey of EHS program directors, using the universe of directors or director delegates from the 250 2015 EHS-CCPs as participants. From those programs, we expect that there will be 335 directors to survey, as some of the programs with delegates will require multiple respondents; however, our analysis will be conducted at the program level (n = 250). Of the 335 directors, we expect that 220 will respond.

The survey will ask directors about partnerships, and we will use this information to determine whether child care providers (including both center-based providers and family child care providers) will receive a survey for providers in dissolved partnerships versus sustained partnerships (herein sustained partnership providers and dissolved partnership providers). We will invite providers to participate in the survey on a rolling basis after we learn from the EHS program directors which type of survey each provider should receive. This web-based survey will include the universe of providers sampled for the NDS; we will ask all providers who were selected to participate in the NDS to participate in our survey.

The NDS, which surveyed child care providers from February through November 2016, had an 82 percent response rate from providers and an 88 percent response rate from EHS program directors. Providers were randomly sampled from a list of child care partners provided by EHS program directors. This resulted in a sample of 470 child care providers (302 center-based providers and 168 family child care providers). The sampling was done separately by provider type to ensure a robust sample of family child care providers, as most EHS programs had only center-based provider partners. Of the 470 providers asked to participate in the survey, 386 completed the survey (255 center-based providers and 131 family child care providers). Information from the NDS is nationally representative of partnership providers when weighted. Given that the response rate was over 80 percent, a nonresponse bias analysis was not conducted, in accordance with the Office of Management and Budget’s guidelines.

Our expected sample size (Table B.2) provides sufficient power to detect effects1.

  • For the survey of EHS program directors, a sample size of about 220 programs will enable us to detect changes of 0.13 standard deviations or larger over time. To put this minimum detectable change into context, on average, EHS programs reported having 7.6 partners in the 2016 survey, with a standard deviation of 8.8. With an expected sample size of 220 programs, we could detect a change of 1.2 partnership providers.

  • For child care providers in sustained partnerships, a sample size of 271 is large enough to detect changes of 0.12 standard deviations or larger over time. For dissolved partnership providers, a sample size of 105 is large enough to detect changes of 0.19 standard deviations or larger over time. In 2016, child care providers reported a total enrollment of children ages 3 or younger that was, on average, 21.9 children, with a standard deviation of 23.1. With an expected sample size of 271 sustained partnership providers, we could detect a change in enrollment of 2.8 children for this group. With an expected sample size of 105 dissolved partnership providers, we could detect a change in enrollment of 4.5 children for this group.

  • We will also analyze subgroup differences in point-in-time estimates by using t-tests to compare average responses across subgroups, including sustained versus dissolved partnership providers and family child care providers versus center-based providers. We could compare family child care and center-based providers within both sustained and dissolved partnership providers. For example, we could examine whether the number of services that these providers are currently offering differ. With an expected sample size of 376 child care providers, we expect 154 (36 percent) to be family child care providers (Del Grosso et al. 2018). Therefore, we could detect differences of 0.31 standard deviations or larger in the number of services.2

  • We could also compare sustained versus dissolved partnership providers. For example, we could examine differences in child care providers’ reports of partnership quality, based on whether the partnership was sustained or dissolved. A sample size of 376 child care providers would enable us to detect differences of 0.32 standard deviations or larger.

Semi-structured interviews. The sampling frame for semi-structured interviews with child care providers whose partnerships have dissolved will be providers who respond to the dissolved partnership provider survey. The sampling frame for interviews with those in sustained partnerships will be providers who respond to the sustained partnership provider survey. For both sets of interviews, we will use a nonprobability, purposive sample from the survey to identify potential respondents. This type of sampling will not result in a sample that is representative of the population of child care providers in dissolved or sustained partnerships. Rather, we seek to identify a sample with variation on key characteristics.

Respondents for the semi-structured interviews will be selected using data from the EHS program director survey and the provider surveys. We will select providers from program director surveys that are completed early in the recruitment process. We will pair each selected provider with a backup. We will interview 48 providers in dissolved partnerships to allow for variation in provider type and in the proportion of infant/toddler slots that were funded through the partnership at the time of dissolution (we will collect this information from the dissolved partner provider survey). We will interview 24 providers in sustained partnerships to allow for variation in provider type and proportion of infant and toddler slots funded through the partnership. We will interview more providers in dissolved partnerships than in sustained partnerships because we expect a lower rate of survey response for those in dissolved partnerships and want to ensure the experiences and perspectives of dissolved partnership providers are reflected. For both the dissolved and sustained partnership providers we recruit, we will select enough center-based providers and family child care providers such that we can complete interviews with an equal number of each type of provider.

B3. Design of Data Collection Instruments

Development of Data Collection Instrument

Web-based surveys. Before measurement planning, we developed the five research questions listed in SSA Section A2. We then specified the constructs and subconstructs needed to answer each research question. Because of the longitudinal nature of our research questions and design, we began to map items from the NDS to these constructs and subconstructs. To the extent possible, we used the same items fielded in the NDS, with some items reworded or reframed. For example, questions about comprehensive EHS services provided to children and families will not be relevant to providers who are no longer partnering with the program. But we can ask a similarly structured question about whether the center or family child care provides the same services, either directly or through referrals to a community agency.

When we could not use existing items from the NDS to measure a construct, we looked to other publicly available items (for example, from the National Survey of Early Care and Education, OMB# 0970-0391, and the Head Start Family and Child Experiences Survey, OMB# 0970-0151). If we could not find an existing item for a construct, we worked to develop a new item. New items were a primary focus of our pre-test.

We pre-tested the program director survey with five EHS program directors and the provider surveys with eight child care providers (six sustained partnership providers and two dissolved partnership providers). For items that appear in multiple surveys, we used a tracking document to ensure that a single item was not pre-tested with more than nine respondents. The main objectives for the pre-test were to test participants’ ability to answer questions requiring recall, program directors’ understanding of and ability to report separately on different partnership types, the clarity of new questions (and continued relevance and clarity of NDS items), how questions functioned differently in the context of the COVID-19 pandemic, and overall survey flow. Beginning in September 2020, we identified EHS program directors to take part in the pre-test. We also used this recruitment step to identify pre-test participants for the provider surveys and interview protocol.

We conducted the pre-test orally by phone, with respondents following along with a copy of the survey we sent in advance. As we moved through the survey, we used a protocol to ask questions about the wording, clarity, and overall understanding of the new survey questions we developed.

Semi-structured interviews. The qualitative protocols are designed to help interviewers obtain a more robust sense of the stories behind the dissolved and sustained partnerships, using selected survey responses as starting points for discussion. From the survey, we will know the main reasons that the partnerships dissolved and whether these reasons were communicated by the EHS program and/or provider, as well as what is working well in sustained partnerships and why. The interviews will provide more detail on the survey responses. We will ask when any issues in the partnership emerged, how child care providers and EHS programs addressed the issues, who was involved in the process, and similar questions. In addition, the interviews will enable us to better understand providers’ experiences during the COVID-19 pandemic.

In the interviews with dissolved partnership providers, we will also probe on the services providers have continued to or stopped providing since the partnership ended, asking whether any services were ended right away or if they tried to sustain them for a time. From the NDS and sustainability surveys, we will know that some things changed (for example, services for enrolled children and families and professional development opportunities) but not as much about when things changed relative to the dissolution and how the changes were made. The interviews should help us fill out that picture. In the interviews with providers from sustained partnerships, we will probe on the champions of the partnership, staff turnover, and periods without enrollment slots.

We pre-tested the interview protocol by asking the interview questions to the same respondents who participated in the survey pre-test. We then debriefed to gather input on the questions, probes, and length. The interview pre-test was conducted with fewer than ten providers.

Aligning instruments with objectives. Table B.1 is a matrix that links research questions (as detailed in SSA Section A2), constructs, and data sources.

A check in the table denotes the source of each item. Research questions 1 and 4 will be addressed by five of our data sources—the EHS program director survey, the web-based survey of sustained partnership providers, the web-based survey of dissolved partnership providers, the semi-structured interviews with sustained partnership providers, and the semi-structured interviews with dissolved partnership providers. The second research question will be addressed by the survey of dissolved partnership providers and the semi-structured interviews with a subset of these providers. The sustained partnership provider survey and semi-structured interview, and to a lesser extent, the EHS program director survey—will address research question 3.

Table B.1. Sustainability research questions, constructs, and data sources

Research questions and constructs

EHS program director surveya

Survey of sustained partnership providers a

Survey of dissolved partnership providers a

Semi-structured interviews with dissolved partnership providersa

Semi-structured interviews with sustained partnership providers

1. Are partnerships sustained? How and why do partnerships change over time?






Update on EHS programs’ service offerings (primarily focusing on partnership offerings)




Experiences with partnerships

Update on child care partners from 2016 (active [sustained or new], dissolved)





Factors that helped sustain partnerships



Reasons that partnerships dissolved



2. After partnerships end, what are the characteristics of the child care providers and the services they offer?






Operation and enrollment status





Enrollment capacity, group size





Hours of operation





Staff characteristics and staff supports




Environment





Delivery of comprehensive services




Sources of funding that support service delivery




Participation in quality improvement initiatives




3. For partnerships that are sustained, how do features of the partnerships change over time?






Funding and resource allocation (changes over time)



Enrollment (changes over time)




Hours of operation





Staff characteristics and staff supports




Environment




Delivery of select comprehensive services (types, structure, reach)




Quality monitoring



Participation in quality improvement initiatives




Partnership agreements/plans and development processes (changes over time)



Relationship quality and communication between partners



EHS-CCP leader



4. What factors support or impede the partnerships’ sustainability? Are there differences between sustained and dissolved partnerships in their program structures/characteristics, initial partnership quality, or other features?






Funding and resource allocation (as a facilitator of or barrier to sustainability)

Partnership agreements/plans and development processes (as a facilitator of or barrier to sustainability)b




Approaches to decision making and monitoring (as a facilitator of or barrier to sustainability)

Collaboration quality (as a facilitator of or barrier to sustainability)

Communication processes (as a facilitator of or barrier to sustainability)

System-level inputs (as facilitators of or barriers to sustainability)

5. Are any of the factors associated with Research Questions 1–4 differ between partnerships that are center based providers and those that are family child care providers?






a Sources of data are at the construct level—each source listed in this column might not reflect all subconstructs listed in a row.

b We will analyze this using data from the NDS as a predictor.

EHS = Early Head Start.

B4. Collection of Data and Quality Control

Web-based surveys. We will conduct initial recruitment by mail and email, with reminder emails, telephone calls, and letters for sample members who have not yet responded to the survey (Appendix B). Along with mail invitations, we will include a joint letter from the Office of Head Start and Office of Child Care describing the purpose of the study (Appendix D). Additionally, we will include a small non-monetary gift with the invitations. Respondents will have the option to complete their survey online or over the phone through computer-assisted telephone interviewing (CATI). We will train telephone interviewers, supervisors, and monitors on the provider surveys. In this training, we will provide an overview of the study and how the provider surveys are related to the other study components, review how to answer questions from respondents, demonstrate how to navigate the CATI instrument, and give trainees a chance to practice administering the interview in pairs. Given the complexity of some program arrangements, a member of our research team will conduct phone surveys with EHS programs directors who choose to complete the survey by telephone.

Survey data monitoring. We will review all completed surveys for missing responses and review all partial surveys for follow-up with respondents. We will conduct a preliminary data review after the first 10 to 20 completions to confirm that the programmed instruments are working as expected and to check for inconsistencies in the data. We will build soft checks into the surveys to alert respondents to potential inconsistencies while they are responding.

Telephone interview monitoring: web-based surveys. For provider surveys completed over the phone, professional survey operations center monitors will monitor the telephone interviewers and observe all aspects of interview administration. Each interviewer will have his or her first interview monitored and get feedback, and for ongoing quality assurance, we will monitor 10 percent of all subsequent telephone interviews (standard practice). Monitors will also listen to interviews conducted by interviewers who have had problems during a previous monitoring session. Monitors will provide feedback to interviewers, highlighting positive aspects and providing constructive feedback and coaching on aspects that should be improved. If an interviewer collects incorrect information, we will conduct a callback to rectify any errors and protect the integrity of the data. In our experience, the need for such callbacks is extremely rare.

Semi-structured interviews. Data will be collected by the research team at Mathematica. We will conduct initial recruitment by email and then send follow-up emails and make telephone calls to providers who do not respond to initial outreach (Appendix C). Interviews will be conducted by telephone. We will use staff who have experience working with child care providers (including family child care providers) in other studies, are trained in qualitative interviewing, and have expertise in conducting semi-structured telephone interviews. We will train interviewers in a webinar that will cover (1) procedures for contacting and scheduling interviews, (2) the interview protocol and how to tailor it to every interview, and (3) proper note-taking in the interview notes template.

Semi-structured interview monitoring. The interview lead will participate in each interviewer’s first interview and provide feedback afterwards. The lead will also hold weekly meetings with interviewers to track their progress, answer questions, and troubleshoot problems. These meetings will also give interviewers an opportunity to share with the team any successful strategies they have used.



B5. Response Rates and Potential Nonresponse Bias

Response Rates

Unit response rates. For the web-based survey, to calculate unit response rates (URR), we will estimate the ratio of the weighted number of completed surveys to the weighted number of respondents in our sample. We will use the following formula:

Where C is the weighted number of completed surveys, R is the weighted number of refused surveys, O is the weighted number of eligible sample units who did not respond for reasons other than refusal, and NC is the weighted number of eligible sample units whom we were unable to contact.

Item response rates. For the web-based survey, to calculate item response rates (IRR), we will estimate the ratio of the number of respondents who provided an appropriate response. We will use the following formula:

Where, for each item x, I is the number of respondents who provided an appropriate response, A is the number of respondents in the survey, and V is the number of respondents with a valid skip.

Expected response rates. As a longitudinal follow up for a previously fielded survey, we have a very specific and finite pool of respondents as the denominator for our response rates3. We expect a web-based survey response rate of 75 to 88 percent (Table B.2). These expected response rates mirror the corresponding response rates from the NDS. The expected response rate for dissolved partnership provider survey is rounded down from the lowest response rate in the NDS (family child care providers, who had a 78 percent response rate).

Table B.2. Expected sample size and response rates

Respondent

Expected completed 2022 surveys/expected
sample size

Expected response rate

Margin of errora

2015 cohort of program directors from EHS-CCP grantees

220/250b

88 percent

6.8 percentage points

Child care providers sampled in 2016 survey

376/470

80 percent

5.2 percentage points

Child care providers sampled in 2016 survey: Sustained partnerships

271/330c

82 percent

6.1 percentage points

Child care providers sampled in 2016 survey: Dissolved partnerships

105/140

75 percent

9.9 percentage points

a The margin of error shows the confidence interval around an outcome of 50 percent in percentage points.

b Some delegates might need to complete roster information, resulting in more than 250 respondents surveyed. We used a sample size of 250 for analytic planning because data will be analyzed at the EHS grantee level.

c We estimate that about 70 percent of partnerships will be sustained, and 30 percent will be dissolved. To arrive at this estimate, we linked the sampled providers from the 2016 survey with Head Start Enterprise System (HSES) data on partnerships as of February 1, 2019, using the grant number and the provider’s name. Sustained partnerships are those in which the same child care provider from 2016 also appeared in the HSES data. Dissolved partnerships are those in which the same child care provider did not appear in the HSES data. We accounted for slight variations in the names to ensure that we identified as many matches as possible.

Response rate monitoring. We will use reports generated from our database and survey instruments to actively monitor response rates for each instrument. Using these reports, we will tailor reminder email and telephone efforts, working to identify challenges and solutions to obtaining expected response rates.

Addressing lower-than-expected response rates. We plan to be flexible with data collection modes, when possible. For example, although we hope that respondents will complete the web survey (and will try to facilitate this by providing a QR code in the mailed invitations), we will increase our phone dial-outs as data collection progresses. We can also leverage the relationship between EHS programs and sustained partnership providers by asking EHS program directors to follow up with sustained partnership providers who have not responded to the survey. Finally, we recognize that dissolved partnership providers will probably be more difficult to reach. We have planned a locating effort for these providers, including searches of databases such as Accurint and related follow-up calls. These efforts will be conducted by experienced survey operations center staff and monitored by survey operations center monitors.

Semi-structured interviews. The interviews are not designed to produce statistically generalizable findings. Response rates will not be calculated or reported.

NonResponse

Throughout our analysis of the web-based survey, we will use weights to account for survey nonresponse. As in the NDS, we will weight survey responses to be representative of the programs that received 2015 EHS Expansion and EHS-CCP grants. When respondents report on all current partners, EHS program director survey data will represent the current partnerships of all 2015 EHS-CCP grant recipients.

When marginal response rates are below 80 percent, we will conduct a nonresponse bias analysis to compare distributions of respondent characteristics to those of nonrespondents using any information available for both types of sample members. We will then compare the distributions for respondents when weighted using the nonresponse-adjusted weights to see if the weights mitigate any observed differences.

We hope to avoid some item-level missing data by providing a “don’t know” option, particularly for items that respondents might not know or remember. We will consider imputation where appropriate. Given that participants in the semi-structured interviews will not be randomly sampled and the findings are not intended to be representative, we will not calculate nonresponse bias. Provider characteristics will be documented and reported in written materials associated with the data collection.

B6. Production of Estimates and Projections

Survey estimates produced by this study will be for official external release.

All survey analyses will be conducted using the final analysis weights so that the estimates can be generalized to the target population. Documentation for the restricted-use analytic files will include instructions, descriptive tables, and coding examples to support the proper use of weights and variance estimation by secondary analysts.

Semi-structured interviews. The data will not be used to generate population estimates, either for internal use or dissemination.

B7. Data Handling and Analysis

Data Handling

Once the electronic instruments are programmed, Mathematica will use a random data generator (RDG) to check the questionnaire skip logic, validations, and question properties. The RDG produces a test data set of randomly generated survey responses. The process runs all programmed script code and follows all skip logic in the questionnaire, simulating real interviews. This process allows any coding errors to be addressed before data collection.

During and after data collection, Mathematica staff responsible for each instrument will edit the data when necessary. The survey team will develop a document for data editing to identify when survey staff edit a variable, noting the current value, the new value, and the reason why the value was edited. A programmer will read the specifications from these documents and update the data file. All data edits will be documented and saved in a designated file. We anticipate that most data edits will correct interviewer coding errors identified during frequency review (for example, filling in missing data with “M” or clearing out “other (specify)” verbatim data when the response has been back-coded). This process will continue until all data are clean for each instrument.

Data Analysis

Longitudinal analyses will focus on changes in constructs from the 2016 NDS Grantee and Delegate Agency Director Survey and Child Care Partner Survey to the 2022 surveys. We will use t-tests to compare responses from the 2016 surveys to responses from the 2022 surveys for items that are consistent across the surveys. These t-tests will reveal whether changes over time are significantly different from zero. Therefore, our samples for the longitudinal analyses will include only respondents who completed their 2016 and 2022 surveys. The constructs we propose to examine in longitudinal analyses include changes in the composition of partnerships (the number of partners in sustained, dissolved, and new partnerships and the provider types), changes in the characteristics of EHS programs and child care providers in both sustained and dissolved partnerships (enrollment, services, quality supports, and program activities), and changes made to features to support partnerships.

Point-in-time analyses will focus on constructs new to the 2022 surveys that were not possible to measure in the 2016 surveys. These analyses will include the full sample of respondents completing 2022 surveys. The constructs we propose to examine in point-in-time analyses include (1) the reasons for dissolving the partnership, from the perspectives of both EHS program directors and dissolved partnership providers; (2) the current funding sources for those partners; and (3) features that support partnerships, from the perspective of EHS program directors, focusing on all of a program’s current child care partners—those in partnerships sustained since the 2016 survey as well as new partners. Furthermore, we will explore whether we can examine alignment between the responses of program directors and providers to similar questions.

For point-in-time analyses, we will calculate percentages, medians, or means and standard deviations of analytic variables of interest as well as percentages of EHS programs and child care providers in various categories. This work will also include analyses of subgroup differences and comparisons of sustained and dissolved partnership providers, both of which will involve t-tests.

Multivariate analysis. We may also conduct multivariate regression analyses to predict whether partnerships were sustained or dissolved based on multiple characteristics. For example, we could predict whether partnerships were sustained based on partnership quality, controlling for program size.

Qualitative data analysis. In addition to the quantitative longitudinal and point-in-time analyses, we will conduct qualitative analyses of data from semi-structured interviews with sustained and dissolved partnership providers. We will draft notes from these interviews in a standard template and then import them into qualitative data analysis software (NVivo). Using a coding protocol developed by the qualitative analysis lead, the research team will then systematically review and assess the data by classifying it, or coding it, to the research questions, constructs, and subconstructs. We will initially code five interviews using the coding protocol to ensure that the codes and definitions are clear, the codes can be applied systematically, and the list of codes is comprehensive. We may adjust the coding protocol after the first five interviews. Coders will be trained to apply the codes to the interview text and to identify themes that are not included in the coding protocol. During training, the coders will code an interview, and the qualitative analysis lead will review the codes for accuracy. After the training is completed and the data are coded, the team will retrieve and sort the data linked to specific research questions and constructs and identify themes in the data.

Because the semi-structured phone interviews will be conducted with a subset of the respondents who completed the survey for dissolved partnerships, we will be able to integrate the qualitative analysis from the interviews with the quantitative data to enhance our understanding of the survey data. For example, we can use the qualitative findings on when issues arose between providers and EHS programs and how they were handled to provide context for the descriptive findings on the factors that impeded the sustainability of the partnership. We can also turn to the qualitative findings to understand unanticipated results.

Data Use

Mathematica will produce several publications based on analysis of data from the Sustainability Study:

  • A report with descriptive tables of findings from all surveys. The intention is to quickly produce findings that Federal agencies can use.

  • Two to three briefs, including one each focusing on research questions #2 and #3, including information from the descriptive tables and interviews, additional narrative explanation of the findings, and addressing research questions #4 and #5. This report will be accessible to a broad audience, using graphics and figures to communicate key findings.

  • Restricted-use data files and documentation, which will be available for secondary analysis.

The data file documentation will include a data user’s guide to inform and assist researchers who might want to use the data for future analyses. The manual will include information on (1) the background of the study, including its conceptual framework; (2) the sample design, such as the number of study participants, response rates, and weighting procedures; (3) the data collection procedures, instruments, and measures; and (4) the data preparation and structure of the study’s data files, including data entry, frequency review, data edits, and creation of data files.

If we decide to archive qualitative data, we will ensure that individual respondents cannot be identified.

B8. Contact Person(s)

Mathematica is conducting this project under Contract No. HHSP233201500035I. A team from Mathematica developed the plans for statistical analyses for this study. To complement the study team’s knowledge and experience, the team also consulted with a group of outside experts, as described in Section A8 of SSA.

The following individuals at the Administration for Children and Families and Mathematica are leading the study team:

Amy Madigan, Ph.D.
Project Officer
Office of Planning, Research, and Evaluation

[email protected]

Sarah Blankenship, Ph.D.
Child Care Program Specialist
Office of Planning, Research, and Evaluation

[email protected]

Christine Fortunato, Ph.D.
Senior Social Science Research Analyst
Office of Planning, Research, and Evaluation

[email protected]

Jenessa Malin, Ph.D.
Social Science Research Analyst
Office of Planning, Research, and Evaluation

[email protected]

Cheri Vogel, Ph.D.
Project Director
Mathematica
[email protected]

Patricia Del Grosso, M.S.
Project Director
Mathematica

[email protected]

Yange Xue, Ph.D.
Co-Principal Investigator
Mathematica

[email protected]

Barbara Carlson, M.A.
Senior Statistician
Mathematica

[email protected]

Sara Skidmore, B.A.
Survey Director
Mathematica

[email protected]

Sara Bernstein, Ph.D.
Co-Principal Investigator
Mathematica

[email protected]



Attachments

Instruments

Instrument 1. EHS Program Director Survey

Instrument 2. Sustained Partnership Provider Survey

Instrument 3. Dissolved Partnership Provider Survey

Instrument 4. Dissolved Partnership Provider Semi-Structured Interview Protocol

Instrument 5. Sustained Partnership Provider Semi-Structured Interview Protocol

Appendices

Appendix A. Comments Received on 60-Day Federal Register Notice

Appendix B. Supplemental materials for surveys

Appendix C. Supplemental materials for the provider interviews

Appendix D. Joint Office of Head Start and Office of Child Care letter of support


References

Del Grosso, P., Thomas, J., Makowsky, L., Levere, M., Fung, N., & Paulsell, D. (2019). Working Together for Children and Families: Findings from the National Descriptive Study of Early Head Start-Child Care Partnerships, OPRE Report # 2019-16, Washington, DC: Office of Planning, Research, and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.

1 The numbers in the Table A.3 in Part A represent the maximum possible number of anticipated completes. The numbers in Part B Table B.2 represent our best assessment of what we think our actual response rates will be.

2 Detecting differences between family child care and center-based providers in sustained versus dissolved partnerships, rather than across both sustained and dissolved partnerships, would require large differences between these groups (0.37 to 0.56 standard deviations).

3 Response rates for the NDS were 88 percent for grantee directors, and 82 percent for providers.

4

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorMadigan, Amy (ACF)
File Modified0000-00-00
File Created2021-07-23

© 2024 OMB.report | Privacy Policy