Supporting Statement A - ECE ICHQ Clean 21721

Supporting Statement A - ECE ICHQ Clean 21721.docx

Assessing the Implementation and Cost of High Quality Early Care and Education

OMB: 0970-0499

Document [docx]
Download: docx | pdf

Alternative Supporting Statement for Information Collections Designed for

Research, Public Health Surveillance, and Program Evaluation Purposes



Assessing the Implementation and Cost of High Quality Early Care and Education: Field Test



OMB Information Collection Request

0970 - 0499





Supporting Statement

Part A

August 2019

Updated February 2021


Submitted By:

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


4th Floor, Mary E. Switzer Building

330 C Street, SW

Washington, D.C. 20201


Project Officers:

Ivelisse Martinez-Beck, Senior Social Science Research Analyst and

Child Care Research Team Leader

Meryl Barofsky, Senior Social Science Research Analyst













Part A




Executive Summary


  • Type of Request: This Information Collection Request (ICR) is for a reinstatement with changes. We are requesting 15 months of approval.


  • Progress to Date: This ICR builds on earlier work of ACF’s Assessing the Implementation and Cost of High Quality Early Care and Education (ECE-ICHQ) project. The original request supported the creation of measures of center-based early care and education (ECE) implementation and costs, and the information collection for that effort has been completed.


  • Timeline: The timeline for the original request was met. This request builds on the original ICR by further testing and modifying the measures created under that request.


  • Previous Terms of Clearance: There were no previous terms of clearance.


  • Summary of changes requested: The current request is to field test refined instruments based on measures developed in previous phases of the study. This ICR will allow ACF to (1) validate key program implementation measures or further improve their psychometric properties using classroom observations, and (2) test preliminary associations between implementation, cost, and quality measures. The proposed collection largely reflects previously-approved study methodology, with three changes:

    • Some measures have been updated based on results of the previous data collection,

    • A classroom observation has been added to test the validity of the measures; and

    • More centers in more states will be targeted for recruitment in order to have a sufficient sample size to further validate the measures.

      • As a first step for the field test, the study will conduct a feasibility study to understand the feasibility of data collection following the COVID-19 pandemic,. The study will first visit a subsample of centers from previous phases of the study to understand how the contexts have shifted due to the pandemic.

We do not intend for this information to be used as the principal basis for public policy decisions.





A1. Necessity for Collection

The Administration for Children and Families (ACF) at the U.S. Department of Health and Human Services seeks approval to collect information to further inform the development of measures of high quality early care and education program implementation and costs. This information collection is part of the project, Assessing the Implementation and Cost of High Quality Early Care and Education (ECE-ICHQ).

Study background

States and the federal government have increased financial support to improve the quality of early care and education (ECE) services for children aged birth to five. However, there is a lack of evidence on how to effectively target funds to increase ECE quality. ACF’s Office of Planning, Research, and Evaluation (OPRE) contracted with Mathematica and consultant Elizabeth Davis of the University of Minnesota to conduct the ECE-ICHQ project to create an instrument to measure the implementation of key functions supporting quality in center-based ECEs and the associated costs.1

Since the fall of 2014, the ECE-ICHQ study team has developed a conceptual framework (See Attachment A); conducted a review of the literature (Caronongan et al. 2016); consulted with a technical expert panel; collected and summarized findings from Phase 1 of the study (completed under ACF’s generic clearance 0970-0355); and collected and summarized findings from Phase 2 of the study (completed under 0970-0499). This information collection request is to field test the revised instruments based on the measures developed in previous phases of the study and updating them to include information about the COVID-19 pandemic. The study will validate the measures using observational measures of quality and administrative data from Quality Rating Improvement Systems. We will also use the measures to examine preliminary associations between cost and quality.

Legal or administrative requirements that necessitate the collection

There are no legal or administrative requirements that necessitate this collection. ACF is undertaking the collection at the discretion of the agency.


A2. Purpose

Purpose and Use

The purpose of this information collection is to field test instruments using measures developed in previous phases of the study and updated to include information about the COVID-19 pandemic. The goals are to (1) refine the implementation measures to further improve their psychometric properties, and (2) test potential associations between implementation, cost, and quality measures.

The information will be used for two main purposes. First, the information about the methods of creating the tools will be documented (in technical reports and journal articles) to assist the field in understanding the measures development process. Second, the information will help ACF produce valid tools to measure how centers use resources to support high-quality early care and education and identify any preliminary associations between cost and quality. Final reports, presentations, and possibly journal articles will be avenues for dissemination. Data from the field test may be archived at the Child and Family Data Archive at the University of Michigan for future research and analyses by qualified researchers.

The information collected is meant to contribute to the body of knowledge on ACF programs. It is not intended to be used as the principal basis for a decision by a federal decision-maker, and is not expected to meet the threshold of influential or highly influential scientific information.  

Research Questions or Tests

The table below presents the research questions for the full ECE ICHQ study. The field test proposed in this ICR is focused on refining implementation measures and initial testing for observed associations between implementation, cost, and quality measures.

Questions focused on ECE centers:

Are differences in center characteristics, contexts, and conditions related to implementation and costs?

What key center-level and classroom-level functions do center-based ECE providers pursue, and what implementation activities support each function?

What are the costs associated with the implementation of key functions?

How do staff members use their time in support of key functions within the center?

Questions focused on the purpose and relevance of the measures for policy and practice:

How can implementation and cost data be aligned to produce relevant and useful evidence to inform decisions about implementation activities and key functions likely to lead to quality improvement?




Study Design

The field test will build on earlier data collection efforts: Phase 1 (completed under ACF’s generic clearance 0970-0355) and Phase 2 (completed under 0970-0499) of the study. During the field test, we will collect data from 80 centers in five states. We will start with a smaller sample of centers from Phase 2 to test the feasibility of the measures in the current COVID-19 context. We will collect data through telephone interviews, electronic cost workbooks, time-use surveys (web-based) and classroom observations. Table A.1 includes each of the data collection activities by respondent and format.


Table A.1. Data collection activities for the ECE-ICHQ field test

Data collection activity

Respondents

Format

Estimated time to complete

Purpose

Center recruitment call (Instrument 1)

Site administrator or center director

Umbrella organization administrator (as applicable)

Telephone

20 minutes






Discuss the study, recruit centers, and obtain agreements as needed


Center engagement call (Instrument 2)

Site administrator or center director


Telephone

30 minutes

Collect information about the characteristics of the center

Implementation interview (Instrument 3)

Site administrator or center director

Education specialist

Umbrella organization administrator (as applicable)

Telephone

3 hours

Gather information about what a center does to support quality early care and education.

Cost workbook (Instrument 4)

Financial manager at site

Financial manager of umbrella organization (as applicable)

Excel workbook; telephone and email follow-up

8 hours

Collect information on all costs for the center for the previous 12-months.

Staff rosters for time-use survey (Instrument 5)

Site administrator or center director

CADE on the weba

15 minutes

Collect a list of potential respondents for the time-use survey

Time-use survey (Instrument 6)

Site administrator or center director

Education specialist

Lead and assistant teachers

Web

15 minutes

Collect information on teaching and administrative staff time use that will help transform labor hours into costs associated with the key functions.

Classroom rosters for observations (Instrument 7)

Site administrator or center director

CADE on the weba

30 minutes

Collect information required for classroom sampling for the classroom observation.

Classroom observation

N/A

CADE with tablet computera

No burden imposed for the classroom observation.

Collect information on observed classroom quality.

aCADE = computer-assisted data entry







Other Data Sources and Uses of Information

This data collection is one component of the information that will be used. We intend to access administrative data from the state about the center’s quality rating and improvement system (QRIS). No burden will be required to access this data, but it will allow selection of sites and further validation of the measures.


A3. Use of Information Technology to Reduce Burden

Using feedback collected in Phase 1 of measurement development, the study team altered the approach to data collection for Phase 2 by relying on telephone interviews rather than in-person data collection and offering a web-based version of the time-use survey. We will continue this approach in the proposed field test. As in Phase 2 of measurement development, a cost workbook will be provided in an electronic spreadsheet format that respondents can complete at their own pace and submit electronically. The time-use survey will be available in a web-based application or hard copy form to accommodate the preferences and schedules of center staff. Study team members will provide individualized telephone and email follow-up as necessary.



A4. Use of Existing Data: Efforts to reduce duplication, minimize burden, and increase utility and government efficiency

None of the study instruments will ask for information that can be reliably obtained from alternative data sources, in a format that assigns costs to key functions. No comparable data have been collected on the costs of key functions associated with providing quality services at the center level for ECE centers serving children from birth to age 5.

Furthermore, the design of the study instruments ensures no duplication of data collected through each instrument. Each center will complete one cost workbook and one implementation interview; these have been developed to be complementary to obtain necessary information with the least burden to respondents.



A5. Impact on Small Businesses

The team will recruit small ECE centers (those serving fewer than 100 children and having fewer than five classrooms) to participate. To minimize the burden on these centers, the study team will carefully schedule telephone interviews with the directors and managers at times that are most convenient for them, and when it will not interfere with the care of children. For example, the team will schedule interviews with directors in the early mornings or late afternoons when there are fewer children at the center. The team will not interview teachers; teachers will be able to complete the time use surveys (via web) when it is convenient for them. Respondents will be able to complete the cost workbook at their own pace, at times convenient to their schedules.


A6. Consequences of Less Frequent Collection

This is a one-time data collection.


A7. Now subsumed under 2(b) above and 10 (below)



A8. Consultation

Federal Register Notice and Comments

In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13) and Office of Management and Budget (OMB) regulations at 5 CFR Part 1320 (60 FR 44978, August 29, 1995), ACF published a notice in the Federal Register announcing the agency’s intention to request an OMB review of this information collection activity. This notice was published on June 18, 2019, Volume 84, Number 117, page 28305-28306, and provided a sixty-day period for public comment. A copy of this notice is attached as Attachment G. During the notice and comment period, no substantive comments were received.

Consultation with Experts Outside of the Study

In designing the ECE-ICHQ, the team drew on a pool of experts (See Table A.2) to complement the knowledge and experience of the study team. To ensure the representation of multiple perspectives and areas of expertise, the expert consultants included program administrators, policy experts, and researchers. Collectively, the study team and external experts have specialized knowledge in measuring child care quality, cost-benefit analysis, time-use analysis, and implementation associated with high quality child care.

Study experts have provided input to help the team (1) define what ECE-ICHQ will measure; (2) identify elements of the conceptual framework and the relationships between them; and (3) make key decisions about the approach, sampling, and methods of Phase 1 of the study. Select members of the expert panel also reviewed findings from Phases 1 and 2 of measurement development and gave input on revisions to the data collection process and tools for the field test that would reduce the burden on respondents, improve the accuracy of data collection, and support development of systematic measures of implementation and costs across a range of ECE centers.

Table A.2. ECE-ICHQ technical expert panel members

Experts consulted for initial study design and Phase 1 (2014-2016)

Name

Affiliation at time of consultation

Melanie Brizzi

Office of Early Childhood and Out of School Learning, Indiana Family Social Services Administration (no longer in this position)

Rena Hallam

Delaware Institute for Excellence in Early Childhood, University of Delaware

Lynn Karoly

RAND Corporation

Mark Kehoe

Brightside Academy (no longer in this position)

Henry Levin

Teacher’s College, Columbia University

Katherine Magnuson

School of Social Work, University of Wisconsin–Madison

Tammy Mann

The Campagna Center

Nancy Marshall

Wellesley Center for Women, Wellesley College

Allison Metz

National Implementation Research Network, Frank Porter Graham Child Development Institute, University of North Carolina at Chapel Hill

Louise Stoney

Alliance for Early Childhood Finance

Experts consulted for Phase 2 and field test (2017-2019)

Name

Affiliation

Margaret Burchinal

Frank Porter Graham Child Development Institute, University of North Carolina

Rena Hallam

Delaware Institute for Excellence in Early Childhood, University of Delaware

Lynn Karoly

RAND Corporation

Nancy Marshall

Wellesley Center for Women, Wellesley College



A9. Tokens of Appreciation

The multi-part, nested structure of this proposed data collection and analysis plan requires a high level of participation from center staff in the time-use survey. To support a successful data collection, the team will provide a $20 gift card to each staff member that completes the time-use survey. This gift card amount is slightly higher than was offered in Phase 2 since we will not be able to send field staff to centers to follow up with staff and staff will need to access and complete a web-based version of the survey, which they cannot complete during regular work hours because they do not have access to computers or phones at this time. Using in person follow up and paper surveys in Phase 2 and, in combination with other techniques to improve response rate described in Supplemental Statement B, supported a 90 percent response rate among center staff.



A10. Privacy: Procedures to protect privacy of information, while maximizing data sharing

Personally Identifiable Information

To enable the distribution of time-use surveys, this study will collect names and email addresses of center staff. Information will not be maintained in a paper or electronic system from which data are actually or directly retrieved by an individuals’ personal identifier.

Assurances of Privacy

Information collected will be kept private to the extent permitted by law. Respondents will be informed of all planned uses of data, that their participation is voluntary, and that their information will be kept private to the extent permitted by law. As specified in the contract, the Contractor will comply with all Federal and Departmental regulations for private information.

Data Security and Monitoring

The study team (Mathematica) has developed a data safety and monitoring plan that assesses all protections of respondents’ personally identifiable information. Mathematica will ensure that all of its employees and consultants who perform work under this contract are trained on data privacy issues and comply with the above requirements. Upon hire, every Mathematica employee signs a Confidentiality Pledge stating that any identifying facts or information about individuals, businesses, organizations, and families participating in projects conducted by Mathematica are private and are not for release unless authorized.

As specified in OPRE’s contract, Mathematica will use Federal Information Processing Standard (currently, FIPS 140-2) compliant encryption (Security Requirements for Cryptographic Module, as amended) to protect all instances of sensitive information during storage and transmission. Mathematica will securely generate and manage encryption keys to prevent unauthorized decryption of information, in accordance with the Federal Processing Standard. Mathematica will (1) ensure that this standard is incorporated into the company’s property management and control system; and (2) establish a procedure to account for all laptop computers, desktop computers, and other mobile devices and portable media that store or process sensitive information. Any data stored electronically will be secured in accordance with the most current National Institute of Standards and Technology requirements and other applicable federal and departmental regulations. In addition, Mathematica must submit a plan for minimizing to the extent possible the inclusion of sensitive information on paper records and for protecting any paper records, field notes, or other documents that contain sensitive or personally identifiable information to ensure secure storage and limits on access.

A restricted use data set will be created based on this data collection. Disclosure analyses will be done prior to releasing the data file, and masking of data will occur to ensure privacy of respondents. The data will be archived at the Child and Family Data Archive at the University of Michigan for future research and analyses by qualified researchers.



A11. Sensitive Information 2

Calculating accurate estimates of center costs requires collecting information on staff compensation and other center operating costs. The study team will explain the importance of this information to respondents and will ask sites to report salary information only by staff title, not personal name.



A12. Burden

Explanation of Burden Estimates

Newly requested information collections

Table A.3 summarizes the estimated reporting burden and costs for each of the study tools included in this information collection request. The estimates include time for respondents to review instructions, search data sources, complete and review their responses, and transmit or disclose information. Figures are estimated as follows:

  1. Center recruitment call (Instrument 1). Based on Phases 1 and 2, the study team expects to reach out to 800 centers to secure the participation of the 80 centers necessary for this study. We anticipate the recruitment call with center directors to take about 20 minutes. The team anticipates that for three-quarters of centers that agree to participate (75 centers), they will need to speak with an administrator of a larger umbrella organization with which the center is affiliated to fully obtain agreement for the center’s participation in the study. This discussion will be similar to the center recruitment call and will take about 20 minutes, on average.

  2. Center engagement call (Instrument 2). The study team expects about 100 centers to agree to participate. When a center has agreed to participate, recruiters will use the second part of the recruitment and engagement call script, which is estimated to take about 30 minutes. Based on Phases 1 and 2, the study team assumes that 20 percent may withdraw after this step.

  3. Implementation interview (Instrument 3). The team will conduct the three-hour implementation interview with the center director at each of the 80 centers. Based on the experience in Phase 2, the team anticipates that in one-quarter of the centers (20 centers), additional respondents will be involved in parts of the interview. On average, the team estimates that additional respondents in the 20 centers will be involved in up to 3 hours of interview time. The additional respondents could include an assistant center director, education program manager or specialist, or executive staff from an umbrella organization (such as a Head Start grantee, or corporate office of a chain).

  4. Electronic cost workbook (Instrument 4). The financial manager at each center or umbrella organization will be the primary person to complete the cost workbook with support from the data collection team as necessary. In Phase 2, 11 centers had more than one respondent for the cost workbook.

Given the experience in Phase 2, the study team estimates that it will take 8 hours, on average, for respondents at each center to complete the cost workbook by assembling records, entering data, and responding to follow-up communication. The estimated average assumes some variation among centers in the extent to which respondents complete the workbook independently or with the assistance of the study team. The team further assumes that respondents in all centers will participate in follow-up communication to confirm the information provided and review portions of the workbook with members of the study team.

  1. Staff rosters for time-use survey (Instrument 5). Field staff will work with a center administrator to obtain a roster with contact information for all the staff targeted for the time use survey in a center. The team expects it will take about 2 minutes for the center administrator to provide information to complete the roster.

  2. Time-use survey (Instrument 6). The study team will target the time-use survey to an average of 16 staff per center (1 or 2 administrators, up to 14 teaching staff) at each of the 80 centers, for a total of 1,280 center staff. The team plans on an 87.5 percent response rate (1,120 respondents) and expects the time-use survey to take 15 minutes to complete.

  3. Classroom rosters for observations (Instrument 7). Field staff will work with a center administrator to collect information required to select classrooms for observation. The team expects it will take about 30 minutes for the center administrator to provide information.

Estimated Annualized Cost to Respondents

Table A.3. Total burden requested under this information collection

Instrument

Total/Annual number of respondents

Number of responses per respondent

Average burden hours per response

Annual burden hours

Average hourly wage

Total annual cost

Center recruitment call

Center director

Umbrella organization administrator





800

75





1

1





.33

.33





264

25





$25.96

$25.96





$6,853.44

$649.00

Center engagement call

100

1

.50

50

$25.96

$1,298.00

Implementation interview protocol







Center director

80

1

3

240

$25.96

$6,230.40

Additional center staff

20

1

3

60

$25.96

$1,557.60

Electronic cost workbook

80

1

8

640

$25.96

$16,614.40

Staff rosters for time use survey

80

1

.25

20

$25.96

$519.20

Time use survey

1,120

1

.25

280

$17.89

$5,009.20

Classroom rosters for observations

80

1

.50

40

$25.96

$1,038.40

Estimated annual burden total

1,619


$39,769.64

Total annual cost

The team based average hourly wage estimates for deriving total annual costs on data from the Bureau of Labor Statistics, Occupational Employment Statistics (2018). For each instrument included in Table A.3, the team calculated the total annual cost by multiplying the annual burden hours by the average hourly wage.

The mean hourly wage of $25.96 for education administrators of preschool and child care centers or programs (occupational code 11-9031) is used for center directors, education managers, and financial managers and applies to all data collection tools except the time-use survey. The mean hourly wage for preschool teachers (occupational code 25-2011) of $16.54 is used for teachers and assistants. The study team calculated hourly average wage burden for the time-use survey based on 2 staff per center (an administrator and an education specialist) at $25.96 and 12 child care staff per center at $16.54, for an average of $17.89.


A13. Costs

Field testing is key to the development of valid, reliable, and practical data collection protocols. With OMB approval, the study team will offer each participating center an honorarium of $500 in recognition of the time and expertise that center staff contribute to the field test. Within each center, staff will (1) participate in interviews, (2) complete the cost workbook, (3) complete the staff roster to support time-use surveys, and (4) complete classroom rosters to inform classroom observations; and (5) allow observations of their classroom activities. The honorarium is intended to both encourage center’s initial participation and recognize their efforts to coordinate a timely and complete data collection.

In Phase 2 of measurement development, the study team provided $350 to each participating center, as approved by OMB. The study team recommends increasing the center honorarium to $500 for this field test, to reflect that the field test will also include logistical support of classroom observations.


A14. Estimated Annualized Costs to the Federal Government

The total/annual cost for the data collection activities under this current request will be $2,660,731.  This includes direct and indirect costs of data collection.

Cost Category

Estimated Costs

Instrument Development and OMB Clearance

$206,116

Field Work

$2,247,806

Publications/Dissemination

$206,809

Total/Annual costs over the request period

$2,660,731



A15. Reasons for changes in burden

This request is for additional information collection under OMB #0970-0499 to validate the measures created under earlier collections.


A16. Timeline

Table A.4 shows the schedule for the field test. The field test report, expected in June 2021, will present findings based on data collected from the 80 centers in the field test. Methodological findings of interest from Phase 2 may also be included.

Table A.4. Multi-case study schedule

Task

Date

Field test data collection

March 2021 to October 2021a

Field test report

June 2022

Data available for secondary analysis

August 2022

a Actual dates dependent on OMB approval


A17. Exceptions

No exceptions are necessary for this information collection.



Attachments



ATTACHMENT A: ECE-ICHQ CONCEPTUAL FRAMEWORK

ATTACHMENT B: ADVANCE MATERIALS

ATTACHMENT C: EMAIL AND LETTER TO SELECTED CENTERS

ATTACHMENT D: IMPLEMENTATION INTERVIEW EMAIL

ATTACHMENT E: COST WORKBOOK EMAIL

ATTACHMENT F: TIME-USE SURVEY OUTREACH

ATTACHMENT G: FEDERAL REGISTER NOTICE

INSTRUMENT 1: CENTER RECRUITMENT CALL SCRIPTS

INSTRUMENT 2: CENTER ENGAGEMENT CALL SCRIPT

INSTRUMENT 3: IMPLEMENTATION INTERVIEW PROTOCOL

INSTRUMENT 4: COST WORKBOOK

INSTRUMENT 5: TIME-USE SURVEY ROSTER

INSTRUMENT 6: TIME-USE SURVEY

INSTRUMENT 7: CLASSROOM ROSTERS FOR OBSERVATIONS





1

The ECE-ICHQ conceptual framework includes six key functions: (1) instruction and caregiving; (2) workforce development; (3) leadership activities, planning, and evaluation; (4) center administration; (5) child and family support; and (6) instructional planning, coordination, and child assessment.

2 Examples of sensitive topics include (but not limited to): social security number; sex behavior and attitudes; illegal, anti-social, self-incriminating and demeaning behavior; critical appraisals of other individuals with whom respondents have close relationships, e.g., family, pupil-teacher, employee-supervisor; mental and psychological problems potentially embarrassing to respondents; religion and indicators of religion; community activities which indicate political affiliation and attitudes; legally recognized privileged and analogous relationships, such as those of lawyers, physicians and ministers; records describing how an individual exercises rights guaranteed by the First Amendment; receipt of economic assistance from the government (e.g., unemployment or WIC or SNAP); immigration/citizenship status.

12


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-02-18

© 2024 OMB.report | Privacy Policy