BFACES 2022 Supporting Statement A_CLEAN

BFACES 2022 Supporting Statement A_CLEAN.docx

OPRE Evaluation: The Early Head Start Family and Child Experiences Survey (Baby FACES)—2020 [Nationally-representative descriptive study]

OMB: 0970-0354

Document [docx]
Download: docx | pdf

Alternative Supporting Statement for Information Collections Designed for

Research, Public Health Surveillance, and Program Evaluation Purposes



The Early Head Start Family and Child Experiences Survey (Baby FACES)—2020/2022



OMB Information Collection Request

0970 - 0354





Supporting Statement

Part A

MARCH 2022


Submitted By:

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


4th Floor, Mary E. Switzer Building

330 C Street, SW

Washington, D.C. 20201


Project Officer: Amy Madigan, Ph.D.










Part A


Executive Summary

  • Type of Request: This Information Collection Request is to continue data collection for an additional two years. The current expiration date for this OMB number is October 31, 2021. This is an extension request with no changes.


  • Progress to Date: ACF’s Baby FACES study periodically collects nationally representative information about Early Head Start (EHS) programs, their staff, and the families they serve to inform program planning, technical assistance, and enable research. Like Baby FACES 2018, Baby FACES 2020/2022 will collect detailed information about centers, staff, and families through interviews, self-administered questionnaires, observations of classrooms, and administrative data sources. While Baby FACES 2018 took an in-depth look at center-based classrooms, Baby FACES 2020/2022 will focus on how home visits and classrooms support infant–toddler development through responsive relationships. OMB approved the 2009, 2018, and 2020 Baby FACES data collections under this control number (0970-0354). Data collection for Baby FACES 2009 and 2018 is complete. See Table B.1 in supporting statement B for the full sample sizes and response rates for Baby FACES 2018, which took place in winter and spring 2018.

  • Timeline: The 2020 schedule for the project was impacted by the COVID19 pandemic and data collection was postponed by two years. This request for Baby FACES 2020/2022 seeks approval to complete data collection as described in the prior approved information collection request under this control number. Data collection for Baby FACES 2020/2022 is scheduled to begin in fall 2021.


  • Previous Terms of Clearance: There were no terms of clearance included in the NOA for the BabyFACES 2020 materials.


  • Summary of changes requested: This is a request for non-substantive changes to materials and data collection modes due to ongoing restrictions resulting from the COVID-19 pandemic.. This request will support the continued collection of data for Baby FACES 2020/2022.



A1. Necessity for Collection

The Administration for Children and Families (ACF) at the U.S. Department of Health and Human Services (HHS) seeks approval to continue to collect descriptive information for the Early Head Start Family and Child Experiences Survey 2020/2022 (Baby FACES 2020/2022). The goal of this information collection is to provide updated nationally representative data on Early Head Start (EHS) programs, staff, and families to guide program planning, technical assistance, and research. Baby FACES is the only source for in-depth information on Early Head Start program operations nationally. The information collected aligns with and allows hypothesis testing of the relationships specified in the EHS conceptual framework (Appendix A).

Study Background

ACF’s Baby FACES study periodically collects nationally representative information about Early Head Start (EHS) programs, staff, and families to guide program planning, technical assistance, and research. Baby FACES 2009 included a sample of 89 programs and nearly 1,000 children from two birth cohorts (newborns and 1-year-olds), following them annually throughout their enrollment in the program (2009‒2012).

For 2018, Baby FACES was redesigned to collect repeated cross-sectional data. The Baby FACES 2018 and 2020/2022 data collections offer the first nationally representative information about teachers, home visitors and classroom/home visit quality in the EHS program. Available administrative data do not provide the depth or richness necessary to answer key research questions. By linking information about staff and service quality to information about activities in the sampled programs, we will be able to examine associations between program processes, support of staff, and staff relationships with children and families.

Legal or Administrative Requirements that Necessitate the Collection

There are no legal or administrative requirements that necessitate the collection. ACF is undertaking the collection at the discretion of the agency.

A2. Purpose

Purpose and Use

The overarching purpose of the Baby FACES studies is to provide knowledge about EHS children and families, and the EHS programs and staff who serve them. The Baby FACES collection of information on EHS programs extends the work of the Family and Child Experiences Survey (FACES)1, which serves a similar purpose for Head Start programs. The ongoing series of Baby FACES data collections aims to maintain up-to-date core information on EHS over time while also focusing on areas of timely topical interest. The Baby FACES studies began with the longitudinal Baby FACES 2009 and continued with the redesigned, cross-sectional Baby FACES 2018 and Baby FACES 2020/2022.

The findings from Baby FACES 2018 and 2020/2022 will provide information about program processes and how program supports are associated with intermediate and longer term outcomes and contribute to ACF’s evidence-based planning, training and technical assistance, management, and policy development efforts. This information is particularly timely given the implementation of new Head Start Program Performance Standards that require grantees to implement program and teaching practices aligned with the Head Start Early Learning Outcomes Framework. A restricted use data set and data documentation will enable secondary research use of the data.

Previously Approved Requests

Baby FACES 2009 (approved October 2008). The 2009 study was designed to produce nationally representative information on EHS services offered to families, training and credentials of staff, and the quality of services provided. The study also described the EHS population, examining changes over time in child and family functioning and possible associations with aspects of the program and services they received. Baby FACES 2009, which concluded in 2015, provided rich descriptive information on the EHS program, families’ participation in it, and the amount and quality of services provided (see Vogel et al. 2011, 2015a, and 2015b).

Baby FACES 2018 (approved September 2017). For 2018, Baby FACES was reconceptualized as a repeated cross-sectional study. The descriptive information Baby FACES 2018 collected allowed ACF to answer new questions about the full age range of children participating in EHS; the characteristics of and professional development supports for the EHS classroom teachers and home-visitors; and how EHS services support infant-toddler development through responsive relationships. In particular, Baby FACES 2018 provided an in-depth look at the processes and teacher-child relationships in EHS center-based classrooms. It also provided information on EHS-Child Care Partnership grantees, which will inform a separate sub-study with EHS Partnership grantees (we will submit an additional information collection request for this).

Responsive relationships are those in which caregivers are respectful of infants and toddlers and interact with them by reading their cues and responding in a way that makes them feel heard and valued. Examples include talking to infants and toddlers, asking questions, responding to their verbal and non-verbal cues, and using strategies to engage children. These relationships are critical to infants’ and toddlers’ development and learning.

Relationship-based approaches to supporting infant-toddler development are approaches that support relationships between caregivers and the infants and toddlers in their care. They are based on caregivers’ being sensitive to the child’s cues and responding contingently to them, and thereby helping to support their physical-motor, social-emotional, language, and cognitive development.

Baby FACES 2020 (approved October 2019) was designed to build upon and extend information from 2018 with a new nationally representative cross-section of programs, and their associated centers, home visitors, teachers, children, and families. The descriptive information gathered through Baby FACES 2020 would allow ACF to examine national-level changes in center-based service provision and quality between Baby FACES 2018 and 2020. Additionally, Baby FACES 2020 would collect new information about home visiting quality and the parent–child relationships associated with home visiting. When combined with information from ACF’s FACES study, which describes Head Start programs and the children they serve (ages 3 to 5), Baby FACES 2020 would fill out the birth to 5 age spectrum. Data collection for Baby FACES 2020 was halted after three weeks in the field due to COVID-19.

In 2021, we made a second attempt to collect data for the approved Baby FACES 2020 request. That request, approved in September 2020, (1) increased the burden to reflect that we conducted data collection with 19 programs in 2020 before we had to stop and (2) revised some data collection procedures to allow flexibility around programs’ changes and requirements related to COVID-19. This effort was postponed before it began given ongoing restrictions due to the COVID-19 pandemic.

Current Request

Baby FACES 2020/2022 will continue to collect data for the approved Baby FACES 2020 request. The only changes to materials are to dates referenced to make them accurate for the updated timeline, adapting paper invitations to email with embedded links, and adding QR codes to materials to make it easier for respondents to access web versions of the instruments.

Research Questions or Tests

Working collaboratively with ACF and the Baby FACES technical work group (see section A.8), Mathematica developed a broad conceptual framework for EHS that hypothesizes how and why program services are expected to lead to positive outcomes for infants and toddlers and their families (see Appendix A). The conceptual framework depicts hypothesized pathways from inputs into EHS program operation to the program’s goals of improving outcomes for children and families.

The overarching research question for both Baby FACES 2018 and Baby FACES 2020/2022 is:
How do EHS services support infant/toddler growth and development in the context of nurturing, responsive relationships? Baby FACES 2018 focused on EHS classrooms, while Baby FACES 2020/2022 will collect in-depth information on home visits.

Table A.1 lists high-level research questions that align with the study’s conceptual framework, regarding program processes, program functioning, and classroom/home visit processes hypothesized to be associated with responsive relationships, enhanced infant/toddler outcomes, and family well-being. Baby FACES 2020/2022 will address three different types of research questions including: (1) descriptive (for example, what is relationship quality in EHS?); (2) associations with relationship quality (for example, how are home visit processes associated with relationship quality in EHS?), and (3) mediators of hypothesized associations.

Detailed lists of specific research questions for the center-based and home-based questionnaires are in Appendix A (Tables A.1 and A.2, respectively). The research questions in those tables map to the research question numbers in the conceptual sub-frameworks in Appendix A (Figures A.2 and A.3). These questions address gaps in the research literature identified at the conclusion of Baby FACES 2009 (Xue et al. 2015).

Table A.1. Research questions for Baby FACES 2018 and 2020/2022

Service characteristics

How do EHS classrooms and home visits support infant/toddler growth and development in the context of nurturing, responsive relationships?

  • What is the quality of relationships between EHS children and their caregivers (e.g., parents and teachers) and relationships between parents and their home visitors?

  • How does EHS support these relationships in classrooms and home visits?

  • How are these relationships associated with the development of infants/toddlers in EHS?

  • What is the quality of home visiting and how does it vary within a home visitor across different families?

Program processes and functioning

How do program-level processes and functioning support the development of nurturing, responsive relationships in classrooms and home visits?

  • How do program leadership, planning, culture, staff training, technical assistance, and other characteristics support quality and the development of responsive relationships between children and their caregivers and between parents and home visitors?

Infant/toddler outcomes and family well-being

How are EHS infants and toddlers faring in key domains of development and learning (e.g., language and social-emotional development)? How are EHS families functioning (e.g., social/economic well-being, family resources and competencies)?

  • What do parent–child relationships and home environment look like among EHS families?

  • How are parent–child relationships and family well-being associated with the development of infants/toddlers in EHS?


Study Design

Baby FACES 2009 was the first nationally representative descriptive study of EHS programs. Using a longitudinal cohort design, it included a sample of 89 programs and nearly 1,000 children from two birth cohorts (newborns and 1-year-olds) and followed them annually throughout their enrollment in the program (2009‒2012). Baby FACES 2018 employed a cross-sectional approach included a nationally representative sample of 137 EHS programs, 871 classrooms and teachers, 611 home visitors, and 2,868 children and families.

Baby FACES 2020/2022 will continue the cross-sectional sample of ECE programs established in 2018, capturing descriptive data on EHS programs, centers, home visitors, classrooms and teachers and the families, and children at a single point in time. The study will involve collecting quantitative information at each of these levels to enable nationally representative estimates and the testing of hypothesized associations across study levels.

Universe of Data Collection Efforts

Data collection instruments for Baby FACES 2020/2022 measure similar constructs to those used in Baby FACES 2018, with revisions to individual items or measures based upon their performance in 2018.

To reflect 2020/2022’s focus on in-depth measurement of home visiting, we include an in-home, observation-based measure of the parent–child relationship, as well as observation-based measures of home visit quality. The instruments and forms (Instruments 1-10) are annotated to identify sources of questions from prior studies, and new questions developed for Baby FACES 2020/2022 (Appendix A). Appendix A also lists the research questions, constructs, measures and in which instruments these measures appear.

Below we describe the data collection instruments/sources of information in the current request:

Data Collection Activity

Respondents

Mode

Purpose

Classroom/home visitor sampling form from EHS staff (Instrument 1)

EHS staff (On-Site Coordinators or Center Directors)

CADE

We will ask staff at each sampled EHS program to provide information in this form, listing all of the centers and home visitors, along with characteristics such as the number of classrooms (for centers) and size of caseload and whether they provide services to pregnant women (for home visitors).

Child roster form from EHS staff (Instrument 2)

EHS staff (On-Site Coordinators or Center Directors)

CADE

After sampling centers, classrooms, and home visitors, we will ask EHS program staff to provide information on the child roster form, listing all children in the sampled classrooms and all children receiving services from the sampled home visitors. Information from this form will be used to select EHS-funded families for inclusion in the study.

Parent consent form (Instrument 3)

Parents

Paper with Web option

After sampling children, we will ask each child’s parent to fill out and sign a form giving their consent to participate in the study.

Parent survey (Instrument 4).

Parents

CATI

We will ask parents about child and family socio-demographic characteristics; child and family health and well-being; household activities, routines, and climate; and parents’ relationships with EHS staff and their engagement with and experiences in the program. This will provide information at the child/family level that will be important for understanding linkages and associations among family characteristics, program experiences, and outcomes.

Parent Child Report (Instrument 5)

Parents

SAQ Web or Paper

The Parent Child Report will collect information about their child’s language and social-emotional development; parenting stress; parents’ perceptions of their relationship with their child; social support; household drug and alcohol use; and household income.2

Staff survey (Teacher survey and Home Visitor survey) (Instruments 6a and 6b)

Teachers and Home Visitors

SAQ Web or CATI

These surveys will provide information about the staff development and training their program offers, curricula and assessments they use, the organizational climate of their program, languages spoken, and their health and background information. In addition, teachers will provide information about the characteristics and routines they use in their classrooms and languages spoken in their classroom. We will link the information gathered in the teacher survey to observed quality in the classroom. We will report data gathered from the staff surveys descriptively as well as in analyses examining associations among different sample levels and moderators. Field staff who are on-site for data collection will administer the paper surveys in person.

Staff Child Report (Instruments 7a and 7b)

Teachers and Home Visitors

SAQ web or paper

These reports gather information on each child’s language and social-emotional development, developmental screenings and referrals, perceived relationship with the child’s parents, and the family’s engagement with the program. In addition, teachers will report on their perceptions of their relationship with the child, and home visitors will provide information about the services they offered to families in the past four weeks (including topics and activities covered, referrals, alignment of visit content to planned goals, and frequency and modes of communication). Field staff will collect the paper forms before they leave the program site.

Program director survey (Instrument 8)

Program Directors

Web SAQ

This survey gathers information about program goals, plans, program decision making, training, and professional development, staff supports, and use of data. The survey will also ask program directors to provide information about home visiting curricula and home visitor professional development, parent involvement, and program processes for supporting responsive relationships.

Center director survey (Instrument 9)


Web SAQ

This survey will gather information about aspects of the center such as use of curricula in classrooms, organizational climate, staff qualifications, and teacher professional development.

Parent–child interaction (Instrument 10)

Parents and children

Paper data entry

For children over 12 months who receive home-based services, we will use a parent-child interaction task in which we will ask parents and their children to interact with one another in a book reading and a semi-structured free-play task with toys. Staff will video record the interaction which will subsequently be coded for attributes such as sensitivity, positive regard, stimulation of cognitive development, intrusiveness, detachment, negative regard, and quality of the relationship.

Classroom observations. We will use a classroom observation tool to capture teacher-child relationships: the Quality of Caregiver-Child Interactions for Infants and Toddlers (Q‑CCIIT) measure (Atkins-Burnett et al. 2015). The Q-CCIIT is a measure developed under contract with ACF (OMB #0970-0513). As in Baby FACES 2018, we will use the Q-CCIIT for Baby FACES 2020/2022 to advance knowledge about the quality of EHS classrooms and expand information about the validity of the measure. The Q-CCIIT assesses the quality of child care settings for infants and toddlers in center-based settings and family child care homes—specifically, how a given caregiver interacts with a child or group of children in nonparental care. The Q-CCIIT measures caregivers’ support for social-emotional, cognitive, and language and literacy development, as well as areas of concern (such as harshness, ignoring children, and health and safety issues). At the end of the observation, observers will complete the Structural Features and Practices form in which they rate the room arrangement of the classroom, indicate the presence of a variety of materials and activities for children in the classroom, indicate whether information for parents is posted anywhere in the setting, whether a quiet space is available to children, whether a separate area for napping (with cribs, cots, or mats) is available in the classroom, and the nature of transitions between activities in the classroom. There is no burden to study participants associated with the observations. We will conduct the observations either in person or remotely, depending on local conditions related to the pandemic.

Home visit observations. For families with children receiving home-based services, we will capture the quality of the interactions between home visitors and families by conducting home visit observations using the Home Visitor Practices subscale from the Home Visit Rating Scales 3rd edition (HOVRS-3) and the Home Visit Content and Characteristics Form. The HOVRS was initially developed from field-based descriptions of successful home visits and is supported by home visiting research in multiple disciplines. Four home visiting practice scales include indicators of relationship building with families, responsiveness to family strengths, facilitation of parent-child interaction, and collaboration with parents. The Home Visit Content and Characteristics Form is an observational measure that documents the content of the home visit (e.g., topics discussed), and the characteristics (e.g., who was present, the level of distraction from TV, and so on). Study staff will accompany home visitors to visits to study families. These observations do not impose any burden on respondents. We will conduct the observations either in person or remotely, depending on local conditions related to the pandemic.

Other Data Sources and Uses of Information

The sample of ECE programs will be drawn using data from the most recent Head Start Program Information Report (PIR)3, using administrative data on program characteristics as explicit and implicit stratification variables. We describe this approach in detail in Supporting Statement Part B. During data analysis, we will incorporate program characteristics data from the PIR, including program size, location, population served, and percentage of children who have a medical home. There is no burden to study participants associated with using PIR data for Baby FACES.

A3. Use of Information Technology to Reduce Burden

The data collection will use a variety of information technologies to reduce the burden of participating on respondents. Program director surveys, center director surveys, teacher surveys, home visitor surveys and Staff Child Reports will be offered as web-based surveys. Parent surveys will be administered using computer-assisted telephone interviewing to reduce respondent burden and data entry errors. Parents will have the option to access an electronic version of the consent form (all paper consent packets will include log-in information to complete the electronic form) and have the option to complete the Parent Child Report online or by paper. Study staff will collect missing consent forms during in-person data collection visits by accessing the electronic version of the consent forms on tablets. Staff surveys (teacher and home visitor surveys) will be administered by web with on-site data collectors scheduling times for staff to complete it on field staff’s tablet or via telephone if needed.

A4. Use of Existing Data: Efforts to reduce duplication, minimize burden, and increase utility and government efficiency

Wherever possible, we will use existing administrative information from PIR about EHS program characteristics to prevent duplication, minimize burden, and increase efficiency. No study instruments ask for information that is available from alternative data sources, including administrative data.

A5. Impact on Small Businesses

Most of the EHS programs and child care centers included in the study will be small organizations, including community-based organizations and other nonprofits. We will minimize burden for respondents by restricting the length of survey interviews as much as possible, conducting survey interviews on-site or via telephone at times that are convenient to the respondent, and providing some instruments in a web-based format.

A6. Consequences of Less Frequent Collection

No nationally representative information has been collected on EHS classrooms, home visitors, families, or children since the conclusion of Baby FACES 2018. In the past three years, EHS has undergone program expansion and other policy changes that warrant measurement to describe the status of implementation efforts.

A7. Now subsumed under 2(b) above and 10 (below)

A8. Consultation

Federal Register Notice and Comments

In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13) and Office of Management and Budget (OMB) regulations at 5 CFR Part 1320 (60 FR 44978, August 29, 1995), ACF published a notice in the Federal Register announcing the agency’s intention to request an OMB review of this information collection activity. This notice was published on July 21, 2021, Volume 86, Number 137, page 38490, and provided a 60-day period for public comment. ACF did not received any comments.

Consultation with Experts Outside of the Study

We consulted with experts to complement our team’s knowledge and experience (Table A.2). Consultants included researchers with expertise in EHS and child care more broadly, child development, family engagement, and classroom and home visit processes. We also engaged experts with specialized knowledge and skills in the areas of home visit quality and parent–child interactions relevant to this work.

Table A.2. Baby FACES 2020/2021 technical work group members and outside experts

Name

Affiliation

Rachel Chazan Cohen

Department of Human Development and Family Science, University of Connecticut

Mary Dozier

University of Delaware

Anne Duggan

Department of Population, Family and Reproductive Health, Bloomberg School of Public Health, Johns Hopkins University

Beth Green

Portland State University

Erika Lunkenheimer

Pennsylvania State University

Anne Martin

Columbia University

Carla Peterson

Iowa State University

Lori Roggman

Utah State University

Daniel Shaw

University of Pittsburgh

Catherine Tamis-LeMonda

New York University


A9. Tokens of Appreciation

We are not requesting any changes to previously approved tokens of appreciation.

Given the complex study design and nested analysis plan for Baby FACES 2020/2022, respondents’ participation in the study activities is key to ensuring the study’s success. High levels of participation among the sampled EHS programs, staff, and families are essential to help ensure that estimates are nationally representative and to increase comparability of data with that collected in Baby FACES 2018.

Similar studies of low-income young families, such as FACES (OMB control number 0970-0151), include tokens of appreciation to participating families and children as part of an overall successful strategy to increase data quality in a complex study design. FACES 2014 used a tiered approach to tokens of appreciation for its parent survey, lowering the base amount to $15, relative to FACES 2009, with add-ons for a potential of $25 total. There were lower response rates to the FACES 2014 parent survey than seen in previous FACES studies. The study team conducted a nonresponse bias analysis of key child-level characteristics and found significant differences between children whose parents responded to the parent survey at baseline (fall 2014) and those whose parents did not. Specifically, parents of children with without disabilities, English speakers, and those with unlimited cell phone minutes were less likely to respond. Further, those in programs reporting 20 percent or less Black child enrollment, and those with more than 50 percent White child enrollment, more likely to respond than those with children in other types of programs. This experiment raises concern for nonresponse bias without offering a token of appreciation.

Similarly, the Project LAUNCH Cross-Site Evaluation (OMB control number 0970-0373) did not offer a token of appreciation to respondents completing the web-based parent survey. The study team found that early respondents (pre-token of appreciation) were not representative of their communities. Minorities, individuals with lower incomes and those who worked part time or were unemployed were underrepresented. Following OMB approval of a $25 post-pay token of appreciation after data collection had started, completion rates and representativeness both improved (LaFauve et al. 2018).

Table A.3. Structure of gifts of appreciation for Baby FACES 2020/2021 and prior rounds



Baby FACES 2020/2022

Baby FACES 2018

Baby FACES 2009

Baby FACES component

Respondent

Length of activity

Token of appreciation

Length of activity

Token of appreciation

Response rate (percentage)

Token of appreciation

Response rate (percentage)

Parent survey

Parent

32 minutes

$20

32 minutes

$20

81.9

$35

79.6

Parent Child Report (PCR)

Parent

20 minutes

$5

15 minutes

$5

88.0

(PCR administered in the home after child assessment. Parent–child interaction part of in home visit

86

Parent–child interaction (and in-home observation of home visitor)

Parent

10 minutes for parent–child interaction, up to 90 minutes in the home

$35
plus children’s book ($7 value)

n.a.

n.a.

n.a.

83.7

Staff survey

Teachers and home visitors

30 minutes

Children’s book ($10 value)

30 minutes

Children’s book ($10 value)

97.5

Children’s book ($5 value)

98.7

Staff Child Report

Teacher or home visitor

15 minutes per sampled child

$5 per report

15 minutes per sampled child

$5 per report

94.4

$5 per report

96.2

n.a. = not applicable.


Table A.3 lists tokens of appreciation for programs, staff, and families participating in Baby FACES 2020/2022 data collection. The tokens of appreciation were approved under OMB control number 0970-0354 on September 15, 2020 and we request an extension of that approval. For comparison, the table also reports approved gift amounts and response rates from prior rounds of Baby FACES.





A10. Privacy: Procedures to protect privacy of information, while maximizing data sharing

Personally Identifiable Information

This collection requests personally identifiable information (PII), such as name, dates of birth, due date, and contact information. All electronic data will be stored on a secure network drive at Mathematica offices and never in possession of ACF; data will be backed up on secure servers for 60 days for disaster recovery purposes. Sixty days after the primary data files are securely deleted, the backed-up data will be automatically and securely deleted, as required by the contract (i.e., “The Contractor shall dispose of the primary data and files created during the course of the study in accordance with specifications provided by ACF”). These plans are described in more detail in a data security plan, also required by the contract. Systems will be accessible only by staff working on the project through individual passwords and logins.



The hard copy data collection instruments (staff child reports, parent child reports, and classroom and home visit observation booklets) will temporarily include teacher/home visitor/child names because respondents need to know who they are providing information when completing these instruments. Field staff will be trained to guard hard copy documents shared between team members that contain PII. All hard copy documents will be inventoried and sent to and from the field using FedEx shipping service. FedEx shipments are logged and tracked from the moment of package pick-up to the time of delivery, including the name of the person who received the package. We will also use our sample management system to track hard copy documents sent to and from the field. Hard copy materials are stored in locked cabinets during the study. Following the end of the project, and when no longer required, hard copy materials and other physical media containing sensitive data will be destroyed using a cross-cut shredder.



Following data collection, Mathematica will remove all PII from the instruments and the de-identified data will be exported for analysis. Neither analysis staff nor ACF will have access to any PII; only de-identified data will be available. Once the analysis is complete all electronic databases will be deleted, and as mentioned above, after 60 days the data will no longer be able to be retrieved.

Information will not be maintained in a paper or electronic system from which they are actually or directly retrieved by an individual’s personal identifier.


Assurances of Privacy

Information collected will be kept private to the extent permitted by law. Respondents will be informed of all planned uses of data, that their participation is voluntary, and that their information will be kept private to the extent permitted by law. The consent statement that all study participants will receive provides assurances that the research team will protect the privacy of respondents to the fullest extent possible under the law, that respondents’ participation is voluntary, and that they may withdraw their consent at any time without any negative consequences.

As specified in the contract signed by ACF and Mathematica (referred to as the Contractor in this section), the Contractor shall protect respondent privacy to the extent permitted by law and will comply with all Federal and Departmental regulations for private information. The Contractor developed a Data Safety Plan that assesses all protections of respondents’ PII) and submitted it to ACF on October 30, 2015. The Contractor shall ensure that all of its employees, subcontractors (at all tiers), and employees of each subcontractor who perform work under this contract/subcontract are trained on data privacy issues and comply with the above requirements. All of the Contractor’s staff sign the Contractor’s confidentiality agreement when they are hired.

Due to the sensitive nature of part of this research (see A.11 for more information), the evaluation has obtained a Certificate of Confidentiality, attached in Appendix B. The Certificate of Confidentiality helps assure participants that their information will be kept private to the fullest extent permitted by law. Further, all materials to be used with respondents as part of this information collection, including consent statements and instruments, have been approved by the Health Media Lab Institutional Review Board (the Contractor’s IRB).

Data Security and Monitoring

As specified in the evaluator’s contract, the Contractor shall use Federal Information Processing Standard (currently, FIPS 140-2) compliant encryption (Security Requirements for Cryptographic Module, as amended) to protect all instances of sensitive information during storage and transmission. The Contractor shall securely generate and manage encryption keys to prevent unauthorized decryption of information, in accordance with the Federal Processing Standard. The Contractor shall ensure that this standard is incorporated into the Contractor’s property management/control system and establish a procedure to account for all laptop computers, desktop computers, and other mobile devices and portable media that store or process sensitive information. Any data stored electronically will be secured in accordance with the most current National Institute of Standards and Technology (NIST) requirements and other applicable Federal and Departmental regulations. In addition, the Contractor must submit a plan for minimizing, to the extent possible, the inclusion of sensitive information on paper records and for the protection of any paper records, field notes, or other documents that contain sensitive data or PII, ensuring secure storage and limits on access.

If we need to conduct remote observations of classrooms or home visits we would do so with secure/encrypted video, using a livestream option so that observations are coded in real time without the need for observers to enter the classroom or for videos to be stored.

For each round of the study, we will create a de-identified restricted use data file and a data user’s guide to inform and assist researchers who would like to use the data in future analyses.

A11. Sensitive Information

To achieve its primary goal of describing the characteristics of the children and families EHS serves, we ask parents and staff (teachers and home visitors) a limited number of sensitive questions. Responses to these items collected during Baby FACES 2009 and 2018 were used to describe the EHS population, their needs, parent outcomes, and families’ circumstances over time. Sensitive questions for parents include potential feelings of depression, use of services for emotional or mental health problems, reports of family violence or substance abuse, household income, and receipt of public assistance. Staff will only be asked about symptoms of depression.

The invitation to participate in the study will inform parents and staff that the survey will ask sensitive questions (these materials are in Appendix C). The invitation will also inform parents and staff that they do not have to answer questions that make them uncomfortable and that the responses they provide will not be reported to program staff.



A12. Burden

Explanation of Burden Estimates

Table A.4 presents the current request for data collection activities that enable sampling classrooms, home visitors, and families; surveys with sampled EHS staff and families; and observation of parent–child interactions during home visits. The estimates include time for respondents to review instructions, search data sources, complete and review the responses, and transmit or disclose information. This information collection request covers a period of two years. There are no remaining approved burden hours from the Baby FACES 2009 or 2018 data collections. We have updated the burden estimates to reflect data collection over the next two years; burden related to data collected prior to cancelation of the spring 2020 data collection was removed. We expect the total annual burden to be 1,972hours for all of the instruments in the current information collection request.

  • Classroom/home visitor sampling form from EHS staff (Instrument 1). For each selected center, a member of the Baby FACES study team will request a list of all Early Head Start (EHS) classrooms from EHS staff (typically the On-Site Coordinator or center director), for a total of 407 classrooms and programs with home visitors. We expect it will take approximately 10 minutes for the EHS staff member to complete this sampling form.

  • Child roster form from EHS staff (Instrument 2). For each selected classroom or home visitor caseload, a Baby FACES study team member will request the names and dates of birth and enrollment of each child or family enrolled in the selected classroom or HV caseload from Early Head Start (EHS) staff (typically the On-Site Coordinator). We will identify the sibling groups in the sampling program and the sampling program will then randomly drop all but one member of each sibling group, leaving one child per family. We expect this form to be completed 252 times, and that it will take about 20 minutes for EHS staff to provide the information requested.

  • Parent consent form (Instrument 3). We will ask parents of all 2,495 elected children to provide their consent via a parent consent form. We expect it will take parents about 10 minutes to complete the form.

  • Parent survey (Instrument 4). We will conduct a 32-minute telephone survey interview with parents of sampled children. We expect responses from a total of 2,084 parents of children across the 123 programs, about 16.9 per program.

  • Parent Child Report (Instrument 5). The Parent Child Report is a 20-minute self-administered questionnaire that we expect 2,008 parents of sampled children to complete.

  • Staff survey (Teacher survey and Home Visitor survey) (Instruments 6a and 6b). We will conduct 30-minute in-person staff surveys with 609 classroom teachers and 706 home visitors.

  • Staff Child Report (Instruments 7a and 7b). The Staff Child Report is a 15-minute self-administered survey that asks home visitors to report on all of their sampled children and a subsample of teachers to report on their sampled families, which will total 1,046 staff completing 2,230 Staff Child Reports.

  • Program director survey (Instrument 8). The 30-minute program director survey will be administered via the web with the option of in-person follow-up for those who do not respond on the web. We expect 120 program directors to participate in this survey.

  • Center director survey (Instrument 9). The 30-minute center director survey will be web-based with the option of in-person follow-up for those who do not respond on the web. We expect 294 center directors to complete this survey.

  • Parent–child interaction (Instrument 10). For children over 12 months who receive home-based services, we will use a 10-minute parent-child interaction task. We expect that 996 families will complete the parent-child interaction task.


Table A.4. Total burden requested under this information collection

Instrument

No. of Respondents (total over request period)

No. of Responses per Respondent (total over request period)

Avg. Burden per Response (in hours)

Total Burden (in hours)

Annual Burden (in hours)

Average Hourly Wage Rate

Total Annual Respondent Cost

Classroom/ home visitor sampling form (from EHS staff)

407

1

0.17

69

35

$35.65

$1,229.93

Child roster form (from EHS staff)

252

1

0.33

83

42

$35.65

$1,497.48

Parent consent form

2,495

1

0.17

424

212

$19.80

$4,197.60

Parent survey

2,084

1

0.53

1,105

553

$19.80

$10,939.50

Parent Child Report

2,008

1

0.33

663

332

$19.80

$6,563.70

Staff survey (Teacher survey and Home Visitor survey)

1,317

1

0.50

659

330

$35.65

$11,766.68

Staff Child Report

1,046

2.13

0.25

557

279

$35.65

$9,928.53

Program director survey

120

1

0.50

60

30

$35.65

$1,069.50

Center director survey

294

1

0.50

147

74

$35.65

$2,620.28

Parent–child interaction

996

1

0.17

169

85

$19.80

$1,673.10

Total





1,972


$51,486.30


Estimated Annualized Cost to Respondents

We expect the total annual cost to be $51,486.30 for all of the instruments in the current information collection request.

Average hourly wage estimates for deriving total annual costs are based on Current Population Survey data for the first quarter of 2021 (Bureau of Labor Statistics 2021). For each instrument included in Table A.4, we calculated the total annual cost by multiplying the annual burden hours and the average hourly wage.

For program directors, center directors, and staff (teachers and home visitors), we used the median usual weekly earnings for full-time wage and salary workers age 25 and older with a bachelor’s degree or higher ($35.65 per hour). For parents, we used the median usual weekly earnings for full-time wage and salary workers age 25 and older with a high school diploma or equivalent and no college experience ($19.80). We divided weekly earnings by 40 hours to calculate hourly wages.

A13. Costs

We are not requesting any changes to previously approved honorarium.

The study team will offer each participating program an honorarium of $250 in recognition of the time and expertise that center staff contribute to the data collection, including their assistance in scheduling data collection site visits and gathering parent consent forms. The honorarium is intended to both encourage center’s initial participation and recognize their efforts to coordinate a timely and complete data collection.

The honorarium approved for Baby FACES 2020 matches the site payments approved for Baby FACES 2018.

A14. Estimated Annualized Costs to the Federal Government

Cost Category

Estimated Costs

Instrument Development and OMB Clearance

$21,395

Field Work

$3,467,209

Publications/Dissemination

$238,736

Total costs over the request period

$3,727,340

Annual costs

$1,863,670



A15. Reasons for changes in burden

We have updated the burden estimates to reflect data collection over the next two years; burden related to data collected prior to cancelation of the spring 2020 data collection was removed. There are no changes to the data collection materials or study design.

A16. Timeline

Table A.5 contains the timeline for the data collection and reporting activities. Recruitment will begin in fall 2021. Data collection is expected to occur through spring 2022. Mathematica will produce several publications based on analysis of data from Baby FACES 2020/2022 (See Supporting Statement B, B7).



Table A.5. Schedule for Baby FACES 2020/2022 data collection and reporting

Activity

Timing

Recruitment


Program recruitment

Fall 2021

Data collection


Parent survey (by telephone)

Spring/summer 2022

Program and center director surveys

Spring 2022

On-site classroom observations and staff surveys

Spring 2022

In-home visits for home visit observations and parent–child interactions

Spring 2022

Analysis


Data processing and analysis for data tables

Spring/summer 2022

Data processing and analysis for final report

Winter 2021/spring 2022

Reporting


Data tables

Fall 2022

Final report on the 2020 data collection

Spring 2023

Briefs on specific topics

Spring/summer 2023

Restricted-use data file

Spring 2023


A17. Exceptions

No exceptions are necessary for this information collection.



Attachments

Appendices

Appendix A. Conceptual Frameworks and Research Questions

Appendix B. NIH Certificate of Confidentiality

Appendix C. Advance Materials

Appendix D. Brochure

Instruments

Instrument 1. Classroom/home visitor sampling form from Early Head Start staff

Instrument 2. Child roster form from Early Head Start staff

Instrument 3. Parent consent form

Instrument 4. Parent survey

Instrument 5. Parent Child Report

Instrument 6a. Staff survey (Teacher survey)

Instrument 6b. Staff survey (Home Visitor survey)

Instrument 7a. Staff Child Report (Teacher)

Instrument 7b. Staff Child Report (Home Visitor)

Instrument 8. Program director survey

Instrument 9. Center director survey

Instrument 10. Parent–child interaction

References

Bureau of Labor Statistics. “Usual Weekly Earnings of Wage and Salary Workers: Fourth Quarter 2018.” USDL-19-0077. Washington, DC: Bureau of Labor Statistics, January 2019.

Bureau of Labor Statistics. “Usual Weekly Earnings of Wage and Salary Workers: First Quarter 2021.” USDL-210655. Washington, DC: Bureau of Labor Statistics, April 2021.

Horm, D., D. Norris, D. Perry, R. Chazan-Cohen, and T. Halle. “Developmental Foundations of School Readiness for Infants and Toddlers: A Research to Practice Report.” OPRE Report No. 2016-07. Washington, DC: Office of Planning, Research, and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services, 2016.

LaFauve, K., K. Rowan, K. Koepp, and G. Lawrence. “Effect of Incentives on Reducing Response Bias in a Web Survey of Parents.” Presented at the American Association of Public Opinion Research Annual Conference, Denver, CO, May 16–19, 2018.

Mack, S., V. Huggins, D. Keathley, and M. Sundukchi. “Do Monetary Incentives Improve Response Rates in the Survey of Income and Program Participation?” In Proceedings of the American Statistical Association, Survey Research Methods Section, vol. 529534, 1998.

Martin, E., and F. Winters. “Money and Motive: Effects of Incentives on Panel Attrition in the Survey of Income and Program Participation.” Journal of Official Statistics, vol. 17, no. 2, 2001, pp. 267–284.

Office of Management and Budget, Office of Information and Regulatory Affairs. “Questions and Answers When Designing Surveys for Information Collections.” Washington, DC: Office of Management and Budget, 2006.

Singer E., N. Gebler, T. Raghunathan, J. V. Hoewyk, and K. McGonagle. “The Effect of Incentives on Response Rates in Interviewer-Mediated Surveys.” Journal of Official Statistics, vol. 15, no. 2, 1999, pp. 217–230.

Singer, E., and R.A. Kulka. “Paying Respondents for Survey Participation.” In Studies of Welfare Populations: Data Collection and Research Issues, edited by Michele Ver Ploeg, Robert A. Moffitt, and Constance F. Citro, pp. 105–128. Washington, DC: National Academy Press, 2002.

Singer, E., J. Van Hoewyk, and M.P. Maher. “Experiments with Incentives in Telephone Surveys.” Public Opinion Quarterly, vol. 64, no. 2, 2000, pp. 171–188.

Sosinsky, L, K. Ruprecht, D. Horm, K. Kriener-Althen, C. Vogel, and T. Halle. “Including Relationship-Based Care Practices in Infant-Toddler Care: Implications for Practice and Policy.” Brief prepared for the Office of Planning, Research, and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services, 2016.

Vogel, Cheri A., Kimberly Boller, Yange Xue, Randall Blair, Nikki Aikens, Andrew Burwick, Yevgeny Shrago, Barbara Lepidus Carlson, Laura Kalb, Linda Mendenko, Judy Cannon, Sean Harrington, and Jillian Stein. “Learning As We Go: A First Snapshot of Early Head Start Programs, Staff, Families, and Children.” OPRE Report No. 2011-7. Washington, DC: Office of Planning, Research, and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services, February 2011.

Vogel, Cheri A., Pia Caronongan, Jaime Thomas, Eileen Bandel, Yange Xue, Juliette Henke, Nikki Aikens, Kimberly Boller, and Lauren Bernstein. “Toddlers in Early Head Start: A Portrait of 2-Year-Olds, Their Families, and the Programs Serving Them.” OPRE Report No. 2015-10. Washington, DC: Administration for Children and Families, U.S. Department of Health and Human Services, 2015a.

Vogel, Cheri A., Pia Caronongan, Yange Xue, Jaime Thomas, Eileen Bandel, Nikki Aikens, Kimberly Boller, and Lauren Murphy. “Toddlers in Early Head Start: A Portrait of 3-Year-Olds, Their Families, and the Programs Serving Them.” OPRE Report No. 2015-28. Washington DC: Office of Planning, Research, and Evaluation, and Princeton, NJ: Mathematica Policy Research, April 2015b.

Xue, Yange, Kimberly Boller, Cheri A. Vogel, Jaime Thomas, Pia Caronongan, and Nikki Aikens. “Early Head Start Family and Child Experiences Survey (Baby FACES) Design Options Report.” OPRE Report No. 2015-99. Washington, DC: Office of Planning, Research, and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services, September 2015.



1 The Family and Child Experiences Survey (FACES) information collection is approved under OMB #0970-0151.

2

3 The PIR is an administrative data system for the Head Start program as a whole that includes data collected annually from all programs. Head Start programs collect the information as approved under OMB control number 0970-0427.



6

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorNicole Deterding
File Modified0000-00-00
File Created2022-03-09

© 2024 OMB.report | Privacy Policy