Baby FACES OMB_SSA-revised2_for OPRE

Baby FACES OMB_SSA-revised2_for OPRE.pdf

Early Head Start Family and Child Experiences Survey (Baby FACES)—2018

OMB: 0970-0354

Document [pdf]
Download: pdf | pdf
The Early Head Start Family and Child
Experiences Survey (Baby FACES)—2018
OMB Information Collection Request
0970-0354

Supporting Statement
Part A
July 2017

Submitted By:
Office of Planning, Research and Evaluation
Administration for Children and Families
U.S. Department of Health and Human Services
4th Floor, Mary E. Switzer Building
330 C Street, SW
Washington, D.C. 20201
Project Officer:
Amy Madigan, Ph.D.

SUPPORTING STATEMENT PART A

CONTENTS
A1. Necessity for the Data Collection ........................................................................................................... 1
Study Background ............................................................................................................................ 1
Legal or Administrative Requirements that Necessitate the Collection ........................................... 1
A2. Purpose of Survey and Data Collection Procedures .............................................................................. 2
Overview of Purpose and Approach ................................................................................................ 2
Research Questions ......................................................................................................................... 4
Study Design .................................................................................................................................... 6
Universe of Data Collection Efforts .................................................................................................. 8
A3. Improved Information Technology to Reduce Burden .......................................................................... 11
A4. Efforts to Identify Duplication ................................................................................................................ 11
A5. Involvement of Small Organizations ..................................................................................................... 12
A6. Consequences of Less Frequent Data Collection ................................................................................ 12
A7. Special Circumstances ......................................................................................................................... 12
A8. Federal Register Notice and Consultation ............................................................................................ 12
Federal Register Notice and Comments ........................................................................................ 12
Consultation with Experts Outside of the Study ............................................................................. 12
A9. Incentives for Respondents .................................................................................................................. 13
A10. Privacy of Respondents ...................................................................................................................... 15
A11. Sensitive Questions ............................................................................................................................ 16
A12. Estimation of Information Collection Burden ...................................................................................... 16
Burden Hours ................................................................................................................................. 16
Total Annual Cost........................................................................................................................... 17
A13. Cost Burden to Respondents or Record Keepers .............................................................................. 17
A14. Estimate of Cost to the Federal Government ..................................................................................... 17
A15. Change in Burden ............................................................................................................................... 17
A16. Plan and Time Schedule for Information Collection, Tabulation and Publication............................... 17
Analysis Plan .................................................................................................................................. 17
Time Schedule and Publication ..................................................................................................... 18
A17. Reasons Not to Display OMB Expiration Date ................................................................................... 19
A18. Exceptions to Certification for Paperwork Reduction Act Submissions ............................................. 19
REFERENCES ............................................................................................................................................ 20

ii

SUPPORTING STATEMENT PART A

TABLES
A.1

Research questions for Baby FACES 2018 and 2020 ..................................................................... 5

A.2

Baby FACES 2018 technical work group members and outside experts ...................................... 13

A.3

Baby FACES 2018 gift of appreciation structure compared to Baby FACES 2009 ....................... 14

A.4

Total burden requested under this information collection .............................................................. 16

A.5

Baby FACES 2018 schedule for data collection ............................................................................ 19

iii

SUPPORTING STATEMENT PART A

Contents of OMB Information Collection Request for Baby FACES 2018
Supporting Statement Part A
Supporting Statement Part B

Appendices
Appendix A. 60-Day Federal Register Notice
Appendix B. Comments Received on 60-Day Federal Register Notice
Appendix C. Conceptual Frameworks and Research Questions
Appendix D. Mathematica Confidentiality Pledge
Appendix E. Advance Materials
Appendix F. Brochure
Appendix G. Screen Shots
Appendix H. Baby FACES 2009 Instrument Information

Attachments (Study Instruments)
Attachment 1. Classroom/home visitor sampling form from Early Head Start staff
Attachment 2. Child roster form from Early Head Start staff
Attachment 3. Parent consent form
Attachment 4. Parent survey
Attachment 5. Parent Child Report
Attachment 6a. Staff survey (Teacher survey)
Attachment 6b. Staff survey (Home Visitor survey)
Attachment 7a. Staff Child Report (Teacher)
Attachment 7b. Staff Child Report (Home Visitor)
Attachment 8. Program director survey
Attachment 9. Center director survey

iv

SUPPORTING STATEMENT PART A

A1. Necessity for the Data Collection

The Administration for Children and Families (ACF) at the U.S. Department of Health and
Human Services (HHS) seeks approval to collect descriptive information for the Early Head
Start Family and Child Experiences Survey 2018 (Baby FACES 2018). The goal of this
information collection is to provide updated nationally representative data on Early Head Start
(EHS) programs, staff, and families to guide program planning, technical assistance, and
research. The data collection for this request will be in 2018 (upon OMB approval of the
request). We anticipate another information collection with a new cross-sectional sample of EHS
programs in 2020 (Baby FACES 2020); however, we expect to focus on different research areas
for that collection. Therefore, this request is for data collection that will occur in 2018; we will
submit a separate request for the information collection for 2020. The 60-day Federal Register
Notice for this request appears in Appendix A.
Study Background
In October 2015, OPRE awarded a contract to Mathematica Policy Research to carry out
Baby FACES 2018 and Baby FACES 2020. The proposed data collection builds upon a prior
study (Baby FACES 2009; also under OMB 0970-0354) that longitudinally followed two cohorts
of children through their experience in the EHS program. We learned a great deal about program
participation over time and about services received by children and families. However, the
earlier design did not allow for national-level estimates of service quality, nor inferences about
children who enter the program after 15 months of age. To fill these knowledge gaps and to
answer additional questions about how programs function, the Baby FACES 2018 design will
include a cross-section of a nationally representative sample of programs, centers, home visitors,
teachers, classrooms, and children and families (including pregnant women). This will allow
nationally representative estimates at all levels at a point in time and will include the entire age
span of enrolled children. It will also aid ACF with planning, training and technical assistance,
management, and policy, which is particularly important given the recent implementation of the
new Head Start Program Performance Standards and adoption of the Head Start Early Learning
Outcomes Framework.
Baby FACES 2018 is guided by a comprehensive conceptual framework for the EHS
program that the study team developed in consultation with content experts. This conceptual
framework illustrates how program processes and activities are expected to lead to high quality
service delivery and enhanced family and infant/toddler outcomes for the program overall. The
study team also developed finer-grained “sub-frameworks” to identify specific research
questions of interest for Baby FACES 2018 to explore. The conceptual framework and subframeworks are in Appendix C (Figures 1-3) of this request.
Legal or Administrative Requirements that Necessitate the Collection
There are no legal or administrative requirements that necessitate the collection. ACF is
undertaking the collection at the discretion of the agency.

1

SUPPORTING STATEMENT PART A

A2. Purpose of Survey and Data Collection Procedures

Overview of Purpose and Approach
The overarching purpose of the Baby FACES studies is to provide knowledge about EHS
children and families, and the EHS programs and staff who serve them. The Baby FACES
collection of information on EHS programs extends the work of the Family and Child
Experiences Survey (FACES), which has a similar purpose for Head Start programs. The
ongoing series of Baby FACES data collections is aimed at maintaining up-to-date core
information on EHS over time while also focusing on different areas of interest. The Baby
FACES studies began with Baby FACES 2009. That was a longitudinal descriptive study that
followed children and families through participation in the program and focused on whether
child and family wellbeing and outcomes changed over time. Baby FACES 2018 and future
requests have been redesigned to provide cross-sectional descriptive information and a point in
time picture of EHS across all participants, with a particular focus on understanding the
processes in EHS core services (classrooms and home visits) that support infant and toddler
development; namely, nurturing relationships between children and caregivers. With this new
focus, Baby FACES 2018 will take a more in depth look at classrooms while Baby FACES 2020
will take a more in depth look at home visiting.
Previously Approved Request
Baby FACES 2009 (0970-0354, approved 10/21/2008) was the first nationally
representative descriptive study of EHS programs. Using a longitudinal cohort design, it included
a sample of 89 programs and nearly 1,000 children from two birth cohorts (newborns and 1-yearolds) and followed them annually throughout their enrollment in the program (2009‒2012).
Areas of focus for this study were to understand the services offered to families, training and
credentials of staff, and the quality of services provided. Another aim was to describe the EHS
population and to use the longitudinal design to examine changes over time in child and family
functioning, along with possible associations between these changes and aspects of the program
and services received.
Baby FACES 2009, which concluded in 2015, provided rich descriptive information on the
EHS program, families’ participation in it, and the amount and quality of services provided (see
Vogel et al. 2011, Vogel et al. 2015a, and Vogel et al. 2015b).
Current Request
Baby FACES 2018 builds on prior and current studies being conducted by ACF. It will build
on descriptive information from Baby FACES 2009; the 2018 round allows ACF to gather
nationally representative information about EHS families and programs. It will also provide
information on EHS-Child Care Partnership grantees, which will be sampled for Baby FACES
2018. Similar to Baby FACES 2009 and Baby FACES 2018, the separately conducted Family
and Child Experiences Survey (FACES) (OMB #0970-0151) looks at Head Start programs and
the children they serve (ages three to five) to fill out the birth to five age spectrum.
The study team will carry out a descriptive study that includes a nationally representative
sample of EHS programs, centers, home visitors, teachers, classrooms, and children and families

2

SUPPORTING STATEMENT PART A

(including pregnant women) and answers new questions about how EHS programs function. The
current request aims to provide an overall picture of how EHS programs are serving children and
families with a special focus on how classrooms and home visits support infant-toddler
development through responsive relationships. The study will address this goal through the
collection of rich information using interviews, self-administered questionnaires, classroom
observations, and administrative data sources. This approach will allow ACF to capture
important information about EHS services, families, and children across all service options (i.e.,
center-based, home-based, mixed), as well as in depth information about how EHS classrooms
and teacher-child relationships support infant/toddler development.1 Specifically, we propose the
following data collection activities, which we would carry out in winter and spring 2018 (after
OMB approval):


Three initial activities will facilitate sampling and prepare for the data collection:
-

A classroom/home visitor sampling form from EHS staff (Attachment 1), which we will
use to sample centers, classrooms, teachers, and home visitors2

-

A child roster form from EHS staff (Attachment 2), which we will use to sample
children and their parents as well as pregnant women

-

A parent consent form (Attachment 3), which we will use to request consent for the
parent and his or her child to participate in the study



A parent survey (Attachment 4) that gathers information about child and family sociodemographic characteristics; parents’ health and well-being; household activities, routines,
and climate; parents’ relationships with EHS staff, and parents’ engagement with and
experiences in the program



A self-administered Parent Child Report (Attachment 5) that will provide information on
sampled children’s language and social-emotional development, child health and well-being,
parenting stress, parents’ perceptions of their relationship with their child, and social
support. Pregnant women will not be asked to complete the Parent Child Report but will
report on their perceptions of social support in the parent survey.



A staff survey of teachers (teacher survey; Attachment 6a) and home visitors (home visitor
survey; Attachment 6b) sampled from centers and programs, respectively. Teachers and
home visitors will provide information about the staff development and training provided by
their program, curricula and assessments they use, the organizational climate of their
program, languages spoken by the children and families they work with, and their health,
and background information. In addition, teachers will provide information about the
characteristics of and routines used in their classrooms and their beliefs about infant and
toddler development.



A Staff Child Report for each sampled child completed by either his or her assigned teacher
(Staff Child Report-Teachers; Attachment 7a) or home visitor (Staff Child Report-Home
Visitors; Attachment 7b). Teachers and home visitors will provide information on children’s

1

Future information requests will include observations of EHS home-visits and the parent-child relationship.

2

The classroom/home visitor sampling form is used after EHS programs are sampled using publicly available
administrative data. See Supporting Statement Part B for a complete description of the sampling process.

3

SUPPORTING STATEMENT PART A

language and social-emotional development, developmental screenings and referrals,
perceived relationship with the child’s parents, and the family’s engagement with the
program. In addition, teachers will report on their perceptions of their relationship with the
child, and home visitors will provide information about their provision of services to
families in the past four weeks (including topics and activities covered, referrals, alignment
of visit content to planned goals, and frequency and modes of communication) (home
visitors will also complete a briefer version for pregnant women)


A program director survey (Attachment 8) to understand program functioning and how
programs support the quality of EHS services, including program goals, plans, decisionmaking processes, training, and professional development, among others



A center director survey (Attachment 9) to understand use of curriculum, organizational
climate, staff qualifications, and similar topics related to how centers support quality of EHS
classrooms



Observations of quality in classrooms, which will be conducted directly by study team staff
visiting the classroom and will not impose burden on participating EHS staff beyond the
forms described above



Linking data collected at the program level to administrative data provided by programs in
the Program Information Report (PIR) to reduce burden on program staff

Future Request
We intend that Baby FACES 2018 will be part of a repeated cross-section of nationally
representative samples of programs, centers, home visitors, teachers, children and families. We
plan to conduct a second data collection in spring 2020 (Baby FACES 2020) to collect
descriptive information at another point in time and to examine change over time at an aggregate
level. Although Baby FACES 2020 will be similar in many ways to the current request, it will
take a more in-depth look at home visiting quality and the parent-child relationships that home
visiting fosters. Both the current and future requests will describe how EHS programs are serving
children and families by collecting detailed information through interviews, self-administered
questionnaires, observations, and administrative data sources. However, while Baby FACES
2018 will collect some information about home visits in programs with home-based services, it
will take a more in-depth look at how classrooms in programs with center-based services support
infants and toddlers. Accordingly, a key purpose of Baby FACES 2020 will be to gather more indepth information on home visits, including their quality and how they foster parent-child
relationships.
Research Questions
Working collaboratively with ACF and the Baby FACES technical work group (see Table
A.2), we developed the broad conceptual framework for EHS that depicts how and why program
services are expected to lead to positive outcomes for infants and toddlers and their families (see
Appendix C). The building blocks of the conceptual framework for EHS include multiple layers:
the resources, assets, contributions, and information available to achieve program goals (inputs);
the plans and activities, services, and processes designed to achieve program goals (activities);
the direct, tangible results of program efforts, such as level of service delivery and participation

4

SUPPORTING STATEMENT PART A

(outputs); and the benefits of program participation for children and families (enhanced
outcomes). The conceptual framework shows the pathways from inputs for operating EHS
programs to program goals of achieving enhanced outcomes for children and families.
Guided by the broad conceptual framework for EHS, we then developed two subframeworks that guide the redesign of Baby FACES (from longitudinal in 2009 to crosssectional in 2018-2020) and serve as a road map for the topics of interest in Baby FACES 2018
and 2020 (see Appendix C). In developing these sub-frameworks, we identified constructs that
are considered to be most important to capture to answer study questions. ACF’s priorities for
Baby FACES 2018 and Baby FACES 2020 are the processes in classrooms and home visits that
support responsive relationships: teacher-child relationships, staff-parent relationships, and
parent-child relationships. The overarching research question for both Baby FACES 2018 and
Baby FACES 2020 is: How do EHS services support infant/toddler growth and development in
the context of nurturing, responsive relationships? Baby FACES 2018 will focus on a more indepth look at classrooms and Baby FACES 2020 will focus on a more in-depth look at home
visits.
Table A.1 lists high-level research questions that align with the broad conceptual framework
and particularly the sub-frameworks to examine program processes, program functioning, and
classroom/home visit processes that lead to responsive relationships and ultimately enhanced
infant/toddler outcomes and family well-being. Detailed lists of the specific research questions
for the center-based and home-based options are in Appendix C (Tables 1 and 2, respectively).
The research questions in those tables map to the research question numbers in the conceptual
sub-frameworks in Appendix C (Figures 2 and 3).
Table A.1. Research questions for Baby FACES 2018 and 2020
Service characteristics
How do EHS classrooms and home visits support infant/toddler growth and development in the context of
nurturing, responsive relationships?

What is the quality of relationships between EHS children and their caregivers (e.g., parents and teachers)
and relationships between parents and their home visitors?

How does EHS support these relationships in classrooms and home visits?

How do these relationships relate to the development of infants/toddlers in EHS?
Program processes and functioning
How do program-level processes and functioning support the development of nurturing, responsive
relationships in classrooms and home visits?

How do program leadership, planning, culture, staff training, technical assistance, etc., support quality and
the development of responsive relationships between children and their caregivers and between parents and
home visitors?
Infant/toddler outcomes and family well-being
How are EHS infants and toddlers faring in key domains of development and learning (e.g., language and
social-emotional development)? How are EHS families functioning (e.g., social/economic well-being, family
resources and competencies)?

What do parent-child relationships and home environment look like among EHS families?

How are parent-child relationships and family well-being associated with the development of infants/toddlers
in EHS?

5

SUPPORTING STATEMENT PART A

These questions address gaps in the research literature that we identified at the end of Baby
FACES 2009 (Xue et al. 2015). The unique developmental characteristics of infants and toddlers
require a focus on developing and supporting relationships between young children and
caregivers because sensitive and responsive relationships with caregivers are critical for the
healthy development of young children (Horm et al. 2016). Thus, relationship-based care
practices are a priority area for practice and policy in child care settings for infants and toddlers
(Sosinsky et al. 2016). At the center of relationship-based care practices in EHS are the supports
for parents, teachers/home visitors, and children to build relationships with one another.
Therefore, in Baby FACES 2018 and 2020, we will focus on teacher-child relationships, parentstaff relationships, and parent-child relationships and how program processes and functioning
support the development of these nurturing, responsive relationships in classrooms and home
visits. We will also address questions of how these relationships relate to outcomes for
infants/toddlers in EHS and their families.
The prior study enabled us to examine nationally representative estimates of the programs
and two different age cohorts (newborns and one-year-olds at study enrollment) and concerned
different questions about longitudinal experiences of families in programs. The current design
allows for nationally representative measures at all levels (program, center, classroom,
teacher/home visitor, and child/family). It also enables us to examine program processes that are
hypothesized to enhance family and child outcomes and to observe the strength of associations
between processes and outcomes.
Study Design
Baby FACES is a nationally representative, descriptive study of EHS providing rich
information to guide program planning, technical assistance, and research. It describes the key
characteristics of families served in EHS, investigates what services are offered and their quality,
describes how EHS children and families are faring, and explores associations between the type
and quality of services and child and family well-being.
Baby FACES 2009 was a longitudinal study that followed nearly 1,000 children in two
cohorts (newborns and one-year-olds) through their experience in the EHS program. The
longitudinal design included a sample of 89 programs and a census of children in the two birth
cohorts within each. It collected data from programs, parents, and staff (teachers and home
visitors of study children) as well as classroom and home visit quality and services offered by
programs and received by families. It was focused on measuring how children and families fared
over time and what were the type, frequency and quality of the services received during
enrollment.
Baby FACES 2018 and 2020 will shift the focus to examine the core services offered by the
program and how program processes support relationships (for example, between home visitors
and parents, between parents and children, and between teachers and children) that are
hypothesized to lead to improved child and family outcomes. To address these questions, the
study will employ a cross-sectional approach, capturing descriptive data on EHS programs,
centers, home visitors, classrooms and teachers and the families, children, and pregnant women
at a single point in time. The descriptive study will involve collecting quantitative information at
all of these levels using nationally representative samples that will allow national-level estimates
as well as exploration of associations across different levels. Baby FACES 2009 did not sample
6

SUPPORTING STATEMENT PART A

at the teacher and home visitor level, and therefore could not report nationally representative
findings at this level. The current request covers the Baby FACES 2018 data collection. A future
request will cover Baby FACES 2020.
The design includes surveys of program directors, center directors, teachers, home visitors,
and parents. It also includes observations of EHS center-based classroom quality, home visit
quality (via observation in Baby FACES 2020), and parent and staff reports of child
development. The surveys are quantitative using established scales with known validity and
reliability whenever possible. Gathering information from the perspective of the staff member
about children’s development and about the relationship with the family will be critical to
address research questions about the associations between the staff-family relationship, family
engagement with the program, and outcomes.
This will be the first time there is nationally representative information available about
teachers/classrooms and home visitors in the EHS program. The administrative data currently
available for EHS programs do not provide the depth or richness we need to answer the research
questions. This information is needed because there is almost no information available at a
national level about EHS teachers/classrooms and home visitors. We need information about
staff and service quality that is linked to the sampled programs to address the research questions
and understand how program processes support staff and their relationships with children and
families.
Because there are few instruments to measure the constructs of interest at the program or
center level, we have worked extensively with experts to identify potential measures and, in
some cases, to develop items tapping these constructs. This effort fills a gap in the knowledge
base about EHS program processes and will answer questions about relationships between
program characteristics and other levels of the conceptual framework.
The study design will enable us to describe the program overall and at each of the levels
noted above. It will not enable us to make conclusions about causal associations (or the direction
of associations) between and among various program processes, classroom/home visit quality,
and outcomes. However, it will allow us to explore hypotheses about the strength of association
among these components. In the next section, we briefly describe each data collection instrument
we plan to use. Part B of the Supporting Statement contains more detailed information about the
design of the sample.
Our objective is a nationally representative sample of 140 programs. Based on test samples
of PIR data we expect 88 percent of programs to provide center-based services and 67 percent to
provide home-based with 55 percent providing both. This will result in a sample of 123 programs
offering center-based and 94 programs offering home based (77 providing both). We will sample
an average of 4 centers per program to achieve 493 centers in total, and 1.7 classrooms per center
on average for a total of 840 classrooms (and teachers). We will sample 6.7 home visitors on
average per program or 630 in total. Within the sample of home visitors in each program, we will
select three of them from which to sample children. We expect to sample a total of 3,465
children in this manner (three children per classroom or home visitor). These figures do not take
into account response rates that we expect (and thus may differ slightly from figures in Table
A.4) and that we discuss further in Supporting Statement Part B.

7

SUPPORTING STATEMENT PART A

Universe of Data Collection Efforts
Previously Approved Data Collection Requests
Baby FACES 2009 was a descriptive study of EHS programs with a representative sample
of programs and children in two age cohorts: perinatal (pregnant women within two months of
their due date and the infants up to two months after birth) and age 1 (children between 10 and
14 months of age); with both cohorts followed the until children reached age 3 or left the
program. Baby FACES 2009 included interviews each spring over four years with program
directors, teachers, and home visitors assigned to study children, as well as with parents. The
interviews gathered information about a wide range of topics, including descriptive data about
programs, children’s and families’ demographic characteristics and well-being, family needs and
how they change over time, and services received. Baby FACES 2009 also conducted
observations of the quality of classrooms and home visits annually starting when the child was
one year old. When children were ages 2 and 3, it also conducted observations of parent-child
interactions, direct assessments, of children’s auditory comprehension, receptive language (age 3
only), and pre-verbal communication/early language development. Other components of the data
collection included measuring the child’s height and weight, and observational assessment of the
child’s social/emotional development and home environment. Staff completed a weekly report
on the services offered to and received by study families throughout children’s enrollment in the
program, including the number of classroom days offered and that children attended and the
number of home visits offered and received by families. Appendix H includes a table of all the
instruments included in Baby FACES 2009.
Current Data Collection Request
Baby FACES 2018 builds on the work done for Baby FACES 2009. We developed new and
revised instruments to reflect the changes between the two studies in research questions
(including the new emphasis on learning more about how EHS programs support responsive
relationships between children and caregivers) and study design (moving from a longitudinal to a
cross-sectional design). The instruments also reflect lessons learned from Baby FACES 2009.
Appendix C (Tables 3 and 4) includes a crosswalk between the research questions, the constructs
of interest, the measures used, and the survey instrument(s) that will capture them. To the extent
possible, we drew on survey items used in Baby FACES 2009 and other prior studies and
standardized measures of particular constructs. The survey instruments and forms (Attachments
1-9) are annotated to identify sources of questions from existing studies as well as questions we
developed for this study. Next, we briefly describe each of the instruments included in this
information request.
Program Information Report (PIR). The PIR is an administrative data system for the
Head Start program as a whole that includes data collected annually from all programs. We will
use the most recent PIR to select a sample of programs by deriving an initial sample frame from
the PIR and then by using program characteristics from the PIR as explicit and implicit
stratification variables (we describe this approach in detail in Section B1 of Supporting
Statement Part B). We will also use program characteristics from the PIR as data in the analysis,
including program size, location, population served, and percentage of children who have a
medical home. There is no burden to study participants associated with the collection of PIR data

8

SUPPORTING STATEMENT PART A

for Baby FACES (the information is already collected by Head Start programs as approved under
OMB #0970-0427).
Classroom/home visitor sampling form from EHS staff (Attachment 1). We will ask
staff at each sampled EHS program to fill out this form, listing all of the centers and home
visitors, along with characteristics such as the number of classrooms (for centers) and size of
caseload and whether they provide services to pregnant women (for home visitors).
Child roster form from EHS staff (Attachment 2). After sampling centers, classrooms,
and home visitors, we will ask EHS program staff to fill out the child roster form, listing all
children in the sampled classrooms and all children and pregnant women receiving services from
the sampled home visitors. Information from this form will be used to select families for
inclusion in the study.
Parent consent form (Attachment 3). After sampling children and pregnant women, we
will ask each child’s parent and each pregnant woman to fill out and sign a form indicating their
consent to participate in the study.
Parent survey (Attachment 4). We will conduct a 30-minute telephone survey interview
with parents of sampled children or with pregnant women. We expect responses from a total of
2,310 parents of children across the 140 programs, about 16.5 per program. We will ask parents
about child and family socio-demographic characteristics; their health and well-being; household
activities, routines, and climate; their relationships with EHS staff and their engagement with and
experiences in the program. This will provide information at the child/family level that will be
important for understanding linkages and associations among family characteristics, program
experiences, and outcomes.
Parent Child Report (Attachment 5). The Parent Child Report is a 15-minute selfadministered questionnaire, available in paper form, that we expect 2,310 parents of sampled
children to complete. The Parent Child Report will collect information about their child’s
language and social-emotional development; their child’s health and well-being; parenting stress;
parents’ perceptions of their relationship with their child; and social support.3
Staff survey (Teacher survey and Home Visitor survey) (Attachments 6a and 6b). We
will conduct 30-minute in-person staff surveys with 798 teachers (teacher survey) and 599 home
visitors (home visitor survey). The surveys will provide information about the staff development
and training offered by their program, curricula and assessments they use, the organizational
climate of their program, languages spoken by the children and families they work with, and
their health and background information. In addition, teachers will also provide information
about the characteristics and routines used in their classrooms and their beliefs about infant and
toddler development. We will link the information gathered in the teacher survey to observed
quality in the classroom. We will report data gathered from the staff surveys descriptively as well
as in analyses examining associations among different sample levels and moderators. Field staff
who are on-site for data collection will administer the paper surveys in person.
3

Pregnant women sampled for the study will not be asked to complete the Parent Child Report. However, they will
report on their perceptions of social support in the parent survey.

9

SUPPORTING STATEMENT PART A

Staff Child Report (Attachments 7a and 7b). The Staff Child Report is a 15-minute selfadministered survey that is available on the web and in paper form. It will ask teachers to report
on all of their sampled children and a subsample of home visitors to report on their sampled
families, which will total 1,097 staff completing 2,742 Staff Child Reports. These reports focus
on each child’s language and social-emotional development, developmental screenings and
referrals, perceived relationship with the child’s parents, and the family’s engagement with the
program. In addition, teachers will report on their perceptions of their relationship with the child,
and home visitors will provide information about their provision of services to families in the
past four weeks (including topics and activities covered, referrals, alignment of visit content to
planned goals, and frequency and modes of communication). Home visitors will complete a
briefer version for pregnant women that excludes the reports of the child’s development. Field
staff will collect the paper forms before they leave the program site.
Program director survey (Attachment 8). The 30-minute program director survey will be
administered via the web with the option of in-person follow-up for those who do not respond on
the web. This survey will document program goals, program decision-making, staff supports, and
use of data. Program directors will also be asked to provide information about home visiting
curricula and home visitor professional development, parent involvement, and program processes
for supporting responsive relationships. We expect 140 program directors to participate in this
survey.
Center director survey (Attachment 9). The 20-minute center director survey will be webbased with the option for in-person follow-up for those who do not respond on the web. This
survey will document aspects of the center such as use of curricula in classrooms, and teacher
professional development. We expect 493 center directors to complete this survey.
Classroom observations. We will use two classroom observation tools to capture teacherchild relationships: the Quality of Caregiver-Child Interactions for Infants and Toddlers
(Q-CCIIT) measure (Atkins-Burnett et al. 2015) and the infant and toddler versions of the
Classroom Assessment Scoring System (CLASS), the CLASS-Toddler (La Paro et al. 2012) and
the CLASS-Infant (Jamison et al. 2014). The Q-CCIIT is a new measure developed under
contract with ACF and the CLASS-Infant and Toddler are downward extensions of the widely
used preschool version of the CLASS measure. We will use both the CLASS and the Q-CCIIT
for Baby FACES 2018 to advance scientific knowledge and expand information about the
validity of both measures. These observations do not impose any burden on respondents.
The Q-CCIIT assesses the quality of child care settings for infants and toddlers in centerbased settings and family child care homes—specifically, how a given caregiver interacts with a
child or group of children in nonparental care. The Q-CCIIT measures caregivers’ support for
social-emotional, cognitive, and language and literacy development, as well as areas of concern
(such as harshness, ignoring children, and health and safety issues). There is no burden to study
participants associated with the collection of data using the Q-CCIIT.
The CLASS-Toddler and the CLASS-Infant measure the quality of teacher-child
interactions in toddler and infant classrooms in center-based settings and family child care
homes. The toddler version includes two domain scores: (1) Engaged Support for Learning
(facilitation of learning and development, quality of feedback, and language modeling); and

10

SUPPORTING STATEMENT PART A

(2) Emotional and Behavioral Support (positive and negative climate, teacher sensitivity, regard
for children’s perspectives, and behavior guidance). The infant version includes only one domain
score—Responsive Caregiving (sensitivity, language stimulation, scaffolding, and relational
climate). There is no burden to study participants associated with the collection of data using the
CLASS-Toddler or Infant.
Future Data Collection Request
Because the overall purpose of both Baby FACES 2018 and Baby FACES 2020 is to
describe how EHS programs are serving children and families, our future request will likely
include instruments to collect data from individuals at the same levels as the current request,
including parents, teachers, home visitors, program directors, and center directors. The
instruments will likely measure the same constructs, but with revisions to the items based on
lessons learned from Baby FACES 2018. We will also revise the instruments to reflect the
change of in-depth focus from classrooms in Baby FACES 2018 to home visiting in Baby
FACES 2020. Some expected results of this change in focus in Baby FACES 2020 are that we
plan to add a measure of the parent-child relationship based on an observation of the interaction
between the parent and child conducted in the child’s home. We also plan to add appropriate
measures of home visit quality. We expect to select these measures based on lessons from other
government efforts currently underway, such as the Mother and Infant Home Visiting Program
Evaluation (MIHOPE) (OMB #0970-0402).
A3. Improved Information Technology to Reduce Burden

The data collection will use a variety of information technologies to reduce the burden of
participating on respondents. Program director and center director surveys as well as Staff Child
Reports will include a web-based mode as an option to complete the survey. We will conduct
parent surveys using computer-assisted telephone interviewing. We will not include a web-based
mode for the parent survey or Parent Child Report. Studies of similar populations found low
response rates via the web. We will conduct staff surveys (teacher and home visitor surveys) in
person as part of the on-site data collection; as a result, a web-based mode is not necessary for
these surveys.
A4. Efforts to Identify Duplication

There is no other current or planned effort to collect nationally representative, descriptive
information about EHS programs, centers, classrooms, teachers, home visitors, or the
children/families they serve. None of the study instruments asks for information that is available
from alternative data sources, including administrative data. We will use existing administrative
information as much as possible, primarily for constructing the sample frame and for some basic
program characteristics. Specifically, PIR data provide information about EHS programs at the
program-level only (no classroom/teacher, home visitor, child or family level data are available
through PIR or other administrative data sources). We will use the PIR data for sampling and to
obtain basic descriptive information about programs’ structural characteristics and enrollment.
None of the program-level information to be collected under this request (e.g., through the
program director survey) duplicates data collected in PIR or other administrative data sources.
Additionally, the design of the study instruments ensures minimal duplication of data collected
across instruments and does so only in cases in which we need the perspective of more than one

11

SUPPORTING STATEMENT PART A

type of respondent to answer specific research questions. For example, we ask questions about
the perceived parent-staff relationship from both the parent and the associated staff member
(teacher or home visitor). We also ask parents and staff to report on children’s language and
social-emotional development. Parent and staff ratings draw on children’s behaviors and
interactions with familiar adults in different contexts (such as in the home for home visitor
ratings, the classroom for teacher ratings, and multiple contexts for parents). Collecting data
from both parents and staff will capture a more complete picture of children’s development.
A5. Involvement of Small Organizations

Most of the EHS programs and child care centers included in the study will be small
organizations, including community-based organizations and other nonprofits. We will minimize
burden for respondents by restricting the length of survey interviews as much as possible,
conducting survey interviews on-site or via telephone at times that are convenient to the
respondent, and providing some instruments in a web-based format.
A6. Consequences of Less Frequent Data Collection

No nationally representative information has been collected on EHS classrooms, home
visitors, families, or children, since the conclusion of Baby FACES 2009 in 2012. Baby FACES
2018 will take place six years after the last round of data collection. During this period, EHS has
experienced major changes, including an expansion of the program, implementation of new
program performance standards, and other policy changes.
A7. Special Circumstances

There are no special circumstances for this data collection.
A8. Federal Register Notice and Consultation

Federal Register Notice and Comments
In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13) and Office of
Management and Budget (OMB) regulations at 5 CFR Part 1320 (60 FR 44978, August 29,
1995), ACF published a notice in the Federal Register announcing the agency’s intention to
request an OMB review of this information collection activity. This notice was published on
April 14, 2017, Volume 82, Number 71, page 18000, and provided a 60-day period for public
comment. A copy of this notice is attached as Appendix A. The comments received during the
notice and comment period and the ways they were addressed are included in Appendix B.
Consultation with Experts Outside of the Study
We consulted with experts to complement our team’s knowledge and experience (Table
A.2). Consultants included researchers with expertise in EHS and child care more broadly, child
development, family engagement, and classroom and home visit processes. We also engaged
experts with specialized knowledge and skills in the areas of research design and data collection
methods/measurement relevant to this work.

12

SUPPORTING STATEMENT PART A

Table A.2. Baby FACES 2018 technical work group members and outside
experts
Name

Affiliation

Catherine Ayoub

Harvard Medical School, Massachusetts General Hospital, Boston Children’s Hospital

Sandra Barrueco

Department of Psychology, Catholic University of America

Margaret Burchinal

Frank Porter Graham Child Development Institute, University of North Carolina at
Chapel Hill

Rachel Chazan Cohen

Department of Curriculum and Instruction, College of Education and Human
Development, University of Massachusetts Boston

Anne Duggan

Department of Population, Family and Reproductive Health, Bloomberg School of
Public Health, Johns Hopkins University

James Elicker

Department of Human Development and Family Studies, College of Health and
Human Sciences, Purdue University

Brenda Jones Harden

Department of Human Development and Quantitative Methodology, College of
Education, University of Maryland

Carolyn Hill

MDRC; McCourt School of Public Policy, Georgetown University

Diane Horm

Early Childhood Education Institute, University of Oklahoma-Tulsa

Ursula Johnson

Children’s Learning Institute, University of Texas Health Science Center at Houston

Jon Korfmacher

Herr Research Center for Children and Social Policy, Erikson Institute

Peter Mangione

Center for Child and Family Studies, WestEd

Virginia Marchman

Department of Psychology, Stanford University

Christine McWayne

Eliot-Pearson Department of Child Study and Human Development, School of Arts and
Sciences, Tufts University

Sonia Middleton

Early Head Start Program, CentroNía of Washington DC

Helen Raikes

Child, Youth and Family Studies, University of Nebraska-Lincoln

Claire Vallotton

Human Development and Family Studies, Michigan State University

Martha Zaslow

Office for Policy and Communications, Society for Research in Child Development

A9. Incentives for Respondents

Participation in Baby FACES 2018 will place some burden on program staff, families, and
children. To offset this burden and to acknowledge respondents’ efforts in a respectful way, we
will provide nominal monetary tokens of appreciation to respondents based on the ones used
effectively in Baby FACES 2009. We propose to offer program staff and families gifts of
appreciation for their participation in data collection activities (Table A.3). For comparison, the
table also shows incentives and response rates by instrument in Baby FACES 2009. We will also
provide programs with $250 to acknowledge the program’s overall participation in the study, and
the efforts of program staff to assist in scheduling the data collection visits and gathering parent
consent forms. Program directors can use the $250 at their discretion; for example, they may
choose to share it with study centers.

13

SUPPORTING STATEMENT PART A

Table A.3. Baby FACES 2018 gift of appreciation structure compared to Baby
FACES 2009
Baby FACES 2018
Baby FACES
component

Respondent

Length of
activity

Gift of
appreciation

Baby FACES 2009
Gift of
appreciation

Response rate
(percent)

Parent survey

Parent

30 minutes

$20

79.6

Parent Child
Report

Parent

15 minutes

$5

$35

Staff survey

Teachers and
home visitors

30 minutes

Children’s book
($10 value)

Children’s book
($5 value)

98.7

Staff Child
Report

Teacher or home
visitor

15 minutes per
sampled child

$5 per report

$5 per report

96.2

Data collection
site visit

Program

$250

$250

86

100

Taking into consideration OMB guidance (2006) on providing gifts of appreciation, we
propose to provide participants with these gifts of appreciation for the following reasons:
1.

They should increase response rates and mitigate nonresponse bias. The knowledge that
they will receive a gift for completion will likely increase respondents’ probability of
completing the data collection activities. Research has shown that incentives for respondents
are effective in increasing response rates (see meta-analysis by Singer et al. 1999). More
recently, Goldenberg et al. (2009) found that monetary incentives increased response rates
and data quality over no incentive. Those receiving the incentive were less likely to say
“don’t know” or refuse to answer individual items. Others have found that incentives
significantly increase response rates overall, but particularly with those who had previously
refused (Zagorsky and Rhoton 2008). Singer and Kulka (2002) examined a number of
studies that showed that incentives reduce differential response rates and hence the potential
for nonresponse bias. This was true particularly for low-income and minority populations,
which resemble populations served by EHS. For example, in their meta-analysis, Singer et
al. (1999) found that in three studies, using incentives was useful in achieving higher
response rates from respondents who may otherwise be underrepresented in surveys, such as
those from low income and minority populations. Other studies have also found that
incentives are more effective in recruiting as well as retaining low-income and minority
populations (Mack et al., 1998; Martin et al., 2000; Singer et al., 2000).

2.

They can ensure nationally representative estimates. The participation of respondents in
the study activities is key to ensuring the quality of the information gathered. High levels of
participation among the target population of EHS programs, staff, and families are essential
to ensure that estimates are nationally representative.

3.

They have worked well with similar populations in the past. The gift structure we are
proposing is very similar to the one used in Baby FACES 2009 (0970-0354) and in earlier
rounds of the Head Start Family and Child Experiences Survey (0970-0151). Baby FACES
2009 achieved very high response rates across multiple rounds of data collection―some
families participated in four annual rounds of data collection. Because we have historically
achieved high response rates with low income and minority populations, we believe that
14

SUPPORTING STATEMENT PART A

using a similar approach to incentives is the best way to achieve high response rates in the
current study.
A10. Privacy of Respondents

The information we collect will be kept private to the extent permitted by law. The consent
statement that all study participants will receive includes assurances that the research team will
protect the privacy of respondents to the fullest extent possible under the law, that respondents’
participation is voluntary, and that they may withdraw their consent at any time without any
negative consequences.
As specified in the contract signed by ACF and Mathematica (referred to as the Contractor
in this section), the Contractor shall protect respondent privacy to the extent permitted by law
and will comply with all Federal and Departmental regulations for private information. The
Contractor developed a Data Safety Plan that assesses all protections of respondents’ personally
identifiable information (PII) and submitted it to ACF on October 30, 2015. The Contractor shall
ensure that all of its employees, subcontractors (at all tiers), and employees of each subcontractor
who perform work under this contract/subcontract are trained on data privacy issues and comply
with the above requirements. All of the Contractor’s staff sign the Contractor’s confidentiality
agreement when they are hired; a copy of the agreement, called the Confidentiality Pledge, is
attached as Appendix D.
Due to the sensitive nature of part of this research (see A.11 for more information), the
evaluation will obtain a Certificate of Confidentiality. The study team has applied for this
Certificate and will provide it to OMB upon receipt. The Certificate of Confidentiality helps
assure participants that their information will be kept private to the fullest extent permitted by
law. Further, all materials to be used with respondents as part of this information collection,
including consent statements and instruments, will be submitted to the New England Institutional
Review Board (the Contractor’s IRB) for approval.
Data security. As specified in the evaluator’s contract, the Contractor shall use Federal
Information Processing Standard (currently, FIPS 140-2) compliant encryption (Security
Requirements for Cryptographic Module, as amended) to protect all instances of sensitive
information during storage and transmission. The Contractor shall securely generate and manage
encryption keys to prevent unauthorized decryption of information, in accordance with the
Federal Processing Standard. The Contractor shall ensure that this standard is incorporated into
the Contractor’s property management/control system and establish a procedure to account for
all laptop computers, desktop computers, and other mobile devices and portable media that store
or process sensitive information. Any data stored electronically will be secured in accordance
with the most current National Institute of Standards and Technology (NIST) requirements and
other applicable Federal and Departmental regulations. In addition, the Contractor must submit a
plan for minimizing, to the extent possible, the inclusion of sensitive information on paper
records and for the protection of any paper records, field notes, or other documents that contain
sensitive data or PII, ensuring secure storage and limits on access.
Information will not be maintained in a paper or electronic system from which they are
actually or directly retrieved by an individuals’ personal identifier.

15

SUPPORTING STATEMENT PART A

A11. Sensitive Questions

To achieve its primary goal of describing the characteristics of the children and families
served by EHS, we will be asking parents and staff (teachers/home visitors) a few sensitive
questions. Topics of sensitive questions for parents include potential feelings of depression, use
of services for emotional or mental health problems, reports of family violence or substance
abuse, household income, and receipt of public assistance. Of these topics, we will ask staff
about symptoms of depression only. We used this information in Baby FACES 2009 reports to
describe the EHS population, their needs, parent outcomes, and how families are faring over
time. The invitation to participate in the study will inform parents and staff that the survey will
ask sensitive questions (these materials are in Appendix E). The invitation will also inform
parents and staff that they do not have to answer questions that make them uncomfortable and
that none of the responses they provide will be reported back to program staff. We will ask
parents to sign a consent form, agreeing that they will participate in the study and permitting
their teacher or home visitor to complete a Staff Child Report. We will not conduct any activities
involving the parent until she or he signs the consent form. Because we will conduct the staff
survey in person, the first part of the survey will include a consent form, indicating the teacher or
home visitor agrees to participate in the study.
A12. Estimation of Information Collection Burden

Burden Hours
Table A.4 presents the current request to cover data collection activities related to sampling
classrooms, home visitors, and families as well as completing surveys with sampled EHS staff
and families. The estimates include time for respondents to review instructions, search data
sources, complete and review the responses, and transmit or disclose information. This
information collection request covers a period of two years.
Table A.4. Total burden requested under this information collection

Instrument
Classroom/ home visitor
sampling form (from EHS
staff)
Child roster form (from EHS
staff
Parent consent form
Parent survey
Parent Child Report
Staff survey (Teacher survey
and Home Visitor survey)
Staff Child Report
Program director survey
Center director survey

Total
number
of
respondents

Annual
number of
respondents

Number of
responses
per
respondent

587

294

1

587

294

2,887
2,310
2,310

Average
burden
hours per
response

Annual
burden
hours

Average
hourly
wage

0.17

50

$31.65

$1,582.50

1

0.33

97

$31.65

$3,070.05

1,444
1,155
1,155

1
1
1

0.17
0.5
0.25

245
578
289

$17.50
$17.50
$17.50

$4,287.50
$10,115.00
$5,057.50

1,397

699

1

0.5

350

$31.65

$11,045.85

1,097
140
493

549
70
247

2.5
1
1

0.25
0.5
0.33

343
35
82

$31.65
$31.65
$31.65

$10,855.95
$1,107.75
$2,595.30

Estimated Annual Burden Total

16

2,069

Total
annual
cost

$49,717.40

SUPPORTING STATEMENT PART A

Total Annual Cost
We expect the total annual burden to be 2,069 hours, or $49,717.40 for all of the instruments
in the current information collection request.
Average hourly wage estimates for deriving total annual costs are based on Current
Population Survey data for the third quarter of 2016 (Bureau of Labor Statistics 2016). For each
instrument included in Table A.4, we calculated the total annual cost by multiplying the annual
burden hours and the average hourly wage.
For program directors, center directors, and staff (teachers and home visitors), we used the
median usual weekly earnings for full-time wage and salary workers age 25 and older with a
bachelor’s degree or higher ($31.65 per hour). For parents, we used the median usual weekly
earnings for full-time wage and salary workers age 25 and older with a high school diploma or
equivalent and no college experience ($17.50). We divided weekly earnings by 40 hours to
calculate hourly wages.
A13. Cost Burden to Respondents or Record Keepers

There are no additional costs to respondents.
A14. Estimate of Cost to the Federal Government

The total cost for the data collection activities under this current request will be $8,317,434.
This amount includes costs for new data collection activities under this request. Annual costs to
the Federal government will be $4,158,717 for the proposed data collection under this OMB
clearance number (0970-0354). There are no remaining costs from the Baby FACES 2009 data
collections, which are complete but were also under OMB clearance number 0970-0354.
A15. Change in Burden

This is an additional information collection request under OMB #0970-0354.
A16. Plan and Time Schedule for Information Collection, Tabulation and
Publication

Analysis Plan
The instruments included in this OMB package will yield data that we will analyze using
quantitative methods. These approaches will enable us to make nationally representative
estimates about EHS programs, centers, classrooms, teachers and home visitors, and families,
children, and pregnant women. We will carefully link the research questions guiding the study
with the data collected, constructs measured, and analyses undertaken. Baby FACES 2018
includes three categories of research questions:
1.

Descriptive. We will address descriptive questions about relationship quality in EHS,
classroom features and practices, home visit processes, program processes and functioning
that support responsive relationships, and the outcomes of infants and toddlers and families
served by EHS.

17

SUPPORTING STATEMENT PART A

2.

Associations with relationship quality. We will examine associations of relationship
quality in EHS with classroom features and practices, home visit processes, and program
processes and functioning, along with associations of teacher-child and parent-child
relationships with infant/toddler outcomes.

3.

Mediators. We will study mechanisms for these associations by examining elements that
may mediate associations.

We can answer many research questions by calculating the means and percentages of
classrooms, teachers and home visitors, programs, or children and families grouped into various
categories and comparing these averages across subgroups. We can perform hierarchical linear
modeling for more complex analyses of associations between relationship quality and program,
classroom, and home visit processes as well as program, teacher, and home visitor
characteristics. We will conduct similar analyses to examine the associations of relationship
quality and classroom and home visit processes with children’s outcomes. We will conduct
mediation analyses to examine the mechanisms for the associations through structural equation
modeling.
Weighting. Using analysis weights will enable us to compute unbiased estimates based on
sample survey responses from the study population. Weights take into account both the
probability of selection into the sample and differential response patterns that might exist in the
respondent sample. We plan to construct weights at the program, center, home visitor, classroom,
and child levels. Supporting Statement Part B provides details about our plans for creating
weights.
Time Schedule and Publication
Table A.5 contains the timeline for the data collection and reporting activities. Recruiting
will begin in fall 2017, after obtaining OMB approval. Data collection will follow and is
expected to occur from February through June 2018. Mathematica will produce several
publications based on analysis of data from Baby FACES 2018:


We will prepare a set of tables describing findings from all surveys. The intention is to
quickly produce findings that Federal agencies can use.



We will prepare a final report that includes the information from the descriptive tables,
along with more narrative explanation of the findings. The format of the report will be
accessible to a broad audience and will use graphics and figures to communicate key
findings.



We will produce briefs on specific topics of interest to the government. These briefs will be
focused and accessible to a broad audience.

18

SUPPORTING STATEMENT PART A

Table A.5. Baby FACES 2018 schedule for data collection
Activity

Timing

a

Recruitment
Program recruitment
Data collection
On-site visits to programs to obtain consent
Parent survey (by telephone)
Program and center director surveys
On-site classroom observations and staff surveys
Analysis
Data processing and analysis for data tables
Data processing and analysis for final report
Reporting
Data tables
Final report on the 2018 data collection
Briefs on specific topics
a

Fall 2017
Winter/Spring 2018
Winter/Spring 2018
Spring 2018
Spring 2018
Spring/Summer 2018
Winter 2018/Spring 2019
Fall 2018
Spring 2019
Spring/Summer 2019

After obtaining OMB approval.

A17. Reasons Not to Display OMB Expiration Date

All instruments will display the expiration date for OMB approval.
A18. Exceptions to Certification for Paperwork Reduction Act Submissions

No exceptions are necessary for this information collection.

19

SUPPORTING STATEMENT PART A

REFERENCES

Atkins-Burnett, Sally, Shannon Monahan, Louisa Tarullo, Yange Xue, Elizabeth Cavadel,
Lizabeth Malone, and Lauren Akers. “Measuring the Quality of Caregiver-Child
Interactions for Infants and Toddlers (Q-CCIIT).” OPRE Report 2015-13. Washington, DC:
Office of Planning, Research and Evaluation, Administration for Children and Families, U.
S. Department of Health and Human Services, January 2015.
Bureau of Labor Statistics. “Usual Weekly Earnings of Wage and Salary Workers: Third Quarter
2016.” USDL-16-2025. Washington, DC: Bureau of Labor Statistics, October 2016.
Goldenberg K. L., D. McGrath, and L. Tan. “The Effects of Incentives on the Consumer
Expenditure Interview Survey.” In Joint Statistical Meetings Proceedings, pp. 5985–5999.
Alexandria, VA: American Statistical Association, 2009.
Horm, D., D. Norris, D. Perry, R. Chazan-Cohen, and T. Halle. “Developmental Foundations of
School Readiness for Infants and Toddlers: A Research to Practice Report.” OPRE report #
2016-07, Washington, DC: Office of Planning, Research and Evaluation, Administration for
Children and Families, U.S. Department of Health and Human Services, 2016.
Jamison, K. R., Cabell, S. Q., LoCasale-Crouch, J., Hamre, B. K., & Pianta, R. C. (2014).
CLASS–Infant: An observational measure for assessing teacher–infant interactions in
center-based child care. Early Education and Development, 25(4), 553-572.
La Paro, K. M., Hamre, B. K., & Pianta, R. C. (2012). Classroom assessment scoring system
(CLASS) manual, toddler. Paul H. Brookes Publishing Company.
Mack, S., Huggins, V., Keathley, D., & Sundukchi, M. (1998). Do monetary incentives improve
response rates in the survey of income and program participation. In Proceedings of the
American Statistical Association, Survey Research Methods Section (Vol. 529534).
Martin, E., & Winters, F. (2001). Money and motive: effects of incentives on panel attrition in
the survey of income and program participation. Journal of Official Statistics, 17(2), 267.
Office of Management and Budget, Office of Information and Regulatory Affairs. “Questions
and Answers When Designing Surveys for Information Collections.” Washington, DC:
Office of Management and Budget, 2006.
Singer E., N. Gebler, T. Raghunathan, J. V. Hoewyk, and K. McGonagle. “The Effect of
Incentives In Interviewer-Mediated Surveys.” Journal of Official Statistics, vol. 15, no. 2,
1999, pp. 217–230.
Singer, E., and R. A. Kulka. “Paying Respondents for Survey Participation.” In Studies of
Welfare Populations: Data Collection and Research Issues, edited by Michele Ver Ploeg,
Robert A. Moffitt, and Constance F. Citro, pp. 105–128. Washington, DC: National
Academy Press, 2002.

20

SUPPORTING STATEMENT PART A

Singer, E., Van Hoewyk, J., & Maher, M. P. (2000). Experiments with incentives in telephone
surveys. Public Opinion Quarterly, 64(2), 171-188.
Sosinsky, L, K. Ruprecht, D. Horm, K. Kriener-Althen, C. Vogel, and T. Halle. “Including
Relationship-Based Care Practices in Infant-Toddler Care: Implications for Practice and
Policy.” Brief prepared for the Office of Planning, Research and Evaluation, Administration
for Children and Families, U.S. Department of Health and Human Services. Washington
DC, 2016.
Vogel, Cheri A., Kimberly Boller, Yange Xue, Randall Blair, Nikki Aikens, Andrew Burwick,
Yevgeny Shrago, Barbara Lepidus Carlson, Laura Kalb, Linda Mendenko, Judy Cannon,
Sean Harrington, and Jillian Stein. “Learning As We Go: A First Snapshot of Early Head
Start Programs, Staff, Families, and Children.” OPRE Report No. 2011-7. Washington, DC:
Office of Planning, Research and Evaluation, Administration for Children and Families,
U.S. Department of Health and Human Services, February 2011.
Vogel, Cheri A., Pia Caronongan, Jaime Thomas, Eileen Bandel, Yange Xue, Juliette Henke,
Nikki Aikens, Kimberly Boller, and Lauren Bernstein. “Toddlers in Early Head Start: A
Portrait of 2-Year-Olds, Their Families, and the Programs Serving Them.” OPRE Report
No. 2015-10. Washington, DC: Administration for Children and Families, U.S. Department
of Health and Human Services, 2015a.
Vogel, Cheri A., Pia Caronongan, Yange Xue, Jaime Thomas, Eileen Bandel, Nikki Aikens,
Kimberly Boller and Lauren Murphy. “Toddlers in Early Head Start: A Portrait of 3-YearOlds, Their Families, and the Programs Serving Them.” OPRE Report No. 2015-28.
Washington DC: Office of Planning, Research, and Evaluation, and Princeton, NJ:
Mathematica Policy Research, April 2015b.
Xue, Yange, Kimberly Boller, Cheri A. Vogel, Jaime Thomas, Pia Caronongan, and Nikki
Aikens. “Early Head Start Family and Child Experiences Survey (Baby FACES) Design
Options Report.” OPRE Report No. 2015-99. Washington, DC: Office of Planning,
Research and Evaluation, Administration for Children and Families, U.S. Department of
Health and Human Services, September 2015.
Zagorsky, J. L., and P. Rhoton. “The Effects of Promised Monetary Incentives on Attrition In a
Long-Term Panel Survey.” Public Opinion Quarterly, vol. 72, no. 3, 2008, pp. 502–513.

21


File Typeapplication/pdf
File Modified2017-07-19
File Created2017-07-19

© 2024 OMB.report | Privacy Policy