FACES 2014 Spring15 Data Collection OMB PartA_REVISED_02 06 15_Clean

FACES 2014 Spring15 Data Collection OMB PartA_REVISED_02 06 15_Clean.docx

Head Start Family and Child Experiences Survey (FACES 2014-2018)

OMB: 0970-0151

Document [docx]
Download: docx | pdf

Head Start Family and Child Experiences Survey (FACES 2014–2018) OMB Supporting Statement for Data Collection

Part A: Justification

May 7, 2014

Update November 25, 2014



CONTENTS

A. JUSTIFICATION 1

A.1. Circumstances Making the Information Collection Necessary 1

A.2. Purpose and Use of the Information Collection 4

A.3. Use of Improved Information Technology 7

A.4. Efforts to Identify Duplication and Use of Similar Information 8

A.5. Impact on Small Businesses or Other Small Entities 8

A.6. Consequences of Not Collecting Information or Collecting Information Less Frequently 8

A.7. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5 8

A.8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency 8

A.9. Explanation of Any Gift to Respondents 9

A.10. Assurance of Privacy Provided to Respondents 11

A.11. Justification for Sensitive Questions 11

A.12. Estimates of Annualized Burden Hours and Costs 12

A.13. Estimates of Other Total Cost Burden to Respondents and Record Keepers 16

A.14. Cost to the Federal Government 16

A.15. Explanation for Program Changes or Adjustments 16

A.16. Plans for Tabulation and Publication and Project Time Schedule 16

A.17. Reason(s) Display of OMB Expiration Date is Inappropriate 19

A.18. Exceptions to Certification for Paperwork Reduction Act Submissions 19

REFERENCES 20



APPENDICES

appendix a: authorizing statues

APPENDIX b: conceptual models

appendix c: study introduction materials

APPENDIX d: classroom observation

appendix e: omb history

APPENDIX f: OMB PUBLIC COMMENTS

APPENDIX G: confidentiality pledge

Appendix h: advance materials

appendix i: screenshots

APPENDIX J: SPRING 2015 ADVANCE MATERIALS


TABLES

A.1 FACES 2014–2018 Core Instruments, Sample Size, Type of Administration, and Periodicity 3

A.2 FACES 2014–2018 Family Engagement Expert Panel Members 9

A.3 FACES 2014–2018 Proposed Token of Appreciation Structure 10

A.4 Approved Estimated Information Gathering Annual Response Burden and Approved Information Gathering Annual Cost 12

A.5 Interim Response Rates for Approved Information Requests 13

A.6 Estimated Current Annual Response Burden and Current Annual Cost 13

A.7 Estimated Future Annual Response Burden and Future Annual Cost 14

FIGURES

A.1 FACES 2014–2018 Study Structure 2

ATTACHMENTS

ATTACHMENT 1: Classroom sampling form from Head Start staff

ATTACHMENT 2: Child roster form from Head Start staff

ATTACHMENT 3: HEAD START CORE CHILD ASSESSMENT

ATTACHMENT 4: HEAD START CORE PARENT SURVEY

ATTACHMENT 5: HEAD START FALL SUPPLEMENTAL PARENT SURVEY

ATTACHMENT 6: HEAD START CORE TEACHER CHILD REPORT

ATTACHMENT 7: HEAD START SPRING SUPPLEMENT PARENT SURVEY

ATTACHMENT 8: HEAD START CORE TEACHER SURVEY

ATTACHMENT 9: HEAD START CORE PROGRAM DIRECTOR SURVEY

ATTACHMENT 10: HEAD START CORE CENTER DIRECTOR SURVEY

ATTACHMENT 11: HEAD START PARENT QUALITATIVE INTERVIEW (FAMILY ENGAGEMENT)

ATTACHMENT 12: HEAD START STAFF QUALITATIVE INTERVIEW (FSS ENGAGEMENT)

ATTACHMENT 13: HEAD START STAFF (FSS) Roster form

ATTACHMENT 14: early care and education providers survey for plus study (5E-Early ED Pilot)

ATTACHMENT 15: early care and education providers survey for plus study (FPTRQ)




A. JUSTIFICATION

The Office of Planning, Research and Evaluation (OPRE), Administration for Children and Families (ACF), U.S. Department of Health and Human Services (HHS), is collecting data for the Head Start Family and Child Experiences Survey (FACES). FACES 2014–2018 features a new “Core Plus” study design. Through this design, FACES will provide data on a set of key indicators more rapidly and with greater frequency than in past rounds of FACES (Core studies), and will allow for studying more complex issues and topics in greater detail and with increased efficiency (Plus studies). The overall design of the FACES 2014–2018 Core and the procedures that are used to select and recruit the sample and conduct data collection are, for the most part, similar to those used in FACES 2009 (OMB number 0970-0151).

The proposed FACES design includes multiple components as noted above, and therefore will involve multiple information collection requests. The current information collection request includes data collection activities for FACES 2014–2018 spring 2015 data collection, including selecting classrooms in additional programs; conducting classroom observations; surveying teachers, center directors, and program directors; and interviewing parents and staff for FACES Plus studies. Previous requests approved the FACES 2014–2018 sampling plans for Head Start programs, centers, classrooms, and children, as well as the procedures for recruiting programs and selecting centers in 2014 and contacting them again in 2016 as well as fall 2014 data collection activities, including selecting classrooms and children for the study, conducting child assessments and parent interviews, and obtaining Head Start teacher reports on children’s development.

A.1. Circumstances Making the Information Collection Necessary

a. Background

ACF has contracted with Mathematica Policy Research (Mathematica) and its subcontractors, Juárez and Associates and Educational Testing Service, under contract number HHSP23320095642WC/HHSP2337052T, to collect information on Head Start Performance Measures. FACES 2014–2018 extends a previously approved data collection program (OMB number 0970-0151) to a new sample of Head Start programs, families, and children. FACES 2014–2018, similar to previous FACES rounds, will collect information from a national probability sample of Head Start programs to ascertain what progress Head Start has made toward meeting program performance goals. There are two legislative bases for the FACES data collection: the Government Performance and Results Act of 1993 (P.L. 103-62), requiring that the Office of Head Start (OHS) move expeditiously toward development and testing of Head Start Performance Measures, and the Improving Head Start for School Readiness Act of 2007 (P.L. 110-134), outlining requirements on monitoring, research, and standards for Head Start (Appendix A). FACES provides the mechanism for collecting data on nationally representative samples of programs, children, and families served by Head Start in order to provide OHS, other federal government agencies, local programs, and the public with valid and reliable national information.

b. Overview of the Study

In 2014, FACES enters its 17th year of serving as a source of timely, periodic, contextualized data about the national Head Start program and its participants. OPRE and OHS engaged in a comprehensive redesign process to renovate FACES for improved effectiveness and efficiency. Enhanced flexibility and responsiveness are central features of the new design so FACES will be a fluid and responsive data collection system to meet the evolving policy and programmatic needs of Head Start. Built on a foundation constructed to report on key characteristics and indicators of programs, classrooms, and child outcomes (Core studies), FACES 2014–2018 also provides the opportunity for several types of integrated Plus studies. These could include topical studies and special studies of greater complexity. More explicitly than past rounds of FACES, the Core Plus study design meets the need for a systems change perspective―one designed to measure an interconnected system in which decisions at one level act as drivers or inhibitors at the next level. It also embodies a continuous program-improvement ethic—the elements measured are those that Head Start has the capacity to change and refine over time. Thus, FACES 2014–2018 represents a major step toward supporting the development of improved services at all levels of the Head Start program.

Approximately 230 Head Start programs and 460 Head Start centers will be selected to participate in FACES 2014–2018. The Core will include a nationally representative sample of 180 programs; an additional 50 programs may be selected for Plus studies. As presented in Figure A.1, the Core Plus design features two Core studies—the Classroom + Child Outcomes Core and the Classroom Core—and Plus studies to include additional survey content of policy or programmatic interest to be determined.

Figure A.1. FACES 2014–2018 Study Structure


Fall 2013

Spring 2014

Fall
2014

Spring 2015

Fall 2015

Spring 2016

Fall 2016

Spring
2017

Fall 2017

Spring 2018

Core

Design

Classroom + Child

Classroom + Child

Shape1 Shape2 Reporting

Classroom

Shape3 Shape4 Reporting


Design

Plus

Design (topics TBD)

Topical Module and/or Special Study

Shape5 Shape6 Reporting

Topical Module and/or Special Study

Shape7 Shape8 Reporting


Design

Shape9 Shape10 Shape12 Shape11

Key Indicators Reported Within Three Months of Core




The Classroom + Child Outcomes Core will occur in fall 2014 and spring 2015. At both time points, FACES will assess the school readiness skills of 2,400 Head Start children from 60 of the 180 programs, survey their parents, and ask the children’s teachers to rate children’s social and emotional skills (see Table A.1). In spring 2015, the number of programs in the FACES sample will increase from the 60 that are used to collect data on children’s school readiness outcomes to all 180 programs for the purpose of conducting observations in 720 Head Start classrooms. Surveys with program directors, center directors, and teachers will also be conducted in the spring. Therefore, the Classroom + Child Outcomes Core collects child-level data along with program and classroom data from 60 programs while only program and classroom data is gathered from an additional 120 programs. In spring 2017, the Classroom Core will be conducted focusing on program and classroom data collection only for all 180 programs.

Table A.1. FACES 2014–2018 Core Instruments, Sample Size, Type of Administration, and Periodicity

Instrument

Sample Size

Type of Administration

Fall
2014

Spring 2015

Spring 2017

Classroom + Child Outcomes Core


Classroom sampling form from Head Start staff

180

CADE on the web

X

X


Child roster form from Head Start staff

60

CADE on the web

X



Direct child assessmenta

2,400

CAPI with tablet computer

X

X


Head Start teacher child ratinga

2,400

Web with paper option

X

X


Parent surveya

2,400

Web/CATI

X

X


Head Start classroom observation

720

CADE with tablet computer


X


Head Start teacher survey

720

Web with paper option


X


Program director survey

180

Web with paper option


X


Center director survey

360

Web with paper option


X


Classroom Core





Classroom sampling form from Head Start staff

180

CADE on the web



X

Head Start classroom observation

720

CADE with tablet computer



X

Head Start teacher survey

720

Web with paper option



X

Program director survey

180

Web with paper option



X

Center director survey

360

Web with paper option



X

a Information gathered from 60 programs; all other components are collected from all 180 programs.

CAPI = Computer-assisted personal interviewing; CATI = Computer-assisted telephone interviewing; CADE = Computer-assisted data entry

The goal of both Core studies is to describe (1) the quality and characteristics of Head Start classrooms, programs, and staff for specific program years; (2) the changes or trends in the quality and characteristics of classrooms, programs, and staff over time; and (3) the factors or characteristics that predict differences in classroom quality. The Classroom + Child Outcomes Core study also adds a focus on describing (4) the school readiness skills and family characteristics of Head Start children for specific program years; (5) the changes or trends in children’s outcomes and family characteristics over time; and (6) the factors or characteristics at multiple levels that predict differences in children’s outcomes. Across the two Core studies, several types of questions will be addressed (see Appendix B for the FACES conceptual frameworks), to include the following:

  • What are the characteristics and observed quality of Head Start classrooms? Are these improving over time?

  • What are the characteristics and qualifications of Head Start teachers and management staff? Are these changing over time?

  • What are the characteristics of Head Start programs? Are these changing over time?

  • Does classroom quality vary by characteristics of programs, teachers, or classrooms?

  • What are the demographic characteristics and home environments of children and families served by Head Start? Are these changing over time?

  • What are the average school readiness skills of the population of Head Start children in fall and spring of the Head Start year? How do Head Start children compare with children of similar ages in the general population1?

  • What is the association between observed classroom quality and children’s school readiness skills? Between child and family characteristics and children’s school readiness skills?

In spring 2015, FACES will include a Plus topical module focused on family engagement. This Plus feature will be conducted within the 60 programs participating in child-level data collection in the Classroom + Child Outcomes Core study. Within each of these 60 programs, we will randomly select three family services staff (FSS) from among those working in the two sampled centers. We will also select a subsample of six parents per program (within the two sampled centers), for a total of 360 parents and 180 FSS. The topical module will include one-hour interviews with this random subsample of parents and FSS. There will be an additional 5 minutes of parent survey content for all 2,400 parents participating in the child-level data collection (i.e., the Head Start spring parent supplement survey, Attachment 7). There will also be an additional 5 minutes of teacher survey content for all 240 teachers participating in the child-level data collection (i.e., the Family and Provider Teacher Relationship Questionnaire [FPTRQ]; Attachment 15). Although the experiences and participation of families have always played a central role in Head Start, recent years have seen a growing emphasis on developing and using strategies to make parent and family engagement activities systematic and integrated within Head Start programs. Several activities contribute to this goal, including development of Office of Head Start’s Parent, Family, and Community Engagement Framework; the provision of resources by the National Center for Parent, Family, and Community Engagement; and the piloting of instruments focused on parent engagement and parent-staff relationships in Head Start. The Family Engagement Plus Study will provide information about the engagement and service provision experiences of Head Start families. It will also provide information about the direct providers of services to parents and families whose voices have not been captured in national studies. With the exception of a case study component in the 1997 cohort, FACES has not collected in-depth qualitative data on the experiences of families participating in Head Start programs or the staff who provide family support services to them.

The family engagement study will explore several questions:

  • What does family engagement look like in Head Start?

  • How do FSS work with families, and what program supports do they receive?

  • How are comprehensive services provided in Head Start?

  • Do family engagement and/or service provision differ by family characteristics?

  • What changes do families identify as a result of Head Start?

  • What are the background characteristics of FSS?

Additionally, in spring 2015, FACES will include a Plus study to pilot a new measure of program functioning. This Plus feature will be conducted within the 120 programs participating in classroom-only-level data collection. The 480 classroom teachers participating will be asked to complete the pilot version of the Five Essentials Measurement System for Early Education (5E-Early Ed) educator survey.

A.2. Purpose and Use of the Information Collection

Major study activities to address the FACES 2014–2018 research questions will include:

  • Selecting a nationally representative sample of Head Start programs, recruiting them to participate in the study, gathering information from those programs to develop a center sampling frame, and selecting a nationally representative sample of Head Start centers (approval granted in previous package, OMB Approval Number 0970-0151, approved on April 7, 2014)

  • Sampling classrooms within those centers (approval granted in previous package, OMB Approval Number 0970-0151, approved on July 7, 2014)

  • Sampling children and recruiting families of Head Start enrollees to participate in the study (approval granted in previous package, OMB Approval Number 0970-0151, approved on July 7, 2014)

  • Collecting data from children and families, Head Start staff, and Head Start classrooms (approval granted in previous package for child and family data, OMB Approval Number 0970-0151, approved on July 7, 2014)

  • Collecting data as part of potential Plus studies to include topical studies and special studies of greater complexity

  • Analyzing and reporting findings (approval granted in previous package for fall 2014, OMB Approval Number 0970-0151, approved on July 7, 2014)

The overall design of FACES 2014–2018—the sampling plan, instruments, procedures, and data analysis plan—draws from the design of FACES 2009 and earlier rounds, but we propose some changes in approach and instruments. Like previous rounds, FACES 2014–2018 uses a multi-stage sample design with four stages: (1) Head Start programs, (2) centers within programs, (3) classrooms within centers, and (4) children within classrooms. We describe sampling procedures more fully in section B.1. and data collection procedures more fully in section B.2.

We will use the data collected as part of the FACES 2014–2018 Core to provide descriptions of the characteristics, experiences, and outcomes for children and families served by Head Start and to observe the relationships among family and program characteristics and outcomes. Findings from FACES 2014–2018 will provide information on Head Start Performance Measures and help guide OHS, national and regional training and technical assistance providers, and local programs in supporting policy development and program improvement.

  1. Approved Information Collection Requests Ongoing

In fall 2014, the field enrollment specialists (FESs) visited each sampled center to gather information to select the sample of classrooms (previously approved) and, for the 60 programs involved in the child-level data collection, the child sample (previously approved).2 For these 60 programs, visits will occur three weeks before the scheduled date of the fall 2014 data collection. FESs will work with center staff and the on-site coordinator (OSC), a liaison between the program and the study team, to distribute consent materials to parents of selected children. Consent materials include a consent letter and form (Appendix C.1 and C.2), a set of frequently asked questions (study FAQ) (Appendix C.3), and a study brochure (Appendix C.4). FESs will also distribute study FAQs to teachers of selected classrooms. Finally, FESs will provide centers with study flyers (Appendix C.5) for staff to display during the weeks prior to the data collection visit to remind staff and parents about the upcoming data collection visit. For the remaining 120 programs, FES visits (focusing on only classroom sampling) will occur at the start of the classroom observation week in spring 2015. This same procedure will occur with all programs in spring 2017.

Direct child assessments (previously approved) in fall 2014 and spring 2015, as well as teacher ratings (previously approved), will document children’s cognition and general knowledge, language use and emerging literacy, social and emotional development, approaches to learning, and physical development. Parent surveys will obtain data on parent’s and children’s activities, experiences with health care, and parents’ feelings and attitudes about themselves (previously approved).

  1. Current Information Collection Requests

In spring 2015 we will continue with activities approved previously as described above: child assessments, teacher ratings, and parent surveys to examine the school readiness skills of Head Start children as well as their family background. The current information collection request is for instruments associated with just the spring 2015 data collection, which will include sampling of additional Head Start classrooms (Attachment 1) and examining program functioning and classroom quality. These include program director (Attachment 9), center director (Attachment 10), and teacher (Attachment 8) surveys that will provide data on their employment and educational background, program goals and philosophy, and curriculum and classroom activities.3 A subsample of teachers will be asked to complete a second survey about their center’s climate, professional development, teaching, family engagement, and program leadership (Attachment 14, under the “Early Care and Education Providers Survey for Plus Study” master burden request), for the purpose of examining a new measure’s psychometric properties for future FACES use to gather descriptive information on program functioning.

Additionally, a Plus topical module is planned for studying family engagement that will involve all parents completing additional survey items as part of the Core survey (i.e., the Head Start spring supplement parent survey, Attachment 7) and interviews with a subsample of Head Start staff (FSS; Attachment 12) and parents (Attachment 11). The spring supplement survey will ask questions about parent-staff relationship and communication and community support. Interviews with a subsample of Head Start parents and staff will include modules on parent involvement in Head Start and program outreach and engagement practices, along with qualitative items on various family engagement topics. Staff will be selected based on a roster of all FSS in the program, focusing on those in the two selected centers, gathered for the On-Site Coordinator (Attachment 13). (See Part B for a description of the sampling approach for staff and parents for the Plus study.) Consent will be gathered for individuals participating in the interviews (Appendix J.5 and J.7). All 240 teachers selected for FACES in those programs will be asked to complete a set of items about parent-staff relationships (Attachment 15, under the “Early Care and Education Providers Survey for Plus Study” master burden request).

The primary goal of the family engagement module is to highlight patterns regarding practices and experiences overall and for key subgroups—for exploratory and hypothesis-generating purposes. Any suggestive findings will be used to help generate hypotheses about family engagement efforts and service provision and to inform future research efforts. For example, while program performance standards and policies mandate the types of family engagement efforts that are initiated (the “what” of engagement), we are currently limited in our knowledge of the “how”—the ways in which staff perform the day-to-day work of engaging with families, their successes and challenges, and the ways in which they individualize their practices. By providing suggestive information about the mechanisms of program efforts and identifying potential areas of strength and need, findings from the data can point to areas for study in future FACES data collections.

  1. Future Information Collection Requests

Future information collection requests will cover remaining components of the FACES study. Head Start staff or parents may be selected for other Plus topical modules or special studies that would involve qualitative interviews or supplemental surveys for additional content. For Plus studies, the study team may collect data (for future collection requests) through direct child assessments, web-based surveys, or telephone interviews, depending on the nature of the study. Quantitative or qualitative data collection methods may be used.

The instruments to support the Core study at the program and classroom levels and the Plus studies anticipated for future submission were described in the first Federal Register notice for the FACES 2014–2018 data collection, published in the Federal Register, Volume 79, pp. 11445-11446 on February 28, 2014 (Reference number FR 2014-04032). We will submit these future requests directly to OMB and allow for a 30-day public comment period under the Paperwork Reduction Act prior to use when these materials are fully developed.

A.3. Use of Improved Information Technology

The proposed data collection builds on the techniques that reduced burden in FACES 2009 while adding enhancements to further reduce burden. As done in FACES 2009, the study team will administer child assessments using computer-assisted personal interviewing (CAPI) to facilitate the routing and calculation of basal and ceiling rules, thereby lessening the amount of time required to administer the assessments and reducing burden on the child. To further enhance the assessment experience for the child and reduce assessment time, we will also present the child with assessment images on a second tablet screen (separate from the computer screen viewed by the assessor) rather than on an easel. Parent surveys will be web-based or administered using computer-assisted telephone interviewing (CATI). With the introduction of web-based surveys with a low-income population, we plan to conduct an experiment to understand how response rates and costs are affected by this new option. In particular, we are interested in whether it is cost-effective to use a web survey as compared to a telephone-administered survey with a low-income population and whether parents’ choice of a web survey is a function of how this option is introduced to them. A program’s parents will be randomly assigned in the fall to one of two groups to complete the parent survey: (1) a web-first group or (2) a choice group. The web-first group will receive a web-based survey initially with CATI follow-up after three weeks. The choice group will receive the option of either web-based or CATI administration starting at the beginning of data collection. Please see Part B, Section B.2 Data Collection Procedures for more details. We will give Head Start teachers the option of completing their Head Start Teacher Child Report (TCR) forms on the web or on paper. Head Start teachers, program directors, and center directors will have the option of completing their spring survey on the web or on paper. Plus study interviews with parents and staff will be administered by interviewers using semi-structured paper-pencil guides or CATI.

A.4. Efforts to Identify Duplication and Use of Similar Information

There is no evidence of other studies that offer comprehensive information on program quality, child outcomes, services, and characteristics of Head Start staff, children, and families. Previous cohorts of FACES would not have captured new program initiatives or changes to the population served by Head Start in the past few years.

Although we identified and adapted many useful survey items from other studies for use in FACES, none of those studies have collected comparable data on a nationally representative sample of Head Start children and families. No available studies combine the four sources of primary data (staff surveys, classroom observations, and, if part of child-level data collection, child assessments and parent surveys) that will be collected in FACES 2014–2018. Also, there is no other source for detailed child-level information that may be used to describe changes in the population served by Head Start over time. However, FACES captures information for children attending the population of Head Start centers, as opposed to other studies, such as Head Start CARES, which examines a randomized trial of interventions.

A.5. Impact on Small Businesses or Other Small Entities

No small businesses are impacted by the data collection in this project.

A.6. Consequences of Not Collecting Information or Collecting Information Less Frequently

From the start of FACES in 1997 through the most recent round in 2009, FACES has been fielded in three-year intervals to be a descriptive study of the population served by Head Start and to monitor program performance, examining both continuity and change. During the FACES redesign process, stakeholders expressed a desire for more timely data (Moiduddin et al. 2012). FACES 2014–2018 will help to enhance the timeliness and accessibility of information by collecting classroom and program data every two years and child-level data every four years. This periodicity is necessary to examine trends and changes over time. Each round of data collection occurs within a single program year, so could not be done less frequently.

A.7. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5

There are no special circumstances requiring deviation from these guidelines.

A.8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency

The first Federal Register notice for the FACES 2014–2018 data collection was published in the Federal Register, Volume 79, pp. 11445-11446 on February 28, 2014 (Reference number FR 2014-04032). Two substantive public comments and three requests to see the study instruments were received during the 60-day comment period. Draft instruments were sent as requested. The first comment, from a retired Head Start teacher, emphasized the importance of including rural programs in the study. A response to the commenter noted that FACES is designed to be a nationally representative study, and programs in both urban and rural areas are included in the study; additionally the current information request includes a larger sample of programs than in the past which will allow additional analyses across program types. The second comment, from the executive director of the National Head Start Association, proposed recommendations to help document the two-generation work Head Start does with families. A response to the commenter noted that the topics recommended are under consideration for inclusion in future spring 2015 or plus study data collection materials (see Appendix F for a copy of this public comment and response). The second Federal Register notice for the FACES 2014–2018 data collection was published in the Federal Register, Volume 79, pp. 27620-27621 on May 14, 2014 (Reference number FR 2014-11054). No comments or requests were received during that comment period. Copies of the 60-day notice and previous 30-day notice are included in Appendix E.

Previous rounds of FACES and the FACES redesign involved many individuals and organizations. The new FACES Core Plus study design and content reflect the redesign project, which gathered information from key stakeholders, examined programmatic and policy priorities, and reviewed study design and measurement strategies. The redesign project held two expert panel meetings—one on research priorities and one on methods—which led to the design options. For FACES 2014–2018, we will engage outside experts on particular topics as they emerge. We will obtain their feedback through written products, telephone conversations, or webinars. To date, we have consulted experts concerning the measurement of family engagement in the parent survey. Members of the family engagement expert panel are listed below in Table A.2.

Table A.2. FACES 2014–2018 Family Engagement Expert Panel Members

Member Name

Affiliation

Oscar Barbarin

Tulane University

Juliet Bromer

Herr Research Center for Children and Social Policy, Erikson Institute

Toni Porter

Innovation, Policy and Research, Bank Street College of Education

Joshua Sparrow

Harvard University

Heather Weiss

Harvard University


A.9. Explanation of Any Gift to Respondents

Previously reviewed and approved

Participation in FACES will place some burden on program staff, families, and children. To offset this burden, we have developed a structure for respondents to receive tokens of appreciation based on the one used effectively in FACES 2009 and attempts to acknowledge respondents’ efforts in a respectful way. Table A.3 presents the proposed structure. The token of appreciation values for teachers and parents have been modified since FACES 2006 and 2009. The token of appreciation for teachers completing a TCR is higher than the amount used in prior rounds. In FACES 2006 and FACES 2009, teachers received $5 for each TCR they completed, and an additional $2 per form if they completed it on the web. This was to encourage teachers’ use of the web option since web-based surveys contain built-in range and logic checks and branching instructions, thus effectively eliminating most of the errors inherent in paper instruments. In both studies, a majority of teachers opted for the web option; therefore, we did not feel the differential incentive amount was needed in FACES 2014-2018. Teachers will receive a $10 incentive for each TCR that they complete, a $3 increase over the amount provided for web completes in FACES 2009 in recognition of the fact that teachers in FACES 2014 are being asked to complete both parts of the teacher survey (teacher background and classroom information and the TCRs) online. Previously, only the TCRs were completed online. Parents would receive a token of appreciation for each survey they complete. In FACES 2006 and 2009, parents received $35 after completing their parent interview in person or by phone. Because the length of the parent survey has been reduced from 60 minutes to 20 minutes (plus the 5 minute parent supplement survey), the amount of the incentive has been reduced. FACES 2014 also now uses a tiered approach, with small additional amounts offered for web and early completion, to reflect the lower costs associated with web completion and reduction in number of follow-ups required. We believe that increasing the number of surveys completed by web as compared to phone will lower the overall data collection cost.

Current request – Spring 2015

In spring 2015, parents would receive a token of appreciation of $25 for completing the Family Engagement Study parent interview. Additionally, teachers completing the 5E-Early Ed pilot survey would receive a token of appreciation of $20. Similar to the Teacher Child Reports, the additional time for the second survey requires efforts beyond the typical work day.

Table A.3. FACES 2014–2018 Proposed Token of Appreciation Structure Compared to Structure of Prior Rounds



FACES 2006

FACES 2009

FACES 2014-2018

FACES Component

Respondent

Token of Appreciation

Token of Appreciation

Token of Appreciation

Data collection site visit

Program in child-level data collection

Fall and Spring: $500

Fall: $500

Spring: $250

Fall: $500

Spring: $250


Program in class/program-only data collection

NA

NA

Spring: $250 (to include FES visit for classroom sampling)

Teacher child report

Teacher

Fall and Spring:
$7 per web form

$5 per paper form

Fall and Spring:
$7 per web form

$5 per paper form

Fall and Spring: $10 per form

Parent survey

Parent

Fall and Spring:
$35

Fall and Spring:
$35

Fall and Spring: $15 (additional $5 if completed within 3 weeks of receiving survey; additional $5 if completed on the web)

Child assessment

Child

Fall and Spring:
Children’s book (valued at $10)

Fall and Spring:
Children’s book (valued at $10)

Fall and Spring:
Children’s book (valued at $10)

Family engagement parent interview

Parent

NA

NA

Spring: $25

5E-early Ed pilot educator survey

Teacher

NA

NA

Spring: $20

NA= Not Applicable


A.10. Assurance of Privacy Provided to Respondents

Respondents will receive information about privacy protections before they are asked to participate in the study. The study team will repeat this information at the start of each survey and interview. All interviewers and data collectors will be knowledgeable about privacy procedures and will be prepared to describe them in detail or to answer any related questions respondents raise.

We have crafted carefully worded consent forms (Appendix C.1 and C.2; Appendix J.5 and J.7) that explain in simple, direct language the steps we will take to protect the privacy of the information each sample member provides. We will assure parents both as they are recruited and before each wave of data collection that their responses and their child’s assessment scores will not be shared with the Head Start program staff or the program. We will assure both parents and staff that their responses will be reported only as part of aggregate statistics across all participants. ACF will obtain signed, informed consent from all parents before their participation and obtain their consent to assess their children. The FACES study FAQ and brochure (Appendix C.3, C.4) make it clear that parents may withdraw their consent at any time.

To further ensure privacy, the study team will remove personal identifiers that could be used to link individuals with their responses from all completed questionnaires and store the hard copy questionnaires under lock and key at the study team offices. The study team has extensive corporate administrative and security systems to prevent the unauthorized release of personal records, including state-of-the-art hardware and software for encryption that meets federal standards; physical security, including limited key card access and locked data storage areas; and other methods of data protection (for example, requirements for regular password updating). Mathematica secures individually identifiable and other sensitive project information and strictly controls access to sensitive information on a need-to-know basis. Data on tablet computers will be secured through hard drive encryption that meets federal standards, as well as through operation and survey system configuration and a password. Any computer files that contain this information will also be locked and password-protected. Survey, interview and data management procedures that ensure the security of data and privacy of information will be a major part of training. Additionally, Mathematica will require its entire staff to sign a confidentiality statement (Appendix G).

We are obtaining a National Institutes of Health certificate of confidentiality to help ensure the privacy of study participants. We are in the process of applying for the Institutional Review Board clearance needed before applying for the certificate. Additionally, OPRE is currently in the process of publishing a System of Records Notice (SORN) and a Privacy Impact Assessment (PIA).

A.11. Justification for Sensitive Questions

To achieve its primary goal of describing the characteristics of the children and families served by Head Start, we will be asking parents a few sensitive questions, including some aimed at assessing feelings of depression. We have used this information in past FACES reports to describe the Head Start population and staff and to examine child outcomes and change in those outcomes over time. Parents will also be asked about household income. The sensitive questions obtain important information for understanding behaviors and family needs, and previous rounds of FACES have used them. The invitation will inform participating parents that the survey will ask sensitive questions (Appendices H.2 through H.5). The invitation will also inform parents that they do not have to answer questions that make them uncomfortable and that none of the responses they provide will be reported back to program staff.

A.12. Estimates of Annualized Burden Hours and Costs

The proposed data collection does not impose a financial burden on respondents, and respondents will not incur any expense other than the time spent participating.

  1. Approved Information Collection Requests Ongoing

The total burden remaining for the previously approved instruments that will continue to collect information is estimated to be 2,515 hours annually. Table A.4 lists these instruments for program directors and on-site coordinators to review materials and speak with a study team member about the centers in their Head Start program as well as center directors to review study materials. It also includes data collection activities for fall 2014 in the 60 programs participating in child-level data collection—child assessments, parent surveys, and teacher child reports. Final response rates are provided in Table A.5. The parent response rate of 77 percent falls below our expected target of 86 percent. The parent survey experiment (described in Section A.3) included a three-week delay when study staff began to actively contact parents in order to complete the survey by phone. This delay could have adversely impacted the response rate, especially in the later weeks of the data collection period. All consented parents are contacted in the spring, even if they did not complete the fall survey. In an effort to remediate the fall response rate issues for the spring data collection, we plan to release fall nonrespondent cases first to allow more time for contact and to complete data collection for these cases. We are also considering shortening the interval between when a parent is invited to complete the survey and when active calling begins.

Table A.4. Approved Estimated Information Gathering Annual Response Burden and Approved Information Gathering Annual Cost

Instrument

Total Number of Respondents

Annual Number of Respondents

Number of Responses per Respondent

Average Burden Hour per Response

Estimated Annual Burden Hours

Average Hourly Wagea

Total Annual Cost

Telephone script for program directors

230

77

2

1

154

$27.53

$4,239.62

Telephone script for on-site coordinators

230

77

2

.75

116

$27.53

$3,193.48

Letter for Center director

460

154

2

.08

25

$27.53

$688.25

Classroom sampling form from Head Start staff

120

40

1

0.17

7

$28.28

$192.27

Child roster form from Head Start staff

120

40

1

0.33

13

$28.28

$373.23

Head Start core parent consent form

2,400

800

1

0.17

136

$16.20

$2,203.20

Head Start core child assessment

2,400

800

2

0.75

1,200

n.a.

n.a.

Head Start core parent survey

2,400

800

2

0.33

528

$16.20

$8,553.60

Head Start fall parent supplement survey

2,400

800

1

0.08

64

$16.20

$1,036.80

Head Start core teacher child report

240

80

20

0.17

272

$28.28

$7,690.80

Estimated Total





2,515


$28,171.25

a Average Hourly wage is based on the most recent Current Population Survey weekly earnings available at the time of the original request.


Table A.5. Final Response Rates for Approved Information Requests

Data Collection

Expected Response Rate

Final Response Rate

Head Start program

100%

90%

Head Start centera

100%

100%

Head Start core parent consent formb

90%

95%

Head Start core child assessmentc

92%

95%

Head Start core parent surveyc

86%

77%

Head Start fall parent supplement surveyc

86%

77%

Head Start core teacher child reportc

93%

98%

a Among participating programs

b Among eligible children

c Among eligible, consented children


  1. Current Information Collection Requests

Table A.6 presents the current request to cover data collection activities related to parents of Head Start children, Head Start teachers, and Head Start staff. We expect the total annual burden to be 556 hours for all of the instruments in the current data collection.



Table A.6. Estimated Current Annual Response Burden and Current Annual Cost

Instrument

Total Number of Respondents

Annual Number of Respondents

Number of Responses per Respondent

Average Burden Hours per Response

Estimated Annual Burden Hours

Average Hourly Wage

Total Annual Cost

(1) Classroom sampling form from Head Start staff

360

120

1

0.17

20

$27.45

$549.00

(7) Head Start spring parent supplement survey

2,400

800

1

0.08

64

$16.65

$1,065.60

(8) Head Start core teacher survey

720

240

1

0.50

120

$27.45

$3,294.00

(9) Head Start core program director survey

180

60

1

0.50

30

$27.45

$823.50

(10) Head Start core center director survey

360

120

1

0.42

50

$27.45

$1,372.50

(11) Head Start parent qualitative interview (Family Engagement)

360

120

1

1.00

120

$16.65

$1,998.00

(12) Head Start staff qualitative interview (FSS Engagement)

180

60

1

1.00

60

$27.45

$1,647.00

(13) Head Start staff (FSS) roster form

60

20

1

0.17

3

$27.45

$82.35

(J-5) Head Start parent engagement interview consent form

360

120

1

0.17

20

$16.65

$333.00

(J-7) Head Start staff engagement interview consent form

180

60

1

0.17

10

$27.45

$274.50

(14) Early care and education providers survey for Plus study (5E-Early Ed pilot)

480

160

1

0.33

53

$27.45

$1,454.85

(15) Early care and education providers survey for Plus study (FPTRQ)

240

80

1

0.08

6

$27.45

$164.70

Estimated Total





556


$13,590.00

n.a. = not applicable

  1. Total Burden Hour Request

The total burden to continue use of already approved information collection in addition to the new request is 3,071 hours per year over three years.

  1. Future Information Collection Requests

Table A.7 presents future data collection activities related to the spring 2017 program and classroom components as well as to potential Plus study activities. We expect the total annual burden to be 2,285 hours for all of the instruments in the future data collection. The estimated future burden reflects the original 60-day master request remaining after accounting for the current information requests.

Table A.7. Estimated Future Annual Response Burden and Future Annual Cost

Instrument

Total Number of Respondents

Annual Number of Respondents

Number of Responses per Respondent

Average Burden Hours per Response

Estimated Annual Burden Hours

Average Hourly Wage

Total Annual Cost

Classroom sampling form from Head Start staff

360

120

1

0.17

20

$27.45

$549.00

Head Start core teacher survey

720

240

1

0.50

120

$27.45

$3,294.00

Head Start core program director survey

180

60

1

0.50

30

$27.45

$823.50

Head Start core center director survey

360

120

1

0.42

50

$27.45

$1,372.50

Head Start parent qualitative interview

441

147

1

1.00

147

$16.65

$2,447.55

Head Start staff qualitative interview

420

140

1

1.00

140

$27.45

$3,843.00

Head Start parent consent form for Plus study

1,350

450

1

0.17

77

$16.65

$1,282.05

Head Start child assessment for Plus study

1,350

450

2

0.75

675

n.a.

n.a.

Head Start parent survey for Plus study

1,350

450

2

0.33

297

$16.65

$4,945.05

Head Start parent supplemental survey for Plus study

1,350

450

2

0.08

72

$16.65

$1,198.80

Head Start teacher child report for Plus study

150

50

20

0.17

170

$27.45

$4,666.50

Head Start teacher survey for Plus study

150

50

2

0.50

50

$27.45

$1,372.50

Head Start program director survey for Plus study

50

17

2

0.50

17

$27.45

$466.65

Head Start center director survey for Plus study

100

33

2

0.42

28

$27.45

$768.60

Early care and education administrators survey for Plus study

600

200

2

0.50

200

$27.45

$5,490.00

Early care and education providers survey for Plus study

722

241

2

0.50

241

$27.45

$6,615.45

Estimated Total





2,334


$39,135.15

n.a. = not applicable

  1. Estimates of Annualized Costs

To compute the total estimated annual cost, we multiplied the total burden hours by the average hourly wage for each adult participant, based on median weekly wages from the Bureau of Labor Statistics, Current Population Survey estimates (second quarter of 2014). The results appear in Tables A.6 (current requests) and A.7 (future requests) below. For teachers, program directors, center directors, Head Start staff, and other early care and education program staff we used the median salary for full-time employees over age 25 with a bachelor’s degree ($27.45 per hour). For parents, we used the median salary for full-time employees over the age of 25 who are high school graduates with no college experience ($16.65 per hour).

A.13. Estimates of Other Total Cost Burden to Respondents and Record Keepers

Not applicable.

A.14. Cost to the Federal Government

The total cost for the Core data collection related to the instruments within this current request is $2,474,114. These costs include the sampling, data collection, data processing, and analysis. The total cost for the Family Engagement Plus Study in this current request is $628,551. The total cost of the 5E Early Ed pilot survey is $120,150. Therefore, the total cost of data collection related to the instruments within this current request is $3,222,815. Including the previously approved costs (OMB Number 0970-0151, approved July 7, 2014), the total cost of data collection to date would be $9,342,222 or $3,114,074 annually.

A.15. Explanation for Program Changes or Adjustments

The changes made to the FACES 2014–2018 data collection reflect OPRE’s and OHS’s interest in ensuring that FACES provides timely and ongoing information about Head Start program performance, including program improvement efforts, program quality, and outcomes for children and families. As detailed above, unlike FACES 2009, FACES 2014–2018 will consist of three Core waves of data collection—fall 2014, spring 2015, and spring 2017—encompassing the Classroom + Child Outcomes Core and the Classroom Core, with Plus studies being a part of those waves or at different time points to be determined. At the program level, the sample size in FACES 2014–2018 is larger than the sample size of FACES 2009, allowing for more powerful analysis of program quality (180 programs rather than 60 programs). At the child level, the Classroom + Child Outcomes Core sample will represent all children enrolled in Head Start at baseline, not just those entering the program for the first time, as in FACES 2009. Previously, FACES followed the newly enrolled children through one or two years of Head Start and then through the spring of kindergarten. The FACES Core design will focus more on the children’s experiences and outcomes during their time in Head Start. FACES 2014–2018 will also differ from FACES 2009 in the mode and length of parent and Head Start staff surveys (more web-based and shortened surveys) to reduce respondent burden and support reporting on key indicators.

The current information collection request includes additional data collection activities for FACES 2014–2018. Specifically: spring 2015 data collection, including selecting classrooms in additional programs; conducting classroom observations; surveying teachers, center directors, and program directors; and interviewing parents and staff for FACES Plus studies.

A.16. Plans for Tabulation and Publication and Project Time Schedule

a. Analysis Plan

The analyses will aim to (1) describe Head Start programs and classrooms; (2) describe children and families served by Head Start, including children’s outcomes; (3) relate classroom and program characteristics to classroom quality; and (4) relate family, classroom, and program characteristics to children’s outcomes. Analyses will employ a variety of methods, including cross-sectional and longitudinal approaches, descriptive statistics (means, percentages), simple tests of differences across subgroups and over time (t-tests, chi-square tests), and multivariate analysis (regression analysis, hierarchical linear modeling [HLM]). For all analyses, we will calculate standard errors that take into account multilevel sampling and clustering at each level (program, center, classroom, child) as well as the effects of unequal weighting. We will use analysis weights, taking into account the complex multilevel sample design and nonresponse at each stage.

Cross-sectional Analyses. Descriptive analyses will provide information on characteristics at a single point in time, overall and by various subgroups. For example, for questions on the characteristics of Head Start programs, classrooms, or teachers (for example, average quality of classrooms or current teacher education levels) and the characteristics of Head Start children and families (for example, family characteristics or children’s skills at the beginning of the Head Start year), we will calculate averages (means) and percentages. We will also examine differences in characteristics (for example, children’s outcomes or classroom quality), by various subgroups. We will calculate averages and percentages, and use t-tests and chi-square tests to assess the statistical significance of differences between subgroups.

Changes or Trends over Time. Analyses will also examine changes or trends in characteristics over time, overall and by various subgroups. For questions about changes in children’s outcomes during a year of Head Start, we will calculate the average differences in outcomes from fall to spring for all children and for selected subgroups (for example, children who are dual language learners). We will use a similar approach for changes in family characteristics during the year. Outcomes that have been normed on broad populations of preschool-age children (for example, the Woodcock-Johnson III Letter-Word Identification or the Peabody Picture Vocabulary Test, 4th Edition) will be compared with the published norms to judge how Head Start children compare with other children their age in the general population and how they have progressed relative to national and published norms.

To examine changes in classroom or program-level characteristics across years, we will use t-tests and chi-square tests for simple comparisons—one year versus another. We will use trend analysis for examining whether child outcomes or family, classroom, or program characteristics are changing across multiple years and rounds of FACES. To compare children’s outcomes across prior and current FACES cohorts, we will employ a regression framework to examine the relationships between children’s outcomes and the year in which the outcomes were measured, controlling for child and family characteristics.

Multivariate Analyses. We will use multiple approaches for questions relating characteristics of the classroom, teacher, or program to children’s outcomes at single points in time, changes during a year in Head Start, or relationships among characteristics of classrooms, teachers, programs, and classroom quality. Many of the questions can be addressed by estimating hierarchical linear models that take into account that children are nested within classrooms that are nested within centers within programs. Analyses examining whether there are policy-relevant thresholds or cut points of classroom quality will also use HLMs to account for the clustering of children within classrooms and of classrooms within programs.

For the first time, children returning for a second year of Head Start will be included in the FACES sample. Gains for children in their second year can now be measured directly. However, gains across two years in the program will be synthetically estimated, piecing together the gains for the first- and second-year children, assuming two groups of children in one program year can adequately represent one group of children in two consecutive years of the Head Start program. To the extent this assumption does not hold, it can be addressed in the analysis by controlling for key covariates. We can also devise a weighting scheme to model characteristics relative to which children will stay in Head Start through their second year.

Plus Topical Module Analysis. All interview items collected from parents and Head Start staff as part of the family engagement plus study will be done with semi-structured paper-pencil guides and audio recorded for transcription and later coding. Analysis of the collected data will involve qualitative coding for themes or patterns, as well as descriptive analysis of survey data. Analyses will be conducted to identify themes and patterns overall and for key subgroups. Data will be coded by trained staff, using Atlas.Ti.

Plus Pilot Survey Analysis. The 5E-Early Ed educator survey data collected through FACES will be used to test the measurement characteristics of this set of new measures. We use Rasch analysis will be used to assess the reliability of the measures and modify the set of items and the response category set to maximize reliability (>=0.80) while keeping the set of items as small as possible. We will examine item fit statistics (calculated as aggregations of individual residuals across people within items) to determine how well the survey responses fit the Rasch model. Individual item fit statistics greater than about 1.3 indicate possible multidimensionality or other violation of principles of good measurement. We will calculate item difficulties to confirm that the order of the items agrees with our understanding of the concept. The data will be analyzed to identify areas for improvement such as misfitting items and poorly constructed response scales. We will also examine differential item functioning among subgroups of teachers to verify that the items are uniformly applicable across all groups (for example, age or teaching experience).

b. Time Schedule and Publications

We plan to produce the following products after each of the three waves of data collection (fall 2014, spring 2015, and spring 2017): (1) a set of descriptive tables on key indicators, (2) a report, and (3) two issue briefs.

The fall 2014 report will describe the population of children enrolled in Head Start that year. Likely topics include children’s demographic and language characteristics; children’s cognitive, social-emotional, and physical well-being; and family economic well-being at the beginning of the program year. The spring 2015 report will include data on children, teachers, classrooms, and programs. It will include children’s data from the fall and spring to examine development across the program year both for the full sample and for subgroups based on, for example, family background or length of children’s Head Start experience. The spring 2017 report will include data on programs and classrooms as well as from teachers. Using data from the director surveys and the program information report, it will provide a profile of Head Start programs—both structural features and approaches to implementing particular policies and processes. It will present information on classroom quality and teacher practices, summarize quality at the program level, and examine whether classroom quality varies by characteristics of programs, teachers, or classrooms. Classroom + Child Outcomes Core issue briefs will examine topics introduced in the fall 2014 and spring 2015 reports with greater depth or for particular subgroups. Classroom Core issue briefs will focus on specific topics related to program quality and services or on classroom quality over time. We will also prepare a report focused on family engagement, to highlight patterns and themes from the family engagement questions included in the spring 2015 (1) parent spring supplement survey and (2) interviews with a subsample of Head Start staff and parents. The report will address the study questions and provide descriptive information about what’s happening in programs around family engagement and service provision, the background characteristics of FSS, how families and staff work together, and how practices and experiences may differ across families (or staff). Results from the psychometric analyses of 5E-Early Ed pilot educator survey may be developed into scholarly publications or technical manuals.

At the end of each Core study in spring 2015 and spring 2017, we will also produce the following products: (1) public use files and (2) technical reports/user’s guides that detail the study design, analysis methods, nonresponse and nonresponse bias, and the psychometric properties of the measures. We will follow a similar approach to documentation and reporting for Plus studies. We will integrate the documents and data from the Plus studies with the Core documents and data when they occur in the same data collection period. The 5E-Early Ed pilot educator survey to be conducted in spring 2015 will not be included on such files.

A.17. Reason(s) Display of OMB Expiration Date is Inappropriate

The OMB number and expiration date will be displayed at the top of the cover page or first Web page for each instrument used in the study. For CATI or CAPI instruments, we will display this information on the introduction screens.

A.18. Exceptions to Certification for Paperwork Reduction Act Submissions

No exceptions are necessary for this data collection.


REFERENCES

Administration on Children, Youth, and Families. “Charting our Progress: Development of the Head Start Program Performance Measures.” Washington, DC: U.S. Department of Health and Human Services, 1995.

Love, John M., Louisa B. Tarullo, Helen Raikes, and Rachel Chazan-Cohen. “Head Start: What Do We Know About Its Effectiveness? What Do We Need to Know?” In Blackwell Handbook of Early Childhood Development, edited by Kathleen McCartney and Deborah Phillips. Malden, MA: Blackwell Publishing, 2006.

Moiduddin, Emily, Julia Lyskawa, Louisa Tarullo, Jerry West, and Elizabeth Cavadel. “FACES Redesign: Stakeholder Input on Information Needs.” Final report submitted to the Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services. Washington, DC: Mathematica Policy Research, January 19, 2012.

1 Head Start children will be compared to publisher normative data for same-age children.

2 The FES will ask Head Start staff (typically the On-Site Coordinator) for a list of all Head Start-funded classrooms and for each classroom, the teacher’s first and last names, the classroom session type (morning, afternoon, full day, or home visitor), and the number of Head Start children enrolled.

3 In the spring waves of data collection, classroom observations will assess the quality of equipment, materials, teacher-child interactions, and instructional practices in the classroom. No burden is associated with the observation, and thus we will not discuss it further in this package; see Appendix D for the components of the classroom observation.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorMathematica Staff
File Modified0000-00-00
File Created2021-01-25

© 2024 OMB.report | Privacy Policy