FACES_OMB_Part A_11_13_13

FACES_OMB_Part A_11_13_13.docx

Pre-testing of Evaluation Surveys

FACES_OMB_Part A_11_13_13

OMB: 0970-0355

Document [docx]
Download: docx | pdf

Redesign of the Head Start Family and Child Experiences Survey (FACES 2012)

Supporting Statement Part A Justification for the Study

November 13, 2013


CONTENTS

A JUSTIFICATION 1

A1. Necessity for the Data Collection 1

A2. Purpose of Survey and Data Collection Procedures 2

A3. Improved Information Technology to Reduce Burden 5

A4. Efforts to Identify Duplication 5

A5. Involvement of Small Organizations 5

A6. Consequences of Less Frequent Data Collection 5

A7. Special Circumstances 5

A8. Federal Register Notice and Consultation 6

A9. Incentives for Respondents 7

A10. Privacy of Respondents 7

A11. Sensitive Questions 8

A12. Estimation of Information Collection Burden (Newly Requested Information Collections) 8

A13. Cost Burden to Respondents or Record Keepers 8

A14. Estimate of Cost to the Federal Government 8

A15. Change in Burden 8

A16. Plan and Time Schedule for Information Collection, Tabulation, and Publication 8

A17. Reasons Not to Display OMB Expiration Date 8

A18. Exceptions to Certification for Paperwork Reduction Act Submissions 8

REFERENCES 8




ATTACHMENTS

A.1 Program Recruitment Screener

A.2 Classroom Selection Form

A.3 COMPONENTS OF THE DIRECT CHILD ASSESSMENt

A.4 FACES Pilot Study Web Specs

A.5 Cognitive Interviewing Protocol


TABLES

A.1 FACES DLL Study Pre-test Design 4

A.2 Members of the FACES Redesign Expert Panel on DLL Children 6

A.3 Total Burden Requested Under This Information Collection 8



A. JUSTIFICATION

The Administration for Children and Families (ACF) at the U.S. Department of Health and Human Services (DHHS) seeks approval for pilot data collection activities to support the Head Start Family and Child Experiences Survey (FACES), a recurring source of information about the children and families served by Head Start at the national level. ACF requests permission to (1) enroll Head Start programs and participants (children and parents) into the pilot study, (2) administer and evaluate a set of vocabulary and language measures with children who are dual language learners (DLLs), and (3) pilot test a parent survey of young children’s development and behavioral health. We will analyze and evaluate the measures and procedures involved with assessing DLL children and surveying their parents, with the goal of expanding and improving measurement of DLL children’s development in future rounds of FACES and in other studies. Findings based on the information collected will be compiled in a report to be used for internal purposes only and will not be released to the public. Mathematica Policy Research is the contractor for the study.

A1. Necessity for the Data Collection

The purpose of the FACES Pilot Study is to evaluate and refine the protocols used to assess the development of DLL children, directly and indirectly, through a survey of their parents. Growth in the U.S. population among Hispanic/Latino preschoolers and DLLs in general (Hernandez 2006) and among the population of children served by Head Start in particular (West and Hulsey 2009; Hulsey et al. 2011) has implications for the assessment procedures used in FACES and other national studies of young children. As a part of the FACES Redesign Project, a two-year effort encompassing a systematic review of all aspects of the FACES design, its methods, and data collection procedures, Mathematica convened an expert panel to consider optimal approaches to assessing DLL children in future rounds of FACES. The panel recommended that ACF pilot additional measures suitable for assessing the vocabulary skills of DLL children more efficiently as well as measures that offer a broader picture of children’s language development. Additionally, finding or developing brief, reliable measures of children’s development, behavioral health, and family context that may be completed by parents using different data collection modes is important to ACF and other federal agencies (for example, the Centers for Disease Control and Prevention). Given the changing nature of the Head Start population and the ways in which parents choose to respond to surveys, it is especially important to evaluate how such measures perform for Spanish-speaking populations, and when administered via the web or telephone.

1. Study Background

The FACES Pilot Study will help ACF improve the assessment battery used with DLL children and assess measures and procedures for surveying the parents of such children. It will provide important information about different vocabulary and language measures, and assessment approaches that will help to guide decisions about how FACES assesses DLL children in the future. The findings from comparisons of reports of children’s development completed by phone or on the web by Spanish- and English-speaking parents have implications for increasing the efficiency of future rounds of FACES and ensuring that such measures are equally reliable for both DLL and non-DLL children.

2. Legal or Administrative Requirements That Necessitate the Collection

There are no legal or administrative requirements that necessitate the data collection activities.

A2. Purpose of Survey and Data Collection Procedures

1. Overview of Purpose and Approach

Mathematica will pre-test a battery of vocabulary and language measures to determine whether FACES can use a single conceptual measure of children’s receptive vocabulary and whether we can include a broader measure of children’s language. The goal of the pre-test is to identify measures that (1) are reliable and valid for assessing DLL children and (2) reduce the receptive vocabulary assessment burden on children. Currently, Spanish-speaking children participating in FACES are administered three vocabulary assessments, including two measures of their English and Spanish receptive vocabulary. We will include the following measures in the pre-test: (1) preLAS 2000 (Pre-language Assessment Scales; Duncan and DeAvila 2002) Art Show, a language screener that assesses expressive English language proficiency; (2) the Peabody Picture Vocabulary Test 4 (PPVT-4; Dunn and Dunn 2006), which assesses receptive vocabulary; (3) the Auditory Comprehension subscale of the Preschool Language Scale 5 (PLS-5; Zimmerman et al. 2011), which assesses language comprehension; and (4) the Receptive One-Word Picture Vocabulary Test (ROWPVT-4)1, which assesses receptive vocabulary. The first two measures are currently part of the FACES battery and the latter two would be new additions.

Parents of the children will be asked to complete the Survey of Well-Being of Young Children (SWYC). With limited information on the use of the SWYC measure with low-income, Spanish-speaking populations, we will pre-test both its English and Spanish versions with Head Start parents. The SWYC, which was developed by the Floating Hospital for Children at Tufts Medical Center, includes brief questionnaires to assess three domains of children’s functioning: (1) developmental, (2) social/emotional, and (3) family context. We will pre-test the SWYC to assess its reliability and examine the feasibility of administering it via web and telephone to English- and Spanish-speaking Head Start parents.

2. Research Questions

The pilot study features two components: (1) a child assessment component and (2) a parent survey component.

  1. The assessment component will involve pre-testing a battery of language/vocabulary measures and is being designed to answer the following research questions:

  1. Can a single conceptually scored measure of receptive vocabulary be used to assess both English- and Spanish-speaking children?

  2. How do children’s receptive vocabulary scores derived from a conceptually scored measure compare to scores based on English-only and Spanish-only measures? How strongly are both of these sets of receptive vocabulary scores associated with children’s scores on other measures of expressive vocabulary and language development?

  3. Can a conceptual score of children’s receptive vocabulary be derived from responses from English and Spanish vocabulary assessments that are independently administered, and will it be comparable to the conceptual score based on a publisher’s standard administration?

  4. How much time is saved by using a conceptually scored measure versus two independent measures? Also, how much time is added to the assessment battery by the introduction of a broader language measure?

  1. The parent survey component of the pre-test will involve pre-testing the SWYC via web and telephone to English- and Spanish-speaking Head Start parents. The research questions to be answered with this pilot study component include the following:

  1. Is the parent survey a reliable measure that captures variability in children’s developmental progress, behavioral health, and family context?

b. Does it capture variability in the developmental progress, behavioral health, and family context of English-speaking and Spanish-speaking children alike?

c. Is it feasible to use the parent survey in future cohorts of FACES children? Which of two possible modes (i.e., telephone interview and web-based administration) is more suitable for use on FACES?

3. Study Design

We will invite 480 children and their parents in 16 Head Start centers (eight programs) to participate in this pre-test, with the expectation that 450 will participate. We will select programs and centers to ensure a good distribution of English- and Spanish-speaking children, and sample 30 children from each center, ensuring that 3-, 4-, and 5-year-olds are equally represented and that we have a total of 100 Spanish-speaking children and 50 English-speaking children within each age group. Children will complete a battery of vocabulary and language measures that will last approximately one hour. All 450 children will be administered the preLAS 2000 Art Show in English, the PPVT-4, and the Auditory Comprehension subscale of the PLS-5 (Table A.1). We will divide the pre-test sample into two groups for the purposes of administering the ROWPVT-4/ROWPVT- 4 SBE. The first group (n=225) will take the ROWPVT-4/ROWPVT-4 SBE using the publisher’s standard approach. Children from English-speaking homes will be administered the ROWPVT-4. Children from Spanish-speaking homes will be administered the ROWPVT-4 SBE. For this group of children, test administration commences in the language in which children indicate they are most comfortable but can be switched in the course of administration, if necessary. Non-DLL children will be included in this group and receive the test in English.) The second group (n=225) will complete the ROWPVT-4 SBE items in Spanish or in English (the two sets of items will be administered independently).

Table A.1. FACES DLL Study Pre-test Design

Sample

All
Children
N=450

All
Children
N=450

All
Children
N=450

One-Half of Children
N=225

One-Half
of Children
N=225


Measure

preLAS 2000 Art Show (English)

PPVT-4

PLS-5 Auditory Comprehension

ROWPVT-4*/ROWPVT-4 SBE (Conceptually Scored)

ROWPVT-4 SBE (English**)

ROWPVT-SBE (Spanish)


*English-speaking children would be administered the ROWPVT-4 and Spanish-speaking children the ROWPVT-4 SBE.

**English-speaking children would be administered the ROWPVT-4 SBE items in English.

Parents of children who participate in the vocabulary and language assessment will be asked to complete the SWYC. We will randomly assign parents of half of the Spanish-speaking and half of the English-speaking children to complete the survey via the web and the other half via Computer-Assisted Telephone Interview (CATI). We will conduct 20 cognitive interviews with parents who complete the survey via the web and 20 who complete it via CATI. We will conduct approximately half of the cognitive interviews with Spanish-speaking parents.

4. Universe of Data Collection Efforts

The FACES Pilot Study includes the following data collection instruments, which we include in Attachments A1−A5:

  1. Program Recruitment Script and Screener (Attachment A.1)

  2. Classroom Selection Form (Attachment A.2)

  3. Assessment Battery (Attachment A.3)

  4. SWYC Instrument (Attachment A.4)

  1. SWYC Cognitive Interviewing Protocol (Attachment A.5)

We include supporting documents, (i.e., Program Recruitment Advance Letter, Parent Letter and Consent Form, SWYC Email Invitation, SWYC CATI Advance Letter, and SWYC Sample Screens) as Attachments in Part B. All forms and procedures are based on ones used successfully in FACES 2009. The Program Recruitment Advance Letter, which contains a brief overview of the study goals and activities, will serve as the initial mode of contact for inviting programs to take part in the pilot study. Guided by the Program Recruitment Script, a team of two Mathematica recruiters will place a follow-up call, during which programs will be invited more formally to participate in the study. During this call, recruiters will describe the study purpose, provide an overview of study activities, confirm the program’s interest in participating, and administer the Program Recruitment Screener. The screener will provide program-level information on the families served by the program as well as descriptive information about the programs participating in the pilot. We will ask programs to designate an On-Site Coordinator (OSC), who will serve as the key contact and study liaison for coordinating the collection of consent forms and scheduling of assessment data collection visits. The OSC will be either the program director or someone appointed by the director. OSCs will complete the Classroom Selection Form for each selected center to identify classrooms eligible to participate in the pre-test. Data collected on classroom characteristics will include the number and age of children and percentage of DLL children. All supporting documents to be used with Spanish-speaking families—including the Letter and Consent Form for Parents and SWYC Invitation E-mail—will be translated by a certified Mathematica translator.

A3. Improved Information Technology to Reduce Burden

Mathematica will use Computer Assisted Interviewing (CAI) to the extent possible when conducting this pre-test. We will use Computer Assisted Personal Interviewing (CAPI) for two of the four child assessments (preLAS 2000 Art Show and PPVT-4)2, and the parent survey will be administered via a web instrument and CATI. For this one-time pre-test, it would not be cost-effective to program the two additional assessment measures (ROWPVT-4 and PLS-5) as CAPI instruments, so we will use paper (i.e., a score sheet) and pencil to administer them.3 The use of CAPI for the child assessments will facilitate routing children through the assessments and calculating basal and ceiling rules, thereby lessening the amount of time required to administer the assessments and reducing burden on the child. The use of the web and CATI for the parent survey will reduce respondent burden by facilitating routing and skip patterns and providing the respondent with an instrument specific to the child’s age.

A4. Efforts to Identify Duplication

The FACES Pilot Study will not be duplicative of other studies. The assessment component is designed to identify suitable language and vocabulary measures that may replace the measures currently in use on FACES, with the aim of reducing burden. To do this, we will compare children’s performance on several measures currently used in FACES with their performance on two new measures. The pre-test of the survey component involves assessing the feasibility of using a new surveillance instrument in future rounds of FACES.

A5. Involvement of Small Organizations

Not applicable. No small businesses are impacted by the data collection in this project.

A6. Consequences of Less Frequent Data Collection

Not applicable. This is a one-time data collection.

A7. Special Circumstances

There are no special circumstances for the proposed data collection efforts.

A8. Federal Register Notice and Consultation

1. Federal Register Notice and Comments

In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13; PRA) and Office of Management and Budget (OMB) regulations at 5 CFR Part 1320 (60 FR 44978, August 29, 1995), ACF published a notice in the Federal Register announcing the agency’s intention to request an OMB review of this information collection activity.

The first Federal Register notice for ACF’s generic clearance for information gathering was published in the Federal Register, Volume 76, page 34078 on June 10, 2011. The agency did not receive any comments in response to the Federal Register notice for the generic clearance. The second Federal Register notice was published in the Federal Register, Volume 76, page 53682 on August 29, 2011.

2. Consultation with Experts Outside of the Study

Table A.2 lists the members of the FACES Redesign expert panel on dual language learners. The panel was convened in May 2013.

Table A.2. Members of the FACES Redesign Expert Panel on DLL Children

Member

Affiliation

Areas of Expertise

Ellen Bialystok

York University

Effect of bilingualism on children’s language and cognitive development, models of metalinguistic awareness and second-language acquisition

Ed De Avila

Language Assessment Scale (LAS)

Language proficiency assessment, developmental psychology

Linda Espinosa

National Task Force on Early Childhood Education for Hispanics Technical Advisory Group

Effective educational services for DLL children, school achievement patterns for language-minority children

Allison Fuligni

California State University; UCLA Center for Improving Child Care Quality

Child development, social development, early childhood education, parenting, and research methods in child development

Fred Genesee

McGill University

Bilingualism, bilingual first-language acquisition in populations with and without impairments

Claude Goldenberg

Stanford University

Second language reading, bilingual education/ESL, Latino concerns in education

Erika Hoff

Florida Atlantic University

Factors in children's early language experiences and development that predict successful oral language and preliteracy outcomes

Elizabeth Peña

University of Texas at Austin

Dynamic assessment, development of assessment protocols for bilinguals

Mariela Páez

Boston College; Jumpstart

Early childhood education, bilingual language development, professional development for the education of DLL children

Catherine Snow

Harvard University

Language acquisition, bilingualism, bilingual education, language-literacy relations

Patton Tabors

Harvard University

Child language and literacy development

In addition to the DLL experts, we are collaborating with SWYC developers Ellen C. Perrin and R. Christopher Sheldrick. We will continue to consult with Dr. Perrin and Dr. Sheldrick as we prepare the web and telephone versions of the SWYC instrument.

A9. Incentives for Respondents

With OMB approval, we will offer children a book that costs under $5 for completing the assessment. We will offer participants $15 for completing the SWYC either on-line or by phone, and an additional $20 for participating in the cognitive interview. We will distribute $200 gift cards to participating programs or donate an equivalent amount to program activities, based on the program’s policies. These amounts were determined based on the estimated burden to participants and are consistent with those offered in prior Head Start studies using similar methodologies and data collection instruments (such as FACES and Baby FACES).

A10. Privacy of Respondents

The study will comply with government regulations for securing and protecting paper records, field notes, or other documents that contain sensitive or personally identifiable information. The study will assign a unique identification number to programs, centers, children, and parents to facilitate the linking of information across data sources for analytic purposes.

Parents will receive information about privacy protections when they consent to participate in the pilot. We have crafted carefully worded consent forms that explain in simple, direct language the steps we will take to protect the privacy of the information each sample member provides. The study will provide assurances of privacy to each parent as he or she is recruited for the pilot data collection. The consent form makes it clear that parents may withdraw their consent at any time or refuse to answer any items in the questionnaire or interview. Parents will be assured that their responses will not be shared with the Head Start program staff, their child’s primary caregiver, or the program, and that their responses will be reported only as part of aggregate statistics across all participating families. We will not share any information with any Head Start staff member. Moreover, no scale scores from direct child assessments will be reported back to programs.

To further ensure privacy, personal identifiers that could be used to link individuals with their responses will be removed from all completed questionnaires and stored under lock and key at the research team offices. Data on laptop computers will be protected by a FIPS 140-2 certified encryption system. Any computer files that contain this information also will be locked and password protected. Interview and data management procedures that ensure the security of data and privacy of information will be a major part of interviewer training.

Program directors and on-site coordinators will be asked a small set of questions about the centers in their Head Start programs. Program directors will provide the names and addresses of each center in their program, as well as each center’s hours of operation and estimates of its DLL children. OSCs will be asked to confirm the name and address of each center chosen to participate in the pre-test, provide the name and contact information for the center’s director, and identify the center’s hours of operation. The same procedures that were used in FACES 2009 (OMB number 0970-0151) will be used to ensure the privacy of the information provided by program directors and OSCs.

A11. Sensitive Questions

The SWYC Instrument includes items about tobacco, alcohol, and substance use in the home; parents’ depressive symptoms; and family tension. Some parents may consider these questions to be sensitive in nature; all parents will be told that they can skip questions they prefer not to answer.

A12. Estimation of Information Collection Burden (Newly Requested Information Collections)

The proposed data collection does not impose a financial burden on respondents nor will respondents incur any expense other than the time spent participating.

The estimated annual burden for study respondents is listed in Table A.3. The total annual burden is expected to be 185 hours for all of the instruments.

Table A.3. Total Burden Requested Under This Information Collection

Instrument

Total Number of Respondents

Annual Number of Respondents (annualized over the 3 year generic clearance period)

Number of Responses per Respondent

Average Burden Hours per Response

Annual Burden Hours

Average Hourly Wage

Total Annual Cost

Program Recruitment Script and Screener

10

3

1

.33

1

$22.01

$22.01

Classroom Selection Form

8

3

1

.33

1

$22.01

$22.01

Direct Child Assessment

450

150

1

1

150

N/A

N/A

Parent Survey

450

150

1

0.17

26

$22.01

$572.26

Cognitive Interview

40

13

1

0.5

7

$22.01

$154.07

Estimated Total




185


$770.35

Total Annual Cost

To compute the total estimated annual cost, the total burden hours were multiplied by the average hourly wage for parents. We used $22.01 per hour, which is the average hourly wage reported by the Bureau of Labor Statistics, Current Employment Statistics Survey, 2012.

A13. Cost Burden to Respondents or Record Keepers

Not applicable. There are no additional costs to respondents; they spend only their time to participate in the study.

A14. Estimate of Cost to the Federal Government

The annual cost to the federal government of contacting the 10 Head Start programs, recruiting participants, conducting child assessments and parent interviews, analyzing the collected data, summarizing findings in response to the study’s research questions, and developing a final assessment battery is estimated to be $76,549, including direct and indirect costs and fees.

A15. Change in Burden

This is an additional request under the pre-testing generic clearance (0970-0355).

A16. Plan and Time Schedule for Information Collection, Tabulation, and Publication

There are no plans for publishing the data gathered from the FACES Pilot Study. The data that are collected are for internal use only. Findings from the scoring, analysis, and tabulation of data will be shared only with ACF staff. The assessment package and all training materials developed for this pre-test may be shared with others. The web version of the SWYC will be designed to be shared with others. The information collected will be for internal use only; however, information might be included as a methodological appendix or footnote in a report containing data from a larger data collection effort.

The pilot study will take place over a three-month period, commencing upon OMB approval and ending by March 2014. Recruitment and data collection activities are slated to occur prior to March 2014. All analysis and reporting activities will take place in March 2014.

The analysis process will include (1) scoring and analysis of the measures in the assessment battery (2) analysis of the parent survey data and (3) comparing scores on the child vocabulary and language assessments to relevant at-risk indicators derived from the parent survey.

We will develop scores (i.e., raw scores and normative scores) for the vocabulary and language measures by following the publishers’ procedures. We will examine the distributional properties of each of the scores (means, range, standard deviations, percentage of cases with zero or perfect scores) for the full pilot study sample and for subgroups defined by age and language. We will also examine the internal consistency of the items that are used to form the scores. Following these initial analyses, we will examine correlations between the scores from the four assessment measures to answer the specified research questions. For example, we will compare correlations between the PPVT-4, PLS-5 and the ROWPVT-4 SBE. A high correlation between scores on the ROWPVT-4 SBE and PPVT-4, and between the ROWPVT-4 SBE and the PLS-5, would provide support for using the ROWPVT-SBE as the single conceptual measure of Spanish-speaking children’s receptive vocabulary in future rounds of FACES, and phasing out the PPVT-4 and Test de Vocabulario en Imagenes Peabody (TVIP; Dunn et al. 1986) that are currently in use. We will examine correlations between ROWPVT-4/ROWPVT-4 SBE scores administered in different ways to assess whether the English and Spanish administrations yield scores that are comparable to the publisher’s standard administration. If we find that they do, we would have the ability to report on children’s overall receptive language skills as well as their skills in each of two languages (English and Spanish).

Analysis for the parent survey will be conducted separately for each administration mode (i.e., telephone versus web survey) and focus on examining the psychometric properties of the SWYC. Specifically, analysis will involve examining the range of responses and scores, correlations of scores with age and language, internal consistency and correlations between individual items and the total score, and exploratory factor analysis to determine whether the survey comprises one or multiple factors. These analyses will be performed using the data from the full sample and separately by language group and administration mode (web versus telephone).

We will examine correlations between scores on the direct child assessments of vocabulary and language and indicators of at-risk status based on parent reports. In general, we would expect children who are at risk to perform more poorly on these measures of development than other children. Again, we will examine these relationships for English- and Spanish-speaking children and parents and by administration mode.

Analysis of the pre-test data will serve three purposes: (1) identifying language and vocabulary measures that are reliable and valid with respect to DLL children and reduce the receptive vocabulary assessment burden on children, (2) assessing the appropriateness of the SWYC for use with English-speaking and Spanish-speaking Head Start parents in the future and (3) comparing the feasibility of administering the SWYC via web and phone. We will produce two products based on the analysis: (1) a report to ACF that summarizes the results of the pre-test and (2) a final web version of the SWYC with incorporated functions that permit scoring and converting results into portable document format (PDF) files.

The study’s final report will be designed as an internal document for ACF and will discuss the following:

  • How well the measures, instruments, and data collection methods worked with children who speak only or predominantly one language or the other

  • Associations between the four language/vocabulary measures and recommendations for assessing language/vocabulary in future FACES cohorts

  • Whether the assessments and parent survey performed comparably across ages and languages

  • A report on the psychometric properties of the English and Spanish versions of the SWYC; also provide a comparison of the two different administration modes (i.e., web and phone)

  • A summary of revisions made to the SWYC based on the pilot and recommendations for its use for large-scale data collection

  • Considerations needed before moving forward with large-scale data collection

A17. Reasons Not to Display OMB Expiration Date

All instruments will display the OMB approval number and expiration date for OMB approval.

A18. Exceptions to Certification for Paperwork Reduction Act Submissions

No exceptions are necessary for this information collection.



REFERENCES

Brownell, R. (2012). Receptive One-Word Picture Vocabulary Test (Spanish-Bilingual edition, manual). Novato, CA: Academic Therapy Publications.

Duncan, Sharon, and Edward DeAvila. “Pre-Language Assessment Scales [PreLAS2000].” Monterey, CA: CTB-McGraw Hill, 2002.

Dunn, Lloyd, and Doug Dunn. Peabody Picture Vocabulary Test. Fourth Edition. Circle Pines, MS: American Guidance Service, 2006.

Dunn, L.M., E.R. Padilla, D.E. Lugo, and L.M. Dunn. Test de Vocabulario en Imagenes Peabody. Circle Pines, MN: American Guidance Service, 1986.

Hernandez, D. “Young Hispanic Children in the U.S.: A Demographic Portrait Based on Census 2000.” Tempe, AZ: Arizona State University, June 26, 2006.

Hulsey, L. K., Aikens, N., Kopack, A., West, J., Moiduddin, E., and Tarullo, L. (2011). Head Start Children, Families, and Programs: Present and Past Data from FACES. OPRE Report 2011-33a. Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.

Martin N, A., Brownell R (2011) ROWPVT-4: Receptive One-Word Picture Vocabulary Test – Fourth Edition. Novato, CA: Academic Therapy Publications.

Sheldrick, R.C., B.S. Henson, S. Merchant, E.N. Neger, J.M. Murphy, and E.C. Perrin. “The Preschool Pediatric Symptom Checklist (PPSC): Development and Initial Validation of a New Social-Emotional Screening Instrument.” Academic Pediatrics, vol. 12, no. 5, September-October 2012a, pp. 456−67. PMID: 22921494.

Sheldrick, R.C., E. Neger, and E.C Perrin. “Concerns about Development, Behavior and Learning Among Parents Seeking Pediatric Care.” Journal of Developmental and Behavioral Pediatrics, vol. 33, no. 2, February 2012b, pp. 156−160.

West, J., and L. Hulsey. “Who is Served by Head Start? Changes in the 3-Year-Old Population.” Presentation at the Society for Research in Child Development Biennial Meeting, Denver, April 3, 2009.

Zimmerman, Irla Lee, Violette G. Steiner, and Roberta Evatt Pond. Preschool Language Scale, Fifth Edition (PLS-5). San Antonio, TX: Pearson Assessments, 2011





1 We will use both, the English edition (ROWPVT-4; Martin & Brownell, 2011) and Spanish Bilingual Edition (ROWPVT-4 SBE ; Brownell 2012) for the FACES Pilot Study.

2 Programmed versions of the Pre-Las Art Show and the PPVT-4 have been used in FACES 2009 and several other Mathematica studies.

3 Children will respond to items displayed to them on an easel and the interviewer will record responses either on the computer or on a score sheet.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleFACES_OMB_PART A_11_13_13
SubjectRedesign of the Head Start Family and Child Experiences Survey (FACES 2012)
AuthorMathematica Staff
File Modified0000-00-00
File Created2021-01-26

© 2024 OMB.report | Privacy Policy