Att_Amendment to PEELS.PartB.OMB Supporting Statement

Att_Amendment to PEELS.PartB.OMB Supporting Statement.doc

Pre-Elementary Education Longitudinal Study (PEELS) (SC)

OMB: 1850-0809

Document [doc]
Download: doc | pdf


B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS


1. Sampling Methods


To address undercoverage of the sample in one geographic region, a supplemental sample was added in Wave 2. Twenty-four additional districts were sampled, and 15 (63%) were recruited for the supplemental sample. The final Wave 1 participation rate among districts in the main sample was 27 percent (189 of 709 districts) and the nonresponse sample was 59 percent (19 of 32). When the supplemental sample is combined with these, the overall district participation rate is 29 percent. The results of the nonresponse study referenced in the 2004 Supporting Statement were submitted to OMB in fall 2004.


The 15 supplemental districts provided lists of children meeting the criteria for the three PEELS age cohorts as defined in Wave 1. A supplemental sample of 544 children was selected. We received enrollment forms for 433 children. Of those, 289 children were eligible for PEELS, and 144 were ineligible. We received signed consent forms from 198 families (68% of known eligible children). This was 85 percent of the 233 families anticipated.


A total of 3,104 children and their families have been recruited for PEELS when the Wave 1 sample (main and nonresponse) and the supplemental sample are combined.


2. Sample Design Procedures


Similar procedures were used to obtain lists of eligible children from the main and supplemental districts. Site Coordinators working in these LEAs provided enrollment lists of children who met the age criteria for the three cohorts, had IEPs, and were enrolled in preschool or kindergarten as of March 1, 2003. Site Coordinators were responsible for completing enrollment forms for all children selected from the lists and recruiting families of children they determined to be eligible.


Exhibit 3 projects expected numbers of respondents for Waves 3 and 4 based on actual Wave 2 response rates (which were higher than 2004 projections).


Exhibit 3. Number of Study Participants, Interviews, and Assessments by Wave

Number participating in Wave 1

2,906

Number participating in Wave 2

3,104

Yearly attrition rate

5%

Number of parent interviews in Wave 1

2,802

Number of parent interviews in Wave 2

2,893

Number of parent interviews in Wave 3

2,748

Number of parent interviews in Wave 4

2,611

Number of assessments in Wave 1

2,792

Number of assessments in Wave 2

2,932

Number of assessments in Wave 3

2,786

Number of assessments in Wave 4

2,646

Notes: Numbers shown are actual for Waves 1 and 2.

An additional 197 children were added to the study in Wave 2.



3. Maximizing Response Rates


There are two key aspects to maximizing the number of sample members for whom data are collected: minimizing the number of sample members lost through attrition and completing data collection with the maximum number of sample members who are retained in the sample.


To maintain the number of LEAs participating in Wave 2, we contacted the districts that recruited families in Wave 1, confirmed the continuing participation of all districts, and confirmed the name of a returning or new Site Coordinator in all districts. We will do the same for all districts prior to the start of Waves 3 and 4 data collection.


To minimize sample attrition over the waves of data collection, IES plans to use aggressive tracking mechanisms to maintain accurate and up-to-date contact information for sample members. Site Coordinators will receive an incentive for returning the CSR in Waves 3 and 4. For each child enrolled in the study, the Site Coordinator will confirm that the participating child is still enrolled at the school, provide the name of the child’s current teacher, and/or identify the school where the child has transferred. In addition, the parent interviews include information that will facilitate tracking of parents/guardians, such as additional work and home telephone numbers for the respondents, location information for one or more friends or relatives who would know where the family had moved, and e-mail addresses.


Maximizing the number of sample members for whom data are collected can be achieved in several ways. Regarding the parent interview, which is administered through CATI, the following procedures are used to maximize the completion rate:

  • Provide a toll-free number for respondents to call to verify the study’s legitimacy or to ask other questions about the study. Those without phones in their homes also can call this number from any location and have the interview conducted at that time.

  • Require many unsuccessful call attempts to a number without reaching someone before considering whether to treat the case as “unable to contact.”

  • Draw a core of interviewers with experience working on telephone surveys of households, particularly interviewers who have proven their ability to obtain cooperation from a high proportion of sample members.

  • Require all interviewers to successfully complete training specific to this study, including discussions of how to avoid inviting a refusal, approaches that will help in addressing questions respondents are likely to ask, and how to counter objections.

  • Use call scheduling procedures that are designed to call numbers at different times of the day and week, to improve the chances of finding a respondent at home.

  • Make every reasonable effort to obtain an interview at the initial contact, but allow respondents flexibility in scheduling appointments to be interviewed.

  • Closely supervise interviewers during data collection.

  • Implement refusal conversion efforts for first-time refusals and use interviewers who are skilled at refusal conversion.

  • Conduct silent monitoring of interviews to identify and promptly correct behaviors that could be inviting refusals or otherwise contributing to low cooperation rates.

  • Leave a message on answering machines when such machines have been repeatedly encountered in order to let the respondent know the call was not a marketing effort but a research study.


To increase response rates for questionnaires in Waves 1 and 2, we sent reminder postcards, remailed questionnaires, and called to follow up with nonrespondents on a fixed schedule that was tied to the date the initial questionnaire was mailed. In addition, postage-paid pre-addressed envelopes were included with all mailings to facilitate return of completed forms. Incentives for teachers, principals, program directors, and district officials (see Section A, item 9), were also used to contribute to improved response rates. The same protocol will be followed for Waves 3 and 4, except to remove the Elementary School Principal and Early Childhood Program Director Questionnaires, which presented the greatest challenges to achieving adequate response rates.


To address concerns about nonresponse bias, OSEP funded a comprehensive nonresponse study in which Westat selected a sample of 32 nonparticipating LEAs in Wave 1. Twenty-five of those LEAs (78%) originally agreed to participate in the study, and 23 ultimately recruited one or more families. The sampling procedures, instruments, and data collection procedures were exactly the same for the main and nonresponse study participants, so any differences between the two samples can be attributed to the differences in the characteristics of the subpopulations that the samples represent (main study sample and nonresponse study sample). The combined samples of the main and nonresponse study samples provided unbiased estimates because the combined samples represented the whole population. Statistical tests that compared these unbiased estimates and PEELS estimates for equality revealed no systematic nonresponse bias.


4. Testing Instrumentation


The assessments proposed for Wave 4 are off-the-shelf instruments with published psychometric data. As such, no additional testing is anticipated. The shortened version of the parent interview will be tested extensively by Westat to ensure that item displays, skip patterns, and data storage operate correctly. This will involve several staff members from the Westat Telephone Research Center, PEELS computer programmers, as well as analysts with specified respondent scenarios designed to cover each possible skip route. No new items are proposed, so we do not anticipate testing instruments with parents. The proposed items have worked well in the previous 3 waves of data collection.


5. Individuals Consulted on Statistical Issues


Persons involved in statistical aspects of the design include staff of the government’s design contractors, SRI International, Research Triangle Institute, and Westat. Those consulted at these organizations are listed below.

SRI:

Dr. Harold Javitz, Senior Statistician


Westat

Dr. Hyunshik Lee

Dr. Annie Lo

Dr. Frank Jenkins


In addition, all aspects of the design, sampling plan, and instrumentation were reviewed by the original PEELS TWG and Consultants.



Attachment A


Corresponding Items from the Elementary School Principal, Early Childhood Program Director, and QED Files

Table A-1. Elementary School Principal Questionnaire Items and Corresponding QED Items


Questionnaire item number—item summary

QED match

Comments

A1 – School type (e.g., regular, special education, magnet, charter, alternative)

Exact


A2 – School type (e.g., public, private, residential/boarding, home school)

Similar

QED does not identify residential/boarding schools or home schools

A3 – Grade levels taught

Exact


A4 – Total school enrollment

Exact


A5 – Pre-K enrollment

Exact


A6 – Metropolitan status

Similar

Categories differ based on variations in the definition of medium-sized city and large city

A7 – School designated as in need of improvement or low-performing

None


B1 – Racial/ethnic breakdown of students

Similar

QED groups Asian and Native Hawaiian or Other Pacific Islander into one category; PEELS uses two categories

B2 – Number of students identified as ELL

Similar

QED identifies schools that have classes to assist students identified as ELL

B3 – Percentage of students from low-income families

Exact

QED variable is continuous and could be recoded to match B3 categories or left as continuous

B4 – Number of expulsions, out-of-school suspensions, in-school suspensions, incidents of violence

None


B5 – Categorical breakdown of students with IEPS

None


C1 – Number of personnel

Similar

QED provides a total number of full time classroom teachers not a breakdown by personnel type

C2 – Percentage of teachers: fully credentialed, in their first year, with less than 3 years teaching experience

None


C3 – Services, resources, programs offered by school

Similar

QED collects data on before/after school, extend day, gifted and talented, mentoring programs, English as a second language

C4 – Service options for special education students

None


D1 – Educational philosophy

None


D2 – Years services first provided to preschool children with disabilities

None


D3 – Way children with and without disabilities brought together

None


E1 – Formal and systematic written procedures for providing alternatives to students not yet receiving services

None


E2 – Meetings and teams involved in process for E1

None


E3 – Resources available to general education teachers

None


E4 – Accommodations, modifications, supports, and learning aids provided to students with disabilities

None


E5 – Participants in IEP or 504 plan development and review

None


E6 – School’s practice regarding mandated standardized tests for students with disabilities

None


E7 – Process for deciding which standardized tests are given to students with disabilities

None


E8 – How scores of special education students were treated

None


E9 – How students with disabilities were addressed in school’s academic content standards

None


E10 – Alternative services for students who are expelled and/or suspended

None


E11 – School policy on promotion of students performing poorly

None


F1 – Forms of communication between parents and staff

None


F2 – Opportunities that promote parent involvement

None


F3 – Supports provided to support transition into kindergarten or elementary school

None



Additional QED variables of interest: instructional dollars spent per pupil, excluding teacher salaries, and Orshansky poverty index


Table A-2. Early Childhood Program Director Questionnaire Items and Corresponding QED Items

Questionnaire item number—item summary

QED match

Comments

A1 – School type (e.g., public education agency, public agency - other, private nonprofit, private for-profit)

Similar

QED has an item called center type with the following categories: day care, Head Start, Montessori, and preschool

A2 – Head Start grantee

Exact


A3 – Program size (e.g., single site, part of larger agency, part of multi-service agency)

None


A4 – Parents charged fee for services

None


A5 – Sliding scale based on parent income

None


A6 – Waivers/alternative sources of payment for some parents

None


A7 – Metropolitan status

Similar

Categories differ based on variations in the definition of medium-sized city and large city

A8 – License or accreditation

None


A9 – Types of programs/classrooms offered

Similar

QED item - Day Care Special Characteristics

B = Before/After School (regular day care for school-age children)

C = Before/After School, independent program at a school

D = Before/After School, regular care and physically handicapped care

E = Before/After School, regular care and mentally handicapped care

F = Migrant Head Start

I = Native American Head Start

P = Physically Handicapped Care (exclusively)

M = Mentally Handicapped Care (exclusively)

V = Parent/Child Head Start

A10 – Educational philosophy

None


A11 – Years in operation

None


A12 – Years program served children with disabilities

None


A13 – Agency established for specific purpose of providing services to children with disabilities

None


A14 – Way children with and without disabilities brought together

None


A15 – Forms of communication between parents and staff

None


A16 – Opportunities that promote parent involvement

None


A17 – Transition of children with disabilities from EI

None


A18 – Supports provided to support transition from EI

None


A19 – Supports provided to support transition into kindergarten or other preschools

None


A20 – Head Start grantee, provider of special education and related services

Exact


A21 – Services provided to children with IEPS

None


A22 – Location of services identified in A21

None


A23 – Types of personnel employed

None


B1 – Pre-K enrollment

Exact


B2 – Categorical breakdown of students with IEPS

None


B3 – Percentage of students from low-income families

Exact

QED variable is continuous and could be recoded to match B3 categories or left as continuous

B4 – Number of students identified as ELL

Similar

QED identifies schools that have classes to assist students identified as ELL

B5 – Racial ethnic breakdown of students

Similar

QED groups Asian and Native Hawaiian or Other Pacific Islander into one category, PEELS uses two categories

C1 – Number of personnel

None


C2 – Number of personnel providing direct services

Similar

QED provides a total number of full-time classroom teachers

C3 – Number of personnel providing direct services to children with IEPS

None


C4 – Number of personnel providing direct services to children with IEPS that left in last 12 months

None


C5 – Number of unfilled staff positions

None


C6 – Employee benefits

None


C7 – Preparation of whole staff to work with preschoolers with disabilities

None


C8 – Preparation of special education and related services staff to work with preschoolers with disabilities

None


C9 – How staff come together outside of IEP meetings

None


D1 – Educational degree(s) of respondent

None


D2 – Professional license(s) or certificate(s) held by respondent

None


D3 – Disability status of respondent’s immediate family members

None


D4 – Sex/gender of respondent

None


D5 – Race of respondent

None


D6 – Hispanic origin of respondent

None


D7 – Age of respondent

None



Attachment B: Wave 4 Assessments


Assessment

Description

Woodcock-Johnson III - Letter-Word Identification (included in Waves 1-3)

Measures word identification skills

Woodcock-Johnson III - Applied Problems (included in Waves 1-3)

Measures ability to analyze and solve math problems

Woodcock-Johnson III - Calculation

Measures ability to perform mathematical computations

Woodcock-Johnson III - Passage Comprehension

Measures reading ability and understanding of written material in context

Peabody Picture Vocabulary Test III (PPVT III) (included in Waves 1-3)

Measures receptive language ability

Dynamic Indicators of Basic Early Literacy Skills (DIBELS) - Oral Reading Fluency

Measures accuracy and fluency with connected text

Language Assessment Scales – Oral (LAS-O)

Measures listening and speaking abilities (this will only be administered to nine children who had been administered the Spanish version of the assessment in previous waves in order to measure their English language proficiency)

Attachment C: Data Collection and Reporting Schedule

Table B-1. Data Collection and Reporting Schedule, 2006, by Month


1

2

3

4

5

6

7

8

9

10

11

12

Conduct Wave 3 Parent Interviews

X

X

X

X

X

X







Conduct Wave 3 Teacher Questionnaires

X

X

X

X

X

X







Conduct Wave 3 Child Assessments

X

X

X

X

X

X







Conduct Wave 3 Principal/Program Director Questionnaires#













Release Wave 1 Overview Report


X











Post Wave 1 Static Web Tables


X











Submit Wave 2 Overview Report






X







Deliver Wave 2 CATI Report





X








Deliver Wave 2 Mail Questionnaire Report





X








Deliver Wave 2 Assessment Report






X







Finalize Thematic Reports




X









Put DAS into Operation











X


Complete Wave 3 Methods Report











X


#Only to new principals/program directors, i.e., those with PEELS children enrolled for the first time.



Table B-2. PEELS Data Collection and Reporting Schedule, 2007, by Month


1

2

3

4

5

6

7

8

9

10

11

12

Conduct Wave 4 Parent Interviews

X

X

X

X

X

X







Conduct Wave 4 Teacher Questionnaires

X

X

X

X

X

X







Conduct Wave 4 Child Assessments

X

X

X

X

X

X







Mail Newsletter



X










Submit Wave 3 Overview Report*













Deliver Wave 3 CATI Report





X








Deliver Wave 3 Mail Questionnaire Report





X








Deliver Wave 3 Assessment Report






X







Finalize Thematic Reports




X









Complete Wave 4 Methods Report











X


# Only to new principals/program directors, i.e., those with PEELS children enrolled for the first time.

* As approved in Annual Analysis and Reporting Plan



Table B-3. PEELS Data Collection and Reporting Schedule, 2008, by Month


1

2

3

4

5

6

7

8

9

10

11

12

Mail Newsletter



X










Update DAS




X









Submit Wave 4 Overview Report*













Deliver Wave 4 CATI Report





X








Deliver Wave 4 Mail Questionnaire Report





X








Deliver Wave 4 Assessment Report






X







Finalize Thematic Reports




X









* As approved in Annual Analysis and Reporting Plan



Table B-4. PEELS Data Collection and Reporting Schedule, 2009, by Month


1

2

3

4

5

6

7

8

9

10

11

12

Conduct Wave 5 Parent Interviews

X

X

X

X

X

X







Conduct Wave 5 Teacher Questionnaires

X

X

X

X

X

X







Conduct Wave 5 Child Assessments

X

X

X

X

X

X







Mail Newsletter



X










Update DAS




X









Submit Wave 4 Overview Report*













Deliver Wave 4 CATI Report





X








Deliver Wave 4 Mail Questionnaire Report





X








Deliver Wave 4 Assessment Report






X







Finalize Thematic Reports




X









Complete Wave 5 Methods Report











X


# Only to new principals/program directors, i.e., those with PEELS children enrolled for the first time.

* As approved in Annual Analysis and Reporting Plan



Table B-5. PEELS Data Collection and Reporting Schedule, 2010, by Month


1

2

3

4

5

6

7

8

9

10

11

12

Update DAS




X









Submit Wave 5 Overview Report*













Deliver Wave 5 CATI Report





X








Deliver Wave 5 Mail Questionnaire Report





X








Deliver Wave 5 Assessment Report






X







Finalize Thematic Reports




X









* As approved in Annual Analysis and Reporting Plan.


4


File Typeapplication/msword
File TitleIntroduction
AuthorLinda LeBlanc
Last Modified Bysheila.carey
File Modified2006-09-27
File Created2006-09-27

© 2024 OMB.report | Privacy Policy