Part A ECLS-K2011 Spring 1st and Fall 2nd Grade Data Collections

Part A ECLS-K2011 Spring 1st and Fall 2nd Grade Data Collections.doc

Early Childhood Longitudinal Study Kindergarten Class of 2010-11 (ECLS-K:2011) Spring First-Grade and Fall Second-Grade Data Collections

OMB: 1850-0750

Document [doc]
Download: doc | pdf




Early Childhood Longitudinal Study, Kindergarten Class of 2010-11 (ECLS-K:2011)


Spring First-Grade and Fall Second-Grade National Data Collections



OMB Clearance Package

# 1850-0750 v.10




Supporting Statement

Part A





Prepared by

National Center for Education Statistics

U.S. Department of Education

July 25, 2011

Revised November 23, 2011



Table of Contents

Chapter Page


A Justification A-1


A.1 Circumstances Making Collection of Information Necessary A-1


A.1.1 Purpose of This Submission A-1

A.1.2 Legislative Authorization A-3

A.1.3 Prior Related Studies A-3

A.1.4 ECLS-K:2011 Study Design for the Spring First-Grade Data Collection A-4

A.1.5 ECLS-K:2011 Study Design for the Fall Second-Grade Data Collection A-6


A.2 Purposes and Uses of the Data A-7


A.2.1 Research Issues Addressed in the ECLS-K:2011 Fall First-Grade Data Collection A-8


A.3 Use of Improved Information Technology A-14

A.4 Efforts to Identify Duplication A-16

A.5 Method Used to Minimize Burden on Small Businesses A-17

A.6 Frequency of Data Collection A-18

A.7 Special Circumstances of Data Collection A-18

A.8 Consultants Outside the Agency A-18


A.9 Provision of Payments or Gifts to Respondents A-23


A.9.1 School Incentive A-23

A.9.2 School Administrator A-24

A.9.3 Teachers A-24

A.9.4 School Coordinators A-25


A.10 Assurance of Confidentiality A-25

A.11 Sensitive Questions A-29

A.12 Estimated Response Burden A-32

A.13 Estimates of Cost A-38

A.14 Annualized Cost to the Federal Government A-38

A.15 Reasons for Changes in Response Burden and Costs A-38

A.16 Publication Plans and Time Schedule A-39

Contents (continued)

Chapter Page


A.17 Approval for Not Displaying the Expiration Date for OMB Approval A-40

A.18 Exceptions to the Certification Statement A-40


B Collection of Information Employing Statistical Methods B-1


B.1 Universe, Sample Design, and Estimation B-1


B.1.1 Universe and Sample Design B-1

B.1.2 Precision Requirements and Sample Sizes B-2

B.1.3 Sample Design for Spring-First Grade B-4

B.1.4 Sample Design for Fall-Second Grade B-5


B.2 Procedures for the Collection of Information B-6


B.2.1 Spring First-Grade Data Collection B-6

B.2.2 Fall Second-Grade Data Collection B-12


B.3 Methods to Secure Cooperation, Maximize Response Rates, and Deal with Nonresponse B-16


B.3.1 Gaining Cooperation from a Variety of Sources B-17

B.3.2 Methods to Maximize Response Rates B-18


B.4 Individuals Responsible for Study Design and Performance B-23


C The ECLS-K:2011 Fall First-Grade Questionnaires C-1


C.1 Introduction C-1

C.2 ECLS-K:2011 Parent Interview C-2


C.2.1 Spring First-Grade Parent Interview C-3

C.2.1.1 Spring First-Grade Parent Interview: Research

Questions C-3

C.2.1.2 Spring First-Grade Parent Interview: Construct

Coverage C-4


C.2.2 Fall Second-Grade Parent Interview C-16

C.2.2.1 Fall Second-Grade Parent Interview: Research

Questions C-16

C.2.2.2 Fall Second-Grade Parent Interview: Construct

Coverage C-17

Contents (continued)

Chapter Page



C.3 School Administrator Questionnaire C-20


C.3.1 School Administrator Questionnaires: Research Questions C-21

C.3.2 School Administrator Questionnaire: Construct Coverage C-21


C.4 General Classroom Teacher Questionnaires C-28


C.4.1 Spring First-Grade General Classroom Teacher Questionnaires C-29

C.4.1.1 Spring First-Grade General Classroom Teacher

Questionnaires: Research Questions C-29

C.4.1.2 Spring First-Grade General Classroom Teacher

Questionnaires: Construct Coverage C-29

C.4.2 Fall Second-Grade General Classroom Teacher Questionnaire C-38

C.4.2.1 Fall Second-Grade General Classroom Teacher

Questionnaire: Research Questions C-39

C.4.2.2 Fall Second-Grade General Classroom Teacher

Questionnaires: Construct Coverage C-39


C.5 Special Education Teacher Questionnaires C-40

C.5.1 Special Education Teacher Questionnaires: Research Questions C-40

C.5.2 Special Education Teacher Questionnaires: Construct Coverage C-41


References C-43



Tables


A-1 Federal agency consultants for ECLS-K and ECLS-K:2011 A-20

A-2 Other organization consultants for ECLS-K A-21

A-3 ECLS-K:2011 First TRP meeting attendee list (November 2008) A-21

A-4 ECLS-K:2011 Second TRP meeting attendee list (March 2011) A-22

A-5 ECLS-K:2011 CRP member list A-22

A-6 Respondent burden chart for the national spring first-grade data collection, the fall second-grade data collection, and previously cleared data collection activities A-34

A-7 Estimated respondent burden for future clearance package related to recruitment for the spring third-grade data collection, sample tracking for the spring third-grade and spring fourth-grade data collections, and the spring second-grade national data collection A-37

A-8 Study costs per year A-38

A-9 Operational schedule for ECLS-K:2011 data collection activities A-41


Exhibits


A-1 Examples of important developments for the ECLS-K:2011 A-8

A-2 Confidentiality Pledge A-28



Appendixes


A Respondent Materials

B Parent Interviews

C General Classroom Teacher Questionnaires

D Special Education Teacher Questionnaires

E School Administrator Questionnaire

F Child Hearing Screening Questions

G Links Between Instrument Items, Covered Constructs, and Related Research Questions


A.1 Circumstances Making Collection of Information Necessary

A.1.1 Purpose of This Submission

The Early Childhood Longitudinal Study, Kindergarten Class of 2010-11 (ECLS-K:2011), sponsored by the National Center for Education Statistics (NCES) within the Institute of Education Sciences (IES) of the U.S. Department of Education (ED), is a survey that focuses on children’s early school experiences beginning with kindergarten and continuing through the fifth grade. It includes the collection of data from parents, teachers, school administrators, and nonparental care providers, as well as direct child assessments. Like its sister study, the Early Childhood Longitudinal Study, Kindergarten Class of 1998-99 (ECLS-K),1 the ECLS-K:2011 is exceptionally broad in its scope and coverage of child development, early learning, and school progress, drawing together information from multiple sources to provide rich data about the population of children who were kindergartners in the 2010-11 school year. Fall and spring collections in the kindergarten year were conducted for NCES by Westat, with the Educational Testing Service (ETS) as the subcontractor developing the child assessments. Clearances for studying the ECLS-K:2011 cohort were granted for the fall 2009 field test data collection, fall 2010 and spring 2011 kindergarten national data collections, and fall 2011 first-grade national data collection (OMB No. 1850-0750).

This submission requests OMB’s approval for (1) a spring 2012 first-grade national data collection; (2) a fall 2012 second-grade data collection with the same 30 percent subsample for which data was collected in the fall 2011 first-grade collection; and (3) a 60-day federal register notice waiver for the next OMB clearance package to be submitted in June of 2012 for the spring 2013 second-grade data collection, recruitment for the spring 2014 third-grade data collection, and tracking students for the spring 2014 third-grade and spring 2015 fourth-grade data collections.2 The respondent materials for the spring 2012 first-grade collection, included in this package, are updated versions of the approved fall 2010 kindergarten and fall 2011 first-grade materials. The included fall 2012 second-grade materials are adapted from the approved fall 2011 first-grade materials. This submission also includes carry-over burden from the last approved package (OMB# 1850-0750 v.9) for the activities that will not be completed by the time the current package is expected to be approved.

A matrix reflecting the past, current, and anticipated future requests for OMB clearance for ECLS-K:2011 under the 1850-0750 OMB number are provided below:

Field Activity

Fall 2010

K

Spring 2011

K

Fall 2011

1st Grade

(30 percent subsample)

Spring 2012

1st Grade

Fall 2012

2nd Grade

(30 percent subsample)

Spring 2013

2nd Grade

Spring 2014

3rd Grade

Spring 2015

4th Grade

Spring 2016

5th Grade

Tracking

NA

Package 4

1850-0750 v.8

2/10/2010

Package 5

1850-0750 v.9

5/12/2011

Package 5

1850-0750 v.9

5/12/2011

Package 5

1850-0750 v.9

5/12/2011

Package 5

1850-0750 v.9

5/12/2011

Package 7

1850-0750 v.11

Package 7

1850-0750 v.11

Package 8

1850-0750 v.12

Field Test

Package 1-3

1850-0750 v.5-7

3/20/2009 –

9/18/2009

Package 1-3

1850-0750 v.5-7

3/20/2009 –

9/18/2009

Package 1-3

1850-0750 v.5-7

3/20/2009 –

9/18/2009

Package 1-3

1850-0750 v.5-7

3/20/2009 –

9/18/2009; plus

NCES Generic Package 1850-0803 v.43 and v. 51

Package 1-3

1850-0750 v.5-7

3/20/2009 –

9/18/2009; plus

NCES Generic Package 1850-0803

Package 1-3

1850-0750 v.5-7

3/20/2009 –

9/18/2009; plus

NCES Generic Package 1850-0803 v.43 and v. 51

NCES Generic Package 1850-0803

NCES Generic Package 1850-0803

NCES Generic Package 1850-0803

Recruit-ment

Package 4

1850-0750 v.8

2/10/2010

Package 4

1850-0750 v.8

2/10/2010

Package 5

1850-0750 v.9

5/12/2011

Package 5

1850-0750 v.9

5/12/2011

Package 5

1850-0750 v.9

5/12/2011

Package 5

1850-0750 v.9

5/12/2011

Package 7

1850-0750 v.11

Package 8

1850-0750 v.12

Package 9

1850-0750 v.13

Full Scale Data Collection

Package 4

1850-0750 v.8

2/10/2010

Package 4

1850-0750 v.8

2/10/2010

Package 5

1850-0750 v.9

5/12/2011

Package 6*

1850-0750 v.10

Package 6*

1850-0750 v.10

Package 7

1850-0750 v.11

Package 8

1850-0750 v.12

Package 9

1850-0750 v.13

Package 10

1850-0750 v.14

* Current package.


The spring 2012 first-grade data collection includes questions on a topic new to the ECLS-K:2011: Response to Intervention (RtI) practices that might be used in schools and classrooms throughout the United States. Cognitive interviews were conducted in August 2011 to determine whether these survey items, most of which have been adapted from other studies of schools with established RtI practices, can be used in a study such as ECLS-K:2011 with a national sample of schools that may not have established RtI practices. The final versions of these items are included in both the general classroom teacher and school administrator questionnaires (found in appendices C and E respectively).

As part of this fall 2012 second-grade collection package, we are also requesting clearance to conduct hearing screenings. This package describes the procedures that will be used to conduct the screenings and includes related respondent materials in appendix A. Additionally, there is a short set of questions that will be asked of children to assure they can be safely screened (e.g., whether they currently experience ear pain) included in appendix F.

Lastly, we are currently examining the feasibility of offering a web-based version of the school administrator questionnaire to schools that have participated in previous rounds. The questions in a web-based version of the questionnaire would be identical to those included in the hard-copy questionnaire submitted with this package, but would have some school-level information (e.g., grades served) prefilled with data collected in a prior round. The prefilled information would allow administrators to simply confirm its accuracy, and only make changes where necessary, thereby reducing overall administrator respondent burden. No information pertaining to a particular individual (e.g., administrator age) would be prefilled. Should NCES proceed with a web-based questionnaire, further details would be submitted to OMB as a change request in January 2012.

A.1.2 Legislative Authorization

ECLS-K:2011 is conducted by NCES in close consultation with other offices and organizations within and outside the U.S. Department of Education. ECLS-K:2011 is authorized by law under the Education Sciences Reform Act of 2002 (20 U.S. Code Section 9543):


  1. The Statistics Center shall collect, report, analyze, and disseminate statistical data related to education in the United States and in other nations, including -- (7) conducting longitudinal and special data collections necessary to report on the condition and progress of education;”

The Education Sciences Reform Act of 2002 (20 U.S. Code Section 9573) further states that:


All collection, maintenance, use, and wide dissemination of data by the Institute, including each office, board, committee, and center of the Institute, shall conform with the requirements of section 552a of title 5, the confidentiality standards of subsection (c) of this section, and sections 1232g and 1232h of this title.


A.1.3 Prior Related Studies

The ECLS-K:2011 is part of a longitudinal studies program. The two prior ECLS studies pertain to two cohorts—a kindergarten cohort and a birth cohort. Together these cohorts provide the range and breadth of data required to more fully describe and understand children’s education experiences, early learning, development, and health in the late 1990s, 2000s, and 2010s.


The birth cohort of the Early Childhood Longitudinal Study (ECLS-B) followed a national sample of children, born in the year 2001, from birth through kindergarten entry. The ECLS-B focused on the characteristics of children and their families that influence children’s school readiness and first experiences with formal schooling, as well as children’s early health and in- and out-of-home experiences.


The ECLS‑K followed a nationally representative cohort of children from kindergarten through eighth grade. The base year data were collected in the fall and spring of the 1998-99 school year, when the sampled children were in kindergarten. A total of 21,260 kindergartners throughout the nation participated by having a child assessment and/or parent interview conducted during that school year. Five more waves of data were collected: in fall and spring of the 1999-2000 school year when most, but not all, of the base year children were in first grade; in the spring of the 2001-02 school year when most, but not all, of the base year children were in third grade; in the spring of the 2003-04 school year when most, but not all, of the base year children were in fifth grade; and in the spring of the 2006-07 school year when most, but not all, of the base year children were in eighth grade.3


A.1.4 ECLS-K:2011 Study Design for the Spring First-Grade National Data Collection

The sample for the national ECLS-K:2011 is a representative sample of children who attended kindergarten in 2010-11 across the country. In the fall of 2010, children were selected using a multistage probability design. In the first stage, 90 primary sampling units (PSUs) that are counties or groups of counties were selected with probability proportional to size (PPS). In the second stage, public and private schools offering kindergarten were selected, also with PPS with an oversampling of private schools. The third-stage sampling units were children in kindergarten or children of kindergarten age in ungraded schools or classrooms. Children were selected within each sampled school using equal probability systematic sampling, with a higher sampling rate for Asian and Pacific Islanders (APIs) so as to achieve a minimum required sample size for APIs.


The base-year (i.e., kindergarten) data were collected in the fall of the 2010 school and are currently being collected in the spring of the 2010-11 school year. The spring 2011 data collection will end in early August 2011. The spring first-grade data collection will be conducted in spring 2012 when most, but not all, of the base-year children will be in first grade. (It is expected that some children will be retained in kindergarten for the 2011-2012 school year.)


Similar to the national kindergarten data collections, the spring national first-grade data collection will include direct child assessments, height and weight measurements, parent interviews, and school administrator and teacher questionnaires (both regular classroom and special education teachers). As in the base year of the ECLS-K:2011, computer assisted interviewing (CAI) will be the mode of data collection for the child assessment and the parent interviews. School administrator and teacher data will be collected via self-administered questionnaires.


Cognitive Assessments. As in the kindergarten and fall-first grade data collections for ECLS-K:2011, a direct cognitive assessment will be used in the spring 2012 first-grade collection. This will be the same assessment that is going to be used in the fall 2011 first-grade collection.4 The cognitive assessment will measure the domains of reading, mathematics, science, and executive functioning. It will be administered directly to the sampled children through a one-on-one assessment employing age- and grade-appropriate items. The structure of the ECLS-K:2011 first-grade cognitive assessment will be two-stage, the same as the ECLS-K:2011 base-year assessment. That is, for the cognitive assessments in reading, math, and science,5 all children first will be administered a routing test. Performance on the routing test will determine which one of three second-stage tests will be appropriate for the child’s skill level; the child will then be administered the appropriate second-stage assessment form. The executive function tasks (i.e., Numbers Reversed and Dimensional Change Card Sort) are not two-stage assessments. In addition to the cognitive assessment, the ECLS-K:2011 direct child assessments will include measures of the children’s height and weight.


A majority of items in the two-stage ECLS-K:2011 reading and mathematics assessments will be the same as those used in the ECLS-K kindergarten/first-grade assessment in order to enable researchers to conduct cross-cohort analyses. While a science assessment was fielded in the ECLS-K, it was first fielded in third grade, so a new assessment appropriate for younger children was developed for the ECLS-K:2011. Science items were administered in the kindergarten and first-grade waves of the ECLS-K as part of a general knowledge assessment; some of these items have been included in the ECLS-K:2011 kindergarten and first-grade science assessments. The science assessment for first grade will be a two-stage assessment, similar to the science assessments fielded in the third-, fifth-, and eighth-grade rounds of the ECLS-K.


Parent Interviews. A parent interview will be administered to one parent/guardian of each child in the ECLS-K:2011 study. The interviews will be conducted in English and Spanish. For parents who speak neither English nor Spanish, home and community interpreters will be used when available to administer the English-language version to parents, translating the English version to the parent's native language during the interview. The parent instrument will ask about family structure, family literacy practices, parental involvement in school, nonparental care arrangements, household composition, family income, parent education levels, and other demographic indicators. Parents will also be asked to report on their children’s level of physical functioning, health, and disability status. The parent interview includes the same types of questions (in terms of topics and format) that have been previously fielded in the ECLS-K, earlier rounds of the ECLS-K:2011, and other NCES studies (e.g., ECLS-B, National Household Education Surveys Program (NHES), Education Longitudinal Survey of 2002 (ELS:2002), and the National Education Longitudinal Survey of 1988 (NELS:88)).


Teacher Questionnaires. Teachers of sampled children will complete the teacher questionnaires. Two versions of the teacher questionnaire will be fielded: a first-grade version for teachers of sampled children who are in first grade or higher and a kindergarten version for teachers of sampled children who are in kindergarten. The instruments include questions about the teachers’ own background and education, class materials, teaching practices, and specific information about the topics and skills taught in the classroom. These questionnaires provide information on the types of materials being used to teach the ECLS-K:2011 students, what and how they are being taught, the characteristics of their classrooms, and the background and experience of their teachers.


Teachers also will be asked to complete child-specific questionnaires about each of the sampled children in their classroom. The questionnaires will contain items about children’s skills in the areas of language and literacy, mathematics, science, and executive functioning; children’s social skills and behaviors; and information about program placements and special services that each child may receive. These data obtained from teachers can be compared to the results of direct assessments administered to the sampled children. As results from additional years of assessments become available, a picture of children’s skills over time can be developed and tentative conclusions can be drawn about children’s progression in school.


Special education teachers will also be asked to complete questionnaires for ECLS-K:2011 students with special needs, defined as having an Individual Education Plan (IEP) on file at the school. These questions will be useful in examining special education curricula and the services being received by children with disabilities.


School Administrator Questionnaire. This questionnaire will be completed by the school administrators in the schools attended by the children in the study. There will be two versions of the school administrator questionnaire: one for continuing schools and one for “transfer” schools (i.e., schools that are added to the ECLS-K:2011 as sampled students transfer from their originally sampled school to a new school). In order to reduce respondent burden, the administrator questionnaire for continuing schools will not contain questions about characteristics that are unlikely to change from year to year and for which we obtained responses in the kindergarten year. This instrument includes a broad range of questions about the school setting, policies, and practices at both the school level and in specific grades, as well as questions about the school administrator and the teaching staff. These items will help researchers understand the school contexts for ECLS-K:2011 students. Comparisons can be made between children attending different types of schools, including public and private schools (with private schools being further identified as religious or nonreligious); rural, urban, and suburban schools; and schools of different sizes. Data from this questionnaire can be merged with data from the child assessments and teacher questionnaires. Linking these data will allow researchers to determine the degree to which educational outcomes of various groups of children are associated with the differences in the schools that the children attend.

A.1.5 ECLS-K:2011 Study Design for the Fall Second-Grade Data Collection

The fall second-grade (fall 2012) data collection will be a follow-up data collection with the children in the 30 percent subsample of the ECLS-K:2011 schools (n=approximately 600 schools, including “transfer” schools), or approximately 6,000 children who were sampled for the fall first-grade data collection conducted in fall 2011. Similar to that data collection, a primary purpose of this collection is to obtain information about children’s summer experiences to examine summer learning, summer learning loss, and the transition between grades. The collection will include a child assessment, a teacher questionnaire, and a parent interview. The collection will also include a screening of children’s hearing, which has been described in previous clearance requests and was field tested before the ECLS-K:2011 kindergarten collections, but for which official clearance for its inclusion in the national data collection has not been requested previously.

A primary purpose of the follow-up rounds of data collection for the ECLS-K:2011 is to allow for examination of change over time within the ECLS-K:2011 cohort, as well as to allow for comparisons of the experiences, skills, and knowledge of this cohort and the cohort of children in kindergarten in 1998-99 (i.e., the ECLS-K). Therefore, the instruments for the fall 2012 second-grade collection have been developed from the instrumentation used in the ECLS-K:2011 fall first-grade collection (and, due to the ECLS-K:2011 study design, the ECLS-K).

A.2 Purposes and Uses of the Data

The ECLS-K:2011 will provide rich data sets that are generally designed to serve two purposes: descriptive and explanatory. It will provide descriptive data at a national level related to (1) children’s status at entry into kindergarten and at different points in children’s elementary school careers, (2) children’s transition into school and into the later elementary grade levels, and (3) children’s school progress through the fifth grade. Additionally, it will provide rich data that will enable researchers to test hypotheses about how a wide range of child, family, school, classroom, nonparental care and education provider, and community characteristics relate to experiences and success in school.


In addition to the descriptive objectives mentioned above, the data will describe the diversity of young children with respect to demographic characteristics such as race/ethnicity, language, and school readiness. Such information is critical for establishing policies that are sensitive to this diversity. The longitudinal nature of the study will enable researchers to study cognitive, socioemotional, and physical growth, as well as relate trajectories of growth and change to variation in home, school, and before- and after-school care setting experiences in the elementary grades. Summer learning or learning loss, which can have a considerable impact on children’s educational progress, can also be examined with data collected in the fall 2012 second-grade data collection. Ultimately, the ECLS-K:2011 data set will be used by policymakers, educators, and researchers to consider the ways in which children are educated in our nation’s schools and to develop effective approaches to education. It will be particularly valuable to policymakers, as the ECLS-K:2011 is being launched a dozen years after the inception of the original ECLS-K. Analyses of the two cohorts will provide valuable information about the influences of changing policy and demographic environments on children’s early learning and development.


A.2.1 Research Issues Addressed in the ECLS-K:2011

Today’s early education environment differs from that of the past in numerous ways. Examples of the many changes that have occurred within schools and within the larger society in recent years are presented in Exhibit A-1 and include changes at the policy, state, school, family, and societal levels. ECLS-K and ECLS-B have been used by numerous researchers to examine many of these topics. The widespread use of ECLS data is a testament to the importance of the ECLS program. At the same time, both prior studies leave gaps in the research questions we may answer with the data, which is perhaps inevitable because changes in policy, research, and society are often difficult to anticipate. We seek to preserve the strengths of the earlier studies by retaining much of the same content, while incorporating appropriate modifications so that ECLS-K:2011 can be used to answer some of these recently-emerging questions while at the same time allowing for the study of a new cohort of children growing up in new circumstances. Below, we discuss some of the more important developments that are particularly relevant to the design of ECLS-K:2011.


Exhibit A-1. Examples of important developments relevant to the ECLS-K:2011


Policy changes

  • Passage of ESEA 2002 and pending reauthorization of ESEA

  • Race to the Top Program

  • Passage of Patient Protection and Affordable Care Act

  • Passage of the 1996 Personal Responsibility and Work Opportunity Reconciliation Act (“welfare reform”)

  • Higher standards for teacher qualifications

Changes in schools and challenges to schools

  • Growth in school choice and increasing number of charter schools

  • Growth in integrated pre-kindergarten through grade 3 schools (Pre-K-3)

  • Change in curricular focus due to ESEA 2002

  • Re-segregation of schools due to residential patterns and decline in court mandated busing

  • Stress on school systems to adapt to decreasing student populations (in the North) or increasing numbers of students (in the Sunbelt)

Demographic/Economic changes

  • Growth of Hispanic child population

  • Growth in English language learners (ELL) in schools, especially at young ages

  • Migration of population from Rustbelt to Sunbelt states

  • Extension of suburban sprawl

  • Continued high levels of single-parent families, maternal employment, and nonparental child care

  • Continued high rates of births to older mothers

  • Global recession and financial crisis beginning 2007/2008

Child health

  • Epidemic of obesity and associated rise in diabetes

  • Rise in incidence of:

  • Allergies

  • Asthma

  • Autism

  • Attention deficit/hyperactivity disorder

  • Learning disabilities

Scientific developments

  • Advances in neuroimaging techniques (e.g., fMRIs) that have led to advances in our understanding of the development of children’s learning, memory, attention, and language

  • Advances in neurological research and emphasis on executive function

  • Emerging research showing the trainability of cognitive process (e.g., Rueda, et al., 2005)

Technological changes

  • Increase in:

  • Use of video games even for very young children

  • TV programs aimed at children

  • Cell phones and texting

  • Internet usage

  • Social network sites

  • Video-sharing websites (e.g., YouTube, Google Video)

  • Evolving and new technologies (e.g., MP3 players with video capability, smart phones, lightweight and pocket computers, more powerful PCs)



A.2.1.1 Developments in Early Education Policy

A major change in early education occurred when the ESEA was reauthorized as the No Child Left Behind Act (NCLB) and signed into law in early 2002. The reauthorization of ESEA, including

proposals for considerable substantive changes, is an issue of considerable interest among policy makers. ESEA in both its current and potentially amended forms has affected and will continue to affect children’s progress through school. ESEA 2002 affects families, classrooms, teachers, schools, and school districts throughout the country. It has clear expectations for student achievement; mandates annual assessments of all children in grades 3 through 8 to measure progress toward state-defined goals; and has strong reporting requirements for schools, districts, and states. Under the ESEA 2002 there are consequences when schools and school districts do not make Adequate Yearly Progress (AYP). The Department of Education’s blueprint for reauthorization (U.S. Department of Education 2010) proposes retaining similar consequences but moves away from the AYP system in favor of one that tracks individual student performance over time. Under this proposed new system, schools, districts, and states will be required to meet performance targets based on overall and subgroup achievement and growth. These performance targets are not yet defined, but those states, districts, and schools that meet or surpass all of their targets will be rewarded financially or through preferential advantages in grant competitions.


The set of ECLS studies will be an important resource to researchers and policymakers seeking to understand the consequences of changing education policy for young children’s cognitive and social development. Cross-cohort comparisons of the ECLS studies will provide important insights into not only the influence of ESEA 2002 and the reauthorization of ESEA in particular, but also the influence of other policies and societal changes on children’s lives. ECLS-K children entered school before the advent of ESEA 2002, though ESEA 2002 was enacted in their middle elementary school years and they likely experienced some of the effects of the law, such as mandatory testing, by the end of the study. ECLS-B children entered school as states, districts, and schools were adjusting to meet the requirements of ESEA 2002 and to develop the required systems to demonstrate AYP. ECLS-K:2011 children entered school after educational systems complied with ESEA 2002 requirements. The ECLS-K:2011 cohort will progress through school as ESEA is reauthorized and as states, districts, and schools readjust from ESEA 2002 to whatever requirements and changes the reauthorization of ESEA holds.


Another policy initiative is Race to the Top, an incentive program designed to stimulate reforms in state and local district K-12 education policy. Race to the Top is funded by the Education Recovery Act and is part of the American Recovery and Reinvestment Act of 2009 (ARRA). Both ESEA and Race to the Top emphasize the importance of having well-qualified teachers in all classrooms; for example, ESEA 2002 requires that all teachers of core subjects have a bachelor’s degree, full state certification, and demonstrated competence in each core academic subject they teach. The Department of Education’s blueprint for reauthorization calls for states to define standards for “effective teacher,” “effective principal,” “highly effective teacher,” and “highly effective principal.” These new terms will be based significantly on student growth but will also include other measures, like classroom observations. Until states transition to these new standards, a more flexible version of ESEA 2002’s “Highly Qualified Teachers” provision will remain in place.


Two programs, the Teacher Quality Partnership and the Teacher Incentive Fund are intended to improve the quality of new teachers and reward current teachers based on student performance. Both of these programs are similar to programs that are proposed in the ESEA reauthorization blueprint, which calls for increased funding for professional development for both new and current teachers and for performance-based incentives based on an individual student growth model system that is proposed to take the place of AYP. Also included in the ESEA reauthorization blueprint is proposed funding for states and districts to develop teacher and administrator evaluation systems and to strengthen traditional and alternative pathways into teaching, including competitive grants for the recruitment, preparation, placement, and induction of teachers for high-need schools and subjects.


The blueprint for ESEA reauthorization also calls for a focus on curriculum standards in science, technology, engineering, and mathematics (STEM). One of the criticisms about No Child Left Behind has been its emphasis on reading and mathematics at the expense of other subjects. Under the blueprint, high-need districts will be given support in science and mathematics, and states will need to develop comprehensive, evidence-based plans to provide high-quality STEM instruction. Grants and partnerships, like Race to the Top, will be available to provide professional development and other resources in these subjects. States that adopt standards in common with each other, utilize technology to address student learning challenges, cooperate with groups who have STEM experience, or pledge to prepare more students from underrepresented groups in STEM careers will be given priority for these resources and grants. Data from the ECLS-K:2011 can examine the degree to which school practices align with the ESEA policy initiatives and how they relate to children’s cognitive development.


Another policy-related issue that can be examined with the ECLS-K:2011 data collections beginning with the spring first-grade data collection is the extent to which schools and teachers are using Response to Intervention strategies and practices. Response to Intervention (RtI) is a system for general, remedial, and special education. It integrates assessment, evidence-based intervention, and student monitoring within a multi-tiered system designed to maximize student achievement and reduce behavior problems by tailoring the type and intensity of interventions based on individual student performance. RtI can also be used to identify students in need of additional services, such as special education classroom instruction.


There has been a growing interest in RtI since the Individuals with Disabilities Education Improvement Act (IDEA) was reauthorized in 2004. Prior to this, schools primarily used an IQ-achievement discrepancy to identify children with learning disabilities. The reauthorization of IDEA allows for RtI to be as an alternative method for identifying children in need of additional educational interventions and allows districts to use a portion of their special education funding for early intervention services for all students.


Many school districts are adopting RtI models and there is a great deal of interest in understanding how RtI is being implemented across the country. RtI models are relatively new and there is much variability in how they are conceptualized and implemented. The effectiveness of RtI may depend upon a number of factors such as the quality and types of interventions that are available, the training of the individual administering the interventions, and the intensity of the interventions.


A.2.1.2 School Readiness

Educational policymakers and researchers continue to debate the most appropriate ways to promote school readiness. Most experts agree that school readiness is a multifaceted phenomenon and encompasses several domains of child development. In addition to cognitive development and pre-academic skills (e.g., letter and number recognition, emerging literacy), school readiness is conceptualized as involving the whole child, including health and physical well-being, language acquisition, social and emotional development, and interest in and enthusiasm for learning. It is therefore important for ECLS-K:2011, like ECLS-K and ECLS-B, to capture all of these domains to fully understand how children’s early learning and development are affected by shifts in policy and by other changes in children’s lives.


One effect of ESEA 2002 is a change in curricular emphasis in the early grades. ESEA 2002 emphasizes evidence-based early literacy activities that stress the development of specific literacy skills. ESEA 2002 includes two initiatives, Reading First and Early Reading First, which seek to lay the foundation for future school success by stressing the following five skills to enable children to become proficient readers:


  • Phonemic awareness: the ability to hear and identify sounds in spoken words;

  • Phonics: the relationship between the letters of written language and the sounds of spoken language;

  • Vocabulary: the words students must know to communicate effectively;

  • Fluency in reading: the capacity to read text accurately and quickly; and

  • Comprehension: the ability to understand and gain meaning from what is read.

ESEA 2002 and these reading programs view literacy as a learned skill that requires coherent skill-based instruction using scientifically proven curricula provided by highly qualified teachers. By ensuring that the ECLS-K:2011 assessments and teacher questionnaires measure these skills, the ECLS-K:2011 can be used to examine children’s emerging literacy and cognitive development since the passage of ESEA 2002. The focus of ESEA 2002 on early literacy skills has essentially shifted discussions of school readiness from the range of domains mentioned above to two: language development and cognition and general knowledge. It will be important to examine the trajectories of other important dimensions of school readiness, such as social competence, approaches to learning, or other indicators of socioemotional development, in light of this aforementioned shift.


A.2.1.3 Executive Functioning

New research in the cognitive and neurological sciences is providing important insights into developmental processes associated with school readiness. Of particular interest is new research on the importance of executive functioning for learning and academic achievement (e.g., Blair and Razza, 2007; Posner and Rothbart, 2006). “Executive functioning” refers to a set of interdependent processes that work together to accomplish purposeful, goal-directed activities and include working memory, attention, inhibitory control, and other self-regulatory processes. Executive functioning processes work to regulate and orchestrate cognition, emotion, and behavior to help a child to learn in the classroom. For example, executive control, which is associated with the prefrontal cortex, involves the ability to allocate attention, to hold information in working memory, and to withhold an inappropriate response (Casey et al., 2000). Not only are these cognitive and behavioral processes predictive of reading and math achievement (Blair and Razza, 2007), but there is also emerging research that indicates that some of these cognitive processes are trainable (Rueda et al., 2005; Klingberg et al., 2005) and can be improved upon in regular public school classrooms without costly interventions (Diamond et al., 2007).


Many other cognitive processes are necessary for learning and achievement. For example, learning, whether it involves reading comprehension, solving applied mathematics problems, or something else, involves the interaction between working memory and long-term memory and the formation of linkages between the two. ECLS-K:2011 will be strengthened by obtaining measures (direct or indirect) that capture specific learning issues such as attention problems, memory problems, inability to withhold inappropriate responses, and language issues. In particular, little attention has been paid to differences in these areas across racial/ethnic subgroups or between low-income and other children (Noble, et al., 2005). ECLS-K:2011 will provide information to allow for the investigation of such differences.


A.2.1.4 Demographic Changes

In addition to changing policies and approaches to early education and research, the U.S. is also experiencing financial and economic turmoil. The current recession, the associated high unemployment rate, and tightened state and local budgets have direct impacts on district and school budgets. A recent study noted that because the American Recovery and Reinvestment Act funding that staved off some personnel and budget cuts from 2008-2010 will soon run out, states and districts will be faced with serious budgetary issues that will likely affect both school personnel and services offered (Mead, Vaishnav, Porter, and Rotherham 2010). Cuts in school budgets and in teaching staff may affect children’s early experiences at school. Additional questions in the ECLS-K:2011 School Administrator Questionnaire about staff additions or contraction in the past year, staff burden, and class sizes will provide data to examine the effects of these cuts. The current economic climate may also affect children’s home lives, and this can be investigated with ECLS-K:2011 data.


Beyond these economic challenges, the U.S. is also experiencing demographic shifts in the composition of its population. Continued high immigration rates, a relatively young immigrant population, high fertility rates among immigrant Hispanic women, and low fertility rates among the native born population mean that a substantial fraction of the child population has one or more immigrant parents. In 2006, approximately one in every four births was to a foreign-born mother (Martin et al., 2009). Sixty-one percent of these births were to women of Hispanic origin (Martin et al., 2009). The demographic shift is especially evident in the school-age population. In 2007, 20 percent of children ages 5 to 17 spoke a language other than English at home (Planty et al., 2009). Of those speaking a language other than English at home, 72 percent spoke Spanish, 13 percent spoke another Indo-European language, 11 percent spoke an Asian or Pacific Island language, and 4 percent spoke some other language at home (Planty et al., 2009). Language barriers are not the only challenge for many of these children. Many children, especially those with parents from Mexico and Central America, have parents with lower levels of education, larger families, and lower family income than native-born children (Larsen, 2004). Additionally, families from other cultures may have different normative expectations for how they should interact with schools and teachers. ECLS-K:2011 will enable researchers to examine how schools and teachers are meeting the needs of these students and their families and how effective those efforts are.


A.2.1.5 Summer Learning

A main purpose of the fall second-grade data collection is to obtain information that will enable researchers to study factors that are associated with children’s learning during the summer months between first and second grades. Studies find that the gap in achievement between disadvantaged children and advantaged children widens during the summer months (Alexander, Entwisle, and Olson, 2007; Burkam et al., 2004). The widening gap could be due to greater gains by advantaged compared to disadvantaged students or a loss in learning that is greater among disadvantaged students or not present for advantaged students. Researchers using a sample of Baltimore students found that disadvantaged children experienced summer losses in what they had learned during the school year while advantaged children did not experience losses to the same extent (Alexander, Entwisle, and Olson, 2007). They also found evidence that summer learning loss cumulates over time to the detriment of disadvantaged children (Alexander, Entwisle, and Olson, 2007). Other research suggests that the extent of losses in learning varies by grade and by subject area. For example, according to the Center for Summer Learning at Johns Hopkins University, all students lose on average approximately 2.6 months of grade level equivalency in mathematical computation over the summer months. The Center for Summer Learning researchers speculate that all students, regardless of socioeconomic level, experience this loss because they are equally unlikely to practice math skills outside of formal class settings. In contrast, the researchers found that family income was important in predicting the extent of summer reading loss: low-income students experienced losses in reading comprehension and word recognition while middle-income students experienced slight gains in reading performance over the summer (Center for Summer Learning, 2007).


An important factor in examining summer learning is family resources. Parents of non-poor or advantaged children tend to be better educated, have higher incomes, and have more prestigious occupations than parents of children living in households below the poverty line (Farkas, 2006; Duncan and Magnuson, 2005). These parents are also more likely to be married, provide higher quality care arrangements, live in better neighborhoods with better schools and other community resources, and encounter fewer family stressors (Farkas, 2006). All of these factors have been shown to be associated with children’s early learning (Farkas, 2006; Duncan and Magnuson, 2005). The family resources to which children living in households above the poverty line have access also enable their parents to provide outside enrichment activities during the summer months. Data from the ECLS-K:2011 can examine the summer learning experiences of children with varying levels of access to resources as well as their summer learning trajectories.


A.3 Use of Improved Information Technology

When feasible, available technology will be used to improve data quality and reduce respondent and school burden.


The ECLS-K:2011 parent interviews and child assessments will be conducted using computer-assisted interviewing (CAI). Using CAI will increase data collection efficiency by permitting preloads of available data about the sampled schools and children, on-line editing of information as it is entered (e.g., correcting data entry errors caught through range and logic checks or correction of information provided in a previous round of data collection), and routing of respondents through complex question branching—all of which also reduce respondent burden by producing faster interviews and reduce the need to recontact respondents to obtain missing information (for example, which would occur when a field interviewer not using CAI does not follow a skip pattern correctly and items that should be asked are not). Field interviewers will conduct interviews with parents without telephones by making in-person visits to complete interviews; these interviews will also be conducted using CAI on laptop computers. The CAI system has important features that will improve the quality of the data and reduce the burden on respondents, as follows:


  • Initial Contact: The CAI system will guide the ECLS-K:2011 field interviewer in making contact with the parent at the phone number or address provided by the school and with the child at the school and will include prompts to help the interviewer identify the correct respondent.

  • Routing the Direct Child Assessment: The CAI system will be programmed so the initial routing tests at the beginning of the reading, mathematics, and science cognitive assessment subtests will be scored by the computer and the appropriate second-stage tests corresponding to the child’s ability level will be administered. The benefits of such a two-stage assessment are increased adaptiveness, reduced burden for the child, and increased precision of measurement because the interviewers do not need to score the routing test and select the appropriate second-stage test themselves.

  • Skip Patterns: The CAI system automatically guides interviewers through the complex skip patterns in the parent interviews, thereby reducing respondent burden, reducing potential for interviewer error, and shortening the interview administration time. The respondent will not be asked inapplicable questions and the interviewers do not need to spend time determining which questions to ask.

  • Copying Responses: The CAI system will be programmed to copy responses from one item to another and from one round to another to prevent unnecessary repetition of questions and to aid in respondents’ recall. For example, information that is provided by the respondent early in the interview may be useful later in the interview; such information can be displayed on the screen or used as a wording fill for relevant questions to assist the respondent. Additionally, information from the previous waves of data collection can be copied to the current wave’s interview and be verified by the respondent, eliminating the need to collect the data again.

  • Time Intervals: The CAI system also provides automated time and date prompts that are very useful in longitudinal studies to assist respondents in remembering specific time periods. The interview can also provide the specific time frame for the interval between the previous and the current wave of data collection, to help respondents recollect information without repeating what they had given at the previous data collection period.

  • Receipt Control: The CAI system will provide for automatic updates to the interview status of study participants and will be used to produce status reports that allow timely and ongoing monitoring of the survey’s progress.

The use of a CAI system for the ECLS-K:2011 is critical because of the intricate and sometimes difficult skip patterns that are part of complex survey instruments and because of the longitudinal nature of the data collection in which the same respondent might be interviewed at multiple time points. Without CAI, the ECLS-K:2011 instruments would be difficult to administer over repeated measurement periods, and respondent burden would be increased.


A computer-based data management system will be used to manage the sample. The sample management system uses encrypted data transmission and networking technology to maintain timely information on respondents in the sample, including contact, tracking, and case completion data. This system will be particularly important as children move from one school to another over the course of ECLS-K:2011 study. The use of technology for sample management will maximize tracking efforts, which should have a positive effect on the study’s ability to locate movers and achieve acceptable response rates.


A.4 Efforts to Identify Duplication

The ECLS-K:2011 will not be duplicative of other studies. The original ECLS-K is the only other study to collect as detailed and extensive information as the ECLS-K:2011 for a cohort of young children and to follow them throughout elementary school. The ECLS-K:2011 will extend the information obtained by the ECLS-K to a new cohort, will open up possibilities to investigate new research questions, and will allow important comparisons to be made between two kindergarten cohorts attending school a dozen years apart. In addition, the ECLS-K:2011 will collect data during the children’s second and fourth grade years, which the original ECLS-K did not.


A literature search was conducted to identify and review research studies with the same study purpose and goals as those proposed for the ECLS-K:2011. To be included in the search the research had to be (1) a survey-based study of a population with a sample of 1,000 or more, (2) longitudinal in design, and (3) focused on children’s cognitive development in the elementary, middle, and/or secondary grades. Although similar studies were found, they were generally confined to limited geographic areas (e.g., Baltimore, Maryland; Greensboro, North Carolina) or, in the case of studies conducted on the national level (e.g., Prospects, Children of the National Longitudinal Survey of Youth [NLSY Child Supplement]), were not based on probability samples of kindergartners. For example, Prospects began with first graders and targeted Title 1 recipients. NLSY79’s Child Supplement targeted the children of female sample members of a household-based 1979 sample of 14- to 21-year-olds. The Head Start Family and Child Experiences Survey (FACES), which is similar to the ECLS-K:2011 in terms of the included content and components, has followed several cohorts of children from preschool through early elementary school. However, FACES has not followed the progress of children in school beyond kindergarten or first grade, and the samples are limited to children served by Head Start. Studies such as the National Education Longitudinal Study of 1988 (NELS:88) and Education Longitudinal Study of 2002 (ELS:2002) began with students in the middle and high school grades. Another major finding of the literature review was that most studies used group-administered achievement tests, which, for young children, can be less reliable than individually administered assessments. Individually administered assessments, like those used in the ECLS-K:2011, allow the assessor to establish rapport and offer motivation and supportive conditions so that each child performs to the best of his or her ability.


In the past 25 years there have been a few large scale studies that have looked at summer programs, summer child care, and/or summer learning, but they were either not national studies and were confined to limited geographic areas (e.g., the Beginning School Study (1982-2002) conducted in the Baltimore City Schools) or were national studies but were not based on nationally representative samples of kindergartners that were followed longitudinally (e.g., the National Survey of Families (1999); the Child Development Supplement (CDS) to the Panel Study of Income Dynamics (2002), the Current Population Survey (CPS) (1996)).  These studies are also not from as recent a cohort of children as the ECLS-K:2011 children are, thus the issues that they measured (e.g., participation in child care during the summer) may have changed since the earlier data were collected.

 

Summer programs, summer child care, and/or summer learning between kindergarten and first grade were examined in the fall first-grade data collection. While the fall second-grade data collection will be conducted with the same subsample as the fall first-grade data collection, this fall second-grade data collection is not duplicative. The fall second-grade data collection will provide a longitudinal element for the fall data collections that will allow us to use the child assessments to examine children’s summer learning in consecutive summers. We will be able to examine whether children's learning differs in the summer after kindergarten compared to the summer after the first grade. Obtaining fall second grade data will also allow an examination of the summer experiences children have at home after most have completed the first grade, which could be considered more formal schooling than kindergarten. Whether completion of the first grade changes the activities parents do with their children to prepare them for the second grade, or the use of summer school, tutoring, or services for children who may have been identified in the first grade as needing additional help, can be examined using the fall second grade data. Lastly, having a fall second-grade data collection will also allow for analyses of children’s growth between the fall and spring of their second-grade year.


A.5 Method Used to Minimize Burden on Small Businesses

Private, not-for-profit, and proprietary elementary schools have been drawn into the sample. These proprietary and nonprofit schools will benefit from the study’s burden-reducing strategies (e.g., instruction packets for participants, toll-free help lines, and prepaid business return envelopes), which were designed for all types of schools.


A.6 Frequency of Data Collection

This submission describes and requests approval for the spring first-grade data collection, the fall second-grade subsample data collection, and the spring second-grade field test. The base year data collection began in fall 2010 and continues in spring 2011. One of the main goals of the ECLS-K:2011 is to measure change in children’s cognitive growth and noncognitive status, as well as changes in the contextual characteristics (i.e., family, classroom, school, and community factors) that can affect growth. The spring first-grade, fall second-grade, and spring second-grade data collections are three of the periodic follow-ups that will collect information to be compared to baseline (kindergarten) information, thereby allowing for analyses of change for school children and their environments.


For the second-grade year, beginning-of-the-school-year data collection is needed in order to obtain baseline data on children at the very beginning of their exposure to the influences of the second-grade year. Through direct and indirect assessments, the second-grade fall data collection will provide measures of the skills, attributes, and knowledge of the subsample of children as they re-enter school and begin a new school year. The second-grade fall data collection will also provide measures of summer learning and/or learning loss. The data collected at the end of the school year in the spring second-grade data collection will be used to examine changes in children after they have experienced nearly a year of second grade. After this second-grade year, the study design calls for follow-up collections each spring from third through fifth grade. This frequency of data collection is linked to the rate of change that is expected for children of this age and the desire to capture information about children as critical events and transitions are occurring, rather than measuring these events retrospectively. Without data collection follow-ups, the study of children’s cognitive, socioemotional, and physical development is hindered. Assuming the first- and second-grade collections are as successful as the kindergarten collections have been to date, future clearance requests will be submitted for the follow-up collections in later grades.


A.7 Special Circumstances of Data Collection

No special circumstances for this information collection are anticipated.



A.8 Consultants Outside the Agency

NCES consulted with a range of outside agencies over the life of the ECLS‑K, and such input also has informed the ECLS-K:2011 study design and instrumentation, since they draw heavily from the ECLS-K. During the early development of the ECLS-K, project staff met with representatives from a wide range of federal agencies with an interest in the care and well-being of children (see Table A-1). The goal of this activity was to identify policy and research issues and data needs. Similarly, consultation with federal agencies has occurred and continues for the ECLS-K:2011.


For the ECLS-K, project staff also consulted several other organizations (see Table A-2) that have an interest in the care, well-being, and education of young children. The goal of this activity was to obtain additional perspectives on policy and research issues and data needs.


Several of the early consultations with government agencies have resulted in interagency agreements funding supplemental questions or sections to the study instruments. Similar to its predecessor, the ECLS-K:2011 represents a collaborative effort by education and health and human services agencies. NCES supports the development of the core design of the ECLS-K:2011. Partner agencies continuing to support the inclusion of the supplemental questions or sections to the study instruments that enrich the ECLS-K:2011 include the Economic Research Services of the U.S. Department of Agriculture, the National Center for Special Education Research in the Institute of Education Sciences of the U.S. Department of Education, and the Administration for Children and Families in the U.S. Department of Health and Human Services. New agency partners to the ECLS-K:2011 include the National Institute of Deafness and Other Communication Disorders and the National Eye Institute, both at the National Institutes of Health in the U.S. Department of Health and Human Services. The National Institute of Deafness and Other Communication Disorders is sponsoring the hearing screening data collection. Table A-1 lists the Federal agency consultants for ECLS-K and ECLS-K:2011 and Table A-2 lists other organization consultants for ECLS-K.


In preparation for the ECLS-K:2011 collections, the data collection contractor assembled expert panels (Technical Review Panel (TRP) and Content Review Panels (CRP)) to review and comment on issues related to the development of the study and survey instruments. The members of the panels include experts in research, policy making, and practice in the fields of early childhood education and development, elementary education, health, research methodology, special populations, and assessment.


There have been two meetings of the TRP panels. The first was a 2-day meeting, held in November 2008. The meeting focused on major design and content issues, such as study periodicity, the benefits of including an assessment of science in kindergarten, the assessment of executive functioning and possible measures for it, and the content of a Spanish language assessment for native Spanish speakers who are English language learners. The TRP members also provided suggestions for specific questionnaire items to be included in the instruments in the full-scale collection. Table A-3 lists the ECLS-K:2011 TRP members present at the first meeting.


The second TRP meeting was a 2-day meeting held in March 2011. The meeting focused on content for the first- and second-grade non-assessment instruments, including suggestions for specific questionnaire items to be included in the instruments in the second-grade data collection. Table A-4 lists the ECLS-K:2011 TRP members present at the second meeting.


Table A-1. Federal agency consultants for ECLS-K and ECLS-K:2011

Diane Schilder1

Government Accounting Office


Cindy Prince,1 Emily Wurtz1

National Education Goals Panel


Andy Hartman1

National Institute for Literacy


Mary Queitzsch,1 Larry Suter1

National Science Foundation


Michael Ruffner,1 Bayla White,1

Brian Harris-Kojetin1

Office of Management and the Budget


John Endahl,1 Jeff Wilde,1 Joanne Guthrie,

Victor Oliviera1

U.S. Department of Agriculture


Don Hernandez1

U.S. Department of Commerce

Bureau of the Census

Marriage and Family Statistics


Tim D’Emillio

U.S. Department of Education, OELA


Naomi Karp,1 Dave Malouf,1 Ivor Pritchard,1

Marsha Silverberg1

U.S. Department of Education, IES


Pia Divine,1 Esther Kresh,1 Ivelisse Martinez-Beck, Ann Rivera

U.S. Department of Health and Human Services

Administration for Children, Youth, and Families


Gerry Hendershot,1 John Kiley,1 Michael Kogan1, Mitchell Loeb, Patricia Pastor

U.S. Dept. of Health and Human Services

NCHS




Howard Hoffman

National Institute on Deafness and Other Communication Disorders

NICHD, U.S. Dept. of Health and Human Services


Mary Frances Cotch

National Eye Institute

NICHD, U.S. Dept. of Health and Human Services

Tom Bradshaw,1 Doug Herbert1

National Endowment for the Arts


Jeffrey Thomas1

National Endowment for the Humanities


Patricia McKee

U.S. Department of Education

OESE Compensatory Education Programs


Cathie L. Martin1

U.S. Department of Education, OIE


Scott Brown,1 Louis Danielson,1 Glinda Hill,1

Lisa Holden-Pitt,1 Kristen Lauer,1

Marlene Simon-Burroughs,1 Larry Wexler

U.S. Department of Education, OSEP


Lisa A. Gorove1

U.S. Department of Education

OUS, Budget Service, ESVA


Elois Scott1

U.S. Department of Education

OUS, PES, ESED


Richard Dean1

U.S. Department of Education

OVAE, Adult Literacy


Jaquelyn Buckley,

U.S. Department of Education

IES, NCSER


Jeff Evans,1 Sarah Friedman,1 Christine Bachrach,1

Peggy McCardle1

U.S. Department of Health and Human Services

NICHD, Center for Population Research


Martha Moorehouse,1 Anne Wolf1

U.S. Department of Health and Human Services

Office of Assistant Secretary for Planning & Evaluation, Children and Youth Policy


Katrina Baum1

Bureau of Justice Statistics

Department of Justice


Meredith A. Miceli

U.S. Department of Education
Office of Special Education Programs

1 Consultant for the ECLS-K only. Affiliation listed is the affiliation at the time input on the study was provided.



Table A-2. Other organization consultants for ECLS-K


Lynson Bobo

Project Associate

Resource Center on Educational Equity

Council of Chief State School Officers


Susan Bredekamp

Barbara Willer

National Association for the Education of Young Children


Mary Jo Lynch

American Library Association

Office of Research and Statistics

Keith W. Mielkek

Children’s Television Workshop


June Million, Sally McConnell, Louanne Wheeler

National Association of Elementary School Principals


Evelyn Moore

Erica Tollett

National Black Child Development Institute


Thomas Schultz

Director, Center for Education Services for Young Learners

National Association of State Boards of Education



Table A-3. ECLS-K:2011 First TRP meeting attendee list (November 2008)


Karl Alexander

Department of Sociology

Johns Hopkins University


Jim Bauman

Center for Applied Linguistics

Washington, DC


Maureen Black

Growth and Nutrition Department

University of Maryland Medical Center


Joanne Carlisle

School of Education

University of Michigan


Janet Fischel

State University of New York at Stony Brook & University Medical Center

Fred Morrison

Department of Psychology

University of Michigan


Charlotte Patterson

Department of Psychology

University of Virginia


Robert Pianta

The Center for Advanced Teaching and Learning

University of Virginia


Kit Viator

Massachusetts Department of Education



Table A-4. ECLS-K:2011 Second TRP meeting attendee list (March 2011)


Karl Alexander

Department of Sociology

Johns Hopkins University


Jim Bauman

Center for Applied Linguistics

Washington, DC


Joanne Carlisle

School of Education

University of Michigan


Robert Crosnoe

Department of Sociology

University of Texas at Austin

David Dickinson

Department of Teaching and Learning

Vanderbilt University


Rolf Grafwallner

Maryland Public Schools


Greg Roberts

The Meadows Center for Preventing Educational Risk

University of Texas at Austin


Deborah Stipek

School of Education

Stanford University




To date, eight meetings of the CRP panels have been held: reading (May 2009), mathematics (May 2009), science (May 2009), English language learners (August 2009), executive function (November 2009; March 2011), socioemotional development (March 2011), and teacher practices (March 2011). For each of these specific content areas, panel members provided critical review of the instruments for inclusion in the national data collections. The meetings focused on the appropriateness and adequacy of specific instruments by considering features such as domain coverage, age appropriateness, and technical quality. Table A-5 lists the ECLS-K:2011 CRP members.


Table A-5. ECLS-K:2011 CRP member list


Reading Panel

Susan Conrad

Independent consultant, assessment development


Gloria Johnston

Education National University

Alba Ortiz

University of Texas at Austin


Barbara Wasik

Temple University

Mathematics Panel

Doug Clements

State University of New York, Buffalo

Donna Compano

Independent consultant, assessment development, math facilitator, elementary teacher

Lizanne DeStefano

University of Illinois at Urbana-Champaign


Leah Parker

Journeys Academy, Gifted Education Specialist

Science Panel

Christie Bean

JJ Ciavarra Elementary School


Kathy DiRanna

University of California - Irvine


Angela Eckhoff

Clemson University

Christine Y. O’Sullivan

Science Consultant

Michael Padilla

Clemson University

English Language Learners Panel

Jamal Abedi

University of California at Davis


Catherine Crowley

Teachers College

Eugene E. García

Arizona State University

Vera Gutierrez-Clellen

San Diego State University

Executive Function Panel

Clancy Blair

New York University


Adele Diamond (March 2011 meeting only)

University of British Columbia

Megan McClelland

Oregon State University


Philip Zelazo

University of Minnesota

Socioemotional Development Panel

Pamela Cole

The Pennsylvania State University


Rick Fabes

Arizona State University

Ross Thompson

University of California, Davis


Carlos Valiente

Arizona State University

Teacher Practices Panel

Stephanie Al Otaiba

Florida State University


Hilda Borko

Stanford University

Carol Connor

Florida State University


Barbara Wasik

University of North Carolina



A.9 Provision of Payments or Gifts to Respondents

Obtaining high response rates is critical for all longitudinal studies. At the start of a longitudinal data collection, it is essential to establish the good will of respondents and to demonstrate that we value their participation in the study. Good will can be established by using well-designed respondent materials that inform respondents about the goals of the study and their role in it, the field staff establishing a rapport with the respondents, professionalism among the field staff, and a small token incentive. The incentive plan for ECLS-K:2011 is similar to the approach approved by OMB for use in ECLS-K and in the base year (i.e., the kindergarten collections) of the ECLS-K:2011. The plan is designed to help respondents to recognize the merits of the study and thereby encourage high response rates.


A.9.1 School Incentive

High levels of school participation are integral to the success of the study. Without a school’s cooperation, there can be no school, teacher, or child data collection activity for that facility. NCES recognizes that administrators will assess the burden level before agreeing to participate. To offset the perceived burden, NCES intends to continue its use of strategies that have worked successfully in the past for the kindergarten rounds of the ECLS-K:2011, the ECLS-K, and other major NCES studies (High School and Beyond, National Education Longitudinal Study of 1988, and Education Longitudinal Study of 2002). It is important to provide schools with an incentive because the study asks a lot of them, including to allow field interviewers to be in their schools for up to 3 days, to provide a contact person and space for the children to be assessed, to remove children from their normal classes while they are tested, and to obtain information about the school and the children. Given the many demands and outside pressures that schools face, it is essential that they see that we understand the burden we are placing on them and that we value their participation. As was done for the ECLS-K:2011 kindergarten data collection, we propose to remunerate schools $200 per school. An honorarium check in the amount of $200 will be mailed to each school at the end of spring first-grade data collection along with a thank you note thanking the school for its participation.6


A.9.2 School Administrator

To build response rates, we propose to remunerate school administrators in appreciation for their completing the school administrator questionnaire. In ECLS-K, the field period had to be extended for both kindergarten and first grade to build adequate response rates for the school administrator questionnaire to meet NCES’ goals. Providing school administrators with an incentive will reduce the potential for needing to extend the field period and help avoid delays in data delivery. We will offer school administrators a $25 incentive in the spring first-grade collection, the same amount that was given to school administrators during the spring kindergarten round of the ECLS-K:2011; the incentive will be attached to the school administrator questionnaire. In the eighth-grade round of the ECLS-K, we offered school administrators a $25 incentive and a response rate of 93.3 percent was achieved for the school administrator questionnaire; we hope for similar success with an ECLS-K:2011 incentive. (As of this submission we do not have final response rates for the ECLS-K:2011 kindergarten data collection rounds.)


A.9.3 Teachers

In the base year of the ECLS-K:2011, teachers received $7 per child-level questionnaire because they were acting as data collectors, recording their observations of their ECLS-K:2011 kindergartners on the questionnaires. A check for the incentive was attached to the package of instruments the teacher received in the fall and in the spring. For the spring first-grade and fall second-grade data collections of the ECLS-K:2011, we propose that classroom and special education teachers again be offered $7 per child-level questionnaire. On average, we expect that general classroom teachers will have 6 sampled children in their first-grade and second-grade classrooms resulting in a total remuneration of $42 per round for participating in each of those data collections. A check for the incentive will be attached to the package of instruments the teacher receives.


NCES began the practice of providing the teacher incentive at the time of questionnaire distribution in the fifth-grade round of the ECLS-K; teachers responded positively to this method, as evidenced by their completing questionnaires on time, resulting in high response rates. We also attribute the high questionnaire response rates achieved in the eighth-grade ECLS-K collection (school administrator at 93.3%; teacher questionnaire at 95.5%; special education teacher questionnaire at 94.2%) in part to the provided incentives. Given our experience with ECLS-K and other school-based, longitudinal studies with high institutional and respondent burden, NCES believes that remuneration is a necessary component of a successful ECLS-K:2011 data collection.


A.9.4 School Coordinators

School coordinators act as the study liaison with their school and, as such, they play a very important role in the ECLS-K:2011. They helped to enroll children in the study and will continue their role beyond the base year by communicating necessary information to parents, notifying teachers and encouraging their participation, arranging the assessment logistics (e.g., space to conduct the assessments), and collecting hard-copy teacher and school administrator questionnaires.7 For this reason, school coordinators will be offered a $25 incentive in each round of data collection. The $25 checks will be attached to the packets mailed to the coordinators in the spring first-grade data collection and then again in the fall second-grade collection. The study offered the same incentive to the school coordinators during the ECLS-K:2011 kindergarten data collection.


A.10 Assurance of Confidentiality

The ECLS-K:2011 plan for ensuring the confidentiality of the project participants conforms with the following federal regulations and policies: the Privacy Act of 1974 (5 U.S.C. 552a), Privacy Act Regulations (34 CFR Part 5b), the Education Sciences Reform Act of 2002 (20 U.S. Code Section 9573), the Computer Security Act of 1987, NCES Restricted-Use Data Procedures Manual, and the NCES Standards and Policies.


All adult respondents who are participating in research under this clearance are informed that the information they provide will be protected from disclosure to the fullest extent allowable under law (20 U.S. Code Section 9573) and that their participation is voluntary. All adult respondents receive an introductory letter that explains NCES’s and the contractor’s adherence to policies on disclosure. Also, this information appears on the cover of each of the study self-administered questionnaires. This information was provided to parents as the guardians for their children when their cooperation was sought during the base year of the study. (Recruitment materials were submitted in a previous clearance package – OMB# 1850-0750 v.9)


Since early spring 2010 (when preparations for the kindergarten data collections began), information about the protection of data from disclosure has been conveyed to state, district, and other school officials at the time their cooperation for the study was sought. As sampled children move to new schools, this information will be provided to the states and districts in which those schools are located, if necessary (i.e., if there are no participating schools in those states and districts already). New schools in the study will receive the letter developed for schools to which sampled children transfer that can be found in Appendix A of this clearance request, as well as the general study materials that were approved in the previous OMB clearance package submitted on 2/2/10 (see Appendix H of that package).


During any in-person or telephone interviewing, respondents will be asked if they received the study’s letter about the upcoming data collection. If the respondent does not recall the letter, the interviewer will summarize the key elements of the data protection assurances; namely, that data will be combined to produce statistical reports, that no data will be published that link the respondent to his/her responses; that participation is voluntary; and that there is federal statute that protects the data from disclosure to the fullest extent allowable under law (20 U.S. Code Section 9573).


All contractor staff members working on the ECLS-K:2011 project or having access to the data (including monitoring of interviews and assessments) are required to sign the NCES Affidavit of Nondisclosure and a Confidentiality Pledge (Exhibit A-2). They also are required to complete mandatory training on data confidentiality and the safehandling of data. The contractor will keep the original notarized affidavits on file and submit PDF copies of all affidavits to NCES quarterly. In addition, contractor staff will complete background screening in compliance with ACS Directive (OM:5-101).


During the course of data collection, interviewers will be equipped with laptop computers, which store any necessary preloaded data, as well as the information collected on a given day during the data collection round (interviewers transmit interview and assessment data on a daily basis to the contractor’s home office via a secure, encrypted internet transmission). The interviewers will be instructed to keep the computers and any hard-copy case materials in a secure place in their homes when they are not being used. When the interviewer is in the field collecting interview or assessment data, he or she is instructed to keep all materials and the computer in his/her possession at all times. When driving a car to or from his/her appointments, the computer and all materials will be locked out of sight, so as not to provide an inviting opportunity for burglary. The interviewers will be instructed to transmit the electronic data for a case to a central database on the same day the case is completed. Data transmitted electronically will be encrypted during transmission.


The laptop configuration will be designed with security and confidentiality considerations in mind. In order to access any of the applications, the interviewer must enter a project-specific password and an interviewer identification code, both of which are checked against encrypted versions of the same data; if the password or interviewer identification code is entered incorrectly repeatedly, the interviewer is “locked out” of the application. All data files will be encrypted on the computer hard disk.


In the event of a hardware failure in the field, the home office will swap the interviewer’s laptop for a new one. The contractor will maintain a supply of “hot spares,” i.e., laptop computers loaded with all necessary ECLS-K:2011 software, which require only the specific interviewer’s identification code and assignment before being sent out.


All mailing of respondent materials, laptops, hard-copy case materials used by assessors to manage their workload will be done using Federal Express, which has a sophisticated tracking system designed to locate any misdirected packages. All packages will require the recipient’s signature for delivery. To the extent practical, the study name and logo will not be included on hard copy materials used by field staff to record school or respondent information. In the event of a loss of hard copy materials, this procedure would make it more difficult for someone who finds the materials to associate a school or respondent with the study.

Finally, all computer assisted interviewing (CAI) applications will have an audit trail of the case data on the hard disk. This is so that if the main data files are corrupted, the data can be reconstructed from the audit trails.


After data collection, all personally identifiable data will be stored on a secure server and password protected with access limited to authorized project staff. Personally identifiable data will also be protected through the coding of responses so that no one individual respondent can be identified (specifically or by deduction) through reported variables in the public access data files. NCES will monitor the conduct of the contractor to ensure that the confidentiality of the data is not breached.



Exhibit A-2. Confidentiality Pledge


EMPLOYEE OR CONTRACTOR’S ASSURANCE OF CONFIDENTIALITY OF SURVEY DATA


Statement of Policy


{Contractor} is firmly committed to the principle that the confidentiality of individual data obtained through {Contractor} surveys must be protected. This principle holds whether or not any specific guarantee of confidentiality was given at time of interview (or self-response), or whether or not there are specific contractual obligations to the client. When guarantees have been given or contractual obligations regarding confidentiality have been entered into, they may impose additional requirements which are to be adhered to strictly.


Procedures for Maintaining Confidentiality


1. All {Contractor} employees and field workers shall sign this assurance of confidentiality. This assurance may be superseded by another assurance for a particular project.

2. Field workers shall keep completely confidential the names of respondents, all information or opinions collected in the course of interviews, and any information about respondents learned incidentally during field work. Field workers shall exercise reasonable caution to prevent access by others to survey data in their possession.

3. Unless specifically instructed otherwise for a particular project, an employee or field worker, upon encountering a respondent or information pertaining to a respondent that s/he knows personally, shall immediately terminate the activity and contact her/his supervisor for instructions.

4. Survey data containing personal identifiers in {Contractor} offices shall be kept in a locked container or a locked room when not being used each working day in routine survey activities. Reasonable caution shall be exercised in limiting access to survey data to only those persons who are working on the specific project and who have been instructed in the applicable confidentiality requirements for that project.

Where survey data have been determined to be particularly sensitive by the Corporate Officer in charge of the project or the President of {Contractor}, such survey data shall be kept in locked containers or in a locked room except when actually being used and attended by a staff member who has signed this pledge.

5. Ordinarily, serial numbers shall be assigned to respondents prior to creating a machine-processible record and identifiers such as name, address, and Social Security number shall not, ordinarily, be a part of the machine record. When identifiers are part of the machine data record, {Contractor’s Manager of Data Processing} shall be responsible for determining adequate confidentiality measures in consultation with the project director. When a separate file is set up containing identifiers or linkage information which could be used to identify data records, this separate file shall be kept locked up when not actually being used each day in routine survey activities.

6. When records with identifiers are to be transmitted to another party, such as for keypunching or key taping, the other party shall be informed of these procedures and shall sign an Assurance of Confidentiality form.

7. Each project director shall be responsible for ensuring that all personnel and contractors involved in handling survey data on a project are instructed in these procedures throughout the period of survey performance. When there are specific contractual obligations to the client regarding confidentiality, the project director shall develop additional procedures to comply with these obligations and shall instruct field staff, clerical staff, consultants, and any other persons who work on the project in these additional procedures. At the end of the period of survey performance, the project director shall arrange for proper storage or disposition of survey data including any particular contractual requirements for storage or disposition. When required to turn over survey data to our clients, we must provide proper safeguards to ensure confidentiality up to the time of delivery.

8. Project directors shall ensure that survey practices adhere to the provisions of the U.S. Privacy Act of 1974, and any additional relevant laws that are specified in the contract, with regard to surveys of individuals for the Federal Government. Project directors must ensure that procedures are established in each survey to inform each respondent of the authority for the survey, the purpose and use of the survey, the voluntary nature of the survey (where applicable), and the effects on the respondents, if any, of not responding.

PLEDGE


I hereby certify that I have carefully read and will cooperate fully with the above procedures. I will keep completely confidential all information arising from surveys concerning individual respondents to which I gain access. I will not discuss, disclose, disseminate, or provide access to survey data and identifiers except as authorized by {Contractor}. In addition, I will comply with any additional procedures established by {Contractor} for a particular contract. I will devote my best efforts to ensure that there is compliance with the required procedures by personnel whom I supervise. I understand that violation of this pledge is sufficient grounds for disciplinary action, including dismissal. I also understand that violation of the privacy rights of individuals through such unauthorized discussion, disclosure, dissemination, or access may make me subject to criminal or civil penalties. I give my personal pledge that I shall abide by this assurance of confidentiality.


Signature


NCES understands the legal and ethical need to protect the privacy of the ECLS-K:2011 survey respondents and, with the contractor, has extensive experience in developing data files for release that meet the government’s requirements to protect individually identifiable data from disclosure. The contractor will conduct a thorough disclosure analysis of the ECLS-K:2011 data when preparing the data files for researchers’ use. This analysis will ensure that NCES has fully complied with the confidentiality provisions contained in 20 U.S. Code, Section 9573. To protect the privacy of respondents as required by 20 U.S. Code, Section 9573, respondents with high disclosure risk will be identified, and a variety of masking strategies will be used to ensure that individuals may not be identified from the data files. These masking strategies include:


  • Swapping data on both the public- and restricted-use files;

  • Omitting key identification variables such as name, address, telephone number, and school name and address from both the public- and restricted-use files (though the restricted-use file will include NCES school ID that can be linked to other NCES databases to identify a school);

  • Omitting key identification variables such as state or ZIP Code from the public-use file;

  • Collapsing categories or developing categories for continuous variables to retain information for analytic purposes while preserving confidentiality in public-use files; and

  • Topcoding” and “bottomcoding”8 continuous variables in public-use files.


A.11 Sensitive Questions

The ECLS-K:2011 is a voluntary study, and no persons are required to respond to the interviews and questionnaires or to participate in the assessments. In addition, respondents may decline to answer any question they are asked. This voluntary aspect of the survey is clearly stated in the advance letter mailed to adult respondents, the study brochure,9 and the instructions of hard-copy questionnaires, and it is stressed in interviewer training to ensure that interviewers are both communicating this to participants and following these guidelines. Additionally, assessors are trained that children may refuse to participate at the time they are visited for an assessment and assessors are to respect the children’s wishes.


The following describes the general nature of the national data collection instruments that will be used during the spring first-grade, fall second-grade data collections, as well topics that may be sensitive for some respondents.


School Administrator Questionnaires. The items in the School Administrator Questionnaire are not of a sensitive nature and should not pose sensitivity concerns to respondents.


Teacher Questionnaires. The information collected in the child-level questionnaires could be regarded as sensitive, because the teacher is asked to supply information about children’s social skills (including ability to exercise self-control, interact with others, resolve conflict, and participate in group activities); problem behaviors (e.g., fighting, bullying, arguing, anger, depression, low self-esteem, impulsiveness); and learning dispositions (e.g., curiosity, self-direction, inventiveness). Because schools often emphasize different skills and concepts, teachers also will be asked to rate the child’s performance in the curricular areas and domains that are included in the cognitive assessments (e.g., language skills; quantitative skills; knowledge of the physical, social, and biological worlds). The purpose of the teacher ratings of children is both to extend the range of domains assessed (e.g., by gathering information about socioemotional development and adaptation to school) and to deepen our understanding of domains by tapping them in multiple ways (e.g., by gathering information on cognitive development that will complement results of the direct assessment).


Within the set of questions about the teacher’s views on school readiness, school climate, and school environment, there are some questions that could be deemed sensitive by some teachers. Teachers may feel that rating statements regarding their satisfaction with their work (e.g., I really enjoy my present teaching job) are sensitive in nature. These items are included because prior research (e.g., Perrachione, Rosser, & Peterson, 2008; Luekens, Lyter, & Fox, 2004; Rhodes, Nevill, & Allen, 2004) indicates that teacher satisfaction may be associated with relevant constructs such as staff retention and stability. Additionally, there is an item asking teachers whether they meet current criteria for being considered “highly qualified” according to the provisions of ESEA/NCLB. This question will inform research about if and how having a “highly qualified” teacher, as defined by law, is related to positive experiences and outcomes for children. Prior to their participation, teachers will be informed and assured that their information will be protected from disclosure except as required by law and that their responses will not be shared with their employers or the parents of their students.


Direct Cognitive Assessments. The direct cognitive assessments are essential in determining children’s performance levels at the time they start school each year and changes in their performance as they progress through school. Because schools often use different standards in their own assessments of children and a uniform set of assessment instruments and procedures is needed for the ECLS-K:2011, school-developed assessments cannot be used in the ECLS-K:2011. The items to be included in the ECLS-K:2011 direct cognitive assessments undergo a sensitivity review and are not themselves sensitive in nature. However, direct assessments of children do raise certain concerns about the assessment procedures to be used. Of primary concern is the length of the assessments. The cognitive assessments are designed to be administered on average within a 60-minute time period. NCES has developed instruments appropriate to the ages of the participating children, and every effort will be made to staff the study with field assessors who have prior experience in working with children. Issues specific to working with children also figure prominently in assessor training.


Parent Interviews. Several topics that will be addressed in the spring first-grade parent interview could be sensitive in nature for some respondents. Questions about family income, child-rearing and disciplinary practices, children’s disabilities, children’s receipt of tutoring, children’s behavior, parents’ and children’s country of origin, household food sufficiency, welfare use, and contact with a child’s nonresidential parent will be included in the parent interview.


In the fall second-grade parent interview, questions about how many weeks the child was away from home during the summer or where the child was when he/she was away from home could be sensitive depending on whether the reason the child was away was perceived as negative by the respondent (e.g., a child custody agreement). Also, if a parent did not do many activities with the child over the summer (and thinks that he/she should have), the parent could feel negatively about questions on these topics (e.g., if the parent reports that the school gave the child a book list to read over the summer, but the child did not read any books from it). Also in the fall second-grade parent interview, questions about required or suggested summer school, therapy services received during the summer, and participation in summer special education programs could also be considered sensitive by some parents.


These types of topics were included in the ECLS-K, however, and very few parents have objected to them. Results from the ECLS-K showed that there were very low levels of missing data in the parent interviews for all items, including the ones mentioned here that are planned to be included in the ECLS-K:2011. For example, in ECLS-K Round 2 (i.e., the spring kindergarten wave), response rates for sensitive items such as parent income (94.4%) and marital satisfaction (99.7%) were in the mid to high 90’s.


Prior research indicates that the topics in the parent interview are correlated with children’s achievement and help to predict children’s preparedness for and success in school. Collecting data on these topics will allow researchers to go beyond descriptive analyses of variation in children’s performance by basic background characteristics such as race/ethnicity and sex. Researchers will be able to test hypotheses about how a wide range of family characteristics relate to early success in school. Therefore, it is important to include questions on the sensitive topics listed above in the parent interviews. Like other study participants, parents will be told that they can refuse to answer any question they wish.


Additionally, because it is imperative that respondents can be found at a later date for follow-up collections in a longitudinal study, the ECLS-K:2011 interview protocol requests locating information from parents. The locating information includes names, addresses, telephone numbers, and email addresses of individuals who would always know the whereabouts of the respondents. Such information may appear sensitive to respondents who may be leery about providing contact information for people they know; again, they will have the option to refuse to answer these questions.


A.12 Estimated Response Burden

The estimated respondent burden for the national spring first-grade and fall second-grade data collections is summarized here and in Table A-6. Included in these estimates, where appropriate, is the time that a respondent would need to gather and compile the data and the clerical time needed to fill out the form.


The spring first-grade national data collection includes direct cognitive assessments with children, parent interviews, regular classroom teacher teacher-/classroom-level self-administered questionnaires, classroom teacher child-level self-administered questionnaires, special education teacher teacher-level self-administered questionnaires, special education teacher child-level self-administered questionnaires for children receiving special education services, and school administrator self-administered questionnaires.


The total number of respondents for the spring first-grade national data collection, i.e., school administrators, teachers, school coordinators, and parents is 20,249.10 The spring first-grade parent, teacher, school coordinator, and school administrator respondent burden translates into a cost amount of $667,429 for 19,357 hours.11 The time children will spend completing the assessments has not been included in the estimated burden.


The fall second-grade national data collection includes direct cognitive assessments and hearing screenings with children, parent interviews, and classroom teacher child-level questionnaires. The total number of respondents for the fall second-grade national data collection, i.e., teachers, school coordinators, children, and parents is 12,375.12 The fall second-grade teacher, school coordinator, and parent respondent burden translates into a cost amount of $97,750 for 2,835 hours.12 There are an additional 1,350 burden hours calculated for children in the hearing screening. The time children will spend completing the direct assessments has not been included in the estimated burden.


Table A-6 also includes the previously cleared burden for recruitment and tracking through the spring of second grade, as well as data collection for the fall of first grade. This burden is being carried over from the previously approved package because the data collection, tracking, and recruitment activities will continue beyond the date by which this current submission is expected to be cleared. Total burden hours for the fall first-grade data collection, spring first-grade data collection, fall second-grade data collection, and tracking and recruitment through spring second grade are estimated to be 49,128. This translates into a cost amount of $1,693,933.12


Table A-7 outlines respondent burden for a future clearance request covering spring second-grade national data collection, tracking and recruitment for spring third grade, and tracking for spring fourth grade. The processes and procedures for respondent tracking are primarily internal and involve little contact with respondents. The table below includes 5 minutes per parent respondent to read the birthday cards we send to children to keep in touch with them and, if necessary, to fill out a change of address card and return it to the data collection contractor. Recruitment burden time includes the time necessary to read study materials sent to parents, teachers, and school administrators; time during which teachers would discuss the study with a data collection staff member; and time the school administrator will take discussing the study with a school recruiter attempting to secure the school’s participation. Burden for the spring second-grade data collection is associated with the parent telephone interview and the teacher and school administrator self-administered questionnaires. Because the parent study participants are expected to be the same across rounds, it would not be accurate to calculate a total sample or total number of respondents as a simple sum of the sample sizes and respondents for each round. Instead, to calculate a total, the table below uses the maximum estimated sample size or number of respondents across all rounds. Specifically, the largest number of parents is expected to be contacted during the spring second-grade national data collection. This is the number used for parents in the calculation of total sample size and total number of respondents.


Table A-6. Respondent burden chart for the national spring first-grade data collection, the fall second-grade data collection, and previously cleared data collection activities


Respondent type

Sample n

Response rate/
selection rate

Number of respondents

Hours per instrument

Number of instruments per respondent

Number of responses

Total hours

SPRING FIRST GRADE








Spring Direct Assessment

16,262

.90

14,636

1.00

1

14,636

14,636

Spring Parent Interview

16,262

.90

14,636

0.75

1

14,636

10,977

Spring School Administrator Questionnaire (SAQ)

1,313

.90

1,182

1.00

1

1,182

1,182

Spring Teacher Questionnaire (TQA)

2,710

.90

2,439

0.50

1

2,439

1,220

Spring Teacher Child-level Questionnaire (TQC)

2,710

.90

2,439

0.33

6

14,634

4,829

Spring Special Education Teacher Questionnaire (SPA)

900

.90

810

0.50

1

810

405

Spring Special Education Teacher Child-level Questionnaire (SPB)

900

.90

810

0.33

1.9

1,539

508

School Coordinator assistance1

1,313

.90

1,182

0.20

NA

1,182

236

Spring First Grade Subtotal

22,4982

NA

8103

NA

NA

36,4224

19,3575









FALL SECOND GRADE








Direct Assessment

6,000

.90

5,400

1.00

1

5,400

5,400

Hearing Screening

6,000

.90

5,400

0.25

1

5,400

1,350

Parent Interview

6,000

.90

5,400

0.25

1

5,400

1,350

Teacher Child-level Questionnaire (TQC)

1,000

.90

900

0.25

6

5,400

1,350

School Coordinator assistance1

750

.90

675

0.20

NA

675

135

Fall Second Grade Subtotal

13,7502

NA

6,9753

NA

NA

16,8754

4,1855

Table A-6. Respondent burden chart for the national spring first-grade data collection, the fall second-grade data collection, and previously cleared data collection activities (Continued)


Respondent type

Sample n

Response rate/
selection rate

Number of respondents

Hours per instrument

Number of instruments per respondent

Number of responses

Total hours

FALL FIRST GRADE








Fall Direct Assessment

6,000

.90

5,400

1.00

1

5,400

5,400

Fall Parent Interview

6,000

.90

5,400

0.25

1

5,400

1,350

Fall Teacher Child-level Questionnaire (TQC)

1,000

.90

900

0.25

6

5,400

1,350

School Coordinator assistance1

600

.90

540

0.20

NA

540

108

Fall First Grade Subtotal

7,6002

NA

1,4403

NA

NA

11,3404

2,8085









RECRUITMENT AND TRACKING








Tracking for Spring First Grade








Parent

12,630

100%

12,630

.084

1

12,630

1,061

School Coordinator

1,313*

100%

1,313*

1.00

1

1,313*

1,313*

Tracking for Fall Second Grade6








Parent

6,000

100%

6,000

.084

1

6,000

504

School Coordinator

750*

100%

750*

1.00

1

750*

750*

Tracking for Spring Second Grade








Parent

8,636

100%

8,636

.084

1

8,636

725

School Coordinator

2,031

100%

2,031

1.00

1

2,031

2,031

Recruitment for Spring First Grade








Parent

16,262*

100%

16,262*

.25

1

16,262*

4,066*

Teacher

2,710*

100%

2,710*

.50

1

2,710*

1,355*

School Administrator

1,313*

100%

1,313*

1.00

1

1,313*

1,313*

Recruitment for Fall Second Grade6








Parent

6,000

100%

6,000

.25

1

6,000

1,500

Teacher

1,000

100%

1,000

.50

1

1,000

500

School Administrator

750*

100%

750*

1.00

1

750*

750*

Table A-6. Respondent burden chart for the national spring first-grade data collection, the fall second-grade data collection, and previously cleared data collection activities (Continued)


Respondent type

Sample n

Response rate/
selection rate

Number of respondents

Hours per instrument

Number of instruments per respondent

Number of responses

Total hours

Recruitment for Spring Second Grade








Parent

14,636

100%

14,636

.25

1

14,636

3,659

Teacher

2,439

100%

2,439

.50

1

2,439

1,220

School Administrator

2,031

100%

2,031

1.00

1

2,031

2,031

Recruitment and Tracking Subtotal

30,5997

NA

28,8497

NA

NA

78,501

22,778









Study Total8



38,0749



143,138

49,128

* Estimates have been updated from previous submission based on the sample sizes obtained in the kindergarten rounds of data collection.

NA Not applicable

1 School coordinators are school staff members who help organize the logistics for the assessment visit. They do not complete a study instrument.

2 Total sample n represents the total sample size with no duplication on the number of listed instruments each respective respondent is asked to complete. One teacher completes both TQA and TQC. One special education teacher completes both SPA and SPB. The sample of students taking the direct assessment is not included in this count because it is not subject to the Paperwork Reduction Act reporting. The sample of students for the hearing screening is included in the estimates for fall second grade because the students will be asked a short set of questions during the screening.

3 Total number of respondents represents the total number of respondents with no duplication on the number of listed instruments each respective respondent is asked to complete. One teacher completes both TQA and TQC. One special education teacher completes both SPA and SPB. The sample of students taking the direct assessment is not included in this count because it is not subject to the Paperwork Reduction Act reporting. The sample of students for the hearing screening is included in the estimates for fall second grade because the students will be asked a short set of questions during the screening.

4 Total number of responses represents the total number of respondents * the total number of instruments they fill out. The responses to the direct assessment are not included in this count because it is not subject to the Paperwork Reduction Act reporting.

5 The sample of students taking the direct assessment is not included in this count because it is not subject to the Paperwork Reduction Act reporting.

6 Reflects a smaller sample due to a planned subsampling.

7 The total sample size represents the maximum total possible. It is expected that the parent respondent will be the same at all rounds, so the largest n for parents (recruitment for spring first grade) is used in the calculation of the total. Sample sizes for teachers, school administrators, and school coordinators at each round all contribute to the total.
8 Includes previously approved burden and new burden for which approval is currently being sought.

9 Shaded numbers contribute to the calculation of the total. Total represents the maximum total possible with no duplication. It is expected that the parent respondent will be the same at all rounds, so the largest n for parents (recruitment for spring first grade) is used in the calculation of the total. Sample sizes for school staff in the spring first-grade and fall second-grade national data collections who are also included in the recruitment and tracking activities for these rounds do not contribute to the total to avoid counting the same person twice. For example, the school coordinator n for spring first-grade recruitment (1,313) contributes to the total but the school coordinator n for the spring first-grade data collection (1,182) does not. One teacher completes both TQA and TQC. One special education teacher completes both SPA and SPB. The sample of students taking the direct assessment is not included in this count because it is not subject to the Paperwork Reduction Act reporting. The sample of students for the hearing screening is included in the estimates for fall second grade because the students will be asked a short set of questions during the screening.

NOTE: Information in the table that appears in gray text (i.e., burden for the fall first-grade data collection and all tracking and recruitment activities) pertains to activities and burden that were approved in a previously cleared package. It is included here because these activities will continue after this current submission is cleared. As noted above, some of these estimates have been revised based on the sample sizes obtained in the kindergarten rounds of data collection.

Table A-7. Estimated respondent burden for future clearance package related to recruitment for the spring third-grade data collection, sample tracking for the spring third-grade and spring fourth-grade data collections, and the spring second-grade national data collection


Respondent type

Sample n

Response rate

Number of respondents

Hours per instrument

Number of instruments per respondent

Number of responses

Total hours

Tracking for Spring Third Grade








Parent

13,534

100%

13,534

.084

1

13,534

1,137

School Coordinator

3,211

100%

3,211

1.00

1

3,211

3,211

Tracking for Spring Fourth Grade








Parent

12,515

100%

12,515

.084

1

12,515

1,051

School Coordinator

4,011

100%

4,011

1.00

1

4,011

4,011

Recruitment for Spring Third Grade








Parent

13,534

100%

13,534

.25

1

13,534

3,384

Teacher

2,256

100%

2,256

.50

1

2,256

1,128

School Administrator

3,211

100%

3,211

1.00

1

3,211

3,211

Spring Second Grade National Data Collection








Spring Direct Assessment

14,636

.90

13,172

1.00

1

13,172

13,172

Spring Parent Interview

14,636

.90

13,172

0.75

1

13,172

9,879

Spring School Administrator

Questionnaires (SAQ)

2,031

.90

1,828

1.00

1

1,828

1,828

Spring Teacher Questionnaire (TQA)

2,439

.90

2,195

0.50

1

2,195

1,098

Spring Teacher Child-level

Questionnaire (TQC)

2,439

.90

2,195

0.33

6

13,170

4,346

Spring Special Education Teacher

Questionnaire (SPA)

900

.90

810

0.50

1

810

405

Spring Special Education Teacher

Child-level Questionnaire (SPB)

900

.90

810

0.33

1.9

1,539

508

School Coordinator assistance1

2,031

.90

1,828

0.20

NA

1,828

366

Study Total

34,7262

NA

32,8843

NA

NA

86,814 4

35,563

NA Not applicable

1 School coordinators are school staff members who help organize the logistics for the assessment visit. They do not complete a study instrument.

2 Total sample size represents the maximum total possible. It is expected that the parent respondent will be the same at all rounds, so the largest n for parents (spring second-grade national data collection) is used in the calculation of the total. Total sample n also represents the total sample size with no duplication on the number of listed instruments each respective respondent is asked to complete. One teacher completes both TQA and TQC. One special education teacher completes both SPA and SPB. The sample of students taking the direct assessment is not included in this count because it is not subject to the Paperwork Reduction Act reporting.

3 Total number of respondents represents the total number of respondents with no duplication on the number of listed instruments each respective respondent is asked to complete. It is expected that the parent respondent will be the same at all rounds, so the largest n for parents (spring second-grade national data collection) is used in the calculation of the total. The sample of students taking the direct assessment is not included in this count because it is not subject to the Paperwork Reduction Act reporting.

4 Total number of responses represents the total number of respondents * the total number of instruments they fill out. The sample of students taking the direct assessment is not included in this count because it is not subject to the Paperwork Reduction Act reporting.

A.13 Estimates of Cost

There are no costs to the respondents to participate beyond the time needed for school administrators to act as a liaison with the school, for parents to answer the interview questions, for teachers and school administrators to complete the questionnaires, and for the children to participate in the assessments. No equipment, printing, or postage charges will be incurred by the participants.


A.14 Annualized Cost to the Federal Government

This information collection activity has been developed in performance of NCES contract ED-IES-10-C-0048. The period of performance for this ECLS-K:2011 contract, which includes the sample tracking procedures through the spring second-grade data collection, and the spring first-grade, fall second-grade, and spring second-grade national data collections, runs from August 2010 through August 2014. The total cost to the government for contractor and subcontractor costs is $30,051,907. This cost estimate includes sample tracking activities, all data collection activities from spring first through spring second grade, design enhancements, and data file delivery and documentation. The total cost of the study has increased relative to the costs described in the last approved OMB package for the fall first grade data collection, because the contract option to conduct the hearing screenings in the fall second grade data collection is being exercised. Table A-8 provides the study costs by year of the contract.


Table A-8. Study costs per year


Year

Amount

2010

$477,599

2011

$2,280,231

2012

$14,704,916

2013

$11,645,183

2014

$943, 978

Total

$30,051,907



A.15 Reasons for Changes in Response Burden and Costs

The increase in the burden requested for this collection as compared to the burden last approved under OMB# 1850-0750 is due to the fact that the last OMB approval was for the fall first grade data collection (with a sub-sample of the base year) and for fall first grade through spring second grade recruitment and tracking, while this request is for the spring first grade data collection (with the full sample), fall second grade data collection (with the sub-sample), and carry over of burden for the fall first grade data collection (with a sub-sample) plus spring first-grade through spring second-grade recruitment and tracking.


There are also some changes to the burden associated with recruitment and tracking listed in table A6 above as compared to the burden reported in the previous OMB submission. Specifically, there are differences in the numbers of parents, teachers, and school administrators we estimated we would attempt to recruit for the spring first-grade data collection and the numbers we are now estimating (table A-6). Specifically, in the previously approved clearance package, we estimated we would recruit 18,630 parents; we are now estimating this as 16,262. The reduction is due to the study obtaining a smaller student sample size in the kindergarten rounds of data collection than was expected. In the previously approved clearance package, we estimated we would recruit 3,105 teachers; we are now estimating this as 2,710 regular classroom teachers and 900 special education teachers, for a total of 3,610. The prior estimate did not include special education teachers and the change reflects an expected reduction based on a smaller-than-expected student sample size. Lastly, in the previously approved clearance package, we estimated we would recruit 1,082 school administrators; we are now estimating this as1,313. This higher estimate is due to a slightly larger-than-expected school sample size and greater-than-expected sample dispersion. In addition, there is one difference between the estimates provided in the previous clearance request for recruitment for fall second grade and what we are currently estimating in table A-6. Specifically, in the previously approved clearance package, we estimated we would recruit 600 school administrators; we are now estimating this as 750 to account for greater sample dispersion.


A.16 Publication Plans and Time Schedule

Publications relevant to the data collection will be part of the reports resulting from the spring first-grade and, fall second-grade, and second-grade field test data collections. Data files with data from the spring first-grade and fall second-grade collections will be produced and made available to researchers in a public-use format. Also produced from the spring first-grade and fall second-grade collections will be restricted-use data files. Researchers who are approved by NCES's data confidentiality office for a restricted-use license can access restricted-use data files, which include more sensitive items and items that pertain to smaller numbers of children (e.g., information about the presence of specific disabilities). To be approved for a restricted-use license, researchers must demonstrate that they have a research question that cannot be answered with the public-use data and that they have the infrastructure to keep the data secure to prevent loss or unauthorized use. Codebooks and user’s manuals will be produced for use with the public- and restricted-use data files. All data will be merged at the child level. Data files will include all instrument variables (except for those that gather directly identifying information, such as the names of household members) and any relevant associated variables, such as composites or assessment scores. Data will be released through Electronic Code Book (ECB) software that allows users to create customized data files in standard statistical software packages (SPSS, SAS, and Stata) and view codebook information. A file record layout will also be provided so that analysis packages other than SPSS/PC, SAS/PC, and Stata/PC (e.g., analysis packages for Apple computers) can be used to analyze the ECLS-K:2011 data.


The ECLS-K:2011 reports and publications will include detailed methodological reports describing all aspects of the data collection effort and psychometric reports outlining properties of the assessment instruments, as well as reports that describe the population of children who are kindergartners in the 2010-11 school year as they progress through school.


The operational schedule for the ECLS-K:2011 spring first-grade and fall second-grade data collections is shown in Table A-9. Table A-9 also shows the operational schedule for the fall first-grade data collection and tracking and recruitment activities through the spring of second grade, which were approved in the most recently submitted clearance request.

A.17 Approval for Not Displaying the Expiration Date for OMB Approval

No exemption from the requirement to display the expiration date for OMB approval of the information collection is being requested for the ECLS-K:2011.



A.18 Exceptions to the Certification Statement

No exceptions to the certification statement identified in item 19, “Certification for Paperwork Reduction Act Submissions,” of OMB Form 83-I apply to the ECLS-K:2011.


Table A-9. Operational schedule for ECLS-K:2011 data collection activities


Activity

Start date

End date

ECLS-K:2011 Spring First-Grade Data Collection



Identify and subsample movers

8/29/2011

12/15/2011

Print/program assessment

4/15/2011

7/20/2011

Print/program questionnaires

9/1/2011

12/20/2011

Train data collectors

3/1/2012

3/16/2012

Spring data collection

2/24/2012

7/15/2012

Process data

3/15/2012

8/15/2012

Construct data files, user’s manual

8/15/2012

10/25/2013

Methodology/psychometric reports

8/6/2011

1/11/2014

ECLS-K:2011 Fall Second-Grade Data Collection



Print/program assessment

4/15/2012

7/20/2012

Print/program questionnaires

3/1/2012

7/20/2012

Train data collectors

6/1/2012

8/16/2012

Fall data collection

8/9/2012

12/30/2012

Process data

9/15/2012

1/15/2013

Construct data files, user’s manual

8/15/2012

10/25/2013

Methodology/psychometric reports

8/6/2011

1/11/2014

ECLS-K:2011 Fall First-Grade Data Collection



Select school sample

12/1/2010

1/15/2011

Print/program assessment

4/15/2011

7/20/2011

Prepare/print/program questionnaires

3/1/2011

7/20/2011

Train data collectors

6/1/2011

8/16/2011

Fall data collection

8/9/2011

12/30/2011

Process data

9/15/2011

1/15/2012

Construct data files, user’s manual

8/15/2011

10/25/2012

Methodology/psychometric reports

8/6/2010

1/11/2013

Sample Tracking for First-Grade Data Collection



Mail birthday cards

6/1/2011

6/1/2012

Preassessment call

8/9/2011

12/20/2011

Tracking movers and updating field management
system

8/9/2011

12/20/2011

Parent, teacher, school administrator, school

coordinator mailings

2/15/2012

4/16/2012

Spring first-grade data collection

3/15/2012

6/30/2012

Sample Tracking for Second-Grade Data Collection



Mail birthday cards

6/1/2012

6/1/2013

Preassessment call

8/9/2012

12/20/2012

Tracking movers and updating field management

system

8/9/2012

12/20/2012

Parent, teacher, school administrator, school

coordinator mailings

2/15/2013

4/16/2013

Spring second-grade data collection

3/15/2013

6/30/2013

NOTE: Information in the table that appears in gray text (i.e., burden for the fall first grade data collection and all tracking and recruitment activities) pertains to activities and burden that were approved by OMB in the previous package. It is included here because burden for these activities is being carried over since they will continue after this submission will be cleared.


1Throughout this package, reference is made to the Early Childhood Longitudinal Study, Kindergarten Class of 1998-99. For ease of presentation, it will be referred to as the ECLS-K. The new study for which this submission requests approval is referred to as the ECLS-K:2011.

2 Because the ECLS-K:2011 is a longitudinal study with many data collections spaced relatively close together (one to two data collections per year, every year, from kindergarten through fifth grade), NCES anticipates submitting a clearance package in 2012 to request approval for additional rounds of the study. In order to meet the tight development and data collection schedules, and because the recruitment materials, interviews, and questionnaires that will be used in the next submissions are not expected to change significantly from those already cleared, we request a waiver of the 60-day federal register notice for the next OMB clearance package that will be submitted in June of 2012. In that package, we will seek clearance for (1) Spring 2013 second-grade data collection; (2) recruitment for the Spring 2014 third-grade data collection; (3) tracking students for the Spring 2014 third-grade and Spring 2015 fourth-grade data collections; and (4) a 60-day federal register notice waiver for the Spring 2014 third-grade data collection and recruitment for the Spring 2015 fourth-grade data collection. The spring 2013 second-grade collection will include a child assessment, teacher questionnaires for both regular classroom and special education teachers, a school administrator questionnaire, and a parent interview. The instruments for this collection will be updated to be grade-appropriate from the instruments used in the kindergarten 2010-11 and spring 2011 first-grade collection rounds. The recruitment procedures and materials will be used to (a) contact districts, schools, and parents to remind them of the next waves of the ECLS-K study, and (b) recruit new schools to which ECLS-K 2011 sample children have transferred. The recruitment and tracking materials and methods will be updated to be grade-appropriate from those approved for the kindergarten, first-grade, and second-grade collections. The estimated burden for spring 2013 second-grade data collection and for recruitment and tracking for Spring 2014 third-grade and Spring 2015 fourth-grade data collections are provided in table A-7.

3At each follow-up stage, a small percentage of children had been retained in a grade at some point prior to the wave of interest and therefore were in a grade lower than the target grade of that follow-up stage. In addition, a small number of children were found to be advanced to a higher grade. These off-grade students were not excluded from the study.

4 There may be one change in the assessment protocol instituted for the fall second-grade data collection. Specifically, NCES is exploring the possibility of including a computerized version of the Dimensional Change Card Sort (DCCS) task, which measures children executive functioning and was administered as a non-computerized (i.e., physical) version in the kindergarten and first-grade rounds. In the physical version, children sort cards into trays based on rules provided to them by the assessor. The task is identical in the computerized version, only the instructions are provided by the computer and children sort cards on the computer screen. The reason for this switch is that it becomes more important to capture response time when measuring cognitive flexibility as children get older. We will submit a clearance request for a small field test of the computerized DCCS separately under 1850-0803, NCES’s cognitive laboratory clearance. The field test will occur in February and March of 2012.

5 In kindergarten, the science assessment had just one stage.

6 Remuneration will not be provided to schools into which study children have transferred between kindergarten and first grade if those schools are not attended by at least four ECLS-K:2011 study children.

7 The school coordinator will often be the same school staff member from the kindergarten data collection. If that person is not available, then a new staff member will be identified by the school administrator to act as a liaison to the study.

8 Topcoding and bottomcoding refer to the process of recoding outlier values to some acceptable end value. For instance, everyone with a personal income higher than $200,000 may be recoded to $200,001 or more to eliminate the outliers.

9 The study brochure was approved in a previous OMB clearance package (OMB No. 1850-0750).

10Schools are asked to assign a staff member to help coordinate the assessment activities at the school; these school coordinators are counted in the total number of respondents, and their burden hours are counted, but they do not complete any study instruments.

11 An hourly rate of $34.48 was used to translate teacher response time into a dollar amount. This rate is based on the National Compensation Survey. See U.S. Department of Labor (2007). National Compensation Survey: Occupational Wages in the United States, June 2006.

12Schools are asked to assign a staff member to help coordinate the assessment activities at the school; these school coordinators are counted in the total number of respondents, and their burden hours are counted, but they do not complete any study instruments. Children are included as respondents for this round of data collection because they will be asked a short set of questions at the beginning of the hearing screening.




Early Childhood Longitudinal Study
Kindergarten Class of 2010-11

A-7



File Typeapplication/msword
AuthorHastedt, Sarah
Last Modified Bykashka.kubzdela
File Modified2011-12-06
File Created2011-11-20

© 2024 OMB.report | Privacy Policy