Part A_ECLS-K_2011_OMB

Part A_ECLS-K_2011_OMB.doc

Early Childhood Longitudinal Study Kindergarten Class of 2010-2011

OMB: 1850-0750

Document [doc]
Download: doc | pdf

Justification

A


Justification

A

A.1 Circumstances Making Collection of Information Necessary

A.1.a Purpose of This Submission

The Early Childhood Longitudinal Study, Kindergarten Class of 2010-11 (ECLS-K:2011) is a survey that focuses on children’s early school experiences beginning with kindergarten and continuing through the fifth grade. It includes interviews with parents, teachers, school administrators, and nonparental care providers, as well as direct child assessments. Like its sister study, the Early Childhood Longitudinal Study, Kindergarten Class of 1998-99 (ECLS-K),1 the ECLS-K:2011 is exceptionally broad in its scope and coverage of child development, early learning, and school progress, drawing together information from multiple sources to provide rich data about the population of children who will be kindergartners in the 2010-11 school year. As with the original ECLS-K, the ECLS-K:2011 is sponsored by the National Center for Education Statistics (NCES) within the Institute of Education Sciences (IES) of the U.S. Department of Education. Collections in the kindergarten year will be conducted for NCES by Westat, with the Educational Testing Service (ETS) as subcontractor.


ECLS-K:2011 is the third in an important series of longitudinal studies of young children sponsored by the U.S. Department of Education (ED) that examine child development, school readiness, and early school experiences. It shares many of the same goals as its predecessors, the ECLS-K and the Early Childhood Longitudinal Study, Birth Cohort (ECLS-B), but also advances research possibilities by providing updated information and addressing recent changes in education policy:


  • Like its predecessors, ECLS-K:2011 will provide a rich and comprehensive source of information on children’s early learning and development, transitions into kindergarten and beyond, and progress through school for a new cohort of children;

  • ECLS-K:2011 will provide data relevant to emerging policy-related domains not measured fully in previous studies; and

  • Coming more than a decade after the inception of the ECLS-K, ECLS-K:2011 will allow cross-cohort comparisons of two nationally representative kindergarten classes experiencing different policy, educational, and demographic environments.

Clearances for studying the first ECLS-K cohort were granted in 1996 for the kindergarten data collection (OMB No. 1850-0719), in 1998 for the first grade to fifth grade data collections (OMB No. 1850-0750), and for the spring 2006 field test data collection with eighth and tenth grade students and their teachers and the spring 2007 national data collection for eighth graders (OMB No. 1850-0750). This current submission provides a summary of the results of the ECLS-K:2011 field test study approved on March 20, 2009 (OMB No. 1850-0750 v.5) and seeks approval for the full scale kindergarten year data collections. The changes between the approved field test OMB clearance package and this package for the full scale collection are outlined in the attached ECLS-K 2011 Memo Outlining FS Changes.doc and ECLS-K 2011 Memo Outlining FS Changes Attachment.xlsx files. This submission, for which the 60-day federal register notice was waved in the previous clearance, describes the final procedures and instruments planned for the full-scale kindergarten data collections of the ECLS-K:2011. It also seeks waver of the 60-day federal register notice for the next two anticipated OMB clearance packages described below.


Because the ECLS-K:2011 is a longitudinal study with many data collections spaced relatively close together (one or two data collections per year, every year, from kindergarten through fifth grade), NCES anticipates submitting another clearance package during 2010 to request approval for additional components of the study. In order to meet the tight development and data collection schedules, and because the recruitment materials and background questionnaires that will be used in the submission outlined below are not expected to change significantly from those already cleared, we request a waiver of the 60-day federal register notice for the following submission:

  1. The Fall 2011 first grade national data collection to be conducted with a subsample of the total ECLS-K:2011 sample and recruiting and tracking of respondents for first and second grade collections. Similar to the fall first grade data collection in the ECLS-K (1998-99), this collection will include children in a 30 percent subsample of the ECLS-K:2011 schools (n=approximately 300), or approximately 6,000 children. A primary purpose of this collection will be to obtain information about children’s summer experiences to examine summer learning and summer learning loss. The collection will include a child assessment, teacher questionnaires, and a parent interview. The collection also will include vision and hearing screenings, which were field tested in the base year filed test. NCES will use the recruitment materials already cleared for the kindergarten collection on 3/20/09, only updating them for the 1st grade collections, as well as the tracking methods described in the cleared kindergarten package. Similarly, the instruments for the Fall 2011 first grade collection will be updated from the instruments used in the kindergarten 2010-11 rounds as well as the ECLS-K (1998-99) fall first grade collection (available at http://nces.ed.gov/ecls/kinderinstruments.asp) in order to ensure that the items are grade-appropriate and capture information on summer learning. Work is currently underway to develop optimal protocols for the hearing and vision screenings. This submission will outline the revised protocols and any changes to previously approved respondent materials that may be necessary. The revised protocols will be a subset of those used in the field test; we will not propose anything that was not previously approved and tested. Also, as part of the change in protocol, the screening time will be reduced from the 30 minutes originally proposed just for hearing to 15 minutes for both vision and hearing; thus, the burden per respondent will be not be greater than the burden previously approved.


The estimated burden for these activities is given in table A-6. With regards to the recruitment materials and burden of recruitment for the Fall 2011 and Spring 2012 first grade data collections, this package will describe the procedures for and include the materials that will be used to (a) contact states and districts to remind them of the next waves of the ECLS-K study, (b) remind schools and parents, and (c) recruit new schools to which ECLS-K 2011 kindergartners have transferred. The package also will request approval for tracking students for the Fall 2011 and Spring 2012 first grade data collections and the second grade data collections (i.e., the optional Fall 2012 and planned Spring 2013 collections). The recruitment materials will be very similar and the tracking procedures identical to those approved for Kindergarten.


A.1.b Legislative Authorization

ECLS-K:2011 is conducted by NCES in close consultation with other offices and organizations within and outside the U.S. Department of Education. ECLS-K:2011 is authorized by law under the Education Sciences Reform Act of 2002 (P.L. 107-279), section 153 (7):


The Statistics Center shall collect, report, analyze, and disseminate statistical data related to education in the United States and in other nations, including -- (7) conducting longitudinal and special data collections necessary to report on the condition and progress of education;”

Section 153 of the Education Sciences Reform Act of 2002 further states that:


all collection, maintenance, use, and wide dissemination of data by the Institute, including each office, board, committee, and Center of the Institute, shall conform with the requirements of section 552A of title 5, United States Code, the confidentiality standards of subsection c of this section, as amended, and sections 1232g and 1232h of this title. [which protect the confidentiality rights of individual respondents with regard to the data collected, reported, and published under this title to the fullest extent allowable under law].” (Section 153)

A.1.c Prior Related Studies

The ECLS-K:2011 is part of a longitudinal studies program. The two prior ECLS studies pertain to two cohorts—a kindergarten cohort and a birth cohort. Together these cohorts provide the range and breadth of data required to more fully describe and understand children’s health and early learning, development, and education experiences in the late 1990s and 2000s.


The birth cohort (ECLS-B) followed a national sample of children, born in the year 2001, from birth through kindergarten entry. The ECLS-B focused on the characteristics of children and their families that influence children’s school readiness and first experiences with formal schooling, as well as children’s early health and in- and out-of-home experiences.


The ECLS‑K followed a nationally representative cohort of children from kindergarten through eighth grade. The base year data were collected in the fall and spring of the 1998-99 school year when the sampled children were in kindergarten. A total of 21,260 kindergartners throughout the nation participated by having a child assessment and/or parent interview conducted during that school year. Five more waves of data were collected: in fall and spring of the 1999-2000 school year when most, but not all, of the base year children were in first grade; in the spring of the 2001-02 school year when most, but not all, of the base year children were in third grade; in the spring of the 2003-04 school year when most, but not all, of the base year children were in fifth grade; and in the spring of the 2006-07 school year when most, but not all, of the base year children were in eighth grade.2



A.1.d ECLS-K:2011 Study Design

National Data Collection

The sample for the ECLS-K:2011 will be a nationally representative sample of children attending kindergarten in 2010-11. The sample will include children in kindergarten for the first time and children repeating kindergarten. In the fall of 2010, children will be selected using a multistage probability design. In the first stage, 90 primary sampling units (PSUs) that are counties or groups of counties will be selected with probability proportional to size (PPS). In the second stage, public and private schools offering kindergarten will be selected, also with PPS with an oversampling of private schools. The third-stage sampling units will be children in kindergarten or of kindergarten age (approximately 5-years old) in ungraded schools or classrooms. Children will be selected within each sampled school using equal probability systematic sampling, with a higher sampling rate for Asians, Native Hawaiians, and Other Pacific Islanders, who will be oversampled as one group, so as to achieve a minimum required sample size for them. Further discussion of sampling issues can be found in section B.1, Universe, Sample Design, and Estimation.


The national kindergarten data collection will include direct child assessments, height and weight measurement, parent interviews, school administrator and teacher (both regular classroom and special education teachers) questionnaires, and wrap-around early care and education provider questionnaires.3 Data will be collected twice, once in the fall and once in the spring, though not all components will be included in each collection (for example, the school administrator questionnaire will be included only in the spring).4 As in ECLS-K, computer assisted interviewing (CAI) will be the mode of data collection for the child assessment and the parent interview. School administrator, teacher, and child care provider data will be collected via self-administered questionnaires.


Direct Child Assessments. As in ECLS-K, a direct cognitive assessment will be used in ECLS-K:2011. The assessment measures the cognitive domains of reading, mathematics, and executive functioning (described further below in section A.2.1.c) in both fall and spring of the kindergarten year using age- and grade-appropriate items. A brief assessment of children’s science knowledge will be included in the spring of the kindergarten year. The cognitive assessment will be administered directly to the sampled children on an individual basis. The structure of the ECLS-K:2011 kindergarten assessment will be two-stage, the same as the ECLS-K base year assessment. A majority of items in the ECLS-K:2011 reading and mathematics assessments will be the same as those used in the ECLS-K base year assessment in order to enable researchers to conduct cross-cohort analyses. While a science assessment was fielded in the ECLS-K, it was first fielded in third grade, so a new assessment appropriate for younger children needed to be developed for the ECLS-K:2011. Assessment of executive functioning is a new feature of the ECLS-K:2011, compared to ECLS-K; the ECLS-K:2011 will employ assessments of executive functioning that have been developed and tested by other researchers. In addition, the ECLS-K:2011 child assessments will include measures of the children’s height and weight.


All children, regardless of home language, will first be administered two subscales of the preLAS2000 to act as a warm-up for the cognitive assessments and to assess their level of basic English proficiency. The subscales of the preLAS2000 were fielded in the ECLS-K and assess children’s listening comprehension, understanding of spoken English, expressive language, and vocabulary. All children will then be administered an assessment of English basic reading skills (EBRS). Children who achieve a minimum score on the preLAS2000 will continue with the remaining reading items and the math assessment in English. For children taking the cognitive assessments in reading and math in English, all children are administered a routing test (the science assessment in spring 2011 will be a single-stage test composed of approximately 15-20 items). The EBRS comprises the first portion of the reading routing test. Performance on the routing test will determine which one of three second-stage tests will be appropriate for the child’s skill level. Spanish- speaking language minority children (those children with a primary home language that is not English) who do not achieve a minimum score on the preLAS2000 will be administered a short test of their basic reading skills in Spanish (SBRS) and a math assessment in Spanish, and they will have their height and weight measured. These children will not be administered science or executive function assessments. Non-Spanish-speaking language minority children who do not achieve a minimum score on the preLAS2000 will receive no other direct assessments other than height and weight.


The science assessment, Spanish basic reading skills assessment, and English basic reading skills assessment were all tested in the field test conducted in fall 2009 because these are new components in the ECLS studies. (The language screener was previously fielded in the ECLS-K.) The results from the field test, which are described in section A.1.e, were favorable. As a result, these components will be included in the national data collection (described in section A.1.e). Screenings of children’s hearing and vision also were tested in the field test. The field test results and protocols for national data collection will be detailed in a future clearance package.


Parent Interviews. A parent interview will be administered in fall and spring to all participating parents/guardians of the children in the ECLS-K:2011 study. The interviews will be developed in English and Spanish; professional interpreters will be used to administer the English-language version to parents who speak neither English nor Spanish (i.e., the interpreters will translate from English to the parent's native language during the interview). Across the two waves of kindergarten data collection, the parent instruments will cover the following topics: family structure, family literacy practices, parental involvement in school, nonparental care arrangements, food security, parent health and well-being, household composition, languages used in the household, family income and receipt of public assistance, parent education levels, parent employment, and other demographic indicators. Parents will also be asked to report on their children’s level of physical functioning and activity, socioemotional functioning, health, and disability status. The parent interview includes the same types of questions that have been previously fielded in ECLS-K and other NCES studies (e.g., ECLS-B, National Household Education Surveys Program (NHES), Education Longitudinal Survey of 2002 (ELS:2002), and the National Education Longitudinal Survey of 1988 (NELS:88)).


Teacher Questionnaires. The general classroom teachers of sampled children will be asked to complete the classroom teacher questionnaire. The instrument includes questions about the teacher’s own background and education, class materials, teaching practices, and specific information about the topics and skills taught in the classroom. The questionnaire provides information on the types of materials being used to teach the nation’s kindergartners, what and how they are being taught, the characteristics of their classrooms, and the background and experience of their teachers.


Classroom teachers also will be asked to complete child-level questionnaires for each of the sampled children in their classroom. The questionnaires will contain items about children’s skills in the areas of language and literacy, mathematics, science, and executive functioning; children’s social skills and behaviors; and program placements and special services that each child may receive. These data obtained from teachers can be compared to the results of direct assessments administered to the sampled kindergartners. They also provide broader information about children’s skills and behaviors than that which can be ascertained through the 60-minute direct child assessment. As results from additional years of collection become available, a picture of children’s skills over time can be developed using both teacher reports and direct cognitive assessment results.


The rating scales of language and literacy and mathematics that will be included in the ECLS-K:2011 child-level questionnaires were used in the ECLS-K. Two scales tapping executive functioning skills of inhibitory control and attention focusing executive functioning will be included from the Child Behavior Questionnaire (Rothbart et al., 2001), a widely used teacher-rating instrument. With the addition of a direct science assessment, it was necessary to develop a similar rating scale for science that was appropriate for kindergartners. The science rating scale was tested in the fall 2009 field test (described in section A.1.e) and found to be suitable for the national data collection. It should be noted that the items that ask teachers to rate children in their classroom on social skills, problem behaviors, learning dispositions, and attentional focusing and inhibitory control, and those that ask teachers to describe their relationship with sampled children are not listed in this package as they are copyright protected.


The special education teachers of children who have an Individual Education Program (IEP) on file at the school also will be asked to complete questionnaires. Like the general classroom teachers, special education teachers will be asked to complete a self-administrated questionnaire about their own background and education. They also will be asked to complete a child-level questionnaire for each sampled child receiving special education services. Questions asked of these teachers will be useful in examining special education curricula and the services being received by children with identified disabilities.


School Administrator Questionnaire. This questionnaire will be completed by participating school administrators in the schools attended by the ECLS-K:2011 children. This instrument includes a broad range of questions about the school setting, policies, and practices at both the school level and in kindergarten, as well as questions about the principal/school administrator and the teaching staff. These items will help researchers understand the school contexts for kindergarten children throughout the nation. Using data from the school administrator, comparisons can be made between children attending different types of schools (including public and private schools (with private schools being further identified as religious or nonreligious); rural, urban, and suburban schools; and schools of different sizes), thereby allowing researchers to determine the degree to which educational outcomes of various groups of children are associated with the differences in the schools that the children attend. The questionnaire is similar to the previous ECLS-K school administrator questionnaire, although shortened due to comments regarding the questionnaire’s length.


Wrap-Around Early Care and Education Provider Questionnaire (WECEP). Self-administered WECEP questionnaires will be completed by the nonparental caregivers for the primary wrap-around care arrangement (home- or center-based) of sampled children who have before- and/or after-school care and education arrangements for at least 5 hours per week. For center-based arrangements, certain questions about the center and staffing will be asked of the center director/administrator, rather than the primary care provider. Administration time is designed to be 10 minutes for the administrator questionnaire and 20-30 minutes for the caregiver questionnaires. Topics covered include the activities offered to children during the hours they are in care, purposes of the program (e.g., to improve children's academic skills, to provide recreational activities), characteristics of other children in the care setting, curricula used, caregiver beliefs and attitudes, the learning environment of the care arrangement, caregiver/teacher background, and, in the case of center-based arrangements, center staffing and services.


A.1.e ECLS-K:2011 Field Test

The ECLS-K study has informed the approach for the ECLS-K:2011. By design, the ECLS-K:2011 data collection instruments are in large part a collection of items used in the ECLS-K to allow comparisons between the two cohorts of kindergartners. A field test was conducted for the ECLS-K:2011 in fall 2009 to test some changes that were made to the study procedures and instrumentation, most particularly in the child assessment. These included:

  1. A new domain, science, for the direct child assessment. Items for a new kindergarten, first-grade, and second-grade science assessment were field tested, because the previous ECLS-K study did not field a science assessment until the third grade.

  2. New items developed for reading and math assessments, to have more items be solely ECLS-K:2011 items (as opposed to using copyrighted items) and to incorporate more higher level items for these younger children to accurately measure the increased level of knowledge and skills we observed in children at kindergarten entry in the ECLS-B.

  3. A new measure of English basic reading skills for ELL children. In ECLS-K:98, children who failed an English language skill test (preLAS2000) were routed into a Spanish math assessment and no reading assessment score could be developed for them.

  4. A new measure of Spanish basic reading skills for Spanish-speaking ELL children.


A new teacher rating scale for science skills and knowledge was proposed to be included in the teacher questionnaires, which was also tested in the field test. Finally, vision and hearing screenings were proposed to be included as part of the child assessment and therefore were tested.


Field tests with two different groups of children were conducted. The English field test served as the primary vehicle for: (1) estimating the psychometric parameters of all items in the kindergarten through second grade assessment battery item pool, (2) producing psychometrically sound and valid direct and indirect cognitive assessment instruments, and (3) assessing the feasibility of screening children’s vision and hearing for the national collection. A second field test (Spanish field test) was conducted to obtain valid assessments for both an English reading score for Spanish-speaking children and an assessment of their basic reading skills (e.g., letter recognition and sounds) in Spanish.


For the English field test, a purposive sample of 37 elementary schools representing different levels of urbanicity across five geographic areas participated. The participating schools included public and private schools (both religious and nonreligious) that are not sampled for the national study. Two thousand nine hundred and seventy-eight children (905 English-speaking kindergarteners, 846 English-speaking first graders, 818 English-speaking second graders, and 409 English-speaking third graders) participated in the English field test. All participating English-speaking children were administered a direct assessment that included a reading subtest and either a math or science subtest. (Reading passages consume assessment time thus reducing the number of items that can be administered. Therefore, a reading subtest was administered to all children to maximize the number of reading items field tested). Subsets of children had their vision and/or hearing screened. As part of the English field test, two hundred and forty-two teachers (approximately two teachers each from kindergarten, first, and second grade in each school), were asked to complete individual rating scales containing items about science knowledge and skills for five anonymous children in their classroom.


For the Spanish field test, a purposive sample of 36 elementary schools across five geographic areas participated. The five geographic areas were selected based on the percentage of the population of students in schools in that area that were Hispanic (50% or higher). The participating schools included public and private schools (both religious and nonreligious) that are not sampled for the national study. A sample of 1,115 Spanish-speaking kindergartners completed a direct assessment of their basic reading skills in English and Spanish.


English Field Test Data Collection Procedures. Children were administered the direct cognitive assessments by trained assessors in a one-on-one setting. Hard-copy field test assessment easels were used by assessors to present the assessment items to children, and assessors recorded children’s responses in a paper-and-pencil format using a separate score sheet.


In order to test many different items without overburdening the children, several versions of the field test assessment easels were developed from items divided into four reading forms, two math forms, and two science forms. These forms were spiraled such that each child in the field test received one of four versions of the reading forms and one of two versions of either the math or the science forms (e.g., reading 1 and math 1, reading 2 and math 2, reading 3 and science 1, reading 4 and science 2, etc.). Children were administered only as many items as they could complete in a 60-minute time period, regardless of whether they completed both, or even only one, domain.


Some of these field test procedures are different from those that will be used in the national data collection. While hard-copy assessment easels will be used in the national collection, the interviewer will enter children’s responses into a computer, rather than use separate hard-copy score sheets. Also, children will receive complete assessments in all domains (reading, math, and science) with no imposed time limit, though the total assessment time for all domains is expected to take 60 minutes.


In addition, in the field test all participating kindergarten through third grade children were eligible for a vision screening and a subset of children whose parents gave explicit consent took part in a hearing screening. All vision and hearing screenings were conducted after the cognitive assessment was completed. The order of the screenings alternated between participating children according to their random field test identification number: if the identification number was an even number, the vision screening was administered first; if the identification number was an odd number, the hearing screening was administered first. The screenings were conducted by a group of field staff trained specifically on administering the hearing and vision screenings. Health screening specialists trained the staff for the hearing and vision screening; the training covered not only the use of the specialized screening equipment and the screenings, but also general issues relating to screening child participants. The hearing assessment consisted of a few questions asked of the child to determine if the child had any conditions that may be affecting the child’s hearing on the day of the assessment (e.g., the child had a cold), a visual inspection of the ears, and a hearing test. An otoscope was used to visually inspect the ears, and children’s hearing was measured using a tympanometer/audiometer. The vision screening assessed characteristics of the eyes (e.g., whether the child needed glasses, whether there were differences between the eyes, whether the child had a lazy eye) using an autorefractor. Visual acuity also was tested using an electronic visual acuity (EVA) tester.


Additional data in the field test were collected from two teachers in each of grades kindergarten through 2 in each field test school. Kindergarten, grade 1, and grade 2 teachers were asked to complete the child science rating forms. Teachers were asked to choose 5 children from their class to rate using the teacher rating scale for science skills and knowledge: their highest achieving student, their lowest achieving student, and three with average achievement. Teachers were asked to select children who matched these criteria and think about their skills and knowledge in science when completing the forms. Each teacher was given $7 for each child rating form he or she completed.


English Field Test Results. Analysis of the field test operations and data focused heavily on the new aspects of the study compared to what was fielded in the ECLS-K. Below we describe the types of analyses that were conducted and changes to the study design based on the results of the field test.


  • Direct assessment of science knowledge. The Educational Testing Service (ETS) analyzed the data obtained from the field test to determine item quality and potential for use in the early grade test forms. As part of the analysis, ETS reviewed all of the available information for each item, including difficulty, science framework specifications, psychometric characteristics, the possibility of linking within ECLS-K:2011 and across to the ECLS-K, assessor feedback, and the possibility of measurement of gain in subsequent years. (For more detailed information about the analysis of science items, please see the memo about the analysis, which can be found in appendix A.) Based on the field test analysis findings, assessing children in fall kindergarten is not advisable. Although some items identified as being appropriate for kindergartners did function well, the number of items that did so was limited, and the items covered only a subset of the science domains that are included in the content framework underlying the design of the assessment. Specifically, the content framework calls for the inclusion of items to measure children’s knowledge in the domains of life science, physical science, earth science, and scientific inquiry. The items that did have acceptable performance were predominantly Life Science items, with only a few successful Physical Science and almost no Earth Science items. This limited domain coverage would make it impossible to select a set of items for a full-scale kindergarten science assessment consistent with the test framework. For the other grade levels, there were an adequate number of items in each category that functioned well; thus, we will develop and field 2-stage assessments in first, second, and third grade.


Because there was a strong recommendation from the TRP members to begin assessing science knowledge as early as possible, and the ECLS-K:2011 is uniquely able to inform research on the development of science knowledge and skills from a very young age, we will use the items that did function well for kindergartners and some of the easier items identified as being appropriate for first grade to develop a limited, approximately 15-20-item single-stage assessment for the spring kindergarten collection. This shorter assessment will permit measurement on a limited set of items appropriate for spring kindergarten for the entire sample. It will serve as an early data point for science knowledge and skills that can be scaled, or calibrated, with science assessment data from subsequent rounds of collection in order to provide a picture of the growth in children’s science knowledge and skills over time.


  • Teacher rating scale for science skills and knowledge. Generally, average item scores were in the middle of the possible range and showed good variation with teachers using the entire range of scores. Two items in first grade and second grade tapped science skills or knowledge (Demonstrates understanding of physical science concepts and Demonstrates understanding of earth and space science concepts) that had higher frequencies of not being taught yet to the rated student, which may not be surprising in the beginning of the school year. Also, it is meaningful to know what skills are or are not taught in different grades and/or different schools. Data across all three grades showed very strong internal consistency reliability, as well as expected differences across the achievement levels of the students teachers were rating.


We will field the kindergarten, first grade, and second grade Science ARS instruments in the ECLS-K:2011. The kindergarten items are included in the fall and spring teacher child-level questionnaires being submitted with this package in appendix D. To reduce burden on respondents, the number of examples listed under each item will be shortened to two examples.

  • Feasibility test of hearing and vision screening. As described above, the field test experiences and data are still being analyzed and discussed. For example, it was clear that the protocols took too long as a significant number of children grew impatient during the screenings. Work is being done to shorten them significantly for the national collection. Also, background noise in schools resulted in our being unable to accurately test hearing at lower frequencies, so the set of frequencies at which we can reliably collect hearing data is being determined. The fall first grade OMB clearance package will describe further the field test results and changes to the data collection protocols based on the results.

More detailed reports of the English field test results can be found in Appendix A.


Spanish Field Test Data Collection Procedures. Like the direct cognitive assessments in the English field test collection, the Spanish field test direct assessments were conducted one-on-one by bilingual assessors using hard-copy assessment easels and separate score sheets. In the Spanish field test, Spanish-speaking kindergartners received the English basic reading skills assessment and the Spanish basic reading skills assessment. Together, the Spanish basic reading skills and English basic reading skills took about 30 minutes (approximately 15 minutes each)


Spanish Field Test Results. Based on the results of the Spanish field test, the following plans for the national data collection were made:


  • Assessment of English basic reading skills.


A goal of the Spanish field test was to obtain an English Basic Reading Skills (EBRS) assessment which, as a stand-alone subset of the full reading assessment items, could be used to accurately measure the low-level English reading ability of kindergarten English Language Learners. Many of the EBRS items were common-functioning for the children in the Spanish field test and the children in the English field test, indicating that we can confidently administer an EBRS subset of the reading items to all children in the study, and that this subset of items will permit reliable estimates of English reading ability for those who have very low English reading ability as indicated by their performance on the EBRS. Having the EBRS assessment as a feature of the ECLS-K:2011 reading assessments will allow us to accurately measure early reading skills and ability for the full range of expected kindergarten English reading abilities, including the emerging abilities of the ELL children who would have been excluded from our assessment in the ECLS-K:1998-99 reading assessment. Based on the field test results, we plan to:


  1. Administer the preLAS2000 language screener to children identified by the school as coming from a home where English is not the primary language (use of the preLAS2000 test will be determined after final review of field test results).

  2. Administer the EBRS items to all study children.

  3. After the EBRS, Spanish speaking children with low EBRS scores will route into the SBRS. Pending final evaluation of the field test data, preLAS2000 screening scores may be used to help with this routing decision. Consideration of how to route all low EBRS children in terms of second stage reading assessments is still pending final evaluation of the field test data. Children with higher EBRS scores will route into second stage English reading assessments.


  • Assessment of Spanish basic reading skills.


Another goal of the Spanish field test was to obtain a valid Spanish Basic Reading Skills (SBRS) measure for the Spanish-speaking children in the study who do not pass the EBRS. This Spanish assessment, which was a careful translation of the EBRS, worked well with the Spanish-speaking field test children and the psychometric properties of the items indicate that we will be able to reliably estimate the basic reading ability of our Spanish-speaking children in their preferred language. The recommendations are to:


  1. Administer the SBRS to all children who fail the EBRS (this decision might be made in conjunction with preLAS2000 scores pending decisions about the use of that assessment for prescreening) and who the school or teacher identifies as Spanish-speaking.

  2. Have all children who completed the SBRS move on to the Spanish mathematics assessment and possibly second stage reading assessments pending decisions on how to route children based on EBRS scores.


More detailed reports of the Spanish field test results can be found in Appendix A.




A.2 Purposes and Uses of the Data

The ECLS-K:2011 will provide a rich data set that is designed to serve two purposes: descriptive and explanatory. It will provide descriptive data at a national level related to (1) children’s status at entry into kindergarten and at different points in their elementary school careers, (2) children’s transition into school and into the later elementary grade levels, and (3) children’s school progress through the fifth grade. Additionally, it will provide a rich data set that will enable researchers to test hypotheses about how a wide range of child, family, school, classroom, nonparental care and education provider, and community characteristics relate to experiences and success in school.


In addition to the descriptive objectives mentioned above, it will also be the goal of the data collection to describe accurately the diversity of young children with respect to demographic characteristics such as race/ethnicity, language, and school readiness. Such information is critical for establishing policies that are sensitive to this diversity. The longitudinal nature of the study will enable researchers to study cognitive and physical growth and socioemotional status, as well as relate trajectories of growth and change to variation in home and school experiences in the elementary grades. Ultimately, the ECLS-K:2011 data set will be used by policymakers, educators, and researchers to consider the ways in which children are educated in our nation’s schools and to develop more effective approaches to education. It will be particularly valuable to policymakers, as the ECLS-K:2011 is being launched a dozen years after the inception of the original ECLS-K. Analyses of the two cohorts will provide valuable information about the influences of changing policy and demographic environments on children’s early learning and development.



A.2.1 Research Issues Addressed in the ECLS-K:2011

Today’s early education environment differs from that of the past in numerous ways. Examples of the many changes that have occurred within schools and within the larger society in recent years are presented in Exhibit A-1 and include changes at the policy, societal, state, and school levels. Numerous researchers have used the ECLS-K and ECLS-B to examine many of these topics. The widespread use of ECLS data is a testament to the importance of these two studies. At the same time, both studies have gaps that are perhaps inevitable because changes in policy, research, and society are often difficult to anticipate. The strengths of these earlier studies will be preserved in the ECLS-K:2011 by retaining much of the same content, while incorporating appropriate modifications so that ECLS-K:2011 can be used to study a new cohort of children growing up in today’s circumstances.



A.2.1.a Developments in Early Education Policy

Two important recent education policy developments since the development of the ECLS-K and ECLS-B are the 2002 reauthorization and amendment to the Elementary and Secondary Education Act (ESEA) of 1965, commonly known as the No Child Left Behind Act (NCLB), and state-level efforts aimed at establishing state-funded pre-kindergarten programs. These policies have the potential to dramatically affect children’s experiences prior to school entry, school readiness, and progress through school.


ESEA affects families, classrooms, teachers, schools, and school districts throughout the country. It has clear expectations for student achievement; mandates annual assessments of all children in grades 3 through 8 to measure progress toward state-defined goals; has strong reporting requirements for schools, districts, and states; and has consequences when schools and school districts do not make Adequate Yearly Progress (AYP). Title I schools and districts that do not make AYP for 2 consecutive years are identified for improvement and are to receive technical assistance to help them improve. Districts must offer all students in identified Title I schools the option of transferring to a school that has not been identified for improvement with transportation provided by the district. If a Title I school misses AYP for a third year, districts must offer low-income students the option of supplemental educational services from a state-approved provider. If a Title I school misses AYP for a fourth year, districts must take corrective actions, which may include replacing school staff or implementing a new curriculum. Cross-cohort comparisons of the three ECLS studies (ECLS-B, ECLS-K, and ECLS-K:2011) will provide important insights into the influence of ESEA on children’s lives. ECLS-K children entered school before the advent of ESEA. ECLS-B children entered school as states, districts, and schools were adjusting to meet the requirements of ESEA, for example by developing the required systems to demonstrate AYP. ECLS-K:2011 children will be entering school after educational systems have had time to comply with ESEA requirements.

Exhibit A-1. Examples of important developments for the ECLS-K:2011


Policy changes

  • Reauthorization and amendment to ESEA

  • Rise in state-funded pre-kindergarten programs

  • Passage of the 1996 Personal Responsibility and Work Opportunity Reconciliation Act (“welfare reform”)

  • Higher standards for teacher qualifications

Changes in schools and challenges for schools

  • Growth in school choice and increasing number of charter schools

  • Growth in integrated pre-kindergarten through grade 3 schools (Pre-K-3)

  • Change in curricular focus due to ESEA

  • Re-segregation of schools due to residential patterns and decline in court mandated busing

  • Stress on school systems as they adapt to decreasing student populations (in the North) or increasing numbers of students (in the Sunbelt)

Demographic changes

  • Growth of Hispanic population

  • Growth in number of English language learners (ELL) in schools, especially in the early grades

  • Migration of population from Rustbelt to Sunbelt states

  • Extension of suburban sprawl

Child health

  • Increase in rates of obesity

  • Rise in incidence of:

  • Allergies

  • Asthma

  • Attention deficit hyperactivity disorder

  • Autism

  • Learning disabilities

Scientific developments in neuroscience

  • Advances in neuroimaging techniques (e.g., fMRIs) that have led to advances in our understanding of the development of children’s learning, memory, attention, and language

  • Advances in neurological research and emphasis on executive functioning

  • Emerging research showing the trainability of cognitive process (e.g., Rueda et al., 2005)

Technological changes

  • Increase in:

  • Video game usage even for very young children

  • Internet usage

  • TV programs aimed at children


At the state level, policymakers have been continuing efforts begun in the 1990s to develop state-funded pre-kindergarten programs for young children. These efforts are gaining additional impetus from the requirements of ESEA. Some states seek universal pre-kindergarten programs for all children; others seek to develop programs that target low-income children as a way to ensure that they have the same access as more advantaged children to early education and learning activities that will enable them to be ready for school. There are wide variations in state policies for preschool and in the numbers of children enrolled in preschool programs. The ECLS-K:2011 will collect basic information about preschool programs attended by the sampled children. Parents will be asked about the type of preschool program their child attended (Head Start, other public preschool, private preschool) and the years the child attended. Researchers will be able to link information about state preschool and early childhood policies to each child’s record, adding contextual information to an already rich data set. Although this information could not be used to make statements about specific states, it would enable researchers to examine whether state policies are associated with children’s transition into kindergarten and success in kindergarten and elementary school.


ESEA and state preschool efforts both emphasize the importance of using highly qualified teachers in the classroom. This emphasis on qualified teachers is exemplified by the bill that passed Congress in November 2007 requiring that by the year 2013 all federally-funded Head Start teachers have at least an associate’s degree and that at least half have a bachelor’s degree. Eighteen of the 38 states currently funding pre-kindergarten programs require the lead teacher in every classroom to have a bachelor’s degree and 20 require all lead teachers to have specialized training in pre-kindergarten education (Barnett, et al., 2006). ESEA has parallel requirements for K-12 schools. For example, it requires that all teachers of core subjects have a bachelor’s degree, full state certification, and demonstrated competence in each core academic subject they teach. The ECLS-K:2011 will enable researchers to examine the qualifications of the cohort members’ teachers, both in kindergarten and across time.



A.2.1.b School Readiness

Educational policymakers and researchers continue to debate the most appropriate ways to promote school readiness. Most experts agree that school readiness is a multi-faceted phenomenon that encompasses several domains of child development. In addition to cognitive development and pre-academic skills (e.g., letter and number recognition, emerging literacy), school readiness is conceptualized as involving the whole child, including health and physical well-being, language acquisition, social and emotional development, and interest in and enthusiasm for learning. It is therefore important for ECLS-K:2011, like ECLS-K and ECLS-B, to capture information related to all of these domains to more fully understand how children’s early learning and development are being affected by shifts in policy and by other changes in children’s lives. For example, one effect of ESEA is a change in curricular emphasis in the early grades. ESEA emphasizes evidence-based early literacy activities that stress the development of specific literacy skills. Two recent initiatives, Reading First and Early Reading First, seek to lay the foundation for future school success by stressing the following five skills to enable children to become proficient readers:


  • Phonemic awareness: the ability to hear and identify sounds in spoken words;

  • Phonics: the relationship between the letters of written language and the sounds of spoken language;

  • Vocabulary: the words students must know to communicate effectively;

  • Fluency in reading: the capacity to read text accurately and quickly; and

  • Comprehension: the ability to understand and gain meaning from what is read.

ESEA and these reading programs view literacy as a skill that requires coherent skill-based instruction using scientifically proven curricula provided by highly qualified teachers to learn. By ensuring that the assessments and teacher questionnaires measure these emphasized skills, ECLS-K:2011 can be used to assess the impact of ESEA on children’s emerging literacy and cognitive development.



A.2.1.c Executive Functioning

New research in the cognitive and neurological sciences is providing important insights into developmental processes associated with school readiness. Of particular interest is new research on the importance of executive functioning for learning and academic achievement (e.g., Blair and Razza, 2007; Posner and Rothbart, 2006). “Executive functioning” refers to a set of interdependent processes that work together to accomplish purposeful, goal-directed activities and include working memory, attention, inhibitory control, and other self-regulatory processes. Executive functioning processes work to regulate and orchestrate cognition, emotion, and behavior to enable a child to learn in the classroom. For example, executive control, which is associated with the prefrontal cortex, involves the ability to allocate attention, to hold information in working memory, and to withhold an inappropriate response (Casey, et al., 2000). Not only are these cognitive and behavioral processes predictive of reading and math achievement (Blair and Razza, 2007), but there is also emerging research that indicates that some of these cognitive processes are trainable (Rueda, et al., 2005; Klingberg, et al., 2005) and can be improved upon in regular public school classrooms without costly interventions (Diamond, et al., 2007).


Many other cognitive processes are necessary for learning and achievement. For example, learning, whether it involves reading comprehension, solving applied mathematics problems, or something else, involves the interaction between working memory and long-term memory and the formation of linkages between the two. ECLS-K:2011 will be strengthened by administering measures (direct and indirect) that capture specific learning issues such as attention problems, memory problems, inability to withhold inappropriate responses, and language issues. In particular, little attention has been paid to differences in these areas across racial/ethnic subgroups or between low-income and other children (Noble, et al., 2005). ECLS-K:2011 will provide information to allow for the investigation of such differences.



A.2.1.d Demographic Changes

In addition to changing policies and approaches to early education and research, the U.S. is also undergoing demographic shifts in the composition of its population towards an increasingly diverse society. Continued high immigration rates, a relatively young immigrant population, high fertility rates among Hispanic women, and low fertility rates among the native-born population mean that a substantial fraction of the child population has one or more immigrant parents. In 2004, approximately one in every four births was to a foreign-born mother (Martin, et al., 2006). Sixty percent of these births were to women of Hispanic origin (Martin, et al., 2006). The demographic shift is especially evident in the school-aged population. In 2005, 20 percent of children ages 5 to 17 spoke a language other than English at home (U.S. Department of Education, 2007 Indicator 6). Of those speaking a language other than English at home, 72 percent spoke Spanish, 14 percent spoke another Indo-European language, 11 percent spoke an Asian or Pacific Island language, and 4 percent spoke some other language at home (U.S. Department of Education, 2007 Indicator 6). Language barriers are not the only challenge for many of these children. Many, especially those with parents from Mexico and Central America, come from homes with lower parental education, larger families, and lower family income than native-born children (Larsen, 2004). Families from other cultures may have different normative expectations for how they should interact with schools and teachers. ECLS-K:2011 will enable researchers to examine how schools and teachers are meeting the needs of these children and their families, how effective those efforts are, and how involved such families are in the school community.



A.3 Use of Improved Information Technology

Where feasible, available technology will be used to improve data quality and reduce respondent and school burden.


The ECLS-K:2011 parent interviews and child assessments will be conducted using computer-assisted interviewing (CAI). Using CAI will increase data collection efficiency by permitting preloads of available data about the sampled schools and children, on-line editing, and complex question branching—all of which also reduce respondent burden with faster interviews and reduce the need to recontact respondents to obtain missing information (for example, as would be necessary in a situation where a field interviewer does not follow a skip pattern correctly and items that should be asked are not). Field interviewers will conduct interviews with parents without telephones by making in-person visits to complete interviews; these interviews will also be conducted using CAI on laptop computers. The CAI system has important features that will improve the quality of the data and reduce the burden on respondents, as follows:


  • Initial Contact: The CAI system will guide the ECLS-K:2011 field interviewer in making contact with the parent at the phone number or address provided by the school or with the child at the school and will include prompts to help the interviewer identify the correct respondent.

  • Routing the Direct Assessment: The CAI system will be programmed so the initial routing test at the beginning of each assessment subtest will be scored online by the computer and the appropriate second-stage test (i.e., the one corresponding to the child’s performance on the routing test) will be administered immediately. The benefits of such a two-stage instrument are increased adaptiveness, reduced burden for the child, and increased precision of measurement because the interviewers do not need to score the routing test and select the appropriate second-stage test themselves.

  • Skip Patterns: The CAI system automatically guides interviewers through the complex skip patterns in the parent interviews, thereby reducing respondent burden and potential for interviewer error and shortening the questionnaire administration time. This is because the respondent will not be asked inapplicable questions and the interviewers do not need to spend time determining which questions to ask.

  • Copying Responses: The CAI system will be programmed to copy responses from one item to another and from one instrument to another to prevent unnecessary repetition of questions and to aid in respondents’ recall. For example, information that is provided by the respondent early in the interview may be useful later in the interview; such information can be displayed on the screen or used as a wording fill for relevant questions to assist the respondent. Most importantly, information from the previous wave of data collection can be copied to the subsequent wave and verified, eliminating the need to collect the data again.

  • Time Intervals: The CAI system also provides automated time and date prompts that are very useful in longitudinal studies to assist respondents in remembering specific time periods. The interview can also provide the specific time frame for the interval between the previous and the current wave of data collection, to help respondents recollect information without repeating what they had given at the previous data collection period.

  • Receipt Control: The CAI system will provide for automatic receipt control and will be used to produce status reports that allow timely and ongoing monitoring of the survey’s progress.

The use of a CAI system for the ECLS-K:2011 is critical because of the intricate and sometimes difficult skip patterns that are part of complex survey instruments and because of the longitudinal nature of the data collection in which the same respondent might be interviewed at multiple time points.. Without CAI, the ECLS-K:2011 instruments would be difficult to administer over repeated measurement periods, and respondent burden would be increased.


A computer-based data management system will be used to manage the sample. The sample management system uses encrypted data transmission and networking technology to maintain timely information on respondents in the sample, including contact, tracking,5 and case completion data. This system will be particularly important as children move from one school to another, both between fall and spring of kindergarten as well as over the course of ECLS-K:2011 study. The use of technology for sample management will maximize tracking efforts, which should have a positive effect on the study’s ability to locate movers and achieve acceptable response rates.



A.4 Efforts to Identify Duplication

The ECLS-K:2011 will not be duplicative of other studies. The original ECLS-K is the only other study to collect as detailed and extensive information as the ECLS-K:2011 for a cohort of young children and to follow them throughout elementary school. The ECLS-K:2011 will extend the information obtained by the ECLS-K to a new cohort, will open up possibilities to investigate new research questions, and will allow important comparisons to be made between two kindergarten cohorts entering school a dozen years apart. In addition, it will collect data during the children’s second and fourth grade years, which the original ECLS-K did not.6


In preparation for the launch of the original ECLS-K, a review of other early childhood studies was conducted. At that time, the review found that a few studies had focused on children’s early learning environments (e.g., the Office of Policy and Planning’s National Transition Study, NCES’s National Household Education Surveys Program), on parent and family involvement in education (the National Household Education Surveys Program), and on the structure of elementary schools (e.g., NCES’s Schools and Staffing Survey), or had evaluated specific programs (e.g., PES’s the Longitudinal Evaluation of School Change and Performance; Chapter 1: Prospects Study). However, these studies either did not provide the longitudinal child-level data that are needed to study the relationships between school experiences and child developmental outcomes and growth or were concerned primarily with only certain segments of the child population.


More recently, a literature search was conducted to identify and review research studies with the same study purpose and goals as those proposed for the ECLS-K:2011. To be included in the search the research had to be (1) a survey-based study of a population with a sample of 1,000 or more, (2) longitudinal in design, and (3) focused on children’s cognitive development in the elementary, middle, and/or secondary grades. Although similar studies were found, they were generally confined to limited geographic areas (e.g., Baltimore, Maryland; Greensboro, North Carolina) or, for the studies conducted on the national level (e.g., Prospects, Children of the National Longitudinal Survey of Youth [NLSY Child Supplement]), were not based on probability samples of kindergartners. For example, Prospects began with first graders and targeted Title 1 recipients. NLSY79’s Child Supplement targets the children of female sample members of a household-based 1979 sample of 14- to 21-year-olds. The Head Start Family and Child Experiences Survey (FACES), which is similar to the ECLS-K:2011 in terms of the included content and components, has followed several cohorts of children from preschool through early elementary school. However, FACES has not followed the progress of children in school beyond kindergarten or first grade, and the samples are limited to children served by Head Start. NELS: 88 and ELS: 2002 begin with students in the middle and high school grades. Another major finding of the review was that most studies used group-administered achievement tests, which can produce unreliable data for young children. The ECLS-K:2011, however, will administer computerized adaptive child assessments in a one-on-one setting.


A.5 Method Used to Minimize Burden on Small Businesses

The respondents for ECLS-K:2011 will include teachers and school administrators. Private, not-for-profit, and proprietary elementary schools may be drawn into the sample. To reduce the perceived burden, the contractor will provide assistance to these schools as needed. These proprietary and nonprofit schools also will benefit from the study’s other burden-reducing strategies (e.g., instruction packets, toll-free help lines, and prepaid business return envelopes), which were designed for all types of schools.



A.6 Frequency of Data Collection

This submission describes and requests approval for the base year data collection of the ECLS-K:2011. The base year data collection will take place in fall 2010 and spring 2011. One of the main goals of the ECLS-K:2011 is to measure change in children’s cognitive growth and noncognitive status, as well as changes in the contextual characteristics (i.e., school, classroom, family, and community factors) that can affect growth. To measure change, baseline information must be collected and compared with data collected in periodic follow-ups that are linked to the rates of change for school children and their environments.


For the national data collection, beginning-of-the-school-year data collection is needed in order to obtain baseline data on children at the very beginning of their exposure to the influences of the school environment. Through direct and indirect assessments, the baseline fall collection will provide measures of the skills, attributes, and knowledge of children as they enter school for the first time. The data collected at the end of the year will be used to examine changes in children after they have experienced nearly a year of kindergarten. Currently, the study design calls for follow-up collections in the fall of first grade then each spring from first through fifth grade. This frequency of data collection is linked to the rate of change that is expected for children of this age and the desire to capture information about children as critical events and transitions are occurring, rather than measuring these events retrospectively. Without data collection follow-ups, the study of children’s cognitive and social development is hindered. Assuming successful kindergarten collections, future clearance requests will be submitted for the follow-up collections in later grades.



A.7 Special Circumstances of Data Collection

No special circumstances for this information collection are anticipated.



A.8 Consultants Outside the Agency

NCES has sought consultation with a range of outside agencies over the life of the ECLS‑K, and their input also has informed the ECLS-K:2011 study design and instrumentation, since they draw heavily from the ECLS-K. During the early development of the ECLS-K, project staff met with representatives from a wide range of Federal agencies with an interest in the care and well-being of children. The goal of this activity was to identify policy and research issues and data needs. Similarly, consultation with Federal agencies has occurred and continues for the ECLS-K:2011. See Table A-1 for the representatives consulted for the ECLS-K and ECLS-K:2011.


Project staff also consulted several other organizations (see Table A-2) that have an interest in the care, well-being, and education of young children. The goal of this activity again was to identify policy and research issues and data needs.


Several of the early consultations with government agencies have resulted in interagency agreements funding supplemental studies. Similar to its predecessor, the ECLS-K:2011 represents a collaborative effort by education, and health and human services agencies. The National Center for Education Statistics supports the development of the core design of the ECLS-K:2011. Partner agencies continuing to support supplemental studies that enrich the ECLS-K:2011 include the Economic Research Services of the U.S. Department of Agriculture, the National Center for Special Education Research in the Institute of Education Services of the U.S. Department of Education, and the Administration for Children and Families in the U.S. Department of Health and Human Services. New agency partners to the ECLS-K:2011 include the National Institute of Deafness and Other Communication Disorders and the National Eye Institute at the National Institutes of Health in the U.S. Department of Health and Human Services.


In preparation for the ECLS-K:2011, the design contractor assembled expert panels (a Technical Review Panel (TRP) and Content Review Panels (CRP)) to review and comment on the issues related to the development of the study and survey instruments. The members include experts in research, policy making, and practice in the fields of early childhood education and development, elementary education, health, research methodology, special populations, and assessment. Table A-3 lists the ECLS-K:2011 TRP members. The TRP has had one 2-day meeting, held in November 2008. The meeting focused on major design and content issues, such as periodicity, the benefits of including an assessment of science in kindergarten, the assessment of executive functioning and possible measures for it, and the content of a Spanish language assessment for native Spanish speakers who are English language learners. The TRP members also provided suggestions for specific questionnaire items to be included in the instruments in the full-scale collection. There also have been five meetings of the CRP panels: reading (May, 2009), mathematics (May, 2009), science (May, 2009), executive function (November, 2009), and English/Spanish Basic Reading Skills (August, 2009). For each of these specific domains, panel members provided critical review of the assessment instruments for inclusion in the field test and national data collections. The respective meetings focused on the appropriateness and adequacy of the specific instruments by considering features such as domain coverage, age appropriateness, technical quality, and the relationship of assessment items to elementary school curricula. Table A-4 lists the ECLS-K:2011 CRP members.

Table A-1. Federal agency consultants for ECLS-K and ECLS-K:2011

Diane Schilder1

Government Accounting Office


Cindy Prince,1 Emily Wurtz1

National Education Goals Panel


Andy Hartman1

National Institute for Literacy


Mary Queitzsch,1 Larry Suter1

National Science Foundation


Michael Ruffner,1 Bayla White,1

Brian Harris-Kojetin1

Office of Management and the Budget


John Endahl,1 Jeff Wilde,1 Joanne Guthrie,

Victor Oliviera1

U.S. Department of Agriculture


Don Hernandez1

U.S. Department of Commerce

Bureau of the Census

Marriage and Family Statistics


Tim D’Emillio

U.S. Department of Education, OELA


Naomi Karp,1 Dave Malouf,1 Ivor Pritchard,1

Marsha Silverberg1

U.S. Department of Education, IES


Pia Divine,1 Esther Kresh,1 Ivelisse Martinez-Beck, Ann Rivera

U.S. Department of Health and Human Services

Administration for Children, Youth, and Families


Gerry Hendershot,1 John Kiley,1 Michael Kogan1, Mitchell Loeb, Patricia Pastor

U.S. Dept. of Health and Human Services

NCHS


Howard Hoffman

National Institute on Deafness and Other Communication Disorders

NICHD, U.S. Dept. of Health and Human Services


Mary Frances Cotch

National Eye Institute

NICHD, U.S. Dept. of Health and Human Services

Tom Bradshaw,1 Doug Herbert1

National Endowment for the Arts


Jeffrey Thomas1

National Endowment for the Humanities


Patricia McKee

U.S. Department of Education

OESE Compensatory Education Programs


Cathie L. Martin1

U.S. Department of Education, OIE


Scott Brown,1 Louis Danielson,1 Glinda Hill,1

Lisa Holden-Pitt,1 Kristen Lauer,1

Marlene Simon-Burroughs1

U.S. Department of Education, OSEP


Lisa A. Gorove1

U.S. Department of Education

OUS, Budget Service, ESVA


Elois Scott1

U.S. Department of Education

OUS, PES, ESED


Richard Dean1

U.S. Department of Education

OVAE, Adult Literacy


Jaquelyn Buckley,

U.S. Department of Education

IES, NCSER


Jeff Evans,1 Sarah Friedman,1 Christine Bachrach,1

Peggy McCardle1

U.S. Department of Health and Human Services

NICHD, Center for Population Research


Martha Moorehouse,1 Anne Wolf1

U.S. Department of Health and Human Services

Office of Assistant Secretary for Planning & Evaluation, Children and Youth Policy


Katrina Baum1

Bureau of Justice Statistics

Department of Justice


Meredith A. Miceli, Ph.D.

U.S. Department of Education
Office of Special Education Programs

1 Consultant for the ECLS-K only. Affiliation listed is the affiliation at the time input on the study was provided.


Table A-2. Other organization consultants for ECLS-K


Mary Jo Lynch, Ph.D.

American Library Association

Office of Research and Statistics


Keith W. Mielkek, Ph.D.

Children’s Television Workshop


Lynson Bobo

Project Associate

Resource Center on Educational Equity

Council of Chief State School Officers


Evelyn Moore

Erica Tollett

National Black Child Development Institute

Susan Bredekamp

Barbara Willer

National Association for the Education of Young Children


June Million, Sally McConnell, Louanne Wheeler

National Association of Elementary School Principals


Thomas Schultz

Director, Center for Education Services for Young Learners

National Association of State Boards of Education




Table A-3. ECLS-K:2011 TRP member list


Karl Alexander

Department of Sociology

Johns Hopkins University


Jim Bauman

Center for Applied Linguistics

Washington, DC


Maureen Black

Growth and Nutrition Department

University of Maryland Medical Center


Joanne Carlisle

School of Education

University of Michigan


Janet Fischel

State University of New York at Stony Brook & University Medical Center

Fred Morrison

Department of Psychology

University of Michigan


Charlotte Patterson

Department of Psychology

University of Virginia


Robert Pianta

The Center for Advanced Teaching and Learning

University of Virginia


Kit Viator

Massachusetts Department of Education




Table A-4. ECLS-K:2011 CRP member list


Reading Panel

Gloria Johnston

Education National University


Alba Ortiz

University of Texas at Austin

Barbara Wasik

Temple University


Susan Conrad

Independent consultant, assessment development

Math Panel

Doug Clements

State University of New York, Buffalo


Lizanne DeStefano

University of Illinois at Urbana-Champaign

Leah Parker

Journeys Academy, Gifted Education Specialist


Donna Compano

Independent consultant, assessment development, math facilitator, elementary teacher

Science Panel

Michael Padilla

Clemson University


Angela Eckhoff

Clemson University


Kathy DiRanna

University of California - Irvine

Christine Y. O’Sullivan

Science Consultant


Christie Bean

JJ Ciavarra Elementary School

English Language Learner Panel

Vera Gutierrez-Clellen

San Diego State University


Catherine Crowley

Teachers College

Eugene E. García

Arizona State University


Jamal Abedi

University of California at Davis

Executive Function Panel

Philip Zelazo

University of Minnesota


Clancy Blair

New York University

Megan McClelland

Oregon State University



A.9 Provision of Payments or Gifts to Respondents

Obtaining high response rates is critical for all longitudinal studies. At the start of a longitudinal data collection, it is essential to establish the good will of respondents and to demonstrate that we value their participation in the study. Good will can be established by using well-designed respondent materials that inform respondents about the goals of the study and their role in it, the field staff establishing a rapport with the respondents, professionalism among the field staff, and a small token incentive. The incentive plan for ECLS-K:2011 is similar to the approach approved by OMB for use in ECLS-K. The plan is designed to get respondents to recognize the merits of the study and thereby encourage high response rates.



A.9.a School Incentive

High levels of school participation are integral to the success of the study. Without a school’s cooperation, there can be no school, teacher, or child data collection activity for that facility. NCES recognizes that administrators will assess the burden level before agreeing to participate. To offset the perceived burden, NCES intends to continue its use of strategies that have worked successfully on three other major NCES studies (High School and Beyond, National Education Longitudinal Study of 1988, and Education Longitudinal Study of 2002) and their in-school follow-up studies and that were also used in later collections of the ECLS-K with OMB’s approval. It is important to provide schools with an incentive because the study asks a lot of them, including to allow field interviewers to be in their schools for up to 3 days, to provide a contact person and space for the children to be assessed, to remove children from their normal classes while they are tested, and to obtain information about the school and the children. Given the many demands and outside pressures that schools face, it is essential that they see that we understand the burden we are placing on them and that we value their participation. We propose to remunerate schools $200 per school. An honorarium check in the amount of $200 will be mailed to each school at the end of spring data collection along with a thank you note thanking the school for its participation.



A.9.b School Administrator

To build response rates, we propose to remunerate school administrators in appreciation for their completing the school administrator questionnaire. In ECLS-K, the field period had to be extended for both kindergarten and first grade to build adequate response rates for the school administrator questionnaire to meet NCES’ goals. Providing school administrators with an incentive will reduce the potential for needing to extend the field period and help avoid delays in data delivery. We will offer school administrators a $25 incentive, the same amount that was given to school administrators during the eighth-grade round of the ECLS-K conducted in 2007, which will be attached to the school administrator questionnaire during the spring kindergarten data collection. In the eight-grade round, a response rate of 93.3 percent was achieved for the school administrator questionnaire.



A.9.c Teachers

In the base year of the ECLS-K, teachers received $5 per child for completing the child-level questionnaire because they were acting as data collectors, recording their observations of their kindergartners on questionnaires. A check was mailed to them upon receipt of the completed questionnaires. Beginning with the third-grade round of collection, teachers were offered $7 per child-level questionnaire to adjust for inflation. For the base year data collection of the ECLS-K:2011, classroom and special education teachers will be offered $7 per child-level questionnaire as well. On average, teachers will have 6 kindergartners selected from their classrooms for two rounds of kindergarten data collection. Thus, a teacher with 6 pupils would receive $7 for each child for each round of data collection, resulting in a total remuneration of $84 for participating in both kindergarten rounds. A check for the incentive will be attached to the package of instruments the teacher receives in the fall and in the spring. NCES began the practice of providing the teacher incentive up-front in the fifth grade round of the ECLS-K, which teachers appreciated and responded positively to by completing their questionnaires on time, resulting in high response rates. Given the unusual burden of the ECLS-K and our experience in other school-based, longitudinal studies with high institutional and respondent burden, NCES believes that remuneration must be a part of data collection for a study such as this. We attribute the high questionnaire response rates achieved in the eighth grade ECLS-K collection (school administrator at 93.3%; teacher questionnaire at 95.5%; special education teacher questionnaire at 94.2%) in part to the provided incentives.



A.9.d School Coordinators

School coordinators act as the study liaison with the school and, as such, they play a very important role in the ECLS-K:2011. They help to enroll children in the study, notify parents and obtain consent as necessary, notify teachers, arrange the assessment logistics for fall and spring (e.g., space to conduct the assessments), and collect teacher, school administrator, and special education teacher questionnaires in fall and spring. For this reason, school coordinators will be offered a $25 incentive. The $25 checks will be attached to the “Welcome” letters mailed to the coordinators in the fall. The study offered the same incentive to the school coordinators during the eighth grade round of ECLS-K data collection conducted in 2007.



A.9.e Wrap-Around Early Care and Education Providers

With parental permission, home- and center-based nonparental before- and/or after-school caregivers will be asked to complete self-administered questionnaires. As mentioned above, center directors will be asked to answer some questions about the center and staffing for children with center-based care arrangements. WECEP providers will be offered an honorarium depending on the number of sampled children in their care, because burden increases with the number of children in care. The honorarium will be included with the packet of questionnaires that are mailed to the provider. Center-based providers with one sampled child will be offered a $15 honorarium; center-based providers with two to five children will be offered a $20 honorarium; center-based providers with six to ten children will be offered a $30 honorarium, and center-based providers with more than ten children students will be offered a $35 honorarium. The honorarium structure is slightly different for home-based providers as they have fewer questionnaires to complete and are expected to have fewer sampled children than center-based providers. Home-based providers with one sampled child will be offered a $10 honorarium; home-based providers with more than one child will be offered a $15 honorarium. This incentive structure is consistent with the incentive structure used for teachers, which also provides honoraria based on the number of study children we ask teachers to provide information for.



A.10 Assurance of Confidentiality

The ECLS-K:2011 plan for ensuring the confidentiality of the project and participants conforms with the following federal regulations and policies: the Privacy Act of 1974 (5 U.S.C. 552a), Privacy Act Regulations (34 CFR Part 5b), the Education Sciences Reform Act of 2002 (P.L.100-297 Title I, Part C, Sec. 183, as amended), the Computer Security Act of 1987, NCES Restricted-Use Data Procedures Manual, and the NCES Standards and Policies.


All respondents who participate in research under this clearance will be informed that the information they provide will be protected from disclosure to the fullest extent allowable under law and that their participation is voluntary. This information will be provided to parents as the guardians for their children. All respondents receive an introductory letter that explains NCES’s and the contractor’s adherence to policies on disclosure (see Appendix H for the teacher letter and the parent letters). The parent consent form also includes an explanation of NCES’s and the contractor’s adherence to policies on disclosure. This responsibility to protect data from disclosure also is conveyed to state, district, and other school officials at the time their cooperation is sought.


During any in-person or telephone interviewing, respondents will be asked if they received the study’s introductory letter. If the respondent does not recall the letter, the interviewer will summarize the key elements of the data protection assurances; namely, that data will be combined to produce statistical reports, that no data will be published that link the respondent to his/her responses; that participation is voluntary; and that there is federal statute that protects the data from disclosure to the fullest extent allowable under law (P.L.100-297 Title I, Part C, Sec. 183, as amended).


All contractor staff members working on the ECLS-K:2011 project or having access to the data (including monitoring of interviews and assessments) are required to sign the NCES Affidavit of Nondisclosure (Exhibit A-2) and a Confidentiality Pledge (Exhibit A-3). They also are required to complete mandatory training on data confidentiality and the safehandling of data. The contractor will keep the original notarized affidavits on file and submit PDF copies of all affidavits to NCES quarterly. In addition, these staff will complete background screening in compliance with ACS Directive (OM:5-101).


During the course of data collection, interviewers will be equipped with laptop computers, which store any necessary preloaded data, as well as the information collected during the course of the interviewing for that round. The interviewers will be instructed to keep the computers and any hard-copy case materials in a secure place in their homes when they are not being used. When the interviewer is in the field collecting interview or assessment data, he or she is instructed to keep all materials and the computer in his/her possession at all times. When driving a car to or from his/her appointments, the computer and all materials will be locked out of sight, so as not to provide an inviting opportunity for burglary. The interviewers will be instructed to transmit the electronic data for a case to a central database on the same day the case is completed. Any data transmitted electronically will be encrypted during transmission.



Exhibit A-2. NCES Affidavit of Nondisclosure


Affidavit of Nondisclosure


____________________________________ __________________________________

(Job Title) (Date Assigned to Work with NCES Data)


____________________________________

(Organization, State or Local Agency Name)



____________________________________ ________________________________

(Organization or Agency Address) (NCES Database or File Containing

Individually Identifiable Information*)



I, __________________________________ , do solemnly swear (or affirm) that when given access to the subject NCES database or file, I will not -


(i) use or reveal any individually identifiable information furnished, acquired, retrieved or assembled by me or others, under the provisions of Section 183 of the Education Sciences Reform Act of 2002 (P.L. 107-279) and Title V, subtitle A of the E-Government Act of 2002 (P.L. 107-347) for any purpose other than statistical purposes specified in the NCES survey, project or contract;


(ii) make any disclosure or publication whereby a sample unit or survey respondent (including students and schools) could be identified or the data furnished by or related to any particular person or school under these sections could be identified; or


(iii) permit anyone other than the individuals authorized by the Commissioner of the National Center for Education Statistics to examine the individual reports.


___________________________________

(Signature)


[The penalty for unlawful disclosure is a fine of not more than $250,000 (under 18 U.S.C. 3571) or imprisonment for not more than five years (under 18 U.S.C. 3559), or both. The word "swear" should be stricken out when a person elects to affirm the affidavit rather than to swear to it.]


City/County of _________________ Commonwealth/State of ________________ .

Sworn to and subscribed before me this _______________ day of

_______________, 20________ . Witness my hand and official Seal.


____________________________________

(Notary Public/Seal) My commission expires__________________ .



* Request all subsequent follow-up data that may be needed. This form cannot be amended by NCES, so access to databases not listed will require submitting additional notarized Affidavits. Form last revised 02/08/07


Exhibit A-3. Confidentiality Pledge


EMPLOYEE OR CONTRACTOR’S ASSURANCE OF CONFIDENTIALITY OF SURVEY DATA


Statement of Policy


{Contractor} is firmly committed to the principle that the confidentiality of individual data obtained through {Contractor} surveys must be protected. This principle holds whether or not any specific guarantee of confidentiality was given at time of interview (or self-response), or whether or not there are specific contractual obligations to the client. When guarantees have been given or contractual obligations regarding confidentiality have been entered into, they may impose additional requirements which are to be adhered to strictly.


Procedures for Maintaining Confidentiality


1. All {Contractor} employees and field workers shall sign this assurance of confidentiality. This assurance may be superseded by another assurance for a particular project.

2. Field workers shall keep completely confidential the names of respondents, all information or opinions collected in the course of interviews, and any information about respondents learned incidentally during field work. Field workers shall exercise reasonable caution to prevent access by others to survey data in their possession.

3. Unless specifically instructed otherwise for a particular project, an employee or field worker, upon encountering a respondent or information pertaining to a respondent that s/he knows personally, shall immediately terminate the activity and contact her/his supervisor for instructions.

4. Survey data containing personal identifiers in {Contractor} offices shall be kept in a locked container or a locked room when not being used each working day in routine survey activities. Reasonable caution shall be exercised in limiting access to survey data to only those persons who are working on the specific project and who have been instructed in the applicable confidentiality requirements for that project.

Where survey data have been determined to be particularly sensitive by the Corporate Officer in charge of the project or the President of {Contractor}, such survey data shall be kept in locked containers or in a locked room except when actually being used and attended by a staff member who has signed this pledge.

5. Ordinarily, serial numbers shall be assigned to respondents prior to creating a machine-processible record and identifiers such as name, address, and Social Security number shall not, ordinarily, be a part of the machine record. When identifiers are part of the machine data record, {Contractor’s Manager of Data Processing} shall be responsible for determining adequate confidentiality measures in consultation with the project director. When a separate file is set up containing identifiers or linkage information which could be used to identify data records, this separate file shall be kept locked up when not actually being used each day in routine survey activities.

6. When records with identifiers are to be transmitted to another party, such as for keypunching or key taping, the other party shall be informed of these procedures and shall sign an Assurance of Confidentiality form.

7. Each project director shall be responsible for ensuring that all personnel and contractors involved in handling survey data on a project are instructed in these procedures throughout the period of survey performance. When there are specific contractual obligations to the client regarding confidentiality, the project director shall develop additional procedures to comply with these obligations and shall instruct field staff, clerical staff, consultants, and any other persons who work on the project in these additional procedures. At the end of the period of survey performance, the project director shall arrange for proper storage or disposition of survey data including any particular contractual requirements for storage or disposition. When required to turn over survey data to our clients, we must provide proper safeguards to ensure confidentiality up to the time of delivery.

8. Project directors shall ensure that survey practices adhere to the provisions of the U.S. Privacy Act of 1974, and any additional relevant laws that are specified in the contract, with regard to surveys of individuals for the Federal Government. Project directors must ensure that procedures are established in each survey to inform each respondent of the authority for the survey, the purpose and use of the survey, the voluntary nature of the survey (where applicable), and the effects on the respondents, if any, of not responding.

PLEDGE


I hereby certify that I have carefully read and will cooperate fully with the above procedures. I will keep completely confidential all information arising from surveys concerning individual respondents to which I gain access. I will not discuss, disclose, disseminate, or provide access to survey data and identifiers except as authorized by {Contractor}. In addition, I will comply with any additional procedures established by {Contractor} for a particular contract. I will devote my best efforts to ensure that there is compliance with the required procedures by personnel whom I supervise. I understand that violation of this pledge is sufficient grounds for disciplinary action, including dismissal. I also understand that violation of the privacy rights of individuals through such unauthorized discussion, disclosure, dissemination, or access may make me subject to criminal or civil penalties. I give my personal pledge that I shall abide by this assurance of confidentiality.


Signature

The laptop configuration will be designed with security and confidentiality considerations in mind. In order to access any of the applications, the interviewer must enter a project-specific password and an interviewer identification code, both of which are checked against encrypted versions of the same data; if the password or interviewer identification code is entered incorrectly, the interviewer is “locked out” of the application. All data files will be encrypted on the computer hard disk.


In the event of a hardware failure in the field, the home office will swap the interviewer’s laptop for a new one. The contractor will maintain a supply of “hot spares,” i.e., laptop computers loaded with all necessary ECLS-K:2011 software, which require only the specific interviewer’s identification code and assignment before being sent out.


All mailing—of respondent materials, laptops, hard-copy case materials—will be done using Federal Express, which has a sophisticated tracking system designed to locate any misdirected packages and all packages will require the recipient’s signature for delivery. To the extent practical, the study name and logo will not be included on hard copy materials used by field staff to record school or respondent information. In the event of a loss of hard copy materials, this procedure would make it more difficult for someone who finds the materials to associate a school or respondent with the study.


Finally, all computer assisted interviewing (CAI) applications will have an audit trail of the case data on the hard disk. This is so that if the main data files are corrupted, the data can be reconstructed from the audit trails.


After data collection, all personally identifiable data are stored on a secure server and password protected with access limited to authorized project staff. Personally identifiable data are also protected through the coding of responses so that no one individual respondent can be identified (specifically or by deduction) through reported variables in the public access data files. NCES monitors the conduct of the contractor to ensure that the confidentiality of the data is not breached.


NCES understands the legal and ethical need to protect the privacy of the ECLS-K:2011 survey respondents, and, with the contractor, has extensive experience in developing data files for release that meet the government’s requirements to protect individually identifiable data from disclosure. The contractor will conduct a thorough disclosure analysis of the ECLS-K:2011 data when preparing the data files for researchers use. This analysis will ensure that NCES has fully complied with the confidentiality provisions contained in PL 100-297. To protect the privacy of respondents as required by PL 100-297, respondents with high disclosure risk will be identified, and a variety of masking strategies will be used to ensure that individuals may not be identified from the data files. These masking strategies include swapping data and omitting key identification variables such as name, address, telephone number, and school name and address from both the public- and restricted-use files (though the restricted-use file will include NCES school ID that can be linked to other NCES databases to identify a school); omitting key identification variables such as state or ZIP Code from the public-use file; collapsing categories or developing categories for continuous variables to retain information for analytic purposes while preserving confidentiality in public-use files; and “topcoding”7 continuous variables in public-use files.



A.11 Sensitive Questions

The ECLS-K:2011 is a voluntary study, and no persons are required to respond to the questionnaires or to participate in the assessments. In addition, respondents may decline to answer any question they are asked. This voluntary aspect of the survey is clearly stated in the advance letter mailed to respondents, the study brochure, and the instructions of hard-copy questionnaires, and it is stressed in interviewer training.


The following describes the general nature of the national data collection instruments.


School Administrator Questionnaires. These are not of a sensitive nature and should not pose a problem to respondents.


Teacher Questionnaires. The information collected in the child-level questionnaires could be regarded as sensitive, because the teacher is asked to supply information about children’s social skills (including ability to exercise self-control, interact with others, resolve conflict, and participate in group activities); problem behaviors (e.g., fighting, bullying, arguing, anger, depression, low self-esteem, impulsiveness); and learning dispositions (e.g., curiosity, self-direction, inventiveness). Because schools often emphasize different skills and concepts, teachers also will be asked to rate the child’s performance in the curricular areas and domains that are included in the cognitive assessments (e.g., language skills; quantitative skills; and knowledge of the physical, social, and biological worlds). The purpose of the teacher ratings of children is both to extend the range of domains assessed (e.g., by gathering information about socioemotional development and adaptation to school) and to deepen our understanding of domains by tapping them in multiple ways (e.g., by gathering information on cognitive development that will complement results of the direct assessment).


Within the questions about the teacher’s views on school readiness, school climate, and school environment, there is one set of questions that could be deemed sensitive by some teachers. Teachers may feel that rating statements regarding their satisfaction with their work (e.g., I really enjoy my present teaching job) are sensitive in nature. These items are included because prior research (e.g., Perrachione, Rosser, & Peterson, 2008; Luekens, Lyter, & Fox, 2004; Rhodes, Nevill, & Allen, 2004) indicates that teacher satisfaction may be associated with relevant constructs such as staff retention and stability. Additionally, there is an item asking teachers whether they meet current criteria for being considered “highly qualified” according to the provisions of ESEA/NCLB. This question will inform research about if and how having a “highly qualified” teacher, as defined by law, is related to positive experiences and outcomes for children. Prior to their participation, teachers will be informed and assured that their information will be protected from disclosure to the fullest extent allowable under law and that their responses will not be shared with their employers or the parents of their students.


Direct Cognitive Assessments and Questionnaires. The direct cognitive assessments are essential in determining children’s performance levels at the time they enter school and changes in their performance as they progress through school. Because schools often use different standards in their own assessments of children, a uniform set of assessment instruments and procedures is needed for the ECLS-K:2011. The items to be included in the direct cognitive assessments are not themselves sensitive in nature. However, direct assessments of children do raise certain concerns about the assessment procedures to be used. Of primary concern is the length of the assessments. The cognitive assessments are designed to be administered within a 60-minute time period, on average. NCES has developed instruments appropriate to the ages of the participating children, and every effort will be made to staff the study with field assessors who have prior experience in working with children to conduct the direct assessments. Issues specific to working with children will also figure prominently in assessor training.


Parent Questionnaires. Several topics that will be addressed in the parent questionnaire could be sensitive in nature for some respondents. Questions about family income, child-rearing and disciplinary practices, parents’ judgments about their children’s academic skills and abilities, parents’ mental well-being, household food sufficiency, marital satisfaction, and contact with a child’s nonresidential parent will be included in the parent questionnaire.


Prior research indicates that each of these topics is correlated with children’s achievement and helps to predict children’s preparedness for and success in school. Collecting data on these topics will allow researchers to go beyond descriptive analyses of variation in children’s performance by basic background characteristics such as race/ethnicity and sex. Researchers will be able to test hypotheses about how a wide range of family characteristics relate to early success in school. Therefore, it is important to include questions on the sensitive topics listed above in the parent questionnaires. Like other study participants, parents will be told that they can refuse to answer any question they wish.


Results from previous rounds of data collection showed that there were very low levels of missing data in the parent interviews for all items, including the ones mentioned here. For example, in ECLS-K Round 2 (i.e., the spring kindergarten wave), response rates for sensitive items such as parent income (94.4%) and marital satisfaction (99.7%) were in the mid to high 90’s.


Additionally, because it is imperative that respondents can be found at a later date for follow-up collections in a longitudinal study, the ECLS-K:2011 interview protocol requests locating information from parents to be used to contact them for later rounds of the study. The locating information includes names, addresses, and telephone numbers of individuals who would always know the whereabouts of the respondents.


Wrap-Around Care and Education Provider (WECEP). Within the questions about wrap-around care and education, there is one set of questions that could be deemed sensitive by some child care providers. Child care providers may feel that rating statements regarding their satisfaction with their work (e.g., I really enjoy my present teaching job/child care position) are sensitive in nature. These items are included because prior research (e.g., Berk, 1985; Goelman et al., 2000; Bollin, 1993) indicates that caregiver satisfaction may be associated with relevant constructs such as the quality of care that children receive, staff retention and stability, and child care resources. Prior to their participation, child care providers will be informed and assured that their information will be protected from disclosure to the fullest extent allowable under law and that their responses will not be shared with their employers or the parents of their children.



A.12 Estimated Response Burden

The estimated respondent burden for the national kindergarten data collection is summarized here and in Table A-5. Included in these estimates, where appropriate, is the time that a respondent would need to gather and compile the data and the clerical time needed to fill out the form.


The estimated response burden for the future clearance package related to the fall first grade collection and first and second grade study recruitment and tracking is outlined in Table A-6. To estimate this burden, we looked to the experiences to date for the kindergarten waves of this study and the fall first grade collection of the ECLS-K.

Table A-5. National data collection respondent burden chart for base year (kindergarten) fall and spring collections

Respondent type

Sample n

Response rate/
selection rate

Number of respondents

Hours per instrument

Number of instruments per respondent

Total hours

Total number of responses

Fall Direct Assessment

21,600

.90

19,440

1.00

1

19,440

19,440

Fall Parent Interview

21,600

.90

19,440

0.50

1

9,720

19,440

Fall Teacher Questionnaire (TQA)

3,600

.90

3,240

0.50

1

1,620

3,240

Fall Teacher Child-level Questionnaire (TQC)

3,600

.90

3,240

0.33

6

6,415

19,440

Fall School Coordinator assistance1

900

.90

810

6.00

NA

4,860

NA

Spring Direct Assessment

21,600

.90

19,440

1.00

1

19,440

19,440

Spring Parent Interview

21,600

.90

19,440

0.60

1

11,664

19,440

Spring School Administrator Questionnaires (SAQ)

900

.90

810

1.00

1

810

810

Spring Teacher Questionnaire (TQB)

3,600

.90

3,240

0.50

1

1,620

3,240

Spring Supplemental Teacher Questionnaire for new teachers2

NA

NA

259

0.17

1

44

259

Spring Teacher Child-level Questionnaire (TQC)

3,600

.90

3,240

0.33

6

6,415

19,440

Spring Special Education Teacher Questionnaire (SPA)

900

.90

810

0.50

1

405

810

Spring Special Education Teacher Child-level Questionnaire (SPB)

900

.90

810

0.33

2

535

1,620

Spring School Coordinator assistance1

900

.90

810

6.00

NA

4,860

NA

(WECEP) Caregiver Questionnaire - Child Level

2,700

.90

2,430

0.20

4

1,944

9,720

(WECEP) Home-based Caregiver Questionnaire

2,700

.90

2,430

0.13

1

316

2,430

(WECEP) Center-based Center Directors Questionnaire

2,700

.90

2,430

0.30

1

729

2,430

(WECEP) Center-based Caregiver Questionnaire

2,700

.90

2,430

0.13

1

316

2,430

Study Total for Base Year Collections

36,0003

NA

32,6594

NA

NA

52,273

104,749

NA Not applicable

1 School coordinators are school staff members who help organize the logistics for the assessment visit. They do not complete a study instrument.

2 There is no set sample for this questionnaire, as it is to be completed only by teachers who did not participate in the fall round of collection, and this number is unknown. The number of respondents estimated in the table is based on the percentage of teachers who were new to the study in the spring kindergarten collection of the ECLS-K (8 percent of all participating teachers).

3 Total sample n represents the total sample size with no duplication on the number of listed instruments each respective respondent is asked to complete. One teacher completes both TQA and TQC. One special education teacher completes both SPA and SPB. The WECEP caregivers complete the WECEP child level questionnaire. The total sample n is the sum of the fall parent interview, fall TQA, fall school coordinator assistance, spring SAQ, spring SPA, WECEP home-based caregiver questionnaire, WECEP center-based caregiver questionnaire, and the WECEP center director questionnaire. The sample of the students taking direct assessment is not included in this count because it is not subject to the Paperwork Reduction Act reporting.

4 Total number of respondents represents the total number of respondents with no duplication on the number of listed instruments each respective respondent is asked to complete. One teacher completes both TQA and TQC. A subsample of the teachers who complete TQA and TQC also will complete the supplemental questionnaire for new teachers in the spring. One special education teacher completes both SPA and SPB. The WECEP caregivers complete the WECEP child level questionnaire. The total number of respondents is the sum of the fall parent interview, fall TQA, fall school coordinator assistance, spring SAQ, spring SPA, spring supplemental teacher questionnaire, WECEP home-based caregiver questionnaire, WECEP center-based caregiver questionnaire, and the WECEP center director questionnaire. The sample of the students taking direct assessment is not included in this count because it is not subject to the Paperwork Reduction Act reporting.

The national data collection includes direct cognitive assessments with children; parent interviews; regular classroom teacher self-administrated questionnaires; classroom teacher child-level questionnaires; special education teacher self-administrated questionnaires; special education teacher child-level questionnaires for children receiving special education services; school administrator self-administered questionnaires; and collection of data from nonparental before- and/or after-school care providers.


The total number of respondents for the national data collection, i.e., school administrators, teachers, parents, nonparental before- and/or after-school care providers, and school coordinators without duplication included in the estimate is 32,659.8 The fall and spring kindergarten teacher, parent, school administrator, child care provider, and school coordinator respondent burden translates into a cost amount of $1,101,915 for 52,273 hours.9 The sample of and the time children will spend completing the assessments has not been included in the estimated burden and response and respondent numbers because direct assessment is not subject to the Paperwork Reduction Act reporting.


Table A-6 below outlines respondent burden for a future clearance request covering fall first grade data collection and first and second grade tracking and recruitment. The processes and procedures for respondent tracking are primarily internal and involve little contact with respondents. The table below includes 5 minutes per parent respondent to read the birthday cards we send to children to keep in touch with them and, if necessary, to fill out a change of address card and return it to the data collection contractor. Recruitment burden time includes the time necessary to read study materials sent to parents, teachers, and school administrators; time during which teachers would attend a pre-assessment visit meeting; and time the school administrator will take discussing the study with a school recruiter attempting to secure the school’s participation. Burden for the fall first grade data collection reflects the teacher self-administered questionnaires, parent telephone interview, and vision and hearing screenings. Because the study participants are expected to be the same across rounds, with the exception of teachers, it would not be accurate to calculate a total sample or total number of respondents as a simple sum of the sample sizes and respondents for each round. Instead, to calculate a total, the table below uses the maximum estimated sample size or number of respondents across all rounds. For example, the largest number of parents is expected to be contacted during recruitment for the spring first grade collection. This is the number used for parents in the calculation of total sample size and total number of respondents.

Table A-6. Respondent burden chart for future clearance package related to recruitment and tracking for first and second grade, and fall first grade national data collection

Respondent type

Sample n

Response rate/
selection rate

Number of respondents

Hours per instrument

Number of instruments per respondent

Total hours

Total number of responses1

Tracking for Fall First Grade2








Parent

6,000

NA

6,000

.084

1

504

NA

Tracking for Spring First Grade








Parent

18,630

NA

18,630

.084

1

1,565

NA

Tracking for Fall Second Grade








Parent

6,000

NA

6,000

.084

1

504

NA

Tracking for Spring Second Grade








Parent

14,636

NA

14,636

.084

1

1,229

NA

Recruitment for Fall First Grade2








Parent

6,000

NA

6,000

.25

1

1,500

NA

Teacher

1,000

NA

1,000

.50

1

500

NA

School Administrator

300

NA

300

1.00

1

300

NA

Recruitment for Spring First Grade








Parent

18,630

NA

18,630

.25

1

4,658

NA

Teacher

3,105

NA

3,105

.50

1

1,553

NA

School Administrator

932

NA

932

1.00

1

932

NA

Recruitment for Fall Second Grade2








Parent

6,000

NA

6,000

.25

1

1,500

NA

Teacher

1,000

NA

1,000

.50

1

500

NA

School Administrator

300

NA

300

1.00

1

300

NA

Recruitment for Spring Second Grade








Parent

14,636

NA

14,636

.25

1

3,659

NA

Teacher

2,439

NA

2,439

.50

1

1,220

NA

School Administrator

732

NA

732

1.00

1

732

NA

Fall First Grade Data Collection2








Fall Direct Assessment

6,000

.90

5,400

1.00

1

5,400

5,400

Fall Hearing and Vision Screenings

6,000

.90

5,400

.25

1

1,350

5,400

Fall Parent Interview

6,000

.90

5,400

0.50

1

2,700

5,400

Fall Teacher Questionnaire (TQA)

1,000

.90

900

0.50

1

450

900

Fall Teacher Child-level

Questionnaire (TQC)

1,000

.90

900

0.33

6

1,782

5,400

Study Total for Portions of First and Second Grade Collections

25,1063

NA

25,1064

NA

NA

27,438

11,7005

NA Not applicable

1 Only responses to survey instruments fielded during data collection are included in this column.

2 Reflects a smaller sample due to a planned subsampling.

3 Total sample n represents the total sample size, with no duplication on the number of listed instruments each respective respondent is asked to complete. The total sample size represents the maximum total possible. It is expected that the parent respondent will be the same at all rounds, so the largest n for parents (recruitment for spring first grade) is used in the calculation of the total. Similarly, the largest n’s for teachers and school administrators is used in the calculation of the total. Therefore, the total is the sum of parents-recruitment for spring first grade, teacher- recruitment for spring first grade, teacher- recruitment for spring second grade, and school administrator- recruitment for spring first grade. Teachers are counted separately for first and second grade because it is anticipated that children will not have the same teacher for both grades. The sample of the students taking direct assessment is not included in this count because it is not subject to the Paperwork Reduction Act reporting.

4 Total number of respondents represents the total number of respondents with no duplication on the number of listed instruments each respective respondent is asked to complete. The number of respondents represents the maximum total possible. It is expected that the parent respondent will be the same at all rounds, so the largest n for parents (recruitment for spring first grade) is used in the calculation of the total. Similarly, the largest n’s for teachers and school administrators is used in the calculation of the total. Therefore, the total is the sum of parents-recruitment for spring first grade, teacher- recruitment for spring first grade, teacher- recruitment for spring second grade, and school administrator- recruitment for spring first grade. Teachers are counted separately for first and second grade because it is anticipated that children will not have the same teacher for both grades. The sample of the students taking direct assessment is not included in this count because it is not subject to the Paperwork Reduction Act reporting.

5 Total number of responses represents the total number of respondents * the total number of instruments they fill out, with no duplication on the number of listed instruments each respective respondent is asked to complete. One teacher completes both TQA and TQC. The sample of the students taking direct assessment is not included in this count because it is not subject to the Paperwork Reduction Act reporting.

NOTE: These are estimates based on current plans for recruiting and tracking for first and second grade, as well as the fall first grade data collection. Estimates based on final data collection protocols for the spring first grade and spring second grade collections will be detailed in a future package submission. Activities for first grade will occur in the 2011-2012 school year and the summer before this school year. Activities for second grade will occur in the 2012-2013 school year and the summer before this school year.



A.13 Estimates of Cost

There are no costs to the respondents to participate beyond the time needed to answer the questionnaires or interviews, for teachers to complete the rating data in the child-level questionnaire, and for the children to participate in the cognitive assessments. No equipment, printing, or postage charges will be incurred by the participants.



A.14 Annualized Cost to the Federal Government

This information collection activity has been developed in performance of NCES contract ED-04-CO-0059/0023. The period of performance for this ECLS-K:2011 contract, which includes the kindergarten through second grade field test and kindergarten national data collections,10 runs from May 2008 through April 2013. The total cost to the government for contractor and subcontractor costs is $26,646,959. This cost estimate includes two kindergarten data collections, one field test, design enhancements, and data file delivery and documentation. Table A-7 provides the study costs by year of the contract.


Table A-7. Study costs per year


Year

Amount

2008

$387,531

2009

$3,127,469

2010

$10,098,492

2011

$10,790,615

2012

$2,052,074

2013

$190,778

Total

$26,646,959



A.15 Reasons for Changes in Response Burden and Costs

The last approved submission included burden hours for the field test and estimated burden hours for the full scale kindergarten data collection. This submission asks for burden hours for the full scale kindergarten data collection only, without including the purely cognitive assessment, resulting in the substantial reduction in total burden hours.

The increase in cost relative to the first clearance request for the field test and kindergarten collections is due to the fact that options to conduct various data collection instruments and protocols, although described in the previous OMB clearance package, were not fully exercised at the time of that last submission. These include (1) special education teachers and (2) wrap-around child care and education providers of sampled children; (3) food security items have been added to the parent interview; (4) Spanish-speaking children will be administered an assessment of mathematics and basic reading skills in Spanish; and (5) a feasibility test of conducting vision and hearing screenings has been conducted. These options have been fully exercised since the previous OMB clearance package.


A.16 Publication Plans and Time Schedule

Publications relevant to the data collection will be part of the reports resulting from the base year data collections. Data files from the national kindergarten collections will be produced and made available to researchers in a public-use format. Researchers who are approved by NCES's data confidentiality office for a restricted-use license can access restricted-use data files, which include more sensitive items and items that pertain to smaller numbers of children (e.g., information about the presence of specific disabilities). To be approved for a restricted-use license, researchers must demonstrate that they have a research question that cannot be answered with the public-use data and that they have the infrastructure to keep the data secure to prevent loss or unauthorized use. Codebooks and user manuals will be produced for both types of data files. All data will be merged at the child level. Data files will include all instrument variables (except for those that gather directly identifying information, such as the names of household members) and any relevant associated variables, such as composites or assessment scores. Data will be released through Electronic Code Book (ECB) software that allows users to create customized data files in standard statistical software packages (SPSS, SAS, and Stata) and view codebook information. A record layout will be provided so that analysis packages other than SAS/PC, SPSS/PC, and Stata/PC could be used (e.g., analysis packages for Apple computers) to analyze the ECLS-K:2011 data.


The ECLS-K:2011 reports or publications will include detailed methodological reports describing all aspects of the data collection effort and psychometric properties of the assessment instruments, as well as reports that describe the population of children who are kindergartners in the 2010-11 school year.


The operational schedule for the ECLS-K:2011 national study is shown in Table A-8.


Table A-8. Operational schedule for ECLS-K:2011 national study


Activity

Start date

End date

ECLS-K:2011 National Data Collection



Select school sample

7/15/2008

4/28/2009

Print/program assessment

3/1/2010

7/20/2010

Print/program questionnaires

4/1/2010

7/20/2010

Train data collectors

6/1/2010

8/16/2010

Fall data collection

8/9/2010

12/30/2010

Process data

9/15/2010

1/15/2010

Spring data collection

2/24/2011

7/15/2011

Process data

3/15/2011

8/15/2011

Construct data files, user’s manual

8/15/2011

10/25/2012

Methodology/psychometric reports

8/6/2010

1/11/2013


A.17 Approval for Not Displaying the Expiration Date for OMB Approval

No exemption from the requirement to display the expiration date for OMB approval of the information collection is being requested for the ECLS-K:2011.

A.18 Exceptions to the Certification Statement

No exceptions to the certification statement identified in item 19, “Certification for Paperwork Reduction Act Submissions,” of OMB Form 83-I apply to the ECLS-K:2011.



1Throughout this package, reference is made to the Early Childhood Longitudinal Study, Kindergarten Class of 1998-99. For ease of presentation, it will be referred to as the ECLS-K. The new study for which this submission requests approval is referred to as the ECLS-K:2011.

2At each follow-up stage, a small percentage of children had been retained in a grade at some point prior to the wave of interest and therefore were in a grade lower than the target grade of that follow-up stage. In addition, a small number of children were found to be advanced to a higher grade.

3Wrap-around care and education is nonparental care and education that a child receives outside of regular school hours.

4Table A-5 provides detail about which components will be included in each round of full-scale collection.

5 Tracking is the process of locating respondents over time.

6The ECLS-K had collections in kindergarten, first grade, third grade, fifth grade, and eighth grade.

7 Topcoding refers to the process of recoding outlier values to some acceptable end value. For instance, everyone with a personal income higher than $100,000 may be recoded to $100,000 to eliminate the outliers.

8Some instruments are completed by the same person. Specifically, it is expected that the same person will complete fall teacher questionnaire part a, fall teacher child-level questionnaire part c, spring teacher questionnaire part b, and spring teacher child-level questionnaire part c. Also, it is expected that: the same person will complete the spring special education teacher questionnaires parts a and b and that the same person will complete the parent interview in fall and spring. The number of respondents is counted only once in the total for each of these three sets of survey instruments. Additionally, schools are asked to assign a staff member to help coordinate the assessment activities at the school; these school coordinators are counted in the total number of respondents, and their burden hours are counted, but they do not complete any study instruments.

9 An hourly rate of $21.08 was used to translate teacher, parent, and school administrator response time into a dollar amount. This rate is based on the National Compensation Survey. See U.S. Department of Labor (2007). National Compensation Survey: Occupational Wages in the United States, March 1995.

10 A separate clearance request will be submitted for the fall first grade collection at a later date.




Early Childhood Longitudinal Study
Kindergarten Class of 2010-11

A-53



File Typeapplication/msword
File TitleEarly Childhood Longitudinal Study Kindergarten Class of 2010-2011
AuthorCarlivati, Jill
Last Modified By#Administrator
File Modified2010-05-03
File Created2010-05-03

© 2024 OMB.report | Privacy Policy