Part A ECLS-K2011 Spring 4th Grade FS-5th Gr Recruitment

Part A ECLS-K2011 Spring 4th Grade FS-5th Gr Recruitment.docx

Spring Fourth-Grade Data Collection & Recruitment for Fifth-Grade

OMB: 1850-0750

Document [docx]
Download: docx | pdf

Early Childhood Longitudinal Study, Kindergarten Class of 2010-11 (ECLS-K:2011)


Spring Fourth-Grade National Data Collection

and Fifth-Grade Recruitment


OMB Clearance Package

# 1850-0750 v.16




Supporting Statement

Part A



Prepared by

National Center for Education Statistics

U.S. Department of Education




Table of Contents

Section Page


A.1 Circumstances Making Collection of Information Necessary A-1


A.1.1 Purpose of This Submission A-1

A.1.2 Legislative Authorization A-1

A.1.3 Prior Related Studies A-1

A.1.4 ECLS-K:2011 Study Design for the Spring Fourth-Grade National Data Collection A-2

A.1.5 Cognitive Interviewers and Timing Tests of the Fourth-Grade Non-Assessment Instruments A-8


A.1.5.1 Cognitive Interviews for the Fourth-Grade Subject-Specific TQCs and Parent Interview Items A-9


A.1.5.2 Timing Tests of the TQ, Subject-Specific TQCs, and Parent Interview A-13


A.2 Purposes and Uses of the Data A-16


A.2.1 Research Issues Addressed in the ECLS-K:2011 A-17


A.2.1.1 Developments in Early Education Policy A-17

A.2.1.2 School Readiness A-20

A.2.1.3 Executive Functioning A-20

A.2.1.4 Demographic Changes A-21


A.3 Use of Improved Information Technology A-22

A.4 Efforts to Identify Duplication A-24

A.5 Method Used to Minimize Burden on Small Businesses A-25

A.6 Frequency of Data Collection A-25

A.7 Special Circumstances of Data Collection A-25

A.8 Consultants Outside the Agency A-25

A.9 Provision of Payments or Gifts to Respondents A-30


A.9.1 School Incentive A-31

A.9.2 School Administrator A-32

A.9.3 Teachers A-32

A.9.4 School Coordinators A-33


A.10 Assurance of Confidentiality A-33

A.11 Sensitive Questions A-37

A.12 Estimated Response Burden A-42

A.13 Estimates of Cost to Respondents A-43

A.14 Cost to the Federal Government A-43

A.15 Reasons for Changes in Response Burden and Costs A-43

A.16 Publication Plans and Time Schedule A-43

A.17 Approval for Not Displaying the Expiration Date for OMB Approval A-45

A.18 Exceptions to the Certification Statement A-45

Contents (continued)

Appendixes Page


A Child Questionnaire

B Parent Interview Specifications

C Teacher Background Questionnaire

D Teacher Subject-Specific Questionnaires

E Special Education Teacher Questionnaires

F School Administrator Questionnaires

G Respondent Materials

H Links Between Instrument Items, Covered Constructs, and Related Research Questions


Tables



Exhibits


A-1 Examples of important developments relevant to the ECLS-K:2011 A-18

A-2 Confidentiality pledge A-35


A.1 Circumstances Making Collection of Information Necessary

A.1.1 Purpose of This Submission

The Early Childhood Longitudinal Study, Kindergarten Class of 2010-11 (ECLS-K:2011), sponsored by the National Center for Education Statistics (NCES) within the Institute of Education Sciences (IES) of the U.S. Department of Education (ED), is a survey that focuses on children’s early school experiences beginning with kindergarten and continuing through the fifth grade. It includes the collection of data from parents, teachers, school administrators, and nonparental care providers, as well as direct child assessments. Like its sister study, the Early Childhood Longitudinal Study, Kindergarten Class of 1998-99 (ECLS-K),1 the ECLS-K:2011 is exceptionally broad in its scope and coverage of child development, early learning, and school progress, drawing together information from multiple sources to provide rich data about the population of children who were kindergartners in the 2010-11 school year. Data collections to date have been conducted for NCES by Westat, with the Educational Testing Service (ETS) as the subcontractor developing the child assessments. Clearances for studying the ECLS-K:2011 cohort were granted for the fall 2009 field test data collection, fall 2010 and spring 2011 kindergarten national data collections, fall 2011 and spring 2012 first-grade national data collections, fall 2012 and spring 2013 second-grade national data collections, and the spring 2014 third-grade national data collection (OMB No. 1850-0750). Several generic clearance requests for testing various components of the study have also been approved (OMB 1850-0803).


This submission requests OMB’s approval for the spring 2015 fourth-grade national data collection and sample recruitment for the spring 2016 fifth-grade national data collection. This submission also includes carry-over burden from the last approved national data collection package (OMB# 1850-0750 v.15) for the activities that will not be completed by the time this package is expected to be approved.

A.1.2 Legislative Authorization

The ECLS-K:2011 is conducted by NCES in close consultation with other offices and organizations within and outside the U.S. Department of Education. The ECLS-K:2011 is authorized by law under the Education Sciences Reform Act of 2002 (20 U.S. Code Section 9543): “The Statistics Center shall collect, report, analyze, and disseminate statistical data related to education in the United States and in other nations, including -- (7) conducting longitudinal and special data collections necessary to report on the condition and progress of education;”

A.1.3 Prior Related Studies

The ECLS-K:2011 is part of a longitudinal studies program. The two prior ECLS studies pertain to two cohorts—the kindergarten class of 1998-99 cohort and a birth cohort. Together these cohorts provide the range and breadth of data required to more fully describe and understand children’s education experiences, early learning, development, and health in the late 1990s, 2000s, and 2010s.


The birth cohort of the Early Childhood Longitudinal Study (ECLS-B) followed a national sample of children born in the year 2001, from birth through kindergarten entry. The ECLS-B focused on the characteristics of children and their families that influence children’s school readiness and first experiences with formal schooling, as well as children’s early health and in- and out-of-home experiences.


The ECLS‑K followed a nationally representative cohort of children from kindergarten through eighth grade. The base-year data were collected in the fall and spring of the 1998-99 school year, when the sampled children were in kindergarten. A total of 21,260 kindergartners throughout the nation participated by having a child assessment and/or parent interview conducted during that school year. Five more waves of data were collected: in fall and spring of the 1999-2000 school year when most, but not all, of the children who participated in the base year were in first grade; in the spring of the 2001-02 school year when most, but not all, of the children who participated in the base year were in third grade; in the spring of the 2003-04 school year when most, but not all, of the children who participated in the base year were in fifth grade; and in the spring of the 2006-07 school year when most, but not all, of the children who participated in the base year were in eighth grade.2


A.1.4 ECLS-K:2011 Study Design for the Spring Fourth-Grade National Data Collection

The sample for the ECLS-K:2011 is a representative sample of children across the country who attended kindergarten in 2010-11. The sample was selected using a multistage probability design. In the first stage, 90 primary sampling units (PSUs) that are counties or groups of counties were selected with probability proportional to size (PPS). In the second stage, public and private schools offering kindergarten or educating 5-year-olds in an ungraded setting were selected, also with PPS. The third-stage sampling units were children in kindergarten or children of kindergarten age in ungraded schools or classrooms. Children were selected within each sampled school using equal probability systematic sampling, with a higher sampling rate for Asian and Pacific Islanders (APIs) so as to achieve a minimum required sample size for APIs.


The base-year (i.e., kindergarten) data were collected in the fall and spring of the 2010-11 school year. The fall first-grade data collection was conducted in fall 2011 when most, but not all, of the sampled children were in first grade. The spring first-grade data collection was conducted in spring 2012 when most, but not all, of the sampled children were in first grade. The fall second-grade data collection was conducted in fall 2012 when most, but not all, of the sampled children were in second grade, and the spring second-grade data collection was conducted in spring 2013 when most, but not all, of the sampled children were in second grade. The spring third-grade data collections was conducted in spring 2014 when most, but not all of the sampled children were in third grade.3


Similar to the previous years’ spring data collections, the national spring fourth-grade data collection will include direct child assessments, height and weight measurements, parent interviews, and school administrator and teacher questionnaires. As in all prior rounds of data collection, computer assisted interviewing (CAI) will be the mode of data collection for the child assessment and the parent interviews. Children will also complete an audio-CASI (computer assisted self-interview) version of a child questionnaire as they did in third-grade data collection. Also as in the past, school administrator and teacher data will be collected via hard-copy self-administered questionnaires.


Cognitive Assessments. As in the previous data collections for the ECLS-K:2011, a direct cognitive assessment will be administered in the spring 2015 fourth-grade collection. The cognitive assessment will include the domains of reading, mathematics, science, and executive functioning. It will be administered directly to the sampled children through a one-on-one assessment employing age- and grade-appropriate items. The structure of the ECLS-K:2011 fourth-grade reading, mathematics, and science assessments will be two-stage, the same as the ECLS-K:2011 previous round assessments.4 That is, for the cognitive assessments in reading, math, and science, all children first will be administered a routing test. Performance on the routing test will determine which one of three second-stage tests (low, middle, or high difficulty) will be appropriate for the child’s demonstrated skill level; the child will then be administered the appropriate second-stage assessment form. The executive function measures (i.e., Numbers Reversed, Dimensional Change Card Sort, and the Flanker Inhibitory Control and Attention Test tasks) are not two-stage assessments.


Though new items were developed for inclusion in the fourth-grade ECLS-K:2011 reading, mathematics, and science assessments, a majority of items in the assessments will be the same as those included in the assessments from the earlier rounds of the ECLS-K:2011 and from the ECLS-K assessments. Items from earlier rounds of the ECLS-K:2011 are included to allow for the measurement of growth or gains in knowledge and skills as children age. Items from the ECLS-K are included in order to enable researchers to conduct cross-cohort analyses using the assessment data.


The spring fourth-grade cognitive assessment, like the spring second- and third-grade cognitive assessments, will use a computerized version of the Dimensional Change Card Sort (DCCS) task, which measures children’s executive functioning (specifically, cognitive flexibility). Although administered in a hard-copy format in the kindergarten and first-grade rounds, a switch to the computerized version was made in the spring second-grade data collection as the electronic mode allows the assessment to capture response time, which is not possible using the physical card version. When assessing cognitive flexibility, it becomes more important to capture response time as children get older.


The Numbers Reversed task, the second assessment of executive function included in the ECLS-K:2011 child assessment, is identical to the Numbers Reversed task included in the previous grade data collections. This task assesses the child’s working memory. It is a backward digit span task that requires the child to repeat an orally presented sequence of numbers in the reverse order in which the numbers are presented. For example, if presented with the sequence “3…5,” the child would be expected to say “5…3.” Children are given five 2-number sequences. If the child gets three consecutive 2-number sequences incorrect, then the Numbers Reversed task ends. If the child is successful with sequences of two numbers, the child is then given five 3-number sequences. The sequences become increasingly longer, up to a maximum of eight numbers, until the child either gets three consecutive number sequences incorrect or completes all number sequences.


An additional executive function task is planned for the fourth-grade data collection. The Flanker Inhibitory Control and Attention Test (commonly referred to just as the Flanker) requires the child to focus on a given stimulus while inhibiting attention to stimuli flanking it. For the fourth-grade assessment, arrows will be used as the stimulus and “flankers.” Children will be asked to look at the middle arrow in a row of arrows shown on the computer screen and indicate in which direction that middle arrow is pointing by pressing either the right or left arrow key on the computer’s keyboard. Sometimes the stimulus is pointing in the same direction as the flankers and sometimes in the opposite direction. Scoring is based on a combination of accuracy and reaction time.


Child Questionnaire. Prior to the start of the direct cognitive assessment, children will be asked to complete a self-administered, computerized questionnaire. As in the third-grade data collection, the Child Questionnaire (CQ) will be administered on a computer using audio-CASI, a software system that reads the instructions and questionnaire items to the child, while the same text is displayed on a laptop’s screen. Children will choose answers to the questions by selecting responses directly on the touch-sensitive screen of the laptop. After answering a question, the child will click on a “next” button and continue to the next question in the self-administered questionnaire.


Questions are drawn from the various published and unpublished scales. (Appendix A includes the programmer specifications for the audio-CASI child questionnaire, which indicate the exact instructions to be provided to the child and the items that will be administered.) The CQ consists of 35 statements and questions, which children will respond to using a variety of response scales and sets. Based on recommendations from a Socioemotional Content Review Panel (CRP) that was convened in October 2012 and a Technical Review Panel (TRP) that was convened in November 2013, the fourth-grade CQ will include items measuring children’s engagement in school, peer support, feelings of loneliness, social anxiety, peer victimization, and media usage. Children will be asked to indicate how often they feel certain emotions or experience certain behaviors. At the request of staff from the National Institute of Child Health and Development, the CQ also includes questions about pet ownership; if children own pets, they will be asked, for example, about time spent playing with the pet and if the pet is present during homework time.


Data from the national administration of the CQ will enable researchers to compare students’ self-ratings of their peer relationships and school engagement to the students’ performance on assessment items in the reading, math, and science domains.


Physical Measurements. In addition to the child questionnaire and the cognitive assessment, the ECLS-K:2011 direct child assessments will include measures of the children’s height and weight, to assess children’s physical growth and development. These measurements have been taken in all previous rounds of the ECLS-K:2011.


Parent Interviews. A parent interview will be administered to one parent/guardian of each child in the ECLS-K:2011 study. (Appendix B includes the programmer specifications for the parent interview, which indicates the items that will be administered.) The interviews will be developed in English and then translated into Spanish. For parents who speak neither English nor Spanish, home and community interpreters will be used when available to administer the English-language version to parents, translating the English version to the parent’s native language during the interview. The spring fourth-grade parent interview includes the same types of questions (in terms of topics and format) that have been previously fielded in the ECLS-K, earlier rounds of the ECLS-K:2011, and other NCES studies (e.g., the ECLS-B, the National Household Education Surveys Program (NHES), the Education Longitudinal Survey of 2002 (ELS:2002), and the National Education Longitudinal Survey of 1988 (NELS:88)). More specifically, the parent instrument will ask about parent involvement with the school; family structure; the home environment; whether children take care of themselves before or after school; nonresident parents; discipline; communication with the child; the parent’s psychological well-being and health; household food security; parent education; parent employment; welfare and other public transfers; and household income. Parents will also be asked to report on their children’s executive function, physical activity, health, and disabilities.


In addition, parents whose children are in a school participating in the 2015 fourth-grade National Assessment of Educational Progress (NAEP) data collection will be asked a longer battery of questions (found in Appendix B) on parent education and household income that have been asked of the ECLS-K:2011 parents in past rounds. NAEP is asking fourth-graders questions to ascertain family socioeconomic status, and current information from parents is needed for comparison to child-reported information. Clearance for the child questions is being requested by NAEP as part of their request for clearance for the 2015 NAEP data collection (Wave 3 submittal).


Based on recommendations from the TRP that was convened in November 2013, there are some new items included in the fourth-grade parent interview that are intended to measure parents’ use of a computer or other electronic device to communicate with or get information from the child’s school; parent help with homework; parent reports of the child’s grades; child’s school avoidance; family monitoring of the amount of time the child spends online and what the child looks at online; children’s friendships; neighborhood safety; parent-child relationship conflict; and life stress. Items that were newly created or that were not previously used in other large-scale studies were included in the spring 2014 cognitive interviews to ensure that the question wording and intent was clear to respondents. (See section A.1.5 for additional information on the parent interview cognitive interviews.)


Teacher Questionnaires. As in previous rounds, teachers of sampled children will be asked to complete hard-copy questionnaires. However, the design and distribution of these questionnaires will be different from past rounds, in which each child’s general classroom teacher completed one background questionnaire, one curriculum questionnaire, and child-specific questionnaires for each sampled child in her classroom. As children move into the upper elementary grades, it becomes more common for children to have different teachers for at least a few subject areas, such as reading and language arts, mathematics, and/or science and social studies. Information about how study schools organize students for instruction for fourth grade, which was collected in the third-grade data collection, shows that there is much variation in the organization. For example, in some schools students have different teachers for all subjects; in other schools, students are pulled out of the general classroom only for science but remain in their homeroom for reading and mathematics; while in other schools students are taught all subjects by one general classroom teacher.


In order to accommodate this variation in organization for instruction, for the spring 2015 fourth-grade data collection, the same approach successfully used in the fifth-grade round of ECLS-K will be followed to collect the teacher questionnaire data. All children will have their reading teacher identified and that teacher will be asked to complete a questionnaire. To reduce the burden on teachers, half of the sampled children will be randomly assigned to have their mathematics teacher complete questionnaires, while the other half of the sampled children will also be randomly assigned to have their science teacher complete questionnaires. Thus, every child will have a reading and either mathematics or science teacher identified for him/her.


All identified teachers will receive a self-administered teacher-level questionnaire (“TQ”). The TQ includes questions about the teachers such as their views on the school climate, evaluation methods used for reporting to parents, and their background and education. It also includes questions about time children spend in group activities and in lessons in general subject areas, as well as occurrences of recess.


Three additional subject-specific child-level questionnaires will also be distributed for the identified reading, mathematics, and science teachers. Each of these questionnaires (“TQC”) will include two sections:


  • Part 1: Child-level questions. The questions in the child-level section ask the teacher to rate the child on academic and social skills, school engagement, and classroom behaviors. Because each child’s reading teacher will complete a child-level questionnaire, the reading TQC contains the majority of the child-level questions, while the mathematics and science TQCs contain only a few child-level questions specifically related to that subject. There are also questions in all three TQCs asking for child-specific instructional information (for example, instructional group placement and additional services the child receives).

  • Part 2: Classroom questions. The questions in the classroom section pertain to the reading, mathematics, or science class in which the sampled student is taught. Specifically, teachers are asked to indicate how much time is spent on specific skills and activities, as well as questions on instruction and grading practices, behavioral issues, and homework assignments.

To further reduce burden on teachers, one “key child” will be identified for each subject and class. Teachers will be asked to complete all items in both the child-level and the classroom sections of the TQC only for the designated key child; for the remainder of the sampled children in the reading, math, or science class a teacher will only need to complete the questions in the child-level section of the TQC. (See appendix C for the TQ and appendix D for the TQC questionnaires.) Teachers who teach multiple sections of a subject (for example, advanced and remedial sections of math) will have a key child identified for each of those sections, meaning that these teachers will complete the classroom questions about each section of the subject that is taught to at least one ECLS-K:2011 student.


The content of the various teacher questionnaires is much the same as it is in the spring 2014 third-grade teacher questionnaires, but it has been reorganized across the TQ and the three subject-specific TQCs. Questions were also added based on discussions with the November 2013 TRP panel, including items on peer relationships, school liking and avoidance, and the types of instructional activities that teachers use in each of the three subject areas.


Data obtained from teachers in the TQ can be used to address research questions about the relationships between certain classroom and teacher characteristics and children’s academic and social development. In addition, data from the TQ can also be compared to data from prior rounds of data collection in this study and in the ECLS-K.


Data obtained from teachers in the TQC can be used in several ways, for example: to examine teacher-reported measures of cognitive and social development and compare with prior rounds of data collection; to address research questions about relationships among classroom and teacher characteristics, child-specific participation in instruction and school-based services, and children’s academic and social development; and in comparison to the results of direct assessments administered to the sampled children. As results from additional years of assessments become available, a picture of children’s skills over time can be developed.


Special education teachers and related service providers will be asked to complete questionnaires for ECLS-K:2011 students with an Individualized Education Program (IEP) on file at the school. The information obtained through these questionnaires will be useful in examining special education curricula and the services being received by children with disabilities (see appendix E for the Special Education Teacher questionnaires).


School Administrator Questionnaires. The School Administrator Questionnaire (SAQ) will be completed by the school administrators in the schools attended by the children in the study.5 There will be two versions of the questionnaire: one for schools that completed an SAQ in a prior round of the study (“continuing schools”) and one for any school that did not previously complete the SAQ, either because the school is a new school into which an ECLS-K:2011 student has transferred or because the school did not complete the SAQ in any previous study round (“new schools”). In order to reduce respondent burden, the administrator questionnaire for continuing schools omits questions included in the SAQ in previous rounds about characteristics that are unlikely to change from year to year. The SAQ instrument includes a broad range of questions about the school setting, policies, and practices at both the school level and in specific grades, as well as questions about the school administrator and the teaching staff. The questionnaire remains much the same as it did in the third-grade data collection, although a few items about communication with parents through online formats and implementation of the Common Core Standards were added to the fourth-grade version. In order to ensure that the wording of these items was clear and appropriate for school administrators, they were included in the spring 2014 cognitive interviews with teachers. See section A.1.5 for more information about these cognitive interviews.


These items will help researchers understand the school contexts for ECLS-K:2011 students. Comparisons can be made between children attending different types of schools, including public and private schools (with private schools being further identified as religious or nonreligious); rural, urban, and suburban schools; and schools of different sizes. Data from this questionnaire can be used with data from the child assessments and teacher questionnaires to investigate the degree to which educational outcomes of various groups of children are associated with the differences in the schools that the children attend (see appendix F for the SAQ questionnaires).


A.1.5 Cognitive Interviewers and Timing Tests of the Fourth-Grade Non-Assessment Instruments

Cognitive interviews were conducted in the spring of 2014 to test items to be considered for inclusion in the subject-specific teacher questionnaires (TQC) and parent interview in the fourth-grade data collection. In addition, timing tests were conducted in order to determine how long it takes respondents to complete the teacher background questionnaire (TQ), each subject-specific TQC questionnaire, and the parent interview. The cognitive interviews and timing tests were approved in February 2014 (OMB# 1850-0803 v.95).


Most items from the teacher questionnaires and parent interview have been fielded over multiple rounds of the ECLS-K and ECLS-K:2011 and, therefore, were not tested in these cognitive interviews. The new items that were tested in the cognitive interviews were proposed for inclusion in the instruments to address issues that were recently identified by the 2013 ECLS-K:2011 Technical Review Panel (TRP) as important areas for research. These items were developed using feedback from the TRP and experience developing and fielding the ECLS-K fifth-grade and ECLS K:2011 third-grade instruments.


In addition, three questions that were proposed for the SAQ instruments were included in the teacher cognitive interviews. These questions focused on the use of online tools by the school to communicate with parents. Because school administrators were not included in the pilot testing, and it was thought that teachers would be aware of and use these online tools, it was decided to add these questions to the teacher cognitive interview protocol to obtain teacher feedback on the items.


A.1.5.1 Cognitive Interviews for the Fourth-Grade Subject-Specific TQCs and Parent Interview Items

Sample and Data Collection Procedures. The cognitive interview sample was a purposive sample, but attempts were made to recruit teachers from among elementary schools with various characteristics, e.g., from public (including charter) and private schools and rural and urban schools located throughout the country. Efforts were also made to recruit parents nationwide, from single- and dual-parent households and with various educational backgrounds.


A national database that includes hundreds of thousands of households across the country was used to identify potential cognitive interview respondents. The database included information such as the presence of children in the household, age of the children, occupation, and other demographic characteristics of the adult household members such as age, gender, and area of residence. Information about children in the household was used to identify households that were likely to have a fourth-grader. Information about occupation was used to identify members of the database who were elementary school teachers. In order to identify and recruit a sufficient number of teachers, this database information was supplemented with a purchased list of the names and home telephone numbers of fourth-grade teachers across the nation. Once recruited, teachers were assigned to complete a cognitive interview focused on reading, math, or science, based on the subject(s) they taught. All respondents were paid $25 for their participation.


Twenty-four parents and twenty-five teachers completed cognitive interviews. Details about the sample appear in the tables below.


Table A-1. Characteristics of fourth-grade teachers participating in the cognitive interviews


Teachers (n=25)

Subject Area


Reading/Language Arts

8

Science

8

Math

9

Organization of Students for Instruction


Students stay with the same teacher for all three subjects

16

Students switch teachers for different subjects

9

Academic Level of Students


Below grade-level

3

On average

18

Above grade-level

4

School Type


Public

24

Private – Religious

1

Region


Northeast

3

South

13

Midwest

8

West

1


Table A-2. Characteristics of parents of fourth-grade children participating in the cognitive interviews


Parents (n=25)

Gender of Respondent


Male

8

Female

17

Gender of Fourth Grader


Male

19

Female

6

Household Structure


Child lives with two parents

19

Child lives with one parent

6

Child has Contact with Parent who Lives Outside the Home


Yes

6

No

0

Child has at Least one Nonparental Childcare Arrangement


Yes

13

No

12

Type of Childcare Arrangement


Child care center or before-/after-school program

8

Relative care

4

Other

1

Child has a Physical, Learning, or Emotional Issue


Yes

9

No

16

Parent Education Level


Some College or less

6

College or beyond

19

Region


Northeast

9

South

7

Midwest

6

West

3



After agreeing to participate in a cognitive interview, teachers were sent a confirmation email containing the details of their interview appointment. They were also mailed a letter explaining the purpose of the cognitive interviews and a copy of the fourth-grade subject-specific questionnaire to review prior to the cognitive interview.6 The cover letter instructed teachers to take 5 minutes to briefly review the entire questionnaire before the interview. While the interviews focused only on selected items, reviewing the entire questionnaire provided some additional context for the respondent. Parents were only sent the confirmation email; they were not asked to review any materials prior to the interview.


The cognitive interviews with teachers and parents occurred in March and April 2014 and were conducted as one-on-one telephone interviews between a respondent and an experienced qualitative interviewer. During the cognitive interviews, trained interviewers followed a prewritten protocol that asked respondents to answer selected draft instrument questions as well as some follow-up cognitive interview questions on specific aspects of the tested items. Interviewers were able to deviate from the protocol in order to address specific issues or anomalies in the respondents’ verbal reports.


The items that were tested in the teacher cognitive interviews included those from a measure of school liking, items providing the teachers’ evaluation of the child’s peer group and social emotional development, items on the frequency of electronic communication from the school to parents, items on the frequency that certain subject-specific skills and concepts are taught in the classroom, and items on the frequency that certain instructional methods are used in the classroom. At the conclusion of the interview, interviewers asked the teachers for their opinion on the length of the questionnaire. Cognitive interviews with teachers lasted about an hour on average.


The items that were cognitively tested with parents included items on frequency of computer use, family rules for internet use, and an evaluation of the child’s peer group. These interviews lasted approximately an hour.


Results of the Cognitive Interviews. As a result of the interviews, changes to the items in both the TQC and the parent interview were made, and some items were cut from the instruments altogether.


Results from the Cognitive Interviews with Teachers. As a result of the cognitive interviews with teachers, the skills items (which ask teachers to indicate the frequency with which numerous skills that align with the Common Core State Standards and Next Generation Science Standards are taught in their classes) were significantly revised across the three subject-specific questionnaires. For example, in reporting the frequency with which particular skills were covered in their class, teachers were confused about whether they were to consider only the examples listed under each skill category (following the word “including) or whether they were to think of other activities or topics taught that might also be part of the skill category. In addition, some of the skill categories were too broad, contained examples that teachers found confusing, or contained examples that were less relevant to a fourth-grade curriculum than other examples that could have been used. The wording of the skills items was revised to address these concerns.


Another focus of the cognitive interviews was the activity items, in which teachers were asked to indicate the frequency with which the class engaged in certain teaching and learning activities. For the most part, the teachers in all of the subjects understood these items and found most of them to be relevant to the fourth-grade classroom. The items were edited somewhat to remove non-relevant activities. For example, many teachers said they did not have “assigned” vocabulary words in their reading and language arts classes but that they did work on building students’ vocabularies based on learning about unfamiliar words encountered in reading passages or other lessons.


In order to keep the cognitive interviews to a reasonable length, some questions were only asked of a subset of respondents. With regards to the electronic communication with parents items that will be fielded in the School Administrator Questionnaire, math and science teachers indicated that school administrators can be expected to know about school communications with all parents as a group but not necessarily about communications with individual parents – the latter type of communication tends to be teacher-specific. Thus, school administrators will be asked to report on only school communication intended for “all parents.” An initial question will ask about electronic and non-electronic communication to all parents, with examples given to explain non-electronic. The second and third questions will ask if the school has used an online tool or website that is available to the general public and/or a restricted online tool or website available only to parents and selected school personnel. A fourth item will ask about the types of information provided through the restricted-access site.


Reading and science teachers were asked to give feedback on the child’s peer group items. These teachers are asked to rate how well statements such as “This is a good group of kids” and “Some of the kids are a bad influence on this child” apply to a child’s peer group. For the most part, teachers found these items easy to answer because they feel they know the peer groups with whom their students associate most often.


Reading and math teachers were asked to rate a child’s social cognition in school (example statements include “Understands others’ feelings” and “Is aware of the effects of his/her behavior on others”). Teachers had some difficulty deciding what would constitute “poor,” “average,” or “good” social cognition; the stem of the item was revised to include instruction to compare the individual child “to a typical child in his or her grade.”


Finally, math and science teachers were asked to answer items comprising a school liking scale, in which teachers are asked things such as how often a child “likes to come to school.” While teachers noted similarities between some of the subitems, most teachers were able to distinguish among the items (i.e., they understood the different aspects of school liking that each item was trying to tap). Thus, all the subitems are included in the final instrument, although in response to feedback from the respondents the response options were made less wordy.


Results from the Cognitive Interviews with Parents. Parent respondents were asked to evaluate two items concerning family rules on usage of the Internet. After the analysis of cognitive interview results, the item was changed to ask about monitoring rather than rules because the interviews indicated that parents were doing a lot of monitoring of children’s Internet use, even though they did not always have explicit rules. Also, to broaden what parents consider in these questions, the word “Internet” was changed to “online.” In addition, help text was added to clarify both questions.


Questions about the children’s peer group were also tested in the parent cognitive interviews. These items did not test well and were deleted from the final instrument. Parents found the items confusing, ambiguous, and difficult to answer. For example, it was unclear to respondents whether the peer group referred to in the questions was the group of students with whom the child has spent time with in school, outside of school, or both. Similarly, many respondents were unsure whether to consider close friends, all friends, and/or acquaintances when responding to the questions. Respondents found the items asking about the frequency with which their child’s friends get in trouble difficult to answer and not entirely age-appropriate.


Two items on participation in academic activities outside of school were evaluated in the parent cognitive interviews. Although parents found the items easy to answer, they did have some questions about the examples of the activities provided. Clarifying help text was added to these questions to be used when fielded in the spring fourth-grade interview.


Finally, respondents were asked to indicate the frequency with which they argue with their children. Although parents did not necessarily articulate that this question was confusing or difficult to answer, parents had very different interpretations of what constituted “larger, more significant arguments” versus “minor arguments.” They reported that minor arguments are much more frequent than larger arguments, but were unsure whether to think about these smaller conflicts when responding to the item. This confusion could lead to under-reporting of the frequency of conflict; thus, a sentence will be added to the question to clarify that both minor and more significant arguments should be included when answering the item.


A.1.5.2 Timing Tests for the TQ, Subject-Specific TQCs, and Parent Interview

Sample and Data Collection Procedures. The sample for the timing tests was also purposive, although efforts were made to obtain respondents from different areas of the country. Efforts were also made to recruit teachers from a variety of school types but, as shown in Table A-3 below, only public school teachers participated. Recruitment of parents of fourth-grade children was done through an ad posted on the data collection contractor’s internal website. Teachers were also recruited through an ad on this website. Seventeen parents and thirteen teachers completed the timing tests. The tables below show the characteristics of the participating respondents.


Table A-3. Characteristics of teachers participating in the timing tests


Teachers (n=13)

Gender of Teachers


Male

4

Female

9

Grade Taught7


3rd grade

3

4th grade

7

5th grade

3

Subject Area


Reading/Language Arts

5

Science

3

Math

5

Organization of Students for Instruction


Students stay with the same teacher for all three subjects

9

Students switch teachers for different subjects

4

Academic Level of Students


Below grade-level

1

On average

11

Above grade-level

1

School Type


Public

13

Private – Religious

0


Table A-4. Characteristics of parents participating in the timing tests


Parents (n=17)

Gender of Respondent


Male

2

Female

15

Gender of Fourth Grader


Male

8

Female

9

Household Structure


Child lives with two parents

15

Child lives with one parent

2

Child has Contact with Parent who Lives Outside the Home


Yes

2

No

0

Child has at Least one Nonparental Childcare Arrangement


Yes

5

No

12

Type of Childcare Arrangement


Child care center or before-/after-school program

4

Relative care

0

Non-relative care

1

Child has a Physical, Learning, or Emotional Issue


Yes

7

No

10

Parent Education Level


Some College or less

3

College or beyond

14

Region


Northeast

11

South

0

Midwest

5

West

1


After agreeing to participate in the timing test, teachers were mailed a letter explaining the purpose of the timing test, a copy of the TQ and applicable TQC, and a Questionnaire Timings Form. The cover letter instructed teachers to complete both questionnaires and track the time it took to complete each section of the questionnaires on the Questionnaire Timings Form. After completion of the questionnaires and the forms, teachers were asked to mail the materials to Westat. Parents who agreed to participate in the timing test were administered the entire parent interview by telephone. Timings for each section and for the entire interview were recorded by the interviewer.


Results of the Timing Tests. Table A-5 below summarizes the average amount of time to complete the teacher and parent instruments. On average, respondents completed the TQ in slightly less than 13 minutes. The respondents who completed the entire reading TQC (that is, both parts 1 and 2) reported an average time of about 26 minutes for completion; the entire math TQC took respondents an average of 13 minutes to complete, and the science TQC took respondents about 12 minutes to complete. The shorter timings for the math and science TQCs are not unexpected, as they contain fewer child-specific questions (see section A.1.4 for more detail on the content of the subject-specific questionnaires). The table below also shows the estimated time to complete the questionnaire for a second, additional child, when a teacher would only be completing part 1. To obtain the estimated timings for the second child’s questionnaire, the sum of the timings for the sections/questions that would not be completed for the second child were subtracted from the total questionnaire timing. The last column in Table A-5 below reflects the total amount of time that a teacher with two sampled children linked to him/her can expect to spend completing both questionnaires.


On average, the parent interview required just over 45 minutes to complete. The timings ranged from a low of 37 minutes to a high of 58 minutes. The characteristics of the child and the household contributed to the variations in the length of interviews. Also, the interviewer conducted the parent interview using a hard-copy instrument, rather than CAPI as is done in national data collection. It is estimated that it took an extra two to three minutes to administer the interview by paper rather than through the CAPI system given the complexity of some of the skip patterns.


As a result of the timing test data, deletions were made to the parent interview so that it is now expected to run about 35 minutes. In the child care section, questions are not asked about before- and after-school care in this round of the study; however, questions about whether children take care of themselves before or after school were retained. Other questions that were deleted were about communication from the school about the child, whether the parent received a school profile online, parental warmth, and peer victimization.


Table A-5. Estimated Mean Length of Administration based on results of the fourth-grade timing tests


Mean Length of Administration


For Key Child

For Additional Child

Total

Teacher Background Questionnaire (TQ)



12.5 minutes

Subject-specific Child-Level Questionnaire (TQC)




Reading and Language Arts TQC

26 minutes

12 minutes

38 minutes

Mathematics TQC

13 minutes

2 minutes

15 minutes

Science TQC

12 minutes

1 minute

13 minutes

Parent Interview



45 minutes


A.2 Purposes and Uses of the Data

The ECLS-K:2011 will provide rich data sets that are generally designed to serve two purposes: descriptive and explanatory. It will provide descriptive data at a national level related to (1) children’s status at entry into kindergarten and at different points in children’s elementary school careers, (2) children’s transition into school and into the later elementary grade levels, and (3) children’s school progress through the fifth grade. Additionally, it will provide rich data that will enable researchers to test hypotheses about how a wide range of child, family, school, classroom, nonparental care, education provider, and community characteristics relate to experiences and success in school.


In addition to the descriptive objectives mentioned above, the data will describe the diversity of young children with respect to demographic characteristics such as race/ethnicity, language, and school readiness. Such information is critical for establishing policies that are sensitive to this diversity. The longitudinal nature of the study will enable researchers to study cognitive, socioemotional, and physical growth, as well as relate trajectories of growth and change to variation in home, school, and before- and after-school care setting experiences in the elementary grades. Summer learning or learning loss, which can have a considerable impact on children’s educational progress, can also be examined with data collected in the fall 2011 and fall 2012 data collections. Ultimately, the ECLS-K:2011 data set will be used by policymakers, educators, and researchers to consider the ways in which children are educated in our nation’s schools and to develop effective approaches to education. It will be particularly valuable to policymakers, as the ECLS-K:2011 is being launched a dozen years after the inception of the ECLS-K. Analyses of the two cohorts will provide valuable information about the influences of changing policy and demographic environments on children’s early learning and development.


A.2.1 Research Issues Addressed in the ECLS-K:2011

Today’s early education environment differs from that of the past in numerous ways. Examples of the many changes that have occurred within schools and within the larger society in recent years are presented in exhibit A-1 and include changes at the policy, state, school, family, and societal levels. ECLS-K and ECLS-B data have been used by numerous researchers to examine many of these topics. The widespread use of ECLS data is a testament to the importance of the ECLS program. At the same time, both prior studies leave gaps in the research questions we may answer with the data, which is perhaps inevitable because changes in policy, research, and society are often difficult to anticipate. The ECLS-K:2011 seeks to preserve the strengths of the earlier studies by retaining much of the same content, while incorporating appropriate modifications. This allows for the use of ECLS-K:2011 data to answer some of these recently-emerging questions, while at the same time allowing for the study of a new cohort of children growing up in new circumstances and the ability to make comparisons with the earlier cohorts. Below, we discuss some of the important developments that are particularly relevant to the design of the ECLS-K:2011.


A.2.1.1 Developments in Early Education Policy

A major change in early education occurred when the Elementary and Secondary Education Act (ESEA) was reauthorized as the No Child Left Behind Act (NCLB) and signed into law in early 2002. ESEA 2002 set clear expectations for student achievement, mandated annual assessments of all children in grades 3 through 8 to measure progress toward state-defined goals, and had strong reporting requirements for schools, districts, and states. ESEA 2002 aims to narrow or eliminate achievement gaps in education and called for accountability and higher standards for achievement. In 2010, President Barack Obama released his Blueprint for Reform for the reauthorization of ESEA, which is awaiting congressional action as of the date this clearance request is being submitted. In the meantime, The Secretary of Education and President Obama have granted some states flexibility in meeting some of the NCLB requirements in exchange for meaningful reform at the state and local levels.


Among the requirements for receiving a waiver from NCLB requirements are the state’s development of criteria for evaluating teacher and principal performance beyond a focus on single standardized test scores and the creation of systems for teacher and principal development. These criteria can include observation, peer review, and feedback from parents and students, as well as student growth rates, but must also set new performance targets for improving student achievement and closing achievement gaps. In addition, states must develop accountability systems to identify and reward high-performing schools and to identify and intervene in the lowest performing schools to help them improve student performance.



Exhibit A-1. Examples of important developments relevant to the ECLS-K:2011

Policy changes

Passage of the Elementary and Secondary Education Act (ESEA) 2002

President Obama’s 2010 Blueprint for Reform proposal for reauthorization of ESEA

Recent U.S. Department of Education invitation to states to apply for flexibility in meeting specific ESEA requirements in exchange for meaningful reform at the state and local levels

Race to the Top

The Common Core State Standards Initiative

The New Generation Science Standards

E-GOV Act of 2002 promoting use of the web and web-based applications to provide access to and enhance delivery of government services


Economic challenges

Global recession and financial crisis beginning 2007/2008

American Recovery and Reinvestment Act of 2009

State and local budget constraints and cuts

Sequestration of a portion of federal funds across all administrative departments in 2013


Changes in schools and challenges to schools

Growth in school choice and increasing number of charter schools

Increased use of technology and the Internet in schools

Increased use of mobile devices and “bring your own device” policies

New technologies allow different types of classroom interactions (e.g., remote personal response systems, social networking, digital textbooks)

Blended learning where in-person instruction and technology-delivered information are combined

Differentiated instruction

Segmentation by subject in elementary school

Value-added assessments

Teacher salary and tenure reform, including incorporating measures of teacher effectiveness

Training teachers to use technology effectively and to become online educators

Growth of Hispanic, Asian, and multi-race child population

Growth in English language learners (ELL) in schools, especially at young ages

Use of data management systems to track and monitor student achievement and behavior and the use of data-driven decision making

The increased use of “response to intervention” approaches to intervention in the general education setting and for the determination of eligibility for special education

Increased focus on preventing problem behavior


Child health

Epidemic of obesity and associated rise in diabetes

Rise in incidence of:

  • Allergies

  • Asthma

  • Autism

  • Attention deficit/hyperactivity disorder

Decline in incidence of:

  • Specific learning disabilities

Scientific developments

Advances in neuroimaging techniques (e.g., fMRIs) that have led to advances in our understanding of the development of children’s learning, memory, attention, and language

Advances in neurological research and emphasis on executive function

Emerging research showing the trainability of cognitive processes (e.g., Rueda, et al., 2005)

Recent developments in cognitive science and learning theory


Several of the recent reform proposals reflect a movement from application of uniform proficiency goals to measurement of individual growth in students’ achievement. This shift would call for new types of assessments that are not just cross-sectional measures, but ones that can detect individual student growth over time.


The recent adoption and state-by-state implementation of the Common Core State Standards in English language arts and mathematics across the country represents another significant change in the education policy environment that can be examined given the data that is collected by the ECLS-K:2011 study. The Common Core State Standards Initiative, launched in 2010 by state policy leaders in the National Governors Association and the Council of Chief State School Officers, seeks to create common standards that align curricula, college and career readiness, and state tests to the highest standards around the country. Forty-four states and the District of Columbia have adopted the common core standards. Recent comparisons with the state standards being replaced indicate that the common core standards are more challenging than most individual state initiatives (Carmichael, Martino, Porter-Magee, and Wilson 2010). Beginning with the spring first-grade data collection, the ECLS-K:2011 included items about instruction of language arts and mathematics in the classroom-level teacher questionnaire to reflect the appropriate grade-level standards as described by the common core state standards.


In addition to changing policies and approaches to early education and research, the United States is still facing economic challenges that will affect the Federal budget in the coming years. The deep recession and the associated high unemployment rate and tightened state and local budgets have direct impacts on districts and schools. Reduced services and staff may well affect children’s experiences in school. Beginning with the spring first-grade data collection, the school administrator questionnaire included questions asking about actions that may have occurred as a result of changes in funding, such as staff additions or contractions in the past year, changes in staff burden and salaries, adjustments in class sizes, and increases in family poverty (that is, in the proportion of students eligible for free or reduced-price lunch). These items were included because the current economic climate may also affect children’s home lives if the family experienced changes in their economic circumstances or if friends and family members did. Researchers have studied the effect of the recession on child well-being and found many adverse effects, including an increase in the number of households classified as “food insecure” (Sell et al., 2010). According to the “NSLP Fact Sheet” of the Food and Nutrition Service, USDA,8 the number of students enrolled in the National School Lunch Program, i.e., those receiving free or reduced-price lunch, continues to increase. For example, in 1990, over 24 million children participated in the program, while in 2011, that number was greater than 31.8 million children.


A.2.1.2 School Readiness

Education policymakers and researchers continue to debate the most appropriate ways to promote school readiness. Most experts agree that school readiness is a multifaceted phenomenon that encompasses several domains of child development. In addition to cognitive development and pre-academic skills (e.g., letter and number recognition, emerging literacy), school readiness is conceptualized as involving the whole child, including health and physical well-being, language acquisition, social and emotional development, and interest in and enthusiasm for learning. It is therefore important for the ECLS-K:2011, like the ECLS-K and the ECLS-B, to capture all of these domains to fully understand how children’s early learning and development are affected by shifts in policy and by changes in children’s lives.


One effect of ESEA 2002 is a change in curricular emphasis in the early grades. ESEA 2002 emphasizes evidence-based early literacy activities that stress the development of specific literacy skills. ESEA 2002 includes two initiatives, Reading First and Early Reading First, which seek to lay the foundation for future school success by stressing the following five skills to enable children to become proficient readers:

  • Phonemic awareness: the ability to hear and identify sounds in spoken words;

  • Phonics: the relationship between the letters of written language and the sounds of spoken language;

  • Vocabulary: the words students must know to communicate effectively;

  • Fluency in reading: the capacity to read text accurately and quickly; and

  • Comprehension: the ability to understand and gain meaning from what is read.

ESEA 2002 and these reading programs view literacy as a learned skill that requires coherent skill-based instruction using scientifically supported curricula provided by highly qualified teachers. By ensuring that the ECLS-K:2011 assessments and teacher questionnaires measure these skills, the ECLS-K:2011 can be used to examine children’s emerging literacy and cognitive development since the passage of ESEA 2002. The focus of ESEA 2002 on early literacy skills has essentially shifted discussions of school readiness from the range of domains mentioned above to two: (1) language development and (2) cognition and general knowledge. It will be important to examine the trajectories of other important dimensions of school readiness, such as social competence, approaches to learning, and other indicators of socioemotional development, in light of this aforementioned shift.


A.2.1.3 Executive Functioning

Recent research in the cognitive and neurological sciences is providing important insights into developmental processes associated with school readiness. Of particular interest is research on the importance of executive functioning for learning and academic achievement (e.g., Blair and Razza, 2007; Posner and Rothbart, 2006). “Executive functioning” refers to a set of interdependent processes that work together to accomplish purposeful, goal-directed activities and include working memory, attention, inhibitory control, and other self-regulatory processes. Executive functioning processes work to regulate and orchestrate cognition, emotion, and behavior to help a child to learn in the classroom. For example, executive control, which is associated with the prefrontal cortex, involves the ability to allocate attention, to hold information in working memory, and to withhold an inappropriate response (Casey et al., 2000). Not only are these cognitive and behavioral processes predictive of reading and math achievement (Blair and Razza, 2007), but there is also emerging research that indicates that some of these cognitive processes are trainable (Rueda et al., 2005; Klingberg et al., 2005) and can be improved upon in regular public school classrooms without costly interventions (Diamond et al., 2007).


Many other cognitive processes are necessary for learning and achievement. For example, learning, whether it involves reading comprehension, solving applied mathematics problems, or something else, involves the interaction between working memory and long-term memory and the formation of linkages between the two. The ECLS-K:2011 will be strengthened by obtaining direct and indirect measures that capture specific learning issues such as attention problems, memory problems, inability to withhold inappropriate responses, and language issues. In particular, little attention has been paid to differences in these areas across racial/ethnic subgroups or between low-income and other children (Noble et al., 2005). The ECLS-K:2011 will provide information to allow for the investigation of such differences.


A.2.1.4 Demographic Changes

The United States is also experiencing demographic shifts in its population. Ours is becoming an increasingly diverse society (Frey, 2011). Recent analyses of decennial census data show that from 2000 to 2010, the growth in the nation’s child population was due primarily to increases in the Hispanic, Asian, and other groups who are not White, Black, or American Indian (Frey 2011). The demographic shift is especially evident in the school-aged population. In 2009, 21 percent of children ages 5 to 17 (or 11.2 million children) spoke a language other than English at home and 5 percent spoke English with difficulty. Of those speaking English with difficulty, 73 percent spoke Spanish, 13 percent spoke an Asian or Pacific Island language, 10 percent spoke an Indo-European language other than Spanish, and 4 percent spoke some other language at home (Aud et al., 2011).


Language is not the only challenge for many of these children, particularly those born outside the United States. Many children born outside of the U.S. who immigrate here, especially those with parents from Mexico and Central America, come from larger families, families where the parents have lower parental education, and families with lower family income than the native-born (Larsen, 2004). Also, families from other cultures may have different normative expectations for how they should interact with schools and teachers. The ECLS-K:2011 will enable researchers to examine how schools and teachers are meeting the needs of these students and their families and to measure the effectiveness of those efforts.


A.3 Use of Improved Information Technology

When feasible, available technology will be used to improve data quality and reduce respondent and school burden. The ECLS-K:2011 parent interviews and child assessments will be conducted using computer-assisted interviewing (CAI). Using CAI will increase data collection efficiency by permitting preloads of available data about the sampled schools and children, on-line editing of information as it is entered (e.g., correcting data entry errors caught through range and logic checks or correction of information provided in a previous round of data collection), and routing of respondents through complex question branching—all of which also reduce respondent burden by producing faster interviews and reducing the need to recontact respondents to obtain missing information (which would occur, for example, if a field interviewer not using CAI does not follow a skip pattern correctly and items that should be asked are not). Parent interviews are primarily conducted by telephone; however, field interviewers will conduct interviews with parents without telephones or who are difficult to reach by making in-person visits to complete interviews. These in-person interviews will also be conducted using CAI on laptop computers. The CAI system has important features that will improve the quality of the data and reduce the burden on respondents, as follows:


  • Initial Contact: The CAI system will guide the ECLS-K:2011 field interviewer in making contact with the parent at the correct phone number or address and with the child at the school and will include prompts to help the interviewer identify the correct respondent.

  • Routing the Direct Child Assessment: The CAI system will be programmed so the initial routing tests at the beginning of the reading, mathematics, and science cognitive assessment subtests will be scored by the computer and the appropriate second-stage tests corresponding to the child’s ability level will be administered. The benefits of such a two-stage assessment are increased adaptiveness, reduced burden for the child, and increased precision of measurement because the assessors do not need to score the routing test and select the appropriate second-stage test themselves. In addition, there typically are some skip rules programmed into the CAI for reading and math that will skip children to a set of questions on a different topic or the next domain if they are struggling and have responded to several questions incorrectly. For the executive function Numbers Reversed task, the CAI system accurately determines where the task ends depending on the child’s performance. As mentioned above, the computerized version of the executive function Dimensional Change Card Sort (DCCS) and Flanker tasks allow the assessment to accurately capture response time, which becomes more important to capture in these particular assessments as children get older.

  • Skip Patterns: The CAI system automatically guides interviewers through the complex skip patterns in the parent interviews, thereby reducing respondent burden, reducing potential for interviewer error, and shortening the interview administration time. The respondent will not be asked inapplicable questions and the interviewers do not need to spend time determining which questions to ask.

  • Copying Responses: The CAI system will be programmed to copy responses from one item to another and from one round to another to prevent unnecessary repetition of questions and to aid in respondents’ recall. For example, information that is provided by the respondent early in the interview may be useful later in the interview; such information can be displayed on the screen or used as a wording fill for relevant questions to assist the respondent. Additionally, information from the previous waves of data collection can be copied to the current wave’s interview and be verified by the respondent, eliminating the need to collect the data again.

  • Time Intervals: The CAI system also provides automated time and date prompts that are very useful in longitudinal studies to assist respondents in remembering specific time periods. The interview can also provide the specific timeframe for the interval between the previous and the current wave of data collection, to help respondents provide information without repeating information they had given at the previous data collection period.

  • Receipt Control: The CAI system will provide for automatic updates to the interview status of study participants and will be used to produce status reports that allow timely and ongoing monitoring of the survey’s progress.

The use of a CAI system for the ECLS-K:2011 is critical because of the intricate and sometimes difficult skip patterns that are part of complex survey instruments and because of the longitudinal nature of the data collection in which the same respondent might be interviewed at multiple time points. Without CAI, the ECLS-K:2011 instruments would be difficult to administer over repeated measurement periods, and respondent burden would be increased.


As in the spring 2014 third-grade data collection, the child questionnaire will be administered using audio computer-assisted self-interview (audio-CASI) technology. With this format, the items and response options are presented to the child on a touchscreen and the child enters his or her own responses by touching the screen. The responses are then saved on the laptop and will be transmitted along with the data from the assessments. There are several advantages to using an audio-CASI version of the child questionnaire. This format provides more privacy to children as they answer questions that may be sensitive for them, and administration is more standardized because all children hear the items read to them in exactly the same way with the recording. Also, electronic capture of responses reduces processing time and the potential for data entry error.


A computer-based data management system will be used to manage the sample. The sample management system uses encrypted data transmission and networking technology to maintain timely information on respondents in the sample, including contact, tracking, and case completion data. This system is particularly important as children move from one school to another over the course of the ECLS-K:2011 study. The use of technology for sample management will maximize tracking efforts, which should have a positive effect on the study’s ability to locate movers and achieve acceptable response rates.


The ECLS-K:2011 Message Center, which was first used in the spring 2014 third-grade round, will be used again for the spring 2015 fourth-grade round. The message center is a secure website accessed with a username and password that has been assigned to specific users, namely field staff (field managers, school recruiters, and team leaders) and participating school coordinators. The list of children enrolled in each school who are participating in the study will be sent to the school coordinator from the data collection contractor’s home office as an attachment to a secure message. This method will make it more convenient, as compared to communicating via telephone, for school coordinators to access the list of participating children. The Message Center will also greatly enhance the security of this list, as it provides a method for sharing confidential personal identifying information between schools and field staff in a secure environment. Because of the nature of the system, the list cannot be printed or forwarded to other school staff. If the school coordinator is amenable, the message system can also be used for other types of sensitive communication between the school coordinator and the field staff (for example, when informing the field staff that a child has moved to a new school).


A.4 Efforts to Identify Duplication

The ECLS-K:2011 will not be duplicative of other studies. The ECLS-K is the only other study to collect as detailed and extensive information as the ECLS-K:2011 for a cohort of young children and to follow them throughout elementary school. The ECLS-K:2011 extends the information obtained by the ECLS-K to a new cohort, opens up possibilities to investigate new research questions, and allows important comparisons to be made between two kindergarten cohorts attending school a dozen years apart. In addition, the ECLS-K:2011 has collected data during the children’s second-grade year and plans to collect information during their fourth-grade year, which the ECLS-K did not.


A literature search was conducted to identify and review research studies with the same study purpose and goals as those proposed for the ECLS-K:2011. To be included in the search the research had to be (1) a survey-based study of a population with a sample of 1,000 or more, (2) longitudinal in design, and (3) focused on children’s cognitive development in the elementary, middle, and/or secondary grades. Although similar studies were found, they were generally confined to limited geographic areas (e.g., Baltimore, Maryland; Greensboro, North Carolina) or, in the case of studies conducted on the national level (e.g., Prospects, Children of the National Longitudinal Survey of Youth [NLSY Child Supplement]), were not based on probability samples of kindergartners. For example, Prospects began with first graders and targeted Title 1 recipients. NLSY79’s Child Supplement targeted the children of female sample members of a household-based 1979 sample of 14- to 21-year-olds. The Head Start Family and Child Experiences Survey (FACES), which is similar to the ECLS-K:2011 in terms of the content and components included, has followed several cohorts of children from preschool through early elementary school. However, FACES has not followed the progress of children in school beyond kindergarten or first grade, and the samples are limited to children served by Head Start. The NICHD Study of Early Child Care and Youth Development focused on similar child development outcome areas (social, emotional, intellectual, and language development, health, and physical growth), but did not include the same depth of information about the child’s school experiences as does ECLS-K:2011. The NICHD sample was recruited from hospitals shortly after the birth of the children and the study’s main focus was on early child care, including maternal care and the relationship between that care and children’s developmental outcomes. Studies such as the National Education Longitudinal Study of 1988 (NELS:88) and Education Longitudinal Study of 2002 (ELS:2002) began with students in the middle and high school grades. Another major finding of the literature review was that most studies used group-administered achievement tests, which, for young children, can be less reliable than individually administered assessments. Individually administered assessments, like those used in the ECLS-K:2011, allow the assessor to establish rapport and offer motivation and supportive conditions so that each child performs to the best of his or her ability.


A.5 Method Used to Minimize Burden on Small Businesses

Private, not-for-profit, and proprietary elementary schools have been drawn into the sample. These proprietary and nonprofit schools will benefit from the study’s burden-reducing strategies (e.g., instruction packets for participants, toll-free help lines, and prepaid business return envelopes), which were designed for all types of schools.


A.6 Frequency of Data Collection

This submission describes and requests approval for the spring fourth-grade data collection, which will occur in the spring of 2015. The first data collection for the study began in the fall of 2010, and additional data collections have occurred in spring 2011, fall 2011, spring 2012, fall 2012, spring 2013, and spring 2014. One of the main goals of the ECLS-K:2011 is to measure children’s cognitive, socioemotional, and physical growth and development, as well as changes in the contextual characteristics (i.e., family, classroom, school, and community factors) that can affect growth. The spring fourth-grade data collection is one of the periodic follow-ups that will collect information to be compared to baseline (kindergarten) information, thereby allowing for analyses of change for children and their environments.


After this fourth-grade year, the study design calls for one more follow-up collection in the spring of the fifth-grade year. This frequency of data collection is linked to the rate of change that is expected for children of this age and the desire to capture information about children as critical events and transitions are occurring, rather than measuring events and transitions retrospectively. Without data collection follow-ups, the study of children’s cognitive, socioemotional, and physical development is hindered. Assuming the fourth-grade collection is as successful as the previous collections have been to date, a clearance request will be submitted in the future for the follow-up collection in fifth-grade.


A.7 Special Circumstances of Data Collection

No special circumstances for this information collection are anticipated.


A.8 Consultants outside the Agency

NCES consulted with a range of outside agencies over the life of the ECLS‑K, and such input also has informed the ECLS-K:2011 study design and instrumentation, since they draw heavily from the ECLS-K. During the early development of the ECLS-K, project staff met with representatives from a wide range of federal agencies with an interest in the care and well-being of children (see Table A-6). The goal of this activity was to identify policy and research issues and data needs. Similarly, consultation with federal agencies has occurred and continues for the ECLS-K:2011. Several of the early consultations with government agencies have resulted in interagency agreements funding questions, sections of or full study instruments, and components of the child assessments (specifically, the hearing evaluations) to the study instruments.


Project staff has also consulted several other organizations (see Table A-7) that have an interest in the care, well-being, and education of young children. The goal of this activity was to obtain additional perspectives on policy and research issues and data needs. While most of this consultation occurred during the design and conduct of the ECLS-K, there was also some outside consultation during the design of the ECLS-K:2011.


Similar to its predecessor, the ECLS-K:2011 represents a collaborative effort by education and health and human services agencies. NCES supports the development of the core design of the ECLS-K:2011. Partner agencies supporting the inclusion of the supplemental questions or sections of the study instruments that enrich the ECLS-K:2011 by providing expert input or funding (or both) have included the Economic Research Service of the U.S. Department of Agriculture, the National Center for Special Education Research in the Institute of Education Sciences of the U.S. Department of Education, the Administration for Children and Families in the U.S. Department of Health and Human Services, and the National Institute of Deafness and Other Communication Disorders and the National Eye Institute, both at the National Institutes of Health in the U.S. Department of Health and Human Services. Table A-6 lists the Federal agency consultants for the ECLS-K and ECLS-K:2011 and Table A-7 lists other organization consultants for the ECLS-K.


In preparation for the ECLS-K:2011 collections, the data collection contractor assembled expert panels (Technical Review Panel (TRP) and Content Review Panels (CRP)) to review and comment on issues related to the development of the study and survey instruments. The members of the panels included experts in research, policy making, and practice in the fields of early childhood education and development, elementary education, health, research methodology, special populations, and assessment.


There have been three meetings of the TRP panels. The first was a 2-day meeting held in November 2008. The meeting focused on major design and content issues, such as study periodicity, the benefits of including an assessment of science in kindergarten, the assessment of executive functioning, and the content of a Spanish language assessment for native Spanish speakers who are English language learners. The TRP members also provided suggestions for specific questionnaire items to be included in the instruments in the full-scale national data collection. Table A-8 lists the ECLS-K:2011 TRP members present at the first meeting.


The second TRP meeting was a 2-day meeting held in March 2011. The meeting focused on content for the first- and second-grade non-assessment instruments, including suggestions for specific questionnaire items to be included in the instruments in the second-grade data collection. Table A-9 lists the ECLS-K:2011 TRP members present at the second meeting.


The most recent TRP was a 2-day meeting held in November 2013. The discussion focused on the development of the fourth-grade instruments, as well as looking ahead to the fifth-grade data collection. Panel members recommended study constructs and specific items for inclusion in the parent interview, child questionnaire, and teacher and school administrator questionnaires. Table A-10 lists the ECLS-K:2011 TRP members present at the third meeting.


Table A-6. Federal agency consultants for ECLS-K and ECLS-K:2011

Diane Schilder1

Government Accounting Office

Cindy Prince,1 Emily Wurtz1

National Education Goals Panel

Andy Hartman1

National Institute for Literacy

Mary Queitzsch,1 Larry Suter1

National Science Foundation

Michael Ruffner,1 Bayla White,1

Brian Harris-Kojetin1

Office of Management and Budget

John Endahl,1 Jeff Wilde,1 Joanne Guthrie,

Victor Oliviera1

U.S. Department of Agriculture

Don Hernandez1

U.S. Department of Commerce

Bureau of the Census

Marriage and Family Statistics

Tim D’Emillio

U.S. Department of Education, OELA

Naomi Karp,1 Dave Malouf,1 Ivor Pritchard,1

Marsha Silverberg1

U.S. Department of Education, IES


Pia Divine,1 Esther Kresh,1 Ivelisse Martinez-Beck, Ann Rivera

U.S. Department of Health and Human Services

Administration for Children, Youth, and Families


Gerry Hendershot,1 John Kiley,1 Michael Kogan, 1 Mitchell Loeb, Patricia Pastor

U.S. Dept. of Health and Human Services

National Center for Health StatisticsHoward Hoffman

U.S. Dept. of Health and Human ServicesNational Institute on Deafness and Other Communication Disorders, National Institutes of Health

Mary Frances Cotch

U.S. Dept. of Health and Human Services

National Eye Institute, National Institutes of Health

Christa Themann, William Murphy

Centers for Disease Control

National Institute for Occupational Safety and Health


Michael Planty, Jenna Truman

U.S. Department of Justice

Bureau of Justice Statistics

Tom Bradshaw,1 Doug Herbert1

National Endowment for the Arts

Jeffrey Thomas1

National Endowment for the Humanities

Patricia McKee

U.S. Department of Education

OESE Compensatory Education Programs

Cathie L. Martin1

U.S. Department of Education, OIE

Scott Brown,1 Louis Danielson,1 Glinda Hill,1

Lisa Holden-Pitt,1 Kristen Lauer,1

Marlene Simon-Burroughs,1 Larry Wexler

U.S. Department of Education, OSEP

Jon Jacobson

U.S. Department of Education, NCEE

Lisa A. Gorove1

U.S. Department of Education

OUS, Budget Service, ESVA

Elois Scott1

U.S. Department of Education

OUS, PES, ESED

Richard Dean1

U.S. Department of Education

OVAE, Adult Literacy

Jacquelyn Buckley

U.S. Department of Education

IES, NCSER

Jeff Evans,1 Sarah Friedman,1 Christine Bachrach,1

Peggy McCardle1

U.S. Department of Health and Human Services

NICHD, Center for Population Research


Jim Griffin and Regina Bures

U.S. Department of Health and Human Services

NICHD, National Institutes of Health

Martha Moorehouse,1 Anne Wolf1

U.S. Department of Health and Human Services

Office of Assistant Secretary for Planning & Evaluation, Children and Youth Policy

Katrina Baum1

Department of Justice

Bureau of Justice Statistics

Meredith A. Miceli

U.S. Department of Education

Office of Special Education Programs

1 Consultant for the ECLS-K only.

NOTE: Affiliation listed is the affiliation at the time input on the study was provided.


Table A-7. Other organization consultants for ECLS-K and ECLS-K:2011

Lynson Bobo

Project Associate

Resource Center on Educational Equity

Council of Chief State School Officers

Susan Bredekamp, Barbara Willer

National Association for the Education of Young Children

Jane Clarenbach

National Association for Gifted Children

Mary Jo Lynch

American Library Association

Office of Research and Statistics

Keith W. Mielkek

Children’s Television Workshop

June Million, Sally McConnell, Louanne Wheeler

National Association of Elementary School Principals

Evelyn Moore, Erica Tollett

National Black Child Development Institute

Thomas Schultz

Director, Center for Education Services for Young Learners

National Association of State Boards of Education

Larry Suter

Independent Education Consultant, Formerly of NSF and NCES

NOTE: Affiliation listed is the affiliation at the time input on the study was provided. Italicized text used for consultation that occurred for the ECLS-K:2011. All other consultations occurred for the ECLS-K.


Table A-8. ECLS-K:2011 first TRP meeting attendee list (November 2008)

Karl Alexander

Department of Sociology

Johns Hopkins University

Jim Bauman

Center for Applied Linguistics

Washington, DC

Maureen Black

Growth and Nutrition Department

University of Maryland Medical Center

Joanne Carlisle

School of Education

University of Michigan

Janet Fischel

State University of New York at Stony Brook & University Medical Center

Fred Morrison

Department of Psychology

University of Michigan

Charlotte Patterson

Department of Psychology

University of Virginia

Robert Pianta

The Center for Advanced Teaching and Learning

University of Virginia

Kit Viator

Massachusetts Department of Education

NOTE: Affiliation listed is the affiliation at the time input on the study was provided.


Table A-9. ECLS-K:2011 second TRP meeting attendee list (March 2011)

Karl Alexander

Department of Sociology

Johns Hopkins University

Jim Bauman

Center for Applied Linguistics

Washington, DC

Joanne Carlisle

School of Education

University of Michigan

Robert Crosnoe

Department of Sociology

University of Texas at Austin

David Dickinson

Department of Teaching and Learning

Vanderbilt University

Rolf Grafwallner

Maryland Public Schools

Greg Roberts

The Meadows Center for Preventing Educational Risk

University of Texas at Austin

Deborah Stipek

School of Education

Stanford University

NOTE: Affiliation listed is the affiliation at the time input on the study was provided.


Table A-10. ECLS-K:2011 third TRP meeting attendee list (November 2013)

Robert Bradley

Family & Human Dynamics Research Institute

Arizona State University

Carol Connor

Department of Psychology

Arizona State University

Robert Crosnoe

Department of Sociology and Population Research Center

University of Texas at Austin

David Dickinson

Department of Teaching and Learning

Vanderbilt University

George Farkas

School of Education

University of California, Irvine

Gary Ladd

Sanford School of Social and Family Dynamics

Arizona State University

Megan McClelland

Hallie E. Ford Center for Healthy Children & Families

Oregon State University

Greg Roberts

The Meadows Center for Preventing Educational Risk

University of Texas at Austin

Judy Snow

State Assessment Director

Montana Office of Public Instruction

NOTE: Affiliation listed is the affiliation at the time input on the study was provided.


To date, ten meetings of the CRP panels have been held: reading (May 2009), mathematics (May 2009), science (May 2009), English language learners (August 2009), executive function (November 2009; March 2011, December 2012), socioemotional development (March 2011; October 2012), and teacher practices (March 2011). For each of these specific content areas, panel members provided critical review of the instruments for inclusion in the national data collections. The meetings focused on the appropriateness and adequacy of specific instruments by considering features such as domain coverage, age appropriateness, and technical quality. Table A-11 lists the ECLS-K:2011 CRP members.


Table A-11. ECLS-K:2011 CRP member list, by panel

Reading Panel

Susan Conrad

Independent consultant, assessment development

Gloria Johnston

Education National University

Alba Ortiz

University of Texas at Austin

Barbara Wasik

Temple University

Mathematics Panel

Doug Clements

State University of New York, Buffalo

Donna Compano

Independent consultant, assessment development, math facilitator, elementary teacher

Lizanne DeStefano

University of Illinois at Urbana-Champaign

Leah Parker

Journeys Academy, Gifted Education Specialist

Science Panel

Christie Bean

JJ Ciavarra Elementary School

Kathy DiRanna

University of California - Irvine

Angela Eckhoff

Clemson University

Christine Y. O’Sullivan

Science Consultant

Michael Padilla

Clemson University

English Language Learners Panel

Jamal Abedi

University of California at Davis

Catherine Crowley

Teachers College

Eugene E. García

Arizona State University

Vera Gutierrez-Clellen

San Diego State University

Executive Function Panel

Clancy Blair

New York University

Adele Diamond (March 2011 meeting only)

University of British Columbia

Lisa Jacobson (December 2012 meeting only)

Kennedy Krieger Institute

Megan McClelland

Oregon State University

Philip Zelazo

University of Minnesota

Socioemotional Development Panel

Pamela Cole (March 2011 meeting only)

The Pennsylvania State University

Rick Fabes

Arizona State University

Karen Bierman (October 2012 meeting only)

The Pennsylvania State University

Allan Wigfield (October 2012 meeting only)

University of Maryland

Ross Thompson (March 2011 meeting only)

University of California, Davis

Carlos Valiente (March 2011 meeting only)

Arizona State University

Dorothy Espelage (October 2012 meeting only)

University of Illinois

Teacher Practices Panel

Stephanie Al Otaiba

Florida State University

Hilda Borko

Stanford University

Carol Connor

Florida State University

Barbara Wasik

University of North Carolina

NOTE: Affiliation listed is the affiliation at the time input on the study was provided.


A.9 Provision of Payments or Gifts to Respondents

Obtaining high response rates is critical for all longitudinal studies. At the start of a longitudinal data collection, it is essential to establish the good will of respondents and to demonstrate that we value their participation in the study. Good will can be established by using well-designed respondent materials that inform respondents about the goals of the study and their role in it, the field staff establishing a rapport with the respondents, professionalism among the field staff, and a small token incentive. The same incentive plan that was approved by OMB for the spring 2014 third-grade ECLS-K:2011 data collection is proposed for the spring 2015 fourth-grade data collection. The plan is designed to help respondents to recognize the merits of the study and thereby encourage high response rates.


As described below, we propose to provide monetary incentives to school staff, as has been done in prior rounds of data collection for the ECLS-K:2011. Parents and children will not receive any significant incentive, monetary or otherwise. As in the past, children will be given ECLS-K:2011 pencils with the sun logo that they use for the math portion of the assessment. In the third-grade round, we also gave children a green rubber bracelet with the study logo printed on it. For fourth-grade round, we propose offering participating children a multi-ink pen with the sun logo. Also as in the third-grade round, we plan to send a set of ECLS-K:2011 post-it notes with the sun logo (included in Appendix G) with the parent letter discussing the new round of data collection. The study is now entering its fifth year, and both parents and children have been asked to participate several times, some as many as seven times. These tokens of appreciation are being sent as a small gesture in an effort to maintain enthusiasm for and a positive attitude about the study. The parent response rates have consistently been lower than desired (between 67 percent and about 80 percent), so another goal of providing these small tokens of appreciation is to maintain the participation of parents who have partaken consistently in the past and encourage the participation of those who have not.


A.9.1 School Incentive

High levels of school participation are integral to the success of the study. Without a school’s cooperation, there can be no school, teacher, or child data collection activity at that facility. NCES recognizes that administrators will assess the study’s burden level before agreeing to participate. To offset the perceived burden, NCES intends to continue its use of strategies that have worked successfully in the past for the ECLS-K:2011, the ECLS-K, and other major NCES studies (High School and Beyond, the National Education Longitudinal Study of 1988, and the Education Longitudinal Study of 2002). It is important to provide schools with an incentive because the study asks a lot of them, including allowing field staff to be in their schools for up to 3 days, providing a contact person and space for the children to be assessed, removing children from their classes while they are assessed, and obtaining information about the school, the teachers, and the children.9 Given the many demands and outside pressures that schools face, it is essential that they see that we understand the burden we are placing on them and that we value their participation. As was done for the other ECLS-K:2011 data collections, we propose to remunerate schools $200 per school. An honorarium check in the amount of $200 will be mailed to each school at the end of the spring fourth-grade data collection along with a thank you note thanking the school for its participation.10

A.9.2 School Administrator

To build response rates for the school administrator questionnaire, we propose to remunerate school administrators. In the ECLS-K, when no incentive was provided for administrators until the third-grade round of data collection, the field period had to be extended (for both kindergarten and first grade) to obtain response rates for the school administrator questionnaire that were closer to the desired rate of 85 percent or higher. Providing school administrators with an incentive will reduce the potential for needing to extend the field period and help avoid delays in data delivery. We will offer school administrators a $25 incentive in the spring fourth-grade collection, the same amount that was given to school administrators in the previous rounds of the ECLS-K:2011; the incentive will be attached to the questionnaire given to the school administrator to complete. In the spring second-grade round of the ECLS-K:2011, we offered school administrators a $25 incentive and a completion rate of 91 percent was achieved for the school administrator questionnaire.11


A.9.3 Teachers

In the base-year, first-, and second-grade collections of the ECLS-K:2011, teachers received $7 per child-level questionnaire because they were asked to provide a significant amount of information about each study child based on their observations of these students. A check for the incentive was attached to the package of instruments the teacher received each fall and spring. For the spring third-grade collection of the ECLS-K:2011, OMB approved a change in the incentive structure to the model that was used in later rounds of the ECLS-K. We propose to continue this incentive structure for the fourth-grade data collection as well because teachers are again being asked to provide a significant amount of key information about the study children’s school experiences and outcomes. General classroom teachers will still be given $7 per subject-specific TQC, along with an additional $20 associated with the teacher-level TQ. Also consistent with the third-grade collection, special education teachers will receive $7 for each child-level questionnaire and $20 for the teacher-level questionnaire. We are proposing to use the same incentive structure for all teachers, regardless of the specific questionnaires they are being asked to complete, to protect against any perception of unfairness that might result if teachers within a school talk to one another about the amount they have received for a specific questionnaire.


Based on what occurred in the ECLS-K, we expect that teachers will have on average two sampled children linked to them, resulting in a total remuneration of $34 ($7 each for each subject-specific TQC and $20 for the teacher-level TQ). The estimate for special education teachers is the same. A check for the incentive will be attached to the package of instruments each teacher receives.


A.9.4 School Coordinators

School coordinators act as the study liaison between study staff and their school and, as such, they play a very important role in the ECLS-K:2011.12 They communicate necessary information to parents, notify teachers and encourage their participation, arrange the assessment logistics (e.g., space to conduct the assessments), and collect hard-copy teacher and school administrator questionnaires. For this reason, school coordinators will be offered a $25 incentive for providing assistance to the study in the spring fourth-grade data collection.13 The $25 checks will be attached to the packets mailed to the coordinators at the start of data collection. The study offered the same incentive to the school coordinators during the ECLS-K:2011 kindergarten, first-, second-, and third-grade data collections.


A.10 Assurance of Confidentiality

The ECLS-K:2011 plan for protecting confidentiality of the project participants conforms with the following federal regulations and policies: the Privacy Act of 1974 (5 U.S.C. 552a), Privacy Act Regulations (34 CFR Part 5b), the Education Sciences Reform Act of 2002 (20 U.S. Code Section 9573), the Computer Security Act of 1987, the NCES Restricted-Use Data Procedures Manual, and the NCES Standards and Policies.


All adult respondents who are participating in research under this clearance are informed that the information they provide will be protected from disclosure except as required by law (20 U.S. Code Section 9573) and that their participation is voluntary. All adult respondents receive an introductory letter that explains NCES’s and the contractor’s adherence to policies on disclosure.14 Also, this information appears on the cover of each of the study self-administered questionnaires. This information was provided to parents as the guardians for their children when their cooperation was sought during the base year of the study.


Since early spring 2010 (when recruitment for the kindergarten data collections began), information about the protection of data from disclosure has been conveyed to state, district, and other school officials at the time their cooperation for the study was sought. As sampled children move to new schools, this information will be provided to the districts in which those schools are located, if necessary (i.e., if there are no participating schools in those states and districts already). New schools in the study will receive the letter developed for schools to which sampled children transfer that can be found in Appendix G of this clearance request, as well as the study brochure that was approved in May 2010 (see Appendix H of OMB No. 1850-0750 v. 8).


Respondent letters to parents summarize the data protection assurances; namely, that data will be combined to produce statistical reports, that no data will be published that link the respondent to his/her responses; that participation is voluntary; and that there is federal statute that protects the data from disclosure except as required by law (20 U.S. Code Section 9573).


All contractor staff members working on the ECLS-K:2011 project or having access to the data (including monitoring of interviews and assessments) are required to sign an NCES Affidavit of Nondisclosure and a Confidentiality Pledge. They also are required to complete mandatory training on data confidentiality and the safe handling of data. The contractor will keep the original notarized affidavits on file and submit PDF copies of all affidavits to NCES quarterly. In addition, contractor staff will complete background screening in compliance with ACS Directive (OM:5-101).


During the course of data collection, interviewers will be equipped with laptop computers, which store any necessary preloaded data, as well as the information collected on a given day during the data collection round. The interviewers will be instructed to keep the computers and any hard-copy case materials in a secure place in their homes when they are not being used. When the interviewer is in the field collecting interview or assessment data, he or she is instructed to keep all materials and the computer in his/her possession at all times. When driving a car to or from his/her appointments, the computer and all materials will be locked out of sight, so as not to provide an inviting opportunity for burglary. The interviewers will be instructed to transmit the electronic data for a case to a central database on the same day the case is completed. Data transmitted electronically will be encrypted during transmission. The laptop configuration is designed with security and confidentiality considerations in mind. In order to access any of the applications, the interviewer must enter a project-specific password and an interviewer identification code, both of which are checked against encrypted versions of the same data; if the password or interviewer identification code is entered incorrectly repeatedly, the interviewer is “locked out” of the application. All data files will be encrypted on the computer hard disk. In the event of a hardware failure in the field, the home office will swap the interviewer’s laptop for a new one. The contractor will maintain a supply of “hot spares,” i.e., laptop computers loaded with all necessary ECLS-K:2011 software, which require only the specific interviewer’s identification code and assignment before being sent out.


All mailing of respondent materials, laptops, and hard-copy case materials used by assessors to manage their workload will be done using Federal Express, which has a sophisticated tracking system designed to locate any misdirected packages. All packages will require the recipient’s signature for delivery. To the extent practical, the study name and logo will not be included on hard copy materials used by field staff to record school or respondent information. In the event of a loss of hard copy materials, this procedure would make it more difficult for someone who finds the materials to associate a school or respondent with the study.

Exhibit A-2. Confidentiality pledge

Shape1

EMPLOYEE OR CONTRACTOR’S ASSURANCE OF CONFIDENTIALITY OF SURVEY DATA


Statement of Policy


{Contractor} is firmly committed to the principle that the confidentiality of individual data obtained through {Contractor} surveys must be protected. This principle holds whether or not any specific guarantee of confidentiality was given at time of interview (or self-response), or whether or not there are specific contractual obligations to the client. When guarantees have been given or contractual obligations regarding confidentiality have been entered into, they may impose additional requirements which are to be adhered to strictly.


Procedures for Maintaining Confidentiality


1. All {Contractor} employees and field workers shall sign this assurance of confidentiality. This assurance may be superseded by another assurance for a particular project.

2. Field workers shall keep completely confidential the names of respondents, all information or opinions collected in the course of interviews, and any information about respondents learned incidentally during field work. Field workers shall exercise reasonable caution to prevent access by others to survey data in their possession.

3. Unless specifically instructed otherwise for a particular project, an employee or field worker, upon encountering a respondent or information pertaining to a respondent that s/he knows personally, shall immediately terminate the activity and contact her/his supervisor for instructions.

4. Survey data containing personal identifiers in {Contractor} offices shall be kept in a locked container or a locked room when not being used each working day in routine survey activities. Reasonable caution shall be exercised in limiting access to survey data to only those persons who are working on the specific project and who have been instructed in the applicable confidentiality requirements for that project.

Where survey data have been determined to be particularly sensitive by the Corporate Officer in charge of the project or the President of {Contractor}, such survey data shall be kept in locked containers or in a locked room except when actually being used and attended by a staff member who has signed this pledge.

5. Ordinarily, serial numbers shall be assigned to respondents prior to creating a machine-processible record and identifiers such as name, address, and Social Security number shall not, ordinarily, be a part of the machine record. When identifiers are part of the machine data record, {Contractor’s Manager of Data Processing} shall be responsible for determining adequate confidentiality measures in consultation with the project director. When a separate file is set up containing identifiers or linkage information which could be used to identify data records, this separate file shall be kept locked up when not actually being used each day in routine survey activities.

6. When records with identifiers are to be transmitted to another party, such as for keypunching or key taping, the other party shall be informed of these procedures and shall sign an Assurance of Confidentiality form.

7. Each project director shall be responsible for ensuring that all personnel and contractors involved in handling survey data on a project are instructed in these procedures throughout the period of survey performance. When there are specific contractual obligations to the client regarding confidentiality, the project director shall develop additional procedures to comply with these obligations and shall instruct field staff, clerical staff, consultants, and any other persons who work on the project in these additional procedures. At the end of the period of survey performance, the project director shall arrange for proper storage or disposition of survey data including any particular contractual requirements for storage or disposition. When required to turn over survey data to our clients, we must provide proper safeguards to ensure confidentiality up to the time of delivery.

8. Project directors shall ensure that survey practices adhere to the provisions of the U.S. Privacy Act of 1974, and any additional relevant laws that are specified in the contract, with regard to surveys of individuals for the Federal Government. Project directors must ensure that procedures are established in each survey to inform each respondent of the authority for the survey, the purpose and use of the survey, the voluntary nature of the survey (where applicable), and the effects on the respondents, if any, of not responding.

PLEDGE


I hereby certify that I have carefully read and will cooperate fully with the above procedures. I will keep completely confidential all information arising from surveys concerning individual respondents to which I gain access. I will not discuss, disclose, disseminate, or provide access to survey data and identifiers except as authorized by {Contractor}. In addition, I will comply with any additional procedures established by {Contractor} for a particular contract. I will devote my best efforts to ensure that there is compliance with the required procedures by personnel whom I supervise. I understand that violation of this pledge is sufficient grounds for disciplinary action, including dismissal. I also understand that violation of the privacy rights of individuals through such unauthorized discussion, disclosure, dissemination, or access may make me subject to criminal or civil penalties. I give my personal pledge that I shall abide by this assurance of confidentiality.


Signature

In addition, as in the third-grade data collection round, a secure message system will be used to share materials containing sensitive information (e.g., children’s names) between the field staff and school staff. In previous rounds of the ECLS-K:2011 the list of participating children was sent separately from all other study materials via Federal Express and contained no study identifying information. With the secure message system, this list of participating children will be shared electronically, rather than in hard copy. The system does not allow for the list to be printed or forwarded to other staff, enhancing the confidentiality of the materials.


Finally, all computer assisted interviewing (CAI) applications will have an audit trail of the case data on the hard disk, so that if the main data files are corrupted, the data can be reconstructed from the audit trails.


After data collection, all personally identifiable information will be stored on a secure server and password protected with access limited to authorized project staff. Personally identifiable data will also be protected through the coding of responses so that no one individual respondent can be identified (specifically or by deduction) through reported variables in the public access data files. NCES will monitor the conduct of the contractor to ensure that the confidentiality of the data is not breached.


NCES understands the legal and ethical need to protect the privacy of the ECLS-K:2011 survey respondents and, with the contractor, has extensive experience in developing data files for release that meet the Government’s requirements to protect individually identifiable data from disclosure. The contractor will conduct a thorough disclosure analysis of the ECLS-K:2011 data when preparing the data files for researchers’ use. This analysis will ensure that NCES has fully complied with the confidentiality provisions contained in 20 U.S. Code, Section 9573. To protect the privacy of respondents as required by 20 U.S. Code, Section 9573, respondents with high disclosure risk will be identified, and a variety of masking strategies will be used to ensure that individuals may not be identified from the data files. These masking strategies include:

  • Swapping data on both the public- and restricted-use files;

  • Omitting key identification variables such as name, address, telephone number, and school name and address from both the public- and restricted-use files (though the restricted-use file will include NCES school ID that can be linked to other NCES databases to identify a school);

  • Omitting key geographic identification variables such as state or ZIP Code from the public-use file;

  • Collapsing categories or developing categories for continuous variables to retain information for analytic purposes while preserving confidentiality in public-use files; and

  • Topcoding” and “bottomcoding”15 continuous variables in public-use files.

A.11 Sensitive Questions

The ECLS-K:2011 is a voluntary study, and no persons are required to respond to the interviews and questionnaires or to participate in the assessments. In addition, respondents may decline to answer any question they are asked. This voluntary aspect of the survey is clearly stated in the advance letter mailed to adult respondents, the study brochure,16 and the instructions of hard-copy questionnaires, and it is stressed in interviewer training to ensure that interviewers are both communicating this to participants and following these guidelines. Additionally, assessors are trained that children may refuse to participate at the time they are visited for an assessment and staff are to respect the children’s wishes. Also, prior to the start of the child questionnaire, children are instructed that if they wish to skip a particular question, they should touch the “Next” button without choosing a response. The following describes the general nature of the national data collection instruments that will be used during the spring fourth-grade data collection, as well as topics that may be sensitive for some respondents.


School Administrator Questionnaires. The items in the School Administrator Questionnaire are not of a sensitive nature and should not pose sensitivity concerns to respondents.


Teacher Questionnaires. The information collected in the subject-specific questionnaires could be regarded as sensitive, because the teacher is asked to provide information about children’s social skills (including ability to exercise self-control, interact with others, resolve conflict, and participate in group activities); problem behaviors (e.g., fighting, arguing, anger, depression, low self-esteem, impulsiveness); learning dispositions (e.g., curiosity, self-direction, inventiveness); liking or avoiding school; relationships with peers; and experiences with peer victimization, both as a victim and as the aggressor. A study of bullying, a construct closely related to peer victimization, by the National Institute for Child Health and Human Development (NICHD) found that 16 percent of middle school students reported being bullied (Nansel et al., 2001). Fewer studies have been done with younger children, but those that have been published suggest that bullying is experienced by many children and is related to negative outcomes. Glew et al.’s (2005) study of third through fifth graders found that 22 percent of children were classified as victims, bullies, or both. Victims, and children who were both bullies and victims, had lower achievement scores and were more likely to feel like they did not belong at school compared to bystanders (Glew et al. 2005). Kochenderfer and Ladd (1996) found a relation between victimization and school adjustment outcomes, with victimization related to children’s loneliness and desire to avoid school. Given these findings and the current White House anti-bullying initiative, having the ECLS-K:2011 collect information about peer victimization for this national sample of elementary school children would be useful.


Within the set of questions about the teacher’s views on school climate and the school environment, there are some questions that could be deemed sensitive by some teachers. Teachers may feel that rating statements regarding their satisfaction with their work (e.g., I really enjoy my present teaching job) are sensitive in nature. These items are included because prior research (e.g., Perrachione, Rosser, & Peterson, 2008; Luekens, Lyter, & Fox, 2004; Rhodes, Nevill, & Allen, 2004) indicates that teacher satisfaction may be associated with relevant constructs such as staff retention and stability. Prior to their participation, teachers will be informed and assured that their information will be protected from disclosure except as required by law and that their responses will not be shared with their employers or the parents of their students. Also, teachers and school coordinators will be given an envelope in which they can place their completed questionnaire and seal it before returning their questionnaire to the school coordinator.


Direct Cognitive Assessments. The direct cognitive assessments are essential in determining children’s performance levels as they progress through school. Because schools often use different standards in their own assessments of children and a uniform set of assessment instruments and procedures is needed for the ECLS-K:2011, school-developed assessments cannot be used in the ECLS-K:2011. The items to be included in the ECLS-K:2011 reading, math, and science assessments undergo a sensitivity review and are not themselves sensitive in nature. Similarly, the executive function assessment is not sensitive in nature. However, direct assessments of children do raise certain concerns about the assessment procedures to be used. Of primary concern is the length of the assessments. The cognitive assessments, while untimed, are designed to be administered on average within a 60-minute time period. The child questionnaire is designed to be administered in 10 minutes and measurement of height and weight adds another 5 minutes to the total child assessment time. NCES has developed instruments appropriate to the ages of the participating children, and every effort will be made to staff the study with field assessors who have prior experience in working with children. Issues specific to working with children also figure prominently in assessor training so that the field staff can respond appropriately to children who may become upset or frustrated by the assessment.


Child Questionnaire (CQ). Some of the questions contained in the child questionnaire may be deemed sensitive, particularly those related to fear of negative evaluation, social distress, and peer victimization. These types of items were added to the child questionnaire at the recommendation of the October 2012 CRP and November 2013 TRP. Based on CRP and TRP recommendations, items measuring fear of negative evaluation that came from a longer social anxiety scale were included in the third-grade data collection; in one study, children reporting a high level of fear of negative evaluation using items from this scale self-reported lower perceived social acceptance and lower global self-worth (La Greca and Stone 1993). The TRP also recommended adding the social distress items from a scale of measuring loneliness (Asher, Hymel, and Renshaw 1984) because it taps feelings of peer rejection and connectedness to the school social environment.


The child-reported items measuring peer victimization were used in third grade. These items mirror the items that were fielded in the second-and third grade parent and teacher instruments (and are included in the fourth-grade teacher instruments), thus allowing researchers to analyze the relationship between children’s own report of peer victimization and the perceptions of peer victimization as reported by adults. Members of the CRP recommended this approach and suggested the items, which are adapted from a scale developed for this age group (Espelage and Holt 2001). As with other respondents, children will be told that they can skip any question(s) they do not wish to answer as part of the instructions for completing the questionnaire.


Parent Interviews. Several topics that will be addressed in the spring fourth-grade parent interview could be sensitive in nature for some respondents. Questions about family income, child-rearing and disciplinary practices, children’s disabilities, children’s receipt of tutoring, children’s country of origin, and contact with a child’s nonresidential parent will be included in the parent interview. All of these questions have been asked in earlier versions of the ECLS-K:2011 and will provide another time point in the study for information on these topics.


New topics in the spring fourth-grade interview that could be sensitive for some respondents are questions about the child’s friends, whether the child avoids school, and parent-child conflict. These questions were included in the cognitive interviews with parents described above in section A.1.5.1. Parents were asked if any of the questions made them uncomfortable or included a sensitive topic or issue for the parent to answer about his or her child. No parents indicated that the felt the questions were too sensitive to answer. Questions on most of these topics were included in the ECLS-K and very few parents objected to them. Results from the ECLS-K showed that there were very low levels of missing data in the parent interviews for all items, including the ones mentioned here that are planned to be included in the ECLS-K:2011. For example, in the spring kindergarten round of the ECLS-K, response rates for sensitive items such as family income and marital satisfaction were in the mid- to high 90’s (94.4 percent and 99.7 percent, respectively).


Prior research indicates that the topics in the parent interview are correlated with children’s achievement and help to predict children’s preparedness for and success in school. Collecting data on these topics will allow researchers to go beyond descriptive analyses of variation in children’s performance by basic background characteristics such as race/ethnicity and sex. Researchers will be able to test hypotheses about how a wide range of family characteristics relate to early success in school. Therefore, it is important to include questions on the sensitive topics listed above in the parent interviews. Like other study participants, parents will be told that they can refuse to answer any question they wish.


Additionally, because it is imperative that respondents can be found at a later date for follow-up collections in a longitudinal study, the ECLS-K:2011 interview protocol requests locating information from parents. The locating information includes name, address, telephone number, email address, and contact information for an individual who would always know the whereabouts of the respondent. Such information may appear sensitive to respondents who may be leery about providing contact information for people they know; again, they will have the option to refuse to answer these questions.


Table A-12. Estimated respondent burden for the national spring fourth-grade data collection, previously cleared fourth-grade tracking and recruitment activities, previously cleared tracking for the spring fifth-grade data collection, and recruitment for the spring fifth-grade data collection


Respondent type


Sample n

Response rate/
selection rate

Number of respondents

Hours per instrument

Instruments per respondent

Number of responses


Total hours1

Spring-Fourth National Data Collection








Direct Assessment

12,456

.90

11,210

1.33

1

11,210

14,909

Child Questionnaire

12,456

.90

11,210

0.17

1

11,210

1,906

Parent Interview

12,456

.90

11,210

0.58

1

11,210

6,502

NAEP overlap parent interview section2

1,600

.90

1,440

0.03

1

1,440

43

School Administrator Questionnaires (SAQ)

3,314

.90

2,983

1.00

1

2,983

2,983

Teacher-level Questionnaire (TQ)

8,355

.90

7,520

0.21

1

7,520

1,579

Teacher Child-level Reading Questionnaire

(TQR)- key child

6,228

.90

5,605

0.503

1

5,605

2,803

Teacher Child-level Reading Questionnaire

(TQR)- additional child

6,228

.90

5,605

0.20

1

5,605

1,121

Teacher Child-level Mathematics

Questionnaire (TQM)- key child

3,114

.90

2,803

0.223

1

2,803

617

Teacher Child-level Mathematics

Questionnaire (TQM)- additional child

3,114

.90

2,803

0.03

1

2,803

84

Teacher Child-level Science Questionnaire

(TQS)-key child

3,114

.90

2,803

0.203

1

2,803

561

Teacher Child-level Science Questionnaire

(TQS)-additional child

3,114

.90

2,803

0.02

1

2,803

56

Special Education Teacher-level

Questionnaire (SPA)

1,326

.90

1,193

0.25

1

1,193

298

Special Education Teacher Child-level

Questionnaire (SPB)

1,454

.90

1,309

0.25

1

1,309

327

School Coordinator Assistance4

3,314

.90

2,983

0.20

NA

2,983

597

Tracking for Spring Fourth-Grade








Parent

12,456

1.0

12,456

.084

1

12,456

1,046

School Coordinator

3,314

1.0

3,314

1.00

1

3,314

3,314

Recruitment for Spring Fourth-Grade








Parent

12,456

1.0

12,456

0.25

1

12,456

3,114

Teacher

7,954

1.0

7,954

0.50

1

7,954

3,977

School Administrator

3,314

1.0

3,314

1.00

1

3,314

3,314

Tracking for Spring Fifth-Grade








Parent

11,406

1.0

11,406

.084

1

11,406

958

School Coordinator

3,787

1.0

3,787

1.00

1

3,787

3,787

Table A-12. Estimated respondent burden for the national spring fourth-grade data collection, previously cleared fourth-grade tracking and recruitment activities, previously cleared tracking for the spring fifth-grade data collection, and recruitment for the spring fifth-grade data collection (continued)


Respondent type

Sample n

Response rate/
selection rate

Number of respondents

Hours per instrument

Instruments per respondent

Number of responses

Total hours1

Recruitment for Spring Fifth-Grade








Parent

11,406

1.0

11,406

0.25

1

11,406

2,852

Teacher

8,355

1.0

8,355

0.50

1

8,355

4,178

School Administrator

3,490

1.0

3,490

1.00

1

3,490

3,490









Study Total

-

-

55,0735

-

-

140,2086

49,5077

NA Not applicable

1 Calculations are based on rounded numbers.

2 In the fourth-grade data collection, a subsample of parents will be asked an additional set of questions about their current employment and education level. These questions are being asked in order to link information collected from ECLS-K:2011 parent respondents and ECLS-K:2011 child respondents who also participate in the 2015 National Assessment of Educational Progress (NAEP) study. Since only a subsample of parents will receive these additional questions, the additional burden on these parents has been broken out separately.

3 The teacher burden for the subject-specific questionnaires is reported at the child level to accurately reflect the teacher burden. Based on the results of the spring 2014 teacher timings tests, the assumption of the burden for reading teachers is that it will take them an average of 30 minutes to complete the questionnaire for the key child and 12 minutes to complete the questionnaire for an additional child (for a total of 42 minutes to complete two reading questionnaires). The assumption for mathematics teachers is that it will take them an average of 13 minutes for the key child’s questionnaire and 2 minutes for an additional child’s questionnaire (for a total of 15 minutes for two mathematics questionnaires). The assumption for science teachers is that it will take them an average of 12 minutes for the key child’s questionnaire and 2 minutes for an additional child’s questionnaire (for a total of 13 minutes for two science questionnaires).

4 School coordinators are school staff members who help organize the logistics for the assessment visit. They do not complete a study instrument.

5 Total number of respondents represents the total number of respondents with no duplication on the number of listed instruments each respective respondent is asked to complete. Shaded numbers do not contribute to the calculation of the total. For the spring fourth-grade activities (recruitment, tracking, and data collection), the largest n across these activities is used for the school coordinator, school administrator, and regular classroom teacher. It is expected that the parent respondent will be the same at all rounds, so the largest n for parents (tracking for spring fourth-grade) is used in the calculation of the total. Teachers will complete the TQ and subject-specific TQCs; reading teachers of all sampled children will complete questionnaires, half the sampled children’s mathematics teachers will complete questionnaires, and half the children’s science teachers will complete questionnaires. One special education teacher completes both SPA and SPB. The sample of students taking the direct assessment is not included in this count because it is not subject to the Paperwork Reduction Act reporting.

6 Total number of responses represents the total number of respondents multiplied by the total number of instruments they complete. The sample of students taking the direct assessment is not included in this count because it is not subject to the Paperwork Reduction Act reporting.

7 The sample of students taking the direct assessment is not included in this count because it is not subject to the Paperwork Reduction Act reporting.

NOTE: Information in the table that appears in gray text (i.e., tracking and recruitment for the spring-fourth grade data collection) pertains to activities and burden that were approved in a previously cleared package. It is included here because these activities will continue after this current submission is cleared. Shaded numbers do not contribute to calculation of the totals.


A.12 Estimated Response Burden

Table A-12 outlines the estimated respondent burden for data collection activities for which this submission is requesting approval (the national spring fourth-grade data collection and recruitment for the spring fifth-grade data collection). Included in these estimates, where appropriate, is the time that a respondent would need to gather and compile the data and the clerical time needed to fill out the form. The spring fourth-grade national data collection includes direct cognitive assessments and self-administered questionnaires with children; measurements of children’s height and weight; parent interviews; self-administered teacher-level questionnaires; reading, math, and science teacher child-level self-administered questionnaires; special education teacher teacher-level self-administered questionnaires; special education teacher child-level self-administered questionnaires; and school administrator self-administered questionnaires.


Table A-12 also outlines respondent burden for recruitment for spring fifth-grade. The table below includes 5 minutes per parent respondent to read the birthday cards the study sends to children to keep in touch with them and their families. Recruitment burden time includes the time necessary to read study materials sent to parents, teachers, and school administrators; time during which teachers would discuss the study with a data collection staff member; and time the school administrator will take discussing the study with a school recruiter attempting to secure the school’s participation.


The total number of respondents across all of the data collection activities listed in table A-12, i.e., school administrators, teachers, school coordinators, and parents, is estimated to be 55,073.17 Because the parent study participants are expected to be the same across rounds, it would not be accurate to calculate a total sample or total number of respondents as a simple sum of the sample sizes and respondents for each round. Instead, to calculate a total, table A-12 uses the maximum estimated sample size or number of respondents across all rounds. Specifically, the largest number of parents involved in the activities delineated in table A-12 is expected to be contacted during recruitment for the spring fourth-grade national data collection. This is the number used for parents in the calculation of total sample size and total number of respondents. Also, for the spring fourth-grade activities (recruitment, tracking, and data collection), the largest sample size and number of respondents across these activities is used for the school coordinator, school administrator, and regular classroom teacher. The estimated respondent burden across all these activities translates into a cost amount of $1,315,401 for 49,507 hours.18 The time children will spend completing the Child Questionnaire has been included in the estimated burden, although the time children will spend completing the cognitive assessments has not been included in the estimated burden.


A.13 Estimates of Cost to Respondents

There are no costs to the respondents to participate beyond the time needed for school coordinators to act as a liaison with the school, for parents to answer the interview questions, for teachers and school administrators to complete the questionnaires, and for the children to participate in the assessments. No equipment, printing, or postage charges will be incurred by the participants.


A.14 Cost to the Federal Government

Tracking and recruitment activities for the fourth-grade data collections are being carried out under NCES contract ED-IES-12-C-0037 with Westat. The period of performance for this ECLS-K:2011 contract, which includes the sample tracking procedures through the spring fifth-grade data collection, and the spring third-grade and spring fourth-grade national data collections, runs from June 2012 through June 2017. The total cost to the Government for contractor and subcontractor costs for this contract is $28,342,921. This cost estimate includes sample tracking activities, a pilot test of the third- through fifth- grade direct child assessments, all data collection activities from spring-third through spring-fourth grade, design enhancements, and data file delivery and documentation. Table A-13 provides the study costs by year of the contract for the third- and fourth-grade data collections. This clearance request is for the fourth-grade national data collection and recruitment for the fifth-grade data collection so the estimated cost for the activities covered by this clearance are approximately half of the contract total, at $14.1 million.


Table A-13. Study costs per year of the contract for the third- and fourth-grade data collections

Year

Amount

2012

$152,645

2013

$1,703,307

2014

$12,966,104

2015

$12,425,111

2016

$984,531

2017

$115,243

Total

$28,346,921


A.15 Reasons for Changes in Response Burden and Costs

The decrease in the burden requested for this collection as compared to the burden last approved under OMB# 1850-0750 is due to the fact that the hearing evaluation component that was included in the spring third-grade data collection is not being included in fourth grade and the tracking and recruitment are for one less round of data collection.


A.16 Publication Plans and Time Schedule

Publications relevant to the data collection will be part of the reports resulting from the spring fourth-grade data collection. A data file with data from the fourth-grade collections will be produced and made available to researchers in a public-use format. Also produced from the fourth-grade collections will be a restricted-use data file. Researchers who are approved by NCES’s data confidentiality office for a restricted-use license can access restricted-use data files, which include more sensitive items and items that pertain to smaller numbers of children (e.g., information about the presence of specific disabilities). To be approved for a restricted-use license, researchers must demonstrate that they have a research question that cannot be answered with the public-use data and that they have the infrastructure to keep the data secure to prevent loss or unauthorized use. Codebooks and user’s manuals will be produced for use with the public- and restricted-use data files. All data will be merged at the child level. Data files will include all instrument variables (except for those that gather directly identifying information, such as the names of household members) and relevant associated variables, such as derived variables and assessment scores. Data will be released through Electronic Codebook (ECB) software that allows users to create customized data files in standard statistical software packages (SPSS, SAS, and Stata) and to view codebook information. A file record layout will also be provided so that analysis packages other than SPSS/PC, SAS/PC, and Stata/PC (e.g., analysis packages for Apple computers) can be used to analyze the ECLS-K:2011 data.


The ECLS-K:2011 reports and publications will include detailed methodological reports describing all aspects of the data collection effort and psychometric reports outlining properties of the study instruments, as well as reports that describe the population of children who were kindergartners in the 2010-11 school year as they progress through school.


The operational schedule for the ECLS-K:2011 spring fourth-grade data collection is shown in table A-14. Table A-14 also shows the operational schedule for the tracking and recruitment activities in the spring of fourth grade, as well as tracking for fifth grade, which were approved in previous clearance requests (1850-0750 v.12-15).


Table A-14. Operational schedule for ECLS-K:2011 data collection activities

Activity

Start date

End date

Sample Tracking and Recruitment for Spring Fourth-Grade Data Collection



Mail birthday cards

6/1/2014

6/1/2015

Pre-assessment call

8/11/2014

12/19/2014

Tracking movers and updating field management system

8/11/2014

12/19/2014

Parent, teacher, school administrator, school coordinator mailings

2/15/2015

4/16/2015

ECLS-K:2011 Spring Fourth-Grade Data Collection



Identify and subsample movers1

8/30/2013

12/16/2013

Print/program assessment

7/25/2014

12/2/2014

Print/program questionnaires

10/23/2014

1/7/2015

Train data collectors

3/10/2015

3/16/2015

National data collection

3/17/2015

7/17/2015

Process data

3/17/2015

8/14/2015

Construct data files, develop user’s manual

8/14/2015

7/11/2016

Methodology/psychometric reports2

10/19/2015

12/13/2017

Sample Tracking and Recruitment for Spring Fifth-Grade Data Collection



Mail birthday cards

6/1/2015

6/1/2016

Pre-assessment call

8/10/2015

12/18/2015

Tracking movers and updating field management system

8/10/2015

12/18/2015

Parent, teacher, school administrator, school coordinator mailings

2/16/2016

4/15/2016

1 Activities for identifying and subsampling movers were approved in a previous OMB package.

2 The methodology and psychometric reports will include descriptions of both the third-grade and fourth-grade rounds of data collection.

NOTE: Information in the table that appears in green text (i.e., sample tracking and recruitment for fourth-grade data collection) pertains to activities and burden that were approved by OMB in a previous package. It is included here because burden for these activities is being carried over since the activities have not yet been completed.


A.17 Approval for Not Displaying the Expiration Date for OMB Approval

No exemption from the requirement to display the expiration date for OMB approval of the information collection is being requested for the ECLS-K:2011.


A.18 Exceptions to the Certification Statement

No exceptions to the certification statement apply to the ECLS-K:2011.

1Throughout this package, reference is made to the Early Childhood Longitudinal Study, Kindergarten Class of 1998-99. For ease of presentation, it will be referred to as the ECLS-K. The new study for which this submission requests approval is referred to as the ECLS-K:2011.

2At each follow-up stage, a small percentage of children had been retained in a grade at some point prior to the wave of interest and therefore were in a grade lower than the target grade of that follow-up stage. In addition, a small number of children were found to be advanced to a higher grade. These off-grade students were not excluded from the study.

3Since the study began, some children have been retained in a grade and some children have been advanced to a grade higher than the modal grade of the study’s children. While the study refers to the data collection rounds by the modal grade for most children in the cohort at the time of data collection, children are still included in the study even if they are in grades other than the modal grade due to retention or advancement.

4In kindergarten, the science assessment had just one stage.

5The questions about school characteristics may be completed by a designee, but the study requests that the administrator complete the section about his/her own characteristics and background.

6 The subject-specific questionnaire that was mailed to the respondent depended on the subject(s) taught and type of cognitive interview the respondent was assigned to during recruitment.

7 In order to ensure enough respondents were recruited, as well as to reflect the fact that not all children in the study sample are currently in 4th grade, teachers from 3rd and 5th grade were included in the timing tests.

8www.fns.usda.gov/cnd/lunch/aboutlunch/nslpfactsheet.pdf as of 5/21/2013.

9 In a few cases, a field staff member may need to visit the school on an additional day or two, for example if a child is absent on the date(s) of the assessment and a make-up day is scheduled.

10Remuneration will not be provided to schools into which study children have transferred because most of those schools have only one study child. Because only a few children will be assessed in most of these transfer schools, the burden on the school is minimal. For example, fewer field staff will visit the school, a smaller assessment space can be used, and likely only one classroom and fewer teachers are asked for assistance. School administrators, teachers, and (if applicable) special education teachers will still be remunerated for the completion of the hard-copy questionnaires.

11Because final reconciliation of the spring-third grade data collection has not yet been completed, the response rate for the spring-second grade round has been provided here.

12The school coordinator will often be the same school staff member from a previous round of data collection. If that person is not available, then a new staff member will be identified by the school administrator to act as a liaison to the study.

13Remuneration will not be provided to school coordinators in schools into which study children have transferred since kindergarten if those schools are not attended by at least four ECLS-K:2011 study children.

14Spring fourth-grade recruitment materials were approved in a previous clearance package – OMB# 1850-0750 v.15.

15Topcoding and bottomcoding refer to the process of recoding outlier values to some acceptable end value. For instance, everyone with a personal income higher than $200,000 may be recoded to $200,001 or more to eliminate the outliers.

16The study brochure was approved in a previous OMB clearance package (OMB No. 1850-0750 v. 8).

17Schools are asked to assign a staff member to help coordinate the assessment activities at the school; these school coordinators are counted in the total number of respondents and their burden hours are counted. However, school coordinators do not complete any study instruments as part of their role as coordinator.

18An hourly rate of $26.57 was used to translate teacher response time into a dollar amount. This rate is based on the National Compensation Survey. See U.S. Department of Labor (2007). National Compensation Survey: Occupational Wages in the United States, May 2011.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleEarly Childhood Longitudinal Study, Kindergarten Class of 2010-11 (ECLS-K:2011)
AuthorAmy Dygan
File Modified0000-00-00
File Created2021-01-27

© 2024 OMB.report | Privacy Policy