Volume I:
Request for Clearance for the Early Childhood Longitudinal Study, Kindergarten Class of 2010-11 (ECLS-K:2011) Pilot Tests of the Third-, Fourth-, and Fifth-Grade Direct Child Assessment, Child Questionnaire, and Online School Administrator Questionnaire
OMB# 1850-0803 v.75
December 21, 2012
The Early Childhood Longitudinal Study, Kindergarten Class of 2010-11 (ECLS-K:2011) is a survey that focuses on children’s early school experiences beginning with kindergarten and continuing through the fifth grade. It includes the collection of data from parents, teachers, school administrators, and nonparental care providers, as well as direct child assessments. Like its sister study, the Early Childhood Longitudinal Study, Kindergarten Class of 1998-99 (ECLS-K),1 the ECLS-K:2011 is exceptionally broad in its scope and coverage of child development, early learning, and school progress, drawing together information from multiple sources to provide rich data about the population of children who were kindergartners in the 2010-11 school year. As with the ECLS-K, the ECLS-K:2011 is sponsored by the National Center for Education Statistics (NCES) within the Institute of Education Sciences (IES) of the U.S. Department of Education (ED). Fall and spring data collections in the study children’s kindergarten and first-grade years, as well as the fall second-grade year, were conducted for NCES by Westat. The Educational Testing Service (ETS) serves as the subcontractor developing the child assessments. Clearances for studying the ECLS-K:2011 cohort have been granted under OMB clearance number 1850-0750 for the fall 2009 field test data collection, the fall 2010 and spring 2011 kindergarten national data collections, the fall 2011 and spring 2012 first-grade national data collections, and the fall 2012 second-grade national data collection. Additional clearances have been received for the spring first-grade cognitive interviews (OMB No. 1850-0803 v. 43), cognitive interviews to test Response to Intervention (RtI) questions (OMB No. 1850-0803 v. 51), a field test of a computerized version of the Dimensional Change Card Sort (OMB No. 1850-0803 v. 60), and cognitive interviews to test items for the spring second-grade teacher questionnaire and parent interview and to finalize the parent interview through timing tests (OMB No. 1850-0803 v. 64).
The ECLS-K:2011 is the third in an important series of longitudinal studies of young children sponsored by the U.S. Department of Education that examine child development, school readiness, and early school experiences. It shares many of the same goals as its predecessors, the ECLS-K and the Early Childhood Longitudinal Study, Birth Cohort (ECLS-B), but also provides updated information and includes new content to address changes in education policy since the time in which the earlier studies were conducted.
Like its predecessors, the ECLS-K:2011 will provide a rich and comprehensive source of information on children’s early learning and development, transitions into kindergarten and beyond, and progress through school for a new cohort of children.
The ECLS-K:2011 will provide data relevant to emerging policy-related domains not measured in previous studies.
Coming more than a decade after the inception of the ECLS-K, the ECLS-K:2011 will allow cross-cohort comparisons of two nationally representative kindergarten classes experiencing different policy, education, and demographic environments.
This is a request for clearance to conduct a pilot test to develop the direct cognitive assessments and child questionnaire for administration in the spring data collections of 2014, 2015, and 2016, when most study children are expected to be in the third, fourth, and fifth grades, respectively. The pilot test will serve as the primary vehicle for: (1) estimating the psychometric parameters of all items in the assessment battery item pool, (2) producing psychometrically sound and valid direct cognitive assessment instruments for use in the aforementioned data collections, and (3) assessing the feasibility of fielding an audio computer-assisted self-interview (audio-CASI) child questionnaire in the national data collection.2 In addition, clearance to test the functionality of an online School Administrator Questionnaire (SAQ) is being requested (the identical paper version of this questionnaire was approved in October 2012). If this test is successful, the online SAQ would be offered as an alternative to the ECLS-K:2011 paper SAQ beginning in the spring 2014 data collection.
Purpose of the Pilot Test of the Direct Cognitive Assessments. The purpose of the pilot test of cognitive assessment items for the third, fourth, and fifth grades will focus primarily on examining the psychometric characteristics of new reading, math, and science items developed for the ECLS-K:2011 by ETS. The other items to be tested have been used in the previously-administered ECLS-K:2011, ECLS-K, Education Longitudinal Study of 2002 (ELS:2002), or National Education Longitudinal Study of 1988 (NELS:88) assessments. These items are included in the pilot test assessment to evaluate whether the items function differently for a different cohort. Pilot testing of the newly-developed and previously-fielded items provides the opportunity to examine the item characteristics with calibration of item difficulties on the same scale for items derived from different sources. Also, feedback on how children respond to the assessment items and suggestions for improving the assessment items will be obtained from assessors.
Purpose of the Pilot Test of the Child Questionnaire. As mentioned above, the purpose of the pilot test of the child questionnaire is to assess the feasibility of administering an audio computer-assisted self-interview (audio-CASI) child questionnaire (CQ) in the national data collection. The Self-Description Questionnaire (SDQ; Marsh 1992) will be the basis for a portion of the CQ; other questions are drawn from the National Institute of Health’s Toolbox for the Assessment of Neurological and Behavioral Function and other published scales (Appendix A includes the programmer specifications for the audio-CASI child questionnaire, which indicate the exact instructions to be provided to the child and the items that will be administered.) The CQ consists of 49 statements, which children will respond to using various 4- and 5-point rating scales. Items include those measuring children’s interest in reading, mathematics, and science; relationships with peers; feelings of loneliness; occurrences of peer victimization; and overall happiness with different aspects of life (e.g., attention from parents or hobbies and free time activities). Depending on the rating scale, children are asked to indicate how true each statement is for them or how often they feel certain emotions or experience certain behaviors.
The child questionnaire will be a new component in the ECLS-K:2011, but a similar instrument (the SDQ in its entirety) was successfully used in the ECLS-K third- and fifth-grade data collections as a hard-copy self-report instrument. The previously-used child questionnaire instrument was revised for the ECLS-K:2011 based on the recommendations of a Socioemotional Content Review Panel (CRP) that was convened in October 2012. Changes made to the instrument include:
Rather than asking children about their interest and competence in “all school subjects,” items have been edited to specifically refer to science.
Questions on externalizing and internalizing problem behaviors were dropped.
Items already tested and validated in other studies were added to the ECLS-K:2011 CQ instrument to measure children’s social awareness, feelings of loneliness, fear of negative evaluation, peer victimization, and life satisfaction (Asher, Hymel, and Renshaw 1984; La Greca and Stone 1993; Zimmer-Gembeck, Geiger, and Crick 2005; Crick and Grotpeter 1995; NIH Toolbox for the Assessment of Neurological and Behavioral Function 2012).
Data from the CQ will enable researchers to compare students’ self-rating of competence in various school subjects to the students’ performance on assessment items in the reading, math, and science domains. A child’s self-concept is important for success in school; children who believe that they have the ability to succeed in a particular situation are more likely to be successful. Research has shown that the child’s perception of academic competence predicts reading and mathematics achievement (e.g., Kirsch et al. 2002). Students’ beliefs about their competence are linked to how engaged they are in the classroom. Students who are engaged in learning are often involved in their work, persevere despite obstacles, and show pride in their accomplishments. A child’s beliefs about the likelihood for success can also influence the child’s emotional state. For example, a student who believes that he or she will not be successful may be anxious or fearful which may result in less engagement in the classroom.
Based on recommendations obtained during the CRP meeting held in October 2012, additional items measuring children’s social distress, prosocial behavior, and life satisfaction were added to the CQ. CRP members advised that self-reports of children’s social distress are more important to capture in the ECLS-K:2011 than self-reported measures of children’s behavior problems (which were used in the ECLS-K) for a variety of reasons, including differences in children’s ability to provide valid self-reports on these two topics. The CRP recommended assessment of social distress globally, with items measuring loneliness, fear of negative evaluation (social anxiety), and perception of victimization, three constructs that tend to correlate with one another. In response to this recommendation, a 3-item loneliness scale was added; Parker and Asher (1993) have reported that this 3-item scale correlates with a longer scales measuring social disaffection. Gest, Welsh, and Domitrovich (2005) have also found that the loneliness scale is associated with measures of aggression, peer rejection, and teacher-student relationship quality. Items measuring fear of negative evaluation were recommended from a longer social anxiety scale; in one study, children reporting a high level of fear of negative evaluation using items from this scale self-reported lower perceived social acceptance and lower global self-worth (La Greca and Stone 1993). The peer victimization items that were added mirror the items that are currently being fielded in the second-grade parent and teacher instruments, thus allowing researchers to analyze the relationship between children’s own report of peer victimization and their experiences as reported by parents and teachers. Items on children’s prosocial behavior and satisfaction with friends and family were added in response to CRP concerns that positive affect and orientation were not tapped by the other items proposed for the CQ, noting a need to balance more negatively-toned items with more positive ones in a child self-report.
In the ECLS-K, the child questionnaire was on paper. The assessor read the items and response categories to the child and the child marked his or her answer on the form. At the end of the assessment, the assessor keyed the child’s responses into a laptop.
For this pilot test, an audio-CASI version of the child questionnaire will be tested. With this format, the items and response options are presented to the child on a laptop screen and the child enters his or her own responses by using a stylus to touch the laptop screen that has stylus input capabilities. Generally, self-administered procedures evoke a greater sense of privacy, which leads to more self-disclosure (Sudman and Bradburn 1974; Tourangeau and Smith 1996; Turner et al. 1998). Because the respondent is controlling the pace of the question-answer process, this gives the respondent more time to process the questions being asked and give more accurate answers, which is even more critical in surveying special populations such as children (De Leeuw and Collins 1997; Turner et al. 1998).
To accommodate the variation in children’s reading ability levels, item text and response options will be audio recorded and “read” to the child, who will be listening to the recording through headphones. Headphones will be used to make it easier for the child to hear the item text, to limit distractions from other children in the assessment area, and to enhance the feeling of privacy (De Leeuw, Hox, Kef, and Van Hattum 1997). Only the child will be able to hear the question being asked, and after an answer is provided it will disappear from the screen.
Thus, there are several advantages to using an audio-CASI version of the child questionnaire compared to using a paper and pencil hard-copy version. First, as described above, it would provide more privacy to children as they answer questions that may be sensitive for them. Second, administration would be more standardized as all children would hear the items read to them in exactly the same way with the recording. Finally, since the responses are entered directly by the child, the potential for data entry error that can result when the assessor enters responses initially recorded on the hard copy is reduced.
The pilot test will obtain feedback from assessors on the child questionnaire. Specifically, feedback collected from assessors will include the following:
How children respond to the CQ audio-CASI, including the use of the headphones, the stylus, and the overall instructions;
Various behaviors children may exhibit during the CQ task that indicate boredom, frustration, or lack of understanding about the task; and
Ways in which training and administration of the CQ for the national data collection can be improved, should the audio-CASI be implemented.
Prior to formal pilot testing of the child questionnaire, feasibility testing will be conducted with a convenience sample of up to 9 children (third- and fourth-graders) to examine the clarity and effectiveness of the assessor instructions, to observe children’s reactions to the audio-CASI application, and to identify any software or hardware issues with the audio-CASI application. The children to be included in this convenience sample will be recruited from Westat’s network of parents in the Washington, DC metropolitan area. Westat’s network of parents includes parents whose families have participated in past Westat pilot tests and is supplemented with parents recruited through advertisements in print and online sources.
Purpose of the Pilot Test of the Online School Administrator Questionnaire. The spring 2013 pilot tests will also include a component testing the use of an online School Administrator Questionnaire (SAQ). Currently, the ECLS-K:2011 SAQ is a paper questionnaire that includes questions about school and administrator characteristics. It is administered to all school administrators in the study’s sampled schools each spring. Some of the information administrators are asked to provide annually changes little, if at all, from year to year (for example, the type of school or the number of days in the school year). In the first- and second-grade data collections, this issue was addressed by having separate questionnaires for administrators who had previously completed an SAQ and those who had not, with the returning school administrator questionnaire excluding questions asking for information that was unlikely to change. One purpose of the pilot test is to test another method to reduce burden on those administrators who have completed the SAQ in the past. This method involves fielding an online questionnaire and prefilling some information about school characteristics that was collected from administrators in past rounds. Rather than administrators entering new responses in blank fields, for these prefilled questions administrators would instead review the information provided and update or correct it as necessary. The pilot test will also be used to test the general functionality of the online instrument and whether collecting this information electronically is preferred by administrators.
Thus, the focus of this pilot test of the online SAQ is three-fold. It will:
Explore usability issues with the technology and procedures for an online questionnaire,
Evaluate the effects of an online format on perceived respondent burden, and
Evaluate administrators’ general preferences for an online questionnaire compared to a paper version.
The data collected will be used to determine the feasibility of and, if appropriate, the development of a third-grade online SAQ to be used in spring 2014, as well as those to be used in future rounds of data collection.
Study Authorization. NCES has contracted Westat to conduct the spring third- and fourth-grade data collections for the ECLS-K:2011, including the child assessment, child questionnaire, and online school administrator questionnaire pilot test activities. The request to conduct the national spring third-, fourth-, and fifth-grade data collections will be submitted to OMB at a later date. The ECLS-K:2011 collections are authorized under 20 U.S. Code section 9543.
Data collection for the pilot test will begin in April 2013 and will be completed in June 2013. Training and data collection protocols (including all response rate maximization procedures) developed for the full-scale survey will be used during the conduct of the pilot test.
Pilot Test Samples. The pilot test sample will be composed of approximately 60 schools (primarily kindergarten through sixth-grade schools, although if needed to obtain the appropriate sample of sixth-graders, some middle schools may be included) in 5 geographic regions to achieve a total sample of 2,940 children. This total includes 120 third-graders and 120 fourth-graders for the child questionnaire, plus 450 third-graders, 900 fourth-graders, 900 fifth-graders, and 450 sixth-graders for the cognitive assessment.3 These sample sizes have been calculated to provide at least 700 responses for each item in the cognitive assessment. The goal is to recruit 9 third-graders, 18 fourth-graders, 18 fifth-graders in each elementary school, and 18 sixth-graders in each K-6 elementary school. Completed cognitive assessments are only needed for 450 sixth-graders, but completing 18 sixth-grade assessments when possible will help achieve the target number of completed sixth-grade assessments even if some participating schools do not teach sixth grade. If the desired number of sixth-graders cannot be achieved in enough K-6 schools, additional sixth-graders attending middle schools will be recruited into the study to ensure the target number of sixth-grade participants.
In a subset of 10 schools, 240 additional third- and fourth-graders will be recruited to participate in the feasibility test of the audio-CASI child questionnaire. In each school in the subset, a random sample of 12 third graders and 12 fourth graders will participate in the feasibility test for a total of 240 children. The subsample will be a purposive sample, but again, attempts will be made to ensure that this subset includes schools from different locales and sectors and that the child sample is racially/ethnically diverse.
Participation in the child assessment or child questionnaire component will be limited to English-speaking children from schools that are not part of the ECLS-K:2011 national study and have not participated in any prior ECLS-K:2011 field test, pilot test, or cognitive laboratory activities. Children will not be eligible to participate if they require accommodations such as a sign language interpreter, Braille, a health care aide or assistive device, or other special arrangements or assistance. The pilot test sample will be a purposive sample, but attempts will be made to include schools from different locales and sectors (i.e., public, non-religious private, and parochial schools). Special efforts will be made to include schools with economic diversity as well as with high percentages of Black, Hispanic, and Asian/Pacific Islander children so as to maximize racial/ethnic diversity within the sample.
The pilot test of the online SAQ will be conducted with school administrators from schools that have agreed to participate in the child assessment and child questionnaire pilot tests. However, because it is anticipated that not all school administrators whose schools are participating in child pilot tests will agree to personally participate in the online SAQ component, the online SAQ pilot test sample will be supplemented as needed with additional school administrators from schools in districts that have agreed to participate in the child pilot tests. The targeted sample size for the online SAQ component is 60 school administrators.
Data Collection Methods for the Child Pilot Test Components. The general data collection methods used in the child pilot tests will be largely the same as those that have been used successfully for the data collection rounds of ECLS-K:2011 that have been conducted to date. The assessment visit at each school is expected to take approximately 4 days. The exact number of days for the assessment visit will depend on several factors, including the number of participating children at the school, any restrictions on the assessment schedule (e.g., if assessments can only be conducted in the morning), and the number of assessments than can be conducted simultaneously within the space made available by the school. The number of participating children at each school will vary depending on which grades are taught at the school, the number of qualifying children in each targeted grade (this includes all students in third- through sixth-grades who are English proficient and would not require accommodations and excludes students whose Individualized Education Program indicates they cannot participate in standardized assessments), and the number of children for whom parental consent is obtained. The study goal is to administer assessments to 9 third-graders, 18 fourth-graders, 18 fifth-graders, and 18 sixth-graders per school. For schools participating in the child questionnaire pilot test, the goal is to complete the child questionnaire with an additional 12 third-graders and 12 fourth-graders. However, data may be collected from additional students in some schools to meet the pilot test’s overall goals related to total sample size. The length of the assessment visit will be worked out with the school prior to the assessment team visiting the school.
The assessment team that visits each school will include a team leader and three assessors. The assessment team will arrive at the school on the appointed first day of assessments and, following any of the school’s required check-in procedures, immediately contact the school coordinator.4 The team leader will introduce the assessors to the school coordinator. The procedures to be used during the on-site data collection period will be discussed with the school coordinator to ensure there is a common understanding of those procedures.
The team leader and assessors will be taken by school personnel to the assessment area(s), from which they will arrange to remove potential distractions as much as possible, arrange furniture (as allowed by the school) to establish a comfortable environment for conducting the assessment, and set up their assessment materials.
Once the assessment areas have been set up and assessors are ready to begin work, the school coordinator will introduce the ECLS-K:2011 team members to the teacher(s) whose students will be assessed. The teacher, in turn, will introduce the assessors to the class. Assessors will then escort the sampled children to the assessment areas, one-by-one, and conduct the child assessment. In the 10 schools that are part of the audio-CASI pilot test subset, children will receive either the child assessment or the child questionnaire; which instrument they receive will be determined by random assignment.
Format and Administration of the Child Assessment Forms. The assessment forms will be administered as one-on-one direct assessments with questions presented on an easel. Unlike the direct cognitive assessments in the full-scale data collections which are computer-assisted, in the pilot test assessors will record children’s answers using paper (i.e., a score sheet) and pencil. In order to test many different items without overburdening the children, the pilot test assessment easels will be developed from items divided into four reading forms, two math forms, and two science forms. These forms will be spiraled such that each child will receive one of four versions of the reading assessment and one of two versions of either the math or the science assessment (e.g., reading 1 and math 1, reading 2 and math 2, reading 3 and science 1, reading 4 and science 2, etc.). The administration order of the subject areas will be counter balanced (i.e., some easels present the reading items before the math or science items, whereas other easels present the reading items after the math or science items) to guard against practice and fatigue effects. Administration of the child assessment is expected to take approximately one hour.
Administration of the Child Questionnaire. In the pilot test, the audio-CASI child questionnaire will present the item text and response options on a laptop equipped with headphones and a monitor that has stylus input capabilities. Children will listen to the task instructions, item text, and response options using headphones while following along on the screen. Children will use a stylus to touch their response option on the screen, skip questions they do not wish to answer, and change answers if they wish to do. Administration of the child questionnaire is expected to take approximately 15 minutes.
After completing the assessment or the child questionnaire, the child will be returned to the classroom by the assessor and the next child will be assessed.
Collection of Assessor Feedback. Staff will be trained to record their observations about children’s behaviors and responses to both the assessment and the child questionnaire. Each assessor will keep a general diary of pilot test experiences, including notes on participants’ reactions to the assessment items. In addition, assessors will complete a separate diary specific to the CQ, in which they will record observations on the audio-CASI and any difficulties the children had interacting with the computer program. These observations will be used to prepare the pilot test report and revise training and data collection procedures for the national data collection as needed.
Data Collection Methods for the Online SAQ Pilot Test. Due to the longitudinal nature of the national study, many ECLS-K:2011 school administrators participate in multiple rounds of data collection. However, when children change schools, their new schools are recruited into the study and, as a result, new administrators are added to the sample each round. To mirror the two types of administrators (“returning” and “new”) in the national study, pilot test administrator respondents will be randomly placed into one of two treatment groups of 30 school administrators each:
Returning: Administrators in this group will first be asked to complete the paper version of the SAQ that is used in the national data collection. When that SAQ has been completed and returned to Westat, the administrators will then be asked to complete the same questionnaire online. Responses to a selected subset of questions from the paper version will be uploaded into the online version of the SAQ, so that when the administrator views the online SAQ, these responses will be pre-filled and available for updating. Each administrator will be sent an invitation to complete the online questionnaire that contains a secure link to the online SAQ. Returning administrators will be asked some questions soliciting their feedback on the online system at the end of the online SAQ. Specifically, the questions ask about the administrator’s experiences with the paper and online forms, such as opinions on the online format of the survey, level of effort needed to complete the questionnaire with pre-filled responses, and the ease-of-use of the online system. Administrators that provide responses to the debriefing questions that would benefit from follow-up (e.g., responses that are unclear or ambiguous, responses that indicate difficulty using the online system) will be called by an ECLS-K:2011 staff member. In those follow-up calls, respondents will be asked to clarify and explain their responses and experiences.
New: Administrators in this group will be asked to complete only the online SAQ without any pre-filled responses, thereby replicating the experience of administrators who are new to the ECLS-K:2011 in any given study round. They will be given a version of feedback questions which does not include questions on the paper SAQ or issues related to pre-filled responses. Like the returning administrators, new administrators that provide responses to the debriefing questions that would benefit from follow-up will be called by an ECLS-K:2011 staff member, to provide clarification and explanation.
The debriefing questions to be administered to both the returning and new administrator groups can be found in Appendix B. The ECLS-K:2011 national study’s second-grade SAQ, which can be found in Appendix C (this instrument was approved in October 2012 for use as a paper version), will be used in this pilot test. Because the pilot is not intended to test the content or understanding of the items themselves, it is not necessary to use the exact instrument that will be used in third, fourth, or fifth grade.
As noted above, for administrators in the returning treatment group, responses to some questions from the paper SAQ will be pre-filled in the same questions in the online SAQ. The pre-populated data will be school-level or other selected information that does not contain Personal Identifying Information (PII). The items chosen for pre-filling are considered to be largely stable over time.
On March 11, 2011, an ECLS-K:2011 Content Review Panel (CRP) met and recommended the use of the SDQ in the ECLS-K:2011, largely due to its success in the ECLS-K and the fact that the SDQ is a well-known and well-used measure.
In preparation for the third- and fourth-grade data collections, a CRP meeting with a different panel of experts was held on October 22, 2012, to review and comment on issues related to the measures of socioemotional development being considered for use in the ECLS-K:2011. The focus of the meeting was to discuss the child questionnaire to be tested as an audio-CASI instrument and the best ways to measure socioemotional development in the remaining years of the study. During the CRP meeting, experts made recommendations to add some new items measuring important socioemotional constructs to the child questionnaire. The child questionnaire items proposed for the third-, fourth-, and fifth-grade pilot test reflect the recommendations made by the most recent CRP. See table 1 for the list of the March 2011 CRP members and table 2 for the list of October 2012 CRP members.
Table 1. Members for the 2011 Socioemotional Content Review Panel
Name |
Affiliation |
Expertise |
Pamela Cole |
The Pennsylvania State University |
Emotional development in early childhood; emotion regulation |
Richard Fabes |
Arizona State University |
Children’s adaptation to school, emotional development, peer relationships |
Ross Thompson |
University of California, Davis |
Early personality and socioemotional development in the context of close relationships |
Carlos Valiente |
Arizona State University |
Children’s stress and coping, roles of parenting and temperament to children’s social and academic competence |
Table 2. Members for the 2012 Socioemotional Content Review Panel
Name |
Affiliation |
Expertise |
Karen Bierman |
The Pennsylvania State University |
Peers, aggression, intervention, interagency school readiness consortium, child interviews/child-report measures |
Dorothy Espelage
|
University of Illinois, Urbana-Champaign |
Bullying, school-based prevention |
Richard Fabes |
Arizona State University |
Children’s adaptation to school, emotional development, peer relationships, March 2011 CRP member |
Allan Wigfield |
University of Maryland, College Park |
Development and socialization of motivation and self-concept; gender differences; achievement motivation; self-regulation and learning; motivation for literacy |
Recruiting. Recruitment of schools and school administrators will begin upon receiving OMB clearance. All respondent materials described below can be found in Appendix D. Districts (and dioceses) with selected schools will be sent a letter and summary sheet that describe the pilot test. Project staff in charge of school recruitment will conduct a follow-up call with the districts to answer any questions they may have about the pilot test, review the list of schools in their district selected to participate, and confirm that the study staff may contact the schools.
School administrators will then be sent a letter and flyer that describe the study and inform them that an ECLS-K:2011 representative (i.e., the school recruiter) will contact them to discuss their own and their school’s participation in the pilot test including answering any questions they may have. School recruiters will then follow up with a call to the school administrator to request the school administrator’s and school’s cooperation and to ask him/her to appoint a school coordinator for the child pilot test activities. School administrators may decline their personal participation in the online SAQ while agreeing for their school’s students to participate in the child assessment and child questionnaire pilot tests. Because of this, additional school administrators may be recruited from schools in the pilot test districts that are not being recruited for the child components. A letter specific to these school administrators’ recruitment can be found in Appendix D, while the recruiter scripts to be used during the school administrator recruitment calls can be found in Appendix E.
After gaining cooperation from the school administrator for the school to participate in the child components, the school recruiter will call the staff member appointed as the school coordinator. The recruiter will collect information about the school, schedule the assessment dates and locations, discuss procedures for recruiting children for the pilot test (i.e., how to contact parents to secure consent for their children’s participation), and determine the consent requirements of the school. The school coordinator will then receive a packet of materials including information packets for parents and forms for tracking students for whom the school has received parental consent.
The school recruiter will follow-up with a call to review these materials with the school coordinator. The school coordinator will be asked to identify any children who should be excluded from the pilot test due to accommodation needs (e.g., sign language interpreter, Braille, health care aides or assistive devices, setting and scheduling issues) or limited English proficiency and will inform the school recruiter how many children in each grade level are eligible for participation in the pilot test.
Parents of all eligible children will receive a letter and flyer describing the study activities and their children’s role in the study. Parents will also receive either an active or passive consent form, depending on the requirements of the school. Active consent requires the parent to sign and return a permission form in order for his or her child to participate in the pilot test. In contrast, passive consent involves notifying parents of the pilot test activities but only requires the return of the permission form if a parent does not want his or her child to participate in the pilot test.
The school coordinator will also be asked to provide classroom and basic demographic information—grade level, sex, race, and date of birth—for each child for whom parental consent is obtained. This information will be recorded by the coordinator on grade-level specific Parent Permission Tracking Forms, which the assessment team members will use when in the schools to identify children who have permission to participate in the pilot test. The demographic information for each participating child will also be recorded on a child administration record (CAR), the form that assigns a child identification number to each participating child just before the assessment is conducted. This two-step process for documenting children’s demographic data ensures that the assessment team has demographic data available for all children whose parents have provided consent for them to participate in the pilot test, but that the study staff only retains this information for children who actually participate in the pilot test. The Parent Permission Tracking Forms are left at the school with the school coordinator. The child’s name, although initially written on the CAR, is cut off and left with the school coordinator so no personally identifiable information about participating children remains on the CAR when it is removed from the school. The team leader then ships each completed CAR to the Westat home office via FedEx.
Given the main purposes of the child components of the pilot test (i.e., to examine the psychometric properties of the cognitive assessment items and to test the feasibility of implementing the audio-CASI child assessment instrument in the national data collection), a purposive sample, rather than a random sample, can be appropriately employed, because it is not necessary for the resulting data to be representative of a national population of children. Therefore purposive sampling will be used for the pilot test, such that the sample will consist of those children for whom parental consent to participate has been received. Although there are targeted numbers of children needed for each grade at each school, efforts will be made to include as many children as possible who have signed consent forms or whose parents have not opted out by the time of data collection. The maximum number of participating children from each school will be determined based on factors such as the number of permitted assessment days and any scheduling constraints during the assessment visit.
While it is expected that an average of 9 third-graders, 18 fourth-graders, 18 fifth-graders, and 18 sixth-graders will participate in the cognitive assessment at each school and that an additional 12 third-graders and 12 fourth-graders will participate in the child questionnaire at selected schools, it is likely that some schools will have a higher number of eligible children in each grade, while others may have fewer eligible children. Thus, by including in the pilot test all children in a school for whom consent is obtained, the total target number of complete assessments or questionnaires will be reached. However, if parental consent is obtained for more children than can be assessed in the number of days agreed upon by the school coordinator and the school recruiter, not all of the children with consent will participate in the pilot test.
For the 60 school administrators participating in the online SAQ component, a welcome email will be sent to the administrators within 5 days of recruitment to confirm that the email address on file for the administrator is correct, to give further details about the pilot test, and to alert “returning” administrators that a paper SAQ is being sent to them via FedEx. Additional correspondence with the school administrators will depend on whether the administrator has been randomly selected for the “returning” or “new” treatment group.
Returning administrators: Within a few weeks of the welcome email being sent, the paper SAQ will be mailed to the 30 school administrators in the “returning” group. Included in the package will also be a letter with detailed instructions for completing the paper SAQ, a return FedEx mailing envelope and label, and the honorarium. Administrators will be asked to complete and return the paper SAQ in a timely manner. As questionnaires are returned, they will be receipted and entered into a database. These data will then be used to pre-fill selected questions in the online SAQ. Once the paper questionnaire has been receipted and the online SAQ questions have been prefilled, an email will be sent to the administrator inviting him/her to complete the online SAQ. Included in the body of the email will be a secure respondent-specific link needed to begin the survey. Returning administrators will be asked to go online to complete the questionnaire, which will include both reviewing the pre-filled items and completing the other items that have not been pre-filled. At the end of the online SAQ, debriefing questions designed to collect information on ease-of-use, burden, and suggestions for improvement will be asked (Appendix B includes the debriefing questions). Administrators that provide answers indicating difficulties with the online forms or that provide vague answers to the open-ended questions will be asked to provide greater clarity on their responses through a brief follow-up call with an ECLS-K:2011 staff member.
New administrators: Within a few weeks of the welcome email being sent, a second email will be sent to the 30 administrators in the “new” treatment group inviting them to complete the online SAQ. Included in the body of the email will be a secure, respondent-specific link needed to begin the survey. A letter containing the link will also be mailed to these school administrators, along with the honorarium. Like the returning administrators, debriefing questions will be administered at the end of the online SAQ, and follow-up calls will be conducted as needed by an ECLS-K:2011 staff member.
A reminder email will be sent to all administrators who have not responded to the online SAQ invitation after 1 week. Once the online questionnaire is completed, a thank you email acknowledging the completion of the questionnaire will be sent. The correspondence sent to school administrators can be found in Appendix D.
Honoraria. We propose to remunerate schools $7 for each child who participates in the study. This incentive is necessary to recruit schools that have many competing priorities. High levels of school participation are integral to the success of the pilot test. It is important to provide schools with an incentive because the study asks a lot of their time and effort, including to allow assessors to be in their schools for approximately 4 days, to identify a school coordinator, to provide space for the children to be assessed, to remove children from their normal classes while they are assessed, and to provide information about the school and the children, as noted above. Given the many demands and outside pressures that schools face, it is essential that they see that we understand the burden we are placing on them and that we value their participation. Children, their parents, teachers, school administrators, and school coordinators will not receive an individual incentive for the child components, although children will be able to keep the ECLS-K:2011 pencil that they used during the assessment.
Also, we propose to remunerate administrators for their participation in the online SAQ component. We propose that “returning” administrators will be given an honorarium of $40, mailed with the paper SAQ, for completing two questionnaires and the debriefing questions, along with telephone follow-up as necessary. Proposed for the “new” administrators is an honorarium of $25, to be mailed with the welcome letter sent soon after recruitment. New administrators will be asked to complete only the online questionnaire and debriefing questions, with telephone follow-up as necessary.
School administrators and parents will receive materials describing the study that clearly indicate that their participation is voluntary. Furthermore, parents will receive consent forms that they must return to the school either to allow their child to participate or to provide notification that they do not want their child to participate, depending on the method of consent the school requires. These materials, which can be found in appendix D, also include the required language about disclosure and legal use of any data collected through this study that has appeared on other ECLS-K:2011 respondent materials.
No directly identifying personally identifiable information (PII) will be removed from the school, and demographic and school information will not be maintained after the assessment analyses are completed. Project staff are thoroughly trained to leave any materials with directly identifying PII at the school in a designated envelope. The school coordinator keeps the materials in the envelope at the end of each assessment day until the team returns the following day. At the end of the field period, the materials with PII are destroyed.
School administrators will return hard-copy questionnaires via FedEx, with ECLS-K:2011 staff tracking receipt of packages. School administrator online data collection will be hosted on NCES’s secure servers, with respondent access to the questionnaires password protected.
Table 3 shows the expected burden for the pilot test. Direct child assessments will be conducted with 450 third-graders, 900 fourth-graders, 900 fifth-graders, and 450 sixth-graders for a total of 2,700 children. The response burden for children participating in the direct child assessments is expected to be about 60 minutes, for a total of 2,700 burden hours. However, because the assessment is not subject to Paperwork Reduction Act (PRA) reporting, this sample of children is not included in the calculation of total sample size or number of respondents or responses, and the time children will spend completing the assessments has not been included in the calculation of total burden hours.
The audio-CASI child questionnaire will be conducted with 120 third-graders and 120 fourth-graders for a total of 240 children. The response burden for children participating in the direct child questionnaire is expected to be about 15 minutes, for a total of 60 burden hours. The burden hours included for the school coordinators and parents are for recruitment activities. For school administrators, burden for recruitment and for participation in the online SAQ component are shown in different rows in table 2. The sample size for parents is estimated based on experiences with parent recruitment in the ECLS-K:2011 2009 field test. Recruitment experiences in past ECLS-K:2011 field test and national data collections also informed the burden estimates in the table for recruitment for administrators and school coordinators.
It is estimated that “returning” administrators will need approximately 60 minutes to complete the paper SAQ and 60 minutes to complete the online SAQ, with an additional 15 minutes estimated to complete the debriefing questions. “New” administrators are estimated to need approximately 60 minutes to complete the online SAQ, with an additional 15 minutes estimated to complete the debriefing questions. It is anticipated that 10 school administrators will provide responses necessitating telephone follow-ups. The follow-up calls are expected to take approximately 10 minutes per respondent.
Respondent type |
Number of respondents |
Total number of responses |
Hours per instrument |
Total hours |
Child (direct child assessment) |
2,700 |
2,700 |
1.00 |
2,700 |
Child (child questionnaire) |
240 |
240 |
0.25 |
60 |
Parent (recruitment) |
11,739 |
11,739 |
0.25 |
2,935 |
Administrator1 (recruitment) |
120 |
120 |
1.00 |
120 |
School Coordinator |
60 |
60 |
1.00 |
60 |
School Administrator (paper and web survey) |
30 |
30 |
2.25 |
68 |
School Administrator (only web survey) |
30 |
30 |
1.25 |
38 |
School Administrator (telephone follow up on debriefing questions) |
10 |
10 |
0.17 |
2 |
Study Total* |
12,1591 |
12,229 |
NA |
NA Not applicable
1 To avoid over-counting, only the total number of participating administrators (n=120) contributes to the total number of respondents.
NOTE: The sample of students taking the direct assessment is not included in the table totals because it is not subject to the Paperwork Reduction Act (PRA) reporting.
The estimated cost to the government to conduct the pilot test is $1,027,023.
Recruitment for the pilot test is expected to begin by the week of January 15, 2013 and to be completed by April 4, 2013. Assessments will begin in mid April 2013 following the team leader and assessor trainings and will be completed in mid to late June 2013. The exact end date for data collection will depend on when the targeted numbers of child assessments, child questionnaires, and online SAQs are completed. The results of the pilot test will be initially summarized in a memorandum in July 2013, and then a report summarizing the findings will be finalized in November 2013.
Asher, S. R., Hymel, S., and Renshaw, P.D. (1984). Loneliness in children. Child Development, 55(4), 1456-1464.
Crick, N.R., and Grotpeter, J.K. (1995). Relational aggression, gender, and social psychological adjustment. Child Development, 66, 710-722.
de Leeuw, E.D,. and Collins, M. (1997). Data collection method and survey quality: An overview. In L. Lyberg et al. (eds). Survey measurement and process quality. New York: Wiley.
de Leeuw, E.D., Hox, J.J., Kef S., and van Hattum, M. (1997). Overcoming the problems of special interviews on sensitive topics: computer assisted self-interviewing tailored for young children and adolescents, pp.1-14 in: 1997 Sawtooth software conference proceedings. Sequim, WA: Sawtooth Software Inc.
La Greca, A. M. and Stone, W. L. (1993). Social anxiety scale for children-revised: Factor structure and concurrent validity. Journal of Clinical Child Psychology, 22(1), 17-27.
Gest, S.D., Welsh, J.A., and Domitrovich, C.E. (2005). Behavioral predictors of changes in social relatedness and liking school in elementary school. Journal of School Psychology, 43, 281-301.
Parker, J. G., and Asher, S. R. (1993). Friendship and friendship quality in middle childhood: Links with peer group acceptance and feelings of loneliness and social dissatisfaction. Developmental Psychology, 29(4), 611 – 621.
Kirsch, I., de Jong, J., Lafontaine, D., McQueen, J., Mendelovits, J., and Monseur, C. 2002. Reading for Change: Performance and Engagement Across Countries, Results from PISA 2000. Paris: Organisation for Economic Co-operation and Development.
Marsh, H.W. (1992). Self Description Questionnaire (SDQ) I: A theoretical and empirical basis for the measurement of multiple dimensions of preadolescent self-concept. An interim test manual and research monograph. Macarthur, New South Wales, Australia: University of Western Sydney, Faculty of Education.
NIH Toolbox for the Assessment of Neurological and Behavioral Function (NIH Toolbox) Emotion Battery. Domain-Specific Life Satisfaction Survey (Supplemental Measure). (2012) http://www.nihtoolbox.org/Pages/default.aspx
Sudman S., and Bradburn, N.M. (1974). Response effects in surveys: A review and synthesis. Chicago: Aldine.
Tourangeau, R., and Smith, T.W. (1996). Asking sensitive questions: The impact of data collection, question format, and question context. Public Opinion Quarterly, 60, 275-304.
Turner, C.F., Ku, L., Rogers, S.M., Lindberg, L.D., Pleck, J.H., and Sonenstein, F.L. (1998). Adolescent sexual behavior, drug use, and violence: Increased reporting with computer survey technology, Science, 8, 867-873.
Zimmer-Gembeck, Geiger, and Crick (2005). Relational and Physical Aggression, Prosocial Behavior, and Peer Relations. Journal of Early Adolescence, 25(4), 421-452.
1Throughout this package, reference is made to the Early Childhood Longitudinal Study, Kindergarten Class of 1998-99. It is referred to as the ECLS-K. The new study for which this submission requests approval is referred to as the ECLS-K:2011.
2If the results of the pilot test of the audio-CASI child questionnaire are unsatisfactory, a pencil and paper version of the child questionnaire will be fielded in the national data collection using the procedures that were successfully implemented in the ECLS-K third- and fifth-grade data collection rounds.
3 The pilot test is designed to develop child assessments for third-, fourth-, and fifth-graders. As mentioned above, a particular purpose of the pilot test is to gather information for developing test parameters for the ECLS-K:2011 child assessments. A valid assessment must contain at least some items at a more difficult level for the highest functioning children. Sixth-graders are included in the pilot test sample to represent high performing fifth-graders. Second-graders are not included in the sample to represent low performing third-graders as we have information from the earlier ECLS-K:2011 data collections on item performance for those ability levels.
4 Schools will be asked to assign a staff member to be the school coordinator to help coordinate the assessment activities at the school. The school administrator will appoint the school coordinator at the time of school recruitment (January-March). The school coordinator may be a school secretary, a teacher, or the school administrator him/herself.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Modified | 0000-00-00 |
File Created | 2021-01-31 |