Part B IELS 2018 Main Study

Part B IELS 2018 Main Study.docx

International Early Learning Study (IELS) 2018 Main Study

OMB: 1850-0936

Document [docx]
Download: docx | pdf



International Early Learning Study (IELS) 2018

Main Study




OMB# 1850-0936 v.5



Supporting Statement Part B






National Center for Education Statistics (NCES)

U.S. Department of Education

Institute of Education Sciences

Washington, DC







May 2018






TABLE OF CONTENTS

B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS

B.1 Respondent Universe 2

B.2 Procedures for the Collection of Information 2

B.2.a Respondent Recruitment 2

B.2.b IELS Data Collection 6

B.3 Maximizing Response Rates 9

B.4 Purpose of Field Test and Data Uses 10

B.5 Individuals Consulted on Study Design 13


COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS

Part B of this submission presents information on the International Early Learning Study (IELS) 2018 main study recruitment and data collection.

B.1 Respondent Universe

The IELS assesses children as they transition to primary school. For international comparability, the U.S. study will focus on five-year-old students in public and private schools that offer kindergarten. The sample of age-eligible students in a given school will be drawn from across all grades in the school, not just kindergarten. In particular, a large proportion of schools that offer kindergarten also offer pre-kindergarten, and it is likely that some prekindergarten students will meet the age definition. These students will be eligible for inclusion. Similarly, there may be the occasional first grade student who meets the age definition, and thus will be included in the list for sampling. The expectation is that children of this age will be assessed near the beginning of the school year. This is an important transition point for U.S. children, and it is also the point at which it is feasible to reach at least 95 percent of U.S. children through a school-based sampling approach. The universe for the selection of schools for IELS is public and private schools with kindergarten, in all states of the 50 United States and the District of Columbia. Within sampled schools, students will be selected for participation by drawing a random sample among children that are 5-years-old during the assessment period.

School Sample. The main study sample was based on a formal probability sample designed to yield a representative sample of 3,000 children age 5 enrolled in schools that contain kindergarten. The U.S. sampled 200 schools.

Schools were selected for recruitment from the most recent available universe files, the 2015-16 Common Core of Data (CCD), to develop the public school list, and from the 2015-16 Private School Universe Survey (PSS) to develop the private school list. Schools under the jurisdiction of the Department of Defense (DoD) and the Bureau of Indian Education (BIE) were also included in the frame.

Stratification for the main study sample was both explicit and implicit. Public and private schools constituted distinct sampling strata. Further stratification included geographic region (northeast, mid-west, south, west), locale (city, suburb, town, rural), and socio-economic status as measured by the percent of students having free or reduced-price lunch.

Student Sample. The OECD IELS main study requirements call for a minimum student sample of 3,000 students, and we estimate that approximately 3,060 target-age children will participate. In each school in the main study, 19 students in the target population will be randomly sampled from a list of all target-age students provided by the school, with the expectation that about 16 will participate. Depending on each school’s policy, active (explicit) parental consent, passive (implicit) parental consent, or notification only materials will be distributed to the sampled students within each school.

B.2 Procedures for the Collection of Information

This section presents information on the recruitment and data collection procedures for the IELS main study. Gaining schools’ and students’ cooperation in voluntary research is increasingly challenging. Employing effective strategies for gaining the cooperation of schools is especially of paramount importance. Recruitment of main study states, districts, and private schools began in the fall 2017. The recruitment effort ran concurrently with the field test district and school recruitment. Main study school recruitment will continue through summer 2018. Data collection will be conducted in October through December 2018.

B.2.a Respondent Recruitment

Based on recruitment knowledge gained during the 2017 IELS field test, the Early Childhood Longitudinal Study Kindergarten Class of 2010–11 (ECLS-K:2011), and other NCES studies, states and districts will be notified about the study and persuaded to allow project recruiters to contact the sampled schools in their jurisdictions. School staff often want to be sure district cooperation is in place before agreeing to participate in the study.

Table 1 summarizes the timing of the various recruitment stages. Details on the proposed approach for state, district, and school recruitment are discussed below. See Appendices A-E for all of the respondent materials referenced below. The main study recruitment was last approved in September 2017 (OMB # 1850-0936 v.3-4), however some of the respondents materials contained in Appendix A-B have been updated to reflect respondent feedback we received during the field test, and parent materials have been updated to better align with the teacher materials.

Table 1. Summary of IELS Recruitment Contacts and Timings

Type of Respondent for Recruitment

Timing of Contact

States with sampled districts/schools

September – December 2017

Districts with sampled schools

September 2017 – March 2018

Sampled schools

January – June 2018

School coordinator contact

Spring 2018 – Fall 2018

Parents and teachers of sampled students

Fall 2018

State Recruitment. From September through December 2017, state education agencies (SEAs) in states that contain sampled main study schools were mailed a package that included the state letter and IELS brochure. We have been working with the NAEP State Coordinators (NSCs) in the state recruitment effort. Some NSCs have sent, or plan to send, the state letter themselves, as well as follow-up to answer any questions. Where NSCs did not wish to be involved in the recruitment effort, the Westat home office has mailed the state package. Note that these packages were sent to SEAs for the main study only. Because the field test used a purposive sample, states with especially engaged NSCs that urged cooperation were selected and thus, such a package was not needed.

District Recruitment. From September 2017 through March 2018, advance packages to main study district superintendents were mailed. Each package contained an introductory letter, including a list of sampled schools in their jurisdiction, and the IELS brochure. The district mailings came from the NSCs or Westat, depending on NSCs’ preference. Shortly after the mailing, the district superintendent was contacted by phone to ensure they received the package, answer any questions they may have had, reviewed the list of schools sampled in the district, confirm key information about the schools (e.g., kindergarten present, size of enrollment, etc.) and get permission to contact the schools in their districts. Any requirements regarding parent consent were also identified. Information collected during this call was used to confirm which schools in the district are eligible for participation in the study, and to obtain contact and other information helpful in school recruitment.

During the recruitment period, the study staff have been prepared to respond to requirements such as research applications or meetings to provide more information about the study. If a district chose not to participate, the recruiter documented all concerns listed by the district so that a strategy could be formulated for refusal conversion attempts.

Based on experience from previous NCES studies, some NSCs wanted to talk to district staff themselves, while others mailed the package but did no further recruitment, and still others did not want to be involved in district recruitment at all. In cases where the NSCs did not wish to follow-up, Westat’s recruiters worked directly with the districts.

In the main study, 28 districts with sampled schools are designated as “special handling districts” (those requiring a research application). Contacting special handling districts began with updating district information based on what was gleaned from online sources, followed by calls to verify the information about where to send the completed research application forms, and, if necessary, to collect contact information for this process. During the call, inquiry was made about the amount of time the districts spend reviewing similar research applications.

School Recruitment. Once the district agreed that their sampled schools may be contacted, the school mailings were triggered. Most schools have been contacted in early 2018, although some contacts will carry through to spring 2018. These mailings, sent from the Westat office, contained an introductory letter, offering the school a $200 check as a thank you for participation, the IELS brochure, and school administrator Frequently Asked Questions (FAQs). Note that a revised version of the FAQs for School Administrator is included in Appendix A. Once OMB approves this submission, the new version will be used going forward. Shortly after the mailing, the NSCs or Westat recruiters phone the school administrator to discuss the study, gain cooperation, and assign a school coordinator (SC) to work with the IELS field staff to manage the data collection in their school. The school coordinator then acts as the liaison between study staff and their school.

In cases where recruitment proves more difficult, school recruiters are provided with email templates, which they edit and use as needed to urge participation. These email templates, included in Appendix A-B, are used with principals who are difficult to reach, school staff who are still considering participation, staff who express a concern and need follow-up, and principals who may require additional appreciation for agreeing to participate. As mentioned in Part A, a second tier incentive has been added, which will provide refusing schools with an $800 incentive, rather than the standard $200. This approach will only be used when necessary. Letters that will be used to communicate this additional incentive to designated schools have been included in Appendix A.

School Coordinator Contact. In early fall 2018, after permission to participate in the study has been granted from the school administrator, Westat team leaders will call the SC to discuss the IELS activities at the school, including gathering student rosters, distributing consent materials to parents of sample students and setting an assessment date. The team leaders will also gather information about what type of parental consent procedures need to be followed at the school; hours of operation, including early dismissal days, school closures/vacations, and dates for standardized testing; and any other considerations that may impact scheduling student assessments (e.g., planned construction periods, school reconfiguration, or planned changes in leadership). They will review the MyIELS website and how it will be used throughout the field period. Each SC will also be provided with the SC Handbook (see Appendix E). If the SC would like more information about IELS, he or she will be directed to the website or, if preferred, receive a mailing of the school administrator FAQs and the study brochure. See the next section for a description of the SC’s responsibilities during the data collection period.

Westat will also email the school coordinator MyIELS website registration information (the text of the email appears in Appendix A). The MyIELS website will contain the IELS respondent materials, teacher and parent PowerPoint presentations that can be used in parent back-to-school nights as applicable, as well as e-filing instructions and templates, and other assessment logistical information. Each school will receive a unique ID that will support multiple schools users. Each user must, however, provide their own contact information and setup their own unique account. This registration process has been used across all of NCES’ international studies, and also in NAEP.

If an SC is designated in the 2017-2018 school year, the initial SC contact will occur in spring 2018. Most of the SC contacts will occur at the beginning of the 2018-2019 school year. Because a lag occurs between the first SC contact and the main study data collection activities, an email (see Appendix A) will be sent to any identified main study SC during summer 2018 to remind them of the study and the upcoming tasks to be completed prior to the assessments. This email was not needed for the field test, as the first contact with the SCs occurred shortly before e-filing and the assessment activities.

Once the School Coordinator has successfully e-filed in the fall of 2018, an email will be sent to notify them that the Child and Teacher Tracking Forms are available. The email will also include a School Coordinator responsibilities fact sheet. This document was developed to give a brief, but thorough overview of the duties the SC will need to complete prior to, during, and after the IELS administration. A special focus is on strategies SCs can use to remind parents and teachers to complete the online questionnaires.

In the main study, additional contacts with the School Coordinator are planned (see Appendix A-B) to further engage and assist the SCs in their responsibilities, including a reminder about their pre-assessment call with the IELS team leader and reminders to secure consent and participation from parents and teachers.

In order to incentivize SCs, and also encourage their continued cooperation, Westat will mail each SC $100 of their incentive after e-filing has been completed, and the remaining $100 with the thank you letter at the end of the data collection. A thank you email will also be sent to School Coordinators, to let them know that their incentive is on its way.

Parent and Teacher Recruitment. After the student sample has been selected in fall 2018, parents and teachers will be recruited and asked to complete online surveys. Teacher names and email addresses will have been collected during the sampling phase, as the international sampling software requires schools to provide a roster of teachers as well as their students. Collecting both pieces of information at the same time allows for each student to be linked directly to his/her teacher. Edit checks will be run during sampling to ensure that all students have been linked to a teacher and that all teachers have students linked to them. If unlinked students or teachers are found, field staff will clarify and correct the issue with the SC.

After sampling has been completed and the correct teacher names and email addresses have been collected, the Westat team leaders will talk with the SC about distributing study information to the teacher respondents. SCs will be mailed a package containing a set of materials bundled for each sample teacher. Each teacher will receive an IELS brochure, a study introduction letter (also available in digital format), the teacher FAQs, a list of students for whom they will need to complete questionnaires, and a check to serve as an incentive for completing the questionnaires. Teachers will receive $20, plus an additional $7 for every part B questionnaire they are asked to complete. The SC will be asked to distribute the materials to the appropriate teachers. The SC will serve as the study’s main point of contact for teachers, and will answer any questions from them about the study and assessment logistics. They will also inform the teachers of the assessment date. Shortly after sampling, Westat will email the teachers a direct, unique link to the teacher questionnaire.

In addition, because the sampled children themselves will not be directly recruited, their parents will be asked to give permission for their child’s participation. Information about schools’ procedures for obtaining consent for students to participate in the study will have been gathered during school recruitment. Schools generally require one of two types of consent: implicit or explicit. Both types of consent require that parents be notified that their children have been selected for the study. With implicit consent, the school does not require verbal or written consent for a student to participate in the study – parents are asked only to notify the appropriate person if they do not want their child to participate. With explicit consent, children may participate only if their parents provide written or oral consent for their children to do so. In a few cases, schools may choose simply to notify the parents of the study and their child’s participation.

Proactive parent recruitment will be focused on maximizing the number of parents who: (1) provide consent and (2) complete the parent questionnaire. Much time was spent analyzing the field test parent response rates and talking informally with parents about barriers to study participation. Using the received feedback, a robust parent recruitment campaign was established to obtain and maintain their engagement in the study throughout the field period. Parent contact materials (see Appendix B) include:

  • An e-mail or text messaging blast to notify parents about IELS and that informational materials will be forthcoming;

  • Revised parent materials that are more concise, emphasize their unique concerns, and are from the school coordinator rather than Westat or NCES;

  • Scheduled follow up notifications, in the form of email or text message, to include general reminders about providing consent and completing the questionnaire; and

  • A Halloween card, used as a light-hearted reminder of the study.

The new and revised materials reflect parents’ request for electronic communications (email or text), prior notification to receiving hard-copy materials, and preference for materials from the school rather than an unknown entity. Additional contacts were added to keep the parents’ interest in the study strong.

After sampling, the SCs can text or email the parents of selected students about the study and alert them to the fact that hard-copy materials will soon come home in their children’s backpacks. Informational parent packets will be assembled at Westat and mailed to the SCs to distribute to parents of sampled students. Each packet will consist of a study introduction/welcome letter containing MyIELS registration details, the study brochure, parent FAQs, and a $20 loadable Visa reward card. The reward card will be loaded upon completion of the parent questionnaire. These materials will be distributed to parents in the way each school believes to be most appropriate and effective (e.g., sending the materials home with students; the school or district sending the materials directly to parents; and/or field staff contacting parents directly by mail, email, and/or phone). Alternate parent letters and consent forms are also available for the SC to use if he/she would prefer to use hard-copy consent forms, rather than direct parents to the MyIELS website to provide consent.

A one-page tip-sheet containing various strategies for SCs to use to follow-up with parents about consent and questionnaires will also be provided. While talking with the SCs, the team leaders will discuss these strategies, as well as procedures for obtaining contact information for the sampled children’s parents. Ideally, the school will provide parents’ email address, telephone/cell phone number, and mailing address. The email address and telephone number will be used for consent follow-up and questionnaire nonresponse. The mailing address will be used to send a thank-you letter once a parent or legal guardian completes the parent questionnaire. SCs will have the option of uploading the parent contact information as an Excel file, entering the information into MyIELS, or telling the information to the IELS team leader over the phone, with the team leader then entering the data into MyIELS. This method was used successfully in ECLS-K:2011.

We learned during the IELS field test that some SCs may be reluctant to provide parental contact information. The team leaders will work with the schools to identify the best method for following up with parents. The approach to following up on consent will also be tailored to the school’s preferences. In most cases, we anticipate the SC will be the main contact for the parents and will follow-up on missing consent forms and questionnaires. Westat field staff will be available to assist in whatever way is needed.

Role of the MyIELS Website. We plan to use the MyIELS website to stay in touch with school administrators, SCs, and parents over the course of the school recruitment period. Parents and teachers are not required to register individually for the website, though parents will be encouraged to visit the website, update their contact information, provide consent, and be taken via hyperlink to the questionnaire site. Alternatively, they may go directly to the questionnaire site and complete their parent questionnaire. The goal is to make it as frictionless as possible to get parents and teachers into the system and obtain their responses. While we typically do not drive the district to the website, provisions are in place for an account to be created for them if requested.

B.2.b IELS Data Collection

Main study data collection is planned for October-December 2018. Prior to the data collection, Westat field staff will complete various pre-assessment activities, as outlined in this section.

Pre-assessment Activities

As discussed, main study school recruitment began in January 2018 and will continue through summer 2018, during which time, Westat school recruiters contact the SCs to introduce the study, briefly review their responsibilities, and set an assessment date. However, most of the pre-assessment activities – e-filing, distribution of parent consent materials, outreach to teachers, etc. – will not be completed until the fall of 2018. These activities must occur after the start of the 2018-2019 school year in order to have the appropriate-aged children participate in the study.

During the pre-assessment period, Westat field staff will discuss the SC’s role in the study and ask the SC to do the following:

  • gather a roster of children born within the appropriate timeline, along with other needed information such as the children’s grade, gender, and teacher, and e-file it using the appropriate template;

  • after the sample is drawn, distribute parent study materials;

  • arrange for an assessment date and location;

  • notify teachers of the study and distribute study materials;

  • welcome the assessment team on assessment day; and

  • work with the teachers and assessment team to help the assessment days run smoothly.

Full details about the SCs role and responsibilities will be posted to MyIELS:

  • Set an assessment date (made up of two days), making sure to avoid conflicts with any special events in the school’s calendar, such as a field trip or school holiday.

  • Arrange for an assessment location, with the goal to identify locations that provide as little distraction as possible, protect the privacy of the children, and are as non-disruptive to the school routine as possible.

  • Discuss the sampling procedures with IELS staff and receive assistance as needed in the e-filing process.

  • Work with IELS staff to distribute and collect parent consent forms.

  • Ensure that teachers have been notified of the study and are able to complete the questionnaire. Work with IELS staff to follow-up on missing or incomplete questionnaires.

As part of the pre-assessment activities, Westat will email the SC instructions for e-filing, including preparing an electronic file of students and teachers and submitting it using a standardized template. IELS help desk and field staff are resources for the SC to use while completing the sampling process. Once the sample has been drawn, the SC will be able to view the selected students on the MyIELS website. The field staff can then work with the SC to distribute parent consent and notify the appropriate teachers.

Throughout these pre-assessment activities, IELS staff will establish a positive and cooperative working relationship with school personnel and the school community. Westat’s field staff will communicate with the SC using email and phone. We expect that completion of the pre-assessment activities will necessitate multiple contacts.

Data Collection

Child Assessment. The direct child assessment will be a one-on-one untimed assessment composed of tasks that assess a variety of skills, with electronic data capture via tablet (about 60 minutes per child, conducted over a two-day period). Direct assessment items will assess the following domains: language/emergent literacy, mathematics/numeracy, executive function/self-regulation, and social emotional skills (prosocial behavior, empathy, and trust).

Generally, the assessment team that visits the school will include a Westat team leader and three assessors. All of the field staff will be well-trained and experienced working with children of this age. The assessment team will arrive at the school on the appointed first day of assessments and, following the school’s required check-in procedures, immediately contact the SC. The team leader will introduce the assessors to the SC. The procedures to be used during the on-site data collection period will be discussed with the SC to ensure there is a common understanding of those procedures. The team leader will also confirm that all sampled children are still enrolled in the school as of the assessment day and determine which children are at school on the assessment day.

On the day of the IELS assessment, the team leader and assessors will be taken by school personnel to the assessment area(s), where they will remove potential distractions as much as possible and establish a comfortable environment for conducting the assessment. They will set up the assessment materials and log into the child assessment app on the tablets that they will bring with them. All field staff will be provided with backup batteries, cords, etc., to ensure that data collection activities are not disrupted by equipment problems.

Once the assessment areas have been set up and assessors are ready to begin work, the SC will introduce the IELS team members to the teacher(s) whose children will be assessed. The teacher(s), in turn, will introduce the assessors to the class. Assessors will then escort the sampled children to the assessment areas, one-by-one.

When the assessor and study child arrive at the assessment space, the assessor will introduce the child to the task and begin the practice items. After completing the cognitive assessments, the assessor will return the child to his/her classroom and the next sampled child will be assessed. These procedures will again be followed on day 2 of the data collection, with the same children completing the second part of their assessment.

It is expected that some children will be absent from school when the assessments are scheduled. Certain days throughout the field period will be designated as days on which select field staff can conduct make-up assessments. Attempts will be made to conduct a make-up assessment at some point during the field period for all children absent on their school’s assessment days.

Parent and Teacher Questionnaires. Contextual information will be collected from parents/guardians and teachers to gather information on the relation between children’s learning and development and on important demographic, social, economic, and education variables.

The parent questionnaire will ask for information about the child and the home, parents’ perspectives on their children’s social and emotional skills, the sampled child’s early childhood education and care participation, and home learning environment. The teacher questionnaire will ask teachers to provide information about their own background and education, and on each study child’s knowledge and skills. Both questionnaires will be available electronically and in hard copy. The final national versions of the questionnaires are provided in Appendix D.

As described above, parents and teachers will be given links to their questionnaires as part of their welcome materials. In the field test, parents received a postcard in their packet that contained registration information for MyIELS. We heard from parents that they would like to see all of the information in one place, and so we have opted to include registration information in the parent welcome letter for the main study. All of this information will be packaged in a manila envelope with an IELS sticker, given that parents indicated that they preferred a less flashy looking notification package. In addition to this notification package, the SC will be provided with parent and teacher specific PowerPoint presentations about the study to be used during a parent and/or teacher information nights, as appropriate. A long version and a short version of each PowerPoint presentation is available, to accommodate timing needs of the SCs. The long version of each presentation is provided in Appendix C. The shortened version is a subset of the presented longer versions.

Parents will be directed to the website to register, provide consent electronically, enter their gift card number so it can be activated once the questionnaire is complete, and complete their questionnaire. We have also included print-ready materials, in the event that a school or parent prefers to complete a hard copy consent form. Parent questionnaires will be also available in hard copy upon request, and will include a self-addressed return envelope. Teachers will be emailed their link to the questionnaire and will not need to register on MyIELS, unless they would like more information about the study.

Throughout the field period, field staff and the SC will monitor the return of questionnaires and, working in conjunction, will follow-up with non-responders as needed. A ‘window-is-closing’ non-response follow-up effort will be utilized, gently reminding respondents to complete their questionnaires. The two reminders before the last, near the end of the data collection window, are designed to establish a deadline effect, and will be followed with an extension email. This campaign-style approach is designed to provide soft reminders across the data collection window, while creating a sense of urgency to respond. If email addresses or a mobile phone number are not available, the reminder text can be printed and sent home to parents in the sampled students’ backpacks or distributed to teachers.

As noted above, a more integrated approach to parent consent and questionnaire follow-up has been created for the main study, including:

  • an initial email or text message will be sent notifying parents that IELS information is coming, followed by the hard-copy parent informational package;

  • a first reminder email or text to be sent approximately one week after the initial invitation to all parents who have yet-to-start the questionnaire (status = not started);

  • a custom, hard copy Halloween card that includes an end date will be sent to all parents who have yet to complete the questionnaire two weeks after the initial invitation;

  • in early November, a second email or text notification will be sent to all non-responding parents reminding them of the upcoming deadline;

  • in mid-to-late November, a final reminder will be sent via email or text to non-respondents;

  • on the day of the deadline (November 30), an extension will be sent to non-respondents, extending the completion deadline to December 7 (the last scheduled day of data collection); and

  • a thank you to all respondents will be sent at the close of data collection.

Every effort has been made to include Spanish-speaking parents, including the use of a Spanish-language parent questionnaire and Spanish-language versions of the consent and other communication materials (updated translated materials will be provided as a change request by mid-August 2018). A Spanish-language version of the child assessment is also planned.

After Assessment Day. Once the data collection is completed, thank you letters (and incentive checks for schools and SCs) will be distributed to responding parents, teachers, SCs, and school administrators. A parent thank you email and text message are also available if these methods will be preferred. The letter will point respondents who wish to learn more about the study to the NCES IELS website. Once results become available, an email may be sent to principals, School Coordinators, parents, and teachers, alerting them of the link to the results. Text for these emails is provided in Appendix B.

B.3 Maximizing Response Rates

Studies have increasingly experienced challenges in obtaining the cooperation of districts and schools. Loss of instructional time, competing demands (such as district and state testing requirements), lack of teacher and parent support, and increased demands on principals impede gaining permission to conduct research in schools. The IELS recruitment teams were trained to communicate clearly to districts, dioceses, private school organizations, schools, teachers, and parents the benefits of participating in the IELS main study, as appropriate, and what participation will require in terms of student and school personnel time. Recruiters were trained to address concerns that districts and schools may have about participation, while simultaneously communicating the value of the study and the school’s key role in developing instruments that ensure high-quality data focusing on young kindergarten students.

Our approach to maximizing school and student response rates in the main study includes the following:

  • Assigning personal recruiters for specific schools;

  • Using experienced recruiters;

  • Developing persuasive written materials;

  • Avoiding refusals by focusing on strategies to solve problems or meet obstacles to participation;

  • Incentives for schools, school coordinators, parents, and teachers (see Section A9); and

  • Contact with schools and school coordinators at set intervals throughout the year preceding the assessment.

These approaches are based on recommendations from an NCES panel and experience with previous administrations of international assessments such as PISA and TIMSS.

B.4 Purpose of Field Test and Data Uses

Participation in the field test is an international requirement for taking part in the IELS main study. The main focus of the field test was to collect enough assessment data to perform reliable tests of the items. However, during the field test, procedures for conducting the main study, including recruitment methods for obtaining school and student participation were also evaluated. This information was used to: (a) determine the final main study design and (b) improve our recruiting strategies and materials for the main study.

The U.S. IELS field test data collection occurred from November 6 through December 15, 2017. In order to meet the international data collection schedule for the fall 2017, field test recruiting activities began in September 2017. For the field test, the IELS tablet-based assessment was comprised of four domains: emerging literacy, emerging numeracy, empathy and trust, and self-regulation.

The assessment items were administered to children using Samsung Galaxy touch screen tablets, with the corresponding questions read aloud via audio from the application to the child, who listened with or without the use of supplied headphones. The children were asked to respond to questions in a variety of methods, including tapping on a picture or dragging and dropping items on the tablet’s screen.

Each assessment was administered one-on-one. That is, a trained assessor sat next to each sampled child, monitoring the child’s progress and answering questions as needed. While there was often more than one child in an assessment space, each child was matched with an assessor who monitored the assessment during the entire assessment session. The play-based assessment was untimed and self-administered, although the assessor was available to respond to questions, demonstrate how to answer items if needed, and intervene if technical issues with the hardware or software arose.

One parent or guardian of each student was asked to complete a brief online survey. Hard copy questionnaires were also available at the end of the administration window. The questionnaire included demographic questions as well as questions about the child’s early learning environment and experiences, social skills, and participation in child care. Parents were given a welcome packet at the beginning of the field period, with informational materials, a description of the type of consent the school wished parents to provide for their child’s participation in the study, and a postcard with a unique username and password to access the study website, MyIELS. The MyIELS portal served as a study portal containing additional informational materials for respondents, as well as a page for parents to provide consent for their child’s participation and contact information. If parents did not wish to register and provide consent on the MyIELS portal, they were also provided with a direct, unique link to the questionnaire in the welcome packet.

Teachers of sampled children were asked to complete an online questionnaire, which included questions about their professional background (about 3 minutes to complete) and the skills and abilities of the children selected for the study (about 5 minutes per child, with an average of four children per teacher). The survey did not need to be completed in a single session. Teachers were emailed their link to the questionnaire and did not need to register on the MyIELS portal, unless they wanted more information about the study. Hard copy versions of the questionnaire were also available.

The original sample contained 30 original schools with two replacement schools each, for a total of 90 schools. Seven of the schools in the original sample did not participate, due to refusal or school ineligibility. Six replacement schools were assessed in their places. A total of 29 schools in 18 districts were recruited for the IELS field test. Of those schools, 24 were public schools, four were Catholic, and one was private, non-Catholic. Recruitment results by school type are listed in Table 2.

Table 2. Number of Schools by Type

School type

Original

Original –

Not participating

Replacement

Total participating

Total schools

30

7

6

29

Public

25

5

4

24

Catholic

3

0

1

4

Private-other

2

2

1

1

Of the 29 schools sampled, 11 schools chose explicit parent consent, 10 schools chose implicit parent consent, and eight schools chose parent notification only (see Table 3).

Table 3. Parent consent type by school type

School type

Explicit

Implicit

Notification

Total

Total schools

11

10

8

29

Public

7

10

7

24

Catholic

3

0

1

4

Private-other

1

0

0

1

A total of 537 children were sampled for the IELS field test; 14 children were excluded from the assessment due to special needs or other reasons. Participation in Day 1 and Day 2 of the assessment is detailed below. Of the 523 eligible children, 462 children (88 percent), completed day 1 of the assessment and 460 (88 percent) completed day 2 of the assessment. Four children per day were assessed with accommodations. The absentee rate was comparable on both days, with 33 children (six percent) absent for day 1 and 35 children (seven percent) absent for day 2 of the assessment. Overall, there were 28 (five percent) parent consent refusals. Participation rates for the cognitive assessment are detailed in Table 4.

Table 4. Cognitive assessment participation rates

 

N

%

Total sampled children

537

100

Out-of-Scope

14

3

Excluded because of special needs

2

14

Left school (Assess? Column in Child Assessment Grid=4 or 5)

9

64

Non-English speakers/homeschooled (Assess? Column in Child Assessment Grid=n)

3

21

In-Scope

523

97

PENDING

12

2

Parent consent pending (Assess? Column in Child Assessment Grid=0)

12

100

DAY 1 – COMPLETE

462

88

Participated (C)

458

99

Participated with accommodation (SA)

4

0.9

DAY 1 – NONRESPONSE

61

11

Absent (A)

33

54

Parent refused (P)

28

46

DAY 2 – COMPLETE

460

88

Participated (C)

456

99

Participated with accommodation (SA)

4

0.9

DAY 2 – NONRESPONSE

63

12

Absent (A)

35

56

Parent refused (P)

28

44

A total of 537 parents were given access to the IELS parent questionnaires. Of the 523 eligible parents, 236 (45 percent) of parents completed the questionnaire; 100 percent of parent questionnaires were completed online. Parent participation is detailed in Table 5.

A total of 111 teachers were sampled with their students. Of the 111 teachers surveyed, 104 (94 percent) returned part A of their questionnaire. All teacher questionnaire responses were completed online. Of the 523 Part B questionnaires for eligible children, 483 (92 percent) were completed. The questionnaire response rates are detailed in Table 5.

Table 5. Parent and teacher questionnaire response rates

Respondent

Total

Total Responses Received

% of Total

Teacher, Part A

111

104

94

Teacher, Part B

523

483

92

Parent

523

236

45

Child Assessment Findings

The majority of feedback from the field staff indicated that while children really enjoyed the assessments, some children did experience fatigue at some point during the assessment. Some children often asked for breaks, showed physical signs of fatigue or exhaustion, indicated that an activity or domain was too long, asked to skip an activity, or asked to stop the assessment entirely. The length of each assessment day may contribute to such issues. During the field test, children completed the Emergent Literacy and Emergent Numeracy domains on one day of assessment and the Self-Regulation and Empathy and Trust domains on the other day. This second pairing tended to take significantly more time. Assessors noted that children were often most engaged and enthusiastic during the day of testing when the assessment involved action-oriented games and activities, but the content of the Empathy and Trust and Self-Regulation domains seemed to be more cognitively taxing for the children. Analysts examined the possibility of reordering the combinations of domains to help mitigate children’s fatigue and determined that they should be changed. For the main study, the literacy and self-regulation tasks will be administered on day 1 and the numeracy and empathy items will be administered on day 2. Note that the trust domain was found to be unreliable and was removed from the child assessment.

Some of the observers and field staff noted that the instructions for the assessment domains were not sufficiently clear to the children. The children asked the assessors clarification questions (e.g., “How do I do this?” “Do I drag the beads?”) because they did not understand the action they were being asked to perform. As a result, children often expressed frustration and confusion with some questions and sometimes made a selection without listening to the question or thus not completing the activity correctly. To address this challenge, we recommended to OECD to build one or more screens into the assessment at the beginning of each domain that prompts assessors to demonstrate the functionality of activities and to emphasize new or different functionalities when they apply. For example, the demonstration should include how to use the tablet, how to use the “drag and drop” functionality, how to advance or return to pages (where appropriate), and common mistakes to avoid. Developing and implementing scripted demonstrations would minimize the impact of potential experience gaps by ensuring all children have baseline knowledge of how to use the tablet, how to navigate throughout the assessment, and a how to complete each activity. To the extent possible, these changes are being implemented by the OECD.

In addition, there was confusion among the field staff on which items were practice items, which allowed for scaffolding, and which were test items. The field staff was conscious of not coaching the children, but not being clear on how to distinguish the different item types—especially in the Empathy and Trust and Self-Regulation domains—was problematic. Better clarification between practice and test items will be utilized in the main study; in addition, the different item types will be emphasized during the main study training.

Parent Questionnaire Findings

One of the most significant data challenges encountered during the IELS field test process was the low response rate on the parent questionnaires (approximately 44 percent). Several factors contributed to this low response rate: the study team experienced delays in receiving the parent questionnaire instrument, and data were not immediately available regarding which parents completed their questionnaire. Consequently, the first wave of data collection was already underway before the team could conduct non-response follow-up and distribute the hard copy parent questionnaire. Because several sites had already completed their assessments when the project team received final approval by OECD of the print questionnaires, the team had to mail the questionnaires to some parents in the final weeks of data collection, after their children had been assessed. As a result of this delay, parents were instructed to complete the questionnaires and return them within 1 week, likely too short a timeline for many parents and thus contributing to the low response rate. To address these challenges, we recommended to OECD that each parent should receive a comprehensive packet of study materials four weeks before their child’s assessment date. Updated parent materials appear in Appendix B. Relatedly, we believe that the revised approach to the parental incentive process, described in this submission, may further yield improved response rates. During the field test, Westat distributed a $20 check to each parent whose name was provided by the SC or who provided his or her name by completing the online consent module. However, this approach has two key disadvantages: (1) it limits incentive delivery to parents for whom names are not available; and (2) it requires parents to cash checks, which can pose an additional burden to participants, particularly those of low socioeconomic status who are less likely to have bank accounts. Rather than using checks, as in the field test, in the main study we will use loadable debit cards, which will be included in the comprehensive packet of study materials described above. Immediately upon completion of the online survey, parents will be redirected from the survey to another site to activate their debit card. For parents who complete and return the paper version of their questionnaire, the IELS study team, upon its receipt, will activate the debit card. This approach is expected to be a significant improvement over the use of checks because debit cards do not require parent names and can be used immediately upon activation, without requiring a bank account.

Furthermore, the parent questionnaire, at an estimated 30 minutes for completion and 18 pages when printed, is lengthy and perhaps daunting for respondents. We recommended to OECD reviewing the questions to ensure they tightly align with study constructs and research goals, with any questions that proved extraneous being deleted, as a shorter questionnaire may help increase parent response rates. We also recommended to re-order the questionnaire so that more factual questions (e.g., attendance in day care) appear at the beginning of the questionnaire, with more personal—and potentially sensitive—questions appearing later. For example, we recommended to move questions on whether the child is from the US and, if not, how long they have been residing in the US, to the back of the questionnaire or delete altogether because of their sensitivity. For the main study questionnaire, the question on the child’s location of birth was moved further towards the back of the questionnaire, and the question on the length of residency in the US was deleted. Although, in order to keep the parent questionnaire standardized across participating counties it was not possible to adapt all of our recommended revisions, all of our recommendations were recognized by OECD and, to the extent possible, a number of them were incorporated into the main study parent questionnaire.

B.5 Individuals Consulted on Study Design

Overall direction for the IELS is provided by Dr. Stephen Provasnik, NCES Team Lead for International Assessments, and Ms. Mary Coleman, Project Officer, at NCES, U.S. Department of Education.

Key international colleagues include:

  • Overall project direction: Dr. Sacha DeVelle, Australian Council for Educational Research;

  • Survey design: Dr. Wolfram Schultz, Australian Council for Educational Research; and

  • Study Operations: Ms. Julianne Henke, IEA Data Processing Center.

1

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-01-20

© 2024 OMB.report | Privacy Policy