Middle Grades Longitudinal Study of 2017-18 (MGLS:2017)
Operational Field Test (OFT) and Recruitment for Main Study Base-year
Supporting Statements Part B
Carried over OMB# 1850-0911 v.10
Newly submitted OMB# 1850-0911 v.15
National Center for Education Statistics
U.S. Department of Education
Institute of Education Sciences
Washington, DC
July 2016
Carried over OMB# 1850-0911 v.10 (approved in March 2016) 1
Newly submitted OMB# 1850-0911 v.15 11
Middle Grades Longitudinal Study of 2017–18 (MGLS:2017)
Recruitment for
2017 Operational Field Test (OFT)
OMB# 1850-0911 v.10
Supporting Statement Part B
APPROVED in March 2016
National Center for Education Statistics
U.S. Department of Education
Institute of Education Sciences
Washington, DC
September 2015
Revised December 2015
Revised March 2016
Revised July 2016 (only verb tenses have been revised)
B. Collection of Information Employing Statistical Methods 2
B.2 Procedures for the Collection of Information 5
B.3 Maximizing Participation 8
B.4 Purpose of Field Test and Data Uses 10
B.5 Individuals Responsible for Study Design and Performance 10
Part B of this submission presents information on the collection of information employing statistical methods for the recruitment for the Middle Grades Longitudinal Study of 2017-18 (MGLS:2017) Operational Field Test (OFT). A separate OMB clearance request for the OFT data collection, informed in part by the Item Validation Field Test (IVFT), is being submitted.
The MGLS:2017 Operational Field Test (OFT) will be conducted during the 2016-17 school year, with data collection scheduled to begin in January 2017. The OFT will be conducted in ten geographic locations that adequately simulate the diversity of the 50 States and the District of Columbia. The OFT may include up to two quasi-nationally representative samples of students in the United States. The Sample 1 universe includes students enrolled in grade 6 and attending general education schools, while Sample 2, if drawn, would include students in general education schools in three focal disability categories (autism, specific learning disability, and emotional disturbance) who are enrolled in grade 6 or are of equivalent age in an ungraded classroom. Sample 2 will be used if Sample 1 proves insufficient to meet the target student sample yields for the three focal disability categories.
The MGLS:2017 OFT employs a multi-stage sampling design, with schools selected in one stage, and then students selected within schools. Schools have been selected using probability proportional to size sampling within school sampling strata, with some schools selected with certainty in order to ensure that all desired types of schools are included (see below).
Students will be selected using simple random sampling within student sampling strata within schools. The school frame has been constructed from the 2013-2014 Common Core of Data (CCD 2013-14) and the 2013-2014 Private School Universe Survey (PSS 2013-14) and includes 3,301 schools that report offering sixth-grade instruction and are located within ten metropolitan statistical areas (MSAs). The following types of schools have been excluded from the sampling frame:
Department of Defense Education Activity schools and Bureau of Indian Education schools,
schools for juvenile delinquents,
schools that report no1 sixth-grade enrollment and do not report to EDFacts,
schools that report no sixth-grade enrollment and report no2 students between the ages of 11 and 13 in the three focal disability groups,
special education schools, and
schools included in the Item Validation Field Test (IVFT) field test.
Schools were stratified by MSA and by a High/Low Prevalence designation derived from the total number of students in the three focal disability groups. Schools are classified as High Prevalence if the total number of students in the three focal disability groups exceeds the 95th percentile for that total across all3 schools not just those schools in the ten MSAs used for the OFT sampling. One hundred and twenty five schools have been sampled and 103 of the 125 schools were selected for initial data collection. The 22 schools not selected for initial data collection will be used as a reserve sample in the event that a higher than expected proportion of schools decline to participate or are found to be ineligible for the OFT.
The allocation of the 125 schools to the school sampling strata proceeded by first determining an allocation of 103 schools across the school sampling strata in order to have approximately half of the sample consist of high prevalence schools, half of the sample consist of low prevalence schools, and have approximately ten sample schools in each MSA. Twenty two additional reserve schools were distributed across the school sampling strata in order to try and preserve these goals as well. The school strata and a sample allocation that meets these goals are shown in table 1.
Table 1. OFT School Sample Allocation
School Region |
Prevalence |
School
|
School
|
30
Certainty Private, |
Certainty
Schools |
Non Certainty Schools Among 103 Schools |
Reserve Sample |
PPS Sample of 78 |
School Sample Size (125) |
A |
High Prevalence |
76 |
7 |
2 |
0 |
5 |
2 |
7 |
9 |
A |
Low Prevalence |
395 |
5 |
3 |
0 |
2 |
1 |
3 |
6 |
B |
High Prevalence |
22 |
7 |
0 |
0 |
7 |
2 |
9 |
9 |
B |
Low Prevalence |
293 |
5 |
3 |
0 |
2 |
1 |
3 |
6 |
C |
High Prevalence |
1 |
1 |
0 |
1 |
0 |
0 |
0 |
1 |
C |
Low Prevalence |
87 |
6 |
1 |
0 |
5 |
2 |
7 |
8 |
D |
High Prevalence |
5 |
5 |
0 |
5 |
0 |
0 |
0 |
5 |
D |
Low Prevalence |
170 |
5 |
3 |
0 |
2 |
1 |
3 |
6 |
E |
High Prevalence |
5 |
5 |
0 |
5 |
0 |
0 |
0 |
5 |
E |
Low Prevalence |
305 |
5 |
3 |
0 |
2 |
1 |
3 |
6 |
F |
High Prevalence |
40 |
7 |
0 |
0 |
7 |
2 |
9 |
9 |
F |
Low Prevalence |
566 |
5 |
3 |
0 |
2 |
1 |
3 |
6 |
G |
High Prevalence |
14 |
7 |
0 |
0 |
7 |
2 |
9 |
9 |
G |
Low Prevalence |
572 |
5 |
3 |
0 |
2 |
1 |
3 |
6 |
H |
High Prevalence |
12 |
7 |
0 |
0 |
7 |
2 |
9 |
9 |
H |
Low Prevalence |
497 |
5 |
3 |
0 |
2 |
1 |
3 |
6 |
I |
High Prevalence |
4 |
4 |
0 |
4 |
0 |
0 |
0 |
4 |
I |
Low Prevalence |
148 |
5 |
3 |
0 |
2 |
1 |
3 |
6 |
J |
High Prevalence |
2 |
2 |
0 |
2 |
0 |
0 |
0 |
2 |
J |
Low Prevalence |
87 |
5 |
3 |
0 |
2 |
2 |
4 |
7 |
Total |
|
3,301 |
103 |
30 |
17 |
56 |
22 |
78 |
125 |
The first step in the sampling process involves the selection of thirty schools with certainty in order to ensure that at least ten private schools, ten schools in rural areas, and ten schools in towns are included in the sample of 103 schools to be initially fielded for data collection. The allocation of these thirty schools to the twenty school sampling strata is provided in table 1. The second step in the sampling process requires identifying those school sampling strata where the sample allocation of the 125 schools equals the number of schools in the frame. As shown in table 1, there are five sampling strata where the sample size equals the number of schools in the frame and there are seventeen schools in these five strata. All seventeen schools were selected with certainty. Forty seven of the 103 schools, or approximately 46 percent of the sample, were selected with certainty. The third step in the sampling process involves selecting 78 schools using stratified probability proportional to size sampling. The distribution of these 78 schools across the school sampling strata is provided in table 1. The fourth step in the sampling process entails selecting a simple random sample, within school strata, of the 78 schools selected using probability proportional to size in order to select 56 of the 78 schools for inclusion in the initial set of 103 schools released for data collection. The sample allocation of these 56 schools is shown in table 1.
The size measure used for the probability proportional to size selection of 78 schools was constructed using the overall sampling rates for students in the following four student categories:
Autism,
Emotional Disturbance,
Specific Learning Disability, and
Other
combined with the total number of students in each of those categories at a given school. In other words, the size measure for a given school (i) may be written as follows:
Where is the sampling rate for the jth student category and is the number of students in the jth category within school i. The sampling rate, , equals the number of students to sample from the jth category divided by the number of students in the jth category across all schools in the sampling frame. The sampling rate for autistic students equals 188/7,706 (.024), the sampling rate for students with emotional disturbance equals 188/5,115 (.037), the sampling rate for students with specific learning disability equals 188/42,603 (.004), and the sampling rate for other students equals 1,188/280,424 (.004.) The denominator of a given rate for a given domain corresponds to the number of students in that domain across all schools in the sampling frame. The numerator of a given rate equals the number of students required to be sampled in order to achieve 1,120 participating students including 120 autistic students, 120 students with emotional disturbance, 120 students with specific learning disability, and 760 other students.
The unequal weighting effect is designed to be one within each of the four student domains (autism, specific learning disability, emotional disturbance, and other) within each school stratum among the schools selected using probability proportional to size sampling. A design effect of one means that the precision of an estimate is equivalent to the precision of an estimate derived via simple random sampling. The degree to which certainty schools will reduce the precision of estimates depends upon the degree to which certainty schools participate, the degree to which students in all schools participate, the degree to which student enrollment counts match expected counts, and the set of non-certainty schools that are sampled.
Within participating schools, students will be stratified into the four student categories defined above and a simple random sample of students will be selected from each student sampling stratum. The number of students sampled per student stratum will vary by school because the within school student allocation to strata depends upon the number of students in each of the four student sampling strata. The process of determining the student sample allocation follows the procedure outlined in section 2 of Folsom et al (19874.) The number of sampled students per student domain will generally not be equal and will vary across schools. The process outlined in Folsom et al. will also be followed for the full-scale collection. Approximately 34 students will be sampled from each of the anticipated 50 participating schools.
For OFT Sample 1, schools were selected for recruitment from the CCD 2013-14 to develop the public school list and from the PSS 2013-14 to develop the private school list. In Sample 1, once schools are selected and recruited, students enrolled in grade 6 will be selected from student rosters that schools will be asked to provide. For OFT Sample 2, districts would be contacted to provide rosters of students enrolled in grade 6 or of equivalent age in an ungraded classroom along with their disability codes. Students in the autism, specific learning disability, and emotional disturbance categories would be sampled from the provided rosters, and their schools would be contacted for recruitment of the specified students. Alternately, if districts prefer, they may identify a school or schools in their district which contain grade 6 students in one or more of the focal disability groups and MGLS:2017 staff will invite the school(s) to participate and request a student list with disability codes from the school(s).
During the 2017 administration of the OFT, students will participate in the full protocol as designed for implementation in the main study base year, including student assessments; a student survey; and surveys of the students’ parents, math teachers, special education teachers (as applicable), and school administrators. In the two subsequent OFT follow-ups planned for 2018 and 2019, when most students will be in grades 7 and 8, respectively, student tracing and tracking activities will be conducted to test approaches, materials, and procedures needed to locate the main study sample students as they move and change schools. The OFT follow-ups will also allow for gaining cooperation from base-year and newly-identified schools and collecting data from students, parents and school staff.
Students
The specified desired yield for the OFT is 1,120 students enrolled in grade 6. Included in the 1,120 cases is a sample of 120 students in the three focal disability categories. Depending on each school’s policy, either active (explicit) or passive (implicit) parental consent materials (Appendices OFT1-F to G) will be distributed to the sampled students, with an estimated 80 percent rate of granted consents. Among those with granted consent, an estimated 80 percent of students are expected to be present and take the assessments. Thus to achieve a yield of 1,120 assessed students, the parents of approximately 1,750 students will need to be contacted for consent (1,750*0.8*0.8=1,120).
Schools
The MGLS:2017 main study design estimates that approximately 20–25 students from each school will participate, which is similar to the number of students who participated in each school in the ECLS-K:2011 and the HSLS:09. Therefore, for the OFT to mimic procedures in the main study, assuming an average participation of 22 students in each school, and a target yield of 1,120 students total, the study will need to obtain the participation of approximately 50 schools. Based on an estimated 97 percent school eligibility rate and an estimated school response rate of 50 percent, the MGLS:2017 OFT would need to draw a sample of approximately 103 schools5.
Also, if in the participating Sample 1 schools we are not able to identify a sufficient number of students in each of the three focal disability categories to meet the sample yield targets, a sample of supplementary schools where more than 80 percent of the students have been identified as having an IEP will be recruited. In such a case, we will initially seek to identify these schools in or near districts that already contain schools that have agreed to participate. If we need to recruit additional supplemental schools, we will choose schools that allow the most cost effective field administration, given the geographic deployment of the field assessment staff.
School Recruitment Approach
Gaining schools’ cooperation in voluntary research is increasingly challenging. For example, in 1998–99 the Early Childhood Longitudinal Study had a weighted school-level response rate of 74 percent,6 whereas 12 years later, the complementary ECLS-K:2011 study had a weighted school-level response rate of 63 percent.7 Additionally, there is evidence that response rates may be lower for schools that serve older students, as in the High School Longitudinal Study of 2009, which had a weighted school-level response rate of 56 percent.8 Therefore, effective strategies for gaining the cooperation of schools are of paramount importance. Attempts are being made to solicit a letter of endorsement from the respective state education agencies to include in recruitment materials sent to districts and schools. Schools will then be recruited both directly and at the district level.
State Endorsement. To encourage district and school participation in the study, their respective state education agencies have been contacted to inform them about the study and to request a letter of endorsement (appendix OFT1-B). The state testing coordinator and, where applicable, the middle grades coordinator at the state level are copied on the state letter. Within 3 days of sending the letter to the state, senior recruitment staff contact the state superintendent, state testing coordinator, and, where applicable, a middle grades coordinator to discuss and secure support for the study. Endorsement letters received by the state will be included in all mailings to districts and schools within the state.
School Districts and Diocesan Recruitment. After state contacts have been completed, whether or not an endorsement letter is received, school districts of sample public schools and dioceses of sample Catholic schools will receive a mailing about the study. The district introductory information packet will include a cover letter (appendix OFT1-C), brochure (appendix OFT1-H), and sheet of Frequently Asked Questions (FAQs) (appendix OFT1-I). Three days after mail delivery of the packet, a recruiter will make a call to secure the district’s cooperation, answer any questions the superintendent or other district staff may have, review the list of schools sampled in the district as well as other schools serving grade 6 or serving students of equivalent age in an ungraded classroom, confirm key information about the schools (e.g., grades served, size of enrollment, enrollment of students with disabilities), and discuss obtaining the students’ IEP information that is necessary for drawing the OFT student sample. Information collected during this call will be used to confirm which schools in the district are eligible for participation in the study, and to obtain contact and other information helpful in school recruitment.
The study staff will be prepared to respond to requirements such as research applications or meetings to provide more information about the study. If a district chooses not to participate, the recruiter will document all concerns listed by the district so that a strategy can be formulated for refusal conversion attempts.
In addition to obtaining permission to contact the selected schools, districts will also be asked about the best way to gather student rosters to enable MGLS:2017 staff to recruit enough students in the three focal disability categories. The purpose of this question is to ask not only about identifying students in the three focal disability categories at the selected school(s) in the district, but also to inquire about obtaining this information for all schools in the district serving grade 6 or equivalent age. With the districts’ permission, additional schools from the district may be added to the study for the sole purpose of collecting data from students with one of the three focal disabilities.
Recruitment of Public and Catholic Schools. Upon receipt of district or diocesan approval to contact the sample public or Catholic schools, respectively, an introductory information packet will be sent via FedEx that includes a cover letter (appendix OFT1-D), a colorful recruitment-oriented brochure (appendix OFT1-H), and a sheet of Frequently Asked Questions (FAQs) about the study (appendix OFT1-I) with links for accessing the MGLS:2017 recruitment website. Within three business days of the information packet delivery (confirmed via FedEx), a school recruiter will follow up with a phone call to secure the school’s cooperation and answer any questions the school may have. During this call, the recruiter will establish who from the school’s staff will serve as the school coordinator for the study. The MGLS:2017 study team will then work with the school coordinator to schedule OFT activities at the school, including gathering student rosters, distributing consent materials to parents of sample students, and arranging the onsite assessments. In early communications, the recruiter will also gather information about what type of parental consent procedures need to be followed at the school; any requirements for collecting data on the IEP status of students and student-teacher rosters; hours of operation, including early dismissal days, school closures/vacations, and dates for standardized testing; and any other considerations that may impact scheduling student assessments (e.g., planned construction periods, school reconfiguration, or planned changes in leadership). The study recruitment team will meet regularly to discuss recruitment issues and develop strategies for refusal conversion on a school-by-school basis.
Private and Charter School Recruitment. If a private or charter school selected for the field test operates under a higher level governing body such as a diocese, a consortium of private schools, or a charter school district, we will use the district-level recruitment approach with the appropriate higher level governing body. If a private or charter school selected for the field test does not have a higher level governing body, the school recruitment approach outlined above will be used.
Collection of Student Rosters. Once a school or district has agreed to participate, the MGLS:2017 contractor, RTI International (RTI), will gather student rosters from the district or with the assistance of the school coordinator from the school. A complete roster of all students eligible for sampling will be requested, including key student characteristics, such as: name; ID number; month and year of birth; grade level; gender; race/ethnicity; and IEP status with disability code(s), when applicable. Each of these characteristics is important for sampling purposes, but we will work with schools that are unable to provide all of the information to obtain the key information available. Based on this information the student sample will be drawn. As part of the roster collection, RTI will also request from the school coordinator or designated district personnel the following information for each student eligible for sampling: student’s parent and/or guardian contact information (e.g., mailing address; landline phone number; cell phone number; e-mail address); student’s math teacher; and student’s special education teacher, when applicable. Schools and districts usually find it easiest, and therefore most efficient, to supply all of the desired information one time for all of their students. However, should it be problematic for any school or district to provide the parent and teacher information on the complete roster, RTI will gather that information as a second step for the sampled students only. If the school and/or district is unwilling to provide parent contact information, RTI will work with the school and/or district to determine the best way to contact parents (e.g., the school coordinator or designated district personnel would facilitate contacting parents and/or mail the required materials to parents using the contact information they have on file).
Schools and districts will be provided with a template and secure transfer options to deliver the rosters (appendices OFT1-S and T). The data quality of the student rosters will be then evaluated by:
reviewing and assessing the quality and robustness of student and parent information available at each school, including contact information for parents;
reviewing and assessing the quality of the data on student-teacher linkages;
reviewing and assessing the quality of the data on IEP status;
addressing any incompleteness or irregularities in the roster file;
requesting additional information as needed from the school coordinator or designated district personnel; and
(re)verifying that the sampled students are currently in attendance in the school.
Parent Recruitment
Information about schools’ procedures for obtaining consent for students to participate in the study will have been gathered during school recruitment. Schools generally require one of two types of consent: implicit or explicit. Both types of consent require that parents be notified that their children have been selected for the study. With implicit consent, the school does not require verbal or written consent for a student to participate in the study – parents are asked only to notify the appropriate person if they do not want their child to participate (appendix OFT1-F). With explicit consent, children may participate only if their parents provide written or oral consent for their children to do so (appendix OFT1-F). In the field test, proactive parent recruitment will be focused on maximizing the number of parents (1) returning signed explicit consent forms and (2) completing the parent survey.
After the student sample is drawn within a school, the initial communication with parents consisting of introductory and consent materials (Appendices OFT1-F to I) will be distributed to parents in a way each school believes to be most appropriate and effective (e.g., sending the materials home with students; the school or district sending the materials directly to parents; and/or trained MGLS recruitment staff contacting parents directly by mail, email, and/or phone). The initial materials will introduce the study, explain the study’s purpose and the importance of student and parent participation, describe what is involved in participation, and specify the consent procedure that is being used by their school. The materials will include a consent seeking letter to all parents plus a consent form where explicit consent is required (appendices OFT1-F), a colorful recruitment-oriented brochure (appendix OFT1-H), and a sheet of FAQs about the study (appendix OFT1-I) with links for accessing the MGLS:2017 recruitment website (appendix OFT1-J). Additionally, in schools using explicit consent, the parental consent form for student’s participation, which will be included in the initial communication materials, will ask parents to provide their contact information (appendix OFT1-G). Parent data collection will entail web-based self-administration with nonresponse follow-up by computer-assisted telephone interviewing.
Maximizing School Participation
Studies increasingly experience challenges in obtaining the cooperation of districts and schools. Loss of instructional time, competing demands (such as district and state testing requirements), lack of teacher and parent support, and increased demands on principals impede gaining permission to conduct research in schools. MGLS:2017 recruitment teams will be trained to communicate clearly to districts, dioceses, private school organizations, schools, teachers, parents, and students the benefits of participating in the field test and what participation will require in terms of student and school personnel time. MGLS staff will utilize conferences to inform middle grades professionals about the study and its field tests and to increase MGLS name recognition.
As part of the strategy to maximize response rates among school districts and schools during the recruitment process, RTI has established partnerships with organizations such as the Association for Middle Level Education (AMLE) and the National Forum to Accelerate Middle-Grades Reform (the Forum). These organizations will actively promote the value of the study to their constituencies as will a number of middle-grades education researchers who will participate in the recruitment effort.
Representatives from these organizations have committed to provide outreach to the middle grades community in general via information in newsletters and related communications. These communications will include information about the importance of the study, what study participation entails, and the benefits of the study to the middle grades community.
Recruiters will be trained to address concerns that districts and schools may have about participation, while simultaneously communicating the value of the study and the school’s key role in developing instruments that ensure high-quality data focusing on middle-grade students. Early engagement of districts and school administrators will be important. The study will also offer monetary and non-monetary incentives to schools as described in Part A of this submission, which have proven in other NCES studies to increase school participation rates.
Along with offering monetary and non-monetary incentives, our plan for maximizing district, school administrator, and parent engagement includes the following:
Experienced recruiters. The recruiting team will include staff with established records of successfully recruiting school districts and schools. To maximize district approval, senior staff will make the initial district telephone contacts. Their familiarity with the study and its future impact, as well as their experience in working with districts to overcome challenges to participation, will be crucial to obtaining district approval. Recruiters contacting schools will be equally adept at working with school administrators and providing solutions to the many obstacles associated with student assessments, including conflicts related to scheduling and assessment space, minimizing interruption to instructional time, and obtaining teacher and parent buy-in.
Persuasive written materials. Key to the plan for maximizing participation is developing informative materials and professional and persuasive requests for participation. The importance of the study will be reflected in the initial invitations from NCES (appendices OFT1-C to G) sent with a comprehensive set of FAQs (appendix OFT1-I) and a colorful recruitment-oriented brochure describing the study (appendix OFT1-H). Reviewing these study materials should provide districts and school administrators with a good understanding of the study’s value, the importance of the field test, and the data collection activities required as part of the study. A full understanding of these factors will be important both to obtain cooperation and to ensure that schools and districts accept the data collection requests that follow.
Persuasive electronically accessible materials. In addition to written materials, we will develop a recruitment-focused website which, drawing heavily on the written materials, will present clear and concise information about the study and convey the critical importance of participating in it. AMLE and the Forum will provide an outreach service, asking for support of the study, offering updates to their constituencies on the progress of the study, and making available information on recent articles and other material relevant to education in the middle grades.
Buy-in and support at each level. During district recruitment, the study team will seek not just permission to contact schools and obtain student rosters but also to obtain support from the district. This may take the form of approval of a research application and a letter from the district’s superintendent encouraging schools to participate. Active support from a higher governing body or organization, such as a district or a diocese, encourages cooperation of schools. Similarly, when principals are interested in the research activity, they are more likely to encourage teacher participation and provide an effective school coordinator.
Avoiding refusals. MGLS recruiters will work to avoid direct refusals by focusing on strategies to solve problems or meet obstacles to participation faced by district or school administrators. They will endeavor to keep the door open while providing additional information and seeking other sources of persuasion.
Incentive Experiment. As described in Part A, the 100 eligible OFT schools will be randomly assigned to one of three baseline incentives: $200, $400, or $400 non-monetary equivalent so that approximately 116 schools will be assigned to each of the three possible incentives. The IVFT and OFT school incentive experiment data will be combined for analysis, increasing the analytic sample size to approximately 350 eligible sample schools. To control for field test membership, a variable indicating the field test to which the school belonged will be included along with an interaction term.
For a power of 0.80, a confidence level of 95 percent, and 116 cases within each condition, in this experiment a 15.6 percent point difference in response rate should be detectable as statistically significant (e.g., 68.0 percent vs. 83.6 percent). Formula provided below.9
n = (Zα/2+Zβ)2 * (p1(1-p1)+p2(1-p2)) / (p1-p2)2
Where Zα/2 is the critical value of the Normal distribution at α/2 (e.g., for a confidence level of 95 percent, α is 0.05 and the critical value is 1.96); Zβ is the critical value of the Normal distribution at β (e.g., for a power of 80 percent, β is 0.2 and the critical value is 0.84) and p1 and p2 are the expected sample proportions of the two groups.
Maximizing Parent Participation
In preparation for the national study and to improve the chances of obtaining higher parent participation in MGLS:2017, a test of a responsive design approach to parent recruitment is proposed. The responsive design approach will be used to identify cases for nonresponse follow-up interventions such that the responding sample is as representative as possible of the population (i.e., sixth graders) and thereby reduce the risk of potential nonresponse bias. This approach will also be used to determine the optimal baseline incentive offer for the national study.
The parent incentive experimental conditions and plans for their implementation are described in Part A. One of three incentive amounts ($0, $10, or $20) will be offered to parents at baseline. Approximately one-third of the way through the OFT data collection, parents will be randomly assigned to receive an offer of either a $10 increase to the baseline incentive or no increase. About two-thirds of the way through the data collection period, one additional incentive boost will be offered to bring the total incentive amount to $40. Those parents who did not receive an incentive offer at baseline, the control group, will not receive an incentive offer throughout the data collection period.
Parents of students with disabilities will receive separate treatment due to the analytically critical import of this population for the study. We propose to offer all parents of students with disabilities $20 at baseline and an additional $10 each at the one-third and two-thirds point in the data collection, not to exceed $40 total per parent.
The parents of approximately 1,750 students will be contacted in the OFT and approximately 4,938 in the IVFT. The parents of the approximately 1,188 OFT students who do not have a primary IEP designation of autism, emotional disturbance, or specific learning disability will be randomly assigned to three baseline incentive amounts: $0, $10, and $20 with twenty percent (n=238) of the parents assigned to the $0 incentive, forty percent (n=475) assigned to the $10 incentive, and forty percent (n=475) assigned to the $20 incentive. The 4,938 IVFT parents will be randomly assigned to the same three incentive conditions with 1,646 assigned to each of the three incentive conditions. Consequently, across both field tests, 1,884 parents will be randomly assigned to the $0 incentive, 2,121 will be randomly assigned to the $10 incentive, and $2,121 will be randomly assigned to the $20 incentive.
For a power of 0.80, a confidence level of 95 percent, and 2,121 cases within each condition, in this experiment a 4.0 percent point difference in response rate should be detectable as statistically significant (e.g., 68.0 percent vs. 72.0 percent). Formula provided below.10
n = (Zα/2+Zβ)2 * (p1(1-p1)+p2(1-p2)) / (p1-p2)2
Where Zα/2 is the critical value of the Normal distribution at α/2 (e.g., for a confidence level of 95 percent, α is 0.05 and the critical value is 1.96); Zβ is the critical value of the Normal distribution at β (e.g., for a power of 80 percent, β is 0.2 and the critical value is 0.84) and p1 and p2 are the expected sample proportions of the two groups.
For a power of 0.81, with 1,884 cases in one group and 2,121 cases in another group, in this experiment a 4.1 percent point difference in response rate should be detectable as statistically significant using Pearson’s chi-square (e.g., 68.0 percent vs. 72.1 percent)11.
However, the OFT and IVFT have a clustered design with students nested in schools. Therefore, assuming an approximate design effect of 4, which is a similar design effect as reported by the HSLS:09 for parent respondents12, which also had a clustered design with students nested in schools, the effective sample size for any condition would be approximately 530 cases (2,121/4) and 471 cases (1,884/4). For a power of 0.80, a confidence level of 95 percent, and 530 cases within each condition, this experiment should be able to detect approximately a 7.8 percent point difference in response as statistically significant (e.g., 68.0 percent vs. 75.8 percent). For a power of .81, with 471 cases in one group and 530 cases in another group, in this experiment a 8.0 percent difference in response rate would be detectable as statistically significant using Pearson’s chi-square (e.g., 68.0 percent vs. 76.0 percent)13.
A main goal of the OFT is to better understand the recruitment strategies necessary for a large-scale nationally representative effort to obtain the targeted sample yield of grade 6 general education students and students with disabilities, and the subsequent tracing and tracking strategies necessary to maintain the student sample from the base year (when sample students will be in grade 6) to the first follow-up (when most of the students will be in grade 7) and the second follow-up (when most of the students will be in grade 8). As described in Part A section A.9, the OFT will include: 1) a school incentive experiment to determine whether different types and/or levels of incentives can significantly improve participation rates, 2) a student incentive experiment to determine whether and which appreciation tokens are effectiveness in securing participation of middle school students, and 3) a responsive design to non-response follow-up on the parent interview to better understand how to achieve the desired response rates in the main study.
The following individuals at NCES are responsible for the MGLS:2017: Carolyn Fidelman, Gail Mulligan, Chris Chapman, and Marilyn Seastrom. The following individuals at RTI are responsible for the study: Dan Pratt, Debbie Herget, Steven Ingels, and David Wilson, along with subcontractor staff: Sally Atkins-Burnett (Mathematica) and Michelle Najarian (ETS).
Middle Grades Longitudinal Study of 2017-18 (MGLS:2017)
Operational Field Test (OFT) and Recruitment for Main Study Base-year
OMB# 1850-0911 v.15
Supporting Statement Part B
National Center for Education Statistics
U.S. Department of Education
Institute of Education Sciences
Washington, DC
July 2016
Revised June 2017
B. Collection of Information Employing Statistical Methods 12
B.1 Universe, Sample Design, and Estimation 12
B.2 Procedures for the Collection of Information 17
B.3 Methods to Secure Cooperation, Maximize Response Rates, and Deal with Nonresponse 25
B.4 Test of Methods and Procedures 30
B.5 Individuals Responsible for Study Design and Performance 33
Part B of this submission presents information on the collection of information employing statistical methods for the Middle Grades Longitudinal Study of 2017-18 (MGLS:2017) Operational Field Test (OFT) data collection and tracking, and recruitment for the MGLS:2017 Main Study Base-year.
The universe and sample design for the OFT was fully described in the previous clearance submission (OMB# 1850-0911 v. 10). In this submission, the focus is on the sampling universe and sampling design for the Main Study Base-year.
The MGLS:2017 Main Study Base-year will be conducted during the 2017-18 school year, with data collection scheduled to begin in January 2018. The Main Study Base-year is designed to select a nationally representative sample of schools offering grade 6 instruction and to select a nationally representative sample of students enrolled in grade 6, including students whose primary Individualized Education Program (IEP) classification is Autism (AUT), Emotional Disturbance (EMN), or Specific Learning Disability (SLD) who are being educated in an ungraded setting and are age-equivalent (aged 11 to 13) to grade 6 students. The MGLS:2017 Base-year school population excludes the following types of schools:
Department of Defense Education Activity schools and Bureau of Indian Education schools,
alternative education schools, and
special education schools.14
The MGLS:2017 Main Study Base-year will employ a multi-stage sampling design with schools selected in the first stage and students selected, within schools, at the second stage. Schools will be selected using probability proportional to size sampling within school sampling strata.
Students will be selected using simple random sampling within student sampling strata within schools. The school frame will be constructed from the 2013-14 Common Core of Data (CCD 2013-14) and the 2013-14 Private School Universe Survey (PSS 2013-14) and will include approximately 48,000 schools that report offering sixth-grade instruction. Schools included in the OFT will be excluded from the frame and, therefore, will not be eligible for the Main Study Base-year due to the OFT tracking activities that will be conducted in parallel with the Main Study Base-year. While the set of schools that will participate in the OFT is not yet known and, therefore, the number of schools in the sampling frame for the Main Study Base-year cannot yet be precisely stated, there are 48,501 schools that are otherwise eligible for the Main Study Base-year.
The sample design calls for information on sixth-grade enrollment, overall and by race and ethnicity, and counts of students whose primary IEP designation is AUT, EMN, or SLD to be used in the sampling process. EDFacts will be used to determine, for each school in the sampling frame, the number of students between the ages of 11 and 13 whose primary IEP designation is AUT, EMN, or SLD. In order for schools to be sampled, sixth-grade enrollment, overall and by race and ethnicity, and counts of students whose primary IEP designation is AUT, EMN, or SLD must be available.
There are some schools for which some of the necessary information is missing but imputation can be done to include them in the proposed sampling process. An estimated 2,973 of these 48,501 schools will need imputation15 of grade 6 enrollment or EDFacts focal disability counts. For the purposes of describing the proposed sampling process in this request, the 45,528 schools with complete (non-imputed) information are used for reporting the number of schools, overall and by sampling stratum, and for estimating the number of students in each of the student sampling strata across the universe of eligible schools. After imputation, there will be more schools included in the Main Study Base-year sampling frame than are noted here. As a result, adjustments may need to be made to the school sample allocation across school sampling strata, and the overall student sampling rate, as well as the student sampling rates within student sampling strata, may, perforce, be lowered. The use of imputation does not impact the proposed sampling process, and we anticipate there will be minimal changes, if any, to the school sample allocation or to the subsequent student sampling rates actually used in the Main Study Base-year.
There are some schools with missing data for which imputation is not advisable, due to a concern that misestimating enrollment counts may give a higher probability of selection to these schools than warranted. For this reason, the following schools will be excluded from the sampling frame:
schools that report overall sixth-grade enrollment but do not report enrollment by race and ethnicity (n=9), and
schools that report no sixth-grade enrollment16 and report having no enrolled students between the ages of 11 and 13 in the three focal disability groups17 or do not report information on students with disabilities to EDFacts (n=1,578).
The 45,528 schools with complete (non-imputed) information in the sampling frame will be explicitly stratified by the cross-classification of the following characteristics:
school type (public, Catholic, other private),
region (Northeast, Midwest, South, West), and
prevalence of students with disabilities (high/low).18
The prevalence indicator is defined using the number of students in two of the three focal disability groups noted above. Schools will be classified as having a high prevalence of students with disabilities (i.e., high prevalence schools) if the total number of students whose primary IEP designation is AUT or EMN exceeds 17. The number of SLD students will not be factored into the stratification process because the number of students classified as SLD generally far exceeds the number of students classified as either EMN or AUT. Factoring in the number of SLD students would result in a threshold where schools above the threshold would have very few EMN or AUT students. The number of SLD students will also be excluded in the determination of high/low prevalence schools because it appears that sufficient numbers of SLD students will be included in the sample without oversampling. The threshold of 17 was determined by identifying the 95th percentile of the total number of students whose primary designation is AUT or EMN across all 45,528 schools.
Prior to selection of the school sample, schools will be sorted by locale (city, suburban, town, rural), school configuration (PK/KG/1-06, PK/KG/1-08, PK/KG/1-12, 05-08, 06-08, other), median income of the zip code in which a school resides, and school size measure within each of the explicit school strata so that approximate proportionality across locales, school configurations, and median zip code incomes is preserved. The purpose of including school size measure in the sort is to enable the ability to freshen the school sample. The school sample will be freshened in the first half of 2017, before the start of base-year data collection, because schools will be selected about a year before the start of data collection to allow sufficient time for recruitment. New schools will be identified through review of more recent CCD and PSS files, if available, or through web searches and talking with school, district, and diocesan contacts. Newly identified schools will be inserted into the sorted sampling frame in such a fashion as to preserve the original sort ordering. Using a half-open interval rule19, we will identify schools to be added to the initial school sample. Any such new schools will be required to exist within school districts and dioceses represented by the initial school sample.
Declining response rates are a concern for any research study, and some recent school-based NCES longitudinal studies achieved response rates lower than a desired target of 75 percent. For example, the school response rate for the High School Longitudinal Study of 2009 (HSLS:09) was 56 percent and the school response rate for the Early Childhood Longitudinal Study Kindergarten Class of 2010-11 (ECLS-K:2011) was 63 percent. For the first MGLS:2017 field test, the Item Validation Field Test (IVFT), the overall school participation rate was approximately 25 percent. However, we expect a higher response rate in the Main Study Base-year based on numerous differences between the IVFT and the Main Study. For example, the recruitment window for the IVFT was greatly compressed compared to the planned recruitment window for the Main Study Base-year (3 versus 12 months, respectively). Also, the burden on the schools for the IVFT was substantially higher, because the IVFT included up to all students in sixth, seventh, and eighth grades whereas the Main Study Base-year will target, on average, a yield of 22 students per school.
Nevertheless, to be conservative, the MGLS:2017 Base-year sampling plan is designed to be flexible so that the study can achieve sample size targets even if eligibility and response rates are lower than anticipated. The proposed school sampling process is designed to achieve 900 participating schools (740 public, 80 Catholic, and 80 other private) distributed over 16 school sampling strata. We plan to select 3,710 schools using stratified probability proportional to size sampling, from which an initial simple random sample of 1,236 schools will be selected within school strata. This subset of 1,236 schools will comprise the initial set of schools that will be released for recruitment in January of 2017. The remaining schools will comprise a reserve sample to be released if participation among schools in the initial sample is too low to meet sample size targets. The numbers of participating schools among the 1,236 released schools will be monitored by school stratum and, if the number of participating schools in a given stratum is less than the yield goals for that stratum, then additional schools may be released for that stratum from the reserve set of schools. The reserve sample will be ordered randomly within each stratum and released in waves by strata, as necessary, until there are 900 participating schools. This procedure will enable the study to achieve within-stratum sample size targets, given expected stratum-specific variation in eligibility and participation rates. The desired numbers of participating schools by the margins of the school stratification characteristics are shown in table 1.
Table 1. Main Study Base-year School Participation Goals, by School Stratification Characteristics
|
|
Public |
Catholic |
Other private |
Total |
Total |
|
740 |
80 |
80 |
900 |
Region |
Northeast |
122 |
19 |
16 |
157 |
|
Midwest |
162 |
28 |
15 |
205 |
|
South |
278 |
19 |
33 |
330 |
|
West |
178 |
14 |
16 |
208 |
Prevalence of students with disabilities |
High |
128 |
NA |
NA |
128 |
|
Low |
612 |
80 |
80 |
772 |
NA: Not Applicable. No explicit participation goals are established for Catholic and Other private schools with these two grade configurations. Catholic and Other private schools with school grade configurations of 05-08 and 06-08 are classified as Other configuration for the purposes of sampling. Catholic and Other private schools are all classified as Low prevalence, for purposes of sampling, as no focal disability counts are available.
The 16 school strata along with the corresponding stratum-specific participation goals, frame counts, total selected school sample (n=3,710), initial school sample (n=1,236), and reserve sample (n=2,474) are shown in table 2.
Table 2. Main Study Base-year School Sample Allocation
School Type |
Census Region |
Prevalence |
Participation Goals |
School Frame Count |
Total Selected School Sample |
Initial School Sample |
School Reserve Sample |
Public |
Northeast |
High |
17 |
250 |
70 |
23 |
47 |
Public |
Northeast |
Low |
105 |
4,721 |
433 |
144 |
289 |
Public |
Midwest |
High |
35 |
450 |
144 |
48 |
96 |
Public |
Midwest |
Low |
127 |
7,262 |
524 |
175 |
349 |
Public |
South |
High |
50 |
613 |
206 |
69 |
137 |
Public |
South |
Low |
228 |
9,166 |
940 |
313 |
627 |
Public |
West |
High |
26 |
307 |
107 |
36 |
71 |
Public |
West |
Low |
152 |
8,196 |
627 |
209 |
418 |
Catholic |
Northeast |
Low |
19 |
1,069 |
78 |
26 |
52 |
Catholic |
Midwest |
Low |
28 |
1,697 |
115 |
38 |
77 |
Catholic |
South |
Low |
19 |
970 |
78 |
26 |
52 |
Catholic |
West |
Low |
14 |
770 |
58 |
19 |
39 |
Other Private |
Northeast |
Low |
16 |
2,060 |
66 |
22 |
44 |
Other Private |
Midwest |
Low |
15 |
2,277 |
62 |
21 |
41 |
Other Private |
South |
Low |
33 |
3,768 |
136 |
45 |
91 |
Other Private |
West |
Low |
16 |
1,952 |
66 |
22 |
44 |
Total |
|
|
900 |
45,528 |
3,710 |
1,236 |
2,474 |
The size measure used for the probability proportional to size selection of 3,710 schools will be constructed using the overall sampling rates for students in the following seven student categories:
Autism (AUT),
Emotional Disturbance (EMN),
Specific Learning Disability (SLD),
Asian, non-Hispanic (non-SLD, non-EMN, non-AUT),
Hispanic (non-SLD, non-EMN, non-AUT),
Black, non-Hispanic (non-SLD, non-EMN, non-AUT), and
Other race, non-Hispanic (non-SLD, non-EMN, non-AUT)
combined with the total number of students in each of those seven categories at a given school. In other words, the size measure for a given school (i) in school stratum h may be written as follows:
Where is the sampling rate for the jth student category in the hth school stratum and is the number of students in the jth category within school i in the hth school stratum. The sampling rate, , equals the number of students to sample from the jth category in the hth school stratum divided by the number of students in the jth category across all schools in the hth school stratum. The sampling rates for the seven student categories listed above will vary across the school strata; for example, a rate of 0 is used for autistic students at Catholic schools while an overall rate of .033 is used for autistic students at public schools. The student sampling rates by school strata are provided in table 3. Because private schools do not report focal disability counts to EDFacts, the school sampling process assumes no students in the focal disability categories are enrolled in private schools. The sampling plan does not rely on sampling focal disability students from private schools in order to achieve the desired number of participating students in each of the three focal disability categories. In practice, however, students in the focal disability categories who are enrolled in sampled private schools will be sampled.
Table 3. Aggregate Student Sampling Rates Used for School Selection
School Type |
Census Region |
Prevalence |
Overall |
AUT |
EMN |
SLD |
Asian |
Hispanic |
Black |
Other |
Public |
Northeast |
High |
0.009 |
0.037 |
0.064 |
0.004 |
0.008 |
0.004 |
0.003 |
0.005 |
|
Northeast |
Low |
0.006 |
0.031 |
0.031 |
0.005 |
0.009 |
0.005 |
0.003 |
0.005 |
|
Midwest |
High |
0.010 |
0.040 |
0.080 |
0.004 |
0.010 |
0.005 |
0.005 |
0.005 |
|
Midwest |
Low |
0.006 |
0.024 |
0.024 |
0.005 |
0.009 |
0.005 |
0.004 |
0.005 |
|
South |
High |
0.009 |
0.042 |
0.102 |
0.004 |
0.006 |
0.003 |
0.003 |
0.005 |
|
South |
Low |
0.006 |
0.034 |
0.034 |
0.005 |
0.009 |
0.005 |
0.003 |
0.005 |
|
West |
High |
0.009 |
0.049 |
0.118 |
0.004 |
0.008 |
0.004 |
0.003 |
0.005 |
|
West |
Low |
0.006 |
0.026 |
0.025 |
0.005 |
0.008 |
0.005 |
0.003 |
0.005 |
Catholic |
Northeast |
Low |
0.017 |
NA |
NA |
NA |
0.024 |
0.024 |
0.024 |
0.014 |
|
Midwest |
Low |
0.017 |
NA |
NA |
NA |
0.022 |
0.022 |
0.022 |
0.015 |
|
South |
Low |
0.016 |
NA |
NA |
NA |
0.026 |
0.027 |
0.027 |
0.011 |
|
West |
Low |
0.017 |
NA |
NA |
NA |
0.025 |
0.025 |
0.024 |
0.011 |
Other private |
Northeast |
Low |
0.011 |
NA |
NA |
NA |
0.011 |
0.010 |
0.011 |
0.011 |
|
Midwest |
Low |
0.009 |
NA |
NA |
NA |
0.010 |
0.009 |
0.009 |
0.009 |
|
South |
Low |
0.012 |
NA |
NA |
NA |
0.012 |
0.012 |
0.012 |
0.012 |
|
West |
Low |
0.017 |
NA |
NA |
NA |
0.024 |
0.024 |
0.024 |
0.014 |
NA: Not Applicable. No explicit participation goals are established for Catholic and Other private schools with the three focal disability groups.
The sampling plan is designed to produce constant weights within each of the seven student domains (autism, specific learning disability, emotional disturbance, Asian non-Hispanic (non-SLD, non-EMN, non-AUT), Hispanic (non-SLD, non-EMN, non-AUT), Black non-Hispanic (non-SLD, non-EMN, non-AUT), and other non-Hispanic (non-SLD, non-EMN, non-AUT)) within each school stratum. When weights are constant within a given student domain and school stratum, there is no increase in the design effect due to unequal weights for estimates produced for the given student domain and school stratum.
Within participating schools, students will be stratified into the seven student categories defined above and a simple random sample of students will be selected from each student sampling stratum. Approximately 29 students will be sampled from each of the anticipated 900 participating schools. However, the number of students sampled per student stratum will vary by school because the within-school student allocation to strata depends upon the number of students in each of the seven student sampling strata. The process of determining the student sample allocation follows the procedure outlined in section 2 of Folsom et al (1987).20
Once schools are selected and recruited, students enrolled in grade 6 will be selected from student rosters that schools will be asked to provide. The student sample sizes were determined by the requirement that at least 782 students in each of the seven student domains21 participate in the second follow-up of MGLS:2017. The 782 requirement was determined by evaluating the minimum required sample size that would be able to measure a relative change of 20 percent in proportions between any pair of the MGLS:2017 study rounds (Main Study Base-year in 2018, first follow-up, and second follow-up). Several assumptions were used to conduct this evaluation, as noted below.
Two-tailed tests with significance of alpha = 0.05 were used to test differences between means and proportions with required power of 80 percent.
A proportion of p = .30 was used to calculate sample sizes for tests of proportion.
Design effect is 2.0.
Correlation between waves is 0.6.
McNemar’s test using Connor’s approximation was used to determine the minimum sample size needed to meet the precision requirement under the previously stated assumptions. The Proc Power procedure available in SAS software22 was used to determine the minimum sample size.
The minimum number of students to sample from each of the seven student categories in the 2018 Main Study Base-year, along with the assumptions used to derive those numbers, are provided in table 4.
Estimates of the minimum number of students to sample in the 2018 Main Study Base-year were derived by adjusting the 782 to account for a variety of factors including estimates of student response in grades 6, 7, and 8 as well as other factors, including the extent to which Main Study Base-year participating schools agree to participate in the first and second follow-up studies and the extent to which students are expected to move between schools between grades 6 and 7 and between grades 7 and 8.
Because of different assumptions regarding student response rates and mover rates, the number of grade 6 students to sample varies across the student categories. In order to achieve the required minimum of 5,474 grade 8 respondents, 782 respondents in each of the seven student categories, yielding a total of 13,987 students, must be sampled in grade 6. Following the assumptions specified in table 4, we estimate that 10,334, or approximately 74 percent, of a sample of 13,987 students would respond in grade 6. This minimal total sample of 13,987 students, however, employs a substantial oversampling of students in two of the three focal disability categories and a substantial undersampling of Hispanic (non-SLD, non-EMN, non-AUT), Black, non-Hispanic (non-SLD, non-EMN, non-AUT), and Other race, non-Hispanic (non-SLD, non-EMN, non-AUT) students. In order to reduce the impact of disproportionate sampling on national estimates and estimates that compare or combine estimates across student categories, the sample sizes for the Hispanic (non-SLD, non-EMN, non-AUT), Black, non-Hispanic (non-SLD, non-EMN, non-AUT), and Other race, non-Hispanic (non-SLD, non-EMN, non-AUT) student categories were increased.
Therefore, for the Main Study Base-year the plan is to sample 29 students, on average, within each of 900 participating schools for a total of 26,100 sample students and, assuming the grade 6 eligibility and response rates shown in table 4, to produce approximately 20,322 participating grade 6 students. The distribution of the grade 6 student sample and estimates of the number of participating students in each of grades 6, 7, and 8 are provided in Table 5.
Table 4. Minimum Sample Sizes and Associated Sample Design Assumptions for Student Sampling Categories
Assumption |
Each Non-focal disability student category |
SLD |
AUT |
EMN |
Grade 6 inflated student sample size |
1,509 |
2,455 |
2,748 |
2,748 |
Grade 6 student eligibility rate |
97% |
97% |
97% |
97% |
Grade 6 student response rate |
85% |
75% |
67% |
67% |
Grade 7 inflated student sample size |
1,244 |
1,786 |
1,786 |
1,786 |
Grade 7 school retention rate |
96% |
96% |
96% |
96% |
Grade 6 to 7 move rate |
30% |
30% |
30% |
30% |
Grade 7 mover follow rate |
80% |
100% |
100% |
100% |
Grade 7 non-mover response rate |
92% |
75% |
75% |
75% |
Grade 7 mover response rate |
60% |
45% |
45% |
45% |
Grade 8 inflated student sample size |
941 |
1,132 |
1,132 |
1,132 |
Grade 8 school retention rate |
96% |
96% |
96% |
96% |
Grade 7 to 8 move rate |
15% |
15% |
15% |
15% |
Grade 8 mover follow rate |
80% |
100% |
100% |
100% |
Grade 8 non-mover response rate |
92% |
75% |
75% |
75% |
Grade 8 mover response rate |
70% |
55% |
55% |
55% |
Grade 8 minimum number of respondents |
782 |
782 |
782 |
782 |
Table 5. Final Student Sample Sizes and Expected Minimum Student Participation by Grade
Assumption |
non-SLD, non-EMN, non-AUT |
SLD |
AUT |
EMN |
Total |
|||
Hispanic |
Asian, non-Hispanic |
Black, non-Hispanic |
Other race, non-Hispanic |
|||||
Grade 6 inflated student sample size |
3,786 |
1,509 |
1,868 |
10,986 |
2,455 |
2,748 |
2,748 |
26,100 |
Grade 6 student eligibility rate |
97% |
97% |
97% |
97% |
97% |
97% |
97% |
— |
Grade 6 student response rate |
85% |
85% |
85% |
85% |
75% |
67% |
67% |
— |
Grade 6 expected participants |
3,122 |
1,244 |
1,540 |
9,058 |
1,786 |
1,786 |
1,786 |
20,322 |
Grade 7 school retention rate |
96% |
96% |
96% |
96% |
96% |
96% |
96% |
— |
Grade 6 to 7 move rate |
30% |
30% |
30% |
30% |
30% |
30% |
30% |
— |
Grade 7 mover follow rate |
80% |
80% |
80% |
80% |
100% |
100% |
100% |
— |
Grade 7 non-mover response rate |
92% |
92% |
92% |
92% |
75% |
75% |
75% |
— |
Grade 7 mover response rate |
60% |
60% |
60% |
60% |
45% |
45% |
45% |
— |
Grade 7 expected participants |
2,361 |
941 |
1,165 |
6,852 |
1,132 |
1,132 |
1,132 |
14,715 |
Grade 8 school retention rate |
96% |
96% |
96% |
96% |
96% |
96% |
96% |
— |
Grade 7 to 8 move rate |
15% |
15% |
15% |
15% |
15% |
15% |
15% |
— |
Grade 8 mover follow rate |
80% |
80% |
80% |
80% |
100% |
100% |
100% |
— |
Grade 8 non-mover response rate |
92% |
92% |
92% |
92% |
75% |
75% |
75% |
— |
Grade 8 mover response rate |
70% |
70% |
70% |
70% |
55% |
55% |
55% |
— |
Grade 8 expected participants |
1,963 |
782 |
969 |
5,697 |
782 |
782 |
782 |
11,757 |
Note: SLD=Specific Learning Disability. AUT=Autism. EMN=Emotional Disturbance. The non-focal disability student categories are Asian, non-Hispanic (non-SLD, non-EMN, non-AUT); Hispanic (non-SLD, non-EMN, non-AUT); Black, non-Hispanic (non-SLD, non-EMN, non-AUT); and Other race, non-Hispanic (non-SLD, non-EMN, non-AUT.)
MGLS:2017 will rely on a set of complementary instruments to collect data across several types of respondents to provide information on the outcomes, experiences, and perspectives of students. These instruments will be used when the students are in grades 6, 7, and 8 to allow for the analysis of change and growth across time; their families and home lives; their teachers, classrooms, and instruction; and the school settings, programs, and services available to them. At each round of data collection in the Main Study, students’ mathematics and reading skills, socioemotional development, and executive function will be assessed. Students will also complete a survey that asks about their engagement in school, out-of-school experiences, peer group relationships, and identity development. Parents will be asked through an online survey or over the telephone about their background, family resources, and involvement with their child’s education and school. Students’ mathematics teachers will complete a two-part survey. In part 1, they will be asked about their background and classroom instruction. In part 2, they will be asked to report on the academic behavior, mathematics performance, and classroom conduct of each study child in their classroom(s). For students receiving special education services, their special education teacher or provider will also complete a survey similar in structure to the two-part mathematics teacher instrument, consisting of a teacher-level questionnaire and student-level questionnaire, but with questions specific to the special education experiences of and services received by the study child. School administrators will be asked to report on school programs and services, as well as on school climate. Finally, a facilities observation checklist, consisting of questions about the school buildings, classrooms and campus security, will be completed by field data collection staff.
The school recruitment approach for the OFT was fully described in the appended Part B (OMB# 1850-0911 v. 10). Below, the methodological descriptions focus on collecting OFT data, recruitment of Main Study Base-year school districts and schools, and tracking and recruitment of OFT students in preparation for a planned follow-up collection with the OFT sample (OFT2), for which OMB approval will be sought in a later request.
Operational Field Test (OFT) Data Collection Approach
The OFT will include student surveys and direct assessments, as well as surveys for students’ parents, math teachers, special education teachers (as applicable), and school administrators. The student surveys and direct assessments will take place in the school setting and be administered using Chromebooks, a tablet-like computer with touchscreen capability and an attached keyboard, brought in to the school by the study. This portion of data collection is referred to as the student session. In order to complete the survey and direct assessment, study staff will work with schools to identify and utilize locations for administration that minimize distractions for the student and disruption to the school routine. The parent, mathematics teacher, special education teacher, and school administrator surveys will have an internet option, a telephone option, and a paper-and-pencil option, so respondents will have the choice to complete the survey in a variety of school and non-school settings. Initially, the surveys will be released in internet-based form. In order to access the internet-based surveys, parents, teachers, and school administrators will receive an email with links and instructions for completing the survey (described in more detail below).
Planning for the Data Collection Visit. During the recruitment of the school, a school staff person will be identified as a school coordinator for the study. This person provides the student roster in the fall or winter before the student session to facilitate student sampling. About four weeks prior to the scheduled student session, the school coordinator receives the list of students sampled for the study and copies of the parental permission forms to send home with the students.
A trained test administrator (TA) will work with the school coordinator to verify that the students selected for the sample are still enrolled, and, if not already provided, to identify each student’s mathematics teacher and, if applicable, the student’s special education teacher or person who provides special education services. The TA will also work with the school coordinator to establish the following:
The schedule for data collection (i.e., days the study will be collecting data in the school, start time and end time of the school day, bell schedule for the transition between classes, and window of time during which students will be assessed during the school day);
Any accommodations that may be needed for students, particularly those with IEPs;
A location in the school setting to accommodate the data collection (i.e., determining the optimal space for the study);
A plan for distributing permission forms and tracking response;
The required logistical information for field staff entering the school (e.g., security and parking procedures); and
The school’s preferred technique for having students arrive at and return from the study space (e.g., this may involve field staff going to classrooms to collect students, students being notified by their teacher to report to the study space, and/or students returning to a specific class on their own when finished with the assessment and survey).
The TA will visit the school prior to the student session to ensure that the logistics are arranged in preparation for a successful data collection. While at the school, the TA will complete a school facilities observation checklist (estimated to take the TA 45 minutes on average to complete). Completion of this instrument does not require the involvement of any school staff though, if required by the school, a school staff person may accompany the TA while the checklist is being completed. The TA will complete the checklist, either online or using a paper and pencil copy of the instrument. The facilities checklist gathers information such as: the general upkeep of the school, including the presence of graffiti, trash, or broken windows (inside classrooms, entrance and hallways, and restrooms); noise level; security measures; and the general condition of the neighborhood/area around the school (Appendix OFT1-V).This checklist may be completed during the pre-session visit and/or on the day of the student session.
Student Survey and Assessments. Student surveys and direct assessments will be administered in 90-minute group sessions during the school day. The TA will be responsible for administering the student surveys and direct assessments. A test administrator assistant (TAA) may accompany the TA to help with equipment setup and also to conduct a second session if more than one session is scheduled concurrently. The student survey and direct assessment data collection will generally be carried out as follows:
The TA (and TAA, if applicable) will arrive at the school on the appointed day of assessments 90 minutes prior to when the first assessment session begins, following all school security protocols for entering the school, and seeking out the school coordinator who will be expecting the study team field staff per the arrangements made during the planning of the data collection visit;
The TA (and TAA, if applicable) will be escorted by school staff to the designated location for the surveys and direct assessments;
The TA (and TAA, if applicable) will set up the survey and assessment space, verifying that the tablet computers are in working order and setting them to the appropriate start screen; and
Once students arrive in the designated assessment space, the TA will read a script to introduce the survey and direct assessments, and help the students log in to begin. If required by the school, the students will be asked to sign an assent form (appendix OFT1-D4). Otherwise, a survey script that provides information about the study and the students’ participation will be read to the group of student participants by the TA. Alternatively, an audio-recording of the script may be used (appendix OFT1-D3).
Accommodations will be provided to students who need them to the greatest extent possible. As previously mentioned, the TA will work with the school coordinator to determine any accommodations that may need to be provided. Possible accommodations include, but are not limited to, small group sessions, one-on-one-sessions, having an aide present, and read-alouds. For students who need read-alouds, the student will wear headphones and listen to an audio recorded version of the instruments. Alternatively, the TA or TAA will be trained in administering a read-aloud and provided a script to guide the read-aloud session with the student as the TA or TAA reads the session aloud to the student. Students who require read-alouds will not participate in the reading assessment.
English Language Learners will be able to participate in certain aspects of the study. English Language Learners who participate in English-language state assessments will be offered the full complement of student instruments. English Language Learners whose primary language is not Spanish and who do not participate in state assessments will be excluded from the 6th-grade student session, however data will be collected from their parents, teachers, and school administrators. English Language Learners whose primary language is Spanish and who do not participate in state assessments will be administered the Spanish-translated version of the student survey and the executive function instrument; data will be collected from their parents, teachers, and school administrators.
Parent Questionnaire. The parent questionnaire is expected to take an average of 40 minutes to complete and will feature a multi-mode approach, with self-administered internet-based questionnaires and a telephone interview follow-up for respondents not completing the questionnaire online. A follow-up, paper-and-pencil version will also be offered.
Parents will have been informed of their participation in the study and of any incentive they might receive at the time of student recruitment (see section A.9 for information on incentives), and parent contact information will have been collected through the school or on the consent form (appendix OFT1-F). The parent data collection will generally be carried out as explained below.
Parent respondents will receive a letter and/or an email (appendices OFT1-G) that announces the launch of the survey and provides a link to the online instrument.
Upon completion of the survey, parents will receive a thank you letter and incentive.
For nonresponding parents, follow-ups will include reminder emails, letters, or postcards with information repeating the instructions on how to access the survey. Emails will be sent approximately every 10 days and letters will be sent approximately every 3 weeks.
For parents who have not responded after 2 weeks, telephone calls will be initiated to encourage participation. The study team interviewer placing the telephone call may offer to complete the survey with the parent over the phone at that moment or schedule a time to follow up with the parent and complete the survey later. A paper-and-pencil version will also be offered during these follow-ups.
Mathematics Teacher and Special Education Teacher Surveys. The mathematics teacher and special education teacher surveys are internet-based, self-administered surveys, with a paper-and-pencil option available that consists of two parts: part 1 – a teacher-level questionnaire and part 2 – a series of teacher student reports (TSRs). For mathematics teachers, part 1 (teacher survey) is expected to take approximately 20 minutes to complete, and part 2 (teacher student report) is expected to take approximately 10 minutes to complete for each student. For special education teachers, part 1 (teacher survey) is expected to take approximately 10 minutes to complete, and part 2 (teacher student report) is expected to take approximately 25 minutes to complete for each student. The mathematics and special education teacher survey data collection will generally be carried out as follows below.
For all students, their mathematics teacher, and for students identified as having an IEP, their special education teacher or the person who provides special education services will receive a letter and/or email (appendix OFT1-E) announcing the launch of the survey, and providing them with a link to the survey.
Upon completion of the survey, teachers will receive a thank you letter and incentive.
While at the school to conduct the student sessions, TAs will leave hand-written notes in the teachers’ mailboxes reminding them to complete their survey and thanking them if they have already participated.
For nonresponding mathematics or special education teachers, follow-ups will include reminder emails, letters, or postcards with information repeating instructions on how to access the survey. Emails will be sent approximately every 10 days and letters will be sent approximately every 3 weeks.
For teachers who have not completed their internet-based surveys after approximately 3 weeks, additional follow-ups will be used, including but not limited to, a telephone call encouraging teachers to complete their surveys or an offer to provide a paper-and-pencil version of the survey.
School Administrator Questionnaire. The school administrator questionnaire will be web-based and self-administered, with a paper-and-pencil option available. It will take the administrator (generally, the principal or principal’s designee) approximately 40 minutes to complete. The school administrator data collection will generally be carried out as follows.
School administrators will receive a letter and/or an email announcing the launch of the survey with a link to the survey.
Upon completion of the survey, school administrators will receive a thank you email (appendix OFT1-D2) including information about how the school will receive its incentive.
While at the school to conduct the student sessions, TAs will ask to meet with the school administrator to thank him or her for the school’s participation and remind the administrator to complete the survey if he or she has not done so already. TAs unable to meet with the school administrator personally will leave hand-written notes in the school administrator’s mailbox as a reminder to complete the survey and thanking the administrator if he or she has already participated.
For nonresponding school administrators, follow-ups will include reminder emails, letters, or postcards with information repeating instructions on how to access the survey. Emails will be sent approximately every 10 days and letters will be sent approximately every 3 weeks.
For school administrators who have not completed their internet-based survey after approximately 3 weeks, additional follow-ups will be used, including but not limited to, a telephone call encouraging school administrators to complete their survey or an offer to provide a paper-and-pencil version of the survey.
Main Study Base-year School Recruitment Approach
Gaining schools’ cooperation in voluntary research is increasingly challenging. For example, in 1998–99 the Early Childhood Longitudinal Study had a weighted school-level response rate of 74 percent,23 whereas 12 years later, the complementary ECLS-K:2011 study had a weighted school-level response rate of 63 percent.24 Additionally, there is evidence that response rates may be lower for schools that serve older students, as in the High School Longitudinal Study of 2009, which had a weighted school-level response rate of 56 percent.25 Therefore, effective strategies for gaining the cooperation of schools are of paramount importance. Recruitment activities will commence one year prior to the start of data collection. Attempts will be made to solicit a letter of endorsement from the respective state education agencies to include in recruitment materials sent to districts and schools. Schools will then be recruited both directly and at the district level.
State Endorsement. To encourage district and school participation in the study, their respective state education agencies will be contacted to inform them about the study and to request a letter of endorsement (appendix MS1-B). The state testing coordinator and, where applicable, the middle grades coordinator at the state level will be copied on the state letter. Within 3 days of sending the letter to the state, senior recruitment staff will contact the state superintendent, state testing coordinator, and, where applicable, the middle grades coordinator to discuss and secure support for the study. Endorsement letters received by the state will be included in all mailings to districts and schools within the state.
School Districts and Diocesan Recruitment. After state contacts have been completed, whether or not an endorsement letter is received, school districts of sample public schools and dioceses of sample Catholic schools (if district or diocese affiliation exists) will receive a mailing about the study. The district introductory information packet will include a cover letter (appendix MS1-C), a colorful recruitment-oriented brochure (appendix MS1-H), and a sheet of Frequently Asked Questions (FAQs) about the study (appendix MS1-I). Three days after mail delivery of the packet, a recruiter will make a call to secure the district’s cooperation and answer any questions the superintendent or other district staff may have. The staff person working with us from the school district will be asked to sign an affidavit of nondisclosure prior to receiving the list of schools sampled in the district. Once the signed nondisclosure affidavit is received, we will discuss the sampled schools, confirm key information about the schools (e.g., grades served, size of enrollment, enrollment of students with disabilities), and discuss obtaining the students’ IEP information that is necessary for drawing the Main Study Base-year student sample. Information collected during this call will be used to confirm which schools in the district are eligible for participation in the study, and to obtain contact and other information helpful in school recruitment.
The study staff will be prepared to respond to requirements such as research applications or meetings to provide more information about the study. If a district chooses not to participate, the recruiter will document all concerns listed by the district so that a strategy can be formulated for refusal conversion attempts.
In addition to obtaining permission to contact the selected schools, districts will also be asked about the best way to gather student rosters to identify students in the three focal disability categories at the selected school(s) in the district to enable MGLS:2017 staff to recruit enough students in the three focal disability categories.
Recruitment of Public and Catholic Schools. Upon receipt of district or diocesan approval to contact the sample public or Catholic schools, respectively, an introductory information packet will be sent via overnight express courier that includes a cover letter (appendix MS1-D) and the same colorful recruitment-oriented brochure (appendix MS1-H) and sheet of Frequently Asked Questions (FAQs) about the study (appendix MS1-I) that were sent to school districts and dioceses with links for accessing the MGLS:2017 recruitment website. Three business days after the information packet delivery (confirmed via package tracking), a school recruiter will follow up with a phone call to secure the school’s cooperation and answer any questions the school may have. During this call, the recruiter will establish who from the school’s staff will serve as the school coordinator for the study. In the fall of 2017, the MGLS:2017 study team will then work with the school coordinator to schedule Main Study Base-year activities at the school, including gathering student rosters, distributing consent materials to parents of sample students, and arranging the onsite assessments. In early communications, the recruiter will also gather information about what type of parental consent procedures need to be followed at the school; any requirements for collecting data on the IEP status of students and student’s teacher and math course information; hours of operation, including early dismissal days, school closures/vacations, and dates for standardized testing; and any other considerations that may impact the scheduling of student assessments (e.g., planned construction periods, school reconfiguration, or planned changes in leadership). The study recruitment team will meet regularly to discuss recruitment issues and develop strategies for refusal conversion on a school-by-school basis.
Private and Charter School Recruitment. If a private or charter school selected for the Main Study Base-year operates under a higher-level governing body such as a diocese, a consortium of private schools, or a charter school district, we will use the district-level recruitment approach with the appropriate higher-level governing body. If a private or charter school selected for the Main Study Base-year does not have a higher-level governing body, the school recruitment approach outlined above will be used.
Collection of Student Rosters. Once a school or district has agreed to participate, the data collection staff will gather student rosters from the district or with the assistance of the school coordinator from the school. A complete roster of all students eligible for sampling will be requested, and information will be requested for each student on key student characteristics, such as: name; school or district ID number; month and year of birth; grade level; gender; race/ethnicity; and IEP status with disability code(s), when applicable. Each of these characteristics is important for sampling purposes, but we will work with schools that are unable to provide all of the information to obtain the key information available. Based on this information the student sample will be drawn. As part of the roster collection, the study will also request from the school coordinator or designated district personnel the following information for each student eligible for sampling: student’s parent and/or guardian contact information (e.g., mailing address; landline phone number; cell phone number; e-mail address); student’s math teacher (including course name and period or section number); and student’s special education teacher, when applicable. Schools and districts usually find it easiest, and therefore most efficient, to supply all of the desired information one time for all of their students. However, should it be problematic for any school or district to provide the parent and teacher information on the complete roster, the data collection team will gather that information as a second step for the sampled students only. If the school and/or district is unwilling to provide parent contact information for the sampled students, the team will work with the school and/or district to determine the best way to contact parents (e.g., the school coordinator or designated district personnel would facilitate contacting parents and/or mail the required materials to parents using the contact information they have on file).
Schools and districts will be provided with a template and secure transfer options to deliver the rosters (see appendix MS1-S and T for student rostering materials). The data quality of the student rosters will be then evaluated by:
reviewing and assessing the quality and robustness of student and parent information available at each school, including contact information for parents;
reviewing and assessing the quality of the data on student-teacher linkages;
reviewing and assessing the quality of the data on IEP status;
addressing any incompleteness or irregularities in the roster file;
requesting additional information as needed from the school coordinator or designated district personnel; and
(re)verifying that the sampled students are currently in attendance in the school.
Parent Recruitment. Information about schools’ procedures for obtaining consent for students to participate in the study will have been gathered during school recruitment. Schools generally require one of two types of consent: implicit or explicit (appendix MS1-F). Both types of consent require that parents be notified that their children have been selected for the study. With implicit consent, the school does not require verbal or written consent for a student to participate in the study – parents are asked only to notify the appropriate person if they do not want their child to participate. With explicit consent, children may participate only if their parents provide written or oral consent for their children to do so. In the Main Study Base-year, proactive parent recruitment will be focused on maximizing the number of parents (1) returning signed explicit consent forms and (2) completing the parent survey. Because implicit consent does not require a verbal or written response from parents, these parents will not be contacted about consent forms.
After the student sample is drawn within a school, the initial communication with parents consisting of introductory and consent materials will be distributed to parents in a way each school believes to be most appropriate and effective (e.g., sending the materials home with students; the school or district sending the materials directly to parents; and/or trained MGLS:2017 recruitment staff contacting parents directly by mail, email, and/or phone). The initial materials will introduce the study, explain the study’s purpose and the importance of student and parent participation, describe what is involved in participation, and specify the consent procedure that is being used by their school. The materials will include a consent seeking letter to all parents plus a consent form of the type specified by the school (appendix MS1-F), a colorful recruitment-oriented brochure (appendix MS1-H), and a sheet of FAQs about the study (appendix MS1-I) with links for accessing the MGLS:2017 recruitment website (website text in appendix MS1-J). Additionally, in schools using explicit consent, the parental consent form for student’s participation, which will be included in the initial communication materials, will ask parents to provide their contact information.
Parents will also receive an invitation to participate in the parent questionnaire (appendix MS1-G). Parent data collection will entail web-based self-administration with nonresponse follow-up by computer-assisted telephone interviewing or the option for completing a paper-and-pencil version.
OFT Tracking
Critical to the longitudinal nature of the study is sample member tracking. Tracking will occur for those students in the sample for whom data were collected from the student, parent, math teacher, or special education teacher during the grade 6 collection. The planned three-tiered approach to tracking the MGLS:2017 OFT sample will include an enrollment status update at the school-level, panel maintenance activities with parents, and database tracing.
School Enrollment Status Update. The purpose of the school enrollment status update is to check the enrollment status of the sampled students in each MGLS:2017 Operational Field Test (OFT) school one year after the initial OFT data collection. We anticipate that many of the students will continue to be enrolled in the school they attended during the initial OFT (referred to as the OFT base-year school), but a subset will have transferred to a new school, started home schooling, or graduated and moved on to a higher-level school (e.g. from elementary school to middle school or junior high school). OFT schools will be asked to provide enrollment status information in the fall of 2017 in advance of a planned follow-up data collection in the winter/spring of 2018. Collecting this information is necessary to maintain current records and to gather the information from the school while that information is still available.
The schools that participated in the 2017 OFT will be asked to review the list of eligible sampled students from the base-year OFT. We expect that many students will remain at the base-year OFT school, but for those who have left the school, schools will be asked to provide the students’ last date of attendance, current school status (transfer, home schooling, etc.), last known address and phone number, and, for transfer students, the name, city, and state of the student’s new school if they are known. We anticipate that it will take 20 minutes, on average, to provide this information through a secure website set up for this purpose.
To initiate this contact, the school principal from each school will receive a lead letter that explains the purpose of the planned follow-up field test and that includes a user name, password, and secure website address. Appendix OFT2-B contains the letter to be sent to sampled schools. The letter will prompt the principal or designee to log into the study website. Upon logging in the principal/designee must confirm he or she is the intended recipient of the letter by answering an identification verification question, such as “What is the school phone number?”, and then reset the password for the account. There is no access to any information until the password is reset using a strong password. A test of the password’s strength is built into the password change application. The users then proceed to a screen where they verify the current enrollment of sampled students and provide any updated information they may have on MGLS:2017 students who are no longer enrolled. Appendix OFT2-C includes the instructions to users and Appendix OFT2-D provides a sample form to be used for the screenshots of the enrollment list update application.
If a user has to stop and continue updating later, he or she must use the new password he or she created. If the user forgets the new password, he or she must contact the MGLS:2017 help desk to reset the password.
A follow-up email will be sent two weeks after the lead letter to all non-respondents. School Enrollment List Update non-respondents will be categorized into two groups:
Group One: Have not changed their password or initiated the process at all – they will receive an email with the same study ID, password, and URL prompting them to change the password and initiate the enrollment update process, just like the letter.
Group Two: Have started the update but have not "submitted" it – they will get an email prompting them to continue and reminding them that if they have forgotten their password, they can contact the help desk to have it reset.
After the two-week period, the recruitment team will begin to contact the school via telephone to follow up on the enrollment status update and to begin to coordinate the logistics of the in-school student data collection for the sampled students who remain at the school. The OFT First Follow-up (OFT2) data collection is scheduled to begin in January 2018.
As the enrollment status updates are received and processed, students who are no longer attending the base year school will be identified. Destination schools will be contacted if four or more study students have enrolled at the school. Appendices OFT2-F through OFT2-H provide the communication materials that will be sent to the school districts and schools that are newly identified for the study. If fewer than four students transfer to a particular school or if a student becomes homeschooled, attends a virtual school, or is otherwise not enrolled at school, the students will be contacted separately to participate via Web.
The OFT2 data collection is scheduled to begin in January 2018. Approval for the data collection protocols will be requested in a separate package. The letter to the school administrator to initiate the tracking activities (Appendix OFT2-B1) describes the upcoming OFT2 data collection, which will consist of a 75-minute student survey and assessment and a 40-minute school administrator survey.
Parent/Student Address Update. In addition to the school-level update, we plan to directly contact the parents of eligible sampled students during later rounds of the OFT to update our address database. A mailing (OFT2-I) will be sent to the parent or guardian of each sample student asking that the parent or guardian log onto our website and update their contacting information. If we have an email address for the parent, the materials will be sent via email as well (OFT2-J). For data security reasons, no personally identifiable information will be preloaded onto the website for this address update. In addition to updating contact information, parents will be asked if their child will be at the same school that he/she attended in the spring of 2017, or if his/her school enrollment status has changed. This provides two sources of information in preparation for the OFT2 collection. The address update will take approximately 5 minutes to complete. See appendix OFT2-K for an example of what information will be on the website for the parent to update. To maximize response, a hardcopy version (OFT2-L) of the same form will be sent to non-respondents 3 weeks after the mailing is sent with the address update website. An email reminder will be sent at this time as well.
Tracing. Batch tracing will be conducted about 30 days prior to the start of the OFT2 data collection. Batch databases are used to confirm or update the contact information that we have for parents to maximize resources for the data collection activities. Throughout the data collection period, for parents that we are unable to reach due to a lack of or out-of-date contact information, RTI tracing specialists will use proprietary databases to conduct intensive tracing activities. A locator database will be maintained for the study and all newly identified information will be loaded into the locator database regularly to be used for current and future data collection efforts.
OFT Data Collection
The data collection plan approaches the school as a community. We aim to establish rapport with the whole community—principals, teachers, parents, and students. The school community must be approached with respect and sensitivity to achieve high levels of participation. In addition to sound data collection approaches, the study will also offer monetary and nonmonetary incentives to schools (described in Part A, Section A.9), which have proven in other NCES studies to increase school participation rates. Along with offering incentives, our plan for maximizing district, school administrator, and parent engagement and increasing response rates for the OFT data collection includes various strategies, described below.
The data collection plan attempts to minimize the time that any one student, parent, or teacher will be asked to spend in completing survey instruments. For example, the student survey and direct assessment was designed to take approximately 90 minutes per student. The parent interview was designed to take an average of 40 minutes. The mathematics teacher survey was designed to take approximately 20 minutes for the teacher-level questions and approximately 10 minutes (per student) for the teacher-reported questions about students. The special education teacher survey was designed to take approximately 10 minutes for the teacher-level questions and approximately 25 minutes (per student) for the teacher-reported questions about students. The items being considered for inclusion in the surveys were reviewed to assure that only items that functioned well and are needed to address important research questions were included, and repetitious questions were not included. For the assessments, the number of items included was kept to the minimum needed to be able to measure knowledge and skills in each domain with sufficient precision.
Internet-based surveys and other computer-assisted methods will be used to collect data from parents, teachers, and school administrators, while offering alternative modes for nonrespondents so that respondents can complete the survey in the mode that is most convenient for them or with which they feel most comfortable. The OFT will provide NCES with important information on the percent of parents who choose to respond using the different modes, item completion rates in each mode, and the level of effort that is required to obtain a response.
Use experienced test administrators. The test administration team will include staff with established records of successfully conducting data collection and/or working in schools. Experienced test administrators understand how to relate to school staff and students and meet their needs while still effectively and accurately maintain the integrity of the study design. Test administrators will demonstrate flexibility in working with the school and assume as much of the burden as possible while conducting the student sessions. These good faith actions on the part of the test administrators help to maximize response from schools who may have limited time and resources to coordinate the effort.
An incentive will be offered to schools, parents, and teachers to encourage their participation and thank them for their time (for more information on incentives see Part A, Section A.9).
A variety of communication materials (advance letters, email invitations, a study summary, and follow-up letters) will be sent to respondents to communicate the importance of the study and of their participation, and how their participation will inform education policy for the future. Providing multiple types of materials and sending them via multiple modes increases the chance that the intended recipient receives and reads the materials. For example, some individuals may be more likely to open an email than an envelope received via U.S. Mail, or vice versa. Some may be more likely to respond to a visually pleasing study summary as opposed to a formal letter, or vice versa. The variety of contact materials will help maximize the coverage of contact, which in turn will maximize response.
Contact will be maintained with respondents using various approaches and through multiple attempts. By staying in contact with reluctant respondents to determine their primary reasons for not responding, the data collection staff can be flexible and proactive. Direct contact with respondents by phone after unsuccessful email and hardcopy invitations can often break through resistance and help to increase cooperation and response rates. Experience with each of these modes during the OFT will help to inform the design and data collection protocol for the Main Study.
Main Study Base-year Recruitment
Maximizing School Participation. District- and school- participation rates in school-based studies have been declining steadily over time. District and school personnel understand the value of the research but have many reasons for refusing participation in these voluntary studies, which require considerable burden on their part. Studies increasingly experience challenges in obtaining the cooperation of districts and schools. Loss of instructional time, competing demands (such as district and state testing requirements), lack of teacher and parent support, and increased demands on principals impede gaining permission to conduct research in schools. MGLS:2017 recruitment teams will be trained to communicate clearly to districts, dioceses, private school organizations, schools, teachers, parents, and students the benefits of participating in the Main Study and what participation will require in terms of student and school personnel time. MGLS:2017 staff will utilize conferences to inform middle grades professionals about the study and to increase MGLS:2017 name recognition.
As part of the strategy to maximize response rates among school districts and schools during the recruitment process, we have established partnerships with organizations such as the Association for Middle Level Education (AMLE), the National Forum to Accelerate Middle-Grades Reform (the Forum), the National Center for Education, Research and Technology (NCERT), and the School Superintendents Association (AASA). These organizations will actively promote the value of the study to their constituencies, as will a number of middle-grades education researchers who will participate in the recruitment effort.
Representatives from these organizations have committed to provide outreach to the middle grades community in general via information in newsletters and related communications. These communications will include information about the importance of the study, what study participation entails, and the benefits of the study to the middle grades community.
We have initiated a communication and outreach approach to publicize MGLS:2017 in public settings, particularly conferences. This has included recent presentations at two NCERT meetings and the AASA National Conference on Education, and has enabled us to connect to local- and state- education officials to raise awareness and to engage their direct support of the study with their constituencies (e.g., endorsement letters; direct contact with their staff). It also has enabled us to meet 1-on-1 with some state- and district officials who are a part of the main study sample.
As part of the broad communication, we are conducting webinars that are open to the public, that are being publicized by AMLE, AASA, NCERT as well as other interested parties. These webinars describe the importance of the study, provide some information about participation, and allow webinar attendees to ask questions. These webinars will be conducted starting in April 2017 through the end of the main study data collection period, as needed. While open to the public, these webinars may include specific invitations to district and school personnel. The webinar materials are included in Appendix MS1-J2.
We are also implementing the following enhancements for Main Study recruitment to encourage participation:
The in-school data collection will begin on January 9, 2018 for schools using implicit permission and January 16, 2018 for schools requiring written consent. In many schools/districts the January dates avoid mandatory testing, among other spring term activities (roughly one-quarter of the OFT participating schools selected January sessions, even with OFT data collection starting on January 24).
Main study data collection will be conducted from January through July 2018. In-school student data collection will take place from January through June 2018, and staff and parent survey collection from January through July 2018. The inclusion of June 2018 as part of the available dates for in-school sessions will enable some schools to participate after their high-stakes testing is finished. Staff and parent surveys will continue through July 2018 to allow them sufficient time to respond, given that teacher and parent lists are submitted on a flow basis throughout the in-school data collection period.
Based on the results of the IVFT and OFT school-incentive experiments, we will offer each school (if allowed by the district/school) a choice of $400 in a check, gift card, or in goods/services for participation (the IVFT and OFT had 3 treatment conditions for the experiment: $200; $400; or $400 equivalent in goods/services).
To provide a tangible connection between the school’s participation and study findings and to respond to districts’/schools’ desire for data, we will also offer each school a report reflecting its aggregated MGLS:2017 assessment results as compared to national and sub-national results (where possible).
We will offer to personnel of participating schools and districts training for analyzing and learning from MGLS:2017 data (to take place after data collections ends) as a professional development and continuing education opportunity incentive.
We have included in this submission minor revisions to our communication materials to emphasize more explicitly the value and uniqueness of this middle-grades study and what may be learned as a result. We also make reference to what we give back to districts and schools (e.g., school-level reports).
To encourage submission of parental consent forms in schools requiring explicit consent for student participation and to engender goodwill and enthusiasm with the school, we will offer the students an in-school pizza party (or other food provision per school’s preference) to motivate returning the consent forms. Such an offer has the potential to reduce burden on the school staff while increasing student participation. We found that districts and schools with explicit consent requirements are sometimes hesitant to participate, anticipating low student participation, and that an incentive to students for returning the form can boost participation and alleviate those concerns.
Students will be using earbuds to complete the audio portion of the student assessment. Students will be allowed to keep the earbuds after participation, in addition to the already approved token incentives (e.g., keychain, pen). These tokens are mentioned in the parent consent materials.
We will conduct in-person recruitment visits to a limited and targeted subset of districts and schools to encourage their participation and explain the value of the study.
Recruiters will be trained to address concerns that districts and schools may have about participation, while simultaneously communicating the value of the study and the school’s key role in developing instruments that ensure high-quality data focusing on middle-grade students. Early engagement of districts and school administrators will be important. Along with what is described above, our plan for maximizing district, school administrator, and parent engagement includes the following:
More Intensive Refusal Conversion. The MGLS:2017 Main Study sampling plan is designed to achieve 900 participating schools distributed across 16 sampling strata. A sample of 3,710 schools was selected for MGLS:2017, and under the assumption that 70 percent of schools would agree to participate, a subsample of 1,236 of those 3,710 schools was selected for initial recruitment. The set of 2,474 schools not selected for initial recruitment defines a set of reserve schools from which additional schools will be selected for recruitment if the desired number of participating schools are not achieved in one or more of the 16 sampling strata.
A more intensive refusal conversion strategy will be employed during the recruitment of the initial set of 1,236 sampled schools to reduce the degree to which sampling of reserve schools is required and to help obtain the target number of students in the study’s focal disability groups set for the base year. The set of pending refusal schools will be reviewed and a subset of districts and schools will be selected to receive interventions designed to increase the likelihood of their participation.
During our recently completed Operational Field Test (OFT), school-level response was lower for schools classified as having a higher prevalence of students in the focal disability groups. The study sampling design identifies schools as either high prevalence or low prevalence with respect to the number of students in the focal disability groups. High prevalence schools are public schools where the expected number of students whose primary IEP is autism or emotional disturbance exceeds the 95th percentile (or 17 students) across all schools in the MGLS:2017 Main Study sampling frame. Only 13 of 51 high prevalence public schools (25.5 percent) in our initial OFT sample agreed to participate. This compared with 17 of 36 low prevalence public schools (47.2 percent) in our initial OFT sample that agreed to participate (p < .036). Most of the refusals experienced among high prevalence schools were received at the district level, where 25 districts, representing 32 high prevalence schools, declined participation. At the district level, with our initial OFT sample, 11 of the 38 districts with high-prevalence schools (28.9 percent) allowed us to contact their schools about the study; 20 of 32 districts with low-prevalence schools (62.5 percent) allowed us to contact their schools (p < .005). Furthermore, OFT districts that declined participation for their schools have on average a higher percentage of high-prevalence schools (32 percent high-prevalence schools) than do OFT districts that allowed us to contact their schools for participation (20 percent high-prevalence schools).
Given the rarity in many schools of students in two of the three focal disability groups – namely students whose primary IEP designation is autism or emotional disturbance – these refusal conversion strategies will be targeted to achieve higher response among those public schools classified as “high prevalence.” Of the 1,236 initially sampled schools, 1,017 are public schools and public schools identify IEP students (the remaining 219 sampled schools are private schools and counts of students with the focal disabilities are unavailable for private schools). The 1,017 public schools come from 839 public school districts. In the initial school sample, we sampled 176 high prevalence schools and they reside in 144 districts. There are an additional 56 low prevalence schools in those districts for a total of 232 sampled schools in 144 districts with at least one high prevalence school (out of the total of 1,017 sampled public schools in 839 districts).
In the OFT, schools were randomly assigned to a school-level participation incentive of a $200 monetary incentive or either a $400 monetary or non-monetary equivalent incentive. For high prevalence schools, if the district participated, the $400 incentive level yielded 9 participating schools of 11 (82 percent); the $200 level yielded 4 participating schools of 9 (44 percent). Given the apparent effectiveness of the additional $200 in increasing response, we propose providing a boost of an additional $200 to pending refusal schools. For each additional high-prevalence school that agrees to participate, we estimate bringing in on average an additional 8 students with autism or emotional disturbance IEP designation.
Refusals at the district level prohibit the study from contacting the school or schools in their jurisdictions to discuss participation in the study; thus a single district refusal can affect participation of multiple schools. This is even more damaging to the study when the refusing district contains high prevalence schools, meaning schools where the total number of students with primary IEP designation of AUT or EMN exceeds 17. Although the $200 boost will be provided to schools, because the district is a gatekeeper and was a significant factor in nonresponse during the OFT, we propose to offer the additional $200 in districts that contain high prevalence schools to maximize their approval of research and improve school participation rates. We will let each refusing districts know about the higher incentive offer as part of our district refusal conversion efforts. This is intended to encourage the district to open the door for us to contact schools directly, where we have greater likelihood of gaining cooperation. Upon OMB approval of this plan, we will identify all pending refusal districts containing high prevalence schools on an ongoing basis. We will offer all schools in those districts, including low prevalence schools, an additional $200 monetary incentive or $200 monetary equivalent in goods or services – for a total of $600 for the targeted schools – to achieve district participation. The purpose of offering consistent levels of incentives across schools in a district is to avoid treating schools in the same district differentially.
A financial incentive does not secure the assurance of participation, as districts and schools are increasingly protective of their instructional time and are reluctant to participate in voluntary studies. However, our recent experience demonstrates that many schools facing budget reductions find that an increase in the incentive is attractive and that it encourages their participation in the study. The proposed increase of $200 is a moderate increase compared with other studies, and we are concerned that a lesser offer may not be sufficient to offset the districts’ reasons for declining to participate in the study.
We will also offer site visits and tele- or video conferences (which may include school administrators in addition to district officials) to explain the details of the study, to address any concerns that the districts and schools may have, and to improve the likelihood of district participation. We will compare the districts that receive the intensified effort (including the nonmonetary measures) to districts where schools received $400 to evaluate effectiveness.
Experienced recruiters. The recruiting team will include staff with established records of successfully recruiting school districts and schools. To maximize district approval, senior staff will make the initial district telephone contacts. Their familiarity with the study and its future impact, as well as their experience in working with districts to overcome challenges to participation, will be crucial to obtaining district approval. Recruiters contacting schools will be equally adept at working with school administrators and providing solutions to overcome the many obstacles associated with student assessments, including conflicts related to scheduling and assessment space, minimizing interruption to instructional time, and obtaining teacher and parent buy-in.
Persuasive written materials. Key to the plan for maximizing participation is developing informative materials and professional and persuasive requests for participation. The importance of the study will be reflected in the initial invitations from NCES (appendices MS1-B to D) sent with a comprehensive set of FAQs (appendix MS1-I), a colorful recruitment-oriented brochure describing the study (appendix MS1-H), and a brief one-page flyer providing quick facts about the study which also explains that MGLS:2017 is different from other assessments (appendix MS1-H). Reviewing these study materials should provide districts and school administrators with a good understanding of the study’s value, the importance of MGLS:2017, and the data collection activities required as part of the study. A full understanding of these factors will be important both to obtain cooperation and to ensure that schools and districts accept the data collection requests that follow.
Persuasive electronically accessible materials. In addition to written materials, information about the study will be available on the study website (text in appendix MS1-J). The website will draw heavily on the written materials, will present clear and concise information about the study, and will convey the critical importance of taking part in the study.
Outreach. As mentioned briefly above, AMLE and the Forum will provide an outreach service, asking for support of the study, offering updates to their constituencies on the progress of the study, and making available information on recent articles and other material relevant to education in the middle grades. In addition, project staff will attend conferences, such as the AMLE annual conference, to promote the study.
Buy-in and support at each level. During district recruitment, the study team will seek not just permission to contact schools and obtain student rosters but also to obtain support from the district. This may take the form of approval of a research application and a letter from the district’s superintendent encouraging schools to participate. Active support from a higher governing body or organization, such as a district or a diocese, encourages cooperation of schools. Similarly, when principals are interested in the research activity, they are more likely to encourage teacher participation and provide an effective school coordinator.
Avoiding refusals. MGLS:2017 recruiters will work to avoid direct refusals by focusing on strategies to solve problems or meet obstacles to participation faced by district or school administrators. They will endeavor to keep the door open while providing additional information and seeking other ways to persuade school districts and schools to participate.
Maximizing Parent Participation.
To improve the chances of obtaining higher parent participation in MGLS:2017, a responsive design approach will be used in the Main Study Base-year to identify cases for nonresponse follow-up interventions such that the responding sample is as representative as possible of the population (i.e., sixth- graders) and thereby reduce the risk of nonresponse bias. The approach chosen for the Main Study will be informed by the results of the IVFT and OFT incentive experiments. Results from the OFT parent incentive experiment in particular, which involves providing a baseline incentive followed by a boost incentive for nonrespondents, will be used to determine the optimal baseline incentive offer and potential incentive boosts in the Main Study.
OFT Tracking and OFT First Follow-Up (OFT2) Data Collection
A major challenge in a longitudinal study is maintaining contact with sample members for subsequent rounds of data collection. For the OFT, we will track the student’s enrollment status and update the parents’ contact/locating data. The goal of this collection is to inform the procedures necessary for successful tracking of students and parents after the Main Study Base-year data collection.
Staff will be trained to communicate with schools and parents. Additionally, the MGLS:2017 study team proposes to take at least the following several steps to increase response rates as part of the OFT tracking:
The data collection plan minimizes the time that any one school administrator or parent will be asked to spend in completing tracking instruments. For example, the enrollment status update instrument was designed to take 20 minutes, and the parent contact update about 5 minutes.
Multiple methods will be used to collect data from parents and school administrators. Schools will be offered online systems to easily update students’ enrollment and parents will be sent hardcopy forms with an online option to complete the contacting information update. Using multiple methods allows respondents to complete the requested activities using the mode that is most convenient for them or with which they feel most comfortable. The OFT will provide NCES with important information on the percent of parents who choose to respond using the different modes and the level of effort that is required to obtain a response.
A variety of communication materials (advance letters, email invitations, a study summary, and follow-up letters) will be sent to respondents to communicate the importance of the study and of their participation. Providing multiple types of materials and sending them via multiple modes increases the chance that the intended recipient receives and reads the materials. For example, some individuals may be more likely to open an email than an envelope received via U.S. Mail, or vice versa. Some may be more likely to respond to a visually pleasing study summary as opposed to a formal letter, or vice versa. The variety of contact materials will help maximize the coverage of contact, which in turn will maximize response.
Contact will be maintained with respondents using various approaches and through multiple attempts. By staying in contact with reluctant respondents to determine their primary reasons for not responding, the data collection staff can be flexible and proactive. Direct contact with respondents by phone after unsuccessful email and hardcopy invitations can often break through resistance and help to increase cooperation and response rates. Experience with each of these modes during the OFT will help to inform the design and data collection protocol for the Main Study Base-year.
Batch and intensive tracing activities will be used as a low-cost, quick-turnaround approach to ensuring current contact information to facilitate reaching parents for the OFT data collection.
Contact materials to the schools will describe both the tracking and data collection activities to fully inform school administrators about the activities to be conducted in the 2017-18 school year.
Students no longer enrolled at their base, destination, or transfer school may be asked to participate via Web to facilitate participation by students who otherwise would be unable to participate in school.
MGLS:2017 is designed to include two field tests: an Item Validation Field Test (IVFT) and an Operational Field Test (OFT). The IVFT was conducted in the winter/spring 2016 and is the basis for informing decisions about the methods and procedures for the OFT and Main Study.
One of the main goals of the IVFT effort was to provide data needed to evaluate a battery of student assessments (in the areas of mathematics and reading achievement, and executive functions) and survey instruments for use in the OFT and Main Study. To that end, a number of analyses were performed on the IVFT data in order to determine if items needed revision or removal.
The properties of the survey items were examined using frequencies, mean, median, mode, standard deviation, skew, kurtosis, and histograms. Differences in response patterns were examined overall and by grade level. If the survey items were intended to be part of a scale, reliability, dimensionality, and item-to-total correlations were examined. Additionally, bivariate correlations with preliminary mathematics assessment, reading assessment, and executive function information were examined. Finally, the timing required to answer items was reviewed to remove or revise any items that needed an inordinate amount of time to complete. Based on these findings, in combination with consideration of construct importance, decisions were made to revise some items and to remove others.
The purpose of the IVFT was also to provide data to establish the psychometric properties and item performance of the items in the mathematics item pool. These data will be used to construct a two-stage mathematics assessments that will be fielded in the OFT. In addition, the IVFT provided data on the performance of the reading assessment and the executive function tasks. These data will be used to confirm performance of the reading assessment and to select executive function tasks to be fielded in the OFT.
The IVFT also provided an opportunity to develop study policies and procedures that could be further tested in the OFT for eventual use in the Main Study. However, the two field tests are quite different. The IVFT included students in multiple grades, though not necessarily from a representative sample, and tested a large number of items to determine the item pool for the longitudinal study. A main goal of the OFT is to better understand the recruitment strategies necessary for a large-scale nationally representative study to obtain the targeted sample yield of grade 6 general education students and students with disabilities, and the subsequent tracing and tracking strategies necessary to maintain the student sample from the base year (when sample students will be in grade 6) to the first follow-up (when most of the students will be in grade 7) and the second follow-up (when most of the students will be in grade 8). The OFT provides an opportunity to further test the procedures that worked effectively in the IVFT and subsequently learn from the OFT how to best implement the Main Study.
Two incentive experiments were conducted in the IVFT to inform decisions about the optimal baseline incentive offer for the OFT and Main Study. These experiments were a school-level incentive experiment and a parent-level incentive experiment.
School-level Incentive Experiment. For the school-level incentive experiment, schools were randomly assigned to one of three incentive treatment groups which consisted of an offer of $200 monetary incentive, $400 monetary incentive, or goods and services with a value of up to $400. For those assigned to the goods and services group, schools were offered to choose from a list of possible options, including a $400 gift card for use toward school supplies, payment for professional development, a library of middle level publications, registration for the AMLE annual meeting, or a two-year membership to AMLE. Two hundred and fifty schools were sampled to achieve a fifty-eight school yield for the IVFT. Because the IVFT was not designed to be a nationally representative sample, results of the school-level incentive experiment from both the IVFT and the OFT will be analyzed to inform recruitment procedures for the Main Study Base-year.
Parent-level Incentive Experiment Parent assignment to one of three incentive levels was performed at the school level prior to the start of school recruitment. All parents at a school were offered the same incentive amount: no incentive, $20, or $40. Because the incentive level was assigned at the school level, the number of parents in each group is unequal. Parent contact information was received from schools on a flow basis and invitations to complete the parent survey were sent as the contact information became available. The parent data collection took place between March 21 and May 31, 2016. This field period was abbreviated compared to what will be experienced in the OFT and Main Study Base-year, which will begin data collection in late January and continue through May and July, respectively.
Table 6 shows parent participation by incentive offer. For the IVFT, 3,222 parents were asked to complete a parent interview. A total of 1,336 parents were not offered an incentive to complete the survey; 813 parents were offered a $20 incentive; and 1,073 parents were offered a $40 incentive. As shown in Table 6, there was a 33.5 percent response rate for parents not offered an incentive, a 42.8 percent response rate for parents in the $20 group, and a 40.5 percent response rate for parents in the $40 group.
Table 6. Parent Participation by Incentive Offer
Incentive |
Selected N |
Completes |
Partial Completes |
||
N |
Percent |
N |
Percent |
||
Total |
3,222 |
1,231 |
38.21 |
150 |
4.66 |
$0 |
1,336 |
448 |
33.53 |
64 |
4.79 |
$20 |
813 |
348 |
42.80 |
33 |
4.06 |
$40 |
1,073 |
435 |
40.54 |
53 |
4.94 |
Table 7 further breaks down the parent incentive experiment by disability group. Of the parents in focal disability groups, 200 parents were in the $0 incentive group, 130 parents were in the $20 incentive group, and 151 parents were in the $40 incentive group. The sample sizes associated with these groups are potentially too small to establish whether one incentive amount is more effective than the other. However, descriptively, generally, as with the estimates for all parents presented in table 6, $20 may be the amount worth considering. While these data are not powered adequately to justify a definitive conclusion, the Emotional Disturbance parent group shows the lowest response rate at all three incentive levels; whereas, the Autism and Specific Learning Disability parents seem to follow a trend more in line with non-disability parents. For this reason we have proposed a different incentive approach for the Emotional Disturbance parent group (see Part A, Table 3 and its accompanying text).
Table 7. Parent Incentive by Disability Group
Incentive |
Selected N |
Completes |
Partial Completes |
||
N |
Percent |
N |
Percent |
||
Total |
481 |
175 |
36.38 |
14 |
2.91 |
$0 |
200 |
56 |
28.00 |
6 |
3.00 |
Specific learning disability |
71 |
16 |
22.54 |
4 |
5.63 |
Autism |
49 |
25 |
51.02 |
1 |
2.04 |
Emotional disturbance |
80 |
15 |
18.75 |
1 |
1.25 |
$20 |
130 |
63 |
48.46 |
4 |
3.08 |
Specific learning disability |
64 |
34 |
53.13 |
2 |
3.13 |
Autism |
43 |
21 |
48.84 |
1 |
2.33 |
Emotional disturbance |
23 |
8 |
34.78 |
1 |
4.35 |
$40 |
151 |
56 |
37.09 |
4 |
2.65 |
Specific learning disability |
63 |
23 |
36.51 |
2 |
3.17 |
Autism |
45 |
19 |
42.22 |
1 |
2.22 |
Emotional disturbance |
43 |
14 |
32.56 |
1 |
2.33 |
Table 8 shows the response rate by incentive type for all parents who received an invitation to participate in the survey, and specifically combines the $20 and $40 incentive groups to allow for comparison against the no incentive group.
Table 8. Parent Response by Incentive Type
Incentive offer |
Selected N |
Completes |
Not completes |
Response rate |
No incentive offer |
1,336 |
448 |
888 |
33.53 |
$20 offer |
813 |
348 |
465 |
42.80 |
$40 offer |
1073 |
435 |
638 |
40.54 |
$20 or $40 offer |
1,886 |
783 |
1103 |
41.52 |
Table 9 provides a comparison of the incentive treatments to support the proposed incentive values described in Part A.9.
Table 9. Chi-square/p-value results
Comparison |
Chi-square value |
P-value |
No incentive vs. incentive |
21.11 |
.000004 (significant) |
No incentive vs. $20 |
18.63 |
.000016 (significant) |
No incentive vs. $40 |
12.59 |
.00039 (significant) |
$20 vs. $40 |
0.98 |
.32 (not significant) |
The following individuals at the National Center for Education Statistics (NCES) are responsible for MGLS:2017: Carolyn Fidelman, Gail Mulligan, Chris Chapman, and Marilyn Seastrom. The following individuals at RTI are responsible for the study: Dan Pratt, Debbie Herget, and David Wilson, along with subcontractor staff: Sally Atkins-Burnett (Mathematica) and Michelle Najarian (ETS).
1 Sixth-grade enrollment is reported as 0 or was not reported to the CCD 2013-14 or PS 2013-14.
2 A school reports zero students or does not report the number of students in any of the three focal disability groups.
3 The reason for this is to mimic the cutoff that is planned to be used for the national data collection in order to gauge the degree to which high prevalence schools agree to participate.
4 Folsom, R.E., Potter, F.J., and Williams, S.R. (1987). Notes on a Composite Size Measure for Self-Weighting Samples in Multiple Domains. Proceedings of the Section on Survey Research Methods of the American Statistical Association, 792-796.
5 One hundred twenty-five schools will be drawn to allow for a reserve pool of 22 schools to account for lower than estimated school-participation rate.
6 Tourangeau, K., Nord, C., Lê, T., Sorongon, A.G., Hagedorn, M.C., Daly, P., and Najarian, M. (2001). Early Childhood Longitudinal Study, Kindergarten Class of 1998–99 (ECLS-K), User’s Manual for the ECLS-K Base Year Public-Use Data Files and Electronic Codebook (NCES 2001-029). U.S. Department of Education. Washington, DC: National Center for Education Statistics.
7 Tourangeau, K., Nord, C., Lê, T., Sorongon, A.G., Hagedorn, M.C., Daly, P., and Najarian, M. (2012). Early Childhood Longitudinal Study, Kindergarten Class of 2010–11 (ECLS-K:2011), User’s Manual for the ECLS-K:2011 Kindergarten Data File and Electronic Codebook (NCES 2013-061). U.S. Department of Education. Washington, DC: National Center for Education Statistics.
8 Ingels, S.J., Pratt, D.J., Herget, D.R., Burns, L.J., Dever, J.A., Ottem, R., Rogers, J.E., Jin, Y., and Leinwand, S. (2011). High School Longitudinal Study of 2009 (HSLS:09). Base-Year Data File Documentation (NCES 2011-328). U.S. Department of Education. Washington, DC: National Center for Education Statistics.
9 Retrieved from http://www.select-statistics.co.uk/sample-size-calculator-two-proportions.
10 Retrieved from http://www.select-statistics.co.uk/sample-size-calculator-two-proportions.
11 Calculated using SAS Proc Power. https://support.sas.com/documentation/cdl/en/statugpower/61819/PDF/default/statugpower.pdf
12 Ingels, S.J., Pratt, D.J., Herget, D.R., Burns, L.J., Dever, J.A., Ottem, R., Rogers, J.E., Jin, Y., and Leinwand, S. (2011). High School Longitudinal Study of 2009 (HSLS:09). Base-Year Data File Documentation (NCES 2011-328). U.S. Department of Education. Washington, DC: National Center for Education Statistics.
13 Calculated using SAS Proc Power. https://support.sas.com/documentation/cdl/en/statugpower/61819/PDF/default/statugpower.pdf
14 A special education schools is a public elementary/secondary school that focuses on educating students with disabilities and adapts curriculum, materials, or instruction for the students served.
15 Imputation is necessary in order to be able to include eligible schools in the sampling process, which helps ensure the sampling frame is more representative of the population of eligible schools. Imputation will be used for grade 6 enrollment when grade 6 enrollment is missing. Imputation will be used for focal disability counts when focal disability counts are missing. We note that schools in Wyoming and Iowa do not report to EDFacts so they would not be able to be represented in the Main Study Base-year sample without imputing focal disability counts. If both grade 6 enrollment and focal disability counts are missing, imputation will not be used and these schools will be excluded from the frame.
16 Sixth-grade enrollment is reported as 0 or was not reported to the CCD 2013-14 or PS 2013-14.
17 A school reports zero students or does not report the number of students in any of the three focal disability groups.
18 For sampling purposes, all private schools will be classified as low prevalence schools because private schools do not report to EDFacts.
19 See, for example, Kish (1965.) Survey Sampling, John Wiley & Sons, Inc. p.56.
20 Folsom, R.E., Potter, F.J., and Williams, S.R. (1987). Notes on a Composite Size Measure for Self-Weighting Samples in Multiple Domains. Proceedings of the Section on Survey Research Methods of the American Statistical Association, 792-796.
21 The seven student domains are as follows: Autism (AUT); Emotional Disturbance (EMN); Specific Learning Disability (SLD); Asian, non-Hispanic (non-SLD, non-EMN, non-AUT); Hispanic (non-SLD, non-EMN, non-AUT;, Black, non-Hispanic (non-SLD, non-EMN, non-AUT); and Other race, non-Hispanic (non-SLD, non-EMN, non-AUT)
22 SAS Institute Inc. 2008. SAS/STAT® 9.2 User’s Guide. Cary, NC: SAS Institute Inc.
23 Tourangeau, K., Nord, C., Lê, T., Sorongon, A.G., Hagedorn, M.C., Daly, P., and Najarian, M. (2001). Early Childhood Longitudinal Study, Kindergarten Class of 1998–99 (ECLS-K), User’s Manual for the ECLS-K Base Year Public-Use Data Files and Electronic Codebook (NCES 2001-029). U.S. Department of Education. Washington, DC: National Center for Education Statistics.
24 Tourangeau, K., Nord, C., Lê, T., Sorongon, A.G., Hagedorn, M.C., Daly, P., and Najarian, M. (2012). Early Childhood Longitudinal Study, Kindergarten Class of 2010–11 (ECLS-K:2011), User’s Manual for the ECLS-K:2011 Kindergarten Data File and Electronic Codebook (NCES 2013-061). U.S. Department of Education. Washington, DC: National Center for Education Statistics.
25 Ingels, S.J., Pratt, D.J., Herget, D.R., Burns, L.J., Dever, J.A., Ottem, R., Rogers, J.E., Jin, Y., and Leinwand, S. (2011). High School Longitudinal Study of 2009 (HSLS:09). Base-Year Data File Documentation (NCES 2011-328). U.S. Department of Education. Washington, DC: National Center for Education Statistics.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Modified | 0000-00-00 |
File Created | 2021-01-22 |