Middle Grades Longitudinal Study of 2017–18 (MGLS:2017)
Recruitment for
2017 Operational Field Test (OFT)
OMB# 1850-0911 v.10
Supporting Statement Part B
National Center for Education Statistics
U.S. Department of Education
Institute of Education Sciences
Washington, DC
September 2015
Revised December 2015
Revised March 2016
B. Collection of Information Employing Statistical Methods 1
B.2 Procedures for the Collection of Information 4
B.3 Maximizing Participation 7
B.4 Purpose of Field Test and Data Uses 9
B.5 Individuals Responsible for Study Design and Performance 10
Part B of this submission presents information on the collection of information employing statistical methods for the recruitment for the Middle Grades Longitudinal Study of 2017-18 (MGLS:2017) Operational Field Test (OFT). A separate OMB clearance request for the OFT data collection will be submitted in 2016. Please note that aspects of the methodology described in this document may be subject to change based on the Item Validation Field Test (IVFT). Any revisions will be submitted to OMB for approval as a change request as soon as available.
The MGLS:2017 Operational Field Test (OFT) will be conducted during the 2016-17 school year, with data collection scheduled to begin in January 2017. The OFT will be conducted in ten geographic locations that adequately simulate the diversity of the 50 States and the District of Columbia. The OFT may include up to two quasi-nationally representative samples of students in the United States. The Sample 1 universe will include students enrolled in grade 6 and attending general education schools, while Sample 2, if drawn, would include students in general education schools in three focal disability categories (autism, specific learning disability, and emotional disturbance) who are enrolled in grade 6 or are of equivalent age in an ungraded classroom. Sample 2 will be used if Sample 1 proves insufficient to meet the target student sample yields for the three focal disability categories.
The MGLS:2017 OFT will employ a multi-stage sampling design, with schools selected in one stage, and then students selected within schools. Schools will be selected using probability proportional to size sampling within school sampling strata, with some schools selected with certainty in order to ensure that all desired types of schools are included (see below).
Students will be selected using simple random sampling within student sampling strata within schools. The school frame will be constructed from the 2013-2014 Common Core of Data (CCD 2013-14) and the 2013-2014 Private School Universe Survey (PSS 2013-14) and will include 3,301 schools that report offering sixth-grade instruction and are located within ten metropolitan statistical areas (MSAs). The following types of schools will be excluded from the sampling frame:
Department of Defense Education Activity schools and Bureau of Indian Education schools,
schools for juvenile delinquents,
schools that report no1 sixth-grade enrollment and do not report to EDFacts,
schools that report no sixth-grade enrollment and report no2 students between the ages of 11 and 13 in the three focal disability groups,
special education schools, and
schools included in the Item Validation Field Test (IVFT) field test.
Schools will be stratified by MSA and by a High/Low Prevalence designation derived from the total number of students in the three focal disability groups. Schools will be classified as High Prevalence if the total number of students in the three focal disability groups exceeds the 95th percentile for that total across all3 schools not just those schools in the ten MSAs used for the OFT sampling. One hundred and twenty five schools will be sampled and 103 of the 125 schools will be selected for initial data collection. The 22 schools not selected for initial data collection will be used as a reserve sample in the event that a higher than expected proportion of schools decline to participate or are found to be ineligible for the OFT.
The allocation of the 125 schools to the school sampling strata proceeded by first determining an allocation of 103 schools across the school sampling strata in order to have approximately half of the sample consist of high prevalence schools, half of the sample consist of low prevalence schools, and have approximately ten sample schools in each MSA. Twenty two additional reserve schools were distributed across the school sampling strata in order to try and preserve these goals as well. The school strata and a sample allocation that meets these goals are shown in table 1.
Table 1. OFT School Sample Allocation
School Region |
Prevalence |
School
|
School
|
30
Certainty Private, |
Certainty
Schools |
Non Certainty Schools Among 103 Schools |
Reserve Sample |
PPS Sample of 78 |
School Sample Size (125) |
A |
High Prevalence |
76 |
7 |
2 |
0 |
5 |
2 |
7 |
9 |
A |
Low Prevalence |
395 |
5 |
3 |
0 |
2 |
1 |
3 |
6 |
B |
High Prevalence |
22 |
7 |
0 |
0 |
7 |
2 |
9 |
9 |
B |
Low Prevalence |
293 |
5 |
3 |
0 |
2 |
1 |
3 |
6 |
C |
High Prevalence |
1 |
1 |
0 |
1 |
0 |
0 |
0 |
1 |
C |
Low Prevalence |
87 |
6 |
1 |
0 |
5 |
2 |
7 |
8 |
D |
High Prevalence |
5 |
5 |
0 |
5 |
0 |
0 |
0 |
5 |
D |
Low Prevalence |
170 |
5 |
3 |
0 |
2 |
1 |
3 |
6 |
E |
High Prevalence |
5 |
5 |
0 |
5 |
0 |
0 |
0 |
5 |
E |
Low Prevalence |
305 |
5 |
3 |
0 |
2 |
1 |
3 |
6 |
F |
High Prevalence |
40 |
7 |
0 |
0 |
7 |
2 |
9 |
9 |
F |
Low Prevalence |
566 |
5 |
3 |
0 |
2 |
1 |
3 |
6 |
G |
High Prevalence |
14 |
7 |
0 |
0 |
7 |
2 |
9 |
9 |
G |
Low Prevalence |
572 |
5 |
3 |
0 |
2 |
1 |
3 |
6 |
H |
High Prevalence |
12 |
7 |
0 |
0 |
7 |
2 |
9 |
9 |
H |
Low Prevalence |
497 |
5 |
3 |
0 |
2 |
1 |
3 |
6 |
I |
High Prevalence |
4 |
4 |
0 |
4 |
0 |
0 |
0 |
4 |
I |
Low Prevalence |
148 |
5 |
3 |
0 |
2 |
1 |
3 |
6 |
J |
High Prevalence |
2 |
2 |
0 |
2 |
0 |
0 |
0 |
2 |
J |
Low Prevalence |
87 |
5 |
3 |
0 |
2 |
2 |
4 |
7 |
Total |
|
3,301 |
103 |
30 |
17 |
56 |
22 |
78 |
125 |
The first step in the sampling process involves the selection of thirty schools with certainty in order to ensure that at least ten private schools, ten schools in rural areas, and ten schools in towns are included in the sample of 103 schools to be initially fielded for data collection. The allocation of these thirty schools to the twenty school sampling strata is provided in table 1. The second step in the sampling process requires identifying those school sampling strata where the sample allocation of the 125 schools equals the number of schools in the frame. As shown in table 1, there are five sampling strata where the sample size equals the number of schools in the frame and there are seventeen schools in these five strata. All seventeen schools will be selected with certainty. Forty seven of the 103 schools, or approximately 46 percent of the sample, will be selected with certainty. The third step in the sampling process involves selecting 78 schools using stratified probability proportional to size sampling. The distribution of these 78 schools across the school sampling strata is provided in table 1. The fourth step in the sampling process entails selecting a simple random sample, within school strata, of the 78 schools selected using probability proportional to size in order to select 56 of the 78 schools for inclusion in the initial set of 103 schools released for data collection. The sample allocation of these 56 schools is shown in table 1.
The size measure used for the probability proportional to size selection of 78 schools will be constructed using the overall sampling rates for students in the following four student categories:
Autism,
Emotional Disturbance,
Specific Learning Disability, and
Other
combined with the total number of students in each of those categories at a given school. In other words, the size measure for a given school (i) may be written as follows:
Where is the sampling rate for the jth student category and is the number of students in the jth category within school i. The sampling rate, , equals the number of students to sample from the jth category divided by the number of students in the jth category across all schools in the sampling frame. The sampling rate for autistic students equals 188/7,706 (.024), the sampling rate for students with emotional disturbance equals 188/5,115 (.037), the sampling rate for students with specific learning disability equals 188/42,603 (.004), and the sampling rate for other students equals 1,188/280,424 (.004.) The denominator of a given rate for a given domain corresponds to the number of students in that domain across all schools in the sampling frame. The numerator of a given rate equals the number of students required to be sampled in order to achieve 1,120 participating students including 120 autistic students, 120 students with emotional disturbance, 120 students with specific learning disability, and 760 other students.
The unequal weighting effect is designed to be one within each of the four student domains (autism, specific learning disability, emotional disturbance, and other) within each school stratum among the schools selected using probability proportional to size sampling. A design effect of one means that the precision of an estimate is equivalent to the precision of an estimate derived via simple random sampling. The degree to which certainty schools will reduce the precision of estimates depends upon the degree to which certainty schools participate, the degree to which students in all schools participate, the degree to which student enrollment counts match expected counts, and the set of non-certainty schools that are sampled.
Within participating schools, students will be stratified into the four student categories defined above and a simple random sample of students will be selected from each student sampling stratum. The number of students sampled per student stratum will vary by school because the within school student allocation to strata depends upon the number of students in each of the four student sampling strata. The process of determining the student sample allocation follows the procedure outlined in section 2 of Folsom et al (19874.) The number of sampled students per student domain will generally not be equal and will vary across schools. The process outlined in Folsom et al. will also be followed for the full-scale collection. Approximately 34 students will be sampled from each of the anticipated 50 participating schools.
For OFT Sample 1, schools will be selected for recruitment from the CCD 2013-14 to develop the public school list and from the PSS 2013-14 to develop the private school list. In Sample 1, once schools are selected and recruited, students enrolled in grade 6 will be selected from student rosters that schools will be asked to provide. For OFT Sample 2, districts would be contacted to provide rosters of students enrolled in grade 6 or of equivalent age in an ungraded classroom along with their disability codes. Students in the autism, specific learning disability, and emotional disturbance categories would be sampled from the provided rosters, and their schools would be contacted for recruitment of the specified students. Alternately, if districts prefer, they may identify a school or schools in their district which contain grade 6 students in one or more of the focal disability groups and MGLS:2017 staff will invite the school(s) to participate and request a student list with disability codes from the school(s).
During the 2017 administration of the OFT, students will participate in the full protocol as designed for implementation in the main study base year, including student assessments; a student survey; and surveys of the students’ parents, math teachers, special education teachers (as applicable), and school administrators. In the two subsequent OFT follow-ups planned for 2018 and 2019, when most students will be in grades 7 and 8, respectively, student tracing and tracking activities will be conducted to test approaches, materials, and procedures needed to locate the main study sample students as they move and change schools. The OFT follow-ups will also allow for gaining cooperation from base-year and newly-identified schools and collecting data from students, parents and school staff.
Students
The specified desired yield for the OFT is 1,120 students enrolled in grade 6. Included in the 1,120 cases is a sample of 120 students in the three focal disability categories. Depending on each school’s policy, either active (explicit) or passive (implicit) parental consent materials (Appendices F-G) will be distributed to the sampled students, with an estimated 80 percent rate of granted consents. Among those with granted consent, an estimated 80 percent of students are expected to be present and take the assessments. Thus to achieve a yield of 1,120 assessed students, the parents of approximately 1,750 students will need to be contacted for consent (1,750*0.8*0.8=1,120).
Schools
The MGLS:2017 main study design estimates that approximately 20–25 students from each school will participate, which is similar to the number of students who participated in each school in the ECLS-K:2011 and the HSLS:09. Therefore, for the OFT to mimic procedures in the main study, assuming an average participation of 22 students in each school, and a target yield of 1,120 students total, the study will need to obtain the participation of approximately 50 schools. Based on an estimated 97 percent school eligibility rate and an estimated school response rate of 50 percent, the MGLS:2017 OFT would need to draw a sample of approximately 103 schools5.
Also, if in the participating Sample 1 schools we are not able to identify a sufficient number of students in each of the three focal disability categories to meet the sample yield targets, a sample of supplementary schools where more than 80 percent of the students have been identified as having an IEP will be recruited. In such a case, we will initially seek to identify these schools in or near districts that already contain schools that have agreed to participate. If we need to recruit additional supplemental schools, we will choose schools that allow the most cost effective field administration, given the geographic deployment of the field assessment staff.
School Recruitment Approach
Gaining schools’ cooperation in voluntary research is increasingly challenging. For example, in 1998–99 the Early Childhood Longitudinal Study had a weighted school-level response rate of 74 percent,6 whereas 12 years later, the complementary ECLS-K:2011 study had a weighted school-level response rate of 63 percent.7 Additionally, there is evidence that response rates may be lower for schools that serve older students, as in the High School Longitudinal Study of 2009, which had a weighted school-level response rate of 56 percent.8 Therefore, effective strategies for gaining the cooperation of schools are of paramount importance. Attempts will be made to solicit a letter of endorsement from the respective state education agencies to include in recruitment materials sent to districts and schools. Schools will then be recruited both directly and at the district level.
State Endorsement. To encourage district and school participation in the study, their respective state education agencies will be contacted to inform them about the study and to request a letter of endorsement (appendix B). The state testing coordinator and, where applicable, the middle grades coordinator at the state level will be copied on the state letter. Within 3 days of sending the letter to the state, senior recruitment staff will contact the state superintendent, state testing coordinator, and, where applicable, a middle grades coordinator to discuss and secure support for the study. Endorsement letters received by the state will be included in all mailings to districts and schools within the state.
School Districts and Diocesan Recruitment. After state contacts have been completed, whether or not an endorsement letter is received, school districts of sample public schools and dioceses of sample Catholic schools will receive a mailing about the study. The district introductory information packet will include a cover letter (appendix C), brochure (appendix H), and sheet of Frequently Asked Questions (FAQs) (appendix I). Three days after mail delivery of the packet, a recruiter will make a call to secure the district’s cooperation, answer any questions the superintendent or other district staff may have, review the list of schools sampled in the district as well as other schools serving grade 6 or serving students of equivalent age in an ungraded classroom, confirm key information about the schools (e.g., grades served, size of enrollment, enrollment of students with disabilities), and discuss obtaining the students’ IEP information that is necessary for drawing the OFT student sample. Information collected during this call will be used to confirm which schools in the district are eligible for participation in the study, and to obtain contact and other information helpful in school recruitment.
The study staff will be prepared to respond to requirements such as research applications or meetings to provide more information about the study. If a district chooses not to participate, the recruiter will document all concerns listed by the district so that a strategy can be formulated for refusal conversion attempts.
In addition to obtaining permission to contact the selected schools, districts will also be asked about the best way to gather student rosters to enable MGLS:2017 staff to recruit enough students in the three focal disability categories. The purpose of this question is to ask not only about identifying students in the three focal disability categories at the selected school(s) in the district, but also to inquire about obtaining this information for all schools in the district serving grade 6 or equivalent age. With the districts’ permission, additional schools from the district may be added to the study for the sole purpose of collecting data from students with one of the three focal disabilities.
Recruitment of Public and Catholic Schools. Upon receipt of district or diocesan approval to contact the sample public or Catholic schools, respectively, an introductory information packet will be sent via FedEx that includes a cover letter (appendix D), a colorful recruitment-oriented brochure (appendix H), and a sheet of Frequently Asked Questions (FAQs) about the study (appendix I) with links for accessing the MGLS:2017 recruitment website. Within three business days of the information packet delivery (confirmed via FedEx), a school recruiter will follow up with a phone call to secure the school’s cooperation and answer any questions the school may have. During this call, the recruiter will establish who from the school’s staff will serve as the school coordinator for the study. The MGLS:2017 study team will then work with the school coordinator to schedule OFT activities at the school, including gathering student rosters, distributing consent materials to parents of sample students, and arranging the onsite assessments. In early communications, the recruiter will also gather information about what type of parental consent procedures need to be followed at the school; any requirements for collecting data on the IEP status of students and student-teacher rosters; hours of operation, including early dismissal days, school closures/vacations, and dates for standardized testing; and any other considerations that may impact scheduling student assessments (e.g., planned construction periods, school reconfiguration, or planned changes in leadership). The study recruitment team will meet regularly to discuss recruitment issues and develop strategies for refusal conversion on a school-by-school basis.
Private and Charter School Recruitment. If a private or charter school selected for the field test operates under a higher level governing body such as a diocese, a consortium of private schools, or a charter school district, we will use the district-level recruitment approach with the appropriate higher level governing body. If a private or charter school selected for the field test does not have a higher level governing body, the school recruitment approach outlined above will be used.
Collection of Student Rosters. Once a school or district has agreed to participate, the MGLS:2017 contractor, RTI International (RTI), will gather student rosters from the district or with the assistance of the school coordinator from the school. A complete roster of all students eligible for sampling will be requested, including key student characteristics, such as: name; ID number; month and year of birth; grade level; gender; race/ethnicity; and IEP status with disability code(s), when applicable. Each of these characteristics is important for sampling purposes, but we will work with schools that are unable to provide all of the information to obtain the key information available. Based on this information the student sample will be drawn. As part of the roster collection, RTI will also request from the school coordinator or designated district personnel the following information for each student eligible for sampling: student’s parent and/or guardian contact information (e.g., mailing address; landline phone number; cell phone number; e-mail address); student’s math teacher; and student’s special education teacher, when applicable. Schools and districts usually find it easiest, and therefore most efficient, to supply all of the desired information one time for all of their students. However, should it be problematic for any school or district to provide the parent and teacher information on the complete roster, RTI will gather that information as a second step for the sampled students only. If the school and/or district is unwilling to provide parent contact information, RTI will work with the school and/or district to determine the best way to contact parents (e.g., the school coordinator or designated district personnel would facilitate contacting parents and/or mail the required materials to parents using the contact information they have on file).
Schools and districts will be provided with a template and secure transfer options to deliver the rosters (appendices S and T). The data quality of the student rosters will be then evaluated by:
reviewing and assessing the quality and robustness of student and parent information available at each school, including contact information for parents;
reviewing and assessing the quality of the data on student-teacher linkages;
reviewing and assessing the quality of the data on IEP status;
addressing any incompleteness or irregularities in the roster file;
requesting additional information as needed from the school coordinator or designated district personnel; and
(re)verifying that the sampled students are currently in attendance in the school.
Parent Recruitment
Information about schools’ procedures for obtaining consent for students to participate in the study will have been gathered during school recruitment. Schools generally require one of two types of consent: implicit or explicit. Both types of consent require that parents be notified that their children have been selected for the study. With implicit consent, the school does not require verbal or written consent for a student to participate in the study – parents are asked only to notify the appropriate person if they do not want their child to participate (appendix F). With explicit consent, children may participate only if their parents provide written or oral consent for their children to do so (appendix G). In the field test, proactive parent recruitment will be focused on maximizing the number of parents (1) returning signed explicit consent forms and (2) completing the parent survey.
After the student sample is drawn within a school, the initial communication with parents consisting of introductory and consent materials (Appendixes F-I) will be distributed to parents in a way each school believes to be most appropriate and effective (e.g., sending the materials home with students; the school or district sending the materials directly to parents; and/or trained MGLS recruitment staff contacting parents directly by mail, email, and/or phone). The initial materials will introduce the study, explain the study’s purpose and the importance of student and parent participation, describe what is involved in participation, and specify the consent procedure that is being used by their school. The materials will include a consent seeking letter to all parents plus a consent form where explicit consent is required (appendices F-G), a colorful recruitment-oriented brochure (appendix H), and a sheet of FAQs about the study (appendix I) with links for accessing the MGLS:2017 recruitment website (appendix J). Additionally, in schools using explicit consent, the parental consent form for student’s participation, which will be included in the initial communication materials, will ask parents to provide their contact information (appendix G). Parent data collection will entail web-based self-administration with nonresponse follow-up by computer-assisted telephone interviewing.
Maximizing School Participation
Studies increasingly experience challenges in obtaining the cooperation of districts and schools. Loss of instructional time, competing demands (such as district and state testing requirements), lack of teacher and parent support, and increased demands on principals impede gaining permission to conduct research in schools. MGLS:2017 recruitment teams will be trained to communicate clearly to districts, dioceses, private school organizations, schools, teachers, parents, and students the benefits of participating in the field test and what participation will require in terms of student and school personnel time. MGLS staff will utilize conferences to inform middle grades professionals about the study and its field tests and to increase MGLS name recognition.
As part of the strategy to maximize response rates among school districts and schools during the recruitment process, RTI has established partnerships with organizations such as the Association for Middle Level Education (AMLE) and the National Forum to Accelerate Middle-Grades Reform (the Forum). These organizations will actively promote the value of the study to their constituencies as will a number of middle-grades education researchers who will participate in the recruitment effort.
Representatives from these organizations have committed to provide outreach to the middle grades community in general via information in newsletters and related communications. These communications will include information about the importance of the study, what study participation entails, and the benefits of the study to the middle grades community.
Recruiters will be trained to address concerns that districts and schools may have about participation, while simultaneously communicating the value of the study and the school’s key role in developing instruments that ensure high-quality data focusing on middle-grade students. Early engagement of districts and school administrators will be important. The study will also offer monetary and non-monetary incentives to schools as described in Part A of this submission, which have proven in other NCES studies to increase school participation rates.
Along with offering monetary and non-monetary incentives, our plan for maximizing district, school administrator, and parent engagement includes the following:
Experienced recruiters. The recruiting team will include staff with established records of successfully recruiting school districts and schools. To maximize district approval, senior staff will make the initial district telephone contacts. Their familiarity with the study and its future impact, as well as their experience in working with districts to overcome challenges to participation, will be crucial to obtaining district approval. Recruiters contacting schools will be equally adept at working with school administrators and providing solutions to the many obstacles associated with student assessments, including conflicts related to scheduling and assessment space, minimizing interruption to instructional time, and obtaining teacher and parent buy-in.
Persuasive written materials. Key to the plan for maximizing participation is developing informative materials and professional and persuasive requests for participation. The importance of the study will be reflected in the initial invitations from NCES (appendices C-G) sent with a comprehensive set of FAQs (appendix I) and a colorful recruitment-oriented brochure describing the study (appendix H). Reviewing these study materials should provide districts and school administrators with a good understanding of the study’s value, the importance of the field test, and the data collection activities required as part of the study. A full understanding of these factors will be important both to obtain cooperation and to ensure that schools and districts accept the data collection requests that follow.
Persuasive electronically accessible materials. In addition to written materials, we will develop a recruitment-focused website which, drawing heavily on the written materials, will present clear and concise information about the study and convey the critical importance of participating in it. AMLE and the Forum will provide an outreach service, asking for support of the study, offering updates to their constituencies on the progress of the study, and making available information on recent articles and other material relevant to education in the middle grades.
Buy-in and support at each level. During district recruitment, the study team will seek not just permission to contact schools and obtain student rosters but also to obtain support from the district. This may take the form of approval of a research application and a letter from the district’s superintendent encouraging schools to participate. Active support from a higher governing body or organization, such as a district or a diocese, encourages cooperation of schools. Similarly, when principals are interested in the research activity, they are more likely to encourage teacher participation and provide an effective school coordinator.
Avoiding refusals. MGLS recruiters will work to avoid direct refusals by focusing on strategies to solve problems or meet obstacles to participation faced by district or school administrators. They will endeavor to keep the door open while providing additional information and seeking other sources of persuasion.
Incentive Experiment. As described in Part A, the 100 eligible OFT schools will be randomly assigned to one of three baseline incentives: $200, $400, or $400 non-monetary equivalent so that approximately 116 schools will be assigned to each of the three possible incentives. The IVFT and OFT school incentive experiment data will be combined for analysis, increasing the analytic sample size to approximately 350 eligible sample schools. To control for field test membership, a variable indicating the field test to which the school belonged will be included along with an interaction term.
For a power of 0.80, a confidence level of 95 percent, and 116 cases within each condition, in this experiment a 15.6 percent point difference in response rate should be detectable as statistically significant (e.g., 68.0 percent vs. 83.6 percent). Formula provided below.9
n = (Zα/2+Zβ)2 * (p1(1-p1)+p2(1-p2)) / (p1-p2)2
Where Zα/2 is the critical value of the Normal distribution at α/2 (e.g., for a confidence level of 95 percent, α is 0.05 and the critical value is 1.96); Zβ is the critical value of the Normal distribution at β (e.g., for a power of 80 percent, β is 0.2 and the critical value is 0.84) and p1 and p2 are the expected sample proportions of the two groups.
Maximizing Parent Participation
In preparation for the national study and to improve the chances of obtaining higher parent participation in MGLS:2017, a test of a responsive design approach to parent recruitment is proposed. The responsive design approach will be used to identify cases for nonresponse follow-up interventions such that the responding sample is as representative as possible of the population (i.e., sixth graders) and thereby reduce the risk of potential nonresponse bias. This approach will also be used to determine the optimal baseline incentive offer for the national study.
The parent incentive experimental conditions and plans for their implementation are described in Part A. One of three incentive amounts ($0, $10, or $20) will be offered to parents at baseline. Approximately one-third of the way through the OFT data collection, parents will be randomly assigned to receive an offer of either a $10 increase to the baseline incentive or no increase. About two-thirds of the way through the data collection period, one additional incentive boost will be offered to bring the total incentive amount to $40. Those parents who did not receive an incentive offer at baseline, the control group, will not receive an incentive offer throughout the data collection period.
Parents of students with disabilities will receive separate treatment due to the analytically critical import of this population for the study. We propose to offer all parents of students with disabilities $20 at baseline and an additional $10 each at the one-third and two-thirds point in the data collection, not to exceed $40 total per parent.
The parents of approximately 1,750 students will be contacted in the OFT and approximately 4,938 in the IVFT. The parents of the approximately 1,188 OFT students who do not have a primary IEP designation of autism, emotional disturbance, or specific learning disability will be randomly assigned to three baseline incentive amounts: $0, $10, and $20 with twenty percent (n=238) of the parents assigned to the $0 incentive, forty percent (n=475) assigned to the $10 incentive, and forty percent (n=475) assigned to the $20 incentive. The 4,938 IVFT parents will be randomly assigned to the same three incentive conditions with 1,646 assigned to each of the three incentive conditions. Consequently, across both field tests, 1,884 parents will be randomly assigned to the $0 incentive, 2,121 will be randomly assigned to the $10 incentive, and $2,121 will be randomly assigned to the $20 incentive.
For a power of 0.80, a confidence level of 95 percent, and 2,121 cases within each condition, in this experiment a 4.0 percent point difference in response rate should be detectable as statistically significant (e.g., 68.0 percent vs. 72.0 percent). Formula provided below.10
n = (Zα/2+Zβ)2 * (p1(1-p1)+p2(1-p2)) / (p1-p2)2
Where Zα/2 is the critical value of the Normal distribution at α/2 (e.g., for a confidence level of 95 percent, α is 0.05 and the critical value is 1.96); Zβ is the critical value of the Normal distribution at β (e.g., for a power of 80 percent, β is 0.2 and the critical value is 0.84) and p1 and p2 are the expected sample proportions of the two groups.
For a power of 0.81, with 1,884 cases in one group and 2,121 cases in another group, in this experiment a 4.1 percent point difference in response rate should be detectable as statistically significant using Pearson’s chi-square (e.g., 68.0 percent vs. 72.1 percent)11.
However, the OFT and IVFT have a clustered design with students nested in schools. Therefore, assuming an approximate design effect of 4, which is a similar design effect as reported by the HSLS:09 for parent respondents12, which also had a clustered design with students nested in schools, the effective sample size for any condition would be approximately 530 cases (2,121/4) and 471 cases (1,884/4). For a power of 0.80, a confidence level of 95 percent, and 530 cases within each condition, this experiment should be able to detect approximately a 7.8 percent point difference in response as statistically significant (e.g., 68.0 percent vs. 75.8 percent). For a power of .81, with 471 cases in one group and 530 cases in another group, in this experiment a 8.0 percent difference in response rate would be detectable as statistically significant using Pearson’s chi-square (e.g., 68.0 percent vs. 76.0 percent)13.
A main goal of the OFT is to better understand the recruitment strategies necessary for a large-scale nationally representative effort to obtain the targeted sample yield of grade 6 general education students and students with disabilities, and the subsequent tracing and tracking strategies necessary to maintain the student sample from the base year (when sample students will be in grade 6) to the first follow-up (when most of the students will be in grade 7) and the second follow-up (when most of the students will be in grade 8). As described in Part A section A.9, the OFT will include: 1) a school incentive experiment to determine whether different types and/or levels of incentives can significantly improve participation rates, 2) a student incentive experiment to determine whether and which appreciation tokens are effectiveness in securing participation of middle school students, and 3) a responsive design to non-response follow-up on the parent interview to better understand how to achieve the desired response rates in the main study.
The following individuals at NCES are responsible for the MGLS:2017: Carolyn Fidelman, Gail Mulligan, Chris Chapman, and Marilyn Seastrom. The following individuals at RTI are responsible for the study: Dan Pratt, Debbie Herget, Steven Ingels, and David Wilson, along with subcontractor staff: Sally Atkins-Burnett (Mathematica) and Michelle Najarian (ETS).
1 Sixth-grade enrollment is reported as 0 or was not reported to the CCD 2013-14 or PS 2013-14.
2 A school reports zero students or does not report the number of students in any of the three focal disability groups.
3 The reason for this is to mimic the cutoff that is planned to be used for the national data collection in order to gauge the degree to which high prevalence schools agree to participate.
4 Folsom, R.E., Potter, F.J., and Williams, S.R. (1987). Notes on a Composite Size Measure for Self-Weighting Samples in Multiple Domains. Proceedings of the Section on Survey Research Methods of the American Statistical Association, 792-796.
5 One hundred twenty-five schools will be drawn to allow for a reserve pool of 22 schools to account for lower than estimated school-participation rate.
6 Tourangeau, K., Nord, C., Lê, T., Sorongon, A.G., Hagedorn, M.C., Daly, P., and Najarian, M. (2001). Early Childhood Longitudinal Study, Kindergarten Class of 1998–99 (ECLS-K), User’s Manual for the ECLS-K Base Year Public-Use Data Files and Electronic Codebook (NCES 2001-029). U.S. Department of Education. Washington, DC: National Center for Education Statistics.
7 Tourangeau, K., Nord, C., Lê, T., Sorongon, A.G., Hagedorn, M.C., Daly, P., and Najarian, M. (2012). Early Childhood Longitudinal Study, Kindergarten Class of 2010–11 (ECLS-K:2011), User’s Manual for the ECLS-K:2011 Kindergarten Data File and Electronic Codebook (NCES 2013-061). U.S. Department of Education. Washington, DC: National Center for Education Statistics.
8 Ingels, S.J., Pratt, D.J., Herget, D.R., Burns, L.J., Dever, J.A., Ottem, R., Rogers, J.E., Jin, Y., and Leinwand, S. (2011). High School Longitudinal Study of 2009 (HSLS:09). Base-Year Data File Documentation (NCES 2011-328). U.S. Department of Education. Washington, DC: National Center for Education Statistics.
9 Retrieved from http://www.select-statistics.co.uk/sample-size-calculator-two-proportions.
10 Retrieved from http://www.select-statistics.co.uk/sample-size-calculator-two-proportions.
11 Calculated using SAS Proc Power. https://support.sas.com/documentation/cdl/en/statugpower/61819/PDF/default/statugpower.pdf
12 Ingels, S.J., Pratt, D.J., Herget, D.R., Burns, L.J., Dever, J.A., Ottem, R., Rogers, J.E., Jin, Y., and Leinwand, S. (2011). High School Longitudinal Study of 2009 (HSLS:09). Base-Year Data File Documentation (NCES 2011-328). U.S. Department of Education. Washington, DC: National Center for Education Statistics.
13 Calculated using SAS Proc Power. https://support.sas.com/documentation/cdl/en/statugpower/61819/PDF/default/statugpower.pdf
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Modified | 0000-00-00 |
File Created | 2021-01-24 |