This document is being carried over from November 2015 clearance
Middle Grades Longitudinal Study of 2017–18 (MGLS:2017)
2016 Item Validation Field Test (IVFT) Data Collection
OMB# 1850-0911 v.4
Supporting Statement Part A
National Center for Education Statistics
U.S. Department of Education
Institute of Education Sciences
Washington, DC
June 2015
Revised September 2015
Revised March 2016
A.1 Importance of Information 2
A.2 Purposes and Uses of Data 3
A.3 Improved Information Technology (Reduction of Burden) 6
A.4 Efforts to Identify Duplication 6
A.5 Minimizing Burden for Small Entities 6
A.6 Frequency of Data Collection 7
A.8 Consultations Outside NCES 7
A.9 Payments or Gifts to Respondents 9
A.10 Assurance of Confidentiality 13
A.13 Total Annual Cost Burden 15
A.14 Annualized Cost to Federal Government 15
A.15 Program Changes or Adjustments 15
A.16 Plans for Tabulation and Publication 15
A.17 Display OMB Expiration Date 15
A.18 Exceptions to Certification Statement 15
Note:
Because the approved (OMB# 1850-0911 v.3, 5, and 7) recruitment activities for the Item Validation Field Test (IVFT) will not be completed at the time of approval for this request, the OMB# 1850-0911 v. 3, 5, and 7 clearance materials are included as supplemental materials with this submission and the approved burden affiliated with IVFT recruitment is being carried over and therefore included in the total burden requested in this submission.
The Middle Grades Longitudinal Study of 2017–18 (MGLS:2017) is the first study sponsored by the National Center for Education Statistics (NCES), within the Institute of Education Sciences (IES) of the U.S. Department of Education, to follow a nationally representative sample of students as they enter and move through the middle grades (grades 6–8). In preparation for the main study, the data collection instruments and procedures must be field tested.
This package requests clearance to conduct the Item Validation Field Test (IVFT) for the MGLS:2017, for which the data collection is scheduled to begin in January 2016. The primary purpose of the IVFT is to determine the psychometric properties of items and the predictive potential of assessment and survey items so that valid, reliable, and useful assessment and survey instruments can be composed for the main study.
Part A of this submission presents information on the basic design of the IVFT. Part B presents information on the collection of information employing statistical methods. Part C provides general content and item justifications for the MGLS:2017 student, parent, math teacher, special education teacher, and school administrator surveys. Appendices A through T (already approved in the recruitment package, OMB# 1850-0911 v. 3, 5, and 7) provide field test recruitment and student roster collection materials, consisting of letters to state and district officials, school principals, and parents, as well as text for an MGLS:2017 brochure, frequently asked questions, and website. Appendix U provides the survey items in survey specification format for the student, parent, math teacher, special education teacher, and school administrator surveys. Appendix V provides the facilities checklist to be completed by MGLS:2017 staff, and Appendix W provides communication materials that will be used during data collection with school administrators, teachers, students and parents.
MGLS:2017 is the first study sponsored by NCES to follow a nationally representative sample of students as they enter and move through the middle grades (grades 6–8). A study of the middle grades will complement NCES’s plans for implementing a multi-cohort sequence for its longitudinal studies series. This means that the Early Childhood Longitudinal Study, Kindergarten Class of 2010–11 (ECLS-K:2011), the MGLS:2017, and the High School Longitudinal Study of 2020 (HSLS:2020) will synchronize, and within a given 10-year span, collect the full range of data on students’ school experiences as they transition from elementary school into high school. The federal government is uniquely positioned to undertake the needed comprehensive large-scale longitudinal study of a nationally representative sample of middle grade youth that includes measures of known critical influences on adolescents’ academic and socioemotional trajectories. NCES is authorized to conduct the MGLS:2017 under the Education Sciences Reform Act of 2002 (20 U.S. Code, Section 9543).
MGLS:2017 will be conducted with a nationally representative sample of students enrolled in sixth grade during the 2017–18 school year, with the base-year data collection taking place from January through March of 2018, possibly extending through June 2018. Annual follow-ups are planned for winters of the 2018-19 and 2019-20 school years, when most of the students in the sample will be in grades 7 and 8, respectively. The MGLS:2017 will provide a rich descriptive picture of the academic experiences and development of students during these critical years and will allow researchers to examine associations between contextual factors and student outcomes. There is a wealth of research highlighting the importance of mathematics and literacy skills for success in high school and subsequent associations with later education and career opportunities. Thus, the study will focus on student achievement in these areas, along with measures of student socioemotional well-being and other outcomes. The study will also include a sample of students with different types of disabilities that will provide descriptive information on their outcomes, educational experiences, and special education services.
The MGLS:2017 will rely on a set of longitudinal and complementary instruments to collect data across several types of respondents to provide information on the outcomes, experiences, and perspectives of students across grades 6, 7, and 8; their families and home lives; their teachers, classrooms, and instruction; and the school settings, programs, and services available to them. At each wave of data collection in the main study, students’ mathematics and reading skills, socioemotional development, and executive function will be assessed. Students will also complete a survey that asks about their engagement in school, out-of-school experiences, peer group relationships, and identity development. Parents will be asked about their background, family resources, and involvement with their child’s education and their school. Students’ mathematics teachers will complete a two-part survey: In part 1, they will be asked about their background and classroom instruction. In part 2, they will be asked to report on the academic behavior, mathematics performance, and classroom conduct of each study child in their classroom. For students receiving special education services, their special education teacher or provider will also complete a survey questionnaire similar in structure to the two-part mathematics teacher instrument, consisting of a teacher-level questionnaire and student-level questionnaire, but with questions specific to the special education experiences and services of the study child. School administrators will be asked to report on school programs and services, as well as on school climate.
In short, the MGLS:2017 will provide data on the development and learning that occur during students’ middle grade years (grades 6–8) and that are predictive of future success, along with the individual, social, and contextual factors that are related to successful development. A key goal of the study is to provide researchers and policymakers with the information they need to better understand the school and nonschool influences associated with mathematics and reading success, socioemotional health, and positive life development during the middle grade years and beyond. To support the development of the study, the MGLS:2017 is conducting two field tests, the IVFT beginning in January 2016, followed by the Operational Field Test (OFT) that will begin in January 2017.
The study’s success is dependent on the development of reliable, valid measures. The goal of the IVFT is to collect data to support examination of the mathematics assessment, reading assessment, executive function assessment, student survey, parent survey, and school staff surveys. The IVFT will provide the data needed to determine the psychometric properties of items and the predictive potential of assessment and survey items so that valid, reliable, and useful assessment and survey instruments can be composed for the main study. As the focus of the IVFT is the analyses of the psychometric properties of the survey items and assessments, the IVFT requires a large, diverse field test sample, though not a nationally representative one.
Gaining schools’ cooperation in voluntary research is increasingly challenging. The OFT will be used to test materials and procedures revised based on the results of the IVFT and to gain a deeper understanding of effective recruitment strategies that lead to higher response rates and thus better data quality. The OFT will include a responsive design approach for non-responders and will allow NCES to tighten assessment and survey timing, so as to maximize the overall functionality of the assessments and surveys while minimizing the time it takes respondents to complete them. The OFT results will inform modifications to the main study materials and procedures. With the focus of the OFT on recruitment strategies, tactics for retention of the sample within the study, and the operational administration of the surveys and assessments, the OFT will require a close to nationally representative sample.
The IVFT will take place in January through June 2016. Its purpose is to evaluate a battery of student assessments (i.e., mathematics and reading skills and executive function) and survey instruments (i.e., student survey, parent survey, and school staff surveys) for use in the MGLS:2017 OFT and later in the MGLS:2017 main study. The IVFT will collect data for a sample of children enrolled in 5th through 8th grade as of January of 2016 and will provide the much needed information to establish the validity and reliability of the direct assessments and surveys.
Field Test Components
The IVFT includes the following components: student assessments and student survey, parent survey, math teacher survey, special education teacher survey, and school administrator survey.
Student Assessments and Student Survey. Students will participate in assessments and a survey, designed to take a total of approximately 90 minutes per student.
Mathematics Assessment. The MGLS:2017 main study mathematics assessment will be a 30-minute, two-stage adaptive assessment that students will take on a tablet computer. The focus will be on domains of mathematics that are most likely to be the central focus of middle school learning now and in the future: the Number System, Ratios and Proportional Relationships, Expressions and Equations, and Functions. To ensure that the study is sensitive to the variation in students’ mathematics ability, the assessment will include items with appropriately varying cognitive demand. The MGLS:2017 mathematics assessment will provide valuable information about the development of middle grade students’ knowledge of mathematics and their ability to use that knowledge to solve problems, moving toward stronger reasoning and understanding of more advanced mathematics.
Reading Assessment. The MGLS:2017 reading assessment will use a two-stage adaptive assessment design consisting of a brief routing block (first stage: approximately 10 minutes) followed by a skill-based block (second stage: approximately 20 minutes), for a total of 30 minutes. The routing block will include items that measure foundational components of reading that are important for comprehension: Vocabulary, Morphological Awareness, and Sentence Processing. Performance on the routing block will direct students to one of three types of skill-based reading blocks (reading components, basic comprehension, or scenario-based comprehension) within the second stage.
The second-stage basic components skill block will be used to gather more information on the foundational reading component skills, including those measured in the first stage as well as word recognition and decoding skills. The basic components block will also capture information about students’ efficiency at basic reading comprehension and ability to comprehend short passages. The second-stage basic comprehension skill block is designed to gather information about students’ efficiency at basic reading comprehension and their ability to comprehend short passages. This skill-based block will measure comprehension in a traditional design where unrelated passages and corresponding questions are presented. The second-stage scenario-based comprehension skill block is designed to gather information about students’ ability to comprehend informational text and reason more deeply about text and to apply what they learn from passages. The scenario-based block will include a scenario or a purpose for reading (e.g., preparing for a classroom discussion or creating a website on a topic).
Executive Function Measures. Executive function, a set of capacities and processes originating in the prefrontal cortex of the brain, permits individuals to self-regulate, engage in purposeful and goal-directed behaviors, and conduct themselves in a socially appropriate manner. Self-regulation is needed for social success, academic and career success, and good health outcomes. Executive function includes capacities such as shifting (cognitive and attention flexibility), inhibitory control, and working memory. Four different executive function measures will be included in the field tests: Stop Signal (inhibitory control), 3-Back with verbal stimulus (working memory), 2-Back with nonverbal stimulus (working memory), and the Hearts and Flowers task (shifting or cognitive flexibility).
Student Survey. The purpose of the student survey is to collect information on students’ attitudes and behaviors; out-of-school time use; and family, school, and classroom environments. The student survey will also serve as a source for information about socioemotional outcomes having to do with social relationships, support, and school engagement.
Parent Survey. The parent survey will take 30 minutes to complete via a self-administered web-based questionnaire; a telephone interview follow-up will be available for respondents who do not complete the questionnaire via the Web. The parent survey will focus on supplementing the information collected from students and teachers about the students’ educational experiences and on learning about parents’ expectations for their children’s academic attainment in high school and beyond. It will also collect information about family involvement in the children’s education and about family characteristics that are key predictors of academic achievement and other student outcomes.
Mathematics Teacher Survey/Teacher Student Report. The mathematics teacher survey will consist of two parts: a teacher survey and a series of teacher student reports (TSRs). Both the mathematics teacher survey and the TSR will be web-based, self-administered surveys, with a phone interview option available. The mathematics teacher survey is expected to take approximately 20 minutes to complete, and the TSR will take 10 minutes for each student who is rated. The mathematics teacher survey will collect data on potential classroom-level correlates of students’ mathematics achievement as well as school-level services and factors such as special programs, school climate, and instructional leadership.
Teacher responses to the TSR will capture information specific to the sampled student and his or her mathematics class. It will provide information on the classroom attendance and performance of individual students, which will augment direct student assessments, transcript information, and student and parent reports. The TSR will also serve as an additional source for data on student socioemotional outcomes related to regulation, school engagement, and externalizing behaviors. In the web version of this instrument, teachers will be given a list of the students for whom they should complete a TSR and will click on each student’s name to launch the TSR for that specific student. If a teacher opts not to complete the web-based survey, a follow-up phone interview will be conducted.
Special Education Teacher Survey/Teacher Student Report. Like the mathematics teacher survey, the special education teacher/service provider survey will consist of two parts. The first part consists of the teacher questionnaire, which asks questions about the teacher’s background and experiences working with students with disabilities. The second part consists of the TSR, which contains specific questions about special education services and other contextual variables for sampled children with an Individualized Education Program (IEP), as well as ratings of individual academic and life skills (the special educator rating scale, SPERS).
The special education teacher survey will be web based and self-administered, with a phone interview option available. The first part of the survey will take approximately 10 minutes to complete, and the second part will take about 25 minutes for each student who is rated. In the second part of the web version of this instrument, teachers will be given a list of the students for whom they should complete the survey and will click on each student’s name to launch this part for that specific student.
School Administrator Survey. The school administrator survey will be web based and self-administered, with a telephone option available, and will take the administrator (generally, the principal or principal’s designee) approximately 20 minutes to complete. The school administrator survey will collect information about a school’s characteristics and staffing (specifically, the school’s structure and climate, including safety, organization, and support). It will also collect information on the student population, student conduct, academic culture, course offerings, and extended learning opportunities (e.g., extracurricular activities, summer school, or supports for struggling students).
Administration of Assessments and Survey Components
In the IVFT, students’ parents, math teachers, special education providers (as applicable), and school administrators will be asked to complete surveys as described above. To keep student participation to approximately 90 minutes and gain as much information on as many assessment and survey items as possible, the IVFT will employ a spiral design in which not all students will receive the same assessments and survey items. Table 1 below presents a summary of the student assessment and survey booklet spiral design.
Table 1. Item Validation Field Test (IVFT) Student Assessment and Spiral Design
Booklet 1 |
Booklet 2 |
Booklet 3 |
Booklet 4 |
Booklet 5 |
Booklet 6 |
Math assessment |
Math assessment |
Math assessment |
Math assessment |
Math assessment |
Math assessment |
Demographic items |
Executive function task: Hearts & Flowers |
Executive function task: 3-Back |
Demographic items |
Demographic items |
Demographic items |
Reading assessment (two-stage) |
No reading assessment |
No reading Assessment |
Reading assessment (two-stage) |
Reading assessment (two-stage) |
Reading assessment (two-stage) |
Executive function task: Stop Signal |
Demographic items |
Demographic items |
Executive function task: 2-Back |
No executive function task |
No executive function task |
Theories of Intelligence (general) |
Student Questionnaire |
Student Questionnaire |
No student questionnaire |
No student questionnaire |
No student questionnaire |
School Recruitment Approach
The student sample for the IVFT, while not required to be nationally representative for psychometric analysis, will include students in the typical age range found in grades 5–8 in the United States; these students will likely demonstrate a range of ability on the constructs being measured by the MGLS:2017 item pool. The sample will also include a subset of students from three focal disability groups (learning disability, autism, and emotional disturbance) who are able to take standardized tests using accommodations. Schools will be recruited both directly and potentially at the district level.
Where feasible, available technology will be used to reduce burden and improve efficiency and accuracy. Web-based surveys and other computer-assisted methods will be used to collect data from students, parents, teachers, and school administrators. Specifically, the student assessments and surveys will be administered via a tablet computer. The parent, teacher, and school administrator surveys will all be offered as web-based surveys.
The MGLS:2017 will not be duplicative of other studies. While NCES longitudinal studies have contributed to our understanding of the factors that influence student success and failure in school, the middle grades (grades 6–8) are noticeably absent from the studies conducted to date. A majority of nationally representative longitudinal studies have focused on high school students and on the transition from secondary to postsecondary education: e.g., the High School and Beyond Longitudinal Study (HS&B) and the Education Longitudinal Study of 2002 (ELS:2002). The Early Childhood Longitudinal Study, Kindergarten Class of 1998–99 (ECLS-K), and the National Education Longitudinal Study of 1988 (NELS:88) collected data on students in grade 8, but neither included a data collection in grades 6 or 7. The ECLS-K:2011 will not follow students beyond grade 5, and the High School Longitudinal Study of 2009 (HSLS:09) began with a national sample of students in grade 9. Thus, there is little information at the national level about the learning that occurs during grades 6–8 and about the rates of learning for different groups of students who may experience diverse school environments and opportunities.
The MGLS:2017 is unique in that it will assess students’ mathematics and reading achievement, as well as other student outcomes (e.g., executive function and socioemotional development), for the same group of students over a 3-year period. In addition to the ECLS-K and NELS:88, other national studies have assessed some of these outcomes for students in grade 8, including the National Assessment of Educational Progress (NAEP) and the Trends in International Mathematics and Science Study (TIMSS). These studies, however, are cross-sectional and do not include repeated measures of achievement or assess multiple subjects and areas of development for the same sample of students. Therefore, they cannot answer questions about students’ growth in mathematics and reading over the middle grade years, about differences in the rates of growth for different populations (e.g., differences by sex, by race/ethnicity, and for students attending public and private schools), and about the school and nonschool factors that may facilitate or hinder this growth. Nor can they explore questions about the relationships between student achievement and other school outcomes and executive functions (e.g., working memory, attention, and inhibitory control) that work to regulate and orchestrate cognition, emotion, and behavior to enable a student to learn in the classroom. MGLS:2017 will also be unique in its focus on obtaining a sample of students in three disability categories that can be studied on their own or compared to general education students over the three middle school years.
Other adolescent development studies have been conducted, but they often do not include a grade 6 sample. For example, the youngest children in the National Longitudinal Study of Adolescent to Adult Health (Add Health) and the Maryland Adolescent Development in Context Study (MADICS) were in grade 7 at baseline. Many of these studies collected data on local samples, had a primary focus on family and child processes, and were started in the 1990s: e.g., MADICS and the Michigan Study of Adolescent and Adult Life Transitions (MSALT). As such, they do not provide a contemporary picture of U.S. students in grades 6–8.
Although small entities are not part of this study, in general, burden will be minimized wherever possible. During district and school recruitment, we will minimize burden by training recruitment staff to make their contacts as straightforward and concise as possible. The recruitment letters and materials (e.g., the study description and FAQs) are designed to be clear, brief, and informative. In addition, contractor staff will conduct all test administration and will assist with parental notification, sampling, and other study tasks as much as possible within each school.
The MGLS:2017 IVFT is a one-time data collection that will take place in January through June 2016.
There are no special circumstances involved with the recruitment and data collection for the IVFT.
As part of the MGLS:2017 design contract, content experts were consulted in the development of the assessments and questionnaires. These experts are listed by name, affiliation, and expertise in table 2.
Table 2. Members of the MGLS:2017 Content Review Panels
Name |
Affiliation |
Expertise |
Mathematics Assessment Content Review Panel (June 18–19, 2013) |
||
Tom Loveless |
Brookings Institution |
Policy, math curriculum |
Linda Wilson |
Formerly with Project 2061 |
Math education, math assessment, middle school assessment, author of NCTM Assessment Standards for School Mathematics and NAEP math framework, teacher |
Kathleen Heid |
University of Florida |
Math education, use of technology, teacher knowledge, NAEP Grade 8 Mathematics Standing Committee member |
Edward Nolan |
Montgomery County Schools, Maryland |
Math curriculum and standards, large-scale assessment of middle grade students |
Lisa Keller |
University of Massachusetts, Amherst |
Psychometrics, former math teacher |
Paul Sally |
University of Chicago |
Math education, mathematics reasoning, mathematically talented adolescents |
Margie Hill |
University of Kansas |
Co-author of Kansas mathematics standards, former NAEP Mathematics Standing Committee member, former district math supervisor |
Executive Function Content Review Panel (July 18, 2013) |
||
Lisa Jacobson |
Johns Hopkins University; Kennedy Krieger Institute |
Development of executive functioning skills, attention, neurodevelopmental disorders, and parent and teacher scaffolding |
Dan Romer |
University of Pennsylvania |
Adolescent risk taking |
James Byrnes |
Temple University |
Self-regulation, decision making, cognitive processes in mathematics learning |
Socioemotional-Student-Family Content Review Panel (July 25–26, 2013) |
||
James Byrnes |
Temple University |
Self-regulation, decision making, cognitive processes in mathematics learning |
Russell Rumberger |
University of California, Santa Barbara |
School dropouts, ethnic and language minority student achievement |
Tama Leventhal |
Tufts University |
Family context, adolescence, social policy, community and neighborhood indicators |
Susan Dauber |
Bluestocking Research |
School organization, educational transitions, urban education, parent involvement and family processes |
Scott Gest |
Pennsylvania State University |
Social networking, social skills, longitudinal assessment of at-risk populations |
Kathryn Wentzel |
University of Maryland |
Social and academic motivation, self-regulation, school adjustment, peer relationships, teacher-student relationships, family-school linkages |
Richard Lerner |
Tufts University |
Adolescent development and relationships with peers, families, schools, and communities |
School Administrator Content Review Panel (August 16, 2013) |
||
Susan Dauber |
Bluestocking Research |
School organization, educational transitions, urban education, parent involvement and family processes |
George Farkas |
University of California, Irvine |
Schooling equity and human resources |
Jeremy Finn |
State University of New York at Buffalo |
School organization, school dropouts |
Edward Nolan |
Montgomery County Schools, Maryland |
Large urban school system administrator |
Tom Loveless |
Brookings Institution |
Policy, math curriculum |
Reading Assessment Content Review Panel ( April 14, 2014) |
||
Donna Alvermann |
University of Georgia |
Adolescent literacy, online literacy, codirector of the National Reading Research Center (funded by the U.S. Department of Education) |
Joseph Magliano |
Northern Illinois University |
Cognitive processes that support comprehension, the nature of memory representations for events depicted in text and film, strategies to detect and help struggling readers |
Sheryl Lazarus |
University of Minnesota |
Education policy issues related to the inclusion of students with disabilities in assessments used for accountability purposes, student participation and accommodations, alternate assessments, technology-enhanced assessments, teacher effectiveness, large-scale assessments, school accountability, research design (including cost analyses), data-driven decision making, rural education, the economics of education |
Disabilities Content Review Panel (April 29, 2014) |
||
Jose Blackorby |
SRI International |
Autism, specific learning disabilities, special education, curriculum design, alternate student assessment, large-scale studies of students with disabilities, codirector of the Special Education Elementary Longitudinal Study (SEELS) |
Lynn Fuchs |
Vanderbilt University |
Specific learning disabilities, student assessment, mathematics curriculum, psychometric models |
Mitchell L. Yell |
University of South Carolina |
Autism, emotional and behavior disorders, specific learning disabilities, pre-K–12 instruction and curriculum, special education, evidence-based intervention |
Sheryl Lazarus |
University of Minnesota |
Special education policy, inclusion of students with disabilities in assessments, accommodations, alternate assessments, technology-enhanced assessments, large-scale assessments, school accountability, research design (including cost analyses) |
Martha Thurlow |
University of Minnesota |
Specific learning disabilities, reading assessment, alternate student assessment, early childhood education, special education, curriculum, large-scale studies |
Diane Pedrotty Bryant |
University of Texas, Austin |
Educational interventions for improving the mathematics and reading performance of students with learning disabilities, the use of assistive technology for individuals with disabilities, interventions for students with learning disabilities and who are at risk for educational difficulties |
Expert Meeting, Middle Grades Experts (January 23, 2015) |
||
Nancy Flowers |
University of Illinois at Urbana-Champaign |
Program evaluation, Large-scale data collection, Research methods |
Deborah Kasak |
National Forum to Accelerate MG Reform |
Education policy, School reform, Schools to watch |
Doug MacIver |
Johns Hopkins University |
School reform, Adolescent engagement, learning and achievement |
Margaret McLaughlin |
University of Maryland |
Special education policy, Students with disabilities |
Steve Mertens |
Illinois State University |
Teacher preparation, School reform, Evaluation |
Karen Swanson |
Mercer University |
Curriculum and instruction, Transformative education, Faculty professional development |
Expert Meeting, Students with Disabilities (April 2, 2015) |
||
Jose Blackorby |
SRI International
|
Autism, specific learning disabilities, special education, curriculum design, alternate student assessment, large-scale studies of students with disabilities, codirector of the Special Education Elementary Longitudinal Study (SEELS) |
Jacqueline Buckley |
Institute of Education Sciences, National Center for Special Education Research |
Large-scale studies of students with disabilities |
Richelle Davis |
Special Education and Rehabilitative Services, Office of Special Education Programs |
Large-scale studies of students with disabilities |
Lindsey Jones |
National Council for Learning Disabilities |
Large-scale studies of students with disabilities |
Margaret McLaughlin |
University of Maryland |
Special education policy, Students with disabilities |
Kim Sprague |
Institute of Education Sciences,
|
Large-scale studies of students with disabilities |
Jim Weindorf |
National Council for Learning Disabilities |
Large-scale studies of students with disabilities |
High levels of school participation are critical to the success of the IVFT. School administrator, mathematics teacher, special education teacher, parent, and student data collection activities are contingent on school cooperation. NCES recognizes that the burden level of the study is one of the factors that school administrators will consider when deciding whether to participate. To offset the perceived burden of participation, NCES intends to continue to use strategies that have worked successfully in other major NCES studies (e.g., ECLS-K, ECLS-K:2011, HS&B, NELS:88, and ELS:2002), including offering both monetary and non-monetary incentives. Table 3 summarizes the proposed incentive amount for each instrument and activity along with their estimated administration times; a brief justification for each incentive amount follows table 3.
Table 3. Item Validation Field Test (IVFT) Instruments and Proposed Incentive Amounts
Instrument/Activity |
Administration Time* |
Field Test Incentives |
Student Assessments and Survey (Math, Reading, Executive Function, and Student Survey) |
90 minutes |
No monetary incentive |
Parent Survey |
30 minutes |
None, $20, or $40 |
Mathematics Teacher |
|
|
Teacher Survey |
20 minutes |
$20 |
Teacher Student Report |
10 minutes per student |
$7 per TSR |
Special Education Teacher |
|
|
Teacher Survey |
10 minutes |
$20 |
Teacher Student Report |
25 minutes per student |
$7 per TSR |
School Administrator Survey |
20 minutes |
No monetary incentive |
School Participation
School Coordinator (logistics, on-site visit, consent forms, administrative records, etc.) |
6 hours for consent assistance 2 hours to schedule assessments 2 hours to set up web access, coordinate computer labs 6 hours to provide administrative records |
$200, $400, or $400 in material or services for school
$150 for coordinator |
*Note that the assessment administration time may be longer for students with disabilities.
Students
There is no monetary incentive in the IVFT for students.
Parents
Parent survey response rates have declined over the past decade. The ECLS-K:2011 baseline (fall 2010) parent survey response rate was more than 10 percentage points lower (74 percent)1 than the parent survey rate in the corresponding 1998 wave of the ECLS-K (85 percent).2 Additionally, the 9th grade parent survey response rate for the HSLS:09 baseline was 68 percent.3 The MGLS:2017 parent survey is a key component of the data being collected. To improve the chances of obtaining higher parent participation rates in a school-based design, we will work with school personnel to recruit sample students’ parents into the MGLS:2017 and will conduct an experiment in the IVFT to determine the effect of different levels of monetary incentives on parent participation.
In the IVFT, an experiment will be used to determine the effect on response rates and on the cost and length of nonresponse follow-up of offering parents of middle grade students a $0, $20, or $40 incentive for completing the parent questionnaire. Additionally, the experiment will also evaluate whether parents of children with disabilities, who may be more reluctant to engage in this study and who may require more frequent and extensive nonresponse follow-up, are influenced differently by the offer of a monetary incentive than are parents of students without disabilities.
For parent incentives, each school will be randomly assigned to one of the three experimental conditions. Therefore, all parents asked to complete the parent survey within a school will be assigned to the same condition. Parents in one-third of the schools will be asked to complete the parent survey, but will not be offered a monetary incentive for doing so; parents in another one-third of the schools will be offered $20 to complete the survey; and parents in the remaining one-third of the schools will be offered $40 to complete the survey. We will monitor the response rate in each group and document the level of effort needed to obtain the response rates achieved under the different incentive/no-incentive options. All groups will receive similar reminders and other modes of follow-up contact. The number of contact attempts to achieve the final response rates will be measured to compute the potential resource savings, if any, of each incentive payment relative to no incentive payment.
For the IVFT, as shown in Part B section B.1 we plan for 3,950 participating students. Assuming an 80 percent response rate from students, we will need to obtain parent consent for participation of 4,938 students. Therefore, within the IVFT, we will be seeking parent surveys from 4,938 students’ parents. Assuming 4,938 are split approximately equally across schools and conditions, this would result in approximately 1,646 cases within each incentive level.
As stated, the 9th grade parent survey response rate for the HSLS: 09 baseline was 68 percent. For a power of 0.80, a confidence level of 95 percent, and 1,646 cases within each condition, in this experiment a 4.5 percent point difference in response rate should be detectable as statistically significant (e.g., 68.0 percent vs. 72.5 percent). Formula provided below.4
n = (Zα/2+Zβ)2 * (p1(1-p1)+p2(1-p2)) / (p1-p2)2
Where Zα/2 is the critical value of the Normal distribution at α/2 (e.g., for a confidence level of 95 percent, α is 0.05 and the critical value is 1.96); Zβ is the critical value of the Normal distribution at β (e.g., for a power of 80 percent, β is 0.2 and the critical value is 0.84) and p1 and p2 are the expected sample proportions of the two groups.
However, the IVFT has a clustered design with students nested in schools. Therefore, assuming an approximate design effect of 4, which is a similar design effect as reported by the HSLS:09 for parent respondents5, which also had a clustered design with students nested in schools, the effective sample size for any condition would be approximately 412 cases (1,646/4). For a power of 0.80, a confidence level of 95 percent, and 412 cases within each condition, this experiment should be able to detect approximately an 8.5 percent point difference in response as statistically significant (e.g., 68.0 percent vs. 76.5 percent).
Teachers
The incentive proposed for students’ teachers is $20 per teacher survey, plus $7 per teacher student report (TSR). These amounts are consistent with the amounts used in current NCES studies, such as the ECLS-K:2011. While it is estimated that the mathematics teacher survey will take longer to complete (20 minutes) than the special education teacher survey (10 minutes), the reverse is true for the individual student reports. The individual student reports will require approximately 10 minutes per student to complete for mathematics teachers and 25 minutes per student to complete for special education teachers (including 5 minutes for an indirect assessment of student’s skills, the SPERS). We are proposing to use the same incentive structure for all teachers, regardless of the specific questionnaires they are being asked to complete, to protect against any perception of unfairness that might result if teachers within a school talk to one another about the amount they have received for a specific questionnaire.
Schools and School Coordinators
As part of the IVFT schools recruitment, we propose to conduct an incentive experiment. Each school will be randomly assigned to one of the three experimental conditions. Given the many demands and outside pressures that schools already face, it is essential that they see that MGLS:2017 staff understand the additional burden being placed on school staff when requesting their participation. The study asks for many kinds of information and cooperation from schools, including a student roster with basic demographic information (e.g., date of birth, sex, and race/ethnicity); information on students’ IEP status; math and special education teacher and parent contact information; permission for field staff to be in the school for up to a week; space for administering student assessments; permission for students to leave their normal classes for the duration of the assessments; and information about the students’ teachers and parents. For sample students with disabilities, on average, five students in each school will be selected based on disability category, and many will require accommodations and different assessment settings, such as individual administration and smaller group sessions. Working with the data collection contractor to assess these students will place even more of a burden on the participating schools.
In Condition 1, the baseline condition, we will offer schools a $200 incentive for participation. This amount is consistent with the amount offered for participation in other NCES studies, such as the ECLS-K, ECLS-K:2011, TIMSS, and the Program for International Student Assessment (PISA). However, based on previous difficulties in recruiting schools for the originally approved MGLS:2017 field test recruitment, and the general decline in school participation in NCES longitudinal studies over the years, we propose to also test offering one third of the sample schools $400 (Condition 2), and one third of schools a choice of one of seven non-monetary incentives equivalent to $400 (Condition 3). The list of the non-monetary incentive choices is provided in Table 4.
Table 4. Non-Monetary Incentive Choices for Schools in Experimental Condition 3
Incentive |
Value |
Registration for Association for Middle Level Education (AMLE) or Regional Annual Meeting |
$400 |
Two-Year School Membership in AMLE |
$400 |
Membership in Regional ML Organization plus Subscriptions to Professional Journals |
$400 |
Professional Development Webinar |
$400 |
School Supplies |
$400 |
Library of Middle Level Publications |
$400 |
The school incentive experiment, with the same three experimental conditions, will be repeated during the MGLS:2017 Operational Field Test (OFT), which will be conducted in January through June 2017 and which will follow the same recruitment procedures as the IVFT.
The purpose of the IVFT is to test the instruments on at least 1,200 students in each of grades 6 through 8, 350 students in grade 5, and at least 200 respondents in each of three disability groups: specific learning disability, autism, and emotional disturbance. To achieve this goal, the number of participating schools in the IVFT should be at least 58 schools. A convenience school sample of about 250 schools will be selected for the IVFT from which to recruit the 58 schools. This not only assures the attainment of the requisite number of participating schools but also provides increased power to the previously proposed school incentive experiments. The larger school sample accounts for the challenge of securing school participation for the IVFT, given the brevity of the period between the start of recruitment and the start of IVFT data collection (September 2015 to January 2016). As originally proposed, schools will be randomly assigned to one of three incentive treatments: $200, $400, or $400 in materials or supplies.
The 250 schools selected for the IVFT will all be recruited at the same time. Of the schools who agree to participate, a selection of 58 schools representing a diversity of demographics will be included in the IVFT. All schools who agree to participate will receive their assigned incentive regardless of their selection for participation. This will enable us to fully carry out the incentive experiment with all sampled schools regardless of their selection for participation. Schools included in this sample of 250 for the IVFT are considered Tier 1 schools.
A study of this nature has not previously been undertaken and it is unknown whether 58 schools will be sufficient to attain the desired yield of students in each of the grades and disability groups. If it is determined that additional schools, beyond the 58, are needed to achieve the desired student yield within each of the subgroups, additional schools, referred to as Tier 2 schools, will be recruited to participate in only the student component of the study. For the purpose of the IVFT, collecting data from school staff and parents in the 58 participating sample schools should be sufficient to inform the operational field test and main study questionnaire testing. Thus, only students will be assessed in Tier 2 schools beyond the initial 58 participating schools to achieve the desired yield targets.
Tier 2 schools will be identified through a variety of means including the following activities:
School officials (and district officials, if applicable) may provide positive response to volunteer participation requests made by middle grades research and policy community organizations and representatives, including the Association for Middle Level Education (AMLE) and the National Forum to Accelerate Middle Grades Reform (the Forum). There will be MGLS:2017 study representation (including the NCES project officer and RTI associate project director) and visibility (an exhibit booth and study update presentation) at the AMLE annual conference in October 2015 to provide information about the study, which may provide a mechanism for schools and districts to express their interest.
Project personnel may identify volunteer schools through networking means, based on professional and personal relationships with various school- and district- officials.
District officials that agree for their sampled schools to participate in the study may offer to have additional school(s) in their district included if needed.
School officials at tier 2 schools may suggest additional schools that might be potential tier 2 volunteers.
The opportunity for their students to participate in field-testing assessments for a national study is sometimes of considerable interest to school officials, and securing such “as needed” volunteer schools will safeguard the success of the IVFT. For these as-needed volunteer schools, depending on the school configurations and needs of the IVFT, participation may also be restricted to a subset of grades (e.g., one school may volunteer to have only their 5th-graders participate and another school may ask that only 8th-graders be included). Thus, Tier 2 schools will be considered an as-needed reserve pool of schools, and their participation in the IVFT will depend on the student yield overall and by various categories (e.g., grade level, disabilities oversample, school characteristics, and student characteristics).
It is estimated that 250 Tier 1 schools will be recruited in the IVFT in order to yield the 50 to 58 schools that will participate in each data collection. The IVFT and OFT school incentive experiment data will be combined for analysis, increasing the analytic sample size to approximately 375 sample schools. To control for field test membership, a variable indicating the field test to which the school belonged will be included along with an interaction term.
School coordinators will be offered a $150 monetary incentive. They play an especially important role in the study and are critical to its success. The coordinator in each participating school will coordinate logistics with the data collection contractor; compile and supply to the data collection contractor a list of eligible students for sampling; communicate with teachers, students, and parents about the study to encourage their participation; distribute and collect parental consent forms; and assist the test administrator in ensuring that the sampled students attend the testing sessions. As described above for schools that agree to participate but are not selected for participation, the school coordinators in these schools will also receive the incentive for the work performed prior to learning that their school would not be selected (e.g., providing student list for sampling and coordinating other logistics for the data collection).
NCES is authorized to conduct this study under the Education Sciences Reform Act of 2002 (20 U.S. Code, Section 9543). By law, the data provided by schools, staff, parents, and students may be used only for statistical purposes and may not be disclosed or used in identifiable form for any other purpose except as required by law (20 U.S. Code, Section 9573). The laws pertaining to the collection and use of personally identifiable information will be clearly communicated in correspondence with states, districts, schools, teachers, students, and parents. Letters and informational materials will be sent to parents and school administrators describing the study, its voluntary nature, and the extent to which respondents and their responses will be kept confidential. A request for a list of middle grade students with IEPs will be requested from school districts and/or schools under the FERPA exception to the general consent requirement that permits disclosures to authorized representatives of the Secretary for the purpose of evaluating Federally supported education programs (34 CFR §§ 99.31 (a)(3)(iii) and 99.35). This information will be securely destroyed when no longer needed for the purposes specified in 34 CFR § 99.35.
The confidentiality plan developed for the MGLS:2017 requires that all contractor and subcontractor personnel and field workers who will have access to individual identifiers sign confidentiality agreements and notarized nondisclosure affidavits. The plan also requires that all personnel receive training regarding the meaning of confidentiality, particularly as it relates to handling requests for information and providing assurance to respondents about the protection of their responses. NCES understands the legal and ethical need to protect the privacy of the MGLS:2017 respondents and has extensive experience in developing data files that meet the government’s requirements to protect individually identifiable data from disclosure. The data files, accompanying software, and documentation will be delivered to NCES by the data collection contractor at the end of the project. Neither names nor addresses will be included in any data file.
The MGLS:2017 field test is a voluntary study, and no persons are required to respond to the questionnaires or to participate in the assessments. In addition, respondents may decline to answer any question they are asked. This voluntary aspect of the survey is clearly stated in the advance letter mailed to adult respondents, other study materials such as the Frequently Asked Questions, and the instructions on web and hardcopy questionnaires. It is also stressed by field staff and telephone interviewers in any question they ask. This voluntary aspect of the survey is clearly stated in the training to ensure that all data collection staff are both communicating the voluntary aspect to participants and following the guidelines. Additionally, students may refuse to participate during the assessments and study field staff are trained to respect students’ wishes.
The items found in the school administrator and teacher (mathematics and special education) surveys are not of a sensitive nature and should not pose sensitivity concerns to respondents. However, to achieve the study’s primary goal of describing the development, academic outcomes, and characteristics of middle grades students, we will be asking parents a few questions that could be viewed as sensitive in nature by some respondents. Questions about family income, disciplinary practices, their child’s disabilities, and problems their child may be having at school are included in the parent survey questions. These types of questions have been asked in many large-scale studies of school-age children including the ECLS-K, ECLS-K:2011, and HSLS:09. These questions are central to describing the middle grades population and to examining the variability in students’ development, mathematics and reading achievement, and other student outcomes.
The student questionnaire includes a few questions that could be sensitive for some students. Questions about internalizing attitudes or behaviors, perceptions of competencies in mathematics, and school and class attendance are included in this self-report survey. Students are also asked to self-report their race/ethnicity and sex, which could be sensitive questions for students at this age. The questions that are included in the student survey have been asked in other studies of adolescents and the responses to these questions have been found to help explain why some students do better than others in school and are more engaged in learning.
Table 5 shows the expected burden for districts, schools, and parents during the IVFT. As shown in Part B, we anticipate contacting approximately 250 Tier 1 schools to reach the approximately 58 schools needed for participation, and contacting the parents of approximately 6,172 students to yield approximately 3,950 participating students. In order to draw samples of students with disabilities, we may need to obtain student records information from up to four districts. We anticipate needing to contact up to 12 districts to gain participation from four.
Table 5. Data Collection Burden Estimates
Item Validation Field Test (IVFT) |
Sample Size |
Expected Response Rate |
Number of Respondents |
Number of Responses |
Average burden time (minutes) |
Total burden (hours) |
Respondent average hourly wage1 |
Estimate of respondent labor cost |
Students and Parents |
||||||||
Student Survey |
6,172 |
64% |
3,950 |
3,950 |
20 |
1,317 |
$7.25 |
$9,548 |
Student Assessment2 |
6,172 |
64% |
3,950 |
3,950 |
70 |
4,608 |
‒ |
‒ |
Students' parents |
6,1724 |
64% |
3,9504 |
3,950 |
30 |
1,975 |
$22.71 |
$44,852 |
Students' math teachers |
||||||||
Teacher-level, teacher characteristics |
522 |
92% |
480 |
480 |
13 |
104 |
$27.70 |
$2,881 |
Teacher-level, classroom characteristics |
522* |
92% |
480* |
480 |
7 |
56 |
$27.70 |
$1,551 |
Teacher report on student |
522* |
92% |
480* |
3,950 |
10 |
658 |
$27.70 |
$18,227 |
Students' special education teachers |
||||||||
Teacher-level survey |
174 |
92% |
160 |
160 |
10 |
27 |
$28.65 |
$774 |
Teacher report on student |
174* |
92% |
160* |
552 |
25 |
230 |
$28.65 |
$6,590 |
School administrators and coordinators |
||||||||
Students' school administrators |
58 |
99% |
57 |
57 |
20 |
19 |
$44.13 |
$838 |
School coordinator |
58 |
100% |
58 |
58 |
720 |
696 |
$26.94 |
$18,750 |
TOTAL for data collection activities |
- |
4,705 |
13,637 |
- |
5,082 |
- |
$104,011 |
|
Approved Total for recruitment3 |
- |
6,454 |
6,454 |
- |
1,424 |
- |
$40,800 |
|
Total for all IVFT activities |
- |
- |
11,159 |
20,091 |
- |
6,506 |
- |
$144,811 |
1 The average hourly earnings of parents in the 2014 National Compensation Survey sponsored by the Bureau of Labor Statistics (BLS) is $22.71, of middle school teachers is $27.70, of middle school special education teachers is $28.65, of education administrators is $44.13, and of educational guidance counselors is $26.94. If mean hourly wage was not provided, it was computed assuming 2,080 hours per year. The exception is the student wage, which is based on the federal minimum wage. Source: BLS Occupation Employment Statistics, http://data.bls.gov/oes/ datatype: Occupation codes: All employees (00-0000); Middle school teachers (25-2022); Middle school special education teachers (25-2053); Education Administrators (11-9032); and Educational guidance counselors (21-1012); accessed on June 18, 2015.
2 Burden associated with student assessments is included here for informational purposes. It is not included in the total burden calculations because, unlike the other burden presented here, it is not subject to the Paperwork Reduction Act (PRA).
3 Recruitment activities for the IVFT will not be completed at the time this request will be approved, and thus the approved burden affiliated with the IVFT recruitment is being carried over and is included in the total requested in this submission.
4 The number of parent respondents is already included in the recruitment number of respondents.
* The same respondent group as above, not double counted in the total number of respondents.
‒Not applicable.
The burden time estimates are based on the maximum reasonable expected burden per respondent:
Student assessments and surveys will be approximately 90 minutes. Within the 90 minutes, the student survey portion will take approximately 20 minutes.
The parent survey will take approximately 30 minutes.
The first part of the mathematics teacher survey (the teacher part) is expected to take approximately 20 minutes to complete, and the second part (the teacher student reports) will take approximately 10 minutes for each student. The teacher-level survey burden estimates assume on average 9 math classes per school (3 per grade). With an estimated 58 schools needed for the IVFT, 3 grades per school, this means approximately 522 6th, 7th, and 8th grade mathematics teachers.
The first part of the special education teacher survey (the teacher part) is expected to take approximately 10 minutes to complete, and the second part (the teacher student reports) will take approximately 25 minutes for each student. The teacher-level survey burden estimates assume on average 3 special education teachers per school (1 per grade). With an estimated 58 schools needed for the IVFT, 3 grades per school, this means approximately 174 Special Education Teachers.
The school administrator survey will take approximately 20 minutes to complete.
The school coordinator will on average spend up to 4 hours per day, per assessment day supporting study activities. The burden estimates assume 3 assessment days.
There are no respondent costs other than the cost associated with response time burden.
The estimated cost to the federal government for contractor and subcontractor work to conduct all aspects of the IVFT, is $3,635,433.
The apparent increase in burden from the last approved package is due to the fact that this request includes burden for both recruitment and data collection activities associated with the MGLS:2017 IVFT, which will begin in January 2016, while the previous approval was only for the recruitment portion of these activities, scheduled to begin in scheduled to begin in September 2015, for which the burden is being carried over in this request.
The results from the IVFT will be presented in a single field test report released approximately 6 months after the completion of the field test.
Table 6. Schedule for Item Validation Field Test (IVFT)
Activity |
Start date |
End date |
Recruitment of school and districts |
September2015 |
March 2016 |
Recruitment of students and parents through requesting parent consent from parents |
October 2015 |
May 2016 |
IVFT Data Collection |
January 2016 |
June 2016 |
Field Test Report |
‒ |
December 2016 |
The OMB expiration date will be displayed on all recruitment materials.
No exceptions to the certification statement are requested or required.
1 Tourangeau, K., Nord, C., Lê, T., Sorongon, A.G., Hagedorn, M.C., Daly, P., and Najarian, M. (2012). Early Childhood Longitudinal Study, Kindergarten Class of 2010–11 (ECLS-K:2011), User’s Manual for the ECLS-K:2011 Kindergarten Data File and Electronic Codebook (NCES 2013-061). U.S. Department of Education. Washington, DC: National Center for Education Statistics.
2 Tourangeau, K., Nord, C., Lê, T., Sorongon, A.G., Hagedorn, M.C., Daly, P., and Najarian, M. (2001). Early Childhood Longitudinal Study, Kindergarten Class of 1998–99 (ECLS-K), User’s Manual for the ECLS-K Base Year Public-Use Data Files and Electronic Codebook (NCES 2001-029). U.S. Department of Education. Washington, DC: National Center for Education Statistics.
3 Ingels, S.J., Pratt, D.J., Herget, D.R., Burns, L.J., Dever, J.A., Ottem, R., Rogers, J.E., Jin, Y., and Leinwand, S. (2011). High School Longitudinal Study of 2009 (HSLS:09). Base-Year Data File Documentation (NCES 2011-328). U.S. Department of Education. Washington, DC: National Center for Education Statistics.
4 Retrieved from http://www.select-statistics.co.uk/sample-size-calculator-two-proportions.
5 Ingels, S.J., Pratt, D.J., Herget, D.R., Burns, L.J., Dever, J.A., Ottem, R., Rogers, J.E., Jin, Y., and Leinwand, S. (2011). High School Longitudinal Study of 2009 (HSLS:09). Base-Year Data File Documentation (NCES 2011-328). U.S. Department of Education. Washington, DC: National Center for Education Statistics.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Modified | 0000-00-00 |
File Created | 2021-01-24 |