Part A MGLS 2017 MS2 Data Collection

Part A MGLS 2017 MS2 Data Collection.docx

Middle Grades Longitudinal Study of 2017-18 (MGLS:2017) Main Study First Follow-up (MS2) Data Collection

OMB: 1850-0911

Document [docx]
Download: docx | pdf






Middle Grades Longitudinal Study of 2017-18 (MGLS:2017) Main Study First Follow-up (MS2) Data Collection






OMB# 1850-0911 v.27





Supporting Statement Part A










National Center for Education Statistics

U.S. Department of Education

Institute of Education Sciences

Washington, DC






August 2019

revised March 2020

second revision May 2020





Preface

The Middle Grades Longitudinal Study of 2017-18 (MGLS:2017) is the first study conducted by the National Center for Education Statistics (NCES), within the Institute of Education Sciences (IES) of the U.S. Department of Education, to follow a nationally representative sample of students as they enter and move through the middle grades (grades 6–8). In preparation for the national data collection, referred to as the Main Study (MS), the data collection instruments and procedures were field tested.

An Item Validation Field Test (IVFT) was conducted from January through May 2016 to determine the psychometric properties of assessment and survey items and the predictive potential of items so that valid, reliable, and useful assessment and survey instruments could be developed for the Main Study. The MGLS:2017 Operational Field Test (OFT) Base Year (OFT1) data collection was conducted from January through May 2017 to test the near-final instruments and recruitment and data collection procedures and materials in preparation for the MGLS:2017 Main Study Base Year (MS1). The MS1 data collection took place from January to August 2018, and the OFT First Follow-up (OFT2) data collection took place from February to May 2018. The primary purpose of OFT2 was to obtain information on recruiting, particularly for students in three focal IDEA-defined disability groups: specific learning disability, autism, and emotional disturbance; obtain a tracking sample that can be used to study mobility patterns in subsequent years; and test protocols, items, and administrative procedures.

Originally, NCES planned for MGLS:2017 to conduct annual main study follow-up data collections first beginning in January 2019 and next beginning in January 2020, when most of the students in the sample will be in grades 7 and 8, respectively. However, participation rates in the base year were substantially short of targets and analyses of the respondent sample sizes indicated that the number of participants was inadequate to meet the precision requirements for several key subgroups of students by the end of the study. In September 2018, OMB approved a revision to the MGLS:2017 follow-up data collection plan and procedures to meet the overall study goal to obtain data on the progress of students starting in grade 6 and ending in grade 8 in general education schools (OMB# 1950-0911 v.20). Specifically, the approval was to: (1) drop the originally planned seventh grade round of data collection and conduct the Main Study First Follow-up (MS2) data collection in January-July 2020 (when most sample students will be in the eighth grade), (2) notify participating districts and schools of this change in data collection schedule, (3) discontinue the procedures designed to oversample students in specific IDEA-defined disability groups, and (4) conduct MS2 and OFT Second Follow-up (OFT3) tracking activities. The MS2 recruitment, which began in began in January 2019, was approved in December 2018 with the latest update approved in May 2019 (OMB# 1850-0911 v.21-23). The MS2 data collection, to be conducted from January through July 2020 (when most sample students will be in the eighth grade), was approved in November 2019 with the latest update in April 2020 (OMB# 1850-0911 v.24-26). The current request is to extend the end date for the student data collection to July 31, 2020 and add an email to sample members.

Part A of this submission presents information on the basic design of MS2. Part B discusses the statistical methods employed, and Part C provides content and item justifications for the MGLS:2017 student, parent, math teacher, special education teacher, and school administrator questionnaires, as well as the facilities observation checklist. The Appendices A-S provides MS2 communication materials (those unchanged from the last approved are marked to that effect). In general, “MS2” in an appendix title demarcates that the material was used across all stages of MS2 while “MS2B” in the appendix title represents activities beginning in the fall 2019, including MS2 tracking and recruitment activities and MS2 data collection. The Appendices T-V document provides (a) the student roster forms that will be used during MS2 tracking and recruitment, and (b) the MS2 data collection instruments. Appendix W contains study email communications.

A. Justification

A.1 Importance of Information

As a study of the middle grades, MGLS:2017 will complement NCES’s plans for implementing a multi-cohort sequence for a longitudinal studies series. By aligning the Early Childhood Longitudinal Study Kindergarten Class of 2010–11 (ECLS-K:2011), MGLS:2017, and the next High School Longitudinal Study (HSLS), NCES will be able to collect, within a 10-year span, a full range of data on students’ school experiences as the students enter and then transition from elementary school into high school. Given its portfolio and experience in national longitudinal education studies, NCES is uniquely positioned to undertake this comprehensive, large-scale, longitudinal study of a nationally representative sample of middle grade youth that includes measures of known critical influences on adolescents’ academic and socioemotional trajectories. NCES is authorized to conduct MGLS:2017 by the Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9543) and to collect students’ education records from education agencies or institutions for the purposes of evaluating federally supported education programs under the Family Educational Rights and Privacy Act (FERPA, 34 CFR §§99.31(a)(3) and 99.35).

MGLS:2017 will rely on a set of longitudinal and complementary instruments to collect data across several types of respondents to provide information on the outcomes, experiences, and perspectives of students for grades 6 through 8; their families and home lives; their teachers, classrooms, and instruction; and the school settings, programs, and services available to them. At each wave of data collection in the main study, students’ mathematics and reading skills, socioemotional development, and executive function will be assessed. Students will also complete a survey that asks about their engagement in school, out-of-school experiences, peer group relationships, and identity development. Parents will be asked about their background, family resources, and involvement with their child’s education and their school. Students’ mathematics teachers will complete a two-part survey. In part 1, they will be asked about their background and classroom instruction. In part 2, they will be asked to report on the academic behavior, mathematics performance, and classroom conduct of each study child in their classroom. For students receiving special education services, their special education teacher or provider will also complete a survey similar in structure to the two-part mathematics teacher instrument, consisting of a teacher-level questionnaire and a student-level questionnaire, but with questions specific to the special education experiences of and services received by the sampled student. School administrators will be asked to report on school programs and services, as well as on school climate.

With data collection occurring in two rounds beginning in the winter/spring 2018 and finishing in 2020, MGLS:2017 will provide rich descriptive data on academic experiences, development, and learning that occur during these critical, middle grade years (grades 6–8), and on the individual, social, and contextual factors that are related to development and future success, thereby allowing researchers to examine associations between various factors and student outcomes. A wealth of research highlights the importance of mathematics and literacy skills for success in high school and subsequent associations with later education and career opportunities. Thus, MGLS:2017 will focus on student achievement in these areas, along with measures of student socioemotional well-being and other outcomes. The study will also collect data on educational experiences, outcomes, and special education services of students with disabilities as a group. A key goal of the study is to provide researchers and policymakers with the information they need to better understand the school and non-school influences associated with mathematics and reading success, socioemotional health, and positive life development during the middle grade years and beyond.

To support the development of the study, MGLS:2017 has conducted two field tests: the IVFT was conducted from February through May 2016 and was followed by OFT1, which took place from January through May 2017. The goal of the IVFT was to evaluate and inform the development of reliable, valid measures, while OFT1 focused on testing MGLS:2017 Base Year materials and procedures and on refining the recruitment techniques to obtain the needed nationally representative sample and better data quality. The MS1 data collection took place from January to August 2018, and the OFT2 data collection from February to May 2018. OFT2 provided an opportunity to do further refinement of surveys and assessments and test out the procedures for recruiting schools, tracking students, and collecting student data in and out of the school setting.

Due to insufficient participation by schools, in MS1 in 2018, MGLS:2017 must undergo design changes including two significant changes – discontinuation of targeted (for oversample) recruitment of students in the autism and emotional disturbance subgroups and the elimination of the grade 7 data collection. In addition, the MS2 school and student sample will be augmented to achieve sufficient sample sizes to meet precision requirements.

A.2 Purposes and Uses of Data

MGLS:2017 will provide nationally representative data related to students’ transitions from elementary school to the middle grades, on the preparations for transitions into high school, and their academic, social, and interpersonal growth during the middle grades. MGLS:2017 will culminate in a rich data set that can be used by researchers, educators, and policymakers to examine family and educational factors related to student achievement. In addition to studying students in the middle grades more generally, educators and policymakers will also be able to use the resulting data to examine the effectiveness of services provided to students in three focal disability groups. The longitudinal nature of the study will allow for analyses of changes in young people’s lives and of how their connections with their communities, schools, teachers, families, and peers affect these changes.

The study is guided by a conceptual framework that emphasizes the complex interrelationships that help shape students’ development and learning, ultimately supporting their academic success and positive development for success in life. MGLS:2017 is designed around a framework of research questions, including:

  1. How do students develop cognitively (with respect to executive function and academic achievement), socially, and emotionally in the middle grades? What school and nonschool factors are associated with that development?

  2. What school and home environment factors are associated with students’ cognitive development and executive function?

  3. What school and home environment factors are associated with students’ regulation and engagement, social skills and behaviors, externalizing problem behaviors, and academic performance?

  4. What is the nature of students’ identity development (including aspirations, peer relationships, and goals) across the middle grades? How does identity development influence school engagement and motivation?

  5. What school and home environment factors are related to the academic success of students with various risk factors often associated with lower academic achievement, such as poverty and low parent education?

  6. What are students’ experiences making the transition from elementary to middle grades? How do parents, teachers, and schools support this transition, as well as the transition from middle grades to high school?

  7. What school and home environment supports are available to middle grade students for setting education pathways and pursuing career goals?

The purpose of MGLS:2017 is to provide data that support the exploration of research interests across disciplines, which will in turn deepen the knowledge base and inform policy and practice. In addition, MGLS:2017 will provide education researchers with data that are currently unavailable: nationally representative longitudinal data focusing specifically on the middle grades.

The study design includes direct measurement of students during a student session that includes the following assessments and surveys:

Reading. The MGLS:2017 reading assessment will provide valuable information about the reading achievement of students in grades 6-8 with a focus on reading comprehension. The reading assessment will provide valuable information on the development of middle grades students’ reading comprehension and ability to integrate information from different sources. It is anticipated that these skills will be essential in various content areas as students move into high school.

Mathematics. The mathematics assessment is designed to measure growth toward algebra readiness in anticipation of the demands students will encounter in high school mathematics coursework. The mathematics assessment will provide valuable information about the development of middle grades students’ knowledge of mathematics and their ability to use that knowledge to solve problems, moving toward stronger reasoning, and understanding of more advanced mathematics.

Executive Function. Executive function, a set of capacities and processes originating in the prefrontal cortex of the brain, permits individuals to self-regulate, engage in purposeful and goal-directed behaviors, and conduct themselves in a socially appropriate manner. Self-regulation is needed for social success, academic and career success, and good health outcomes. Executive function includes capacities such as shifting (cognitive and attention flexibility), inhibitory control, and working memory.

Student Survey. The purpose of the student survey is to collect information on students’ attitudes and behaviors, out-of-school time use, and family, school, and classroom environments. The student survey will also serve as a source for information about socioemotional outcomes having to do with social relationships and support and academic engagement. These data augment the information collected from the mathematics, reading and executive function assessments to provide a deeper understanding of the social and contextual factors related to students’ academic and non-academic outcomes.

Height and Weight. Measuring students’ height and weight provides data to assess body mass index as an indicator of obesity, pubertal timing (i.e., growth spurt), and eating disorders12.

Student data will be supplemented by data collected from students’ parents, teachers, and school administrators:

Parent Survey. The purpose of the parent survey is to collect information about: 1) family involvement in their child’s education and 2) family characteristics that are key predictors of academic achievement and other student outcomes.

Mathematics Teacher Survey. The purpose of the mathematics teacher survey is to gather information on the teaching and mathematics classroom context for use in understanding students’ development and mathematics learning during the middle grades. Teachers also rate the sampled students on their math ability.

Special Education Teacher Survey. The purpose of the special education teacher survey is to gather information on the teaching and classroom context for students with disabilities during the middle grades and to learn more about services offered in schools.

School Administrator Survey. The purpose of the school administrator survey is to provide context for school factors that influence student development, motivation, and mathematics learning.

Facilities Observation Checklist. The facilities observation checklist for the school setting will be used to document the condition of the physical plant and the availability of resources. This information will be collected by field staff and will complement the School Administrator Survey.

Further detail on the assessment and survey content is found in Part C. For more information on the data collection from different types of respondents see Part B.

A.3 Use of Improved Information Technology (Reduction of Burden)

Where feasible, available technology will be used to reduce burden and improve efficiency and accuracy. For example, if districts can provide information linking students to their mathematics teachers or students with disabilities to their special education teachers electronically, we will use this information rather than asking for it at the school level. The burden of recruitment on districts and schools will be minimal, with most information gathered over the telephone. Districts will primarily be asked to provide confirmation of data gathered from other sources, including school universe files and district and school websites. Our collection of student lists will accommodate whatever format districts and schools find to be the least burdensome. The study will utilize the information in any format in which it is provided.

The student assessments and survey will be completed on a Chromebook, a tablet-like computer with touchscreen capability and an attached keyboard. The computerized assessment is made possible by connecting the Chromebooks to an independent local access network (LAN) housed on a laptop computer set up at the school by study field staff. All equipment is provided by the study, and neither the school’s internet access nor any internet access in general is required for the computerized administration of the student session.

Students who are unable to participate in school will have the opportunity to participate at home via Web or to complete the survey by phone and the assessments via Web. Similarly, students who complete part of the session in school will have the opportunity to complete the remainder of the session at home via Web.

The parent and school staff questionnaires will be fielded as web surveys. Web surveys will also be conducted with MS2 students who: (a) are in schools that only allow an “out-of-school” data collection, (b) left their MS1 school and do not attend a school with 4 or more student sample members, or (c) missed the in-school session. Using this data collection mode will allow for automatic routing of respondents through the surveys, which contain some instances of complex question branching. The automatic routing reduces respondent burden by producing faster interviews. The respondent will not be asked inapplicable questions and will not need to spend time determining which questions to answer. Also, electronic capture of responses reduces processing time and the potential for data entry error.

The website for data collection will reside on NCES’s SSL-encrypted servers. On a nightly basis, the data collection contractor, RTI, will download interview data, in batches, to its Enhanced Security Network (ESN) via a secure web service. Once in the ESN, data will be cleaned and undergo quality analysis.

A computer-based data management system will be used to manage the sample. The sample management system uses encrypted data transmission and networking technology to maintain timely information on respondents in the sample, including contact, tracking, and case completion data. This system will be particularly important as students move from one school to another over the course of the study. The use of technology for sample management will maximize tracking efforts, which should have a positive effect on the study’s ability to locate movers and achieve acceptable response rates.

A.4 Efforts to Identify Duplication

MGLS:2017 will not be duplicative of other studies. While NCES longitudinal studies have contributed to our understanding of the factors that influence student success and failure in school, no NCES study has yet collected data across the middle grades (grades 6–8). A majority of nationally representative longitudinal studies have focused on high school students and on the transition from secondary to postsecondary education: e.g., the High School and Beyond Longitudinal Study (HS&B) and the Education Longitudinal Study of 2002 (ELS:2002). The Early Childhood Longitudinal Study, Kindergarten Class of 1998–99 (ECLS-K) and the National Education Longitudinal Study of 1988 (NELS:88) collected data on students in grade 8, but neither included a data collection in grades 6 and 7. The ECLS-K:2011 does not plan to follow students beyond grade 5, and the High School Longitudinal Study of 2009 (HSLS:09) began with a national sample of students in grade 9. Thus, there is little information at the national level about the learning that occurs during grades 6–8 and about the rates of learning for different groups of students who may experience diverse school environments and opportunities.

MGLS:2017 is unique in that it will assess students’ mathematics and reading achievement, as well as other student outcomes (e.g., executive function and socioemotional development), for the same group of students over a 3-year period. In addition to ECLS-K and NELS:88, other national studies have assessed some of these outcomes for students in grade 8, including the National Assessment of Educational Progress (NAEP) and the Trends in International Mathematics and Science Study (TIMSS). These studies, however, are cross-sectional and do not include repeated measures of achievement or assess multiple subjects and areas of development for the same sample of students. Therefore, they cannot answer questions about students’ growth in mathematics and reading over the middle grade years, about differences in the rates of growth for different populations (e.g., differences by sex, by race/ethnicity, and for students attending public and private schools), and about the school and non-school factors that may facilitate or hinder this growth. Nor can they explore questions about the relationships between student achievement and other school outcomes and executive functions (e.g., working memory, attention, and inhibitory control) that work to regulate and orchestrate cognition, emotion, and behavior to enable a student to learn in the classroom.

Other adolescent development studies have been conducted, but they often do not include a grade 6 sample. For example, the youngest children in the National Longitudinal Study of Adolescent Health (Add Health) and the Maryland Adolescent Development in Context Study (MADICS) were in grade 7 at baseline. Many of these studies collected data on local samples, had a primary focus on family and child processes, and were started in the 1990s: e.g., MADICS and the Michigan Study of Adolescent and Adult Life Transitions (MSALT). As such, they do not provide a contemporary picture of U.S. students in grades 6–8.

A.5 Minimizing Burden for Small Entities

Burden will be minimized wherever possible. During district and school recruitment, we will minimize burden by training recruitment staff to make their contacts as straightforward and concise as possible. The recruitment letters and materials (e.g., the study description and FAQs) are designed to be clear, brief, and informative. In addition, contractor staff will conduct all test administration and will assist with parental notification, sampling, and other study tasks as much as possible within each school.

A.6 Frequency of Data Collection

The MGLS:2017 MS1 data collection took place from January through August 2018. Tracking activities for OFT2 occurred from August 2017 through May 2018, and data collection from February to May 2018. Tracking activities for OFT3 will occur from September 2018 through May 2019, and for MS2 from September 2018 through May 2020 (in multiple rounds). The MS2 data collection will occur from January through July of 2020.

A.7 Special Circumstances

There are no special circumstances involved with this study.

A.8 Consultations outside NCES

Content experts have been consulted in the development of the assessments and questionnaires. These experts are listed by name, affiliation, and expertise in table 1.

Table 1. Members of the MGLS:2017 Content Review Panels

Name

Affiliation

Expertise

Mathematics Assessment Content Review Panel (June 18–19, 2013)

Tom Loveless

Brookings Institution

Policy, mathematics curriculum

Linda Wilson

Formerly with Project 2061

Mathematics education, mathematics assessment, middle school assessment, author of NCTM Assessment Standards for School Mathematics and NAEP math framework, teacher

Kathleen Heid

University of Florida

Mathematics education, use of technology, teacher knowledge, NAEP Grade 8 Mathematics Standing Committee member

Edward Nolan

Montgomery County Schools, Maryland

Mathematics curriculum and standards, large-scale assessment of middle grade students

Lisa Keller

University of Massachusetts, Amherst

Psychometrics, former mathematics teacher

Paul Sally

University of Chicago

Mathematics education, mathematics reasoning, mathematically talented adolescents

Margie Hill

University of Kansas

Co-author of Kansas mathematics standards, former NAEP Mathematics Standing Committee member, former district math supervisor

Executive Function Content Review Panel (July 18, 2013)

Lisa Jacobson

Johns Hopkins University; Kennedy Krieger Institute

Development of executive functioning skills, attention, neurodevelopmental disorders, and parent and teacher scaffolding

Dan Romer

University of Pennsylvania

Adolescent risk taking

James Byrnes

Temple University

Self-regulation, decision making, cognitive processes in mathematics learning

Socioemotional-Student-Family Content Review Panel (July 25–26, 2013)

James Byrnes

Temple University

Self-regulation, decision making, cognitive processes in mathematics learning

Russell Rumberger

University of California, Santa Barbara

School dropouts, ethnic and language minority student achievement

Tama Leventhal

Tufts University

Family context, adolescence, social policy, community and neighborhood indicators

Susan Dauber

Bluestocking Research

School organization, educational transitions, urban education, parent involvement and family processes

Scott Gest

Pennsylvania State University

Social networking, social skills, longitudinal assessment of at-risk populations

Kathryn Wentzel

University of Maryland

Social and academic motivation, self-regulation, school adjustment, peer relationships, teacher-student relationships, family-school linkages

Richard Lerner

Tufts University

Adolescent development and relationships with peers, families, schools, and communities

School Administrator Content Review Panel (August 16, 2013)

Susan Dauber

Bluestocking Research

School organization, educational transitions, urban education, parent involvement and family processes

George Farkas

University of California, Irvine

Schooling equity and human resources

Jeremy Finn

State University of New York at Buffalo

School organization, school dropouts

Edward Nolan

Montgomery County Schools, Maryland

Large urban school system administrator

Tom Loveless

Brookings Institution

Policy, math curriculum

Reading Assessment Content Review Panel (April 14, 2014)

Donna Alvermann

University of Georgia

Adolescent literacy, online literacy, codirector of the National Reading Research Center (funded by the U.S. Department of Education)

Joseph Magliano

Northern Illinois University

Cognitive processes that support comprehension, the nature of memory representations for events depicted in text and film, strategies to detect and help struggling readers

Sheryl Lazarus

University of Minnesota

Education policy issues related to the inclusion of students with disabilities in assessments used for accountability purposes, student participation and accommodations, alternate assessments, technology-enhanced assessments, teacher effectiveness, large-scale assessments, school accountability, research design (including cost analyses), data-driven decision making, rural education, the economics of education

Disabilities Content Review Panel (April 29, 2014)

Jose Blackorby

SRI International

Autism, specific learning disabilities, special education, curriculum design, alternate student assessment, large-scale studies of students with disabilities, codirector of the Special Education Elementary Longitudinal Study (SEELS)

Lynn

Fuchs

Vanderbilt University

Specific learning disabilities, student assessment, mathematics curriculum, psychometric models

Mitchell L. Yell

University of South Carolina

Autism, emotional and behavior disorders, specific learning disabilities, pre-K–12 instruction and curriculum, special education, evidence-based intervention

Sheryl Lazarus

University of Minnesota

Special education policy, inclusion of students with disabilities in assessments, accommodations, alternate assessments, technology-enhanced assessments, large-scale assessments, school accountability, research design (including cost analyses)

Martha Thurlow

University of Minnesota

Specific learning disabilities, reading assessment, alternate student assessment, early childhood education, special education, curriculum, large-scale studies

Diane Pedrotty Bryant

University of Texas, Austin

Educational interventions for improving the mathematics and reading performance of students with learning disabilities, the use of assistive technology for individuals with disabilities, interventions for students with learning disabilities and who are at risk for educational difficulties

Technical Review Panel 1 & 2 (May 10, 2016; May 16, 2017)

Grace Kao

University of Pennsylvania

Dr. Kao’s research interests center on the explanation of immigrant, racial, and ethnic disparities in education outcomes. Her work has used quantitative analyses of nationally representative data on students and parents (including NCES data sets as well as AddHealth).

Margaret McLaughlin

University of Maryland

Dr. McLaughlin’s research focuses on special education policy, particularly use of large-scale data in policy research including investigation of the impact of education reform on students with disabilities and special education programs.

Lisa Jacobson

Kennedy Krieger Institute

Dr. Jacobson specializes in clinical pediatric neuropsychology. Her research interests include cognitive and behavioral aspects of disorders related to attention and executive functions. She is interested in how children’s developing executive functions interact with developmental contexts both at home and school.

Brian Rowan

University of Michigan

Dr. Rowan’s research has focused on the organization and management of schooling, paying special attention to the measurement and improvement of teaching quality. His current research includes a randomized field trial of an early grades reading intervention, an evaluation of a high school instructional improvement program, and a study of online high schools in Florida.

Oscar Barbarin

University of Maryland

Dr. Barbarin’s research has focused on the social and familial determinants of ethnic and gender achievement gaps beginning in early childhood. An additional focus is Dr. Barbarin’s concern with socioemotional and academic development, particularly of boys of color.

James P. Byrnes

Temple University

Dr. Byrnes interests include the modeling of academic achievement, decision-making and risk-taking, development of mathematical expertise, gender differences in achievement, and critical thinking about neuroscientific research.

Dan Romer

Adolescent Communication Institute, Annenberg Public Policy Center

Dr. Romer has studied social influences on adolescent health with particular attention to the social transmission of risky behavior. He is currently studying a cohort of adolescents in Philadelphia to understand the risk factors that underlie early use of drugs and other threats to healthy development. His interests include the relationship between risk behavior and Executive Function.

Jeremy Finn

University at Buffalo

Dr. Finn’s research interests include school organization and class size, student engagement, disengagement, and dropping out, students at risk, and using quantitative methods to study policy issues.

Lynn Newman

SRI International

Dr. Newman has experience in education and social science research in disability policy and human services. She has expertise in quantitative and qualitative methodologies and large-scale, longitudinal studies, particularly with respect to school experiences and transitions of youth with disabilities.

Technical Review Panel 3 (January 24, 2019)

Grace Kao

Yale University

Dr. Kao’s research interests center on the explanation of immigrant, racial, and ethnic disparities in education outcomes. Her work has used quantitative analyses of nationally representative data on students and parents (including NCES data sets as well as AddHealth).

Margaret McLaughlin

University of Maryland, College Park

Dr. McLaughlin’s research focuses on special education policy, particularly use of large-scale data in policy research including investigation of the impact of education reform on students with disabilities and special education programs.

Lisa Jacobson

Kennedy Krieger Institute

Dr. Jacobson specializes in clinical pediatric neuropsychology. Her research interests include cognitive and behavioral aspects of disorders related to attention and executive functions. She is interested in how children’s developing executive functions interact with developmental contexts both at home and school.

Oscar Barbarin

University of Maryland, College Park

Dr. Barbarin’s research has focused on the social and familial determinants of ethnic and gender achievement gaps beginning in early childhood. An additional focus is Dr. Barbarin’s concern with socioemotional and academic development, particularly of boys of color.

James P. Byrnes

Temple University

Dr. Byrnes interests include the modeling of academic achievement, decision-making and risk-taking, development of mathematical expertise, gender differences in achievement, and critical thinking about neuroscientific research.

Dan Romer

Adolescent Communication Institute, Annenberg Public Policy Center

Dr. Romer has studied social influences on adolescent health with particular attention to the social transmission of risky behavior. He is currently studying a cohort of adolescents in Philadelphia to understand the risk factors that underlie early use of drugs and other threats to healthy development. His interests include the relationship between risk behavior and Executive Function.

Jennifer Yu

SRI International

Dr. Yu specializes in the development, implementation, and evaluation of educational and health-related supports and services for K-postsecondary students with disabilities, with a focus on technologies and programs that improve the learning and quality of life for students with autism, learning disabilities, and mental/behavioral health issues.

Tom Hoffer

NORC

Dr. Hoffer specializes in study design, instrument development, data analysis, and report writing on education projects. He has several years of experience on projects in each of the main institutional areas of education: elementary, middle school, high school, college and graduate school, and the nexus of formal education and the labor force.


A.9 Payments or Gifts to Respondents

High levels of school participation are critical to the success of each phase of the study. School administrator, mathematics teacher, special education teacher, parent, and student data collection activities are contingent on school cooperation. NCES recognizes that the burden level of the study is one of the factors that school administrators will consider when deciding whether to participate. To offset the perceived burden of participation, NCES intends to continue to use strategies that have worked successfully in other NCES studies (e.g., ECLS-K, ECLS-K:2011, HS&B, NELS:88, and ELS:2002), including offering both monetary and non-monetary incentives to be given to respondents after they participate in the data collection activities, for example upon completion of a survey. Because the roster and enrollment status update collections were viewed as burdensome by school coordinators in MS1 and OFT2, we will split the school coordinator incentive moving forward to pay part of the incentive upon completion of the roster or enrollment status form (after all quality check (QC) issues are resolved) and the remainder of the incentive after all data collection activities for the round are completed.

Table 2 summarizes the incentive amount planned for each instrument and activity along with their estimated administration times. A brief justification for each incentive amount follows table 2. Incentive information is provided for MS2 and was previously approved (OMB# 1850-0911 v23).

Table 2. MS2 Tracking Activities, Data Collection Instruments, and Incentive Amounts

Instrument/Activity

Administration Time*

MS2** Incentives

MS2 Tracking, Recruitment, and Data Collection

Student return of parent consent forms (explicit consent schools only)

10 minutes

Food event at school (e.g., pizza, bagels, etc.) sponsored by the study

Student Assessments and Survey – In-school administration

(Mathematics, Reading, Executive Function, Height, Weight, & Survey)

90 minutes

Earbuds and pencil used during assessment and a certificate for 2 hours of community service from the U.S. Department of Education

Student Assessments and Survey – Out-of-school administration

(Mathematics, Reading, Survey)

45 minutes or




75 minutes

$20 plus a certificate for 2 hours of community service from the U.S. Department of Education

$20 plus a certificate for 3 hours of community service and a $20 donation to Save the Children’s special fund to help kids affected by the COVID-19 outbreak.

Parent Panel Maintenance

5 minutes

$10

Parent Survey

35 minutes

$20 to $40 (one parent per student)

Mathematics Teacher

Teacher Survey

20 minutes

$20

Mathematics Teacher

Teacher Student Report

7 minutes per student

$7 per student

Special Education Teacher

Teacher Survey

10 minutes

$20

Special Education Teacher

Teacher Student Report

20 minutes per student

$7 per student

School Administrator Survey

40 minutes

$25

School Participation


School Coordinator

(logistics, on-site visit, consent forms, administrative records, etc.)

6 hours to provide administrative records (roster or enrollment status update)

6 hours for consent assistance

2 hours to schedule assessments

2 hours to coordinate session logistics

$400 or $400 in goods and services (for schools allowing MS2 student group administration in school)


$150 for coordinator ($50 after the roster or enrollment status update passes QC and the remaining $100 after all data collection activities are completed at the school)

*Note that the assessment administration time may be longer for students with accommodations.

** Final incentive amounts were determined based on the outcome of the field tests and main study base year.


Students

Main Study First Follow-up (MS2): Students in the MS1 sample or the MS2 augmentation sample (described in Part B.1) who are participating in MS2 at school (most of whom will be in grade 8) will be allowed to keep the earbuds and pencil used during the MS2 assessment. Students who have left their Base Year school and are unable to participate in school (e.g., if the number of MGLS:2017 sampled students at the new school in which they are enrolled, meaning the school to which they transferred, is less than 4, see Part B.2 for additional detail) will be invited to participate via web outside of school. These students will not receive earbuds and pencils. Instead, students who opt to complete the 45-minute session will receive $20 plus a 2-hour certificate of community service from the U.S. Department of Education, while students who choose the 75-minute session will receive $20 plus a 3-hour certificate of community service from the U.S. Department of Education and we will make a $20 donation to Save the Children’s special fund to help kids affected by the COVID-19 outbreak. The out-of-school data collection is used so that students in the base-year sample may still participate regardless of their educational situation in subsequent rounds or in the event of school closures due to COVID-19. These students are critical as they may be different from students who participate in school. The monetary incentive offered to these students is designed to encourage them to incur the burden of participating in the study on their out-of-school time.

Parents

Main Study First Follow-up (MS2): Parent survey response rates have declined over the past decade. The ECLS-K:2011 baseline (fall 2010) parent survey response rate was more than 10 percentage points lower (74 percent)3 than the parent survey rate in the corresponding 1998 wave of the ECLS-K (85 percent).4 Additionally, the ninth-grade parent survey response rate for the HSLS:09 baseline was 68 percent.5 The MGLS:2017 parent survey is a key component of the data being collected. In MS1, a differential incentive was offered to parents of students with Emotional Disturbance (EMN). Because we are no longer oversampling students with EMN, we will offer the same incentive to all MS2 parents as was used for MS1 parents of students without EMN – $20 with a $10 boost for nonresponse offered mid-way through the data collection period. In addition, we will offer a $10 incentive for parents to complete the second panel maintenance materials in the fall of 2019. About 21 percent of parents participated in the 2018 panel maintenance activity and, of those, about 2 percent of parents were base-year nonrespondents. The ELS:2002/12 experience found that a $10 incentive for panel maintenance response increased response by about 5 percent. Moreover, parents who responded to the panel maintenance request were more likely to participate in the survey than those who did not participate in panel maintenance. The addition of a $10 incentive for the fall 2019 panel maintenance is expected to increase response for panel maintenance while also increasing response to the MS2 parent survey.

Teachers

Main Study First Follow-up (MS2): in MS2, as in MS1, the incentive for students’ teachers will be $20 per teacher survey, plus $7 per teacher student report (TSR). These amounts are consistent with the amounts used in other NCES studies, such as the ECLS-K:2011. For the mathematics teacher, it is estimated that the teacher survey will take 20 minutes to complete, and the teacher student report will take 7 minutes per student to complete. For the special education teacher, it is estimated that the teacher survey will take 10 minutes to complete, and the teacher student report will take 20 minutes per student to complete. The teacher student report is expected to take longer for the special education teacher because it includes an additional indirect assessment of student’s skills that is not included in the mathematic teacher’s teacher student report. We will use the same incentive structure for all teachers, regardless of the specific questionnaires they are being asked to complete, to protect against any perception of unfairness that might result if teachers within a school talk to one another about the amount they have received for a specific questionnaire.

School Administrators

Main Study First Follow-up (MS2): School administrator data is critical to the success of the study and each administrator’s survey responses provide valuable contextual information for each sampled student at the school. We will offer $25 to the school administrator or his/her designee for completing the school administrator survey, which is equivalent to the school administrator incentive on ECLS-K:2011. This incentive will help offset declining administrator response rates (HSLS achieved about 94 percent response compared to about 81 percent achieved thus far on MGLS:2017).

Schools

Main Study First Follow-up (MS2): Main study schools will be contacted at two points prior to the winter/spring 2020 in-school MS2 data collection: once in fall 2018 and again in the fall of 2019. A school-level incentive of $400 or $400 in goods and services will be given to MS2 schools conducting an in-school MS2 student data collection after the 2020 data collection activities have been completed. This incentive level is consistent with that offered during OFT2.

School Coordinators

Main Study First Follow-up (MS2): School coordinators from MS1 schools will be asked to complete enrollment status updates at two points in time (fall 2018 and fall 2019) in advance of the winter/spring 2020 MS2 data collection. Those in the augmentation sample will be asked to provide a student roster. Both the enrollment status update and roster submission were viewed by schools as very burdensome in both OFT2 and MS1 and, for some sample schools, it prohibited their participation in the study. We will offer a school coordinator incentive of $50 per roster or enrollment status update. School coordinators from schools that participated in MS1 will receive the $50 for providing the enrollment status update in the fall of 2018 and another $50 for providing an update in fall 2019. School coordinators from augmentation schools will receive $50 for providing student rosters. All of the provided enrollment and roster information will need to pass QC checks prior to payment of the incentives. After the MS2 data collection activities have been completed, the school coordinator will receive an additional $100 for their assistance on the study. This incentive amount is in line with the total incentive provided to school coordinators in MS1 and the field tests. School coordinators play an especially important role in the study and are critical to its success. The coordinator in each participating school will coordinate logistics with the data collection contractor; compile and supply to the data collection contractor a list of eligible students for sampling for the MS2 augmentation sample and enrollment status update for MS2; communicate with teachers, students, and parents about the study to encourage their participation; distribute and collect parental consent forms; and assist the test administrator in ensuring that the sampled students attend the testing sessions.

A.10 Assurance of Confidentiality

Data security and confidentiality protection procedures have been put in place for MGLS:2017 to ensure that RTI International and its subcontractors comply with all privacy requirements, including:

  1. The Statement of Work of this contract (ED-IES-15-O-5016);

  2. Family Educational Rights and Privacy Act (FERPA) of 1974 (20 U.S.C. §1232(g));

  3. Privacy Act of 1974 (5 U.S.C. §552a);

  4. Privacy Act Regulations (34 CFR Part 5b);

  5. Computer Security Act of 1987;

  6. U.S.A. Patriot Act of 2001 (P.L. 107-56);

  7. Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9573);

  8. Cybersecurity Enhancement Act of 2015 (6 U.S.C. §151);

  9. Foundations of Evidence-Based Policymaking Act of 2018, Title III, Part B, Confidential Information Protection;

  10. The U.S. Department of Education General Handbook for Information Technology Security General Support Systems and Major Applications Inventory Procedures (March 2005);

  11. The U.S. Department of Education Incident Handling Procedures (February 2009);

  12. The U.S. Department of Education, ACS Directive OM: 5-101, Contractor Employee Personnel Security Screenings;

  13. NCES Statistical Standards; and

  14. All new legislation that impacts the data collected through the contract for this study.

Furthermore, RTI International will comply with the Department’s IT security policy requirements as set forth in the Handbook for Information Assurance Security Policy and related procedures and guidance, as well as IT security requirements in the Federal Information Security Management Act (FISMA), Federal Information Processing Standards (FIPS) publications, Office of Management and Budget (OMB) Circulars, and the National Institute of Standards and Technology (NIST) standards and guidance. All data products and publications will also adhere to the revised NCES Statistical Standards, as described at the website: http://nces.ed.gov/statprog/2012/.

By law (20 U.S.C. §9573), a violation of the confidentiality restrictions is a felony, punishable by imprisonment of up to 5 years and/or a fine of up to $250,000. The MGLS:2017 procedures for maintaining confidentiality include notarized nondisclosure affidavits obtained from all personnel who will have access to individual identifiers; personnel training regarding the meaning of confidentiality; controlled and protected access to computer files; built-in safeguards concerning status monitoring and receipt control systems; and a secure, staffed, in-house computing facility. MGLS:2017 follows detailed guidelines for securing sensitive project data, including, but not limited to: physical/environment protections, building access controls, system access controls, system login restrictions, user identification and authorization procedures, encryption, and project file storage/archiving/destruction.

MGLS:2017 will use additional security measures to protect the web Parent Survey from unauthorized access in the form of security questions based on data previously collected on the participants. These questions will take a form commonly associated with credit check “pick lists.” A survey entrant will be asked (a) to select their child’s name from a list of otherwise fictitious names and (b) to identify their child’s school from a list. If they answer correctly, they will move onto the Parent Survey. If their answer does not match the MGLS:2017 record, they will be asked to contact the study for further assistance. The web survey will also be programmed to prevent backtracking to areas of the survey with personally identifiable information (PII). This measure is intended to prevent unauthorized access to PII within in-progress surveys.

NCES has a secure data transfer system, which uses Secure Socket Layer (SSL) technology, allowing the transfer of encrypted data over the Internet. The NCES secure server will be used for all administrative data sources. All data transfers will be encrypted.

The Department has established a policy regarding the personnel security screening requirements for all contractor employees and their subcontractors. The contractor must comply with these personnel security screening requirements throughout the life of the contract including several requirements that the contractor must meet for each employee working on the contract for 30 days or more. Among these requirements are that each person working on the contract must be assigned a position risk level. The risk levels are high, moderate, and low based upon the level of harm that a person in the position can cause to the Department’s interests. Each person working on the contract must complete the requirements for a “Contractor Security Screening.” Depending on the risk level assigned to each person’s position, a follow-up background investigation by the Department will occur.

The Family Educational Rights and Privacy Act (FERPA) (34 CFR Part 99) allows the disclosure of personally identifiable information from students’ education records without prior consent for the purposes of MGLS:2017 according to the following excerpts: 34 CFR §99.31 asks, “Under what conditions is prior consent not required to disclose information?” and explains in 34 CFR §99.31(a) that “An educational agency or institution may disclose personally identifiable information from an education record of a student without the consent required by §99.30 if the disclosure meets one or more” of several conditions. These conditions include, at 34 CFR §99.31(a)(3):

The disclosure is, subject to the requirements of §99.35, to authorized representatives of--

(i) The Comptroller General of the United States;

(ii) The Attorney General of the United States;

(iii) The Secretary; or

(iv) State and local educational authorities.

MGLS:2017 is collecting data under the Secretary’s authority. Specifically, NCES, as an authorized representative of the Secretary of Education, is collecting this information for the purpose of evaluating a federally supported education program. Any personally identifiable information is collected with adherence to the security protocol detailed in 34 CFR §99.35:

(a)(1) Authorized representatives of the officials or agencies headed by officials listed in §99.31(a)(3) may have access to education records in connection with an audit or evaluation of Federal or State supported education programs, or for the enforcement of or compliance with Federal legal requirements that relate to those programs.

(2) The State or local educational authority or agency headed by an official listed in §99.31(a)(3) is responsible for using reasonable methods to ensure to the greatest extent practicable that any entity or individual designated as its authorized representative—

(i) Uses personally identifiable information only to carry out an audit or evaluation of Federal- or State-supported education programs, or for the enforcement of or compliance with Federal legal requirements related to these programs;

(ii) Protects the personally identifiable information from further disclosures or other uses, except as authorized in paragraph (b)(1) of this section; and

(iii) Destroys the personally identifiable information in accordance with the requirements of paragraphs (b) and (c) of this section.

(b) Information that is collected under paragraph (a) of this section must—

(1) Be protected in a manner that does not permit personal identification of individuals by anyone other than the State or local educational authority or agency headed by an official listed in §99.31(a)(3) and their authorized representatives, except that the State or local educational authority or agency headed by an official listed in §99.31(a)(3) may make further disclosures of personally identifiable information from education records on behalf of the educational agency or institution in accordance with the requirements of §99.33(b); and

(2) Be destroyed when no longer needed for the purposes listed in paragraph (a) of this section.

(c) Paragraph (b) of this section does not apply if:

(1) The parent or eligible student has given written consent for the disclosure under §99.30; or

(2) The collection of personally identifiable information is specifically authorized by Federal law.

By law, the data provided by schools, staff, parents, and students may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151). The laws pertaining to the collection and use of personally identifiable information will be clearly communicated in correspondence with states, districts, schools, teachers, students, and parents. Letters and informational materials will be sent to parents and school administrators describing the study, its voluntary nature, and the extent to which respondents and their responses will be kept confidential. This information will also be included in any research application required by school districts.

A list of eighth-grade students will be requested from school districts and/or schools under the FERPA exception to the general consent requirement that permits disclosures to authorized representatives of the Secretary for the purpose of evaluating Federally supported education programs (34 CFR §§ 99.31(a)(3)(iii) and 99.35). In turn, for the follow-ups, schools will be asked to confirm the enrollment of student sample members. Both the enrollment list and the enrollment update lists will be securely destroyed when no longer needed for the purposes specified in 34 CFR §99.35.

a. Contact letters and emails

NCES is authorized to conduct MGLS:2017 by the Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9543) and to collect students’ education records from education agencies or institutions for the purposes of evaluating federally supported education programs under the Family Educational Rights and Privacy Act (FERPA, 34 CFR §§ 99.31(a)(3)(iii) and 99.35). The data are being collected for NCES by RTI International, a U.S.-based nonprofit research organization. All of the information [{you/ you and your child/respondents/schools/ schools and students} provide/ provided by schools, staff, students, and parents] may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151). The collected information will be combined across respondents to produce statistical reports.

b. Data collection instruments/website

NCES is authorized to conduct the MGLS:2017 by the Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9543), and to collect students’ education records from education agencies or institutions for the purpose of evaluating federally supported education programs under the Family Educational Rights and Privacy Act (FERPA, 34 CFR §§ 99.31(a)(3)(iii) and 99.35). The data are being collected for NCES by RTI International, a U.S.-based nonprofit research organization. Participation is voluntary. You may skip questions you do not wish to answer; however, we hope that you will answer as many questions as you can. All of the information you provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form, for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151). The collected information will be combined across respondents to produce statistical records.

According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless it displays a valid OMB control number. The valid OMB control number for this voluntary information collection is 1850-0911. Approval expires M/D/2022. The time required to complete this information collection is estimated to average [40 minutes per response for school administrators, 35 minutes for parents, 20 minutes for math teacher-level information and 7 minutes per study student, 10 minutes for special education teacher-level information and 20 minutes per study student, and up to 90 minutes for students][20 minutes for the electronic form or 5 minutes for the paper form], including the time to review instructions, gather the data needed, and complete and review the information collection. If you have any comments concerning the accuracy of the time estimate, suggestions for improving this survey, or any comments or concerns regarding the status of your individual submission of this survey, please write directly to: The Middle Grades Longitudinal Study of 2017-18 (MGLS:2017), National Center for Education Statistics, Potomac Center Plaza, 550 12th St., SW, Room 4002, Washington, DC 20202.

A.11 Sensitive Questions

MGLS:2017 is a voluntary study, and no persons are required to respond to the questionnaires or to participate in the assessments. In addition, respondents may decline to answer any question they are asked. This voluntary aspect of the survey is clearly stated in the advance letter mailed to adult respondents, other study materials such as the Frequently Asked Questions, and the instructions on web and hardcopy questionnaires. It is also stressed by field staff and telephone interviewers in any question they ask. This voluntary aspect of the survey is clearly stated in the training to ensure that all data collection staff are both communicating the voluntary aspect to participants and following the guidelines. Additionally, students may refuse to participate during the assessments and study field staff are trained to respect students’ wishes. The following describes the topics for each instrument that may be sensitive for some respondents.

Schools. In MS2, a roster of all students in grade 8 will be requested from each school or its school district in the augmentation sample. In MS2, the MS1 schools will be asked to verify the enrollment status of each student sampled for MS1 and provide new school information for those no longer enrolled (see Part B.2). Schools may have concerns about providing this information without first obtaining permission from the parents to do so. The disclosure is permitted under FERPA’s exception to the general consent requirement that permits disclosures to authorized representatives of the Secretary for purposes of evaluating Federally supported education programs (34 CFR §§ 99.31(a)(3)(iii) and 99.35). This information will be securely destroyed when no longer needed for the purposes specified in 34 CFR §99.35. All district and school personnel facilitating the conduct of the study and developing the sampling frame will be informed of the privacy and confidentiality protocols required for the study, including those having to do with the sample lists of schools and students.

School Administrator. The items in the School Administrator Questionnaire are not of a sensitive nature and should not pose sensitivity concerns to respondents.

Math Teacher. The information collected in the teacher student report could be regarded as sensitive because the teacher is asked to provide information about a student’s academic skills, social skills (including classroom behavior and peer relationships), problem behavior (including anger, manipulation, and disobedience), and experience with peer victimization, both as a victim and as an aggressor.

Special Education Teacher. As with the math teacher survey, information collected in the teacher student report may be regarded as sensitive. Each special education teacher is asked to provide information on a student’s special education status, IEP goals, and services received. The survey also includes questions on the teacher’s expectations for the student, and the student’s academic and life skills.

Parent. To achieve the study’s primary goal of describing the development, academic outcomes, and characteristics of middle grades students, we will be asking parents some questions that could be viewed as sensitive in nature by some respondents. Questions about family income, disciplinary practices, neighborhood safety, their child’s disabilities, and problems their child may be having at school, including experience with peer victimization, are included in the parent survey. Additionally, parents are asked if their child ever: got involved with the wrong kinds of people; used drugs or alcohol; got in trouble with the police; and ran away.

The types of questions included in the staff and parent surveys have been asked in many large-scale studies of school-age children including the ECLS-K, ECLS-K:2011, and HSLS:09. These questions are central to describing the middle grades population and to examining the variability in students’ development, mathematics and reading achievement, and other student outcomes.

Student. The student questionnaire includes a few questions that could be sensitive for some students. Questions about internalizing attitudes or behaviors, perceptions of competencies in mathematics, and school and class attendance are included in this self-report survey. Students are asked about negative behaviors of their peers, and about their relationship with their parents. Students are also asked to self-report their race/ethnicity and sex, which could be sensitive questions for students at this age. The questions that are included in the student survey have been asked in other studies of adolescents and the responses to these questions have been found to help explain why some students do better than others in school and are more engaged in learning.

The in-school session for MS2 will also include a height/weight measurement of participating students. Care will be taken to ensure the privacy of this information, and as with all components of the study, participation in the height/weight measurement is voluntary.

A.12 Estimates of Burden

Burden estimates for all MS2 tracking and recruitment activities are shown in this section.

Main Study:

MS2 Tracking/Recruitment (2019): For the augmentation sample, we will contact 455 school districts to achieve 144 participating districts. Some districts require completion of a research application before they will allow schools under their jurisdiction to participate in the study, and for those schools we anticipate about 2 hours for IRB staff approval and for each of an estimated 5 panelists per district. Of the 650 eligible sampled schools, we anticipate 206 to participate. We estimate that it will take 20 minutes on average for school administrators to review the materials and either agree or decline to participate, and we estimate an additional 4 hours for schools that decide to participate. The school coordinator will spend, on average, up to 6 hours preparing the student roster, which includes the provision of student information and their associated parent and teacher information.

For students’ parents, we estimate that it will take up to 10 minutes to review the recruitment materials and either consent or refuse to participate (on behalf of their student and themselves). The provision of student rosters and the parents’ consent forms will serve as sources for parents’ contact information, which during the data collection period can be used for nonresponse follow-up.

The Tracking/Recruitment: Enrollment Status Update portion of table 3 shows the expected burden for MS2B enrollment status and tracking activities. Scheduled to occur in fall 2019 to prepare for the winter/spring 2020 data collection, we estimate that it will take 30 minutes on average for MS1 districts to decide about participation in MS2 and 120 minutes for districts to which students have moved (“mover districts”). A subset of districts will require IRB review of the study to determine participation prior to providing enrollment status; we estimate the burden associated with IRB approval to impose a similar burden to that mentioned above, with about 2 hours for IRB staff approval and for each of an estimated 5 panelists per district. For schools, we estimate 20 minutes on average for school staff to provide enrollment status of sampled students, and 5 minutes on average for parents to provide updated contact information.


MS2 Data Collection: Students who participate in school will complete a 90-minute session consisting of math and reading questions, a survey, two brain games, and height and weight measurements. Students who participate outside of school will spend about 45 minutes as they will not complete the brain games, the second-stage portions of the reading or math assessment, or height and weight measurements,. Parents will complete a 35-minute survey, though an abbreviated (20-minute) survey will be offered about five weeks prior to the end of the data collection for nonresponding parents and an even shorter survey (5-minute) will be sent two weeks later to any who still haven’t responded. Mathematics teachers will be asked to spend about 20 minutes completing a survey about their background and classroom practices, plus an additional 7 minutes per student on which they are asked to report. Special education teachers or service providers will complete a 10-minute survey on their background plus a 20-minute survey for each student on which they are asked to report. On average, math teachers will complete about 6.382 student reports per teacher, while special education teachers will complete about 1.636 reports each. The school administrator survey is estimated to take 40 minutes. School coordinators will spend about 12 hours verifying the enrollment of the student sample and assisting with the preparation of each student session.

Table 3. MS2 Burden Estimates

MGLS:2017 Activity

Sample Size

Expected Response Rate

Number of Respondents

Number of Responses

Median Burden Time (minutes)

Total Burden (hours)

Estimated Respondent Average Hourly Wage1

Estimated Respondent Burden Time Cost

MS2B Tracking/Recruitment (Fall 2019)

Recruitment: Augmentation Sample

Nonparticipating Districts

455

68%

310

310

20

104

$46.85

$4,873

Participating Districts

32%

146

146

260

633

$46.85

$29,657

District IRB staff study approval

43

100%

43

43

120

86

$46.85

$4,030

District IRB panel study approval2

215

100%

215

215

120

430

$46.85

$20,146

Nonparticipating eligible schools (MS2 augmentation sample)

647

68%

440

440

20

147

$46.85

$6,887

Participating schools (MS2 augmentation sample)

32%

207

207

260

897

$46.85

$42,025

Nonparticipating eligible schools (MS1 non-participating schools attempted for MS2)

185

85%

157

157

20

52

$46.85

$2,436

Participating schools (MS1 non-participating schools attempted for MS2)

15%

28

28

260

121

$46.85

$5,669

School Coordinator (roster)

234

100%

234

234

360

1,404

$28.18

$39,565

Students’ parents

5,8003

85%

4,930

4,930

10

822

$24.98

$20,534

Tracking/Recruitment: Enrollment Status Update

Base Year districts

400

100%

400

400

30

200

$46.85

$9,370

Mover districts

325

100%

325

325

120

650

$46.85

$30,453

District IRB staff study approval

132

100%

132

132

120

264

$46.85

$12,369

District IRB panel study approval2

660

100%

660

660

120

1,320

$46.85

$61,842

School staff at Base Year schools

568

95%

540

540

20

180

$46.85

$8,433

School staff at mover schools

1,1004

95%

1,045

1,045

20

349

$46.85

$16,351

Tracking: Locating Update

Parents (MS1) 5

16,812

20%

3,363

3,363

5

281

$24.98

$7,020

MS2B 2019 Total



13,175

13,175


7,940


$321,660

MS2 Data Collection









Students and Parents









Student Survey

22,975

85%

19,529

19,529

20

6,510

$7.25

$47,198

Student Assessment6

22,975

85%

19,529

19,529

70

22,784

Students' parents

22,9757

85%8

19,5299

19,529

3510

11,392

$24.98

$284,573

Students’ mathematics teacher









Teacher survey

3,60011

85%12

3,060

3,060

20

1,020

$29.91

$30,509

Teacher student report

22,97513

85%12

3,0609

19,52910

7

2,279

$29.91

$68,165

Students' special education teacher









Teacher survey

2,02511

85%12

1,722

1,722

10

287

$30.98

$8,892

Teacher student report

3,31514

85%12

1,7229

2,81814

20

940

$30.98

$29,121

School administrators and coordinators









Students' school administrators15

1900

85%12

1,615

1,615

4016

1,077

$46.85

$50,457

School coordinator17

805

100%

805

805

720

9,660

$28.18

$272,219

MS2 Data Collection Total



46,260

68,607


33,165


$791,134

Total Requested

-

-

59,435

81,782

-

41,105

-

$1,112,794

Note: Numbers of respondents have been rounded up to the nearest whole number.

1 The average hourly earnings of parents derived from May 2018 Bureau of Labor Statistics (BLS) Occupation Employment Statistics is $24.98, of education administrators it is $46.85, of educational guidance counselors (used for school coordinator role) it is $28.18, and of teachers is $30.98. If mean hourly wage was not provided, it was computed assuming 2,080 hours per year. The exception is the student wage, which is based on the federal minimum wage. Source: BLS Occupation Employment Statistics, http://data.bls.gov/oes/ datatype: Occupation codes: All employees (00-0000); Education Administrators (11-9032); Middle school teachers (25-2022); Middle school special education teachers (25-2053); accessed on May 9, 2019.

2 Assumes 5 members on each district IRB panel.

3 Of the 6,163 students sampled, we expect about six percent to be ineligible due to not being enrolled in the sixth grade two years prior.

4 This estimate is for school coordinators at mover schools. It includes schools that students left after grade 6 because the schools end in grade 6, and other schools from which students moved since their participation in MGLS:2017 Base Year (when they were in grade 6). It has been updated to reflect the results of MS2A tracking activities.

5 The number of parents included in the MS2B locating update is based on the number of eligible sampled students in MS1.

6 Burden associated with student assessments is shown here for informational purposes. It is not included in the total burden calculations because, unlike the other burden presented here, it is not subject to the Paperwork Reduction Act (PRA).

7 The number of parent respondents reflects the total of the 16,812 in the MS1 eligible sample and the 6,163 in the augmentation sample.

8 The targeted response rate is higher than that achieved in MS1 data collection. Steps that have been and will be taken to maximize response rate include the following: between-round locating update parent contacts, including a $10 incentive for providing information updates; the ability to begin contacting most parents from the start of data collection, which was not possible in MS1 for parents associated with schools that provided lists late in data collection; and in-person field follow-up efforts outside of school by SFs.

9 The same respondent group as above, not double counted in the total number of respondents.

10 This represents the estimated median burden time for the full survey. The estimated time for the abbreviated survey is 20 minutes and the estimated time for the mini-survey is 5 minutes. The estimated distribution is 70% for the full survey, 12% for the abbreviated survey, and 3% for the mini-survey.

11 Assumes 3 math teachers per school and 1.1 teacher per transfer school; Assumes 2 special education teachers per school and 1 special education teacher in .38 mover schools.

12 The targeted response rate is higher than that achieved in MS1 data collection. Steps that have been and will be taken to maximize response rate include the following: multiple between-round communications with school staff to get student enrollment updates and to remind school personnel about the upcoming data collection; the ability to begin contacting most staff from the start of data collection, which was not possible in MS1 with schools that provided lists late in data collection; and prompting by SFs when they are at the schools.

13 Teachers will be asked to complete student-level reports regardless of the students’ participation, so this estimate accounts for 85% of the sampled students for math teachers.

14 Teachers will be asked to complete student-level reports regardless of the students’ participation, so this estimate accounts for 85% of students in the sample with a special education teacher or provider. The number of student-level reports includes 2,699 students with IEPs from MS1 plus an estimated 616 additional students identified in MS2.

15 Includes administrators at MS2 participating schools conducting in-school sessions as well as administrators at other schools attended by MGLS:2017 students in spring 2020.

16 This represents the estimated median burden time for the full survey. The estimated time for the abbreviated survey is 20 minutes. The estimated distribution is 75% for the full survey and 10% for the abbreviated survey.

17 Includes only school coordinators at MS2 participating schools that will conduct in-school sessions, whether originally sampled for MGLS:2017 or newly identified through tracking activities.


A.13 Total Annual Cost Burden

There are no respondent costs other than the cost associated with response time burden.

A.14 Annualized Cost to Federal Government

As shown in table 4, the estimated cost to the federal government for contractor and subcontractor work to conduct all aspects of MS2 is $8,835,569. These figures include costs for planning, instrument development, recruitment, data collection, data analysis, and reporting.

Table 4. Contract Costs for MS2 Tracking, Recruitment, and Data Collection

Main Study First Follow-up Tracking, Recruitment, and Data Collection (MS2)

$7,775,947

MS2 Incentives

$1,069,622

Total

$8,835,569



A.15 Program Changes or Adjustments

The apparent increase in burden from the last approved submission is due to the fact that the last approval included only recruitment and tracking activities for MS2 and OFT3, while this submission includes the MS2 tracking activities and the MS2 data collection.

A.16 Plans for Tabulation and Publication

The results from the field tests will be presented in a field test report that will include an overview of the study; purposes of the IVFT, OFT1, and OFT2; sample design and methodologies employed; recruitment and data collection results; and recommendations for the main study. The MGLS:2017 methodology report will provide a description of the study design, sample design, training and data collection approaches, and data collection results. The MGLS:2017 psychometric report will provide a detailed accounting of the design and framework for all of the assessments, the item development process and results, and analyses related to the assessment implementation and results. The MGLS:2017 descriptive report will provide a limited set of statistical analyses. All MGLS:2017 results from the national data collections and all reports related to the field tests and national data collections will be made available on the NCES website. A schedule for OFT1, MS1, OFT2, and MS2 is provided in table 5 (gray font delineates activities already concluded, and bold font delineates activities requested in this submission).

Table 5. Schedule for OFT1, MS1, OFT2, OFT3, and MS2

Activity

Start date

End date

OFT1 recruitment of schools and districts

April 2016

March 2017

OFT1 recruitment of students and parents through requesting parent consent

January 2017

May 2017

OFT1 data collection

January 2017

May 2017

OFT1 & IVFT Report

June 2017

December 2017

MS1 recruitment of schools and districts

February 2017

April 2018

MS1 recruitment of students and parents through requesting parent consent

January 2018

May 2018

MS1 data collection

January 2018

August 2018

OFT2 tracking and recruitment

August 2017

May 2018

OFT2 data collection

February 2018

May 2018

OFT3 tracking and recruitment

September 2018

May 2019

MS2 district and school Notification

September 2018

December 2018

MS2 tracking

September 2018

May 2020

MS2 recruitment

January 2019

May 2020

MS2 data collection

January 2020

July 2020


A.17 Display OMB Expiration Date

The OMB expiration date will be displayed on all materials.

A.18 Exceptions to Certification Statement

No exceptions to the certification statement are requested or required.

1 Le Grange, D., Doyle, P. M., Swanson, S. A., Ludwig, K., Glunz, C., & Kreipe, R. E. (2012). Calculation of Expected Body Weight in Adolescents with Eating Disorders. Pediatrics, 129(2), e438–e446.

2 University of Chicago Medical Center. (2012-01-04). Calculating Weight in Children with Eating Disorders - Experts Urge BMI Method. Retrieved 2017-10-29, from https://www.disabled-world.com/health/eating-disorders/bmi-method.php

3 Tourangeau, K., Nord, C., Lê, T., Sorongon, A.G., Hagedorn, M.C., Daly, P., and Najarian, M. (2012). Early Childhood Longitudinal Study, Kindergarten Class of 2010–11 (ECLS-K:2011), User’s Manual for the ECLS-K:2011 Kindergarten Data File and Electronic Codebook (NCES 2013-061). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

4 Tourangeau, K., Nord, C., Lê, T., Sorongon, A.G., Hagedorn, M.C., Daly, P., and Najarian, M. (2001). Early Childhood Longitudinal Study, Kindergarten Class of 1998–99 (ECLS-K), User’s Manual for the ECLS-K Base Year Public-Use Data Files and Electronic Codebook (NCES 2001-029). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

5 Ingels, S.J., Pratt, D.J., Herget, D.R., Burns, L.J., Dever, J.A., Ottem, R., Rogers, J.E., Jin, Y., and Leinwand, S. (2011). High School Longitudinal Study of 2009 (HSLS:09). Base Year Data File Documentation (NCES 2011-328). U.S. Department of Education. Washington, DC: National Center for Education Statistics.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-10-04

© 2024 OMB.report | Privacy Policy