Part A MGLS2017 OFT & Recruitment for MS Base-year

Part A MGLS2017 OFT & Recruitment for MS Base-year.docx

Middle Grades Longitudinal Study of 2017-18 (MGLS:2017) Operational Field Test (OFT) and Recruitment for Main Study Base-year

OMB: 1850-0911

Document [docx]
Download: docx | pdf








Middle Grades Longitudinal Study of 2017-18 (MGLS:2017)

Operational Field Test (OFT) and Recruitment for Main Study Base-year







OMB# 1850-0911 v.15





Supporting Statement Part A










National Center for Education Statistics

U.S. Department of Education

Institute of Education Sciences

Washington, DC






July 2016


Revised June 2017




Preface

The Middle Grades Longitudinal Study of 2017-18 (MGLS:2017) is the first study conducted by the National Center for Education Statistics (NCES), within the Institute of Education Sciences (IES) of the U.S. Department of Education, to follow a nationally representative sample of students as they enter and move through the middle grades (grades 6–8). In preparation for the national data collection, referred to as the Main Study, the data collection instruments and procedures must be field tested.

This package requests clearance to conduct three components of the study: (1) the MGLS:2017 Operational Field Test (OFT) data collection; (2) the recruitment of schools for the Main Study Base-year; and (3) the tracking of OFT students and associated recruitment of schools in preparation for the first follow-up OFT data collection.

An Item Validation Field Test (IVFT) was conducted in the winter/spring of 2016 (OMB# 1850-0911 v. 3,4,5,7,8,9) to determine the psychometric properties of assessment and survey items and the predictive potential of items so that valid, reliable, and useful assessment and survey instruments can be composed for the Main Study. The MGLS:2017 OFT data collection will begin in January 2017, at which time the recruitment for Main Study Base-year schools will also commence. Tracking of students and associated recruitment of schools for the OFT first follow-up data collection is scheduled to begin in the late summer of 2017. The primary purpose of the OFT is to: (a) obtain information on recruiting, particularly for students in three focal IDEA-defined disability groups: specific learning disability, autism, and emotional disturbance; (b) obtain a tracking sample that can be used to study mobility patterns in subsequent years; and (c) test protocols and administrative procedures.

Part A of this submission presents information on the basic design of the OFT and Main Study, Part B discusses the statistical methods employed, and Part C provides content and item justifications for the MGLS:2017 student, parent, math teacher, special education teacher, and school administrator surveys, as well as the facilities observation checklist. Appendices OFT1-A through J provide the OFT and Appendices MS1-A through J provide the Main Study recruitment materials consisting of a brochure, frequently asked questions, content of a study website, and letters to state and district officials, school principals, and parents. Appendices OFT1-K through R and MS1-K through R provide content summaries of the proposed assessments and surveys, and Appendices OFT1-S and T and MS1-S and T provide the student roster collection materials. Appendix OFT1-U and V provide the OFT survey instruments. Appendices OFT2-A through L provide OFT contact materials for schools and parents to track OFT student participants from Base-year to First Follow-up. Because the OFT recruitment will still be ongoing at the time this request is approved, the burden and materials from the MGLS:2017 Recruitment for the 2017 OFT request (OMB# 1850-0911 v.6,9,10) are being carried over in this submission. In Appendix labeling, OFT1 refers to OFT Base-year materials, OFT2 to OFT First Follow-up materials, and MS1 to Main Study Base-year materials.

A. Justification

A.1 Importance of Information

As a study of the middle grades, MGLS:2017 will complement NCES’s plans for implementing a multi-cohort sequence for a longitudinal studies series. By aligning the Early Childhood Longitudinal Study Kindergarten Class of 2010–11 (ECLS-K:2011), MGLS:2017, and the next High School Longitudinal Study (HSLS), NCES will be able to collect, within a 10-year span, a full range of data on students’ school experiences as the students enter and then transition from elementary school into high school. Given its portfolio and experience in national longitudinal education studies, NCES is uniquely positioned to undertake this comprehensive, large-scale, longitudinal study of a nationally representative sample of middle grade youth that includes measures of known critical influences on adolescents’ academic and socioemotional trajectories. NCES is authorized to conduct MGLS:2017 by the Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9543) and to collect students’ education records from education agencies or institutions for the purposes of evaluating federally supported education programs under the Family Educational Rights and Privacy Act (FERPA, 34 CFR §§99.31(a)(3) and 99.35).

MGLS:2017 will rely on a set of longitudinal and complementary instruments to collect data across several types of respondents to provide information on the outcomes, experiences, and perspectives of students across grades 6, 7, and 8; their families and home lives; their teachers, classrooms, and instruction; and the school settings, programs, and services available to them. At each wave of data collection in the main study, students’ mathematics and reading skills, socioemotional development, and executive function will be assessed. Students will also complete a survey that asks about their engagement in school, out-of-school experiences, peer group relationships, and identity development. Parents will be asked about their background, family resources, and involvement with their child’s education and their school. Students’ mathematics teachers will complete a two-part survey. In part 1, they will be asked about their background and classroom instruction. In part 2, they will be asked to report on the academic behavior, mathematics performance, and classroom conduct of each study child in their classroom. For students receiving special education services, their special education teacher or provider will also complete a survey similar in structure to the two-part mathematics teacher instrument, consisting of a teacher-level questionnaire and a student-level questionnaire, but with questions specific to the special education experiences of and services received by the sampled student. School administrators will be asked to report on school programs and services, as well as on school climate.

With data collection occurring in three rounds beginning in the winter/spring 2018 and finishing in 2020, MGLS:2017 will provide a rich descriptive data on academic experiences, development, and learning that occur during these critical, middle grade years (grades 6–8), and on the individual, social, and contextual factors that are related to development and future success, thereby allowing researchers to examine associations between various factors and student outcomes. A wealth of research highlights the importance of mathematics and literacy skills for success in high school and subsequent associations with later education and career opportunities. Thus, MGLS:2017 will focus on student achievement in these areas, along with measures of student socioemotional well-being and other outcomes. The study will also collect data on educational experiences, outcomes, and special education services of students with different types of disabilities, with a particular focus on students with a specific learning disability, autism, and/or emotional disturbance. A key goal of the study is to provide researchers and policymakers with the information they need to better understand the school and non-school influences associated with mathematics and reading success, socioemotional health, and positive life development during the middle grade years and beyond.

To support the development of the study, MGLS:2017 is conducting two field tests: the IVFT was conducted from February through May 2016 and will be followed by the OFT, which will take place from January through May 2017. Recruitment for the OFT is currently underway (OMB# 1850-0911 v. 6,9,10).

The goal of the IVFT was to evaluate and inform the development of reliable, valid measures. With these measures now in their nearly final form, the OFT will focus on testing MGLS:2017 materials and procedures, which have been revised based on the results of the IVFT, and on refining the recruitment techniques to obtain the needed nationally representative sample and better data quality. Gaining schools’ cooperation in voluntary research is increasingly challenging. The OFT will include a responsive design approach for nonresponding parents. The OFT is also an opportunity to finalize standardized protocols for test administration. It will allow NCES to estimate the timing of the assessments and surveys to assure that the length of the final instruments used in the Main Study is congruent with the time communicated to respondents.

A.2 Purposes and Uses of Data

MGLS:2017 will provide nationally representative data related to students’ transitions from elementary school and preparations for transitions into high school, as well as their academic, social, and interpersonal growth during the middle grades. MGLS:2017 will culminate in a rich data set that can be used by researchers, educators, and policymakers to examine family and educational factors related to student achievement. In addition to studying students in the middle grades more generally, educators and policymakers will also be able to use the resulting data to examine the effectiveness of services provided to students in three focal disability groups. The longitudinal nature of the study will allow for analyses of changes in young people’s lives and of how their connections with their communities, schools, teachers, families, and peers affect these changes.

The study is guided by a conceptual framework that emphasizes the complex interrelationships that help shape students’ development and learning, ultimately supporting their academic success and positive development for success in life. MGLS:2017 is designed around a framework of research questions, including:

  1. How do students develop cognitively (including executive function and academic achievement), socially, and emotionally in the middle grades? What school and nonschool factors are associated with that development?

  2. What school and home environment factors are associated with students’ cognitive development and executive function?

  3. What school and home environment factors are associated with students’ regulation and engagement, social skills and behaviors, externalizing problem behaviors, and academic performance?

  4. What is the nature of students’ identity development (including aspirations, peer relationships, and goals) across the middle grades? How does identity development influence school engagement and motivation?

  5. What school and home environment factors are related to the academic success of students with various risk factors often associated with lower academic achievement, such as poverty and low parent education?

  6. What are students’ experiences making the transition from elementary to middle grades? How do parents, teachers, and schools support this transition, as well as the transition from middle grades to high school?

  7. What school and home environment supports are available to middle grade students for setting education pathways and pursuing career goals?

The purpose of MGLS:2017 is to provide data that support the exploration of research interests across disciplines, which will in turn deepen the knowledge base and inform policy and practice. In addition, MGLS:2017 will provide education researchers with data that are currently unavailable: nationally representative longitudinal data focusing specifically on the middle grades.

The study design includes direct measurement of students during a student session that includes the following assessments and surveys:

Reading. The MGLS:2017 reading assessment will provide valuable information about the reading achievement of students in grades 6-8 with a focus on reading comprehension.

Mathematics. The mathematics assessment is designed to measure growth toward algebra readiness in anticipation of the demands students will encounter in high school mathematics coursework. The mathematics assessment will provide valuable information about the development of middle grades students’ knowledge of mathematics and their ability to use that knowledge to solve problems, moving toward stronger reasoning, and understanding of more advanced mathematics.

Executive Function. Executive function, a set of capacities and processes originating in the prefrontal cortex of the brain, permits individuals to self-regulate, engage in purposeful and goal-directed behaviors, and conduct themselves in a socially appropriate manner. Self-regulation is needed for social success, academic and career success, and good health outcomes. Executive function includes capacities such as shifting (cognitive and attention flexibility), inhibitory control, and working memory.

Student Survey. The purpose of the student survey is to collect information on students’ attitudes and behaviors, out-of-school time use, and family, school, and classroom environments. The student survey will also serve as a source for information about socioemotional outcomes having to do with social relationships and support and academic engagement. These data augment the information collected from the mathematics, reading and executive function assessments to provide a deeper understanding of the social and contextual factors related to students’ academic and non-academic outcomes.

Height and Weight. Measuring students’ height and weight provides data to assess body mass index as an indicator of obesity, pubertal timing (i.e., growth spurt), and eating disorders.

Student data will be supplemented by data collected from students’ parents, teachers, and school administrators:

Parent Survey. The purpose of the parent survey is to collect information about: 1) family involvement in their child’s education and 2) family characteristics that are key predictors of academic achievement and other student outcomes.

Mathematics Teacher Survey. The purpose of the mathematics teacher survey is to gather information on the teaching and mathematics classroom context for use in understanding students’ development and mathematics learning during the middle grades. Teachers also rate the sampled students on their math ability.

Special Education Teacher Survey. The purpose of the special education teacher survey is to gather information on the teaching and classroom context for students with disabilities during the middle grades.

School Administrator Survey. The purpose of the school administrator survey is to provide context for school factors that influence student development, motivation, and mathematics learning.

Facilities Observation Checklist. The facilities observation checklist for the school setting will be used to document the condition of the physical plant and the availability of resources. This information will be collected by field staff and will complement the School Administrator Survey.

Further detail on the assessment and survey content is found in Part C. For more information on the data collection from different types of respondents see Part B.

A.3 Use of Improved Information Technology (Reduction of Burden)

Where feasible, available technology will be used to reduce burden and improve efficiency and accuracy. For example, if districts can provide information linking students to their mathematics teachers or students with disabilities to their special education teachers electronically, we will use this information rather than asking for it at the school level. The burden of recruitment on districts and schools will be minimal, with most information gathered over the telephone. Districts will primarily be asked to provide confirmation of data gathered from other sources, including school universe files and district and school websites. Our collection of student lists will accommodate whatever format districts and schools find to be the least burdensome. The study will utilize the information in any format in which it is provided.

The student assessments and survey will be completed on a Chromebook, a tablet-like computer with touchscreen capability and an attached keyboard. The computerized assessment is made possible by connecting the Chromebooks to an independent local access network (LAN) housed on a laptop computer set up at the school by study field staff. All equipment is provided by the study, and neither the school’s internet access nor any internet access in general is required for the computerized administration of the student session.

The parent and school staff questionnaires will be fielded as web surveys. Using this data collection mode will allow for automatic routing of respondents through the surveys, which contain some instances of complex question branching. The automatic routing reduces respondent burden by producing faster interviews. The respondent will not be asked inapplicable questions and will not need to spend time determining which questions to answer. Also, electronic capture of responses reduces processing time and the potential for data entry error.

A computer-based data management system will be used to manage the sample. The sample management system uses encrypted data transmission and networking technology to maintain timely information on respondents in the sample, including contact, tracking, and case completion data. This system will be particularly important as students move from one school to another over the course of the study. The use of technology for sample management will maximize tracking efforts, which should have a positive effect on the study’s ability to locate movers and achieve acceptable response rates.

A.4 Efforts to Identify Duplication

MGLS:2017 will not be duplicative of other studies. While NCES longitudinal studies have contributed to our understanding of the factors that influence student success and failure in school, no NCES study has yet collected data across the middle grades (grades 6–8). A majority of nationally representative longitudinal studies have focused on high school students and on the transition from secondary to postsecondary education: e.g., the High School and Beyond Longitudinal Study (HS&B) and the Education Longitudinal Study of 2002 (ELS:2002). The Early Childhood Longitudinal Study, Kindergarten Class of 1998–99 (ECLS-K) and the National Education Longitudinal Study of 1988 (NELS:88) collected data on students in grade 8, but neither included a data collection in grades 6 and 7. The ECLS-K:2011 does not plan to follow students beyond grade 5, and the High School Longitudinal Study of 2009 (HSLS:09) began with a national sample of students in grade 9. Thus, there is little information at the national level about the learning that occurs during grades 6–8 and about the rates of learning for different groups of students who may experience diverse school environments and opportunities.

MGLS:2017 is unique in that it will assess students’ mathematics and reading achievement, as well as other student outcomes (e.g., executive function and socioemotional development), for the same group of students over a 3-year period. In addition to ECLS-K and NELS:88, other national studies have assessed some of these outcomes for students in grade 8, including the National Assessment of Educational Progress (NAEP) and the Trends in International Mathematics and Science Study (TIMSS). These studies, however, are cross-sectional and do not include repeated measures of achievement or assess multiple subjects and areas of development for the same sample of students. Therefore, they cannot answer questions about students’ growth in mathematics and reading over the middle grade years, about differences in the rates of growth for different populations (e.g., differences by sex, by race/ethnicity, and for students attending public and private schools), and about the school and non-school factors that may facilitate or hinder this growth. Nor can they explore questions about the relationships between student achievement and other school outcomes and executive functions (e.g., working memory, attention, and inhibitory control) that work to regulate and orchestrate cognition, emotion, and behavior to enable a student to learn in the classroom. MGLS:2017 will also be unique in its inclusion of oversamples of students on the autism spectrum or who have emotional disturbance. These oversamples will allow those students, as well as students in the largest IDEA-defined category, specific learning disability, to be studied as separate groups and be compared to general education students over the three middle level years.

Other adolescent development studies have been conducted, but they often do not include a grade 6 sample. For example, the youngest children in the National Longitudinal Study of Adolescent Health (Add Health) and the Maryland Adolescent Development in Context Study (MADICS) were in grade 7 at baseline. Many of these studies collected data on local samples, had a primary focus on family and child processes, and were started in the 1990s: e.g., MADICS and the Michigan Study of Adolescent and Adult Life Transitions (MSALT). As such, they do not provide a contemporary picture of U.S. students in grades 6–8.

A.5 Minimizing Burden for Small Entities

Burden will be minimized wherever possible. During district and school recruitment, we will minimize burden by training recruitment staff to make their contacts as straightforward and concise as possible. The recruitment letters and materials (e.g., the study description and FAQs) are designed to be clear, brief, and informative. In addition, contractor staff will conduct all test administration and will assist with parental notification, sampling, and other study tasks as much as possible within each school.

A.6 Frequency of Data Collection

The base-year data collection for the OFT for MGLS:2017 is expected to take place from January through May of 2017. Tracking activities for the OFT will occur starting in the fall of 2017. The Main Study data collection is scheduled for January through July of 2018 with first and second follow-up data collections planned for 2019 and 2020, respectively.

A.7 Special Circumstances

There are no special circumstances involved with this study.

A.8 Consultations Outside NCES

Content experts have been consulted in the development of the assessments and questionnaires. These experts are listed by name, affiliation, and expertise in table 1.

Table 1. Members of the MGLS:2017 Content Review Panels

Name

Affiliation

Expertise

Mathematics Assessment Content Review Panel (June 18–19, 2013)

Tom Loveless

Brookings Institution

Policy, mathematics curriculum

Linda Wilson

Formerly with Project 2061

Mathematics education, mathematics assessment, middle school assessment, author of NCTM Assessment Standards for School Mathematics and NAEP math framework, teacher

Kathleen Heid

University of Florida

Mathematics education, use of technology, teacher knowledge, NAEP Grade 8 Mathematics Standing Committee member

Edward Nolan

Montgomery County Schools, Maryland

Mathematics curriculum and standards, large-scale assessment of middle grade students

Lisa Keller

University of Massachusetts, Amherst

Psychometrics, former mathematics teacher

Paul Sally

University of Chicago

Mathematics education, mathematics reasoning, mathematically talented adolescents

Margie Hill

University of Kansas

Co-author of Kansas mathematics standards, former NAEP Mathematics Standing Committee member, former district math supervisor

Executive Function Content Review Panel (July 18, 2013)

Lisa Jacobson

Johns Hopkins University; Kennedy Krieger Institute

Development of executive functioning skills, attention, neurodevelopmental disorders, and parent and teacher scaffolding

Dan Romer

University of Pennsylvania

Adolescent risk taking

James Byrnes

Temple University

Self-regulation, decision making, cognitive processes in mathematics learning

Socioemotional-Student-Family Content Review Panel (July 25–26, 2013)

James Byrnes

Temple University

Self-regulation, decision making, cognitive processes in mathematics learning

Russell Rumberger

University of California, Santa Barbara

School dropouts, ethnic and language minority student achievement

Tama Leventhal

Tufts University

Family context, adolescence, social policy, community and neighborhood indicators

Susan Dauber

Bluestocking Research

School organization, educational transitions, urban education, parent involvement and family processes

Scott Gest

Pennsylvania State University

Social networking, social skills, longitudinal assessment of at-risk populations

Kathryn Wentzel

University of Maryland

Social and academic motivation, self-regulation, school adjustment, peer relationships, teacher-student relationships, family-school linkages

Richard Lerner

Tufts University

Adolescent development and relationships with peers, families, schools, and communities

School Administrator Content Review Panel (August 16, 2013)

Susan Dauber

Bluestocking Research

School organization, educational transitions, urban education, parent involvement and family processes

George Farkas

University of California, Irvine

Schooling equity and human resources

Jeremy Finn

State University of New York at Buffalo

School organization, school dropouts

Edward Nolan

Montgomery County Schools, Maryland

Large urban school system administrator

Tom Loveless

Brookings Institution

Policy, math curriculum

Reading Assessment Content Review Panel ( April 14, 2014)

Donna Alvermann

University of Georgia

Adolescent literacy, online literacy, codirector of the National Reading Research Center (funded by the U.S. Department of Education)

Joseph Magliano

Northern Illinois University

Cognitive processes that support comprehension, the nature of memory representations for events depicted in text and film, strategies to detect and help struggling readers

Sheryl Lazarus

University of Minnesota

Education policy issues related to the inclusion of students with disabilities in assessments used for accountability purposes, student participation and accommodations, alternate assessments, technology-enhanced assessments, teacher effectiveness, large-scale assessments, school accountability, research design (including cost analyses), data-driven decision making, rural education, the economics of education

Disabilities Content Review Panel (April 29, 2014)

Jose Blackorby

SRI International

Autism, specific learning disabilities, special education, curriculum design, alternate student assessment, large-scale studies of students with disabilities, codirector of the Special Education Elementary Longitudinal Study (SEELS)

Lynn

Fuchs

Vanderbilt University

Specific learning disabilities, student assessment, mathematics curriculum, psychometric models

Mitchell L. Yell

University of South Carolina

Autism, emotional and behavior disorders, specific learning disabilities, pre-K–12 instruction and curriculum, special education, evidence-based intervention

Sheryl Lazarus

University of Minnesota

Special education policy, inclusion of students with disabilities in assessments, accommodations, alternate assessments, technology-enhanced assessments, large-scale assessments, school accountability, research design (including cost analyses)

Martha Thurlow

University of Minnesota

Specific learning disabilities, reading assessment, alternate student assessment, early childhood education, special education, curriculum, large-scale studies

Diane Pedrotty Bryant

University of Texas, Austin

Educational interventions for improving the mathematics and reading performance of students with learning disabilities, the use of assistive technology for individuals with disabilities, interventions for students with learning disabilities and who are at risk for educational difficulties

Technical Review Panel (May 10, 2016)

Grace Kao

University of Pennsylvania

Dr. Kao’s research interests center on the explanation of immigrant, racial, and ethnic disparities in education outcomes. Her work has used quantitative analyses of nationally representative data on students and parents (including NCES data sets as well as AddHealth).

Margaret McLaughlin

University of Maryland

Dr. McLaughlin’s research focuses on special education policy, particularly use of large-scale data in policy research including investigation of the impact of education reform on students with disabilities and special education programs.

Lisa Jacobson

Kennedy Krieger Institute

Dr. Jacobson specializes in clinical pediatric neuropsychology. Her research interests include cognitive and behavioral aspects of disorders related to attention and executive functions. She is interested in how children’s developing executive functions interact with developmental contexts both at home and school.

Brian Rowan

University of Michigan

Dr. Rowan’s research has focused on the organization and management of schooling, paying special attention to the measurement and improvement of teaching quality. His current research includes a randomized field trial of an early grades reading intervention, an evaluation of a high school instructional improvement program, and a study of online high schools in Florida.

Oscar Barbarin

University of Maryland

Dr. Barbarin’s research has focused on the social and familial determinants of ethnic and gender achievement gaps beginning in early childhood. An additional focus is Dr. Barbarin’s concern with socioemotional and academic development, particularly of boys of color.

James P. Byrnes

Temple University

Dr. Byrnes interests include the modeling of academic achievement, decision-making and risk-taking, development of mathematical expertise, gender differences in achievement, and critical thinking about neuroscientific research.

Dan Romer

Adolescent Communication Institute, Annenberg Public Policy Center

Dr. Romer has studied social influences on adolescent health with particular attention to the social transmission of risky behavior. He is currently studying a cohort of adolescents in Philadelphia to understand the risk factors that underlie early use of drugs and other threats to healthy development. His interests include the relationship between risk behavior and Executive Function.

Jeremy Finn

University at Buffalo

Dr. Finn’s research interests include school organization and class size, student engagement, disengagement, and dropping out, students at risk, and using quantitative methods to study policy issues.

Lynn Newman

SRI International

Dr. Newman has experience in education and social science research in disability policy and human services. She has expertise in quantitative and qualitative methodologies and large-scale, longitudinal studies, particularly with respect to school experiences and transitions of youth with disabilities.


A.9 Payments or Gifts to Respondents

High levels of school participation are critical to the success of each phase of the study. School administrator, mathematics teacher, special education teacher, parent, and student data collection activities are contingent on school cooperation. NCES recognizes that the burden level of the study is one of the factors that school administrators will consider when deciding whether to participate. To offset the perceived burden of participation, NCES intends to continue to use strategies that have worked successfully in other NCES studies (e.g., ECLS-K, ECLS-K:2011, HS&B, NELS:88, and ELS:2002), including offering both monetary and non-monetary incentives to be given to respondents after they participate in the data collection activities, for example upon completion of a survey. Table 2 summarizes the proposed incentive amount for each instrument and activity along with their estimated administration times; a brief justification for each incentive amount follows table 2. Incentive information is provided for the OFT and Main Study Base Year and OFT First Follow-up data collection activities.

Table 2. OFT and Main Study Instruments and Proposed Incentive Amounts

Instrument/Activity

Administration Time*

Operational Field Test and Main Study** Incentives

OFT and Main Study Base Year Data Collection

Student return of parent consent forms (explicit consent schools only)

10 minutes

Food event at school (e.g., pizza, bagels, etc.) sponsored by the study

Student Assessments and Survey

(Mathematics, Reading, Executive Function, and Student Survey)

90 minutes

Earbuds used during assessment, plus choice of e.g. (average value $0.50 each)

1) mechanical pencil,

2) mobile device screen cleaner,

3) suncatcher,

4) slap bracelet

Parent Survey

40 minutes

$0 to $40 ($20 to $50 for parents of students with emotional disturbance)

For details, please see description below

Mathematics Teacher

Teacher Survey

20 minutes

$20

Mathematics Teacher

Teacher Student Report

10 minutes per student

$7 per student

Special Education Teacher

Teacher Survey

10 minutes

$20

Special Education Teacher

Teacher Student Report

25 minutes per student

$7 per student

School Administrator Survey

40 minutes

No monetary incentive

School Participation


School Coordinator

(logistics, on-site visit, consent forms, administrative records, etc.)

6 hours for consent assistance

2 hours to schedule assessments

2 hours to set up web access & coordinate computer labs

6 hours to provide administrative records

$200 or $400 in check, or $400 in material or services for school (OFT Base Year)

$400 or $600 in check or material or services for school (Main Study Base Year)


$150 for coordinator

OFT Follow-up Data Collection

Student return of parent consent forms (explicit consent schools only)

10 minutes

Food event at school (e.g., pizza, bagels, etc.) sponsored by the study

Student Assessments and Survey

(Mathematics, Reading, Executive Function, and Student Survey)


In-school administration


Out-of-school administration





75 minutes


75 minutes




Earbuds used during assessment (no additional token incentive)


$20

School Administrator Survey

40 minutes

No monetary incentive

School Participation


School Coordinator

(logistics, on-site visit, consent forms, administrative records, etc.)

6 hours for consent assistance

2 hours to schedule assessments

2 hours to set up web access & coordinate computer labs

6 hours to provide administrative records

$200 or $200 in material or services for school



$150 for coordinator

*Note that the assessment administration time may be longer for students with disabilities.

** Final incentive amounts were determined based on the outcome of the field tests.


Students

Operational Field Test (OFT): In the OFT, we plan a simple experiment on the Token of Appreciation for Student Participation. To build goodwill toward the study on the part of students, we will offer students the choice of one of four items valued at $0.25 up to $1 each. These items will not be branded with an MGLS:2017 logo in order not to identify the student as having participated in this particular study. NCES has experience with providing tokens of appreciation for elementary and high school students but we are not as familiar with what would be an attractive token for middle grades students.

Main Study (MS): A choice of up to four token incentive items will be offered to students in the Main Study data collection. In addition, students will be allowed to keep the earbuds used during the assessment. For students at schools where explicit parental permission is required (i.e., return of parental consent form), students who return the form by a set date will be offered a food event sponsored by the study (e.g., pizza, bagels, etc.).

Operational Field Test Follow-Up (OFT2): Students participating in school will be allowed to keep the earbuds used during the assessment. Students who have left their base school and are unable to participate in their destination school or their transfer school (e.g., if the number of students at the destination school or transfer school is less than 4) will be invited to participate via web outside of school and will be offered $20 for their participation. The monetary incentive offered to these students is designed to encourage them to incur the burden of participating in the study on their own time.

Parents

Parent survey response rates have declined over the past decade. The ECLS-K:2011 baseline (fall 2010) parent survey response rate was more than 10 percentage points lower (74 percent)1 than the parent survey rate in the corresponding 1998 wave of the ECLS-K (85 percent).2 Additionally, the ninth-grade parent survey response rate for the HSLS:09 baseline was 68 percent.3 The MGLS:2017 parent survey is a key component of the data being collected.

To improve the chances of obtaining higher parent participation rates in a school-based design, we will work with school personnel to recruit sample students’ parents for MGLS:2017. In the Main Study, we plan to use a responsive design approach to identify cases for nonresponse follow-up interventions such that the responding sample will be as representative as possible of the population (i.e., sixth-graders) and thereby reduce the risk of potential nonresponse bias. A simulation of the responsive design approach will be tested in the OFT, as described below. Results of this approach and recommendations for the main study will be provided in a separate package.

Operational Field Test (OFT): During the OFT, we will test varying incentive amounts at the onset of parent interview collection to determine the optimal baseline incentive offer for the Main Study. To better understand effective non-response follow-up, we will also test increased incentive offers among pending nonrespondents at two different points in data collection. A separately tailored incentive approach will be utilized for parents of students with the emotional disturbance IEP classification based on the results of the IVFT in which those parents responded at a lower rate than other parents (see section B.4). The parent incentive experimental conditions are shown in table 3. Although in the Main Study responsive design methods will be employed to select pending nonresponding cases for targeted interventions, in the OFT, where the number of cases will be relatively small, increased incentive amounts will be determined by random assignment. In order to inform Main Study procedures, data from the OFT experiment will be analyzed with an approach that simulates responsive design.

Baseline (Phase 1)

In the OFT, parents of students who do not have an emotional disturbance IEP designation will be randomly assigned to one of three baseline incentive groups (phase 1):

  • Cases in group A (20 percent of the cases) receive no incentive the entire data collection period.

  • Cases in group B (40 percent of the cases) will be offered $10 at the start of data collection.

  • Cases in group C (40 percent of the cases) will be offered $20 at the start of data collection.

During phase 1, parents will be asked to complete the online questionnaire and telephone prompting will begin about three weeks after the initial contact is made with the parent.

Non-response Follow-up (Phase 2 and Phase 3)

Approximately one-third of the way through the OFT data collection (phase 2; about 3 weeks after telephone prompting begins), nonresponding cases in groups B and C will be randomly assigned to a treatment group ($10 incentive boost) or a control group (no incentive boost).

Approximately two-thirds of the way through the OFT data collection (phase 3), one more incentive boost experiment will be implemented among remaining nonresponding cases in groups B and C. The treatment group will receive an increase in incentive of whatever amount brings their total incentive to equal $40, while the control group will not receive a boost from the previous offer.

The number of cases will not be sufficient to allow for assignment and experimentation based on responsive design modeling, therefore the incentive boosts for pending nonrespondents (for a total offer of $40) will be done before phases 2 and 3 with a random-assignment experimental design. After the end of data collection, a responsive design simulation will be conducted to observe the potential effectiveness of the non-response follow-up interventions. This analytic approach will inform the Main Study procedures by simulating the responsive design model proposed for the Main Study. This simulation approach was employed successfully with the second follow-up field test data collection for the High School Longitudinal Study of 2009 (HSLS:09).

We will develop a model to retrospectively identify those nonrespondents most likely to contribute to bias as of the beginning of phase 2. We will then compare the response rate, as of the end of phase 2, between those nonrespondents who were assigned an additional incentive amount and those who were not in order to examine the degree to which the incremental incentive applied in phase 2 was successful in increasing the response rate among nonrespondents as of the beginning of phase 2. We will also use that same model to identify those nonrespondents most likely to contribute to bias as of the beginning of phase 3. We will compare the response rate, as of the end of phase 3, between those nonrespondents who were assigned an additional incentive amount and those who were not in order to examine the degree to which the incremental incentive applied in phase 3 was successful in increasing the response rate among nonrespondents as of the beginning of phase 3. Additional details of the retrospective analysis model plan are included in Part B of this submission.

Parents of Students with Emotional Disturbance

It is expected that parents of students with emotional disturbance may be a more challenging group from which to obtain participation than parents whose children do not have that disability. In the IVFT, parents of students with emotional disturbance had a lower participation rate than other parents (see section B.4 for details). While the study will be inclusive of most students with disabilities, students in each of the three focal disability groups (emotional disturbance, autism, and specific learning disability) comprise an analytically critical population for this study given the study design to allow students in these groups to be analyzed separately. Given the relatively lower IVFT response rate specific to parents of students with emotional disturbance, we will implement a separate treatment for parents of students with that disability. We will randomly assign the parents of this group to an offer of a $20 incentive or a $30 incentive at the outset of data collection and increase the incentive by $10 (a cumulative offer of $30 or $40) one-third of the way through data collection and another $10 (a cumulative offer of $40 or $50) two-thirds of the way through data collection. Providing a differential treatment for analytically critical populations was used successfully in HSLS:09 with students who had reportedly ever dropped out of school.

Table 3. OFT Parent Incentive Experimental Conditions

Experiment Group

Phase 1

Baseline Incentive

Phase 2

One-third of the way through data collection

Phase 3

Two-thirds of the way through data collection

A: Parents of Students without Emotional Disturbance (EMN) IEP

$0 (no offer)

$0 (no boost)

$0 (no boost)

B: Parents of Students without EMN IEP

$10

$10 (no boost)

$10 (no boost)

$40 ($30 boost)

$20 ($10 boost)

$20 (no boost)

$40 ($20 boost)

C: Parents of Students without EMN IEP

$20

$20 (no boost)

$20 (no boost)

$40 ($20 boost)

$30 ($10 boost)

$30 (no boost)

$40 ($10 boost)





D: Parents of Students with EMN IEP

$20

$30

$40

E: Parents of Students with EMN IEP

$30

$40

$50


Main Study: The outcome of the OFT parent incentive experiment will determine the starting and highest incentive values for parents in the Main Study. We will develop a responsive design model to select targeted nonresponding cases for additional intervention to minimize bias in the final participating sample. The responsive design model will be presented in the Main Study data collection package to be submitted to OMB in summer 2017.

Teachers

Operational Field Test (OFT) and Main Study: The incentive proposed for students’ teachers is $20 per teacher survey, plus $7 per teacher student report (TSR). This incentive would be applied in both the OFT and Main Study. These amounts are consistent with the amounts used in current NCES studies, such as the ECLS-K:2011. For the mathematics teacher, it is estimated that the teacher survey will take 20 minutes to complete, and the teacher student report will take 10 minutes per student to complete. For the special education teacher, it is estimated that the teacher survey will take 10 minutes to complete, and the teacher student report will take 25 minutes per student to complete. The teacher student report is expected to take longer for the special education teacher because it includes an additional indirect assessment of student’s skills that is not included in the mathematic teacher’s teacher student report. We are proposing to use the same incentive structure for all teachers, regardless of the specific questionnaires they are being asked to complete, to protect against any perception of unfairness that might result if teachers within a school talk to one another about the amount they have received for a specific questionnaire. Teachers will not be asked to complete a survey in the OFT2.

Schools

Operational Field Test (OFT): As part of the OFT schools recruitment, we are conducting an incentive experiment. Each school has been randomly assigned to one of the three experimental conditions. Given the many demands and outside pressures that schools already face, it is essential that they see that MGLS:2017 staff understand the additional burden being placed on school staff when requesting their participation. The study asks for many kinds of information and cooperation from schools, including a student roster with basic demographic information (e.g., date of birth, sex, and race/ethnicity); information on students’ IEP status, math and special education teachers, and parent contact information; permission for field staff to be in the school for up to a week; space for administering student assessments; permission for students to leave their normal classes for the duration of the assessments; and information about the students’ teachers and parents. Sample students with disabilities may require accommodations and different assessment settings, such as individual administration and smaller group sessions, which will add to the time the study spends in schools and perhaps require additional assistance from school staff to assure that these students are accommodated appropriately.

One of the key questions for the OFT is to determine whether sufficient numbers of students in the focal disability groups can be selected for the study. Gaining cooperation from schools that have more students in those disability groups will be important for the success of MGLS:2017. Therefore, the school sample of 103 schools will be classified by the number of students within the school in the focal disability groups (autism; emotional disturbance; and specific learning disability): (1) “higher”– schools with 17 or more sixth-grade [or age-based equivalent] students in these three groups versus (2) “lower”–schools with fewer than 17 sixth-grade [or age-based equivalent] students in these three groups. The sample design assumes a 3 percent school ineligibility rate, which would be determined if a sampled school has closed or does not contain students in grade 6.

Within the two school types (schools with “higher” or “lower” counts of students in the three disability groups), each school will be randomly assigned to one of three experimental conditions. In Condition 1, the baseline condition, we will offer one third of the sample schools a $200 incentive for participation. This amount is consistent with the amount offered for participation in other NCES studies, such as the ECLS-K, ECLS-K:2011, TIMSS, and the Program for International Student Assessment (PISA). However, based on previous difficulties in recruiting schools for the originally approved MGLS:2017 field test recruitment (to have been conducted in 2015), and the general decline in school participation in NCES longitudinal studies over the years,4 we will test offering one third of the sample schools $400 (Condition 2), and one third of schools a choice of one of seven non-monetary incentives equivalent to $400 (Condition 3). The list of the non-monetary incentive choices is provided in table 4.

The school incentive experiment, with the same three experimental conditions [(1) $200 or (2) $400 or (3) non-monetary incentive equivalent to $400], was also used during the MGLS:2017 IVFT, which was conducted in January through May 2016. The IVFT has a larger sample (250 schools) and does not subdivide the schools by number of students in disability groups of interest.

Table 4. Non-Monetary Incentive Choices for Schools in Experimental Condition 3

Incentive

Value

Registration for Association for Middle Level Education (AMLE) or Regional Annual Meeting

$400

Two-Year School Membership in AMLE

$400

Membership in Regional ML Organization plus Subscriptions to Professional Journals

$400

Professional Development Webinar

$400

School Supplies

$400

Library of Middle Level Publications

$400


Recruitment of schools for the OFT began in April 2016 and will continue until approximately April 2017. Results from the IVFT recruitment effort will be reviewed in summer 2016 and may inform recruitment efforts for OFT sample schools that by that point have not yet agreed to participate in the OFT. Administrators at non-participating IVFT schools will be asked to complete a brief survey via web or phone about their reasons for not participating in the study and what about the study plan and/or incentives, if anything, would have changed the school’s mind. A change request package with a revised OFT recruitment plan may be submitted in the fall of 2016 if we learn from the IVFT debriefing questionnaire that a particular incentive may be effective for gaining school cooperation in the study. The debriefing questionnaire will also be used in the OFT to inform the main study (appendix OFT1-W.4.b). As with the IVFT, the debriefing questionnaire will include the following questions of nonparticipating schools:

  • For what reasons did your school decide not to participate in MGLS:2017?

  • Your school was offered [incentive] to participate in the study. Would a different incentive have changed your mind about participating? In other words, would you have agreed to let your school participate in MGLS:2017, if the incentive was a different amount or type of incentive?

  • [If yes] How much or what would the incentive need to be for your school to decide to participate? [record comments]

  • Is there anything else, not mentioned above, we could have done to enable your school to participate?

The results of the school incentive experiment in the OFT will be combined with the results of the experiment in the IVFT. With the combined 350 schools between the two field test samples, we will be able to observe potential impact of the different incentive treatments to inform school recruitment strategies in the Main Study.

Main Study (MS): A school-level incentive for the Main Study of $400 or non-monetary equivalent to $400 in materials or services was approved in April 2017 (1850-0911 v.13) based on the results of the IVFT and OFT.

As described in more detail in the Supporting Statement Part B of this submission, we will offer an additional $200 (for a total of $600) to schools associated with districts that initially decline to participate and that have one or more schools that are designated as having “higher” counts of students in the focal disability groups.

OFT Follow-up (OFT2): OFT2 consists of a shorter student session and of fewer staff surveys than in OFT1. However, there is still considerable burden for the school to provide tracking information and to coordinate student sessions. We will offer schools $200 in cash or cash equivalent to encourage their participation in OFT2.

School Coordinators

Operational Field Test (OFT), Main Study (MS), and OFT Follow-up (OFT2): School coordinators will be offered a $150 monetary incentive. They play an especially important role in the study and are critical to its success. The coordinator in each participating school will coordinate logistics with the data collection contractor; compile and supply to the data collection contractor a list of eligible students for sampling (for OFT and Main Study) and enrollment status update information for OFT2; communicate with teachers, students, and parents about the study to encourage their participation; distribute and collect parental consent forms; and assist the test administrator in ensuring that the sampled students attend the testing sessions.

A.10 Assurance of Confidentiality

NCES is authorized to conduct this study by the Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9543). By law, the data provided by schools, staff, parents, and students may be used only for statistical purposes and may not be disclosed or used in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151). The laws pertaining to the collection and use of personally identifiable information will be clearly communicated in correspondence with states, districts, schools, teachers, students, and parents. Letters and informational materials will be sent to parents and school administrators describing the study, its voluntary nature, and the extent to which respondents and their responses will be kept confidential. This information will also be included in any research application required by school districts. A list of sixth-grade students with IEPs will be requested from school districts and/or schools under the FERPA exception to the general consent requirement that permits disclosures to authorized representatives of the Secretary for the purpose of evaluating Federally supported education programs (34 CFR §§ 99.31(a)(3)(iii) and 99.35). This information will be securely destroyed when no longer needed for the purposes specified in 34 CFR §99.35.

The confidentiality pledge was updated during the course of the OFT, as reflected in the submission documents (Appendices MS1-C through MS1-T, Appendices OFT2-A through OFT2-L, and note for Appendices OFT1-T through OFT1-W) to reflect the addition of the Cybersecurity Enhancement Act of 2015 provision. The revised pledge reads: “All of the information you provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151).” The OFT base-year (OFT1) materials and Main Study (MS1) Endorsement Request Letter and State Letter and Sample Endorsement Letter were not updated because they have already been used, given that OFT and MS endorsement and state letters became operational prior to the implementation of the revised pledge. All other aspects of the Main Study (MS1) and recruitment for OFT follow-up (OFT2) now include the revised pledge.

The confidentiality plan developed for MGLS:2017 requires that all contractor and subcontractor personnel and field workers who will have access to individual identifiers sign confidentiality agreements and notarized nondisclosure affidavits. The plan also requires that all personnel receive training regarding the meaning of confidentiality, particularly as it relates to handling requests for information and providing assurance to respondents about the protection of their responses. NCES understands the legal and ethical need to protect the privacy of the MGLS:2017 respondents and has extensive experience in developing data files for release that meet the government’s requirements to protect individually identifiable data from disclosure. The data files, accompanying software, and documentation will be delivered to NCES by the data collection contractor at the end of the project. Neither names nor addresses will be included in any data file.

MGLS:2017 will adhere to the following list of security, privacy, and confidentiality laws and regulations to protect PII data:

Data security and confidentiality protection procedures have been put in place for MGLS:2017 to ensure that RTI International and its subcontractors comply with all privacy requirements, including:

  1. The Statement of Work of this contract (ED-IES-15-O-5016);

  2. Family Educational and Privacy Act (FERPA) of 1974 (20 U.S.C. §1232(g));

  3. Privacy Act of 1974 (5 U.S.C. §552a);

  4. Privacy Act Regulations (34 CFR Part 5b);

  5. Computer Security Act of 1987;

  6. U.S.A. Patriot Act of 2001 (P.L. 107-56);

  7. Education Sciences Reform Act of 2002 (ESRA 2002, 20 U.S.C. §9573);

  8. Confidential Information Protect and Statistical Efficiency Act of 2002;

  9. E-Government Act of 2002, Title V, Subtitle A;

  10. Cybersecurity Enhancement Act of 2015 (6 U.S.C. §151);

  11. The U.S. Department of Education General Handbook for Information Technology Security General Support Systems and Major Applications Inventory Procedures (March 2005);

  12. The U.S. Department of Education Incident Handling Procedures (February 2009);

  13. The U.S. Department of Education, ACS Directive OM: 5-101, Contractor Employee Personnel Security Screenings;

  14. NCES Statistical Standards; and

  15. All new legislation that impacts the data collected through the inter-agency agreement for this study.

Furthermore, RTI International will comply with the Department’s IT security policy requirements as set forth in the Handbook for Information Assurance Security Policy and related procedures and guidance, as well as IT security requirements in the Federal Information Security Management Act (FISMA), Federal Information Processing Standards (FIPS) publications, Office of Management and Budget (OMB) Circulars, and the National Institute of Standards and Technology (NIST) standards and guidance. All data products and publications will also adhere to: the revised NCES Statistical Standards, as described at the website: http://nces.ed.gov/statprog/2012/.

A.11 Sensitive Questions

MGLS:2017 is a voluntary study, and no persons are required to respond to the questionnaires or to participate in the assessments. In addition, respondents may decline to answer any question they are asked. This voluntary aspect of the survey is clearly stated in the advance letter mailed to adult respondents, other study materials such as the Frequently Asked Questions, and the instructions on web and hardcopy questionnaires. It is also stressed by field staff and telephone interviewers in any question they ask. This voluntary aspect of the survey is clearly stated in the training to ensure that all data collection staff are both communicating the voluntary aspect to participants and following the guidelines. Additionally, students may refuse to participate during the assessments and study field staff are trained to respect students’ wishes. The following describes the topics for each instrument that may be sensitive for some respondents.

Schools. A roster of all students in grade 6 will be requested from each school or its school district, including the collection of IEP and disability information. Schools may have concerns about providing this information without first obtaining permission from the parents to do so. The disclosure is permitted under FERPA’s exception to the general consent requirement that permits disclosures to authorized representatives of the Secretary for purposes of evaluating Federally supported education programs (34 CFR §§ 99.31(a)(3)(iii) and 99.35). This information will be securely destroyed when no longer needed for the purposes specified in 34 CFR §99.35. All district and school personnel facilitating the conduct of the study and developing the sampling frame will be informed of the privacy and confidentiality protocols required for the study, including those having to do with the sample lists of schools and students. The collection of these data is necessary to facilitate the oversampling of students in two of the three focal disability groups: autism and emotional disturbance. Schools that opt not to provide IEP and disability information may still participate and disability information will be requested as part of the parent and teacher surveys

School Administrator. The items in the School Administrator Questionnaire are not of a sensitive nature and should not pose sensitivity concerns to respondents.

Math Teacher. The information collected in the teacher student report could be regarded as sensitive because the teacher is asked to provide information about a student’s academic skills, social skills (including classroom behavior and peer relationships), problem behavior (including anger, manipulation, and disobedience), and experience with peer victimization, both as a victim and as an aggressor.

Special Education Teacher. As with the math teacher survey, information collected in the teacher student report may be regarded as sensitive. Each special education teacher is asked to provide information on a student’s special education status, IEP goals, and services received. The survey also includes questions on the teacher’s expectations for the student, and the student’s academic and life skills.

Parent. To achieve the study’s primary goal of describing the development, academic outcomes, and characteristics of middle grades students, we will be asking parents a few questions that could be viewed as sensitive in nature by some respondents. Questions about family income, disciplinary practices, their child’s disabilities, and problems their child may be having at school, including experience with peer victimization, are included in the parent survey.

The types of questions included in the staff and parent surveys have been asked in many large-scale studies of school-age children including the ECLS-K, ECLS-K:2011, and HSLS:09. These questions are central to describing the middle grades population and to examining the variability in students’ development, mathematics and reading achievement, and other student outcomes.

Student. The student questionnaire includes a few questions that could be sensitive for some students. Questions about internalizing attitudes or behaviors, perceptions of competencies in mathematics, and school and class attendance are included in this self-report survey. Students are also asked to self-report their race/ethnicity and sex, which could be sensitive questions for students at this age. The questions that are included in the student survey have been asked in other studies of adolescents and the responses to these questions have been found to help explain why some students do better than others in school and are more engaged in learning.

The in-school session will also include a height/weight measurement of participating students. Care will be taken to ensure the privacy of this information, and as with all components of the study, participation in the height/weight measurement is voluntary.

A.12 Estimates of Burden

Burden estimates for the OFT data collection, Main Study recruitment, and OFT tracking and recruitment activities are shown in this section.

Operational Field Test: The OFT Recruitment portion of table 5 shows the expected burden for districts, schools, and parents during the OFT recruitment activities. For the OFT, we anticipate collecting data within 50 schools, from approximately 1,120 participating students, their parents, and their teachers. Because OFT recruitment is still ongoing, burden for this activity is carried over from the prior OMB Recruitment package (OMB# 1850-0911 v.10) and included in the burden estimates below.

As shown in Part B, we anticipate (a) contacting approximately 103 schools (with an estimated 97 percent school-eligibility rate and 50 percent participation rate) to reach the approximately 50 schools needed for participation, and (b) contacting the parents of approximately 1,750 students (with an estimated 200 students in each of the focal disability groups) to yield approximately 1,120 participating students. In order to draw samples of students with disabilities, we may need to obtain student records information from up to four districts.

We estimate that it will take 20 minutes on average for school and district administrators to review the materials and either agree or decline to participate, and also for all schools and districts to debrief on reasons why schools or districts chose to participate or not to participate in the OFT. The debrief may be completed via phone or email and will ask school administrators what factors led to the decision to participate or not to participate in MGLS:2017. For example, the debrief may ask whether content, incentives, time, and/or other factors impacted the decision. The results will be used to inform recruitment efforts for the main study. For those participating, we estimate an additional 4 hours for the provision of student rosters, including information about students for sampling, contact information for their parents, and their math and special education teachers (see Appendices OFT1-S and OFT1-T).

For students’ parents, we estimate that it will take up to 10 minutes to review the recruitment materials and either consent or refuse to participate (on behalf of their student and themselves). The provision of student rosters and the parents’ consent forms will serve as sources for parents’ contact information, which during the data collection period can be used for nonresponse follow-up.

The OFT Data Collection portion of table 5 shows the expected burden for the OFT data collection. The burden time estimates are based on the maximum reasonable expected burden per respondent:

  • The student session, which includes the student assessments and survey, will be approximately 90 minutes. Within the 90 minutes, the student survey portion will take approximately 20 minutes.

  • The parent survey will take approximately 40 minutes.

    • The mathematics teacher will complete a teacher survey that is expected to take approximately 20 minutes to complete, and complete a teacher student record for each sampled student in their class, which is expected to take approximately 10 minutes per student to complete. The teacher-level survey burden estimates assume on average three 6th grade math teachers per school (with an average of approximately 12 students per teacher). With an estimated 50 schools needed for the OFT, this means approximately 150 mathematics teachers.

    • The special education teacher will complete a teacher survey that is expected to take approximately 10 minutes to complete, and a teacher student record for each sampled student in their class, which is expected to take approximately 25 minutes per student to complete. The special education teacher-level survey burden estimates assume on average one special education teacher per school (with an average of roughly 8 students per teacher). With an estimated 50 schools needed for the OFT, this means approximately 50 Special Education Teachers.

  • The school administrator survey will take approximately 40 minutes to complete.

  • The school coordinator will spend, on average, up to 4 hours per day, per assessment day, supporting study activities. The burden estimates assume one assessment day.

The OFT Tracking portion of table 5 shows the expected burden for the OFT enrollment status and tracking activities. The estimates of response burden for these proposed activities are based on tracking experiences in HSLS:09.

As discussed in Part B, we anticipate contacting all of the (approximately 50) OFT base-year participating schools, and contacting the parents of the (approximately 1,120) OFT base-year participating students. In addition, we will contact an estimated 25 schools to which students have moved (“mover schools”) for OFT enrollment status update activities.

We estimate that it will take 20 minutes on average for school staff to provide enrollment status of sampled students and 5 minutes on average for parents to provide updated contact information.

We project that approximately 95 percent of schools will provide enrollment status and 20 percent of parents will provide updated contact information.

Main Study: The Main Study Recruitment portion of table 5 shows the expected burden for districts, schools, and parents during the Main Study recruitment activities. For the Main Study Base-year, we anticipate collecting data within 900 schools. As described in Part B, we expect overall school participation of 60 percent, and thus the school sampling process will include a reserve sample. Assuming a 60 percent school participation rate, we will need to contact approximately 1,500 schools to get to a yield of 900.

The student sampling process is designed to achieve approximately 20,322 grade 6 participants (Part B, Table 4), to obtain which we plan to sample approximately 26,100 students (Part B). In order to draw samples of students with disabilities, we estimate needing to request student records from up to four districts.

We estimate that it will take 20 minutes on average for school and district administrators to review the materials and either agree or decline to participate, and also for all schools and districts to debrief on reasons why schools or districts chose to participate or not to participate in the Main Study. For those participating, we estimate an additional 4 hours for the provision of student rosters, including information about students for sampling, contact information for their parents, and their math and special education teachers (see Appendices MS1-S and MS1-T). For students’ parents, we estimate that it will take up to 10 minutes to review the recruitment materials and either consent or refuse to participate (on behalf of their student and themselves). The provision of student rosters and the parents’ consent forms will serve as sources for parents’ contact information, which during the data collection period can be used for nonresponse follow-up.

The total response burden estimate for district IRB approvals (in the special handling districts that require completion of a research application before they will allow schools under their jurisdiction to participate in a study) is based on an estimated 120 minutes for IRB staff approval and 120 minutes per panelist for approval by the district’s IRB panel, which is estimated to average 5 panelists.



Table 5. OFT Burden Estimates1

Operational Field Test (OFT)

Sample Size

Expected Response Rate

Number of Respondents

Number of Responses

Average Burden Time (minutes)

Total Burden (hours)

Estimated Respondent Average Hourly Wage1

Estimated Respondent Burden Time Cost

OFT Recruitment

Nonparticipating districts

12

67%

8

8

20

3

$44.13

$132

Participating districts

33%

4

4

260

17

$44.13

$750

Nonparticipating eligible schools

1032

50%2

50

50

20

17

$44.13

$750

Participating schools

50%

50

50

260

217

$44.13

$9,576

Students’ parents

1,750

64%

1,120

1,120

10

187

$22.71

$4,247

Approved Total for OFT Recruitment3

-

-

1,232

1,232

-

441

-

$15,455

OFT Data Collection

Students and Parents

Student Survey

1,750

64%

1,120

1,120

20

373

$7.25

$2,704

Student Assessment4

1,750

64%

1,120

1,120

70

1,307

Students' parents

1,7505

64%

1,1205

1,120

40

747

$22.71

$16,964

Students’ mathematics teacher

Teacher survey

150

92%

138

138

20

46

$27.70

$1,274

Teacher student report

150*

92%

138*

1,6106

10

268

$27.70

$7,424

Students' special education teacher

Teacher survey

50

92%

46

46

10

8

$28.65

$229

Teacher student report

50*

92%

46*

3687

25

153

$28.65

$4,383

School administrators and coordinators

Students' school administrators

50

99%

49

49

40

33

$44.13

$1,456

School coordinator

50

100%

50

50

720

600

$26.94

$16,164

OFT Tracking

Tracking: Enrollment Status Update

School staff

50

95%

48

48

20

16

$44.13

$706

Mover schools

258

95%

24

24

20

8

$44.13

$353

Tracking: Locating Update

Parents

1120

20%

224

224

5

19

$22.71

$431

Total for all OFT activities

-

-

2,707

6,029

-

2,712

-

$67,543

Main Study Recruitment

Nonparticipating districts

1,050

30%

315

315

20

105

$44.13

$4,634

Participating districts

70%

735

735

260

3,185

$44.13

$140,554

District IRB staff study approval

263

100%

263

263

120

526

$44.13

$23,212

District IRB panel study approval

263*5

100%

1,315

1,315

120

2,630

$44.13

$116,062

Nonparticipating eligible schools

1,500

40%

600

600

20

200

$44.13

$8,826

Participating schools

60%

900

900

260

3,900

$44.13

$172,107

Students’ parents

26,100

95%

24,795

24,795

10

4,133

$22.71

$93,860

Total for Main Study Recruitment

-

-

28,923

28,923

-

14,679

-

$ 559,255

Total Requested

-

-

31,630

34,952

-

17,391

-

$ 626,798

1 The average hourly earnings of parents in the 2014 National Compensation Survey sponsored by the Bureau of Labor Statistics (BLS) is $22.71, of middle school teachers is $27.70, of middle school special education teachers is $28.65, of education administrators is $44.13, and of educational guidance counselors is $26.94. If mean hourly wage was not provided, it was computed assuming 2,080 hours per year. The exception is the student wage, which is based on the federal minimum wage. Source: BLS Occupation Employment Statistics, http://data.bls.gov/oes/ datatype: Occupation codes: All employees (00-0000); Middle school teachers (25-2022); Middle school special education teachers (25-2053); Education Administrators (11-9032); and Educational guidance counselors (21-1012); accessed on June 18, 2015.

2 The OFT will start with a sample of 103 schools, though it is estimated that three percent of schools will not be eligible for the study. The response rate is based on the eligible sample of 100 schools.

3 Recruitment activities for the OFT will not be completed at the time this request will be approved, and thus the approved burden affiliated with the OFT recruitment is being carried over and is included in the total requested in this submission.

4 Burden associated with student assessments is shown here for informational purposes. It is not included in the total burden calculations because, unlike the other burden presented here, it is not subject to the Paperwork Reduction Act (PRA).

5 The number of parent respondents is already included in the recruitment number of respondents.

6 Teachers will be asked to complete student-level reports regardless of the students’ participation, so this estimate accounts for 92% of the sampled students.

7 The number of student-level reports estimates 8 students with IEPs per school.

8 This estimate includes schools that students left after 6th grade because the schools end in grade 6, and other schools from which students moved since their participation in MGLS 2017 Main Study Base-year (when they were in 6th grade).

* The same respondent group as above, not double counted in the total number of respondents.

A.13 Total Annual Cost Burden

There are no respondent costs other than the cost associated with response time burden.

A.14 Annualized Cost to Federal Government

As shown in table 6, the estimated cost to the federal government for contractor and subcontractor work to conduct all aspects of the OFT Base-year, Main Study Base-year, and OFT First Follow-up sample tracking and recruitment is $14,987,326. These figures include costs for planning, instrument development, recruitment, data collection, data analysis, and reporting. The cost estimate for the Main Study Base-year portion associated with sampling and recruitment activities is $1,302,292. The total cost for the activities requested in this submission is $5,423,951.

Table 6. Contract Costs for Operational Field Test, Main Study Base-year, and OFT Sample Tracking1

Operational Field Test Base-year

$ 3,805,003

Main Study Base-year

$ 10,865,667

Main Study Base-year – Sampling and Recruitment

$ 1,302,292

Main Study Base-year – Other Costs (Data Collection, Reporting)

$ 9,563,375

Operational Field Test First Follow-up Sample Tracking and Recruitment

$ 316,656

Total

$ 14,987,326

1 Contract costs include 1/5 of cost of management task in Operational Field Test Base-year; and 2/5 of cost of management task in Main Study Base-year.

A.15 Program Changes or Adjustments

The increase in burden from the last approved package is due to the fact that the total burden requested in this submission is a sum of burden estimates for OFT recruitment, OFT data collection, OFT tracking, and Main Study recruitment. The last approved burden was for OFT recruitment and the IVFT only.

A.16 Plans for Tabulation and Publication

The results from the OFT will be presented in a field test report that will be prepared approximately 6 months after the completion of the field test and subsequently made available in 2019 as an appendix in the MGLS:2017 Data File Documentation report. The field test report will include an overview of the study, purposes of the IVFT and OFT, sample design and methodologies employed, recruitment and data collection results, and recommendations for the main study. A schedule for the OFT, Main Study, and OFT tracking and recruitment is provided in table 7.

Table 7. Schedule for OFT, Main Study, and OFT Tracking

Activity

Start date

End date

Recruitment of schools and districts for OFT

April 2016

March 2017

Recruitment of students and parents through requesting parent consent from parents for OFT

January 2017

May 2017

OFT Data Collection

January 2017

May 2017

Field Test Report

June 2017

December 2017

Recruitment of schools and districts for Main Study

January 2017

April 2018

Recruitment of students and parents through requesting parent consent from parents for Main Study

January 2018

May 2018

Main Study Data Collection

January 2018

July 2018

Tracking and recruitment for OFT follow-up

August 2017

May 2018

OFT Follow-Up Data Collection

January 2018

May 2018


A.17 Display OMB Expiration Date

The OMB expiration date will be displayed on all materials.

A.18 Exceptions to Certification Statement

No exceptions to the certification statement are requested or required.

1 Tourangeau, K., Nord, C., Lê, T., Sorongon, A.G., Hagedorn, M.C., Daly, P., and Najarian, M. (2012). Early Childhood Longitudinal Study, Kindergarten Class of 2010–11 (ECLS-K:2011), User’s Manual for the ECLS-K:2011 Kindergarten Data File and Electronic Codebook (NCES 2013-061). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

2 Tourangeau, K., Nord, C., Lê, T., Sorongon, A.G., Hagedorn, M.C., Daly, P., and Najarian, M. (2001). Early Childhood Longitudinal Study, Kindergarten Class of 1998–99 (ECLS-K), User’s Manual for the ECLS-K Base Year Public-Use Data Files and Electronic Codebook (NCES 2001-029). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

3 Ingels, S.J., Pratt, D.J., Herget, D.R., Burns, L.J., Dever, J.A., Ottem, R., Rogers, J.E., Jin, Y., and Leinwand, S. (2011). High School Longitudinal Study of 2009 (HSLS:09). Base-Year Data File Documentation (NCES 2011-328). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

4 For example, in 1998–99, the Early Childhood Longitudinal Study had a weighted school-level response rate of 74 percent, whereas 12 years later, the complementary ECLS-K:2011 study had a weighted school-level response rate of 63 percent.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy