Part A MGLS 2017 Operational Field Test 2017 Recruitment

Part A MGLS 2017 Operational Field Test 2017 Recruitment.docx

Middle Grades Longitudinal Study of 2017-2018 (MGLS:2017) Recruitment for 2017 Operational Field Test

OMB: 1850-0911

Document [docx]
Download: docx | pdf







Middle Grades Longitudinal Study of 2017–18 (MGLS:2017)

Recruitment for

2017 Operational Field Test (OFT)








OMB# 1850-0911 v.6





Supporting Statement Part A










National Center for Education Statistics

U.S. Department of Education

Institute of Education Sciences

Washington, DC








September 2015

Revised November 2015




Preface

The Middle Grades Longitudinal Study of 2017-2018 (MGLS:2017) will be the first study sponsored by the National Center for Education Statistics (NCES), within the Institute of Education Sciences (IES) of the U.S. Department of Education, to follow a nationally representative sample of students as they enter and move through the middle grades (grades 6–8). In preparation for the main study, the data collection instruments and procedures must be field tested. This submission describes the recruitment of schools, school districts, and parents to participate in the MGLS:2017 Operational Field Test (OFT), for which the data collection is scheduled to begin in January 2017. A separate OMB clearance request for the OFT data collection will be submitted in May 2016.

The primary purpose of the OFT is to obtain information on recruiting, particularly for the targeted disability groups; obtaining a tracking sample that can be used to study mobility patterns in subsequent years; and testing protocols and administrative procedures.

Part A of this submission presents information on the basic design of the OFT; Part B discusses the collection of information employing statistical methods; Appendices A through J provide field test recruitment materials consisting of letters to state and district officials, school principals, and parents, as well as text for an MGLS:2017 brochure, frequently asked questions, and website. Because some schools and districts require submission of a research application that requests detailed information on the proposed assessments and surveys, Appendices K through R provide content summaries of the proposed assessments and surveys, and Appendices S and T provide the student roster collection materials. Appendix U provides the current drafts of the survey instruments. The MGLS:2017 contract covering the OFT through the main study’s seventh grade data collection was recently awarded to RTI International (a trade name of the Research Triangle Institute). The subsequent submission for the MGLS:2017 OFT data collection will include updated assessments and survey instruments based on results from the Item Validation Field Test (IVFT) that will be conducted in early 2016, concurrently with recruitment for the OFT. Because the MGLS:2017 IVFT recruitment and data collection will still be ongoing at the time this request is approved, the burden and materials from the MGLS:2017 Recruitment for 2016 IVFT request (OMB# 1850-0911 v.3, 5, and 7) and from the MGLS:2017 IVFT Data Collection (OMB# 1850-0911 v.4) are being carried over in this submission.

A. Justification

A.1 Importance of Information

The MGLS:2017 will be the first study sponsored by NCES to follow a nationally representative sample of students as they enter and move through the middle grades (grades 6–8). A study of the middle grades will complement NCES’s plans for implementing a multi-cohort sequence for its longitudinal studies series. This means that the Early Childhood Longitudinal Study, Kindergarten Class of 2010–11 (ECLS-K:2011), the MGLS:2017, and the High School Longitudinal Study of 2020 (HSLS:2020) will work toward synchronizing within a given 10-year span to collect the full range of data on students’ school experiences as they transition from elementary school into high school. The federal government is uniquely positioned to undertake the needed comprehensive large-scale longitudinal study of a nationally representative sample of middle grade youth that includes measures of known critical influences on adolescents’ academic and socioemotional trajectories. NCES is authorized to conduct the MGLS:2017 under the Education Sciences Reform Act of 2002 (20 U.S. Code, Section 9543).

The MGLS:2017 will be conducted with a nationally representative sample of students enrolled in sixth grade during the 2017–18 school year, with the baseline data collection taking place from January through June of 2018. Annual follow-ups are planned for winters of the 2018-2019 and 2019-2020 school years, when most of the students in the sample will be in grades 7 and 8, respectively. The MGLS:2017 will provide a rich descriptive picture of the academic experiences and development of students during these critical years and will allow researchers to examine associations between contextual factors and student outcomes. There is a wealth of research highlighting the importance of mathematics and literacy skills for success in high school and subsequent associations with later education and career opportunities. Thus, the study will focus on student achievement in these areas, along with measures of student socioemotional well-being and other outcomes. The study will also include a sample of students with different types of disabilities that will provide descriptive information on their outcomes, educational experiences, and special education services.

The MGLS:2017 will rely on a set of longitudinal and complementary instruments to collect data across several types of respondents to provide information on the outcomes, experiences, and perspectives of students across grades 6, 7, and 8; their families and home lives; their teachers, classrooms, and instruction; and the school settings, programs, and services available to them. At each wave of data collection in the main study, students’ mathematics and reading skills, socioemotional development, and executive function will be assessed. Students will also complete a survey that asks about their engagement in school, out-of-school experiences, peer group relationships, and identity development. Parents will be asked about their background, family resources, and involvement with their child’s education and their school. Students’ mathematics teachers will complete a two-part survey. In part 1, they will be asked about their background and classroom instruction. In part 2, they will be asked to report on the academic behavior, mathematics performance, and classroom conduct of each study child in their classroom. For students receiving special education services, their special education teacher or provider will also complete a survey similar in structure to the two-part mathematics teacher instrument, consisting of a teacher-level questionnaire and student-level questionnaire, but with questions specific to the special education experiences of and services received by the study child. School administrators will be asked to report on school programs and services, as well as on school climate.

In short, the MGLS:2017 aims to provide data on the development and learning that occur during students’ middle grade years (grades 6–8) and that are predictive of future success, along with the individual, social, and contextual factors that are related to successful development. A key goal of the study is to provide researchers and policymakers with the information they need to better understand the school and nonschool influences associated with mathematics and reading success, socioemotional health, and positive life development during the middle grade years and beyond. To support the development of the study, the MGLS:2017 is conducting two field tests, the IVFT beginning in January 2016, followed by the Operational Field Test (OFT) that will begin in January 2017.

The study’s success is dependent on the development of reliable, valid measures. The goal of the IVFT is to collect data to support evaluation of the mathematics assessment, reading assessment, executive function assessment, student survey, parent survey, and school staff surveys. The IVFT will provide the data needed to determine the psychometric properties of items and the predictive potential of assessment and survey items so that valid, reliable, and useful assessment and survey instruments can be composed for the main study. As the focus of the IVFT is the analyses of the psychometric properties of the survey items and assessments, the IVFT requires a large, diverse field test sample, though not a nationally representative one.

Gaining schools’ cooperation in voluntary research is increasingly challenging. The OFT will be used to test materials and procedures revised based on the results of the IVFT and to gain a deeper understanding of effective recruitment strategies that lead to higher response rates and thus better data quality. The OFT will include a responsive design approach for nonresponding parents. The OFT is also an opportunity to finalize our standardized protocols for test administration. It will allow NCES to tighten assessment and survey timing, so as to maximize the overall functionality of the assessments and surveys while minimizing the time it takes respondents to complete them. With the focus of the OFT on recruitment strategies, tactics for retention of the sample within the study, and the operational administration of the surveys and assessments, the OFT will inform the main study and provide the MGLS team with a small-scale practice in obtaining a nationally representative sample.

A.2 Purposes and Uses of Data

The OFT data collection will take place from January through June 2017. Unlike the cross-sectional, one-time administration of the IVFT, the OFT is partially longitudinal in nature. The purpose of the OFT is to recruit an approximately nationally representative sample of sixth graders; field a close-to-final version of the sixth grade assessments and surveys; and then track these sixth graders across the next 2 years, whether they stay within their sixth grade school or move to another school. Given that NCES has never followed a middle grades cohort and student moving patterns can have a big impact on the data collection plan for the seventh and eighth grades, the OFT sample will be followed up in the winters of 2018 and of 2019. As stated, the OFT is not currently designed to field the assessments and surveys past the sixth grade collection in 2017. The OFT will be used to better understand the recruitment strategies necessary for a large-scale nationally representative effort, the response rates that can be expected during recruitment and associated with a sixth grade data collection, and the effort involved in the subsequent tracking of the sample from the base year (when the children will be in the sixth grade) to the first follow-up (when most of the children will be in the seventh grade) and the second follow-up (when most of the children will be in the eighth grade).

Field Test Components

The OFT will include a sample that approximates a nationally representative sample to support a deeper understanding of the main study’s operations in terms of recruitment of the sample and operational considerations in fielding the assessments and surveys (see Part B, section B.1 for sample details). The OFT includes the following components: student assessments, student height and weight measurement, student survey, parent survey, math teacher survey, special education teacher survey, school administrator survey, and facilities observation checklist.

Student Assessments and Student Survey. Students will participate in assessments and a survey, designed to take approximately 90 minutes per student.

  • Mathematics Assessment. The MGLS:2017 mathematics assessment will be a 30-minute, two-stage adaptive assessment that students will take on a tablet computer. The focus will be on domains of mathematics that are most likely to be the central focus of middle school learning now and in the future: the Number System, Ratios and Proportional Relationships, Expressions and Equations, and Functions. To ensure that the study is sensitive to the variation in students’ mathematics ability, the assessment will include items with appropriately varying cognitive demand. The MGLS:2017 mathematics assessment is designed to assess high school algebra readiness and will provide valuable information about the development of middle grade students’ knowledge of mathematics and their ability to use that knowledge to solve problems, moving toward stronger reasoning and understanding of more advanced mathematics.

  • Reading Assessment. The MGLS:2017 reading assessment will use a two-stage adaptive assessment design consisting of a brief routing block (first stage: approximately 10 minutes) followed by a skill-based block (second stage: approximately 20 minutes), for a total of 30 minutes. The routing block will include items that measure foundational components of reading that are important for comprehension: Vocabulary, Morphological Awareness, and Sentence Processing. Performance on the routing block will direct students to one of three types of skill-based reading blocks (basic components, basic comprehension, or scenario-based comprehension) within the second stage.

The second-stage basic components skill block will be used to gather more information on the foundational reading component skills, including those measured in the first stage as well as word recognition and decoding skills. The basic components block will also capture information about students’ efficiency at basic reading comprehension and ability to comprehend short passages. The second-stage basic comprehension skill block is designed to gather information about students’ efficiency at basic reading comprehension and their ability to comprehend short passages. This skill-based block will measure comprehension in a traditional design where unrelated passages and corresponding questions are presented. The second-stage scenario-based comprehension skill block is designed to gather information about students’ ability to comprehend informational text and reason more deeply about text and to apply what they learn from passages. The scenario-based block will include a scenario or a purpose for reading (e.g., preparing for a classroom discussion or creating a website on a topic).

  • Executive Function Measures. Executive function, a set of capacities and processes originating in the prefrontal cortex of the brain, permits individuals to self-regulate, engage in purposeful and goal-directed behaviors, and conduct themselves in a socially appropriate manner. Self-regulation is needed for social success, academic and career success, and good health outcomes. Executive function includes capacities such as shifting (cognitive and attention flexibility), inhibitory control, and working memory. Four different executive function measures will be included in the field tests: Stop Signal (inhibitory control), 3-Back with verbal stimulus (working memory), 2-Back with nonverbal stimulus (working memory), and the Hearts and Flowers task (shifting or cognitive flexibility).

  • Student Height and Weight Measurement. Measuring students’ height and weight provides data to assess body mass index as an indicator of obesity, pubertal timing (i.e., growth spurt), and eating disorders.

  • Student Survey. The purpose of the student survey is to collect information on students’ attitudes and behaviors; out-of-school time use; and family, school, and classroom environments. The student survey will also serve as a source for information about socioemotional outcomes having to do with social relationships, support, and school engagement.

Parent Survey. The parent survey will take 30 minutes to complete via a self-administered web-based questionnaire; a telephone interview follow-up will be available for respondents who do not complete the questionnaire via the web. The parent survey will focus on supplementing the information collected from students and teachers about the students’ educational experiences and on learning about parents’ expectations for their children’s academic attainment in high school and beyond. It will also collect information about family involvement in the children’s education and about family characteristics that are key predictors of academic achievement and other student outcomes.

Mathematics Teacher Survey/Teacher Student Report. The mathematics teacher survey will consist of two parts: a teacher survey and a series of teacher student reports (TSRs). Both the mathematics teacher survey and the TSR will be web-based, self-administered surveys, with a phone interview option available. The mathematics teacher survey is expected to take approximately 20 minutes to complete, and the TSR will take 5 to 10 minutes for each student that is rated. The mathematics teacher survey will collect data on potential classroom-level correlates of students’ mathematics achievement as well as school-level services and factors such as special programs, school climate, and instructional leadership.

Teacher responses to the TSR will capture information specific to the sampled student and his or her mathematics class. It will provide information on the classroom attendance and performance of individual students, which will augment direct student assessments, and student and parent reports. The TSR will also serve as an additional source for data on student socioemotional outcomes related to regulation, school engagement, and externalizing behaviors. In the web version of this instrument, teachers will be given a list of the students for whom they should complete a TSR and will click on each student’s name to launch the TSR for that specific student. If a teacher opts not to complete the web-based survey, a follow-up phone interview will be conducted.

Special Education Teacher Survey/Teacher Student Report. Like the mathematics teacher survey, the special education teacher/service provider survey will consist of two parts. The first part consists of the teacher questionnaire, which asks questions about the teacher’s background and experiences working with students with disabilities. The second part contains the TSR, which contains specific questions about special education services and other contextual variables for sampled children with an Individualized Education Program (IEP), as well as ratings of individual academic and life skills (the special educator rating scale, SPERS).

The special education teacher survey will be web based and self-administered, with a phone interview option available. The first part of the survey will take approximately 10 minutes to complete, and the second part will take about 25 minutes for each student. In the second part of the web version of this instrument, teachers will be given a list of the students for whom they should complete the survey and will click on each student’s name to launch this part for that specific student.

School Administrator Survey. The school administrator survey will be web based and self-administered, with a telephone option available, and will take the administrator (generally, the principal or principal’s designee) approximately 20 minutes to complete. The school administrator survey will collect information about a school’s characteristics and staffing (specifically, the school’s structure and climate, including safety, organization, and support). It will also collect information on the student population, student conduct, academic culture, course offerings, and extended learning opportunities (e.g., extracurricular activities, summer school, or supports for struggling students).

Facilities Observation Checklist. The facilities observation checklist for the school setting will be used to document the condition of the physical plant and the availability of resources. This information will be collected by field staff and will complement the School Administrator Survey.

Administration of Assessments and Survey Components

Similar to the IVFT, students’ parents, math teachers, special education providers (as applicable), and school administrators will be asked to complete surveys as described above. However, unlike the IVFT, the OFT will not employ a spiral design for the student assessments and student survey items. The purpose of the OFT is to, as closely as possible, mirror the main study approach; therefore, each student will receive quasi-final versions of the math assessment (30 minutes); reading assessment (30 minutes); two of the executive function assessments (10 minutes); and the student survey (20 minutes) (approximately 90 minutes total).

School Recruitment Approach

The student sample for the OFT will simulate a nationally representative school sample and will include students in grade 6 in general education schools in the United States. These students will likely demonstrate a range of ability on the constructs being measured by the MGLS item pool. The sample will also include a subset of students from the three focal disability groups (learning disability, autism, and emotional disturbance) who are in general education schools and are able to take standardized tests, using accommodations if necessary. Schools will be recruited both directly and at the district level.

A.3 Improved Information Technology (Reduction of Burden)

Where feasible, available technology will be used to reduce burden and improve efficiency and accuracy. For example, if districts can provide information linking students to their mathematics teachers or students with disabilities to their special education teachers electronically, we will use this information rather than asking for it at the school level. The burden of recruitment on districts and schools will be minimal, with most information gathered over the telephone. Districts will primarily be asked to provide confirmation of data gathered from other sources, including school universe files and district and school websites. Our collection of student lists will accommodate whatever format districts and schools find to be the least burdensome. The study will utilize the information in any format it is provided.

A.4 Efforts to Identify Duplication

The MGLS:2017 will not be duplicative of other studies. While NCES longitudinal studies have contributed to our understanding of the factors that influence student success and failure in school, the middle grades (grades 6–8) are noticeably absent from the studies conducted to date. A majority of nationally representative longitudinal studies have focused on high school students and on the transition from secondary to postsecondary education: e.g., the High School and Beyond Longitudinal Study (HS&B) and the Education Longitudinal Study of 2002 (ELS:2002). The Early Childhood Longitudinal Study, Kindergarten Class of 1998–99 (ECLS-K), and the National Education Longitudinal Study of 1988 (NELS:88) collected data on students in grade 8, but neither included a data collection in grades 6 and 7. The ECLS-K:2011 will not follow students beyond grade 5, and the High School Longitudinal Study of 2009 (HSLS:09) began with a national sample of students in grade 9. Thus, there is little information at the national level about the learning that occurs during grades 6–8 and about the rates of learning for different groups of students who may experience diverse school environments and opportunities.

The MGLS:2017 is unique in that it will assess students’ mathematics and reading achievement, as well as other student outcomes (e.g., executive function and socioemotional development), for the same group of students over a 3-year period. In addition to the ECLS-K and NELS:88, other national studies have assessed some of these outcomes for students in grade 8, including the National Assessment of Educational Progress (NAEP) and the Trends in International Mathematics and Science Study (TIMSS). These studies, however, are cross-sectional and do not include repeated measures of achievement or assess multiple subjects and areas of development for the same sample of students. Therefore, they cannot answer questions about students’ growth in mathematics and reading over the middle grade years, about differences in the rates of growth for different populations (e.g., differences by gender, by race/ethnicity, and for students attending public and private schools), and about the school and nonschool factors that may facilitate or hinder this growth. Nor can they explore questions about the relationships between student achievement and other school outcomes and executive functions (e.g., working memory, attention, and inhibitory control) that work to regulate and orchestrate cognition, emotion, and behavior to enable a student to learn in the classroom. The MGLS:2017 will also be unique in its focus on obtaining a sample of students in three disability categories that can be studied on their own or compared to general education students over the three middle level years.

Other adolescent development studies have been conducted, but they often do not include a grade 6 sample. For example, the youngest children in the National Longitudinal Study of Adolescent Health (Add Health) and the Maryland Adolescent Development in Context Study (MADICS) were in grade 7 at baseline. Many of these studies collected data on local samples, had a primary focus on family and child processes, and were started in the 1990s: e.g., MADICS and the Michigan Study of Adolescent and Adult Life Transitions (MSALT). As such, they do not provide a contemporary picture of U.S. students in grades 6–8.

A.5 Minimizing Burden for Small Entities

Burden will be minimized wherever possible. During district and school recruitment, we will minimize burden by training recruitment staff to make their contacts as straightforward and concise as possible. The recruitment letters and materials (e.g., the study description and FAQs) are designed to be clear, brief, and informative. In addition, contractor staff will conduct all test administration and will assist with parental notification, sampling, and other study tasks as much as possible within each school.

A.6 Frequency of Data Collection

The main activities of the OFT for the MGLS:2017 are expected to take place in January through June of 2017. The OFT has planned limited follow-ups (as described in section A.2) in January through June of winters 2018 and 2019. The follow-ups will focus solely on student tracing and tracking and will not reassess students or readminister any survey components.

A.7 Special Circumstances

There are no special circumstances involved with the recruitment.

A.8 Consultations Outside NCES

Content experts have been consulted in the development of the assessments and questionnaires. These experts are listed by name, affiliation, and expertise in table 1.

Table 1. Members of the MGLS:2017 Content Review Panels

Name

Affiliation

Expertise

Mathematics Assessment Content Review Panel (June 18–19, 2013)

Tom Loveless

Brookings Institution

Policy, math curriculum

Linda Wilson

Formerly with Project 2061

Math education, math assessment, middle school assessment, author of NCTM Assessment Standards for School Mathematics and NAEP math framework, teacher

Kathleen Heid

University of Florida

Math education, use of technology, teacher knowledge, NAEP Grade 8 Mathematics Standing Committee member

Edward Nolan

Montgomery County Schools, Maryland

Math curriculum and standards, large-scale assessment of middle grade students

Lisa Keller

University of Massachusetts, Amherst

Psychometrics, former math teacher

Paul Sally

University of Chicago

Math education, mathematics reasoning, mathematically talented adolescents

Margie Hill

University of Kansas

Co-author of Kansas mathematics standards, former NAEP Mathematics Standing Committee member, former district math supervisor

Executive Function Content Review Panel (July 18, 2013)

Lisa Jacobson

Johns Hopkins University; Kennedy Krieger Institute

Development of executive functioning skills, attention, neurodevelopmental disorders, and parent and teacher scaffolding

Dan Romer

University of Pennsylvania

Adolescent risk taking

James Byrnes

Temple University

Self-regulation, decision making, cognitive processes in mathematics learning

Socioemotional-Student-Family Content Review Panel (July 25–26, 2013)

James Byrnes

Temple University

Self-regulation, decision making, cognitive processes in mathematics learning

Russell Rumberger

University of California, Santa Barbara

School dropouts, ethnic and language minority student achievement

Tama Leventhal

Tufts University

Family context, adolescence, social policy, community and neighborhood indicators

Susan Dauber

Bluestocking Research

School organization, educational transitions, urban education, parent involvement and family processes

Scott Gest

Pennsylvania State University

Social networking, social skills, longitudinal assessment of at-risk populations

Kathryn Wentzel

University of Maryland

Social and academic motivation, self-regulation, school adjustment, peer relationships, teacher-student relationships, family-school linkages

Richard Lerner

Tufts University

Adolescent development and relationships with peers, families, schools, and communities

School Administrator Content Review Panel (August 16, 2013)

Susan Dauber

Bluestocking Research

School organization, educational transitions, urban education, parent involvement and family processes

George Farkas

University of California, Irvine

Schooling equity and human resources

Jeremy Finn

State University of New York at Buffalo

School organization, school dropouts

Edward Nolan

Montgomery County Schools, Maryland

Large urban school system administrator

Tom Loveless

Brookings Institution

Policy, math curriculum

Reading Assessment Content Review Panel ( April 14, 2014)

Donna Alvermann

University of Georgia

Adolescent literacy, online literacy, codirector of the National Reading Research Center (funded by the U.S. Department of Education)

Joseph Magliano

Northern Illinois University

Cognitive processes that support comprehension, the nature of memory representations for events depicted in text and film, strategies to detect and help struggling readers

Sheryl Lazarus

University of Minnesota

Education policy issues related to the inclusion of students with disabilities in assessments used for accountability purposes, student participation and accommodations, alternate assessments, technology-enhanced assessments, teacher effectiveness, large-scale assessments, school accountability, research design (including cost analyses), data-driven decision making, rural education, the economics of education

Disabilities Content Review Panel (April 29, 2014)

Jose Blackorby

SRI International

Autism, specific learning disabilities, special education, curriculum design, alternate student assessment, large-scale studies of students with disabilities, codirector of the Special Education Elementary Longitudinal Study (SEELS)

Lynn

Fuchs

Vanderbilt University

Specific learning disabilities, student assessment, mathematics curriculum, psychometric models

Mitchell L. Yell

University of South Carolina

Autism, emotional and behavior disorders, specific learning disabilities, pre-K–12 instruction and curriculum, special education, evidence-based intervention

Sheryl Lazarus

University of Minnesota

Special education policy, inclusion of students with disabilities in assessments, accommodations, alternate assessments, technology-enhanced assessments, large-scale assessments, school accountability, research design (including cost analyses)

Martha Thurlow

University of Minnesota

Specific learning disabilities, reading assessment, alternate student assessment, early childhood education, special education, curriculum, large-scale studies

Diane Pedrotty Bryant

University of Texas, Austin

Educational interventions for improving the mathematics and reading performance of students with learning disabilities, the use of assistive technology for individuals with disabilities, interventions for students with learning disabilities and who are at risk for educational difficulties


A.9 Payments or Gifts to Respondents

High levels of school participation are critical to the success of the OFT. School administrator, mathematics teacher, special education teacher, parent, and student data collection activities are contingent on school cooperation. NCES recognizes that the burden level of the study is one of the factors that school administrators will consider when deciding whether to participate. To offset the perceived burden of participation, NCES intends to continue to use strategies that have worked successfully in other major NCES studies (e.g., ECLS-K, ECLS-K:2011, HS&B, NELS:88, and ELS:2002), including offering both monetary and non-monetary incentives. Table 2 summarizes the proposed incentive amount for each instrument and activity along with their estimated administration times; a brief justification for each incentive amount follows table 2.

Table 2. Operational Field Test (OFT) Instruments and Proposed Incentive Amounts

Instrument/Activity

Administration Time*

Field Test Incentives

Student Assessments and Survey

(Math, Reading, Executive Function, and Student Survey)

90 minutes

Choice of e.g.

1) mechanical pencil,

2) mobile device screen cleaner,

3) suncatcher,

4) slap bracelet

(average value $0.50 each)

Parent Survey

30 minutes

$0 to $40

For details, please see the description in the text below

Mathematics Teacher



Teacher Survey

20 minutes

$20

Teacher Student Report

10 minutes per student

$7 per TSR

Special Education Teacher



Teacher Survey

10 minutes

$20

Teacher Student Report

25 minutes per student

$7 per TSR

School Administrator Survey

20 minutes

No monetary incentive

School Participation


School Coordinator

(logistics, on-site visit, consent forms, administrative records, etc.)

6 hours for consent assistance

2 hours to schedule assessments

2 hours to set up web access, coordinate computer labs

6 hours to provide administrative records

$200, $400, or $400 in material or services for school



$150 for coordinator

*Note that the assessment administration time may be longer for students with disabilities.


Students

In the OFT, we plan a simple experiment on the Token of Appreciation for Student Participation. To build goodwill toward the study on the part of students, we will offer students the choice of one of four items valued at $0.25 up to $1 each. These items will not be branded with a MGLS:2017 logo in order not to identify the student as having participated in this particular study. NCES has experience with providing tokens of appreciation for elementary and high school students but we are not as familiar with what would be an attractive token for middle grades students. By giving the OFT students the choice of one of four items, we can determine which two would likely be most attractive in the national data collection.



Parents

Parent survey response rates have declined over the past decade. The ECLS-K:2011 baseline (fall 2010) parent survey response rate was more than 10 percentage points lower (74 percent)1 than the parent survey rate in the corresponding 1998 wave of the ECLS-K (85 percent).2 Additionally, the ninth grade parent survey response rate for the HSLS:09 baseline was 68 percent.3 The MGLS parent survey is a key component of the data being collected.

To improve the chances of obtaining higher parent participation rates in a school-based design, we will work with school personnel to recruit sample students’ parents for the MGLS:2017. In the main study, we plan to use a responsive design approach to identify cases for nonresponse follow-up interventions such that the responding sample will be as representative as possible of the population (i.e., sixth graders) and thereby reduce the risk of potential nonresponse bias.

During the OFT, we will test varying incentive amounts at the onset of parent interview collection to determine the optimal baseline incentive offer for the main study. To better understand effective non-response follow-up, we will also test increased incentive offers among pending nonrespondents at two different points in data collection. A separately tailored incentive approach will be utilized for parents of students with disabilities. The parent incentive experimental conditions are shown in table 3. Although in the main study responsive design methods will be employed to select pending nonresponding cases for targeted interventions, in the OFT, where the number of cases will be relatively small, increased incentive amounts will be determined by random assignment. In order to inform main study procedures, data from the OFT experiment will be analyzed with an approach that simulates responsive design.

Baseline (Phase 1)

In the OFT, parents of students who are not in the disability sample will be randomly assigned to one of three baseline incentive groups (phase 1):

  • Cases in group A (20 percent of the cases) receive no incentive the entire data collection period.

  • Cases in group B (40 percent of the cases) will be offered $10 at the start of data collection.

  • Cases in group C (40 percent of the cases) will be offered $20 at the start of data collection.

During phase 1, parents will be asked to complete the online questionnaire and telephone prompting will begin about three weeks after the initial contact is made with the parent.

Non-response Follow-up (Phase 2 and Phase 3)

Approximately one-third of the way through the OFT data collection (phase 2; about 3 weeks after telephone prompting begins), nonresponding cases in groups B and C will be randomly assigned to a treatment group ($10 incentive boost) or a control group (no incentive boost).

Approximately two-thirds of the way through the OFT data collection (phase 3), one more incentive boost experiment will be implemented among remaining nonresponding cases in groups B and C. The treatment group will receive an increase in incentive of whatever amount brings their total incentive to equal $40, while the control group will not receive a boost from the previous offer.

The number of cases will not be sufficient to allow for assignment and experimentation based on responsive design modeling, therefore the incentive boosts for pending nonrespondents (for a total offer of $40) will be done before phases 2 and 3 with a random-assignment experimental design. After the end of data collection, a responsive design simulation will be conducted to observe the potential effectiveness of the non-response follow-up interventions. This analytic approach will inform the main study procedures by simulating the responsive design model proposed for the main study. This simulation approach was employed successfully with the second follow-up field test data collection for the High School Longitudinal Study of 2009 (HSLS:09). The details of the retrospective analysis model plan will be included in the MGLS:2017 OFT Data Collection clearance request package to be submitted to OMB in 2016.

At the end of the data collection period, we will simulate the responsive design process based on the OFT results. We will develop a model to retrospectively identify those nonrespondents most likely to contribute to bias as of the beginning of phase 2. We will then compare the response rate, as of the end of phase 2, between those nonrespondents who were assigned an additional incentive amount and those who were not in order to examine the degree to which the incremental incentive applied in phase 2 was successful in increasing the response rate among nonrespondents as of the beginning of phase 2. We will also use that same model to identify those nonrespondents most likely to contribute to bias as of the beginning of phase 3. We will compare the response rate, as of the end of phase 3, between those nonrespondents who were assigned an additional incentive amount and those who were not in order to examine the degree to which the incremental incentive applied in phase 3 was successful in increasing the response rate among nonrespondents as of the beginning of phase 3.

Parents of Students with Disabilities

It is expected that parents of students with disabilities will be a more challenging group from which to obtain participation than parents whose children do not have a disability. Students with disabilities comprise an analytically critical population for this study. Therefore, we will implement a separate treatment for parents of students with disabilities. We will offer a $20 incentive for this group at the outset of data collection and increase the incentive by $10 (a cumulative offer of $30) one-third of the way through data collection and another $10 (a cumulative offer of $40) two-thirds of the way through data collection. Providing a differential treatment for analytically critical populations was used successfully on HSLS:09 with students who had reportedly ever dropped out of school.

Table 3. OFT Parent Incentive Experimental Conditions

Experiment Group

Phase 1

Baseline Incentive

Phase 2

One-third of the way through data collection

Phase 3

Two-thirds of the way through data collection

A:

Non-disability Sample

$0 (no offer)

$0 (no boost)

$0 (no boost)

B:

Non-disability Sample

$10

$10 (no boost)

$10 (no boost)

$40 ($30 boost)

$20 ($10 boost)

$20 (no boost)

$40 ($20 boost)

C:

Non-disability Sample

$20

$20 (no boost)

$20 (no boost)

$40 ($20 boost)

$30 ($10 boost)

$30 (no boost)

$40 ($10 boost)





Disability Sample

$20

$30

$40


Teachers

The incentive proposed for students’ teachers is $20 per teacher survey, plus $7 per teacher student report (TSR). These amounts are consistent with the amounts used in current NCES studies, such as the ECLS-K:2011. While it is estimated that the mathematics teacher survey will take longer to complete (20 minutes) than the special education teacher survey (10 minutes), the reverse is true for the individual student reports. The individual student reports will require approximately 10 minutes per student to complete for mathematics teachers and 25 minutes per student to complete for special education teachers (including 5 minutes for an indirect assessment of student’s skills, the SPERS). We are proposing to use the same incentive structure for all teachers, regardless of the specific questionnaires they are being asked to complete, to protect against any perception of unfairness that might result if teachers within a school talk to one another about the amount they have received for a specific questionnaire.

Schools

As part of the OFT schools recruitment, we propose to conduct an incentive experiment. Each school will be randomly assigned to one of the three experimental conditions. Given the many demands and outside pressures that schools already face, it is essential that they see that MGLS:2017 staff understand the additional burden being placed on school staff when requesting their participation. The study asks for many kinds of information and cooperation from schools, including a student roster with basic demographic information (e.g., date of birth, sex, and race/ethnicity); information on students’ IEP status, math and special education teachers, and parent contact information; permission for field staff to be in the school for up to a week; space for administering student assessments; permission for students to leave their normal classes for the duration of the assessments; and information about the students’ teachers and parents. For sample students with disabilities, on average, five students in each school will be selected based on disability category, and many will require accommodations and different assessment settings, such as individual administration and smaller group sessions. Working with the data collection contractor to assess these students will place even more of a burden on the participating schools.

One of the key questions for the OFT is to determine whether sufficient numbers of students in the focal disability groups can be selected for the study. Gaining cooperation from schools that have more students in those disability groups will be important for the success of MGLS:2017. Therefore, the school sample of 103 schools will be classified by the number of students within the school in the focal disability groups (autism; ED; and SLD): (1) “higher”– schools with 17 or more sixth-grade [or age-based equivalent] students versus (2) “lower”–schools with fewer than 17 sixth-grade [or age-based equivalent] students. The sample assumes a three percent school ineligibility rate, which would be determined if a sampled school has closed or does not contain students in grade 6.

Within the two school types (schools with “higher” or “lower” counts of students in those disability groups), each school will be randomly assigned to one of three experimental conditions. In Condition 1, the baseline condition, we will offer one third of the sample schools a $200 incentive for participation. This amount is consistent with the amount offered for participation in other NCES studies, such as the ECLS-K, ECLS-K:2011, TIMSS, and the Program for International Student Assessment (PISA). However, based on previous difficulties in recruiting schools for the originally approved MGLS field test recruitment, and the general decline in school participation in NCES longitudinal studies over the years,4 we propose to also test offering one third of the sample schools $400 (Condition 2), and one third of schools a choice of one of seven non-monetary incentives equivalent to $400 (Condition 3). The list of the non-monetary incentive choices is provided in Table 4.

Table 4. Non-Monetary Incentive Choices for Schools in Experimental Condition 3

Incentive

Value

Registration for Association for Middle Level Education (AMLE) or Regional Annual Meeting

$400

Two-Year School Membership in AMLE

$400

Membership in Regional ML Organization plus Subscriptions to Professional Journals

$400

Professional Development Webinar

$400

School Supplies

$400

Library of Middle Level Publications

$400


The school incentive experiment, with the same three experimental conditions [(1) $200 or (2) $400 or (3) non-monetary incentive equivalent to $400], will also be used during the MGLS:2017 IVFT, which will be conducted in January through June 2016. The IVFT has a larger sample (250 schools) and does not subdivide the schools by number of students in disability groups of interest.

Recruitment of schools for the OFT will begin in January 2016 and continue until approximately April 2017. Recruitment for the IVFT is currently ongoing. Results from the IVFT recruitment effort will be reviewed in spring 2016 and may inform recruitment efforts for OFT sample schools that by that point have not yet agreed to participate in the OFT. Administrators at non-participating IVFT schools will be asked to complete a brief questionnaire about their reasons for not participating in the study and what about the study plan and/or incentives, if anything, would have changed the school’s mind. A change memo to the OFT recruitment plan may be submitted in 2016 if we learn from the IVFT debriefing questionnaire that a particular incentive may be effective for gaining school cooperation in the study. The debriefing questionnaire will also be used in the OFT to inform the main study. As with the IVFT, the debriefing questionnaire (appendix W) will include the following questions of nonparticipating schools:

  • For what reasons did your school decide not to participate in MGLS:2017?

  • Your school was offered [incentive] to participate in the study. Would a different incentive have changed your mind about participating? In other words, would you have agreed to let your school participate in MGLS:2017, if the incentive was a different amount or type of incentive?

  • [If yes] How much or what would the incentive need to be for your school to decide to participate? [record comments]

  • Is there anything else, not mentioned above, we could have done to enable your school to participate?

The results of the school incentive experiment in the OFT will be combined with the results of the experiment in the IVFT. With the combined 350 schools between the two field test samples, we will be able to observe potential impact of the different incentive treatments to inform school recruitment strategies in the national study.

School Coordinators

School coordinators will be offered a $150 monetary incentive. They play an especially important role in the study and are critical to its success. The coordinator in each participating school will coordinate logistics with the data collection contractor; compile and supply to the data collection contractor a list of eligible students for sampling; communicate with teachers, students, and parents about the study to encourage their participation; distribute and collect parental consent forms; and assist the test administrator in ensuring that the sampled students attend the testing sessions.

A.10 Assurance of Confidentiality

NCES is authorized to conduct this study under the Education Sciences Reform Act of 2002 (20 U.S. Code, Section 9543). By law, the data provided by schools, staff, parents, and students may be used only for statistical purposes and may not be disclosed or used in identifiable form for any other purpose except as required by law (20 U.S. Code, Section 9573). The laws pertaining to the collection and use of personally identifiable information will be clearly communicated in correspondence with states, districts, schools, teachers, students, and parents. Letters and informational materials will be sent to parents and school administrators describing the study, its voluntary nature, and the extent to which respondents and their responses will be kept confidential. A request for a list of middle grade students with IEPs will be requested from school districts and/or schools under FERPA exception (34 CFR Part 99.31). This information will be used for sampling purposes only and will be securely destroyed once student samples are drawn.

The confidentiality plan developed for the MGLS:2017 requires that all contractor and subcontractor personnel and field workers who will have access to individual identifiers sign confidentiality agreements and notarized nondisclosure affidavits. The plan also requires that all personnel receive training regarding the meaning of confidentiality, particularly as it relates to handling requests for information and providing assurance to respondents about the protection of their responses. NCES understands the legal and ethical need to protect the privacy of the MGLS:2017 respondents and has extensive experience in developing data files for release that meet the government’s requirements to protect individually identifiable data from disclosure. The data files, accompanying software, and documentation will be delivered to NCES by the data collection contractor at the end of the project. Neither names nor addresses will be included in any data file.

A.11 Sensitive Questions

The recruitment effort does not involve gathering information considered to be of a sensitive nature. A list of middle grade students with IEPs will be requested from school districts, or schools where appropriate, under FERPA exception (34 CFR Part 99.31). This information will be used for sampling purposes only and will be destroyed once student samples are drawn. All district and school personnel facilitating the conduct of the study and developing the sampling frame will be informed of the privacy and confidentiality protocols required for the study, including those having to do with the sample lists of schools and students.

A.12 Estimates of Burden

Table 5 shows the expected burden for districts, schools, and parents during the OFT recruitment activities. For the OFT, we anticipate collecting data within 50 schools, from approximately 1,120 participating students, their parents, and their teachers. Table 6 shows the approved IVFT burden being carried over.

As shown in Part B, we anticipate contacting approximately 103 schools (with an estimated 97 percent school-eligibility rate and 50 percent participation rate) to reach the approximately 50 schools needed for participation, and contacting the parents of approximately 1,750 students to yield approximately 1,120 participating students. In order to draw samples of students with disabilities, we may need to obtain student records information from up to four districts.

We estimate that it will take 20 minutes on average for school and district administrators to review the materials and either agree or decline to participate, and also for all schools and districts to debrief on reasons why schools or districts chose to participate or not to participate in the OFT. The debrief may be completed via phone or email and will ask school administrators what factors led to the decision to participate or not to participate in MGLS:2017. For example, the debrief may ask whether content, incentives, time, and/or other factors impacted the decision. The information will be used to inform recruitment efforts for the main study. For those participating, we estimate an additional 4 hours for the provision of student rosters, including information about students for sampling, contact information for their parents, and their math and special education teachers (see Appendices S and T). For students’ parents, we estimate that it will take up to 10 minutes to review the recruitment materials and either consent or refuse to participate (on behalf of their student and themselves). The provision of student rosters and the parents’ consent forms will serve as sources for parents’ contact information, which during the data collection period can be used for nonresponse follow-up.

Table 5. Operational Field Test (OFT) Recruitment Burden Estimates for Schools and Parents

Recruitment

Sample Size

Response Rate

Number of respondents and responses

Average burden time (minutes)

Total burden (hours)

Respondent average hourly wage**

Estimate of respondent burden time cost

Nonparticipating districts

12

67%

8

20

3

$44.13

$132

Participating districts

33%

4

260

17

$44.13

$750

Nonparticipating eligible schools

103*

50%*

50

20

17

$44.13

$750

Participating schools

50%

50

260

217

$44.13

$9,576

Students’ parents

1,750

64%

1,120

10

187

$22.71

$4,247

Total

-

-

1,232

-

441

-

$15,455

* The OFT will start with a sample of 103 schools, though it is estimated that three percent of schools will not be eligible for the study. The response rate is based on the eligible sample of 100 schools.

**The average hourly earnings of parents in the 2014 National Compensation Survey sponsored by the Bureau of Labor Statistics (BLS) is $22.71, and of education administrators is $44.13, Source: BLS Occupation Employment Statistics, http://data.bls.gov/oes/ datatype: Occupation codes: All employees (00-0000) and Education Administrators (11-9032); accessed on June 18, 2015.



Table 6. IVFT Burden Estimates1

Item Validation Field Test (IVFT)

Sample Size

Expected Response Rate

Number of Respondents

Number of Responses

Average burden time (minutes)

Total burden (hours)

Respondent average hourly wage2

Estimate of respondent burden time cost

Students and Parents

Student Survey

6,172

64%

3,950

3,950

20

1,317

$7.25

$9,548

Student Assessment3

6,172

64%

3,950

3,950

70

4,608

Students' parents

6,1725

64%

3,9504

3,950

30

1,975

$22.71

$44,852

Students' math teachers

Teacher-level, teacher characteristics

522

92%

480

480

13

104

$27.70

$2,881

Teacher-level, classroom characteristics

522*

92%

480*

480

7

56

$27.70

$1,551

Teacher report on student

522*

92%

480*

3,950

10

658

$27.70

$18,227

Students' special education teachers

Teacher-level survey

174

92%

160

160

10

27

$28.65

$774

Teacher report on student

174*

92%

160*

552

25

230

$28.65

$6,590

School administrators and coordinators

Students' school administrators

58

99%

57

57

20

19

$44.13

$838

School coordinator

58

100%

58

58

720

696

$26.94

$18,750

TOTAL for data collection activities

-

4,705

13,637

-

5,082

-

$104,011

Approved Total for recruitment4

-

6,454

6,454

-

1,424

-

$40,800

Total for all IVFT activities

-

-

11,159

20,091

-

6,506

-

$144,811

1 Because the MGLS:2017 IVFT recruitment and data collection will still be ongoing at the time this request is approved, the burden and materials from the MGLS:2017 Recruitment for 2016 IVFT request (OMB# 1850-0911 v.3, 5, and 7) and from the MGLS:2017 IVFT Data Collection (OMB# 1850-0911 v.4) are being carried over in this submission. Table 6 shows the approved IVFT burden being carried over.

2 The average hourly earnings of parents in the 2014 National Compensation Survey sponsored by the Bureau of Labor Statistics (BLS) is $22.71, of middle school teachers is $27.70, of middle school special education teachers is $28.65, of education administrators is $44.13, and of educational guidance counselors is $26.94. If mean hourly wage was not provided, it was computed assuming 2,080 hours per year. The exception is the student wage, which is based on the federal minimum wage. Source: BLS Occupation Employment Statistics, http://data.bls.gov/oes/ datatype: Occupation codes: All employees (00-0000); Middle school teachers (25-2022); Middle school special education teachers (25-2053); Education Administrators (11-9032); and Educational guidance counselors (21-1012); accessed on June 18, 2015.

3 Burden associated with student assessments is included here for informational purposes. It is not included in the total burden calculations because, unlike the other burden presented here, it is not subject to the Paperwork Reduction Act (PRA).

4 Recruitment activities for the IVFT will not be completed at the time this request will be approved, and thus the approved burden affiliated with the IVFT recruitment is being carried over and is included in the total requested in this submission.

5 The number of parent respondents is already included in the recruitment number of respondents.

* The same respondent group as above, not double counted in the total number of respondents.

Not applicable.


The total burden requested in this submission is a sum of burden estimates for OFT Recruitment and for IVFT Recruitment and Data Collection (table 7).

Table 7. Total Burden Estimates for OFT Recruitment and IVFT Recruitment and Data Collection

Data Collection

Number of Respondents

Number of Responses

Total burden (hours)

Estimate of respondent burden time cost

IVFT recruitment and data collection (carried over)

11,159

20,091

6,506

$144,811

OFT recruitment

1,232

1,232

441

$15,455

Total requested

12,391

21,323

6,947

$160,266


A.13 Total Annual Cost Burden

There are no respondent costs other than the cost associated with response time burden.



A.14 Annualized Cost to Federal Government

The estimated cost to the federal government for contractor and subcontractor work to conduct all aspects of the OFT is $2,815,487.

A.15 Program Changes or Adjustments

The apparent increase in burden from the last approved package is due to the fact that this request includes burden for MGLS:2017 OFT recruitment and the carried over burden for MGLS:2017 IVFT recruitment and data collection activities, while the previous approval was only for MGLS:2017 IVFT recruitment and data collection.

A.16 Plans for Tabulation and Publication

The results from the OFT will be presented in a report released approximately 6 months after the completion of the field test.

Table 8. Schedule for Operational Field Test (OFT)

Activity

Start date

End date

Recruitment of schools and districts

January 2016

March 2017

Recruitment of students and parents through requesting parent consent from parents

January 2017

May 2017

OFT Data Collection

January 2017

June 2017

Field Test Report

June 2017

December 2017


A.17 Display OMB Expiration Date

The OMB expiration date will be displayed on all recruitment materials.

A.18 Exceptions to Certification Statement

No exceptions to the certification statement are requested or required.

1 Tourangeau, K., Nord, C., Lê, T., Sorongon, A.G., Hagedorn, M.C., Daly, P., and Najarian, M. (2012). Early Childhood Longitudinal Study, Kindergarten Class of 2010–11 (ECLS-K:2011), User’s Manual for the ECLS-K:2011 Kindergarten Data File and Electronic Codebook (NCES 2013-061). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

2 Tourangeau, K., Nord, C., Lê, T., Sorongon, A.G., Hagedorn, M.C., Daly, P., and Najarian, M. (2001). Early Childhood Longitudinal Study, Kindergarten Class of 1998–99 (ECLS-K), User’s Manual for the ECLS-K Base Year Public-Use Data Files and Electronic Codebook (NCES 2001-029). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

3 Ingels, S.J., Pratt, D.J., Herget, D.R., Burns, L.J., Dever, J.A., Ottem, R., Rogers, J.E., Jin, Y., and Leinwand, S. (2011). High School Longitudinal Study of 2009 (HSLS:09). Base-Year Data File Documentation (NCES 2011-328). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

4 For example, in 1998–99, the Early Childhood Longitudinal Study had a weighted school-level response rate of 74 percent, whereas 12 years later, the complementary ECLS-K:2011 study had a weighted school-level response rate of 63 percent.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy