Volume 1 MGLS 2017 Cog Labs 2014 Reading & Disability

Volume 1 MGLS 2017 Cog Labs 2014 Reading & Disability.docx

NCES Cognitive, Pilot, and Field Test Studies System

Volume 1 MGLS 2017 Cog Labs 2014 Reading & Disability

OMB: 1850-0803

Document [docx]
Download: docx | pdf

National Center for Education Statistics


Middle Grades Longitudinal Study Of 2016-2017 (MGLS:2017) Cognitive Interviews

Set 2


Volume 1

Supporting Statement





OMB# 1850-0803 v. 103








May 12, 2014

Justification

The Middle Grades Longitudinal Study of 2016–17 (MGLS:2017) will be the first study sponsored by the National Center for Education Statistics (NCES)part of the Institute of Education Sciences (IES) in the U.S. Department of Education (ED) —to follow a nationally representative sample of students as they enter and move through the middle grades (grades 6, 7, and 8). The purpose of this submission is to conduct cognitive interviews to refine the assessments and questionnaires for the MGLS:2017 field test—specifically, the newly added reading assessment and the new content added to the parent survey to address topics specific to students with disabilities. Prior to cognitive interviews, all items have been reviewed by topical experts. The primary goal for the cognitive laboratory work is to ensure that the reading assessment, new content on the parent survey, and accommodations are appropriate for the middle grades population and, in particular, students with disabilities. We aim to gain a clear understanding of participants’ comprehension, retrieval, judgment, and response strategies when asked selected items, and to use this information to improve item wording and format. We also aim to determine if appropriate accommodations are provided for student instruments.

Study Description

The data collected for MGLS:2017, through repeated measures of key constructs, will provide a rich descriptive picture of the experiences and lives of all students during these critical school years and will allow researchers to examine associations between contextual factors and student outcomes. Because mathematics and literacy skills are important for preparing students for high school and because they are associated with later education and career opportunities, the study is placing a focus on instruction and student growth in these areas. The study’s emphasis on inclusiveness involves oversampling students who represent several categories of the Individuals with Disabilities Education Act (IDEA). A key goal of the study is to better understand the supports that students need for academic success, high school readiness, and positive life outcomes, including high school graduation, college and career readiness, and healthy lifestyles. The study will track the progress students make in reading and mathematics and their developmental trajectories as they transition from elementary school to middle school and from middle school to high school. The study will also identify factors in their schools, classrooms, homes, and communities, that may help explain differences in achievement and development and may contribute to their academic successes, as well as other outcomes both during the middle grades and beyond. Baseline data will be collected from a nationally representative sample of approximately 15,000 to 20,000 sixth graders in the spring of 2017, with annual follow-ups in spring 2018 and spring 2019 —when most of the students in the sample will be in grades 7 and 8 respectively. The national sample will yield between 1,800 and 2,400 students with disabilities, assuming that 12 percent of sixth grade students have one or more types of disabilities. The sample will be designed to yield at least 475 students in each of at least three disability categories (autism, emotional disturbance, and specific learning disability)1 in order to provide researchers with adequate power to analyze results for those groups. This will require oversampling students in the autism and in the emotional disturbance categories because of their low incidence in the overall student population (each of the groups represents approximately one percent of all students).

At each wave of data collection in the national study, students’ reading achievement, mathematics achievement, and executive function will be assessed. Students will also complete a survey that asks about their engagement in their school, out-of-school experiences, their peer group, and their identity and socioemotional development. Parents will be interviewed about their background, family resources, and parental involvement. Teachers will complete a two-part survey that asks about their background, and classroom instruction, and then asks teachers to report on each student’s academic behaviors, mathematics performance, and conduct. For students with disabilities, we will also ask their special education teacher to complete a two-part survey. First these teachers will be asked about their background and their role at the school. Next they will be asked questions on topics such as the sampled students’ primary disability, special education and related services, and the students’ instructional environment. School administrators will be asked to report on school supports and services, as well as school climate. Student information will be abstracted from school records and field staff will complete an observation checklist on school facilities and resources.

The study design requires a set of longitudinal and complementary instruments across multiple participants that provides information on the outcomes, experiences, and perspectives of students across grades 6, 7, and 8; their families and home lives; their teachers, classrooms, and instruction; and the school settings, supports, and available services. NCES has contracted with Decision Information Resources, Inc. (DIR) and its subcontractors, Mathematica Policy Research (Mathematica) and Educational Testing Service (ETS), to design and field test the instruments and procedures for MGLS:2017.

Members of the MGLS:2017 design project are developing a set of direct student assessments and questionnaires for students, parents, teachers, and school administrators , as well as developing and evaluating procedures for selecting the study sample. The project team will gain the cooperation of schools, teachers, parents, and students, and will administer and evaluate the study instruments. The design project has three phases of activity. In phase 1 (Instrument Development), the design team worked with NCES to identify and develop measures for the study’s key constructs. A critical component of this activity was soliciting and incorporating input from content and technical experts. The products from phase 1 were (1) a large mathematics item pool; (2) a draft set of executive function assessments; (3) a draft reading assessment; (4) drafts of student, parent, teacher, and school administrator questionnaires, and (5) a draft facilities checklist. Phase 2 (Testing and Evaluation) begun with cognitive laboratories for each study instrument, includes the second round of cognitive testing focused on the newly added reading assessment and the new content added to the parent survey to address topics specific to students with disabilities (the subject of this OMB package), and will conclude with a large field test in spring 2015 to obtain psychometric information. The field test will be conducted in 50 schools, with more than 4,100 students in grades 5 through 8, as well as 450 mathematics teachers, 150 special education teachers, and more than 800 parents. Phase 3 (Analysis and Revision) will use psychometrics to identify items, scales, and measures for mathematics, reading, and executive function assessments, along with the nonassessment instruments for the national study. The procedures used in the field test to select and recruit participants, work with school personnel, and administer each of the instruments will be scrutinized and recommendations for the national study will be finalized.

This request for clearance is submitted under the NCES generic clearance agreement (OMB #1850-0803) to conduct cognitive laboratory activities to inform the design of the direct assessment and nonassessment instruments being developed for MGLS:2017. The cognitive interviews will be conducted by DIR and Mathematica.

An earlier request to conduct cognitive laboratory activities for the parent questionnaire, the mathematics teacher questionnaire, the mathematics teacher student report, the school administrator questionnaire, the student mathematics assessment, the student executive function assessment, and the student questionnaire was approved on March 18, 2014 (OMB# 1850-0803). A request to field-test the instruments in February 2015 will be submitted to OMB at a later date. The MGLS:2017 collections are authorized by law under the Education Sciences Reform Act of 2002 (ESRA), 20 U.S.C. § 9543.

The purpose of this request is two-fold: 1) to conduct cognitive laboratory activities for the student reading assessment, and 2) to conduct cognitive laboratory activities to test newly developed questionnaire content related to topics of interest for students with disabilities and their families. This second purpose, which builds on earlier cognitive laboratory work, involves conducting interviews with parents of students with disabilities and also testing our current content in the student mathematics assessment, the student executive function assessment, and the questionnaire with students from each of the three disability categories.

Purpose of the Cognitive Laboratory Work

Cognitive laboratory work will be conducted to ensure that clear instructions and well-designed items are used in the following instruments: the Student Mathematics Assessment, the Student Reading Assessment, the Student Executive Function Assessment, the Student Questionnaire, and the Parent Questionnaire. This work involves students and parents participating in one-on-one interviews. All students will be observed as they complete their assessments and questionnaire, watching for signs of frustration or discomfort, and noting their comprehension of the task at hand. Staff conducting the cognitive laboratories will be trained to be particularly attuned to the needs of students with disabilities, using interviewing practices that take careful note of student reactions in completing the tasks. Furthermore, we want to use the cognitive laboratories to observe the reactions of students, particularly students with disabilities to the items. This will provide information on whether we need to adapt procedures to better support these students in completing the assessments and questionnaire. Next we provide the criteria used to select the items for cognitive laboratory testing followed by the cognitive laboratory testing goals. We present this first for the reading assessment work followed by the disability content work.

Reading assessments. For the reading assessment cognitive laboratory work several criteria were used to select items.

  • Timing of the skill block;

  • Platform for delivery of the assessment;

  • Experience using computer interface;

  • Understanding of the task instructions; and

  • Items requiring timing information.

Although many of the items in the MGLS:2017 reading assessment will be drawn from assessments developed by ETS through the Reading for Understanding initiative, the item pool development process identified some issues relevant for the purposes of this study. For some domains of reading, we identified relevant existing items that had not been used with the target population. For example, some reading assessment items had been previously administered to elementary or high school students, but not specifically to middle grade students. In other instances, we identified items from existing instruments that fit within one of the domains; however, the study sample on which they had been used was not as ethnically, geographically, and socioeconomically diverse as the MGLS:2017. Further, items in the reading assessment have all been administered to large, diverse samples but some of the combinations of items that will be included in the MGLS:2017 have not been tested. Thus, the timing of these new combinations of items will be investigated. Next we highlight particular goals for the cognitive laboratory work to be conducted for the reading assessment.

An objective of the cognitive laboratory work for the reading assessment is to acquire good timing estimates of the new combinations of items. All items taken from the Study Aid and Reading Assistant (SARA) and the Global, Integrated Scenario-Based Assessment (GISA) have been extensively tested in large, diverse samples. However, combining items into a first-stage routing block based on SARA items and second-stage, skill-based blocks, based on either additional SARA items or GISA items is novel in the MGLS:2017. Thus, it is important to get good timing estimates prior to the field test. All SARA and GISA items to be used in the cognitive laboratories and the field test will be selected based on item properties, such as difficulty and discriminability information known from previous large-scale administrations of these assessments.

Students in the cognitive laboratory for the reading assessment items will complete the routing block and one of the skill-based blocks. They will complete these blocks online on a computer to capture accurate timing information of single items as well as entire blocks. Once they have completed the blocks, they will complete an interview to ensure that the items selected have clear instructions. More details on the cognitive laboratory activities for the reading assessment are included in subsequent sections.

For all students, including students with disabilities, we are also interested in looking at the readability as well as comprehensibility of the items. We will be observing the students’ behavior and looking for responses that may suggest stress, frustration, or confusion. We will follow up with them to better understand items that may be problematic. We will also gain timing information on test items and learn about the types of accommodations that may be required in the field test.

Disability content. For the disability content cognitive laboratory work, new content will be added to the questionnaire for parents of students with disabilities. Several criteria were used to select items.

  • Items from prior studies that have been modified;

  • Items that have not been used on diverse samples;

  • New topic areas requiring exploration to determine the level of item development needed;

  • Complexity of language; and

  • Items requiring timing information.

The new content for parents of students with disabilities are items that have been used to collect data on middle school students with the same types of disabilities the MGLS:2017 is interested in. All items selected for cognitive laboratory work will be tested with a diverse group of parents to ensure that they are comprehensible and carry the same meaning across a range of participants. By testing draft items and getting feedback about topic areas, we will be able to refine the items for the field test where they will undergo further evaluation using larger and even more diverse samples.

In addition to this new content, we also seek to conduct cognitive laboratories on the existing mathematics assessment, executive function assessment, and the student questionnaire with student with disabilities. Therefore the following text is focused on describing the cognitive laboratory focus for children with disabilities.

        • Student Mathematics Assessment. The goal for testing students with disabilities on this assessment is to ensure readability and understanding of items. An additional goal is to learn more about potential specific accommodations needed by these students, which will inform our field test procedures. As with the cognitive laboratory work that is being conducted on this assessment with typically developing students, these assessment items will undergo two types of cognitive laboratory work with students who are diagnosed with autism, emotional disturbance, or a specific learning disability. First, students will complete a subset of assessment items and be interviewed one-on-one. The cognitive interview will help ensure that the items selected have instructions that are clearly articulated, that the mathematical task that the item is intended to inform is understandable, and that the format in which the item is presented does not create any unintended challenges. Second, for students with disabilities, we will be particularly interested in looking at the readability as well as comprehensibility of the questions. Many of the items are application items and include text designed to be readable at a third-grade level (with the exception of some mathematics words). By better understanding the literacy level of the items, we will be able to perhaps better gauge the potential confound between the measurement of mathematics with the reading ability of the students, and this may be of particular importance for students with disabilities.

  • Student Executive Function Assessment. The primary goal of cognitive laboratory work with students in the three disability groups for the executive function tasks is similar to the goals for typically developing students: to ensure that the selected tasks and directions are clear and comprehensible to middle grade students across diverse backgrounds and literacy levels. We will also collect information on the timing of the task administration. Specific to students with disabilities, we will carefully observe the students as they participate, looking for signs of frustration or confusion. We will also learn about the accommodations that are needed by these students to better inform and prepare for our field test.

  • Student Questionnaire. We intend to test the student questionnaire with up to 15 students with disabilities. The cognitive interviews will help to inform us about how students with disabilities fare regarding the understanding of the questions and the response options. Students with disabilities will receive the same student questionnaire as students without disabilities, but the cognitive interviews will provide an important opportunity to closely observe both groups of students for confusion or frustration with the items. We additionally plan to capture timing estimates to determine if students with disabilities will require more time to complete the questionnaire than their peers without disabilities. Finally, the cognitive interviews will provide the chance to learn about the types of accommodations that could be needed in order for certain students to complete the questionnaire.

  • Parent Questionnaire. For the cognitive laboratory work involving the parent questionnaire, in addition to items that will be asked of all parents, we will test newly added content that is specific to parents of students with disabilities. These items include asking respondents about the types of services their children receive inside and outside of school, as well as their satisfaction with these services. During the cognitive interviews, we will focus on item wording (across cultural or linguistic groups), gaining information on specific services that are not currently included in the MGLS list of services, and whether these parents appear sensitive to responding to questions about their child’s IEP and services. Together, these topics will help to inform the final item selection and wording.

Design

The cognitive laboratory work for the MGLS:2017 reading assessment instrument and the cognitive work to be conducted with students with disabilities and their parents consists of different data collection activities. Table 3 presents a summary of the different cognitive laboratory activities we are proposing to conduct as described in this package and the estimated respondent burden associated with them.

All cognitive interview activities will be conducted in English by researchers from Mathematica and DIR, who have been involved in the development of the item pool and the cognitive laboratory protocols. Staff involved in the cognitive interviews will participate in a specialized training prior to the commencement of the cognitive laboratory work. Additionally, all staff leading the work have extensive experience using cognitive interviewing techniques for federal studies. Training sessions will be given by these project leads via either webinars or in-person sessions lasting up to four hours. The training will include a step-by-step review of the protocols, and a review of specific observation and probing techniques. There will be a special session on working with students with various disabilities, which will include information on what to expect and how to handle situations that arise, as well as using the observation form to watch for students’ reactions to the tasks. Training will involve interactive activities to check for understanding, as well as a review of the survey and study materials that will be used as part of the cognitive lab process to insure that all staff are familiar and comfortable with them. The staff conducting the reading cognitive interview will include four interviewers (one project leader and three junior members of the project team). The staff conducting the cognitive interviewing of students with disabilities, as well as the new content added specifically about these students to the parent, school administrator, and teacher questionnaires will include five interviewers (one project leader and four junior members of the project team). Note, in addition to the training mentioned above, cognitive interviewers working specifically with students with disabilities have been identified based on their prior experience working with this study population and their families.

The outcomes of the cognitive interviews will be reviewed on a continuous basis, revisions to the items will be made as necessary, and revised items will be used with subsequent participants. Deciding beforehand how many revisions will be needed is problematic, but we assume that at least one round of revisions will be needed after we have completed about one-third of the scheduled cognitive interviews. To monitor the progress of the cognitive interviews and efficiently incorporate feedback being received in the field, the design team will hold weekly debriefings during the cognitive laboratory work. The following is a description of the data collection methods for each instrument.

Student Mathematics Assessment. We will work individually with a total of up to 20 students focusing on the three primary disability categories (autism, emotional disturbance, and specific learning disability), but also recruiting students with other types of disabilities that we expect will come into the sample during the field test and national study, for example students with speech or language impairments and students with ADD/ADHD. Each cognitive laboratory will take approximately 60 minutes and students will test items in the Mathematics Assessment using paper-and-pencil forms. In many cases assessments will be conducted at one of the Mathematica corporate offices (located in Chicago; Princeton, NJ; Washington, DC; Oakland, CA; Cambridge, MA; and Ann Arbor, MI). When travel time is greater than 30 minutes, we will identify a community space in which to meet (such as a private room at a local library), and will travel to the student.

During the cognitive interview, students will complete one block of 15 to 20 test items, using paper and pencil, in the presence of a researcher, and will then be interviewed following a standard protocol (Volume 3). The use of paper and pencil will allow us to instruct students to circle or highlight words that they find difficult to understand as they work through the series of items. This also allows us to capture feedback from the students without interrupting their thought processes as they complete the assessment. Although the field test will be computerized, the cognitive aspects of the item will be the same. We will closely observe students’ verbal and nonverbal behaviors as they work through the tasks, and note any accommodations that they require. Based on parental responses to our screening questions, we will know what to expect in terms of accommodations that the student will need. While some accommodations may not alter our approach, for example if the student will be wearing a hearing aid, other accommodations will, such as a student requiring frequent breaks. After students have completed the items, we will ask them whether the questions and instructions are clear, how familiar the tasks are, and whether they understand the questions (even if they do not know how to answer them). In particular, we will probe students about word problems with multiple sentences by asking them (1) what they think the item is asking and (2) what information they think the item provides. We will also pay close attention to students’ understanding of varied item presentations (for example, students’ ability to recognize fractions, equations, and graphs in multiple formats). Information from these interviews will help evaluate the language and literacy demands of the items, the ease of understanding the questions, and any ambiguities in the questions or response options. We will also be looking at the need for multiple breaks during the assessment and the average time per item for these students so that we can plan for appropriate staffing during the field test.

With the permission of students and their parents, interviews will be recorded for note-taking purposes. A digital recorder placed on the table between the student and the interviewer will be used to record the audio. During the interview, the interviewer will note pertinent aspects of the interview process, such as the student’s level of motivation and any special circumstances that might affect the interview. As needed, recordings will be analyzed afterward—for example, if it is determined that additional information is required beyond what was captured in the protocol or if additional members of the development team need to hear a student’s tone and complete response to a question. We will keep the paper copies of the students’ assessments to review for any indications of confusing sentences or words.

Student Reading Assessment. The cognitive laboratory work for the reading assessment will be conducted individually with 20 students total10 typically developing students and 10 students with disabilities. For students with disabilities, we hope to recruit students from the three groups that will be oversampled in the MGLS:2017 field test: (1) autism, (2) emotional disturbance, and (3) specific learning disability. Students will complete the reading assessment online at one of the Mathematica corporate offices. When the travel time is greater than 30 minutes, we will identify a community space in which to meet (such as a private room at a local library), and will travel to the student. The cognitive laboratory will take approximately 60 minutes for each student.

To begin, students will complete a first-stage routing block which will take approximately 10 minutes. Based on their performance on the routing block, students then will be assigned to a second-stage, skill-based block designed to take approximately 20 minutes. While the students are completing both blocks, they will be observed by the interviewer. The interviewer will note aspects of the students’ interactions with the online assessment. For example, we will ask the interviewer to note whether students could navigate through the screens accurately, if they scroll to view a passage on items where scrolling is possible, if they pause on direction screens or skip them, and so on. The interviewer will also provide information on student demeanor, such as level of motivation, or indications of stress or confusion, and provide comments on any special circumstances that might affect the interview.

After the students have completed each section of the online portion of the cognitive laboratory (i.e. the routing block: Block 1 and the second-stage skill-based blocks, either Block 2a or Block 2b), an interview will be conducted with the student. During the interview, the interviewer will ask the students whether the questions and instructions were clear, how familiar the tasks were, and whether they understood the questions being asked (even if they did not know how to answer them). Students will be provided with printed copies of the online materials in case they would like to discuss some items specifically. The interviews will be recorded for note-taking purposes with permission of the students and their parents. A digital recorder placed on the table between the students and the interviewer will be used to record the audio.

Student Executive Function Assessment. The cognitive interviews for the executive function assessment proposed in this package will be conducted with up to 20 students across the three disability categories of interest, as well as with students with other types of disabilities that we expect will come into the sample during the field test and national study, for example students with speech or language impairments and students with ADD/ADHD. The MGLS:2017 executive function tasks will measure inhibitory control, cognitive flexibility, and working memory. We will program four tasks on a laptop or tablet for the students to complete. As with the mathematics and reading assessment cognitive interviews, we expect most students will travel to Mathematica corporate offices (located in Chicago; Princeton, NJ; Washington, DC; Oakland, CA; Cambridge, MA; and Ann Arbor, MI) to complete the executive function cognitive laboratory activity.

After explaining the purpose of the activities and the student’s participation, we will provide the executive function assessment to the student and observe him or her using a standard protocol (Volume 3). These assessments will have directions and practice items at the beginning that will be written as well as recorded for the student to hear through headphones. With the parent and student’s permission, video from the interviews will be recorded for note-taking purposes. An unobtrusive camera, likely placed on the laptop and focused on the student, will be used. Cognitive lab interviewers will be trained to set up the camera appropriately, record the assessment, and review the recording to supplement their notes on student performance. As with the audio recordings, the video recordings will be analyzed afterward; if it is determined that additional information is required beyond what was captured in the interviewer’s notes during the cognitive laboratory session. The interviewer will take notes during the cognitive laboratory session and will fill in any gaps using the video recording after the interviews. The interviewer will pay special attention to students who require any accommodations (based on parental responses to the screening questions), have any trouble getting started with the tasks, have difficulty understanding the directions, or have difficulty staying focused on the tasks throughout the session. As with the other assessments, we will also be looking at the need for multiple breaks during the assessment and the average time per item for these students so that we can plan for appropriate staffing during the field test. After the students complete the assessment, the interviewer will debrief them about their experiences to identify any problems with administration, procedures, or directions for the executive function tasks. Each task takes approximately 5 minutes to complete. Together with the debriefing questions, this cognitive activity will last about 60 minutes.

Student Interview. For the student interview we will use the same protocol that will be used for the student assessments, which will capture whether students require any accommodations, have any trouble getting started with the tasks, have difficulty understanding the directions, or have difficulty staying focused on the tasks throughout the session. We plan to include up to 15 students with disabilities, focusing efforts on the disability categories of autism, emotional disturbance, and specific learning disability, but also including students with other disabilities such as ADD/ADHD, physical disabilities, or speech and language impairments. The focus will be to assess items from the Student Questionnaire using a group of students with disabilities. As with the math assessment, reading assessment, and executive function assessment cognitive interviews, the student questionnaire cognitive interviews will be conducted in person in a Mathematica corporate office or, when travel time is more than 30 minutes, an interviewer will travel to the student and conduct the interview in a community space (such as a private room at a local library). With the parent and participant’s permission, audio and video from the interviews will be recorded for note-taking purposes. During the interview, the interviewer will note pertinent aspects of the interview process, such as the student’s level of motivation and any special circumstances that might affect the interview. Recordings will be analyzed afterward if it is determined that additional information is required beyond what is captured within the protocol.

The student interview will follow a standard protocol (Volume 3), taking about 60 minutes. We will examine Student Questionnaire items by conducting semi-structured interviews with probes to make sure students understand the questions. The interview will be interactive. For example, we will use questioning techniques after individual items so that we get sufficient feedback about new topics that have not been previously asked of a diverse group of middle grade students. The semi-structured interviews and probing will also help inform our decisions to make modifications to the wording of pre-existing items. For example, our cognitive laboratory work will include items on students’ academic expectations and their perceptions of parenting behaviors. We will ask students about the meaning of these items, and what factors they considered in answering the items. We will also probe students on issues that were observed by the interviewer, such as signs of stress or confusion, so that we may better understand the source of their negative reaction. As with the other cognitive goals of the assessments, we will also learn what types of accommodations students with disabilities need in order to complete the questionnaire.

Parent Interviews. We will conduct up to 10 one-on-one interviews lasting approximately 30 minutes each with parents of students in grades 6, 7, or 8 who have autism, emotional disturbance, or specific learning disabilities. We will test items and topics from the Parent Questionnaire, including some items that are being tested on parents of typically developing students, and also new items that ask parents of students with disabilities about their satisfaction with the services their children receive at school and their children’s receipt of services outside of school. Parents will be sent the topics and items beforehand so that they can read the instructions and be familiar with what will be covered (estimated at 5 minutes of prep time). These interviews will be conducted via telephone or in-person. With participants’ permission, audio from the interviews will be recorded for note-taking purposes. Recordings will be analyzed afterward if it is determined that additional information is required beyond what is captured within the protocol.

The parent interview will be conducted using a standard protocol (Volume 3) and will include specific survey items followed by probes, which will examine the extent to which parents interpret the new items in the same way as one another. Parent responses will inform our decisions on potential modifications to wording of items. We will also include a list of “observation” items to be used by the interviewer to capture parental sensitivity to the questions, as observed over the phone or in person, and reluctance on the part of the parent to discuss topics related to their child’s IEP such as services they receive outside of school.

Consultants Outside the Agency

The MGLS:2017 design team has sought the expertise of content area specialists to gather input on the various measures. We conducted a series of Content Review Panels (CRPs) to seek input on the key constructs to measure and to identify possible item sources and assessment design. Between July 2013 and April 2014, we convened five panels for the following study components: mathematics assessment, executive function tasks, socioemotional-student-family questionnaires, reading assessment, and students with disabilities. Most relevant to this OMB submission are the Reading Assessment Panel and the Students with Disabilities Panel.

MGLS:2017 Reading Assessment Content Review Panel met via webinar on April 14, 2014 to provide input on the assessment design and the targeted constructs. The panel was very supportive of the reading assessment design and proposed content. The panel provided input on the utility of the timed MAZE task in the skill-based blocks to ensure that even the lowest ability group is demonstrating comprehension of connected text longer than a sentence. The panel supported the inclusion of items focused on word recognition and decoding for the lowest ability group, but suggested reducing the number of these items to allow more time for comprehension-focused items. The panel also provided input on the construction of the scenario-based, skill-based blocks for the moderate and high ability group. The panel suggested that a deeper focus on fewer tasks may be better for engaging middle school students with the material, in contrast to sampling more broadly across many different tasks. The discussion with the panel supported the use of three different topics—one for each scenario-based block for better topic coverage—rather than using the same topic across each scenario-based block with different item types. The input from the panel will shape the final design and content for the reading assessment.

The MGLS:2017 Students with Disabilities Oversample Panel. This panel met via webinar on April 29, 2014 to discuss a variety of issues related to the students with disabilities component of the study. The panel provided feedback on the key research questions, the proposed instrument content related to students with disabilities, and helped to identify gaps in both the research questions and in the instrument coverage of critical topical areas for this group. The panel also reviewed the current sampling plan and approach to working with districts and schools, helping identify potential challenges and solutions related to the sampling and collection of data on students in the three main disability categories of interest. In addition, the panel weighed in on the overall approach for the various assessments, their content, and the accommodations we expect to use. The panel supported the set of research questions but suggested that we think carefully about some definitions, such as what is meant by “service provider,” and the varying types of service models we can expect to find in schools. The panel also suggested conducting a careful review of the current accommodations proposed, ensuring that we highlight the accommodations and supports available to all students via computer.

Table 1 shows the experts in attendance at each of the five CRP meetings, including the reading assessment and students with disabilities panels. The instruments being evaluated in the cognitive interviews reflect recommendations made by these various CRPs.

Table 1. Members of the Content Review Panels

Name

Affiliation

Expertise

Mathematics Assessment CRP (June 18-19, 2013)

Tom Loveless

Brookings Institute

Policy, math curriculum

Linda Wilson

Formerly with Project 2061

Math education, math assessment, middle school assessment, author of Assessment Standards for School Mathematics (NCTM) and NAEP math framework, teacher

Kathleen Heid

University of Florida

Math education, use of technology, teacher knowledge, NAEP Grade 8 mathematics standing committee member

Edward Nolan

Montgomery County Schools

Math curriculum and standards, large-scale assessment of middle-grade students

Lisa Keller

UMass Amherst

Psychometrics, former math teacher

Paul Sally

University of Chicago

Math education, mathematics reasoning, mathematically talented adolescents

Margie Hill

University of Kansas

Co-author of Kansas math standards, former NAEP Math Standing Committee member, former district math supervisor

Executive Function CRP (July 18, 2013)

Lisa Jacobson

Johns Hopkins University; Kennedy Krieger Institute

Executive functioning; attention; neurodevelopmental disorders, parent and teacher scaffolding development of executive functioning skills

Dan Romer

University of Pennsylvania

Adolescent risk taking

James Byrnes

Temple University

Self-regulation, decision making; cognitive processes in mathematics learning

Socioemotional-Student-Family CRP (July 25-26, 2013)

James Byrnes

Temple University

Self-regulation, decision making; cognitive processes in mathematics learning

Russell Rumberger

University of California, Santa Barbara

School dropouts, ethnic and language minority student achievement

Tama Leventhal

Tufts University

Family context, adolescence, social policy, community and neighborhood indicators

Susan Dauber

Bluestocking Research

School organization, educational transitions, urban education, parent involvement and family processes

Scott Gest

Pennsylvania State University

Social networking, social skills, and longitudinal assessment of at-risk populations.

Kathryn Wentzel

University of Maryland

Social and academic motivation, self-regulation, school adjustment, peer relationships, teacher-student relationships, family-school linkages

Richard Lerner

Tufts University

Adolescent development and relationships with peers, families, schools, and communities


Name

Affiliation

Expertise

Reading Assessment CRP ( April 14, 2014)

Donna Alvermann

University of Georgia

Adolescent literacy, online literacy, co-director of the National Reading Research Center, which is funded by ED

Joseph Magliano

Northern Illinois University

Cognitive processes that support comprehension, the nature of memory representations for events depicted in text and film, strategies to detect and help struggling readers.

Sheryl Lazarus

University of Minnesota

Education policy issues related to the inclusion of students with disabilities in assessments used for accountability purposes, student participation and accommodations, alternate assessments, technology enhanced assessments, teacher effectiveness, large-scale assessments, school accountability, research design (including cost analyses), data-driven decision-making, rural education, and the economics of education.

Disabilities CRP (April 29, 2014)

Jose Blackorby

SRI International

Autism, specific learning disabilities, special education, curriculum design, alternate student assessment, large scale studies of students with disabilities, co-director of the Special Education Elementary Longitudinal Study (SEELS)

Lynn Fuchs

Vanderbilt University

Specific learning disabilities, student assessment, mathematics curriculum, psychometric models

Mitchell L. Yell

University of South Carolina

Autism, emotional and behavior disorders, specific learning disabilities, preK–12 instruction and curriculum, special education, evidence-based intervention

Sheryl Lazarus

University of Minnesota

Special education policy, inclusion of students with disabilities in assessments, accommodations, alternate assessments, technology enhanced assessments, large-scale assessments, school accountability, research design (including cost analyses)

Martha Thurlow

University of Minnesota

Specific learning disabilities, reading assessment, alternate student assessment, early childhood education, special education, curriculum, large-scale studies

Diane Pedrotty Bryant

University of Texas, Austin

Educational interventions for improving the mathematics and reading performance of students with learning disabilities, the use of assistive technology for individuals with disabilities, interventions for students with learning disabilities and who are at-risk for educational difficulties


Recruiting and Paying Respondents

Recruiting

DIR and Mathematica will recruit participants using networks of professional contacts (including CRP members), community organizations, disability communities, and charter or private schools to help achieve the desired cultural, linguistic, grade, and geographic diversity in the samples. Students across the disability groups of interest will also be represented in the samples and will be diverse with regard to their differentiation within each disability category. Where possible, we will attempt to streamline recruitment and optimize interview time. In particular, students and parents will be recruited together, as we contact parents to obtain consent for their minor children. Student assessments and interviews will be conducted in person. The target locations (Chicago; Princeton, NJ; Washington, DC; Oakland, CA; Cambridge, MA; and Ann Arbor, MI) will allow the study to obtain a diverse sample in close proximity to Mathematica offices. Parents will be interviewed over the phone or in-person.

Students. To ensure a diverse sample for the reading assessment, students will be recruited by DIR and Mathematica staff to achieve a mix of (1) sixth, seventh, and eighth-grade students; (2) gender; (3) race and ethnicity (Black, Asian, White/ Hispanic); and (4) socioeconomic background. For the disability-related cognitive laboratory work our primary focus will be to recruit across the three disability groups of principal interest, (autism, emotional disturbance, and specific learning disability) but also including students with other disabilities such as ADD/ADHD, physical disabilities, or speech and language impairments. We will use local disability communities and organizations and our networks of professional contacts, including CRP members as recruiting resources. Although the sample will include a mix of student characteristics, the results will not explicitly measure differences by those characteristics.

DIR and Mathematica staff will begin recruitment by using existing contacts to reach out directly to parents and other adults in the community who work with children (for example, through churches, sports programs, and disability and community-based organizations). If needed, the recruiters may also ask members of the CRPs for referrals (members across the CRPs work with both typically developing students and students with disabilities) or place advertisements in print or online sources. The recruitment team will use a variety of methods (e-mails, letters, and phone calls) to make the initial contacts.

Interested participants will be screened to ensure the mix of characteristics described above. Both DIR and Mathematica will use the same basic recruitment screener, customized as necessary for the specific population being recruited. When recruiting student participants, staff will first speak to the parent of the minor before starting the screening process. During this conversation, the parent will be informed of the objectives, purpose, participation requirements, and activities involved in the student data collection effort. The parent screening is followed by screening of the student.

Once parents agree to allow their child to participate, they will be asked to complete and sign a written consent form prior to scheduling the cognitive interview. Parents of students with disabilities will be asked several questions about their child’s disability and strategies for working with the childfor example, how to tell if their child is getting upset and what the best response is if that occurs. After confirmation that student participants are qualified, willing, and available to participate in the research project, they will receive a confirmation e-mail or letter. Only after DIR or Mathematica has obtained written consent from the parent, will a student be allowed to participate in the cognitive interview session. See Volume 4 for recruitment, consent forms, confirmation, and thank you letters.

Parents of Students with Disabilities. Staff will determine a parent’s interest in participating in the parent cognitive interview at the same time as they are speaking with parents about their child’s participation. As with students, the goal for the parent interview cognitive testing is to have a sample that includes as much diversity as possible; however, priority will be placed on conducting interviews with parents of students of each of the three disability groups of interest (autism, emotional disturbance, or a specific learning disability). If a sufficient number of these parents agree to participate, efforts will be made to conduct interviews with parents of sixth, seventh, and eighth grade students, and to the extent possible, of varying socioeconomic backgrounds and race and ethnicities (Black, Asian, White, Hispanic). Finally, if possible, study staff will attempt to recruit a balance of mothers and fathers. If chosen, parents will be asked to sign a separate consent form indicating their voluntary participation in the parent cognitive laboratory interview.

Incentives

To attract individuals to participate in the cognitive laboratory activities and to thank them for their time and effort, students will receive a $25 gift card for participating; parents will receive a $25 gift card for accompanying a student to the activities; and parents of students with disabilities who complete the parent cognitive interview will receive an additional $15 gift card. Table 2 provides the cognitive laboratory incentive amount for each activity.

Table 2. Reading and Disability Cognitive Laboratory Incentives, by Activity

Student Interview (15), Mathematics Assessment (20), Reading Assessment (20) and Executive Function Assessment (20)

$25 for student + $25 for accompanying parent

Parent Interview (10)

$15




Assurance of Confidentiality

At the beginning of the cognitive interview, respondents will be informed that their participation is voluntary and that their answers may only be used for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (Education Sciences Reform Act of 2002, 20 U.S.C. § 9573). No personally identifiable information will be maintained after the interview analyses are completed. With respondent permission, audio and video from the interviews will be recorded for use in later analyses. If the respondent (or the student’s parent) indicates that he or she does not want to be recorded (either audio or video), only written notes will be taken. The recordings and notes will be destroyed at the conclusion of the MGLS:2017 design contract.

Estimate of Hour Burden

Table 3 shows the expected burden for the cognitive laboratory activities. We anticipate contacting approximately 150 parents during recruitment to yield the final sample of respondent interviews to test the parent survey with its added disability content, and to test the student mathematics assessment, reading assessment, executive function tasks, and student questionnaires for students with disabilities. The estimated burden for recruitment is 10 minutes on average, for a total of approximately 25 burden hours. The estimates of the numbers of parents who will need to be contacted about their participation and the recruitment time are based on estimates from similar cognitive laboratory studies (for example, TIMMS, NAEP, and ECLS-K). For all of the student level testing, we plan to focus our testing on students of the three disability categories of interest (although we will also seek and include students with other disabilities) and across grades 6, 7, and 8. The total respondent burden for the MGLS:2017 cognitive laboratory activities described in this OMB package is approximately 106 hours.


Table 3. Respondent Burden Chart for MGLS:2017 Cognitive Laboratory Activities


Activity (Number of Items/Topics per Participant)

Subpopulations Represented

(grades 6, 7,
and 8)

Mode

Number of Respondents

Length (mins)

Total Burden (Hrs)

Recruitment


Mail, e-mail, telephone

150*

10

25

Student Mathematics Assessment (15 to 20 items)

Up to 20 students across three disability categories

In-person, hard copy assessment

20

60

20

Student Reading Assessment (routing and skill-based blocks)

Up to 10 typically developing students; up to10 students with disabilities

In-person, computer assessment

20

60

20

Student Executive Function Tasks (4 tasks)

Up to 20 students across three disability categories

In-person, computer assessment

20

60

20

Student Questionnaire (8 survey items/item series)

Up to 15 students across three disability categories

In-person interview

15

60

15

Parent Questionnaire (4 survey items/item series)

Up to 10 parents of students across three disability categories

In-person or telephone interview, hard copy topic form

10

35

6

Study Total

235 Responses


160

N/A

106


* These estimates account for anticipated attrition rates of 50 percent from initial contact to participation of 75 students.


Cost to the Federal Government

The estimated cost to the government to conduct the MGLS:2017 cognitive laboratory activities described in this OMB submission is $115,775.

Project Schedule

Table 4 provides the schedule of milestones and deliverables for this round of cognitive interviews. Recruitment for the interviews is expected to begin in June 2014. Interviews need to be completed by July 2014. The results of the cognitive laboratory activities will be summarized in a report scheduled for August 2014, as well as brief interim reports submitted during the cognitive laboratory process.

Table 4. Schedule of MGLS:2017 Reading and Disability Cognitive Laboratory Milestones and Deliverables

Recruitment begins

May 2014

Data collection, coding, and analysis

June – July 2014

Cognitive laboratory report completed

August 2014


1 The sample yield and number of disability categories planned for oversampling have been revised since the last cognitive interviews clearance, because after consultation with the Office of Special Education Programs (OSEP), NCES decided to focus on three of the five disability categories originally identified for the study, thereby reducing the planned sample yield size for the national study.  The category of “Speech or Language Impairment” was dropped because this disability is much more prevalent in younger children and is less of an issue in the middle grades; and the category of “ADD/ADHD” was dropped because this is not a reportable category under IDEA and so identifying students with this disability would be problematic.



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-01-27

© 2024 OMB.report | Privacy Policy