MTSS-R OMB Part B 12.20.2021-Clean

MTSS-R OMB Part B 12.20.2021-Clean.docx

Impact Evaluation of Training in Multi-Tiered Systems of Support for Reading in Early Elementary School

OMB: 1850-0953

Document [docx]
Download: docx | pdf

Shape1


December 2021


Impact Evaluation of Training in Multi-Tiered Systems of Support for Reading in Early Elementary School

OMB Clearance Request: Data Collection Instruments, Part B


Data Collection





Impact Evaluation of Training in
Multi-Tiered Systems of Support for Reading in Early Elementary School

OMB Clearance Request: Data Collection Instruments, Part B

Data Collection

December 2021




1000 Thomas Jefferson Street, NW
Washington, DC 20007-3835
202.403.5000

www.air.org

Copyright © 2021 American Institutes for Research. All rights reserved.






Contents

Page


Exhibits

Page


Part B. Supporting Statement for Paperwork Reduction Act Submission

This package requests clearance from the Office of Management and Budget (OMB) to conduct initial data collection activities for the Impact Evaluation of Training in Multi-Tiered Systems of Support for Reading in Early Elementary School (the MTSS-R Study). The Institute of Education Sciences (IES), within the U.S. Department of Education (ED), awarded the MTSS-R Study contract to the American Institutes for Research (AIR) and its partners, Instructional Research Group (IRG) and School Readiness Consulting (SRC), in September 2018. The purpose of this evaluation is to provide information for policy makers, administrators, and educators on the effectiveness of two MTSS-R approaches in improving classroom reading instruction and students’ reading skills. In addition, the evaluation will examine implementation challenges and costs of the two MTSS-R approaches.

This package provides a detailed discussion of all evaluation activities. However, the package only requests clearance for the school staff surveys, interviews, and Tier I and Tier II post-observation interviews, which will take place from the spring of SY2021-22 through SY2023–24. A previous package submitted in June of 2020 and cleared in December of 2020 (OMB Control Number: 1850-0953) requested clearance for the data collection activities that occur in the fall of 2021 (parent consent forms for student participation in data collection activities, district records requests to identify students in the sample, and district cost interviews).

Collections of Information Employing Statistical Methods

  1. Respondent Universe and Sampling Methods

The Impact Evaluation of Training in Multi-Tiered Systems of Support for Reading in Early Elementary School (the MTSS-R Study) will examine the impact of two approaches to MTSS-R on instructional practices and student reading skills in elementary schools. The study will include a purposive sample of approximately 150 elementary schools across 9 school districts. We anticipate that approximately 900 teachers, 450 reading interventionists, and 22,500 students in the 150 schools will participate in the study.

Site selection. The study does not employ random sampling of districts or schools. Instead, districts and schools are currently recruited according to characteristics required by the study design. To identify potential schools and districts, the study team and ED have developed the following eligibility guidelines:

Schools are eligible if they:

  • Have a low overall level of MTSS-R implementation.

  • Are not planning to train staff to improve MTSS-R implementation during the 2021–22, 2022–23, or 2023–24 school years.

  • Have access to core reading program materials for Tier I.

  • Have at least 40 students per grade in Grades 1 and 2.

  • Have at least 20% of students in Grades 1 and 2 who struggle with reading.

Districts are eligible if they have at least nine eligible schools.

To determine initial eligibility, the study team first identified 337 districts nationwide that had nine or more elementary schools with 40 or more students per grade in Grades 1 and 2 and that had at least 20% of third-grade students performing below basic on the state reading test. We excluded Hawaii because of cost concerns. From this initial list of districts, 9 school districts and a total of approximately 150 schools are currently recruited for the study on the basis of their current and planned practices (e.g., districts already implementing a rigorous MTSS-R approach will not be selected), interest, and ability to participate.

Sampling Methods

The paragraphs below describe the sampling plans for each data collection.

  1. Data collections that do not require clearance

Study-administered student tests. The study team will administer the Woodcock Reading Mastery Test to all students (excluding students whose parents have not consented) in the fall of first grade. We will use the Woodcock Reading Mastery Subtests to identify the subgroup of students at risk for reading difficulties in each cohort based on a common cutoff value across all schools. In the spring of each year of MTSS-R implementation, we will test a stratified random sample of students within each classroom who were part of the fall baseline testing and are still enrolled in a study school in the spring. Within each of the tested classrooms, we will select a total of 10 students across two strata:

  • Five students randomly selected from the at-risk subgroup, representing students who are likely to be referred to Tier II intervention. If a classroom includes five or fewer than five at-risk students, all at-risk students will be sampled.

  • Five students randomly selected from the not-at-risk subgroup, representing students who are not likely to be referred to Tier II intervention but may be affected by changes in Tier I core reading instruction introduced as part of the MTSS-R intervention.

We plan to sample students within each classroom to balance the sample of students across classes, so that the estimated effects are based on students taught by all Grade 1 and 2 teachers of reading in the sampled schools. Randomly sampling students within classrooms will also facilitate supplemental analyses that examine the association between instruction and student outcomes. Prior to sampling, we will examine the distribution of student baseline test scores across classrooms. If there are not enough students in the at-risk subgroup in some classes, we will consider sampling at the grade level within each study school instead of within each classroom. We will finalize the sampling process for outcome testing to ensure that the total number of students tested averages 10 students per classroom per grade per school (5 at risk and 5 not at risk).

Tier I and Tier II observations. During the spring of the second study year, the study team will observe all first- and second-grade teachers in each of the 150 study schools. Each teacher will be observed once, and half of those observed will be randomly selected to be observed a second time (stratifying the sample by school and grade). In addition, we will observe a random sample of two reading interventionist groups (one in Grade 1, one in Grade 2), which will be followed by a post-observation interview. Interventionist groups will be observed only once.

  1. Data collections under the previous clearance request

Parent consent forms. In the fall of 2021, the study team will distribute parent consent forms to all parents of first grade students (i.e., Cohort 1 students) in participating schools. The study team will do the same in the fall of 2022 (i.e., for Cohort 2 students). The forms will ask for active consent in districts that require it, and will allow parents to opt their child out of data collection activities in districts that do not require active consent.1

District cost interviews. In the fall and spring of the first study year, and in the spring of the second and third study years, the study team will interview the head of each district’s early elementary curriculum/literacy and student services departments.

District records requests. The study team will request the following data:

(1) Class rosters and demographic information for students. We will request roster and demographic data for all Cohort 1 and Cohort 2 students (whose parents have not opted them out) in the fall and spring of SY2021–22 (Cohort 1 only), the fall and spring of SY2022–23 (both cohorts), and the fall and spring of SY2023–24 (Cohort 2 only).

(2) Staff position and demographic information. We will request directory and demographic information for all first-grade and second-grade teachers and interventionists serving first-grade and second-grade students in participating schools, as well as for district central office staff supporting implementation, in the spring of 2022, 2023, and 2024.

(3) Data on students’ intervention status and their screening and progress monitoring data. We will request information on students’ intervention status, as well as screening and progress monitoring data on all Cohort 1 and Cohort 2 students in the fall, winter, and spring of SY2021–22 (Cohort 1 only), SY2022–23 and SY2023–24 (Cohort 2 only).

(4) Student achievement data. We will also request information on third-, fourth-, and fifth-grade achievement in the spring of 2024 (Cohort 1students’ third-grade achievement), spring of 2025 (Cohort 1 students’ fourth-grade achievement and Cohort 2 students’ third-grade achievement), spring of 2026 (Cohort 1 students’ fifth-grade achievement and Cohort 2 students’ fourth-grade achievement), and spring of 2027 (Cohort 2 students’ fifth-grade achievement).

  1. Data collections under the current clearance request

Teacher surveys. The study team will survey all Grade 1 and 2 teachers in participating schools in the spring of 2022, 2023, and 2024.

Reading interventionist surveys. In each study school, the study team will survey all school staff that provide supplemental reading intervention to students in Grades 1 and 2, in the spring of 2022, 2023, and 2024.

MTSS-R team leader surveys. The study team will survey one person in each school who is familiar with the school’s practices for screening and progress-monitoring in the spring of 2022, 2023, and 2024. In most schools, this will be a reading specialist or school administrator who serves as the MTSS-R team leader.

MTSS-R team leader interviews. In the spring of 2022 and 2024, we will interview the
MTSS-R team leader in each school implementing one of the two MTSS-R approaches.

Tier I and Tier II post-observation interviews. During the spring of the second study year, the study team will observe all first- and second-grade teachers in each of the 150 study schools. Each observation will be followed by a 5-minute interview related to the teacher’s use of data to inform the lesson. Each teacher will be observed once, and half of those observed will be randomly selected to be observed a second time (stratifying the sample by school and grade). In addition, we will observe a random sample of two reading interventionist groups (one in Grade 1, one in Grade 2), which will be followed by a post-observation interview. Interventionist groups will be observed only once.

  1. Procedures for Data Collection

This section describes our procedures for each data collection.

  1. Data collections that do not require clearance

Procedures for testing students. Students whose parents have provided consent will be administered the Woodcock Reading Mastery Test individually and in person. The study team will send teams of testers to schools, and the teams will plan to complete each school’s testing within two days for each administration. The study team will work with schools and districts in advance to obtain additional school staff support to facilitate the testing.

Procedures for the Tier I and Tier II observations. Classroom teacher and interventionist observations will be conducted in person.

  1. Data collections under the previous clearance request

Procedures for collecting parent consent forms. The procedures will differ depending on whether districts require active informed parent consent or grant waiver of the active informed consent; two of the likely participating districts require active parental consent.

Procedure for districts that require active consent: The study team will work with districts requiring active consent to identify one person per school—a parent consent liaison—to compile consent forms as they are returned and conduct targeted outreach to ensure that forms are returned. The parent consent liaison will be compensated for his or her time with a stipend. Specifically, we will implement the following steps:

  • Consent forms will be included in the information packet parents receive at the beginning of the year, with clear instructions to return the form along with other necessary paperwork.

  • One week after parents receive the consent forms, the consent liaisons will send a second copy of the form home for students whose parents have not yet returned a consent form, using “backpack mail.”

  • One week later, the parent consent liaison will begin reaching out to parents who have still not received the form, with the help of classroom teachers.

  • The parent consent liaisons will compile the returned forms twice a week for pick-up and will share return rates by classroom with a local data collection consultant employed by the study team. The local data collection consultant will pick up the consent forms twice a week from the schools, enter consent information into a data base, make copies of the consent forms and send original forms to the study team.

Procedure for districts that waive active consent: As in the districts that require active consent, the consent forms, allowing parents to opt their child of out data collection activities, will be included in the information packet parents receive at the beginning of the year with clear instructions for parent(s) to return the form within two weeks if they want to opt their child out of data collection. Likewise, the parent consent liaisons in districts that waive active consent will compile returned forms for the local data collection consultant to pick up. After three weeks, the data collection consultant will enter information about opt outs to a database, make copies of the consent forms, and send original forms to the study team.

Procedures for the district cost interviews. The district interviews will be conducted by phone. As with the MTSS-R team leader interviews, one study team member will lead the interview, and a second team member will take notes.

Procedures for collecting district records. District data requests will be included in the Data Sharing Agreement section of the Memorandum of Understanding (MOU) that will be negotiated with the district during the recruitment process. Eight weeks prior to needing the data, the study team will send relevant district staff the data request and will offer to meet to discuss the request. Data will be transferred to AIR using a secure file transfer protocol site.

  1. Data collections under the current clearance request

Procedures for the teacher, reading interventionist, and MTSS-R team leader surveys. The surveys will be group-administered electronically in each school’s computer lab. Administration dates will be established in each district in cooperation with the district’s main point of contact. Prior to each data collection, the teachers, reading interventionists, and MTSS-R team leaders will be notified (by e-mail and through notifications posted in staff lounges) about the timing and purpose of the surveys. The notifications will inform staff that they can complete the survey at another time, using the link provided in the survey e-mail, if they cannot attend the group administration.

Procedures for the MTSS-R team leader interviews. MTSS-R team leaders will be e-mailed a link to an electronic form to identify a time for the interview and to provide a phone number for the study team to call. MTSS-R staff who have not scheduled the interview after two reminder e-mails will be contacted by phone or with the assistance of their district’s main point of contact. When the interview has been scheduled, the study team will send electronic reminders. The interviews will be conducted by phone, with one study team member leading the interview and a second team member taking notes.

Procedures for the Tier I and Tier II post-observation interviews. The post-observation interviews will take place in person immediately following the observation.

  1. Estimation Procedures

This section describes our data analysis procedures to address study’s research questions. The analyses fall into five general categories: (1) impact analyses that estimate the effects of each of the two MTSS-R approaches on instructional practices and student reading outcomes; (2) implementation analyses that describe implementation of MTSS-R related supports, infrastructure, and procedures in the treatment schools; (3) implementation and service contrast analyses that estimate differences in MTSS-R related supports, infrastructure, and procedures between treatment schools and control schools; (4) a cost-effectiveness analysis; and (5) nonexperimental descriptive and correlational analyses related to supplemental study briefs.

Impact analyses. We will conduct intent-to-treat (ITT) analyses to estimate the impact of each MTSS-R approach on reading instructional practices (RQ1) and student reading outcomes (RQ2).

Impact on Reading Instructional Practices (RQ1). To address RQ1.1, we will estimate a three-level hierarchical linear model with observations nested within teachers nested within schools, as described in Exhibit 1. We will use the model to estimate the MTSS-R impact on each of the two primary measures of Tier I instruction: degree of explicit instruction and degree of differentiated instruction. To address RQ1.2, we will estimate a similar model, with Tier II sections nested within schools. In addition to estimating the effect of each MTSS-R approach relative to the control schools, we will conduct a post hoc analysis to test whether the MTSS-R impact differs between the two approaches by comparing the and estimates from the impact model (Model 1).

Exhibit 1. Statistical Model for Estimating the Impact of MTSS-R on Reading Instructional Practices (RQ1)

Model 1:

where

  • is the instruction outcome of observation i for teacher j in school k;

  • is a vector of teacher characteristics (e.g., years of experience), including student composition of the teacher’s classroom (e.g., proportion of English learner [EL] students and mean baseline reading score) and class size;

  • is a dichotomous indicator for whether the teacher has Grade 1 or Grade 2 students;

  • is a vector of school background characteristics, including school size and the school’s student composition (e.g., proportion of EL students and mean baseline reading score);

  • is a vector of dichotomous indicators for randomization blocks;

  • is a dichotomous indicator for whether school k was randomly assigned to implement MTSS-R using Approach A ( = 1) or not ( = 0);

  • is a dichotomous indicator for whether school k was randomly assigned to implement MTSS-R using Approach B ( = 1) or not ( = 0);

  • is the school-level residual;

  • is the teacher-level residual; and

  • is the observation-level residual.

The main parameters of interest are and , which represent the impact of MTSS-R implemented using the two alternative approaches A and B, respectively, estimated as the precision-weighted average impact of MTSS-R across blocks.

Impact on Student Reading Outcomes (RQ2). To estimate the impact of MTSS-R on students’ Grade 2 reading outcomes, we will pool students in Cohort 1 (who will be in Grade 2 in Implementation Year 2) and Cohort 2 (who will be Grade 2 in Implementation Year 3), estimating a three-level model with students nested within teachers (at the time of Grade 2 testing) and schools, as described in Exhibit 2. To address each of RQ2’s subquestions, we will estimate separate models for the following student samples:

  • students identified by the study team as at risk for reading difficulties at baseline (RQ2.1),

  • students in policy-relevant high-need populations at baseline (Students with disabilities and English learners RQ2.2), and

  • all students in the baseline sample (RQ2.3).

In addition to estimating the effect of each MTSS-R approach relative to the control schools, we will conduct a post hoc analysis to test whether the MTSS-R impact differs between the two approaches by comparing the and estimates from the impact model (Model 2).

Exhibit 2. Statistical Model for Estimating the Impact of MTSS-R on Student Reading Outcomes (RQ2)

Model 2:

where

  • is the test score (Woodcock Reading Mastery Subtest) for student i taught by teacher j in school k;

  • is a vector of student background characteristics (e.g., gender, special education status, EL status, and baseline test score);

  • is a dichotomous indicator, coded 1 if student i is in Cohort 2 and 0 if the student is in Cohort 1; and

  • is a student-level residual term.

The other terms are defined in the same way as in Model 1 (Exhibit 1).

Implementation analyses. We will conduct descriptive analysis of the fidelity of training delivery (RQ3), presenting summary statistics on the extent to which the frequency, duration, content, and attendance for trainings and supports in the treatment schools was aligned with the providers’ intent, based on observations of trainings and provider technical assistance logs. In addition, we will describe how treatment schools implement MTSS-R infrastructure and procedures (RQ5) by presenting summary statistics on the frequency, duration, and content of school team meetings, on team meeting participation, measurement of MTSS-R implementation fidelity, and systematic collection and use of data for screening and progress monitoring, based on data from MTSS-R meeting minutes.

Service and implementation contrast analyses. We will conduct analyses to estimate differences in the training and supports provided in treatment and control schools (RQ4), as well as differences in the infrastructure and procedures in treatment and control schools (RQ6). We will test for differences between treatment and control schools using a simplified version of the model described in Exhibit 2, based on data from the teacher and interventionist surveys, where respondents to the survey are nested within schools.

Cost-effectiveness analyses. The study’s approach to cost and cost-effectiveness (RQ7) is based on the Resource Cost Model. Costs captured through data collections will be compiled in the CostOut tool developed by the Teachers College at Columbia University. The resource cost database for the study will contain the costs of the providers, coaches, district staff time and resources to support districts and schools in implementing Approach A or Approach B, as well as support for implementation provided by the study team. We will report total costs over the three years of implementation, annual costs, and per pupil costs, by MTSS-R approach. We will also report specific components of the cost (e.g., the cost due to coaching).

We will then link the costs with the results of the impact analysis to provide cost-effectiveness estimates for each MTSS-R approach. Specifically, the estimates will measure the effectiveness of the MTSS-R approaches in terms of the cost per additional unit improvement of student achievement.

Supplemental briefs related analyses. We will conduct descriptive and correlational analyses to address the research questions related to supplemental briefs.

  1. Methods to Maximize Response Rates

This section describes our approach to maximizing response rates for each data collection. For all data collections, the study team will work closely with districts from the site selection phase onward, communicating the importance of the data collections and including support for data collection as part of the MOU.

  1. Data collections that do not require clearance

Procedures to maximize student testing response rates. Schools will be informed of the student testing procedures and timeline prior to joining the study. Testing will take place over approximately 6 weeks; testing days will be scheduled in collaboration with the study’s main point of contact in the district and with school administrators in each participating school. The study team will minimize disruption to the schools by sending teams of testers who can complete the student testing over two instructional days. Makeup days for students who were absent during the testing days will be planned in advance, with ample time to make sure that all enrolled students whose parents have consented to or not opted their child out of data collections.

Procedures to maximize Tier I and Tier II observation response rates:

  • Teachers and interventionists will be informed of the observations during the staff orientation. Reminder e-mails will be sent one month in advance of the observation and will include information about the observation. Teachers and reading interventionists will be informed of the week that observations will take place as well as plans for makeup observations.

  • The study team will work with a school specific data collection liaison to collect information on the timing of reading blocks and reading intervention groups to establish an efficient schedule of observations (minimizing the time observers are in each school). In addition, the study team will work with each district’s main contact to determine days on which observations should occur, avoiding days when schools are less likely to have reading blocks (e.g., because of field trips). Observation schedules will incorporate time for makeup observations for teachers or interventionists who are away (e.g., due to illness).

  1. Data collections under the previous clearance request

Procedures to maximize parent consent form return rates. In addition to the steps described in section 2b of this clearance request, the study team will take the following steps to maximize parental consent:

  • The consent forms will be brief, with one page of information and one page for signatures.

  • The consent forms will be written in clear, accessible language, explaining the importance of the study, the data collections students would be involved in, and the benefits and risks to participation.

  • Consent forms will be available in both English and Spanish. We will also consider other translations if a participating district indicates that a significant percent of the parent population in the participating schools are typically contacted with materials in other languages.

Procedures to maximize district cost interview and district records request response rates. Study team members will be responsible for maintaining contact with the districts, ensuring that he district leaders are reminded of the interview and extant data requests in advance and understand the importance of these data collections. Interviews will be scheduled a month in advance, with reminders sent a week before.

  1. Data collections under the current clearance request

Procedures to maximize teacher, reading interventionist, and MTSS-R team leader survey response rates. The study team will use the following methods to maximize the response rates for these surveys:

  • The surveys will be brief―approximately 30 minutes―and will cover only crucial constructs, to reduce the risk that staff will refuse the survey because of length or burden.

  • All surveyed teachers and reading interventionists will be offered a $30 incentive for completion, conditional on OMB approval.

  • The surveys will be group-administered electronically in the school’s computer lab. Staff will be notified in advance by e-mail about the timing of the survey administration.

  • Nonresponders will be given the opportunity to take the survey individually, either electronically or via hard copy. The web survey will be optimized to display appropriately on a smartphone web browser.

Procedures to maximize MTSS-R team leader interview response rates. To maximize response rates for the interviews, the study team will use the following methods:

  • The interview with each MTSS-R team leader will be scheduled in advance by e-mail or phone, depending on the team leader’s preference. The study team will use electronic invitations with the call-in information, interview topics, and date and time.

Procedures to maximize Tier I and Tier II post-observation interview response rates:

  • Teachers and interventionists will be informed of the Tier I and Tier II post-observation interviews during the staff orientation. Reminder e-mails will be sent one month in advance of the observation and will include information about the interview process (i.e., an observer will visit to observe a lesson and will conduct a 5-minute structured interview at the end of the observation).

  1. Testing Data Collection Processes and Instruments

The teacher, reading interventionist, and MTSS-R Team leader surveys will be piloted during the 60-day comment period (fewer than 10 respondents per instrument), and will be revised on the basis of feedback to ensure that the questions are as clear and simple as possible for respondents to complete. Pilot-test subjects will include volunteer staff from nonstudy schools in districts that have signed on to the study. A think-aloud, or cognitive lab, format will be used for pilot-testing: The respondents will be asked to complete the draft instrument, explain their thinking as they construct their responses, and identify the following:

  • Questions or response options that are difficult to understand

  • Questions in which none of the response options is an accurate description of a respondent’s circumstances

  • Questions that call for a single response but more than one is appropriate

  • Terms that are not defined or are unclear

  • Questions for which the information requested is unavailable

  1. Individuals Consulted on Statistical and Methodological Aspects of Data Collection

This project will be conducted by AIR under contract to ED, as well as by IRG and SRC. The data collection strategy and instruments were developed by Michael Garet, Anja Kurki, and Seth Brown of AIR. Jordan Rickles of AIR will lead the development of the design report, analysis plan, and the study’s impact analyses. Project staff also will also on the experience and expertise of a network of outside experts who are members of AIR’s technical working group (TWG). The TWG members are listed in Exhibit 3.

Exhibit 3. Technical Working Group Members

Expert

Organization

David Francis

University of Houston

Elizabeth Tipton

Northwestern University

Julie Washington

Georgia State University

Lynne Vernon-Feagans

University of North Carolina

Matthew Burns

University of Minnesota

Michael Conner

Middletown Public Schools, Connecticut

Michael Coyne

University of Connecticut

Nathan Clemens

University of Texas at Austin

Nicole Patton Terry

Florida State University

Stephanie Al Otaiba

Southern Methodist University

Sylvia Linan-Thompson

University of Oregon

Yaacov Petscher

Florida State University


The organizations responsible for data collection activities are as follows:

Organization

Primary Contact

Phone Number

American Institutes for Research

Seth Brown

781-373-7034

Instructional Research Group

Joseph Dimino

714-826-9600,

School Readiness Consulting

Laura Hawkinson

877-447-0327, ext. 709

ABOUT AMERICAN INSTITUTES FOR RESEARCH

Established in 1946, with headquarters in Washington, D.C., American Institutes for Research (AIR) is an independent, nonpartisan, not-for-profit organization that conducts behavioral and social science research and delivers technical assistance both domestically and internationally. As one of the largest behavioral and social science research organizations in the world, AIR is committed to empowering communities and institutions with innovative solutions to the most critical challenges in education, health, workforce, and international development.

1000 Thomas Jefferson Street NW
Washington, DC 20007-3835
202.403.5000

www.air.org

LOCATIONS

Domestic

Washington, D.C.

Arlington, VA

Atlanta, GA

Austin, TX

Baltimore, MD

Cayce, SC

Chapel Hill, NC

Chicago, IL

Columbus, OH

Frederick, MD

Rockville, MD

Honolulu, HI

Indianapolis, IN

Metairie, LA

Naperville, IL

New York, NY

Rockville, MD

Sacramento, CA

San Mateo, CA

Waltham, MA

International

Egypt

Honduras

Ivory Coast

Kyrgyzstan

Liberia

Tajikistan

Zambia

1 Of the nine likely participating districts, one requires active parental consent; the remaining eight districts have waived active parental consent. Parents in these eight districts will be informed about the study and study related student testing and will have the opportunity to opt their child out of the student testing (see Appendix A for the parent consent forms).

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleAIR Report
SubjectAIR Report
AuthorHeppen, Jessica
File Modified0000-00-00
File Created2021-12-28

© 2024 OMB.report | Privacy Policy