3369 OMB EWIMS Supporting Statement B_OPEPD revisions_draft_3714

3369 OMB EWIMS Supporting Statement B_OPEPD revisions_draft_3714.docx

Evaluation of the Early Warning and Intervention Monitoring System

OMB: 1850-0904

Document [docx]
Download: docx | pdf

Supporting Justification for OMB Clearance of an Evaluation of the Early Warning and Intervention Monitoring System Under the Regional Educational Laboratory Program (REL)

OMB Clearance Request, Part B



March 2014


Submitted to: Submitted by:

U.S. Department of Education American Institutes for Research

Institute of Education Sciences 1000 Thomas Jefferson Street NW
555 New Jersey Ave. NW, Rm. 308 Suite 200

Washington, DC 20208 Washington, DC 20007-3835






1120 East Diehl Road, Suite 200

Naperville, IL 60563-1486

866-730-6735

www.relmidwest.org

This publication was prepared for the Institute of Education Sciences (IES) under contract ED-IES-12-C-0004 by Regional Educational Laboratory Midwest, administered by American Institutes for Research. The content of the publication does not necessarily reflect the views or policies of IES or the U.S. Department of Education, nor does mention of trade names, commercial products, or organizations imply endorsement by the U.S. government. The publication is in the public domain. Authorization to reproduce in whole or in part for educational purposes is granted.


10/16



CONTENTS


Page



CONTENTS (continued)

TABLES AND FIGURES


Page

Tables


Figures




Supporting Justification for OMB Clearance of an Evaluation of the Early Warning and Intervention Monitoring System Under the Regional Educational Laboratory Program (REL)


OMB Clearance Request, Part B


Study Background

The U.S. Department of Education (ED) requests clearance for the recruitment materials and data collection protocols under the OMB generic clearance agreement (OMB Number [IES to complete]) for activities related to the Regional Educational Laboratory Program (REL). ED, in consultation with American Institutes for Research (AIR), is planning a two-part evaluation of the Early Warning and Intervention Monitoring System (EWIMS), consisting of an impact study and an implementation study. OMB approval is being requested for a multimode data collection and analysis of a group of schools, students, and staff members in public schools in Ohio, Michigan, and Indiana. The impact study consists of data collection from the state education agency (SEA) in Ohio, Michigan, and Indiana and participating districts and schools. The implementation component consists of data collection from participating schools. Specifically, in this OMB clearance package, ED is requesting clearance for the following data collection approaches:

  • Recruitment materials for all participating districts and schools

  • Extant administrative records data collections from SEAs and from districts and schools within Ohio, Michigan, and Indiana

  • The transfer of data from treatment schools to the evaluation team via populated Early Warning System (EWS) tools

  • Pilot testing of survey and interview protocols

  • A Web-based survey of school leaders in treatment and control schools

  • An interview with one school administrator

The implementation study will include additional forms of data collection (satisfaction survey and monthly logs of EWIMS data team meetings). However, ED is not seeking approval for these measures of implementation because they are part of the typical EWIMS intervention and present no burden to participants.

As detailed more fully in the project description that follows, this impact study (designed as a cluster randomized controlled trial) will focus on student outcomes spanning multiple domains of school success (student risk status, scores on graduation tests, persistence and progress in school, and being on track at the end of ninth grade) and will examine whether the EWIMS model has an impact on intermediate outcomes in schools, including the schools’ data culture1 and data-informed allocation of dropout prevention interventions. The implementation study will focus on schools’ experience with implementation, the extent to which schools faithfully implement the EWIMS model and the interventions provided to students identified as at risk by the EWS tool.

The purpose of the project is to assess the implementation and impact of EWIMS, a data tool and process for implementing a system of data-driven decision making. Developed by the National High School Center, EWIMS provides a means of systematically and reliably identifying students at risk for dropping out of high school. The proposed study is a two-year school-level randomized controlled trial (RCT) to examine the impact of implementing EWIMS on school processes and student outcomes.

Impact Study

The focus of the impact study will be to assess the effectiveness of EWIMS on student and school outcomes. For that reason, the study will address the following research questions:

  1. What is the impact of the EWIMS model on outcomes for students in schools, including:

  1. indicators of student risk?

  2. scores on graduation tests?

  3. persistence and progress in school?

  4. predicted probability of on-time graduation ?

In addition, REL Midwest will conduct exploratory research analyses for both students and schools.

  1. What is the impact of EWIMS on outcomes for subgroups of students, including:

    1. students who receive free or reduced-price lunch (FRPL)?

    2. students who are English language learners (ELLs)?

    3. students with Individualized Education Plans (IEPs)?

    4. students with baseline indicators of risk for attendance, course failures, and/or behavior?

  1. What is the impact of EWIMS on other school outcomes, including:

  1. data-informed allocation of dropout prevention interventions for students?

  2. school data culture (including the context, supports for data use, working with data, and responses to data)?

The exploratory student subgroup analyses will address research questions 1a–1d that will assess whether the impact of EWIMS on student outcomes differs for key subgroups of students: students who receive free or reduced-price lunch (FRPL), students who are English language learners (ELLs), students with Individualized Education Plans (IEPs), and students with initial risk (with one or multiple risk factors) in the fall 2013 semester, preceding random assignment and implementation. We acknowledge that some students may be included in more than one of these analyses because they may be members of more than one subgroup.

The exploratory school-level impact questions are intended to understand how early adoption of EWIMS may cause initial changes in how schools use data to identify at-risk students and respond with interventions.

To minimize burden on participating districts and schools, this study draws heavily on extant data to address the study’s research questions outlined earlier in this statement. Student- and school-level baseline and outcome data for the impact study will be obtained from multiple sources, including the SEAs, the EWS tool, and school and district administrative data.

Implementation Study

REL Midwest also will conduct an implementation study to describe treatment schools’ experiences with adoption and early implementation of EWIMS, and the extent to which there is a difference between the intervention and business as usual in control schools. The implementation study will address the following research questions:

  1. To what extent do treatment schools faithfully implement the EWIMS model?

  2. To what extent does business-as-usual practice in control schools include the use of data for identifying at-risk students (treatment contrast)?

  3. What are the specific interventions provided to students identified as “at risk” by the EWIMS model in treatment schools?

The remainder of Part B addresses the following: respondent universe and sampling methods (though no sampling methods will be used for this randomized controlled trial); description of procedures for maximizing response rates; description of tests, procedures and methods; and contact information for statistical consultants and key staff.

1. Respondent Universe and Sampling Methods

This section describes the respondent universe and sampling methods for recruitment.

To test the impact of EWIMS on student outcomes, ED’s contractor will recruit and randomly assign high schools in Ohio, Michigan, and Indiana to use or not use the EWIMS model. The recruitment strategy will focus on both districts and schools. ED’s contractor will conduct recruitment both in a top–down approach (district then school) and in a bottom–up approach (school then district). The recruitment strategy is designed to target schools that are relatively large (e.g., more than 150 Grade 9 students) and that demonstrate a need for improved graduation rates and student outcomes (e.g., schools with graduation rates between 25 and 95 percent, with a particular emphasis on those schools with graduation rates between 50 and 90 percent). The cutoff of 150 Grade 9 students was selected to establish sufficient statistical power to achieve a minimum detectable effect size of 0.25 or smaller. The sample will consist of regular public schools serving Grades 9–12 that indicate their commitment to implement the EWIMS model and their willingness to adhere to all study requirements (i.e., random assignment and the completion of data collection activities). Although EWIMS can be implemented in both middle schools and high schools, this study will focus on high schools because some key intended outcomes of the EWIMS, such as whether students are on track for graduation in terms of course credits and grades, are relevant only for high school students. Moreover, recruiting and studying both middle school and high school outcomes would exceed available resources.

ED’s contractor will target schools with at least 150 students in Grade 9 with estimated graduation rates between 25 and 95 percent to ensure that the study can feasibly detect an impact on student outcomes. Taking these two inclusion criteria together (more than 150 Grade 9 students and graduation rates between 25 and 95 percent), there is an estimated pool of 688 potential schools for recruitment in the EWIMS Impact Study, as shown in Table 1. Based on a power analysis, the ED expects to need 72 schools to provide adequate power for detecting the effects on outcomes.

Table 1. Potential Respondent Universe of Public Schools In Ohio, Michigan, and Indiana

State

Total n

Treatment

Control


All school districts in Ohio, Michigan, Indiana

OH

609

--

--

MI

549

--

--

IN

366

--

--

All public schools serving Grades 9–12**

OH

858

--

--

MI

672

--

--

IN

381

--

--

Schools with Grade 9 enrollment greater than 150**

OH

369

--

--

MI

284

--

--

IN

178

--

--

Schools with graduation rates between 25 and 95 percent*

OH

601

--

--

MI

659

--

--

IN

297

--

--

Schools with both Grade 9 enrollment greater than 150 and graduation rates between 25 and 95 percent

OH

321

--

--

MI

225

--

--

IN

142

--

--

Schools in study sample

OH

20

10

10

MI

26

13

13

IN

26

13

13

Total

72

36

36

Source: *Ohio Department of Education (2010–11), Michigan Department of Education (2011–12), Indiana Department of Education (2011–12); **Common Core of Data (2010–11).

Degree of Accuracy Needed

Because this study is focused on examining policy-relevant impacts for at-risk students, some continuous outcomes (e.g., number of course failures) will be examined as binary indicators (e.g., 1=failed one or more courses, 0=did not fail any courses) for impact analyses. This decision is based on extensive research documenting the predictive power of these binary “flags” for on-time graduation (Allensworth & Easton, 2005, 2007; Balfanz, Herzog, & Mac Iver, 2007; Neild & Balfanz, 2006; Silver, Saunders, & Zarate, 2008). Further, some of these binary indicators are used explicitly within the EWS tool to flag at-risk students (e.g., attendance flags for students who miss more than 10 percent of school days). A comprehensive list of the binary variables that will be used in the impact models to examine the impact of EWIMS on student outcomes is provided in Table 2, along with information about how these variables will be coded and our corresponding assumptions for the power analyses.

Power analyses for binary student-level outcomes were conducted using equations derived from Optimal Design Plus Empirical Evidence (Version 3.0) (Raudenbush, Spybrook, Congdon, Liu, Martinez, Bloom, & Hill, 2011). To estimate the number of schools needed to have adequate statistical power (0.80) for detecting differences between students in treatment and control schools for the impact analyses, the power analyses assumed an alpha of 0.05 (two-tailed). ED’s contractor estimated power for a constant effects blocked cluster-random assignment design with the treatment occurring at Level 2. Following Bloom, Hill, Black, and Lipsey (2008), the study is powered to be able to detect a minimum detectable effect size (MDES) roughly equivalent to 0.25 for student outcomes; this effect size is generally considered necessary for an intervention to have educational significance. To date there are no studies that document the existence or relative size of the impact of implementing an early warning system on student outcomes, thus the suggested MDES of 0.25 is a conservative estimate for this study. Specifically, the study is powered with enough schools and students within schools to be able to obtain effects of 0.25 or below for all student outcomes. Given the expected low incidence of students flagged as no longer in school2, the enrollment/persistence outcome requires a minimum of 72 schools and an average of 150 students per school to obtain a difference between treatment and control schools that is less than or equal to 0.25, using a standardized Cox Index for binary measures. Including 72 schools and 150 students per school provides sufficient power to detect effects ranging from 0.09 to 0.17 for the other binary outcomes of interest. It is important to note that the estimated MDES for each outcome is sensitive to the assumptions about the 95 percent plausible interval of school means (in proportions). Assumed 95 percent plausible intervals of school means for each outcome are detailed in Table 2. Based on these analyses, the study requires approximately 72 schools in the study, with half assigned to each condition, to detect an MDES of 0.25 (or smaller) for each outcome variable of interest.

Table 2. Power Analyses: Minimal Detectable Effect Sizes for
Each Binary Student Outcome

Student Outcome

Minimum Detectable Effect Size(MDES)
(as Cox Index)

Expected Mean Proportion by Condition3

95% Plausible Intervals of School Means (in Proportions)

Control

Treatment

Lower Bound

Upper Bound

Attendance

(1=missed more than 10% of school days; 0=did not miss more than 10%)

-0.128

0.150

0.125

0.100

0.200

Fail Any Course

(1=failed one or more courses; 0=did not fail any courses)

-0.098

0.250

0.221

0.200

0.300

Fail Any Core Course

(1=failed one or more core course; 0=did not fail any core courses)

-0.108

0.200

0.173

0.150

0.250

Behavior

(1=suspended one or more times; 0= not suspended)

-0.174

0.100

0.077

0.050

0.150

Predicted Probability of Graduation

(1=predicted probability less than 0.80; 0= predicted probability greater than or equal to 0.80)

-0.108

0.200

0.173

0.150

0.250

GPA

(1=GPA less than or equal to 2.0; 0=GPA greater than 2.0)

-0.089

0.300

0.270

0.250

0.350

Grade Retention/Promotion

(1=not promoted to next grade; 0=promoted to next grade)

-0.128

0.150

0.125

0.100

0.200

Enrollment/Persistence

(1=no longer enrolled in school; 0=still enrolled in school)

-0.244

0.050

0.034

0.020

0.100

Credits Earned

(1=insufficient credits earned; 0=sufficient credits earned)

-0.128

0.150

0.125

0.100

0.200

On Track” at the End of Ninth Grade

(1=off track at the end of ninth grade; 0= on track at the end of ninth grade)

-0.108

0.200

0.173

0.150

0.250

The treatment group mean proportions presented in Table 2 represent the smallest possible departure from the control group mean proportion ED’s contractor expects to observe with 72 schools, 150 students per school, and the assumed 95 percent plausible interval of school means in proportions. ED’s contractor conducted extensive sensitivity analyses examining how expanding the 95 percent plausible interval affects the number of schools needed to conduct the study. These analyses demonstrate that including 72 schools provides sufficient power for enrollment/persistence and more than adequate power for other binary outcomes with minor deviations to the plausible interval.

To determine the MDES for continuous outcomes (e.g., standardized achievement scores), ED’s contractor conducted a power analysis using Optimal Design Plus Empirical Evidence (Version 3.0) (Raudenbush et al., 2011). Specifically, ED’s contractor examined the MDES for a cluster-randomized trial examining the impact on person-level outcomes (Level 1) with treatment at Level 2. These analyses conservatively assumed (a) 72 schools (36 treatment, 36 control); (b) 150 students per grade; (c) an intraclass correlation of 0.15 (based on prior research conducted by the research team examining these continuous outcomes); (d) a Level 1 covariate explaining 70 percent of the variation in the Level 1 outcomes;4 (e) an alpha of 0.05; and (f) power of 0.80. Based on these analyses, this study is powered to detect a minimal detectable effect (MDE) of 0.15 for continuous outcomes as illustrated in Figure 1.



Figure 1. Minimal Detectable Effect Size for Continuous Student-Level Outcomes

ED’s contractor also will examine two school-level outcomes, including data-informed allocation of dropout prevention interventions for students and school data culture. These analyses will be tested using intent-to-treat, linear regression models at the school level. The study is not sufficiently powered to detect school-level outcomes with effect sizes less than or equal to 0.25. For example, using G*Power (Buchner, Erdfelder, & Faul, 1997) and assuming an alpha of 0.05, 0.80 power, and 72 schools, a linear regression model with four predictors (three blocking strata indicators and one treatment indicator) at the school level is powered to detect an effect size 0.85 or larger. Given this large MDES, school level analyses will be considered exploratory analyses.

2. Description of Procedures for the Collection of Information

ED’s contractor, REL Midwest, will manage data collection and ensure quality and timeliness. The draft materials and instruments for recruiting, screening and negotiating final memoranda of understanding (MOUs) for participation in the study are included in Attachments A-1 through A-9. In addition, the draft instruments for data collection for the impact and implementations studies are included in Attachments B-1 through B-3.

The most common source of data for this study (including all estimates of the student-level impact) will involve extant data collected by the SEAs, school districts, and schools. Primary data collection for the impact and implementation studies will only include a Web-based school-level survey to measure data-informed use of dropout prevention interventions for students and school data culture, and the treatment contrast.

Data Collection for School and District Recruitment

We will contact all schools that meet the two inclusion criteria—150 or more Grade 9 students and graduation rates between 25 and 95 percent—to determine whether they are currently implementing or planning to implement any new programming that might conflict with or too closely mirror the EWIMS model. Only schools that are not currently using or planning to use an early warning system (EWS) tool and process similar to EWIMS will ultimately be invited to participate in the study. The two main components of the early warning system process that exclude schools are flagging and assigning students to interventions on the basis of systematic data review in more than one domain (attendance, discipline, and academics/failures) either with or without a data team.

Schools that have a sophisticated student information system (SIS) but do not actually use the tool to flag or assign at-risk students to interventions will still be eligible. Also, schools that only flag students (generate student-level reports of risk) but do not assign students to interventions or monitor progress will still be eligible. Finally, schools that have teams that review data that are not specific to dropout prevention, (e.g., teams that meet about benchmark assessment scores or school improvement teams) will still be eligible to participate in the study.

Also, participation in the EWIMS impact study does not preclude schools from adopting new EWS-like programs during the course of the 1.5-year implementation. The study will not impose any restrictions on the dropout prevention practices that make up the “business as usual” control group. However, as an incentive for control schools to participate, the implementation team will provide them with the same implementation support in Year 3 as provided to treatment schools in Years 1 and 2. This delayed treatment will also help mitigate the risk that control schools will adopt an intervention similar to that being tested in this study. Expert panels convened in the technical working group (TWG) agree that this approach will limit the risk of control schools adopting similar EWS during the 1.5-year implementation period because they are unlikely to invest in a new tool during this time period when they are promised a complete tool and accompanying technical assistance and professional development beginning in the 2015–2016 school year.

The members of the REL Midwest Dropout Prevention Research Alliance will be actively involved in recruitment. Our recruitment strategy will focus on both school districts and the high schools within districts. Recruitment will be facilitated by clear materials and communication about the nature of the study and the requirements for participation, including complying with random assignment. Based on prior experience recruiting schools for school-level random assignment studies (e.g., Access to Algebra I: The Effects of Online Mathematics for Grade 8 Students, Teacher and Leader Evaluation Systems Study), the ED’s contractor does not anticipate random assignment to be a barrier to participation. To incentivize participation for control schools, the implementation team will provide them with the same implementation support in Year 3 as provided to treatment schools in Year 1.

ED’s contractor will involve the state, districts, and alliance members in recruitment to garner local support for the study. Based on the willingness of all invited schools and school districts to participate in similar projects currently underway via REL Midwest5 and continued interest expressed by additional districts and schools, ED’s contractor anticipates that a wide variety of schools will be interested in receiving the EWIMS model as part of the study. The process for recruitment is outlined in Table 3.

District and school recruitment will occur simultaneously. ED acknowledges that districts can act as a gateway for access to interested and eligible schools; however, some schools (typically in smaller districts) may also have the autonomy to agree to participation without district approval. Therefore, in larger districts recruitment will begin at the district level and in smaller districts ED’s contractor may first recruit schools and then follow up with districts when necessary. This process includes contacting districts to garner district buy-in for the study and prescreening interviews at the school level. The following sections describe more fully each of these steps in the recruitment process: (1) identifying the pool of schools to be screened, (2) contacting districts to gain district support for the study, (3) conducting school-level screening interviews, (4) prioritizing schools for recruitment, (5) recruiting eligible schools, and (6) negotiating final agreements.

Table 3. Recruitment Process and Respondent Universe for the EWIMS Impact Study

Steps in the Recruitment Process

State

Respondent Universe

Identify pool of schools eligible to participate in the study in each state

OH

321

MI

225

IN

142

District first contact (e-mail and/or phone) containing eligible high schools in each state

OH

250

MI

190

IN

114

Follow up for non-responding districts

OH

100

MI

76

IN

46

District screening

OH

80

MI

61

IN

36

First school contact (e-mail and/or phone) to eligible schools in each state

OH

321

MI

225

IN

142

Follow-up for non-responding schools

OH

128

MI

90

IN

57

School screening/interview (~50% of those eligible at first contact)

OH

161

MI

113

IN

71

School visit (face-to-face or virtual) with highest qualifying schools that are interested in study participation

OH

30

MI

40

IN

30

Negotiate final agreements (memoranda of understanding [MOUs]) with participating schools

OH

20

MI

26

IN

26

Conduct random assignment (study schools)

OH

20

MI

26

IN

26

Identify survey respondents within schools (school leaders)

OH

20

MI

26

IN

26

Note. If there are issues with identifying and successfully recruiting schools that meet all of the specified qualifications, the ED’s contractor will consider relaxing the criteria for inclusion in negotiation with the Institute of Education Sciences (IES).

(1) Identifying the pool of schools to be screened. To identify the pool of eligible and interested schools, the contractor will use the Common Core of Data to pinpoint schools within Ohio that serve Grades 9 through 12, have more than 150 Grade 9 students, and have graduation rates between 25 and 95 percent. ED’s contractor will then reach out to the estimated pool of 688 schools to identify interest and the presence or absence of an early warning system similar to EWIMS. After the pool of schools has been selected based on the previously mentioned criteria, ED’s contractor will send an informational e-mail to each school (see Attachment A-1 for the text of this e-mail). This e-mail will include notification that the school meets the initial eligibility requirements for participation in the study and will contain documents that briefly describe the study and a letter from IES endorsing the study (see Attachment A-2 for these documents). After ED’s contractor sends the e-mail, a recruitment team member will call each school to inform it about the study and ask the school to participate in a short telephone interview. The recruitment team comprises REL staff who will also subsequently serve on the study or implementation team following random assignment.

(2) Contacting districts to gain district support for the study. ED’s contractor will identify the districts in which the 688 eligible schools are located, and begin a wide outreach plan that involves e-mails and follow-up phone calls to district-level staff. This “first touch” of recruitment at the district level will facilitate our communications with interested and eligible schools (see Attachment A-3 for the text of this e-mail). After the pool of districts that represent eligible schools has been identified, ED’s contractor will send an informational e-mail to each district (see Attachment A-1 for the text of this e-mail). This e-mail will include notification that the district meets the initial eligibility requirements for participation in the study and will contain documents that briefly describe the study and ED’s contractor and a letter from IES endorsing the study (see Attachment A-3 for these documents). After the e-mail is sent, a recruitment team member will call each district to inform the district about the study and ask the district to participate in a telephone interview. The district-level screening protocol that will be used to guide this interview is presented in Attachment A-6.

(3) Conducting district- and school-level screening interviews. The district- and school-level screening protocols are designed to verify school size, graduation rate, and use of an early warning system. The interviews also allow for early termination of the interview if a school or all schools within a district do not meet these eligibility criteria. ED’s contractor will prepopulate the screeners for schools based on the Common Core of Data but will also seek to confirm these data are correct at the beginning of the screening interviews. ED’s contractor estimates that there are 554 districts with eligible high schools and 688 eligible high schools across the three states. Of those, ED’s contractor estimates that 332 districts and 413 high schools will respond to an initial contact (60 percent response rate) and that 283 districts and 330 schools will complete the screener (85 percent of districts and 80 percent of schools that responded to initial communication—a response rate based on ED’s contractor’s prior experience recruiting for randomized control trials). The items on the screening protocols are mapped to the constructs they measure (Table 4). The district and school screening protocols will be administered by project staff by telephone interviews and drafts can be found in Attachment A-5 and A-6.

Table 4. Mapping of Recruitment Screening Protocol Items to Topics

School Screening Protocol

Construct

Items

School Characteristics

1–11

Data-Driven Dropout Prevention Efforts

12–26

Major Initiatives in the School

27–29

Research Review Policies

30

District Screening Protocol

Construct

Items

District and School Characteristics

1–2

Data-Driven Dropout Prevention Efforts

3–9 


(4) Prioritizing schools for recruitment. Among the 330 schools that ED’s contractor expects to complete the screener, it is anticipated that about 280 will pass the initial screening based on extant data; some will be more appropriate candidates for the study than others. ED’s contractor will prioritize its recruitment efforts in schools with:

  • Moderate graduation rates. Although many high schools have graduation rates above 90 percent, ED’s contractor will focus the recruitment efforts on the schools that have at least 150 Grade 9 students and graduation rates between 50 and 90 percent. Schools with lower graduation rates (i.e., 25 to 49 percent) may not have enough available interventions for students identified through the EWIMS process, and schools with higher graduation rates (i.e., 91 to 95 percent) may not have enough at-risk students to be included in the evaluation. For this reason, ED’s contractor will focus first on recruiting these schools which are most likely to benefit from the EWIMS intervention and secondarily focus on those schools (with graduation rates between 25-49 percent or between 91 and 95 percent) that could benefit from EWIMS but that face additional challenges or have a more limited need for dropout prevention.

  • Greater interest in the study. Schools that signal to alliance members or ED’s contractor a greater interest to participate will be given higher priority. For example, some schools will make a greater effort to include multiple administrators in initial meetings to discuss the opportunity and/or may be more responsive to e-mail and phone communication from ED’s contractor than other schools. Whenever possible, ED’s contractor will also incorporate its prior work and knowledge of specific districts to gauge the interest level of districts.

  • Feasibility of implementation. Districts that have fewer competing initiatives for dropout prevention programming, and specifically the use of an early warning system, will be given higher priority. ED’s contractor will also determine whether schools have the capacity to assemble an EWIMS team to upload and analyze student-level data and risk indicators.

Considering these criteria while canvassing the total 688 eligible schools will allow REL staff to prioritize the schools during recruitment and help ED’s contractor efficiently use resources to conduct site visits at eligible and interested schools.

(5) Recruiting eligible schools. Once ED’s contractor has finalized the pool of potentially eligible schools, face-to-face recruitment site visits with key staff will begin. ED’s contractor will begin recruitment by targeting schools that are considered most eligible given the three priority areas listed above. The recruitment team members, with input from district personnel, will either organize meetings for clusters of schools or will visit each eligible and potentially interested campus separately.

A school’s eligibility to participate in the study will be reconfirmed in a meeting with the principal by using the criteria listed in the previous section (size, graduation rates, not currently using or planning to use an EWS tool and system, and interest). The implementation team’s experience with implementing EWIMS has shown that an important benefit of the study to schools is not just the EWS tool but also the supportive technical assistance to implement the EWIMS process. Although treatment schools will receive these supports in 2014, control schools too will receive these supports beginning in fall 2015. Therefore, each school that agrees to participate will receive the full EWIMS implementation (tool, technical assistance) at no cost to the school. ED’s contractor believes this delayed treatment model will facilitate school recruitment.

(6) Negotiating final agreements. After successful recruitment meetings with the 72 study schools, ED’s contractor will then enter into final agreements through memoranda of understanding (MOUs) that detail the roles and responsibilities of study participation for participating school and their districts. Drafts of these final agreements are included in Attachments A-7 and A-8. If necessary, project staff will make additional phone calls or visits to build consensus and obtain commitment from principals. The principals’ signatures are expected to be gathered shortly after the districts’ MOUs are obtained in order to allow random assignment of schools immediately following OMB clearance. The timeline for recruiting schools is ambitious; however, ED’s contractor has substantial experience conducting large-scale successful recruitment efforts on extremely tight timelines.

(7) Conducting random assignment. Once the exclusion criteria are applied, schools will be assigned to treatment and control conditions within blocks based on graduation rates and school size. To conduct blocked random assignment ED’s contractor will create clusters of schools using matching techniques (e.g., Mahalanobis distance), potentially based on their graduation rates, size, district, and baseline data culture and will randomly assign half of the schools in each block to treatment status and half of the schools within each block to control status. Again the key variables to consider during blocking are:

  • graduation rates,

  • school size,

  • district,

  • and potentially baseline data culture and/or the sophistication of schools’ baseline data systems.


ED and its contractor will finalize the blocking approach post-recruitment and before random assignment. The matching techniques that will be used to conduct blocked random assignment may also take into account information about schools’ home district and baseline data capacity, using data collected during the screening process.

(8) Identifying respondents for the school surveys. After random assignment and during data collection, ED’s contractor will also strategically identify school leaders/administrators to complete the Web-based school surveys. In each school ED’s contractor will identify the most relevant individual to complete the school survey on data-informed use of dropout prevention interventions for students and school data culture. In treatment schools ED’s contractor will sample the lead individual from the EWIMS data team. In control schools ED’s contractor will recruit the principal, assistant principal, counselor, or other school leader, depending on their level of involvement with data use and dropout prevention initiatives in their school.


Data Sources for Impact and Implementation Studies


To understand the effects of the EWIMS model and schools’ experiences with implementation, this project will consist of an impact study and an implementation study. The following sections describe the data collection for each of these separately. Table 5 provides a summary of each data element collected in the schools and associated data sources.


Table 5. Data Sources, Data Elements, and Timeline for Collection

Data Source

Data Element(s)

Timeline for Data Collection

SEA

School Administrative Data

Baseline–March 2014

Student Administrative Data

March 2014, June 2014, Jan. 2015, June 2015

District

Student Administrative Data

March 2014, June 2014, Jan. 2015, June 2015

School Administrators

Annual Web-Based Survey

May 2014, May 2015

School-Level EWIMS Data Teams

Populated EWS tools

March/April 2014, June 2014, Aug. 2014, Jan. 2015, March 2015, June 2015

Participation in EWIMS trainings or convenings

March/April 2014, June 2014, Aug. 2014, Jan. 2015, March 2015, June 2015

Satisfaction with EWIMS trainings or convenings

March/April 2014, June 2014, Aug. 2014,
Jan. 2015, March 2015, June 2015

EWIMS data team monthly logs

Monthly from March. 2014 –June 2014; Aug. 2014–June 2015

Interview of one EWIMS Team Member1

May 2014, May 2015

Note. Bolded data elements are data collection instruments involving burden of study participants.

1. Items adapted from the California Comprehensive Assistance Center (CCAC).

All student-level data will be collected through extant data. There is no student-level primary data collection burden on schools. As detailed in Table 5, student-level extant data will be obtained from the SEAs, districts, and schools. Protocols for transferring the data are outlined in the protocol in Attachment B-3. When feasible, student administrative data will be collected from the SEAs to reduce the burden on participating schools.

Because some of the outcome data (e.g., attendance, course performance) are not available from SEAs, REL Midwest will ask districts and schools to compile the data (with four scheduled data pulls—March 2014, June 2014, January 2015, and June 2015). It is expected to take district staff approximately 16 hours in year 2 and 32 hours in year 3 to compile and transmit all required student-level administrative data. ED’s contractor is not offering an incentive to districts or schools for compiling administrative extant data because most of the data are readily available in district and school information systems and the burden is minimal.

Impact Study Measures

The impact study will assess the effects of the EWIMS model on student outcomes, including indicators of student risk (e.g., attendance, number of course failures, credits earned, behavior). Other outcome measures include measures of student success that are not in the EWS tool, including performance on state assessments (including graduation tests), persistence and progress in school, and predicted probability of on-time graduation. Data collection activities will focus on students in Grades 9 and 10 during Year 1 (2013‑14) and Grades 9, 10, and 11 during Year 2 (2014‑15). All variables for the impact study will come from school and district data sources. The impact study will obtain the same extant data for all participating schools and use these data to construct the outcome measures.

Baseline Measures. In March 2014, at the beginning of implementation, REL Midwest will obtain student-level covariate data from SEAs for students in treatment and control schools. Data will include demographic characteristics (e.g., race/ethnicity, gender, free or reduced-price lunch [FRPL], individualized education program [IEP] and English language learner [ELL] status, and parents’ education); prior academic achievement (e.g., GPA, course failures from the previous year, state mathematics and reading scores); and previous-year attendance rates. School-level baseline data from SEAs will include high school graduation rates, average state achievement scores in mathematics and reading, and the percentage of FRPL students.

Outcome Measures. The impact evaluation is designed to determine if the EWIMS model has an impact on student outcomes over time, in four domains: (1) student risk status, (2) scores on state assessments, (3) persistence and progress in school, and (4) predicted probability of on-time graduation. The study will also examine if the EWIMS model has an impact on intermediate outcomes in schools, including the ways in which they allocate dropout prevention programming and their school data culture. The student-level outcome measures include the following:

  • Student Risk Status indicators are binary variables reflecting the presence or absence of validated risk indicators, including attendance (e.g., missing 10 percent or more of instructional time), course performance (e.g., one or more course Fs), GPA (e.g., 2.0 or lower), behavior6, and “on-track” at the end of 9th grade (a composite measure that classifies students as on-track or off-track to graduate based on Grade 9 course credits earned and core course failures). We will use the national EWIMS tool cut-off scores for indicators of risk. Treatment schools will be expected to work directly with these same data for implementation purposes on a monthly basis. These risk indicators will be available for all treatment students within the EWS tool. The evaluation contractor will calculate comparable indicator flags for students in control schools based on extant data from the state, district, and schools that are used to populated the EWS tool in treatment schools.

  • Scores on state assessments will be collected from all three states. Our approach is to use standardized test scores in both mathematics and English Language Arts at the end of grade 10. Each participating state collects standardized test score for their grade 10 students, including the Ohio Graduation Test in Ohio, the ACT Plan standardized tests scores in Michigan, and the Acuity Algebra I and English 10 end-of-course assessments in Indiana. We will collect these three sets of standardized test scores and use Z-score standardization to examine patterns in mathematics and English Language Arts achievement for students across all three states.

Also, all participating states expect to administer new Common Core–aligned assessments in the spring of 2015 (Smarter Balanced or the Partnership for Assessment of Readiness for College and Careers [PARCC]). Standardized scores for students in participating schools that complete these assessments in 2015 will also be collected from the SEAs.

  • Persistence and progress in school includes a measure of persistence reflecting whether students are enrolled or have withdrawn for reasons other than transfer to another district, including dropping out. ED’s contractor will examine persistence on an annual basis. ED’s contractor will use state-level administrative data to identify the pool of all possible codes for transfer and dropout. Because ED’s contractor will use state-level data that will be consistent across all districts and schools, this will reduce the variability in possible codes that identify transfer and dropout. The second measure—of progress—reflects whether students have been promoted from grade to grade each year and how many credits they have earned, and will be collected by school or district administrative data. This domain also includes students’ credit accumulation by semester.

  • Predicted probability of on-time graduation will be a composite variable that incorporates data from multiple indicators of risk for each student in both control and treatment schools. This value is a model-based estimate of each individual student’s probability of on-time graduation that will be derived by applying the coefficient estimates as weights for each risk factor7. First, each student will be assigned a binary indicator of having or not having each indicator of risk. Next, the model-based estimate of the intercept, or average logit, will be used as a starting point for each student. Next, the weight of each indicator8 will be added for students with risk factors, yielding individual student logits for on-time graduation that are based on their numbers of risk factors. Finally, logits will be transformed into a predicted probability of on-time graduation for each student. For example, the equation for calculating the predicted probability of on-time graduation for an individual student with the following risk factors could be:

Student Predicted Logit = (4.1.01 Estimated Intercept) + (Binary Absence Flag * Absence Coefficient from 4.1.01) + (Binary Course Failure Flag * Course Failure Coefficient From 4.1.01) + (Binary Behavior Flag * Behavior Coefficient From 4.1.01) + (Binary Algebra Failure Flag * Algebra Failure Coefficient From 4.1.01)

The logits can then be transformed to predicted probabilities using the following equation:

Predicted Probability = (1 / (1+(ln(-logit))

Thus, each student would have an individual predicted probability of graduation that is based on multiple indicator flags and the relative weight of each flag derived from the analyses in 4.1.01. This predicted probability reflects the likelihood of graduation within four years based on continuous and dichotomous measures of the indicator variables, and will be derived from the validation analyses conducted in 4.1.01. ED’s contractor will calculate predicted probabilities at the end of Grades 9 and 10 after Year 1, and at the end of Grades 9, 10, and 11 after Year 2. The predicted probability will then be the outcome in a model that tests the impact of EWIMS on students’ likelihood of on-time graduation.

In addition to the student-level outcomes outlined above, the study will conduct impact analyses on the following school-level outcomes:

  • Data-informed allocation of dropout prevention programs and interventions will measure how schools use data for dropout prevention in both treatment and control schools. The goal of this measure is to tap the theorized intermediate school-level outcome of the EWIMS model, where data are used to inform which students need interventions and how to assign them to specific programs. ED’s contractor will collect these school-level data to inform the construction of this measure through annual Web-based surveys from a school administrator at each site identified by the evaluation team during recruitment to be knowledgeable about the school’s dropout prevention and intervention services. ED’s contractor plans to collect this information at the school level rather than the individual student level because student-level information about the types of services and microinterventions in which students participate is not routinely recorded systematically in most schools. Of course, if the EWIMS model is implemented with fidelity in treatment schools, ED’s contractor will have these detailed student-level data for students in treatment schools and will leverage these data to answer implementation research question 3 (What are the specific interventions provided to students identified as “at risk” by the EWIMS model in treatment schools?). However, assuming these data are otherwise not available for students in control schools, ED’s contractor will not attempt to collect them at the student level because this could potentially dilute the service contrast between the treatment and control conditions.9 Therefore, ED’s contractor will rely on school-level surveys to gauge the effects of the EWIMS model on whether and how both treatment and control schools provide dropout prevention supports for students.

School staff who participate in the annual Web-based survey will receive a $30 stipend in the form of a gift card, which is aligned with Analytical and Technical Support for Advancing Education Evaluation: How to Put Together an OMB Supporting Statement, Appendix E (Sloan, Ingels, & Burghardt, 2012). This $30 incentive applies to responses on one survey that measures three key constructs, two of which are outcomes for the impact study (data-informed allocation of dropout prevention programs and interventions and school data culture), and one of which is for the implementation study, described in the next section (treatment contrast). The incentives will be administered twice throughout the study, accompanying each administration of the survey (May 2014, May 2015).

Attachment B-1 includes a survey that will be developed to collect this information. The items are designed to solicit information about the type and frequency of dropout prevention programs and the strategies to which students may be exposed within and outside of school. These items were based on the constructs presented in Dropout Prevention: A Practice Guide (Dynarski et al., 2008) and Approaches to Dropout Prevention: Heeding Early Warning Signs with Appropriate Interventions (Kennelly & Monrad, 2007). These items inquired about students’ access to online credit recovery, mentorship programs (including Check & Connect), tutoring, internship or school-related work prep programs (e.g., Job Corps, MAAC Project, career and technical education classes), college prep programs (e.g., AVID, African American and Latino Student Summit, Gear Up), and school- and community-based extracurricular activities (e.g., sports, band, clubs or organizations, student government). In this survey, the latter questions (items 2–6) inquire about schools’ implementation of microinterventions, including the names of specific programs, participation rates, grade level availability, and the student selection process.

ED’s contractor will operationalize the data collected from the online survey into one summary variable that captures schools’ proclivity to engage in data-informed allocation of dropout prevention programs and interventions. First, ED’s contractor will create a score that summarizes the extent to which schools use data to assign students to different types of dropout prevention interventions and programs (as measured by the school survey), including:

  • Targeted academic interventions

  • Targeted behavioral interventions

  • Attendance/truancy interventions

  • Online content recovery programs

  • Mentoring programs

  • Internship or school-related work-prep programs

  • College prep

ED’s contractor will test the impact of EWIMS on the proportion of students in a school that participate in one of the above dropout prevention interventions or programs, as well as how often that school uses data to inform the allocation of dropout prevention interventions or programs. Table 6 provides a map of the items included in the draft protocol and the constructs measured in the survey found in Attachment B-1.

Table 6. Mapping Web-Based School Survey Items to Constructs


Data-Informed Allocation of Dropout Prevention

Constructs

Items

Academic interventions

1

Behavioral interventions

2

Attendance/truancy interventions

3

Online credit or content recovery

4

Student mentoring programs

5

Internship or school-related work preparation program

6

College preparation programs

7

  • School data culture is another school-level outcome hypothesized to change as a function of implementing EWIMS. School data culture encompasses the following four key dimensions of school data use:

  1. Context includes district and school goals for data use and the professional climate around data use in schools.

  2. Supports for data use include the presence or absence of structured time to review data, training and professional development around data use, collaboration around data, data coaching, and principal leadership.

  3. Barriers to data use include commononly mentioned barriers to incorporating data reivew into edcuational practice, such as lack of time, lack of training, or lack of comfort with data review.

ED’s contractor will use the aforementioned annual Web-based survey to measure school data culture in both treatment and control schools. School administrators from treatment and control schools will be invited to respond to this school-level survey. This survey includes items adapted from the Bill & Melinda Gates Foundation–funded Urban Data Study (Faria et al., 2012). While the Faria et al. (2012) survey measured specific data use practices around benchmark assessments, ED’s contractor adapted these items for use with early warning system data structures. Attachment B-2 includes a draft protocol with specific items to be used in the EWIMS impact study. Table 7 provides a map of the items included in the draft protocol and the constructs measured in the survey found in Attachment B-2.


Table 7. Mapping Web-Based School Survey Items to Constructs


School Data Culture

Constructs

Items

Context


School data context

1, 2

Supports for data use


Principal leadership and support for

data use

3, 4

Structured time to review data

5

Professional development for data

use

6

Staff capacity/human resources

7

Collaboration around data

8, 9

Barriers to data use

10

Demographics

1113



Implementation Study Measures

The Dropout Prevention Research Alliance and practitioners more broadly will benefit from a clear understanding of what it takes to implement the EWIMS model and, whether or not it shows an impact, from clear documentation about the implementation process. The implementation study will focus on the following questions: (1) To what extent do treatment schools faithfully implement the EWIMS model?; (2) To what extent does business-as-usual practice in control schools include the use of data for identifying at-risk students (treatment contrast)?; (3) What are the specific interventions provided to students identified as “at risk” by the EWIMS model in treatment schools (i.e., Step 5 of the EWIMS process)?; and (4) What are the key components of high quality implementation? This information is of critical interest to alliance members who may use this information to inform the design or implementation of EWIMS models.

Measures of Fidelity. REL Midwest will collect EWIMS data tools from treatment schools twice yearly (January and June) as a primary means for assessing the degree to which schools are faithfully implementing the EWIMS model. The data to be collected to measure fidelity of implementation include:

  • The extent to which schools upload data into the EWS tool (percentage of student demographic and administrative data records uploaded, measured by the EWS tool)

  • The extent to which identified students have documented interventions recorded in the tool (measured by the EWS tool)

  • Participation in EWIMS trainings/convenings (measured by attendance sheets from trainings collected by the implementation team and submitted to the evaluation team)

  • Satisfaction with EWIMS trainings/convenings (measured by satisfaction surveys)

  • Ways in which each treatment school implements the intervention (measured by interviews with an EWIMS team member)


To document implementation, the evaluation team will have access to populated tools and will determine whether student data are regularly loaded, and whether students identified as at risk within the EWS tool also have documented interventions recorded in the tool. ED’s contractor will calculate the percentage of students with complete data in the tool in comparison with the extant data available for those students collected from the SEAs, districts, and schools.

ED’s contractor will also collect information about participation in EWIMS-related trainings and convenings that will be used to address the first research question of the implementation study (To what extent do treatment schools faithfully implement the EWIMS model?). First, the implementation team will share attendance and sign-in sheets for all EWIMS trainings and convenings. Second, the implementation team will facilitate the data collection of satisfaction surveys similar to the process by which college professors receive course satisfaction surveys. One person attending the training will volunteer to collect all satisfaction surveys; place them in a self-addressed, stamped, and sealed envelope; sign the back of the envelope; and mail the satisfaction surveys to the evaluation team. The satisfaction surveys will be adapted from those developed by the California Comprehensive Assistance Center (CACC), which documented implementation of an EWIMS pilot project conducted as a collaboration between the National High School Center and the California Department of Education. The satisfaction surveys will be administered after each technical assistance activity (i.e., technical and implementation trainings, community of practice, on-site support, and responsive technical assistance), and will gauge participants’ satisfaction with the quality, relevance, and utility of the service provided. The satisfaction surveys are part of the typical implementation of the EWIMS intervention and therefore are not included in our estimates of burden for treatment schools.

In addition, the evaluation team will collect qualitative information via interviews about the ways in which each treatment school implements the EWIMS model. The interviews are part of the typical implementation of the EWIMS intervention and therefore are not included in our estimates of burden for treatment schools.

Measures of Treatment Contrast. To understand the extent to which business-as-usual practices in control schools include the use of data for identifying students at risk of dropping out, ED’s contractor will use the aforementioned Web-based survey to collect information about the presence or absence of an early warning data-based system in control schools. In this survey, items will inquire about current usage of an EWS tool or other systematic use of data to identify students who are at risk of academic failure, disengagement, and/or dropping out of high school. If schools are using data for these purposes, follow-up questions explore the types of indicators used, including thresholds and time frames applied, ability to customize indicators, and who has access to the data. Attachment B-1 includes a draft protocol of the items measuring the treatment contrast. Table 8 provides a map of the items included in the draft protocol and the constructs measured in the survey found in Attachment B-1.





Table 8. Mapping Treatment Contrast Survey Items to Constructs

Treatment Contrast Survey

Constructs

Items

Use of an early warning system or tool

1–4

Use of attendance data to flag and assign students to interventions

5–10

Use of course failure and credit deficiency data to flag and assign students to interventions

11–14

Use of other data to flag and assign students to interventions

15

Presence of an EWS/data use team

16–19

Commitment to EWS process

20, 21

Measures of Specific Intervention Information. The EWIMS model is a school-level macrointervention, through which schools can use data to efficiently allocate dropout prevention and academic support interventions to targeted students. Its overall impact will be driven in part by the effectiveness of the specific interventions assigned to students and the appropriateness of the “fit” between student needs and assigned interventions. Although ED’s contractor will not directly assess the “fit” between students and microinterventions, nor the impact of microinterventions on student outcomes, ED’s contractor will collect information on the specific microinterventions allocated to students in treatment schools. To collect these data, ED’s contractor will use the populated EWS data tools from each treatment school to carefully record the microinterventions assigned to students. Information on specific interventions assigned to students will only be collected in treatment schools, as noted above in the description of school-level measures of allocation of dropout prevention programming. However, ED’s contractor will collect a school-level measure of allocation of dropout prevention programming in each control school to attempt to understand whether control schools offer students interventions.10

Estimation Procedures


Student-Level Impact Analyses. ED’s contractor will conduct student-level impact analyses at the end of Year 211 that assess the effects of EWIMS on measures in the four student outcome domains—indicator flags, academic achievement, persistence and progression, and predicted probability of on-time graduation— and ED’s contractor will conduct exploratory analyses that examine the impact of EWIMS on subgroups of students and two school-level outcomes. These analyses align to the research questions about the impact of EWIMS as stated above.

The analytic strategy for questions about impacts on student-level outcomes will compare students in schools randomly assigned to receive the EWIMS intervention with students in schools in the control group. The data for this study are hierarchical (students are nested within schools); therefore, units at the same level are not statistically independent. To allow the effects of student- and school-level factors to be modeled and adjustments made for the non-independence of observations within clusters, this study will use hierarchical linear modeling to estimate the treatment effect on the student-level outcomes of interest. In all analyses of student-level outcomes, ED’s contractor will estimate intent-to-treat two-level models, where students are the Level 1 unit and schools are the Level 2 unit. ED’s contractor will define the intent to treat student sample as “all students included enrolled on the 20th day of school.” Some students within the at-risk populations may not enroll during the first week of school, but do enroll within the first 20 school days. This cut-off date is also policy relevant because schools must report their total enrollment to the state on the 20th day of school to inform state and annual funding. For all of the binary student-level outcomes, ED’s contractor will use a hierarchical generalized linear model with a Bernoulli sampling distribution and logit link function. For all continuous student-level outcomes, ED’s contractor will use hierarchical linear models. ED’s contractor will run models separately by year and cohort.


Most of our student level outcome variables are dichotomous. For models with binary outcomes ED’s contractor will use a logit link function to transform the dependent variable into the odds of achieving a particular outcome (i.e., being on track or off track, persisting). ED’s contractor will assume a constant treatment effect across blocks and include dummy blocking variables at Level 2 (state, school graduation rate, and school size dummies). At the student level, ED’s contractor will include baseline student demographic and prior achievement characteristics, including race/ethnicity, gender, FRPL, ELL, IEP status, and prior standardized reading and mathematics test scores.12 The treatment indicator will be included at Level 2 to indicate that a school was randomly assigned to the treatment group. These student- and school-level variables are expected to be correlated with the outcomes of interest, and will be used as covariates in all analytic models for this study to improve the precision of the estimates of impact. These models will estimate both probability and odds of showing a particular characteristic or indicator of risk for students in the treatment and control groups.

Exploratory Student Subgroup Analyses. To test for a differential impact of EWIMS for subgroups of students, ED’s contractor will use the same impact models outlined above for the student outcomes, but additionally include a cross-level interaction between the treatment status of participating schools and an indicator for students in each subgroup tested. ED’s contractor will conduct these analyses for student impact research questions 1a through 1d. ED’s contractor will address four key subgroups of students: students who are English language learners (ELLs), students with Individualized Education Programs (IEPs), students who receive free or reduced-price lunch (FRPL), and students who are flagged as at risk with one or multiple risk factors as calculated using administrative data from the participating schools during the semester preceding implementation (fall 2013). ED’s contractor will therefore categorize students as “initially at risk” (or not) based on data from the fall 2013 marking period preceding implementation and use this subgroup categorization throughout the remainder of the study for each cohort. ED’s contractor will not redefine this subgroup later, because changes in flagged status in subsequent semesters may be a consequence of the impact of the intervention (e.g., students who are flagged receive high-quality micro-interventions that may decrease their likelihood of being flagged in later semesters).

Because the theory of action does not include a priori hypotheses about the differential impact of EWIMS on the four subgroups to be tested with four separate exploratory analyses, ED’s contractor will adjust for multiple comparisons given the potential for alpha inflation. Specifically, the p-value associated with the cross-level interaction coefficient in each of the four subgroup analyses must be less than .0125 to be considered statistically significant (adjusting for four post hoc exploratory tests). In addition, it is possible that students in one subgroup are also included in other subgroups tested under separate exploratory analyses. For this reason, ED’s contractor will report the number and percentage of students in each subgroup that are also included in the other three subgroups. Understanding the overlap across subgroups will be important for interpreting a differential impact of EWIMS on one or more subgroups of students.

Exploratory School-Level Impact Analyses. To analyze the impact of EWIMS on school-level outcomes and answer research questions 3a and 3b, ED’s contractor will conduct intent-to-treat, linear regression models on the school outcomes of interest. These analyses will be treated as exploratory because they are underpowered to detect effects and are not the primary focus of the EWIMS model. These models will include strata blocking variables and the indicator for whether a school was randomly assigned to the treatment or control group.

Sensitivity Analyses. ED’s contractor also will conduct a series of sensitivity analyses to test the robustness of the observed estimates from the benchmark impact models. These analyses include fixed effects models with school dummy variables, models with no covariates, and models that use the continuous or ordinal data on which the binary indicators used in the primary analyses are based (e.g., using all values for GPA rather than the binary indicator of above 2.0 vs. 2.0 or lower).

Study Milestone Timeline

This array of data will permit ED’s contractor to gather a deep understanding of program implementation and how the intervention differs from business as usual in control schools. The above data collections are specified in the following study timeline:


Table 9. Schedule of Activities

Activity

Expected Date

Draft Office of Management and Budget (OMB) package

July 2013

Documentation of institutional review board approval

April 2013

Submit 60 day FRN

July 2013

Submit 30 day FRN

October 2013

Draft proposal accepted by ED

March 2013

Final proposal approved by ED

October 2013

Expected OMB clearance date

March 2014

Complete school recruitment

March 2014

Obtain signed district/school memoranda of understanding from all participating schools

March 2014

Complete random assignment of participating schools

March 2014

Academic year (AY) 1, 2013‑14


Collect baseline data from participating districts

March 2014

Pilot the Web-based survey and interview protocols

March 2014

Treatment schools implement EWIMS with Grades 9 and 10

March 2014–June 2014

Collect EWS tool data from treatment schools (AY1, quarter 3)

March/April 2014

Collect EWS tool data from treatment schools (AY1, quarter 4)

June 2014

Collect end-of-year student-level data from participating districts

June 2014

Conduct interviews with EWIMS teams at treatment schools

May 2014

Administer survey measure of treatment contrast at participating schools

May 2014

Academic year (AY) 2, 2014‑15


Treatment schools implement EWIMS with Grades 9, 10, and 11

August 2014–June 2015

Collect EWS tool data from treatment schools (AY2, quarter 1)

November 2014

Collect midyear student-level data from participating districts

January 2015

Collect EWS tool data from treatment schools (AY2, quarter 2)

January 2015

Collect EWS tool data from treatment schools (AY2, quarter 3)

March 2015

Collect EWS tool data from treatment schools (AY2, quarter 4)

June 2015

Collect end-of-year student-level data from participating districts

June 2015

Conduct interviews with EWIMS teams at treatment schools

May 2015

Administer survey measure of treatment contrast at participating schools

May 2015

Academic year 3, 2015‑16

August 2015

Control schools implement EWIMS

August 2015–June 2016

Submit Making an Impact report (first draft)

December 2015

Making an Impact report accepted by ED

TBD

3. Description of Procedures for Maximizing Response Rates

ED is committed to obtaining complete data for this evaluation. Because the evaluation relies heavily on administrative data, the ED’s contractor does not expect response rates to be an issue and anticipates a 95-percent response rate for district/school and state archival record requests. A key to achieving complete administrative data is tracking the data components from the SEAs and schools/districts with e-mail and telephone contact to the appropriate parties to resolve issues of missing or delayed data files. Also, because ED’s contractor will have direct access to EWS tools in treatment schools, response rates will be 100 percent for all applicable data. The completeness of those data will vary, however, by the degree to which treatment schools use and implement EWIMS. All administrative data files and direct instrument responses received as part of this evaluation will be reviewed for consistency and completeness. If a data file has too many missing values or if an instrument in the implementation study has too few items completed to be counted as a response, AIR staff will seek to obtain more complete responses by e-mail or phone.

Based on its extensive experience with administering surveys in a variety of schools, districts, and states, ED’s contractor expects the response rate for the survey to be at least 85 percent. Although ED’s contractor expects high response rates for the unique data collections related to this evaluation (as schools volunteer for this study in order to receive EWIMS for free and instruments are developed with sensitivity to respondent burden), nonresponse follow-up will be performed to ensure adequate response rates. Follow-up reminders and phone calls will be placed to individual respondents in the event that responses are not obtained for Web-based surveys, and AIR staff will follow up directly with respondents in the event that face-to-face interviews or reflection interviews are not conducted.

Additional steps will be taken to maximize response rates for the data collected for this study. For example, sampled respondents will receive advance communications explaining their role and an assurance of confidentiality. Respondents also will be given a contact number to reach ED’s contractor with questions and will be informed of the incentive that they will receive (on top of the incentive received by all control schools). For example, the study plan includes the following incentive:

  • $30 gift card for school administrators who participate in the annual Web-based survey (May 2014, May 2015)

  • $30 gift card for interview participants from EWIMS data teams at each of the 36 treatment schools (May 2014, May 2015)

Finally, ED’s contractor will work with IES to determine the best analytic approach for addressing missing data on key outcomes and covariates in impact models estimating the effect of EWIMS on student and school outcomes. In the proposed analytic approach, ED’s contractor will drop any covariates from impact models that have greater than 5 percent missing data and will list wise delete cases with missing data on outcome measures. Upon collecting the data, ED’s contractor will conduct analyses examining the percentage of missing cases on key outcomes and covariates and work with the IES to determine whether list wise deletion is the best analytic approach or whether ED’s contractor should impute missing data (using mean imputation or multiple imputation) to minimize bias associated with attrition or nonresponse.


4. Description of Tests, Procedures, and Methods

Because all student-level outcome data used to estimate the impact of EWIMS on students will come from state- , district- , or school-level administrative records and/or directly from the EWS tool, there will be no need to test procedures or instruments. However, ED’s contractor will invite internal staff at AIR with expertise in leveraging partnerships with states and districts to collect extant data to confirm that the data collection protocols are appropriate and do not pose any unnecessary burden on state or district staff.

To measure school-level outcomes, including data-informed use of dropout prevention interventions for students, and school data culture, ED’s contractor will use a Web-based survey. The survey items will be reviewed by AIR colleagues who were formerly employed as teachers and/or school administrators or colleagues with content expertise. These “critical colleagues” will look for three things during their review: (1) whether the questions asked are clear, understandable/ free of research jargon, and answerable; (2) whether the questions actually assess the intended constructs (e.g., the degree to which schools are implementing EWIMS as intended or implementing an EWIMS-like intervention); and (3) whether the number and type of questions are suitable for a 60-minute survey (e.g., not redundant, focused enough to solicit clear answers).

Next, ED’s contractor will pretest the Web-based survey with five school leaders who are currently implementing EWIMS in their schools. ED’s contractor will recruit these five participants from the network of schools currently implementing EWIMS. ED’s contractor will ask participants to complete the Web-based survey using a cognitive think- aloud protocol to pilot the items and correct any issues with item sets before administering the first wave of surveys in spring 2014.

The implementation also involves face-to-face data collection through interviews with the data team. ED’s contractor will again utilize its network of experts within AIR, and the interview protocols will be reviewed by these critical colleagues for clarity, face validity of questions, and brevity. ED’s contractor will also pilot the interview protocols with nine teachers/school leaders who are currently implementing EWIMS in their schools. Again, ED’s contractor will recruit nine participants from the network of schools currently implementing EWIMS. ED’s contractor will ask participants to complete the interview and provide feedback on the length and appropriateness of the items to correct any issues with item sets before conducting the first interviews. School-level staff who participate in the pilot testing will receive a one-time $30 stipend in the form of a gift card, which is aligned with Analytical and Technical Support for Advancing Education Evaluation: How to Put Together an OMB Supporting Statement, Attachment E.


5. Contact Information for Statistical Consultants

ED’s contractor consulted the following experts internally within AIR concerning the methodology, study design, and data collection approach/burden:

  • Dr. Hans Bos, Vice President, AIR (650-843-8110)

  • Dr. Chris Brandt, Senior Researcher, AIR (630-649-6649)

  • Dr. Dean Gerdeman, Principal Researcher, AIR (202-403-6223)

  • Dr. Mengli Song, Principal Researcher, AIR (202-403-5267)

The following individuals were consulted on the statistical, data collection, and analytic aspects of the EWIMS evaluation study through REL Midwest’s Technical Working Group (TWG).



Margaret Burchinal, Ph.D.

Senior Scientist

Frank Porter Graham Child Development Institute

Research Professor, Department of Psychology

Adjunct Professor, Department of Biostatistics

University of North Carolina

515 Oakcrest Drive

Chapel Hill, NC 27516-9638

Ph: 919-966-5059

Fax: 919-962-5771

E-Mail: [email protected]



Thomas Cook, Ph.D.

Joan and Sarepta Harrison Chair in Ethics and Justice

Professor of Sociology, Psychology, Education and Social Policy

Faculty Fellow, Institute for Policy Research

Northwestern University

2040 Sheridan Road

Evanston, IL 60208

Ph: 847-491-3776

E-Mail: [email protected]



Sara Goldrick-Rab, Ph.D.

Associate Professor of Educational Policy Studies and Sociology

Senior Scholar, Wisconsin Center for the Advancement of Postsecondary Education

University of Wisconsin

211 Education Bldg.

1000 Bascom Mall

Madison, WI 53706

Ph: 608-265-2141

E-Mail: [email protected]



Larry Hedges, Ph.D.

Board of Trustees Professor of Statistics and Social Policy

Faculty Fellow, Institute for Policy Research

Northwestern University

2006 Sheridan Road, EV 4070

Evanston, IL 60208

Ph: 847-491-8899

E-Mail: [email protected]



James J. Kemple, Ed.D.

Executive Director

Research Alliance for New York City Schools

Research Professor, Steinhardt School of Culture, Education, and Human Development

New York University

726 Broadway, 756

New York, NY 10003

Ph: 212-998-5463

Fax: 212-995-4049

E-Mail: [email protected]



Brian Rowan, Ph.D.

Burke A. Hinsdale Collegiate Professor and School of Education Research Professor

Institute for Social Research

University of Michigan

610 E. University Ave.

Room 4112

Ann Arbor, MI 48109-1259

Ph: 734-615-0286

E-Mail: [email protected]



Barbara Schneider, Ph.D.

John A. Hannah Chair and University Distinguished Professor

College of Education and Department of Sociology

Michigan State University

Erickson Hall

620 Farm Lane, Room 516

East Lansing, MI 48824

Ph: 517-432-0188

E-Mail: [email protected]

ED’s contractor also consulted with the REL Midwest Dropout Prevention Research Alliance members listed below to gather feedback on the design and measures to be used in the study.

  • Teresa Brown, Assistant Superintendent at Indiana Department of Education (317-232-0524)  

  • Leisa Gallagher, Director of the Reaching & Teaching Struggling Learners Initiative at the Michigan Department of Education (517-908-3921)

  • Jeremy Herr, Principal, McComb High School, McComb Local Schools (419-293-3286)

  • Laurie Kruszynski, Data Coordinator, Scott High School, Toledo Public Schools
    (419- 671-4000)

  • Cherie Mourlam, Assistant Superintendent, Washington Local Schools (419-473-8222)

  • Mike O’Shea, Springfield High School (419-867-5633)

  • Melissa Ramirez, Assistant Principal, Findlay City Schools (419-425-8257)

  • Amy Szymanksi, Consultant, State Support Team Region 1 (419-833-6771)

  • Jay Wollenburg, Principal, Ohio Virtual Academy (866-339-9071)

  • Sue Zake, Executive Director, Ohio Department of Education (419-720-8999)


Key staff that will be directly involved with the data collection and analyses include:

  • Dr. Ann-Marie Faria, Senior Researcher, AIR (202-403-5356)

  • Dr. Jessica Heppen, Principal Researcher, AIR (202-403-5488)

  • Dr. Mindee O’Cummings, Principal Researcher, AIR (202-403-5254)

  • Dr. Nicholas Sorensen, Senior Researcher, AIR (312- 283-2318)




References

Allensworth, E., & Easton, J. (2005). The on-track indicator as a predictor of high school graduation. Chicago: Consortium on Chicago School Research. Retrieved from track- http://ccsr.uchicago.edu/publications/track-indicator-predictor-high-school-graduation.

Allensworth, E., & Easton, J. Q. (2007). What matters for staying on-track and graduating in Chicago public high schools: A close look at course grades, failures and attendance in the freshman year. Chicago: Consortium on Chicago School Research. Retrieved from http://ccsr.uchicago.edu/publications/what-matters-staying-track-and-graduating-chicago-public-schools

Balfanz, R., Herzog, L., & Mac Iver, D. J. (2007). Preventing student disengagement and keeping students on the graduation path in urban middle-grades schools: Early identification and effective interventions. Educational Psychologist, 42(4), 223–235.

Bloom, H. S., Hill, C., Black, A., & Lipsey, M. W. (2008). Performance trajectories and performance gaps as achievement effect-size benchmarks for educational interventions. New York: MDRC.

Dynarski, M., Clarke, L., Cobb, B., Finn, J., Rumberger, R., & Smink, J. (2008). Dropout prevention: A practice guide (NCEE 2008-4025). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance. Retrieved from http://ies.ed.gov/ncee/wwc/pdf/practiceguides/dp_pg_090308.pdf

Faria, A.M., Heppen, J., Li, Y., Stachel, S., Jones, W., Sawyer, K., et al. (2012). Charting success: Data use and student achievement in urban schools. Washington, D.C.: Council of the great city schools.

Goldschmidt, P, & Wang, J. (1999). When can schools affect dropout behavior? A longitudinal multilevel analysis. American Educational Research Journal, 36, 715-738.

Kennelly, L., & Monrad, M. (2007). Approaches to dropout prevention: Heeding early warning signs with appropriate interventions. Washington, DC: American Institutes for Research, National High School Center. Retrieved from http://www.betterhighschools.org/docs/NHSC_ApproachestoDropoutPrevention.pdf

Mac Iver, M. A., & Mac Iver, D. J. (2010). Keeping on track in ninth grade and beyond: Baltimore’s ninth graders in 2007–08. Baltimore: Baltimore Education Research Consortium. Retrieved from http://baltimore-berc.org/pdfs/3A%20Final%20report_06-15-10.pdf

Neild, R. R., & Balfanz, R. (2006). An extreme degree of difficulty: The educational demographics of the urban neighborhood high school. Journal of Education for Students Placed at Risk 11(2), 123–141.

Neild, R. C., Stoner-Eby, S., & Furstenberg, F. (2008). Connecting entrance and departure: The transition to ninth grade and high school dropout. Education and Urban Society, 40(5), 543–569.

Silver, D., Saunders, M., & Zarate, E. (2008). What factors predict high school graduation in the Los Angeles Unified School District? (California Dropout Research Project Report 14). Santa Barbara, CA: University of California. Retrieved from http://www.hewlett.org/uploads/files/WhatFactorsPredict.pdf

Raudenbush, S.W., Spybrook, J., Congdon, R., Liu, X., Martinez, A., Bloom, H., & Hill, C. (2011). Optimal Design Plus Empirical Evidence (Version 3.0). Available at http://www.wtgrantfoundation.org/resources/consultation-service-and-optimal-design.

Sloan, M., Ingels, J., & Burghardt, J. (2012). Analytical and Technical Support for Advancing Education Evaluation: How to Put Together an OMB Supporting Statement. Available at http://relpacific.wikispaces.com/file/view/OMB%20guidance%20compiled%205-30-12.pdf/350956206/OMB%20guidance%20compiled%205-30-12.pdf.

Attachment A. Recruitment Materials

Attachments A-1 through A-9 include recruitment materials for districts and schools to be used in the EWIMS impact study.

A-1 includes the advance letter email text. A-2 includes the 2-page study overviews. A-3 includes a letter from IES endorsing the study. A-4 includes follow up text for schools that are non-responsive during recruitment. The advance email letter, the 2-page study overviews, and a letter from IES endorsing the study will be sent to principals at each school and superintendents at each district. A-5 and A-6 include screening protocols for schools and districts. These protocols will be administered to a relevant stakeholder within each school or district identified by an initial point of contact. For example, the study team will first contact principals at each school and superintendents at each district though these high-level administrators might request that another staff member (e.g., dropout prevention specialist) complete the screening interviews. A-7 and A-8 include memoranda of understanding that will be signed between districts and REL Midwest (at AIR) and schools and REL Midwest (at AIR) to formalize study participation and commitment. A-9 includes a memorandum of understanding that will be signed between the SEA and REL Midwest (at AIR)



A-1. Advance Letter (E-Mail Text)

Dear < principal or superintendent name>,

I am contacting you on behalf of the U.S. Department of Education to discuss an opportunity for <district or school name> to participate in a study aimed at preventing high school dropout. This study would provide access to the National High School Center’s Early Warning and Intervention Monitoring System (EWIMS) along with high-quality professional development technical support at no cost. The EWIMS process involves using data to identify at-risk students early in high school and assigning struggling students to interventions that get them back on track for on-time graduation. Conducted by the REL Midwest Regional Educational Laboratory at American Institutes for Research (AIR), the project will examine the impact of implementing EWIMS on a host of critical student outcomes (validated indicators of dropout, course performance, persistence and progress in school, and likelihood of on-time graduation). In addition, the project will examine the impact of EWIMS on how schools make use of data to allocate limited dropout prevention resources.

Attached please find a document that briefly describes the project. Our initial records indicate that your district/school may be eligible to participate in this project, funded by the U.S. Department of Education’s Institutes for Educational Sciences. I would like to schedule a brief phone call with you to tell you more about the project, ascertain your interest in taking part, and to discuss whether this might be a great opportunity for <district or school name>.

Would you be able to touch base briefly by phone on <day>, <month> <date>? Please feel free to reach me by email or at 312-283-2318.

Thank you and I look forward to speaking with you!

<name of project outreach recruiter>


<name of project outreach recruiter>

Project Outreach

On-Time Graduation Project

Midwest Regional Educational Laboratory

American Institutes for Research

20 North Wacker Drive, Suite 1231, Chicago, IL 60606

t: 312.283.2318 | www.air.org


P er the policies and procedures required by the Education Sciences Reform Act of 2002, Title I, Part E, Section 183, responses to this data collection will be used only for statistical purposes. The reports prepared for this study will summarize findings across the sample and will not associate responses with a specific district or individual. Any willful disclosure of such information for nonstatistical purposes, except as required by law, is a class E felony.


A-2. Study Overview for Schools and Districts



Ohio

Purpose of the Project

One in four students in the United States fails to graduate from high school, and graduation rates are significantly lower for students who are racial or ethnic minorities, economically disadvantaged, migrant and/or limited-English proficient, or receiving special education services. However, strong foundational research highlights critical indicators in ninth and tenth grade that powerfully predict whether students are “on track” for high school graduation. These indicators, which typically focus on student engagement measures (attendance) and course performance (grades, credits earned), can be used as part of an early warning system to flag at-risk students early, assign appropriate interventions, and get students back on track for graduation.

As part of Ohio’s Comprehensive Continuous Improvement Plan, the state has committed, through adoption of the Ohio Improvement Process, to promoting student success through data-driven decision making, targeted programming, ongoing student monitoring, and evaluation of improvement process effectiveness. Use of an early warning system is one strategy that schools can use within the Ohio Improvement Process.

One such early warning system is the Early Warning and Intervention Monitoring System (EWIMS) developed by the National High School Center. The EWIMS model encourages educators to use data to consider and provide appropriate interventions to students, and provides a means for monitoring student progress over time. At the core of EWIMS is the use of simple tools that encourage educators to routinely examine indicators that identify students as “off track” and take action.

Despite increasingly widespread implementation of early warning systems by states, districts, and schools, there have been no rigorous studies testing the impact of using an early warning system to improve student outcomes such as staying in school, progressing in school, and probability of graduating. There is also little research examining how using an early warning system can shape a school’s culture for data use (including how schools allocate their limited dropout prevention resources).

To address this gap, the On-Time Graduation Project will provide the first rigorous test of the impact of an early warning system. This project, funded by the U.S. Department of Education’s Institutes of Educational Sciences, is being conducted by the Midwest Regional Educational Laboratory at the American Institutes for Research.

The project will examine the impact of EWIMS on
(1) student outcomes, including student risk status for dropout, scores on graduation tests, persistence and progress in school, and likelihood of on-time graduation; and (2) school outcomes, including how schools allocate dropout prevention interventions for students and their data-use culture. Participating in this high-profile, large-scale project will give your school and district an opportunity to access these innovative resources
at no cost and help inform educational policy in Ohio and at the national level.

Project Approach

The project will take place in approximately 70 high schools in Ohio during the 2013–14 and 2014–15 school years. All schools that participate in the project will receive the early warning system at no cost. Participating schools will be assigned by lottery to receive access to the EWIMS model, including the tool and high-quality professional development for implementation, in the 2013–2014 school year or the fall of 2015 (following the completion of the project). Schools assigned to receive the EWIMS model in the fall of 2015 will continue “business as usual” practices to identify at-risk students and allocate dropout prevention resources through 2014–15.

The project will examine student outcomes for all students in Grades 9 and 10 during the 2013–14 school year and all students in Grades 9 through 11 during the 2014–15 school year for schools that implement EWIMS in 2013–2014 and those that do not implement EWIMS until the fall of 2015. All student outcome data will be collected from school or district administrative data, the early warning system tool, or the Ohio Department of Education. There will be no primary student data collection for this project. To understand how EWIMS may impact schools, all participating schools will be asked to complete an annual Web-based survey about data use practices, and schools implementing EWIMS during the two-year project may be asked to participate in interviews about their experiences using the tool.

EWIMS Intervention

The EWIMS process is designed to identify students who are at risk of dropping out of school and to support and monitor at-risk students through school-wide strategies and targeted interventions. EWIMS is currently in use in 67 districts in six states, and the tool has been downloaded more than 20,000 times from the National High School Center’s website.

EWIMS Tool. At the heart of the EWIMS model is an early warning data tool used to flag students as “at risk” based on attendance, course performance (grades, credits, GPA), and behavior indicators. The tool enables users to identify students who are at risk of dropping out of school, record assignments to available interventions, and monitor students’ response to those interventions. The tool provides a number of reports accessible to users, including:

  • School-Level Reports: Graphical reports that show trends in student risk status across the school

  • Student-Level Reports: Lists of students, grouped by indicators of risk and/or assigned intervention programs, that include summary information

  • Detailed Student Reports: Simple reports that can be generated for each individual student and show student information, indicators of risk status, and any assigned interventions

  • Student-Level Intervention Summary Reports: Lists of students, their indicators of risk (flagged or not flagged), and the number and types of interventions to which each student was assigned

EWIMS Implementation Process. In addition to the tool, the National High School Center has devised a seven-step EWIMS implementation process to support implementation. The process guides users to make informed decisions about how to use data to support at-risk students and how to continue to monitor their progress over time. In addition to focusing on individual students, the process guides users to examine the success of specific supports or interventions and to examine possible systemic issues (school climate) that may relate to dropout trends.

















EWIMS Seven-Step Implementation Cycle

Shape1

Initial Criteria for Participation

The project will include approximately 70 high schools in Ohio. To qualify, schools must (1) have at least 150 ninth-grade students; (2) a graduation rate between 25 and 95 percent and (3) not already be implementing an early warning system tool for using data to flag at-risk students.

Project Timeline

Through March 2014, the project team will discuss participation with districts and schools that meet the initial criteria and conduct on-site or virtual meetings with school principals, guidance counselors and dropout prevention coordinators. Participating schools will sign memoranda of understanding in January 2014 and be assigned by lottery to receive access to the EWIMS model, including the tool and high-quality professional development on the implementation process, in March 2014 (treatment group) or the 2015–2016 school year (“business as usual” control group). Training for EWIMS implementation in treatment schools will begin in early 2014. Data collection activities will continue throughout but not beyond the 2014–15 school year.

For Additional Information

For more information or to begin a conversation about partnering on this project, contact Dr. Nicholas Sorensen ([email protected] or 312-283-2318) or Dr. Mindee O’Cummings ([email protected] or 202-403-5254).



Indiana

On-Time Graduation Project

Purpose of the Project

One in four students in the United States fails to graduate from high school on time and graduation rates are significantly lower for students who are racial or ethnic minorities, economically disadvantaged, migrant and/or limited-English proficient, or receiving special education services. However, strong foundational research highlights critical indicators in ninth and tenth grade that powerfully predict whether students are “on track” for high school graduation. These indicators, which typically focus on student engagement measures (attendance) and course performance (grades, credits earned), can be used as part of an early warning system to flag at-risk students early, assign appropriate interventions, and get students back on track for graduation.

Schools in Indiana have many alternative approaches to increasing on-time graduation. Use of an early warning system is one strategy that schools can use to achieve the recent commitments by the Department of Education to improve graduation rates to at least 90% by the year 2012 as outlined in the state’s ESEA waiver request.

One such early warning system is the Early Warning and Intervention Monitoring System (EWIMS) developed by the National High School Center. The EWIMS model encourages educators to use data to consider and provide appropriate interventions to students, and provides a means for monitoring student progress over time. At the core of EWIMS is the use of simple tools that encourage educators to routinely examine indicators that identify students as “off track” and take action.

Despite increasingly widespread implementation of early warning systems by states, districts, and schools, there have been no rigorous studies testing the impact of using an early warning system to improve student outcomes such as staying in school, progressing in school, and probability of graduating. There is also little research examining how using an early warning system can shape a school’s culture for data use including how schools allocate their limited dropout prevention resources.

To address this gap, the On-Time Graduation Project will provide the first rigorous test of the impact of an early warning system. This project, funded by the U.S. Department of Education’s Institutes of Educational Sciences, is being conducted by the Midwest Regional Educational Laboratory at the American Institutes for Research.

The project will examine the impact of EWIMS on (1) student outcomes including student risk status for dropout, scores on graduation tests, persistence and progress in school and likelihood of on-time graduation; and (2) school outcomes including how schools allocate dropout prevention interventions for students and their data-use culture. Participating in this high-profile, large-scale project will give your school and district an opportunity to access these innovative resources at no cost and help inform educational policy in the Midwest and at the national level.

Project Approach

The project will take place in approximately 70 high schools in Midwest region, including Indiana, during the 2013–14 and 2014–15 school years. All schools that participate in the project will receive the early warning system at no cost. Participating schools will be assigned by lottery to receive access to the EWIMS model, including the tool and high-quality professional development for implementation, in the 2013–2014 school year or the fall of 2015 (following the completion of the project). Schools assigned to receive the EWIMS model in the fall of 2015 will continue “business as usual” practices to identify at-risk students and allocate dropout prevention resources through 2014–15.

The project will examine student outcomes for all students in Grades 9 and 10 during the 2013–14 school year and all students in Grades 9 through 11 during the 2014–15 school year for schools that implement EWIMS in 2013–2014 and those that do not implement EWIMS until the fall of 2015. All student outcome data will be collected from school or district administrative data, the early warning system tool or the Indiana Department of Education. There will be no primary student data collection for this project. To understand how EWIMS may impact schools, all participating schools will be asked to complete an annual Web-based survey about data use practices and schools implementing EWIMS during the two-year project may be asked to participate in interviews about their experiences using the tool.



EWIMS Intervention

The EWIMS process is designed to identify students who are at risk of dropping out of school and support and monitor at-risk students through school-wide strategies and targeted interventions. EWIMS is currently in use in 67 districts in six states, and the tool has been downloaded more than 20,000 times from the National High School Center’s website.

EWIMS Tool. At the heart of the EWIMS model is an early warning data tool used to flag students as “at risk” based on attendance, course performance (grades, credits, GPA), and behavior indicators. The tool enables users to identify students who are at risk of dropping out of school, record assignments to available interventions, and monitor students’ response to those interventions. The tool provides a number of reports accessible to users, including:

  • School-Level Reports: Graphical reports that show trends in student risk status across the school

  • Student-Level Reports: Lists of students, grouped by indicators of risk and/or assigned intervention programs, that include summary information

  • Detailed Student Reports: Simple reports that can be generated for each individual student and show student information, indicators of risk status, and any assigned interventions

  • Student-Level Intervention Summary Reports: Lists of students, their indicators of risk (flagged or not flagged), and the number and types of interventions to which each student was assigned

EWIMS Implementation Process. In addition to the tool, the National High School Center has devised a seven-step EWIMS implementation process to support implementation. The process guides users to make informed decisions about how to use data to support at-risk students and how to continue to monitor their progress over time. In addition to focusing on individual students, the process guides users to examine the success of specific supports or interventions and to examine possible systemic issues (school climate) that may relate to dropout trends.





EWIMS Seven-Step Implementation Cycle

Shape2

Initial Criteria for Participation

The project will include approximately 70 high schools in the Midwest. To qualify, schools must (1) have at least 150 ninth-grade students; (2) a graduation rate between 25 and 95 percent and (3) not already be implementing an early warning system tool for using data to flag at-risk students.

Project Timeline

Through March 2014, the project team will discuss participation with districts and schools that meet the initial criteria and conduct on-site or virtual meetings with school principals, guidance counselors and dropout prevention coordinators. Participating schools will sign memoranda of understanding in January 2014 and be assigned by lottery to receive access to the EWIMS model, including the tool and high-quality professional development on the implementation process, in March 2014 (treatment group) or the 2015–2016 school year (“business as usual” control group). Training for EWIMS implementation in treatment schools will begin in early 2014. Data collection activities will continue throughout but not beyond the 2014–15 school year.

For Additional Information

For more information or to begin a conversation about partnering on this project, contact Dr. Nicholas Sorensen ([email protected] or 312-283-2318) or Dr. Mindee O’Cummings ([email protected] or 202-403-5254).

Michigan

On-Time Graduation Project

Purpose of the Project

One in four students in the United States fails to graduate from high school and graduation rates are significantly lower for students who are racial or ethnic minorities, economically disadvantaged, migrant and/or limited-English proficient, or receiving special education services. However, strong foundational research highlights critical indicators in ninth and tenth grade that powerfully predict whether students are “on track” for high school graduation. These indicators, which typically focus on student engagement measures (attendance) and course performance (grades, credits earned), can be used as part of an early warning system to flag at-risk students early, assign appropriate interventions, and get students back on track for graduation.

As part of Michigan's Dropout Challenge, the state is challenging schools to use early warning indicators to identify students, especially students in transition years, for targeted programming and monitoring to keep students on track towards graduation and post-secondary success. Use of an early warning system is one strategy that schools can implement within the Dropout Challenge.

One such early warning system is the Early Warning and Intervention Monitoring System (EWIMS) developed by the National High School Center. The EWIMS model encourages educators to use data to consider and provide appropriate interventions to students, and provides a means for monitoring student progress over time. At the core of EWIMS is the use of simple tools that encourage educators to routinely examine indicators that identify students as “off track” and take action.

Despite increasingly widespread implementation of early warning systems by states, districts, and schools, there have been no rigorous studies testing the impact of using an early warning system to improve student outcomes such as staying in school, progressing in school, and probability of graduating. There is also little research examining how using an early warning system can shape a school’s culture for data use including how schools allocate their limited dropout prevention resources.

To address this gap, the On-Time Graduation Project will provide the first rigorous test of the impact of an early warning system.



This project, funded by the U.S. Department of Education’s Institutes of Educational Sciences, is being conducted by the Midwest Regional Educational Laboratory at the American Institutes for Research.

The project will examine the impact of EWIMS on (1) student outcomes including student risk status for dropout, scores on graduation tests, persistence and progress in school and likelihood of on-time graduation; and (2) school outcomes including how schools allocate dropout prevention interventions for students and their data-use culture. Participating in this high-profile, large-scale project will give your school and district an opportunity to access these innovative resources at no cost and help inform educational policy in Michigan and at the federal level.

Project Approach

The project will take place in approximately 70 high schools during the 2013–14 and 2014–15 school years. All schools that participate in the project will receive the early warning system at no cost. Participating schools will be assigned by lottery to receive access to the EWIMS model, including the tool and high-quality professional development for implementation, in the 2013–2014 school year or the fall of 2015 (following the completion of the project). Schools assigned to receive the EWIMS model in the fall of 2015 will continue “business as usual” practices to identify at-risk students and allocate dropout prevention resources through 2014–15.

The project will examine student outcomes for all students in Grades 9 and 10 during the 2013–14 school year and all students in Grades 9 through 11 during the 2014–15 school year for schools that implement EWIMS in 2013–2014 and those that do not implement EWIMS until the fall of 2015. All student outcome data will be collected from school or district administrative data, the early warning system tool or the Michigan Department of Education. There will be no primary student data collection for this project. To understand how EWIMS may impact schools, all participating schools will be asked to complete an annual Web-based survey about data use practices and schools implementing EWIMS during the two-year project may be asked to participate in interviews about their experiences using the tool.

EWIMS Intervention

The EWIMS process is designed to identify students who are at risk of dropping out of school and support and monitor at-risk students through school-wide strategies and targeted interventions. EWIMS is currently in use in 67 districts in six states, and the tool has been downloaded more than 20,000 times from the National High School Center’s website.

EWIMS Tool. At the heart of the EWIMS model is an early warning data tool used to flag students as “at risk” based on attendance, course performance (grades, credits, GPA), and behavior indicators. The tool enables users to identify students who are at risk of dropping out of school, record assignments to available interventions, and monitor students’ response to those interventions. The tool provides a number of reports accessible to users, including:

  • School-Level Reports: Graphical reports that show trends in student risk status across the school

  • Student-Level Reports: Lists of students, grouped by indicators of risk and/or assigned intervention programs, that include summary information

  • Detailed Student Reports: Simple reports that can be generated for each individual student and show student information, indicators of risk status, and any assigned interventions

  • Student-Level Intervention Summary Reports: Lists of students, their indicators of risk (flagged or not flagged), and the number and types of interventions to which each student was assigned

EWIMS Implementation Process. In addition to the tool, the National High School Center has devised a seven-step EWIMS implementation process to support implementation. The process guides users to make informed decisions about how to use data to support at-risk students and how to continue to monitor their progress over time. In addition to focusing on individual students, the process guides users to examine the success of specific supports or interventions and to examine possible systemic issues (school climate) that may relate to dropout trends.





EWIMS Seven-Step Implementation Cycle

Shape3

Initial Criteria for Participation

The project will include approximately 70 high schools in the Midwest. To qualify, schools must (1) have at least 150 ninth-grade students; (2) a graduation rate between 25 and 95 percent and (3) not already be implementing an early warning system tool for using data to flag at-risk students.

Project Timeline

Through March 2014, the project team will discuss participation with districts and schools that meet the initial criteria and conduct on-site or virtual meetings with school principals, guidance counselors and dropout prevention coordinators. Participating schools will sign memoranda of understanding in January 2014 and be assigned by lottery to receive access to the EWIMS model, including the tool and high-quality professional development on the implementation process, in March 2014 (treatment group) or the 2015–2016 school year (“business as usual” control group). Training for EWIMS implementation in treatment schools will begin in early 2014. Data collection activities will continue throughout but not beyond the 2014–15 school year.

For Additional Information

For more information or to begin a conversation about partnering on this project, contact Dr. Nicholas Sorensen ([email protected] or 312-283-2318) or Dr. Mindee O’Cummings ([email protected] or 202-403-5254).



A-3. Letter From IES Endorsing the Study





U NITED STATES DEPARTMENT OF EDUCATION

INSTITUTE OF EDUCATION SCIENCES

National Center for Education Evaluation and Regional Assistance

Month XX, 2013

Dear District/Principal,

I am writing to introduce you to a new project funded by the U.S. Department of Education (ED) through the Regional Educational Laboratory (REL) program called the On-Time Graduation Project. This project is aimed at preventing high school dropout by using data to identify at-risk students early in high school and assigning students to interventions that help get them back on track for on-time graduation. The study will be conducted by REL Midwest at American Institutes for Research (AIR). For this study, ED is seeking more than 70 high schools in the Midwest to participate. In addition to being part of a high-profile, innovative study, participating districts will receive free access to the EWIMS model, developed by the National High School Center at AIR, including the tool and training and technical support for implementation.

Improving current graduation rates is a focal point for states across the Midwest. Districts and schools in the Midwest are increasingly interested in using an early warning system to identify students who are off track for graduation as early as possible. For this reason, the REL Midwest formed the Dropout Prevention Research Alliance, focused on improving graduation outcomes and reducing persistent disparities in graduation and dropout rates among student subgroups. The goal of the alliance is to expand the implementation of early warning systems and to conduct an efficacy study testing their impact on student outcomes and school processes. The proposed study is a two-year randomized controlled trial (RCT) to examine the impact of implementing an early warning system on school processes and student outcomes. The project responds to a need expressed by members of the REL Midwest’s Dropout Prevention Alliance for clear information about the efficacy of early warning systems.

On behalf of the U.S. Department of Education, I encourage your participation, as many districts and schools are needed to generate clear and usable results from this evaluation. Thank you in advance for your consideration. If you have any questions about the study, please feel free to contact AIR’s project directors, Dr. Ann-Maria Faria ([email protected], 202-403-5356) or Dr. Nicholas Sorensen ([email protected], 312-283-2318). In addition, if I can provide any assistance to you in this matter, please feel free to contact me by phone at 202-219-1674 or email at [email protected]. Thank you for your time, and we look forward to speaking with you.

Sincerely,

Chris Boccanfuso, Ph.D.

Contracting Officer’s Representative, REL Midwest



A-4. Nonresponse Follow-Up (E-Mail Text)


Dear < principal or superintendent name>,


I am contacting you on behalf of the U.S. Department of Education to discuss an opportunity for <district or school name> to participate in a study that would provide access to the National High School Center’s Early Warning and Intervention Monitoring System (EWIMS) including training and technical support at no cost.


This project, conducted by the Midwest Regional Educational Laboratory at American Institutes for Research (AIR) and funded by the U.S. Department of Education’s Institutes of Educational Sciences, is aimed at preventing high school dropout by using data to identify at-risk students early in high school and assign struggling students to interventions that get them back on track for eventual graduation.


The timeline for recruiting districts and schools to participate in this opportunity is extremely tight. Would you be able to touch base briefly by phone on <day>, <month> <date> to discuss whether this might be a good opportunity for <district or school name>?


Please feel free to reach me by email or at 312-283-2318.


Thank you and I look forward to speaking with you!


<name of project outreach recruiter>




<name of project outreach recruiter>

Project Outreach

On-Time Graduation Project

Midwest Regional Educational Laboratory

American Institutes for Research

20 North Wacker Drive, Suite 1231, Chicago, IL 60606

t: 312.283.2318 | www.air.org


P er the policies and procedures required by the Education Sciences Reform Act of 2002, Title I, Part E, Section 183, responses to this data collection will be used only for statistical purposes. The reports prepared for this study will summarize findings across the sample and will not associate responses with a specific district or individual. Any willful disclosure of such information for nonstatistical purposes, except as required by law, is a class E felony.






A-5. Telephone School Screening Protocol/Interview

Paperwork Burden Statement

According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless such collection displays a valid Office of Management and Budget (OMB) control number. The valid OMB control number for this information collection is XXXX-XXXX. The time required to complete this information collection is estimated to average 30 minutes per response. This information collection is voluntary. If you have any comments concerning the accuracy of the time estimate(s) or suggestions for improving this form, please write to: U.S. Department of Education, Washington, DC 20202–4651. If you have comments or concerns regarding the status of your individual submission of this form, write directly to: Christopher Boccanfuso, U.S. Department of Education, Institute of Education Sciences, Room 506D, 555 New Jersey Ave. NW, Washington, DC 20208-5500.

Draft School Screening Protocol

School Screening Protocol

Introductory script: Thank you for the opportunity to meet with you today and discuss the Impact of an Early Warning and Intervention Monitoring System (EWIMS) on Student and School Outcomes Study funded by the Institutes of Educational Science within the U.S. Department of Education. We are very excited about your school’s interest in participating in the study and are looking forward to talking more about whether the study might be a good opportunity for your school. Participation in this study is voluntary for schools. Today we will first ask you questions about how your school works with students who are at-risk for not graduating on time and then we have some questions about your school’s data use practices. By data use practices we are referring to things like the professional climate around using data, supports for data such as professional development and/or structured time to review data, frequency of data use and using data to track progress.

Required Introductory Talking Points

  • This project will provide participating schools access to the National High School Center’s Early Warning and Intervention Monitoring System (EWIMS), which will allow you to:

    • Import data from your student information system and flag students using evidence-based indicators of risk that identify at-risk students early in 9th or 10th grade

    • Assign those students to interventions of your choosing and monitor their progress over time

    • Examine which programs, interventions, or support strategies appear to be most effective in helping students get back on track



  • You will receive high quality professional development to help you use the tool and implement a seven-step systematic process for using to data to help you make decisions about how to help at-risk students get back on track for on-time graduation



  • All of this is provided at no cost



  • Because this is a research project, in March, we will use a lottery to divide all participating schools in two groups. The first group will receive EWIMS and the PD beginning in March of 2014. The second group will receive EWIMS and the PD in the fall of 2015.

  • The goal of this interview today is for us to get a sense of what you’re currently doing with data and how you are using or thinking about data to help at-risk students. Some of the questions may seem redundant but this is simply to make sure we are getting as much detail as possible.



  • We know that most schools are not yet implementing a lot of these practices; the goal today is to make sure that what we would be bringing to your school would have added value for you and that it could produce a positive impact on students in your school.

School Name: Principal Name:

Email: Telephone:

School Characteristics

  1. How many 9th grade students are enrolled in fall 2013?



  1. What is the structure of your school?

    • Grade structure? (e.g., 7–12, 9–12)

    • Campus structure? (e.g., multiple campuses, 9th grade campus separate)

    • Would you consider your school to be an alternative education setting? (e.g., magnet, community/charter, vocational/technical school)



  1. Do you have any alternative programs within your school? (e.g., magnet, vocational/technical school, credit recovery program)

  2. How many grading periods do you have at your school? (e.g., times in which students are awarded an actual credit, not just marking periods with progress reports?)

    • Full year (once per year)

    • Semester (twice per year)

    • Trimester (three times per year)

    • Quarterly (four times per year)



  1. What kind of grading scale do you use?

    • A traditional 4.0 or 5.0 point scale? or A through F?

    • Do you use competency-based grades?



  1. What was your school’s graduation rate for the 2012–13 academic year?

  2. What was your school’s graduation rate for the 2011–12 academic year?

  3. What is the source of the graduation rate? (e.g., report card? DOE grad rate?)

  4. Are you a Race to the Top school? Can you describe your engagement with RttT (only applicable to Ohio)?

    • Yes, RttT School/District

    • No, not involved with RttT


  1. What types of computers are used in your school by staff? (check all that apply)

    • MAC

    • PC

    • iPads or other tablets


  1. What version of Microsoft Office do you currently have installed on your computers?

    • PC – MS Office 2013

    • PC – MS Office 2010

    • PC – MS Office 2007

    • PC – MS Office 2003

    • PC – MS Office XP (2002)

    • PC – MS Office 2000

    • PC – MS Office 1997

    • PC – MS Office 1995

    • Mac – MS Office 2013

    • Mac – MS Office 2011

    • Mac – MS Office 2008

    • Mac – MS Office 2004

    • Mac – MS Office X

    • Mac – MS Office 2001

    • Mac – MS Office 1998


Data-Driven Dropout Preventions Efforts

Next we would like to talk more about if and how your school is currently using data to identify students who may be at risk of not graduating on time.

For each of the following questions, interviewer will allow the school interviewee to respond to the question and the interviewer will select the most appropriate response option.

  1. Does your high school have an early warning system tool that identifies students who may be at risk of not graduating?

    • Yes, we have an early warning system tool used to identify students who may be at risk of not graduating in my high school.

If yes, what is the name of the system or tool?

    • No, my high school does not currently have an early warning system tool to identify students who may be at risk of not graduating in my high school.

    • I'm not sure.



  1. Does your high school use a student information system (SIS) or another data tool to identify students who may be at risk of not graduating from high school on­time?

    • Yes, we have a tool to identify students who may be at risk of not graduating in my high school.

    • No, my high school does not currently have a SIS or other data tool to identify students who may be at risk of not graduating in my high school. IF NO, SKIP TO QUESTION 15.

    • I'm not sure.

If yes, please specify:

  1. Tell me about the indicators in your tool or SIS.

    • What are the indicators?

    • What are the cut-offs based on—does the tool use evidence-based or locally validated indicators?

IF THE SCHOOL DOES NOT HAVE AN EWS OR SIS TOOL THAT FLAGS AT-RISK STUDENTS:

  1. Does your school review student attendance data to determine which students may be at risk (i.e., missing more than X days per year)?

    • No

    • Yes

If yes:

    • For which grades are you examining student attendance data (circle all that apply)?

      1. Grade 9

      2. Grade 10

      3. Grade 11

      4. Grade 12



    • How often and when do you review attendance data?



    • How do you determine which students are at risk (i.e., what is the cut point for when students have too many absences)?



    • How did you make the decision to use that cut point (e.g., district policy, state policy, based on research, experience at your school)?



    • What happens with students whose absences exceed the threshold? ___________________



    • What are the strategies or interventions that you use to support these students?



    • Are these strategies required or suggested to students?



    • At what grade levels are they offered or required?



Note: It is important to get a strong sense of whether the school is going beyond notification practices (often required by law) to working with the student and his/her parents to identify the root cause of the attendance challenge.


  1. Does your school review student course performance data (including course failures, credit deficiencies) to determine which students may be at risk?

    • No

    • Yes

If yes:

    • For which grades are you examining course performance data (circle all that apply)?

      1. Grade 9

      2. Grade 10

      3. Grade 11

      4. Grade 12



    • How often and when do you review these data?



    • How do you determine which students are at risk (i.e., how many course failures or what is the credit deficiency that would identify which students are at risk)?



    • How did you make the decision to use that cut point (e.g., district policy, state policy, based on research, experience at your school)?



    • What happens with students whose course failures or credit deficiencies exceed the threshold?



    • What are the strategies or interventions that you use to support these students?



    • Are these strategies required or suggested to students?



    • At what grade levels are they offered or required?

Note: It is important to get a strong sense of whether the strategies to address credit deficiencies are targeted primarily toward upper grades (11 and 12) or also to lower grades (9 and 10). It is also important to understand whether interventions (e.g., credit recovery) are required of Grade 9 and 10 students or just encouraged (e.g., summer credit recovery). We want to know whether the school is making an active effort to identify students with course failures in Grade 9 and 10, and going beyond notification (e.g., informing students or parents of course performance) to require (not just encourage) these students to recover credit early in high school.

  1. What other kinds of student data do you look at and for which grades? (check all that apply below)


N/A

Grade 9

Grade 10

Grade 11

Grade 12

Behavior referrals





Behavior suspensions





Grade point average (GPA)





State assessment results





Other:






Other:






Other:






Other:








For each additional data source:

    • How often and when do you review these data?



    • How do you determine which students are at risk (i.e., where do you draw the cut point)?



    • How did you make the decision to use that cut point (e.g., district policy, state policy, based on research, experience at your school)?



    • What are the strategies or interventions that you use to support these students?



    • Are these strategies required or suggested to students?



    • At what grade levels are they offered or required?



  1. In what other ways do you assign students to intervention, or support programming?



    • Demographics (such as free or reduced-price lunch)

    • We use teacher recommendations or referrals.

Specify:

    • Other types of data

Specify:



  1. Can you please tell us about any other interventions, programs, or strategies you currently have in place to support students?



  1. Does your school have a team or group of individuals that focuses on students who are identified as at risk of not graduating from high school?

    • Yes, we have a dedicated school-based team.

    • Yes, we have a school-based team that focuses on students who are identified as at risk of not graduating from high school, but it is part of another team.

    • Yes, the district has a team that focuses on students who are identified as at risk of not graduating from high school.

    • No, there is no team at the district or high school levels.

    • Other (please specify)



  1. How often does your school review data to identify students at risk of not graduating?

    • 4 times per year (once per quarter)

    • 3 times per year

    • 2 times per year

    • 1 time per year

    • Other

Specify:

    • I'm not sure.



  1. Do teachers in your school have access to student data that identifies students at risk of not graduating?

    • Yes, and all teachers access it regularly

    • Yes, and some teachers access it regularly

    • Yes, and some teachers access it occasionally

    • Yes, but very few teachers actually access it

    • Yes, but no teachers actually access it

    • No, teachers do not have access to this type of data

    • Other (please specify)



  1. Please describe the nature of collaboration around student data in your school.

Probes:

    • Are these collaborations generally formal (scheduled meeting times) or informal?

    • Who participates in these meetings?

    • What is the focus of these meetings? (data analysis, sharing/discussing student work, lesson planning, ways to improve or modify instruction, focus on individual students, focus on problem content areas, etc…)

    • What are the goals of these meetings? (Identify students who need additional support? Identify school or class-level instructional issues? Create instructional plans?)

    • Do teachers review other data (e.g., interim assessment data, curriculum-based assessments, teacher-created assessments) along with data that identify students at risk of not graduating?



  1. Does your school have a system to monitor students' progress in interventions/supports to which they are assigned?

    • Yes (please specify)

    • No

    • Unsure



  1. Has your school received any professional development on data use? (In-service, pre-service, ongoing coaching).

If yes:

    • Who delivered the PD (Is this delivered by district and/or school staff)?

    • Is the training mandatory?

    • What has been the focus (content) of the professional development sessions?

    • What was the duration of the PD? (One-time training? Is ongoing training provided or available?)

    • What is the mode of delivery of the PD? (Is it face to face? Online? Video?)

    • What materials have been provided to support your and/or your staff’s learning? (Are there any sample materials that we can take a look at?)

If no:

    • What type of professional development would be most useful?



  1. Is there anything else you would like to share with us about how your school approaches dropout prevention interventions and strategies?

Major Initiatives at Your School or District

  1. What are the top three initiatives at your school currently? These could be school, district, or state- driven initiatives that your school prioritizes.

1.

2.

3.

  1. What are the top three initiatives planned for the next 2 years? (e.g., new curriculum, current or new focus for PD?)

1.

2.

3.

  1. Do you have any plans to develop or implement an early warning system within your school or as part of a larger district initiative before the fall of 2015?

    • Yes

    • No

    • Maybe

If yes or maybe:

    • What do you anticipate developing or implementing before fall of 2015?

District Policies About Research

  1. What are your district’s policies around participating in research projects?

Is there a formal research review process in your district that we would need to follow before we move forward?



    • Who needs to ultimately sign off on participation in research projects for your school?





A-6. Telephone District Screening Protocol/Interview

Paperwork Burden Statement

According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless such collection displays a valid Office of Management and Budget (OMB) control number. The valid OMB control number for this information collection is XXXX-XXXX. The time required to complete this information collection is estimated to average 30 minutes per response. This information collection is voluntary. If you have any comments concerning the accuracy of the time estimate(s) or suggestions for improving this form, please write to: U.S. Department of Education, Washington, DC 20202–4651. If you have comments or concerns regarding the status of your individual submission of this form, write directly to: Christopher Boccanfuso, U.S. Department of Education, Institute of Education Sciences, Room 506D, 555 New Jersey Ave. NW, Washington, DC 20208-5500.

Draft District Screening Protocol

Introductory script: Thank you for the opportunity to meet with you today and discuss the Impact of an Early Warning and Intervention Monitoring System (EWIMS) on Student and School Outcomes Study funded by the Institutes of Educational Science within the U.S. Department of Education. We are very excited about your district’s interest in the study and are looking forward to talking to you about potential participation of high schools within your district. Participation in this study is voluntary for schools. Today we will ask you questions about how your district supports schools in working with students who are at-risk for not graduating on time.

District Name: District Contact:

Email: Telephone:

School Characteristics

  1. How many high school (serving grades 9-12) do you have in your district?

What are the names of those high schools and their graduation rates for the 2012-13 academic year? (note- pre populated with names, if available)



  1. What methodology was used to calculate the graduation rate?





Data-Driven Dropout Prevention Efforts

  1. Does your district promote the use of a student information system, an early warning system (EWS), or another other data tool to identify students who may be at­risk of not graduating from high school on­time? [interviewer will allow the interviewee to describe their process and will select most appropriate response]

    • Yes, we have a district-wide tool to identify students who may be at­risk of not graduating and we use the tool in all schools.

    • Yes, we have a district-wide tool to identify students who may be at­risk of not graduating and we use the tool in some schools.

If yes, please specify which schools:

    • No, the district does not promote and/or high schools do not use a tool to identify students who may be at­risk of not graduating.

    • I'm not sure.

  1. Can you please describe the tool:



  1. Does your district have an early warning system team or group of individuals that focuses on students who are identified as at risk of not graduating from high school? [interviewer will allow the interviewee to describe their process and will select most appropriate response]

    • Yes, the district has a team that focuses on students who are identified as at risk of not graduating from high school.

    • No, there is no team at the district or high school levels.

    • Other



If yes, specify who serves on this team:



  1. Does your district have a policy or practice on how schools should assign at­risk students to supports and/or interventions?



Specifically, how is data used within this process (e.g. how often and to what extent?)

Does this vary by school? If so, how?


  1. Does your district have a system for schools to monitor individual high school students' progress in interventions/supports to which they are assigned?

    • Yes

    • No

    • Unsure

If "yes," please specify which high schools use it:

  1. If your district monitors students' progress in interventions/supports to which they are assigned, what data are used in this process? [interviewer will allow the interviewee to describe their process and will check all that apply]

  • Attendance

  • State assessment results

  • Course performance

  • Data from students' classroom teachers

  • Core course performance

  • Data from other adults who interact with the student (e.g., counselors, coaches, etc.)

  • Credits earned

  • Behavior referrals

  • Behavior suspensions

  • Information from the at­risk (flagged) student

  • Information from at­risk (flagged) students' parent/guardian

  • Other

If "other," please specify:

  1. Is there anything else you would like to share with us about how your district approaches dropout prevention interventions and strategies?



A-7. Final Agreement Form (School MOU)


School Roles and Responsibilities:
Early Warning and Intervention Monitoring System Study



Dear <principal name>,

On behalf of American Institutes for Research (AIR) (www.air.org) and its partners, we welcome you to the Early Warning and Intervention Monitoring System (EWIMS) study. We are excited about this project. Its purpose is to examine the impact of EWIMS on (1) student outcomes including student risk status for dropout, scores on state assessments, persistence and progress in school and likelihood of on-time graduation; and (2) school outcomes including how schools allocate dropout prevention interventions for students and their data-use culture. This document contains an overview of the study and a brief description of the intervention, followed by a description of the roles and responsibilities for your school and for the study team, including the benefits of participation, and the project timeline. Please review the contents of this document and sign the last page to indicate your agreement to participate. Return the signed last page to Dr. Nicholas Sorensen ([email protected], fax 312-288-7601).

Overview

The primary goal of the EWIMS study is to evaluate whether implementing an early warning and intervention monitoring system for identifying students at-risk of dropping out and using this system to assign students to dropout prevention interventions will improve student outcomes including student risk status for dropout, scores on graduation tests, persistence and progress in school and likelihood of on-time graduation. In addition, this study will also examine the impact of implementing EWIMS on school outcomes including how schools allocate dropout prevention resources for students and their data-use culture. Study results will yield valuable information for the state of <state name> and for districts and schools across the country about the viability and benefits of using an early warning system to prevent high school dropout and help struggling students get back on track for eventual graduation. The study is being funded by the U.S. Department of Education, Institute of Education Sciences, and will be conducted from March 2014 through the spring of 2016.

We look forward to working with <High School> as a partner in this project!

The EWIMS Model

The EWIMS model, developed by the National High School Center at AIR, is a multistep process intended to encourage systematic and comprehensive implementation within schools. The process is based on a combination of research on data use in schools and National High School Center’s experience working with states, districts, and schools implementing early warning systems.

At the heart of the EWIMS process is an early warning data tool used to flag students as “at risk” based on attendance, course performance (grades, credits, grade point average [GPA]), and behavior indicators. The tool enables schools to identify students who are at risk of dropping out of school, record assignments to available interventions, and monitor students’ response to those interventions.



Beyond the development of the data tools, the National High School Center has devised a seven-step EWIMS implementation process to support implementation. The process guides users to make informed decisions about how to use data to support at-risk students and how to continue to monitor their progress over time. In addition to focusing on individual students, the process guides users to examine the success of specific supports or interventions and to examine possible systemic issues (e.g., school climate) that may relate to dropout trends.

Figure A-1. Early Warning Intervention Monitoring System Implementation Process

Shape4

As shown in Figure 1, the steps are intended to be cyclical. At the core of this data-driven decision-making process, the steps focus users on key indicators that identify which students are showing signs of risk of dropping out of high school and guide users to go beyond the indicator data and other relevant information to connect at-risk students to dropout prevention or academic support interventions. The EWIMS model does not prescribe specific interventions for schools to implement. Instead, the model is designed to allow schools flexibility to decide which interventions they believe are most effective for their students’ needs. Ideally, the EWIMS model allows users to identify students with accuracy and provide supports and intervention of your school’s choosing to at-risk students, resulting in improved outcomes for students, including higher attendance rates and improvement in academic performance leading toward successful graduation.

This Study

Despite increasingly widespread implementation of early warning systems by states, districts, and schools, there have been no rigorous studies testing the impact of using an early warning system to improve student outcomes such as staying in school, progressing in school, and graduating. There is also little research examining how using an early warning system can shape a school’s culture for data use—increased data-driven decision making (assigning interventions to students) and processes and professional development to support using data to improve teaching and learning. This study will address these gaps and provide the first rigorous test of the impact of an early warning system. The study will:

  • Identify a sample of eligible and interested schools in <state name>. The study team will conduct outreach to schools that meet initial eligibility criteria to confirm eligibility and discuss interest in participating in the EWIMS study. The study will include approximately 70 high schools in the Midwest. To qualify, schools must (1) have at least 150 ninth-grade students; (2) a graduation rate between 25 and 95 percent and (3) not already be implementing an early warning system tool for using data to flag at-risk students.

  • Use a lottery to randomly assign half of the participating schools to implement EWIMS in March of 2014 and the other half to implement in fall of 2015. Half of participating schools will be randomly assigned to receive access to the EWIMS model in March 2014 (including the tool and technical support for implementation) and the other half of participating schools will be randomly assigned to conduct “business as usual” for identifying at-risk students until the fall of 2015 when they will receive the same resources and supports.

  • Implement EWIMS. All schools will implement the seven-step EWIMS process either in March 2014 or the fall of 2015 (depending on random assignment by lottery). The seven steps for implementation are as follows:

    • Step 1—Establish Roles and Process. In this step, the composition of the EWIMS team is established;13 team members then determine frequency and duration of meetings and develop a shared vision or focus of the team’s work.

    • Step 2—Use the EWS tool. In this step, the school-based EWIMS teams are trained on the use and purpose of the tool itself. This step also includes first customizing the tool settings and importing the student demographic and initial administrative data and also ongoing refreshing of the administrative data in the tool and the running of automated and custom lists and reports available within the tool.

    • Step 3—Analyze EWS data. In Step 3, EWIMS teams focus their attention on student- and school-level data, based on the indicators available in the tool. This data review process is intended to identify areas of focus and further investigation.

    • Step 4—Interpret EWS data. Step 4 guides teams to bring in additional data (external to the tool) to provide more context and a fuller picture to inform the EWIMS team’s consideration of specific needs of individuals or groups of flagged students. Unlike Step 3, which is focused on the indicator flags themselves (i.e., the data in the tool), this step addresses root causes of why students might be identified as at risk for one or more indicators. The implementation team will provide training to identify root causes that focus on acquiring additional formal (e.g., administrative records) and informal (e.g., from teacher, family, and student) input. This training will occur face to face and will last for one hour.

    • Step 5—Assign and Provide Interventions. In this step, EWIMS team members make informed decisions about the allocation of available resources and strategies to support students identified as at risk of dropping out of high school. The EWIMS team matches individual students to specific interventions after having gathered information about (1) potential root causes for individual flagged students (Step 4) and (2) the available dropout prevention and academic and behavioral support programs in the school, district, and community, which are locally determined.

    • Step 6—Monitor Students. In this step, EWIMS teams continue to examine student indicators at regular intervals to continually identify students who show signs of being at risk. The teams will use the same indicators to closely monitor already-identified students who were assigned to interventions for progress in school and risk status. This step provides critical ongoing feedback about additional student- and school-level needs and apparent successes.

    • Step 7—Evaluate and Refine the EWIMS. Through active and structured reflection, EWIMS team members assess whether students are responding to assigned interventions, revise their specific strategies or general approach as needed, and determine how resources are allocated to improve support for students. This step encourages EWIMS teams to make course corrections to all parts of the EWIMS implementation. As implied by the cyclical depiction of the seven-step process, this step (as well as the other six) reflects an ongoing process of continual improvement.

  • Evaluate the effects of EWIMS on student and school outcomes. The study will examine student outcomes for all students in grades 9 and 10 during the 2013–14 school year and all students in grades 9 through 11 during the 2014–15 school year. All student outcome data will be collected from school or district administrative data, the EWS tool or the <State Department of Education>. In addition, all participating schools will be asked to complete an annual Web-based survey about data use practices and schools randomly assigned to implement EWIMS in March 2014 may be asked to participate in interviews about their experiences using the tool.



School Roles and Responsibilities

We look forward to partnering with <High School Name> for this exciting project! More detailed information on the responsibilities of participating schools follows. Final determination of school eligibility requires willingness to adhere to the study guidelines and responsibilities. Please note that your school’s participation in this project is voluntary. <High School Name> will not be penalized in any way for not participating and you may discontinue participation at any time without penalty.

  • Maintain a sustained commitment to participate in the study. To evaluate the impact of EWIMS, it is critical that schools agree to adhere to the study guidelines and timelines associated with random assignment by lottery to implement EWIMS in March 2014 or fall of 2015. All schools are expected to implement the EWIMS model as designed (including all seven steps of the implementation cycle).

  • Adhere to the results of the lottery/random assignment process. It is essential that the groupings that result from the lottery remain intact over the course of the study. Schools assigned by lottery to implement the EWIMS model in March 2014 will serve as the treatment group. Their counterparts assigned to implement EWIMS in fall of 2015 will serve as a control group from March 2014 through the spring of 2015. Schools randomly assigned to the control group must continue with “business as usual” practices for identifying at-risk students and assigning dropout prevention interventions until EWIMS implementation begins in the fall of 2015. Students in control schools should continue to receive any services that would be offered to them in the absence of the study. As for students in the “treatment” group, no “typical” services should be withheld. Please refer any questions or concerns from parents or school staff about this to the study team.

  • Participate in all data collection activities. Participating schools should provide school-level information including high school graduation rates, average state achievement scores in reading and mathematics and demographics (e.g., percentage of students receiving free or reduced-price lunch).

Administrative student-level data collection in participating schools will focus on all students in grades 9 and 10 during the 2013–14 school year and all students in grades 9, 10 and 11 during the 2014–15 school year. The study team will obtain as much administrative data as possible from the <State Department of Education (DOE)> and the school district. However, all participating schools should provide the study team access to the following administrative data for students should this data be unavailable through other sources:

    • Demographic information (e.g., race/ethnicity, gender, free or reduced-price lunch [FRPL], individualized education program [IEP], and English language learner [ELL] status, and parents’ education)

    • Grade point average (GPA)

    • State test scores

    • Attendance rates

    • Course grades in core academic courses by semester

    • Credits earned by semester

    • Disciplinary information (e.g., suspensions)

    • Enrollment information (e.g. whether students are enrolled or have left school for reasons other than transfer to another district, including dropping out)

    • Grade promotion

In addition to administrative records, one administrator at all schools should complete an annual Web-based survey assessing how schools use data to allocate dropout prevention resources to students.

Finally, all schools assigned by lottery to the treatment group (implementing EWIMS in March 2014) should participate in data collection efforts focused on implementation. Specifically, the project team will collect data on attendance and satisfaction with EWIMS training sessions and meetings. EWIMS teams at schools assigned to the treatment group will be required to submit their EWS tool securely to the study team and may be asked to participate in interviews about their implementation experience.

EWIMS implementation (either in March 2014 or fall of 2015) will require the following:

  • Develop an EWIMS team within your school. A diverse, well-informed, EWIMS team within your school is essential to the success of this process. The EWIMS team may be established as a new team or may build on or be integrated into existing teams (school improvement team, response to intervention team, student support team). It is not necessary to create an entirely new team for EWIMS work, but an existing team that takes on the responsibility to use the tool for dropout prevention efforts should include a broad representation of staff within the school (e.g., principals, representatives from feeder elementary/middle schools, guidance counselors, teachers, specialists). The EWIMS team is responsible for identifying students who are at risk and ensuring that their individual needs are met through school-based interventions. In most cases, this team is not directly responsible for applying interventions for students but their focus should be on helping students navigate the school systems to access appropriate and needed services.

  • Participate in EWIMS professional development. The EWIMS team will receive professional development on the EWIMS process and tool capabilities, and subsequently be given adequate time to implement the EWIMS process. The professional development activities include the following:

  • One two-hour training on how to use the tool (e.g., uploading data) for the individual who will manage data entry (also a member of the EWIMS data team). The project team anticipates that these training sessions will be held on site at each participating school.

  • Full-day in-person regional training (estimate no more than 100 miles maximum from any participating school) on seven-step EWIMS process and model. Project will cover mileage for up to five building faculty/staff and potentially $500 for substitute teachers if the school elects for one or two teachers to join the EWIMS data team.

  • Two two-hour webinars

    • Reviewing data and monitoring progress over time (all five team members)

    • Evaluating and refining the EWIMS process (all five team members)

  • Monthly one-hour conference calls for a community of practice of all participating schools (minimum one person per team must participate)

  • Import student data into the EWS tool. A member of the EWIMS team is responsible for entering, or importing, data into the EWS tool, which facilitates the EWIMS team’s use of the tool to use the data to initially flag students as at risk. Participating schools should upload attendance data at the 20- or 30-day mark and after every grading period. Course performance, GPA, and behavioral data (optional) should be uploaded after every marking period.

  • Produce reports of at-risk students and assign students to appropriate interventions or services. The tool houses information about interventions assigned to each student and documents students’ transition in and out of each intervention and their ultimate response to the intervention(s) (i.e., for each student flagged, did the assigned intervention(s) have a positive influence on the number of flags as calculated during subsequent grading periods?). The EWS tool does not prescribe specific interventions for students based on the type or number of indicators, but rather relies on EWIMS teams to make data-driven decisions within their own local context of potential interventions available, to match students with interventions.

  • Conduct EWIMS monthly team meetings that are organized and documented. An agenda for each meeting should be prepared at the end of the prior meeting, and at least some agenda items should be routine, such as a review of the data from the tool, actions taken for individual or groups of students, a review of previous meetings’ action items (ongoing or completed), new action items, and communication with staff and leadership. Notes should be taken at each meeting and include action items assigned to specified individuals to accomplish. Agenda, meeting notes, and a faculty/staff sign-in sheet should be kept on file to provide a record of the team’s work.

  • Communicate with individuals and groups outside of the EWIMS team. Information on flagged students, intervention effectiveness, and team-identified needs to support students should be routinely reported to and discussed with school and district leadership. Teachers should receive regular updates about students in their classes who are displaying indicators of risk, as well as input about supports available to them to use with these students. Last, students and their parents should be engaged in the conversation about their risk status and the plans to ensure that they are able to get back on track for graduation. Although the EWIMS team may not be directly responsible for meetings with individual students and their parents (i.e., delivering the individual interventions), the team should be in a position to prompt such meetings or to share information routinely about student progress and the early warning signs of risk. Of critical note, the team should share the knowledge of students’ risk with sensitivity, ensuring that identification is used to prompt action and support, not to assign labels that carry stigma.

  • Solicit feedback from stakeholders. Feedback from administrators, teachers, staff, students, and parents can help the EWIMS team uncover underlying causes for students displaying indicators of risk. This information may help the EWIMS team match students to appropriate interventions and supports.

  • Monitor progress. The EWIMS team should monitor progress as it strives to improve educational outcomes for students during a single school year and over the course of multiple school years. The team should be responsible for presenting progress reports to key stakeholders, including principals, staff, district leadership, the local board of education, and parents.

Study Team Role and Responsibilities

The project team is composed of researchers from the Midwest Regional Educational Laboratory at AIR. The major responsibilities for the study team are as follows:

  • Obtain necessary approvals from review boards (federal, district, and organizational Internal Review Boards) and comply with the research protocols in place.

  • Provide access to the EWS tool and training and technical support for implementation.

  • Collect data for the study. The majority of the data for this study will be administrative records transmitted from the district, thus minimizing the data collection burden on participating schools. The study team also will conduct an annual Web-based survey of all schools and collect all implementation data from schools assigned to the treatment group (EWS tool data, interviews with EWIMS team members).

  • Assure confidentiality. The study team will collect data only for the purposes of this study and will not use or allow the use of the data for evaluating individual participants, schools, or districts.

    • Each participant will be assigned a study-specific identification number, in place of their names. A data file that links each participant with their identification number will be kept in a password-protected file that only the study team can access.

    • The published analysis of the results will aggregate results across all schools and will not include results that have been disaggregated by school or district.

    • All members of the study team are required to complete a comprehensive training course that addresses current federal government standards and sign federal data confidentiality agreements.

  • Analyze data and produce reports. The study team will be responsible for aggregating information about the effectiveness of EWIMS on student and school outcomes. The study team expects that the final report will be released in 2016, pending the federal review process, and will ensure that participating schools receive this report.

Timeline


Table A-1 presents the major tasks of the project as they were described previously.

Table A-1. Major Tasks of the Project

Tasks

Dates

EWIMS Implementation

Treatment schools implement EWIMS with Grades 9 and 10

March 2014–June 2014

Treatment schools implement EWIMS with Grades 9, 10, and 11

August 2014–June 2015

Control schools implement EWIMS

August 2015–June 2016

Data Collection

Collect administrative records from the state and district

March 2014–June 2015

Collect EWS tool data from treatment schools

March 2014–June 2015

Conduct annual Web-based survey

May 2014 and May 2015

Conduct interviews with EWIMS team members in treatment schools

June 2014 and June 2015

Analysis and Reporting

Draft and submit final report.

December 2015



EWIMS implementation is aligned with the academic calendar. The school-based EWIMS teams meet monthly, with other critical activities occurring prior to school beginning, after the first 20 or 30 days of school, shortly after the end of each grading period, and at the end of the academic year. Table A-2 details expected key activities of EWIMS implementation over the course of an academic year.

Table A-2. Schedule and Key Activities for Early Warning Intervention and
Monitoring System Implementation

Year 1 Activities

Schedule

Key Activities (aligned to the Early Warning Intervention and Monitoring System [EWIMS] implementation steps)

March/April 2014


  • Forming/designating an EWIMS team (Step 1)

  • Setting up the early warning system (EWS) Tool (Step 2)

  • Begin convening monthly EWIMS team meetings (Step 1)

  • Importing or entering students’ absences, course failures, and behavior information (e.g., referrals and suspensions, by grading period) (Step 2)

  • Reviewing and interpreting student- and school-level reports (Steps 3 and 4)

  • Identifying and implementing student interventions (Step 5)

  • Monitoring students’ responses to existing interventions in which they are participating (Step 6)

  • Revising students’ intervention assignments, as needed (Steps 5 and 6)

At the end of the school year (~June 2014)


  • Updating student roster to reflect new enrollees, transfers in and out, and so forth (Step 2)

  • Importing or entering students’ absences, course failures, and behavior information (e.g., referrals and suspensions), by grading period, if applicable (Step 2)

  • Reviewing and interpreting student- and school-level data (Steps 3 and 4)

  • Identifying and implementing new student interventions (Step 5)

  • Monitoring students’ responses to existing interventions in which they are participating (Step 6)

  • Revising students’ intervention assignments for summer and for the next academic year, if needed (Steps 5 and 6)

  • Evaluating the EWIMS process, using student- and school-level reports, and revise as necessary (Step 7)

  • Exporting student data to (1) prepare the EWS tool for the next school year and/or (2) for those students who are transitioning to high school, share data with students’ high school(s).


Year 2 Activities

Schedule

Key Activities (aligned to the Early Warning Intervention and Monitoring System [EWIMS] implementation steps)

At the beginning of the school year (~August 2014)


  • Reconvening the EWIMS team meetings (Step 1)

  • Importing or entering student information and, if available, incoming risk indicator data into the EWS Tool (Step 2)

  • Reviewing and interpreting student needs based on data from the previous year (e.g., review the Overage Student Report) (Steps 3 and 4)

  • Verifying student information, especially enrollment status, and updating student roster to reflect new enrollees, transfers in and out, and so forth (Step 2)

  • Reviewing incoming risk indicators or previous year data, including any additional information (e.g., bridge program participation, summer school participation, prior course performance), to review and interpret student needs (Steps 3 and 4)

  • Identifying and implementing student interventions or supports based on incoming risk indicator information, if available (Step 5)

After the first 20 or 30 days of the school year (~October 2014)


  • Updating student roster to reflect new enrollees, transfers in and out, and so forth (Step 2)

  • Importing students’ absences (Step 2)

  • Reviewing and interpreting student- and school-level reports (Steps 3 and 4)

  • Identifying and implementing student interventions (Step 5)

  • Monitoring students’ initial response to interventions (Step 6)

  • Revising students’ intervention assignments, as needed (Steps 5 and 6)

After the midyear grading period (~February 2015)




  • Updating student roster to reflect new enrollees, transfers in and out, and so forth (Step 2)

  • Importing or entering students’ absences, course failures, and behavior information (e.g., referrals and suspensions, by grading period) (Step 2)

  • Reviewing and interpreting student- and school-level reports (Steps 3 and 4)

  • Identifying and implementing student interventions (Step 5)

  • Monitoring students’ responses to existing interventions in which they are participating (Step 6)

  • Revising students’ intervention assignments, as needed (Steps 5 and 6)

At the end of the school year (~June 2015)




  • Updating student roster to reflect new enrollees, transfers in and out, and so forth (Step 2)

  • Importing or entering students’ absences, course failures, and behavior information (e.g., referrals and suspensions), by grading period, if applicable (Step 2)

  • Reviewing and interpreting student- and school-level data (Steps 3 and 4)

  • Identifying and implementing new student interventions (Step 5)

  • Monitoring students’ responses to existing interventions in which they are participating (Step 6)

  • Revising students’ intervention assignments for summer and for the next academic year, if needed (Steps 5 and 6)

  • Evaluating the EWIMS process, using student- and school-level reports, and revise as necessary (Step 7)

  • Exporting student data to (1) prepare the EWS tool for the next school year and/or (2) for those students who are transitioning to high school, share data with students’ high school(s).


Benefits to Participation

There are many benefits of participation for your high school. Critical indicators in ninth and tenth grade that powerfully predict whether students are “on track” for high school graduation can be used as part of an early warning system to flag at-risk students early, assign appropriate interventions, and get students back on track. Participating in this high-profile, large-scale study will give your school and district an opportunity to access the Early Warning and Intervention Monitoring System (developed by the National High School Center at AIR) at no cost. The EWIMS model, currently in use in 67 districts in six states, includes both an excel-based tool and training and technical support for implementation. Your participation in this study will also play an important role in informing educational policy focused on dropout prevention in <state name> and at the federal level.


Questions or Comments

If you have any questions or comments about the study or the opportunity it provides for your school, please feel free to contact Dr. Nicholas Sorensen ([email protected] or 312-283-2318) or Dr. Mindee O’Cummings ([email protected] or 202-403-5254).





Signatures of Commitment

Return via fax (312-288-7601) or e-mail ([email protected])

The following people have read this document detailing the study and agree to the roles, responsibilities, and conditions of participation on behalf of <High School Name> and the study team.



<insert name> <insert title>

District Representative Signature Printed Name Title Date



Principal

Principal Signature Printed Name Title Date



Jessica Heppen Co-Principal Investigator

Principal-Investigator Printed Name Title Date



Mindee O’Cummings Co-Principal Investigator

Principal-Investigator Printed Name Title Date



Ann-Marie Faria Project Director

Project Director Printed Name Title Date



Nicholas Sorensen Deputy Project Director

Project Director Printed Name Title Date


Per the policies and procedures required by the Education Sciences Reform Act of 2002, Title I, Part E, Section 183, responses to this data collection will be used only for statistical purposes. The reports prepared for this study will summarize findings across the sample and will not associate responses with a specific district or individual. Any willful disclosure of such information for nonstatistical purposes, except as required by law, is a class E felony.

A-8. Final Agreement Form (District MOU)



District Roles and Responsibilities:
Early Warning and Intervention Monitoring System Study



Dear <superintendent>,

On behalf of American Institutes for Research (AIR) (www.air.org) and its partners, we welcome you to the Early Warning and Intervention Monitoring System (EWIMS) study. We are excited about this project. Its purpose is to examine the impact of EWIMS on (1) student outcomes including student risk status for dropout, scores on state assessments, persistence and progress in school and likelihood of on-time graduation; and (2) school outcomes including how schools allocate dropout prevention interventions for students and their data-use culture. This document contains an overview of the study and a brief description of the intervention, followed by a description of the roles and responsibilities for your district and for the study team, including the benefits of participation, and the project timeline. Please review the contents of this document and sign the last page to indicate your agreement to participate. Return the signed last page to Dr. Nicholas Sorensen ([email protected], fax 312-288-7601).

Overview

The primary goal of the EWIMS study is to evaluate whether implementing an early warning and intervention monitoring system for identifying students at-risk of dropping out and using this system to assign students to dropout prevention interventions will improve student outcomes including student risk status for dropout, scores on graduation tests, persistence and progress in school and likelihood of on-time graduation. In addition, this study will also examine the impact of implementing EWIMS on school outcomes including how schools allocate dropout prevention resources for students and their data-use culture. Study results will yield valuable information for the state of <state name> and for districts and schools across the country about the viability and benefits of using an early warning system to prevent high school dropout and help struggling students get back on track for eventual graduation. The study is being funded by the U.S. Department of Education, Institute of Education Sciences, and will run from January 2014 through the spring of 2016.

We look forward to working with <district name> as a partner in this project!




The EWIMS Model

The EWIMS model, developed by the National High School Center at AIR, is a multistep process intended to encourage systematic and comprehensive implementation within schools. The process is based on a combination of research on data use in schools and National High School Center’s experience working with states, districts, and schools implementing early warning systems.

At the heart of the EWIMS process is an early warning data tool used to flag students as “at risk” based on attendance, course performance (grades, credits, grade point average [GPA]), and behavior indicators. The tool enables schools to identify students who are at risk of dropping out of school, record assignments to available interventions, and monitor students’ response to those interventions.



Beyond the development of the data tools, the National High School Center has devised a seven-step EWIMS implementation process to support implementation. The process guides users to make informed decisions about how to use data to support at-risk students and how to continue to monitor their progress over time. In addition to focusing on individual students, the process guides users to examine the success of specific supports or interventions and to examine possible systemic issues (e.g., school climate) that may relate to dropout trends.

Shape5
Figure A-2. Early Warning Intervention Monitoring System Implementation Process

As shown in Figure 1, the steps are intended to be cyclical. At the core of this data-driven decision-making process, the steps focus users on key indicators that identify which students are showing signs of risk of dropping out of high school and guide users to go beyond the indicator data and other relevant information to connect at-risk students to dropout prevention or academic support interventions. The EWIMS model does not prescribe specific interventions for schools to implement. Instead, the model is designed to allow schools flexibility to decide which interventions they believe are most effective for their students’ needs. Ideally, the EWIMS model allows users to identify students with accuracy and provide supports and intervention of the schools’ choosing to at-risk students, resulting in improved outcomes for students, including higher attendance rates and improvement in academic performance leading toward successful graduation.

This Study

Despite increasingly widespread implementation of early warning systems by states, districts, and schools, there have been no rigorous studies testing the impact of using an early warning system to improve student outcomes such as staying in school, progressing in school, and graduating. There is also little research examining how using an early warning system can shape a school’s culture for data use—increased data-driven decision making (assigning interventions to students) and processes and professional development to support using data to improve teaching and learning. This study will address these gaps and provide the first rigorous test of the impact of an early warning system. The study will:

  • Identify a sample of eligible and interested schools in <state name>. The study team will conduct outreach to schools that meet initial eligibility criteria to confirm eligibility and discuss interest in participating in the EWIMS study. The study will include approximately 70 high schools in the Midwest. To qualify, schools must (1) have at least 150 ninth-grade students; (2) a graduation rate between 25 and 95 percent and (3) not already be implementing an early warning system tool for using data to flag at-risk students.

  • Use a lottery to randomly assign half of the participating schools to implement EWIMS in March 2014 and the other half to implement in fall of 2015. Half of participating schools will be randomly assigned to receive access to the EWIMS model in March 2014 (including the tool and technical support for implementation) and the other half of participating schools will be randomly assigned to conduct “business as usual” for identifying at-risk students until the fall of 2015 when they will receive the same resources and supports for implementing EWIMS.

  • Implement EWIMS. All schools will implement the seven-step EWIMS process either in March 2014 or the fall of 2015 (depending on random assignment by lottery). The seven steps for implementation are as follows:

    • Step 1—Establish Roles and Process. In this step, the composition of the EWIMS team is established;14 team members then determine frequency and duration of meetings and develop a shared vision or focus of the team’s work.

    • Step 2—Use the EWS tool. In this step, the school-based EWIMS teams are trained on the use and purpose of the tool itself. This step also includes first customizing the tool settings and importing the student demographic and initial administrative data and also ongoing refreshing of the administrative data in the tool and the running of automated and custom lists and reports available within the tool.

    • Step 3—Analyze EWS data. In Step 3, EWIMS teams focus their attention on student- and school-level data, based on the indicators available in the tool. This data review process is intended to identify areas of focus and further investigation.

    • Step 4—Interpret EWS data. Step 4 guides teams to bring in additional data (external to the tool) to provide more context and a fuller picture to inform the EWIMS team’s consideration of specific needs of individuals or groups of flagged students. Unlike Step 3, which is focused on the indicator flags themselves (i.e., the data in the tool), this step addresses root causes of why students might be identified as at risk for one or more indicators. The implementation team will provide training to identify root causes that focus on acquiring additional formal (e.g., administrative records) and informal (e.g., from teacher, family, and student) input. This training will occur face to face and will last for one hour.

    • Step 5—Assign and Provide Interventions. In this step, EWIMS team members make informed decisions about the allocation of available resources and strategies to support students identified as at risk of dropping out of high school. The EWIMS team matches individual students to specific interventions after having gathered information about (1) potential root causes for individual flagged students (Step 4) and (2) the available dropout prevention and academic and behavioral support programs in the school, district, and community, which are locally determined.

    • Step 6—Monitor Students. In this step, EWIMS teams continue to examine student indicators at regular intervals to continually identify students who show signs of being at risk. The teams will use the same indicators to closely monitor already-identified students who were assigned to interventions for progress in school and risk status. This step provides critical ongoing feedback about additional student- and school-level needs and apparent successes.

    • Step 7—Evaluate and Refine the EWIMS. Through active and structured reflection, EWIMS team members assess whether students are responding to assigned interventions, revise their specific strategies or general approach as needed, and determine how resources are allocated to improve support for students. This step encourages EWIMS teams to make course corrections to all parts of the EWIMS implementation. As implied by the cyclical depiction of the seven-step process, this step (as well as the other six) reflects an ongoing process of continual improvement.

  • Evaluate the effects of EWIMS on student and school outcomes. The study will examine student outcomes for all students in grades 9 and 10 during the 2013–14 school year and all students in grades 9 through 11 during the 2014–15 school year. All student outcome data will be collected from school or district administrative data, the EWS tool or the <State Department of Education>. In addition, all participating schools will be asked to complete an annual Web-based survey about data use practices and schools randomly assigned to implement EWIMS in March 2014 may be asked to participate in interviews about their experiences using the tool.


District Roles and Responsibilities

We look forward to partnering with <District Name> for this exciting project! More detailed information on the responsibilities of participating schools follows. Please note that your district’s participation in this project is voluntary. You will not be penalized in any way for not participating and you may discontinue participation at any time without penalty.

  • Maintain a sustained commitment to participate in the study. To evaluate the impact of EWIMS, it is critical that participating districts agree to adhere to the study guidelines and timelines associated with random assignment by lottery to implement EWIMS in March 2014 fall of 2015.

  • Support school recruitment. Participating districts should actively support the study team’s efforts to recruit high schools by communicating with school leadership and endorsing district participation and encourage school-level participation.

  • Support the use of random assignment. It is essential that the district support the use of random assignment of schools to receive access to and implement EWIMS in March 2014 or the fall of 2015. All schools will receive access to EWIMS including the tool and training and technical support for implementation; random assignment will only determine whether schools receive the “treatment” in March 2014 or a “delayed treatment” in the fall of 2015. Schools assigned by lottery to implement the EWIMS model in March 2014 will serve as the treatment group. Their counterparts assigned to implement EWIMS in fall of 2015 will serve as a control group from March 2014 through the spring of 2015. Schools randomly assigned to the control group must continue with “business as usual” practices for identifying at-risk students and assigning dropout prevention interventions until EWIMS implementation begins in the fall of 2015. Students in control schools should continue to receive any services that would be offered to them in the absence of the study. As for students in the “treatment” group, no “typical” services should be withheld.

  • Provide access to student-level administrative records. Administrative student-level data collection in participating schools will focus on all students in grades 9 and 10 during the 2013–14 school year and all students in grades 9, 10 and 11 during the 2014–15 school year. The study team will obtain as much administrative data as possible from the <State Department of Education (DOE)>. However, all participating schools should provide the study team access to the following student-level administrative data should this data be unavailable from <DOE>:

    • Demographic information (e.g., race/ethnicity, gender, free or reduced-price lunch [FRPL], individualized education program [IEP], and English language learner [ELL] status, and parents’ education)

    • Grade point average (GPA)

    • State test scores

    • Attendance rates

    • Course grades in core academic courses by semester

    • Credits earned by semester

    • Disciplinary information (e.g., suspensions)

    • Enrollment information (e.g. whether students are enrolled or have left school for reasons other than transfer to another district, including dropping out)

    • Grade promotion

Study Team Role and Responsibilities

The project team is composed of researchers from the Midwest Regional Educational Laboratory at AIR. The major responsibilities for the study team are as follows:

  • Obtain necessary approvals from review boards (federal, district, and organizational Internal Review Boards) and comply with the research protocols in place.

  • Provide access to the EWS tool and training and technical support for implementation to participating schools.

  • Collect data for the study. The majority of the data for this study will be administrative records transmitted from the district, thus minimizing the data collection burden on participating schools. The study team will provide a detailed data request to participating districts in a pre-determined format that will minimize burden on district staff.

  • The study team will conduct an annual Web-based survey of all participating schools and collect all implementation data from schools assigned to the treatment group (EWS tool data, interviews with EWIMS team members).

  • Assure confidentiality. The study team will collect data only for the purposes of this study and will not use or allow the use of the data for evaluating individual participants, schools, or districts.

    • Each participant will be assigned a study-specific identification number, in place of their names. A data file that links each participant with their identification number will be kept in a password-protected file that only the study team can access.

    • The published analysis of the results will aggregate results across all schools and will not include results that have been disaggregated by school.

    • All members of the study team are required to complete a comprehensive training course that addresses current federal government standards and sign federal data confidentiality agreements.

  • Analyze data and produce reports. The study team will be responsible for aggregating information about the effectiveness of EWIMS on student and school outcomes. The study team expects that the final report will be released in 2016, pending the federal review process, and will ensure that participating schools receive this report.




Timeline


Table A-3 presents the major tasks of the project as they were described previously.


Table A-3. Major Tasks of the Project

Tasks

Dates

EWIMS Implementation

Treatment schools implement EWIMS with Grades 9 and 10

March 2014–June 2014

Treatment schools implement EWIMS with Grades 9, 10, and 11

August 2014–June 2015

Control schools implement EWIMS

August 2015–June 2016

Data Collection

Collect administrative records from the state and district

March 2014–June 2015

Collect EWS tool data from treatment schools

March 2014–June 2015

Conduct annual Web-based survey

May 2014 and May 2015

Conduct interviews with EWIMS team members in treatment schools

June 2014 and June 2015

Analysis and Reporting

Draft and submit final report.

December 2015

Benefits to Participation

There are many benefits of participation for districts in <state name>. Critical indicators in ninth and tenth grade that powerfully predict whether students are “on track” for high school graduation can be used as part of an early warning system to flag at-risk students early, assign appropriate interventions, and get students back on track. Participating in this high-profile, large-scale study will give your district an opportunity to access the Early Warning and Intervention Monitoring System (developed by the National High School Center at AIR) at no cost. The EWIMS model, currently in use in 67 districts in six states, includes both an excel-based tool and training and technical support for implementation. Your participation in this study will also play an important role in informing educational policy focused on dropout prevention in <state name> and at the federal level.

Questions or Comments

If you have any questions or comments about the study or the opportunity it provides for your district, please feel free to contact Dr. Nicholas Sorensen ([email protected] or 312-283-2318) or Dr. Mindee O’Cummings ([email protected] or 202-403-5254).



Signatures of Commitment

Shape6

The below <District Name> staff person has agreed to serve as the primary point of contact for school recruitment and data collection during the course of the study.

Name:

Position:

Telephone Number:

E-Mail:



Return via fax (312-288-7601) or e‑mail ([email protected])

The following people have read this document detailing the study and agree to the roles, responsibilities, and conditions of participation on behalf of <District Name> and the study team.

<insert name> <insert title>

District Representative Signature Printed Name Title Date

Jessica Heppen Co-Principal Investigator

Principal-Investigator Printed Name Title Date

Mindee O’Cummings Co-Principal Investigator

Principal-Investigator Printed Name Title Date

Ann-Marie Faria Project Director

Project Director Printed Name Title Date

Nicholas Sorensen Deputy Project Director

Project Director Printed Name Title Date



Per the policies and procedures required by the Education Sciences Reform Act of 2002, Title I, Part E, Section 183, responses to this data collection will be used only for statistical purposes. The reports prepared for this study will summarize findings across the sample and will not associate responses with a specific district or individual. Any willful disclosure of such information for nonstatistical purposes, except as required by law, is a class E felony.


A-9. Final Agreement Form (State MOU)


Memorandum of Understanding

By and Between the <State Department of Education> and __________



This agreement is entered into by the <State Department of Education (“DOE”)> and _____________ (“Researcher”) for the purpose of sharing information between the parties in a manner consistent with the Family Education Records Privacy Act of 1974 (“FERPA”). The information will be used by researchers at _____________ to conduct evaluative studies designed to improve instruction for children in the state of <name of state>. Topics of these studies will include: ___________. In order to complete these studies and in order to have a positive impact on the instruction of children, the Researcher requires the use of student data from the <DOE>.


The Family Educational Rights and Privacy Acts Statute (FERPA) describes circumstances under which State Educational Agencies (SEAs) are authorized to release data from an education record. This information can be disclosed to organizations conducting studies on behalf of SEAs, provided that Federal, State, or local law authorizes the evaluation in question.


  1. PARTIES. The <DOE> is an SEA that is authorized to receive information from local educational agencies (“LEAs”) subject to FERPA, as authorized by 34 CFR Section 99.31. Researcher desires to conduct studies on behalf of <DOE> for the purpose of improving instruction in <state name> public schools in accordance with the Scope of Work Agreement attached hereto as Appendix A. The parties wish to share data collected by the <DOE> regarding education in <state name>, none of which will allow the identification of individual students.


  1. COMPLIANCE WITH FERPA. To effect the transfer of data subject to FERPA, Researcher agrees to:


  1. In all respects comply with the provisions of FERPA. For purposes of this agreement, “FERPA” includes any amendments or other relevant provisions of federal law, as well as all requirements of Chapter 99 of Title 34 of the Code of Federal Regulations. Nothing in this agreement may be construed to allow either party to maintain, use, disclose or share student information in a manner not allowed by federal law or regulation.


  1. Use the data shared under this agreement for no purpose other than research authorized under Section 99.31(a)(3)(iv) or 99.31(a)(6) of Title 34 of the Code of Federal Regulations. Researcher further agrees not to share data received under this MOU with any other entity without the <DOE’s> approval. Researcher agrees to allow the Office of the State Auditor, subject to FERPA restrictions, access to data shared under this agreement and any relevant records of Researcher for purposes of completing authorized audits of the parties. Researcher shall be liable for any audit exception that results solely from its acts or omissions in the performance of this agreement. <DOE> shall be liable for any audit exception that results solely from its acts or omissions in the performance of this agreement. In the event that the audit exception results from the act or omissions of both parties, the financial liability for the audit exception shall be shared by the parties in proportion to their relative fault.


  1. Require all employees, contractors and agents of any kind to comply with all applicable provisions of FERPA and other federal laws with respect to the data shared under this agreement. Researcher agrees to require and maintain an appropriate confidentiality agreement from each employee, contractor or agent with access to data pursuant to this agreement. Nothing in this paragraph authorizes sharing data provided under this Agreement with any other entity for any purpose other than completing Researcher’s work authorized under this Agreement.


  1. Maintain all data obtained pursuant to this agreement in a secure computer environment and not copy, reproduce or transmit data obtained pursuant to this agreement except as necessary to fulfill the purpose of the original request. All copies of data of any type, including any modifications or additions to data from any source that contains information regarding students, are subject to the provisions of this agreement in the same manner as the original data. The ability to access or maintain data under this agreement shall not under any circumstances transfer from Researcher to any other institution or entity.


  1. Not to disclose any data obtained under this agreement in a manner that could identify an individual student to any other entity in published results of studies as authorized by this agreement, nor attempt to infer or deduce the identity of any student or teacher based on data provided by <DOE>, nor claim to have identified or deduced the identity of any student based on data provided by <DOE>.


  1. Not to provide any data obtained under this agreement to any party ineligible to receive data protected by FERPA or prohibited from receiving data from any entity.


  1. Provide to the <DOE> a list of specific research studies, updated annually, for which the data are being used, and to notify the <DOE> in advance of any new project or research question researcher proposes to address. This list of research studies will identify linkages of all data possessed by researcher under this agreement and covered by FERPA to specific research studies. Further, it will include the fixed ending date for use of all data linked to each project.


  1. Provide to the <DOE> any materials designed for public dissemination, based in whole or in part on data obtained under this agreement, at least ten days prior to public release.


  1. Destroy all data obtained under this agreement, within the time frame established in Appendix A, Section II, when it is no longer needed for the purpose for which it was obtained. Nothing in this agreement authorizes either party to maintain data beyond the time period reasonably needed to complete the purpose of the request. All data no longer needed shall be destroyed or returned to the <DOE> in compliance with 34 CFR Section 99.35(b)(2). Researcher agrees to require all employees, contractors, or agents of any kind to comply with this provision.

  1. DATA REQUESTS.


  1. The <DOE> may decline to comply with a request if it determines that providing the data in the manner requested would violate FERPA and/or would not be in the best interest of current or former students in <state name> public schools. All requests shall include a statement of the purpose for which it is requested and an estimation of the time needed to complete the project for which the data is requested. Data requests may be submitted by post, electronic mail or facsimile.


  1. Researcher agrees that <DOE> makes no warranty concerning the accuracy of the student data provided.

  1. AUTHORIZED REPRESENTATIVE. Researcher shall designate in writing a single authorized representative able to request data under this agreement. The authorized representative shall be responsible for transmitting all data requests and maintaining a log or other record of all data requested and received pursuant to this agreement, including confirmation of the completion of any projects and the return or destruction data as required by this agreement. The <DOE> or its agents may upon request review the records required to be kept under this section.


RELATED PARTIES. Researcher represents that it is authorized to bind to the terms of this contract, including confidentiality and destruction or return of student data, all related or associated institutions, individuals, employees or contractors who may have access to the data or may own, lease or control equipment or facilities of any kind where the data is stored, maintained or used in any way by Researcher. This Agreement takes effect only upon acceptance by an authorized representative of <DOE>, by which that institution agrees to abide by its terms and return or destroy all student data upon completion of the research for which it was intended or upon the termination of its current relationship with Researcher.


TERM. This agreement takes effect upon signature by the authorized representative of each party and will remain in effect until <DATE>. The parties further understand that the <DOE> may cancel this agreement at any time for reasonable cause, upon thirty-day written notice. Notice of such cancellation shall be sent or otherwise delivered to the persons signing this agreement. The <DOE> specifically reserves the right to immediately cancel this agreement upon discovery of non-compliance with any applicable federal or state laws, rules or regulations. Further, the <DOE> specifically reserves the right to immediately cancel this agreement should the <DOE>, in its sole discretion, determine that student information has been released in a manner inconsistent with this agreement, has not been maintained in a secure manner, or that substantially similar data access has become generally available for research purposes through any other mechanism approved by the <DOE>. In the event of immediate cancellation, a notice specifying the reasons for cancellation shall be sent as soon as possible after the cancellation to the persons signing the agreement.

  1. BREACH AND DEFAULT. Upon breach or default of any of the provisions, obligations, or duties embodied in this agreement, the parties may exercise any administrative, contractual, equitable, or legal remedies available, without limitation. The waiver of any occurrence of breach or default is not a waiver of such subsequent occurrences, and the parities retain the right to exercise all remedies mentioned herein.

  2. AMENDMENT. This agreement may be modified or amended provided that any such modification or amendment is in writing and is signed by the parties to this agreement. It is agreed, however, that any amendments to laws, rules, or regulations cited herein will result in the correlative modification of this agreement, without the necessity for executing written amendment.

  3. ASSIGNMENT OF RIGHTS. Neither this agreement, nor any rights, duties, or obligations described herein shall be assigned by Researcher without the prior express written consent of <DOE>.

  4. ENTIRETY OF AGREEMENT. All terms and conditions of this agreement are embodied herein and in the Scope of Work Agreement attached hereto as Appendix A. No other terms and conditions will be considered a part of this agreement unless expressly agreed upon in writing and signed by both parties.



Entered into by



____________________________ _____

Date

Authorized DOE Representative




____________________________ _____

Date

Chief Research Officer




_____________________________ _____

[Signature] Date

[Printed Name and Title]

[Name of Organization]



P er the policies and procedures required by the Education Sciences Reform Act of 2002, Title I, Part E, Section 183, responses to this data collection will be used only for statistical purposes. The reports prepared for this study will summarize findings across the sample and will not associate responses with a specific district or individual. Any willful disclosure of such information for nonstatistical purposes, except as required by law, is a class E felony.






Attachment B. Data Collection Instruments

Attachments B-1 through B-3 include data collection instruments that will be used in the EWIMS impact study.

B-1 includes a draft protocol of the school-level survey that measures the treatment contrast and data-informed allocation of dropout prevention interventions for students. B-2 includes a draft protocol of the school data culture survey. B-3 includes a protocol for collecting data stored in the EWS tool in treatment schools.




B-1. Draft Protocol—School-Level Survey
(Treatment Contrast and Data-Informed Allocation of Dropout Prevention Interventions for Students)

Paperwork Burden Statement

According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless such collection displays a valid Office of Management and Budget (OMB) control number. The valid OMB control number for this information collection is XXXX-XXXX. The time required to complete this information collection is estimated to average 60 minutes per response. This information collection is voluntary. If you have any comments concerning the accuracy of the time estimate(s) or suggestions for improving this form, please write to: U.S. Department of Education, Washington, DC 20202–4651. If you have comments or concerns regarding the status of your individual submission of this form, write directly to: Christopher Boccanfuso, U.S. Department of Education, Institute of Education Sciences, Room 506D, 555 New Jersey Ave. NW, Washington, DC 20208-5500.

Per the policies and procedures required by the Education Sciences Reform Act of 2002, Title I, Part E, Section 183, responses to this data collection will be used only for statistical purposes. The reports prepared for this study will summarize findings across the sample and will not associate responses with a specific district or individual. Any willful disclosure of such information for nonstatistical purposes, except as required by law, is a class E felony.

**This survey will be administered to both treatment and control schools to estimate the impact of EWIMS on school-level outcomes

The purpose of this draft school-level survey is twofold: the first section documents current usage of an early warning system tool or the use of data to identify students who are at risk of academic failure, disengagement, and/or dropping out of high school, and will be used to measure the treatment contrast in the EWIMS impact study. We will use this information to document the treatment contrast. Measures are adapted from the REL Appalachia survey in an attempt to coordinate data collection across RELs on similar EWS implementation studies.

These second section collects consistent information from treatment and control schools about the types of dropout interventions they are implementing and whether and how they use student data to allocate those dropout interventions. This section will be used to answer school-level research question 2a: What is the impact of EWIMS on data-informed allocation of dropout prevention interventions for students? Items are based on up-to-date input from the Dropout Prevention Research Alliance members’ input, as well as a collaboration between REL Appalachia and REL Midwest to develop similar tools for EWS implementation studies. The study team plans to continue to review and populate these response options prior to administration and in collaboration with alliance members to reflect the most up-to-date local.

These items are still in draft form and will be finalized with input from REL Midwest’s Dropout Prevention Alliance.

Section I: Treatment Contrast—Use of an Early Warning System

  1. Is your high school using an early warning system (EWS) to identify students who may be at risk of not graduating from high school on time?

      • Yes, we are implementing EWS in my high school

If yes, what is the name of your early warning system? ________________________.

      • No, my high school is not currently implementing EWS in any high schools (IF NO, SKIP TO QUESTION 5)

  1. Please tell us when you began implementing an early warning system in your school, and the degree of implementation in each school year.


Time Period

Degree of Implementation

Before the 2010–11 school year

Drop down menua

2010–11 school year

Drop down menua

2011–12 school year

Drop down menua

2012–13 school year

Drop down menua

2013–14 school year

Drop down menua

Other (Please specify)

Drop down menua

aDrop down menu: Beginning stages, In progress, Mastered the tool and process

  1. How do you use the early warning system? (Please check all that apply)

      • To identify students at risk based on attendance

      • To identify students at risk based on course failure

      • To identify students at risk based on behavior

      • To target resources to support off-track students before they drop out

      • To assign students to intervention(s)

      • To monitor student progress or students’ response to intervention(s)

      • To examine patterns and identify school climate issues

      • Other (please specify) _____________________________________________________

  1. What kinds of reports does your early warning system generate? (Please check all that apply)

      • Student-level reports that show lists of students and risk status

      • School-level, or graphical, reports that show trends in student risk status across the school or time

      • Reports (student or school level) for key subgroups (e.g., students with disabilities) of students

      • Intervention-level reports that allow us to monitor the effectiveness of our interventions

      • Other (please specify) _____________________________________________________



~~~~~~~~~~~~~~~~~~~~~~Skip Pattern from Question 1 ~~~~~~~~~~~~~~~~~~~~~~~~~~~



The following questions ask about how your school uses data to identify students who may be at risk for not graduating on time.

  1. Do teachers in your school have access to student data that identifies students at risk of not graduating?

    • Yes, and all teachers access it regularly

    • Yes, and some teachers access it regularly

    • Yes, and some teachers access it occasionally

    • Yes, but very few teachers actually access it

    • Yes, but no teachers actually access it

    • No, teachers do not have access to this type of data

    • Other (please specify) _____________________

  2. Does your school review student attendance data to determine which students may be at risk (i.e., missing more than # days per year)?

    • Yes

    • No (IF NO, SKIP TO QUESTION 9)


  1. For which grades do you examine student attendance data? (Please check all that apply.)

    • Grade 9

    • Grade 10

    • Grade 11

    • Grade 12

  1. How often do you review these data?

    • Daily

    • Weekly

    • Monthly

    • 4 times per year (once per quarter)

    • 3 times per year

    • 2 times per year

    • 1 time per year

  1. How often do you use attendance data to assign students to specific interventions or support services?

    • Never

    • Rarely

    • Sometimes

    • Always

  2. Tell us a little bit about how you use attendance data in your school:

  3. Does your school review student course performance data (including course failures, credit deficiencies) to determine which students may be at risk?

    • Yes

    • No (IF NO, SKIP TO QUESTION 15)

  4. For which grades do you examine course performance data (including course failures, credit deficiencies)? (Please check all that apply).

    • Grade 9

    • Grade 10

    • Grade 11

    • Grade 12

  1. How often do you review these data?

    • Daily

    • Weekly

    • Monthly

    • 4 times per year (once per quarter)

    • 3 times per year

    • 2 times per year

    • 1 time per year

  1. Tell us a little bit about how you use course performance data (including course failures, credit deficiencies) data in your school:


  1. What other kinds of student data do you look at, and for which grades? (Please check all that apply.)



We do not use

We use in these grades

Grade 9

Grade 10

Grade 11

Grade 12

Behavior referrals

Behavior suspensions

Grade point average (GPA)

State assessment results

Other:


Other:


Other:


Other:




The following questions ask about the structure of data review in your high school, focusing on the group of individuals who use student data to identify students who may be at risk of not graduating from high school on time. These individuals may be part of a distinct early warning system team or part of another team that is responsible for examining these data.

  1. Does your school have a team or group of individuals that reviews student data to support students who are identified as at risk of not graduating from high school? (e.g., building- or teacher-level teams, Student Success Teams, Data Review Teams)

    • Yes, we have a dedicated school-based team.

If yes, what is the goal or purpose of this team? ___________________________ _________________________________________________________________

    • No, we do not have a dedicated school-based team.

    • Other (please specify)




  1. How often does your team meet to review student data?

    • Weekly

    • Monthly

    • 4 times per year (once per quarter)

    • 3 times per year

    • 2 times per year

    • 1 time per year

  2. How often does your team meet to assign students at risk to support and/or interventions?

    • Weekly

    • Monthly

    • 4 times per year (once per quarter)

    • 3 times per year

    • 2 times per year

    • 1 time per year

  3. Describe the members of the team at the high school level (if applicable). (Please check all that apply.)

  • School principal

      • Assistant principal

      • Representative from feeder middle schools

      • Guidance counselors

      • Content area teachers

      • Special education teachers

      • English language learner instructors

      • District office representative

      • Community service providers

      • Community members

      • Other (please specify) _____________________________________________________

  1. To what extent do you agree with the following statement: “Implementing an early warning system will lead to improved graduation rates.”

  • Strongly disagree

  • Disagree

  • Agree

  • Strongly agree

  1. How often does your school reflect on the process of identifying and assigning students at risk to interventions?

    • More than twice a year

    • Twice a year

    • Once a year

    • We do not have this opportunity


Section II: Data-Informed Allocation of Dropout Prevention Interventions for Students

  1. Does your school offer targeted academic interventions (e.g., tutoring, reading remediation, study skills) for students?

      • Yes

    • No (IF NO, SKIP TO QUESTION 3)

  1. Please name the targeted academic interventions that the school offers. In the table below, indicate which programs are available, the number of students who participated during the school year, and the grade level of students who participated.

Program name

Focus of the intervention

How is this program delivered?

Who delivers this program?

# of participating students

Grade level(s) of students

Who refers students for this program?

What criteria are used to select students for participation in targeted academic interventions?

What data do you use to determine whether an intervention/ support is working?

Write in

Drop down menu (reading, mathematics, science, tutoring, remediation, study skills, other please specify)

Drop down menu (individually to students, via small group, via large group, schoolwide, other please specify)

Drop down menu (teachers, administrators, guidance department, community members, parents, peers [other students], other please specify)

Drop down menu

(from 1 to 999)

Drop down menu (from grade 9 to grade 12)

Drop down menu allowing checking all that apply (administrator, counselor, parent, teacher, data team, student, [self], other please specify)

Drop down menu allowing checking all that apply (course failure [e.g., failing Algebra], grade point average [GPA], number of credits, student attendance [e.g., missing a number of days of school or class], student behavior [e.g., office referrals, suspensions], no criteria, other please specify)

Drop down menu allowing checking all that apply (attendance, behavior, course performance, core course performance, GPA, credits earned, behavior referrals, behavior suspensions, no criteria, other please specify)
















































  1. Does your school offer targeted behavior interventions (e.g., social skill training, character education) for students?

      • Yes

      • No (IF NO, SKIP TO QUESTION 4)

  1. Please name the targeted behavior interventions at the school. In the table below, indicate which programs are available, the number of students who participated during the school year, and the grade level of students who participated.

Program name

Focus of the intervention

How is this program delivered?

Who delivers this program?

# of participating students

Grade level(s) of students

Who refers students for this program?

What criteria are used to select students for participation in targeted behavior interventions?

What data do you use to determine whether an intervention/ support is working?

Write in

Drop down menu (social skill training, character education, other please specify)

Drop down menu (individually to students, via small group, via large group, schoolwide, other please specify)

Drop down menu (teachers, administrators, guidance department, community members, parents, peers [other students], other please specify)

Drop down menu

(from 1 to 999)

Drop down menu (from grade 9 to grade 12)

Drop down menu allowing checking all that apply (administrator, counselor, parent, teacher, data team, student, [self], other please specify)

Drop down menu allowing checking all that apply (course failure [e.g., failing Algebra], grade point average [GPA], number of credits, student attendance [e.g., missing a number of days of school or class], student behavior [e.g., office referrals, suspensions], no criteria, other please specify)

Drop down menu allowing checking all that apply (attendance, behavior, course performance, core course performance, GPA, credits earned, behavior referrals, behavior suspensions, no criteria, other please specify)
















































  1. Does your school offer targeted attendance/truancy interventions (e.g., attendance monitor) for students?

    • Yes

    • No (IF NO, SKIP TO QUESTION 5)

  1. Please name the targeted attendance/truancy interventions that the school offers. In the table below, indicate which programs are available, the number of students who participated during the school year, and the grade level of students who participated.

Program name

Focus of the intervention

How is this program delivered?

Who delivers this program?

# of participating students

Grade level(s) of students

Who refers students for this program?

What criteria are used to select students for participation in targeted behavior interventions?

What data do you use to determine whether an intervention/ support is working?

Write in

Drop down menu (attendance monitoring, contacting truancy officers, automated contact with families, conferences with families and students, other please specify)

Drop down menu (individually to students, via small group, via large group, schoolwide, other please specify)

Drop down menu (teachers, administrators, guidance department, community members, parents, peers [other students], other please specify)

Drop down menu

(from 1 to 999)

Drop down menu (from grade 9 to grade 12)

Drop down menu allowing checking all that apply (administrator, counselor, parent, teacher, data team, student, [self], other please specify)

Drop down menu allowing checking all that apply (course failure [e.g., failing Algebra], grade point average [GPA], number of credits, student attendance [e.g., missing a number of days of school or class], student behavior [e.g., office referrals, suspensions], no criteria, other please specify)

Drop down menu allowing checking all that apply (attendance, behavior, course performance, core course performance, GPA, credits earned, behavior referrals, behavior suspensions, no criteria, other please specify)















































  1. Does your school offer credit or content recovery (e.g., online programs such as Apex, K12, Plato, or another district program)?

      • Yes

    • No (IF NO, SKIP TO QUESTION 6)

  1. Please name the credit recovery programs that your school offers. In the table below, indicate which programs are available, the number of student who participated during the school year, and the grade level of students who participated.

Program name

Focus of the intervention

How is this program delivered?

Who delivers this program?

# of participating students

Grade level(s) of students

Who refers students for this program?

What criteria are used to select students for participation in targeted behavior interventions?

What data do you use to determine whether an intervention/ support is working?

Write in

Drop down menu (reading, mathematics, science, tutoring, remediation, study skills, other please specify)

Drop down menu (individually to students, via small group, via large group, schoolwide, other please specify)

Drop down menu (teachers, administrators, guidance department, community members, parents, peers [other students], other please specify)

Drop down menu

(from 1 to 999)

Drop down menu (from grade 9 to grade 12)

Drop down menu allowing checking all that apply (administrator, counselor, parent, teacher, data team, student, [self], other please specify)

Drop down menu allowing checking all that apply (course failure [e.g., failing Algebra], grade point average [GPA], number of credits, student attendance [e.g., missing a number of days of school or class], student behavior [e.g., office referrals, suspensions], no criteria, other please specify)

Drop down menu allowing checking all that apply (attendance, behavior, course performance, core course performance, GPA, credits earned, behavior referrals, behavior suspensions, no criteria, other please specify)















































  1. Does your school offer student mentoring programs (e.g., Check & Connect, Check In/Check Out)?

      • Yes

    • No (IF NO, SKIP TO QUESTION 7)

  1. Which of the following mentoring programs does the school offer or do students have access to in the community? In the table below, indicate which programs are available, the number of students who participated during the school year, and the grade level of students who participated.

Program name

Focus of the intervention

How is this program delivered?

Who delivers this program?

# of participating students

Grade level(s) of students

Who refers students for this program?

What criteria are used to select students for participation in targeted behavior interventions?

What data do you use to determine whether an intervention/ support is working?

Write in

Drop down menu (academic mentoring, school adjustment mentoring [e.g., 9th grade transition program, reentry from adjudication], career mentoring, project-based and community-based mentoring, group-specific mentoring, peer mentoring, other please specify)

Drop down menu (individually to students, via small group, via large group, schoolwide, other please specify)

Drop down menu (teachers, administrators, guidance department, community members, parents, peers [other students], other please specify)

Drop down menu

(from 1 to 999)

Drop down menu (from grade 9 to grade 12)

Drop down menu allowing checking all that apply (administrator, counselor, parent, teacher, data team, student, [self], other please specify)

Drop down menu allowing checking all that apply (course failure [e.g., failing Algebra], grade point average [GPA], number of credits, student attendance [e.g., missing a number of days of school or class], student behavior [e.g., office referrals, suspensions], no criteria, other please specify)

Drop down menu allowing checking all that apply (attendance, behavior, course performance, core course performance, GPA, credits earned, behavior referrals, behavior suspensions, no criteria, other please specify)

Check & Connect









Check In Check Out









Gear Up



























  1. Does your school offer a student internship or school-related work-preparation program (such as Job Corp, MACC Project, or career and technical education classes or programs)?

      • Yes

      • No (IF NO, SKIP TO QUESTION 8)

  1. Which of the following student internship or school-related work-preparation programs does the school offer? In the table below, indicate which programs are available, number of student who participated during the school year, and the grade level of students who participated.

Program name

Focus of the intervention

How is this program delivered?

Who delivers this program?

# of participating students

Grade level(s) of students

Who refers students for this program?

What criteria are used to select students for participation in targeted behavior interventions?

What data do you use to determine whether an intervention/ support is working?

Write in

Drop down menu (internship, work-prep, other please specify)

Drop down menu (individually to students, via small group, via large group, schoolwide, other please specify)

Drop down menu (teachers, administrators, guidance department, community members, parents, peers [other students], other please specify)

Drop down menu

(from 1 to 999)

Drop down menu (from grade 9 to grade 12)

Drop down menu allowing checking all that apply (administrator, counselor, parent, teacher, data team, student, [self], other please specify)

Drop down menu allowing checking all that apply (course failure [e.g., failing Algebra], grade point average [GPA], number of credits, student attendance [e.g., missing a number of days of school or class], student behavior [e.g., office referrals, suspensions], no criteria, other please specify)

Drop down menu allowing checking all that apply (attendance, behavior, course performance, core course performance, GPA, credits earned, behavior referrals, behavior suspensions, no criteria, other please specify)

Job Corps









AmeriCorps









Post-Secondary Enrollment Options program at Kent State University









Career and technical education program(s) (e.g., culinary arts)









Career and technical education class(es) that are not part of a program









Other—please describe:











  1. Does your school offer student college preparation programs (such as AVID or Gear Up)?

      • Yes

      • No (IF NO, SKIP TO QUESTION 9)

  1. Which of the following college preparation programs does the school offer? In the table below, indicate which programs are available, the number of students who participated during the school year, and the grade level of students who participated.

Program name

Focus of the intervention

How is this program delivered?

Who delivers this program?

# of participating students

Grade level(s) of students

Who refers students for this program?

What criteria are used to select students for participation in targeted behavior interventions?

What data do you use to determine whether an intervention/ support is working?

Write in

Drop down menu (general college prep, financial aid assistance, application process assistance, support for college entrance exams, academics, other please specify)

Drop down menu (individually to students, via small group, via large group, schoolwide, other please specify)

Drop down menu (teachers, administrators, guidance department, community members, parents, peers [other students], other please specify)

Drop down menu

(from 1 to 999)

Drop down menu (from grade 9 to grade 12)

Drop down menu allowing checking all that apply (administrator, counselor, parent, teacher, data team, student, [self], other please specify)

Drop down menu allowing checking all that apply (course failure [e.g., failing Algebra], grade point average [GPA], number of credits, student attendance [e.g., missing a number of days of school or class], student behavior [e.g., office referrals, suspensions], no criteria, other please specify)

Drop down menu allowing checking all that apply (attendance, behavior, course performance, core course performance, GPA, credits earned, behavior referrals, behavior suspensions, no criteria, other please specify)

Early College High School









AVID









Gear Up









Talent Development




















  1. In what other ways do you assign students to intervention, or support programming?

    • Demographics (such as free or reduced-price lunch)

    • We use teacher recommendations or referrals

    • Other types of data (please specify)



B-2. Draft Protocol—School Data Culture Survey

This survey will be administered to both treatment and control schools to estimate the impact of EWIMS on school level outcomes.

This attachment provides additional information on the draft protocols to measure School Data Culture. Table B-1 presents the reliability of the subscales and Key Dimensions of School Data Culture for principals and the reliability of the subscales and scales measuring principals’ report of School Data Culture. Cronbach’s alpha was calculated for each subscale and scale. It is generally accepted that Cronbach’s alphas above 0.70 indicate adequate internal consistency among the items in a scale.

The reliability statistics of each scale and the number of survey items in each scale are shown below. The alpha statistic for all scales and subscales measuring School Data Culture (Context, Supports for Data Use, and Barriers to Data Use) is above 0.70, suggesting adequate reliability. Although ED’s contractor will revise items to fit the needs of the EWIMS evaluation, these Cronbach’s alphas are promising for the internal consistency of the revised School Data Culture Scale.



Table B-1. Reliability Estimates of Principals’ Key Dimensions of
Data Use Scales and Subscales




Cronbach’s Alpha

Key Dimension Scale

Subscales within each Key Dimension scale

Survey item N

General principal responses

Reading instruction –specific

Math instruction–specific

Context

32


0.87

0.87


Assessment/Instructional Context

23


0.82

0.81


State, District, and School Data Culture

9

0.84



Supports for Data Use

51

0.92




Data Infrastructure

8

0.82




Organizational Supports

31

0.90




Staffing/Human Resources

12

0.93



Barriers to Data Use

13

0.76



Sample size: N = 212 Principals.

Note: The Supports for Data Use scale and Barriers to Data Use scale did not contain survey items that were reading or mathematics specific.

School-Level Instructional Response is a subscale specific to the Principal Survey.



Paperwork Burden Statement

According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless such collection displays a valid Office of Management and Budget (OMB) control number. The valid OMB control number for this information collection is XXXX-XXXX. The time required to complete this information collection is estimated to average 60 minutes per response. This information collection is voluntary. If you have any comments concerning the accuracy of the time estimate(s) or suggestions for improving this form, please write to: U.S. Department of Education, Washington, DC 20202–4651. If you have comments or concerns regarding the status of your individual submission of this form, write directly to: Christopher Boccanfuso, U.S. Department of Education, Institute of Education Sciences, Room 506D, 555 New Jersey Ave. NW, Washington, DC 20208-5500.

Per the policies and procedures required by the Education Sciences Reform Act of 2002, Title I, Part E, Section 183, responses to this data collection will be used only for statistical purposes. The reports prepared for this study will summarize findings across the sample and will not associate responses with a specific district or individual. Any willful disclosure of such information for nonstatistical purposes, except as required by law, is a class E felony.


Survey Items

  1. How much do you agree or disagree with the following statements about your school’s priorities about using data?


    Strongly

    disagree

    Disagree

    Agree

    Strongly

    agree

    1. This school has clear goals for using data to improve student outcomes

    1. The school’s data use policies help us address student needs at our school.

    1. This school has adequate resources (e.g. time, staff, money) to facilitate teachers’ use of data.

    1. This school has adequate professional development to give teachers the skills to access, interpret, and make decisions about data.

  2. Now consider the professional climate in your school. To what extent do you agree or disagree with each of the following statements?


    Strongly disagree

    Disagree

    Agree

    Strongly agree

      1. Educators in this school are continually learning and seeking new ideas.

      1. Educators are engaged in systematic analysis of student data.

      1. Educators in this school work hard to match students with interventions that will meet their individual needs

      1. Assessment of student performance leads to changes in programming.

      1. Educators in this school regularly examine student data.

  3. To what extent do you agree or disagree with the following statements about your school leadership team (e.g. principal, assistant principals, other key administrators)?

    The school leadership team

    Strongly disagree

    Disagree

    Agree

    Strongly agree

    1. Encourages teachers to make decisions based on data.

    1. Facilitates conversations about using student data

    1. Commits adequate resources to help teachers interpret and use the student data

    1. Places too much emphasis on using the student data

  4. Please read the following response options and pick the one that best reflects the degree of support for data use provided by the school leadership team:

      • Not at all supportive: Does not make the student data a priority. There is limited discussion of the data with staff.

      • Not very supportive: Occasional support for the student data use in faculty and staff discussions, but administrators do not see using the student data as central to the school’s mission.

      • Supportive: Administration is supportive of teacher’s efforts, speaks positively about the student data with staff, problem-solves obstacles to using the student data, uses the system themselves.

      • Very supportive: Administration is a “cheerleader” for the student data, effectively supports staff use of the system, and sees it as central to school mission.

  5. About how often does your school have scheduled meeting time to review student data (e.g., in staff meetings, in data team meetings)?


    About once a week

    1–2 times per month

    1–2 times per quarter

    Once a year

    My school does provide time for this

    1. Review student data (e.g., state test scores, student work, curriculum-based unit tests)

    1. Discuss individual student achievement

    1. Discuss student achievement by subgroup (e.g. students with disabilities, ELL/LEP, gender, race/ethnicity)

    1. Discuss and share instructional strategies

    1. Assign students to interventions

    1. Discuss students’ progress in interventions

  6. In the last 12 months, how much professional development was offered to your staff that focused on using data to inform educational decisions?


    No PD provided on this topic

    Minor emphasis

    Moderate emphasis

    Major emphasis

    1. Linking student data to classroom practice

    1. Analyzing student data

    1. Identifying strengths or weaknesses of their own teaching

    1. Incorporating student data into lesson planning

    1. Using data to target interventions for low-performing students

    1. Using data to target interventions for high-performing students

    1. Understanding how data can be used to guide instruction

  7. How many teachers at your school have the following characteristics?



None

Some

About half

Most

All

  1. The ability to use data from student assessments

  1. The ability to analyze trends in individual student performance over time

  1. The ability to analyze trends in classroom-level performance over time

  1. The ability to translate data into knowledge about student strengths and weaknesses

  1. The ability to make instructional changes based on data


  1. This question concerns how teachers interact with each other in your school. Please indicate the extent to which your teachers do each of the following:



Not at all

To a slight extent

To a moderate extent

To a great extent

  1. Meet together to look at trends in the data (or analyze data)

  1. Share ideas about using data to improve teaching with other teachers

  1. Share and discuss student work with other teachers

  1. Discuss particular lessons that were not very successful

  1. Discuss beliefs about teaching and learning



  1. Please rate how much you agree or disagree with the following statements.



Not at all

To a slight extent

To a moderate extent

To a great extent

  1. Teachers and administrators work together to review student data.

  1. Decisions about students’ enrollment in interventions and supports are made as a team.

  1. Teachers work together to identify their students strengths and weaknesses.

  1. Administrators and teachers work together to support their at-risk students.

  1. Teachers prefer to review data on their students independently.

  1. At this school, data review is a team effort.


  1. To what extent do the following factors hinder your ability to use student data to inform instruction and interventions?


Not at all

To a minor extent

To a moderate extent

To a great extent

  1. Lack of time to study and think about available data

  1. Lack of time to collaborate with others in analyzing and interpreting data

  1. Lack or professional development or training on how to use the dashboards

  1. Personal discomfort with data analysis

  1. Lack of technology (e.g., access to computer with reliable internet connection)

  1. Insufficient amount of data

  1. Data provided too late for use

  1. The data in the system are inaccurate

  1. The pacing guides are too rigid to really allow me to re-teach or adapt my instruction to data

  1. Other (please specify)



  1. Which of the following best describes your role at your school?

  • Principal

  • Assistant principal

  • Guidance counselor

  • Other school administrator (please specify) _______________________________



  1. How long have you been teaching? If this is your first year teaching please select “one.”

  • Total number of years:_____(drop down menu 1 through 25 or more)



  1. What is the highest level of education you have completed?

  • Bachelor's degree

  • Master's degree

  • Professional degree (Ed.D., Ph.D.)

  • Other (please specify)_________________________________________________

B-3. Protocol for Collecting Data Stored in the EWS Tool
in Treatment Schools


According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless such collection displays a valid Office of Management and Budget (OMB) control number. The valid OMB control number for this information collection is XXXX-XXXX. The time required to complete this information collection is estimated to average 30 minutes per response. This information collection is voluntary. If you have any comments concerning the accuracy of the time estimate(s) or suggestions for improving this form, please write to: U.S. Department of Education, Washington, DC 20202–4651. If you have comments or concerns regarding the status of your individual submission of this form, write directly to: Christopher Boccanfuso, U.S. Department of Education, Institute of Education Sciences, Room 506D, 555 New Jersey Ave. NW, Washington, DC 20208-5500.



To facilitate the transfer of EWS tools from treatment schools to the EWIMS evaluation team, we will use an FTP procedure. The table below describes the timeline of activities between treatment schools and the evaluation team during the two years of the study.


Action

Method/Mode

Key School Staff

Key REL Midwest Staff

Date

AIR will establish secure FTP site for data sharing

Create the Secure FTP

N/A

Ann-Marie Faria

January 2014

AIR will email FTP secure log in information to Treatment schools

E-mail communication



January 2014, June 2014, January 2015, June 2015

Treatment schools will upload populated EWS tools to the secure FTP

E-mail communication

Data Teams in Treatment Schools

Ann-Marie Faria

January 2014, June 2014, January 2015, June 2015





Establishing a secure system to share data (SFTP). One key aspect of transferring populated EWS tools between schools and the evaluation team is the use of a secure transfer of data between agencies. AIR will create a secure FTP and will share the log-in information with each treatment school. A draft of the directions that will be shared with each treatment school from AIR are included below.


WEB HOSTING SERVICES "SFTP Client Guide" Public Information

DESCRIPTION:

This SOP outlines the procedures to use Web Services’ SFTP service.

PROCEDURE:

A. Open the email received from WS Engineer with access credentials to the SFTP site.



B. Follow Steps 1-7 in the email:

1. Open your internet browser and type http://sftp.airws.org or http://sftp.astprojects.org in the address field, depending on which website you were given in the email.

Shape7

a. Verify whether Java is installed/enabled in your browser:

b. If Java is not installed/ enabled in your browser, click “Install”.

Shape8

c. Accept license agreement, and choose all defaults: Web Services Page 2 of 6


d. Reboot your computer, and repeat step #1.

Shape9


2. Click the “Connect Button” on the webpage:

Shape10




3. A dialog box will pop up, asking for a username and password. Refer to the email for these credentials. Type the given credentials EXACTLY as they appear in the email. Click OK.


EMAIL


DIALOG BOX


4. Upon a successful connection you should see the following on the webpage:

Shape12 Shape11


The figure above shows a message that you have successfully connected to the SFTP server. On the right, it also shows a list of virtual directories located on the remote system, and a field to type the directory path (encircled). On the left is the directory listing for your local machine.


Shape13

5. Refer to step #5 in the email for the name of the specific directory created for your project. Type this name EXACTLY as it appears in the instructions, or find it in the directory listing and navigate manually using your mouse.


Shape14


6. Double click the subfolder contained in the virtual directory. In this subfolder, you can upload/download files or create more folders, depending on the read/write permissions granted for your SFTP account.

Web Services Page 5 of 6

Shape15


7. Use the arrows between the two windows to move files back and forth between your local machine and the SFTP folder in the remote server. Press the “right” arrow button to upload a highlighted file from your machine to the SFTP folder, and the “left” arrow button to download a file from the SFTP folder to your machine.



Shape16

Click the “Disconnect” button when finished.


1 Data culture is a school’s general approach toward using data to inform educational decision making and includes contextual factors (e.g., the assessment and instructional context), supports for data use (e.g., professional development or structured time to review data), working with data (e.g., frequency and depth of data use), and responses to data (e.g., assignment of interventions to students).


2 We are basing our school selection on graduation rates, and we know that they do not correspond directly to dropout rates that often underestimate the number of students who actually dropout. According to NCES, the Ohio graduation rate in 2008-09 was 79.6, while the dropout rate was 4.2. Documented dropout is a low incident event.

3 The MDESs as well as the Expected Mean proportion by condition are based on the contractor’s prior research on dropout prevention including the IES funded Check & Connect Impact Evaluation, as well as the REL Midwest Dropout Prevention Alliance suite of studies, previously mentioned.

4Analyses of continuous outcomes include standardized achievement scores and GPA. These analyses will control for prior standardized achievement and/or GPA, which typically account for more than 70 percent of the variance in these respective outcomes.

5 ED’s contractor is currently conducting a series of interrelated projects for the Regional Educational Laboratory (REL) Midwest’s Dropout Prevention Research Alliance that examine different aspects of implementing an early warning system to achieve the Alliance’s shared goals of improving graduation outcomes and reducing persistent disparities in graduation and dropout rates among student subgroups. To date, this work has primarily focused on the state of Ohio and includes projects focused on (1) validating early warning indicators; and (2) providing technical assistance to support the implementation of early warning systems. The current study—evaluating the impact of using early warning systems on school and student outcomes—will expand this work to other states in the region, including Michigan and Indiana.

6Although it is acknowledged that schools can vary in coding and recording behavior referrals and suspensions, these variables are often predictive of student graduation and/or dropout (e.g., Balfanz, Herzog, & Mac Iver, 2007; Goldschmidt, & Wang, 1999; Mac Iver, & Mac Iver, 2010; Neild, Stoner-Eby, & Furstenberg, 2008). If the results of the Year 1 REL Midwest 4.1.01 study, Local Validation of Early Warning System Indicator Study, demonstrate inconsistency in recording and coding of behavior data, the evaluation team will consider dropping this as an outcome measure.

7 The coefficients will be derived from analyses conducted under the previously mentioned study conducted by REL Midwest that validates early warning indicators based on local Ohio data. We will use the findings from the validation study to inform the weights for each indicator in the RCT.

8If district-specific analyses are available for any schools that both participated in the 4.1.01 study and the randomized controlled trial, the contractor will utilize the most specific weights (coefficients) in the equations that predict each student’s probability of on-time graduation.

9That is, collecting this information for all students in control schools could constitute an intervention in itself. 

10Because asking school staff to document the student-level service contrast information for control schools is likely too much of a burden, and too similarly resembles the actual implementation of EWIMS, the contractor will only collect these data in control schools at the school level (through Web-based surveys of school administrators).

11 We will conduct the main impact analyses on student outcomes at the end of the 2014-2015 school year. However, we also plan to conduct these same models as exploratory analyses at the end of Year 1 (summer 2014). These are not confirmatory analyses because we do not hypothesize an impact of EWIMS on student outcomes after only six months of implementation.

12 Because we will not impute missing data, student test scores will only be used as a covariate in impact analyses if missing data is less than five percent.

13 The goal is not to create yet another “team” with functions that may or may not overlap with functions of other already existing teams. Rather, integration with existing team structures (if functionally operational) is optimal.

14 The goal is not to create yet another “team” with functions that may or may not overlap with functions of other already existing teams. Rather, integration with existing team structures (if functionally operational) is optimal.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Title4.2.13 OMB Schedule B with Tracked Changes
Authorbrown-kevin
File Modified0000-00-00
File Created2021-01-28

© 2024 OMB.report | Privacy Policy