1850-NEW DDI OMB_SS SectionA

1850-NEW DDI OMB_SS SectionA.docx

Impact Evaluation of Data Driven Instruction Professional Development for Teachers

OMB: 1850-0924

Document [docx]
Download: docx | pdf



Impact Evaluation of Data-Driven Instruction Professional Development for Teachers

Supporting Statement Part A:
Justification for the Study

August 26, 2015



Contract Number:

ED-IES-12-C-0086

Mathematica Reference Number:

40166.710

Submitted to:

Institute of Education Sciences

IES/NCEE

U.S. Department of Education

555 New Jersey Avenue, NW

Washington, DC 20208

Project Officer: Erica Johnson

Submitted by:

Mathematica Policy Research

600 Alexander Park

Suite 100

Princeton, NJ 08540

Telephone: (609) 799-3535

Facsimile: (609) 799-0005

Project Director: Phil Gleason

Impact Evaluation of Data-Driven Instruction Professional Development for Teachers

Supporting Statement Part A: Justification for the Study

May 26, 2015





CONTENTS

PART A: JUSTIFICATION 1

1. Circumstances Necessitating the Collection of Information 1

2. Purposes and Uses of Data 8

3. Use of Technology to Reduce Burden 9

4. Efforts to Avoid Duplication of Effort 9

5. Methods to Minimize Burden on Small Entities 9

6. Consequences of Not Collecting Data 9

7. Special Circumstances 10

8. Federal Register Announcement and Consultation 10

9. Payments or Gifts 11

10. Assurances of Confidentiality 12

11. Additional Justification for Sensitive Questions 13

12. Estimates of Hours Burden 13

13. Estimates of Cost Burden to Respondents 14

14. Estimates of Annual Costs to the Federal Government 14

15. Reasons for Program Changes or Adjustments 14

16. Plan for Tabulation and Publication of Results 14

17. Approval Not to Display the OMB Expiration Date 21

18. Explanation of Exceptions 21

REFERENCES 22



Appendix A: teacher assignment data request

Appendix B: teacher survey questionnaire and accompanying letter

Appendix C: teacher log and accompanying letter

Appendix D: PRINCIPAL SURVEY QUESTIONNAIRE AND ACCOMPANYING LETTER

Appendix E: student records request

Appendix F: confidentiality pledge

Appendix G: REMINDER EMAIL AND CALL SCRIPTS






TABLES

1. Schedule of Major Study Activities for Data-Driven Instruction Evaluation 7

2. Research Questions and Data Collection Method 8

3. Experts Consulted 11

4. Estimated Response Time for Data Collection Activities 13

5. Data-related Activities, by Source 16

6. Teacher and Principal Intermediate Outcome Measures, by Source 19



Figures

Figure 1: Logic Model for Planned DDI Intervention 4



SUPPORTING STATEMENT FOR PAPERWORK
REDUCTION ACT SUBMISSION

This OMB package requests clearance for data collection activities for a rigorous evaluation of data-driven instruction (DDI) in 104 schools from 12 school districts that have been recruited for the study. Data-driven instruction involves the use of student assessment data to help teachers adapt their instruction and, ultimately, improve student achievement. The study’s intervention plan will build school capacity for DDI by: (1) helping schools set up structures and processes that enable teachers and other school staff to efficiently carry out data-driven instruction, and (2) training and coaching teachers in the skills needed to understand student data and implement improved instructional strategies to address student needs. By participating in professional development and ongoing DDI activities, teachers are expected to develop the knowledge and skills needed to help them adapt and improve their instructional strategies based on student data, and ultimately, improve student achievement.


We plan to collect student records and teacher-assignment data from participating districts and schools, and conduct a teacher survey, teacher logs, and a principal survey. The Institute of Education Sciences (IES) within the Department of Education (ED) has contracted with Mathematica Policy Research and its partners Abt Associates and Evidence-Based Education Research & Evaluation to conduct the evaluation, and Public Consulting Group–Focus on Results to provide technical assistance to schools implementing the DDI program.

The evaluation’s main objectives are to understand how DDI is implemented and to rigorously estimate the impact of a comprehensive DDI program on student achievement and teacher and principal practices. The implementation component will use information collected from the technical assistance (TA) team, a teacher survey and logs, and a principal survey. For the impact evaluation, the experimental design involves randomly assigning schools within a district to either a treatment or control group. The treatment schools will implement a comprehensive DDI program, with technical assistance provided by an organization that works with schools to implement such instruction, and the control schools will not implement new DDI initiatives during the study years. Student outcomes will include students’ achievement on math and reading state assessments. Teacher and principal outcomes will include teachers’ and principals’ use of data, teachers’ instructional strategies, and the extent and nature of teacher collaboration.

This OMB clearance request concentrates on materials that will be used to collect principal, teacher, and student data. Included in this request are the following: a teacher assignment data request (Appendix A), a teacher survey questionnaire and an accompanying letter (Appendix B), a form and letter for the administration of teacher logs (Appendix C), a principal survey questionnaire and accompanying letter (Appendix D), a student records request (Appendix E), and a confidentiality pledge (Appendix F).

PART A: JUSTIFICATION

1. Circumstances Necessitating the Collection of Information

a. Statement of Need for a Rigorous Evaluation of DDI

The specific piece of legislation authorizing this evaluation is Title II, Part A of the Elementary and Secondary Education Act (ESEA), Section 2121-2123 as amended by No Child Left Behind (NCLB) (20 USC 6621-6623). The Title II legislation provides grants to states and districts to increase student academic achievement. State and local education agencies can use Title II funding to offer professional development activities that “provide training on how to understand and use data and assessments to improve classroom practice and student learning” (U.S. Department of Education, 2013). The federal government is also actively promoting DDI through other initiatives, including Race to the Top and School Improvement Grants.

A growing number of schools across the country have adopted DDI programs. Although there are no comprehensive national data on the current number of schools implementing DDI—in part due to differing definitions of this intervention—DDI providers report implementing DDI programs in hundreds of districts and tens of thousands of schools.1 One provider reports that its DDI materials and training are used in 20 percent of K–12 school districts nationwide (Cordray et al. 2012).

Despite the growing interest in data use across the country, there is limited rigorous evidence on the effects of comprehensive DDI, especially for programs that feature intensive on-site support for teachers and other school staff. The few rigorous evaluations of DDI typically find no statistically significant impacts on student achievement.2 Several of these experimental studies have had methodological limitations, however, and the DDI interventions they examined did not provide substantive supports at the school and teacher levels (Konstantopoulos et al. 2013; Cordray et al. 2012; Carlson et al. 2011; Slavin et al. 2013). In general, the studies have focused on interventions that did not include teacher collaboration, an important best practice (Hamilton et al. 2009; Lachat and Smith 2005), or intensive professional development and on-site coaching and other supports specifically designed to help teachers fully understand student data and adapt their instruction accordingly. In fact, the studies found that the interventions they examined had little or no effect on instructional practices (Cordray et al. 2012) or that DDI teachers were actually less likely to use differentiated instruction, an important DDI teaching practice (Williams et al. 2013; Konstantopoulos et al. 2013). Thus, the existing literature is limited in the information it can provide on the potential impact of a comprehensive DDI intervention.3

This study would provide important, new experimental evidence on the impact of a comprehensive DDI program on student achievement. We will also gather valuable information on how DDI is implemented in schools.

b. Study Design and Research Questions

The study team will evaluate the effectiveness of the DDI program provided by Focus on Results using a random assignment design. This will be based on a sample of 104 schools from 12 school districts, with schools randomly assigned to either a treatment group that implements the study-provided DDI program or a control group does not.

Under the intervention, training and setting up the infrastructure to support DDI activities will occur early in the intervention period in the treatment schools. Treatment schools will learn how to set up the structures and implement the procedures appropriate to data-driven instruction through ongoing professional development and technical assistance. They will be given time to gradually receive training and gain practice implementing key data-driven instruction activities to then be ready for full DDI implementation. The study implementation team will support and monitor implementation both during the initial infrastructure building phase as well as the full implementation phase.

The implementation and impact evaluation study will address the following research questions:

  • What is the impact of DDI professional development on students’ achievement on state assessments in math and reading?

  • What is the impact of DDI professional development on teachers’ and principals’ access to and use of data to inform instructional support, planning, and practice?

  • How does DDI professional development affect teachers’ use of instructional strategies, such as differentiated instruction?

  • How do DDI-related activities differ in treatment versus control schools?

Because the primary research question focuses on the impact of DDI on student achievement, the study will examine impacts on grades in which state assessments are administered, grades four and five. Targeting fourth and fifth grades will also focus intervention resources more intensively on a subset of participating schools’ teachers, while allowing us to examine the effects of DDI on achievement across two different grade levels.



c. The Data Driven Instruction Program Provided by Focus on Results

The Focus on Results plan for DDI is based on a comprehensive framework for training and supporting school leadership and teachers on using data to inform instruction and improve student achievement. The DDI program is designed to be implemented in schools with regular interim assessments and a data system that produces reports on proficiency levels at the student, classroom, teacher, and school levels (overall and for key student subgroups). Key DDI components and how they are expected to contribute to achieving intermediate and final outcomes are depicted in Figure 1. The overall logic of the DDI program is that a core set of training and technical assistance inputs inform and guide various activities by school leaders and individual teachers. These inputs and activities lead to teachers increasing their use of student data to guide instruction and altering their instructional strategies, which in turn leads to improved student achievement.


Figure 1: Logic Model for Planned DDI Intervention

Participating schools assigned to the treatment will set up new structures to support the implementation of DDI. School leaders and individual teachers will engage in activities designed to help staff learn to analyze data from interim assessments and other sources to identify areas for improvement, jointly formulate improvement goals and priorities, implement evidence-based changes in instructional practices, and monitor progress toward achieving the agreed-upon goals. The following people and teams of people will be involved in the intervention in treatment schools:

  • Data coach. Participating schools will hire a half-time data coach to lead and monitor the implementation of DDI activities on a day-to-day basis. Trained and supported by Focus on Results consultants, the data coach will analyze student data, conduct strategic planning with the principal and other school leaders, and work directly with teachers—individually and in groups (teacher collaboration teams). These activities will help teachers learn to use available data effectively to formulate instructional improvement goals, identify relevant best practices, and implement such practices. The coach will create and maintain a DDI resource room with materials to help school leaders and teachers analyze data and research possible instructional practices. The coach also will develop a set of data displays, available in the resource room and throughout the school, to help school leaders and teachers monitor student progress.

  • Principal. As a school’s main instructional leader, the principal plays a critical role in establishing a data-driven culture in the school and building support and momentum for DDI implementation. After receiving initial training, the school’s principal and data coach together plan for DDI implementation; they also meet weekly to review progress, troubleshoot issues encountered, and adjust plans for DDI implementation as needed. Principals work to support the initiative by emphasizing the targeted instructional focus and practices during regular classroom visits, follow-up feedback to teachers, and interactions with the instructional leadership team and data coach.

  • Instructional Leadership Team. Each participating school will establish an instructional leadership team, whose primary role is to lead the school’s DDI implementation effort. The team will include the school’s principal, data coach, grade-level chairs (and/or department chairs, if instruction is departmentalized), and other formal or informal school leaders (for example, assistant principal, other teachers, or IEP specialist). This team is charged with analyzing school-wide data and, based on this analysis, identifying areas of instructional focus for school-wide DDI improvement efforts (including specific instructional practices linked with these areas), and setting and monitoring goals for increased student achievement. Since DDI implementation relies on a train-the-trainer approach to build capacity within each school, this team also has the important responsibility to take what it learns from ongoing professional development sessions with Focus on Results and pass it on to teacher collaboration teams (described next), to guide their efforts. The instructional leadership team is also charged with addressing more concrete issues, such as ensuring that teacher collaboration teams have the time and space to meet on a regular basis.

  • Teacher Collaboration Teams. Teachers in each participating grade or department will form a professional learning community to work collaboratively, under the leadership of the data coach, to do much of the on-the-ground work of DDI. Each team will hold regular meetings (at least every other week) to determine what the data (both formal interim assessments and informal assessments) suggest regarding student learning needs, collectively set grade/department-specific goals and monitor progress toward goals, and identify instructional strategies that could better help them meet the needs of students scoring at the lowest levels.

  • Individual Teacher Activities. Drawing on their work within the teacher collaboration teams, as well as one-on-one assistance from the data coach, each teacher will implement agreed-upon instructional strategies in support of both the school-wide and grade-level improvement goals. Each teacher will also consider her students’ proficiency levels with respect to different standards (based on assessment data and student work) and make decisions about how best to address the students’ identified learning needs. After implementing the instructional strategies designed to meet these learning needs, teachers will assess the success of their efforts based on feedback from the coach as well as evidence from subsequent student work and assessment data.

By collaborating on teams and participating in the activities described above, teachers are expected to develop the knowledge and skills needed to help them adapt and improve their instructional strategies based on student data. The relevant instructional strategies for a given teacher will depend on the specific learning needs of that teacher’s students, as indicated by the student data. DDI does not require that teachers implement any one specific instructional approach. Nonetheless, based on Focus on Results’ experiences and best-practices identified by DDI experts (Hamilton et al. 2009), we expect teachers in treatment schools to be more likely than teachers in control schools to adopt certain strategies or use them more frequently. For example, DDI should lead to a greater use of differentiated instruction, more frequent review and adjustment of students’ small group assignments, and integrating evidence-based practices into all aspects of their teaching.

Schools selected to implement DDI for the study will receive resources, training, technical assistance, and ongoing support from consultants from Focus on Results to help them implement DDI. Capacity-building inputs will include (1) an initial training for school principals and data coaches, (2) professional development training for instructional leadership teams, and (3) other technical assistance support for data coaches and instructional leadership teams during DDI implementation. The following training and technical assistance activities will take place in treatment schools:

  • Initial Training. A two-day orientation and training session ensures that data coaches and principals are prepared to begin implementing DDI in treatment schools. Activities include both general information about DDI and its implementation and specific training tailored to the activities each group will be expected to perform at their schools. Participants also are provided an opportunity to network and exchange ideas with data coaches and principals at other study schools implementing DDI.

  • Ongoing professional development for instructional leadership teams. Each school’s leadership team will receive a total of six professional development sessions to support the team in guiding their school’s adoption of DDI and provide training and skills that can be passed on to teachers at the school. Each session focuses on a particular set of activities related to the implementation of DDI, such as identifying an instructional focus, looking at student data and setting goals, using best practices in teachers’ classrooms, and monitoring their progress. Members of these instructional leadership teams will then train teachers and other staff in their schools using the skills and knowledge they learn from the professional development sessions.

  • Other technical assistance. Instructional leadership teams will receive customized technical assistance to help address questions and needs that arise as schools implement data driven instruction. Each participating school will be visited a total of eight times by Focus on Results consultants to monitor DDI implementation and provide customized technical assistance. Technical assistance also will be delivered during monthly phone calls with the data coach, and additional phone calls or e-mails, as needed.

  1. Data Collection Plan

To address the study research questions, the study team will collect and analyze data from teachers, principals, and district staff. A brief description of each data source and data collection activity is provided below.

  • Teacher and Principal Assignment Data: We will collect teacher assignment data from participating schools in winter 2016. We will ask schools to identify teachers currently teaching fourth and fifth grades, provide contact information for these teachers (such as school e-mail address), and indicate if the teacher taught at the school in fall 2014 (Appendix A).4 The study team will attempt to identify principals at study schools from public sources (such as school websites). If that information is not publicly available, we will request schools to provide principal assignment information.

  • Teacher Survey: We will administer a web-based survey in spring 2016 to 500 teachers (Appendix B). Teachers will also have the option of completing the survey by hard copy or by phone, if they prefer. The survey will provide information on professional development and training received by teachers (particularly related to DDI topics), school data culture, instructional planning and collaboration, teachers’ access and use of data to guide instruction, teachers’ instructional strategies, and teachers’ demographic and background characteristics.

  • Teacher Logs: We will administer web-based logs with teachers during the 2015-2016 school year (Appendix C). The same set of 500 teachers who will be asked to complete the teacher survey will also be asked to complete these logs at two points during the year, and in each case report on their activities over two consecutive days. The logs will provide information on teachers’ day-to-day activities, including planning activities (individually and in collaboration with other teachers) and instructional strategies in the classroom.

  • Principal Survey: We will administer a hard-copy survey to all 104 study principals in spring 2016 (Appendix D). The survey will focus on topics such as school-wide leadership activities, school data culture, school-wide access to and use of data, and principals’ demographic and background characteristics. Principals will have the option to provide answers to a trained interviewer over the phone.

  • Student Records Collection: In summer 2016, we will collect student outcome measures (for example, math and reading test scores from state assessments) for the year of full implementation (2015–2016) as well as for two prior years (2014–2015 and 2013-2014 if available). We will also collect student demographic and background data, such as gender, race/ethnicity, and eligibility for free and reduced-price lunch, and student attendance and disciplinary measures (if available), and student enrollment data for fall 2014 and winter 2016 (Appendix E).

  1. Study Activities and Time Line

In Table 1, we show the timing of the major study activities.

Table 1. Schedule of Major Study Activities for Data-Driven Instruction Evaluation


Fall/

Winter 2015

Winter/Spring
2016

Summer/Fall 2016

Summer 2017

Collect Teacher Assignment Data*


X



Conduct Teacher Survey


X



Conduct Teacher Logs

X

X



Conduct Principal Survey


X



Collect Student Records Data



X


Prepare Report




X

*Schools will be asked to provide principal assignment data only if the information cannot be obtained from publicly available sources.



2. Purposes and Uses of Data

The main purpose of this evaluation is to estimate the impacts of DDI on student achievement and other outcomes, as well as to document implementation of DDI programs and activities. In Table 2, we list the study’s research questions and the data collection to support the answers.

Table 2. Research Questions and Data Collection Method

Research Questions

Data Collection Method


1. What is the impact of DDI professional development on students’ achievement on state assessments in math and reading?

  • Student records data


2. What is the impact of DDI professional development on teachers’ and principals’ access to and use of data to inform instructional support, planning, and practice?

  • Teacher survey

  • Teacher logs

  • Principal survey

  1. How does DDI professional development affect teachers’ use of instructional strategies, such as differentiated instruction?

  • Teacher survey

  • Teacher logs


4. How do DDI-related activities differ in treatment versus control schools?


  • Teacher survey

  • Principal survey

  • Technical assistance documents



  • Teacher and Principal Assignment Data: Teacher assignment data will be used to randomly select approximately 500 fourth and fifth grade teachers to complete the teacher survey and logs. Principal assignment data will be used to identify the principals in the 104 study schools to determine the sample for the principal survey. In late 2015 (following OMB approval), we will collect information on teacher and principal assignments as of fall 2014, which will be used to examine whether the DDI intervention lead to a treatment-control difference in educators’ mobility between fall 2014 (following random assignment) and the 2015-2016 school year.

  • Teacher Survey and Logs: Teacher survey and log data will be used to analyze teacher intermediate outcomes and to measure important aspects of the treatment-control contrast. Intermediate outcomes that will be examined include the impact of DDI on teachers’ access to and use of data to guide instruction and teachers’ instructional strategies (such as the use of differentiated instruction). We will also examine treatment-control differences in the training teachers receive and other activities surrounding the use of data in schools.

  • Principal Survey: The principal survey will enable a better understanding of the treatment-control contrast in schools’ training and activities surrounding the use of data, as well as shed light on some aspects of DDI implementation. It will examine topics such as whether principals have established school-wide goals, their expectations for school-wide data use, whether the school has an instructional leadership team and the activities it undertakes, and the degree to which school leaders review and discuss student data.

  • Student Records Collection: We will use existing district test score data to estimate the impact of DDI on student achievement, the key outcome of interest. Information on students’ demographic and socioeconomic characteristics and their achievement test scores prior to the study school year will be used to describe the students in the study and to develop more precise impact estimates.

In the process of implementing the intervention and providing technical assistance to treatment schools, we will also obtain information on aspects of implementation from the evaluation team’s technical assistance provider (Focus on Results), such as materials from their professional development sessions.

We will present the study findings in a report that will include information from both the implementation and impact analyses. The data collected by this evaluation will also be available as restricted-use files, serving as a valuable resource for researchers.

3. Use of Technology to Reduce Burden

The data collection plans are designed to obtain reliable information in an efficient way that minimizes respondent burden. When feasible, we will gather information from existing data sources, using the most efficient methods available.

A web-based survey and web-based logs will be the primary modes of data collection for teachers. Respondents will also have the option of completing a self-administered hard copy questionnaire or providing answers to a trained interviewer over the phone. The survey and logs will enable respondents to complete them at a location and time of their choice, and its automatic editing system will reduce the number of response errors. Using email to follow up with nonrespondents will also offer an additional convenient option for respondents because email reminders will include a link to the survey website and a username-password combination, as well as an attached PDF of the survey if respondents choose to complete it in hard copy.

The principal survey will be administered as a hard copy questionnaire. Principals will also have the option to provide answers to a trained interviewer over the phone.

4. Efforts to Avoid Duplication of Effort

The study seeks to provide new evidence on the impact of a comprehensive DDI program, as well as factors and context that might affect DDI effectiveness. The experimental studies that have been conducted do not examine a comprehensive DDI program that incorporates intensive professional development and coaching or regular teacher collaboration. Moreover, much of the existing work examines branded interventions using proprietary assessments, and cannot shed light on the effectiveness of a general DDI approach.

5. Methods to Minimize Burden on Small Entities

No small businesses or entities will be involved as respondents.

6. Consequences of Not Collecting Data

The data collection plan described in this submission are necessary for ED to conduct a rigorous national evaluation of DDI and to understand the effectiveness of this education reform strategy. The study represents an important step toward developing a systematic and rigorous evaluation agenda in the area of school reform. Without the data collected in this evaluation, states, districts, schools and policymakers will not know if their considerable investment in DDI is improving student learning, whether additional investment in this strategy is merited, and what challenges districts may encounter implementing a comprehensive DDI program and how to overcome those challenges.

The consequences of not collecting specific data are outlined below:

  • Teacher and Principal Assignment Data: Without teacher assignment data, the study will not be able to identify the pool of fourth and fifth grade teachers from which a random sample of teachers will be chosen to complete the teacher survey and logs. Without principal assignment data, the study will not be able to identify the principals who will be asked to complete the principal survey.5

  • Teacher Survey and Logs: Without the data from the teacher survey and logs, we will not be able to examine the impact of DDI on teacher-level outcomes such as their review and use of student data, collaboration, and instructional strategies. In addition, we will not be able to examine treatment-control differences in the training they receive and other activities surrounding the use of data in schools.

  • Principal Survey: Without the principal survey data, we will not understand or be able to describe treatment and control differences in school contextual factors (such as the school culture pertaining to data use), or school-wide data-use activities.

  • Student Records Collection: Without the study records, we will have to administer assessments to students in place of using their district test scores to measure student achievement. Without the data on student characteristics, we will not be able to fully describe the study sample and verify the effectiveness of the random assignment.

7. Special Circumstances

There are no special circumstances associated with the school recruiting effort.

8. Federal Register Announcement and Consultation

a. Federal Register Announcement

The 60-day notice to solicit public comments was published in Volume 80, page 34150-34151 of the Federal Register on June 15, 2015. A correction to the Docket ID Number was published in Volume 80, page 35641 on June 22, 2015. Two public comments were received.

b. Consultations Outside of the Agency

In formulating the study design, DDI intervention, surveys, and teacher logs, the study team sought input from experts in DDI, evaluation methodology, and education policy. Table 3 lists the experts who were consulted.

c. Unresolved Issues

None.

Table 3. Experts Consulted

Name

Title and Affiliation

Expertise

Geoffrey Borman

Professor of Education, University of Wisconsin-Madison

Evaluation methodology; education policy

Russell Gersten

Executive Director, Instructional Research Group

Education policy; reading and math instruction

Laura Hamilton

Associate Director, Rand

Data-driven instruction; evaluation methodology

Ken Koedinger

Professor of Human Computer Interaction and Psychology, Carnegie Mellon University

Education policy; educational technologies; experimental methods

Spyros Konstantopoulos

Professor of Measurement and Quantitative Methods, Michigan State University

Quantitative and experimental methods; education research; data-driven instruction

Ellen Mandinach

Senior Research Scientist, WestED

Data-driven decision making

Jon Supovitz

Director, Consortium for Policy Research in Education, University of Pennsylvania

Data-driven decision making; teacher and principal professional development in using data

Leslie Nabors Olah

Managing Research Scientist, Educational Testing Service

Data-driven decision making; teacher professional development in using interim assessment data

Jeff Wayman

President, Wayman Services, LLC

Data-driven instruction and decision making; program evaluation

Martin West

Associate Professor of Education, Harvard University

Data-driven instruction; evaluation methodology



9. Payments or Gifts

Incentives have been proposed for the teacher surveys and logs and the principal surveys to partially offset respondents’ time and effort in completing the surveys. We propose offering a $20 incentive to both teachers and principals after completion to compensate them for the 30 minutes required to complete the questionnaire.

We also propose a $15 incentive to teachers in study schools for each completed teacher log in order to compensate them for the 15 minutes we expect each log to take to complete. These proposed amounts are within the incentive guidelines outlined in the March 22, 2005 memo, “Guidelines for Incentives for NCEE Evaluation Studies,” prepared for OMB.

Incentives are also proposed because high response rates are needed to make the survey findings reliable, and we are aware that teachers are the targets of numerous requests to complete surveys on a wide variety of topics from state and district offices, independent researchers, and ED. Although some districts will have solicited buy-in from teachers to participate in the evaluation, our recent experience with numerous teacher surveys supports our view that obtaining teacher buy-in does not guarantee teachers will devote the time it takes to complete a survey or log, and monetary incentives increase the likelihood of cooperation of school staff.

10. Assurances of Confidentiality

The data collection efforts, including student records, teacher survey, teacher logs, and principal survey, will be conducted in accordance with all relevant regulations and requirements. These include the Education Sciences Reform Act of 2002, Title I, Part E, Section 183, and all letters, surveys, and data collection documents will contain the following confidentiality paragraph:

“The Education Sciences Reform Act of 2002, Title I, Part E, Section 183, prohibits disclosure of individually identifiable information as well as making the publishing or communicating of individually identifiable information by employees or staff a felony. Per the policies and procedures requires by the Education Sciences Reform Act of 2002, Title I, part E, Section 183, responses to this data collection will be used only for statistical purposes. The reports prepared for this study will summarize findings across the sample and will not associate responses with a specific school, districts, or individual. Any willful disclosure of such information for nonstatistical purposes, except as required by law, is a class E felony.”

In addition, for student information, the project director will protect all individually identifiable information about students and their academic achievement, as well as information regarding individual schools will remain confidential in accordance with Section 552a of Title 5, United States Code, the confidentiality standards subsection (c), and Sections 444 and 445 of the General Educations Provision Act.

Subsection (c) of Section 183, referenced above, requires the director of IES to “develop and enforce standards designed to protect the confidentiality of persons in the collection, reporting, and publication of data.” The study will also adhere to requirements of subsection (d) of Section 183 prohibiting disclosure of individually identifiable information as well as making the publishing or inappropriate communication of individually identifiable information by employees or staff a felony.

Mathematica and its subcontractors will protect the confidentiality of all information for the study and use it for research purposes only. When reporting results, data will be presented in aggregate form only, such that individuals and institutions will not be identified. A statement to this effect will be included with all requests for data including letters, surveys, and data collection documents. All members of the study team with access to the data will be trained and certified on the importance of confidentiality and data security. All data will be kept in secured locations and identifiers will be destroyed as soon as they are no longer required.

The following safeguards are routinely employed by Mathematica to carry out confidentiality assurances during the study:

  • All Mathematica employees sign a confidentiality pledge (Appendix H) emphasizing its importance and describing their obligation.

  • Identifying information is maintained on separate forms and files, which are linked only by sample identification number.

  • Access to hard-copy documents is strictly limited. Documents are stored in locked files and cabinets. Discarded materials are shredded.

  • Computer data files are protected with passwords and access is limited to specific users.

  • Especially sensitive data are maintained on removable storage devices that are kept physically secure when not in use.

11. Additional Justification for Sensitive Questions

As part of our request to schools for student records data, we will ask for information on student disciplinary measures of suspensions. This will be valuable to measure a set of outcomes that could presumably be influenced by the data-driven instruction intervention being studied. Since the intervention aims to help teachers better understand student needs and improve their instruction, it is possible that the changes teachers make to their instruction and classroom practices could influence student behavior.

12. Estimates of Hours Burden

Table 4 provides an estimate of time burden for the data collection activities. The total of 779 burden hours for data collection involves collecting student records from districts, teacher assignment data from schools, surveying principals, and surveying and obtaining logs from teachers. These estimates are based on our experience collecting data for other evaluation studies.

The total of 779 burden hours covers all three years of data collection approval, and includes the following efforts: an annualized total of 32 hours (96/3) for each of the 12 districts to collect and assemble administrative records on students participating in the evaluation; an annualized total of 9 hours (104*.25/3) for school staff to provide teacher assignment data (and principal assignment data if not publicly available) ; an annualized total of 15 hours (88*.5/3) to complete a 30 minute principal survey in the spring of 2016 for 88 principals (85 percent of the 104 principals in the sample); an annualized total of 71 hours (425*.5/3) to complete a 30 minute teacher survey in the spring of 2016 for 425 teachers (85 percent of the 500 teachers in the sample); and an annualized total of 133 hours (400/3) to complete 4 rounds of teachers logs of 15 minutes each.

Table 4. Estimated Response Time for Data Collection Activities

Respondent Instrument

Number of Targeted Respondents

Expected Response Rate (%)

Number
of
Respondents

Unit
Response
Time (Hours)

Total
Response
Time (Hours)

Total annual hours

(hrs/3)

Districts







Student records collectiona

12

100

12

8

96

32

Schools







Teacher assignment datab

104

100

104

0.25

26

9

Principals







Surveys

104

85

88

0.5

44

15

Teachers







Surveys

500

85

425

0.5

213

71

Logs

500

80

400

1c

400

133

Overall Total



1029


779

260

NOTE: Reporting on an annualized data collection over 3 years. Annual number of respondents is 343 [(12+104+88+425+400)/3]

a In some cases, we might request this data from schools.

b If principal assignment data is not publicly available, schools will be asked to provide this information for principals.

c Reflects 4 completed logs for each respondent.



13. Estimates of Cost Burden to Respondents

There are no additional respondent costs associated with this data collection beyond the burden estimated in section 12 above.

14. Estimates of Annual Costs to the Federal Government

The total cost to the federal government of carrying out this study is $9,697,224, to be expended over the study period of four years. The estimated average annual cost of the study is $2,424,306.

15. Reasons for Program Changes or Adjustments

This is a new data collection.

16. Plan for Tabulation and Publication of Results

We discuss our plans below for tabulating data for the final report to address the research questions and publishing results.

a. Tabulation Plans

Our tabulation plans include implementation and impact analyses aligned to the research questions. Nonexperimental analyses will describe implementation and the association of specific features of DDI programs with DDI impacts. Random assignment of schools within districts to a treatment group that will implement a comprehensive DDI program or to a control group that will not is an ideal design for assessing overall effectiveness of DDI. Our primary impact analysis will exploit this experimental design to provide rigorous estimates of the impact of DDI on student achievement and other student outcomes. Additional experimental analyses are designed to estimate the impact of a comprehensive DDI program on teacher outcomes, such as teachers’ access to and use of student data and teachers’ use of instructional strategies.

i. Implementation analysis

The implementation analyses have two main objectives. The first objective is to describe the DDI activities implemented in treatment schools and the extent to which they were implemented with fidelity. The second objective is to describe differences in treatment and control schools’ implementation of DDI-related activities. The analyses will rely upon information from the data coach weekly activity logs and the Focus on Results consultant activity logs, as well as from the principal survey, teacher survey, and teacher logs.

Fidelity of DDI implementation. The implementation analysis will describe DDI activities undertaken by treatment schools and the extent to which they were implemented with fidelity to the intervention plan. As a part of this analysis, we will review log entries from January 2015 through spring 2016. Because it may take time for treatment schools to set up necessary school structures and roll out intervention activities, we expect some variance in implementation at different points in time during the intervention. Thus, we will examine fidelity of implementation during the course of the intervention period as well as at its end.

To assess the extent to which treatment schools implemented DDI with fidelity, it will be important to quantify the data and summarize it uniformly. The DDI intervention aims to build school capacity in two ways, by (1) helping schools set up structures and activities that enable school staff to carry out DDI, and (2) training and coaching teachers in the skills needed to use and interpret student data. Implementation fidelity measures will therefore examine the training and coaching that treatment schools receive and the degree to which treatment schools fully implement the anticipated DDI structures and activities.

In describing the fidelity of implementation of a particular DDI component, we will focus on the activities we expect to have happened if the component has been implemented. We will measure both whether the activity occurred and, if appropriate, the level of participation in the activities. To measure the fidelity of implementation of the instructional leadership team meetings, for example, we will measure whether a school holds monthly instructional leadership team meetings among all key staff, attendance at these meetings among the staff, and the key activities taking place at these meetings. These activities would include that the team establishes school-wide goals for performance, establishes areas of instructional focus, and examines student data.

Our examination of implementation fidelity will be accompanied by descriptive text that summarizes our findings, describes changes in implementation over the course of the intervention, and provides a range of examples of how schools implemented DDI with fidelity.

We will also describe treatment schools’ experiences and the challenges they faced in implementing DDI. This information will be especially important if the DDI approach is found to be effective and other districts wish to replicate the program. We will examine school-level factors that may influence DDI implementation, such as the level of engagement of the school principal and the challenges faced when implementing DDI activities (such as achieving consistent participation and strong engagement of teachers in teacher collaboration team meetings).


Implementation of DDI-related activities in treatment and control schools. As depicted in the logic model (Figure 1), the DDI intervention is expected to lead the principal, data coach, instructional leadership team, teacher collaboration teams, and individual teachers in treatment schools to engage in numerous data-focused activities. These include (1) activities undertaken by school leaders in directing data use, (2) school-wide communications about data use, (3) professional development and support for the principal and teachers on data use, and (4) collaboration among teachers to review student data and share instructional strategies. We will use data from the principal survey, teacher survey, and teacher logs to examine whether implementation of the DDI intervention leads to a treatment-control difference in these activities (Table 5).


ii. Impact analysis

The impact analysis will rigorously assess the effectiveness of a comprehensive DDI program. Calculating the statistical significance of the impacts requires that the nested structure of the data—with students clustered in schools–is incorporated in the analysis. Due to clustering, the variance of the impact estimates is larger than it would have been if each individual student were randomly assigned to DDI. Below we describe our approach to calculating outcomes of student achievement and intermediate outcomes on teacher and principal practices.


Student final outcomes. Student achievement will be the study’s primary outcome, measured using spring 2016 math and reading scores on state standardized tests. Specifically, this main impact analysis will examine the effect of DDI on:

  • Math achievement among fourth and fifth graders in a school that implemented data-driven instruction

  • ELA achievement among fourth and fifth graders in a school that implemented data-driven instruction


Table 5. Data-related Activities, by Source


Principal Survey

Teacher Survey

Teacher Logs

Activities undertaken by school leaders in directing data use

Frequency of leadership team meetings

X



Leadership team members

X



Leadership team activities (e.g., setting achievement and priority learning goals, monitoring progress, planning professional development activities)

X



Degree to which school leaders ensure that teachers have the time and resources needed to analyze and interpret student data

X

X


School-wide communications about data

Frequency of communication on student achievement goals and results

X

X


Frequency of communication on priority learning goals

X

X


Frequency of communication on expectations for and actual use of data

X

X


Use of data displays in classrooms and other public areas

X

X


Professional development and support for the principal, teachers around data use

Amount of professional development/training activities this school year

X

X


Topics covered by trainings (analyzing student data, establishing priority learning goals for the school, individualizing student learning goals, tracking progress toward goals, using evidence-based instructional strategies)

X

X


Availability of on-site coaching/support for data use

X

X


Data coach activities

X



Frequency of, and topics addressed during, individual coaching


X

X

Frequency of classroom observations and feedback


X

X

Collaboration among teachers around data use and sharing instructional strategies

Frequency of, and amount of time spent on, teacher collaboration


X

X

Teacher collaboration activities (analyzing student data, setting common learning goals for students, sharing effective instructional practices, jointly modifying lesson plans, monitoring implementation and results of instructional changes)


X

X



To assess the impact of data-driven instruction on student achievement, we will use a place-based impact estimation strategy that compares outcomes for students in treatment schools to those of students in control schools using spring 2016 state test scores. This implies that the impacts could reflect either the impacts of data-driven instruction on student achievement or impacts on student mobility. Given the nature of the intervention, we do not expect that the data-driven instruction intervention would affect students’ mobility patterns during the study period. However, we will test this empirically through the analysis of impacts on student mobility based on the student sample in fall 2014 (following randomization) and spring 2016 (at the conclusion of the intervention).

The main impact analysis based upon spring 2016 test scores will estimate an impact model in which the overall impact estimate is based on treatment-control differences in the outcome of interest within each stratum (matched school pair). Although a simple treatment-control difference in mean outcomes will yield an unbiased estimate of the impact of DDI, the precision of estimates can be improved by controlling for baseline characteristics that may influence the outcomes of interest but are not related to the treatment itself. For all the student outcomes, we will control for baseline student and school covariates. If available from state or district records, specific student-level covariates will include prior years’ (spring 2015 and, if available, spring 2014) math and reading test scores (in z-score units), prior years’ student attendance, prior years’ student suspensions, gender, race/ethnicity, eligibility for free or reduced-price lunch, English language learner status, and special education status. We also anticipate including school-level aggregates of these variables and other relevant school characteristics in cases where individual student-level data are not available.

Accordingly, we will estimate student impacts using the following model:

(1)

where yijk is the outcome of individual student i in school j within stratum k; αk is a vector of stratum (matched pair) indicators (fixed effects) included to control for differences across strata in average student, teacher, principal and school characteristics; Tjk is a treatment indicator that equals one if the school was assigned to DDI and zero otherwise; Xijk is a vector of baseline individual student characteristics; ujk is a school-specific random error term; εijk is an individual-level random error term; and , γ, and δ are parameters to be estimated.

The estimate of represents the overall impact of DDI on the student outcome of interest. We will estimate the model with ordinary least squares (OLS) using standard errors that account for school-level clustering.

We will consider student achievement in reading and math to be separate domains, across which the impact of data-driven instruction might differ. Given that assessments differ across state, grade level, and subject area, we will standardize the raw achievement scale scores by converting them to z-scores. We will calculate the z-scores by subtracting the state mean score from the raw scale score and dividing by the standard deviation of the state scores.

We will also estimate impacts on other student-level outcomes, such as attendance and suspensions, if possible. We will request student-level data on these outcomes from districts or states, but if the data are not available or are unreliable, we will estimate impacts on these outcomes measured at the school (or, if available, grade) level. Impacts on a school-level outcome yjkd can be estimated using model (1) above but including school-level averages of individual-level covariates rather than the individual-level covariates themselves.

The estimation of overall impacts on student achievement may mask differences in impacts across subgroups of students. For example, DDI may prove more or less effective at boosting student achievement for students with different baseline characteristics; it may, for example, raise achievement among lower performing students to a greater degree than it does among higher performing students. We will provide a subgroup analysis of the impacts of DDI for student groups based on their level of baseline achievement, focusing on impacts for particularly low-achieving students as well as for students at moderate to high baseline achievement levels. To estimate impacts for student subgroups, we will create a version of the model that interacts a subgroup indicator with the treatment indicator; the coefficient on the subgroup-treatment indicator interaction term will represent the impact estimate for the subgroup.

Teacher and principal intermediate outcomes. The core set of professional development and technical assistance inputs under DDI are intended to help teachers and principals use data to improve instruction, which in turn would lead students to realize higher achievement gains. We will use responses from teacher surveys, teacher logs, and the principal survey to examine these intermediate outcomes. Teachers and principals offer different vantage points from which to assess the extent to which schools engage in data use activities. For example, principals may be able to provide detailed information on school leadership activities, while teachers may be able to provide detailed information on the frequency and content of teacher collaboration activities. The surveys will provide useful information on the frequency of activities over an extended period of time (such as how often a teacher attended professional development during the school year), while teacher logs will provide a one-day snapshot of teacher activities that occur relatively frequently (such as lesson planning and collaboration based on analysis of data).

Table 6 lists the measures and their sources that will be used to estimate impacts on intermediate outcomes. The first two sets of items listed in the table measure intermediate outcomes related to teachers’ access to and use of data to guide instruction. The second two sets of items capture information on teachers’ instructional strategies.

To assess the impact of DDI on teachers’ use of data and instructional strategies, we will compare outcomes for teachers in treatment schools to those of teachers in control schools. Our impact model for teacher outcomes will take a similar approach to that taken in the student outcomes model, calculating treatment-control differences among teachers. For all teacher outcomes, we will control for teacher covariates derived from the survey. Teacher covariates will include years of teaching experience, gender, race/ethnicity, teacher certification, and an indicator for a master’s degree.


Similar to our approach in the student model, we will estimate the model with ordinary least squares (OLS) using standard errors that account for school-level clustering, and we will compute the overall average impact of DDI by taking a weighted average of the coefficients on the treatment indicators. Each district-specific impact will be weighted by the number of study schools in each district.


Table 6. Teacher and Principal Intermediate Outcome Measures, by Source


Teacher Survey

Teacher Logs

Principal Survey

Access to student-level data

Access to interim assessment results

X


X

Access to summative assessment results

X


X

Access to student background characteristics, attendance, and school behavior information

X


X

Barriers to data use (usable format, technology, tools)




Use of student-level data

Frequency of use by type (summative assessment results, interim assessments, formative assessments, samples of student work, student characteristics)

X

X

X

Purposes of data use (understand student needs, set learning goals, monitor progress toward goals, differentiate instruction, revise lesson plans)

X


X

Understanding of instructional changes to make based on data

X


X

Differentiated Instruction

Placing students in small groups based on student data

X



Providing small group instruction

X

X


Providing individualized instruction

X

X


Identifying and referring students in need of pull-out services or other intensive interventions

X

X


Changing instructional group assignments of students based on student data

X

X


Whole-Class Instruction

Providing additional instruction in areas where students are struggling

X

X


Identifying evidence-based instructional changes to help address students’ needs


X


Using new instructional strategies to teach challenging concepts to students


X




Different types of teachers may be differentially affected by the implementation of DDI. For example, less experienced teachers may be more (or less) at ease in using data to inform instruction than more experienced teachers. They also may be able to benefit to a greater extent from the information on student performance provided by student data, since they will be less able to rely on experience to understand their students’ needs. The study will also examine impacts on teacher outcomes separately for subgroups of teachers defined by their level of experience. Similar to the approach used in the student subgroup analysis, we will estimate subgroup impacts by creating a version of the model that interacts a subgroup indicator with the treatment indicator.

Our approach to analyzing principal outcomes will similarly compare principal survey responses regarding the structures and professional development activities in treatment and control schools. However, while the teacher and student models will be estimated at the classroom and student levels, respectively, the principal model will be estimated at the school level. We will adapt the student-level model presented in equation (1) to estimate impacts on principal-reported measures. For all principal survey analyses, we will control for principal covariates, such as years of experience, gender, race/ethnicity, and an indicator for a master’s degree.

iii. Nonexperimental analysis

School contextual factors. Contextual factors, such as student characteristics or principal and teacher background characteristics, may aid in the interpretation of the impact of DDI in treatment schools relative to control schools. We will therefore examine how school contextual factors are related to impacts. Examples of contextual factors include:


  • School characteristics. School math and ELA proficiency measured at baseline, the percentage of students eligible for free or reduced-price meals, the percentage that are English language learners, the percentage in special education programs, and racial/ethnic composition of the school.

  • Teacher and principal characteristics. Education, experience, and background characteristics.


The study will use descriptive analyses and regression analyses to examine how impacts are related to school contextual factors. We will use the descriptive analysis to identify conditions and practices that are candidates for regression analyses of impacts.


Correlational analyses of impacts and implementation fidelity. The potential impact of DDI on student achievement may differ within treatment schools based on aspects of their fidelity of implementation of DDI. To explore how school characteristics and other contextual factors may influence student impacts, we will conduct a correlational analysis that examines the relationship between estimated impacts of DDI and key features of treatment schools (or of matched pairs of treatment and control schools when data are available for the control schools).


For example, we may examine whether the qualifications of the data coaches hired at each treatment school are related to impacts on student achievement. Because data coaches play a central role in supporting the DDI intervention at each school, the prior skills and knowledge they bring to the work may influence the degree to which teachers are supported in analyzing student data and using them to improve instruction, which may in turn influence student outcomes. Understanding the degree to which the success of the DDI intervention hinges upon the qualifications of the data coach may prove helpful in developing effective DDI interventions in the future. As above, we will correlate each coach’s data proficiency score on an exercise developed by Focus on Results to assess their initial skills in analyzing student assessment data with the estimated impact of the coach’s school.

Because schools cannot be randomly assigned to specific DDI contexts, this correlational analysis will be nonexperimental. We will stress that any significant relationships between impacts and DDI features or contextual factors might not be causal and might reflect the influence of other unobserved factors.

b. Publication Plans

We will prepare one report that will present the results of the implementation and outcomes analysis. The projected release date of this report is Summer 2018.

Reports will be written in a style and format accessible to policymakers and research-savvy practitioners, and will comply fully with the standards set by the National Center for Education Statistics.

17. Approval Not to Display the OMB Expiration Date

The study will display the OMB expiration date.

18. Explanation of Exceptions

No exceptions are being sought.

REFERENCES

Carlson, D., G. Borman, and M. Robinson. “A Multi-State District-Level Cluster Randomized Trial of the Impact of Data-Driven Reform on Reading and Mathematics Achievement. Educational Evaluation and Policy Analysis, vol. 33, 2011, pp. 378–398.

Cordray, D., G. Pion, C. Brandt, A. Molefe, and M. Toby. “The Impact of the Measures of Academic Progress (MAP) Program on Student Reading Achievement.” (NCEE 2013-4000). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education, 2012.

Datnow, A., V. Park, and P. Wohlstetter. “Achieving with Data: How High-Performing School Systems Use Data to Improve Instruction for Elementary Students.” Los Angeles: Center for Educational Governance, University of Southern California, 2007.

Hamilton, L., R. Halverson, S. Jackson, E. Mandinach, J. Supovitz, and J. Wayman. “Using Student Achievement Data to Support Instructional Decision Making.” NCEE 2009-4067. Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education, 2009. Available at http://ies.ed.gov/ncee/wwc/publications/practiceguides/. Retrieved DATE.

Henderson, S., A. Petrosino, S. Guckenburg, and S. Hamilton. “Measuring How Benchmark Assessments Affect Student Achievement” Issues and Answers Report, REL 2007–No. 039. Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Northeast and Islands, 2007.

Konstantopoulos, S., S. Miller, A. van der Ploeg, C. Li, and A. Traynor. “The Impact of Indiana’s System of Benchmark Assessments on Mathematics Achievement.” Educational Evaluation and Policy Analysis, vol. 35, no. 4, 2013, pp. 481–499.

Lachat, M., and S. Smith. “Practices That Support Data Use in Urban High Schools.” Journal of Education for Students Placed at Risk, vol. 10, no. 3, 2005, pp. 333–349.

Quint, J., S. Sepanik, and J. Smith. “Using Student Data to Improve Teaching and Learning: Findings from an Evaluation of the Formative Assessments of Student Thinking in Reading (FAST-R) Program in Boston Elementary Schools.” New York, NY: MDRC, 2008.

Shaw, Shana, and Jeffrey C. Wayman. "Third-Year Results From an Efficacy Study of the Acuity Data System." (2012).

Slavin, R., A. Cheung, G. Holmes, N. Madden, and A. Chamberlain. “Effects of a Data-Driven District Reform Model on State Assessment Outcomes.” American Educational Research Journal, vol. 50, no. 2, 2013, pp. 371–396.

U.S. Department of Education. “Sec. 2121. Allocations to local education agencies.” Available at [http://www2.ed.gov/policy/elsec/leg/esea02/pg22.html/]. Accessed November 6, 2013.

Williams, R., A. Swanlund, S. Miller, S. Konstantopoulos, and A. van der Ploeg. “Making Sense of Unanticipated Results: Instructional Differentiation and the Indiana Diagnostic Assessment Study.” Society for Research on Educational Effectiveness (SREE) Spring Conference, 2013

Shape1 Shape2

Improving public well-being by conducting high-quality, objective research and surveys

Princeton, NJ Ann Arbor, MI ■ Cambridge, MA ■ Chicago, IL ■ Oakland, CA ■ Washington, DC


Mathematica® is a registered trademark of Mathematica Policy Research

www.mathematica-mpr.com



1 This information is based on conversations with representatives of nine large DDI providers: Achievement Network, Acuity, Curriculum Associates, Data Wise, Education for the Future, Northwest Evaluation Association, Public Consulting Group-Focus on Results, TERC, and Wireless Generation.

2 There have been several nonexperimental examinations of DDI (Shaw and Wayman 2012; Henderson et al. 2007; Quint et al. 2008; Datnow et al. 2007), but because schools or teachers using DDI may systematically differ in other ways from schools or teachers not choosing to use DDI, these non-experimental results do not support rigorous causal inference.

3 In addition to the studies listed here, there is an ongoing evaluation of the Achievement Network DDI intervention. That intervention includes the administration of a particular set of interim assessments along with supports to participating schools to aide them in interpreting and using the assessment results. A report on this study is forthcoming in spring 2015. Therefore, at this time, the detailed characteristics of the evaluation design and the findings are not known.

4 Teacher assignment data will be collected following approval of the OMB clearance package.

5 Schools will only be asked to provide principal assignment data if this information is not publicly available.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Title40166_OMB_Part A_(3.26.15)
SubjectImpact Evaluation of Data-Driven Instruction Professional Development for Teachers
AuthorMathematica Staff
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy