FS4.4.8_OMB_SupportingStatementB_AppendixB_StudySummary_2013-08-06

FS4.4.8_OMB_SupportingStatementB_AppendixB_StudySummary_2013-08-06.docx

Alternative Student Outcomes for Growth Measures Case Studies

FS4.4.8_OMB_SupportingStatementB_AppendixB_StudySummary_2013-08-06

OMB: 1850-0901

Document [docx]
Download: docx | pdf

OMB Supporting Statement Part B: Alternative Student Growth Measures for Teacher Evaluation: Case Studies of Early Adopters


August 6, 2013

Appendix B: Study Summary

The Issue: Teacher Evaluation Using Alternative Measures of Student Growth

In school districts that have sought to measure the effectiveness of teachers in raising student achievement, statistical methods that aim to measure growth or value-added are typically applied to students’ scores on statewide standardized tests. To permit evaluation of grades and subjects not tested through state exams and to derive a more comprehensive picture of teacher effectiveness, some school districts have begun to use alternative student outcome measures in value-added models (VAMs) and other student growth models. These alternative measures include end-of-course curriculum-based assessments and additional standardized as well as Student Learning Objectives (SLOs)—specific growth targets for a teacher’s own particular set of students, typically set by individual teachers and approved by principals.


Various school districts are adopting alternative student outcomes for measuring growth in both low-stakes contexts for instructional purposes and in high-stakes contexts for teacher evaluation. Many more districts will need to adopt them soon due to changes in state-level evaluation systems requiring a measure of student growth. Little is yet known about the features and uses of these measures, the process of implementation, or the challenges encountered during implementing.

The Study

As a partner of the mid-Atlantic Regional Educational Laboratory (REL Mid-Atlantic), which is housed at ICF International and funded by the U.S. Department of Education (ED), Mathematica Policy Research has been contracted to conduct a study to examine the implementation of alternative measures for student growth used to evaluate teachers. We are seeking to identify 9 districts to participate in the study. The key research questions are

  1. What student outcome measures other than state standardized test scores are currently being used in growth measures to assess teacher performance?

  2. How have school districts implemented the data collection and analysis necessary for growth measures based on alternative student outcomes, and what obstacles have they encountered?

  3. How are the alternative measures being used for other purposes in addition to teacher evaluation?

  4. How much weight does each alternative measure alternative receive in a teacher’s overall evaluation, and how does the weight vary by grade and subject? How does the distribution of scores on the alternate measure compare to the distribution of scores on growth measures used with state assessments or other measures of performance?

  5. What are the perceived benefits and drawbacks of using growth models (including VAM) based on each type of alternative outcome: SLOs, end-of-course curriculum-based assessments, and nationally normed assessments? What costs (notably in terms of time and effort) do they impose on teachers, principals, and districts?

How The Study Works

This study aims to fill the gap in information available to districts and policymakers on measures of student growth that do not use state standardized tests via qualitative case studies of up to nine districts that are using alternative measures of student achievement growth in teacher performance ratings. The study’s scope will encompass three categories of alternative student growth measures: (1) end-of-course curriculum-based assessments used in growth models, (2) nationally normed assessments such as the Iowa Test of Basic Skills used in growth models, and (3) Student Learning Objectives (SLOs).


The case studies will examine what alternative outcome measures are used, how the alternative growth measures are implemented, challenges and obstacles in implementation, how the measures are being used, and, where possible, the distribution of teacher performance on the measures, as compared with the distribution of teacher performance on conventional value-added measures that are based on state assessments. Districts participating in the study will not be identified in published reports.

Benefits of Participating in the Study

Through their participation in the study, districts can make an important contribution to policymaker understanding of alternative measures of student growth as tools for measuring teacher performance in multiple contexts. By providing information on the implementation process, the effectiveness of the measures in differentiating teacher performance, and the perceived costs and benefits of the measures from stakeholder perspectives, this study has the potential to inform states and districts in the REL Mid-Atlantic region (and throughout the country) in deciding which measures are promising for use in evaluation, which are of doubtful value, and how to move forward with implementation. Participation in the study is voluntary.

Study Participation Requirements

In the fall of the 2013-14 school year, participating districts will assist the study team with scheduling a site visit and/or a series of telephone interviews with school and district staff. The study team will conduct one-on-one interviews with up to ten staff members—including at least one district administrator, two or three principals, two or three teachers, and one teachers’ union/association representative. During these interviews, the study team will gather information on district- and school-level implementation of alternative growth measures, applications of the measures, the distribution of teacher performance on the measures, and the perceived costs and benefits of the measures from the perspectives of those interviewed.

The Study Team

Mathematica Policy Research, Inc., a nonpartisan policy research firm, conducts research and surveys for federal and state governments, foundations, and private sector clients. Mathematica’s studies of education initiatives and other programs have been used to inform national policymakers for more than 35 years. Mathematica strives to improve public well-being by bringing the highest standards of quality, objectivity, and excellence to bear on the provision of information collection and analysis to its clients. Mathematica has offices in New Jersey, California, Illinois, Massachusetts, Michigan, and Washington, DC. See www.mathematica-mpr.com.

To Find Out More

Contact Mathematica’s project director, Brian Gill, by phone at (617) 301-8962 or by email at [email protected].

Data Confidentiality

Responses to the data collection activities will be used for research purposes only. The reports prepared for the study will summarize findings across the sample and will not associate responses with a specific district, school, or individual. We will not provide information that identifies you or your district to anyone outside the study team, except as required by law.

Mathematica follows the confidentiality and data protection requirements of the Institute of Education Sciences (The Education Sciences Reform Act of 2002, Title I, Part E, Section 183). We will protect the confidentiality of all information collected for the study and will use it for research purposes only. No information that identifies any study participant will be released. Information on respondents will be linked to their institution but not to any individually identifiable information. No individually identifiable information will be maintained by the study team. All institution-level identifiable information will be kept in secured locations and identifiers will be destroyed as soon as they are no longer required.


OMB Package: Alternative Student Outcomes for Growth Measures Case Studies 1

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleTable of Contents
AuthorBrian Gill
File Modified0000-00-00
File Created2021-01-28

© 2024 OMB.report | Privacy Policy