FeliciaOMB_SupportingStatementB_2013-08-06

FeliciaOMB_SupportingStatementB_2013-08-06.docx

Alternative Student Outcomes for Growth Measures Case Studies

OMB: 1850-0901

Document [docx]
Download: docx | pdf

OMB Supporting Statement Part B: Alternative Student Growth Measures for Teacher Evaluation: Case Studies of Early Adopters


August 6, 2013

Table of Contents








SUPPORTING STATEMENT FOR PAPERWORK
REDUCTION ACT

This submission is a request for approval of data collection activities that will be used to support the Mid-Atlantic Regional Educational Laboratory (REL) Alternative Student Outcomes for Growth Measures Case Studies. The study is being funded by the Institute of Education Sciences (IES), U.S. Department of Education (ED), and is being implemented by ICF International and its subcontractor, Mathematica Policy Research.

This study aims to fill the gap in information available to districts and policymakers on measures of student growth that do not use state standardized tests via qualitative case studies eight school districts that are using alternative measures of student achievement growth in teacher performance ratings. The case studies will address what alternative outcome measures are used, how the alternative growth measures are implemented, challenges and obstacles in implementation, and how the measures are being used. Where possible, the study team will examine the distribution of teacher performance on the measures, as compared with the distribution of teacher performance on conventional value-added measures that are based on state assessments. The study team will conduct semi-structured interviews with district administrators leading teacher evaluation or effectiveness efforts, teacher representatives (such as union leaders), teachers (including both classroom teachers and instructional coaches), and principals. The data collected will be summarized and analyzed using a case study approach.

This submission requests approval to recruit districts for the study and conduct in-person and telephone interviews with staff in participating districts.

PART B. COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS

  1. Respondent Universe and Sampling Methods

A nationally representative sample of districts is not feasible for this study. The current documentation of the universe of districts implementing and piloting alternative measures of student growth is not sufficiently robust to permit the study team to identify every district in the universe within the study’s proposed timeline. However, the study team consulted a variety of sources to identify an initial pool of districts for recruitment that have been implementing alternative student growth measures for at least one school year. The study team used a literature review and the web-based database of the National Council on Teacher Quality to identify an initial pool of 13 districts using alternative student growth measures to estimate teacher effectiveness. For each of these districts, we reviewed information on the district’s website to determine the measures in use, the initial year of implementation, and the purpose of the measure. We identified eleven districts that had been implementing an alternative growth measure for at least one year and used the measure in teacher evaluations or performance-related compensation systems.

Because of the study’s goal to provide as much detailed information on the implementation of alternative growth measures as possible, and our qualitative case study approach, the selection of districts for recruitment was designed to maximize the variety of growth measures included in the study. Districts were grouped by the type of alternative assessment applied in growth models using the following three categories: (1) end-of-course curriculum-based assessments to which statistical growth models are applied; (2) nationally-normed assessments such as the PSAT or ITBS, to which statistical growth models are applied; and (3) Student Learning Objectives, which do not involve the application of statistical growth models, and instead are intended to account for growth implicitly, because they are selected separately for each teacher’s students. Sorting potential case study districts into these categories ensured that the final study sample includes at least one district implementing each type of assessment. The team ranked the districts within each type based on the year the measure was first implemented; we preferred districts that had been implementing the measure for a longer period of time because they were more likely to have progressed beyond a pilot stage, would have more experience implementing the measure, and would have better insight into the medium- and long-term benefits and challenges of the measure. For the three highest-ranked districts within each type, we then e-mailed the relevant district contact and followed up with a phone call within two business days. In two districts we were unable to obtain permission to conduct research within our timeline, and we contacted the next highest ranked district (prioritizing so that at least one district used each type of measure).

The study team successfullylly recruited eight districts from the sorted and ranked pool of potential districts. The sample size was determined based on the timeline and resources available as well as on the size of the total universe of districts actively implementing alternative outcome growth measures. The sample includes districts with experience using each of the three types of alternative growth measures.

Data will be collected via interviews (described below) from up to 10 staff in each district, including the district staff person with greatest responsibility for the program, two principals, three classroom teachers, and one union/association representative. In larger districts, the study team will conduct additional interviews. In districts with dedicated instructional leaders (e.g., instructional coaches and master teachers), one or two additional interviews will also be conducted with instructional leaders to supplement interviews with classroom teachers and gain a broader perspective on the effects on teachers of implementing the measures. The number of instructional leaders interviewed will depend on the size of the district.

  1. Procedures for the Collection of Information

The study team began the recruitment effort by mailing districts an introductory package, which included the following two documents:

  • Notification letter. The one-page notification letter describes the importance of studying the implementation of alternative measures of student growth, provides an overview of the study design, summarizes the benefits of participating, and notes that a study team member will follow up by telephone to discuss the study in more detail (Appendix A).

  • Study summary. The two-page summary describes the purpose of the study and the benefits of participation, identifies the study team, and provides contact information for the project director and the ED project officer. It also discusses the activities required of participating districts and schools (Appendix B).

The study team sent the notification letter and study summary to each district’s superintendent and either the director of human resources or director of curriculum and instruction—depending on which administrator is determined to be the appropriate contact regarding the implementation of student growth measures—via FedEx to highlight the importance of the documents. A Mathematica team member then followed up with the director of human resources or director of curriculum and instruction to begin discussing the study. The study team scheduled a call to describe the study, explain the benefits of participation, discuss confidentiality procedures, and secure participation (Appendix C).

After receiving OMB approval to begin data collection in the sample districts, the study team will work with the district administrators to recruit the principals, teachers, union representatives, and instructional leaders to interview. Copies of the interview protocols for district administrators, principals, teachers, and union representatives are located in Appendixes D, E, F and G respectively.

To determine which principals and teachers will be interviewed, the study team will first solicit initial suggestions from key district administrators on the principals who are most familiar with the alternative student growth measure and the implementation process. The study team will also investigate which schools were leaders in developing or implementing the alternative student growth measure and which schools may be struggling with implementation. If possible, the study team will try to select principals from both types of schools in an effort to obtain multiple perspectives on the implementation process. Similarly, the study team will solicit suggestions from principals of teachers who may have played a more active role in the design or implementation of the alternative student growth measure and are relatively more knowledgeable with regard to the applications of the measure. Because response bias is a concern and because the study team will not be able to select a representative sample of all subjects and grades in a district due to the small sample size of respondents, the team’s focus is on selecting teacher and principal respondents who will be most credible or accurate in their reports on implementation, application, and effectiveness of the alternative measures of student growth. Teachers who were most active in piloting or implementing these measures are likely to be the most knowledgeable and credible sources of information on implementation. To ensure a range of viewpoints, we will also solicit suggestions of teachers who actively struggled with implementation. Based upon these suggestions, the study team will reach out to teachers active in piloting and implementing the measureas, as well as those who struggled with implementation. The study team will not exceed ten respondents in a given district.

  1. Methods to Maximize Response Rates and Deal with Nonresponse

Mathematica has developed multiple strategies to maximize response rates while minimizing burden on respondents. The study team has found in prior work that the following techniques contribute significantly to a high completion rate: establishing positive relationships with respondents and district staff; sending advance letters; and establishing efficient and flexible scheduling. To help alleviate districts’ concerns about data privacy, all information request documents will include a statement on our adherence to confidentiality and data collection requirements (The Education Sciences Reform Act of 2002, Title I, Part E, Section 183).

The study team will adhere to any data collection requirements that districts may have, such as preparing research applications and seeking IRB approvals. Now that the team has successfully recruited the eight districts needed for the study, the team anticipates a 100 percent response rate for district and staff telephone interviews.

School-level staff (principals, teachers, and union/association representatives) will be recruited based on initial information provided by district administrators as described in previous section. The study team will work with district administrators to contact school-level staff and schedule interviews with staff who are willing to participate in the study. Therefore, the study team does not anticipate significant nonresponse. However, if a school-level staff member declines to participate in an interview, the study team will identify another individual in the same respondent category based on input from district administrators. The study team will continue to recruit additional school-level staff members until the team meets the target thresholds by respondent type for each district: at least two school principals, at least three classroom teachers, and at least one teachers’ union/association district representative.

  1. Tests of Procedures or Methods to Be Undertaken

The study team has piloted the interview protocol with one district administrator in each district in June of 2013. This pilot assessed the content, clarity, and wording of individual questions; respondent burden time; and the use of probes. It also served as an initial collection of data for the study. The study team has analyzed the interview data collected during the pilot and produced a preliminary report based on this analysis (expected release in February 2014). After the pilot was completed, the primary investigators decided that the interview protocols did not require revision.



  1. Individuals Consulted on Statistical Aspects of the Design and Collecting and/or Analyzing Data

The following individuals were consulted on the statistical aspects of the study design and on the data collection and analysis:

Name

Title

Telephone Number

Brian Gill

Senior Fellow, Mathematica

617-301-8962

Claudia Gentile

Associate Director, Mathematica

609-275-2379

Patricia Del Grosso

Researcher, Mathematica

540-961-1585

Moira McCullough

Researcher, Mathematica

617-301-8965

Robert Boruch

Professor, University of Pennsylvania

215-898-0409

Laura Hamilton

Senior Behavioral Scientist, RAND

412-683-2300, x4403

Christopher Hulleman

Research Associate Professor, University of Virginia

434-924-6998

Andrew Porter

Professor, University of Pennsylvania

215-898-7014

Christopher Rhoads

Assistant Professor, University of Connecticut

860-486-3321



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleTable of Contents
AuthorBrian Gill
File Modified0000-00-00
File Created2021-01-28

© 2024 OMB.report | Privacy Policy