Att_EMHPT supporting statement part A 8- 28 ver Moving Teachers OMB A 082008.v3

Att_EMHPT supporting statement part A 8- 28 ver Moving Teachers OMB A 082008.v3.doc

Evaluation of Moving High-Performing Teachers to Low -Performing Schools.

OMB: 1850-0861

Document [doc]
Download: doc | pdf

Contract No.: ED-04-0112/0007

MPR Reference No.: 6398





Evaluation Of Moving High-Performing Teachers To Low-Performing Schools


Part A: Supporting Statement for Paperwork Reduction Act Submission


August 22, 2008


















Submitted to:


U.S. Department of Education

Institute of Education Sciences

555 New Jersey Avenue, NW

Washington, DC 20208

Telephone: (202) 208-7169


Project Officer: Elizabeth Warner



Submitted by:


Mathematica Policy Research, Inc.

600 Maryland Ave. S.W., Suite 550

Washington, DC 20024-2512

Telephone: (202) 484-9220

Facsimile: (202) 863-1763


Project Director: Steve Glazerman




CONTENTS

Page

Part A: Supporting Statement for Paperwork Reduction Act Submission

A. JUSTIFICATION A-1

1. Circumstances Necessitating the Collection of Information A-1

2. Purposes and Uses of the Data A-5

3. Use of Technology to Reduce Burden A-6

4. Efforts to Avoid Duplication A-6

5. Methods to Minimize Burden on Small Entities A-6

6. Consequences of Not Collecting Data A-7

7. Special Circumstances A-7

8. Federal Register Announcement and Consultation A-7

a. Federal Register Announcement A-7

b. Consultations Outside the Agency A-7

c. Unresolved Issues A-7


9. Payments or Gifts A-8

10. Assurances of Confidentiality A-8

11. Additional Justification for Sensitive Questions A-9

12. Estimates of Hours Burden A-10

13. Estimates of Cost Burden to Respondents A-10

14. Estimates of Annual Costs to the Federal Government A-10

15. Reasons for Program Changes or Adjustments A-10

16. Plans for Tabulation and Publication of Results A-11

a. Tabulating Descriptive Information. A-11

b. Estimating Teacher Value Added A-11

c. Estimating Impacts of the Master Teacher Residency Program A-13


17. Approval to Not Display the OMB Expiration Date A-14

18. Explanation of Exceptions A-14

Part B: Supporting Statement for Paperwork Reduction Act Submission

B. Collection of Information Employing Statistical Methods B-2

1. Respondent Universe and Sampling Methods B-2

2. Statistical Methods for Sample Selection and Degree of Accuracy Needed B-4

a. Evaluation of Impact on Student Achievement B-4

b. Candidate Survey B-5


3. Methods to Maximize Response Rates B-5

4. Pilot Testing B-6

5. Individuals Consulted on the Statistical Aspects of the Design B-6



APPENDICES



APPENDIX A: COVER LETTER FOR MASTER TEACHER RESIDENCY PROGRAM CANDIDATES TO ACCOMPANY CANDIDATE SURVEY

APPENDIX B: PRETEST MATERIALS

B1.Pretest Draft for the Master Teacher Residency Program Candidates Survey

B2. Pretest Debriefing Protocol


APPENDIX C: DRAFT RECRUITMENT MATERIALS

C1. District Recruitment Protocol

C2. Principal Recruitment Letter

C3. Information Sheet For Principals

C4. Letter of Introduction and Invitation to Apply to Program for Eligible Teacher Candidates


APPENDIX D: CONFIDENTIALITY PLEDGE

APPENDIX E: REFERENCES

TABLE

Table Page

A-1 SCHEDULE OF MAJOR STUDY ACTIVITIES A-4

A-2 BURDEN ESTIMATES BY RESPONDENT AND DATA REQUEST A-10



Part A: Supporting Statement for Paperwork Reduction Act Submission

This OMB package requests clearance to recruit school districts for an upcoming evaluation to test the effect of teacher incentives designed to move high-performing teachers to targeted low-performing schools (hence, the evaluation is titled, “Moving High-Performing Teachers to Low-Performing Schools.”) The evaluation aims to estimate the impact of the high-performing teachers on the low-performing schools that they transfer to. The evaluation design is a randomized experiment in which the researchers will randomly assign schools that have a teaching vacancy in targeted grades and subjects to an intervention group or a control group. High-performing teachers will be offered bonuses for transferring to and remaining in the intervention schools for two years. Control schools will fill their teaching vacancies the way they normally would if they were not part of a study. We will compare student achievement and other outcomes between the intervention and control schools to estimate the impact of the intervention.

In addition to the clearance request for recruiting, we are requesting clearance to collect student records data from those recruited districts and administer a data collection form to a group of 61 teachers participating in a pilot study that will be conducted for the 2008-09 school year. We refer to this teacher data collection form as the “Candidate Survey.”

This request is the first of two. A future request will seek clearance to collect additional teacher and principal survey data associated with the evaluation. With the exception of the Candidate Survey covered in this request, we are not requesting any clearance for data collection forms during the pilot study that is currently taking place. The pilot study involves one district and will allow us to pretest three additional surveys—one of teachers, one of principals, and one of district human resources staff—on a sample of 9 or fewer individuals each.

The study is submitting the package in two stages because site identification and recruitment must begin before all the data collection instruments are developed and pretested and because implementing the Candidate Survey with a full sample of 61 teachers will allow us to learn the appropriate lessons from the pilot study before moving on to the planned full study. The draft letter requesting teacher participation in the Candidate Survey is contained in Appendix A, the draft Candidate Survey to be used in the pretest in Appendix B, recruitment materials in Appendices C1-C4, and MPR’s internal confidentiality pledge in Appendix D. References appear in Appendix E.

A. JUSTIFICATION

1. Circumstances Necessitating the Collection of Information

The specific legislation authorizing this data collection is Section 9601 of The No Child Left Behind Act of 2001 (NCLB). The law stipulates that federal funds are to be used to evaluate activities that are authorized under this Act. NCLB, which reauthorized the Elementary and Secondary Education Act of 1965 (ESEA), emphasizes the importance of teacher quality in improving student achievement. Title II, Part A of ESEA-the Improving Teacher Quality State Grants program – provides nearly $3 billion a year to states to prepare, train, and recruit high-quality teachers. The purpose of Title II, Part A is to help States and local school districts ensure that all students have effective teachers. One allowable use of Title II, Part A funds is developing merit-based performance systems and strategies that provide differential and bonus pay for teachers in high-need academic subjects such as mathematics and teachers in high-poverty schools and districts.

Research shows that high quality teachers are critical to raising student achievement (Rivkin et al. 2005; Rockoff 2004; Rowan et al. 2002), yet schools most in need of effective strategies for improving achievement often experience difficulty in attracting and retaining high-quality teachers (Carroll et al. 2000; Lankford et al. 2002; Roellke 2007). On average, this leads to the lesser experienced and lower quality teachers teaching the most needy and lowest achieving students. Increasingly districts and schools are experimenting with teacher compensation reform as one mechanism to address this maldistribution of teacher quality.

In recent years, multiple districts have implemented or considered implementing various forms of merit pay to improve teacher quality in low-performing schools. However, policymakers lack rigorous research evidence about the forms of merit pay that are successful in improving the quality of teachers assigned to students. The National Center for Education Evaluation within the U.S. Department of Education’s Institute of Education Sciences has contracted with Mathematica Policy Research, Inc. (MPR) and its subcontractors to develop and conduct a test of an approach to merit-based bonuses for teachers who transfer to low-performing schools. Throughout this package this approach is called the Master Teacher Residency Program (MTRP).1 It aims to encourage high-performing teachers (“Master Teachers”) to transfer to low-performing schools by identifying teachers with consistently high performance (“value added” as measured by test score growth of their students, adjusting for the background characteristics of the students) and offering them recruitment and retention bonuses.

The evaluation effort will first assess the feasibility of the MTRP by mounting a pilot for the 2008-09 school year with 8 low performing schools in one school district. Next, the team will recruit approximately 10 districts to implement the intervention on a larger scale starting in 2009-2010. A detailed study timeline is given below.

The evaluation design of the full study is a randomized trial. Within each district, MPR will identify eligible schools and randomly assign half of them to a treatment group that receives the intervention and half to a control group that does not. “Receiving the intervention” means that the school may hire one of the Master Teacher candidates, who would then be eligible for the transfer incentive.

As part of the evaluation, MPR will collect data on treatment and control group schools (teachers and principals) and collect school records to estimate the impact of the intervention on teacher and student outcomes. As mentioned above, this request for OMB clearance is the first of two submissions, covering one pilot year survey and the district recruitment for the full study. The second submission will cover the rest of the full study data collection, which comprises the teacher candidate survey, a survey of newly hired teachers, and a principal survey. A more detailed data collection plan is described below.

The study’s central research question is the following: What impacts do Master Teachers have on student achievement when they are placed in low-performing schools? The study will examine the impact of the intervention on increasing student achievement within classrooms, grade levels, and schools. Several additional research questions are also important for policymaking:

  • What is the overlap between high-performing teachers and low-performing schools? In other words, how serious is the unequal distribution of teacher talent?

  • How responsive to incentives are high-performing teachers?

  • What factors influence career decisions of high-performing teachers?

  • Who fills teaching vacancies in low-performing schools in the absence of incentives?

In combination with random assignment of schools, the survey data and school records data will be used to answer these questions.

Study Timeline. The pilot study will be implemented for the school years 2008-2009 and 2009-2010 based on fall 2008 teaching vacancies.

The full-scale study will be implemented for the school years 2009-10 and 2010-11 and include district recruitment, identification of high-performing teachers, and random assignment of low-performing schools with fall 2009 vacancies (see Table A-1). A report describing MTRP implementation and presenting the first-year impacts will be prepared in summer 2011. A second report on impacts during the intervention’s first and second years and the retention rates of Master Teachers will be prepared in summer 2012.

Table A-1
Schedule of Major Study Activities

Activities

Fall 2008

Spring 2009

Summer 2009

Fall 2009

Spring 2010

Summer 2010

Fall 2010

Spring 2011

Summer 2011

Summer 2012

Pilot Candidate Survey (n=61)










Recruit districts for full study









Identify high-performing teachers for full study










Identify low-performing schools with fall 2009 vacancies










Conduct school random assignment










Conduct Candidate Survey (n=600)










Conduct New Hire Survey (n=200 teachers for fall 2009; n=40 replacement teachers for fall 2010)









Conduct Principal Survey (n=160)









Conduct Human Resources Survey (n=10)









Collect student records data for impact analysis








Prepare Year 1 report










Prepare Year 2 report












Data Collection Plan. The study includes several complementary data collection efforts that support answers to the study’s research questions. A brief description of each data collection activity is provided below. Only two items below are part of this clearance request (as well as part of the full-scale study. Specifically, we request permission to request districts’ student achievement records linked to teachers to identify high-performing teachers and to pilot the Candidate Survey (see Appendix B) with all of the estimated 61 candidates in the pilot district. Other forms to be used in the study will be developed and submitted in the full-scale study clearance package along with pretest results and estimated burden time for each.

  • Candidate Survey. In addition to administering this instrument in the pilot study, we will administer it for the full study in fall 2009. It will help us to describe the background of teachers identified as high performing (MTRP candidates) and to learn more about the factors that affected their willingness to participate as well as their experiences during the hiring process. At this point, we are only requesting clearance to field this instrument in early fall, 2008, after OMB approval is received.

  • New Hires Survey. This survey will be administered in spring 2010 in the full study to all teachers who fill one of the vacancies in intervention or control schools. The survey will tell us about teachers’ experiences at their new schools and will collect information about teacher professional characteristics and other factors that may affect their students’ achievement. This instrument is not in this package. We will pretest the instrument this year as part of the pilot study with no more than 9 respondents and request clearance for the full study in a subsequent clearance request.

  • Principal Survey. A principal survey will be administered in early spring 2010 and 2011 in the full study to obtain data from principals in intervention and control schools about recruitment and hiring as well as their assessments of the teachers hired in the study’s target grades and any redistribution of resources related to the arrival of the new hire. This instrument is not in this package. We will pretest the instrument this year as part of the pilot study with no more than 9 respondents and request clearance for the full study in a subsequent package.

  • Human Resource Personnel Survey. An interview protocol will be developed and administered to ask one senior human resources (HR) staff member in each study district about their district’s role in teacher hiring, recruitment, and transfers. The interviews will be conducted in spring 2010 and spring 2011 in the full study. This instrument is not in this package. We will pretest the instrument this year as part of the pilot study with no more than 9 respondents and request clearance for the full study in a subsequent package.

Program Application. Eligible teachers will be asked to complete an on-line application for consideration for the program. The primary purpose of the application is to enroll the teacher in the program for hiring consideration by principals and HR personnel. We estimate that a teacher may spend up to 20 minutes completing the application on-line and have included this time in the burden estimate.

Student Records. Student achievement is both the key element used to identify high-performing teachers and the critical outcome for the evaluation. Student demographic data will serve as an important control variable in the analyses. In the full study, in addition to test scores, we will collect data such as student age, race/ethnicity, English language proficiency, disability status, eligibility for school lunch programs, and mobility status.

  • Student Records for Identification of High-Performing Teachers (requesting clearance as part of recruitment for the full study). As districts are recruited (and by spring 2009), we will collect four years of test score data (for the 2004-2005 through the 2007-2008 school years) and three years of enrollment and demographic data (for the 2005-2006 and 2007-2008 school years) for all students in each district in order to conduct the value-added analysis to identify high-performing teachers. Details of the value-added analysis are discussed below in Section A16.

  • Student Records for Measuring Student Achievement Outcomes. In the full study, we will collect data in targeted classrooms at two additional points in order to obtain student achievement outcome data: summer 2010 for the 2009-2010 school year and summer 2011 for the 2010-2011 school year.

2. Purposes and Uses of the Data

The primary purpose of the evaluation is to estimate the impacts of high-performing teachers on student achievement in low-performing schools. The research team will also study the implementation of the MTRP. A pilot, which follows one cohort of teachers for two years, will be conducted for the 2008-2009 and 2009-2010 school years to assess the feasibility of expanding the study to approximately 10 districts. The full study is planned for implementation in years 2009-10 and 2010-11.

The information collected will fill several gaps in knowledge about teacher quality. It will inform policy decisions about strategies for addressing the uneven distribution of teacher quality and potentially raising student achievement by inducing high-performing teachers to move to and remain in low-performing schools. For example, the study will provide new evidence on the degree to which teacher quality is inequitably distributed across schools. The study will also be able to combine value-added data with more readily observed proxies for teacher quality to see if they are correlated and will then map them to school characteristics to assess the degree of skewness in the distribution of teacher quality.

Findings will be presented in two reports. In addition, the data collected by the evaluation will be submitted to ED as restricted-use data files that will serve as a valuable resource for other researchers to further study these issues.

3. Use of Technology to Reduce Burden

The data collection plan was designed to obtain reliable information in an efficient way that minimizes respondent burden. Consistent with that goal, information will be gathered from existing data sources where feasible. Existing data sources will include test scores for school-administered tests. This information will be obtained in the form of computer files provided by the school district.

The Candidate Survey will be mailed to respondents to complete and return, with telephone followup for nonresponders. We considered other modes of survey administration, such as a computer automated telephone interview (CATI) or a web-based survey. However, because the sample size is small relative to the fixed cost of advanced data collection methods, the cost of developing a computer-assisted survey outweighs the benefits. Respondents also may find a mail questionnaire to be less burdensome because a computer-assisted interview would typically need to be conducted when the respondent has access to a telephone or computer (access to telephones and private access to computers is uneven).

4. Efforts to Avoid Duplication

No other survey or evaluation has been conducted of these sample groups for this purpose, and no equivalent sources of data exist for the study. The study will avoid duplication of data collection efforts by using student test scores and background information from existing district testing programs and administrative records.

5. Methods to Minimize Burden on Small Entities

The primary entities for the study are districts and teachers. Burden is minimized for all respondents by requesting only the minimum data required to meet the study’s objectives. The data requirements were determined by careful consideration of the information needed to meet the study’s objective and will be reviewed by the study’s Technical Working Group (TWG) before the OMB package for the full study is submitted in December 2008.

6. Consequences of Not Collecting Data

The data collection plan described in this submission is necessary for conducting ED’s Evaluation of Moving High Performing Teachers to Low-Performing Schools and, consistent with the goals of the NCLB, to address the uneven distribution of teacher quality and raise student achievement by requiring that all students be taught core subjects by highly qualified teachers.

7. Special Circumstances

There are no special circumstances associated with this data collection.

8. Federal Register Announcement and Consultation

a. Federal Register Announcement

The 60-day Federal Register Notice was published in the Federal Register, page 34715 Vol. 73, no. 118 on June 18,2008.


One comment was received and has been addressed.


Only data that can contribute significantly to the evaluation study will be gathered. Student records data to be gathered in the pilot year is used to identify high-performing teachers who are eligible to participate in the program. The districts participating in the study volunteer to be part of this study and are fully apprised of the data requirements when they are recruited. In addition, the study will provide, as needed, any technical assistance to districts in order to obtain the needed data. The teacher candidate survey will be kept to under 30 minutes and based on OMB approval teachers will receive a $25 incentive payment for their participation. A pretest of 9 respondents will be conducted prior to piloting the teacher survey to the larger pilot sample – to ensure that the instrument can be completed within the 30 minutes.


The full-scale study will adhere to all efforts to minimize burden set in place for the pilot.



b. Consultations Outside the Agency

During the preparation of the design and implementation of the pilot study, the study will seek input from its TWG, which will include a number of the nation’s leading experts in areas relevant to this study. Throughout the study, the study team also will consult with the TWG on other issues that would benefit from its input.

Members of the TWG include:

Dale Ballou (Vanderbilt University)

Brad Jupp (Denver Public Schools)

Tom Kane (Harvard Graduate School of Education)

Rob Meyer (University of Wisconsin Center for Education Research)

Tony Milanowski (University of Wisconsin Center for Education Research)

Jeff Smith (University of Michigan)

Louise Sundin (Minneapolis Federation of Teachers, formerly)

Jake Vigdor (Duke University)


c. Unresolved Issues

None.

9. Payments or Gifts

For fall 2008, we plan to administer the pilot version of the Candidate Survey with an estimated 61 teachers in the one school district participating in the pilot study (during 2008-2009). We propose offering a $25 incentive payment for completion of the pretest survey. This proposed amount is within the incentive guidelines as outlined in the memo, “Guidelines For Incentives For NCEE Evaluation Studies,” prepared for OMB March 22, 2005.

The incentives have been proposed for the candidate survey to offset anticipated reluctance from candidates—a majority of whom may not see any benefit in participating in the survey.  In fact, the candidates whose survey response are of greatest interest include those who chose not to engage with the MTRP or those who were rejected from MTRP teaching positions for which they applied.  These two groups are less likely to complete surveys, but their perspectives are key to learning all that we can from the pilot phase of the study.  A second reason to use respondent incentives for the Candidate Survey is to mimic the planned conditions of the full scale study as closely as possible in order to gain the full benefit of the pilot. Therefore, we propose to use the same respondent incentives that we plan to request clearance for regarding the Candidate Survey administration in the full scale study.



10. Assurances of Confidentiality

The data collection efforts that are the focus of this clearance package will be conducted in accordance with all relevant regulations and requirements. These include the Education Sciences Institute Reform Act of 2002, Title I, Part E, Section 183, that requires “[a]ll collection, maintenance, use, and wide dissemination of data by the Institute” to “conform with the requirements of section 552 of Title 5, United States Code, the confidentiality standards of subsections (c ) of this section, and sections 444 and 445 of the General Education Provisions Act (20 U.S.C. 1232 g, 1232h).” These citations refer to the Privacy Act, the Family Education Rights and Privacy Act, and the Protection of Pupil Rights Amendment. In addition, for student information, the director will ensure that all individually identifiable information about students, their academic achievements, and their families, and information with respect to individual schools, shall remain confidential in accordance with section 552a of Title 5, United States Code, the confidentiality standards subsection (c ) of this section, and sections 444 and 445 of the General Educations Provision Act.

Subsection ( c ) of section 183 referenced above requires the Director of IES to “develop and enforce standards designed to protect the confidentiality of persons in the collection, reporting, and publication of data”.

The study will also adhere to requirements of subsection (d) of section 183 prohibiting disclosure of individually identifiable information as well as making the publishing or inappropriate communication of individually identifiable information by employees or staff a felony.

MPR and its subcontractors NTRT and Optimal will protect the confidentiality of all information for the study and will use it for research purposes only. No information that identifies any study participant will be released. Further, personally identifiable data will not be entered into the analysis file and data records will contain a numeric only. When reporting the results, data will be presented only in aggregate form, such that individuals and institutions will not be identified. A statement to this effect will be included with all requests for data. The teacher survey will include a reminder about confidentiality protection in compliance with the legislation. When data are collected through telephone or in-person follow-up interviews, respondents will be reminded about the confidentiality protections, the voluntary nature of the survey, and their right to refuse to answer individual questions. Further, no individually identifiable information will be maintained by the study team. All members of the study team having access to the data will be trained on certified on the importance of confidentiality and data security. All data will be kept in secured locations and identifiers will be destroyed as soon as they are no longer required.

The following safeguards are routinely employed by MPR to carry out confidentiality assurances during the study:

  • All employees at MPR sign a confidentiality pledge (Appendix D) emphasizing its importance and describing their obligation.

  • Access to sample selection is limited to those who have direct responsibility for providing and maintaining sample locating information. At the conclusion of the research, these data are destroyed.

  • Identifying information is maintained on separate forms and files, which are linked only by sample identification number.

  • Access to the file linking sample identification numbers with the respondents’ ID and contact information is limited to a small number of individuals who have a need to know this information.

  • Access to the hard copy documents is strictly limited. Documents are stored in locked files and cabinets. Discarded materials are shredded.

  • Computer data files are protected with passwords, and access is limited to specific users. Especially sensitive data are maintained on removable storage devices that are kept physically secure when not in use.

The Privacy Act of 1974 applies to this collection. MPR will make certain that all surveys are held strictly confidential, as described above, and that in no instance will responses be made available except in tabular form. Under no condition will information be made available to school personnel. District and school staff responsible for assisting MPR in the data collection will be fully informed of MPR’s policies and procedures regarding confidentiality of the data.

A System of Record has been prepared and submitted.

11. Additional Justification for Sensitive Questions

We do not anticipate that any of the data collection forms will contain items considered to be of a sensitive nature.


12. Estimates of Hours Burden

Table A-2 below reports the estimated burden hours for 61 teachers to complete the pilot Candidate Survey. District staff hours to prepare the student records data file are also provided and are based upon our experience with the pilot site.


TABLE A-2.


BURDEN ESTIMATES BY RESPONDENT AND DATA REQUEST



Respondent/data request

Number of respondents

Unit response time (hours)

Total response time (hours)

Teacher Candidates




Candidate Survey

61

0.5

30.5

Teacher Applicants* estimate

20

0.3

6

District Staff




Records File Preparation

10

120

1200

Total

91


1236.5


A total of 30.5 teacher burden hours are estimated to implement the Candidate Survey in 2008. This includes a projected 30 minutes for 61 candidates to complete just the survey. We estimate that up to 20 eligible teachers will complete an application. District staff hours to prepare the student records are estimated at 120 hours for each of 10 districts for a total of 1,200 hours. Thus, total burden for data collection is 1,235.5 hours.

13. Estimates of Cost Burden to Respondents

There are no start-up costs for respondents.

14. Estimates of Annual Costs to the Federal Government

The estimated cost of the study is $11,692,524 and the estimated annual costs to the federal government are $2,338,504.

15. Reasons for Program Changes or Adjustments

This is a new data collection resulting in a program change of 1236.5 hours for data collection.

16. Plans for Tabulation and Publication of Results

Our discussion of tabulation and publication plans focuses on the analyses we will conduct based on information gathered from the pilot study. We also include a brief discussion of the analyses planned for the full study reports. Our tabulation plans for both the pilot and full study phase include (a) descriptive information gathered on teachers, schools, and the implementation of the MTRP; (b) estimating teacher value-added; and (c) estimating impacts of the MTRP.

Tabulation Plans. Our plans for tabulations include three sets of analyses. Some of these analysis plans are included for reference only, because they include data that will be collected after a future request for clearance is approved.

a. Tabulating Descriptive Information.

To identify challenges and develop strategies for refining the intervention, the evaluation will describe the implementation of the MTRP. Through surveys of principals and human resource personnel administered in the full study, we will examine schools’ and school districts’ roles in teacher hiring, recruitment, and transfer processes. Data gathered from the Candidate Survey in both the pilot and full study phase will enable us to assess teacher responsiveness to outreach and recruiting and to identify factors that influence teachers’ decisions to apply for and pursue a position in a low-performing school. Information from the New Hires survey administered in the full study will add information about job search experiences to allow us to characterize the process of filling vacancies in low-performing schools. We will also use data from all of the surveys to describe stakeholders’ attitudes about and satisfaction with the MTRP.

Other data tabulations conducted in the full study will provide important context for the impact evaluation. We will combine data from our two-stage identification of high-performing teachers with publicly available data on schools to describe the distribution of teacher quality. Though the primary focus of the evaluation is on estimating impacts on student achievement, we will also tabulate data to describe the broader effects of the MTRP on schools and school districts, examining other outcome measures such as the subjective ratings from the principal survey of newly hired teachers’ performance, as well as retention rates tracked by the study.

We will also synthesize data from various sources (e.g., school characteristics, teacher surveys, HR surveys, etc.) data in the full study to provide information critical for interpreting impact estimates. For example, assessing the background characteristics of teachers in treatment and control schools obtained from the New Hire Surveys will enable us to describe and interpret the treatment/control contrast. This assessment of teacher characteristics may also guide us in defining important subgroups. Analyzing Principal Surveys will provide information on resource allocations within schools that will be key to understanding any distributional effects that may contribute to the net impact on student achievement.

b. Estimating Teacher Value Added

In both the pilot and the full study, the credibility of the MTRP depends on our ability to identify high-performing teachers. Our approach relies on a critical first stage of selecting only those teachers with a proven track record of raising student achievement. We identify teachers with such a track record by using several years of achievement data to estimate teachers’ value-added—the unique contribution that a teacher makes to student achievement growth in a typical year. Using the estimates of each teacher’s value-added, we will identify a list of high-performing teachers from the upper tail of the performance distribution. We will then target those teachers for the next stage of teacher recruitment.

The value-added model of teacher performance is based on a student achievement growth model which controls for students’ prior achievement and student background. Specifically, the model will be a regression of a student’s test score in a given subject and year on the student’s previous-year score, background characteristics, and a teacher effect:

(1)

where Yijt is the year t test score for student i in the subject taught by teacher j; ijt-1­ is the previous-year test score; Xit is a set of student characteristics included as control variables; Tijt is a teacher dosage variable indicating the fraction of the year the student was taught by teacher j; τ is a fixed year effect;2 is a random error term; and , , , , and τ are parameters to be estimated. Our goal is to estimate the coefficients j on the teacher dosage variables to measure each teacher’s value added.

Volatility in Test Scores. Random, one-time events that are not related to achievement but that affect scores can be a problem in that, by necessity, teachers work with a small sample of students each year. Such volatility can undermine attempts to estimate meaningful rankings of teacher performance (Kane and Staiger 2002). To account for this imprecision, we will follow Kane and Staiger’s recommendation to aggregate test score information over several years to estimate teacher effectiveness. Specifically, we will pool at least three years of student learning gains in estimating the value-added model.

Errors in Variables and Attenuation Bias. Another concern with value-added models is the well-documented attenuation bias that results from including an explanatory variable (the pretest) that is measured with error (Meyer 1992). While the bias in the coefficient on the pre-test is known to be attenuation, that is, biased towards zero, the bias in the teacher effects, which are the main parameters of interest, is of unknown direction.

We will deal with measurement error by estimating the reliability of the pretest and subtracting the estimated reliability from the diagonal elements of the cross-product matrix formed to compute the regression coefficients.3 As a test of the robustness of this approach and of the estimates to assumptions about errors in variables in general, we will recompute the teacher value-added estimates using gain scores (post-test minus pretest) and errors-in-variables regressions with different values of the reliability estimate and each time examine how our list of top teachers changes. If a teacher initially identified as high performing fails to remain in that group under alternative models, then we will examine the specific patterns of achievement gains and student characteristics to determine if the problem lies with the base model or the alternative models and make necessary corrections.

Sampling Error. There will always be some margin of error in the estimation of teachers’ value-added, even with several years of data. We aim to determine the cutoffs for high performance in such a way that minimizes the probability of including a teacher who might have a high value-added score by pure chance. We will use the estimated standard error for each teacher effect to characterize the uncertainty with which the teacher falls in the high performing category. One way to use this information is to apply an empirical Bayes or “shrinkage” estimator that replaces each teacher effect with a weighted average of the mean teacher effect and the observed one, with the weights being a function of the standard error of measurement. Less precisely estimated teacher effects (those based on less data) will “shrink” closer to the mean teacher effect.

c. Estimating Impacts of the Master Teacher Residency Program

The full study will estimate the impacts of the MTRP on student achievement in the treatment schools by exploiting the random assignment of schools to treatment conditions. Randomly assigning low-performing schools to either a treatment group that has the opportunity to hire a participant in the MTRP or to a control group not able to hire through the MTRP allows us to attribute differences in outcomes between the treatment and control schools to their participation in the MTRP. Random assignment ensures that there are no systematic differences between treatment and control schools prior to starting MTRP; we will use statistical hypothesis tests (for example, a t-test for the difference in two means) to account for chance differences between treatment and control schools. The difference between average outcomes among students in treatment and control schools is a simple unbiased estimator of the impact of the intervention.

We hypothesize that placing a high-performing teacher in a low-performing school can result in three fundamental effects: direct effects of high-performing teachers on students in their own classrooms, indirect effects of Master Teachers on students in other classrooms, and distributional effects resulting from principals redirecting students or resources due to the presence of Master Teachers.

  • Direct Effects on Student Achievement. Student test scores will be used to examine whether Master Teachers raise the achievement of the students in their own classroom relative to the achievement that would have been attained (the “counterfactual”) had the students been taught by (a) another teacher in the same school, or (b) whomever would have been hired by the school had the MTRP not been in existence (represented by the control teacher group).

  • Indirect Effects on Student Achievement. Student test scores and teacher and principal surveys will be used to examine whether the potential benefits of a Master Teacher in residence may spill over to colleagues and affect students in other classrooms where teachers collaborate on lesson planning and curriculum design or where Master Teachers provide information, mentoring, or other support of colleagues.

  • Distributional Effects on School Resources. Principals may assign Master Teachers the hardest-to-teach students or redirect mentoring or supervisory time they would normally devote to newly hired teachers toward other teachers in the school. The presence of distributional effects would make it difficult to distinguish between true direct and indirect effects. A principal survey will ask detailed questions about the allocation of school resources.

The sum of these effects is the net impact of the MTRP on student achievement. By comparing average outcomes for students of Master Teachers in treatment schools to average outcomes for students of their counterparts (new hires) in control schools, we estimate the direct effect, plus some distributional effects, of Master Teachers. By comparing average achievement growth for the whole treatment school (or grade level within the school) to that of the whole control school (or grade level), we obtain an unbiased estimate of the net impact of the Master Teachers.

Building upon this simple comparison of means, we will compute regression-adjusted estimates of the impacts of the MTRP. Using regression procedures increases the statistical precision of the impact estimates by enabling us to account for student, teacher, and school characteristics other than MTRP status that could affect the outcome.

Plans for Publication of Results. Two reports will be prepared under the full-scale study. The first report, to be submitted to ED in July 2011, will describe implementation, including challenges in identifying high-performing teachers willing to move to low-performing schools, and will examine the impacts of these teachers on student achievement during the intervention’s first year. The second report will be submitted to ED in July 2012. It will address impacts on student achievement during the intervention’s first and second years and will report on the retention rates of Master Teachers relative to new hires in control schools.

17. Approval to Not Display the OMB Expiration Date

The study will display the OMB expiration date.

18. Explanation of Exceptions

No exceptions to the certification statement are being sought.


1 The name of the intervention and the terminology used to refer to its participants will be updated and possibly customized for different school districts. We use MTRP and Master Teacher throughout the document for consistency and clarity.

2 For simplicity we present the model here for a single grade level, but we will estimate a pooled model that puts all grade levels on a common scale and includes year-by-grade fixed effects instead of just year effects.

3 This can be accomplished in Stata by using the eivreg command.


File Typeapplication/msword
File TitleMEMORANDUM
AuthorRobert Agodini
Last Modified Bykatrina.ingalls
File Modified2008-08-29
File Created2008-08-29

© 2024 OMB.report | Privacy Policy