Supporting Statement A

Supporting Statement A.pdf

Evaluation of the Impact of Teacher Induction Programs

OMB: 1850-0802

Document [pdf]
Download: pdf | pdf
U.S. Department of Education

Evaluation of the Impact of Teacher
Induction Programs

Office of Management and Budget
Statement for Paperwork Reduction Act Submission
Part A: Justification
Contract ED-04-CO-0112/0001

February 26, 2008

CONTENTS

Chapter

Page
PART A. JUSTIFICATION.........................................................................................1
1.
2.
3.
4.
5.
6.
7.

8.
9.
10.
11.
12.
13.
14.
15.
16.
17.
18.

Explanation of Circumstances That Make Collection of Data
Necessary .....................................................................................................4
How, by Whom, and for What Purpose Information Is to Be Used ............4
Use of Improved Information Technology to Reduce Burden...................17
Efforts to Identify and Avoid Duplication .................................................17
Efforts to Minimize Burden on Small Businesses and Other Entities .......18
Consequences of Less-Frequent Data Collection ......................................19
Special Circumstances Regarding Collection of Information in a
Manner Inconsistent with Section 1320.5(d)(2) of the Code of Federal
Regulations.................................................................................................20
Federal Register Comments and Persons Consulted Outside the
Agency .......................................................................................................20
Payments to Respondents...........................................................................21
Assurances of Confidentiality ....................................................................23
Questions of a Sensitive Nature .................................................................25
Estimates of Respondent Burden ...............................................................26
Estimate of the Cost Burden to Respondents.............................................28
Estimates of Annualized Government Costs..............................................28
Change in Hour Burden .............................................................................28
Time Schedule, Publication, and Analysis Plan.........................................28
Display of Expiration Date for OMB Approval.........................................36
Exceptions to Certification Statement........................................................36

References..............................................................................................................37

APPENDIX A:

MENTOR QUESTIONNAIRE

APPENDIX B:

COVER LETTER FOR THE TEACHER
BACKGROUND QUESTIONNAIRE

APPENDIX C:

TEACHER BACKGROUND QUESTIONNAIRE

APPENDIX D:

CONSENT FORM FOR ACCESS TO COLLEGE
ENTRANCE EXAM SCORES

APPENDIX E:

PARENTAL NOTIFICATION LETTER

APPENDIX F:

COVER LETTER FOR CLASSROOM OBSERVATIONS

CONTENTS (continued)

APPENDIX G:

CLASSROOM OBSERVATION TEACHER
INTERVIEW PROTOCOL

APPENDIX H:

COVER LETTER FOR THE INDUCTION
ACTIVITIES TEACHER QUESTIONNAIRE

APPENDIX I:

INDUCTION ACTIVITIES TEACHER
QUESTIONNAIRE

APPENDIX J:

COVER LETTER FOR THE TEACHER MOBILITY
QUESTIONNAIRE

APPENDIX K:

MOBILITY QUESTIONNAIRE

This package represents a request for a short extension of 9 months for data collection
instruments previously approved by OMB (OMB Control No. 1850-0802, approval notice dated
August 16, 2005). The clearance initially granted was for a period of 3 years, with an expiration
date of August 31, 2008. Data collection for the final administration of the teacher retention
survey (Appendix I) is planned to begin in October 2008, and therefore an extension on the
clearance is needed. Because the design for and burden of the final round of data collection
was included in the original package, this current package is identical in content to the package
approved by OMB. (Minor changes in wording have been made to the section headings to
reflect the current OMB headings.)

PART A.

JUSTIFICATION

This request for OMB clearance addresses data collection activities for the Evaluation of the
Impact of Teacher Induction Programs. Teacher induction refers to a program of services
provided to novice teachers, typically in their first year. These services often include multiple
forms of instructional and emotional support during the critical first year, such as working with a
mentor, participating in professional development workshops, and obtaining structured feedback
on classroom practices. This study is designed to test rigorously whether the use of a highintensity teacher induction program improves teacher retention rates, teacher practices, and
student achievement.

Through qualitative and quantitative data collection, the study will

compare the effectiveness of high-intensity teacher induction programs with that of lowerintensity programs, which are the norm in many school districts nationwide.
Three reasons motivate this rigorous study of the impacts of high-intensity teacher induction
programs. First, research evidence suggests that the single most important factor in student
achievement is the quality of the classroom teacher (Mayer et al. 2002). In response to this
evidence, the No Child Left Behind (NCLB) Act of 2001 calls on state and local educators to
increase the numbers of highly qualified teachers in our nation’s public schools. At the same
time, some states are mandating the use of induction for novice teachers, and several proposals
for the Higher Education Act include funds for such programs. In response, the percentage of

1

novice public school teachers who participated in such a program increased from 51 percent in
1990-1991 to 83 percent in 1999-2000 (Smith and Ingersoll 2003).
Second, the need for this study also stems from a growing body of evidence related to
teacher turnover.

About 14 percent of teachers leave the profession after one year, and

subsequent years also have high exit rates (Ingersoll 2003). High turnover rates limit the stock
of experienced teachers, who have greater impact on student achievement than those with less
experience (Sanders and Rivers 1996). Frequent turnover, especially in districts with high
poverty rates, also requires that thousands of dollars be spent to recruit, hire, and train a
replacement for each departing teacher. The Alliance for Excellent Education (2004) estimates
the annual cost of teacher attrition to be $2.6 billion nationwide.
Third, the need for this study stems from a lack of scientifically based information on
whether more intensive, and hence more expensive, induction programs are the most appropriate
type of program to implement. States and local districts, which invest substantial funding in
induction programs, do not have a sound understanding of the worthiness of their investments.
Considerable consensus exists about the potential value of components such as intensive,
structured mentoring by experienced and carefully selected expert teachers; formative
assessments of teaching practices; ongoing professional development workshops; and a clear
focus on the instructional aspects of teaching. Nevertheless, only about one percent of novice
teachers participate in a program with such elements (Smith and Ingersoll 2004). Policymakers
and educators need better evidence to understand whether a comprehensive, or “high-intensity,”
teacher induction model is an effective use of resources.
To inform this debate, Institute of Education Sciences (IES) of the U.S. Department of
Education (ED) has funded the Evaluation of the Impact of Teacher Induction Programs. The
study will compare the benefits and costs of the programs to examine whether high-intensity

2

teacher induction programs lead to higher teacher retention rates, better teacher practices, and
higher student achievement, and whether such programs are worthwhile investments.
To do so, the study will randomly assign schools to receive either the district’s current lowintensity induction program (the control group) or one of two high-intensity programs (the
treatment group). Use of random assignment ensures scientifically valid estimates of the impacts
of the high-intensity teacher induction programs on outcomes, compared with those of lowerintensity programs.
Two organizations will provide high-intensity programs—Educational Testing Service
(ETS) and the New Teacher Center (NTC)—to increase confidence that impact estimates are not
dependent on the specific aspects of a particular provider. ETS and NTC are two prominent
providers of high-intensity teacher induction in the United States, so including both will boost
the study’s credibility and broaden the possible applicability of its findings. An analysis that
pools the results from the two programs is reasonable, because the two models selected are quite
similar in their structure, focus, and content. Nevertheless, implementing each model in about
half the districts does provide an opportunity to study the effects of each one separately, though
the study is not designed to permit a direct comparison of the impacts of one program to the
other. In addition, the study will include two benefit-cost analyses. The first will compare the
direct financial costs of the high-intensity programs with the direct financial benefits arising
from reduced teacher turnover. The second will examine the cost-effectiveness of the highintensity programs in affecting teacher practices, student outcomes, and the number and types of
teachers who are retained.

3

1.

Explanation of Circumstances That Make Collection of Data Necessary

Introduction
Section 9601 of the NCLB Act stipulates that federal funds are to be used to evaluate
programs that the Act authorizes. NCLB, which reauthorized the Elementary and Secondary
Education Act of 1965 (ESEA), emphasizes the importance of teacher quality in improving
student achievement. Title II, Part A of ESEA—the Improving Teacher Quality State Grants
program—provides nearly $3 billion a year to states to prepare, train, and recruit high-quality
teachers. The purpose of Title II, Part A is to help states and local school districts ensure that all
students have effective teachers. The impact evaluation is thus essential to determining whether
state and local efforts to implement high-intensity teacher induction programs are having a
measurable impact on teacher retention patterns, teacher practices, and student achievement.
2.

How, by Whom, and for What Purpose Information Is to Be Used
The main purpose of the impact evaluation is to determine the effectiveness of high-intensity

induction programs in terms of teacher retention rates, teacher practices, and student
achievement. The study will also shed light on the nature of teacher induction services typically
provided in the selected districts and the characteristics of new teachers who participate in these
services.
The data collected for the study will be used to address research questions in six areas:
(1) characteristics of new teachers when they enter the teaching profession, (2) induction services
received by novice teachers, (3) teacher retention, (4) classroom practices, (5) student
achievement, and (6) benefits and costs of implementing the high-intensity induction programs.
In each of these areas, the following questions will be explored:

4

1. Baseline Characteristics of Novice Teachers. What are the characteristics of
novice teachers when they begin teaching, such as their professional and personal
background characteristics? To what degree do they feel prepared to handle various
aspects of teaching? What are their expectations for teaching as a career?
2. Induction Services Received by Novice Teachers. What are the types and
intensities of teacher induction activities in different induction programs for novice
teachers? What forms of support are provided in such areas as pedagogy and
classroom management? Who are the mentors who provide this support? What are
teachers’ levels of satisfaction with teaching?
3. Teacher Mobility. How does high-intensity teacher induction affect new teachers’
mobility patterns and, more specifically, the retention rates for districts? Do teachers
who leave a particular school transfer to another school within the same district,
transfer to another school district, transition into another type of position in the
education field, or leave the profession entirely? What reasons account for teachers’
leaving the schools where they begin their careers? What are the characteristics of
teachers who are retained compared with those of teachers who leave the school,
district, or profession?
4. Classroom Practices. How does teacher induction affect new teachers’ classroom
practices? Do the high-intensity programs positively affect the quality of novice
teachers’ planning and preparation, classroom management, and instructional
techniques?
5. Student Achievement and Other Student Outcomes. Does high-intensity teacher
induction ultimately result in improved student achievement? Does high-intensity
induction reduce the incidence or severity of disciplinary actions?
6. Benefits and Costs. Do benefits of increased retention rates associated with highintensity induction programs outweigh the financial costs associated with
implementing such programs? What are the benefits in addition to increased
retention?
The collection of information to address these questions will permit analyses that can inform
the policy debate on appropriate strategies for helping new teachers make the transition into the
profession and also helping them to remain high-quality, effective teachers. Each piece of the
data collection package will provide vital information toward developing a policy framework for
future decisions regarding teacher induction. The intended audiences for the study’s results are
ED, state education policymakers, and state and local induction program and school district staff.

5

Conceptual Framework for the Study. Many factors can distinguish novice teachers from one
another.

To understand the contribution of teacher induction models on teacher retention,

classroom practices, and student performance, it is important to account for differences in
teachers’ personal and professional background characteristics, in addition to differences in the
content and intensity of the teacher induction programs themselves. A conceptual framework for
the study is depicted in Figure 1.

6

FIGURE 1
CONCEPTUAL FRAMEWORK FOR THE EFFECTS OF TEACHER INDUCTION
PROGRAMS ON TEACHER, SCHOOL, AND STUDENT OUTCOMES

EXPLANATORY FACTORS

OUTCOMES

Context
Intermediary variables
Local area

School

Induction program
components

Credentials

Assessment
Integration

Pro
Teacher
and
fessional
student
practice
outcomes
Mobility
patterns
Professional
practice

Orientation
Classroom

Attitudes
Professional
development workshops

Student
performance

Intensity
Teacher

Mentoring/peer coaching

Small group activities

Observation

A

B

C

D

This framework indicates core areas for exploration under the research questions posed in
each of the topical areas listed above. The framework highlights the important linkages between
explanatory factors and outcomes. First, Column A includes the contexts of local communities,
schools, classrooms, and teachers, including such characteristics as neighborhood demographics,
7

the degree of administrative financial support, the percentage of a classroom’s students with
special needs or special education status, and teachers’ employment history. Second, Column B,
induction program components, includes factors such as the quality, duration, and frequency of
induction activities, including orientation, assessment, professional development workshops,
mentoring/peer coaching, small group activities, and observations.

Third, Column C,

intermediary variables, indicates the intermediate effects that these program components might
have on teachers’ attainment of additional credentials, integration and socialization in their
school communities, and attitudes about teaching. Finally, Column D, teacher and student
outcomes, shows the longer-term effects of an induction program. Teacher outcomes include
increased retention rates and improvement of instructional practices. Student outcomes include
improved academic achievement and a reduction in behavioral problems related to attendance,
tardiness, and disciplinary incidents.
a.

Structure of the Data Collection Effort
To address the study’s research questions, the evaluator, Mathematica Policy Research, Inc.

(MPR), will utilize a number of different data collection methods. Data collection instruments
will include a mentor background survey, a baseline teacher survey, a consent form requesting
permission for the evaluator to collect teachers’ college entrance exams, a classroom observation
protocol, a teacher induction activities survey, and a teacher retention survey.1 The study also
will include collection of aggregated student records data and a review of program documents.

1

Formally, the baseline teacher survey is called the Background Survey and the teacher retention survey is
called the Mobility Survey.

8

Data will be collected from up to 400 different, geographically dispersed schools, and each
data collection activity will be uniformly administered. Figure 2 displays a timeline for the data
collection activities. A brief description of each data collection activity is provided below.

FIGURE 2
DATA COLLECTION TIMELINE
Mentor
Background
Survey

8/05

1/06

10/05

Random
Assignment

Induction
Activities
Survey

• Baseline Teacher
Survey

3/06 4/06

6/06

Fall 06

Induction
Activities
Survey

Observe
Classrooms

##

Fall 07 Fall 08
##

Mobility
Survey

Collect Student
Records

• Obtain Teachers’
Consent for SAT/ACT

Notes:
The bold portion of the timeline, from 9/05 to 6/06, indicates the induction program period.
Items above the timeline apply only to those in the Treatment Group.
Items below the timeline apply to both treatment and control teachers.

Instruments are included in accompanying appendices , and the matrix presented in Figure 3
displays the role of each activity in providing information that is relevant to the conceptual
framework.
b. Mentor Background Survey
In summer 2005, at the time of the initial mentor training sessions, a background survey will
be administered to the mentors selected for both the NTC and ETS induction programs. Topics

9

FIGURE 3
DATA SOURCES AND DATA COLLECTION METHODS
Data Collection Methods
Survey

Topic Areas
Beginning Teacher Outcomes
Credentials
Integration/Socialization
Attitudes
Mobility patterns
Professional practice components
Planning and preparation
Classroom environment
Instruction

Observation

External
Data

Document
Review

TB, TR
TB, TR
TB, TR
TR
C
C
C

Student Outcomes
Academic achievement
Behavior

S
S

Induction Program Components
Assessment
Orientation
Professional development workshops
Mentoring/peer coaching
Mentor selection
Mentor support
Mentor training
Small group activities
Observation

TI
TI
TI
TI
M

TI
TI

Context
Local area conditions
School characteristics
Classroom characteristics
Teacher characteristics

TB

Key: Data Sources
C
Classroom Observations
CCD
Common Core of Data (NCES)
Cen
U.S. Census
D
Program Description
S
School Records
SAT/ACT Teacher SAT/ACT Consent
TB
Baseline Teacher Survey
TI
Teacher Induction Activities Survey
TR
Teacher Retention Survey
M
Mentor Background Survey

10

CCD, Cen
CCD, S
S
SAT/ACT

D
D
D
D
D
D
D
D
D

will include their professional and personal background characteristics. The survey takes about
10 minutes to complete and appears in Appendix A.
c.

Baseline Teacher Survey
In October 2005, a baseline survey will be administered to the treatment and control

teachers. A cover letter will briefly summarize the study, explain its purpose, and assure
teachers that the confidentiality of the requested information will be maintained. Topics to be
covered are the teacher’s professional credentials, perceptions of the teaching profession, and
personal background characteristics, many of which (marital status, spouse’s occupation and
relocation history, number of young children, and salary at the start of the first year) may affect
retention. The survey will then ask teachers to provide their name, Social Security number, the
grade they are teaching, and contact information for follow-up. Teachers will receive the survey
by mail at their school, along with a letter asking that they complete it within two weeks and
return it in the pre-addressed, postage-paid envelope included in the survey packet. The survey
takes about 30 minutes to complete. The cover letter to teachers and the baseline teacher survey
appear in Appendix B and Appendix C, respectively.
d. Teacher ACT/SAT Scores
Teachers with different levels of academic ability may demonstrate different levels of
effectiveness, regardless of their participation in induction activities.

Therefore, it will be

important to control for differences in their academic ability. All treatment and control group
teachers will be asked to give the College Board or ACT permission to release their college
entrance exam scores for the study. The collection of these test scores will provide an objective
measure of teachers’ cognitive ability and will place no additional burden on teachers. It will be

11

made clear to teachers that they may decline to provide access to their scores. Appendix D
displays the consent form, which will be included in the baseline teacher survey packet.
e.

Student Records Data
The basic purpose of improvements in teacher quality are intended to result in improvements

in student achievement and other student outcomes. We will collect information on student
outcomes by obtaining school records data, aggregated to the classroom level (Table 1). Student
records data will be collected during summer 2006 and summer 2007 for study classrooms in
both treatment and control schools; these data will include scores from standardized tests that the
districts already plan to administer, as well as attendance and behavioral incidents such as
tardiness and disciplinary actions. Because aggregated student records data do not require
identification of individual students, active parental consent will not be required. Appendix E is
the notification letter that explains what is planned. Permission and procedures for accessing
these data will be discussed with each district at the time of their recruitment into the study.
Agreement to obtain the school records will be included in the memorandum of understanding
with each district.
f.

Classroom Observation Protocol
A key hypothesis of the evaluation is that high-intensity teacher induction will lead to

improvements in teachers’ instructional practices, which ultimately will affect student
achievement. Because classroom practices are difficult to quantify, the impact evaluation will
include classroom observations conducted by trained observers.
These classroom observations will be conducted to gain firsthand knowledge of each study
teacher’s approach to teaching in terms of pedagogical practices and classroom management (see

12

TABLE 1
SCHOOL RECORDS DATA ITEMS
Data Item
School name/identifier
Teacher identification number (Provided by MPR)
Classroom identifier
Grade level (supplied by MPR, to verify)
Number of students in class
Classroom Average
Score on mathematics test
Number with valid math score
Score on reading test
Number with valid reading score
Days enrolled (or average daily enrollment)
Days attended (or average daily attendance)
Days tardy (or average daily tardy rate)
Suspensions (occurrences)
Days suspended
Expelled
Disciplined (other, if available)
Number or Percentage of Students
Retained in grade
Promoted to next grade
With promotion contingent on summer school/retest
Eligible for free school lunch program
Eligible for reduced price lunch
African American
Hispanic or Latino
English language learners
Classified as having special needs, such as those with an Individual Education Plan
Note:

The initial request for school records data will include these data items. We expect to
work with each school district to determine which data items are available. If
appropriate, we also will discuss whether alternative formats for the data items can
more easily be provided to us.

13

Figure 3). Each treatment and control teacher from the 400 schools in our sample will be
observed twice, on consecutive days, in late spring 2006, before schools close for the summer.
Site visitors will be trained how to complete a classroom observation protocol developed by the
Vermont Institutes. Prior to each classroom observation, 10-minute semistructured interviews
will be conducted with each teacher. These interviews will address the teacher’s goals and
objectives for the lesson to be observed.
Appendix F contains a cover letter that will be sent to each teacher to confirm arrangements
for the classroom observations, and Appendix G contains the protocol for this 10-minute preobservation teacher interview. The observations themselves require no interaction with the
teachers. The protocol for the classroom observations (the Vermont Classroom Observation
Tool) is a proprietary document and is therefore not included in this document.
g.

Teacher Induction Activities Survey
It will be important to understand the differences in the services delivered by the high- and

low-intensity programs. Information about services delivered by programs operated at different
intensity levels will be useful for interpreting impacts and for identifying any district that needs
technical assistance to strengthen adherence to its high-intensity program model. Furthermore,
information about services received by control group teachers will be useful for characterizing
what would have happened in the absence of the high-intensity programs.
So that these retrospective self-reports are more accurate, a teacher induction activities
survey will be administered to both treatment and control teachers at three points (October 2005,
January 2006, and April 2006). Since the nature of induction activities may change often during
the school year, surveying three times will reduce any difficulties teachers may have in recalling
induction activities. Survey items will include questions applicable to activities delivered by

14

both the high-intensity programs and the “business as usual” (low-intensity) programs in
participating districts. The survey will ask questions about the focus of the induction activities,
the duration of each activity, the extent to which participants thought that each activity was
useful, and which additional types of help teachers would like to receive from mentors (topics 12
through 17 in Figure 3). Teachers will receive the surveys by mail, along with a letter requesting
completion of the surveys within two weeks. Teachers will be asked to return the survey in a
pre-addressed, postage-paid envelope that will be included in the survey packet. Completion
time for each survey is estimated to be 20 minutes. The cover letter to teachers and the teacher
induction activities survey appear in appendices H and I, respectively.
h. Teacher Retention Survey
In the fall of 2006, 2007, and 2008, the teacher retention surveys, which will concentrate on
the mobility of teachers to different schools, districts, or professions, will be administered. Items
will include the teacher’s current place of employment (the original school, a different school
within the same district, a different school in another district, or a temporary or permanent
nonteaching job), the timing of the change in employment, job satisfaction, the reason(s) for
leaving last year’s school, and the reason(s) for leaving the teaching profession, if applicable
(topic 4 in Figure 3). Completion time for each survey is 20 minutes, and teachers will receive
the survey by mail, along with a letter requesting completion of the survey within two weeks.
Teachers will be asked to return the survey in a pre-addressed, postage-paid envelope that will be
included in the survey packet. The most recent contact information (home address, home phone
number, cell phone number, email address, and Social Security number) that they provide in the
baseline teacher survey, as well as locating software, will be used to follow up with teachers who

15

move from a particular school. The cover letter to teachers and the teacher retention survey
appear in appendices J and K, respectively.
i.

Document Review
A document review of materials supplied by the two high-intensity induction program

providers will be conducted to supplement the information collected through the teacher
induction activities survey. Data collected will focus on assessment, orientation, professional
development workshops, mentoring/peer coaching, small group activities, and teacher
observations (topics 12 through 17 in Figure 3). These materials will include items such as
training agenda and materials, curriculum guides, and assessment tools. This information will be
collected directly from the two participating high-intensity induction program providers.
j.

Data to Measure Benefits and Costs
The benefit-cost analysis will not involve additional systematic data collection. Published

data and data collection activities already mentioned will provide the information needed to
estimate benefits and costs of teacher induction.
The Induction Activities survey will indicate the time spent in mentoring, orientation,
professional development, and other activities among beginning teachers in both the treatment
and control groups. We will combine this information with administrator and teacher salary data
gathered from public sources to compute the value of time spent by all those involved in
induction efforts.

For the treatment programs, we can compute unit cost information that

includes materials and activities not reflected in the Induction Activities Questionnaire from their
detailed contract information. For the control programs, districts can provide us with budget data
that indicates the cost of the district’s own induction services.

16

We will use published estimates of the costs of hiring and separation (including advertising,
recruiting, interviewing, administrative processing, and severance pay) to determine the cost of
replacing a teacher. We will consider a broader range of benefits of induction, including student
achievement and behavior and teacher satisfaction, in the cost-effectiveness analysis that will
complement the benefit-cost analysis. All this information will be gathered through existing data
collection efforts.
3.

Use of Improved Information Technology to Reduce Burden
The data collection plan reflects sensitivity to issues of efficiency, accuracy, and respondent

burden.

Where feasible, information will be gathered from existing data sources, such as

program and school records, using straightforward reporting forms or preexisting documents.
Districts (and schools, when appropriate) will have the option of delivering school records data
electronically, filling out a straightforward reporting form manually, or submitting hard-copy
documents that already exist.
In other cases, necessary data can be obtained only from school staff or teachers. Every
effort will be made to reduce burden and maximize efficiency of the process. The baseline
teacher survey and the induction activities survey will include a toll-free telephone number and
email address so that teachers can easily contact researchers with questions. Mail and telephone
followup will be conducted for nonresponse. These procedures are all designed to minimize
burden on respondents.
4.

Efforts to Identify and Avoid Duplication
There is much interest in obtaining an accurate assessment of how high-intensity induction

programs affect teacher behaviors and, thus, student achievement. To date, however, no studies

17

of this kind have been conducted.2 This impact evaluation thus will be an important contribution
to the policy debate. Its rigorous methodological design, incorporating random assignment of
schools, will ensure that highly credible evidence about the impact of high-intensity teacher
induction models on teacher retention, classroom practices, and student performance is obtained.
In most cases, the evaluation will gather data on baseline and outcomes measures that will
not require duplication of effort. For example, the evaluation will collect information on teacher
induction program activities only from the treatment and control group novice teachers and not
from the mentors. In contrast, the study will need to collect data on teacher performance from
more than one source, since measuring this is challenging and complex. The inclusion of
classroom observations of all teachers—which will afford the opportunity to observe teaching
practices firsthand—will enrich our understanding of teacher practices and our interpretation of
the study’s findings. In addition, teacher performance will be further measured by examining
student achievement through aggregated standardized test scores.
5.

Efforts to Minimize Burden on Small Businesses and Other Entities
Although both districts and schools will be involved in the impact evaluation, the burden

that each of these types of entities will incur should be minimal, particularly given the potential
benefits they will have the opportunity to receive. Districts and schools that agree to participate
in the study will need to work with either NTC or ETS to implement a high-intensity induction
program, and work with evaluators to provide school records data. Principals of these schools
will need to allow evaluators access to the teachers and their classrooms. Importantly, these
burdens will be mitigated by the opportunity that the districts and schools will gain from

2

The Teacher Follow-Up Survey, administered by the National Center for Education Statistics, asks a few
questions about induction practices. However, it has a one-year followup only.

18

receiving high-intensity induction services, which have the potential to increase teacher
retention, improve the quality of teaching by novice teachers, and produce better student
outcomes.
Participants will be asked to provide only the minimum information required to meet the
study objectives. The burden will be minimized through the careful specification of information
needs and the restriction of questions to information that is generally available to participants. In
addition, all data collection will be coordinated by trained staff so as to minimize the burden on
school staff.
6.

Consequences of Less-Frequent Data Collection
In the absence of the impact evaluation, IES will not be able to detect differences in teacher

retention rates, classroom practices, or student achievement stemming from differences in
intensity levels of teacher induction programs. Only the most basic of information addressing
the value of and approach to effective teacher induction is currently available, and much of that
information is methodologically suspect. Nevertheless, thousands of new teachers are hired
every year and make a transition into teaching with little or no scientifically based knowledge of
which types of support teachers need to remain in the profession and be effective in the
classroom.
The impact evaluation will fill this gap in policy-relevant knowledge, using a study design
containing several components.

Because high-intensity teacher induction programs have

multiple objectives (to increase teacher retention, improve classroom practices, and bolster
student achievement), the data collection plan is diverse. Nevertheless, it has been designed to
allow us to answer questions of policy importance with minimal burden to sample members.

19

7.

Special Circumstances Regarding Collection of Information in a Manner Inconsistent
with Section 1320.5(d)(2) of the Code of Federal Regulations
There are no special circumstances involved with this data collection.

8.

Federal Register Comments and Persons Consulted Outside the Agency

a.

Federal Register Announcement
A 60-day notice to solicit public comments was published in the Federal Register. No

public comments have been received as a result of this notice.
b. Consultations Outside the Agency
During preparation of the data collection plan for this evaluation, professional counsel was
sought from a number of people. Early in the study planning, input was solicited from a broad
range of researchers who are members of the Technical Working Group under contract to design
the impact evaluation and to provide ongoing input throughout the evaluation. Their counsel has
continually been sought on numerous issues. These people include:

c.

•

Carol Bartell, California State University at Los Angeles, 323-343-4300

•

Larry Hedges, University of Chicago, 773-256-6275

•

Hamilton Lankford, State University of New York at Albany, 518-442-4743

•

Rebecca Maynard, University of Pennsylvania, 215-898-3558

•

Sandra Odell, University of Nevada at Las Vegas, 702-895-3232

•

Jeff Smith, University of Maryland, 301-405-3532

•

Todd Stinebrickner, University of Western Ontario, 519-661-2111

Unresolved Issues
None.

20

9.

Payments to Respondents
In March 2005, NCEE submitted a paper to OMB outlining the Guidelines for Incentives for

NCEE Evaluation Studies. The incentives proposed for the Evaluation of the Impact of Teacher
Induction Programs conform to the incentives discussed within this paper.
The Evaluation of the Impact of Teacher Induction Programs is one that employs
randomization of schools. With a random assignment design, it is critical to maintain the
integrity of the treatment and control groups and ensure equivalence of the groups. This study’s
ability to detect effects of high intensity induction programs will be compromised to the extent
there is attrition of either the treatment or control group teachers, and especially if there is
differential attrition. If a significant portion of either the treatment or control group teachers
declines to participate, it will not be possible to conduct meaningful analyses based on “intent to
treat,” since it is not possible to add new members to either group. To the extent that members
of the treatment or control group are lost from the study, the findings are biased, and study funds
are wasted.
To encourage response and acknowledge that participation is not without some burden, we
plan to offer payment to teachers for completing the surveys and participating in classroom
observations. We will offer:
1. $30 for the Baseline (Background) questionnaire (a 25 minute survey and 5 minute
permission form, administered once)
2. $20 for the Induction Activities questionnaire (a 20 minute survey, administered
three times during the first school year)
3. $20 for the Retention (Mobility) questionnaire (a 20 minute survey, administered
once in each of the subsequent school years)
4. $25 per classroom observation (we will observe each teacher twice during the spring
of the first school year)

21

The maximum amount a teacher could be paid over four years is $200.

The target

population for this study of novice teachers in self-contained elementary school classrooms are
reported to be the object of numerous requests to complete surveys. Collective bargaining
agreements in many districts do not allow teachers to complete surveys during school time.
Incentives are therefore needed to encourage teachers to complete the surveys.

This is

particularly true for teachers in the control group, who do not receive any of the potential
benefits of the high intensity induction program, but are asked to complete the surveys and have
their classrooms observed. These teachers receive burden from the data collection without
receiving any potential benefit from the treatment.
Providing a $30 incentive for the Baseline questionnaire near the start of the school year will
help to ensure that we get the highest response rates possible on critical items that will be used to
control for background characteristics and to define subgroups in our analyses, as well as
provision of contact information so that all subsequent surveys can be successfully administered.
Providing the $20 incentive for each completion of the Induction Activities questionnaire is
essential given that the questionnaire will be administered three times during the 2005-2006
school year and high response rates during each administration are necessary to ensure
documentation of the contrast in induction services received by teachers in the treatment and
control groups. Providing the incentive to teachers in the treatment and control groups will help
to ensure that we get equivalent response rates from teachers in both groups without
compromising the quality of the data in any way. Teachers in the treatment group could be
encouraged to complete these surveys by their mentors and, thus, not need an incentive to do so,
but this could bias the actual responses provided and we do not want to risk such an outcome.
The classroom observations, which will provide us data for one of our key outcome
variables, need to be conducted during a fairly narrow window of time, so that teachers are all

22

observed at close to the same point in time near the end of the school year. However, many
teachers may feel uneasy about their classroom practices being observed and rated. Providing
teachers with an incentive to cooperate with the scheduling and conduct of these observations
will help to prevent large gaps of time in when the observations are conducted, which would
compromise the usefulness of these data.
This impact evaluation requires a lengthy field period, requiring data collection in four
consecutive years. Providing compensation for completion of the Retention questionnaires will
help us obtain high response rates on another core outcome measure.

The Retention

questionnaire is a key data collection that is particularly at risk for low response rates. This is
because novice teachers tend to have high mobility rates. Teachers are therefore unlikely to be
retained in the control group, and perhaps in the treatment group if the high intensity program
does not prove to be effective in curbing mobility. Teachers who leave the school or profession
will have no incentive to continue to complete the surveys, and may be lost from the sample if an
incentive is not offered. In addition, regardless of whether the teacher remains in the school or
profession after the first year, achieving high response rates will be harder to do in the follow-up
years when the teachers are not receiving induction activities. By compensating teachers for
completing these mail questionnaires, we will reduce the need for the more expensive approach
of using field interviewers to go to the sample members’ schools or homes to attempt interviews.
10. Assurances of Confidentiality
All data collection activities will be conducted in full compliance with ED regulations. Data
collection activities will be conducted in compliance with The Privacy Act of 1974, P.L. 93-579,
5 USC 552 a; the “Buckley Amendment,” Family Educational and Privacy Act of 1974, 20 USC
1232 g; The Freedom of Information Act, 5 USC 522; and related regulations, including but not
limited to: 41 CFR Part 1-1 and 45 CFR Part 5b and, as appropriate, the Federal common rule or
23

ED’s final regulations on the protection of human research participants. This is to maintain the
confidentiality of data obtained on private persons and to protect the rights and welfare of human
research subjects as contained in ED regulations. Each self-administered instrument will include
a reminder on the protection of confidentiality. Where data are collected through intervieweradministered interviews—for instance, with teachers who do not complete a self-administered
version and are interviewed by telephone—interviewers will remind respondents of the
confidentiality protections provided, as well as their right not to answer questions. All data
collectors and interviewers will be knowledgeable about confidentiality procedures and will be
prepared to describe them in full detail, if necessary, or to answer any related questions from
respondents.
MPR has a long history of protecting confidentiality and privacy of records and considers it
a critical aspect of the scientific and legal integrity of any study. The integrity the company
brings to protecting data confidentiality and privacy extends to every aspect of survey operations
and data handling in the field for the impact evaluation. MPR plans to use its ongoing, longstanding techniques, which have proven effective in the past. Every data collector will be
required to sign a pledge to protect the confidentiality of respondent data. The pledge indicates
that any violation or unauthorized disclosure may result in legal action or other sanctions by
MPR. A copy of this pledge will be kept on file and will be available upon request.
Specific Procedures to Maintain Confidentiality
MPR removes personal identifying information from respondents’ data as soon as practical.
Should MPR use a linking methodology, it is secured to prevent unauthorized linkage of the
respondent information and the personal identifiers. Hard-copy questionnaires completed by
teachers and mentors are returned to MPR in pre-addressed, postage-paid envelopes. However,

24

identifying information (such as contact sheets and locating information used by field
interviewers) is sent separately when possible.
To protect confidential data stored on hard-copy media, MPR keeps these materials in
controlled-access areas and locked rooms. When not in use, hard copies, floppy disks, and
computer tapes are also stored in these areas. In addition, we use log sheets to track and record
access to the confidential information and maintain this log as part of the project’s
documentation and records. Important raw data and intermediate and final analytical files are
copied to cartridge and assigned an expiration date or disposed of in accordance with the contract
requirement or data use agreement. Paper documents are then shredded.
A privacy impact assessment was conducted and the Privacy Act System of Records Notice
was published in the federal register on June 17, 2005.
11. Questions of a Sensitive Nature
School-based disciplinary events among students of sampled teachers can be considered
sensitive information. School records will be collected on such events as absenteeism, tardiness,
suspension, expulsion, and promotion among all the students of sampled teachers. However, the
student record data will be provided in aggregate form and linked to each teacher, and individual
students will not be identifiable.
The teacher questionnaire will contain background questions on sample members’ income,
marital status, education, race, ethnicity, age, household composition, and home ownership,
Some teachers may consider this information sensitive. However, data on these topics are
important to collect because of their strong relationship to teacher outcomes, such as retention.
Obtaining Social Security numbers is also important so that we can locate sample members if
they move and so that we can obtain college entrance exam data, which is also expected to be a
strong predictor of outcomes. Questions used to obtain this potentially sensitive information
25

have been asked frequently in other surveys and have been successfully pretested for this study.
In addition, we will request that teachers voluntarily sign a consent form to release their SAT and
ACT scores—further information that some teachers may consider sensitive.
12. Estimates of Respondent Burden
Table 2 provides an estimate of time burden. The total reporting burden for this data
collection effort is 3,066 hours. Most of these hours are for administering three types of surveys:
(1) a baseline teacher survey, which will take 30 minutes; (2) three teacher induction activities
surveys, each of which will take 20 minutes; and (3) three teacher retention surveys, each of
which will take 20 minutes. Additional time is included for the 10 minute mentor background
survey, the 10-minute teacher interviews that precede classroom observations and for extraction
of records data (about 20 hours per school district).
TABLE 2
BURDEN IN HOURS TO RESPONDENTS

Data Collection Activities
Baseline survey
Induction survey
Retention survey
Mentor survey
Pre-observation interviews
Extraction of student records

Average Burden
Hours/
Respondent
.50
.33
.33
.17
.17
20

Number of
Completions
960
2,880
2,735
40
1,920
20

Total
Burden
Hours
480
950
903
7
326
400

Estimated
Total Burden
Costs
(Dollars)a
10,781
21,337
20,281
157
7,322
8,984

Total
3,066
68,862
a
These estimated costs are based on an estimate derived from the National Compensation Survey of $22.46 as
the mean hourly earnings of elementary school teachers in 2003.

The numbers of teacher survey completions are calculated as follows. Survey completion
estimates are based on a sample of 20 districts, 20 schools per district, and 2.4 teachers per
school (yielding a total of 960 teachers included in the study). The baseline survey and the

26

induction surveys are completed in the 2005-2006 school year. We anticipate a 100 percent
response rate for these surveys, so we expect to obtain 960 baseline surveys and 2,880 (960
teachers × 3 surveys/teacher) induction surveys. The number of survey completes that we will
achieve for the retention surveys depends on our expected response rate with sample members.
We have assumed a 97 percent response rate in the 2006-2007 school year, which will yield 931
(960 teachers × 0.97 response rate) survey completes for the first retention survey. We anticipate
achieving 94 percent response rates for the retention surveys conducted in the 2007-2008 and
2008-2009 school years, which will yield 902 (960 × 0.94) survey completes each for the second
and third retention surveys.
We expect to complete background surveys with all mentors included in the study—these
are mentors who are working with NTC or ETS in providing induction services to teachers in the
treatment schools. Since they will all be present for the initial training session (as a condition of
their being hired for the position), there should be no problem in achieving a 100 percent
response rates with this group.
One way that we will examine the impact of induction program participation on teacher
practices is to conduct classroom observations. MPR will observe all teachers (960) twice in
spring 2006 (yielding 1,920 observations). Classroom observations will be conducted to gain
firsthand knowledge of each study teacher’s approach to teaching in terms of the teacher’s
content knowledge, pedagogical practices, and classroom management. Prior to each classroom
observation, the site visitor will conduct a 10-minute semistructured interview with each teacher
to understand the teacher’s goals for the class, to obtain copies of handouts, and to determine the
teacher’s preferences on seating and other logistical issues so that the observation is as minimally
disruptive as is possible. The observations themselves require no interaction with the teachers
and thus will impose minimal burden.

27

Student records, containing standardized test scores, attendance, and disciplinary
information, will be provided in aggregate form for teachers’ classrooms, so that individual
students cannot be identified. Based on experience obtaining similar data for other research
studies, and assuming that district staff will be able to provide these data in an extract of their
files, we anticipate that the average burden will be 20 hours per school district.
13. Estimate of the Cost Burden to Respondents
There are no direct costs to individual participants.
14. Estimates of Annualized Government Costs
The estimated cost to the federal government of designing the Evaluation of the Impact of
Teacher Induction Programs; designing and administering all data collection instruments;
collecting other data, such as student records; processing and analyzing all the data; and
preparing reports summarizing the results is $4,470,553. All activities will take place over five
years (from fall 2004 to fall 2009). Thus, the average annual cost of the evaluation activities
described within this package is $894,111. This estimate is based on MPR’s previous experience
in management of other research and data collection activities of this type.
15. Change in Hour Burden
This is a request for an extension in the time needed to complete the final year of data
collection for an existing data collection and therefore does not require any changes in hour
burden.
16. Time Schedule, Publication, and Analysis Plan
Our discussion of tabulation and publication plans focuses on the analyses we will conduct
and the reports we will produce. In Section 16.1, we discuss our approach to analyses, including

28

plans to (1) tabulate descriptive information gathered on teachers’ characteristics, school
districts, and induction services; (2) estimate impacts of the high-intensity induction programs;
(3) examine the types of teachers who stay in teaching as a result of the high-intensity program;
and (4) conduct analyses of program benefits and costs. Section 16.2 discusses the reports that
will be provided, and Section 16.3 discusses the schedule for the work.
1) Tabulation Plans
This section describes the four sets of analyses listed above.
a) Tabulating Descriptive Information Gathered on Teachers’ Characteristics, School
Districts, and Induction Services. To provide a context for the study, and specifically for the
impact and benefit-cost analyses, the evaluation will describe the characteristics of the school
districts, mentors, schools, and teachers included. Through the three periodic induction activities
surveys, we will also be able to assess adherence to the high-intensity program models in the
treatment schools, as well as whether any contamination of the control group is occurring, such
as if the induction services that should be delivered by control schools begin to mimic the
services offered through the high-intensity programs in the treatment schools.
Using the baseline survey data and publicly available data, we will describe the baseline
characteristics of teachers in the treatment and control groups, as well as the schools and
communities in which they teach. Doing so serves three purposes. First, it will guide us in
defining important subgroups. Second, it will facilitate interpretation of impact estimates if we
find different results between simple comparisons of treatment-control group differences and
regression-adjusted impact estimates. (Impact estimation is described in detail in the following
section.)

Third, we will be able to understand how the teachers and school districts that

participated in the study differ from teachers and schools nationwide.

29

b) Estimating Impacts of the High-Intensity Induction Programs. The main use of the data
will be to compare outcomes for teachers in the high-intensity teacher induction programs (the
treatment group) to those for teachers in low-intensity induction programs (the control group).
The teacher surveys, classroom observations, and school records will provide evidence of the
effect of the program at the end of the induction year and during the subsequent three years. By
randomly assigning schools to the two conditions (the high-intensity group and the low-intensity
group) at the outset of the study, we will be able to attribute differences (“impacts”) to the
introduction of high-intensity teacher induction. Impacts can be estimated by simply computing
the average difference in outcomes between treatment and control teachers in each district, then
computing the average of those district-level impacts.
In practice, we will refine this simple comparison of means by using regression methods to
compute the impact estimates. Research shows that the outcomes of interest to the study are
strongly related to characteristics of teachers and their schools (Hanushek 2004). We will adjust
for these characteristics when computing impacts by including them in an appropriately specified
regression model, thereby improving the precision of the impact estimates.3
In addition to computing the overall impacts of the high-intensity programs, we will
examine impacts for policy-relevant subgroups of teachers.
subgroups is the program provider, whether ETS or NTC.

One of the most important
Findings of impacts on other

subgroups, defined by district, school, and teacher characteristics, can provide important
information on how to interpret aggregate results and target the high-intensity induction

3

The regression methods will fully account for the sampling and random assignment design. For example, the
teachers are clustered within schools, which means that comparisons of groups of teachers will include measures of
data that are not independent of each other. The standard errors, which describe the level of uncertainty associated
with the impact estimates, will be computed in a way that recognizes the non-independence of teachers who are in
the same school.

30

programs toward those areas and persons most likely to benefit most from them. We will also
examine impacts for subgroups defined by characteristics of the low-intensity programs that exist
in the districts to determine whether aspects of a district’s preexisting induction program are
related to the effectiveness of the high-intensity programs. Additional subgroups will be defined
using data collected as part of the baseline teacher survey and through public-use data sets that
contain information about districts and schools, such as ED’s Common Core of Data (CCD).
However, we will not analyze impacts in each district, because the number of teachers that
could be used to compute those results will be too small for results to be meaningful.
Effects on Retention. Teacher retention, a key study outcome, can be defined in various
ways. (See Figure 4). Broadly speaking, we can refer to groups of teachers as stayers, movers,
and leavers. A new teacher can stay in his or her original school throughout the follow-up period
(a stayer) or leave the original school to go to a new one (a mover). The new school could be in
the same district or in a new one, or it could be nonpublic. The original and new schools could
have the same types of students (as measured by characteristics such as poverty rates or dropout
rates) or different types. Finally, the teacher may leave the teaching profession altogether (a
leaver).
To provide a comprehensive understanding of the impact of the high-intensity programs on
teachers’ probabilities of staying, moving, and leaving, we will compute impacts for all the
definitions of retention described above.

Such computation is important, because the

implications of each type of transition are different depending on one’s perspective.

For

example, an increase in between-school (within-district) mobility can hurt individual principals,
who must hire replacements, but this movement may benefit the district by placing a teacher in
an environment that allows that person to teach effectively. For example, someone who is a poor
match for a specific school may be better off in a new school, and the other staffs of both schools

31

FIGURE 4
VARIOUS TYPES OF TEACHER TRANSITIONS

Original
School

Remain at
Original
School

Same District,
Same Type of
Students

New School in:

Same District,
Different Type
of Students

New District,
Same Type of
Students

Leave
Teaching

New District,
Different Type
of Students

also may benefit. Also, the desirability of any given teacher’s remaining in the classroom
depends on the teacher’s effectiveness or potential for effectiveness in the future. We also will
examine the effect of the high-intensity programs on persistence. For example, we will examine
how a high-intensity program affects a teacher’s likelihood of remaining in his or her original
school throughout the three-year follow-up period.
Teacher retention will be measured through follow-up surveys administered to all treatment
and control teachers in fall 2006, fall 2007, and fall 2008. The followup is necessary to track
mobility in the critical early years of a teachers’ career, when most transitions are likely to occur.
The surveys are described in detail in Section A.2.
Effects on Teacher Practices and Student Outcomes. Professionals in any field are likely
to feel greater job satisfaction, and hence be less likely to quit, if they believe they are doing a
good job. Teachers who are more successful in managing their classes and instructing their
students may feel more confident in their abilities and experience greater job satisfaction, thereby

32

leading to greater retention.

Furthermore, recent studies have begun to find relationships

between teacher quality and student achievement, which suggests that students may also benefit
from improved teacher practices (Wenglinksy 2002; Hanushek et al. 1998). The study will
examine whether the high-intensity programs affect teacher quality by analyzing teacher
practices and student outcomes.
We plan to collect information about teacher practices and student outcomes through direct
observations of the classrooms and through the collection of school records.

(These data

collection efforts are described in detail in Section A.2.) The observations will be conducted in
the spring of 2006, toward the end of the intervention year, and the school records will be
collected both in the summer of 2006, after the end of the induction year, and in the summer of
2007, after the second year.
c)

Examining the Types of Teachers Who Stay as a Result of the Program. Higher rates of

teacher retention benefit school districts through lower turnover costs and can benefit students by
increasing the overall experience level of teachers. However, the benefit of increased teacher
retention to students also depends on the characteristics of the teachers retained, especially
compared with those of the teachers who would have replaced them. Put differently, having a
high-intensity induction program may affect the types of teachers in the school. Whether or not
that effect is desirable depends on the types of teachers being retained.
To examine the types of teachers who stay as a result of a high-intensity program, we will
use information from the baseline teacher survey and college entrance exam scores. These data
will make it possible to describe the qualifications of teachers who stay and leave, in terms of
their credentials, preparation, general education, and cognitive ability. We will also be able to
characterize the types of teachers who leave and stay in terms of their demographic and

33

household characteristics, their self-reported career expectations and job satisfaction, and their
teaching practices. Another dimension along which we can characterize stayers and movers is
the average test score gains of their students in the first year of the study.
d) Comparing the Benefits Versus Costs of the Program. Teacher induction programs have
the potential to benefit school districts by reducing costs associated with teacher turnover and by
improving children’s education. They also have the potential to retain high-quality teachers in
poor urban schools, where children’s need for quality teachers is highest. To determine whether
the costs of a high-intensity program are worthwhile, ideally, we would like to consider all the
potential benefits.
However, because of the many possible indirect benefits of an induction program,
conducting a comprehensive benefit-cost analysis is challenging in this setting. While it is
possible to calculate the direct financial benefits to a school district in dollar terms, the other
benefits are difficult to assess in those terms. For example, teacher induction programs may
increase the average experience level of teachers by increasing retention rates, which may
improve student achievement, which may in turn improve student outcomes—such as lifetime
earnings. Higher retention rates may also affect the cohesiveness of a school’s staff and the
overall school environment.
Given these challenges of analysis, we will conduct two less-comprehensive, but still useful,
analyses of costs and benefits. The first analysis compares the direct financial costs associated
with a high-intensity induction program and the direct financial benefits to a school district of
reducing teacher turnover. This analysis takes into account the recruiting and training costs of
hiring a replacement after a teacher leaves. It does not account for any beneficial effects that a

34

high-intensity program has on students, staff cohesiveness, labor market dynamics, or other
secondary factors that are not measured through the data collected for the study.
The second analysis will examine the cost-effectiveness of the high-intensity programs in
affecting many outcomes—including teacher practices, the types of teachers retained, the ability
of schools serving at-risk populations to retain high-quality teachers, and student achievement.
Though the benefits of affecting these outcomes are difficult to quantify in dollar terms, many
educators and policymakers will find it useful to know the costs associated with these important
outcomes.
2) Publication Plans
The central tasks during the last three years of the study are to analyze the data and write
one report and two briefs about results. The report will contain a description of all aspects of
program implementation, monitoring, and technical assistance that occurred. It also will report
on the first-year impacts of the high-intensity induction programs. The first brief will describe,
in detail, all costs and effort associated with implementing the induction programs, as well as the
second-year effects of the programs. The costs will be presented on both a per-teacher and a perdistrict basis. The second brief will present third-year effects and the benefit-cost analyses.
MPR will submit the draft report about first-year effects to ED in February 2007. A revised
version, which addresses the comments of ED and the expert panel, will be delivered in April
2007, while a final version that incorporates minor editorial revisions will be delivered in May
2007. The draft of the first brief, about second-year effects, will be delivered in February 2008,
while a final version that addresses ED’s comments will be delivered in March 2008. Likewise,
draft and revised versions of the second brief, about third-year effects and benefit-cost analyses,
will be delivered in February and March 2009, respectively.

35

We also will prepare both public- and private-use data files, along with supporting
documentation. The private-use file will contain all the data collected for and used by the
evaluation, including personal identifiers of teachers, in case ED would like to conduct further
followup of the teachers in the study. The public-use file will contain all the data in the privateuse file, except the personal identifiers. It will enable other researchers, outside of ED, to
conduct their own work and to replicate the study’s findings. Both files, along with their
documentation, will be submitted to ED by August 2009.
3) Schedule
The full timeline for the evaluation (shown in Table 3) calls for design and district selection
activities between October 2004 and August 2005.

Implementation of the high-intensity

induction programs, as well as baseline and induction activities data collection, will occur during
the 2005-2006 school year. We will collect outcomes data on teacher practices in spring 2006,
student achievement in summer 2006, and teacher retention in fall 2006, fall 2007, and fall 2008.
The report that describes program implementation and presents the first-year impact effects will
be provided in spring 2007. The briefs on second- and third-year effects of the program will be
provided in spring 2008 and spring 2009.
17. Display of Expiration Date for OMB Approval
Approval not to display the expiration date for OMB approval is not requested.
18. Exceptions to Certification Statement
No exceptions to the certification statement are requested or required.

36

REFERENCES
Alliance for Excellent Education. “Tapping the Potential. Retaining and Developing HighQuality New Teachers.” Washington, DC: Alliance for Excellent Education, 2004.
Benner, A.D. The Cost of Teacher Turnover. Austin, TX: Texas Center for Educational
Research, 2000.
Hanushek, Eric A. “Some Simple Analytics of School Quality.” National Bureau of Economic
Research Working Paper no. 10229. Cambridge, MA: NBER, 2004.
Hanushek, Eric A., John F. Kain, and Steven G. Rivkin. “Teachers, Schools, and Academic
Achievement.” National Bureau of Economic Research Working Paper no. 6691.
Cambridge, MA: NBER, 1998.
Ingersoll, R.M. “Is There Really a Teacher Shortage?” Philadelphia, PA: University of
Pennsylvania, Center for the Study of Teaching and Policy and the Consortium for Policy
Research in Education, 2003.
Mayer, Daniel, John Mullens, and Mary Moore. “Monitoring School Quality: An Indicators
Report.” Report prepared for the U.S. Department of Education, National Center for
Education Statistics. NCES 2001-030. Washington, DC: U.S. Department of Education,
Office of Educational Research and Improvement, December 2000.
Sanders, W.L., and J.C. Rivers. “Research Progress Report: Cumulative and Residual Effects of
Teachers on Future Student Academic Achievement.” Knoxville, TN: University of
Tennessee Value-Added Research and Assessment Center, 1996.
Smith T.M., and R.M. Ingersoll. “What Are the Effects of Induction and Mentoring on
Beginning Teacher Turnover?” American Educational Research Journal, vol. 41, no. 2,
summer 2004.
Smith, T.M., and R.M. Ingersoll. “Reducing Teacher Turnover: What Are the Components of
Effective Induction?” Paper presented at the annual meeting of the American Educational
Research Association, Chicago, IL, 2003.
Wenglinsky, Harold. “The Link Between Teacher Classroom Practices and Student Academic
Performance.” Education Policy Analysis Archives, vol. 10, no. 12, February 2002.

37


File Typeapplication/pdf
File TitleMicrosoft Word - OMB-Part A-Combined Text.doc
AuthorDPatterson
File Modified2008-04-29
File Created2008-04-29

© 2024 OMB.report | Privacy Policy