Att_OMB-PartB.HSAC-Package2.v7

Att_OMB-PartB.HSAC-Package2.v7.pdf

Evaluation of Secondary Math Teachers from Two Highly Selective Routes to Alternative Certification

OMB: 1850-0866

Document [pdf]
Download: pdf | pdf
Contract No.:
ED-04-CO-0112 (09)
MPR Reference No.: 6522-530

An Evaluation of
Secondary Math Teachers
From Two Highly
Selective Routes to
Alternative Certification Addendum
Part B: Supporting Statement
for Paperwork Reduction Act
Submission
April 23, 2009
Revised July 31, 2009

Submitted to:
Institute of Education Sciences
IES/NCEE
U.S. Department of Education
555 New Jersey Avenue, NW
Washington, DC 20208
Project Officer:
Stefanie Schmidt

Submitted by:
Mathematica Policy Research, Inc.
P.O. Box 2393
Princeton, NJ 08543-2393
Telephone: (609) 799-3535
Facsimile: (609) 799-0005
Project Director:
Sheena McConnell

CONTENTS

Page
PART B: SUPPORTING STATEMENT FOR PAPERWORK REDUCTION ACT
SUBMISSION ................................................................................................................1
B. COLLECTION OF INFORMATION EMPLOYING STATISTICAL
METHODS ..............................................................................................................2
1.
2.
3.
4.
5.

Respondent Universe and Sampling Methods .................................................2
Statistical Methods for Sample Selection and Degree of Accuracy
Needed .............................................................................................................3
Methods to Maximize Response Rates and Deal with Nonresponse ...............7
Tests of Procedures and Methods to Be Undertaken .......................................9
Individuals Consulted on Statistical Aspects of the Design ..........................10

REFERENCES..............................................................................................................11

ii

APPENDICES

APPENDIX A: TEACHER CONTACT FORM
APPENDIX B: TEACHER MATH ASSESSMENT
1. Teacher Letter
2. Teacher Consent Form
APPENDIX C: PARENT CONSENT
1. Passive Consent Letter for Middle School Students
2. Passive Consent Letter for High School Students
3. Active Consent Letter for Middle and High School Students
4. Active Consent Form for Middle School Students
5. Active Consent Form for High School Students
APPENDIX D: STUDENT RECORDS DATA REQUEST
1. District Letter
2. District Student Records Data Request Form

APPENDIX E: TEACHER SURVEY
1. Teacher Letter
2. Teacher Survey
APPENDIX F: STUDENT ASSENT FOR MATH ASSESSMENT
APPENDIX G: HSAC PROGRAM ADMINISTRATOR INTERVIEW PROTOCOL

iii

EXHIBITS

Exhibit
1

Page

MINIMUM DETECTABLE EFFECT SIZES ................................................................6

iv

PART B: SUPPORTING STATEMENT FOR PAPERWORK
REDUCTION ACT SUBMISSION

This package requests clearance for data collection activities to support a rigorous
evaluation of secondary math teachers who have entered teaching through highly selective routes
to alternative certification (HSAC). This evaluation is being conducted by the Institute of
Education Sciences (IES), U.S. Department of Education (ED); it is being implemented by
Mathematica Policy Research, Inc. (MPR) and its partners—Chesapeake Research Associates
LLC and Branch Associates.
The objective of the evaluation is to estimate the impact on secondary student math
achievement of teachers who obtain certification via HSAC routes compared with teachers who
receive certification through traditional or less selective alternative certification routes. The
evaluation design is an experiment in which the researchers will randomly assign secondary
school students to a treatment or control group. The treatment group will be taught by an HSAC
teacher and the control group will be taught by a non-HSAC teacher. Both teachers must teach
the same math class at the same level under the same general conditions at the same school. We
will compare student math achievement between the treatment and control groups to estimate the
impact of HSAC teachers.
This is the second submission of a two-stage clearance request. The package was submitted
in two stages because the study schedule required that district and school recruitment begin
before all the data collection instruments were developed and tested. The first stage package
requested approval for recruitment of schools, a teacher background form, a pilot of the student
assessment, and the random assignment of students. In this package, we are requesting approval
for:
A teacher survey and collection of teacher contact information
A teacher math content knowledge assessment—the Praxis math subject test—to be
administered to teachers who were not required to take this test for certification
A form for all teachers—whether they took the Praxis math subject test to obtain
certification or just for this study—to provide consent for the Educational Testing
Service (ETS) to release their scores on this assessment to the study team
Parent/guardian consent forms for the administration of a math assessment to high
school students and the collection of school records on middle and high school
students
Collection of school records data on student characteristics and scores on state or
district math assessments
A student math assessment and students’ assent for taking the assessment
A protocol for semi-structured interviews of HSAC program administrators
1

This package provides a detailed discussion of the procedures for these data collection
activities and copies of the forms and instruments.
B. COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS
1.

Respondent Universe and Sampling Methods

The respondent universe for the study will consist of secondary school math teachers from
two HSAC programs (Teach For America [TFA] and the Teaching Fellows programs and other
similar programs affiliated with The New Teacher Project [TNTP]), non-HSAC teachers of the
same courses in the same schools, and the students in these courses. The sample will be selected
in four stages. MPR will (1) identify districts with TFA or TNTP secondary math teachers, (2)
identify the schools within these districts that employ secondary math teachers from TFA or
TNTP, (3) select at least one HSAC and one non-HSAC secondary math teacher who are
teaching the same course in the same school, and (4) randomly assign students between the
classrooms taught by HSAC and non-HSAC teachers and include all students in these
classrooms in the research sample.
The study will include a total of 450 “classroom matches,” each match consisting of a math
class taught by an HSAC teacher and one taught by a non-HSAC teacher in the 2009-2010
school year, for a total of at least 900 classrooms. All classes in the match must be for the same
subject (for example, Algebra I), at the same level (for example, honors, remedial, or regular),
and must be taught under the same circumstances (for example, English language learners must
be evenly distributed across the classrooms rather than clustered with one teacher or the other).
Furthermore, it must be possible for researchers to assign students randomly between classes in
the match with no disruption to or involvement in school scheduling procedures; this will
typically be the case when all sections of the match are taught concurrently. 1 The same teacher
may be in more than one classroom match if he or she teaches more than one eligible class
during the school day. Assuming each teacher in the sample teaches an average of three study
classes, we anticipate a total sample of 150 HSAC teachers and 150 non-HSAC teachers. We
anticipate that it will require the participation of approximately 112 schools in 20 districts.
Assuming 20 students per classroom, the study will include approximately 18,000 students.

1

Alternatively, if the school assigns students to “teams” in which students take all courses together throughout
the school day and the target course is taught by an HSAC teacher in one team and a non-HSAC teacher in the other
team, students can be randomly assigned to teams regardless of whether the courses are taught concurrently, with no
involvement in the school’s scheduling procedures.

2

2.

Statistical Methods for Sample Selection and Degree of Accuracy Needed

a.

Sample Selection

Ideally, we would randomly sample teachers from the entire universe of secondary school
HSAC math teachers for the evaluation. Impact estimates would then generalize to all secondary
school HSAC math teachers. However, random sampling is not possible because the evaluation
will necessarily be limited to schools in which the experimental design is feasible—those with
eligible classroom matches. Thus, we propose to draw a purposive sample designed to meet a
specified statistical standard of precision. Although we will not be able to generalize to all HSAC
math teachers, we will obtain valid estimates of the impacts of the set of HSAC math teachers
who meet our eligibility requirements.
Selection of School Districts. Rosters from TFA and TNTP will indicate the districts that
have hired teachers from these programs. Because we are collecting a purposive sample, we will
not randomly sample these districts. Instead, we will prioritize our recruiting efforts to focus first
on the districts with the most math teachers and later on the districts with fewer.
Selection of Schools. From the school districts selected, we will contact schools to
participate in the study and to determine whether or not they anticipate having an eligible
classroom match in the 2009-2010 school year.
Selection of Teachers. Teachers who teach a class in the classroom match will be included
in the study. The teachers will be teaching math and include both HSAC and non-HSAC
teachers. In each school, we will include as many eligible classroom matches as possible in order
to maximize the statistical power of the study.
Selection of Students. All students signed up for classes in a classroom match and who can
take an assessment (if in high school) will be included in the study sample. These students will
be randomly assigned to either a classroom taught by an HSAC teacher or a classroom taught by
a non-HSAC teacher within the match.
b. Estimation Procedures
To estimate the impact of HSAC teachers on secondary student math achievement for the
full evaluation, we will treat each classroom match as a separate “mini-experiment.” For each
classroom match, we will compare the average outcome math assessment score of students
randomly assigned to the class taught by the HSAC teacher to the average score of those
assigned to the non-HSAC teacher—the difference in average scores will provide an estimate of
the HSAC teacher’s impact in that particular classroom match. We will then average the impact
estimates across all classroom matches in the study to come up with an overall estimate of the
HSAC teachers’ impact on secondary student math achievement.
Primary Impact Analysis. Due to random assignment, the differences in mean outcomes in
each classroom match will provide an unbiased estimate of the impact of HSAC teachers.
However, the precision of the estimates can be improved by controlling for student-level baseline
characteristics that may explain some of the differences in achievement, such as sex, race,
3

free/reduced price lunch eligibility, special education status, whether the student is an English
language learner, and prior math achievement. We will therefore estimate the following model of
student math achievement for student i in classroom match j:
(1) Yij

Pj

X ij

Tij

i

where Yij is the outcome math test score of student i in classroom match j, Pj is a vector of
classroom match indicators, Xij is a vector of student-level baseline characteristics, Tij is an
indicator for whether the student was in the HSAC teacher’s class in classroom match j, εi is a
random-error term that represents the influence of unobserved factors on the outcome, and β and
δ are vectors of parameters to be estimated. Because the randomization is done within classroom
matches within schools, and schools may differ from each other in student compositions, the
model includes a vector of classroom match indicators, Pj, to control for differences in the
average student characteristics between classroom matches and schools. If a sufficient number of
classroom matches contain three teachers instead of two, the estimated standard errors will
account for clustering of students within classroom.
The vector δ represents the experiment-level impacts of the HSAC teachers in each
classroom match that can then be aggregated to estimate the overall HSAC impact. The simplest
and perhaps most intuitively appealing way to aggregate these impacts is to calculate an equally
weighted average of the classroom match-level impacts. In this way, each classroom match will
have an equal influence on the overall impact estimate. As a specification check, we will also
explore alternative weighting schemes that have the potential to provide greater statistical
efficiency and test the robustness of the findings, including giving greater weight to more
precisely estimated classroom match-level impacts and weighting proportionally to the size of
the sample in each classroom match.
Subgroup Analyses. In addition to estimating the overall impact of HSAC teachers on
secondary student math achievement, we will conduct a limited number of subgroup analyses.
Specifically, we will separately estimate the impact of TFA and TNTP teachers, middle and high
school HSAC teachers, and novice and experienced HSAC teachers. To calculate subgroup
impacts, the classroom match-level impact estimates will be aggregated for each relevant
subgroup. For example, to calculate the subgroup impacts for high school and middle school
teachers, the impact estimates from experiments in high schools will be aggregated separately
from those from the experiments in middle schools. While we will test the statistical significance
of the impact for each subgroup, we will not test the significance of differences between
subgroups (for instance, between TFA and TNTP teachers), as the sample will not provide
adequate statistical power for these comparisons.
Non-Experimental Analysis. If we find that HSAC teachers are more effective than nonHSAC teachers, policymakers will want to understand the reasons they are more effective. To
shed light on this, we will investigate whether there are particular observable teacher
characteristics that are correlated with the impacts. Because the effects of the teacher
characteristics cannot be separated from the HSAC recruiting model experimentally, we will rely
on non-experimental methods for this exploratory analysis.

4

For the non-experimental analysis, we will estimate variations of Equation 1 that introduce
within-experiment differences in teacher characteristics:
(2) Yij

Pj

X ij

Tij

Cij

i

where Cij represents a vector of observable characteristics of student i's teacher, γ is a vector of
parameters to be estimated, and all other variables are defined as above. Since these models
include classroom match-level fixed effects, the coefficients in vector γ represent the correlations
between the within-match differences in teacher characteristics and the within-match differences
in student outcomes. These exploratory analyses will be guided in large part by differences
between HSAC and non-HSAC teachers that are observed through the teacher survey and that
have been hypothesized to influence student achievement. For example, HSAC teachers are often
perceived to be different from non-HSAC teachers in their subject knowledge, the selectivity of
their undergraduate colleges, and their experience, all of which have been connected to student
achievement in prior research (Clotfelter et al. 2007). Therefore, using data from the teacher
survey and teacher math knowledge assessments (if the option is exercised), we will examine
how the differences between the HSAC teachers and the non-HSAC teachers along these
dimensions are correlated with student outcomes.
Non-Response and Crossovers. Although, we will take steps to minimize the amount of
missing data, some student non-response for this evaluation is inevitable. This non-response may
lead to biased impact estimates if the non-response is correlated with math achievement and
whether the student was assigned to an HSAC teacher. To address this, we will use propensity
score matching and create non-response weights that appropriately weight those for whom we
have outcome math test scores, so that the weighted sample of students with nonmissing data is
representative of the full sample. In addition, some students who are assigned to an HSAC
teacher may crossover into a class with a non-HSAC teacher or vice versa. Including crossover
students might bias the impact estimates by attributing the performance of the HSAC teacher to a
non-HSAC teacher and vice versa. We can adjust the estimates for these crossovers using the
students’ assignment status as an instrumental variable for having an HSAC teacher (Angrist et
al. 1996).
c.

Degree of Accuracy Needed

The study is designed to achieve a minimum detectable effect (MDE) of 0.10 standard
deviations in student math test scores. This target MDE is based on considerations of policy
relevance and attainability, balanced against the costs of data collection. It is lower than MDEs
from similar studies at the elementary school level because test score gains tend to be lower at
the middle and high school levels. Estimates of average annual gains in effect sizes from
nationally-normed math tests across grade levels presented by Hill et al. (2007) indicate that a
0.10 standard deviation effect of HSAC teachers on test scores would be equivalent to roughly a
third of a year of schooling for children in grades 6-10, a policy-relevant effect by most
standards. Furthermore, previous research has estimated effects of HSAC teachers as high as
0.11 standard deviations (Boyd et al. 2006; Kane et al. 2006), suggesting that an HSAC impact
of 0.10 might be attainable.

5

Exhibit 1 displays MDE sizes for the full sample and for subgroups of teachers. The MDEs
are based on an assumed sample of 112 schools, one-third providing four teachers for the study
and the rest providing two teachers, for a total of 300 teachers (150 HSAC and 150 non-HSAC
teachers). We assume each teacher on average teaches in three separate classroom matches, for a
total of 450 classroom matches, or 900 classes. We further assume each class has an average of
20 students, for a total of 18,000 students.
For all calculations, we assume a 5 percent level of statistical significance and an 80 percent
level of statistical power. Based on the previous experimental study of TFA (Decker et al. 2004),
we assume a “crossover rate” (students switching from the treatment to the control classroom or
vice versa) of 5 percent and a sample attrition rate of 10 percent. Also, consistent with the
previous experimental TFA study, we assume a teacher-level intracluster correlation (ICC) of
0.15 to account for correlation of outcomes between teachers as well as a correlation between
treatment and control group outcomes within a school of 0.50. We assume that control variables
in the impact model—in particular baseline test scores—explain 50 percent of the variances in
the test score outcome measure (that is, R2 = 0.50).
EXHIBIT 1
MINIMUM DETECTABLE EFFECT SIZES

Subgroup size
100 percent (full sample)
75 percent
50 percent
30 percent

Minimum Detectable Effect

Sample Size
(students/teachers)

0.10
0.11
0.14
0.18

18,000/300
13,500/225
9,000/150
6,000/100

Note: The minimum detectable effects were calculated using the following formula:
2.486 *

1

R2 *

2
T

2(1
N

)

where R2 (= .50) is the regression R-squared value estimated from previous studies, T is the number of
treatment (control) group teachers, N is the total number of students in the treatment (control) group
classrooms (assuming 20 students per class),
(= .15) is the between-classroom variance as a percentage of
the total variance of the outcomes based on previous similar studies, and sample attrition is 10 percent.

d. Unusual Problems Requiring Specialized Sampling Procedures
We do not anticipate any unusual problems that require specialized sampling procedures.
e.

Use of Periodic Data Collection Cycles to Reduce Burden

All of the data collection activities that we are requesting clearance for in this package will
occur only once.

6

3.

Methods to Maximize Response Rates and Deal with Nonresponse

Teacher Contact Form. We will administer a teacher contact form that requires minimal
effort to complete at the beginning of the school year (Appendix A). In the states where we will
administer a teacher assessment, we will personally distribute the contact and consent forms to
the teachers and request that they complete it at that time. In the other states, we will mail the
forms to the teachers at their school and they will have the option of returning the forms by mail
or fax.
The contact information collected by this form will help us contact teachers who leave the
school during the school year so we can ask them to complete the teacher survey in the spring.
To maximize response rates, we will offer teachers $5 for each completed contact form. Based
on previous projects, we anticipate a 95 percent response rate for the teacher contact form. On
the Impact Evaluation of Teacher Induction Programs, teacher contact information was collected
as part of a 30 minute teacher survey that achieved a 94 percent response rate. We expect a
similar response rate for the teacher contact form for this study, which will take less than 5
minutes to complete.
Teacher Math Assessment and Consent Form. We will administer the Praxis Middle
School Mathematics (0069) test to teachers in grades 6-8 in states that do not require this test (or
the 0061) for certification. We will administer the Praxis Content Knowledge in Mathematics
(0061) test to teachers in grades 9-12 in the states that do not require teachers to take these tests
for certification. To encourage teachers to take the teacher math assessment, we will send an
invitation letter on ED letterhead (Appendix B). The letter will highlight the importance and
purpose of the teacher math assessment and emphasize our commitment to maintaining data
confidentiality. We will follow up with telephone calls to the teachers at the school to confirm
their participation. A payment of $120 will be offered to compensate teachers for the time (two
hours) and effort to take the test. The assessment will be scheduled following school hours at the
school or a site within the district to minimize traveling time for the teachers.
In the states that require teachers pass these Praxis math subject tests for certification, we
will collect from ETS the study teachers’ test scores from when they took the test to obtain
certification. These teachers will receive a letter on ED letterhead which will request them to
complete and return the enclosed contact form and consent form to release their scores to the
study team (Appendix B). The letter will offer a $5 incentive upon receipt of the completed
forms.
We expect that we will be able to collect existing Praxis math subject test scores for 15
percent of the teachers, based on the proportion of study teachers that is employed in states that
require the test for certification. Therefore, we will seek to administer the test to 85 percent of
the teachers. Drawing upon prior experience, we expect that 90 percent of these teachers will
complete the teacher math assessment. The Evaluation of Early Elementary School Mathematics
Curricula achieved a 96 percent response rate for a one-hour teacher assessment that was
administered at the school during curriculum training sessions. Since training will not be coupled
with taking the Praxis assessment in this study, we anticipate a lower response rate. In total, we
expect that we will administer the test to 76.5 percent (90 percent of 85 percent) of the study
teachers.

7

Parent Consent. High participation rates for the student math assessment and student
records data collection will depend on high rates of parent consent for each student’s
participation in the study (Appendix C).
All students for whom active or passive consent is required will be asked to take home a
two-sided consent form and/or notification letter, one side in English and the other in Spanish, to
their parents or guardians (English copies are presented in Appendix C). These documents will
be translated into other languages as needed. The documents will inform parents and guardians
that their child’s classroom has been selected for a national study of HSAC teachers, that
participation in the study is voluntary, and that it will involve schools/districts providing
demographic and test score data for their child. For the high school students, the consent
documents will indicate that students will be asked to complete a math assessment in class at the
end of the school year. The consent documents will also specify that the information collected
will be kept confidential and will only be reported in aggregate and it will also provide a toll-free
telephone number for parents to call to ask questions about the study. If possible, a letter from
the school principal supporting the study will be sent along with the consent form. The language
will be clear and nontechnical. The consent documents are modeled after documents we have
used in other evaluations.
Before the beginning of the school year, the school principal will be called to alert him or
her to the need to distribute the consent forms. The forms will be sent via Federal Express to the
school, with clear instructions for their distribution. The school will be asked to send the parental
consent forms in the “first-day” packages distributed to parents if possible.
Teachers will be asked to collect the signed forms, and the school will be provided with
postage-paid Federal Express packages to return the completed forms. Schools will be asked if
they would be willing to have the students hand address an envelope with the consent form so
that the material can be directly mailed to the parent. Before the end of the first week of school,
we will call the school and the teachers to remind them to encourage students to return the forms.
Further calls will be made as needed. If the rate of return of consent forms is low, a member of
the evaluator’s data collection team will visit the school to talk personally to the principal,
teacher, and class. This study member may also attend school events that are frequented by
parents, such as back-to-school nights or parent-teacher nights. At these events, study members
can talk about the study and directly ask parents to complete the form. Additional consent
packets will be distributed to students/parents as needed. We will discuss with the school the
possibility of sending the consent forms with the students’ report cards or other school materials
requiring parent signature (such as class syllabi).
To explore whether incentives are effective in increasing the rate at which students return
consent forms in active consent districts, we propose to conduct an incentive experiment. The
experiment will investigate the effectiveness of two types of incentive. The first incentive is $25
offered to classrooms that collect signed consent forms for at least 95 percent of their students.
The second incentive is a $5 gift card offered to each student who returns the signed consent
form. Both types of incentives will be paid on the basis of returned forms, regardless of whether
the parent provided consent. We based our decision to provide a $5 gift card to students on two
studies. In the Impact Evaluation of Mandatory Random Study Drug Testing (OMB Approval
#1850-0818), students received a movie ticket ($7 value) for the return of a completed consent

8

form, regardless of the consent status. In the Evaluation of the Youth Transition Demonstration
Projects (OMB Approval #0960-0687), youth received a $10 Target gift card or Metrocard if
they returned the consent form, regardless of consent status.
Schools in districts that require active consent will be randomly assigned to one of three
groups:
1. Treatment 1: Classroom receives a $25 incentive if 95 percent or more of the consent
forms are returned and individual students are offered a $5 gift card if they return the
consent form.
2. Treatment 2: There is no classroom incentive; individual students are offered a $5 gift
card if they return the consent form.
3. Control. There is neither a class incentive nor a student financial incentive for returning
the form.
In each group of schools, we will document the percentage of forms returned each week
starting with the week the forms are sent home with the students. Other procedures used to
encourage the return of the forms not involving financial incentives will be similar in each group.
Comparisons across groups of the number of forms returned each week will provide estimates of
whether student or classroom incentives are effective and whether offering both student and
classroom incentives is more effective than either student or classroom incentives alone. A
power analysis concluded that we will be able to detect a difference by group of 14 percentage
points or more in the rates at which the forms are returned, a difference much lower than found
in previous experiments (Thompson 1984). The results of the experiment will be documented
and presented to OMB.
School Records Data. To minimize burden on the district and maximize the likelihood of
obtaining the data, during the initial phases of recruiting we will ask each district how
administrative records data are stored, how we can obtain permission for collecting this
information, and the contact person we should work with to obtain the data. For districts where
these data are stored at the school level, we will provide the school a letter of approval from the
district and a letter that will describe the type of data requested and include a toll-free number for
school staff to call if they have questions (Appendix D). We will also accept electronic data file
or hard copy lists. We assume that we will be able to obtain records for 95 percent of the
participating students.
Teacher Survey. To maximize response to the teacher survey, we will send teachers a letter
that will describe the study and provide instructions to complete the survey online at their
convenience or to request a paper self-administered questionnaire (Appendix E). We will send
out reminder emails and make reminder telephone calls. If necessary, MPR staff will follow up
with nonrespondents and administer the survey over the telephone at the teachers’ convenience.
To increase the response rate, we will offer $30 for each completed survey. Drawing on our
experience with similar surveys, we expect a 90 percent response rate for the teacher survey. For
the Evaluation of the Impact of Teacher Induction Programs, we achieved response rates
between 85 percent and 94 percent for six teacher surveys, ranging from 20 to 30 minutes long.

9

The 30 minute teacher survey administered for the Impact Evaluation of Teacher Preparation
Models achieved a 94 percent response rate.
High School Student Math Assessment. Students will be asked for their assent to complete
the test either through a paper form or on the first screen of the computerized math assessment
(Appendix F). Students who do not assent to the test will not take the test. We expect that nearly
all students will assent to take the test. Because the test is presented on a computer, it will be
novel to the students, and because it is adaptive, will not be too challenging and hence
frustrating. To express our appreciation for participation in the student math assessment, we will
offer a $5 gift card to participating students.
Some students in our study will transfer to other schools within the district, and others will
relocate outside the school district. Student mobility will be tracked through the use of multiple
classroom roster checks in each school. Schools will be asked to provide their current rosters for
the classrooms of sampled teachers three times during the school year. These will be crosschecked against the study sample in each classroom. Follow-up telephone calls with the
appropriate school or district administrator(s) will help determine the location of those no longer
enrolled in the study class. We will attempt to test students who leave the study classroom but
remain in the same school when we assess their former classmates. For students who leave the
study school but remain in the same school district, every attempt will be made to test them in
their new schools at about the same time as those in their original cohort. We expect a 90 percent
response rate for the student assessment, the same response rate achieved for the student math
assessments administered for the Impact Evaluation of Teacher Preparation Models.
HSAC Program Administrator Interviews. To maximize response for the interviews with
the program directors, we will call the HSAC program administrators to introduce the study and
talk about the purpose of the study and interviews, and describe the topics to be covered in the
interviews (Appendix G). Immediately after the call, we will email the study summary to them to
provide more information (presented in the first submission of this OMB clearance request). A
few days later, we will call the program administrators to address their questions and arrange a
time for the interview.
4.

Tests of Procedures and Methods to Be Undertaken

The teacher contact form, teacher consent form, and teacher survey were modeled on
instruments used in a previous study, the Impact Evaluation of Teacher Preparation Models. The
school records data collection form was modeled on forms developed for the Impact Evaluation
of Charter School Strategies and Impact Evaluation of Teacher Induction Programs. The teacher
contact form, teacher consent form, and school records data request form will not be pretested
for this study as they were used effectively in previous studies for similar purposes. The teacher
assessment, Praxis math subject tests, will not be pretested in light of its established use as a
teacher assessment.
We conducted a pretest of the teacher survey with six middle and high school math teachers
with a variety of routes to certification. The pretest did not result in any substantive changes to
the survey and confirmed the burden to be 30 minutes or less.

10

The four student math assessments (General Math, Algebra I, Algebra II, and Geometry)
will be piloted in spring 2009 to ensure that there is sufficient time in one class period to obtain a
precise measure of a student’s achievement and to resolve any logistical issues in administering
the student assessment for the evaluation. Each version will be tested with 40 students from low
income districts. Request for clearance for this pilot study was included in the first stage of the
OMB package.
The parent consent forms will not be pretested as they were modeled on consent forms that
were successfully used for the Impact Evaluation of Teacher Preparation Models and the Impact
Evaluation of Teacher Induction Programs. The student assent statement will not be pretested as
it was modeled upon assent statements used in previous studies.
The HSAC program interview protocol will not be formally pretested as it is considered a
guideline for discussions, rather than a highly structured interview guide. However, after the first
two sets of interviews, the research team will hold a debriefing to discuss how the protocols are
working and make any necessary modifications. Additionally, our interviewers will be trained to
understand the kind of information we need to collect and the meaning of particular questions, so
they can communicate that effectively to interviewees.
5.

Individuals Consulted on Statistical Aspects of the Design
The following individuals were consulted on the statistical aspects of the study:

Name

Title

Telephone Number

Melissa Clark

Senior Researcher, MPR

609-750-3193

Philip Gleason

Senior Fellow, MPR

315-781-8495

John Deke

Senior Researcher, MPR

609-275-2230

The following individuals will be responsible for the data collection and analysis:

Name

Title

Telephone Number

Sheena McConnell

Associate Director of Research
and Senior Fellow, MPR

202-484-4518

Timothy Silva

Senior Researcher, MPR

202-484-5267

Melissa Clark

Senior Researcher, MPR

609-750-3193

Kathy Sonnenfeld

Survey Researcher, MPR

609-275-2293

Eric Zeidman

Survey Researcher, MPR

609-936-2784

11

REFERENCES

Angrist, Joshua D., Guido W. Imbens, and Donald R. Rubin. “Identification of Causal Effects
Using Instrumental Variables.” Journal of the American Statistical Association, vol. 91,
1996, pp. 444-472.
Boyd, Donald, Pamela Grossman, Hamilton Lankford, Susanna Loeb, and James Wyckoff.
“How Changes in Entry Requirements Alter the Teacher Workforce and Affect Student
Achievement.” Education Finance and Policy, vol. 1, no. 2, Spring 2006, pp. 178-216.
Clotfelter, Charles T., Helen F. Ladd, and Jacob Vigdor. “Teacher Credentials and Student
Achievement in High School: A Cross-Subject Analysis with Student Fixed Effects.”
National Bureau of Economic Research Working Paper no. 13617, November 2007.
Decker, Paul T., Daniel P. Mayer, and Steven Glazerman. “The Effect of Teach for America on
Students: Findings from a National Evaluation.” Princeton, NJ: Mathematica Policy
Research, Inc., 2004.
Hill, Carolyn J., Howard S. Bloom, Alison Rebeck Black, and Mark W. Lipsey. “Empirical
Benchmarks for Interpreting Effect Sizes in Research.” MDRC Working Papers on Research
Methodology, New York, NY: MDRC, 2007.
Kane, Thomas J., Jonah E. Rockoff, and Douglas O. Staiger. “What Does Certification Tell Us
About Teacher Effectiveness? Evidence from New York City.” National Bureau of
Economic Research Working Paper No. 12155, Washington, DC: National Bureau of
Economic Research, April 2006.
Thompson, Teresa. “A Comparison of Methods of Increasing Parental Consent Rates in Social
Research.” The Public Opinion Quarterly, vol. 48, no. 4, 1984.

12


File Typeapplication/pdf
File TitleMEMORANDUM
AuthorNancy Duda
File Modified2009-07-31
File Created2009-07-31

© 2024 OMB.report | Privacy Policy