OMB Package 1 Part A_5-22-09-Final

OMB Package 1 Part A_5-22-09-Final.doc

Evaluation of Secondary Math Teachers from Two Highly Selective Routes to Alternative Certification

OMB: 1850-0865

Document [doc]
Download: doc | pdf

Contract No.: ED-04-CO-0112 (09)

MPR Reference No.: 6522-520





An Evaluation of Secondary Math Teachers From Two Highly Selective Routes to Alternative Certification


Part A: Supporting Statement for Paperwork Reduction Act Submission


February 9, 2009

Revised May 22, 2009
















Submitted to:


Institute of Education Sciences

IES/NCEE

U.S. Department of Education

555 New Jersey Avenue, NW

Washington, DC 20208


Project Officer:

Stefanie Schmidt


Submitted by:


Mathematica Policy Research, Inc.

P.O. Box 2393

Princeton, NJ 08543-2393

Telephone: (609) 799-3535

Facsimile: (609) 799-0005


Project Director:

Sheena McConnell

CONTENTS

Page

PART A: SUPPORTING STATEMENT FOR PAPERWORK REDUCTION ACT
SUBMISSION 1


A. JUSTIFICATION 1


1. Circumstances Necessitating Collection of Information 1

2. Purposes and Uses of Data 10

3. Use of Technology to Reduce Burden 11

4. Efforts to Avoid Duplication of Effort 11

5. Methods of Minimizing Burden on Small Entities 11

6. Consequences of Not Collecting Data 12

7. Special Circumstances 12

8. Federal Register Announcement and Consultation 13

9. Payment or Gift to Respondents 13

10. Confidentiality of the Data 19

11. Additional Justification for Sensitive Questions 20

12. Estimates of Hour Burden 21

13. Estimate of Total Annual Cost Burden to Respondents or Recordkeepers 21

14. Estimates of Annualized Cost to the Federal Government 21

15. Reasons for Program Changes or Adjustments 22

16. Tabulation, Publication Plans, and Time Schedules 22

17. Approval Not to Display the Expiration Date for OMB Approval 24

18. Exception to the Certification Statement 24



REFERENCES 26



APPENDICES



APPENDIX A: DISTRICT NOTIFICATION PACKAGE

APPENDIX B: DISTRICT TELEPHONE CALL PROTOCOL

APPENDIX C: DISTRICT VISIT PROTOCOL

APPENDIX D: PRINCIPAL NOTIFICATION LETTER

APPENDIX E: SCHOOL TELEPHONE CALL GUIDE

APPENDIX F: SCHOOL VISIT GUIDE

APPENDIX G: TEACHER BACKGROUND FORM

APPENDIX H: CONSENT FORM FOR PILOTING STUDENT ASSESSMENT

APPENDIX I: REQUEST FOR CLASSROOM ROSTER

EXHIBITS

Exhibit Page

1 DATA COLLECTION PLAN 8


2 DIRECT AND INDIRECT BURDEN IN HOURS ON SCHOOLS
PARTICIPATING IN THE STUDY 16


3 BURDEN IN HOURS TO RESPONDENTS 21


4 TIMELINE FOR THE STUDY 25



PART A: SUPPORTING STATEMENT FOR PAPERWORK
REDUCTION ACT SUBMISSION

This package requests clearance to recruit school districts and schools for a rigorous evaluation of secondary math teachers who have entered teaching through highly selective routes to alternative certification (HSAC). This evaluation is being conducted by the Institute of Education Sciences (IES), National Center for Education Evaluation (NCEE), U.S. Department of Education (ED); it is being implemented by Mathematica Policy Research, Inc. (MPR) and its partners—Chesapeake Research Associates LLC and Branch Associates.


The objective of the evaluation is to estimate the impact on secondary student math achievement of teachers who obtain certification via HSAC routes compared with teachers who receive certification through traditional or less selective alternative certification routes. The evaluation design is an experiment in which the researchers will randomly assign secondary school students to a treatment or control group. The treatment group will be taught by an HSAC teacher and the control group will be taught by a non-HSAC teacher. Both teachers must teach the same math class at the same level under the same general conditions at the same school. We will compare student math achievement between the treatment and control groups to estimate the impact of HSAC teachers.


The package will be submitted in two stages because the study schedule requires that district and school recruitment begin before all the data collection instruments are developed and tested. In this package, we are requesting approval for recruitment, a teacher background form, a spring 2009 pilot of the high school student math assessment, and the random assignment of students. This package also provides an overview of the study, including its design and data collection procedures.


An addendum to this package, submitted at a later date, will request clearance for the remaining data collection for the evaluation, including the consent forms. The addendum will also provide a detailed discussion of the data collection activities and copies of the instruments and consent forms.

A. JUSTIFICATION

1. Circumstances Necessitating Collection of Information

a. Statement of Need for a Rigorous Evaluation of HSAC Teachers

The specific legislation authorizing this data collection is Section 9601 of the Elementary and Secondary Education Act of 1965 (ESEA), which permits ESEA program funds to be used to evaluate activities that are authorized under this act. The No Child Left Behind Act of 2001 (NCLB), which reauthorized ESEA, emphasizes the importance of teacher quality in improving student achievement. Title II, Part A of ESEA—the Improving Teacher Quality State Grants program—provides nearly $3 billion a year to states to prepare, train, and recruit high-quality teachers, especially in the areas of mathematics and science. The purpose of Title II, Part A is to help states and local school districts improve student achievement through strategies for improving teacher quality and increasing the number of highly qualified teachers. One allowable use of Title II, Part A funds is “carrying out programs that establish, expand, or improve alternative routes for state certification of teachers and principals, especially in the areas of mathematics and science.Teachers who have not yet obtained full state certification can meet the highly qualified teacher requirements of NCLB if they are participating in an alternative route to certification program and demonstrate satisfactory progress toward full state certification (Title I: Improving the Academic Achievement of the Disadvantaged, Final Regulations, 34 CFR Part 200.56, December 2, 2002).


In response to the increasing teacher shortages faced by many school districts, 47 states have established alternative routes to certification that allow teachers to begin teaching before completing all of the training required for certification. While most teachers still follow traditional certification routes, an increasing number of teachers are entering the profession through alternative certification routes. By some estimates, about one-third of any given year’s teachers have entered the profession via alternative certification routes (Feistritzer and Chester 2002).


Many alternative certification programs are not very selective and accept most of their applicants. However, some are highly selective, requiring applicants to undergo challenging interviews and a rigorous screening process and rejecting many or even most of their applicants. Teachers from these highly selective alternative certification programs have often been seen as a way to fill teacher shortages, especially in the areas of math and science (Ingersoll 2003; Boyd et al. 2006). The number of new teachers entering teaching through HSAC programs has grown rapidly since the founding of Teach For America, the first HSAC program, in 1990.


Despite the rising number of HSAC teachers, policymakers lack rigorous research evidence about the effectiveness of HSAC teachers in improving student achievement, particularly at the secondary level. To date, there has been one experimental study of HSAC teachers at the elementary level (Decker et al. 2004). Several well implemented nonexperimental studies have findings that suggest students of HSAC teachers at the secondary level perform at least as well and sometimes slightly better on mathematics achievement tests than students of traditionally certified teachers (Boyd et al. 2006; Kane et al. 2006; Xu et al. 2008). However, the nonexperimental methods used by these studies leave open the possibility that any observed differences in student achievement may be due to factors other than the HSAC teachers.


This evaluation is thus essential to determining whether efforts to place high-quality alternatively certified teachers in classrooms are, in fact, having a measurable impact on student achievement. This study aims to fill the knowledge gap by focusing on secondary teachers from the two largest and most well-known HSAC programs: (1) Teach For America (TFA), and (2) the Teaching Fellows programs and similar programs by other names, fostered by The New Teacher Project (TNTP). TFA recruits recent graduates of some of the nation’s most prestigious colleges. TNTP-affiliated programs focus on highly accomplished people who began their careers in other fields, but want to become teachers.


The study focuses on math at the secondary level for four reasons. First, secondary math teacher shortages are widespread, so there is a high demand for HSAC teachers in this area. Second, the United States lags behind many other industrialized countries in secondary math achievement, suggesting a need for evidence on ways to enhance math achievement at this level. Third, some have argued that HSAC teachers are most effective at teaching the more advanced technical courses, such as secondary math. Fourth, a previous rigorous study found that TFA elementary teachers produced greater achievement gains in math than other teachers in the same grades and schools, but there were no differences in reading (Decker et al. 2004).

b. Research Questions

The primary research question of the evaluation is:

  • What is the impact on student math achievement of secondary school HSAC math teachers compared with non-HSAC teachers?

The evaluation also will address the following secondary research questions:

  • What is the impact of secondary math TFA teachers compared with non-HSAC teachers? What is the impact of secondary math TNTP-affiliated teachers compared with non-HSAC teachers?

  • What is the impact of middle school HSAC math teachers? What is the impact of high school HSAC math teachers?

  • To what extent do HSAC teachers differ in their educational backgrounds, experience, and math content knowledge from other math teachers in the same schools?

  • How does the impact of HSAC teachers vary with their educational backgrounds, experience, and math content knowledge?

  • How do HSAC programs recruit, train, and support secondary school math teachers?

c. Study Design

To answer the primary and secondary research questions, this study will use an experimental design in which students in the same school are randomly assigned to either a class that is taught by an HSAC teacher or a class that is taught by a non-HSAC teacher. The teachers in this “classroom match” must teach the same subject at the same level under the same general circumstances (for example, the same number of teachers in the classroom) in the same school. Students randomly assigned to an HSAC teacher comprise the treatment group; those randomly assigned to a non-HSAC teacher comprise the control group.


Random assignment is considered the “gold standard” for social policy evaluations because it, more than any other approach, minimizes the chance that any observed differences in outcomes between the study groups are due to unmeasured, pre-existing differences between members of the groups being studied. To determine whether an experimental evaluation of HSAC teachers would be feasible, MPR staff visited 28 purposively selected schools and concluded that under certain circumstances random assignment of students to HSAC and non-HSAC teachers was possible and that it was feasible to conduct an experimental evaluation of HSAC teachers (Clark et al. 2008).


The ability of the study to detect policy-relevant differences between the treatment and control groups depends, in large part, on the sample sizes. The study aims to include 450 classrooms matches or about 900 classes. Assuming an average of 20 students per class, the study will include approximately 18,000 students. We expect that these matching classrooms will include about 150 pairs of teachers (300 teachers), 112 schools, and 20 districts.


To examine the separate impacts of TFA and TNTP-affiliated programs, we will aim to include roughly equal numbers of teachers from both types of programs. To examine the separate impact of middle and high school HSAC teachers, we will aim to include roughly equal numbers of middle and high school teacher matches.


Districts, schools, and classes/teachers will be selected purposively based on the feasibility of their participation in an experimental evaluation and their willingness to participate. All districts that expect to employ secondary math teachers from TFA or TNTP programs in the study school year (2009-2010) are eligible to participate in the study. We will prioritize districts with HSAC programs that have been in operation for three years or more, and districts with a larger number of HSAC secondary math teachers. These districts are likely to include Baltimore, Miami, New York, Philadelphia/Camden, the San Francisco Bay Area, and Washington D.C.


A school is eligible to participate in the study if it: (1) is a public secondary school (and so contains at least one of the grades 6-12) and (2) will have at least one set of two matching classrooms—one taught by an HSAC teacher and one taught by a non-HSAC teacher—in the 2009-2010 school year and it is possible to randomly assign students to the classes. To participate in the study, teachers of both classes must teach the same math class at the same level under the same general circumstances. We expect that it will be feasible to randomly assign the students to the classes most frequently when the classes are taught during the same class period, as this is the least disruptive to schools’ schedules. For example, a match could be formed if there was a first period Algebra I class taught by an HSAC teacher and a first period Algebra I class taught by a non-HSAC teacher. Schools will be prioritized, like districts, to maximize recruiting success, targeting the largest schools and those identified with the most potentially eligible HSAC teachers.

d. Recruitment of Districts and Schools

As we expect some districts will not agree to participate or not to agree to participate in time for the study schedule, we will begin by contacting 40 districts. To identify districts with HSAC teachers, we will request from TFA and TNTP programs names of current HSAC program participants and alumni who are teaching secondary math, by region, district, and school.


After prioritization of the districts, we will begin our recruiting efforts by sending an introductory package mailing to the selected district superintendents. The mailing will include a notification letter, study summary, and letters of support from TFA and TNTP program officials to inform them about the study and future contacts by the recruiting team (Appendix A). Follow-up calls to the districts will be made to ascertain the appropriate contact person and to arrange an in-person meeting with the relevant district staff (Appendix B). During the in-person meetings, we will discuss the study purpose and procedures in more detail, determine the necessary research approval requirements, and discuss the responsibilities of participating schools as well as the district’s responsibility to provide student records (Appendix C).


School recruitment will begin when districts grant us permission to begin contacting schools directly. We will contact those schools that we expect will employ HSAC math teachers during the study school year. Principals will receive a notification letter (Appendix D), study summary, and letters of support for the study from TFA and TNTP. Following the mailing, we will call the schools to conduct an initial telephone screening (Appendix E), followed by an in-person meeting to discuss the details of the study’s eligibility requirements, random assignment of students, data collection activities, and the study timeline (Appendix F). During the meeting, we will confirm teacher eligibility by asking principals to request that potentially eligible teachers complete teacher background forms (Appendix G).

e. Data Collection Needs

To address the study’s research questions, data will be required on students, teachers, schools, and HSAC programs.


Students. The key outcome of interest for this evaluation is the students’ math achievement at the end of the 2009-2010 school year. For middle school students, we will collect the spring 2010 test scores from state- or district-administered math assessments rather than administering an assessment. As not all high school students take state- or district-administered math assessments, and the tests that are administered to high school students are often not well aligned to the course material they are taught, high school students will be administered an in-class adaptive, computerized math assessment in spring 2010. The students will take an assessment in the math course he or she is taking—General Math, Algebra I, Algebra II, or Geometry. We will conduct a pilot of the student math assessments in spring 2009 to test the logistics of administering a computer-adaptive assessment and, for each of these assessments, to determine the number of questions students can complete within a class period,1 how the number of questions answered affects the precision of students’ scores, and whether an assessment may be too difficult or too easy (resulting in floor or ceiling effects).


Information on students’ demographic and socioeconomic characteristics and on their math test scores prior to the study school year will be used both to describe the students in the study and to develop more precise impact estimates. These data will be obtained from students’ school records.


Teachers. To examine how HSAC and non-HSAC teachers differ, teachers will be administered a survey in spring 2010 to collect information about their educational and professional background and the training and support they receive over the 2009-2010 school year. To ensure that we are able to administer the teacher survey to teachers who have left the school during the school year, we will administer a teacher contact form in fall 2009. The contact form will collect mailing address, telephone number, and e-mail address information from the teachers.


Because one of the key differences between HSAC and non-HSAC teachers may be their knowledge of the subject matter they teach, we will conduct an assessment of teacher math content knowledge in fall 2009.


HSAC Programs. To understand how HSAC programs prepare people for teaching, the team will conduct semistructured interviews with the administrators of all the programs attended by teachers in the study.

f. Data Collection Activities

A brief description of each data collection activity (in chronological order) is provided below and summarized in Exhibit 1. Only four data collection activities—administration of the teacher background form, the pilot of the high school student math assessment, and the collection of classroom rosters for random assignment—are part of this clearance request. Instruments for other data collection activities for this study will be developed, tested, and submitted later as part of the addendum to this clearance request along with the parental consent forms.


Data collection activities that are part of this clearance request include:


Administration of the Teacher Background Form. Principals will be asked to request that teachers who are potentially eligible for the study complete a teacher background form (Appendix G). Any teachers who replace study teachers during the 2009-2010 school year will also be asked to complete the form. This form asks teachers about the route they took to certification and the number of years they have been teaching. It will be used for two purposes: to check the eligibility of study teachers and to ensure that we have key information on all study teachers.


Pilot of Student Math Assessment. At the end of the 2009-2010 school year, high school students will be asked to take a Northwest Evaluation Association (NWEA) general math or end-of-course math test depending on the course they are taking. The NWEA assessment is a computer-adaptive test. The pilot will help us refine administration procedures to ensure that the students can easily understand test instructions and procedures, the laptops work well, as well as to determine the appropriate number of questions that students can complete within one class period.


We will administer each of the four assessments (General Math, Algebra I, Algebra II, and Geometry) to approximately 40 students from about two New Jersey high schools (160 students in total). To pilot the assessments on a group of students similar to those who will be included in the evaluation, we will select high schools in low income districts. All parents of students in the pilot study will be given a letter describing the assessment and notifying them that students can opt out of the assessment if they wish (Appendix H).


Request for Classroom Roster and Random Assignment. Schools will be asked to submit to MPR initial classroom rosters of students assigned to the classes of the HSAC and non-HSAC teachers in the classroom matches. An example of this request is presented in Appendix I. MPR will take all these students and randomly assign them to the HSAC and non-HSAC teachers. These reshuffled classroom rosters will be sent back to the school.


Schools will also be asked to send rosters of students who enter the matching study classrooms after the initial rosters are sent (late-entering students).


The school will also be asked to send updated classroom rosters three times—in fall 2009, in the first week of classes in spring 2010, and then again later in spring 2010. These rosters will be used to monitor the integrity of random assignment and the extent to which students leave or are added to classes.


Data collection activities for which we will request clearance in an addendum clearance request include:

  • Teacher Contact Form. This form will request detailed contact information for each teacher in the study. This will increase the likelihood that teachers who have left a school will respond to the teacher survey administered in the spring.

  • Teacher Math Content Knowledge Assessment. In fall 2009, we will ask most study teachers to take a Praxis math subject test. These math subject tests were developed by the Educational Testing Service (ETS) specifically to assess the math knowledge of teachers. Study middle school teachers will be asked to take the Middle School Mathematics (0069) test and study high school teachers will be asked to take the Content Knowledge in Mathematics (0061) test. In states that require teachers to take these tests to obtain certification, we will not ask teachers to retake the test, but will ask ETS for the study teachers’ scores on the test when they took it to obtain certification. We will ask for consent from all study teachers to obtain from ETS their scores on these tests.

  • Teacher Survey. Teachers will be asked to complete a survey in spring 2010, at the end of the study school year. This survey will ask about the college they attended, their college major and minor, any math-related coursework, and previous math-related work experience; the content and timing of certification-related coursework; training, mentoring, and coaching experiences during the school year; and student teaching experience. Teachers will have the option of completing the teacher survey online or using a self-administered paper questionnaire.

  • Student Math Assessment. High school students will be asked to take an NWEA math test at the end of the 2009-2010 school year.

  • Student Records. We will request standardized math test scores for all students for spring 2006 through spring 2010. We will also request data on student characteristics in the 2009-2010 school year, including sex, race/ethnicity, date of birth, grade, whether repeating a grade, whether they are eligible to receive free or reduced-price school lunch, whether they are an English language learner (ELL), and whether they have an individual education plan (IEP) or 504 plan. We will request these data first from the state or district; we will collect from the school any data that are unavailable from the state or district.

  • HSAC Program Directors Interviews. Semi-structured interviews will be conducted with HSAC regional program directors in spring 2010. These interviews will collect information on the strategies used to recruit, select, place, train, and support secondary math teachers.

EXHIBIT 1

DATA COLLECTION PLAN

Schedule

Activity

Respondent

Mode

Spring 2009

Pilot Study of Student Math Assessment

Students will participate in pilot administration of NWEA general math, Algebra I, Algebra II, and Geometry assessments

High school students in general math, Algebra I, Algebra II, and Geometry classes

Computer adaptive assessment

Spring - Summer 2009; Fall 2009 - Spring 2010

Teacher Background Form

Request teachers (or principals on behalf of teachers) to complete during school recruitment visit. Also request information from any replacement teacher throughout the 2009/2010 school year.

Teachers (or principals)

Hard copy

Spring - Summer 2009

Classroom rosters

Obtain classroom rosters of students to randomly assign students to either HSAC or non-HSAC classrooms

School staff

Electronic or hard copy

Fall 2009 (first two weeks of fall semester)

List of late enrollment students

Obtain names of students who enroll in school after initial random assignment has been conducted

School staff

Electronic or hard copy

Fall 2009

Teacher contact form

Obtain personal contact information from study teachers to enable contact if teacher leaves the study school prior to spring data collection

Teacher

Hard copy

F

Exhibit 1 (continued)


all 2009

Consent forms for school records data collection and for testing (high schools only)

School records: do not require consent; obtain passive consent if district requires consent

High school math assessment: will request passive consent, active consent if required by the district

Parents and legal guardians of students

Hard copy

Fall 2009

Teacher math assessment and consent form to release scores to MPR

Request middle school teachers take the Praxis Middle School Mathematics (0069) test and high school teachers take the Praxis Content Knowledge in Mathematics (0061) test. We will obtain existing scores in states where teachers are required to take these tests for certification. We will obtain consent to obtain all study teachers’ scores from ETS

Teachers

Hard copy assessment

Fall 2009

Classroom roster checks

At three points throughout the school year

School staff

Electronic or hard copy

Early Spring 2010

Spring 2010

Spring 2010

High school student math assessment

Conduct math assessment with high school students

Students

NWEA computer adaptive assessment

Spring 2010

Teacher survey

Collect data on training and support received by teachers during school year

Teachers

Web, hard copy, telephone reminder

Spring 2010 - Initial request

Summer 2010 - Collect data

Student records data collection

Collect the following data:

Spring 2006-2010 math standardized test score data

Student characteristics data for school year 2009-2010

District or school staff

Electronic or hard copy

Spring 2010

HSAC program administrators

Collect information on HSAC programs’ recruiting, selection, placement, training, and support strategies for secondary school math teachers

HSAC program administrators

Telephone semi-structured interviews


NWEA = Northwest Evaluation Association



g. Analysis

The study will estimate overall impacts and impacts for subgroups, including TFA teachers, TNTP teachers, middle school teachers, and high school teachers. We will investigate the extent to which differences in effectiveness are correlated with differences in educational background, experience, and math content knowledge between the HSAC and non-HSAC teachers. To understand how the impact of HSAC teachers varies with their characteristics, we will estimate the impact of teachers defined by certain characteristics, including years of experience.

h. Study Timeline

The study is expected to be completed in four years. The experimental evaluation will be implemented in the school year 2009-2010. Reports will be available in spring 2012.

2. Purposes and Uses of Data

Information will be collected by Mathematica Policy Research, Inc. and its partners, Chesapeake Research Associates LLC and Branch Associates, under contract with ED [contract number ED-04-CO-0112 (09)].


The information collected by the teacher background forms will be used to identify eligible HSAC teachers for the study and to obtain information on their teaching experience and certification. The pilot of the student math assessments will provide data that will allow us to determine the appropriate number of questions students can complete within a class period, how the number of questions answered affects the precision of students’ scores, whether an assessment may lead to floor or ceiling effects, and to identify any logistical difficulties with the administration of the assessment.


The study findings as a whole will be used to inform the efforts of national, state, and local policymakers, districts, schools, and parents to improve student outcomes. This information will help guide school districts in their teacher hiring decisions. The study results may also provide policymakers with information on how to improve secondary math achievement in the United States, which lags behinds other industrialized countries in this area. Math achievement can have a meaningful impact on the future economic well-being of students, with research confirming a correlation between student achievement on standardized tests at the secondary level with post high school earnings (Murnane et al. 1995; Murnane et al. 2001; Lazear 2003; Deke and Haimson 2006). Knowledge of the HSAC teachers’ effectiveness will help teacher preparation and certification program developers to design programs that have the best chance of improving student outcomes.


Findings will be presented in a final report in spring 2012. In addition, the data collected by the evaluation will be submitted to ED as restricted-use data files that will serve as a valuable resource for other researchers to further examine this issue.

3. Use of Technology to Reduce Burden

The evaluation will use a combination of mechanical and electronic technology to collect data. For each data collection task, we have selected the form of technology that enables us to obtain reliable information in an efficient way that minimizes respondent burden.


During the recruiting process, we will use the teacher background form to collect data to confirm the certification route of potentially eligible teachers. To minimize burden, we will deliver the hard-copy form in person to the teachers or principals during the school visit. Because the form is very short, requiring less than five minutes to complete, respondents can quickly complete the form during the visit and give it to the researchers in person, if they so choose. They will also have the option of faxing or mailing the form back to us.


Given the brevity of the school screening protocol, we can accurately and easily administer it over the telephone. Visits, however, are necessary for the districts and schools to ensure that they fully understand the study requirements. To minimize burden, schools will have the option of providing classroom rosters in electronic or hard-copy format.


The student math assessment administered in the pilot is a computer-adaptive test. We have selected this assessment to minimize burden on the students. Adaptive tests have been found to be more efficient (taking less time to complete and providing more precise estimates) and decrease the possibility of floor or ceiling effects (Rock and Pollack 2002). The assessment adapts to the student’s ability level, presenting more or less difficult questions depending on the student’s performance, until the student’s achievement level is precisely estimated.

4. Efforts to Avoid Duplication of Effort

No other experimental evaluation of HSAC math teachers at the secondary school level has been conducted. To date, there has been only one experimental evaluation of HSAC teachers that studied teachers at the elementary level (Decker et al. 2004). Although there have been nonexperimental studies (such as Boyd et al. 2006; Kane et al. 2006; Xu et al. 2008), the findings are mixed and the nonexperimental methods used by these studies leave open the possibility that observed differences in student achievement might be due to underlying differences between the students taught by the HSAC and non-HSAC teachers rather than to true causal effects of the HSAC teachers themselves.

To the extent possible, we will use existing data for the study rather than duplicate data collection efforts. The information collected from the pilot study of the student assessment, the class rosters, and the teacher background forms is not available elsewhere.

5. Methods of Minimizing Burden on Small Entities

The primary small entities for this study are the districts and schools in which the study teachers teach. During recruiting, we will minimize burden by training recruitment staff to make their contacts as straightforward and concise as possible. The recruitment mailings and presentations are designed to be clear, brief, and informative. We will include all relevant staff at the district- and school-level meetings so that the district superintendents and principals will not be required to convey the information individually to their staff members. At the district level, we will attempt to arrange a meeting that includes representatives of the superintendent’s office, human resources office, and research approval office; the top official for secondary schools; the top official for math instruction; and officials who can discuss the availability of student records. For the school-level presentations, we will offer to meet with the school principal, key individuals responsible for course scheduling, and, at the principal’s discretion, the teachers who might be included in the study. We will use a two-stage screening process, using the school screening guide and teacher background form, to quickly eliminate schools and teachers who are not eligible for the study so they will not receive further contact from the recruiting team. The telephone screener guide and teacher background form were designed to be as short as possible.

To avoid imposing an additional time commitment and travel requirements on the students participating in the pilot of the student assessment, we will conduct the pilot during their regular math classes. To expedite the pilot, a trained proctor and test administrator will explain directions and answer questions. A computer technician will be present to set up the equipment prior to the start of the class and resolve any technical difficulties during the pilot.

6. Consequences of Not Collecting Data

The full data collection plan described in this submission is necessary for conducting this evaluation, which is consistent with the goals of NCLB to raise student achievement by requiring that all teachers in core academic subjects be highly qualified. Despite the increasing use of HSAC teachers, there have been very few experimental studies on the effectiveness of HSAC teachers. Thousands of new teachers are hired every year from HSAC programs with little or no scientifically based evidence on whether these programs produce teachers who are likely to be effective in the classroom. In the absence of this evaluation, ED will not be able to gauge HSAC teachers’ effects on student achievement. This study will thus be an important contribution to the policy debate. Its rigorous methodological design incorporating random assignment of students will ensure that highly credible evidence about the impact of HSAC teachers on student achievement is obtained.


The consequences of not collecting specific data items are discussed below.

  • Without collecting the information on the teacher background forms, we would not be sure that the teacher would meet the eligibility requirements for the study.

  • Without a pilot of the student math assessment, we could be including an inappropriate number of questions on the assessment and may face unexpected logistical issues administering the assessment during the full study.

  • Without the classroom rosters, we cannot conduct or monitor random assignment.

7. Special Circumstances

There are no special circumstances involved with the recruitment and data collection.

8. Federal Register Announcement and Consultation

a. Federal Register Announcement

A 60-day notice to solicit public comments was published in the Federal Register on December 19, 2008. No comments were received.

b. Consultations outside the Agency

Professional counsel was sought from a number of experts during the feasibility study for this evaluation. In January 2008, MPR convened a meeting of a Technical Working Group, consisting of a broad range of researchers, to provide input on study design issues and the data collection plan. These individuals were:

  • Brian Jacob, Walter H. Annenberg Professor of Education, Gerald R. Ford School of Public Policy, University of Michigan

  • John Pane, Senior Scientist, RAND Corporation

  • Michael Petrilli, Vice President for National Programs and Policy, Thomas B. Fordham Institute

  • Jeffrey Smith, Professor of Economics, University of Michigan

  • James Wyckoff, Professor of Economics, Rockefeller College, University of Albany

  • Paul Decker, President, Mathematica Policy Research, Inc.

  • John DeFlaminis, Practice Professor of Education, University of Pennsylvania Graduate School of Education

c. Unresolved Issues

There are no unresolved issues.

9. Payment or Gift to Respondents

We plan to provide payments or gifts to: (1) schools and students that participate in the pilot of the student assessments; and (2) schools that participate in the full study. We discuss each in turn.


a. Schools and Students Participating in the Pilot of the Student Assessments

To express our appreciation for participation in the pilot study of the student math assessment, we propose offering $250 to participating schools and a $5 gift to participating students. The school-level compensation is necessary because the schools participating in the pilot study, which will not be employers of TFA or Teaching Fellows teachers, will likely have little interest in the findings of the study. The schools will experience a high degree of burden because they will need to allocate valuable class time for the student assessments, and will need to designate physical space for the contractor to set up the laptops for the assessments. The $5 gift to students participating in the pilot of the assessment is necessary because NCEE has found in other studies, such as the DC Choice study, that getting secondary students to take assessments seriously is a challenge, and we hope the payment will help ameliorate that problem.

b. Schools that Participate in the Full Study

Schools that participate in the full study will be offered both nonfinancial and financial compensation.


Nonfinancial Compensation. Many school administrators hunger for information about their students and teachers as well as evidence-based policy recommendations. Hence, offering this type of information may be an effective recruitment tool.


When providing schools information from the study, it is paramount that we preserve the confidentiality of the study participants. Hence, we cannot offer schools student-level data on the results of the student math assessment that we will administer in the high schools as this would violate the students’ confidentiality. Providing student scores aggregated by teacher may discourage teachers, and hence schools, from agreeing to participate in the study.


We will, however, offer high schools information on how their students performed on aggregate on each math test, in comparison to students in other schools within the district and to students nationally. If the school wishes, we could provide the mean and quartiles of scores for students who take each test (Algebra I, Algebra II, Geometry, and Basic Math) in their school along with the mean and quartiles of scores for all students in the district taking this test for our study (as long as there are more than two schools in the district). This would provide school administrators with information on how well their students are performing in his/her school compared to students in other similar schools. We could also provide the national norms for scores on each test by grade level that are constructed by NWEA. These data could be provided with the school and district data so that the school administrators can compare how well their students are performing compared with students in the same grade level nationwide.


We also will offer to notify schools when the results of the study are made public. By sending them a short summary of the results and a link to the study report, the participating schools can be among the first to know about the study findings.


Financial Compensation. We plan to offer a $500 payment to middle schools that participate in the study, and $1,000 to high schools. The differential of $500 between the middle and high school compensation takes into consideration the additional burden of the student math assessments in the high school that will not occur in middle schools. The $1,000 per school compensation is comparable to the OMB approved $1,000 payment for control schools in the Math Professional Development study in which the control schools received no intervention but were subject to the burdens of student testing and other data collection. This payment is required because of the burden this study will place on schools, the absence of any significant nonfinancial incentives related to the study, and the limited number of schools that are eligible to participate in the study. These factors are elaborated on below.


Despite efforts to minimize burden, the study will impose a burden on schools. We will take all actions possible to make the study as easy for the school as possible. Examples of ways we plan to reduce burden include:


  • Conducting in-person recruitment visits at a time that is convenient for school administrators. While it would be less costly to conduct recruitment by telephone, the evaluator plans to conduct the meetings in person because it is easier for the school administrator to ask questions, share and review the master schedules, and ask other staff members to attend the meeting.

  • If desired by the school, the evaluators are willing to talk with individual teachers or groups of teachers about the study when they are visiting the school or at a later date.

  • Each school will be linked with one person employed by the evaluator who will be the point of contact for addressing any questions that the school administrators or teachers have about the study.

  • The school administrator will be asked to identify a study liaison—a staff person at the school who will interact with the evaluator’s point of contact.

  • The schools will be allowed to exempt a small number of students from random assignment to accommodate exceptional circumstances such as a parent’s request for a specific teacher.

  • We will not be administering an assessment in middle schools. Instead, we will use the spring district math assessments. We need to administer an assessment in high schools because the spring district math assessment is not consistently administered after grade 8 and the assessments that are administered may not be well aligned with the math course taken by the students.

  • The student assessment in the high schools can be conducted during the student’s scheduled math class in the student’s classroom. The school will not need to provide any materials or equipment for the test.


Our experience in visiting schools is that school administrators are very busy, have many demands on their time, and do not welcome any additional tasks. Even with all the steps taken by the evaluator to reduce the burden on the schools, the study will unavoidably place some burden on school staff. Exhibit 2 presents the expected burden on the school. This table shows both the direct burden of the study on the schools—the time to collect and transmit classroom rosters—and also the time schools will spend conducting other tasks for the study that are not directly related to data collection but still impose an indirect burden on the schools.






EXHIBIT 2


DIRECT AND INDIRECT BURDEN IN HOURS ON SCHOOLS PARTICIPATING IN THE STUDY


Activities

Number of Respondents

Number of Responses Per Respondent

Total Number of Responses

Average Burden Hours Per Response

Total Burden Hours







Direct Burden





Classroom rosters

Roster of students enrolled late


112

112


32

8


3,600

900


0.25

0.08


900

75


Total Direct Burden on Schools





975








Indirect Burden






Meetings with evaluators

Discussions within school

Distributing student consent forms

Collecting student consent forms in active consent districts

Distribute teacher consent and contact forms

Scheduling teacher assessment

Coordinating teacher assessments

Scheduling student testing

Coordinating on day of test

Arranging for student testing makeups



Total Indirect Burden on Schools


112

112

112


34


112

86

86

56

56


56

1

1

1


1


1

1

1

1

1


1

112

112

112


34


112

86

86

56

56


56

2.25

3.00

0.80


0.80


0.27

1.00

0.10

1.80

1.00


1.00

252

336

90


27


30

86

9

101

56


56



1,043


Total Burden on Schools





2,018



The direct burden on the schools includes time spent creating and sending class rosters first for random assignment and then for the evaluator to track movement of students in and out of classes and to verify that the treatment and control groups have been maintained. Each school will, on average, provide 8 classes (900 classes/112 schools). Since each class is requested to provide a roster four times during the study year (the roster for random assignment and the roster for the fall, early spring, and late spring roster checks), each school would submit 32 rosters (8 classes x 4 rosters per class). Assuming .25 hours is required to prepare each roster, each school would spend a total of 8 hours on preparing these rosters (8 classes x 4 rosters/class x .25 hours/roster). Additionally, each class will be requested to provide a roster of students who enroll during the first two weeks of the fall semester. Assuming it will require .08 hours to prepare the roster of these late enrolling students, each school will spend .64 hours on the preparation of these rosters (8 classes x 1 roster x .08 hours/roster). In total, each school will spend approximately 9 hours to prepare the full class rosters (8 hours) and the roster of late enrolling students (about 1 hour). We estimate the total burden of these activities is 975 hours.


In addition to the burden of sending the student rosters both for random assignment and to check on the validity of random assignment, the study will also impose an indirect burden on the school. Specifically, the schools will need to participate in the following activities.


  • Meetings with Evaluators. The evaluation contractor will first discuss the study with a school administrator by telephone. In this call, the principal will be asked some brief questions and asked whether they may be interested in the study. We expect that this will involve about 0.25 hours for each school. The evaluator will then arrange an in-person meeting that lasts about one hour and may include other school staff. We expect on average this to take about 1.5 person hours. Finally, once the schedule for the 2009-2010 school year has been set, the evaluator will discuss the random assignment process with the school administrator. We expect this to take about 0.5 hours. These meetings will in total take 2.25 hours (0.25 + 1.5 + 0.5) for a total of 252 hours.

  • Internal School Discussions. The school administrator will need to discuss the study with the staff responsible for scheduling and with the participating teachers. While the amount of time needed for this discussion will vary with the number of teachers participating and by whether the person responsible for scheduling was in the meeting with the evaluation contractor, on average, we expect this to take about 2 hours per school. The school administrator may also need to think about how the school can alter its schedule or teaching assignments to accommodate the study. This may involve further contact with the evaluation contractor. On average, we expect this to take about 1 hour. In total, these discussions will take about 3 hours (2 + 1 hours) per school for a total of 975 hours.

  • Distributing Letters and Consent Forms to Students. Letters notifying parents/guardians of the study will be distributed to all students in the study. Consent forms will be distributed to all high schools students and to students in middle schools if requested by the district. The forms will be sent to the study liaison at the school who will distribute to each class in the study (about eight). We expect that this will take about 0.10 hours per class or about 0.80 hours in total per school. We anticipate that these forms will be distributed in 112 schools and will total 90 hours.

  • Collecting Student Consent Forms. In districts that require active consent, school personnel will need to collect the signed consent forms and send them to the evaluator. Additional forms may need to be distributed to students who lose their forms. We expect this to take about 0.10 hours per class or about 0.80 hours in total per school (we expect about 8 classes per school). We anticipate that these forms will need to be collected in 60 percent of the high schools or 34 schools (we expect half the schools to be high schools), and will total 27 hours.



  • Distributing Teacher Consent and Contact Forms. As with the student consent forms, teacher consent and contact forms will be distributed to the study liaison at the school who will distribute to each of the teachers in the study. These forms need to be distributed to all 300 teachers in the study or 2 to 3 per school. We expect that this will take about 0.10 hours per teacher or about 0.27 hours in total per school. This will total 30 hours overall.

  • Scheduling the Teacher Assessment. The evaluator will call the schools in districts in which we will administer the teacher assessment to arrange for the most convenient time and place for the evaluator to administer the assessment. We estimate that teacher assessments will be conducted in 86 schools. We expect that scheduling the assessment will involve about 1.0 hour of the study liaison’s time for a total of 86 hours.

  • Coordinating the Teacher Assessment. The study liaison will need to meet with the evaluator briefly on the day the teacher assessment is administered. We expect this will take about 0.10 hours per school for about 9 hours across all 86 schools.

  • Scheduling the Student Assessment. The evaluator will call the study liaison in each school to schedule the student assessments. The liaison is likely to need to check with each teacher about the best days for the assessment. We expect this will take about 0.10 hours per class or about 0.80 hours in total per school. An additional hour will be spent answering questions from the evaluator, for a total of 1.80 hours in total per school. Student testing will occur in only the high schools, which will be about half the study schools or about 56 schools. In total, this will involved 101 hours.

  • Coordinating on the Day of the Test. Although the evaluator will administer the test, the study liaison will need to meet the evaluator on the day of the test and direct the assessment team to the classrooms. We expect this will take about 1.0 hour for each of the 56 high schools for an overall total of 56 hours.

  • Arranging for Student Make-up Test. The evaluator will administer tests to students who were absent on the day the student assessment is administered. The evaluator will need to discuss with the school study liaison the best day, time, and location for these make-up tests. We expect this will take about 1.0 hour for each of the 56 schools for an overall total of 56 hours.


We estimate this indirect burden to the schools will be in total 1,043 hours. Adding in the direct burden of sending classroom rosters, we estimate the total burden on the schools will be 2,018 hours (Exhibit 1). On average, this amounts to about 18 hours per school. As we propose to pay on average $750 to the school ($500 to middle schools and $1,000 to high schools), this is equivalent to about $42 per hour for the school liaison and administrator’s time.


Participating schools do not directly benefit from the study. Unlike many other NCEE studies, none of the study schools will be offered an intervention, such as a curriculum or professional development program, as compensation for participation in the studies. Although the district may make use of these alternative routes, the number of teachers that any one school hires in any one year is small in comparison to their overall staff size. Thus learning from the study is of lesser interest to any single principal. In addition, NCEE has no direct leverage to require or even promote study participation since this is not an evaluation of a program directly funded by ED. 


The number of eligible schools is limited. In the Evaluation of Secondary Math Teachers, given the sample size requirements of the study and the current estimated number of schools and districts that TFA and TNTP serve, if there is not a high acceptance rate to participate in the study, we will not be able to achieve the required study sample within a single year. 


10. Confidentiality of the Data

All data collection activities will be conducted in full compliance with ED regulations to maintain the confidentiality of data obtained on private persons and to protect the rights and welfare of human research subjects as contained in ED regulations.


The contractor will follow the policies and procedures required by the Education Sciences Reform Act of 2002, Title I, Part E, Section 183, which requires “[a]ll collection, maintenance, use, and wide dissemination of data by the Institute” to “conform with the requirements of Section 552 of Title 5, United States Code, the confidentiality standards of Subsection (c) of this section, and Sections 444 and 445 of the General Education Provision Act (20 U.S.C. 1232g, 1232h).” These citations refer to the Privacy Act, the Family Educational Rights and Privacy Act, and the Protection of Pupil Rights Amendment.


In addition, the contractor will ensure that all individually identifiable information about students, their academic achievements, their families, and information with respect to individual schools shall remain confidential in accordance with Section 552a of Title 5, United States Code; the confidentiality standards of Subsection (c) of this section; and Sections 444 and 445 of the General Education Provision Act.


Subsection (c) of Section 183 referenced above requires the Director of IES to “develop and enforce standards designed to protect the confidentiality of persons in the collection, reporting, and publication of data.”


Subsection (d) of Section 183 referenced above prohibits disclosure of individually identifiable information as well as making any publishing or communicating of individually identifiable information by employees or staff a felony.


MPR and its subcontractors will protect the confidentiality of all information collected for the study and will use it for research purposes only. No information that identifies any study participant will be released. Further, personally identifiable data will not be entered into the analysis file and data records will contain a numeric identifier only. When reporting the results, data will be presented only in aggregate form so that individuals and institutions will not be identified. A statement to this effect will be included with all requests for data. The teacher contact form and survey will include a reminder about confidentiality protection in compliance with the legislation. When data are collected through telephone interviews, respondents will be reminded about the confidentiality protections, the voluntary nature of the survey, and their right to refuse to answer individual questions. Further, no individually identifiable information will be maintained by the study team. All members of the study team having access to the data will be trained on the importance of confidentiality and data security. All data will be kept in secured locations and identifiers will be destroyed as soon as they are no longer required.


The following safeguards will be employed by MPR to carry out confidentiality assurances during the study:

  • All employees at MPR sign a confidentiality pledge emphasizing its importance and describing their obligation.

  • Access to identifying information on sample members is limited to those who have direct responsibility for providing and maintaining sample locating information. At the conclusion of the research, these data are destroyed.

  • Identifying information is maintained on separate forms and files, which are linked only by sample identification number.

  • Access to the file linking sample identification numbers with the respondents’ ID and contact information is limited to a small number of individuals who have a need to know this information.

  • Access to the hard-copy documents is strictly limited. Documents are stored in locked files and cabinets. Discarded materials are shredded.

  • Computer data files are protected with passwords, and access is limited to specific users. Especially sensitive data are maintained on removable storage devices that are kept physically secure when not in use.

The Privacy Act of 1974 applies to this collection. MPR will make certain that all surveys are held strictly confidential, as described above, and that in no instance will responses be made available except in tabular form. Under no condition will information be made available to school personnel. District and school staff responsible for assisting MPR in the data collection will be fully informed of MPR’s policies and procedures regarding confidentiality of the data.


In addition, the following verbatim language will appear on all letters, brochures, and other study materials:

Per the policies and procedures required by the Education Sciences Reform Act of 2002, Title I, Part E, Section 183, responses to this data collection will be used only for statistical purposes. The reports prepared for this study will summarize findings across the sample and will not associate responses with a specific district or individual. We will not provide information that identifies you or your district to anyone outside the study team, except as required by law. Any willful disclosure of such information for nonstatistical purposes, without the informed consent of the respondent, is a class E felony.

11. Additional Justification for Sensitive Questions

The test score data collected by the student assessment pilot may be considered sensitive if teachers use the data improperly to determine student grades. We will protect the confidentiality of these data by removing personal identifiers and restricting access to these data to the study team.

12. Estimates of Hour Burden

Exhibit 3 provides an estimate of time burden. The total reporting burden for the data collection effort covered by this clearance request is 1,185 hours. We expect that we will need 450 teachers to complete teacher background forms during school recruiting in order to identify 300 teachers who are actually eligible to participate in the evaluation. We hope to include 160 students in the pilot study of the student math assessment. We will require about 112 schools to provide classroom rosters for 8 classes each. The rosters need to be provided four times. In addition, schools need to provide a list of late-entering students for each class.

EXHIBIT 3

BURDEN IN HOURS TO RESPONDENTS


Activities

Number of Respondents

Number of Responses Per Respondent

Total Number of Responses

Average Burden Hours Per Response

Total Burden Hours

Schools






Classroom rosters

112

32

3,600

0.25

900

Roster of students enrolled late

112

8

900

0.08

75

Teachers






Teacher background forma

450

1

450

0.08

37

Students/Parents






Parental consent form for pilot

160

1

160

0.08

13

Pilot of student assessment

160

1

160

1.00

160

Total





1,185

aAssumes 150 teachers who complete the background form will not meet the criteria for the study. Principals can be a proxy respondent for the teacher background form.



13. Estimate of Total Annual Cost Burden to Respondents or Recordkeepers

There are no direct costs to individual participants.

14. Estimates of Annualized Cost to the Federal Government

The estimated cost to the federal government for the study—including recruiting districts and schools, designing and administering all data collection instruments, processing and analyzing the data, and preparing reports—is $8,087,800. Recruiting, data collection, and reporting activities will be carried out over approximately four years (fall 2008 to summer 2012). Thus, the average annual cost of data collection and analysis is $2,021,950.

15. Reasons for Program Changes or Adjustments

This is a new project.

16. Tabulation, Publication Plans, and Time Schedules

We discuss plans presenting analyses, publication plans, and the timeline for the study in the following subsections.

a. Tabulation Plans

We will examine data obtained from the pilot of student assessments to determine the appropriate number of questions students can complete within a class period, how the number of questions answered impacts the precision of students’ scores, and whether an assessment may result in floor or ceiling effects.


To estimate the impact of HSAC teachers on secondary student math achievement for the full evaluation, we will treat each classroom match as a separate “mini-experiment.” For each classroom match, we will compare the average end-of-year math assessment score of students randomly assigned to the class taught by the HSAC teacher to the average score of those assigned to the non-HSAC teacher—the difference in average scores will provide an estimate of the HSAC teacher’s impact in that particular classroom match. We will then average the impact estimates across all classroom matches in the study to come up with an overall estimate of the HSAC teachers’ impact on secondary student math achievement.


Primary Impact Analysis. Due to random assignment, the differences in mean outcomes in each classroom match will provide an unbiased estimate of the impact of HSAC teachers. However, the precision of the estimates can be improved by controlling for student-level baseline characteristics that may explain some of the differences in achievement, such as sex, race, free/reduced price lunch eligibility, special education status, whether the student is an English language learner, and prior math achievement. We will therefore estimate the following model of student math achievement for student i in classroom match j:



where Yij is the outcome math test score of student i in classroom match j, Pj is a vector of classroom match indicators, Xij is a vector of student-level baseline characteristics, Tij is an indicator for whether the student was in the HSAC teacher’s class in classroom match j, εi is a random-error term that represents the influence of unobserved factors on the outcome, and β and δ are vectors of parameters to be estimated. Because the randomization is done within classroom matches within schools, and schools may differ from each other in student compositions, the model includes a vector of classroom match indicators, Pj, to control for differences in the average student characteristics between classroom matches and schools. If a sufficient number of classroom matches contain three teachers instead of two, the estimated standard errors will account for clustering of students within classroom.


The vector δ represents the experiment-level impacts of the HSAC teachers in each classroom match that can then be aggregated to estimate the overall HSAC impact. The simplest and perhaps most intuitively appealing way to aggregate these impacts is to calculate an equally weighted average of the classroom match-level impacts. In this way, each classroom match will have an equal influence on the overall impact estimate. As a specification check, we will also explore alternative weighting schemes that have the potential to provide greater statistical efficiency and test the robustness of the findings, including giving greater weight to more precisely estimated classroom match-level impacts and weighting proportionally to the size of the sample in each classroom match.


Subgroup Analyses. In addition to estimating the overall impact of HSAC teachers on secondary student math achievement, we will conduct a limited number of subgroup analyses. Specifically, we will separately estimate the impact of TFA and TNTP teachers, middle and high school HSAC teachers, and novice and experienced HSAC teachers. To calculate subgroup impacts, the classroom match-level impact estimates will be aggregated for each relevant subgroup. For example, to calculate the subgroup impacts for high school and middle school teachers, the impact estimates from experiments in high schools will be aggregated separately from those from the experiments in middle schools. While we will test the statistical significance of the impact for each subgroup, we will not test the significance of differences between subgroups (for instance, between TFA and TNTP teachers), as the sample will not provide adequate statistical power for these comparisons.


Non-Experimental Analysis. If we find that HSAC teachers are more effective than non-HSAC teachers, policymakers will want to understand the reasons they are more effective. To shed light on this, we will investigate whether there are particular observable teacher characteristics that are correlated with the impacts. Because the effects of the teacher characteristics cannot be separated from the HSAC recruiting model experimentally, we will rely on non-experimental methods for this exploratory analysis.


For the non-experimental analysis, we will estimate variations of Equation 1 that introduce within-experiment differences in teacher characteristics:



where Cij represents a vector of observable characteristics of student i's teacher, γ is a vector of parameters to be estimated, and all other variables are defined as above. Since these models include classroom match-level fixed effects, the coefficients in vector γ represent the correlations between the within-match differences in teacher characteristics and the within-match differences in student outcomes. These exploratory analyses will be guided in large part by differences between HSAC and non-HSAC teachers that are observed through the teacher survey and that have been hypothesized to influence student achievement. For example, HSAC teachers are often perceived to be different from non-HSAC teachers in their subject knowledge, the selectivity of their undergraduate colleges, and their experience, all of which have been connected to student achievement in prior research (Clotfelter et al. 2007). Therefore, using data from the teacher survey and teacher math knowledge assessments (if the option is exercised), we will examine how the differences between the HSAC teachers and the non-HSAC teachers along these dimensions are correlated with student outcomes.


Non-Response and Crossovers. Although, we will take steps to minimize the amount of missing data, some student non-response for this evaluation is inevitable. This non-response may lead to biased impact estimates if the non-response is correlated with math achievement and whether the student was assigned to an HSAC teacher. To address this, we will use propensity score matching and create non-response weights that appropriately weight those for whom we have outcome math test scores, so that the weighted sample of students with nonmissing data is representative of the full sample. In addition, some students who are assigned to an HSAC teacher may crossover into a class with a non-HSAC teacher or vice versa. Including crossover students might bias the impact estimates by attributing the performance of the HSAC teacher to a non-HSAC teacher and vice versa. We can adjust the estimates for these crossovers using the students’ assignment status as an instrumental variable for having an HSAC teacher (Angrist et al. 1996).

b. Publication Plans

During the third year of the study, we will prepare the draft of the final report, which will address each research question. The report will be written in a style and format accessible to policymakers and research-savvy practitioners. A draft will be delivered to ED in June 2011; a revised draft that addresses ED’s comments will be delivered in August 2011. The final report, which will address all of the peer-review comments, will be delivered by April 2012.

c. Schedule

The timeline for the evaluation is shown in Exhibit 3.

17. Approval Not to Display the Expiration Date for OMB Approval

Approval not to display the expiration date for OMB approval is not requested.

18. Exception to the Certification Statement

No exceptions to the certification statement are requested or required.


EXHIBIT 4


TIMELINE FOR THE STUDY


Activity

Time Period

Recruit districts and schools

Fall 2008-spring 2009

Administer teacher background form

Fall 2008-spring 2009

Pilot of student assessment

Spring 2009

Conduct random assignment

Spring 2009 – fall 2009

School year during which teachers are evaluated

2009 –2010

Conduct teacher math content assessment (at ED’s option)

Fall 2009

Collect consent forms for testing

Fall 2009

Collect teacher contact form

Fall 2009

Administer high school math assessment

Spring 2010

Conduct HSAC program director interviews

Spring 2010

Administer teacher survey

Spring 2010

Collect school records data

Summer 2010

Draft report

June 2011

Final report

April 2012


ED = U.S. Department of Education

HSAC = highly selective routes to alternative certification.






REFERENCES

Angrist, Joshua D., Guido W. Imbens, and Donald R. Rubin. “Identification of Causal Effects Using Instrumental Variables.” Journal of the American Statistical Association, vol. 91, 1996, pp. 444-472.

Boyd, Donald, Pamela Grossman, Hamilton Lankford, Susanna Loeb, and James Wyckoff. “How Changes in Entry Requirements Alter the Teacher Workforce and Affect Student Achievement.” Education Finance and Policy, vol. 1, no. 2, Spring 2006, pp. 178-216.

Clark, Melissa, Sheena McConnell, Kristen Hallgren, Daniel Player, and Alison Wellington. “Evaluating Highly Selective Programs That Provide Alternative Routes to Teacher Certification: Feasibility and Design Issues.” Princeton, NJ: Mathematica Policy Research, Inc., March 28, 2008.

Clotfelter, Charles T., Helen F. Ladd, and Jacob Vigdor. “Teacher Credentials and Student Achievement in High School: A Cross-Subject Analysis with Student Fixed Effects.” National Bureau of Economic Research Working Paper no. 13617, November 2007.

Decker, Paul T., Daniel P. Mayer, and Steven Glazerman. “The Effect of Teach for America on Students: Findings from a National Evaluation.” Princeton, NJ: Mathematica Policy Research, Inc., 2004.

Deke, John, and Joshua Haimson. “Valuing Student Competencies: Which Ones Predict Postsecondary Educational Attainment and Earnings and for Whom?” Princeton, NJ: Mathematica Policy Research, Inc., 2006.

Feistritzer, Emily, and David Chester. “Alternative Teacher Certification: A State-by-State Analysis 2002.” Washington, DC: National Center for Education Information, 2002.

Ingersoll, Richard M. “Is There Really a Teacher Shortage?” Philadelphia: Consortium for Policy Research in Education, University of Pennsylvania, 2003.

Kane, Thomas J., Jonah E. Rockoff, and Douglas O. Staiger. “What Does Certification Tell Us About Teacher Effectiveness? Evidence from New York City.” National Bureau of Economic Research Working Paper No. 12155, Washington, DC: National Bureau of Economic Research, April 2006.

Lazear, Edward P. “Teacher Incentives.” Swedish Economic Policy Review, vol. 10, no. 3, 2003, pp. 179–214.

Murnane, Richard J., John B. Willett, and Frank Levy. “The Growing Importance of Cognitive Skills in Wage Determination.” Review of Economics and Statistics, vol. 77, no. 2, May 1995, pp. 251–266.


Murnane, Richard J., John B. Willett, M. Jay Braatz, and Yves Duhaldeborde. “Do Different Dimensions of Male High School Students’ Skills Predict Labor Market Success a Decade Later? Evidence from the NLSY.” Economics of Education Review, vol. 20, no. 4, August 2001, pp. 311–320.

Northwest Evaluation Association. “Technical Manual for the NWEA Measures of Academic Progress and Achievement Level Tests.” Portland, Oregon: Northwest Evaluation Association, 2003.

Rock, Donald A., and Judith Pollack.  "Early Childhood Longitudinal Study - Kindergarten Class of 1998-99 (ECLS-K): Psychometric Report for Kindergarten Through First Grade." Washington, DC: National Center for Education Statistics, August 2002.

Xu, Zeyu, Jane Hannaway, and Colin Taylor. “Making a Difference? The Effects of Teach For America in High School.” Washington, DC: Urban Institute, March 2008.




1 The length of a class period can differ among schools, lasting anywhere from 45 minutes to as long as 90 minutes. Our goal is to administer the student assessment in less than 45 minutes.

File Typeapplication/msword
File TitleMEMORANDUM
AuthorNancy Duda
Last Modified ByDonna Dorsey
File Modified2009-05-22
File Created2009-05-22

© 2024 OMB.report | Privacy Policy