Response to OMB Questions

Att_response to OMB questions.1_27_11.docx

Study of Teacher Residency Programs

Response to OMB Questions

OMB: 1850-0880

Document [docx]
Download: docx | pdf



Contract Number:

ED-IES-10-C (0001)

Mathematica Reference Number:

06748-510

Submitted to:

Institute of Education Sciences

U.S. Department of Education

555 New Jersey Avenue, NW

Washington, DC 20208

Project Officer: Melanie Ali

Submitted by:

Mathematica Policy Research

600 Maryland Avenue, SW

Suite 550

Washington, DC 20024-2512

Telephone: (202) 484-9220

Facsimile: (202) 863-1763

Project Director: Philip Gleason

Responses to OMB Questions on A Study of Teacher Residency Programs

January 27, 2011




Shape2 Shape1

The questions we received from OMB on 9/29/2010 on our information collection package for the Teacher Residency Program Study, along with our responses, are presented below. One important bit of context in understanding our responses is that the design of the study currently being planned is different from the study’s original design (that was reflected in the initial draft of the OMB package reviewed by OMB). In particular, a key part of the original design involved an impact study intended to measure the impact of having a TRP teacher on students’ achievement levels. We had planned to estimate this impact using an experimental design, with students randomly assigned to either TRP or non-TRP teachers. However, we determined that an experimental design was not feasible. Thus, in conjunction with Allison Cole and Rita Zota of OMB, as well as with the program office, we decided to change the research question being addressed from one that called for us to measure the impact of having a TRP teacher on student achievement to one that would examine student outcomes among the students in TRP teachers’ classrooms. In other words, we shifted from conducting an impact study to conducting an outcomes study. In this outcomes study, we plan to use value added methods to estimate value added scores among the TRP teachers in our sample, using value added scores among other teachers in these districts as a way to benchmark the TRP teachers’ outcomes (but without attempting to measure the impact of having a TRP teacher). An implication of this change in the study design is that several of the questions listed below are no longer relevant under the current study design.



  1. What other approaches to increasing the sample size (besides recruiting from outside ED grantees) did you consider? For example, did you consider collecting data on multiple cohorts of grantees?

When we were exploring the experimental design in the study, we first turned to the idea of recruiting programs that are not ED grantees, as noted in the question. We also considered including teachers who are beyond their second year as the teacher of record, as well as future cohorts of teachers—those who become teachers of record in fall 2012 or beyond. However, since we are no longer planning to implement an experimental design, we do not anticipate having difficulty including a sufficient number of TRP and non-TRP teachers in our outcomes study.

  1. Pg. 4 discusses not requiring an exact match on the level of teacher experience. Can you discuss why you think this is acceptable?

Since we are no longer conducting an impact study requiring a match of TRP and non-TRP teachers, this issue is no longer relevant for our design.

  1. Why do you plan to collect implementation/applicant data from only a subset of grantees? Is there a way to use data the Department is already collecting (for example, through annual performance reports) as a data source?

The whole study is structured in a way that sequentially narrows down the sample to smaller groups of greater interest for analytical reasons. We first survey all programs, in part to provide context for the study and also to assess how comparable the programs in the outcome study are to other TRPs. This exercise will allow us to provide a rough assessment of the generalizability of the outcomes estimates. Once we have collected basic information from all TRPs, we will then narrow the pool to focus on those that are most likely to be included in the outcomes analysis. As we narrow the pool, we seek to minimize the response burden and conserve project resources by collecting more detailed information only from those that are still likely to be included in the analysis. For example, although we survey all programs, we survey residents and mentors only from a subset of 15 programs under consideration for the outcomes study. The program director interviews will also be conducted only with this group. An even smaller subset of eight programs will be included in the outcomes study, where—in addition to the data collection activities described above—we will also conduct the teacher of record survey, collect student administrative data, collect teacher employment verification data, and conduct the teacher mobility survey.

While we expect the outcomes study to include only 8 programs, we believe that it will be useful to collect more detailed information from 15 programs (including the 8 in the outcomes study) for several reasons. By focusing on 15 programs, we will be providing descriptive information on the characteristics of programs and their participants for a broader set of TRPs than just those included in the outcomes study. It will also help us put the results of the outcomes study into a somewhat broader context, by comparing the program and participants characteristics of these 8 programs with the larger set of programs. Finally, there are important practical or logistical reasons for the tiered approach to data collection. While we want to collect detailed information on residents and mentors from the programs that ultimate end up in the outcomes study, we will not be able to identify these 8 programs with certainty at the time the resident and mentor surveys are to be conducted. At that time, we will be able to identify promising candidates for the outcomes study, but will not be able to verify that the administrative data available in the district will permit the value added analysis we wish to perform and we may not be able to confirm the district’s willingness to cooperate with the research effort. By collecting data from this broader set of 15 programs, we will increase the chances that we will have the data available for programs that end up participating in the outcomes study.

The level of detail and quality of data included in the annual performance reports is likely to vary significantly across grantees. Also, the information in the annual performance reports may not be timely for this study’s needs. A systematic data collection conducted as part of the study is therefore a more preferable approach.

  1. Why will the director, teacher, and mentor interviews/ surveys only be completed for the impact sample?

As explained above, the program survey will be conducted with all programs; the director interview, mentor survey, and resident survey (and applicant data collection) will be conducted with the subset of programs considered potentially most appropriate for the outcomes study (not the impact study, since we are no longer doing an impact analysis); and the teacher of record survey will be conducted with the programs actually selected for the outcomes study. The number of programs in the outcomes study will be roughly half the number of programs that will be carefully considered for the outcomes study, and the number of programs carefully considered for the outcomes study will be roughly half of all the programs. We felt that this data collection structure was the most efficient way to collect some common data for a broad set of programs, but collect more detailed data for the programs that will be included in the outcomes study to help us interpret the outcomes results.

  1. For the 2nd year of data collection, the 2012-2013 school-year, there will be no new recruitment of TRP teachers. Is it correct that only those TRP teachers that were recruited during the 2011-2012 school-year will be the only treatment teachers throughout the entirety of the experiment; however control teachers may change? What is NCEE’s estimate of TRP teacher attrition after year one? How does the power analysis take this anticipated attrition into account? Will the sample sizes in each year be enough to generate experimental estimates of effects on achievement by year?

Since we are no longer conducting an impact study, the issue raised in this question is no longer relevant.

  1. Will you control for funding sources of TRPs in your model (TQP grantee vs. non-grantee)?

Since we are no longer conducting an impact study, the issue raised in this question is no longer relevant.

  1. What percentage of districts do you expect require active consent for student data? Is this a problem that you expect may cause significant sample attrition if most districts that you recruit require active consent?

Under the revised design, in our outcomes analysis we will not need identified administrative data on student records. Instead, we will collect de-identified student records data. Thus, we do not anticipate that we will need parental consent in any of the districts.

  1. Do you propose to offer any incentives to the district, school, and/or teachers to persuade them to participate in the study? Will any of this information be shared during the recruitment phase? If so, please provide your incentive plan now.

We propose offering a $25 gift card to resident teachers and a $20 gift card to mentor teachers who complete the survey. We propose offering teachers of record a $25 gift card for the teacher of record survey and a $20 gift card for the mobility survey. The size of the incentive payments is based on guidelines we have used before – $1 per minute of expected burden. Since we are asking teachers of record to provide information multiple times, we feel it is necessary to offer an incentive that is slightly higher than the $1 per minute estimate to achieve the desired response rate on the mobility survey. These amounts are consistent with incentives approved on similar teacher surveys for ED impact evaluations. For example, for the OMB-approved Evaluation of the Impact of Teacher Induction Programs study (OMB Control Number 1850-0802), teachers received $30 for a baseline survey that averaged about 30 minutes to complete. To achieve high response rates, we believe the gift certificates are an efficient way to obtain response rates of at least 85 percent from resident teachers and teachers of record, and 90 percent from mentor teachers.

The study will not give incentives to TRPs for completing the interview and survey, or to districts for providing student administrative records and teacher employment data.

During recruitment, we will share with TRPs and districts that we have proposed to offer these incentives. However, we will not promise any incentives, prior to receiving clearance from OMB.

  1. What will recruiters tell districts and schools about optional assessments and the associated commitments that may be required? Will this information be included in the MOUs?

The optional assessment is no longer part of the design, so it will not be mentioned.

  1. Has ED figured out yet what recruiters will say to potential non-grantee TRP programs about providing a program-level statistic as an incentive to participating?

Recruiters will tell non-grantee TRPs under consideration for the study (and grantees, for that matter) that the study team will be able to share program-level aggregated data, after the report is released), so long as doing so does not compromise the confidentiality of the data sources.

  1. Would it be possible to estimate impacts on retention rates for a third year?

If this question is about determining whether study teachers returned to their schools/districts in fall 2014, that is not part of the current plan and therefore there are no resources budgeted for this. However, because the initial sample will include teachers in their 1st or 2nd year of teaching, data on teacher employment in fall 2013 will tell us the retention rates for these groups in their 3rd and 4th years of teaching, respectively.

  1. We would like to see a report describing the results of your recruitment activities along with the recruitment data prior to or in conjunction with the next study submission to OMB.

ED will be happy to share with OMB the results of this study’s recruitment efforts, as soon as the information is available.

  1. Please cite the Education Sciences Reform Act confidentiality section in A10, consistent with what is in the letters.

The text below was added to A10 (and will be included in all requests for data):

The contractor follows the confidentiality and data protection requirements of IES (The Education Sciences Reform Act of 2002, Title I, Part E, Section 183). The contractor will protect the confidentiality of all information collected for the study and will use it for research purposes only. The reports prepared for the study will summarize findings across the sample and will not associate responses with a specific program, district, school, or individual. We will not provide information that identifies any study participant to anyone outside the study team, except as required by law.

  1. There seems to be a missing sentence in the TRP grantee letter indicating the purpose of the letter, something like “I am contacting you to let you know that we are launching an evaluation of….”

The following sentence was added to the TRP grantee letter: I am writing to inform you that we are in the initial stages of the study.

  1. Education is spelled wrong in the TRP school letter.

The package no longer contains the school letter because the new study design no longer requires the recruitment of schools for an impact evaluation.









































www.mathematica-mpr.com

Shape4

Improving public well-being by conducting high-quality, objective research and surveys

Princeton, NJ Ann Arbor, MI Cambridge, MA Chicago, IL Oakland, CA Washington, DC


Mathematica® is a registered trademark of Mathematica Policy Research

Shape3


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleContract Number:
AuthorComputer and Network Services
File Modified0000-00-00
File Created2021-02-01

© 2024 OMB.report | Privacy Policy