10-19-07 memo

QTEL OMB ResponseXMT_V2 (2).doc

Evaluation of the Quality Teaching for English Learners (QTEL) Program

10-19-07 memo

OMB: 1850-0842

Document [doc]
Download: doc | pdf


MEMORANDUM



To: Amy Feldman
Ray Valdivieso

From: Neal Finkelstein, Thomas Hanson

CC: Gary Estes
Nikola Filby
Hans Bos
Raquel Sanchez

Date: October 19, 2007

Re: 200707-1850-005: QTEL questions


Below are our responses to written comments from OMB that we received from IES on October 10, 2007. We would be happy to discuss these responses with you by telephone conference if that would be helpful.


  1. What were the results of the 2005 study in New York City?


The New York City field study examined the short-term effects of the QTEL professional development on teacher practice, knowledge, and attitudes. This study provided important insights into how instructional capacity for working effectively with English language learners might be developed through high-quality professional development. However, the short duration of the study limits generalization of the results. Treatment group teachers attended 2 to 3 days of professional development workshops in January 2005 and received two days of on-site coaching support during the spring semester. Follow-up survey data and classroom observation data were collected in May, 2005.


The NYC QTEL intervention activities appear to have had substantial beneficial impacts on teachers’ pedagogical content knowledge, but relatively few measurable impacts on instructional practices and teacher attitudes. Again, the shortened duration of the intervention may have limited impacts on these dimensions. Table 1 presents adjusted posttest means by experimental condition, treatment/control group differences and associated p-values for all the teacher outcome variables. All of these estimates come from random effects linear regression models that adjust for covariates measured prior to the intervention. The treatment/control group differences in posttest outcomes (column 3) capture intervention effects.




Table 1. Adjusted Treatment/Control Group Differences in Posttest Outcomes
(Intent-to-Treat Impact Estimates)








Treatment

Control

Difference

p-value

S.D.







Pedagogical Content Quiz (% Correct)

65.6

49.7

15.9**

<.01

20.1







Observation Outcomes






Average Observational Rating

2.58

2.49

.09

.65

.80

Academic Rigor–disciplinary knowledge

2.70

2.71

-.01

.96

.86

Academic Rigor–higher order thinking

2.75

2.64

.11

.64

.94

High Support

2.74

2.44

.30

.14

.98

High Expectations

2.42

2.33

.09

.76

1.11

Lang. Focus–metalinguistic knowledge

2.31

2.23

.08

.73

1.00

Lang. Focus–academic language practice

2.46

2.56

-.10

.60

.91

Quality Interactions–sustained recip. talk

2.59

2.37

.22

.34

.98

Survey Outcomes






Instruction. practices consistent w QTEL

2.70

2.75

-.05

.47

.33

Accommodated EL instruction practices

2.44

2.46

-.02

.83

.47

Student-centered practices

2.70

2.66

.04

.54

.28

Teacher-directed student practices

1.83

2.02

-.19#

.06

.41

Teaching Emphases

3.66

3.65

.01

.90

.33

Review of basic facts

3.55

3.59

-.04

.72

.57

Simplification of communication

3.33

3.69

-.36**

<.01

.66

Challenging students beyond lang. prof

2.47

2.35

.12

.49

.92

Student awareness of expectations

2.29

2.26

.03

.82

.66

Students work hard in classes

3.20

3.09

.11

.55

.76

Barriers to Teaching

2.26

2.29

-.03

.82

.66

Teachers can reach difficult students

3.07

2.82

.24

.29

.80

Teachers can ensure high achievement

3.34

3.31

.03

.79

.50

Performance assessed with high standard

2.17

2.19

-.02

.95

.90

All students capable of learning material

3.19

2.97

.22

.23

.71

ELL students make significant progress

3.30

3.26

.04

.74

.62

ELL improvement in literacy

3.66

3.63

.03

.84

.83

Enjoyment of teaching

3.69

3.69

.01

.97

1.17







Notes: Analytic sample consists of a maximum of 171 teachers in 38 schools with posttest data. All results come from random intercept regression models that control for teacher licensure, education, years of teaching experience, serving as a “push-in” teacher, prior EL professional development experiences, subjects taught (ELA and ESL), race/ethnicity (African American, Asian, Latino, White, Multi-ethnic), region, and matching group.

** p < .01 * p < .05 # p < .10


Table 1 shows that QTEL program participation was associated with a 16 percentage point increase in scores on the pedagogical content knowledge quiz — which represents an effect size of approximately .79 standard deviation units. Program participation was also associated with self-reported declines in teacher-directed student practices (p = .06) and simplification of communication — findings that are consistent with the QTEL model. However, no program impacts were detected for the observational outcomes nor for 16 of the 19 constructs assessed by the surveys. It is unclear whether the absence of such impacts is due to low levels of reliability of the observational instrument, the short duration of the intervention, the low levels implementation fidelity, or a simple failure of the intervention to have beneficial impacts.


  1. To what extent is the QTEL program currently being implemented in the Western region?  Has this intervention been implemented in any other locations outside the Western region?


In California, the QTEL program has been implemented in Chula Vista and in San Jose at the Eastside Union School District (2005 to the present). A variation of the intervention currently under study has been implemented in New York City since 2004. As discussed above, a pilot study of QTEL was conducted in NYC middle schools. The QTEL program is part of a three-year project at Austin Independent School District (Texas) which began in July 2007.


  1. Currently if the program is implemented in a district, are all schools required to participate?  Are teachers, or is their participation voluntary?


Each district volunteered a specific set of schools to participate in the study. Schools were then randomized into treatment and control groups within each district. All ELA and ESL teachers in the treatment schools are invited to participate in the QTEL professional development program. Teacher participation in evaluation activities is completely voluntary, they can opt out of participation at any time.


  1. Is any evaluation planned or ongoing of the San Jose implementation?   


The James Irvine Foundation has funded a two-year implementation of QTEL at Eastside Union High School District schools with a possible extension to additional years. QTEL’s work there includes 4 comprehensive high schools and a charter high school. There is no current commitment for an external evaluation of the project. Program staff will be exploring other possible funding sources to undergo an evaluation of the project, but at this time only implementation data are being collected in the form of participation in professional development activities, satisfaction with PD events, and coaching records to document changes in teacher implementation.

  1. Why is this study's duration 3 years?  For what duration will the professional development and coaching occur?


The professional development and coaching occur over a three-year period. The study’s duration is 5 years. The professional development and coaching is staggered so that, roughly, 6th and 7th grade teachers receive PD/coaching during the first half of the intervention period and 8th grade teachers receive PD/coaching during the second half of the intervention period. This maximizes the exposure of the first cohort of 6th grade students to the intervention over the three year implementation. Data collection takes place during the three years of program implementation. QTEL is a long-term intervention aimed at equipping middle school ELA and ESL teachers to provide challenging tasks and scaffold ELL student learning to advance development of academic English fluency. The appropriate research design for evaluating such a program is a long-term research study.


  1. How will you obtain student outcome data?


As stated in the “Sample District MOU” document, school districts will provide student outcome data from electronic databases for all the students in the participating schools from each district. There is no burden to students or parents.


    1. Is it for all students at the schools in the study or just the ELL students?  If the former, why? 


We will obtain data from all the students in the school. We obtain data from all students rather than for just ELL students (1) to examine intervention impacts on the performance of students in the general population and (2) to examine differential impacts on ELL and non-ELL students.


    1. Will this be individual-level data or aggregate?  If the former, what consent/assent procedures do you have planned and what is the associated burden on students and parents?


We plan to obtain individual-level data. We have obtained IRB approval for passive consent from parents because student personal information will be replaced with numeric identifiers by the district before submitting it to BPA. There is no burden to students or parents.


    1. What is the number of students for whom you will analyze A couple of places cite 50,000, of which 12,500 are ELL and one place cites 16,000.


Because the number of students varies from year to year as well as the percent who are ELLs, we cannot estimate the total number of students with much precision. However, at this point we expect to collect standardized test score data for approximately 50,000 students over the course of the study. Of these, we expect that approximately 12,500 or 25 percent will be ELLs.


  1. Please cite the ESRA statute when providing assurances of confidentiality.


We thought that the ESRA statute was cited in section 10 of Supporting Statement A. Will you please clarify so that we can make corrections? The ESRA statute has been added to the consent form (see attached).


  1. Given the level of confidentiality you are promising, it is not advisable to send both the user ID and password to teachers in the same email.


Agreed – thank you. We will send the user ID and the password to teachers in separate e-mail messages.


  1. What is the cost of the intervention package?


The total cost of the intervention package over the entire five-year period of the project is approximately $3.2 million.


  1. Given that REL West also developed the intervention, what are they doing to ensure there is no bias in their evaluation of it?


REL West has contracted with Berkeley Policy Associates (BPA) to conduct the impact evaluation of QTEL. BPA is a nationally respected education research firm with substantial experience conducting randomized trials in education settings. REL West staff are not involved in any data collection, measurement, or analysis work. The WestEd QTEL developers and BPA have developed a cooperative working relationship to facilitate contact with schools and teachers who are participating in the study. The Co-Directors for Research at REL West have been involved from the outset of the design of this study in affirming and emphasizing the clear distinction between the QTEL program staff and the independent research responsibilities of the BPA team.


  1. Was there a description of the QTEL intervention somewhere (other than the few sentences on page 2 of Part A)?  If not, please provide a detailed description of the intervention.


The professional development is intended to equip middle school ELA and ESL teachers to provide challenging tasks and scaffold student learning to advance development of academic English fluency. Participating teachers in treatment schools attend 7 full-day, professional development sessions to build understanding and pedagogical knowledge to support implementation of new tools and processes for the academic and linguistic development of adolescent ELLs. The intervention was developed and will be delivered by WestEd’s Dr. Walqui and her team. The intervention consists of three components (see Table 2): professional development institutes, individualized coaching, and collaborative implementation support. The professional development sessions are offered to 6th and 7th grade teachers in summer 2007, to 7th and 8th grade teachers in summer 2008, and to 8th grade teachers in summer 2009. ELA and ESL teachers in treatment schools also engage in individual coaching cycles to receive assistance with developing academically and linguistically rigorous lessons that implement the principles, tools and processes of QTEL. These coaching cycles consist of a one-on-one lesson planning meeting, observation of the lesson’s implementation, and a debriefing. Participating teachers receive four to six coaching sessions each year. Coaching is staggered, so that 6th grade teachers receive coaching in 2007/08, 7th grade teachers receive coaching in 2008/09, and 8th grade teachers receive coaching in 2009/10. This maximizes the exposure of the first cohort of 6th grade students to the intervention.


Table 2: Key Intervention Components by Year

Component

2007/08

2008/09

2009/10

Professional Development Institutes

  • All 6th and 7th grade ELA and ESL teachers

  • School site and district administrators

  • 4 days in June

  • 3 days in August

  • QTEL Professional Developers

  • All 7th and 8th grade ELA and ESL teachers

  • School site and district administrators

  • 4 days in June

  • 3 days in August

  • QTEL Professional Developers

  • All 8th grade ELA and ESL teachers

  • School site and district administrators

  • 4 days in June

  • 3 days in August

  • QTEL Professional Developers

Coaching and In Classroom Support

  • 6th grade participants

  • Four to six individualized cycles per teacher

  • 7th grade participants

  • Four to six individualized cycles per teacher

  • 8th grade participants

  • Four to six individualized cycles per teacher

Collaborative Implementation Support

  • Four to six after-school study sessions for all ELA and ESL teachers

  • Four to six after-school study sessions for all ELA and ESL teachers

  • Four to six after-school study sessions for all ELA and ESL teachers



  1. Are there any concerns about the treatment and control schools being from the same district?


We are not concerned that teachers in the two research groups are in the same school district. We have agreements with the districts to minimize teacher movement between schools and other possible crossover situations. The benefits of assignment within district outweigh the cost since it is much more beneficial for a district to fully cooperate with a study if they have some program schools than if they are excluded from the program altogether. Also, within-district assignment drastically reduces data collection cost and burden since most data are maintained centrally. Lastly, random assignment within each district automatically equalizes many important contextual factors across the research groups.


  1. How will the evaluation address teachers who participate in professional development but do not change their teaching practices or attitudes?


We are actually very interested in including teachers who fail to change their teaching style in the study. Only with these teachers fully included can we estimate the true impact of QTEL. We do not propose to address these teachers of their students differently in the  study, except perhaps in supplemental non-experimental "dose-response"  analyses of student outcomes which would be secondary to the overall  impact analysis.


  1. Incentives: Please clarify what incentives IES proposes to give to the various groups in the study.

    1. $30 for a 20-30 minute survey seems excessive. Please provide a rationale (and please note that incentives are not meant to compensate participants for their time according to their average hourly rate).


The $30 incentive for completing the teacher knowledge test is designed to insure high response rates from teachers who are involved in the study. The teacher knowledge test, which has a high level of difficulty, takes about 60 minutes to complete. Past work has demonstrated that the incentive to teachers needs to be seen as fair for their agreeing to take time to thoughtfully respond to the survey.


    1. The incentives in the supporting statement do not seem to match up with the incentives listed in the consent forms. Please clarify.


Our reading indicates that the incentive structure on the supporting statement and that described on the teacher consent form are consistent. We have attached the teacher consent form in case there is a discrepancy in the documents.


    1. Since teachers will be observed during the normal course of their teaching duties, an additional incentive amount for classroom observation does not seem warranted. Also, the supporting statement on page 7 says the amount is $50, while the table on page 9 says $30. Please clarify.


The rational of $50 is for teachers to assist with the videotaping over a 3-day/5-day period. The compensation is a sign of appreciation of the respondents’ time, commensurate with the value of that time. We believe it is essential to the success of this data collection effort to provide a sufficient rate for videotaping classroom practices.


Note, also, that the $30 listed on page 9 is the estimated hourly rate for teachers that we used for calculating the cost burden. This is not an incentive.


    1. The table on page 9 of supporting statement part A says that $50 will be provided for student archived data collection. Please clarify who the recipients of this $50 incentive are and provide a justification for the $50 amount.


The $50 listed for providing archived data is the estimated hourly rate for calculating the cost burden. This is not an incentive – there will be no recipient.


  1. The focus groups do not appear to be listed on Exhibit 4. Please clarify when these will be taking place.


Focus groups will take place in Mar-Apr of each program year (2007-08, 2008-09, and 2009-10).


  1. Previous IES studies have demonstrated impacts that are significant but small. Are there plans to use the data to see which schools/teachers attained particularly good outcomes and particularly poor outcomes, and then follow up with qualitative data techniques to figure out what the performing schools were doing differently to attain the good outcomes and what the non-performing schools were doing to attain the poor outcomes? Or to follow-up on results obtained in quantitative analyses that don't quite make sense or were unexpected?


The primary analysis focuses on estimating intent-to-treat impacts for the entire sample. The sample size is not adequate for estimating separate impacts for participating districts. As described in Part A of the Supporting Statement, we do plan to conduct subgroup analyses to expand what can be learned from the study. For example, by dividing the sample of teachers by the amount of prior teaching experience they have, we can present independent estimates of the program effect for more experienced teachers and less experienced teachers. However, because the design does not involve random assignment of schools or teachers to different types of implementation practices, particularly practices observed after random assignment, we believe that focusing on school/teachers that achieved particularly good or bad outcomes will not provide much analytical leverage. Such analyses will be purely descriptive and could potentially lead to false inferences regarding the effectiveness of QTEL.


  1. Other than administrative ease, why is the MOU with the District rather than the schools?  What are the principals' or other school officials' roles in this study?  In other school-wide RCTs, we have seen more emphasis on "selling" the study and recruiting at the school level since they are more able to reach out directly to facilitate or encourage teachers to participate in all aspects of the study.  Relatedly, other studies have used MOUs to emphasize the benefits of the study in concert with the requirements, so it appears more stand-alone and balanced. Would these strategies be useful to QTEL?


We rely on district MOUs because the agreements primarily concern district level data requirements. We did not think an MOU was the best way to encourage buy-in at the school level. Instead we will contact schools directly to schedule data collection activities and call on the district for support as needed.


  1. Some of the questions on the teacher surveys (both control and treatment) seem inflammatory, especially question 19. Can ED clarify how these questions will be used and whether they have been validated? Is there also some reason why ED is asking so many of these "attitudinal" questions?


These questions are designed to measure teachers’ attitudes toward ELLs. They are included because the intervention attempts to change teachers’ attitudes toward working with ELLs. Also, teachers’ attitudes toward ELLs may predict outcomes at the teacher level (i.e. those who are more positive about working with ELLs may be more motivated to learn and improve their practice than those who have less positive attitudes). Some of the items were used in prior research on QTEL, and some items come from other studies on teacher attitudes toward ELLs. This combination of items has not been previously validated. Since these items do not relate directly to our research questions, we are willing to cut them if necessary. 

8


File Typeapplication/msword
Authoramy.feldman
Last Modified ByMatsuoka_k
File Modified2007-11-02
File Created2007-11-02

© 2024 OMB.report | Privacy Policy