Justification Memo

Att_HSAC Incentive experiment memo to OMB Final.docx

Evaluation of Secondary Math Teachers from Two Highly Selective Routes to Alternative Certification

Justification Memo

OMB: 1850-0866

Document [docx]
Download: docx | pdf

MShape1
EMORANDUM




600 Maryland Ave., SW, Suite 550

Washington, DC 20024-2512

Telephone (202) 484-9220

Fax (202) 863-1763

www.mathematica-mpr.com



TO: Stefanie Schmidt


FROM: Sheena McConnell, Kathy Sonnenfeld, Nancy Duda, and

Daniel Player DATE: 6/16/2010

HSAC-373

SUBJECT: Report on the Incentive Experiment for An Evaluation of Secondary Math Teachers from Two Highly Selective Routes to Alternative Certification (OMB # 1850-0866)


In August 2009, OMB approved the data collection for An Evaluation of Secondary Math Teachers from Two Highly Selective Routes to Alternative Certification (OMB# 1850-0866). To learn more about the use of incentives, OMB requested that we conduct an experiment on the effectiveness of providing incentives for the return of parent consent forms in school districts that require active parental consent for collecting data on students. This memo documents the findings of that experiment to date.


The Study

The purpose of the overall study is to estimate the impact on secondary student math achievement of teachers who obtain certification via highly selective alternative routes to certification (HSAC) compared with teachers who obtain certification through traditional or less selective alternative certification routes. The study involved identifying math classes of the same subject (such as algebra) and level (such as honors) that occurred at the same time in the same school under the same general conditions, with one class taught by an HSAC teacher and another class taught by a non-HSAC teacher. For each of these “matched classes,” students were randomly assigned to a treatment class taught by an HSAC teacher or a control class taught by a non-HSAC teacher. The impact of the HSAC teachers will be estimated by the difference in math achievement of students in the treatment and control groups averaged across all the matches. Results on district math tests will be used to measure the math achievement of middle school students. An evaluator-administered math achievement test will be used to measure the math achievement of high school students.


Four active consent school districts in the study were included in the experiment.1 By active consent, we mean that the parent/guardian must notify the evaluator that they consent to the data collection. About half of the students in our research sample require active consent.

Achieving a high rate of consent for the study is important. The rate of return of signed consent forms sets the upper limit for the overall response rate—further attrition may occur if students do not take the math assessment—so a high consent return rate is critical to achieving a high response rate for the study overall. A low rate of returned consent forms would reduce the sample size and compromise our ability to produce precise impact estimates.


The Incentive Experiment

The main point of the incentive experiment was to test whether offering incentives for the return of a consent form would lead to higher response rates. Schools were randomly assigned to one of three groups:



  1. Treatment 1: Individual students were offered a gift card worth $5 if they returned a signed consent form, irrespective of whether the parent/guardian had provided consent. In addition, if 95 percent or more of a class’s consent forms were returned (again, irrespective of whether consent was granted), the class teacher received $25 to be used for purchases that benefit the class, such as school supplies or books.

  2. Treatment 2: Individual students were offered a gift card worth $5 if they returned a signed consent form, irrespective of whether the parent/guardian provided consent. The class was not offered an incentive.

  3. Control: Neither students nor the class were offered incentives for the return of consent forms.

By design, the 31 schools in the active consent districts were randomly divided as evenly as possible between the three research groups (see Table 1). Middle and high schools were fairly evenly distributed across the groups, with each group having three to four middle schools and six or seven high schools. By chance, the number of classrooms differed—the Treatment 2 schools had fewer classrooms on average than the Treatment 1 or Control schools. The number of students was also the lowest in Treatment 2, because the number of classrooms was lower and there were fewer students on average in each class.




Table 1. Number of Schools, Classrooms, and Students in the Incentive Experiment


Treatment 1

($5 to student, $25 to class)

Treatment 2

($5)

Control

(No Incentives)



Total

Schools

11

10

10

31

Classrooms

48

35

53

136

Students

1,272

799

1,317

3,388



In the fall of 2009, school liaisons in the active consent districts were sent consent forms for each student in a study classroom. Teachers were asked to distribute the consent forms to students and to encourage them to ask their parents/guardians to sign and return the consent forms to the teacher. The consent forms were accompanied by a letter (both the consent forms and the letters were in English and Spanish) that described the main study (not the consent experiment), ensured confidentiality, asked for the form to be returned, and described the incentives (when applicable). The forms were collected by the teacher and sent back to the evaluator in a postage-paid package. We tracked the amount of time between when consent forms were first distributed to schools and when they were returned.


The outcome of the experiment depends on both the differing incentives and efforts made by the data collection staff to increase response rates, by for example, calling the school. Efforts were not necessarily equal across incentive groups. As the data collectors’ goal was to achieve a high overall consent rate, they placed more effort in obtaining consent forms in schools with initially low rates of return. Therefore, we tracked the activities we conducted for each school. These activities fall into four main groups:


  • Sending Additional Packages of Consent Forms. About four weeks after the original mailing, we sent another packet of consent forms to the school to be distributed to students again, because some teachers, students, or parents may have mislaid the consent forms sent earlier. Additional packets of consent forms were sent to schools in which the rate of consent was low. On average, we sent 3.7 packets of consent forms to each school, including the original mailing.

  • Contacting Schools. Telephone calls were made to the study liaison and in some cases teachers at schools to remind them to return consent forms. Emails were also sent. On average, 8.2 contacts (either by telephone or email) were made at the schools in the experiment.

  • Visiting Schools. Data collection staff in the school districts visited schools to explore the reasons for low return rates and to pick up signed consent forms. In some schools, data collectors were able to attend parent/teacher conferences or other school events at which parents attended, to prompt or receive responses. On average, schools were visited 1.9 times.

  • Seeking Verbal Rather than Written Response. Some schools provided us with the telephone numbers of parents. For these schools, we were able to call the parents directly and ask for a response over the telephone. In a few other schools, a teacher or parent would make the calls with a data collection staff present. This meant that the telephone numbers did not have to be given out; also parents were more likely to respond to a trusted member of their community. Of the 2,291 consents received, 273 (12 percent) were obtained verbally.


The time spent on these activities varied by the number of schools, classrooms, and students. We estimated that a contact with a school took on average 0.25 hours and a visit took 2.5 hours with an additional 0.08 hours per classroom. We estimated it took an average of 0.08 hours per student for each attempt to obtain a verbal consent. The time spent sending a package of consent forms to each school also varied by incentive group. Each package took only about 0.15 hours to compile and send to a Control school, but it took longer to send a package to a Treatment 1 or 2 school (0.5 hours per school) because the package included school-specific gift cards and instructions on how to use them.


The extent of our efforts, by experimental condition, is described in the findings section, below. We did not track the efforts that individual teachers or other school staff might have made to increase the response rates of their students.



Findings of the Incentive Experiment

The key finding is that the incentives that we offered did not affect the rate at which parents returned the consent form or gave consent (Table 2). Overall, we found that 76 percent of all parents returned the consent form. The percentage of parents who returned a consent form was highest under Treatment 2 (81 percent) and lowest under Treatment 1 (73 percent), but none of the differences between the three groups were statistically significant. Only 4 of the 48 classrooms under Treatment 1 received the $25 as a reward for having 95 percent or more of parents returning their consent forms.



Table 2. Parent Responses to the Active Consent Request


Treatment 1

($5 to student, conditional $25 to class)

Treatment 2

($5 to student)

Control

(No Incentives)


Total

Number responding

Percent responding


932

73%

645

81%

987

75%

2,564

76%

Number consenting

824

583

884

2,291

Percent consenting

65%

73%

67%

68%






Number refusing

108

62

103

273

Percent refusing

8%

8%

8%

8%






Number not responding

340

154

340

824

Percent not responding

27%

19%

25%

24%






Average time to response (calendar days)


65


64


61


63

Note: No statistically significant differences were found between Treatment 1 and Control, between Treatment 2 and Control, or between Treatment 1 and Treatment 2 at the p < 0.05 level.



The incentives did not affect the rate at which parents actively refused consent (the parent returned the form with an indication that the parent did not give consent). In all three groups, about 8 percent of the parents refused consent (Table 2).


The incentives also did not seem to affect the time taken to receive the consent form back. On average, it took 63 calendar days from the day we sent the consent forms to the school to the day we received a signed consent form, slightly more days for the treatment schools and slightly fewer days for the control schools (Table 2).


Even though there was no statistically significant difference in the response or consent rates, the rates could reflect differential follow-up efforts. Our analysis shows this likely was not the case (Table 3).




Table 3. Effort Taken in Obtaining Response


Activity

Treatment 1

($5 to student, conditional $25 to class)

Treatment 2

($5 to student)

Control

(No Incentives)


Total

Packages of Forms Sent to School





Number per school

3.5

3.7

3.9

3.7

Number per response

0.07

0.12

0.09

0.09

Hours spent per response

0.11*

0.13*

0.04

0.09






Contacts Made with School





Number per school

6.5

8.2

9.9

8.16

Number per response

0.13

0.24

0.27

0.22

Hours spent per response

0.03

0.06

0.07

0.05






Visits Made to School





Number per school

1.9

1.8

2.0

1.9

Number per response

0.04

0.05

0.05

0.05

Hours spent per response

0.10

0.13

0.15

0.12






Attempts to Obtain Consent Verbally





Number per school

62

91

59

73

Number per response

1.35

2.53

0.86

1.57

Hours spent per response

0.11

0.20

0.07

0.13






Total hours spent per response

0.34

0.52

0.33

0.39

*Different from control group with p < .05


We found no statistically significant differences between incentive groups in the number of times packages of consent forms were sent to schools, the number of contacts with schools, or the number of visits to schools. Somewhat more batches of consent forms were sent to schools in the Control group, and more contacts with and visits to schools in the Control group were made, but these differences were not statistically significant.

Overall, we found no statistically significant differences across the incentive groups in the time spent per returned consent form. We found that the time taken on sending packages of consents was more in the treatment groups, but this reflects the additional time spent sending each package in these groups rather than a need for more packages to be sent.

Given that the effort put into obtaining responses was fairly similar across the three experimental groups, it seems fair to conclude that the lack of differences in response and consent rates reflects the lack of effectiveness of the incentives themselves. The ineffectiveness of the incentives could be interpreted in two ways. First, it could be that incentives do not play much of a role in whether parents received the consent form, signed it, and returned it. Second, it could be that both the incentives tested in this experiment are too low, and higher incentives would be more effective. Another experiment would be needed to distinguish between these reasons for the incentives’ ineffectiveness.











cc: Tim Silva


1 The four districts are Miami-Dade County, Houston, Chicago, and New York City.

An Affirmative Action/Equal Opportunity Employer

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorSheena McConnell
File Modified0000-00-00
File Created2021-02-02

© 2024 OMB.report | Privacy Policy