Results of Incentive Experiment

BTLS 2009 Incentive Experiment Results.doc

Beginning Teacher Longitudinal Study (BTLS) 2009-2012

Results of Incentive Experiment

OMB: 1850-0868

Document [doc]
Download: doc | pdf


memorandum

to: Shelly Wilkie Martinez, OMB

from: Freddie Cross and Kathryn Chandler, NCES

subject: Results of BTLS Incentives Experiment (OMB# 1850-0868 v.1)

date: December 21, 2010

THROUGH: Kashka Kubzdela, NCES



This memo summarizes the results of the incentive experiment conducted as part of the 2009-10 Beginning Teacher Longitudinal Study (BTLS), approved on October 20, 2009, under OMB# 1850-0868 v.1. In order to boost response rates on the new Beginning Teacher Longitudinal Study (BTLS), NCES gave non-contingent cash incentives to study participants in advance of the survey instrument. Because an optimal incentive amount had not been determined, NCES included an experimental design to test the effects of differential amounts on the response rates. The following memo outlines the literature on the effectiveness of incentives on survey response, explains the research questions and methodology, defines the population of analysis, and concludes that $20 cash incentives were more effective than $10 incentives in boosting final response rates, as well as early response rates before the start of the telephone follow-up operation. The memo also includes recommendations for the next wave of the BTLS.

Literature on effectiveness of incentives

Offering incentives for a survey respondent's time is one way to increase survey response rates. A monetary or gift incentive has proven to significantly increase response rates in surveys (Szele´nyi, Bryant, and Lindholm 2005; Brick, Hagedorn, Montaquila, Roth and Chapman 2006). Research also shows that an incentive as small as five dollars can increase response rates effectively (VanGeest, Wynia, Cummins, and Wilson 2001; Halpern, Ubel, Berlin and Asch 2002). However, it is unclear how much incentive should be included to be the most cost efficient to increase the response rates of beginner teachers. In addition, the cash incentive may result in a greater number of early responses before the scheduled starting date for telephone follow-up calls, which would reduce the number of calls needed and therefore the follow-up cost of the survey. In addition, research has shown that prepaid monetary incentives are more effective than the promise of payment to be made after the survey is completed (Hopkins, Hopkins, and Schon 1988; Skinner, Ferrell and Pride 1984).

Research Question and methodology

In order to test the effectiveness of different cash incentives, the 2009-10 administration of the BTLS contained an experiment to measure the impact on survey completion, completion date, and completeness of the survey responses. The sampled cases in BTLS cohort were randomly assigned to one of two experimental groups – a $10 incentive group, or a $20 incentive group. Teachers were mailed a letter with the cash incentives three days before they received the email to the online BTLS instrument. Teachers should have received the two correspondences around the same time.

The following research questions were explored:

  1. Is twenty dollars more effective than ten dollars in increasing the number of interviews?

  2. Does a larger cash incentive amount increase the number of interviews completed before the start of the scheduled telephone follow-up date (February 1)?

  3. Does a larger cash incentive amount increase the number of completed surveys among interviews?

Comparisons were made between the two incentive groups on the number of interviews, the number of interviews before telephone follow-up date (2/1/10), and the number of completed surveys using chi-square tests for association between incentive amounts and different outcome variables.

The 2009-10 BTLS data was primarily collected through a web instrument with telephone follow-up. The first item and several consequent items in the instruments were designed as required questions, that is, a person could not proceed through the survey without giving answers to these questions. These required questions were used to determine their teaching status (current teacher vs. former teacher; stayers vs. movers) which, in turn, determined the paths respondents took in the survey. In addition to the web instrument, BTLS participants also had the option to complete the survey over the phone by calling a toll-free number. During the telephone follow-up period (starting in February), study participants who hadn’t responded to the web instrument were called and offered the opportunity to answer the questions over the phone. The log data produced in the web instrument during the data collection contained dates and the following indicators of completion:

  • complete (respondent/interviewer reached the last screen),

  • partial-complete – with required items (respondent/interviewer completed the required items),

  • partial-complete – without required items (respondent/interviewer didn’t complete the required items), or

  • opened with no answers (respondent/interviewer didn’t answer any questions).

Based on the actual data collected, a Final ISR (FI) file was created containing information on case status - whether a case was an interview (i.e. respondent), nonrespondent, or out-of-scope. Both complete surveys and partial-complete surveys with required items answered were considered to be study interviews in BTLS processing because they contain key information on teachers’ status. Analysis below is conducted using the FI file.

Data and Analysis

All first year public school teachers who responded to 2007-08 SASS are included in the BTLS sample and their SASS responses constitute the Wave 1 data of BTLS. In 2008-09, these same teachers were asked to complete the longitudinal version of the Teacher Follow-up Survey (TFS) — their responses constitute the Wave 2 data of BTLS. Prior to the 2010-11 BTLS data collection, a total of 1,976 current or former teachers1 were randomly assigned to two groups — $10 incentive (group 1) vs. $20 incentive (group 2). Group 1 consists of 982 people and group 2 consists of 994. Nevertheless, after the incentives were mailed out, three people were deemed out-of-scope (OOS) and 44 people didn’t received the incentives due to an undeliverable address (UAA). Because these people were either ineligible for the BTLS or were never “treated”, they were excluded from the analysis. As a result, the sample size for the following analysis includes 1,929 current or former teachers who started teaching in 2007 or 2008.

Experiment Results

Table 1 shows that among the 1,929 BTLS participants who actually received the incentive, 965 received 10 dollars and 964 received 20 dollars. Forty-nine percent of them (474 current or former teachers) in the 10-dollar incentive group and 56 percent (544 current or former teachers) in the 20-dollar incentive group completed the survey or the required items of the survey by the end of January before the telephone follow-up period. The chi-square test result shows a significant relationship between the number of early study interviews and the incentive amount (chi-square with one degree of freedom = 10.3463, p = .0013). By the end of the data collection, 86 percent of the participants (826 current or former teachers) in the 10-dollar incentive group and 90 percent (865 current or former teachers) in the 20-dollar incentive group were counted as the study interviews. The chi-square test result shows a significant relationship between the number of final study interviews and the incentive amount (chi-square with one degree of freedom = 7.6216, p = .0058).

Table 2 shows the percentage of complete surveys among the study interviews in BTLS wave 3 data collection. All together, 1,691 out of 1,929 current and former teachers who received cash incentive were considered as study interviews. Among them, 97 percent completed the survey, meaning a respondent reached the last page of the web instrument. However, the chi-square test result shows no significant association between the completeness of the BTLS survey and the incentive amount (chi-square with one degree of freedom = 0.0286, p = .8658).

Cost Analysis

Telephone follow-up costs $39 per case on average, as estimated by Census. In this part, both costs of the incentives and telephone follow-ups are taken into consideration and the cost per respondent is calculated. Table 3 shows the number of cases given incentives, the cost of the incentives, the number of cases followed up, the cost of follow-up effort, the number of respondents and actual cost per respondent. In the last column, we can see $20 incentive group has higher average cost per respondent compared with the $10 incentive group ($41 vs. $35). However, the difference is only six dollars, rather than ten dollars after taking into account the savings from from the telephone follow-up effort. In addition, though more cases in the $10 group were given telephone follow-up, this group still had signficantly lower responses than the $20 group.



Conclusions and Recommendations

In summary, a larger incentive amount (20 dollars) is associated with both a higher early survey response rate and a higher final response rate. As discussed earlier, a higher early response rate can also save the number of telephone follow-up cases, which can offset some cost of the extra incentives. At the same time, the incentive amount is not associated with the completeness of the survey. The results presented above are limited in that the current design does not support comparisons for any subgroups aside from incentive amount.

Given that the group receiving $20 incentives maintained both higher early and final response rates than the group receiving lower incentive, it is recommended to use a $20 incentive for the BTLS wave 4 data collection to maintain high response rates. The projected costs and response rates are listed in the table 4 below for two different incentive amounts.

The incentive proposed for BTLS Wave 4 will allow for comparisons between Wave 3 and Wave 4 such that we will be able to determine if the same amount of incentive is able to maintain the same early or final response rates and if an increased amount is able to produce higher early or final response rates. The impact of the same incentive on the different subgroups (e.g., former vs. current teachers) will also be investigated.





References

Brick, J.M., Hagedorn, M.C., Montaquila, J., Brock Roth, S., Chapman, C. (2006). Impact of Monetary Incentives and Mailing Procedures: An Experiment in a Federally Sponsored Telephone Survey (NCES 2006-066). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

Halpern, S. D., Ubel, P. A., Berlin, J. A., and Asch, D. A. (2002) “Randomized Trial of $5 versus $10 Monetary Incentives, Envelope Size, and Candy to Increase Physician Response Rates to Mailed Questionnaires” Medical Care 40(9), 834-839.

Hopkins, K. D., Hopkins, B.R. and Schon, I. (1988) “Mail Surveys of Professional Populations: The Effects of Monetary Gratuities on Return Rates.” Journal of Experimental Education 56(4), 173-75.

Skinner, S. J., Ferrell, O.C. and Pride, W. M. (1984) “Personal and Nonpersonal Incentives in Mail Surveys: Immediate versus Delayed Inducements” Journal of the Academy of Marketing Science 12(1), 106-14.

Szele´nyi, K., Bryant, A. N., and Lindholm, J. A. (2005) “What Money can Buy: Examining the Effects of Prepaid Monetary Incentives on Survey Response Rates Among College Students”. Educational Research and Evaluation 11(4), 385 – 404.

VanGeest, J. B., Wynia, M. K., Cummins, D.S., and Wilson, I.B. (2001) “Effects of Different Monetary Incentives on the Return Rate of a National Mail Survey of Physicians” Medical Care, 39( 2), 197-201.

1 The treatment groups were originally assigned equally on N=1,994. However after the groups were assigned, Census removed study refusals and OOS teachers from the experiment, thus creating unequal treatment groups.

5


File Typeapplication/msword
AuthorAmerican Institutes for Research
Last Modified By#Administrator
File Modified2010-12-21
File Created2010-12-21

© 2024 OMB.report | Privacy Policy