Appendix T - Incentives and Response Rates

Appendix T - Incentives and Response Rates.docx

Survey of SNAP and Work

Appendix T - Incentives and Response Rates

OMB: 0584-0664

Document [docx]
Download: docx | pdf

OMB Control Number: 0584-NEW

Expiration Date: xx/xx/20xx

Appendix T - Incentives and Response Rates



Incentives and Response Rates

Postpaid Incentives. Research unequivocally demonstrates that incentives increase response rates. Singer and Ye1 completed a systematic review of articles on the use of incentives to enhance response rates published after 2002. They found that: (a) incentives increase response rates for all modes of administration; (b) increasing incentive amounts continue to improve response rates; and (c) monetary incentives were more effective than promised gifts. It is likely that incentives influence response rates either through facilitating contact with the potential respondent or by stimulating cooperation. In another meta-analysis of monetary incentives, Mercer and colleagues2 reported a 10-percentage point increase in response rates for mail surveys when participants were paid a $2 pre-paid incentives and a 6 percentage-point increase for phone surveys when participants were offered a $20 post-incentive. Similarly, research indicates that post-paid incentives improves responses to mail and interviewer-administered surveys.10,11 For example, Cantor et al.3 reported an effect of 9.1 percentage points when offering a post-incentive of $20 (compared to no incentive).


There has been extensive research on the relative effectiveness of different incentive amounts. Higher incentive amounts increase response rates, although the relationship between incentive amounts and response rates is not linear.4 This means that the marginal increase in response rate decreases as incentive amounts increase. For example, Mercer et al. found that while response rates increased with higher incentive amounts, the effect tapered off at about $20 (in 2012 dollars). 5 Cantor et al. (2008) obtained similar results.6 A survey of unemployed and dislocated workers found that $50 and $75 resulted in significantly higher response rate than an incentive of $25. However, the $50 incentive was more cost effective than the $75 incentive.

Early Response Incentives. Survey literature suggests that incentives can be effective at increasing early response.7 One study found that providing an early response incentive of $100 increased the response rate from 20 percent to 29 percent in the cutoff period compared to a $50 incentive.8 Similarly, another study with disconnected youth found that those who were offered the $40 incentive had 38 percent higher odds of completing their survey within the first four weeks, compared to those who were offered the $25 incentive. An early response incentive has the potential to reduce overall data collection costs by shortening the data collection period and driving more responses into the more cost effective web mode.



Prepaid Incentives. Other literature documents the effectiveness of prepaid incentives.9 A review by Cantor, O’Hare, and O’Connor10 of incentive experiments in telephone surveys found consistently significant effects for prepaid incentives of $1 to $5, with increases in response rate of 2.2 to 12.1 percentage points.11 A recent experiment conducted by FNS for the SNAP Barriers Study found that a $2 pre-incentive and $20 post-incentive increased the response rate by 5.8 percentage points compared to receipt of a $20 post-incentive only.12 In a meta-analysis of 40 studies, Messer and Dillman13 found regarding response rates on multi-mode surveys using varying incentive amounts that offering a $5 pre-incentive and implementing a web-mail design yielded significant increase in response rates.

The Food and Nutrition Service (FNS) study: Nutrition Assistance in Farmers Markets: Understanding the Shopping Patterns of SNAP Participants (FMCS)14 (OMB Control Number: 0584-0564; Expiration Date: November 30, 2014) involved survey data collection from SNAP participants; the respondent burden was comparable to the proposed burden for the FINI national evaluation. The FMCS included an incentive experiment to examine the impact of a differential incentive on survey completion rates among SNAP participants. The estimated burden for completing the one-time survey was 25 minutes. Survey completion rates ranged from 42.5 to 48.9 percent, with the highest response rate for the $5 initial (pre-paid or pre-survey) and $20 post survey completion incentive group. Response rates were approximately 6 percentage points higher in the $20 post survey completion incentive group than the $10 post survey completion incentive group.


Most directly relevant to the current data collection, there is evidence that a prepaid incentive can increase response rates in sequential multimode designs similar to the one used for this study.15 An experiment conducted by the Department of Labor examined the use of a prepaid incentive in a survey of low-wage workers. The survey included web and telephone with an early response incentive for respondents to complete the survey on the web. The control group was offered a $40 incentive to complete the survey on the web and a $30 incentive to complete the survey on the telephone. The treatment group was offered $35 to complete on the web and $25 to complete on the telephone plus a $5 prepaid incentive. The $5 prepaid incentive significantly increased the response rate and reduced more costly locating efforts, making it cost effective. The increase in response rate was largely due to increases in the response rate to the web mode.

Incentives and Nonresponse Bias

Several studies have also found that the use of incentives is effective at changing the composition of the sample and potentially reducing nonresponse bias. Offering incentives can increase participation among low-income respondents and those who are less interested in the research. 16,17 For example, Singer et al. (2000) found that a $5 prepaid incentive brought a disproportionate number of low-education respondents into the sample.18 One study that used data from the Health and Retirement Survey (HRS) found that offering a $100 incentive per individual (or $200 per couple) to respondents who initially refused brought in respondents who had about 25 percent higher net worth and a 16 percent higher income than those who never refused. Another experiment,19 examining the impact of providing incentives to telephone survey non-respondents on sample composition and data quality in the New York Adult Tobacco Survey, found that offering an incentive to individuals who refused to participate in the survey led to increased proportion of respondents who were over age 55, did not have a college degree, and were not employed. Many of these subpopulations are represented in the SNAP universe. Studies also suggest that incentives may potentially reduce nonresponse bias by bringing in respondents to whom the research topic is not salient or not of interest. For example, Groves et al. (2000) found that while individuals more involved in their community were more likely to respond to a survey about issues facing the community, the provision of a $5 prepaid incentive increased response among those who were not involved.20 The incentive was more effective among those for whom the topic was less salient: the incentive increased the response rate in the “low community involvement group” by 42 percentage points and in the “high community involvement” group by 16 percentage points. This suggests that without the incentive, the sample would be biased toward individuals who were more interested in the topic and estimates of community involvement from the survey would be biased upward.


We propose to provide survey respondents a cash incentive of $20 upon completion of the survey. A monetary incentive serves as a token of our appreciation and can be used to offset any expenses such as cellular phone air time or internet connectivity charges. In order to drive respondents to completing the survey via the most cost efficient mode, the web survey (thereby increasing the efficiency of data collection), we propose to provide an additional $20 if the respondents complete the survey within the first 4 weeks. We believe that the early and later incentives of $40 and $20, respectively, strike a good balance between encouraging cooperation and the efficient use of project resources. Moreover, these amounts are in line with other recent FNS studies:

  • Most directly relevant, the USDA-FNS SNAP E&T Registrant and Participant Survey (OMB control number: 0584-0339, expiration date: 1/31/2021) involves a survey of SNAP E&T participants and work registrants with a 30-minute burden. Respondents were offered a $40 early response incentive for completing the survey on the web and a $20 incentive for completing on the telephone.

  • The USDA-FNS Evaluation of Food Insecurity Nutrition Incentives (FINI) (OMB control number: 0584-0616, expiration date: 11/30/2019) included baseline and follow-up surveys with SNAP participants. The estimated burden was 20 minutes for each survey and the incentive was $20.

  • The USDA- FNS study Evaluation of SNAP Employment and Training Pilots (OMB control number: 0584-0604, expiration date: 1/31/2019) involves surveying participants in the treatment and control groups after 12 months of participation and again after 36 months of participation. The estimated burden for the completion of each survey is about 30 minutes. A $30 incentive was approved for the completion of the 12 month follow-up survey and a $40 incentive for the completion of the 36 month follow-up survey; there is no baseline survey.




1

2 Mercer A, Caporaso A, Cantor D, Townsend R (2015). How much gets you how much? Monetary incentives and response rates in household surveys. Public Opinion Quarterly, 79:105-129.

3Cantor, David, Kevin Wang, and Natalie Abi-Habib. (2003). “Comparing Promised and Pre-Paid Incentives for an Extended Interview on a Random Digit Dial Survey.” Proceedings of the American Statistical Association, Survey Research Section.

4 Mercer, et al., (2015).

5 Mercer, et al., (2015).

6 Cantor, D., O’Hare, B., & O’Connor, K. (2008). The use of monetary incentives to reduce non-response in random digit dial telephone surveys. In Advances in Telephone Survey Methodology, eds. James M. Lepkowski, Clyde Tucker, J. Michael Brick, Edith de Leeuw, Lilli Japec, Paul J. Lavrakas, Michael W. Link, and Roberta L. Sangster, 471-98. New York: Wiley

7 LeClere, F., Plumme, S., Vanicek, J., Amaya, A., & Carris, K. (2012). Household early bird incentives: leveraging family influence to improve household response rates. In American Statistical Association Joint Statistical Meetings, Section on Survey Research.

8 Coopersmith, J., Vogel, L. K., Bruursema, T., & Feeney, K. (2016). Effects of Incentive Amount and Type of Web Survey Response Rates. Survey Practice, 9(1).

9Singer, E., van Hoewyk, J., and Maher, M. P. (2000). Experiments with incentives in telephone surveys. Public Opinion Quarterly, 64, 171–188; and Singer, E., Groves, R. M., and Corning, A. D. (1999). Differential incentives: Beliefs about practices, perceptions of equity, and effects on survey participation. Public Opinion Quarterly, 63, 251–260.

10 Cantor, et al., (2008).

11 Cantor, et al., (2008).

12 Gearing, Maeve. 2017. Assessment of the Barriers that Constrain the Adequacy of Supplemental Nutrition Assistance Program (SNAP) Allotments: Results of Incentive Experiment. Memo submitted to FNS.

13 Messer, B. and Dillman, D. (2011). Surveying the general public over the internet using address-based sampling and mail contact procedures. Public Opinion Quarterly: 64: 171-188.

14 Karakus, Mustafa, MacAllum, Keith, Milfort, Roline and Hao, Hongsheng. Nutrition Assistance in Farmers Markets: Understanding the Shopping Patterns of SNAP Participants. Prepared by Westat for the U.S. Department of Agriculture, Food and Nutrition Service, October 2014.

15 Hock, Heinrich, Anand, Priyanka, Mendenko, Linda, DiGiuseppe, Rebecca and McInerney, Ryan (May 2015). “The Effectiveness of Prepaid Incentives in a Mixed-Mode Survey.” Presentation at the 70th Annual Conference of the American Association for Public Opinion Research Hollywood, FL May 2015: http://www.aapor.org/AAPOR_Main/media/AnnualMeetingProceedings/2015/G2-3-Mendenko.pdf

16 Groves RM, Couper MP, Presser S, Singer E, Tourangeau R, Acosta G, Nelson L. (2006) Experiments in Producing Nonresponse bias. Public Opinion Quarterly. 70(5): 720-736

17 Singer, E., and R.A. Kulka. “Paying Respondents for Survey Participation.” In Studies of Welfare Populations: Data Collection and Research Issues. Panel on Data and Methods for Measuring the Effects of Changes in Social Welfare Programs, edited by Michele Ver Ploeg, Robert A. Moffitt, and Constance F. Citro. Committee on National Statistics, Division of Behavioral and Social Sciences and Education. Washington, DC: National Academy Press, 2002, pp. 105–128.

18 Singer, et al., (2000).

19 Currivan D (2005). The impact of providing incentives to initial telephone survey refusers on sample composition and data quality. Prepared for the American Association of Public Opinion Research Annual Meeting in Miami, 2005.

20 Groves, R., Singer, E. and Corning, A. (2000). Leverage-salience theory of survey participation: description and illustration. Public Opinion Quarterly: 64 (3): 299-308.

1


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorFrank Bennici
File Modified0000-00-00
File Created2021-01-13

© 2024 OMB.report | Privacy Policy