Appendix U. Incentives and Response Rates
A. Household survey incentives.
FNS is requesting a pre-paid survey incentive of $5 and a post-survey incentive of a $35 Visa gift card for participants completing the 35 minute survey instrument. Research has found that offering monetary incentives increases survey response rates, and that there is a positive relationship between higher post-pay incentives and survey response rates. Incentives also help reduce non-response bias to surveys without compromising the quality of the data.1,2,3 Studies also indicate that pre-paid incentives can be more effective in promoting survey response than post-paid incentives, particularly in their ability to reduce survey refusal rates among located participants and to lend credibility to post-pay incentives.2,4 Post-pay incentives encourage participants to stay engaged with the survey until they have completed and help offset expenses and time associated with survey completion, of particular relevance for longer survey instruments. The proposed combination of pre- and post-paid incentives has been shown to be particularly effective in improving overall survey response.1-5 Below, we describe the literature surrounding the pre-paid and post-paid incentive approach. We also describe a proposed experiment to determine the best way to present sample members with the $5 prepaid incentive.
Post-paid Incentives. Research indicates that offering monetary incentives helps improve survey response rates and mitigate nonresponse bias across different respondent populations, particularly among low-income respondents, those residing in rural areas, and those receiving federal nutrition assistance benefits.2,6 On the Project LAUNCH Cross-Site Evaluation (OMB number 0970-0373, expired October 31, 2019), the study initially did not initially offer a token of appreciation to parents who completed a web-based survey, and then OMB approved a $25 post-pay token of appreciation. The team found that early respondents (completing before incentives were offered) were not representative of their communities. Minorities, individuals with lower incomes and education levels, and those who worked part-time or were unemployed were underrepresented. Completion rates and representativeness both improved following the added incentives.7
A meta-analysis by Singer et al. found that incentives in face-to-face and telephone surveys were effective at increasing response rates, with each one dollar increase in incentive resulting in approximately a one-third of a percentage point increase in response rate, on average.2 Experiments conducted by the U.S. Census Bureau with the Survey of Income and Program Participation (SIPP) in 2014 compared the effects of $10, $20, and $40 post-pay incentives on survey response. The team found that a $40 post-pay incentive increased response rates by 3.0% compared to the $0 control group, while a $20 incentive increased response rates by 1.1%.8 The $40 incentive also increased response rates among subgroups of low income respondents. FNS proposes to provide survey respondents with a $35 Visa gift card as a post-incentive upon completion of the survey.
Pre-paid Incentives. Small pre-paid incentives (between $1 and $5) have been shown to significantly increase response rates for telephone surveys.2,9,10,11 Messer and Dillman also suggest that a mailed pre-paid incentive included with advance letter materials may help motivate respondents to complete the survey by web, helping overcome barriers associated with hard copy mailings that attempt to push participants to a web mode.11 A review of the literature by Cantor, O’Hare, and O’Connor found that pre-paid incentives between $1 and $5 in a telephone survey were associated with response rate increases from 2.2 to 12.1 percentage points compared to no offered incentive.4 Given the literature, FNS proposes a $5 pre-paid incentive for the participant survey to maximize response rates while still minimizing costs.
Combination Of Pre- And Post-Paid Incentives. A combined approach that includes both a pre- and post-paid incentive can help studies achieve maximum response rates while still being cost effective.1,5 Cantor et al found that combining a pre- and post-paid incentive increased overall tracing and contact rates, particularly among sample members for whom the team did not have an active telephone number.9 This project uses an address-based sample supplemented with SNAP administrative records; we will not have phone numbers for many sampled households and are less likely to have phone numbers for SNAP non-participants. The Food and Nutrition Service (FNS) study: Nutrition Assistance in Farmers Markets: Understanding the Shopping Patterns of SNAP Participants (FMCS)12 (OMB Control Number: 0584-0564; Expiration Date: November 30, 2014) collected data from from SNAP participants and included an incentive experiment to examine the impact of a differential incentive on survey completion rates on a 25-minute survey, comparing different combinations of a $5 or $10 pre-paid incentive with a $10 or $20 post-paid incentive. The experiment found that the combination of a $5 pre-paid incentive with a $20 post-paid incentive resulted in the highest response rates (49.3%), with the lowest response rate (42.5%) attributed to the $5 pre-paid and $10 post-paid group.
Post-Pay Amounts. Additionally, FNS proposes increasing the post-incentive amount to $45, instead of $35, among subpopulations of nonrespondents in subsequent releases to reduce nonresponse bias. The current study design includes three separate releases of sample throughout data collection, where sample members are randomly assigned to a release. After the first sample release, the study team would review response rates by county to determine if there are significant differentials between subgroups of SNAP participants compared to SNAP nonparticipants. We hypothesize there may be differences in response rates between SNAP participants and SNAP nonparticipants given that nonparticipants are not receiving FNS benefits. If a response rate differential within individual counties exists, such that SNAP nonparticipants are less likely to participate, the team would increase the post-incentive value among SNAP nonparticipants randomly selected to be included in the second or third release. This approach attempts to use a preexisting characteristic of survey nonresponders (SNAP participation status) to maximize the benefit of incentive payments.
For these subgroups, FNS proposes a post-paid incentive of $45, instead of $35 to further encourage survey response and ultimately reduce nonresponse bias. The literature shows a positive relationship between higher post-pay incentives and higher response rates.13,14,15 However, it is not always cost effective or beneficial to increase incentive amounts for the entire sample. Not all participants require an increased incentive to participate;16 further, simply increasing survey response rates does not necessarily reduce nonresponse bias in survey estimates.17 Research suggests that differential incentives are most effective in reducing nonresponse bias when they are offered among otherwise underrepresented study groups.1,18 The study’s randomized multiple release sample design offers the study team a unique opportunity to learn firsthand about the sample population during the first release and provides the opportunity to tailor the data collection approach in subsequent releases based on known sample characteristics. If it is found that the subgroup of SNAP nonparticipants are less likely to participate in the study during the first sample release, increasing the post-incentive provided to these sample members might help increase response rates and reduce nonresponse bias in the most cost effective way. We will include subgroup analyses (including income within each state) as we review the effects of the experiment on completion rates to confirm the experiment did not have a disproportional impact by income, which could add additional bias.
Incentive Experiment. To further the literature and better understand how to efficiently provide pre-paid incentives, the study team proposes to conduct an experiment investigating the best way to present sample members with the $5 prepaid incentive, sent in the initial advance letter mailing. DeBell et al.19 found that displaying a $5 prepaid cash incentive through the window of a sealed envelope resulted in a survey response rate increase of more than 4 percent compared to envelopes where cash was not visible, with no evidence of additional theft. To build off that research, the experiment will test the success of three approaches to the advance letter envelope: (1) prepaid cash visible with the $5 dollar value showing, (2) prepaid cash visible without the dollar value showing, and (3) prepaid cash not visible through the window. In the two visible cash scenarios, the cash will be purposefully affixed to the letter to control how much of the $5 can be seen through the envelope window. In the not visible condition, the cash will be affixed to the advance letter in such a way that it cannot be seen through the envelope window. The study team will field this experiment during the first county sample release. Experiment results will dictate the approach for the remaining releases to maximize response rates.
B. IDI Incentives
FNS plans to offer a $50 post-pay incentives for the 120-minute in-depth interviews. This amount is consistent with the literature as well as other OMB-approved information collections. For example, $50 incentives were offered to participants of a 120-minute in-depth interview on the Evaluation of Demonstrations to End Childhood Hunger for USDA/FNS (OMB Control Number 0584-0603, Expiation Date 08/31/2018) and for case study interviews conducted on the Evaluation of Supplemental Nutrition Assistance Program (SNAP) Employment and Training Pilots (OMB Control Number 0584-0604, Expiration Date 11/30/2022). An incentive of $50 was also offered to community members participating in one-hour telephone interviews for the Evaluation of the Pilot Project for Canned, Frozen, or Dried Fruits and Vegetables in the Fresh Fruit and Vegetable Program for USDA/FNS (OMB Control Number 0584-0598, Expiration Date September 30, 2017). The study to assess the effect of Supplemental Nutrition Assistance Program on Food Security (OMB Control Number 0584-0563, Discontinued September 19, 2011) offered a $30 incentive to for completing a 90-minute in-depth interview.
1 Singer E, Ye C. The use and effects of incentives in surveys. The ANNALS of the American Academy of Political and Social Science. 2013;645(1):112–41.
2 Singer, Eleanor, et al. "The effect of incentives on response rates in interviewer-mediated surveys." Journal of official statistics 15.2 (1999): 217.
3 Singer, E., and R.A. Kulka. “Paying Respondents for Survey Participation.” In Studies of Welfare Populations: Data Collection and Research Issues. Panel on Data and Methods for Measuring the Effects of Changes in Social Welfare Programs, edited by Michele Ver Ploeg, Robert A. Moffitt, and Constance F. Citro. Committee on National Statistics, Division of Behavioral and Social Sciences and Education. Washington, DC: National Academy Press, 2002, pp. 105–128.
4 Mercer A, Caporaso A, Cantor D, Townsend R. How much gets you how much? Monetary incentives and response rates in household surveys. Public Opinion Quarterly. 2015;79(1):105–29.
5 Beydoun, Hind, Audrey F. Saftlas, Kari Harland, and Elizabeth Triche. 2006. Combining conditional and unconditional recruitment incentives could facilitate telephone tracing in surveys of postpartum women. Journal of Clinical Epidemiology 59 (7): 732–38.
6 Bonevski, Billie, et al. "Reaching the hard-to-reach: a systematic review of strategies for improving health and medical research with socially disadvantaged groups." BMC medical research methodology 14.1 (2014): 1-29.
7 Lafauve, K., K. Rowan, K. Koepp, and G. Lawrence. “Effect of Incentives on Reducing Response Bias in a Web Survey of Parents.” Presented at the American Association of Public Opinion Research Annual Conference, Denver, CO, May 16–19, 2018.
8 Westra, Ashley, Mahdi Sundukchi, and Tracy Mattingly. "Designing a multipurpose longitudinal incentives experiment for the Survey of Income and Program Participation." Proceedings of the 2015 Federal Committee on Statistical Methodology (FCSM) Research Conference, available at https://www.reginfo.gov/public/do/DownloadDocument?objectID=77072601 (last accessed June 9, 2022). 2015.
9 Cantor, David, Barbara O’Hare, and Kathleen O’Connor. 2008. The use of monetary incentives to reduce non-response in random digit dial telephone surveys. In Advances in telephone survey methodology, eds. James M. Lepkowski, Clyde Tucker, J. Michael Brick, Edith de Leeuw, Lilli Japec, Paul J. Lavrakas, Michael W. Link, and Roberta L. Sangster, 471–98. New York, NY: Wiley.
10 Singer, Eleanor, John Van Hoewyk, and Mary P. Maher. "Experiments with incentives in telephone surveys." Public Opinion Quarterly 64.2 (2000): 171-188.
11 Messer, Benjamin L., and Don A. Dillman. "Surveying the general public over the internet using address-based sampling and mail contact procedures." Public opinion quarterly 75.3 (2011): 429-457.
12 Karakus, Mustafa, MacAllum, Keith, Milfort, Roline and Hao, Hongsheng. Nutrition Assistance in Farmers Markets: Understanding the Shopping Patterns of SNAP Participants. Prepared by Westat for the U.S. Department of Agriculture, Food and Nutrition Service, October 2014. https://www.fns.usda.gov/snap/nutrition-assistance-farmers-markets-understanding-shopping-patterns-snap-participants
13 Singer E, Ye C. The use and effects of incentives in surveys. The ANNALS of the American Academy of Political and Social Science. 2013;645(1):112–41.
14 Singer, Eleanor, et al. "The effect of incentives on response rates in interviewer-mediated surveys." Journal of official statistics 15.2 (1999): 217.
15 Singer, E., and R.A. Kulka. “Paying Respondents for Survey Participation.” In Studies of Welfare Populations: Data Collection and Research Issues. Panel on Data and Methods for Measuring the Effects of Changes in Social Welfare Programs, edited by Michele Ver Ploeg, Robert A. Moffitt, and Constance F. Citro. Committee on National Statistics, Division of Behavioral and Social Sciences and Education. Washington, DC: National Academy Press, 2002, pp. 105–128.
16 Groves RM, Singer E, Corning A. Leverage-saliency theory of survey participation: description and an illustration. The Public Opinion Quarterly. 2000;64(3):299–308.
17 Groves RM. Nonresponse rates and nonresponse bias in household surveys. Public opinion quarterly. 2006;70(5):646–75.
18 Lepkowski JM, Mosher WD, Groves RM, et al. Responsive design, weighting, and variance estimation in the 2006–2010 National Survey of Family Growth. National Center for Health Statistics. Vital Health Stat 2(158). 2013.
19 Debell M, Maisel N, Edwards B, Amsbary M, Meldener V. Improving Survey Response Rates with Visible Money. Journal of Survey Statistics and Methodology. 2020;8(5):821–31.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | LocalAdmin |
File Modified | 0000-00-00 |
File Created | 2024-07-29 |