Appendix H Use of Incentives_01202022

Appendix H Use of Incentives_01202022.docx

Food Security Status and Well-Being of Nutrition Assistance Program (NAP) Participants in Puerto Rico

Appendix H Use of Incentives_01202022

OMB: 0584-0674

Document [docx]
Download: docx | pdf

Appendix H. Use of Incentives for the Study

The study team is requesting the use of incentives for the household survey, in-depth interviews, and concept mapping. For survey respondents, the initial survey invitation would include a $5 prepaid cash incentive and offer respondents a $40 postparticipation incentive. The subgroup of survey respondents who participate in a 60-minute in-depth interview would be offered a $50 postparticipation incentive. Concept-mapping participants would be offered a $50 honorarium per meeting.

Section A discusses factors supporting the use of incentives. Section B discusses incentives for the household survey, and section C discusses incentives for the in-depth interviews. Section D discusses the provision of honoraria for the concept-mapping participants.

  1. Factors Supporting the Use of Incentives

This section provides details on the decision to provide incentives to survey respondents and in-depth interview participants. Sections 1 through 4 discuss general respondent burden for the household survey and summarize the literature around prepaid and postparticipation incentives, amounts and response rates, and the use of incentives to reduce nonresponse bias.

  1. Respondent Burden

Individuals sampled for the study will need to take time outside normal working hours to complete the survey and/or in-depth interview—time that could be used for other important nonwork activities, such as childcare, household maintenance, or grocery shopping. Individuals may face financial burdens, such as the expense of childcare, as a result of completing the survey or in-depth interview during nonworking hours. There may be other opportunity costs, such as needing to purchase rather than prepare meals for themselves or household members as a result of time spent completing the survey or in-depth interview. Participants who complete the survey via the web or phone or call the study hotline with questions may incur expenses for cellular smartphone airtime or internet connectivity charges. Individuals who complete the in-depth interview may need to travel to the interview location if it occurs outside their home.

  1. Prepaid and Postparticipation Tokens of Appreciation

An incentive is essential to obtain the sample sizes needed to produce reliable and unbiased estimates by encouraging individuals to provide complete data in critical study instruments. Dillman and Singer found incentives reduce efforts to locate hard-to-reach study participants and lower overall survey costs and time to achieve completion rates without affecting data quality.1, 2 Singer and Ye found through a systematic review that incentives improve response rates for all modes of survey administration, increasing incentive amounts continue to improve response rates, and monetary incentives are more effective than promised gifts.3 Higher response rates and higher numbers of complete surveys will improve data quality and the research team’s ability to answer the study research questions.

Singer, Groves, and Corning and Singer, Van Hoewyk, and Maher document the effectiveness of prepaid incentives.4, 5 A review by Cantor et al. of incentive experiments in telephone surveys found consistently significant effects for prepaid incentives of $1 to $5, with increases in response rate of 2.2 to 12.1 percentage points.6 Gearing found a $2 prepaid and a $20 postparticipation incentive increased the response rate by 5.8 percentage points compared with receipt of a $20 postparticipation incentive only.7 In a meta-analysis of 40 studies, Messer and Dillman found a $5 prepaid incentive yielded a significant increase in response rates on multimode surveys using varying incentive amounts.8

In a survey of Supplemental Nutrition Assistance Program (SNAP) participants, Karakus et al. conducted an experiment to examine the impact of differential incentive amounts on survey completion rates.9 The authors found the highest response rate among the $5 prepaid and $20 postparticipation incentive group. Response rates were approximately 6 percentage points higher in the $20 postparticipation incentive group than in the $10 postparticipation incentive group.

In a survey of low-wage workers, Hock et al. also conducted an incentive experiment in a multimode survey design.10 The survey included web and telephone instruments, with an early response incentive for respondents to complete the survey on the web. The control group was offered a $40 incentive to complete the survey on the web and a $30 incentive to complete the survey on the phone, with no prepaid incentive. The treatment group was offered $35 to complete the survey on the web and $25 to complete it on the phone, plus a $5 prepaid incentive for both modes. Similar to Karakus et al.,11 Hock et al. found the $5 prepaid incentive significantly increased the response rate and reduced more costly locating efforts, making it cost-effective.12 The increase in response rate was largely the result of increases in the response rate to the web mode.

Mercer et al. reported a 10 percentage-point increase in response rates for mail surveys when participants were given a $2 prepaid incentive and a 6 percentage-point increase for phone surveys when participants were offered a $20 postparticipation incentive.13 Similarly, Gearing and Messer and Dillman found postparticipation incentives improved responses to mail and interviewer-administered surveys.14 , 15 Cantor et al. found response rates increased by 9.1 percentage points when offering a postparticipation incentive of $20, compared with no incentive.16 Fredrickson et al. found a $10 incentive increased responses by 20 percentage points among Medicaid recipients.17 Overall, the literature summarized here demonstrates prepaid and postparticipation incentives increase response rates among survey participants.

  1. Token of Appreciation Amounts and Response Rates

Research indicates the amount of the incentive matters; larger incentive amounts are associated with higher response rates. This is true for both prepaid and postparticipation incentives. For example, Han et al. contrasted $5 and $2 prepaid incentives on the response rate to a screener.18 They found the group that received the $5 prepaid incentive had an initial response rate of 43 percent, compared with 36 percent for the group that received the $2 prepaid incentive. Similarly, Griffin et al. found groups that received $5 and $2 prepaid cash incentives had response rates of 58 and 48 percent, respectively.19

Higher postparticipation incentives are also associated with higher response rates. The U.S. Census Bureau has experimented with incentives on the Survey of Income and Program Participation since 1996. Westra et al. conducted the most recent experiment in 2014, comparing results of $10, $20, and $40 incentive amounts with those of a $0 control group.20 Overall, the $20 incentive increased Wave 1 response rates compared with the control group, and the $40 incentive increased Wave 1 response rates compared with both the control group and the $20 incentive. These findings held for the subgroup of respondents with low incomes, with response rates of 71 percent, 73 percent, and 77 percent for the $0, $20, and $40 groups, respectively. Brick et al. observed a dramatic effect in their experiment comparing $10 and $5 incentives.21 The response rate for the $10 group was 26 percent, compared with 19 percent for the $5 group.

The National Survey of Family Growth included an incentive experiment in which a $20 postparticipation incentive was contrasted with a $40 postparticipation incentive. The response rate was 62 percent for those offered $20 and 72 percent for those offered $40. Those receiving the higher incentive were also less likely to express objections to or reluctance about participating than those receiving the lower amount.22

The Medical Expenditures Panel Study included an incentive experiment with $30, $50, and $70 postparticipation incentive amounts. The results showed the composite response rate across all rounds of data collection was higher for the $50 and $70 postparticipation incentive group relative to the $30 group. The difference in response rates between the $70 and $50 postparticipation incentive groups was also statistically significant, with a composite response rate of 71.1 percent and 66.7 percent for the $70 and $50 groups, respectively.23

The National Survey of Household Drug Use included an experiment to compare the impact of $20 and $40 postparticipation incentives with a $0 control on measures of respondent cooperation, data quality, survey costs, and population substance use estimates. Overall, the $40 postparticipation incentive resulted in a significantly higher response rate than the $20 postparticipation incentive (83 percent versus 79 percent), and the $20 postparticipation incentive resulted in a significantly higher response rate than the $0 control (79 percent versus 69 percent).24

Despite extensive research on the relative effectiveness of different incentive amounts, Mercer et al. found that while higher incentive amounts increase response rates, the relationship between incentive amounts and response rates is not linear.25 This means that the marginal increase in response rate decreases as the incentive amount increases. In a survey of unemployed and dislocated workers, Gemmill et al. found that $50 and $75 postparticipation incentives resulted in a significantly higher response rate than a $25 postparticipation incentive.26 However, the $50 incentive was more cost-effective than the $75 incentive.

Similarly, a recent experimental study by Kelly et al. randomized participants from an online panel into one of five versions of a recruitment ad: no incentive, a nonmonetary incentive, $25, $50, or $75.27 Results indicated that incentive amounts mattered; the respondents offered $75 were more willing to participate in a qualitative interview than those offered $25, but there were no differences in willingness to participate between the groups offered $50 and $75.

  1. Tokens of Appreciation and Nonresponse Bias

Several studies have found the use of incentives is effective at changing the composition of the sample and potentially reducing nonresponse bias. Lafauve et al. demonstrated that web survey completion rates and respondent representativeness improved after incentives were provided.28 Prior to incentives being provided, minorities, individuals with lower incomes and education levels, and those who worked part time or were unemployed were underrepresented among survey respondents.

Groves et al. and Singer and Kulka found that incentives can increase participation among respondents with low incomes and those who are less interested in the research.29, 30 Singer et al. found that a $5 prepaid incentive brought a disproportionate number of low-education respondents into the sample.31 Currivan conducted an experiment examining the impact of providing incentives to telephone survey nonrespondents on sample composition and data quality in the New York Adult Tobacco Survey.32 This study found that offering an incentive to individuals who refused to participate in the survey led to increased proportions of respondents who were over age 55, did not have a college degree, and were not employed.

Studies also suggest incentives may reduce nonresponse bias by bringing in respondents to whom the research topic is not salient or not of interest. For example, Groves et al. found that while individuals more involved in their community were more likely to respond to a survey about issues facing the community, the provision of a $5 prepaid incentive increased response rates among those who were not involved.33 The incentive was more effective among the individuals for whom the topic was less salient: the incentive increased the response rate in the “low community involvement group” by 42 percentage points and in the “high community involvement” group by 16 percentage points. This finding suggests that without the incentive the sample would be biased toward individuals who were more interested in the topic, and estimates of community involvement from the survey would be biased upward. The proposed use of incentives in this study is important for potentially reducing nonresponse bias.

  1. Household Survey

The use of prepaid and postparticipation incentives will help increase survey response rates, gain efficiency in data collection, and, most importantly, reduce nonresponse bias. Providing survey participants with a monetary incentive reduces nonresponse bias and improves survey representativeness, especially in populations defined as being in poverty.34, 35, 36, 37 Tokens of appreciation also improve survey response rates and the numbers of completed surveys.

Having an adequate number of completed surveys is essential to obtain reliable estimates. Tokens of appreciation are an essential component of the multipronged approaches used to minimize nonresponse bias, especially in studies with hard-to-reach, low-income households such as those with children, older individuals, those residing in rural areas, and those receiving nutrition assistance benefits.38 Tokens of appreciation reduce efforts to locate hard-to-reach study participants and lower overall survey costs and time to achieve completion rates without affecting data quality.39, 40

Based on the empirical evidence summarized above, the study team is requesting all sampled individuals and households receive a $5 prepaid cash incentive with the invitation letter and individuals and households that complete the survey receive a $40 postparticipation incentive provided by a gift card or money on a cash app. The incentive can be used to offset any expenses such as cellular smartphone airtime or any internet connectivity charges participants may incur.

The total incentive of $45 strikes a balance between encouraging cooperation and efficiently using project resources and is appropriate given the time to complete the survey (up to 40 minutes). Moreover, this amount is in line with other recent Food and Nutrition Service (FNS) studies:

  • The USDA-FNS Evaluation of Food Insecurity Nutrition Incentives (OMB control number: 0584-0616, expiration date: 11/30/2019) included baseline and follow-up surveys with SNAP participants, with a $20 incentive for each survey and a $2 prepaid incentive for each survey, for a possible total incentive of $44.

  • The USDA-FNS SNAP Employment and Training Registrant and Participant Survey (OMB control number: 0584-0339, expiration date: 1/31/2021) offered respondents a $40 early response incentive for completing the survey on the web and a $20 incentive for completing it on the phone.

  • The USDA-FNS study Evaluation of SNAP Employment and Training Pilots (OMB control number: 0584-0604, expiration date: 1/31/2019) provided a $30 incentive for the first follow-up survey and a $40 incentive for the second follow-up survey.

Within a week of survey completion by web, paper, or phone, the home office staff will mail thank-you letters with incentives. Alternatively, field data collectors will provide incentives in person to respondents from the area probability sample at the time of survey pickup. Field data collectors will obtain signatures from respondents confirming receipt.

  1. In-Depth Interviews With Low-Income Households

While the survey will provide estimates of food security status among a representative sample of households in Puerto Rico, the in-depth interviews will provide critical insight into the social context of food security among low-income households and the coping strategies they use to avoid hunger. The study team is requesting approval to offer interviewees a $50 incentive for their participation in this qualitative data collection effort. This amount is recommended based on several considerations described above. The in-depth interviews will last approximately 60 minutes, an estimated 20 minutes longer than the survey. The in-depth interview also requires the participant to schedule an appointment with the interviewer. The greater time commitment and the need to meet the interviewer at a designated time and place make the in-depth interview more burdensome to participants than the survey.

As discussed, the $50 postparticipation incentive can offset costs associated with participation, such as childcare that may be needed while the participant completes the interview, travel to the interview location if it occurs outside of the participant’s home, and cellular phone and data usage costs associated with scheduling the interview. The incentive also offsets the “opportunity cost” associated with participation; respondents may need to forgo other sources of income to participate in the interview.41

The incentive will be provided in person to in-depth interview participants upon completion of the interview in the form of a gift card or money on a cash app.

  1. Honoraria for Concept-Mapping Participants

Concept-mapping participants will have expertise in policy issues related to NAP and food security in Puerto Rico, familiarity with one or more stages of Puerto Rico’s food and nutrition system, and/or ability to represent the perspectives of key stakeholders in that system. The study team is requesting approval for a $50 honorarium for each 90-minute virtual meeting ($100 total honorarium). The honoraria are intended to encourage participation, as described in section A, and recognize the participants’ unique expertise.

1 Dillman, D. (2000). Mail and internet surveys: The tailored design method (2nd ed.). John Wiley & Sons.

2 Singer, E. (2006). Introduction: Nonresponse bias in household surveys. Public Opinion Quarterly, 70(5), 637–645.

3 Singer, E., & Ye, C. (2013). The use and effects of incentives in surveys. Annals of the American Academy of Political and Social Science, 645(1), 112–141.

4 Singer, E., Groves, R. M., & Corning, A. D. (1999). Differential incentives: Beliefs about practices, perceptions of equity, and effects on survey participation. Public Opinion Quarterly, 63(2), 251–260.

5 Singer, E., Van Hoewyk, J., & Maher, M. P. (2000). Experiments with incentives in telephone surveys. Public Opinion Quarterly, 64(2), 171–188.

6 Cantor, D., O’Hare, B. C., & O’Connor, K. S. (2007). The use of monetary incentives to reduce nonresponse in random digit dial telephone surveys. In J. M. Lepkowski, C. Tucker, J. M. Brick, E. D. de Leeuw, L. Japec, P. J. Lavrakas, M. W. Link, & R. L. Sangster (Eds.), Advances in Telephone Survey Methodology (pp. 471–498). John Wiley & Sons.

7 Gearing, M. (2017). Assessment of the barriers that constrain the adequacy of Supplemental Nutrition Assistance Program (SNAP) allotments: Results of incentive experiment [Memo submitted to FNS]. Westat.

8 Messer, B., & Dillman, D. (2011). Surveying the general public over the internet using address-based sampling and mail contact procedures. Public Opinion Quarterly, 75, 429–457.

9 Karakus, M., MacAllum, K., Milfort, R., & Hao, H. (2014). Nutrition assistance in farmers markets: Understanding the shopping patterns of SNAP participants. Westat. https://www.fns.usda.gov/sites/default/files/FarmersMarkets-Shopping-Patterns.pdf

10 Hock, H., Priyanka, A., Mendenko, L., DiGiuseppe, R., & McInerney, R. (2015). The effectiveness of prepaid incentives in a mixed-mode survey. Presentation at Annual Conference of the American Association for Public Opinion Research, Hollywood, FL. http://www.aapor.org/
AAPOR_Main/media/AnnualMeetingProceedings/2015/G2-3-Mendenko.pdf

11 Karakus, M., MacAllum, K., Milfort, R., & Hao, H. (2014). Nutrition assistance in farmers markets: Understanding the shopping patterns of SNAP participants. Westat. https://www.fns.usda.gov/sites/default/files/FarmersMarkets-Shopping-Patterns.pdf

12 Hock, H., Priyanka, A., Mendenko, L., DiGiuseppe, R., & McInerney, R. (2015). The effectiveness of prepaid incentives in a mixed-mode survey. Presentation at Annual Conference of the American Association for Public Opinion Research, Hollywood, FL. http://www.aapor.org/
AAPOR_Main/media/AnnualMeetingProceedings/2015/G2-3-Mendenko.pdf

13 Mercer, A., Caporaso, A., Cantor, D., & Townsend, R. (2015). How much gets you how much? Monetary incentives and response rates in household surveys. Public Opinion Quarterly, 79, 105–129.

14 Gearing, M. (2017). Assessment of the barriers that constrain the adequacy of Supplemental Nutrition Assistance Program (SNAP) allotments: Results of incentive experiment [Memo submitted to FNS]. Westat.

15 Messer, B., & Dillman, D. (2011). Surveying the general public over the internet using address-based sampling and mail contact procedures. Public Opinion Quarterly, 75, 429–457.

16 Cantor, D., Wang, K., & Abi-Habib, N. (2003). Comparing promised and prepaid incentives for an extended interview on a random digit dial survey. Presentation at Annual Meeting of the American Association for Public Opinion Research, Nashville, TN.

17 Fredrickson, D. D., Jones, T. I., Molgaard, C. A., Carman, C. G., Schukman, J., Dismuke, S. E., & Ablah, E. (2005). Optimal design features for surveying low-income populations. Journal of Health Care for the Poor and Underserved, 16, 677–690.

18 Han, D., Montaquila, J. M., & Brick, J. M. (2013). An evaluation of incentive experiments in a two-phase address-based sample mail survey. Survey Research Methods, 7(3), 207–218.

19 Griffin, J. M., Simon, A. B., Hulbert, E., Stevenson, J., Grill, J. P., Noorbaloochi, S., & Partin, M. R. (2011). A comparison of small monetary incentives to convert survey non-respondents: A randomized control trial. BMC Medical Research Methodology, 11(1), 1–8.

20 Westra, A., Sundukchi, M., & Mattingly, T. (2015). Designing a multipurpose longitudinal incentives experiment for the Survey of Income and Program Participation. In Proceedings of the 2015 Federal Committee on Statistical Methodology (FCSM) Research Conference.

21 Brick, J. M., Brick, P. D., Dipko, S., Presser, S., Tucker, C., & Yuan, Y. (2007). Cell phone survey feasibility in the US: Sampling and calling cell numbers versus landline numbers. Public Opinion Quarterly, 71(1), 23–39.

22 Nhein, T. (2015). Gemini incentive structure review. U.S. Department of Labor.

23 Agency for Healthcare Research and Quality. (2010). Respondent payment experiment with MEPS panel 13. https://meps.ahrq.gov/data_files/publications/rpe_report/rpe_report_2010.shtml

24 Eyerman, J., & Bowman, K. (2002). 2001 national household survey on drug abuse: Incentive experiment: Combined quarter 1 and quarter 2 analysis. Prepared for the Substance Abuse and Mental Health Services Administration, Contract No. 283-98-9008. RTI.

25 Mercer, A., Caporaso, A., Cantor, D., & Townsend, R. (2015). How much gets you how much? Monetary incentives and response rates in household surveys. Public Opinion Quarterly, 79, 105–129.

26 Gemmill, E., Nemeth, P., Schochet, P., & Berk, J. (2009). Logos and dollars: How procedural and incentive payment changes can increase response rates. Presentation at Annual Conference of the American Association for Public Opinion Research, Hollywood, FL. https://www.mathematica.org/download-media?MediaItemId={78268504-024B-4CAF-AC20-7C0390FAEEF2}

27 Kelly, B., Margolis, M., McCormack, L., LeBaron, P. A., & Chowdhury, D. (2017). What affects people’s willingness to participate in qualitative research? An experimental comparison of five incentives. Field Methods, 29(4), 333–350. https://doi.org/10.1177/1525822X17698958

28 Lafauve, K., Rowan, K., Koepp, K., & Lawrence, G. (2018). Effect of incentives on reducing response bias in a web survey of parents. Presentation at Annual Conference of the American Association for Public Opinion Research, Denver, CO.

29 Groves, R. M., Couper, M. P., Presser, S., Singer, E., Tourangeau, R., Acosta, G., & Nelson, L. (2006). Experiments in producing nonresponse bias. Public Opinion Quarterly, 70(5), 720–736.

30 Singer, E., & Kulka, R. A. (2002). Paying respondents for survey participation. In M. Ver Ploeg, R. A. Moffitt, & C. F. Citro, Committee on National Statistics, Division of Behavioral and Social Sciences and Education (Eds.), Studies of welfare populations: Data collection and research issues. Panel on data and methods for measuring the effects of changes in social welfare programs (pp. 105–128). National Academy Press.

31 Singer, E., Van Hoewyk, J., & Maher, M. P. (2000). Experiments with incentives in telephone surveys. Public Opinion Quarterly, 64(2), 171–188.

32 Currivan, D. (2005). The impact of providing incentives to initial telephone survey refusers on sample composition and data quality. Presentation at Annual Meeting of the American Association for Public Opinion Research, Miami, FL.

33 Groves, R., Singer, E., & Corning, A. (2000). Leverage-saliency theory of survey participation: Description and an illustration. Public Opinion Quarterly, 64(3), 299–308.

34 Singer, E. (2002). The use of incentives to reduce nonresponse in household surveys. In R. Groves, D. Dillman, J. Eltinge, & R. Little (Eds.), Survey nonresponse (pp. 163–177). Wiley.

35 James, T. L. (1997). Results of the wave 1 incentive experiment in the 1996 survey of income and program participation. Proceedings of the Survey Research Section of the American Statistical Association, 834–839.

36 Groves, R., Fowler, F., Couper, M., Lepkowski, J., Singer, E., & Tourangeau, R. (2009). Survey methodology (2nd ed.). Wiley.

37 Singer, E., & Ye, C. (2013). The use and effects of incentives in surveys. Annals of the American Academy of Political and Social Science, 645(1), 112–141.

38 Bonevski, B., Randell, M., Paul, C., Chapman, K., Twyman, L., Bryant, J., Brozek, I., & Hughes, C. (2014). Reaching the hard-to-reach: A systematic review of strategies for improving health and medical research with socially disadvantaged groups. BMC Medical Research Methodology, 14(42).

39 Dillman, D. (2000). Mail and internet surveys: The tailored design method (2nd ed.). John Wiley & Sons.

40 Singer, E. (2006). Introduction: Nonresponse bias in household surveys. Public Opinion Quarterly, 70(5), 637–645.

41 U.S. Department of Health and Human Services, Office for Human Research Protections, Secretary’s Advisory Committee on Human Research Protections, 2019

Food Security Status and Well-Being of NAP Participants in Puerto Rico, Appendix H. Use of Incentives for the Study H-2

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorAllyson Corbo
File Modified0000-00-00
File Created2023-08-28

© 2024 OMB.report | Privacy Policy