Nonsub change request memo

0885 Nonsub chg request memo 2020.docx

Study of Oncology Indications in Direct-to-Consumer Television Advertising

Nonsub change request memo

OMB: 0910-0885

Document [docx]
Download: docx | pdf

Study of Oncology Indications in Direct-to-Consumer Television Advertising


(OMB Control Number 0910-0885)


Change Request (83-C)


August 11, 2020


Because of COVID-19 concerns, we plan to conduct cognitive interviews remotely rather than in person. This resulted in some changes to the screener and consent form. The revised documents are similar in length and therefore the burden estimate has not changed. This change request affects cognitive interviews with 18 participants; the remaining data collection is unaffected. Please see below for the revised incentive amount and justification.


Amount and Justification for the Proposed Incentive

We propose an incentive amount of $50 for the 60-minute remote interviews. This estimate is based on participants spending approximately 90 minutes of their time on this effort, which includes time for online and phone screening (5 minutes), time for testing the platform (10 minutes), time to participate in the interview (60 minutes), and the request to log in 15 minutes early to confirm technical operation. This token of appreciation is intended to provide enough incentive to participate in the study rather than another activity, improve data quality of the study, reduce the chance of cancellations or insufficient recruitment numbers, recognize the burden of childcare costs, and to convey appreciation for contributing to this important activity.1 Incentives must be high enough to equalize the burden placed on respondents in respect to their time and cost of participation.

The Bureau of Labor Statistics (BLS) calculated that the average hourly wage of employees, including benefits, in March 2020 is $37.73.2 At that hourly rate, compensation for 90 minutes is approximately $57. Although the incentive is a token of appreciation rather than a wage, this estimate represents the amount of money participants would earn if they spent the same amount of time working at a job, which helps identify what an appropriate incentive might be to adequately incentivize someone to participate.

Given the current pandemic, it is worth emphasizing that all research is remote right now. Thus, the convenience typically associated with virtual (remote) sessions has evaporated, and participants are not deriving any special benefit from joining a remote (versus an in-person) interview. They still have to coordinate childcare, protect time, find a private and quiet location, etc. For example:


  • Participants are required to join the group from a quiet location where there are no distractions, which may require childcare or special accommodations during that time. BLS calculated in May 2018 that the average hourly wage of childcare workers is $11.65.3 (Bureau of Labor Statistics, 2018)

  • The cognitive interviews will be conducted online and participants must have a computer and broadband Internet to participate in the groups; participating will use at least 60 minutes of data on their Internet plans.

The proposed incentive amount is significantly below market rate for an effort of this type. Recruiting firms and researchers determine market rates for research participation based on what other comparable studies in the field are offering and what rate will incentivize the required population to participate in the research. Shugoll (the subcontractor conducting recruitment) and other vendors estimate that other studies being conducted with similar populations and levels of effort in this market at this time pay incentives of $100-$175. For example, it is not uncommon for companies to pay $150 for 45-minute remote interviews.4 On the lower end of the scale, one online market research firm suggests paying people at least twice as much as their hourly rate to incentivize them to participate (for this study, that would be $75 for an online interview).5 In sum, the proposed incentive rate is also in accordance with standard practice and based on RTI’s experience recruiting similar populations, the amount of time the participant spends in the study, what is required of them, recent consultation with market research firms, and OMB-approved incentives on recent FDA projects.

In reviewing OMB’s guidance on the factors that may justify provision of incentives to research participants, we have determined that the following principles apply:


  1. The incentive amount will help reduce costs.

OMB’s guidance states that “If prior or similar surveys have devoted considerable resources to nonresponse follow-up, it may be possible to demonstrate that the cost of incentives will be less than the costs of extensive follow-up.”6

Existing evidence demonstrates that monetary incentives have a robust, positive effect on encouraging survey participation across different survey modes and target populations.7,8,9 Empirical studies have established that larger incentives (e.g., $100) perform significantly better than smaller incentives (e.g., $50, $20).10,11 Singer at al.6 found that each dollar of incentive paid resulted in one-third of a percentage point increase in response rate compared with the no incentive condition.

In our and other researchers’ experiences, offering lower incentives will necessitate over-recruitment by higher percentages and will require longer recruiting time and may result in accruing unnecessary costs to the budget. Consequences of insufficient incentives include increased time and cost of recruitment, increased “no-show” rates, and increased probability of cancelled or postponed interviews. Offering an incentive below $50 may result in higher recruiting costs and burden to the public due to the need for additional recruitment.

  1. An incentive will improve data quality by improving validity and reliability.

OMB’s guidance states that a “justification for requesting use of an incentive is improvement in data quality. For example, agencies may be able to provide evidence that, because of an increase in response rates, an incentive will significantly improve validity and reliability to an extent beyond that possible through other means”.6

It will be critical to maximize the number who respond to ensure that we have a good mix of gender, race, and education in the interviews. Using incentives to bring in a cross section of consumers can reduce nonresponse bias if these participants (those less interested in the topic, men, minorities, high income) have different responses and feedback than those who would participate without incentives.12 For example, research has found that incentives are necessary to help attract a reasonable cross-section of participants, reflecting diversity in age, income, and education.13 In particular, incentives can reduce nonresponse bias for groups with low-income, low educational attainment, gender and non-white race.2,14,15 Additionally, leverage-salience theory argues that monetary incentives can help to recruit people who otherwise might not be motivated to respond (e.g., people who do not care about the topic, lack altruistic motives for responding, have competing obligations) 16,17 or are typically less likely to participate in research.18


  1. Similar incentives were previously approved under recent OMB packages.

According to item 76 in the Memorandum for the President’s Management Council, past experience can be utilized to justify a more elevated honorarium: “Agencies may be able to justify the use of incentives by relating past survey experience, results from pretests or pilot tests, or findings from similar studies.”6

Not only is the proposed incentive of $50 significantly lower than market rate, it is also consistent with what OMB approved in recent years for remote interviews or focus groups.

Below are higher incentives that have also been approved for virtual studies of 60 minutes or less.

  • FDA Boxed Warnings Study (OMB exempt because it falls under the 21st Century Cures Act; 2020)

    • $50 incentive for 30-minute remote interviews (which would translate to $100 for 60-minute interview)

  • CFPB’s Consumer Response Intake Form Improvement Study, Second Iteration (Intake Form Improvement Study II) (OMB Control number 3170-0042; 2017)

    • $75 for 60-minute remote interviews

There are also examples of 90-minute remote focus groups that had an incentive of $75 (which would translate to $50 for 60 minutes) and 60-minute remote interviews with an incentive of $50. These studies include:

  • FDA Biosimilars Patient Study (OMB control number 0910-0695; 2019)

    • $75 incentive for 90-minute virtual focus groups

  • FDA Studies to Enhance FDA Communications Addressing Opioids and Other Potentially Addictive Pain Medications (OMB control number 0910-0695; 2016)

    • $75 incentive for 90-minute virtual focus groups

  • School Survey on Crime and Safety (SSOCS) 2016 and 2018 (OMB control number 1850-0761; 2015)

    • $50 incentive for 60-minute remote interviews

  1. This incentive is consistent with those used in remote interviews between the contractor (RTI) and the vendor.

OMB’s guidance states that agencies may justify the use of incentives by “relating past survey experience.”6


The contractor (RTI) and their recruitment vendor are experts in their field. RTI has consulted with several research firms with experience recruiting and hosting qualitative research across multiple markets (Schlesinger Group, L&E Research, Shugoll, Focus Pointe Global, Plaza Research, Fieldwork), including those indicated for the current study. All the contacted research firms have extensive experience working with government-funded studies and understand the processes for working within the parameters of these studies, including incentive parameters. In their experience, an incentive of $75 is the most efficient way to recruit 18 consumers for a 60-minute remote interview. Nonetheless, the recruitment vendor has identified ways to increase the viability of the project at a $50 incentive by extending the recruitment period, allowing for weekend interviews, and not restricting what areas the participants live in.



1 Russell, ML., Moralejo, DG., Burgess, ED. (2000). Paying research subjects: Participants’ perspectives. Journal of Medical Ethics, 26(2), 126–130.

2 U.S. Bureau of Labor Statistics (BLS), “Civilian workers by occupational and industry group,” March, 2020, Table 2, total compensation for civilian workers: http://www.bls.gov/​ncs/​ (visited July 21, 2020)

3 BLS, Occupational Outlook Handbook, Childcare Workers, on the Internet at https://www.bls.gov/ooh/personal-care-and-service/childcare-workers.htm (visited July 21, 2020).

4 Lai, H. C., & Wirasinghe, R. (2017, May). Applied Research for Advertising Products: Tactics for Effective Research. In Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems (pp. 1144-1151).

6 Office of Management and Budget. (2006). Questions and Answers When Designing Surveys for Information Collections. Washington, DC: Office of Information and Regulatory Affairs. OMB. Retrieved from: https://obamawhitehouse.archives.gov/sites/default/files/omb/inforeg/pmc_sur vey_guidance_2006.pdf

7 Singer, E., Van Hoewyk, J., Gebler, N., & McGonagle, K. (1999). The effect of incentives on response rates in interviewer-mediated surveys. Journal of Official Statistics15(2), 217.

8 Singer, E., & Ye, C. (2013) The Use and Effects of Incentives in Surveys. ANNALS of the American Academy of Political and Social Science 645, 112–41

9 Church, A. H. (1993). Estimating the effect of incentives on mail survey response rates: A meta-analysis. Public opinion quarterly57(1), 62-79.

10 Hsu, J. W., Schmeiser, M. D., Haggerty, C., & Nelson, S. (2017). The effect of large monetary incentives on survey completion: Evidence from a randomized experiment with the survey of consumer finances. Public Opinion Quarterly, 81(3), 736-747.

11 Church (1993): Estimating the effect of incentives on mail survey response rates. Public Opinion

Quarterly, 57(1), 62-79.

12 Castiglioni, L., & Pforr, K. (2017). The effect of incentives in reducing non-response bias in a multi-actor survey. Presented at the 2nd annual European Survey Research Association Conference, Prague, Czech Republic.

13 Willis, G. (2005). Cognitive interviewing: A tool for improving questionnaire design. Thousand Oaks, CA: Sage.

14 Groves, R.M., Dillman, DA.., Eltinge, J., Little, R.A. (2001). Survey Nonresponse. New York, NY: Wiley

15 Singer, E. & Kulka, R. A. (2002). Paying Respondents for Survey Participation. In M. Ver Ploeg, R. A. Moffitt, & C. F. Citro (Eds.), Studies of Welfare Populations: Data Collection and Research Issues, Washington, D.C.: National Academy Press.

16 Groves, R. M., Presser, S., & Dipko, S. (2004). The role of topic interest in survey participation decisions. Public Opinion Quarterly68(1), 2-31.

17 Singer, E., & Ye, C. (2013). The use and effects of incentives in surveys. The ANNALS of the American Academy of Political and Social Science, 645(1), 112-141.

18 Guyll, M., Spoth, R., & Redmond, C. (2003). The effects of incentives and research requirements on participation rates for a community-based preventive intervention research study. Journal of Primary Prevention24(1), 25-41.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorKaren Stein
File Modified0000-00-00
File Created2021-01-13

© 2024 OMB.report | Privacy Policy