General Market Campaign: Wave 3 Online Quantitative Study of Reactions to Rough-Cut Advertising Designed to Prevent Youth Tobacco Use
0910-0810
Supporting Statement: Part B
B. STATISTICAL METHODS
Respondent Universe and Sampling Methods
The one-time actual burden figures are listed in Part A.
The primary outcome of this study will be based on a non-random sample of 1,292 youth ages 13-17, who are cigarette experimenters or at-risk non-triers. The study is a cross-sectional design, and participants will be recruited in-person from malls across the US. The screening criteria are based on age, smoking status, and intention to smoke in the future. As this study is considered part of formative research for campaign development and planning, these methods are not intended to generate nationally-representative samples or precise estimates of population parameters. The sample drawn here is designed primarily to provide information on the perceived effectiveness of four advertisements for a third wave of FDA’s The Real Cost (TRC) campaign and to identify any potential unintended consequences of viewing the ads.
Sampling Methods
This study will utilize recruit and screen potential participants via mall intercepts across the United States. Recruiters will partner with 30-50 mall facilities across the nation to recruit a diverse population of potential study participants from all racial/ethnic and socio-economic backgrounds. All participants will be recruited through in-mall intercepts. Eligible participants will be required to self-identify as having experimented with cigarettes or be at risk of experimenting with cigarettes in the future via the screener form (Attachment C). Only participants aged 13-17 will be included in the study. If they have participated in a research study in within the past 6 months, they will be excluded. Participants who indicate that a member of their immediate family or a close friend works for the tobacco industry will be excluded from the study. Other demographic questions contained in the screener (e.g., race/ethnicity and education) will not be used as inclusion/exclusion criteria, but these data will be included in the final data set for data analysis purposes. Participants will not be stratified by race/ethnicity or other demographic characteristics, but demographic questions will be used to ensure that enrollment includes a diverse population of youth. Researchers will not inform ineligible individuals that they are being excluded as a result of anything related to their demographic profile or tobacco use behavior. Researchers will never turn away youth who ask to complete a screener.
Specific cigarette use status inclusion and exclusion criteria are:
Youth who indicate in the Screener that they satisfy the criteria of an “experimenter” – that is, have smoked at least one puff of a cigarette but have smoked no more than 99 cigarettes in their lifetime – will qualify for study participation.
Youth who indicate in the Screener that they satisfy the criteria of an “at-risk non-trier” – that is, they have never used cigarettes in their lifetime, not even one puff of a cigarette, but answered with an affirmative response to any of the susceptibility questions (i.e., did not answer "definitely not" to all questions) – will qualify for study participation.
Youth who respond that they have never used cigarettes, not even taken a puff of a cigarette, and respond "definitely not" to all questions assessing susceptibility to future smoking will be defined as "non-at-risk non-trier" and be excluded from participation.
Youth who respond that they have smoked more than 99 cigarettes in their lifetime will be designated as established users because they have crossed the threshold of experimenter and will be excluded from participation.
During the screening process, potential participants will be asked for personal information including their email address, their parent or guardian’s email address, and sensitive questions about their cigarette use behavior. This information will be used to determine eligibility, and to contact potential participants and/or their parents or guardians to provide parental opt-out, administer the incentive, and to administer the survey.
Participants will receive a youth assent form, which they will need to complete before enrollment in the study; they will also provide an email address for a parent or guardian. Using the email provided by the participant, parents or guardians will be emailed a blank copy of the youth assent form and a copy of the parental opt-out form, which includes instructions for how to un-enroll their child from the study. Parents or guardians will have 24 hours to opt their child out of the study either via telephone or email before their child is emailed a link to the study. A reminder email will be sent after 12 hours; additionally, if the opt-out email “bounces back,” indicating that the email is invalid, the youth will not be allowed to participate in this study. If a parent or guardian contacts the study team within 24 hours of screening to opt their child out of the study, the child will be removed from the list of potential participants. If no opt-out is indicated, eligible participants will be emailed a link to the study 24 hours after completing the screener. The email will contain a link that will take the participant directly to the study for completion; survey links are unique and can only be used one time. Qualified participants who do not complete the study within the 48 hours will receive a second email reminding them about the study that also contains a link to the study. Attachment F includes the scripts for these emails.
Parental opt-out methodology has been used by CTP for the past 3 years on several IRB and OMB approved studies. We queried IRB about this approach and they did raise any issues of ethics with this approach. Given that this study only includes 13-17 years old youth, we only use opt-out for participants who are 13 years old or older. We have examined and have seen similar proportions of parental permissions in cases where parental consent is required, and where we use the 24-hour opt out. Furthermore, we have seen parents of 13-17 year old year old youth opt their children out of studies, indicating the waiting period works. Over the past three years, we have not received any complaints from participants, parents, or IRB administrators about using an opt-out approach.
Further, due to the target population of this study, traditional written parental consent procedures would screen out the very subjects most appropriate for the aims of this study. Many youth who smoke or are at-risk for smoking are unlikely to seek out parental consent or have parents who provide written consent for their children’s participation in prevention programs, making the evaluations of such programs problematic (Levine, 1995; Pokorny et al., 2001; Unger et al., 2004; Severson and Ary, 1983). Demonstrating this point, there is consistent evidence of quantifiable differences in the characteristics of youth who participate in smoking cessation research when traditional written consent is required compared to waived parental consent, including participant demographics and smoking history. For instance, Kearney et al. (1983) found that explicit written consent procedure produced a sample that was approximately half the size of the eligible population and over-represented White students while under-representing Blacks and Asian Americans. Anderman et al. (1995) found differences between 9th- and 12th-grade students with and without written parental consent for a sensitive health survey. Participants with written consent were more likely to be White, live in two-parent households, and have a grade point average of “B” or above. Cigarette smoking was also less prevalent in the written consent group. Severson and Ary (1983) found that youth participants who gained consent were more likely to be nonsmokers compared to those non-consent participants.
Because obtaining written consent for at-risk youth will result in a sample with different characteristics than the target group, a 24-hour parental opt-out approach is being requested for participants (and has already received approval from IRB).
Sample Size
To determine the total sample size needed for this study, an a priori power analysis was conducted that included a single advertisement exposure group and a single non-advertisement viewing group. This sample size was determined to account for testing a total of four advertisements. With four advertisement groups and one control group in this study, there will be five groups. Assuming a two-tailed test, a small effect size (Cohen’s d = 0.20) and an alpha of 0.05, the required sample size for both phases of this study to achieve a power of 0.80, which is generally considered adequate in social science research, is N=1,230 (246 participants in each of the five groups). Because study enrollment will be occurring simultaneously in multiple malls and geographical locations, it is possible that over-enrollment will occur. Accordingly, a five percent buffer (n=62) will be incorporated into the anticipated N, so that final enrollment numbers will fall between N=1,230 (to achieve adequate power) and a maximum of N=1,292. Based on prior waves of copy testing for The Real Cost campaign, it is anticipated that roughly three youth will need to be screened for every one study participant. Thus, it is estimated that 3,876 potential respondents will need to be screened for this study.
Procedures for the Collection of Information
Qualified participants will be randomly assigned to either view an advertisement or not view an advertisement and will complete the questionnaire form (Attachment D). Participants selected to view an advertisement will first be provided with a set of questions about their exposure to tobacco use and their own cigarette use. Following these questions, participants will view an advertisement and will then be prompted to complete a questionnaire designed to assess whether the advertisement provides an understandable and engaging message about the harms of cigarette use. Advertisements will average 30 seconds in length. Participants in the ad-viewing group will be randomly assigned to view one of a total of four advertisements being assessed in this study.
Following completion of the advertisement questionnaire, participants will be provided with general questions about their attitudes and beliefs about the harms of cigarette use as well as additional demographic questions. The questions targeting general attitudes and beliefs about the harms of cigarette use will be used to assess potential unintended consequences.
Participants selected not to view any advertisements will only be provided with questions about their exposure to tobacco use and their own cigarette use and about their attitudes and beliefs around the harms of cigarette use, followed by additional demographic questions. Participants who do not view any advertisements are being included to measure for unintended consequences.
All of the above-mentioned research activities will be conducted on a password-protected website. Table 5 indicates the variables to be assessed during the questionnaire and the participant groups that will be exposed to these survey items.
Table 5. Structure of the Copy Testing Process and Questionnaire
Action or Variable |
Description |
Presented to Ad-Viewing Participants |
Presented to Control Participants |
Ad exposure |
Each of the ad-viewing participants will view an advertisement. |
X |
|
Tobacco use and peer tobacco use |
All participants will respond to items on household tobacco use, peer cigarette use, and participant cigarette use. |
X |
X |
Perceived ad effectiveness |
Ad-viewing participants will respond to items that assess perceived ad effectiveness, presented immediately following the ad. |
X |
|
Tobacco-related knowledge, attitude, and beliefs |
All participants will respond to items that assess knowledge, attitudes, and beliefs about tobacco. |
X |
X |
Unusual Problems Requiring Specialized Sampling Procedures
No specialized sampling procedures are involved.
Use of Periodic Data Collection Cycles to Reduce Burden
This is a one-time survey data collection effort.
Methods to Maximize Response Rates
General Methods to Reduce Non-Response & Drop-Off
Several features of this study have been designed to maximize participant response rate and Questionnaire completion.
Incentives: As participants often have competing demands for their time, incentives are used to encourage participation in research. Numerous empirical studies have shown that incentives can significantly increase response rates in cross-sectional surveys and reduce attrition in longitudinal surveys (e.g., Abreu & Winters, 1999; Castiglioni, Pforr, & Krieger, 2008; Jäckle & Lynn, 2008; Shettle & Mooney, 1999; Singer, 2002). This study will use incentives totaling $20 per participant to provide enough motivation for them to participate in the study rather than another activity. If a participant does not take the survey, they do not get the incentive. In accordance with IRB requirements, if participants begin the study, then have to leave for personal reasons or because they became uncomfortable then they will receive the incentive.
Reminders: Qualified participants who do not complete the study within the 48 hours of being emailed the study link will receive a second email reminding them about the study. These reminder messages will also include the unique link to the survey, to enable youth to easily complete the questionnaire. These reminders are intended to decrease non-response by ensuring youth have the necessary information to complete the questionnaire, and by encouraging youth who do not initially complete the questionnaire to complete it before the conclusion of data collection.
Parental Opt-Out: A parental opt-out approach will be utilized. Due to the target population of this study, traditional written parental consent procedures would discourage participation among the very subjects most appropriate for the aims of this study. Many youth who smoke or are at-risk for smoking are unlikely to seek out parental consent or have parents who provide written consent for their children’s participation in prevention programs (Levine, 1995; Pokorny et al., 2001; Unger et al., 2004; Severson and Ary, 1983). Demonstrating this point, there is consistent evidence of quantifiable differences in the characteristics of youth who participate in smoking cessation research when traditional written consent is required compared to waived parental consent, including participant demographics and smoking history (Kearney et al., 1983; Anderman et al., 1995; Severson and Ary, 1983). Utilizing a parental opt-out approach will remove a barrier that might discourage the target audience from returning to complete the questionnaire, thereby reducing non-response. Parental opt-out methodology has been used by CTP for the past 3 years on several IRB and OMB approved studies. We queried IRB about this approach and they did raise any issues of ethics with this approach. Given that this study only includes 13-17 years old youth, we only use opt-out for participants who are 13 years old or older. We have examined and have seen similar proportions of parental permissions in cases where parental consent is required, and where we use the 24-hour opt out. Furthermore, we have seen parents of 13-17 year old year old youth opt their children out of studies, indicating the waiting period works. Over the past three years, we have not received any complaints from participants, parents, or IRB administrators about using an opt-out approach.
Online Completion: Participants will be emailed a link to complete the questionnaire online on their own device. Because youth can complete the questionnaire on their own time and on their own devices, study participation will more convenient and youth will be more apt to complete the survey. This will also decrease time and costs related to recruitment. This technology also permits participants to complete the instruments in privacy. Providing the participant with a methodology that improves privacy makes reporting of potentially embarrassing or stigmatizing behaviors (e.g., tobacco use) less threatening and enhances response validity and response rates.
Test of Procedures or Methods
The campaign contractor, FCB, and their subcontractor, Marketing Workshop, will conduct rigorous internal testing of the electronic survey instruments prior to their fielding. Trained researchers will review the screeners and questionnaire to verify that instrument skip patterns are functioning properly, delivery of campaign media materials is working properly, and that all survey questions are worded correctly and are in accordance with the instrument approved by OMB.
Individuals Consulted in Statistical Consultation and Information Collection
The following individuals inside the agency have been consulted on the design of the copy testing plan, survey development, or intra-agency coordination of information collection efforts:
Tesfa Alexander
Office of Health Communication & Education
Center for Tobacco Products
Food and Drug Administration
10903 New Hampshire Avenue
Silver Spring, MD 20993-0002
Phone: 301-796-9335
E-mail: [email protected]
Gem Benoza
Office of Health Communication & Education
Center for Tobacco Products
Food and Drug Administration
10903 New Hampshire Avenue
Silver Spring, MD 20993
Phone: 240-402-0088
E-mail: [email protected]
Michael Murray
Office of Health Communication & Education
Center for Tobacco Products
Food and Drug Administration
10903 New Hampshire Avenue
Silver Spring, MD 20993
Phone: 301-796-4234
E-mail: [email protected]
Matthew Walker
Office of Health Communication & Education
Center for Tobacco Products
Food and Drug Administration
10903 New Hampshire Avenue
Silver Spring, MD 20993
Phone: 240-402-3824
E-mail: [email protected]
The following individuals outside of the agency have been consulted on questionnaire development.
Dimas Adiwiyoto
Account Director
FCB
100 West 33rd Street
New York, NY 10001
Phone: 212-885-3283
Email: [email protected]
David Cortés
Senior Strategic Planner
FCB
100 West 33rd Street
New York, NY 10001
Phone: 212-885-3743
Email: [email protected]
Mark Hall
Senior Strategic Planner
FCB
100 West 33rd Street
New York, NY 10001
Phone: 212-885-3372
Email: [email protected]
Xiaoquan Zhao
Department of Communication
George Mason University
Robinson Hall A, Room 307B
4400 University Drive, 3D6
Fairfax, VA 22030
Phone: 703-993-4008
E-mail: [email protected]
References
Abreu, D. A., & Winters, F. (1999). Using monetary incentives to reduce attrition in the survey of income and program participation. In Proceedings of the Survey Research Methods Section (pp. 533-538).
Anderman, C., Cheadle, A., Curry, S., Diehr, P., Shultz, L., & Wagner, E. (1995). Selection bias related to parental consent in schoolbased survey research. Evaluation Review, 19, 663–674.
Asch, D. A., Christakis, N. A., & Ubel, P. A. (1998). Conducting physician mail surveys on a limited budget: a randomized trial comparing $2 bill versus $5 bill incentives. Medical Care, 36(1), 95-99.
Beebe, T. J., Davern, M. E., McAlpine, D. D., Call, K. T., & Rockwood, T. H. (2005). Increasing response rates in a survey of Medicaid enrollees: the effect of a pre-paid monetary incentive and mixed modes (mail and telephone). Medical Care, 43(4), 411-414.
Castiglioni, L., Pforr, K., & Krieger, U. (2008, December). The effect of incentives on response rates and panel attrition: Results of a controlled experiment. In Survey Research Methods (Vol. 2, No. 3, pp. 151-158).
Coughlin, S. S., Aliaga, P., Barth, S., Eber, S., Maillard, J., Mahan, C., & Williams, M. (2013). The effectiveness of a monetary incentive on response rates in a survey of recent US veterans. Survey Practice, 4(1).
Dirmaier, J., Harfst, T., Koch, U., & Schulz, H. (2007). Incentives increased return rates but did not influence partial nonresponse or treatment outcome in a randomized trial. Journal of Clinical Epidemiology, 60(12), 1263-1270.
Dykema, J., Stevenson, J., Kniss, C., Kvale, K., Gonzalez, K., & Cautley, E. (2012). Use of monetary and nonmonetary incentives to increase response rates among African Americans in the Wisconsin Pregnancy Risk Assessment Monitoring System. Maternal and Child Health Journal, 16(4), 785-791.
Gajic, A., Cameron, D., & Hurley, J. (2012). The cost-effectiveness of cash versus lottery incentives for a web-based, stated-preference community survey. The European Journal of Health Economics, 13(6), 789-799.
Graham, A. L., Milner, P., Saul, J. E., & Pfaff, L. (2008). Online advertising as a public health and recruitment tool: comparison of different media campaigns to increase demand for smoking cessation interventions. Journal of Medical Internet Research, 10(5), e50.
Han, D., Montaquila, J. M., & Brick, J. M. (2012). An Evaluation of Incentive Experiments in a Two-Phase Address-Based Sample Mail Survey. In Proceedings of the Survey Research Methods Section of the American Statistical Association.
Jäckle, A., & Lynn, P. (2008). Offre de primes d’encouragement aux répondants dans une enquête par panel multimodes: effets cumulatifs sur la non-réponse et le biais. Techniques d’enquête, 34(1), 115-130.
Kearney, K.A., Hopkins, R.H., Mauss, A.L., & Weisheit, R.A. (1983). Sample bias resulting from a requirement for written parental consent. Public Opinion Quarterly, 47, 96102.
Lane, T.S., Armin, J., & Gordon, J. S. (2015). Online recruitment methods for web-based and mobile health studies: A review of the literature. Journal of Medical Internet Research, 17(7), e183.
Levine, R.J. (1995) Adolescents as research subjects without permission of their parents or guardians: Ethical considerations. Journal of Adolescent Health, 17(5), 289-297.
Messer, B. L., & Dillman, D. A. (2011). Surveying the general public over the internet using address-based sampling and mail contact procedures. Public Opinion Quarterly, 75(3), 429-457.
Montaquila, J. M., Brick, J. M., Williams, D., Kim, K., & Han, D. (2013). A study of two-phase mail survey data collection methods. Journal of Survey Statistics and Methodology, 1(1), 66-87.
Pew Research Center. (2015). Teens, Social Media & Technology Overview 2015. Washington, DC: Pew Research Center.
Pokorny, S. B., Jason, L. A., Schoeny, M. E., Townsend, S. M., & Curie, C. J. (2001). Do participation rates change when active consent procedures replace passive consent. Evaluation Review, 25(5), 567-580.
Racial and Ethnic Approaches to Community Health Across the U.S. (REACH U.S.) Evaluation. OMB CONTROL NUMBER: 0920-0805 Report on Incentive Experiments www.reginfo.gov/public/do/DownloadDocument?objectID=30165501. Accessed on February 1, 2016.
Severson, H.H., & Ary, D.V. (1983). Sampling bias due to consent procedures with adolescents. Addictive Behaviors, 8(4), 433–437.
Singer, E. (2002). The use of incentives to reduce nonresponse in household surveys. Survey nonresponse, 51, 163-177.
Shettle, C., & Mooney, G. (1999). Monetary incentives in US government surveys. Journal of Official Statistics, 15(2), 231.
Stevenson, J., Dykema, J., Kniss, C., Black, P., & Moberg, D. P. (2011). Effects of mode and incentives on response rates, costs and response quality in a mixed mode survey of alcohol use among young adults. In annual meeting of the American Association for Public Opinion Research, May, Phoenix, AZ.
Ulrich, C.M., Danis, M., Koziol, D., Garrett-Mayer, E., Hubbard, R., & Grady, C. (2005). Does it pay to pay? A randomized trial of prepaid financial incentives and lottery incentives in surveys of nonphysician healthcare professionals. Nursing Research, 54(3), 178-183.
Unger, J.B., Gallaher, P., Palmer, P.H., Baezconde-Garbanati, L., Trinidad, T.R., Cen, S., & Johnson, C.A. (2004). No news is bad news: Characteristics of adolescents who provide neither parental consent nor refusal for participation in school-based survey research. Evaluation Review, 28(1), 52-63.
Warriner, Keith, John Goyder, Heidi Gjertsen, Paula Hohner, and Kathleen McSpurren. 1996. “Charities, No, Lotteries, No, Cash, Yes: Main Effects and Interactions in a Canadian Incentives Experiment.” Paper presented at the Survey Non-Response Session of the Fourth International Social Science Methodology Conference, University of Essex, Institute for the Social Sciences, Colchester, UK.
Page
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Modified | 0000-00-00 |
File Created | 2021-01-21 |