Information Collect Request
Reinstatement with Change
Extended Evaluation of the National Tobacco Prevention and Control
Public Education Campaign
OMB Control No. 0920-1083
Supporting Statement: Part B
Program Official/Contact
Rebecca Murphy, PhD, MPH
Office on Smoking and Health
National Center for Chronic Disease Prevention and Health Promotion
Centers for Disease Control and Prevention
4770 Buford Highway, NE MS F-79
Atlanta, Georgia 30341
OFFICE: 770-488-8964
FAX: 770-488-5939
E-mail: [email protected]
October 17, 2017
TABLE OF CONTENTS
B. COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS
B1. Respondent Universe and Sampling Methods
B2. Procedures for the Collection of Information B3. Methods to Maximize Response Rates and Deal with No Response
B4. Tests of Procedures or Methods to be Undertaken
B5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or
Analyzing Data
ATTACHMENTS
A-1. Public Health Service Act
A-2. Family Smoking Prevention and Tobacco Control Act
A-3. PPHF
B. 60-Day Federal Register Notice
C. Screening & Consent Questionnaire Screenshots—English and Spanish
D-1. Waves A-E Smoker Survey Screenshots—English and Spanish
D-2. Changes to Waves A-E Smoker Survey
E-1. Waves A-E Nonsmoker Survey Screenshots—English and Spanish
E-2. Changes to Waves A-E Nonsmoker Survey
F. KnowledgePanel Recruitment Procedures
G. RTI IRB Approval Notice
H. GfK Privacy and Security Procedures
I-1. ABS Sample Invitation Letter
I-2. KnowledgePanel Email Invitation and Reminders
I-3. ABS Sample Postcard Reminders
J. Source of Respondents for Each Wave of Data Collection
Section B: COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS
B.1 Respondent Universe and Sampling Methods
Five waves of information collection (A—E) will be conducted over approximately 16-18 months in a 2-year clearance period to facilitate evaluation of Phases 6 and 7 of CDC’s National Tobacco Prevention and Control Public Education Campaign (The Campaign) and to help inform the evaluation of future potential phases of The Campaign . The target population for the surveys consists of non-institutionalized adults age 18 and over residing in the United States. The total universe of this population is estimated at approximately 235 million adults in the U.S. The sample will be divided into smoker and non-smoker groups. Among this universe, there is an estimated 38 million smokers and 197 million nonsmokers.
The participants for these surveys will be recruited from two sources: (1) an online longitudinal cohort of adult smokers and nonsmokers, sampled randomly from postal mailing addresses in the United States (address-based sample, or ABS); and (2) the existing GfK KnowledgePanel, an established long-term online panel of U.S. adults. The ABS-sourced sample consisting of smokers and nonsmokers will serve as the core sample upon which estimates of key outcomes will be made. ABS-sourced participants will make up approximately 65% of the total sample between smokers and nonsmokers (35% will originate from KnowledgePanel). The use of the ABS-sourced sample addresses will increase the coverage of the core sample and will alleviate possible concerns over “panel conditioning”. This sample will be recruited by GfK, utilizing nearly identical recruitment methods that are used in the recruitment of KnowledgePanel (see Attachment F). The GfK KnowledgePanel (KP) will be used in combination with the ABS-sourced cohort to support larger sample sizes that will allow for more in-depth subgroup analysis, such as race/ethnicity, income, education, which is a key objective for CDC. All online surveys, regardless of sample source, will be conducted via the GfK KnowledgePanel Web portal for self-administered surveys.
Power analyses were conducted to determine the necessary number of respondents to detect anticipated changes in outcomes as a function of campaign exposure. These analyses were informed by previous studies of earlier phases of The Campaign. In addition, we examined existing evaluation literature and research to determine the expected effect sizes on the outcome of making a quit attempt (Farrelly et al., 2012, McAfee et al., 2013). Based on these analyses, we have powered the study to detect an underlying odds ratio of 1.18 between campaign exposure and the likelihood of a quit attempt. Previous media evaluations of earlier phases of The Campaign have quantified similar magnitude of impact on the likelihood of a quit attempt. We have conservatively powered the sample to detect this effect at approximately 85% power among smokers in the sample. The sample is slightly overpowered to guard against the possibility of smaller effect sizes as The Campaign matures. For nonsmokers, we estimated statistical power based on recent data from RTI’s 2012 Tips evaluation (McAfee et al., 2013) showing The Campaign was associated with increased communications about the dangers of smoking with an estimated odds ratio of 1.16 between Campaign exposure and the likelihood of nonsmokers’ communications (McAfee et al., 2013). We estimate that we will have approximately 80% power to detect an effect of this magnitude based on a maintained nonsmoker cohort size of at least 2,000 respondents per wave. These required sample sizes reflect the total cohort size for smokers and nonsmokers after attrition and replenishment at each wave. Our power analyses also account for an estimated intracluster correlation of 0.008 across markets, variation in market-level sample sizes, and a variance inflation factor of approximately 1.58, estimated from earlier waves of survey data. Table B.1.1 summarizes the detailed power calculations for smokers and nonsmokers.
Table B.1.1. Sample Power and Detectable Effects on Outcomes Among Smoker (n=4,000) and Nonsmoker (n=2,000) Cohorts
Odds Ratio for Relationship Between Campaign Exposure and Likelihood of Quit Attempt |
Effect Size (Pre-Post % Point Change in Outcome) |
Estimated Power for Smokers (n = 4,000) |
Estimated Power for Nonsmokers (n = 2,000) |
1.18 |
3.8% |
85.3% |
---- |
1.17 |
3.7% |
83.4% |
---- |
1.16 |
3.6% |
81.3% |
79.5% |
Planned sample sizes are based on minimum required sample sizes to detect Campaign effects on quit attempts among smokers and on communications about the dangers of smoking among nonsmokers as determined via the power analyses above. We estimate that a minimum total annualized sample size of 4,000 smokers per survey wave is required to detect The Campaign effect on quit attempts. Based on previous data collections involving similar populations, we anticipate an initial retention rate of approximately 62% between Waves A and B among the ABS smokers at Wave A. This retention rate requires an initial sample size of approximately 6,500 smokers at Wave A to yield 4,000 smokers at Wave B. Once panelists are enrolled, wave-by-wave retention is expected to be approximately 65% at Waves C and D. This retention rate will yield approximately 2,600 retained smokers at each of Waves C and D, requiring approximately 1,400 new respondents for replenishment at each Waves C and D to meet the minimum sample size requirement of 4,000 smokers at these waves. Similar processes of retention and replenishment are applied to the nonsmoker sample in order to yield a minimum cohort of size of 2,000 respondents per wave. See Attachment J for a detailed summary of sample sizes, retention rates, and planned sample replenishment by survey wave.
It should be noted that while the sample recruitment procedures are designed to maximize the degree of representativeness of this national sample, the limitations associated with online data collection require that all results from this information collection be reported with appropriate caution and interpretation. Specifically, although all participants (ABS-sourced and KnowledgePanel-sourced) must be invited to participate and cannot volunteer on their own, there may be systematic differences between individuals who choose to join internet surveys and the type of individuals who do not wish to participate in these types of studies over an ongoing timeframe. Therefore, evaluation results must be interpreted with appropriate caution regarding our ability to generalize the findings to the national population of smokers and nonsmokers.
Information collection will be conducted in English and Spanish. The approximate distribution of responses, by wave, smoking status, and language, is provided in Table B.1.2.
Information Collection, by Wave and Smoker/ Nonsmoker Status |
Estimated No. of Responses, by Language |
||
English |
Spanish |
Total |
|
Screening |
23,750 |
1,250 |
25,000 |
Wave A Smoker |
6,175 |
325 |
6,500 |
Wave B Smoker |
3,800 |
200 |
4,000 |
Wave C Smoker |
3,800 |
200 |
4,000 |
Wave D Smoker |
3,800 |
200 |
4,000 |
Wave E Smoker |
3,800 |
200 |
4,000 |
Wave A Nonsmoker |
2,375 |
125 |
2,500 |
Wave B Nonsmoker |
1,900 |
100 |
2,000 |
Wave C Nonsmoker |
1,900 |
100 |
2,000 |
Wave D Nonsmoker |
1,900 |
100 |
2,000 |
Wave E Nonsmoker |
1,900 |
100 |
2,000 |
Total |
55,100 |
2,900 |
58,000 |
B.2 Procedures for the Collection of Information
The ABS and KP samples use probability-based non-stratified sampling frames consisting of all mailing addresses in the U.S. Postal Service’s Delivery Sequence File. To increase the sample yield for households with smokers, the ABS sample uses auxiliary data on demographics and other household characteristics correlated with smoking to identify a subset of the ABS frame (approximately 1/3) that contains households with a higher probability of having a smoker residing in the household.
ABS-sourced participants will be initially contacted by letter in advance and invited to participate. GfK KnowledgePanel participants will be initially contacted and invited to participate by email(see Attachment I-2). All participants that respond to the initial survey invitation will be asked to complete an online screening and consent questionnaire (Attachment C) to verify eligibility and assign the respondent to the appropriate survey instrument (smoker or nonsmoker). All surveys will be conducted with ABS-sourced and KP-sourced participants via the GfK KnowledgePanel Web portal for self-administered surveys. Once the survey invitations have been distributed to the households and respondents in the sampling frame, surveys will be accessible to respondents any time of day for a designed period.
Participants can complete each survey only one time. This information collection will rely on Web surveys to be self-administered on computers at home or in a location convenient to the respondent. The surveys will be fielded in English and Spanish and will occur from approximately August 2017 through February 2019 (the timeframe may shift depending on receipt of OMB approval and other factors). All participants will be re-contacted for follow-up at subsequent survey waves with new participants enrolled to replenish the sample and maintain sample size.
ABS-Sourced Participants
Recruitment of the ABS-sourced sample will parallel recruitment methods used for the existing KnowledgePanel. Persons residing at randomly sampled addresses will be invited to join the study via a series of mailings. Specifically, the ABS sample will be sent a letter in advance (Attachment I-1) that describes the study, the length of commitment to the cohort, available incentives, and the overall purposes of the study. CDC will be prominently identified as the sole sponsor of the survey effort in all recruitment materials to encourage study cooperation.
Invited households that receive the letter in advance will be able to join the study by going to a designated Website where the study screener can be accessed and completed by adults 18 and over. After initially accepting the invitation, respondents will then complete an online consent and screening survey (Attachment C) to initiate their cohort tenure. The consent and screening survey will require a PIN that will be supplied to the respondent in the advance letter. Households that do not respond to the advance letter will be mailed up to 2 postcard reminders about the study (Attachment I-3). Each postcard will contain brief information about the study, will remind invitees of the importance of responding, and will provide the aforementioned PIN and Website for accessing the survey online.
KnowledgePanel Participants
Sampled KP participants will receive email notification that the survey is available for completion. Nonrespondents will receive two e-mail reminders requesting their participation in the survey. See Attachment I-2 for study email notifications for the KnowledgePanel sample. The email notifications contain links to the online consent and survey screening questionnaire that is used to determine study eligibility (Attachment C). Informed consent will be sought from participants for participation in the Web survey. Participants will consent by selecting the appropriate link on the Web screen. A detailed description of KnowledgePanel recruitment methodology is provided with this submission (Attachment F).
Sample Weighting
All data collected for this study will be weighted for analysis. GfK will weight all data to facilitate separate analysis of the ABS-sourced and KP-sourced samples as well as analysis of the combined samples. Weights for the sample will be calculated using a standard post-stratification weighting procedure that adjusts for survey non-response as well as non-coverage. This weighting procedure also applies a standard post-stratification adjustment based on national distributions of age, gender, race/ethnicity, and education among smokers from the most recent 2010–2011 Tobacco Use Supplement of the U.S. Census Bureau’s Current Population Survey (CPS). Benchmark distributions for Internet access used in this weight are obtained from the most recent (October 2009) special CPS supplemental survey measuring Internet access.
Quality Control
To ensure data quality, RTI and CDC will conduct rigorous testing of the online survey instrument prior to its fielding. RTI and CDC researchers will have access to an online test version of the instrument that we will use to verify that instrument skip patterns are functioning properly, delivery of campaign media materials is working properly, and that all survey questions are worded correctly and in specification with instrument approved by OMB. In addition, each data collection wave will begin with an initial “soft launch” period whereby study invitations are delivered to only a small fraction of sampled households. Resulting data from the soft launch will be analyzed for consistency, accuracy, and will be tested for missing data prior to larger releases of study invitations. In addition, the collected data will be analyzed to detect and remove any data that exhibits fraudulent completion patterns such as speeding, “straight-lining” (i.e., answering all questions the same), and survey satisficing.
B.3 Methods to Maximize Response Rates and Deal with No Response
Multiple methods will be used in this information collection to ensure maximal response rates, deal with non-response, and maximize long-term longitudinal retention. Participant retention is particularly important in this information collection because the later follow-up surveys are used to track longer-term cigarette abstinence among smokers who initially report quitting as a result of The Campaign. This will be essential to properly estimating the impact of The Campaign on long-term successful quitting. Hence, long-term cohort maintenance will be critical to the success of the evaluation.
The following procedures will be used to maximize response rates in this study:
Remuneration Plan:
Our remuneration plan acknowledges that the ABS-sourced participants may need an additional incentive because they may not have access to Internet in their home. The remunerations for the KP participants are consistent with their customary bonus points system that provides cash-equivalent points that can be redeemed for merchandise.
Participants recruited to the ABS-sourced longitudinal cohort will be provided $20 for completion of each survey they participate in. An additional $20 per survey (for a total of $40) will be offered to ABS-sourced respondents who do not have internet capability and must seek out public computers or other types of internet access to complete the online surveys. This additional remuneration for non-Internet households is meant to encourage their participation and appropriately acknowledge respondents’ time and effort.
Participants recruited from the KP sample will be provided a remuneration of 15,000 KP redeemable bonus points which is equivalent to $15.
Prompted Reminder System:
Postcard reminders (Attachment I-3) will be sent to all individuals sampled via ABS who do not respond to the initial advance letter. In addition, email reminders (Attachment I-2) will be sent to all sampled KP participants who do not complete their assigned survey within a given period of time after it is assigned. A second round of email reminders will be sent to KP non-responders who do not complete the survey once the initial email reminder is delivered.
Technical Assistance:
GfK will provide a toll-free telephone number to all sampled individuals and invite them to call with any questions or concerns about any aspect of the study.
GfK data collection staff will work with RTI project staff to address concerns that may arise.
RTI and GfK will maintain ongoing communication consisting of regular and ad hoc conference calls to identify and resolve barriers to full participation.
Given that the expected response rate is less than 80%, several procedures will be undertaken to analyze nonresponse bias. Specifically, we will perform analysis of nonresponse at each survey wave to assess the potential for biases that may arise from nonresponse. The existence of non-response bias will be reported in two ways: 1) Differential non-response among qualified respondents who do not consent or otherwise respond to the survey; and 2) Differential wave-by-wave cohort retention. For the former, we will assess the extent to which qualified respondents who do not consent to the study are systematically different from those who do agree to participate. This will be accomplished using existing variables that are known prior to survey consent and response. For longitudinal non-response, patterns in wave-by-wave attrition will be assessed by estimating the relationship between the odds of completed follow-up surveys and demographic and other characteristics of all potential respondents at the initial wave. These analyses will help determine whether cohort dropout rates are disproportionate across various types of participants.
B.4 Test of Procedures or Methods to be Undertaken
As part of previous data collections supporting evaluation of The Campaign, survey pretesting procedures have been implemented. These have included small sample pretests (fewer than 9 participants) that involved administration of the main survey instrument as well as additional items to assess survey functionality. These previous pilot tests included two primary assessments: (1) assessment of technical aspects and functionality of the survey instrument, and (2) identification of areas of the survey that were either unclear or difficult to understand. The pretest data also included diagnostic information on average time of survey completion, survey completion patterns (e.g., are there any concentrations of missing data?), and other aspects related to the proper function of the survey. Results of these pretesting procedures did not result in any major changes to the instruments included in the previously approved related ICRs (0920-0923, exp. 3/31/2017; 0920-1083, exp. 9/30/2017). Based on the similarity between these previously-tested instruments and the planned instruments referenced in this ICR (see Attachments C-E), we do not anticipate any substantial problems with survey functionality.
In addition to the previous pre-testing data that has been analyzed, RTI and CDC will conduct rigorous testing of the online survey instrument prior to its fielding. RTI and CDC researchers will have access to an online test version of the instrument that we will use to verify that instrument skip patterns are functioning properly, delivery of campaign media materials is working properly, and that all survey questions are worded correctly and in specification with instrument approved by OMB.
B.5 Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data
Primary responsibility for methodological design, data collection, and data analysis will be performed by Kevin Davis and Jennifer Duke from RTI International, whose information is listed below.
Kevin C. Davis, MA
RTI International
3040 Cornwallis Road
Research Triangle Park, NC 27709
Phone: (919) 541-5801
Email: [email protected]
Jennifer Duke, PhD
RTI International
3040 Cornwallis Road
Research Triangle Park, NC 27709
Phone: (919) 485-2269
Email: [email protected]
Individuals consulted at CDC on the information collection design are listed below.
Diane Beistle
Office on Smoking and Health
Centers for Disease Control and Prevention
4770 Buford Highway NE, Mailstop F79
Atlanta, GA 30341
Phone: (770) 488-5066
Email: [email protected]
Bob Rodes, MS, MBA, MEd
Office on Smoking and Health
Centers for Disease Control and Prevention
4770 Buford Highway NE, Mailstop F79
Atlanta, GA 30341
Phone: (770) 488-5748
Email: [email protected]
Brian King, PhD
Office on Smoking and Health
Centers for Disease Control and Prevention
4770 Buford Highway NE, Mailstop F79
Atlanta, GA 30341
Phone: (770) 488-5107
Email: [email protected]
Israel Agaku, DMD, PhD
Office on Smoking and Health
Centers for Disease Control and Prevention
4770 Buford Highway NE, Mailstop F79
Atlanta, GA 30341
Phone: (770) 488-5138
Email: [email protected]
Rebecca Murphy, PhD, MPH
Office on Smoking and Health
National Center for Chronic Disease Prevention and Health Promotion
Centers for Disease Control and Prevention
4770 Buford Highway, NE MS F-79
Atlanta, Georgia 30341
770-488-8964
FAX:770-488-5939
E-mail: [email protected]
References
Abreu, D.A., & Winters, F. (1999). Using monetary incentives to reduce attrition in the survey of income and program participation. Proceedings of the Survey Research Methods Section of the American Statistical Association.
Chang L. & Krosnick J.A. (2009). National surveys via RDD telephone interviewing versus the Internet: comparing sample representativeness and response quality. Public Opinion Quarterly. 74(4):641-678
Department of Labor, Bureau of Labor Statistics (2016). Employer Costs for Employee Compensation Historical Listing, National Compensation Survey: March 2004 – June 2016. Available at: http://www.bls.gov/ncs/ect/sp/ececqrtn.pdf
Farrelly, M.C., Duke, J.C., Davis, K.C., Nonnemaker, J.M., Kamyab, K., Willett, J.G., Juster, H.R. (2012). Promotion of smoking cessation with emotional and/or graphic antismoking advertising. American Journal of Preventive Medicine, 43(5):475-482.
McAfee, T., Davis, K.C., Alexander, R.L., Jr., Pechacek, T.F., Bunnell, R. (2013). Effect of the first federally funded US antismoking national media campaign. Lancet, 382(9909):2003-2011.
Shettle, C., & Mooney, G. (1999). Monetary incentives in U.S. government surveys. Journal of Official Statistics, 15, 231-250.
Southwell B, Barmada C, Hornik R, et al. (2002).Can we measure encoded exposure? Validation
evidence from a national campaign. J Health Commun, 7:445-453.
Yeager D.S., Krosnick J.A., Chang L., et al. (2011). Comparing the accuracy of RDD telephone
surveys and Internet surveys conducted with probability and non-probability samples. Public Opinion Quarterly. 75(4):709-747.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | bkf4 |
File Modified | 0000-00-00 |
File Created | 2021-01-22 |