2. Part B ExPECTT.FU3_clean

2. Part B ExPECTT.FU3_clean.docx

Evaluation of the Food and Drug Administration's General Market Youth Tobacco Prevention Campaign

OMB: 0910-0753

Document [docx]
Download: docx | pdf

U.S. Food and Drug Administration

Evaluation of the Food and Drug Administration’s General Market Youth Tobacco Prevention Campaigns


OMB Control Number 0910-0753

B. Statistical Methods

  1. Respondent Universe and Sampling Methods


The Cohort 2 evaluation consists of a probability sample involving a longitudinal survey of approximately 8,000 youth assessed at baseline and three follow-up waves for the national (non-trier and experimenter) campaign. This longitudinal design allows us to calculate baseline-to-follow-up changes in campaign-targeted outcomes for each study participant. We hypothesize that if the campaign is effective, the baseline-to-follow-up changes in outcomes should be larger among individuals exposed to the campaign more frequently (i.e., dose-response effects). Eligible youth are aged 11 to 16 at baseline and 13 to 20 by the end of data collection. For the Cohort 2 evaluation, age is the only screening criterion. The survey is being conducted by RTI.

For the Cohort 2 sample, we began by taking a sample of 100 Primary Sampling Units (PSUs) probability proportional to the number of 11-17 year olds. Our PSU is Public Use Microdata Areas (PUMAs). PUMAs are created for the dissemination of Census public use microdata from the American Community Survey but also can serve as PSUs clusters (McMichael & Chen, 2015). The area frame of PUMAs covers the entire lower 48 states plus the District of Columbia. Our Secondary Sampling Units (SSUs) is Postal Carrier Route. This cluster of addresses amounts to the list of addresses a mail carrier will deliver in one day. We selected between 400 and 500 SSUs probability proportional to the number of 11-17 year olds. For our third and final stage we selected addresses from the Computerized Delivery Sequence file (CDS) leased from Compact Information Systems (CIS). We selected an approximate equal number of addresses from each SSU (Approximately 85 – 106 per SSU). Exhibit 1 details our response assumptions for Cohort 2.

Exhibit 1. Addresses and the Associated Assumptions to Yield the Needed Number of Completes

Activity

National Sample Cohort 2

(All Youth)

Selected addresses

42,510

Correctly geocoded housing units

NA

Occupied housing units

36,134 (85%)

Screened households

27,100 (75%)

Eligible households

11,111 (41%)

Eligible persons

11,111 (100%)

Baseline completes

8,000 (72%)

Wave 2 (1st follow-up) completes

6,400 (80%)

Wave 3 (2nd follow-up) completes

5,120 (80%)

Wave 4 (3rd follow-up) completes

4,096 (80%)

Wave 5 (4th follow-up) completes

3,277 (80%)

Note: The 50% response rate at the first time point is a product of the person completion rate and the household screening rate (72% * 70%).

  1. Procedures for the Collection of Information

B.2.2 Outcome Evaluation Follow-Up Data Collection Waves


Data collection at baseline and first follow-up have been completed for Cohort 2. Therefore, this section describes data collection procedures for the final two follow-up surveys. Surveys are conducted at approximately 8-month intervals. This design will produce data for the same youth over a 2-year period or longer. This study design will provide a more accurate and thorough understanding of tobacco initiation, prevalence, and cessation among the campaign’s target audience of youth aged 12 to 17. Eligible youth were aged 11 to 16 at the baseline survey and will be 13 to 20 at the final survey wave. As the cohort will be aging over this time period, the data collected throughout the study will reflect information from youth aged 11 to 20. The follow-up surveys will be conducted largely in person (70%), with the remainder (30%) conducted via a Web-based survey during periods when it is safe to collect data in person. When it is not safe to collect data in person, such as during the COVID-19 pandemic, we will collect data via a Web-based survey only.

Panel maintenance letters will be sent out in advance of follow-up data collections to update contact information to the degree possible (Attachment 15_E2a). Before interviewers make any in-person contact with youth participants and their parents, lead letters with Web-survey log-in credentials will be sent to respondents to invite them to participate in the follow-up by Web or in-person. These advance letters will inform parents and youth about the study’s purpose and background, explain the survey procedures, and provide information to the respondent on participating via the Web or with an interviewer in their home. The letters will provide the Web address for the online version of the survey and the user ID and password each sample member will need to access the survey. Respondents who provided an e-mail address in the baseline survey will also receive an e-mail invitation to complete each follow-up survey via the Web. The follow-up lead letter and text for the follow-up e-mail invitations are shown in Attachments 10_E2a, 19_E2d, 20_E2d, and 21_E2d. Participation via the Web will provide flexibility and convenience to participants who can complete the survey online. Completion of the Web-based survey will be tracked closely during each follow-up wave to identify respondents who need to have an in-person interview scheduled in their home. Parents will also receive fact sheets and a copy of the parent permission form before the start of each follow-up wave of data collection.


If an interviewer is unable to locate the participating family during data collection, the interviewer will request that the case be sent to interactive tracing through the project control system. Interactive tracing is conducted by one of RTI’s tracing specialists who reviews the case contact information in the control system and accesses resources and databases to search for additional or more current contact information to locate the parent. Once located, the contact information is shared with the interviewer through an update made within the project control system so that the interviewer can attempt to complete the associated case(s).


The youth surveys include the same set of items at baseline and follow-up with the exception of items regarding each campaign and its materials (e.g., television ads, print materials), which will vary over the course of the campaigns). Minor revisions to surveys may be necessary given the media development process and possibility of changes in campaign implementation, but every effort has been made to minimize the possibility of instrument changes. The youth survey instrument includes measures of demographics; tobacco use behavior; intentions to use tobacco; self-efficacy; cessation intentions; cessation behaviors; tobacco-related attitudes, beliefs, and risk perceptions; social norms; media use and awareness; environmental questions; and measures of awareness of and exposure to the campaign materials (see Attachment 2_E2c). There is no parent survey at follow-up.


We will contact approximately 10% of households to conduct a telephone verification survey (Attachment 12_E2b). The purpose of the telephone verification survey is to complete quality control checks on the in-home interview process.

  1. Methods to Maximize Response Rates and Deal with Nonresponse

The ability to obtain the cooperation of potential respondents in the baseline survey and maintain their participation across all survey waves will be important to the success of this study.


At follow-up waves one and two, youth respondents were offered a $25 incentive to complete the survey online during an early release period that ran for approximately three weeks. Subsequently, youth respondents were offered a $20 incentive to complete the survey either online or in person. For waves three and four, we plan to increase the incentives to $30 for completion of the survey during the early release period, and $25 for completion following the early release period. Studies suggest that this incentive approach can increase response rates and reduce costs and nonresponse. In addition, the study will use procedures designed to maximize respondent participation. E-mail reminders will be sent to encourage participants to complete the survey via the Web and remind them about the option of having an interviewer visit their homes to complete the survey. Data collection procedures will begin with assignment of sample dwelling units (SDUs) to specific interviewers at the start of data collection. When assigning cases, supervisors will take into account which interviewers are in closest proximity to the study households, basic information such as demographics and size of each sampled area, and interviewer skill sets. Supervisors will assign cases to interviewers in ways designed to maximize production.


When interviewers transmit their data from completed household screenings and interviews, the data will be summarized in daily reports posted to a Web-based case management system accessed by field supervisors and RTI’s data collection managers. On a daily basis, supervisors will use these reports to review response rates, production levels, and records of call information. This information will allow supervisors to determine each interviewer’s progress toward weekly production goals, when interviewers should attempt further contacts with SDUs, and how to handle challenging situations such as households that initially refuse to participate or households where the interviewer has been unable to contact anyone. Supervisors will discuss information and challenges with their interviewers each week. When feasible, cases will be transferred to other interviewers with different skill sets to assist with converting initial refusals into participating households. Cases may also be transferred among interviewers to improve production in areas where the original interviewer is not meeting response rate goals.


As noted in Section B.2, interviewers will use various notifications (Attachment 13_E2a) to communicate with potential respondents, including a “Sorry I Missed You” card , refusal letters tailored to specific refusal reasons, and an “Unable-to-Contact” letter When interviewers have been unable to gain access to one or more SDUs due to an access barrier, such as a locked gate or door person, Controlled Access Letters will be sent to the appropriate person or organization to obtain assistance in gaining access to these SDUs. Interviewers may also use the Question and Answer fact sheet (Attachment 11_E2b) to encourage participation.

  1. Test of Procedures or Methods to be Undertaken

Prior to launching the baseline survey, we fielded an eight-case pretest of the survey instrument. This pre-test survey was identical to the instrument being used in the Cohort 2 evaluation and approved by OMB, with the exception of a few additional questions to assess overall clarity of instrument questions and respondents’ opinions on aspects of the survey that were unclear. The purpose of the pretest was twofold: (1) to assess technical aspects and functionality of the survey instrument and (2) to identify areas of the survey that were either unclear or difficult to understand. We reviewed diagnostic data on average time of survey completion, survey completion patterns (e.g., are there any concentrations of missing data?), and other aspects related to the proper function of the survey. We also examined data on pilot test measures used to assess the clarity of item wording and ease of understanding.


In addition to the pretest survey, RTI conducted rigorous internal testing of the online survey instrument prior to its fielding in the first follow-up. Evaluators reviewed the online test version of the instrument used to verify that instrument skip patterns functioned properly, delivery of campaign media materials was working properly, and that all survey questions were worded correctly and in accordance with the instrument approved by OMB.

  1. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

The following individuals inside the agency have been consulted on the design and statistical aspects of this information collection as well as plans for data analysis:


Tesfa Alexander

Office of Health Communication & Education

Center for Tobacco Products

Food and Drug Administration

10903 New Hampshire Ave

Silver Spring, MD 20993

Phone: 301-796-7745

E-mail: [email protected]


Gem Benoza

Office of Health Communication & Education

Center for Tobacco Products

Food and Drug Administration

10903 New Hampshire Ave

Silver Spring, MD 20993

Phone: 240-397-3723

E-mail: [email protected]


Janine Delahanty

Office of Health Communication & Education

Center for Tobacco Products

Food and Drug Administration

10903 New Hampshire Avenue

Silver Spring, MD 20993

Phone: 240-402-9705

E-mail: [email protected]


Alexandria Smith

Office of Health Communication & Education

Center for Tobacco Products

Food and Drug Administration

10903 New Hampshire Ave

Silver Spring, MD 20993

Phone: 240-402-2192

E-mail: [email protected]


Debra Mekos

Office of Health Communication & Education

Center for Tobacco Products

Food and Drug Administration

10903 New Hampshire Ave

Silver Spring, MD 20993

Phone: 301-796-8754

E-mail: [email protected]


Ollie Ganz

Office of Health Communication & Education

Center for Tobacco Products

Food and Drug Administration

10903 New Hampshire Ave

Silver Spring, MD 20993

Phone: 240-402-5389

E-mail: [email protected]


The following individuals outside the agency have been consulted on the questionnaire development, statistical aspects of the design, and plans for data analysis:


Xiaoquan Zhao

Department of Communication

George Mason University

Robinson Hall A, Room 307B

4400 University Drive, 3D6

Fairfax, VA 22030

Phone: 703-993-4008

E-mail: [email protected]


The following individuals will conduct data collection and analysis:


Matthew Farrelly

RTI International

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: 919-541-6852

E-mail: [email protected]


Jennifer Duke

RTI International

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: 919-485-2269

E-mail: [email protected]

Jane Allen

RTI International

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: 919-597-5115

E-mail: [email protected]


Kevin Davis

RTI International

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: 919-541-5801

E-mail: [email protected]


James Nonnemaker

RTI International

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: 919-541-7064

E-mail: [email protected]








References


Abreu, D. A., & Winters, F. (1999). Using monetary incentives to reduce attrition in the survey of income and program participation. Proceedings of the Survey Research Methods Section of the American Statistical Association.

Castiglioni, L., Pforr, K., & Krieger, U. (2008). The effect of incentives on response rates and panel attrition: Results of a controlled experiment. Survey Research Methods, 2(3), 151–158.

Centers for Disease Control and Prevention. (2012). Youth Risk Behavior Surveillance–United States, 2011. Morbidity and Mortality Weekly Report, 61(4), 1162.

Davis, K. C., Nonnemaker, J., Duke, J., & Farrelly, M. C. (2013). Perceived effectiveness of cessation advertisements: The importance of audience reactions and practical implications for media campaign planning. Health Communication, 28(5), 461472. doi:10.1080/10410236.2012.696535

Davis, K. C., Uhrig, J., Bann, C., Rupert, D., & Fraze, J. (2011). Exploring African American women’s perceptions of a social marketing campaign to promote HIV testing. Social Marketing Quarterly, 17(3), 39–60.

Dillard, J. P., Shen, L., & Vail, R. G. (2007). Do perceived message effectiveness cause persuasion or vice versa? Seventeen consistent answers. Human Communication Research, 33, 467–488.

Dillard, J. P., Weber, K. M., & Vail, R. G. (2007). The relationship between the perceived and actual effectiveness of persuasive messages: A meta-analysis with implications for formative campaign research. Journal of Communication, 57, 613–631.

Farrelly, M. C., Davis, K. C., Haviland, M. L., Messeri, P., & Healton, C. G. (2005). Evidence of a dose-response relationship between “truth” antismoking ads and youth smoking prevalence. American Journal of Public Health, 95(3), 425431. doi: 10.2105/AJPH.2004.049692

Jäckle, A., & Lynn, P. (2008). Respondent incentives in a multi-mode panel survey: Cumulative effects on nonresponse and bias. Survey Methodology, 34(1), 105–117.

Janega, J. B., Murray, D. M., Varnell, S. P., Blitstein, J. L., Birnbaum, A. S., & Lytle, L. A. (2004). Assessing the most powerful analysis method for schools intervention studies with alcohol, tobacco, and other drug outcomes. Addictive Behaviors, 29(3), 595606.

McMichael, J., & Chen, P. (2015). Using census public use microdata areas (PUMAs) as primary sampling units in area probability household surveys. In JSM Proceedings, Survey Research Methods Section, pp. 2281–2288. Alexandria: American Statistical Association.

Murray, D. M., & Blitstein, J. L. (2003). Methods to reduce the impact of intraclass correlation in group-randomized trials. Evaluation Review, 27(1), 79103.

Murray, D. M., & Short, B. J. (1997). Intraclass correlation among measures related to tobacco-smoking by adolescents: Estimates, correlates, and applications in intervention studies. Addictive Behaviors, 22(1), 112.

Shettle, C., & Mooney, G. (1999). Monetary incentives in U.S. government surveys. Journal of Official Statistics, 15, 231–250.

Singer, E. (2002). The use of incentives to reduce nonresponse in household surveys. In R. M. Groves, D. A. Dillman, J. L. Eltinge, & R. J. A. Little (Eds.), Survey Nonresponse (p. 163–177). New York, NY: Wiley.

Snyder, L. B., Hamilton, M. A., Mitchell, E. W., Kiwanuka-Tondo, J., Fleming-Milici, F., & Proctor, D. (2004). A meta-analysis of the effect of mediated health communication campaigns on behavior change in the United States. Journal of Health Communications, 9, 7196.

Substance Abuse and Mental Health Services Administration (SAMHSA). (2012). Results from the 2011 National Survey on Drug Use and Health: Summary of national findings. NSDUH Series H-44, HHS Publication No. (SMA) 12-4713. Rockville, MD: Substance Abuse and Mental Health Services Administration.

U.S. Department of Health and Human Services (USDHHS). (2006). The health consequences of involuntary exposure to tobacco smoke: A report of the Surgeon General. Atlanta, GA: U.S. Department of Health and Human Services, Centers for Disease Control and Prevention, Coordinating Center for Health Promotion, National Center for Chronic Disease Prevention and Health Promotion, Office on Smoking and Health.

Wakefield, M. A., Spittal, M. J., Yong, H-H., Durkin, S. J., & Borland, R. (2011). Effects of mass media campaign exposure intensity and durability on quit attempts in a population-based cohort study. Health Education Research, 26(6), 988–997.




10





File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorGittleson, Daniel
File Modified0000-00-00
File Created2021-01-13

© 2024 OMB.report | Privacy Policy