0910-0788 PRA SS_Part A_Final

0910-0788 PRA SS_Part A_Final.docx

Evaluation of the Food and Drug Administration's 'Fresh Empire' Multicultural Youth Tobacco Prevention Campaign.

OMB: 0910-0788

Document [docx]
Download: docx | pdf


U.S. Food and Drug Administration

Food and Drug Administration’s Evaluation of the Fresh Empire Campaign on Tobacco (EFECT)


OMB Control No. 0910-0788


SUPPORTING STATEMENT

A. Justification

  1. Circumstances Making the Collection of Information Necessary

The 2009 Family Smoking Prevention and Tobacco Control Act (Tobacco Control Act) (Pub. L. 111–31) amended the Federal Food, Drug, and Cosmetic Act (the FD&C Act) to grant FDA authority to regulate the manufacture, marketing, and distribution of tobacco products to protect public health and to reduce tobacco use by minors. Section 1003(d)(2)(D) of the FD&C Act (21 U.S.C. 393(d)(2)(D)) supports the development and implementation of FDA public education campaigns related to tobacco use. Accordingly, FDA is currently developing and implementing a youth-targeted public education campaign to help prevent tobacco use among multicultural youth and thereby reduce the public health burden of tobacco. The campaign features events, advertisements on television and radio and in print, digital communications including videos and social media, and other forms of media. For the purpose of this OMB package, each of these campaign elements will be referred to as “advertisements” or “ads.”

The objective of the evaluation is to measure the effectiveness of CTP’s Fresh Empire campaign designed to reduce tobacco use among multicultural youth aged 12 to 17. FDA’s Fresh Empire youth tobacco prevention campaign focuses on reducing tobacco use among youth who affiliate with a Hip Hop peer crowd, and predominantly among African American, Hispanic, and Asian/Pacific Islander youth. The goal of the proposed information collection is to evaluate the effectiveness of these efforts in affecting specific cognitive outcomes related to tobacco use that are targeted by the campaign.

This study is designed to measure awareness of and exposure to FDA’s Fresh Empire youth tobacco prevention campaign and assess its impact on outcome variables of interest. The first data collection period was in mid to late 2015. The post-campaign data collection began approximately 6 months following the launch of the campaign. The data collection was originally scheduled to end approximately 24 months after the launch of the campaign; however, FDA requests OMB approval to add two additional waves of data collection with existing youth in the study, such that data collection will end approximately 48 months after the launch of the campaign. This design, which includes cross-sectional data collection with an embedded longitudinal cohort, will facilitate analysis of relationships between individuals’ exposure to campaign activities and pre- to post-campaign changes in outcomes of interest between campaign and comparison cities. Research studies have demonstrated that receptivity to advertisements is causally antecedent to actual ad effectiveness (e.g., Davis et al., 2013; Davis, Uhrig, et al., 2011; Dillard, Shen, & Vail, 2007; Dillard, Weber, & Vail, 2007). We hypothesize that if the campaign is effective, the pre- to post-campaign changes in outcomes should be larger among individuals in campaign cities compared to individuals in comparison cities. Furthermore, the differences should be more pronounced for youth in campaign cities exposed to the campaign more frequently (i.e., dose-response effects).

The primary method to recruit youth for the pre-test survey was to send a brief mail screener to households in campaign and comparison cities. However, given that the target audience represents a relatively small proportion of youth, we complemented this approach by recruiting youth through social media. Those youth who are recruited through mail or social media will become members of the longitudinal panel. The pre-test survey includes measures of tobacco-related beliefs, attitudes, intentions, and behaviors. The outcome post-test survey includes measures of audience awareness of and exposure to the campaign advertisements as well as the aforementioned outcome variables of interest. The post-test questionnaire is presented in Attachment 1. The brief mail screener used to identify multicultural youth for the outcome pre-test survey is presented as Attachment 2; the mail screener will also be used to recruit new youth age 12 – 17 to complete the fourth post-test survey. Attachment 3 contains the content of the web screener that will be used to identify eligible youth recruited using social media.

  1. Purpose and Use of the Information Collection

The information obtained from the data collection activities is collected from individuals and used to inform FDA, policy makers in the United States, prevention practitioners, and researchers about the extent of multicultural youth’s exposure to the campaign’s activities and the extent to which exposure to these activities is associated with changes in targeted outcomes. While not exhaustive, the list below illustrates a range of purposes and uses for the proposed information collection:


    • Provide critical data on the reach of the campaign among multicultural youth in targeted cities, particularly with estimates of the proportion of the population that was exposed to the campaign.

    • Understand the influence of the campaign on targeted beliefs and attitudes.

    • Inform FDA, policy makers, and other stakeholders on the impact of the campaign overall.

    • Inform the public about the impact of the campaign.

    • Inform future programs that may be designed for similar purposes.


To achieve these goals, data collection consists of a pre-test survey and post-test surveys with youth in the target audience. The post-test surveys are conducted among those youth who participated previously (“Longitudinal Cohort”), with new participants being recruited to make up for attrition (“Cross-Sectional Refresher Sample”). Eligible youth are initially 12 to 17 year old youth who affiliate with a Hip Hop peer crowd. Youth in the embedded longitudinal cohort may reach the age of 18 during the course of the evaluation. The sample is predominantly African American, Hispanic, and Asian/Pacific Islander. The Fresh Empire campaign targets 44 cities. The data collection will occur in 15 campaign-targeted cities and 15 similar (“comparison”) cities. Collecting data in a subset of cities helps manage the costs of data collection, while not compromising statistical power (i.e., too much clustering reduces effective sample sizes). The embedded longitudinal cohort reduces cost, as well as respondent burden. The outcome study relies primarily on a mail screener survey to identify eligible youth, followed by in-person data collection. We supplement this approach by recruiting youth through social media. We advertise in social media and invite youth 13 to 17 years old to complete the screening survey online. Consistent with the Federal Children’s Online Privacy Protection Act, we do not contact youth under 13 online. We then ask eligible youth to provide contact information for their parents/guardians so that we can obtain permission for completing the outcome survey online. In post-campaign survey rounds, 15 to 17 year old youth recruited by social media do not require parental permission to participate in the survey. For youth 13 to 14 years old, we continue to require parental permission.


To ensure that the youth who participate in the outcome evaluation are members of the target audience, i.e., multicultural youth who affiliate with a Hip Hop peer crowd, eligible youth will be identified using the same screening method used by the agency implementing the Fresh Empire campaign—Rescue. This is accomplished by presenting photos of males and females representing various peer crowds. The images are displayed in two arrays stratified by gender. Respondents are asked to rank order the three images depicting individuals who best represent their friend group and the three images that least represent it in each array. Survey participants are categorized as members of the Hip Hop peer crowd based on this exercise. Eligible youth will be contacted and invited to complete the outcome survey.


The outcome surveys are self-administered on laptop computers provided by field interviewers or are completed online. The pre-test survey had a sample size of 2,194, with about half of the sample from 15 campaign-targeted cities and half from comparison cities. The total sample for the post-test surveys will be approximately 10,500, with an equal number of surveys in campaign and comparison cities. We have estimated the proportion of pre-test participants expected to complete successive post-test surveys and supplement that longitudinal sample with new cross-sectional participants to meet our target total sample size. This design permits an analysis of trends in outcomes between youth in targeted and comparison cities.


The original plan called for recruiting up to 500 participants for the pre-test surveys through social media platforms Twitter and Facebook. In actuality, there were fewer participants recruited through social media platforms as anticipated for the pre-test survey. Our social media recruitment efforts have been more successful beginning with first post-test survey. Of the 10,500 post-test surveys, approximately 3,500 will be completed by youth recruited through social media.

  1. Use of Improved Information Technology and Burden Reduction

Use of an embedded longitudinal cohort will markedly reduce burden relative to a design consisting solely of cross-sectional surveys. In addition, this outcome study will rely on a mail-based screener and in-person computer-based outcome surveys for pre- and post-test data collection. The approach of screening eligible youth by mail and recruiting eligible youth in-person provides a number of methodological advantages, including efficiency in identifying this hard-to-reach population, increased accuracy in measurement of key variables of interest, and reduced burden on study participants. Computerized administration permits the instrument designer to incorporate into the questionnaire routings that might be overly complex or not possible using a paper-based survey. The laptop computer and online survey used to collect youth data can be programmed to implement complex skip patterns and fill specific wordings based on the respondent’s previous answers. Interviewer and respondent errors caused by faulty implementation of skip instructions are virtually eliminated. Second, computerized administration increases the consistency of the data. The computer can be programmed to identify inconsistent responses and attempt to resolve them through respondent prompts. This approach reduces the need for most manual and machine editing, thus saving time and money. In addition, it is likely that respondent-resolved inconsistencies will result in data that are more accurate than when inconsistencies are resolved using editing rules. FDA estimates that 18% of the respondents will use electronic means to fulfill the agency’s request.


The self-administered mail screener (see Attachment 2) is programmed using a TeleForm —a machine-readable data form—so that the survey responses can be automatically captured using a TeleForm reader, which obviates the need for manual data entry. Using this technology, the majority of surveys can be read electronically. Those that cannot be scanned will be coded by a data processor.


The computer-assisted self-interview technology for the outcome survey permits greater expediency with respect to data processing and analysis (e.g., a number of back-end processing steps, including coding and data entry, are minimized). Data are transmitted electronically within 48 hours. These efficiencies save time due to the speed of data transmission, as well as receipt in a format suitable for analysis. Finally, this technology permits respondents to complete the interview in privacy. Providing the respondent with a methodology that improves privacy makes reporting of potentially embarrassing or stigmatizing behaviors (e.g., tobacco use) less threatening and enhances response validity and response rates.


The mail screener and in-person computerized survey sample will be supplemented by a sample of respondents who are recruited through social media. These respondents will be recruited through social media platforms, such as Facebook and Instagram, and led to an online screener for the study (see Attachment 3). Respondents will be invited to complete the screener using a web survey programmed and hosted on RTI’s servers. This web screener will have the advantage of immediately notifying respondents if they are eligible for the full study. In addition, use of social media as a recruitment tool will cast a wider net to identify additional, eligible study respondents who are members of this hard-to-reach population.


Eligible respondents will be routed to the full web survey, and given a unique ID to use to enter the survey. Respondents will be able to quit the survey at any time and resume where they left off upon reentry. Respondents will also be emailed a link to resume the survey and contact information to ask questions, receive reminders to complete the survey, and receive a virtual gift card upon completion.


Administration of the survey using web methods will help to contain costs, allowing for a sample that is geographically diverse without driving up interviewer costs for travel during data collection.

  1. Efforts to Identify Duplication and Use of Similar Information

FDA’s Evaluation of the Fresh Empire Campaign on Tobacco (EFECT) is new. To date, there has been no in-depth evaluation of this campaign in a real-world setting, and there are no existing data sources that contain measures on awareness of and exposure to the campaign. This information collection therefore does not duplicate previous efforts. In designing the proposed data collection activities, we have taken several steps to ensure that this effort does not duplicate ongoing efforts and that no existing data sets would address the study questions. We have carefully reviewed existing data sets to determine whether any of them are sufficiently similar or could be modified to address FDA’s need for information on the effectiveness of the campaign with respect to reducing youth tobacco-related outcomes. We investigated the possibility of using existing data to examine our research questions, such as data collected as part of ongoing national surveillance systems, evaluations of current or past state-level campaigns for youth, the National Youth Tobacco Survey, and the Youth Risk Behavior Surveillance System. Due to the timing of the campaign and the specificity of the target population, none of these existing data sources will be able to provide the necessary data collection needs of the campaign, none will include the necessary in-depth survey questions on awareness of individual ads and other campaign materials, and none contain all of the necessary outcome variables specific to the campaign’s messages.

  1. Impact on Small Businesses or Other Small Entities

Respondents in this study will be members of the general public and specific subpopulations, not business entities. No impact on small businesses or other small entities is anticipated.

  1. Consequences of Collecting the Information Less Frequently

Respondents to this collection of information will answer just one survey during each data collection period. While there are no legal obstacles to reducing burden, any lack of information needed to evaluate the Fresh Empire campaign may impede the federal government’s efforts to improve public health. Without the information collection requested for this evaluation study, it would be difficult to determine the value or impact of the campaign on the lives of the people it is intended to serve—multicultural youth. Failure to collect these data could reduce effective use of FDA’s program resources to benefit youth in the United States. Careful consideration has been given to how frequently the campaign’s intended audience should be surveyed for evaluation purposes. We believe that the proposed outcome study design will provide sufficient data to evaluate the campaign effectively.

  1. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5

There are no special circumstances for this collection of information that require the data collection to be conducted in a manner inconsistent with 5 CRF 1320.5(d)(2). The message testing activities fully comply with the guidelines in 5 CFR 1320.5.

  1. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency

In accordance with 5 CFR 1320.8(d), FDA published a 60-day notice for public comment in the Federal Register on December 26, 2017 (82 FR 61003). FDA received one comment; however, this comment was not PRA related.


The following individuals inside the agency have been consulted on the design of the campaign evaluation plan, audience questionnaire development, or intra-agency coordination of information collection efforts:


April Brubach

Office of Health Communication & Education

Center for Tobacco Products

Food and Drug Administration

9200 Corporate Boulevard

Rockville, MD 20850

Phone: 301-796-9214

E-mail: [email protected]


Gem Benoza

Office of Health Communication & Education

Center for Tobacco Products

Food and Drug Administration

10903 New Hampshire Avenue

Silver Spring, MD 20903

Phone: 240-402-0088

E-mail: [email protected]


David Portnoy

Office of Science

Center for Tobacco Products

Food and Drug Administration

10903 New Hampshire Avenue

Silver Spring, MD 20903

Phone: 301-796-9298

E-mail: [email protected]


Matthew Walker

Office of Health Communication & Education

Center for Tobacco Products

Food and Drug Administration

10903 New Hampshire Avenue

Silver Spring, MD 20903

Phone: 240-402-3824

E-mail: [email protected]


Leah Hoffman

Office of Health Communication & Education

Center for Tobacco Products

Food and Drug Administration

10903 New Hampshire Avenue

Silver Spring, MD 20903

Phone: 240-743-1777

E-mail: [email protected]


Janine Delahanty

Office of Health Communication & Education

Center for Tobacco Products

Food and Drug Administration

10903 New Hampshire Avenue

Silver Spring, MD 20903

Phone: 240-402-9705

E-mail: [email protected]


Ollie Ganz

Office of Health Communication & Education

Center for Tobacco Products

Food and Drug Administration

10903 New Hampshire Avenue

Silver Spring, MD 20903

Phone: 240-402-5389

E-mail: [email protected]


The following individuals outside of the agency have been consulted on questionnaire development. Additionally, input has been solicited and received from FDA on the design of this study, including participation by FDA in meetings with OMB:


Matthew Farrelly

RTI International

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: 919-541-6852

E-mail: [email protected]


Jennifer Duke

RTI International

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: 919-485-2269

E-mail: [email protected]


Jane Allen

RTI International

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: 919-597-5115

E-mail: [email protected]


Youn Lee

RTI International

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: 919-485-5536

E-mail: [email protected]


Amy Henes

RTI International

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: 919-541-7293

E-mail: [email protected]


Jamie Guillory

RTI International

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: 919-316-3725

E-mail: [email protected]


Patricia LeBaron

RTI International

230 W Monroe Suite 2100

Chicago, IL 60606

Phone: 312-777-5204

E-mail: [email protected]


Azucena Derecho

RTI International

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: 919-541-7231

E-mail: [email protected]


Stephen King

RTI International

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: 919-541-8094

Email: [email protected]


Pamela Rao

Akira Technologies, Inc.

1747 Pennsylvania Ave NW Suite 600

Washington, DC 20002

Phone: (202) 517-7187

Email: [email protected]


Xiaoquan Zhao

Department of Communication

George Mason University

Robinson Hall A, Room 307B

4400 University Drive, 3D6

Fairfax, VA 22030

Phone: 703-993-4008

E-mail: [email protected]


Jeff Jordan

Rescue

3436 Ray Street

San Diego, CA 92104

Phone: 619-231-7555 x 150

Email: [email protected]


Mayo Djakaria

Rescue

660 Pennsylvania Avenue SE, Suite 400

Washington, DC 20003

Phone: 619-231-7555 x 120

Email: [email protected]


Dana Wagner

Rescue

660 Pennsylvania Ave SE, Suite 400

Washington, DC 20003

Phone: 619-231-7555 x 331

Email: [email protected]

  1. Explanation of Any Payment or Gift to Respondents

Participants who complete all waves of the study could receive up to $167 in cash or virtual gift cards. Households that receive the mail screener will all receive a nominal incentive of a $2 bill to encourage completing and returning the screener before participating in this brief survey. The lead letter indicates that the $2 bill is intended for the potential youth participant, but that the adult recipient of the letter may keep the $2 if there are no eligible youth in the household. A meta-analysis of studies examining the use of incentives in mail surveys showed that pre-paid incentives and promised incentives increase participating in mail surveys by 19% and 8% respectively, compared to no incentives (Church, 1993). More recent studies confirm these findings (e.g., Montaquila et al., 2013; Brick et al., 2012; Beebe et al., 2005).


Youth invited to participate in the outcome evaluation surveys will receive incentives. Youth participants will be offered a $25 incentive for completion of the pre- and post-test surveys. We estimate that the pre-campaign survey will take 30 minutes to complete, and the post-test survey will take up to 45 minutes. The incentives are intended to recognize the time burden placed on participants, encourage their cooperation in subsequent post-test surveys, which will reduce both respondent burden and cost, and convey appreciation for contributing to this important study. Incentives are similar to those offered for most surveys of this type. Both surveys will take less than an hour to complete, and thus design protocols call for the same incentive amount. Numerous empirical studies have shown that incentives can significantly increase response rates in cross-sectional surveys and reduce attrition in longitudinal surveys (e.g., Abreu & Winters, 1999; Castiglioni, Pforr, & Krieger, 2008; Jäckle & Lynn, 2008; Shettle & Mooney, 1999; Singer, 2002; Singer and Ye, 2013). The decision to use incentives for this study is based on the need to promote continued participation by this hard-to-reach and specific population of multicultural youth who affiliate with a Hip Hop peer crowd (Beebe et al., 2005).


During the third, fourth and fifth post-test data collection periods, we plan to offer an additional $5 “early bird” incentive for longitudinal respondents who originally completed an in-person survey to encourage them to complete the survey online.


Several studies have shown that early bird incentives can improve response rates. In one study, individuals who received an early bird incentive were 1.8 times more likely to complete the survey within the first 7 days of data collection and were 1.69 times more likely to ever complete the survey (LeClere, Plummer, Vanicek, Amaya & Carris, 2012). Another study showed that an early bird incentive significantly increased the response rate in the first two weeks of data collection (29.7 vs. 20.1%) (Coopersmith, Vogel, Bruursema & Feeney, 2016). Biemer et al. (2017) used an experimental design to test the effectiveness of incentivizing mode choice – the researchers offered web and mail modes at the same time. Individuals in the experiment group were offered an additional $10 if they completed the survey via web. The control group was offered the standard incentives. Incentivizing mode choice increased the overall response rate by 3.98 percentage points (42.78 vs. 38.80 for experiment and control group, respectively) and increased the proportion of respondents who completed via web (from 28% to 64%).

ExPECTT, the evaluation of the FDA’s public education campaign for tobacco use among youth (The Real Cost), promised an additional $5 incentive to participants who completed the survey online before the specified early bird date. Study staff found that this was an extremely effective method of facilitating timely data collection and promoting online completion of the surveys, which significantly reduced data collection costs. By eliminating the need for an interviewer to visit the household, this practice eliminated the cost of employing field staff and travel to the households. This method also reduced participant burden by not requiring them to complete the questionnaire on a specific day and time scheduled with an interviewer. In other words, online completion of the survey allows the participant greater flexibility. This method has also been used for the EFECT, ExPECTT, RESPECT, and RuSTEC campaign evaluations. For Follow-up 1 of RuSTEC, 79% of completes occurred during the early bird period, while 58% of the RuSTEC web responses for Follow-up 2 occurred during the early bird period. For the ExPECTT campaign evaluation, 85% of Follow-up 1 participants, 81% of the follow-up 2 participants, 79% of the Follow-up 3 participants, and 80% of the Follow-up 4 participants completed the survey during the early bird period. For EFECT Follow-up 3, 75% of longitudinal web respondents completed during the early bird period.


Respondents who are recruited through social media (such as Facebook and Instagram) and who complete the outcome survey online will receive a link to a virtual gift card via email, such as from Visa or Amazon, with a value of $25 upon completion of the survey.


A more detailed justification for the use of incentives is provided in Attachment 4. The use of modest incentives is expected to enhance survey response rates without biasing responses. A smaller incentive would not appear sufficiently attractive to participants. We also believe that the incentives will result in higher data validity as participants will become more engaged in the survey process. This will also enhance overall response to the pre-test and post-test surveys and reduce attrition at follow-up within the embedded longitudinal cohort. The use of incentives will help ensure that pre-test data collection is completed in a timely manner and potentially reduce the number of follow-up visits needed to contact non-respondents. Use of incentives within the embedded longitudinal cohort will reduce attrition which in turn will reduce respondent burden and the cost of post-test surveys. The specific amount of the proposed incentive is based on several previous projects conducted by RTI, including a survey used to evaluate FDA’s general market tobacco prevention education campaign (ExPECTT) and the National Survey of Child and Adolescent Well-Being, which found that use of similar incentives increased response rates among youth.

  1. Assurance of Privacy Provided to Respondents

In developing this study, CTP consulted the FDA Privacy Officer to identify potential risks to the privacy of participants and other individuals whose information may be handled by or on behalf of FDA in the performance of this study. Prior to consulting the Privacy Officer, CTP had intentionally designed the study to minimize privacy risks in keeping with the Fair Information Practice Principles (FIPPs) and applying controls selected from the National Institute of Standards and Technology (NIST), Special Publication 800-53, Security and Privacy Controls for Federal Information Systems and Organizations. CTP has also identified privacy compliance requirements and coordinated with FDA’s Privacy Officer to ensure responsible offices in CTP satisfy all requirements. The FDA Privacy Office is currently reviewing the Privacy Impact Assessment; once complete, FDA will submit the document to OMB.


PII Collection

As part of this study, RTI International, the contractor acting on behalf of FDA, is collecting and maintaining personally identifiable information (PII) about participants who complete the mail screener, the online screener, and the in-person and online questionnaires. Parents are asked to provide their name and phone numbers on the mail screener for the purpose of providing permission for their child to participate in the study. Youth completing the mail screener provide their first name, age, gender, and race/ethnicity. Youth who complete the online screener are asked to provide gender, race-ethnicity, zip code (to verify residence in a data collection city), and date of birth (to determine eligibility). Eligible participants are asked to provide their first name, parent name and phone number if they are age 13 – 14 (so that RTI can contact them to request parental permission), email address, cell phone number, and family member email address and/or cell number. IP address is also collected for all participants completing the online screener. PII or potential PII about youth collected in both the online and in-person questionnaires includes date of birth, grade in school, race/ethnicity (only asked of new participants), and family member contact information. For the in-person questionnaire, first names and ages of other siblings are also collected (in order to screen other potential participants) and parent name and phone number are collected for quality control purposes so that the study team can verify that the interview took place, if needed. For the online questionnaire, zip code is collected to verify residence in a data collection city, email address is collected so that that respondent can receive the virtual gift card for completing the survey, and IP address is collected.


Addresses for the mail screener are obtained from RTI’s address-based sampling frame, which is used to identify households likely to have eligible youth. The foundation of the address-based sampling frame is acquired from the U.S. Postal Service Computerized Delivery Sequence file and then is enhanced by appending ancillary information from public and private sources to better characterize households. Addresses of participants enrolled in the study are maintained so that they can be invited to participate in future rounds of the study.


RTI assigns each respondent a randomly generated unique case identification number which is printed on the mail screener or assigned to the case once screened as eligible online. Once a participant is selected, the case ID and password can be used to access the study online.


Privacy Act Applicability

The information collection is not subject to the Privacy Act of 1974. Hence, no Privacy Act Statement is required to be displayed on the form, website, mobile application or other point at which individuals submit their information.


Data Minimization

The PII collected or used for this study is limited to the minimum necessary to achieve the authorized purpose and produce a valid study. The purpose of the study is to evaluate the Fresh Empire public education campaign to reduce and prevent tobacco use being conducted by CTP in support of its mandate to positively impact public health with regards to tobacco. The PII is necessary in order to determine respondent eligibility, contact parents for parental permission when needed, invite participants to participate in future waves of the study, and distribute incentives.


Likewise, any potentially sensitive information gathered from respondents in association with their PII is limited to that which is essential for the study, such as tobacco use and home tobacco environment. Items such as media use and sensation seeking are collected because they are established risk factors for tobacco use in youth.


FDA has minimized the risk of unnecessary access, disclosure, use or proliferation of PII about respondents. FDA and other parties involved in the study maintain study records containing PII only as long as required (for 3 years after final payment of the contract in accordance with FAR Subpart 4.7). RTI International will use a case identification number to identify participants. Access to PII is restricted by role to personnel who must access this information. Sensitive records are kept in a secure location until destruction occurs. RTI has in place standard operating procedures based on RTI Policy to ensure the security and privacy of recorded information during all phases of the destruction process, including pickup and transport of records from RTI’s locations to the destruction site. Non-identifiable or de-identified data (i.e., responses to the study, but without any PII) will be sent by the contractor to FDA. No PII will be sent to or be accessible by FDA at any time. Field data collectors and field supervisors sign a detailed data collection agreement at the time they are hired onto the project. This data collection agreement, amongst other things, states that they agree to treat as confidential all information obtained during the interviews or obtained during the course of completing their project-related activities.


Participants who complete the online survey provide their email address so they can receive a virtual gift card incentive. RTI study staff provides an encrypted file to the incentive provider, Creative Group Inc., containing the participants’ email address and case ID so that they can contact participants with the compensation. RTI does not share this information with CTP. Creative Group Inc. does not have access to any other PII or non-PII from the study. RTI shares case ID, password, first name and mailing addresses of longitudinal participants with the print vendor, Glover Printing, so that participants can be invited to continue with the study in follow-up waves. The information is sent to Glover via encrypted files. RTI does not share this information with CTP. The print vendor does not have access to any other PII or non-PII from the study.


RTI International will not share PII gathered via this collection with any other individuals or entities.


Notice and Transparency

All subjects are provided notice regarding the collection and use of the information they submit. The purpose of the study and the intended use of the information collected is described on the first page of the mail screener. Parents must write in their name and phone number before giving the mail screener to their child to complete. In both the mail and the online screener, youth are told that information collected in the screener will determine their eligibility for the study and must provide assent/consent before completing the mail screener or the online screener. Youth participants who complete either the in-person or online evaluation questionnaires must first read and accept an electronic informed assent form before they can complete the questionnaire. Longitudinal youth who have turned 18 during the study must read and accept an electronic informed consent form. Study materials and website pages are clearly branded as FDA products.


Individual Participation and Control

Participation in the evaluation of the Fresh Empire campaign is entirely voluntary. Participants may choose not to join the study and are free to withdraw at any time from the in-person and online study, including during the course of responding to a questionnaire, without incurring any negative repercussions. For all youth assent and youth consent forms, affirmative assent or consent is obtained by clicking an “accept” button below the electronic assent text.


Third-Party Accountability

RTI is held accountable for complying with privacy and security procedures (including reporting data breaches) by its contract with FDA, which requires that RTI complies with 45 CFR part 46 and with the Contractor’s current Federal-wide Assurance (FWA) on file with the Office for Human Research Protections (OHRP), Department of Health and Human Services. The Contractor agrees to provide certification at least annually that the Institutional Review Board has reviewed and approved the procedures, which involve human subjects in accordance with 45 CFR part 46 and the Assurance of Compliance. RTI also has an established protocol in place for privacy breaches that includes the Project Director notifying RTI’s IRB and CTP, who, in turn, notifies RIHSC. In addition, RTI has an Incident Response and Breach Notification Plan in place that activates first responders when an incident occurs, and, as required by law, a breach notification policy with respect to protected health information. RTI subcontractors are accountable via contract terms for all data that it handles, uses, shares and maintains as part of this survey.


Data Security

RTI International’s data security procedures for the Federal Information Processing Standards (FIPS) Low network, which is the RTI network on which the data from the evaluation will be stored, have been reviewed by a FedRAMP certified Third Party Organization and deemed acceptable. This organization issued an Authorization to Operate (ATO) for the FIPS Low network.

RTI’s Institutional Review Boards (IRB) will review and approve the permission, consent and assent forms (Attachments 5, 5a, 5b, 5c, 5d, 5e, 5f, 5g, 5i) for the outcome evaluation survey. These forms include language for parental permission and adolescent assent, or youth consent. The IRB’s primary concern is protecting respondents’ rights, one of which is maintaining the privacy of respondent information to the fullest extent of the law.

Concern for privacy and protection of respondents’ rights will play a central part in the implementation of the outcome evaluation study and will receive the utmost emphasis. Interviewers will be thoroughly educated in methods for maximizing a respondent’s understanding of the government’s commitment to privacy to the fullest extent of the law. Several procedures ensure that respondents’ rights are protected. First, the interviewer introduces himself or herself and the study to parents or guardians of eligible youth respondents using the parental permission scripts and the Study Description (Attachments 5, 5a, 5c and 6). As part of the process for obtaining informed assent or consent, youth respondents are given a Study Description (Attachment 6), which includes information on their rights as study participants. Specifically, the Study Description states that respondents’ answers will be used only by authorized personnel for statistical purposes and cannot be used for any other purpose. Parental permission is obtained from the youth’s parent or guardian; subsequently, youth assent is requested. In post-campaign survey rounds, youth who have turned 18 do not require parental permission and can provide their own consent. Signed consent and assent are waived in this study.

After obtaining informed assent or consent, interviewers make every attempt to secure an interview setting in the respondent’s home that is as private as possible. In addition, the interview process, by design, includes techniques to afford privacy for the respondent. The self-administered portion of the interview maximizes privacy by giving control of the interview directly to the respondent. This allows the respondent to read the questions directly from the computer screen and then key his or her own responses into the computer via the keyboard.

At least every 48 hours, data are electronically transferred to RTI’s servers via secure encrypted data transmission. Once the data are securely transmitted from the field to RTI, cases and all associated information are removed from the laptop. Names, email addresses, phone numbers, and mailing addresses are never transmitted to FDA/CTP. Only authorized RTI staff will have access to this information on a need-to-know basis.

Security for respondents of the Web-based surveys will be assured in a number of ways: (1) we will obtain parental permission for all eligible youth screened online prior to completing the pre-test outcome survey and for 13 to 14 year-old eligible youth prior to post-test outcome surveys, which is fully compliant with COPPA’s revised standards; each respondent will remain anonymous and will be known only by a unique alphanumeric variable; respondents will be asked to provide their email address to receive the incentive; (2) participants will log onto the secure server hosted by RTI using a unique identifier and password; (3) respondents will be provided with information about the privacy of their data before they encounter the first survey item; (4) respondents will be required to freely provide their assent or consent to participate before they encounter the first survey item; and (5) respondents will have the option to decline to respond to any item in the survey for any reason. All those who handle or analyze data will be required to adhere to the standard data security policies of RTI.

To ensure data security, all RTI project staff are required to adhere to strict standards. RTI maintains restricted access to all data preparation areas (i.e., receipt and coding). All data files on multi-user systems are under the control of a database manager, with access limited to project staff on a “need-to-know” basis only. No respondent identifiers will be contained in reports to FDA, and results will only be presented in aggregate form.

Implementation of data security systems and processes occur as part of the survey data collection. Data security provisions involve the following:

    • All data collection activities are conducted in full compliance with FDA regulations to maintain the privacy of data obtained from respondents and to protect the rights and welfare of human research subjects as contained in their regulations. Respondents receive information about privacy protections as part of the informed consent process.

    • All data collectors are trained on privacy procedures and are prepared to describe them in full detail, if necessary, or to answer any related questions raised by respondents. Training includes procedures for safeguarding sample member information in the field, including securing hardcopy case materials and laptops in the field, while traveling, and in respondent homes, and protecting the identity of sample members.

    • All field interviewers sign a privacy agreement that emphasizes the importance of respondent privacy and describes their obligations.

    • All field staff laptop computers are equipped with encryption software so that only the user or RTI administrators can access any data on the hard drive even if the hard drive is removed and linked to another computer.

    • Laptops use the Microsoft Windows operating system and require multiple valid login IDs and passwords to access any applications or data.

    • All data transferred to RTI servers from field staff laptops is encrypted and transferred via a secure (SSL) broadband connection or optionally a secure telephone (land) line. Similarly, all data entered via the Web-based survey system is encrypted, as the responses will be on a website with an SSL certificate applied. Data are passed through a firewall at RTI and then collected and stored on a protected network share on the RTI Network. Only authorized RTI project staff members have access to the data on the secure network share.

    • Respondents recruited through social media (such as Facebook and Instagram) also access the survey with a unique ID and password and complete the survey on a secure server. Respondents who are part of the longitudinal cohort have the option to complete post-test outcome surveys online, following the same secure procedures.

All respondents are assured that the information they provide is maintained in a secure manner and will be used only for the purpose of this research. Respondents are assured that their answers will not be shared with family members and that their names will not be reported with responses provided. Respondents are told that the information obtained from all surveys will be combined into a summary report so that details of individual questionnaires cannot be linked to a specific participant.

Respondents participate on a voluntary basis. The voluntary nature of the information collection is described in the introductory section of the consent process (Attachments 5, 5a, 5b, 5c, 5d, 5e, 5f, 5g, 5i) and the lead letters (Attachments 8 and 8a).

  1. Justification for Sensitive Questions

The majority of questions asked will not be of a sensitive nature. There will be no requests for a respondent’s Social Security Number. However, it will be necessary to ask some questions that may be considered to be of a sensitive nature in order to assess specific health behaviors, such as cigarette smoking. These questions are essential to the objectives of this information collection. Questions about messages concerning lifestyle (e.g., smoking, current smoking behavior, attempts to quit smoking) and some demographic information, such as race, ethnicity, and income, could be considered sensitive, but not highly sensitive. To address any concerns about inadvertent disclosure of sensitive information, respondents will be fully informed of the applicable privacy safeguards. The informed consent process (see Attachments 5, 5a, 5b, 5c, 5d, 5e, 5f, 5g, 5i) will apprise respondents that these topics will be covered during the survey. This study includes a number of procedures and methodological characteristics that will minimize potential negative reactions to these types of questions, including the following:


    • Respondents will be informed that they need not answer any question that makes them feel uncomfortable or that they simply do not wish to answer.

    • Surveys are self-administered and maximize respondent privacy without the need to verbalize responses.

    • Participants will be provided with a specific toll-free phone number (linking directly to the RTI IRB Office) to call in case they have a question or concern about the sensitive issue.


Finally, as with all information collected, these data will be presented with all identifiers removed.

  1. Estimates of Annualized Burden Hours and Costs

12a. Annualized Burden Hour Estimate

Information will be initially collected through interviews involving youth ages 12 to 17. Those youth will then be asked to participate in subsequent rounds. The sample will be predominantly African American, Hispanic, and Asian/Pacific Islander. Information will be collected prior to and following the campaign’s launch. To better understand youth’s awareness of and receptivity to campaign materials as the campaign evolves, we will collect data starting 6 months after the campaign launches and ending approximately 48 months following the campaign’s launch. Statistical power estimates provide guidance on reasonable expectations for observing statistically significant change in outcomes of interest as detailed in Section B.1.

A mail-based screener is one of the methods used to identify eligible youth (Attachment 2). Parents or guardians will be asked to provide permission and their contact information on this form (burden described below). For the pre-launch survey, the five-minute screener was completed by youth in 13,816 households for a total of 1,151 burden hours. We did not use the mail-based screener for several post-test screening surveys because we were able to rely upon social media recruitment, described in further detail below. The mail-based screener will be used again during the fourth post-test survey to recruit new youth age 12 – 17 to ensure that the sample composition is similar across rounds of data collection. For the fourth post-test survey, the 5-minute screener will be completed by 9,869 youth for a total of 822 burden hours. This method will not be used during the fifth post-test survey, for which new participants will be recruited only via social media. The total responses for youth completing the mail screener and assent/consent process for will be 23,685 over the course of the evaluation, with a total burden of 1,973 hours.

We will also recruit youth through social media (such as Facebook and Instagram) as a secondary strategy to recruit youth 13 to 17. An online version of the screener described above will be used to identify eligible youth (included in Attachment 3). Eligible youth will be asked to provide their parents’ or guardians’ contact information. The pre-test survey required parental permission for all participants. For the post-campaign outcome surveys, newly recruited 15 to 17 year-old youth will not require parental permission and will therefore not be asked to provide their parents’ or guardians’ contact information. For newly recruited youth 13 to 14 years old, we will continue to require parental permission. The screener will take five minutes and was completed by approximately 8,000 youth for the pre-test survey for a total of 666 burden hours. For the first and second post-test surveys, approximately 24,000 online screeners (2,000 hours) were completed. An additional 4,000 youth will complete the screener during each of the fourth and fifth post-test surveys, for a total of 8,000 additional youth respondents and 666 total additional burden hours. The total number of participants completing surveys will be 40,000 and the total burden will be 3,332 hours over the course of the study.

The process of parents and guardians providing permission for eligible youth will take approximately 1 minute. Parental permission during the pre-test mail screening was provided by 13,816 parents for a total burden of 229 hours. As noted, there were fewer participants recruited through social media platforms as anticipated for the pre-test survey; no more than 520 adults were contacted for permission for a total of 9 burden hours. For the fourth post-test, the 1-minute parental mail permission will be completed by 9,869 households for an additional 164 hours for the parents or guardians. To date, approximately 6,000 adults have provided permission for eligible youth recruited online for a total of 100 burden hours. For the fourth and fifth post-test surveys, an additional 700 adults will be contacted to provide permission for eligible youth recruited online for a total of 11 additional burden hours. This is a conservative estimate as not all eligible youth will require parental permission. The total number of parental permissions will be 30,905 over the course of the study, for a total of 513 hours.

For the pre-test survey, 2,194 youth completed the questionnaire with an estimated burden of 30 minutes per respondent, for an annualized total of 1,097 hours. During the first post-test outcome survey, 2,404 youth (1,722 longitudinal and 682 cross-sectional) completed the survey (a larger sample size than anticipated due to successful social media recruitment efforts). During the second post-test outcome survey, a total of 2,255 youth completed the survey: 1,752 longitudinal cases and 503 cross-sectional cases (204 of which were removed from the analytic sample based on Hip Hop score). For the third post-test outcome survey, 2,100 youth (1,365 longitudinal and 735 cross-sectional) are expected to take the survey. For the fourth and fifth post-test outcome surveys, 2,100 youth are expected to complete the survey at each wave. Based on earlier response rates and longitudinal respondents aging out of the eligibility criteria (over the age of 18), we expect to need to recruit a larger number of cross-sectional respondents than in previous waves. We estimate that approximately 600 longitudinal youth and 1,500 cross-sectional youth will participate in each of the fourth and fifth post-test surveys (total 4,200). For the post-test surveys, the estimated burden is 45 minutes per respondent, for a total of 8,220 burden hours (5 waves of longitudinal and 5 waves of cross-sectional). The number of respondents completing the post-test surveys, including those originally recruited, will be 6,039 (4,530 hours) for the embedded longitudinal cohort, and 4,920 (3,690 hours) for the new cross-sectional respondents.

This data collection will take place in 2015, 2016, 2017, 2018 and 2019. Thus, the target number of completed campaign questionnaires for all respondents is 107,743, an increase of 32,638 since the previous approval. The annualized response burden is now estimated at 15,135, an increase of 4,813 hours. OMB approval for this extension is requested for 3 years. Exhibit 1 provides details about how this estimate was calculated. The Web self-administered surveys will be designed to maximize ease of response (at home on personal computers or mobile devices) and thus decrease respondent burden.

Exhibit 1. Estimated Annual Burden Hoursa

Type of Respondent

Number of Respondents

Number of Responses per Respondent

Total Annual Responses

Average Burden per Response

Total Hours

Youth Mail Screener – Outcome Survey

23,685

1

23,685

0.0833

1,973

Cross-Sectional Refresher Sample, Youth Assent/Consent Process and Post-Tests 1-5 – Outcome Survey

4,920

1

4,920

0.75

3,690

Youth Pre-Test Outcome Survey

2,194

1

2,194

0.50

1,097

Longitudinal Cohort Youth Assent/Consent Process and Post-Tests 1-5 – Outcome Survey

6,039

1

6,039

0.75

4,530

Youth Online Screener and Assent/Consent –Outcome Survey

40,000

1

40,000

0.0833

3,332

Adult Parental Permission Process - Outcome Survey

30,905

1

30,905

0.0166

513

Total

107,743


15,135

a There are no capital costs or operating and maintenance costs associated with this collection of information.

12b. Annualized Cost Burden Estimate


Respondents participate on a purely voluntary basis and, therefore, are subject to no direct costs other than time to participate. There are also no start-up or maintenance costs. RTI has conducted many smoking-related surveys of similar length among youth. We have examined diagnostic data from each of these prior surveys and estimate that data collection for this study will take approximately 30 minutes per respondent for the pre-test outcome survey and 45 minutes for the post-test surveys. We estimate that the web surveys will also take 30 minutes per respondent for the pre-test outcome survey and 45 minutes for the post-test surveys. According to the U.S. Department of Labor (DOL) Bureau of Labor Statistics the average hourly wage in 2013 was $8.19 for ages 16 to 19. Thus, assuming an average hourly wage of $8.19 for youth respondents and an hourly wage for adults of $24.75, the estimated total cost to participants will be $132,452. The estimated value of respondents’ time for participating in the information collection is summarized in Exhibit 2.


Exhibit 2. Estimated Annual Cost

Type of Respondent

Activity

Annual Burden Hours

Hourly Wage Rate

Total Cost

Youth aged 12 to 17 in the United States

Mail Screener - Outcome Survey

1,973

8.19

$16,159

Cross-Sectional Youth Refresher Sample, aged 12 to 17

Assent/Consent Process and Post-Tests 1-5 -- Outcome Survey

3,690

8.19

$30,221

Youth aged 12 to 17 in select media markets

Pre-Test and Assent/Consent Process – Outcome Survey

1,097

8.19

$8,984

Longitudinal Youth Cohort, Age 13 to 18

Assent/Consent Process and Post-Tests 1-5 – Outcome Survey

4,530

8.19

$37,101

Youth aged 13 to 17 in the United States in select media markets

Online Screener – Outcome Survey

3,332

8.19

$27,289

Adults 18 and older in the United States

Parental Permission Process – Outcome Survey

513

24.75

$12,697

Revised Total


15,135


$132,451

  1. Estimates of Other Total Annual Costs to Respondents and/or Recordkeepers/Capital Costs

There are no capital, start-up, operating, or maintenance costs associated with this information collection.

  1. Annualized Cost to the Federal Government

This information collection is funded through a contract with RTI. The actual cost under the original contract was $3,591,502. A new contract to extend the study has been awarded in the amount of $5,368,058. The estimated costs attributable to this data collection now total $8,996,243 (Exhibit 3), which is an increase in the cost to the government of $968,125. This total includes additional contract-funded activities occurring before and after this data collection that include project planning and data analysis. Other activities outside this data collection include coordination with FDA and its media contractor, evaluation plan development, instrument development, reporting, RTI IRB, and progress reporting and project management. This information collection will occur from 2015 through 2019.

Exhibit 3. Itemized Cost to the Federal Government

Government Personnel

Time Commitment

Average Annual Salary

Total

GS-13

25%

$73,846

$18,462

GS-14

15%

$87,263

$13,089

GS-15

5%

$102,646

$5,132

Total Salary Costs

$36,683

Contract Cost

$8,959,560

Total

$8,996,243

  1. Explanation for Program Changes or Adjustments

FDA requests OMB approval to extend the evaluation of FDA’s multicultural youth tobacco public education campaign and to add two additional waves of data collection. Continued evaluation is necessary in order to determine the campaign’s impact on outcomes of interest. Adding the two waves increases the target number of completed campaign questionnaires for all respondents to 107,743, an increase of 32,638 responses, and the annualized response burden to 15,135, an increase of 4,813 hours.

  1. Plans for Tabulation and Publication and Project Time Schedule

Data from this information collection will be used to estimate awareness of and exposure to the campaign among multicultural youth. These estimates will take the form of self-reported ad recognition and recall that assess basic exposure as well as frequency of ad exposure. These estimates will also be calculated separately for each specific campaign advertisement.


Data from this information collection will also be used to examine statistical associations between exposure to the campaign and pre-post changes in specific outcomes of interest for campaign and comparison groups. We will conduct two primary types of analyses. The first will focus on aggregate changes in outcomes from the pre- to post-campaign periods between the campaign and comparison cities. The second analytic approach will focus on individual changes in outcomes as a function of campaign exposure, which will vary within and across campaign and comparison cities. The embedded longitudinal cohort may also permit some longitudinal analysis. The primary outcomes of interest among youth will be awareness of the campaign as well as tobacco-related beliefs, attitudes, intentions and behaviors. We hypothesize that there should be larger changes in outcomes among individuals with more frequent campaign exposure (i.e., dose-response effects).


In addition to relying on self-reported exposure, we will also utilize measures of market-level campaign intensity, which will be constructed with available data on campaign activities, including traditional and digital advertising and local campaign events. These data will be merged to the survey to provide an additional measure of campaign exposure among study participants. This will allow us to analyze the relationship between the market-level delivery of the campaigns and actual levels of awareness in each sample that is collected. This will also facilitate further analyses of the relationship between exogenous market-level measures of campaign dose and changes in the aforementioned outcome variables of interest.


The reporting and dissemination mechanism will consist of three primary components: (1) summary statistics (in the form of PowerPoint presentations and other briefings) on individual awareness of and reactions to the campaign, (2) a comprehensive evaluation report summarizing findings from this information collection, and (3) at least two peer-reviewed journal articles that document the relationships between campaign exposure and changes in the aforementioned outcomes of interest. The key events and reports to be prepared are listed in Exhibit 4.


Exhibit 4. Project Schedule

Project Activity

Date

Pre-test data collection

July – October 2015

Post-test data collection

April 2016 – June 2019

Preparation of analytic data file

Approximately 4 weeks after completion of data collection

Data analysis

Approximately 5 – 12 weeks after completion of each analytic data file

Report writing and dissemination

Approximately 12 – 16 weeks after completion of each analytic data file

  1. Reason(s) Display of OMB Expiration Date is Inappropriate

Not applicable. All data collection instruments will display the expiration date for OMB approval of the information collection.

  1. Exceptions to Certification for Paperwork Reduction Act Submissions

Not applicable. There are no exceptions to the certification statement.

References

Abreu, D. A., & Winters, F. (1999). Using monetary incentives to reduce attrition in the survey of income and program participation. Proceedings of the Survey Research Methods Section of the American Statistical Association.

Biemer, P. P., Murphy, J., Zimmer, S., Berry, C., Deng, G., & Lewis, K. (2017). Using Bonus Monetary Incentives to Encourage Web Response in Mixed-Mode Household Surveys. Journal of Survey Statistics and Methodology, 1–22. https://doi.org/10.1093/jssam/smx015.

Castiglioni, L., Pforr, K., & Krieger, U. (2008). The effect of incentives on response rates and panel attrition: Results of a controlled experiment. Survey Research Methods, 2(3), 151–158.

Centers for Disease Control and Prevention. (2012). Youth Risk Behavior Surveillance–United States, 2011. Morbidity and Mortality Weekly Report, 61(4), 1–162.

Coopersmith, J., Vogel, L. K., Bruursema, T., & Feeney, K. (2016). Effects of Incentive Amount and Type of Web Survey Response Rates. Survey Practice, 9(1).

Davis, K. C., Nonnemaker, J., Duke, J., & Farrelly, M. C. (2013). Perceived effectiveness of cessation advertisements: The importance of audience reactions and practical implications for media campaign planning. Health Communication, 28(5), 461–472. doi:10.1080/10410236.2012.696535

Davis, K. C., Uhrig, J., Bann, C., Rupert, D., & Fraze, J. (2011). Exploring African American women’s perceptions of a social marketing campaign to promote HIV testing. Social Marketing Quarterly, 17(3), 39–60.

Dillard, J. P., Shen, L., & Vail, R. G. (2007). Do perceived message effectiveness cause persuasion or vice versa? Seventeen consistent answers. Human Communication Research, 33, 467–488.

Dillard, J. P., Weber, K. M., & Vail, R. G. (2007). The relationship between the perceived and actual effectiveness of persuasive messages: A meta-analysis with implications for formative campaign research. Journal of Communication, 57, 613–631.

Farrelly, M. C., Davis, K. C., Haviland, M. L., Messeri, P., & Healton, C. G. (2005). Evidence of a dose-response relationship between “truth” antismoking ads and youth smoking prevalence. American Journal of Public Health, 95(3), 425431. doi: 10.2105/AJPH.2004.049692

Jäckle, A., & Lynn, P. (2008). Respondent incentives in a multi-mode panel survey: Cumulative effects on nonresponse and bias. Survey Methodology, 34(1), 105–117.

Janega, J. B., Murray, D. M., Varnell, S. P., Blitstein, J. L., Birnbaum, A. S., & Lytle, L. A. (2004). Assessing the most powerful analysis method for schools intervention studies with alcohol, tobacco, and other drug outcomes. Addictive Behaviors, 29(3), 595–606.

LeClere, F., Plumme, S., Vanicek, J., Amaya, A., & Carris, K. (2012). Household early bird incentives: leveraging family influence to improve household response rates. American Statistical Association Joint Statistical Meetings, Section on Survey Research, 4156 - 4165

Murray, D. M., & Blitstein, J. L. (2003). Methods to reduce the impact of intraclass correlation in group-randomized trials. Evaluation Review, 27(1), 79–103.

Murray, D. M., & Short, B. J. (1997). Intraclass correlation among measures related to tobacco-smoking by adolescents: Estimates, correlates, and applications in intervention studies. Addictive Behaviors, 22(1), 1–12.

Shettle, C., & Mooney, G. (1999). Monetary incentives in U.S. government surveys. Journal of Official Statistics, 15, 231–250.

Singer, E. (2002). The use of incentives to reduce nonresponse in household surveys. In R. M. Groves, D. A. Dillman, J. L. Eltinge, & R. J. A. Little (Eds.), Survey Nonresponse (p. 163–177). New York, NY: Wiley.

Snyder, L. B., Hamilton, M. A., Mitchell, E. W., Kiwanuka-Tondo, J., Fleming-Milici, F., & Proctor, D. (2004). A meta-analysis of the effect of mediated health communication campaigns on behavior change in the United States. Journal of Health Communications, 9, 71–96.

Substance Abuse and Mental Health Services Administration (SAMHSA). (2012). Results from the 2011 National Survey on Drug Use and Health: Summary of national findings. NSDUH Series H-44, HHS Publication No. (SMA) 12-4713. Rockville, MD: Substance Abuse and Mental Health Services Administration.

U.S. Department of Health and Human Services (USDHHS). (2006). The health consequences of involuntary exposure to tobacco smoke: A report of the Surgeon General. Atlanta, GA: U.S. Department of Health and Human Services, Centers for Disease Control and Prevention, Coordinating Center for Health Promotion, National Center for Chronic Disease Prevention and Health Promotion, Office on Smoking and Health.

Wakefield, M. A., Spittal, M. J., Yong, H-H., Durkin, S. J., & Borland, R. (2011). Effects of mass media campaign exposure intensity and durability on quit attempts in a population-based cohort study. Health Education Research, 26(6), 988–997.

21


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy