0. POSITEv SS Part B

0. POSITEv SS Part B.doc

Evaluation of the Food and Drug Administration’s Point-of-Sale Campaign

OMB: 0910-0851

Document [doc]
Download: doc | pdf

B. Statistical Methods

The one-time actual burden figures listed in Part A, Item 12, and Table 1 of this supporting statement has been divided by 3 to provide annual burden estimates. These figures are listed in parentheses as “annualized” below in this part of the supporting statement.

  1. Respondent Universe and Sampling Methods


The primary outcome study will consist of a longitudinal survey of 4,282 adults. This longitudinal design allows us to calculate changes over time in campaign-targeted outcomes for each study participant. We hypothesize that if the campaign is effective, the differences between the outcomes of participants in the control group and participants in the treatment group should be larger among individuals in the treatment condition exposed to the campaigns more frequently (i.e., dose-response effects). Eligible adults will be aged 25 to 54 at screening and 27 to 56 by the end of data collection. Screening criteria are age, current cigarette smoking, full-time residence in a selected household in one of the 30 U.S. counties, and not being on active military duty. We started by identifying 37 U.S. counties with sufficient media capabilities (“media buy”) to support the campaign and a sufficient population of smokers for evaluation of the efficacy of the campaign among smokers. From these, we randomly selected 30 counties.


From each of the 30 counties, we will randomly select approximately 15 postal carrier routes, resulting in 450 carrier routes or address clusters. We obtained these clusters from RTI’s Address Based Sampling frame, which we acquired from the U.S. Postal Service Computerized Delivery Sequence file. This file contains all mail delivery points in the U.S. To obtain the 2,475 Wave 4 interviews (825 annualized), we will randomly select 104,541 locatable addresses from the selected carrier routes, within the selected counties (locatable addresses are those suitable for in-person data collection; post office boxes are not considered locatable). Based on prior experience with similar sample designs we expect approximately 52% of the selected units to be to be occupied and to complete the screener by mail, in person, or by mail and in person, resulting in 53,909 screeners.


These figures are based on an expected total of 27,651 mail screenings (9,217 annualized respondents) and 26,258 in-person screenings (8,753 annualized respondents). Of the total mail and in-person screenings (53,909), we expect approximately 10% of the households that complete the screener to contain one or more current smokers between the ages of 25 and 54, resulting in approximately 5,170 households. This estimate is based on 10.65% of households sampled in the 2014-2015 Tobacco Use Supplement to the Current Population Survey (TUS-CPS) having a smoker between the ages of 25 and 54. Of the households that meet inclusion criteria, we expect approximately 80% to be willing to complete the Wave 1 questionnaire, resulting in 4,282 participants (1,427 annualized). We will only sample one participant per household. We will also complete a telephone verification questionnaire with a subsample of participants during each wave of data collection to check interviewer accuracy and ensure that all incentives are distributed. For telephone verification for the field screener and questionnaire 1, we will select 10% of those who complete the Wave 1 questionnaire (4,282) (428 verifications) and 10% of those who complete the in-person screener but do not to complete the Wave 1 questionnaire either due to ineligibility or declining to participate (26,258-4,282=21,968) for verification (2,197 verifications), resulting in 2,626 participants (875 annualized) completing telephone verification 1. We expect that 54% of those who complete the Wave 1 questionnaire will download the smartphone app and complete the three questionnaires, resulting in 2,308 (769 annualized) participants for the app-based portion of data collection. Due to the incentives and initial in-person data collection, as well as allowing participants to complete the Waves 2, 3, and 4 evaluation questionnaires online, we are expecting 80% of those interviewed at Wave 1 to respond to Wave 2, 85% of those interviewed at Wave 2 to respond to Wave 3, and 85% of those interviewed at Wave 3 to respond to Wave 4, yielding approximately 3,426 (1,142 annualized) completes for Wave 2 and 2,912 (971 annualized) for Wave 3, and 2,475 (825 annualized) for Wave 4. We anticipate that 10% of Wave 2 completers will complete telephone verification questionnaire 2, resulting in 343 participants (114 annualized). Ten percent of Wave 3 completers will complete telephone verification questionnaire 3, yielding 291 participants (97 annualized), and 10% of Wave 4 completers will complete the telephone verification questionnaire, yielding 247 participants (82 annualized). Exhibit 6 outlines this progression of one-time actual numbers for the campaign evaluation. Audio recording of interviews (which will be conducted with approximately 10% of participants) are not included in this table because they will be conducted with a subset of the Waves 1, 2, 3, and 4 respondents.


Table 6. Addresses and the Associated Assumptions to Yield the Needed Number of Completes


Activity

Adult current smokers

Selected addresses

104,451 (100%)

Total mail and field screenings

53,909 (52%)

Eligible persons

5,353 (10%)

Wave 1 completes

4,282 (80%)

Telephone verification 1

2,625 (10%)*

App-based questionnaires (3)

2,308 (54%)

Wave 2 completes

3,426 (80%)

Telephone verification 2

343 (10%)

Wave 3 completes

2,912 (85%)

Telephone verification 3

291 (10%)

Wave 4 completes

2,475 (85%)

Telephone verification 4

247 (10%)

* This figure is 10% of those who complete the Wave 1 questionnaire (10% of 4,282=428) and 10% of those who complete the in-person screener but do not complete the Wave 1 questionnaire (10% of 21,968=2,197).

The 54% response rate for the app-based questionnaires is based on the percentage of Americans with smartphones (77%) times the anticipated percentage of participants who will download the app and complete the three app-based questionnaires (70%).



For the purposes of estimating statistical power for the national sample, we assume that the test statistic evaluating campaign impact will involve a two-tailed hypothesis test with a Type I error rate of 0.05 and a Type II error rate of 0.02, yielding 80% statistical power. Our estimates include an intraclass correlation coefficient (ICC) of 0.007 to account for the geographic clustering of respondents. To some extent, these factors are offset by parameters that will serve to reduce variation. Those parameters include a person-time correlation of 0.765 at the cluster and individual levels. These parameter estimates are available in the published literature and supported by our experience conducting similar studies (Murray & Short, 1997; Murray & Blitstein, 2003; Janega et al., 2004; Farrelly et al., 2005). Statistical models will be used to assess differences between the treatment and control counties in motivation to quit smoking among adults aged 25 to 54 as the primary outcome of the media campaign. We anticipate a 4.5-percentage point difference between the treatment and control counties in the percentage of smokers who report that they want to quit smoking. This difference could result in lower smoking prevalence in the treatment counties compared to the control counties. This expectation is reasonable and similar to the change observed as a result of a previous adult mass media tobacco campaign (McAfee, 2013). Based on these parameters, we anticipate data collection will include complete and repeated measures on approximately 2,475 adults in the target age range for all four longitudinal data collections.


In addition to the primary outcome evaluation, we will conduct three app-based questionnaires with all members of the longitudinal sample who complete the Wave 1 questionnaire, have a smartphone, and agree to download the smartphone app, resulting in an anticipated sample of 2,308 (769 annualized) adults. The app will track passive exposure to the media campaign by recording every time a participant visits a convenience store, including stores that feature campaign materials in treatment counties and stores on a list of comparable convenience stores in control counties. In addition, the app will send participants a brief questionnaire three times over 18 months (approximately every 6 months). The app-based questionnaires will provide additional data points for cigarette smoking, tobacco purchasing behavior, intention to quit smoking, and campaign awareness. However, these questionnaires are primarily designed to check that the app is functioning, specifically the passive exposure monitoring feature. App-based questionnaires will be staggered to occur between Waves 1-4 of the outcome evaluation questionnaires. During the Wave 1 interview, all adults will be provided with information on how to download and use the app. Participants will also be advised of the privacy of their data and be asked to provide their consent to participate before downloading the app. This consent form will include information for participants about potential data charges that may result from the app. All data collected by the app will be associated with the e-mail address that the participant selects to log into the app and to send a $5 electronic gift card or $5 worth of electronic points that can be redeemed from an online vendor upon completion of each app-based questionnaire. The app will not collect names, phone numbers, or physical addresses. The app will collect e-mail addresses for reimbursement. Either a case identification number or participant e-mail address will be used to link app-based participant data with outcome evaluation questionnaire data. Data obtained from the app will be stored on a secure cloud-based server managed by the app vendor. This data will then be securely transmitted to RTI’s secure network server and will be scrubbed from all of the app vendor’s servers and stored only on secure RTI servers once data collection is complete.

  1. Procedures for the Collection of Information

B.2.1 Screening and Outcome Evaluation Data Collection Wave 1


This section describes the procedures for screening and Wave 1 data collection. Screening will be conducted by mail and in person by field interviewers at respondents’ homes. To be eligible, adults must be aged 25 to 54, be current smokers at the time of the screening, not be serving on active military duty, and be full-time residents in one of the selected households located in the 30 identified U.S. counties.


A mail screener (Attachment 1a) will be sent to all identified households with a cover letter (Attachment 5a) that briefly explains the purpose of the study. The cover letter will request the cooperation of an adult aged 18 or older in each household to identify potentially eligible adults living in the household. This letter will be printed on project-specific letterhead with the signature of FDA’s Study Sponsor and RTI’s Project Director. The mail screener materials will contain a $2 prepaid incentive ($2 bill) and a pre-addressed postage paid envelope for returning the screener. One week after the initial mailing, we will send all households a postcard (Attachment 5b). The postcard will be on FDA letterhead and will thank those who have already completed and returned the screener and remind those that have not to please do so.


Half of the households that do not return the first mail screener approximately one month after the first screener is mailed will be randomly selected to receive a second mail screener to increase response to the screener while managing the costs of an additional mailing. This screener packet will include a mail screener identical to the first mailing (Attachment 1a) and a cover letter that identical to the first screener’s cover letter except that it instructs household members who have already completed one screener not to complete another one (Attachment 5c). This mailing will not contain an incentive but will contain a pre-addressed postage paid envelope identical to the one included in the first mail screener packet.


Households that return a mail screener indicating that one or more household members is eligible for participation, as well as a subsample of households that do not return the mail screener, will be visited by trained staff.

Upon arrival at each household, the interviewer will ask to speak to an adult resident aged 18 or older.


After identifying an adult resident to speak to, the interviewer will show the adult resident a copy of the paper mail screener mailed to the household. Then, using a tablet computer, the interviewer will administer a field screener to determine the first names of all eligible participants in the household (adult current smoker between the ages of 25 and 54; see Attachment 1b). It is necessary to have all households complete the field screener regardless of whether they returned the mail screener because we are sampling counties in primarily urban areas (and urban populations are highly mobile), resulting in a high probability that changes will occur in household composition between return of the mail screener and the arrival of the in-person interviewer. This is consistent with our approach to use the mail screener primarily to eliminate ineligible households. After entering the answers to the in-person screening questions into a tablet computer, the computer will randomly select one eligible adult for participation in the Wave 1 outcome evaluation questionnaire. If the selected individual is available, the interviewer will confirm the person’s eligibility and proceed with the Wave 1 evaluation questionnaire. If the selected individual is not available, the interviewer will leave a card so that the eligible adult can contact him or her to return and complete the interview (Attachment 2a). If no one is home during the initial visit to the household, the interviewer will also have the option to leave a card (see Attachment 6a) to inform the residents that the interviewer plans to visit the household at a different time. Further visits will be made as soon as feasible after the initial visit to conduct the Wave 1 evaluation questionnaire with the selected participant. Interviewers will make at least four additional visits beyond the initial visit to each household to complete the screening process and up to another four visits to complete interviews with selected adults, if at least one adult is selected for an interview.


If the interviewer is unable to contact an adult aged 18 or older at the household after repeated attempts, the field supervisor may send an unable-to-contact letter (see Attachment 6b) to reiterate information provided in the mail screener cover letter and to ask for participation in the study. If the interviewer is still unable to contact anyone at a household, the interviewer might send an additional call-me letter (see Attachment 6b) that requests that the residents call the field supervisor to set up a screening appointment.


When a potential respondent refuses to complete the household screening procedures, the interviewers will rely on their training and experience to accept the refusal in a positive manner. This technique will reduce the potential for creating an adversarial relationship between the residents and the interviewer that could preclude future visits. The supervisor might also request a refusal letter (see Attachment 6b) be sent to the residence that is tailored to the specific concerns expressed by the potential respondent and asks him or her to reconsider participating in the study. Refusal letters will also include the supervisor’s telephone number in case the potential respondent has questions or would like to set up an appointment with the interviewer. Unless the respondent calls the supervisor or RTI’s office to refuse participation in the study, one further attempt to enlist the household’s cooperation will be made by specially selected interviewers with experience in addressing initial refusals. These trained interviewers will be selected based on their proximity to the case to minimize travel costs.


If the randomly selected adult is available and agrees to complete the Wave 1 evaluation questionnaire, the participant will read the consent form (Attachment 3a, 2a). Then, the participant will self-administer the Wave 1 evaluation questionnaire (Attachment 2a).


For each adult selected, the interviewer will follow these steps:


    • Obtain electronic consent from the adult. The consent form, which will appear as the first visible screen on the laptop, will be designed to communicate the goals and procedures to adults aged 25 to 54 (Attachment 3a). The participant will read the text of the consent document and click a check box (Attachment 2a) if he or she consents to participate.

    • If the participant is selected for audio recording (for quality control purposes) and the participant consents to it, the interviewer will record parts of the interview.

    • Next, the participant will complete the Wave 1 evaluation questionnaire in a prescribed and uniform manner. The interviewer will show the participant how to use the laptop, and the participant will self-administer the questionnaire using the laptop.

    • At the end of the Wave 1 questionnaire, the interviewer will administer the locator module to obtain contact information for the respondent.

    • Then, the interviewer will offer the participant the option to participate in the optional app-based portion of the study.

    • If the participant has a smartphone and agrees to participate, the interviewer will ask the participant to complete the separate consent form for the app-based portion of the study (Attachment 3b).

    • Then, the laptop will display the instructions for downloading the app (Attachment 4b).

    • The interviewee will then download the app before the interviewer leaves the home. Once the app is downloaded, the interviewee will enter either a case identification number or e-mail address and a password into the app to create a unique login. The interviewer will be available to answer any questions about the download instructions, but participants will be directed to contact the app vendor directly if technical difficulties occur.


The selected eligible adult respondent will be offered $25.00 cash at completion of the Wave 1 questionnaire in appreciation of the time it takes to complete the questionnaire.


All interview data will be transmitted every 48 hours or more frequently via secure encrypted data transmission to RTI’s offices, where the data will be subsequently processed and prepared for analysis, reporting, and data file delivery. Upon transmission to RTI, all data will be automatically wiped from all data collection devices used in the field.


B.2.2 App-based Data Collection


This section describes the procedures for the app-based data collection. Geolocation and app-based questionnaire data will be collected through a smartphone app for all survey cohort members who choose to participate. The app will use geolocation information from the participant’s phone to collect the date, time, and location data whenever a participant enters or exits a convenience store. The app will also administer three brief questionnaires over 18 months (approximately every 6 months). All geotracking and app-based questionnaires will be conducted by a digital health and behavior firm. Data will be stored on the app vendor’s cloud-based server and securely transmitted to RTI’s secure network server. Sample size is estimated to be approximately 2,308 for all three waves of app-based data collection. This figure is based on the assumption that 77% of participants who complete the Wave 1 evaluation questionnaire will have a smartphone (Smith, 2017) and that 70% of those with smartphones will download the app and complete the three app-based questionnaires.


The app-based questionnaires contain the same questions at all three time points. Participants complete the questionnaires in the app on their personal smartphones. E-mail addresses or case identification numbers and passwords will be used to create an account for each participant. App-based notifications will alert participants when it is time to complete an app-based questionnaire (Attachment 4d). If the participant agrees, he or she may also receive reminders via e-mail and/or text message (Attachment 4d). No name, physical address, or phone number is recorded by the app. E-mail addresses will be used for reimbursement. E-mail addresses and/or case identification numbers will be used to link the data from the app to data obtained from the Waves 1 through 4 evaluation questionnaires. Data entered into the app-based questionnaires are uploaded to the app vendor’s secure cloud and transferred securely to RTI’s server. Upon completing the questionnaire, electronic points or electronic gift cards valued at approximately $5 per questionnaire (for up to $15 over the course of the study) will be sent to the participant. Instructions for redeeming this incentive will be e-mailed to participants (Attachment 4d).


The primary purpose of the app is to track passive exposure to the campaign (see Attachment 3b). For this reason, the app will collect the date and time whenever the participant enters and leaves convenience stores. The app will use geolocation technology to detect visits to the convenience stores. We will not collect any data on the purchases the participant makes within the convenience store.


Geolocation data from the app will be compared to participant self-reported exposure to the campaign to determine campaign reach, the accuracy of self-reported exposure, and the relationship between campaign exposure and awareness.


B.2.3 Outcome Evaluation Data Collection Waves 2, 3, and 4


After the Wave 1 evaluation questionnaire (Attachment 2a), three additional outcome evaluations will be conducted (Attachment 2b). This design allows the same adults to be followed over time and provides the data needed to address the study’s goals. The 4 outcome evaluation questionnaires will occur with the same adults over 24 months (approximately 7 months apart). The longitudinal study design will provide an accurate and thorough understanding of campaign exposure, tobacco prevalence, and cessation among the campaign’s target audience of adults aged 25 to 54. We may periodically send the participants who complete the Wave 1 questionnaire a panel maintenance letter to ensure that we still have their correct contact information (Attachment 9c).


For the Waves 2, 3, and 4 evaluation questionnaires, participants will be offered the opportunity to complete the questionnaire in person or online (Attachment 9a and 9d). Participants who do not complete the questionnaire online will be visited by interviewers to complete the questionnaire in person. Participants will receive an advance letter before the start of each of Waves 2, 3, and 4 data collection (see Attachment 9a). These advance letters will remind participants about the study’s purpose and background, explain the study procedures, and provide information to the respondent on how to complete the questionnaire online. Each participant will be given the URL (web address) for completing the questionnaire online and will be provided with their own username and password. Participants will be instructed that those who complete the questionnaire online before in-person data collection begins will be offered an additional $5 “early bird” incentive in addition to the $25 incentive offered to all participants for completing each questionnaire. For questionnaires completed online, the incentive will be delivered by check. The participant will be instructed that an interviewer will contact him or her to schedule an in-person interview if he or she does not complete the questionnaire online. An appointment card will be sent to households that schedule in-person administration of the questionnaire by an interviewer (Attachment 9b). The purpose of this card is to simply remind the participant of when the field interviewer will visit his or her household to complete the questionnaire. At the completion of in-person questionnaires, the incentive will be delivered in cash.


The Waves 2, 3, and 4 questionnaires will include a very similar set of items as the Wave 1 questionnaire, with additional skip patterns for participants who quit smoking between Waves 1 and 2. In addition, demographic characteristics assessed at Wave 1 that do not vary over time will not be assessed again in the Waves 2, 3, and 4 questionnaires. The questionnaires will also include a few questions for participants in the app-based portion of the study to ensure that participants have not changed phones or turned off the app. Participants who have elected not to participate in the app-based portion of the study will again be offered the opportunity to participate in this portion of the study. Minor revisions to questionnaires may be necessary given the media development process and the possibility of changes in campaign implementation, but every effort will be made to minimize changes to the evaluation questionnaire. The evaluation questionnaire includes measures of demographics; tobacco use behavior; motivation and intentions to quit smoking; cessation behaviors; tobacco-related attitudes, beliefs, and risk perceptions; social norms; campaign awareness; media use and awareness; and environmental questions. Participants who complete the Waves 2, 3, and 4 questionnaires in person will receive their incentive in cash. Those who complete it online will receive a check accompanied by a form letter (Attachment 5d) in the mail after completing the questionnaire.


B.2.4 Telephone verification questionnaires


To prevent interviewer fraud, a subsample of participants (approximately 10%) will be surveyed during each of the 4 waves of data collection (Attachment 13) to complete telephone verification questionnaires. A trained interviewer will administer a telephone verification questionnaire by phone to ensure that participants received their incentive and that interviewers followed all other appropriate procedures (for example, displaying proper identification). As necessary when interviewer falsification is suspected and it is not possible to reach the respondent by phone, these verification interviews may occur in person. No incentive will be offered for completion of these verification questionnaires.


B.2.5 Audio recording of interviews


As an additional quality control step, we will seek the respondents’ permission to record small portions of the interview. With participants’ permission, we will perform audio recording of approximately 10% of in-person interviews. This will allow us to assess authenticity of the interview and assess interviewer’s compliance with project protocols.

  1. Methods to Maximize Response Rates and Deal with Nonresponse

The ability to obtain the cooperation of potential respondents for the Wave 1 questionnaire and maintain their participation across all data collection waves is important to the success of this study. In preparation for launching the Wave 1 data collection, we will review procedures for enlisting respondent cooperation across a wide range of surveys, incorporate best practices from those surveys into the data collection procedures, and adapt the procedures through continuous improvement across the survey waves.


In addition to the $25 incentive for each evaluation questionnaire, the $5 additional incentive offered to participants who complete the Waves 2, 3, and 4 questionnaires online before in-person data collection begins, and the $5 incentive for each app-based questionnaire, the study will use procedures designed to maximize respondent participation. Data collection procedures will begin with assignment of households to specific interviewers at the start of data collection. When assigning cases, supervisors will take into account which interviewers are in closest proximity to the work, interviewer skill sets, and basic information such as demographics and size of each sampled area. Supervisors will assign cases to interviewers using methods that maximize production while controlling costs.


When interviewers transmit their data from completed household screenings and evaluation questionnaires, the data will be summarized in daily reports posted to a Web-based case management system accessed by field supervisors and RTI’s data collection managers. On a daily basis, supervisors will use these reports to review response rates, production levels, and record of call information. This information will allow supervisors to determine each interviewer’s progress toward weekly production goals, when interviewers should attempt further contacts with households, and how to handle challenging situations such as households that initially refuse to participate or households where the interviewer has been unable to contact anyone. Supervisors will discuss information and challenges with their interviewers each week. When feasible, cases will be transferred to other interviewers with different skill sets to assist with converting initial refusals into participating households. Cases might also be transferred among interviewers to improve production in areas where the original interviewer is not meeting response rate goals.


As noted in Section B.2, interviewers will use a Sorry I Missed You Card (Attachment 6a) and Notifications (Attachment 6b) when needed to contact respondents and encourage participation. To assist efforts to convert households that initially refuse to participate, refusal letters (Attachment 6b) tailored to specific refusal reasons will be used. Similarly, an unable-to-contact letter (Attachment 6b) will be sent to a household if the interviewer has been unable to contact an adult resident after multiple attempts. When interviewers have been unable to gain access one or more households due to an access barrier, such as a locked gate or a doorperson, controlled access letters (Attachment 6b) will be sent to the appropriate person or organization to obtain assistance in gaining access to these households.

  1. Test of Procedures or Methods to be Undertaken

Whenever possible, the evaluation questionnaires contain items that have been used and validated in other large surveys. In addition, RTI has consulted with an extensive number of experts for questionnaire development. RTI will also conduct rigorous internal testing of the evaluation questionnaire prior to its fielding. Testers will verify that instrument skip patterns are functioning properly, display of images of media campaign materials is working properly, and that all questionnaire items are worded correctly and are in accordance with the instrument approved by OMB.

  1. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

The following individuals inside the agency have been consulted on the design and statistical aspects of this information collection as well as plans for data analysis:


Natalie Gibson

Office of Health Communication and Education

Center for Tobacco Products

U.S. Food and Drug Administration

10903 New Hampshire Avenue

Silver Spring, MD 20993

Phone: 240-402-4095

E-mail: [email protected]


Janine Delahanty

Office of Health Communication and Education

Center for Tobacco Products

U.S. Food and Drug Administration

10903 New Hampshire Avenue

Silver Spring, MD 20993

Phone: 240-402-9705

E-mail: [email protected]


Chaunetta Jones

Office of Health Communication and Education

Center for Tobacco Products

U.S. Food and Drug Administration

10903 New Hampshire Avenue

Silver Spring, MD 20993

Phone: 240-402-0427

E-mail: [email protected]


April Brubach

Office of Health Communication and Education

Center for Tobacco Products

Food and Drug Administration

10903 New Hampshire Avenue

Silver Spring, MD 20993

Phone: 301-796-9214

E-mail: [email protected]


Gem Benoza

Office of Health Communication and Education

Center for Tobacco Products

Food and Drug Administration

10903 New Hampshire Avenue

Silver Spring, MD 20993

Phone: 240-402-0088

E-mail: [email protected]


David Portnoy

Office of Science

Center for Tobacco Products

Food and Drug Administration

10903 New Hampshire Avenue

Silver Spring, MD 20993

Phone: 301-796-9298

E-mail: [email protected]


Katherine Margolis

Office of Science

Center for Tobacco Products

U.S. Food and Drug Administration

10903 New Hampshire Avenue

Silver Spring, MD 20993

Phone: 240-402-6766

E-mail: [email protected]


The following individuals outside of the agency have been consulted on the questionnaire development, statistical aspects of the design, and plans for data analysis:


James Nonnemaker

RTI International

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: 919-541-7064

E-mail: [email protected]



Matthew Farrelly

RTI International

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: 919-541-6852

E-mail: [email protected]



Joseph McMichael

RTI International

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: 919-485-5519

E-mail:[email protected]



Susan Pedrazzani

RTI International

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: 919-541-6744

E-mail: [email protected]



Ashley Amaya

RTI International

701 13th Street NW #750

Washington, DC 20005

Phone: 202-728-2486

E-mail: [email protected]



Lauren Dutra

RTI International

2150 Shattuck Avenue, Suite 800

Berkeley, CA 94704

Phone: 510-665-8297

E-mail: [email protected]


Pamela Rao

Akira Technologies, Inc.

1747 Pennsylvania Ave NW Suite 600

Washington, DC 20002

Phone: (202) 517-7187

Email: [email protected]


Xiaoquan Zhao

Department of Communication

George Mason University

Robinson Hall A, Room 307B

4400 University Drive, 3D6

Fairfax, VA 22030

Phone: 703-993-4008

E-mail: [email protected]


Mark Hall

FCB New York

100 West 33rd Street

New York, NY 10001

Phone: 212-885-3372

E-mail: [email protected]


Suzanne Santiago

FCB New York

100 West 33rd Street

New York, NY 10001

Phone: 212-885-3651

E-mail: [email protected]



The following individuals will conduct data collection and analysis:


Matthew Farrelly

RTI International

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: 919-541-6852

E-mail: [email protected]


Lauren Dutra

RTI International

2150 Shattuck Avenue, Suite 800

Berkeley, CA 94704

Phone: 510-665-8297

E-mail: [email protected]


James Nonnemaker

RTI International

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: 919-541-7064

E-mail: [email protected]

References


Abreu, D. A., & Winters, F. (1999). Using monetary incentives to reduce attrition in the survey of income and program participation. Proceedings of the Survey Research Methods Section of the American Statistical Association.

Castiglioni, L., Pforr, K., & Krieger, U. (2008). The effect of incentives on response rates and panel attrition: Results of a controlled experiment. Survey Research Methods, 2(3), 151–158.

Farrelly, M. C., Davis, K. C., Haviland, M. L., Messeri, P., & Healton, C. G. (2005). Evidence of a dose-response relationship between “truth” antismoking ads and youth smoking prevalence. American Journal of Public Health, 95(3), 425431. doi: 10.2105/AJPH.2004.049692

Farrelly, M.C., Nonnemaker, J., Davis, K.C., Hussin, A. (2009). The influence of the national truth campaign on smoking initiation. American Journal of Preventive Medicine, 36(5): 379-384.

Farrelly, M.C., Duke, J.C., Davis, K.C., Nonnemaker, J.M., Kamyab, K., Willett, J.G., Juster, H.R. (2012). Promotion of smoking cessation with emotional and/or graphic antismoking advertising. American Journal of Preventive Medicine, 43(5), 475-482.

Jäckle, A., & Lynn, P. (2008). Respondent incentives in a multi-mode panel survey: Cumulative effects on nonresponse and bias. Survey Methodology, 34(1), 105–117.

Janega, J. B., Murray, D. M., Varnell, S. P., Blitstein, J. L., Birnbaum, A. S., & Lytle, L. A. (2004). Assessing the most powerful analysis method for schools intervention studies with alcohol, tobacco, and other drug outcomes. Addictive Behaviors, 29(3), 595606.

McAfee, T., Davis, K.C., Alexander, R.L., Pechaceck, T.F., & Bunnell, R. (2013). Effect of the first federally funded US antismoking national media campaign. Lancet, 382(9909), 2003-2011.

Murray, D. M., & Blitstein, J. L. (2003). Methods to reduce the impact of intraclass correlation in group-randomized trials. Evaluation Review, 27(1), 79103.

Murray, D. M., & Short, B. J. (1997). Intraclass correlation among measures related to tobacco-smoking by adolescents: Estimates, correlates, and applications in intervention studies. Addictive Behaviors, 22(1), 112.

Shettle, C., & Mooney, G. (1999). Monetary incentives in U.S. government surveys. Journal of Official Statistics, 15, 231–250.

Singer, E. (2002). The use of incentives to reduce nonresponse in household surveys. In R. M. Groves, D. A. Dillman, J. L. Eltinge, & R. J. A. Little (Eds.), Survey Nonresponse (p. 163–177). New York, NY: Wiley.

Smith, A. (2017) Record shares of Americans now own smartphones, have home broadband. Pew Research. http://www.pewresearch.org/fact-tank/2017/01/12/evolution-of-technology/. January 12, 2017. Accessed August 10, 2017.

U.S. Department of Health and Human Services (USDHHS). (2014). The health consequences of smoking – 50 years of progress: A report of the Surgeon General. Atlanta, GA: U.S. Department of Health and Human Services, Centers for Disease Control and Prevention, National Center for Chronic Disease Prevention and Health Promotion, Office on Smoking and Health.



15



File Typeapplication/msword
AuthorChaunetta Jones
Last Modified BySYSTEM
File Modified2017-10-27
File Created2017-10-27

© 2024 OMB.report | Privacy Policy