CTP_Tobacco_User_Panel_Supporting_Statement_PART_A_CLEAN

CTP_Tobacco_User_Panel_Supporting_Statement_PART_A_CLEAN.docx

National Panel of Tobacco Consumer Studies

OMB: 0910-0815

Document [docx]
Download: docx | pdf

TOBACCO USER PANEL

SUPPORTING STATEMENT PART A

0910-0815










TABLE OF CONTENTS

Section Page

A. Justification 1

A.1 Circumstances Making the Collection of Information Necessary 1

A.2 Purposes and Use of the Information Collection 3

A.2.1 Overview of the Design 3

A.2.2 Purpose of the Panel 7

A.2.3 Information Elements and Data Sources 8

A.3 Use of Improved Information Technology and Burden Reduction 10

A.4 Efforts to Identify Duplication and Use of Similar Information 12

A.5 Impact on Small Businesses or Other Small Entities 13

A.6 Consequences of Collecting the Information Less Frequently 13

A.7 Special Circumstances Relating to the Guidelines of 5 CFR1320.5 13

A.8 Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency 13

A.8.1 Federal Register Announcements and Comments 13

A.8.2 Consultation Within the Research Community 17

A.9 Explanation of Any Payment or Gift to Respondents 18

A.10.1 Procedures for Protecting Data Collected from Participants 21

A.10.2 Additional Privacy Concerns Associated with On-line Data Collection 23

A.10.3 Privacy Procedures for Mail Survey Participants 24

A.10.4 Privacy Concerns for Participants Using Loaned Tablet Computers 25

A.11 Justification for Sensitive Questions 26

A.12 Estimates of Annualized Burden Hours and Costs 26

A.12.1 Annualized Hour Burden Estimate 26

A.12.2 Annualized Cost Burden Estimate 27

A.13 Estimates of Other Total Annual Cost Burden to Respondents or Recordkeepers 28

A.14 Annualized Cost to the Federal Government 28

A.15 Explanation for Program Changes or Adjustments 28

A.16 Plans for Tabulation and Publication and Project Time Schedule 28

A.17 Reason(s) Display of OMB Expiration Date Is Inappropriate 30

A.18 Exceptions to Certification for Paperwork Reduction Act Submissions 30

References 31




Exhibits

Number Page

Exhibit A.9-1. Incentive Type and Amount 20

Exhibit A.12-1. Estimated Annualized Response Burden for Panel Members 27

Exhibit A.12-2. Estimated Annual Reporting Burden 27

Exhibit A.16-1. Panel Project Schedule 29




LIST OF ATTACHMENTS

1. Questionnaires: English-Language Versions 1-1

2. Questionnaires: Spanish-Language Versions 2-1

3. Respondent Materials: English-Language Versions 3-1

4. Respondent Materials: Spanish-Language Versions 4-1

5. Summary of the Sample Design for the Panel 5-1

6. Incentives 6-1


A. Justification

On June 22, 2009, the President signed the Tobacco Control Act (Pub. L.111–31) into law. The Tobacco Control Act granted the U.S. Food and Drug Administration (FDA) authority to regulate the manufacture, marketing, and distribution of tobacco products to protect the public health generally and to reduce tobacco use by minors. Section 201 of the Tobacco Control Act, which amends Section 4 of the Federal Cigarette Labeling and Advertising Act (15 U.S.C. 1333), requires FDA to issue “regulations that require color graphics depicting the negative health consequences of smoking to accompany the label statements specified in subsection (a)(1).” FDA also can assert authority over other tobacco products and require similar label statements. The U.S. Food and Drug Administration’s Center for Tobacco Products (FDA-CTP) obtained clearance to establish the panel to conduct experimental and observational studies with a national sample of tobacco users designed to collect information from tobacco users from across the sociodemographic spectrum in order to assess consumers’ responses to tobacco marketing warning statements, product labels, and other communications about tobacco products. The data collected will be used to inform FDA’s regulatory authority over tobacco products.

A.1 Circumstances Making the Collection of Information Necessary

The U. S. Food and Drug Administration’s (FDA) Center for Tobacco Products (CTP) established a high-quality, national panel of about 4,000 tobacco users. The panel includes individuals who agree to participate in up to 8 studies over a 3-year period to assess consumers’ responses to tobacco marketing warning statements, product labels, and other communications about tobacco products. The data collected will be used to inform FDA’s regulatory authority over tobacco products. For purposes of panel member recruitment and retention, the collection of experimental and observational studies that will be conducted with the panel is referred to as the National Panel of Tobacco Consumer Studies (TCS).

CTP established the panel of consumers because currently existing panels have a number of significant limitations. First, many existing consumer panels are drawn from convenience samples that limit the generalizability of study findings (Baker et al., 2010). Second, although at least two probability-based panels of consumers exist in the United States, they are not designed to represent the sociodemographic spectrum of tobacco users. Furthermore, there is a concern that responses to the studies using tobacco users in these panels may be biased (i.e. panel bias) due to panel conditioning effects (e.g., Coen, Lorch and Piekarski, 2005; Nancarrow and Catwright, 2007). That is, the subsample of tobacco users in these panels may be called upon to complete surveys so frequently that their responses may no longer be similar to tobacco users who have not participated in so many surveys on this topic. A 2012 study by KnowledgeNetworks, for example, indicated the median number of prior studies completed by smokers was 107 and that the range of prior smoking related studies was from 0 to 19 or 2.5 on average (Cobb, Lawrence, & Gross, 2012). Panel conditioning has been associated with repeated measurement on the same topic (e.g., Kruse et al., 2009), panel tenure (e.g., Coen, Lorch and Piekarski, 2005), and frequency of the survey request (e.g., Nancarrow and Catwright, 2007). This issue is of particular concern for tobacco users who represent a minority of the members in the panels, and so may be more likely to be selected for participation in experiments and/or surveys related to tobacco products. Given the limitations in the existing Web-based panels, it is important to develop a new panel of tobacco users that balances the need to conduct experiments while limiting the number of tobacco-related studies per year so as to not bias study results.

FDA proposed a multi-modal study, with a preference for Web-based administration where feasible, because Web surveys can include multimedia, such as images of tobacco product packages, tobacco advertising, new and existing warning statements and labels, and potential reduced harm claims in the form of labels and print advertisements. Establishing a primarily Web-based panel of tobacco users through in-person probability-based recruitment of eligible adults and limiting the number of times individuals participate in tobacco-related studies will reduce the likelihood of bias in this data collection.

Data collection activities involve initial implementation and testing of procedures for panel recruitment and management, mail and in-person household screening, in-person recruitment of tobacco users, enrollment of selected household members, and administration of a baseline survey, following all required informed consent procedures for panel members. Panel members are asked to participate in up to 8 experimental and observational studies over the 3-year panel commitment period. The first of these studies (Study 1) is included in this information collection request; approval for the remainder of the studies will be sought in future requests. This clearance is primary for the purposes of the design and implementation of the panel. With the exception of this first study, a separate clearance request will likely be used to process individual studies.

A.2 Purposes and Use of the Information Collection

A.2.1 Overview of the Design

The panel is designed to establish a primarily Web-based panel of 4,000 adult tobacco users, aged 18 and older, in housing units and in non-institutionalized group quarters in the 50 states and the District of Columbia. The sample is designed to allow in-depth analysis of subgroups of interest and to the extent possible, provide insight into tobacco users more generally.

The 2014 National Health Interview Survey (NHIS), a nationally representative sample, found that 17% of U.S. adults were current cigarette smokers, and 22% were current cigarette smokers and/or current users of other tobacco products (e.g., cigars, smokeless, and/or other tobacco products) (NHIS, 2014). In 2014, the NHIS found that tobacco users were mostly male (62%). Almost half of tobacco users were between 26-49 years of age (48%), and 36% were 50+ years of age. The majority of tobacco users were White (73%). In terms of educational attainment, 27% had a high school education, and 23% had some college education. About 40% of tobacco users reported an annual household income below $35,000 (NHIS, 2014).

For this panel, the young adult population (aged 18-25) are oversampled, while tobacco users ages 26 and older are undersampled. This will allow us to achieve the target sample sizes in four domains formed by age group (18-25, 26+) and social economic status (SES) (low SES, non-low SES) and to conduct more in-depth study of these groups of tobacco users. The primary reason to oversample young adults is because they are at a point in their life when their tobacco use habits are not fully established and they may respond differently to tobacco regulation than older, more established smokers. To better understand this population, we are oversampling 18-25 year-olds because the sample size of young adult smokers we would get for the panel would be relatively small otherwise. According to the 2014 NHIS, only 16% of smokers and tobacco users are between 18-25 years of age (NHIS, 2014 public use data). This is generally consistent with what other Federal surveys suggest (e.g, 17.1% of 18- to 24-year-olds are current smokers according to the CPS-Tobacco Use Supplement [2010-2011] (TUS-CPS, 2011);19.2% of 18- to 25-year-olds are current cigarette smokers according to the ASPE Health System Measurement Project [2013] (HSMP, 2013); 16.8% of 18- to 24 –year-olds are current daily smokers according to the National Adult Tobacco Survey [2009-2013] (NATS, 2014)). In addition to this oversample, smokeless tobacco users identified during screening are assigned higher probabilities of selection than other tobacco users. Supporting Statement Part B (Section B.1) details the panel sample design. Exhibit B.1-1 provides the sample sizes in each of the sample domains of interest.

Given FDA’s preference to establish a primarily web-based panel in order to include multimedia images in the experimental and observational studies, an important consideration is the level of Internet access that can be assumed among the recruited panel members. The Pew Research Center reports that as of May, 2013, 76% of the U.S. adults use the Internet at home and an additional 9% of adults use the Internet but lack home access (Zichuhr, 2013). Despite the high coverage rate, a significant proportion of the population will not have access to the Internet for one reason or another. Lack of Internet use at home has been associated with certain demographic characteristics. The Current Population Survey found that in 2011 42% of Hispanics and 44% of Black households, 28% of those who are 55 and older, and 63% of those with less than high school education) live in households that do not use or have access to the Internet at home (U.S. Census Bureau, 2013).

There is little available data to estimate the prevalence rate of Internet access among U.S. adult tobacco users. Arriving at an accurate estimate of the proportion of the total population of tobacco users who are unable to participate in an online panel is therefore challenging. The 2014 Health Information National Trends Survey (HINTS), however, included questions about current cigarette use and access to the Internet.1 These data suggest that 79% of current smokers who responded (defined as every day or some day smokers) have some access to the Internet2 (HINTS 4, Cycle 4 public use data). The 21% of current smokers who do not tend to be older (39% are between 50-64 years of age, 25% are 65 and older), less educated (41% have less than high school education and 41% have only high school education), and have lower annual household income (68% are below $35,000).

We recognize that in addition to access to the Internet, a proportion of sampled adult tobacco users may be averse to participating in online studies. This argues for a multi-modal study, offering modes of participation in the panel studies other than the Internet. One of our main objectives is to build a sample of panel responders that is, to the best of our ability, free of bias and is reflective of the socioeconomic, demographic, and geographic characteristics of U.S. tobacco users. As such, we want to avoid losing the segment of the population that does not use the Internet. Excluding them from the panel would significantly bias results; as shown above, this group of respondents differs from the main population along several important dimensions, including age and socioeconomic status.

To minimize the potential for coverage and nonresponse bias, we proposed two strategies to facilitate enrollment and participation in the panel. First, we offer a mail survey option for those eligible adults who would like to participate in the panel but are unwilling or unable to do so online. Based on the above statistics, we expect such participants to be older, less tech savvy adults, who do not use computers, do not access the Internet, or who do so infrequently and express discomfort. As shown in Exhibit B.1-1, we plan to enroll 2,960 panel members age 26 and older, including 1,184 in the low-SES domain and 1,776 in the non-low SES domain. Using demographic data on current tobacco users from the 2014 NHIS (NHIS, 2014 public use data), we estimate that between 35.6-37.33% of our panel members will be 50 years of age or older, including 442 in the low-SES and 662 in the non-low SES domain. Assuming we experience Internet access rates similar to those from the 2014 HINTS, we estimate that 51% (225) of low-SES panel members age 50 and older would be unlikely to ever go online to participate in the panel. Another 21% (139) of panel members age 50 and older in the non-low SES domain would also be unlikely to participate online. Considering the participation needs of these 364 older adults, we plan such participants to be older, less tech savvy adults, who do not use computers, do not access the Internet, or who do so infrequently and express discomfort. We plan to enroll 400 mail mode participants based on what we outline above, however, we acknowledge that these estimates are based on approximations and that our final number of panel members who cannot participate online could be different. We will enroll a maximum of 800 mail mode participants if we find a higher percentage of panel members express a preference for this mode. Mail mode panelists will participate in the same interviewer-administered enrollment interview, but the baseline survey will also be interviewer-administered using the Web version rather than self-administered by the panelist. Subsequent experimental and observational studies will be administered via mail. However, these panelists will always be given the option to transition to the Web survey environment if they become more comfortable with the use of computers and/or the Internet and prefer to switch to the online mode.

A second strategy to facilitate enrollment and participation is to provide a means to join the Internet panel for those who would, but do not have access to the Internet. To maximize the number of online participants in the experimental and observational studies, we plan to move as many panelists as possible to Web data collection by offering the loan of a Web-enabled tablet computer to a subset of the sampled adults who do not have the means to participate online but would otherwise be very capable Web survey respondents. We expect these to be younger, more “tech knowledgeable” adults who do not have the financial means to participate online (e.g., no computer, smartphone or other available device, no Internet service). As the mail mode would provide coverage for 10% of our panel members, or about half of the 21% of tobacco users the 2014 HINTS data suggests would not go online, the offer of a tablet loan is intended to facilitate panel participation for the remaining 10% of panelists who are unlikely to have Internet access. We plan to enroll a maximum of 400 panelists to use study-provided tablets while in the panel; thus, the total number of panelists participating over the Web is expected to be 3,600 (out of the 4,000 enrolled. As noted above, the final number of mail mode participants may be higher (maximum of 800) depending on mode preferences expressed at enrollment.

It is important to note that the sample selection is independent of the mode of data collection. That is, we first draw a random sample from all addresses on our address-based sample frame. In the process of recruitment we identify those who are either unable or unwilling to participate in an online panel and provide them with the option of an alternative mode to avoid biasing the panel. Section B.3.2 provides additional details about the procedures for nonresponse bias assessment and the proposed strategy to weight results to address differences in mode of survey administration, oversampling of young adults, and adjust for deviations from the original design due to factors such as variable nonresponse.

As noted above, panel members are asked to participate in up to 8 experimental and observational studies during their 3-year commitment to the panel. Additionally, in non-study months, they may receive other forms of contact to maintain their interest and engagement in the panel.

Attrition from the sample is expected, and the sample design provides for a quarterly in-person panel replenishment effort, using the same sampling and data collection design described above, to replace panel members who choose to end their involvement in the panel.

A.2.2 Purpose of the Panel

The overall purpose of the data collection is to collect information from a national sample of tobacco users to provide data that may be used to develop and support FDA’s policies related to tobacco products, including their labels, labeling, and advertising. Data will be collected from the panel primarily through the use of randomized experimental designs. In the future FDA may submit ICRs under a separate clearance mechanism that use other methods, such as surveys, interviews, or online group discussions. As discussed in Section A.1, existing panels of tobacco users are not appropriate for this purpose for one or more reasons. The project establishes a panel of tobacco users who are asked to participate in up to 8 experimental and observational studies over a 3-year period. Oversampling of young adults (18-25) is another key feature of this study. Smoking initiation has increased among young adults (Lanz, 2003), and this age group has the highest smoking prevalence rate in the United States (Schiller, Lucas, & Peregoy, 2012). As a result, information about how proposed regulations might impact them is essential to continued decreases in tobacco use among Americans. Another key feature of this study is the oversampling of adults who use smokeless tobacco products and the inclusion of panelists who use cigars.

A nationally representative sample is not necessary for conducting the experiments; rather the need is for a sample that is sufficiently varied with respect to the major sociodemographic characteristics of tobacco users. Although we use probability methods to recruit the panel of tobacco users, the final panel may not be able to produce results that are representative for the population of tobacco users in the U.S. Of particular concern are the complex relationships among tobacco use, age, income, race/ethnicity, education, and geography (both location and urban/rural). As such, whenever the results are presented, CTP will clearly describe the sociodemographic and geographic characteristics of the sample that responded to a given survey, explicitly characterizing potential limitations in generalizability. It is likely that, for at least some studies, we may be limited to describing the results as sufficiently varied to reflect the general characteristics of smokers in the U.S. Such a description should be sufficient for documenting the trends and patterns of interest to CTP in the context of this information collection. Consistent with obligations under HHS’ and OMB’s Information Quality Guidelines, CTP will assess the quality of the information generated for each regulatory or policy purpose under consideration.

A.2.3 Information Elements and Data Sources

The data elements in the initial set of surveys used for establishing the panel was driven primarily by the need for quality baseline data to benchmark future experimental and observational studies and to accurately characterize the tobacco use of panel members. The surveys also took into account the methodological and administrative factors relevant to collecting data in a cost effective manner that does not burden respondents unduly, and that adequately deals with the requirements of a diverse multicultural population of interest. Relevant factors taken into account included the overall length and complexity of each survey and the presentation of individual questions, response sets, and respondent instructions in both Web and paper self-administration environments (e.g., minimizing use of grids or other complex question formats). All surveys were also translated into Spanish.

Instrument Development Process

Four questionnaires—a mail screening questionnaire, field screening questionnaire, enrollment questionnaire, and baseline questionnaire—were developed to support screening and recruitment of the panel and collection and maintenance of participant contact information, demographic data, and other background information pertinent to panel management and analysis. The questionnaires were drafted using existing survey items from the National Health Interview Survey (NHIS) and the Tobacco Use Supplement in the Current Population Survey (TUS-CPS) as the source for items on tobacco use. Use of previously tested and fielded survey items mitigates the need for extensive pretesting of the questionnaires. In addition, an interviewer observation questionnaire was developed.

The first experimental and observation study (Study 1) is designed to be self-administered shortly after panel members are enrolled. As with the screener and baseline surveys, the Study 1 questionnaire was developed primarily using items from existing surveys. The panel instruments, including the Study 1 questionnaire, are described in detail in the sections that follow. Further details are provided in Section B.2.3.

Baseline Questionnaire

The baseline survey collects a detailed history of the panel member’s use of cigarettes, cigars or little cigars, and smokeless tobacco products. Panel members are asked how frequently they use each tobacco product and whether they intend to quit within the next 30 days. They are also asked questions to assess their level of addiction to nicotine, their general health status, and use of other tobacco products, including electronic cigarettes, pipes, and water pipes.

At the conclusion of the baseline survey, and upon leaving the panel member’s home, the interviewer completes a brief interviewer observation questionnaire on his/her tablet computer to document perceptions about the panel member recruitment process, comfort level with the Web baseline survey and computers in general, and his/her likelihood of remaining in the panel. This information, coupled with the baseline survey items on comfort with the computer, will be used to identify panel members at greater risk of attrition or who may need increased levels of technical support while in the panel. These panelists will receive more targeted or more frequent support while in the panel, and in particular during the initial weeks and months following enrollment. The interviewer observation questionnaire is also used to capture information collected by interviewers during their post-enrollment follow-up call to those panelists using a loaned tablet.

Experimental and Observational Studies

Periodic self-administered Web (or mail) surveys will be the mechanism for collecting experimental and observational data desired by FDA. As noted above, up to 8 studies will be conducted with panel members during the initial 3-year panel period. Study 1 is included in this clearance request in order to engage panel members in their first substantive study within the first few months of their panel enrollment. The remaining studies will be handled in separate clearance requests. To minimize burden, each of these studies will require no more than 20 minutes for panel members to complete.

Study 1 focuses on purchasing behavior, tobacco brands, and use of coupons and price promotions for tobacco products. The goal of this study is to collect information about participants’ tobacco product brand loyalty and more accurate measures of their tobacco product consumption. One reason that tobacco companies use coupons and promotions is to promote brand switching. Study 1 includes items about receipt and use of coupons and promotions to assess susceptibility to brand switching among participants. Tobacco product consumption is a self-reported measure, and thus it may be under-reported. To better assess the accuracy of self-reported consumption behaviors, the study also includes detailed questions about product purchases. The research literature and focus groups conducted under the contractor’s Consumer Behaviors BPA show that brand loyalty moderates consumer attitudes about and intentions to try or purchase other tobacco product brands (OMB Control No. 0910-0497). As a result, it is important to quantify the brand loyalty of panel participants in order to examine how it affects panelist choices in planned experimental studies. The Study 1 questionnaire includes items from the CPS-TUS, the National Adult Tobacco Survey (NATS), and the Online Smokers Survey that the contractor has administered for the State of Florida since 2009, as well as several new measures that examine product purchasing.

A.3 Use of Improved Information Technology and Burden Reduction

The panel instruments, including the field screener, enrollment survey, baseline survey, and experimental and observation study instruments, are programmed for computer-assisted data collection. Computer-assisted interviewing (CAI) technology affords well-known improvements and efficiencies in the collection of survey data. The technology permits more complex routings compared to a paper-and-pencil mode of data collection. It allows for on-screen cueing of respondents, delivery of media images, and consistency checks during self-administration or by the field interviewer, and produces quality backend data that saves costs associated with data-cleaning and data analysis.

The contractor uses its mobile field system (via interviewer tablet computers) to conduct all counting and listing and field screening operations. This includes identification of dwelling units that were not part of the sampling frame using Check for Housing Units Missed (CHUM) protocols. The mobile field system is also used for the administration of the enrollment survey and interviewer observation survey that are deployed on the interviewer’s tablet computer. This system enables the ready creation of instruments for deployment and the easy output of codebooks and data at the backend. Use of mobile technology enhances the quality of data, for example allowing behind-the-scenes GPS capture to verify sampled addresses, while improving the efficiency of the doorstep screening operation. Survey data on the tablet are encrypted and both the tablet and the mobile field system are password protected. An integrated field management system supports field staff data transmissions, time reporting, and assignment of cases.

The contractor’s Hatteras Web authoring system is used for the panel member’s baseline and Study 1 instruments. Like the mobile field system, the Hatteras survey engine supports all aspects of survey deployment, data output, and codebook generation. A Hatteras Web page can display a wide array of fonts, colors, and images, including videos with superior resolution. Use of both systems reduces user burden and creates efficiencies, both for project staff and panel members. Hatteras is also used to support the collection of panel member survey data in alternative modes, including entry of completed mail questionnaires. Additionally, any in-person or telephone data collection undertaken as part of nonresponse follow-up efforts for Study 1 or subsequent experimental and observational studies can also be supported by the mobile field system and/or Hatteras.

Access to the panel member Web surveys is controlled through a project Web portal hosted by the contractor. A two-tiered security approach is used for accessing the surveys and transmitting the data. An ID and password is required for a panel member to enter a Web survey; Secure Socket Layer (SSL) certification ensures that only encrypted data flow over the Internet.

A control system is the central component of all the activities that take place with the panel. Data maintained in the control system database provides a record of the panel operations, including sampling, screening and recruiting, data collection, panel member communications (mailings, e-mailings, text messaging, automated telephone prompting), panel member tracing, fulfillment operations (incentive and questionnaire mailings, mail survey receipt and data entry), helpdesk operations, and data processing. This centralized repository of information creates efficiencies in the generation of reports on sample disposition, data quality monitoring and the flow of information between the contractor, self-administered interviews, and field operations.

A.4 Efforts to Identify Duplication and Use of Similar Information

Three commercial Web-based panels include smokers, but none of these panels meet the rigorous requirements needed to inform FDA’s regulatory authority over tobacco products. For example, Harris Interactive includes smokers, but there is limited participation by disadvantaged populations that may be of interest to FDA. Many tobacco control investigators use the GfK Knowledge Networks panel for survey research, as it is built from an address-based sample and includes many difficult to reach populations such as young adults, cell-phone only households, and ethnic/racial minorities. However, there are significant concerns that the smokers in this panel may be biased by conditioning effects because they participate in a relatively high number of tobacco-related studies. These effects may be particularly pronounced among the small number of disadvantaged populations due to the gap in smoking-related information about them which places them in high demand for surveys. Of particulate note, however, is that in our own experience we have found that these commercial panels cannot easily recruit the number of cigar smokers or smokeless tobacco users that the planned studies may require.

Existing longitudinal surveillance studies of tobacco users, such as FDA’s Population Assessment of Tobacco and Health (PATH), are not appropriate for the planned experimental or observational studies. PATH is intended to understand the natural history of tobacco use uptake, cessation, and relapse and associated health consequence without conducting experiments that may influence their behavior. Therefore, subjecting PATH participants to experiments may influence their behavior and then the PATH study would not be able to claim that it is a representative picture of tobacco use and health for the U.S.

The survey items in the panel instruments are standard measures used to characterize participant demographics, smoking status, and level of addiction to tobacco products. These items are included in many national surveillance systems to monitor trends in tobacco use. However, their inclusion in the panel questionnaires is nonduplicative of these surveys; rather, they are included to identify tobacco users to be recruited to the panel and to measure potential covariates that may be needed to account for nonresponse in future studies.

A.5 Impact on Small Businesses or Other Small Entities

There is no impact on small business or other entities. No small businesses are involved in this study.

A.6 Consequences of Collecting the Information Less Frequently

By design, the panel is being established to support up to 8 experimental and observational studies of adult tobacco users over a 3-year period to assess consumers’ responses to tobacco marketing, warning statements, product labels, and other communications about tobacco products. Given the length of commitment, it is critical to the overall success of the panel, especially in minimizing attrition, to maintain frequent contact with panelists to ensure their continued interest and participation, address technical or other issues they may have, and to maintain accurate locator information that will facilitate longitudinal contact and tracking of movers. Thus, other contacts, involving other forms of communication with panel members are planned for this purpose.

A.7 Special Circumstances Relating to the Guidelines of 5 CFR1320.5

None.

A.8 Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency

A.8.1 Federal Register Announcements and Comments

In accordance with 5 CFR 1320.8(d), FDA published a 60 day notice for public comment in the FEDERAL REGISTER of 10/16/2014 (79 FR 62160). FDA received three comments, however only two were PRA related. Within those submissions, FDA received multiple comments which the agency has addressed.

(Comment) One comment asked FDA for the opportunity to review the data collection plans and instruments including the sample design, data collection methodology and panel performance evaluation plan.

(Response) All the instruments and background documents including our plan for evaluating panel performance have been uploaded to the docket for easy access. The documents included are the data collection plans and methodology (Supporting Statement Part A), copies of the survey instruments used to screen and recruit panel members, as well as the first experimental or observation study (Study 1) and the proposed sample design (Supporting Statement Part B).

(Comment) One comment asked FDA to provide additional details about the proposed sample design and FDA’s approach to issues such as nonresponse of subjects and conditioning effects.

(Response) The proposed sample design is described in detail in Supporting Statement Part B. Briefly, we propose a multi-stage area sample based on an Address-based sampling (ABS) frame. The probabilities (both single and joint) will be measurable at each stage as will the overall selection probability.

The issues of non-response and conditioning effects are real challenges but they should be considered separately from the sample design. These are issues faced in the field once the sample has been selected and contacted. We have proposed several strategies for reducing non-response in the recruitment of panel members, the primary one being in-person recruitment which we believe will lead to significantly larger recruitment rates than we would achieve if we contacted sample members via mail, telephone, or Web. We will describe our plans to reduce the non-response bias in future individual studies as part of the OMB submissions for these studies. We consider the issue of conditioning effects as part of our overall panel management plan, which is described in Supporting Statement, Part A.

(Comment) One comment stated that FDA suggests that not every panelist will be eligible to participate in every study to minimize the potential for "conditioning" effects. However, this approach to participation is inconsistent with the requirement that every individual in the population has a non-zero probability of being in the sample. FDA will need to make trade-offs to balance these two interests. FDA could consider drawing data from similar respondents, as long as FDA knows that there are no important hidden differences between the respondents that may affect their responses.

(Response) We will draw the original sample with known, non-zero, and, to the extent possible, equal probabilities. The same will apply to any additional samples drawn for the panel to replace attrition. Furthermore, any subsample drawn from the panel for specific studies will also result in known probabilities of selection. We will derive a strategy of spreading the survey-taking load over all panel members to avoid excessive burden on any single member or group of members. We will implement this strategy by randomly selecting each subsample, but at the same time keeping track of each member’s survey-taking activity. As the number and frequency of survey-taking for a given member increases, their probability of selection will decrease – a strategy that we will implement using probability proportion to size sampling. This strategy will lead to known and measurable selection probabilities for each specific subsample.

(Comment) One comment stated FDA should consider whether in some instances collecting fresh data from new samples of tobacco product users over time may provide better results.

(Response) Our proposed approach includes replenishment of the sample over time to address attrition from the panel. As such, the panel will include tobacco users with varying tenure lengths on the panel. We will be in a position to restrict a specific study subsample to the more recent panel members, if desired, and more generally, the panel will allow FDA to specify the composition of the sample with respect to tenure.

(Comment) One comment said FDA should consider inclusion of non-tobacco users or users of specific tobacco categories (e.g., e-cigarette users, moist smokeless tobacco users) in the sample to support comparative analyses between users and non-users or subgroup analyses.

(Response) FDA considered including non-tobacco users early in the planning process. However, the planned experimental and observational studies will examine issues specific to the tobacco-using population, especially those with lower socio-economic status. This includes the underlying demographics of users as well as their knowledge, attitudes, practices, behaviors, and reactions to various tobacco-related stimuli. Other existing data sources, including survey panels, support research with non-users. Moreover, limiting the panel to users reduces the overall public burden. Once the panel is firmly established, we may consider its expansion.

(Comment) One comment stated FDA should also consider how well the sample of 4,000 adult tobacco users will support the planned investigations.

(Response) The sample size of 4,000 was chosen after a careful review of, on the one hand, power and subclass analyses requirements, and, on the other hand, the budgetary implications. After our careful review, we concluded that a sample size of 4,000 tobacco users represents a good balance, at least for the first iteration of the panel.

We should also mention that the young adult population (aged 18-25) and the low-income population (combined household income less than $30,000) will be oversampled allowing for more in-depth study of these two groups of tobacco users. We also include a screening feature that will result in oversampling of the smokeless tobacco users.

(Comment) One commenter stated that FDA suggests that the approach includes a "3-year panel commitment period." FDA should consider developing and sharing its plan for keeping or removing panelists. For example, will FDA keep or remove a panelist if he/she decides to quit using tobacco products? Also, how will FDA monitor whether incentives are influencing a panelist's responses or behavior? These are only examples of issues that could arise; therefore, a thoughtful panel management plan is needed.

(Response) We agree that a detailed and well-designed panel management plan is needed to make the panel successful. The literature on panel maintenance is growing, but there is still much to be learned about optimal strategies for maintaining a healthy and productive panel. Supporting Statement, Part A outlines our plans for panel management, including retention and nonresponse follow-up strategies, planned incentive experiments, monitoring of panel conditioning, and evaluation of the effects of various panel maintenance strategies on substantive responses. Continual monitoring is planned to study these and other important aspects of the panel’s health. We will also keep a close eye on individual panelists, their participation patterns, and their non-response patterns to identify potential problems requiring intervention.

FDA considered removing panel members who report they have stopped using tobacco products. Because of recidivism rates however, it was decided to retain all enrolled panel members regardless of changes in their tobacco use patterns. Subsampling of panelists may be implemented for specific experimental and observational studies that are intended solely for current users of one or more specific tobacco products.

(Comment) One commenter stated FDA should consider establishing mechanisms to evaluate on a continuous basis the performance of the panel as well as the data derived from it. For example, data from the panel on measures such as current or past 30-day cigarette smoking might be compared against the most recent data from national surveys and other published reports.

(Response) We agree that benchmarking the panel sample characteristics - demographic, socioeconomic, and tobacco use – against other national data sources is extremely important. We will continuously check that our panel matches known underlying population characteristics. However, we will also monitor how the panel compares with the target population with respect to known patterns of behavior surrounding tobacco use. Differences will not necessarily suggest problems with the panel but they will stimulate further investigation and explanation.

(Comment) One commenter asked the FDA to provide copies of the survey instruments for public comment.

(Response) Copies of the survey instruments used to screen and recruit panel members, as well as the first experimental or observation study (Study 1), are uploaded to the docket.

(Comment) One commenter strongly supports FDA’s proposed collection of information. They stated that this panel is of great utility and the proposed probability-based panel will serve as a flexible tool, giving FDA the opportunity to conduct diverse studies.

(Response) FDA agrees with this comment and believes the panel will be a valuable tool for conducting new experimental studies.

A.8.2 Consultation Within the Research Community

To inform the design of the panel recruitment and retention strategies, the contractor engaged the services of a Web survey panel expert in the research community. The consultant participated in discussions with the contractor to review focus group findings (Focus groups conducted under OMB Control No. 0910-0497) and provide feedback on strategies for recruiting and engaging panel members long-term. This included guidance on 1) the feasibility of providing tablet computers to panelists as part of a study incentive protocol rather than a loan, and potential challenges with this approach; 2) panelist use of personal computing devices to complete Web surveys; 3) cash-based incentive options both at enrollment and throughout the panel period; 4) the need for Internet service provision by the study to enroll some panelists; 5) length of the panel commitment period; and 6) panel maintenance strategies, including short surveys and other forms of contact with the panel. The consultant also provided feedback on the most significant challenges in Web-enabling survey respondents and keeping them engaged long-term, and the need for alternative survey modes for panel members who will not participate online in order to minimize coverage and nonresponse bias.

Consultant contact information is provided below.

Scott Crawford

Founder, Chief Executive Officer

Survey Sciences Group, LLC

950 Victors Way, Suite 50

Ann Arbor, Michigan 48108

Ph. 734-527-2150

A.9 Explanation of Any Payment or Gift to Respondents

The multi-year longitudinal design with multiple surveys can pose a burden to respondents, while the self-administered modes of data collection and relatively sensitive topic limit the ability to motivate sample members and encourage participation. Along with other features of the study, these factors create a substantial risk of nonresponse and attrition bias in estimates of tobacco use (e.g., Seltzer, Bosse and Garvey, 1974; Vestbo and Rasmussen, 1992; Cunradi et al., 2005), if left unaddressed. A comprehensive incentive strategy was requested to recruit and maintain the 4,000-member panel, given the length of the panel commitment, the need for panelists to have frequent yet easy access to the Web survey application, and the planned frequency of contacts (up to 8 experimental and observational studies) during their time in the panel.

Regardless of the nature of the online panel, the recruitment strategy typically involves some combination of various motivators for participation, such as incentives, importance of self-expression, fun, ease of panel participation, and social comparison (Baker et al., 2010). Incentives have been viewed as the primary motive for joining a panel—for example, Poynter and Comley (2003) report a mix of motivators, with incentives having the most impact (59%), followed by curiosity (42%), enjoyment in doing surveys (40%), and importance of expressing own views (28%). More importantly, those participating due to an offered incentive may be different, for example, due to lower interest in the topic (e.g., Groves, Singer and Corning, 2000; Groves, Presser and Dipko, 2004; Groves et al., 2006)—offering an incentive can reduce nonresponse bias due to lack of interest and motivation.

Two key concerns in longitudinal panel maintenance are panel attrition and panel conditioning. To combat panel attrition and increase the likelihood of participation at each survey request during the life of the panel, panel members are offered incentives contingent on survey completion (Baker et al., 2010). Furthermore, minimum burden through limited number of survey requests can ensure panel retainment and at the same time minimize panel conditioning, associated with repeated measurement on the same topic and frequency of the survey request. However, the limited number of survey requests can also induce nonresponse due to lack of engagement. Given the significant investment made during the recruitment stage and the high cost of replacing panel members (due to in-person recruitment and screening), we have developed a sound incentive strategy to keep recruited panelists engaged throughout the life of the panel.

We conducted a review of the existing longitudinal surveys in terms of panel maintenance strategies, and specifically, incentives (see Attachment 6). Incentive amounts ranged from no incentive to $100; however, it is difficult to know what amounts will work best for a particular sample and a specific survey topic. To aid in the identification of an appropriate incentive model, focus groups segmented by age (18-25 vs. 26 and older) and socioeconomic status (less than $30,000 and $30,000 or more per year) were conducted by the contractor during the design phase (OMB Control No. 0910-0497). The sessions, which involved a total of 44 adult tobacco users (7 of which were monolingual Spanish speakers), were designed to inform decisions related to length of time in the panel, frequency and nature of contacts, recruitment strategies, incentives, and panel maintenance strategies that would ensure the longevity of the panel. The sessions also explored the feasibility of offering a tablet computing device—rather than cash—as the primary incentive for joining the panel and participating in the planned studies. The results of the focus groups suggested that potential participants preferred to use their personal computing devices (e.g., Smartphone, tablet/laptop, computers) to complete the online surveys. Participants also expressed a preference for receiving a larger cash incentive rather than a tablet computing device for participating in the panel surveys.

The requested incentive protocol, based on findings in the survey literature, focus groups (OMB Control No. 0910-0497), and discussions with survey researchers outside of the study team, includes the following:

  • to minimize the initial screening cost, we mail a paper screener to all sampled households to determine if there is an eligible tobacco user in the household. A $2 prepaid incentive is enclosed with this initial mailing to maximize response rates and reduce the number of households requiring a more expensive in-person screening visit.4 This approach is consistent with other large federal surveys (e.g., National Household Education Survey, U.S. Department of Education (as part of the transition from a telephone to a mail mode of administration); The National Survey of Early Care and Education, Administration for Children and Families) that have experimented with a mail screener that includes a small prepaid incentive (typically, $2 or $5) and have reported on their effectiveness in increasing screener response rates.

  • provision of a one-time $35 enrollment incentive, paid by the interviewer upon the panelists’ completion of both the enrollment and baseline surveys. The goal of this incentive is to engage the potential panelist, provide a token of appreciation for his/her participation in the enrollment and baseline surveys (an estimated interview burden of 20 minutes, plus interviewer training on website login) and serve as a proof that future promised incentives are paid upon survey completion.

  • a $15 promised incentive, payable upon completion of each experimental and observational study. The goal of this incentive is to maximize participation in each study. Each study instrument is expected to take approximately 20 minutes to complete, on average.

We believe the requested incentive strategy, summarized in Exhibit A.9-1, is reasonable for recruiting a 4,000 member panel of tobacco users for the 3-year period, maintaining their interest and active participation long-term, thereby minimizing attrition, and achieving the necessary response rates to support the planned analyses for each of the 8 experimental and observational studies. Additional documentation in support of the proposed incentive strategy is provided in Attachment 6.

Exhibit A.9-1. Incentive Type and Amount

Type of Incentive

Participant

Amount/Value

Mail screener incentive

All sample members

$2 one time

Enrollment incentive

All panel members

$35 one time

Experimental and observational study cash incentive

All panel members

$15/study;

Up to $120 total, covering 8 studies



Over the 3-year panel period, panel members will have the opportunity to receive a maximum of $155 in incentives if they enroll and complete all planned studies (8).

A.10 Assurance of Privacy Provided to Respondents

The contractor’s Institutional Review Board (IRB) has reviewed and approved the panel protocols and consent forms (see Attachments 3-25, 3-26, 4-25, and 4-26). The IRB’s primary concern is protecting respondents’ rights, one of which is maintaining the privacy of respondent information to the fullest extent of the law and in accordance with 45 CFR 46.103(f). The IRB will review any amendments to the study protocol before the requested changes are implemented, and conduct annual continuing reviews.

This data collection is not covered by the Privacy Act and does not require a SORN because the federal government will never have access to any personally identifiable information received during the establishment and implementation of the panel. Instead, the government will only receive de-identified datasets. All personally identifiable information is handled by the contractor that establishes and maintains the panel. The contractor assigns a unique 8-digit identification number to each sample member and the contractor uses this number to maintain linkages between the survey data files and control system files that the contractor maintains. The contractor removes the following sensitive data to produce the datasets to be delivered to the government: a) names, addresses, telephone numbers, and email addresses for panel members, b) dates of birth for panel members, c) names, ages, and relationships of all household members, and d) names, addresses, telephone numbers, and email addresses of contact persons.

All data collection activities are conducted in full compliance with FDA regulations to maintain the privacy of data obtained from respondents and to protect the rights and welfare of human research subjects as contained in their regulations. Respondents receive information about privacy protections as part of the informed consent process.

A.10.1 Procedures for Protecting Data Collected from Participants

The procedures that are used to maintain privacy for the panel in-person data collection are summarized below:

  • All project staff, including fulfillment personnel, sign a privacy pledge that emphasizes the importance of nondisclosure and describes their obligations.

  • All field data collectors are trained on privacy procedures and be prepared to describe them in full detail, if necessary, or to answer any related questions raised by sample members. Training includes procedures for safeguarding sample member information in the field, including securing hardcopy case materials and tablet computers in the field, while traveling, and in respondent homes, and protecting the identity of sample members.

  • Hardcopy documents containing personally identifying information (PII) are stored in locked files and cabinets. Discarded hard copy material containing PII are securely shredded.

  • Hardcopy consent forms and case folders for completed field cases are receipted and securely stored at the contractor’s Research Operations Center (ROC), which uses a keyless card-controlled entry system for controlled access.

  • Responses to all screening, CHUM, enrollment, and interviewer observation surveys are entered directly into the Android tablet computing device provided to each field interviewer. The data entered are encrypted before being written to the local database on each tablet. In the unlikely event the tablet is stolen or otherwise compromised, the tables holding the survey data would be unreadable.

  • GPS data collected on the field interviewer’s tablet during the screening and enrollment process are used for quality control purposes only to verify the interviewer’s location in relation to the sampled address. These data are not used in analyses of the substantive data or included on deliverable data files.

  • Both the Android tablet, the contractor’s MOBILE FS system on the tablet, and any field supervisor laptops used for administrative tasks are password protected with unique user logins.

  • Field supervisor laptops have whole disk encryption to protect the hard drive. The associated Checkpoint FDE software is FIPS 140 compliant. File transfers are done through an FTP site using secure socket layer (SSL) to protect data in transit. The FTP site is specific to the project and requires credentials specific to the project. Data files are encrypted using FIPS 140 certified libraries prior to sending.

  • All data transferred to the contractor’s servers from field staff Android tablets including CARI files, media files such as audio, photo and video and survey data are encrypted on-the-fly using AES-256 with a ‘secure random’ public key hashed using SHA-256 and a private key. These files are transmitted back to the contractor using secure socket layer (SSL) over HTTPS. A batch process decrypts these files after receipt on the contractor’s private network where they are stored on secure contractor servers. The survey data are stored in SQL Server databases on those servers. Only authorized project staff members are able to access them on the secure network share or databases. Access requires passwords and the enabling of user access by contractor IT security personnel.

  • Respondents receive information about privacy protections as part of the informed consent process.

  • A unique 8-digit identification number is assigned to each sample member and used to maintain linkages between survey data files and control system files.

  • Following receipt from the field, PII is stored only on contractor password protected, secured servers. Only authorized project members have access to PII for research sample members.

  • Reports and data files provided to the research community will not include any individually identifying information.

  • As noted above, all precautions are taken against inadvertent disclosure. Project directories and files containing data, and files of identifiers and contacting data, are protected through the use of encryption and passwords.

A.10.2 Additional Privacy Concerns Associated with On-line Data Collection

Panel member privacy concerns regarding use of the Internet for participation will typically be related to three issues:

  • Disclosure of subjects’ PII by the researchers to others outside the study;

  • Use of electronic information to gather additional PII without the subject’s knowledge or consent, and;

  • Electronic breach of security allowing access of subjects’ PII to unrelated third parties.

Plans to minimize potential for risk and addressing these three issues are described below, respectively.

All study consent forms provide participants with advance notice of what data are collected and the measures that are taken to protect their privacy. These methods include: using approved encryption and other methods to physically and electronically secure data, collecting only the minimum amount of information necessary to conduct the study, not disclosing this information to anyone outside the research team, and destroying data as soon as possible after the study has been completed. The data are collected only for the stated purpose and not used subsequently for any other purposes.

Panelists access the panel website using their unique 8-digit identification code. They are required to create a unique password to access their Web surveys, and in the event of a break-off, to resume surveys at a later date. At their initial log in, panelists are also required to select and answer one of 5 security questions that is used in the event the panel member requests a password reset during the course of the panel period. Responses entered through the Web-based survey are encrypted as the responses are on the panel website with an SSL certificate applied. Like the mobile instrument survey data, the Web survey data reside on secure contractor servers on SQL Server databases. Only authorized project staff have access.

The type of Web browser and operating system used by a panel member cannot be used to identify an individual. Panel member access to the Web survey system requires a unique panel member identifier and password, as noted above. In addition, the panel member is reassured that the researchers do not gather any other information aside from the survey answers and electronic information already described. The Web site does not place session cookies, persistent cookies, or any other type of tracking or monitoring software on panel members’ computers, tablets, or smartphones to track or monitor. There is no tracking or monitoring of panel members’ internet behavior in this information collection.

A.10.3 Privacy Procedures for Mail Survey Participants

The privacy of responses from mail survey participants is treated in the same manner as Web survey participants.

  • Project staff take all necessary precautions to ensure the secure transport of study materials to participants and to ensure the secure transport, processing, and storage of participant data.

  • All study materials, while in possession of the contractor, are assembled, processed, and stored at the contractor’s ROC, a controlled-access facility equipped to support sensitive, large-scale mail survey efforts. Access to the building is by keyless card-controlled entry.

  • All staff who come in contact with private project materials have signed privacy pledges and have been trained on all project security procedures.

  • Electronic files containing sensitive data created in the process of preparing printed materials for mailouts (e.g., mail-merge data files, print files) are deleted by staff as soon as all associated mailings or printings have been completed.

  • Any printed sensitive materials not used, such as test printouts or batches of materials with printing problems for which reprinting is required, are securely shredded immediately.

  • Mailings for mail survey participants are assembled by project staff that have signed privacy pledges.

  • After participants complete a mail survey, they return the completed form, identified only by the Case ID, to the contractor in a standard Business Reply Envelope. All returned mailings and forms received by the contractor are sent directly to the Survey Support Department (SSD) at the ROC and stored in a secure area at all times. The SSD area is locked at all times. A supervisor is present at all times when work is being performed in the SSD area. At SSD, a Document Control Clerk is assigned the responsibility of processing, filing, and maintaining all project materials. Incoming materials are stored in a locked file cabinet after processing and then shredded at the end of the project.

  • After receipt by the contractor, completed mail survey forms are scanned using Teleform. Panel member names are not printed on paper survey forms; instead, forms are labeled with the panelist’s study ID.

  • Completed mail survey forms are security shredded at the end of the project, following data delivery to FDA.

A.10.4 Privacy Concerns for Participants Using Loaned Tablet Computers

The privacy of responses from panel members using the loaned tablet computer is treated in the same manner as Web survey participants. In addition:

  • The Web-enabled tablet computer loaned to a subset of panel members is provided as a tool for accessing the panel website to participate in panel surveys online. No survey data are collected or stored locally on the device. Additionally, the device is not used to track the panel member’s location or to collect data from the device about non-study usage.

  • Panel members receive detailed written instructions by mail for the packaging and return of loaned tablets to the contractor when their panel participation ends. This includes shipping boxes and overnight postage-paid shipping labels.

  • Upon return, loaned devices are inventoried and receipted, wiped clean of any data that might have been stored on them, and restored to their factory settings. Panel members are reassured that no attempts are made to gather any other information from the device aside from the survey answers and electronic information already described.

A.11 Justification for Sensitive Questions

The panel field screener and enrollment surveys contain items about current employment status and basic demographic information including age, gender, date of birth, race/ethnicity, educational attainment, and marital status. Federal regulations governing the administration of these questions, which might be viewed as sensitive due to personal or private information, require (a) clear documentation of the need for such information as it relates to the primary purpose of the study, (b) provisions to respondents that clearly inform them of the voluntary nature of participation in the study, and (c) assurances that responses may be used only for statistical purposes, except as required by law (20 U.S.C. § 9573).

The collection of data related to current employment status and basic demographic information including is essential for subsequent analyses, which includes examination of demographic characteristics of survey nonrespondents and panel members who leave the study over time. These data will also be used to accurately characterize and/or subset panel members for inclusion in the experimental and observational studies, and for descriptive and other analyses described in Section A.16.

Respondents are advised of the voluntary nature of participation and their right to refuse to answer any question during the informed consent process.

A.12 Estimates of Annualized Burden Hours and Costs

A.12.1 Annualized Hour Burden Estimate

Exhibit A.12-1 contains the estimated interview times for each member of the panel. Burden was estimated using data from timed-readings of each instrument, including the mail and field screeners, enrollment survey, baseline survey, and Study 1 questionnaire. To compute the total estimated annual cost, the total burden hours were multiplied by the average hourly wage for each adult participant, according to the Bureau of Labor Statistics, Current Employment Statistics Survey, 2011. Estimates are presented in Exhibit A.12-1.

A.12.2 Annualized Cost Burden Estimate

Exhibit A.12-1. Estimated Annualized Response Burden for Panel Members

Type of Respondent

Total Burden Hours

Hourly Wage Rate

Total Respondent Costs

2-year Panel Member

3,534

$22.88

$80,857.92

3-year Panel Member

900

$22.88

$20,592.00

Total

$101,449.92

Exhibit A.12-2. Estimated Annual Reporting Burden

Activity/Respondent

Number Of Respondents

Number Of Responses Per Respondent

Total Annual Responses+

Avg. Burden Hours Per Response

Avg. Annual Burden Hours+

Household Screening Respondent1

35,885

.33

11,842

.13

1,539

Panel Member Enrollment Survey

4,000

.33

1,320

.25

330

Panel Member Baseline Survey

.33

1,320

.25

330

Study 1 (Experimental/

Observational Study)

.33

1,320

.33

436

Panel Replenishment Household Screening Respondent1

20,570

.50

10,285

.13

1,337

Panel Replenishment Enrollment Survey3

2,800


.33

924

.25

231

Panel Replenishment Baseline Survey3

.33

924

.25

231

TOTAL

63,255




4,434

+ Amounts are rounded to the nearest whole number.

1 Includes both mail and field screening. Of the total screening respondents, we expect 25% will respond only in the mail screening (household deemed ineligible), 65% will respond only in the field screening (mail screening nonrespondents), and the remaining 10% will respond in both the mail screening and the field screening. The latter includes eligible households from the mail screening that are subsequently field-screened to sample the panel member, and the 10% quality control sample of households whose mail screening ineligibility is verified through in-person screening. Assumes an estimated 10,285 household screening respondents during yearly panel replenishment (20,570 total).

3 Assumes 1,400 additional panel members will be recruited annually (2,800 total) as part of the panel replenishment effort. Replenishment panel members replace original panel members and become part of the 4,000-member panel that receives experimental/observational and panel maintenance surveys.

A.13 Estimates of Other Total Annual Cost Burden to Respondents or Recordkeepers

There are no capital or operating and maintenance costs associated with this collection. There are no direct monetary costs to individual participants other than their time to participate in the study.

A.14 Annualized Cost to the Federal Government

The estimated annual cost to the government for each year of this contract is $3,343,615. This figure is based on a total cost to the Federal government for establishing the panel under the terms of the 5-year, $16,718,075 contract to RTI International and their subcontractors. These costs include questionnaire design and programming, design and implementation of the initial sample of 36,390 addresses and up to 12,091 reserve sample addresses, eligibility screening of an estimated 35,885 sampled households, recruitment of 4,000 adult panel members, collection of data from panel members, including enrollment, baseline, and Study 1 survey data, panel replenishment tasks, including screening and recruitment of 1,400 additional panel members annually, data processing and analysis, and preparation of reports and data files. Panel member incentive costs are included in this estimate.

A.15 Explanation for Program Changes or Adjustments

FDA is requesting non-substantive changes to (1) revise respondent materials to support planned panel maintenance activities; (2) revise the topical study 1 questionnaire on tobacco brands and purchasing behaviors; and (3) update the estimated burden for panel replenishment and include a reserve sample. The average annual burden hours will increase by 668 hours, from 3,766 to 4,434 hours.

A.16 Plans for Tabulation and Publication and Project Time Schedule

A.16.1 Study Schedule

Exhibit A.16-1 provides a schedule of the major activities for the panel project. A 3-year clearance is requested given the long-term nature of the Panel and the plan to enroll panel members for a 3-year period.

A.16.2 Publication and Reporting Plans

The key findings of Study 1 will be summarized in presentations and/or written reports and disseminated to target audiences within the public health community (including researchers and policymakers) within approximately one year after the completion of data collection. As described in Section A.2.3, Study 1 focuses on purchasing behavior, tobacco brands, and the use of coupons and price promotions for tobacco products. The analysis and reporting of the key findings of Study 1 will examine participants’ tobacco product brand loyalty and the accuracy of self-reported data on tobacco product consumption. The analysis and reporting of the key findings of Study 1 will be informed by descriptive data (e.g., demographic characteristics and tobacco use indicators) collected in the baseline survey for each panel member.

Publication and Reporting Plans for each subsequent observational or experimental study are described in the information collection request for that study.

Exhibit A.16-1. Panel Project Schedule

Activity

Time frame

Start date

End date

Select address sample for panel implementation

June 2016

July 2016

Recruit and train field staff for panel implementation

August 2016

January 2017

Conduct field enumeration activities in selected areas

November 2016

December 2016

Recruit and enroll initial cohort of panel members

September 2016

August 2017

Conduct nonresponse follow-up and troubleshooting (flow basis)

October 2016

End of Panel

Provide reports of panel recruitment and maintenance activities (flow basis, during active recruiting phases)

October 2016

End of Panel

Conduct Study 1, first experimental or observational study

February 2018

March 2018

Conduct panel replenishment (quarterly)

April 2018

End of Panel

Conduct analysis and reporting of Study 1 findings

April 2018

September 2018



A.16.3 Analysis Plans

The data analyses to be conducted in conjunction with the 6-8 studies involving the established panel will focus on analysis of the collected data to address the key research questions for each of the experimental and observational studies. For example, an analysis of an experimental study of potential warning labels could compare differences in tobacco-related beliefs, attitudes, and behavioral intentions between the treatment and control groups overall and for subgroups of tobacco users as appropriate. Depending on the experimental design, the analyses could be used to help determine which of various proposed statements/labels are most effective.

Analysis of the study data may involve one or more of the following methods and approaches:

  • Descriptive analyses to describe the panel on individual variables, including summary statistics such as the number and percentage of panel members reporting a specific response or behavior, or distributions of scores for continuous variables (e.g., measures of central tendency, standard deviation, skewness).

  • Bivariate analyses and cross-tabulations to examine the relationships between variables, including panel member demographic characteristics and tobacco use data; for example, this may include examining the relationship between type of tobacco product used and race/ethnicity, or between level of addiction and self-classification as a tobacco user.

  • Evaluation of experiments implemented during a study.

  • Analyses that identify subsets of panel members who may be of unique analytical value for the experimental or observational studies, for example, panel members who regularly use more than one type of tobacco product or who are highly addicted and self-identify as non-tobacco users.

  • Analyses that ensure the quality and representativeness of the panel, for example comparisons of collected data to national benchmarks.

  • Tests of the relationships between variables (e.g., means tests, regression analyses, Chi square tests) to monitor changes in key panel member characteristics such as tobacco product used, level of addiction, and quit attempts to identify changes in panel characteristics that might influence future studies.

  • Analyses that evaluate nonresponse bias and inform the weighting strategy to adjust for deviations from the original design due to factors such as variable nonresponse.

The analyses conducted will yield tables, figures, and various summary statistics for potential inclusion in presentations and/or written reports disseminated for each study.

A.17 Reason(s) Display of OMB Expiration Date Is Inappropriate

The OMB number and expiration date is displayed on the survey website where questionnaires are launched, all mail survey instruments, and on the consent forms.

A.18 Exceptions to Certification for Paperwork Reduction Act Submissions

There are no exceptions to the certification.


References

Armstrong, J. Scott. 1975. Monetary Incentives in Mail Surveys. Public Opinion Quarterly, 39, pp. 111–116.

Health System Measurement Project (2013). https://healthmeasures.aspe.hhs.gov/measure/268


Baker, R., Blumberg, S., Brick, M., Couper, M., Courtright, M., Dennis, J. M., Dillman, D., Frankel, M., Garland, P., Groves, R., Kennedy, C., Krosnick, J. and Lavrakas, P. 2010. AAPOR Report on Online Panels. Public Opinion Quarterly, 74 (4), pp.711–781.

Baumgartner, Robert and Pamela Rathbun (1997). Prepaid monetary incentives and mail survey response rates. Paper presented at AAPOR, Norfolk, VA

Bethlehem, J. (2002). Weighting Nonresponse Adjustments Based on Auxiliary Information. In Survey Nonresponse. R.M. Groves, D.A. Dillman, J.L. Eltinge, & R.J.A. Little, eds. pp. 275-278. New York: John Wiley and Sons.

Biemer, P. P., & Lyberg, L. (2003). Introduction to Survey Quality. Hoboken, NJ: Wiley.

Biner, P. M. and Kidd, H. J., 1994. The Interactive Effects of Monetary Incentive Justification and Questionnaire Length on Mail Survey Response Rates. Psychology and Marketing 11:483–492.

Centers for Disease Control and Prevention. Current Cigarette Smoking Among Adults—United States, 2005–2014. Morbidity and Mortality Weekly Report 2015;64(44):1233–40 [accessed 2015 Dec 7].

Church, Allan H. 1993. Estimating the Effect of Incentives on Mail Survey Response Rates: A Meta-Analysis. Public Opinion Quarterly 57:62–79.

Clark, S. M. and Mack, S.P. 2009. SIPP 2008 Incentive Analysis. Paper Presented at the Federal Committee on Statistical Methodology Research Conference, Washington, D.C.

Cobb, L.C., Lawrence, M.S., and Gross, W., 2012. The Prevalence and Impact of Self-Selection Bias and Panel Conditioning on Smoker Studies Using Established Internet Panels. Presentation for Southern Association of Public Opinion Research (SAPOR).

Coen, T., Lorch, J. and Piekarski, L. 2005. The effects of survey frequency on panelists’ responses. Worldwide Panel Research: Developments and Progress. Amsterdam: ESOMAR.

Creighton, K., King, K. and Martin, E. 2007. The Use of Monetary Incentives in Census Bureau Longitudinal Surveys. Survey Methodology Research Report Series N2007-2. Washington, DC: U.S. Census Bureau.

Cunradi, C. B., Moore, R., Killoran, M., and Ames, G. 2005. Survey Nonresponse Bias among Young Adults: The Role of Alcohol, Tobacco, and Drugs. Subst Use Misuse 40(2): 171–85.

DeBell, M., Krosnick, J. and A. Lupia 2010. Methodology Report and User’s Guide for the 2008-2009 ANES Panel Study. Palo Alto, CA and Ann Arbor, MI: Stanford University and the University of Michigan.

Dillman, D. A. 2000. Mail and Internet Surveys: The Tailored Design Method, 2nd edition. New York: Wiley.

Dillman, D. A., 2007. Mail and Internet Surveys: The Tailored Design Method, 2nd edition. 2007 Update with New Internet, Visual and Mixed-mode Guide. New York: Wiley.

Folsom, R. E., & Singh, A. C. (2000). The generalized exponential model for sampling weight calibration for extreme values, nonresponse, and poststratification. In Proceedings of the American Statistical Association, Survey Research Methods Section, pp. 598-603. Alexandria, VA: American Statistical Association.

Fox, R.J., Crask, M.R., and Kim, J. 1988. Mail Survey Response Rate: A Meta-analysis of Selected Techniques for Inducing Response. Public Opinion Quarterly, 52, 467–491.

Groves, R. M., Couper, M. P., Presser, S., Singer, E., Tourangeau, R., Acosta, G. P., and Nelson, L. 2006. Experiments in Producing Nonresponse Bias. Public Opinion Quarterly 70(5): 720–736.

Groves, R. M., Presser, S., and Dipko, S. 2004. The Role of Topic Interest in Survey Participation Decisions. Public Opinion Quarterly 68(1): 2–31.

Groves, R. M., Singer, E., and Corning, A. 2000. Leverage-Saliency Theory of Survey Participation - Description and an Illustration. Public Opinion Quarterly 64(3): 299–308.

Groves, R., & Peytcheva E. (2008). The impact of nonresponse rates on nonresponse bias: A meta-analysis. Public Opinion Quarterly, 72(2), 167-189.

Heberlein, T. A. and Baumgartner, R. 1978. Factors Affecting Response Rates to Mailed Questionnaires: A Quantitative Analysis of the Published Literature. American Sociological Review 3:447-62.

Heberlein, T. A. and Baumgartner, R. 1978. Factors Affecting Response Rates to Mailed Questionnaires: A Quantitative Analysis of the Published Literature. American Sociological Review 3:447-62.

HINTS 4, Cycle 4. 2014. Public Use Dataset updated June 2015: http://hints.cancer.gov/dataset.aspx


Iannacchione, V. G. (2011). The changing role of address-based sampling in survey research. Public Opinion Quarterly, 75(3), 556–575.

James, T. L. 1997. Results of Wave 1 Incentive Experiment in the 1996 Survey of Income and Program Participation. Proceedings of the Survey Research Methods Section of the American Statistical Association, pp.834–839.

Kish, L. (1965). Survey Sampling. New York: John Wiley and Sons.

Kruse, Y., Callegaro, M., Dennis, J. M., DiSogra, C., Subias, S., Lawrence, M., & Tompson, T. 2009.  Panel conditioning and attrition in the AP-Yahoo! News Election Panel Study. Paper presented at the American Association for Public Opinion Research (AAPOR) 64th Annual Conference.

Lanz, P. M. 2003. Smoking on the Rise among Young Adults: Implications for Research and Policy. Tobacco Control 12 (Suppl I): i60-i70.

Lengacher, J., Sullivan, C., Couper, M. P and R. Groves. 1995. Once Reluctant, Always Reluctant? Effects pf Differential Incentives on Later Survey Participation in a Longitudinal Survey. Proceedings of the American Statistical Association, Survey Research Methods Section, p.1029–1034.

Levine, S. and Gordon, G. 1958. Maximizing Returns on Mail Questionnaires. Public Opinion Quarterly, 22:568-75.

Lin, I. F., & Schaeffer, N. (1995). Using survey participants to estimate the impact of nonparticipation. Public Opinion Quarterly, 59, 236-258.

Linsky, A. 1975. Stimulating Responses to Mailed Questionnaires: A Review. Public Opinion Quarterly, 39, pp. 82–101.

Mack, S., Huggins, V., Keathley, D. and Sundukchi, M. 1998. Do Monetary Incentives Improve Response Rates in the Survey of Income and Program Participation? Proceedings of the American Statistical Association, Survey Research Methods Section, 529–534.

McMichael, J., Ridenhour, J., & Shook-Sa, B. 2008. A robust procedure to supplement the coverage of address-based sampling frames for household surveys. Proceedings of the American Statistical Association, Section on Survey Research Methods, 4329–4335.

Nancarrow, C. & Catwright, T. 2007. Online access panels and tracking research: The conditioning issue. International Journal of Market Research, 49(5), 435–447.

National Adult Tobacco Survey (2014). http://www.cdc.gov/tobacco/data_statistics/fact_sheets/adult_data/cig_smoking/index.htm.


NHIS. 2014. Public Use Dataset updated June 2015:

http://www.cdc.gov/nchs/nhis/nhis_2014_data_release.htm.


Poynter, R. and P. Comley. 2003. Beyong Online Panels. Proceedings of the ESOMAR Technovate Conference. Amsterdam: ESOMAR.

Rodgers, W. 2002. Size of Incentive Effects in a Longitudinal Study. Proceedings of the American Association for Public Research 2002: Strengthening Our Community - Section on Survey Research Methods.

RTI, 2010, SUDAAN Release 10.

Schiller, J. S., Lucas, J. W., Peregoy, J.A. 2012. Summary health statistics for U.S. adults: National Health Interview Survey, 2011. National Center for Health Statistics. Vital Health Stat 10(256).

Seltzer, C. C., R. Bosse and A. J. Garvey 1974. Mail Survey Response by Smoking Status. American Journal of Epidemiology 100(6): 453–457.

Singer, E., Van Hoewyk, J. and Maher, M. P. 1998. Does the Payment of Incentives Create Expectation Effects? Public Opinion Quarterly, 62: 152–64.

TUS-CPS, 2010-2011. Public Use Dataset updated May 2011;

http://thedataweb.rm.census.gov/ftp/cps_ftp.html#cpssupps.


U.S. Census Bureau 2013. Computer and Internet Use in the United States. U.S. Census Bureau publication P20-569. http://www.census.gov/prod/2013pubs/p20-569.pdf

Vestbo, J. and Rasmussen, F. V. 1992. Baseline Characteristics Are Not Sufficient Indicators of Non-Response Bias Follow up Studies. Journal of Epidemiology and Community Health 46(6): 617–619.

Yu, J. and H. Cooper, 1983. A Quantitative Review of Research Design Effects on Response Rates to Questionnaires.  Journal of Marketing Research 20: 36-44.

Zickuhr, K. 2013. Who’s not Online and Why. Washington, DC: Pew Research Center; http://www.pewinternet.org/2013/09/25/whos-not-online-and-why/

1 The question is “Do you ever go online to access the Internet or World Wide Web, or to send and receive e-mail?”

2 The question is “Do you ever go online to access the Internet or World Wide Web, or to send and receive e-mail?”

3 35.6% is the percent of smokers and tobacco users, while 37.3% is the percent of cigarette smokers only.

4 The design provides for screening all nonresponding households in a face-to-face mode and selecting a 10% random sample of those who report ineligibility to be screened by an interviewer during a face-to-face visit.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleFDA Internal Review
AuthorRadway, Anne
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy