Download:
pdf |
pdfSuperimposed Text in Direct-to-Consumer Promotion of Prescription Drugs
OMB Control No. 0910-NEW
SUPPORTING STATEMENT Part A
A. Justification
1. Circumstances Making the Collection of Information Necessary
Section 1701(a)(4) of the Public Health Service Act (42 U.S.C. 300u(a)(4)) authorizes the
FDA to conduct research relating to health information. Section 1003(d)(2)(C) of the Federal
Food, Drug, and Cosmetic Act (the FD&C Act) (21 U.S.C. 393(d)(2)(c)) authorizes FDA to
conduct research relating to drugs and other FDA regulated products in carrying out the
provisions of the FD&C Act.
The proposed study seeks to extend previous research on the effects of superimposed text
(supers) in advertising to today’s modern direct-to-consumer (DTC) pharmaceutical promotion.
Although earlier research on the effects of supers in other consumer settings suggests that
altering text size can influence consumer comprehension of information, it is unclear if these
findings extend to DTC promotion of prescription drugs and are applicable over 20 years later
when viewing promotional materials using today’s modern technologies (e.g., tablets).
Moreover, other factors such as text/background contrast may also influence both the
understanding of the superimposed information1 and the effects of text size. The proposed
research seeks to update these earlier findings and also to answer new questions concerning
presentation of supers.
Part of FDA’s public health mission is to ensure the safe use of prescription drugs;
therefore it is important that the information provided in DTC promotion is clear and
understandable for consumer audiences, avoids use of deceptive or misleading claims, and
achieves “fair balance” in presentation of benefits and risks. For example, varying presentation
formats including type size, bulleting, amount of white space, and use of “chunking” or
headlines can all influence consumer perceptions of information2. A systematic review of
presentation formats in prescription drug labeling found that these “clear communication”
characteristics positively influenced consumer’s comprehension of information and prescription
drug behaviors (i.e., adherence)3. In one randomized controlled study, young and older adults
were presented with 12 otherwise identical over-the-counter drugs bottled with varied container
labels along various dimensions, one of which was text size (7 vs 10 point). While younger
participants performed equally well with both font sizes, elderly populations had significantly
1
Hall RH, Hanna P. The impact of web-page text-background colour combinations on readability, retention,
aesthetics, and behavioural intention. Behav Inform Technol. 2004;23:183-95.
2
Baur C, Prue C. The CDC Clear Communication Index is a new evidence-based tool to prepare and review health
information. Health Promot Practice. 2014;15:629-37.
3
Shrank W, Avorn J, Rolon C, Shekelle P. Effect of content and format of prescription drug labels on readability,
understanding, and medication use: A systematic review. Ann Pharmacother. 2007;41:783-801.
1
reduced recall and comprehension when exposed to the smaller text size4. Another study found
that both young and older populations preferred the larger text size, and that patients read labels
with larger font more rapidly and accurately than labels with smaller font5. Although these
studies were specific to prescription drug container labels, it is plausible that the effects of font
sizes would be applicable to drug promotion.
Some early research in the late 1980s and 1990s examined the size of text information in
advertising topics outside of prescription drugs6. These studies all generally found that text size
was associated with comprehension, such that larger text sizes increased understanding of the
material (and, conversely, smaller text sizes interfered with comprehension). For example,
Foxman and colleagues7 found that whereas “small” text size (< ½ inch size) was associated with
comprehension for 59% of respondents, “large” text size (> ½ inch size) was associated with
comprehension for 79% of respondents. Studies by other researchers8 found similar patterns
such that increasing the text size of supers generally corresponded with increased
comprehension.
We know of no studies that have examined other commonly variable factors, such as
text/background contrast, that may interact with text size to influence comprehension. Early
research on text readability determined that the contrast between text and background has a
consistent but small effect. Specifically, while the contrast of color has a small effect9, the
contrast in brightness, or luminance, makes the largest difference10. These studies showed that
black text on a white background results in the highest readability11, but that other effects of
color contrasts are unclear12. Some studies have demonstrated that contrast interacts with text
size, such that contrast becomes a more important discriminator as the text size decreases13.
4
Wogalter MS, Vigilante WJ. Effects of label format on knowledge acquisition and perceived readability by
younger and older adults. Ergonomics. 2003;46:327-344.
5
Smither JAA, Braun CC. Readability of prescription drug labels by older and younger adults. J Clin Psychol Med
S. 1994;1:149-59.
6
Foxman ER, Muehling DD, Moore PA. Disclaimer footnotes in ads: Discrepancies between purpose and
performance. J Public Policy Mark. 1988;7:127-37; Murray NM, Manrai LA, Manrai AK. Public policy relating to
consumer comprehension of television commercials: A review and some empirical results. J Consum Policy.
1993;16:145-170; Manrai LA, Manrai AK, Murray N. Comprehension of info-aid supers in television advertising
for social ideas: Implications for public policy. J Bus Res. 1994;30:75-84.
7
Foxman ER, Muehling DD, Moore PA. Disclaimer footnotes in ads: Discrepancies between purpose and
performance. J Public Policy Mark. 1988;7:127-37.
8
Murray NM, Manrai LA, Manrai AK. Public policy relating to consumer comprehension of television
commercials: A review and some empirical results. J Consum Policy. 1993;16:145-170; Manrai LA, Manrai AK,
Murray N. Comprehension of info-aid supers in television advertising for social ideas: Implications for public
policy. J Bus Res. 1994;30:75-84.
9
Hill A, Scharff L. Readability of computer displays as a function of colour, saturation, and background texture. In
D. Harns (Ed.) Engineering psychology and cognitive ergonomics (Vol. 4) Ashgate, Aldershot, United Kingdom.
10
Shieh K-K, Lin C-C. Effects of screen type, ambient illumination, and color combination on VDT visual
performance and subjective preference. Int J Ind Ergonom. 2000;26:527-36.
11
Tinker MA, Paterson DG. Studies of typographical factors influencing speed of reading. VII. Variations in color
of print and background. J Appl Psychol. 1931;15:471-9.
12
Hall RH, Hanna P. The impact of web-page text-background colour combinations on readability, retention,
aesthetics, and behavioural intention. Behav Inform Technol. 2004;23:183-95.
13
Legge GE, Rubin GS, Luebner A. Psychophysics of reading. V. The role of contrast in normal vision. Vision Res.
2
The earlier research on supers is limited in their applicability to today’s DTC promotion
in several ways. None of these studies specifically focused on prescription drug promotion, but
rather explored the effects of superimposed text in a variety of social and consumer advertising
contexts. Another limitation is that these earlier studies were conducted with populations (i.e.,
undergraduate students) that are not representative of today’s prescription drug users. It is not
clear if the effects of supers would translate to older adult populations, who represent the greatest
proportion of prescription drug users14. Perhaps most importantly, it is unknown if the effects of
supers would be found today, considering the prevalent use of modern technologies, including
large (40+ inches) TV screens and personal tablets. Our proposed study seeks to address these
unanswered questions regarding the use of supers in prescription drug promotion.
General Research Questions
1. Does the size of the superimposed text, the contrast behind the superimposed text, and/or the
device type influence the noticeability, recall, and perceived importance of the super
information?
2. Does the size of the superimposed text, the contrast behind the superimposed text, and/or the
device type influence the recall of and attitudes toward the promoted drug?
3. Are there any interaction effects among any combination of independent variables?
2. Purpose and Use of the Information Collection
The purpose of this project is to investigate how different presentations of superimposed text in
video DTC promotion influence the communication of benefit and risk information. To our
knowledge, no studies have comprehensively examined the size and contrast of superimposed
text in the DTC arena. Part of FDA’s public health mission is to ensure the safe use of
prescription drugs; therefore it is important to communicate the risks and benefits of prescription
drugs to consumers as clearly and usefully as possible. This study will inform FDA of key
variables in the communication value of superimposed text.
3. Use of Improved Information Technology and Burden Reduction
Automated information technology will be used in the collection of information for this study.
One hundred percent (100%) of participants will self-administer the survey via a computer,
which will record responses and provide appropriate probes when needed. In addition to its use
in data collection, automated technology will be used in data reduction and analysis. Burden will
be reduced by recording data on a one-time basis for each participant, and by keeping the written
parts of surveys to less than 30 minutes in both the pretests and main study.
4. Efforts to Identify Duplication and Use of Similar Information
Although the literature revealed a rich background on which to base the current research, we
found no studies that have examined the issues we propose to study.
1987;27:1165-77.
14
Kaufman DW, Kelly JP, Rosenberg L, Anderson TE, Mitchell AA. Recent patterns of medication use in the
ambulatory adult population of the United States: The Slone survey. J Amer Med Assoc. 2002;287:337-344.
3
5. Impact on Small Businesses or Other Small Entities
No small businesses will be involved in this data collection.
6. Consequences of Collecting the Information Less Frequently
The proposed data collection is one-time only. There are no plans for successive data
collections.
7. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5
There are no special circumstances for this collection of information.
8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the
Agency
In accordance with 5 CFR 1320.8(d), FDA published a 60-day notice in the Federal Register
of March 9, 2016 (81 FR 12503) requesting public comment on the proposed collection of
information. FDA received 10 comments total. Six comments were outside the scope of the
proposed research (“Ban DTC”), leaving four substantive comments.
1. Abbvie
a.
Comment: Mobile users can change font size and viewing size—we should
incorporate this into our study.
Response: Although the font size for certain text (such as newspaper articles) or
closed captioning text size can be changed on a tablet, supers within a developed
video cannot be manipulated. Participants will be allowed to hold the tablet as
they normally would, but it is important to establish experimental control over
many user settings to avoid threats to internal validity. Thus, font and viewing
size will be standardized for this study.
b.
Comment: Recommend looking at use of TV and mobile devices concurrently, as
some people use them this way.
Response: This is a good suggestion for future research, but is out of scope for the
current study.
2. Lilly
a. Comment: Generally supportive; research objectives and study approach are
reasonable.
Response: Thank you.
4
b. Comment: Recommend showing supers in black box at bottom of the screen and
not superimposing them over moving, contrasting color field to mimic common
practices in television commercial advertising.
Response: Our high contrast condition indeed presents the supers in white font on
a black background at the bottom of the screen. Our low contrast condition shows
lettering over the moving scenes because not all advertisements show their supers
in a black banner.
c. Comment: Lilly requests clarity about how the size of text and level of contrast
were developed when the agency reports the results of the study.
Response: We used cognitive interviews and will use the pretest to make these
determinations. We will be sure to include this information when we report the
results of the study.
d. Comment: Recommend qualitative pre-test instead of quantitative pretest.
Response: We fulfilled this suggested purpose with a set of nine cognitive
interviews that were conducted in April.
e. Comment: Request clarity about quota sampling and other techniques we may
plan to use to ensure a diverse sample. Also suggest groups of at least 50 in each
cell for analysis purposes.
Response: As this study is not intended to be nationally representative, we will not
employ strict quota sampling procedures. However, we will work closely with our
recruitment firms to monitor recruitment and ensure that our sample is diverse
with regard to factors including race, education, age and gender. Further, selection
of our three U.S. cities for data collection (Los Angeles, Cincinnati and Tampa)
was purposive to help achieve diversity on these factors.
To answer the second part of the comment, we are aware of no statistical or
research standard that specifies that groups must contain 50 individuals. However,
we conducted power analyses and determined that in order to have enough power
for the proposed statistical tests, we will exceed this number per experimental
cell.
f. Comment: Recommends replacing the pre-test question about the importance of
the text information (Question 5) with a question such as “how noticeable or
legible was the text information?”
Response: We agree that the noticeability and legibility of the text information is
important, and we have other questions that address this. We are specifically
interested in the perceived importance of the text information as a moderator
variable.
5
g. Comment: Recommends removing semantic differential questions (Question 9)
and essentially any questions that ask about perceptions because it is a pretest.
Response: Our pretest study is not designed to test the main study questionnaire.
Rather, the main purposes of the pretest are to (1) test consumer perceptions of
superimposed-text size with the aim of choosing perceptibly different levels of
size (small, medium, large) for use in the main study; and (2) test our planned
procedures for implementation of the intervention (TV and tablet) and in-person
data collection. However, to make the most use of our resources, we also plan to
test the properties of certain main study survey items (e.g., means, ranges, etc.) to
ensure the utility of the items for use in the main study.
h. Comment: Calls out an inconsistency in terms of how many times participants will
view the ad.
Response: Thank you for noting that discrepancy. Participants will view the ad
once. We have corrected all materials to reflect this change. We note that Lilly
recommends showing it twice. We agree that if the goal is to learn about user
experience (preferences and such, or trying to improve the presentation) then two
or more viewings makes sense. However, our goal is to test differences in
cognitive processing based on the varied size/contrast presentations of the supers.
Thus, we do not want to artificially enhance the scrutiny participants pay to the ad
above and beyond the experimental situation. For example, small supers may
interfere with cognitive processing as hypothesized, but this interference may be
overcome upon a second viewing. In a real world viewing situation, consumers
rarely see an ad two times in a row.
i. Comment: Question 12: attributes are very similar and will be duplicative.
Response: The three survey items for questions #12 (attitudes towards the ad) are
conceptually similar and will be used as a multi-item scale. Conventionally, three
items is the minimum recommended to assess inter-item reliability.
j. Comment: Question 12 and 14: Suggest bolding or underlining “drug” or “ad” in
these questions to differentiate them for participants.
Response: We agree and have added language to the survey items to better make
this distinction. For items specific to attitudes towards the drug we now begin the
item with “Overall, DRUG X is…” whereas items about the ad begin with
“Overall, the ad was…”
k. Comment: Would be interesting to include an open-ended question about whether
any additional information could have or should have been provided in the ad,
such as accessibility to the drug, information about the disease, etc.
6
Response: These are great ideas and would provide additional information about
various communication issues relevant to DTC television advertising. However,
we regret that we must make difficult choices about what to include and not
include in this study and these issues fall outside the scope of the current research
questions.
3. Merck
a. Comment: FDA’s execution may not yield useful data. For example, we are
examining TV and tablet use, but people may be viewing promotion on mobile
devices.
Response: We agree that the ways in which people view their media are
multiplying and that we have not captured all of them. However, rather than
simply study superimposed text on a television screen, we opted to add an
examination of viewing on a tablet, which is an increasingly popular option for
viewing shows. We regret that we do not have the opportunity to explore viewing
on all possible new technologies, but we believe that the current study will offer
insights above and beyond the television screen.
b. Comment: Prior to the implementation of results from individual studies on the
content, format, and presentation of information in DTC advertisements on
television, FDA should conduct research on the combination of all of the
individual factors.
Response: This comment is outside the scope of the present project. It is not
directed at the improvement of the study and does not appear to require the
abandonment of the current study.
4. GlaxoSmithKline (GSK)
a. Comment: Allowing participants to view the TV at the distance they usually view
it and to interact with the tablet the way they ordinarily do would better reflect a
real-world experience.
Response: We agree that these details are important to consider when conducting
valid research. We must make a decision between the trade-off of experimental
control and real-world generalizability. We have attempted to do this by setting
up the television and chair in the room at the average distance that people tend to
sit from their televisions in their living room and instructing participants to wear
glasses or contact lenses if needed. Television viewing is a more fixed experience
than more modern technologies. We also agree that allowing individuals to hold
the tablet or place it on a table as they normally would is appropriate for both
experimental control and ecological generalizability.
7
b. Comment: Including a medium contrast instead of just a high and low contrast
may be informative.
Response: We appreciate this comment because we considered it when designing
the study. We decided to use only high and low contrast in the study because our
main variable of interest in this particular study is the size of the text. Thus, we
are expending resources to attempt to determine multiple sizes of text to test in
order to get a fuller appreciation of the role of text size in DTC promotion. We
have found in past studies that identifying a medium level is difficult (e.g., OMB
Control No. 0910-0695) and chose in this study to focus on size rather than
contrast. That said, we do feel that contrast is valuable enough to add as a
variable of interest, so we are planning to devote two conditions to it.
c. Comment: It would be useful if the questionnaire is posted along with the notice
on regulations.gov.
Response: We are happy to provide the questionnaire to anyone who requests it.
d. Comment: Suggests an FDA-Industry working group might be helpful in the
furtherance of this research.
Response: This is an intriguing idea and may have merit after we obtain empirical
data that is specifically applicable to DTC promotion. Without this data, it is
unclear what this working group would contribute. We will consider this idea in
further detail upon interpretation of results.
External Reviewers
In addition to public comment, OPDP solicited peer-review comments from researchers
in fields relevant to the communication of DTC prescription drug information. We received
responses and incorporated the thoughts of the following individual:
Dr. Cynthia Baur, Senior Advisor, Health Literacy Office of the Associate Director for
Communication, Centers for Disease Control and Prevention
9. Explanation of Any Payment or Gift to Respondents
Incentive rates will vary according to the industry standards by location. Participants
completing the pretest or main study in the Los Angeles market will receive a $75 cash incentive
for completing a 25-minute in-person interview. Participants recruited from the Cincinnati or
Tampa markets will receive a $40 cash incentive for a 25-minute in-person interview. The two
research facilities with which we will partner have confirmed that these incentive amounts
represent the minimum required to compensate for travel expenses and a relatively brief onsite
time commitment, and would be the appropriate standard for the 25-minute interviews in these
locations. These incentive amounts will ensure that we are able to attract a reasonable crosssection of consumers aged 18 or older. Using market-rate incentives confers several benefits:
8
1. Reduce survey costs: Recruiting with market-rate incentives is cost-effective. Our
experience indicates that using the market rate for broad demographic recruitment results
in very few no-shows. Prior research corroborates our past experience, finding that
monetary incentives at similar market rates increases participation rates15. Incentives are
cost-effective because the cost of a no-show is greater than the cost of paying the
incentive. When participants fail to show up in large numbers, the cost of the study
increases significantly because it means paying for the presence of multiple researchers,
equipment, and renting a facility that is sitting idle. Further, lower participation rates will
likely impact the project timeline because participant recruitment will take longer and,
therefore, data collection will be slower.
2. Improve data quality: Because providing a market-rate incentive tends to increase
response rates, it also improves data quality. Previous research suggests that providing
incentives may help reduce sampling bias by increasing rates among individuals who are
typically less likely to participate in research (such as those with lower education, e.g.,
Guyll et al., 2003). Furthermore, there is some evidence that using incentives can actually
reduce nonresponse bias in some situations by bringing in a more representative set of
respondents16. This may be particularly effective in reducing nonresponse bias due to
topic saliency17.
Offset burden on respondents: In-person studies require participants to report to a
specific location at a specific scheduled time. Participants must arrange for child care if
they have children (because children cannot be brought to the test session) and they must
arrange for transportation and/or parking, even for test sessions of short duration.
Therefore, providing a market-rate incentive should help to offset participant burden.
10. Assurance of Confidentiality Provided to Respondents
All participants will be provided with an assurance of privacy to the extent allowable by
law. See Appendix A for the consent form.
No personally identifiable information will be sent to FDA. All information that can
identify individual respondents will be maintained by the subcontractor in a form that is separate
from the data provided to FDA. The information will be kept in a secured fashion that will not
15
16
17
Guyll M, Spoth R, Redmond C. The effects of incentives and research requirements on participation rates for a
community-based preventive intervention research study. J Primary Prevent. 2003;24(1):25-41.
Castiglioni L, Pforr K. The effect of incentives in reducing non-response bias in a multi-actor survey. Presented
at the 2nd annual European Survey Research Association Conference, Prague, Czech Republic, June, 2007;
Singer E. (2002). The Use of Incentives to Reduce Nonresponse in Household Surveys. (RM Groves, DA
Dillman, JL Eltinge, RJ Little, Eds.) Survey nonresponse. 2002;051:163-178. University of Michigan
Institute for Social Research. Retrieved from http://www.isr.umich.edu/src/smp/Electronic; Singer E.
(2006). Nonresponse bias in household surveys. Public Opinion Quarterly. 2006;70(5):637-645.
Groves R, Couper M, Presser S, Singer E, Tourangeau R, Acosta G, Nelson L. Experiments in producing
nonresponse bias. Public Opinion Quarterly. 2006;70(5):720-736.
9
permit unauthorized access. Confidentiality of the information submitted is protected from
disclosure under the Freedom of Information Act (FOIA) under sections 552(a) and (b) (5 U.S.C.
552(a) and (b)), and by part 20 of the agency’s regulations (21 CFR part 20). These methods
will all be approved by FDA’s Institutional Review Board (Research Involving Human Subjects
Committee, RIHSC) prior to collecting any information.
All participants will be assured that the information will be used only for research
purposes and will be kept private to the extent allowable by law. The experimental instructions
will include information explaining this to respondents. The pretest and main study instructions
and consent forms will include information explaining to respondents that their information will
be kept confidential. Participants will be assured that their answers to screener and survey
questions will not be shared with anyone outside the research team and that their names will not
be reported with responses provided. Participants will be told that the information obtained from
all of the surveys will be combined into a summary report so that details of individual
questionnaires cannot be linked to a specific participant. All electronic data will be maintained in
a manner consistent with the Department of Health and Human Services’ ADP Systems Security
Policy as described in the DHHS ADP Systems Manual, Part 6, chapters 6-30 and 6-35. All data
will also be maintained in consistency with the FDA Privacy Act System of Records #09-100009 (Special Studies and Surveys on FDA Regulated Products). Upon final delivery of data
files to RTI and completion of the project, Schlesinger and L&E Research will destroy all study
records, including data files, upon request.
11. Justification for Sensitive Questions
This data collection will not include sensitive questions. The complete list of questions is
available in Appendix B.
12. Estimates of Annualized Burden Hours and Costs
For both the pretests and main study, the questionnaire is expected to last no more than
30 minutes. This will be a one-time (rather than annual) collection of information. FDA
estimates the burden of this collection of information as follows:
Activity
Pretesting
Number to
complete the
screener
(assumes
50%
eligible)
No. of
Respondents
338
No. of
Responses per
Respondent
1
10
Total
Annual
Responses
338
Average
Burden per
Response (in
hours)
0.08
(5 minutes)
Total
Hours
27
Activity
Number of
completes
Main Study
Number to
complete the
screener
(assumes
50%
eligibility)
Number of
completes
No. of
Respondents
No. of
Responses per
Respondent
Total
Annual
Responses
Average
Burden per
Response (in
hours)
0.42
(25 minutes)
Total
Hours
240
1
240
101
1,785
1
1,785
0.08
(5 minutes)
143
1,272
1
1,272
0.42
(25 minutes)
534
Total hours
805
These estimates are based on FDA’s and the contractor’s experience with previous consumer
studies.
13. Estimates of Other Total Annual Costs to Respondents and/or Recordkeepers/Capital
Costs
There are no capital, start-up, operating or maintenance costs associated with this
information collection.
14. Annualized Cost to the Federal Government
The total estimated cost to the Federal Government for the collection of data is $883,425
($294,475 per year for three years). This includes the costs paid to the contractors to manipulate
the stimuli, program the study, draw the sample, collect the data, and create and analyze a
database of the results. The contract was awarded as a result of competition. Specific cost
information other than the award amount is proprietary to the contractor and is not public
information. The cost also includes FDA staff time to design and manage the study, to analyze
the resultant data, and to draft a report ($72,000; 8 hours per week for three years).
15. Explanation for Program Changes or Adjustments
This is a new data collection.
16. Plans for Tabulation and Publication and Project Time Schedule
Conventional statistical techniques for experimental data, such as descriptive statistics, analysis
of variance, and regression models, will be used to analyze the data. See Part B for detailed
information on the design, hypotheses, and analysis plan. The Agency anticipates disseminating
11
the results of the study after the final analyses of the data are completed, reviewed, and cleared.
The exact timing and nature of any such dissemination has not been determined, but may include
presentations at trade and academic conferences, publications, articles, and Internet posting.
Table 2. – Project Time Schedule
Task
Pretest completed
Main study data collected
Final methods report completed
Final results report completed
Manuscript submitted for internal review
Manuscript submitted for peer-review journal
publication
Estimated Number of Weeks
after OMB Approval
16 weeks
40 weeks
40 weeks
62 weeks
66 weeks
70 weeks
17. Reason(s) Display of OMB Expiration Date is Inappropriate
No exemption is requested.
18. Exceptions to Certification for Paperwork Reduction Act Submissions
There are no exceptions to the certification.
12
File Type | application/pdf |
File Title | Microsoft Word - Superimposed Text Study Supporting Statement Part A.docx |
Author | DHC |
File Modified | 2017-01-26 |
File Created | 2017-01-26 |