Rapid Message Testing & Message
Development System
Contact:
Rudith Vice
National Center for Emerging and Zoonotic Infectious Diseases
Centers for Disease Control and Prevention
1600 Clifton Road, NE
Atlanta, Georgia 30333
Phone: (470) 553-3567
Email: [email protected]
1. Respondent Universe and Sampling Methods 4
2. Procedures for the Collection of Information 4
3. Methods to Maximize Response Rates and Deal with No Response 6
4. Tests of Procedures or Methods to be Undertaken 6
5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data 7
In general, the Rapid Message Testing & Message Development System will not employ statistical methods to sample respondents, except in the case of applying statistical weights to survey data. This introductory section justifies the decisions for the data collection methods that the System will employ.
The methods are:
Individual in-depth (cognitive) interviews of 72 individuals using probing techniques to develop a deeper understanding of how respondents interpret a message.
Focus groups of eight or fewer respondents. The anticipated data collections will be small in scale because groups are intended to inform an iterative process of developing health communication campaign messages, not to be generalized to a specified respondent universe.
Online surveys of 1,000 individuals.
These audience-specific techniques rely not on statistical power but on the theoretical premise that language is interpreted through shared cultural knowledge and frameworks.1, 2 To increase the likelihood that a message will be noticed, to avoid miscommunication, and to guard against insensitivity in specialized communication to sub-cultural groups, the proposed data gathering techniques provide “…a ‘window’ on a particular worldview” (Priest, p. 114).
By incorporating qualitative and quantitative elements in various mixtures, these techniques allow the flexibility for in-depth, individualized probing, can be feasible in time-sensitive situations, and have worked well in message development for commercial advertising.3
Individual In-depth Interviews (Cognitive Interviews)
Cognitive interviewing is a technique used to probe a respondent to identify a deeper understanding of a particular response or answer to a question. Questions are usually open-ended and give the respondent an opportunity to fully express their thoughts without being influenced by the constraints of a close-ended question. Cognitive interviews can be used successfully in a variety of settings with diverse populations, including experts on certain topics who may provide nuances and insights that would not be captured by closed-ended survey questions. An additional use: the interaction inherent in cognitive interviewing can improve the interviewer’s understanding of the subject’s cognitive processes and the decision-making process behind a given response.
Online Focus Groups
Eight or fewer participants will be recruited for each group. Focus groups will be conducted over the internet with a professional moderator who will guide a real-time discussion but will also allow the direction of the discussion to take its own course. For online focus groups, participants will share aloud, and all other participants will be able to hear the conversation. Participants may be physically located in different parts of the country even within one group.
Online Surveys
Potential respondents will be primarily drawn from national panels of thousands of individuals who have opted in to participate in surveys on various topics. Panels are maintained by vendors working on behalf of NCEZID and its contractors. Panel members have provided a basic profile of information about themselves, their work, and their households as part of their agreement to join the panel. Surveys will be delivered to respondents based on their initial likelihood to qualify for the parameters of the questionnaire. Many will qualify, but others will screen out based on answers to initial questions because they do not meet the specific criteria or because quotas for have been reached on certain characteristics.
Statistical methods will be used to apply weights to data in cases where the survey collects responses from a sample that is meant to resemble the general public. Prior to collecting data, such surveys will have quotas put in place to ensure that the sample is generally representative of an accurate mix of demographic variables, such as age, sex, ethnicity and race, region, income, and educational attainment. After data are collected, statistical weighting may be employed to bring the sample into alignment with known population parameters, based on Census targets. Using such quotas and weighting is standard in the communications research industry when testing messages.
The surveys in this System will not employ other statistical sampling techniques, such as random digit dial phone surveying, that are necessary to be able to generalize results to the larger population. The surveys in this System are not meant to be generalized, as panel members may inherently differ from non-members in the population due to their self-selection into web-based panel pools. For the purposes of this System, panel sampling is the fastest, most efficient, and most cost-effective way of performing rapid message testing and related message development activities.
For all methods described, broad dimensions of interest are:
Comprehension – Is the message or topic clearly understood? Is the audience able to identify and recount the intended main messages? Is the intended information presented in a manner that makes it effective and actionable for the intended audience?
Appeal – How much does the audience like the presentation of the message? What elements do they especially like? Which do they dislike?
Relevance – Do respondents perceive the messages as relevant to themselves personally, as well as to their peers? In other words, does the message resonate with their personal perceptions and experience?
Believability – Is the message and/or its source perceived as credible? Does it portray the message realistically and convincingly?
Acceptability – Is there anything in the message that is perceived as offensive or unacceptable to either the primary or secondary audiences?
Behavioral Intent – Do respondents think they will act as a result of seeing/hearing the message?
Participation will be encouraged by:
Stressing the value of data collection as a means of helping to understand the characteristics of the health problem or prevention message.
Keeping the interviews and questionnaires short.
Providing the opportunity for respondents to contact the contractor to receive assurance regarding the authenticity of the survey.
Any adult, non-incarcerated, non-institutionalized member of a target audience for any of NCEZID’s numerous messages could be a respondent in a study falling within the purview of the System. In practice, the majority of respondents will be drawn from research panels of individuals who have opted to participate in studies of various kinds. Any member of the public can join such a panel, provided they share a basic profile of information with the vendor panel manager. In some cases, a particular audience may not be feasible for recruitment from panels, due primarily to their scarcity (e.g., Physician Assistants whose patients are primarily members of Hispanic populations). In these cases, vendors may recruit for interviews or focus groups from industry lists, direct outreach, or referrals—all common and often necessary techniques for certain niche audiences.
Surveys under this System will include typical samples of 1,000 general public respondents, with quotas and weights to ensure the sample resembles the population at large. Such a sample size is intended to allow for subgroup analysis (for example, simple comparisons of men to women, Hispanic to non-Hispanic adults, or senior men to senior women). This subgroup analysis is often very helpful in determining the impact of messages on different segments of the population. Samples smaller than 1,000 will risk reducing certain subgroups to sizes that make analysis difficult or unwise; at the same time, larger samples are not necessary because the incremental added precision is typically not required to ascertain the impact of a message or communications tool being tested.
Data will primarily be collected by NCEZID communications research contractors, including Weber Shandwick and KRC Research, the parties that have consulted on the development of the instruments under review in this System package.
Data Collection
Qualitative (Interviews and Online Focus Groups)
Contractors will lead the recruitment activities and the data collection activities. Procedures for data collection will follow communications research industry standards.
For each individual data collection activity under this System, contractors will first select from a necessary set of approved questions in the screening and recruitment question bank (Attachment 3). The selected questions will form the screening questionnaire to be asked of potential interview or focus group respondents. The screening questionnaire will also include quotas for certain criteria and the number of participants. The selection of both questions and quotas will be done in partnership with NCEZID. Once approved, the contractor will work with a sub-contracted panel provider to conduct the screening among potential qualified candidates in their panel. The panel vendor will screen until all participant slots are filled with qualified candidates. The contractor will review candidate criteria to confirm accuracy. Once approved, the panel provider will require participants to complete a consent form (Attachment 5). Then, the panel provider will work with the contractor to schedule participants for interviews or focus groups. Participants will be given information about the sponsor of the study, their rights, and guidelines on what to expect. They will also be told the date and time of the activity in advance.
Around the same time as the screening questionnaire is being developed, contractors will draw questions from the message and content testing question banks (Attachments 4A, 4B, or 4C), again in consultation with NCEZID. The selected questions will make up the interview guide or moderator guide. After participants are scheduled and instruments approved, contractors will conduct the focus groups or interviews. Interviewers and moderators are trained qualitative researchers with extensive professional market research industry backgrounds; they will inform participants about their status as independent interviewers.
Quantitative (Online Surveys)
Contractors will lead data collection activities. In surveys, the screening criteria are built into the questionnaire itself, along with quotas for subgroups. Questions will be drawn from the same question banks referenced previously (Attachments 3, 4A, 4B, or 4C). Once approved and programmed, the survey will be distributed by sub-contracted panel providers to members of their panels (described in the introduction to this Supporting Statement) based on the likelihood of their qualification for the study. To efficiently ensure the recruitment and participation of required subgroups from within the panel sample pool, proportionate stratification will be employed to ensure the survey efficient reaches each of the subgroups required by the survey’s quota plan (e.g., quotas for specific shares of ethnicity or race categories or income bands). All potential respondents will first complete the screening questions, and, if qualified, will then proceed to the survey’s content questions. The communications research contractor will oversee all phases of the data collection, from initial quality review of partial data, to monitoring of quotas, and tabulation of results.
Statistical Methods
Many of the studies under the System will lead to descriptive topline reports that will not involve inferential statistics. However, in the case of surveys, basic descriptive statistical analyses will be conducted with software for manipulation and tabulation of survey data. For example, mean, standard deviation, and standard error of the mean might be calculated for response categories including Strongly Agree, Agree Somewhat, Disagree Somewhat, and Strongly Disagree.
In the case of data collection activities that involve interviews, focus groups, and online surveys, the following procedures have proven effective in previous studies and can be used to increase responses:
Interviewers will participate in thorough training and background sessions. Training topics will include study objectives, question-by-question reviews of data collection instruments, strategies for engaging respondents, role playing, and techniques for fostering respondent cooperation and survey completion.
Experienced, highly trained staff will moderate all focus groups and conduct all interviews.
To maximize participation in focus groups, respondents will be contacted multiple times leading up to the session. Each will be sent a calendar invitation, and each will participate in a brief “tech check” to ensure they are comfortable and equipped to participate with the required web-based video technology. If a participant does not have access to a webcam, one will be provided to them.
To maximize participation in interviews, each participant will be sent a calendar invitation and a reminder of the upcoming meeting. A phone interview may be arranged in case of non-response or difficulty with web-based interview platforms.
During the initial recruitment and screening for focus groups and interviews, all potential participants will be informed of the parameters and requirements of the planned activity (length, technology requirements, etc.). Those who are not able to meet these parameters and requirements will not be scheduled for participation.
All participants in focus groups and interviews may be contacted by the recruiting vendor in the case of “no shows” to attempt to include them. Interviews can be rescheduled, and late-arriving focus group participants can be added after the group has started at the discretion of the moderator.
The nature of panel surveys helps boost response rates: participants have already opted in to participate in studies, and those who decline to participate can typically be replaced quickly and efficiently.
The majority of the procedures to be used by the System are already well-established, and the online panel approach has been used with increasingly frequency. Studies using this method have passed scientific peer review,4 and this method of data collection is now used by major commercial research firms.
In the case of surveys, contractors will typically perform a “soft launch” by collecting data from a small percentage of planned respondents (typically 10% or less). Once the soft launch data collection is complete, the survey will be paused while the data are reviewed to ensure programming logic is working correctly and no questions or instructions are being clearly misinterpreted. Once data are reviewed, data collection will resume as part of the “full launch.”
Though NCEZID may use different contractors at different points under this System, Weber Shandwick is the lead contractor that has led the development of the instruments and approach that make up this package as described here, and Weber Shandwick’s research arm, KRC Research, will lead many of the initial data collection activities. KRC’s role will include day-to-day supervision and planning, sample design, instrument development based on approved questions in question banks (Attachments 3, 4A, 4B, or 4C), survey testing, quality control, focus group and interview moderation, data analysis, and reporting. Sub-contractors to KRC will be responsible for sampling and recruiting, qualitative or quantitative platform hosting, transcriptions and translations (as necessary), and data tabulation.
1 Priest, S.H. (1996) Doing Media Research: An Introduction. Thousand Oaks California: Sage Press.
2 Glaser, B. & Strauss, A. (1967) The Discovery of Grounded Theory. Chicago: Aldine Press.
3 Sen, A.K. (1982) Bank uses mall-intercept interviews to test ad concepts, Marketing News, Chicago, 15(15), p. 20.
4 Schlenger, W.E., Caddell, J.M., Ebert, L., Jordan, K., Rourke, K.M., Wilson, D., Thalji, L., Dennis, M., Fairbank, J.A., & Kulka, R.A. (2002) Psychological reactions to terrorist attacks: Findings from the National Study of Americans’ reactions to September 11, JAMA, 288(5), 581-588,
| File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
| Author | Samuel, Lee (CDC/OID/NCEZID) |
| File Created | 2025:11:24 00:27:51Z |