Supporting Statement A for Request for a Revision Clearance:
COLLABORATING CENTER FOR QUESTIONNAIRE DESIGN AND EVALUATION RESEARCH
OMB No. 0920-0222
Expiration Date: 09/30/2024
Contact Information:
Amanda Titus B.S.
Behavioral Scientist, Collaborating Center for Questionnaire Design and Evaluation Research
Division of Research and Methodology
National Center for Health Statistics/CDC
3311 Toledo Road, Room 5451
Hyattsville, MD 20782
301-458-4579
December 1, 2022
Table of Contents
A. Justification
A.1. Circumstance Making the Collection of Information Necessary 4
A.2. Purpose and Use of Information Collection 12
A.3. Use of Improved Information Technology and Burden Reduction 21
A.4. Efforts to Identify Duplication and Use of Similar Information 21
A.5. Impact on Small Businesses or Other Small Entities 21
A.6. Consequences of Collecting the Information Less Frequently 22
A.7. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5 22
A.8. Comments in Response to the Federal Register Notice and
Efforts to Consult Outside the Agency 22
A.9. Explanation of Any Payment or Gift to Respondents 23
A.10. Protection of the Privacy and Confidentiality of Information Provided
by Respondents……………………………………………… 24
A.11. Institutional Review Board (IRB) and Justification for Sensitive Questions 31
A.12. Estimates of Annualized Burden Hours and Costs 35
A.13. Estimates of Other Total Annual Cost Burden to Respondents
or Record Keepers 37
A.14. Annualized Cost to the Federal Government 37
A.15. Explanation for Program Changes or Adjustments 38
A.16. Plans for Tabulation and Publication and Project Time Schedule 38
A.17. Reason(s) Display of OMB Expiration Date in Inappropriate 38
A.18. Exceptions to Certification for Paperwork Reduction Act Submissions 38
LIST OF ATTACHMENTS
ATTACHMENT A: Public Health Service Act
ATTACHMENT B: 60-day Federal Register Notice
ATTACHMENT C: Nondisclosure affidavit
Template 1: Federal Employee Version
Template 2: Contractor Version
ATTACHMENT D: Sample screening script
Template 1: Respondent recruited from newspaper
advertisement
Template 2: Respondent recruited from CCQDER Database
ATTACHMENT E: Informed consent templates for CCQDER interviews
Template 1: Adults
Template 2: Adults offsite
Template 3: Focus Groups
Template 4: Virtual
Template 5: Parental/Guardian
ATTACHMENT F: Special Consent for Expanded Use of Video and Audio Recordings
ATTACHMENT G: Special Consent for Expanded Use of Video and Audio Recordings
for Individual Respondents of Discussion Groups
ATTACHMENT H: Sample recruitment flyer
ATTACHMENT I: Sample newspaper advertisement
ATTACHMENT J: Sample CCQDER Voice mail script
ATTACHMENT K: Sample of a 2017 advance letter sent to NHIS respondents
ATTACHMENT L: Sample Respondent data collection sheet
ATTACHMENT M: Detailed explanation of cognitive interviewing procedure read by interviewer to respondent
ATTACHMENT N: Thank You Letter
Goal
of the study: It
is the goal of the Collaborating Center for Questionnaire Design
and Evaluation Research (CCQDER) to evaluate questions for optimal
design as well as to provide documentation supporting the validity
of NCHS and other agencies’ information collections.
Intended
use of the resulting data:
CCQDER obtains information about the interpretive processes used
by respondents to formulate answers to survey questions. Findings
are used 1) to ensure question comparability across respondent
groups, for example, those across different gender,
race/ethnicity, income and education groups, 2) to correct any
identified problematic questions, for example, those which are
vague or ambiguous, cannot be answered readily or accurately by
the respondent, or otherwise contribute to the non-sampling errors
of the survey, and 3) to provide data usage documentation
regarding the phenomena considered by respondents, that is, the
specific construct measured by individual questions.
Methods
to be used to collect:
Data collection includes a mix of qualitative and quantitative
methodologies, including cognitive interviewing, focus groups,
usability testing, ethnography, and survey field tests/pilot
interviews (in-person/telephone/web).
Subpopulation
to be studied:
In that a primary research goal involves assessment of
comparability among potential survey respondents, focus of
analysis is across relevant respondent groups, for example, of
those across different gender, race/ethnicity, income and
education groups.
How
data will be analyzed:
Depending on project needs and data collection methods, a variety
of analytic methods may be used and will be explained in the
individual information collection requests under this clearance.
In general, however, qualitative data will be analyzed using the
constant comparative method to identify the presence and
comparability of interpretive patterns. Quantitative data will be
analyzed using estimation- and correlation-based techniques
including hypothesis testing, latent class analysis, and studies
of differential item functioning. Regardless of analysis
techniques used, findings under this generic clearance serve
methodological purposes and are not intended for the production of
population estimates.
Supporting Statement A
Collaborating Center for Questionnaire Design and Evaluation Research
A three-year OMB clearance (2022-2025) revision is requested for “NCHS Collaborating Center for Questionnaire Design and Evaluation Research” (currently approved under the title of “NCHS Questionnaire Design Research Laboratory”) (OMB No. 0920-0222, Exp. Date 09/30/2024). This revision seeks an increase in burden hours as well as inclusion of specific ongoing research projects, such as research on incentives for qualitative studies, and the addition of a new quantitative data collection system, the Research and Development Survey (RANDS). Furthermore, the revision updates language pertaining to the various methodologies, including a focus on comparability and the study of intersectionality in survey question design.
A. JUSTIFICATION
1. Circumstances Making the Collection of Information Necessary
In 1984, the Committee on National Statistics conducted a two-part seminar on the Cognitive Aspects of Survey Methodology (CASM) under a grant from the National Science Foundation (NSF). The seminar examined cognitive-related methodological studies that might lead to improvements in questionnaire and interviewing procedures employed in scientific surveys. Following the seminar, the NSF provided funding to NCHS to investigate how relevant knowledge and techniques in cognitive science could be applied to improve health surveys. The project, begun in 1984, was called Laboratory-Based Studies of the Cognitive Aspects of Survey Methodology (CASM), and used cognitive psychological methods to study the survey interviewing process. In its final report, NCHS concluded that it is feasible and efficient for a Federal statistical agency to conduct laboratory research on the cognitive aspects of survey questionnaires.
Subsequently, NCHS applied cognitive research techniques being tested under the grant to develop the 1987 NHIS supplement (OMB No. 0920-0214, Exp. Date 12/31/2023, a comprehensive set of questions on knowledge, attitudes, and practices regarding cancer risk factors. Proving successful, NCHS created the Collaborating Center for Questionnaire Design and Evaluation Research (CCQDER) to provide such testing for NCHS surveys on a regular basis, as well as to continue more general research on the survey response process, questionnaire design, and pretesting methodology.
In October 2009, NCHS held a Question Evaluation Methods Workshop to examine various question evaluation methods as well as to discuss the impact of question design and evaluation on survey data quality. Broad consensus determined that measurement error in the Federal statistical enterprise requires renewed consideration. Federal statistical agencies have a fundamental obligation to produce valid and reliable data, and more attention on question evaluation and documentation is required. Furthermore, it was established that validation of measures is a particularly complex, methodological problem that requires a mixed-method approach. While quantitative methods are essential for understanding the magnitude and prevalence of error, they remain dependent on the interpretive ability of cognitive interviewing. Unlike any other question evaluation method, cognitive interviewing can portray the interpretive processes that ultimately produce survey data. As it is practiced as a qualitative methodology, cognitive interviewing reveals these processes as well as the actual of phenomena being measured. Additionally, the method allows for the examination of interpretive patterns across various respondent groups, such as race, gender, and education, to assess comparability. Consequently, cognitive interviewing methodology—when conducted as a qualitative methodology identifying patterns of interpretation—is an effective method for ensuring the validity of statistical data, and the documented findings from these studies represent tangible evidence of how the question performs.
In 2015, CCQDER began the first of a series of web surveys (referred to as the Research and Development Survey, or RANDS, series) to assess the utility of web panels as a way of augmenting the primarily qualitative program with quantitative methodologies, specifically, with the use of embedded, close-ended probe questions. The quantitative data coupled with the larger sample sizes (in contrast to traditional cognitive interviewing studies) allows for comparing the performance of alternate questions as well as quantitatively assessing their performance across respondent subgroups. If representative (or at least understood in the ways they are not), web panels could become an important platform for investigating question response as a socio-cultural process as well as conceptualizing error quantitatively, significant advancements for the field of survey measurement.
Initial assessments of web-panel surveys for use in question evaluation, particularly the use of embedded probing questions, have shown to be successful.1 Consequently, there is growing interest among CDC and other federal programs to use such platforms for question design-related research. Given its leadership in this area, CCQDER intends to continue RANDS as an ongoing program for methodological research.
In addition to question evaluation, it is the mission of CCQDER to provide documentation supporting the validity of NCHS survey data. Thus, the final product of each study conducted under the auspices of this clearance will consist of a clearly documented and publicly available report. Such documentation serves not only NCHS and other federal programs, but also supports data users in their application and interpretation of data. Therefore, all completed CCQDER testing reports are located and made accessible on Q-Bank (http://wwwn.cdc.gov/qbank), an interagency, online searchable database that houses question evaluation studies.
Data collection for this project is authorized under 42 U.S.C. 242k (Section 306 of the Public Health Service Act). A copy of the legislation is provided in Attachment A. CDC is requesting terms of clearance identical to previous submissions. CDC will submit individual collections under this generic three-year clearance to OMB. It is requested that OMB continue to provide feedback on the individual collections within 10 working days of the submission.
Overview of the Data Collection System
CCQDER uses a range of empirically based methodologies to evaluate individual questions, questionnaires, and self-administered forms for various agencies, including NCHS, other CDC programs, federal agencies, and academic and professional institutions. The purpose of applied studies is to identify and correct question design flaws with the goal of improving data quality. Additionally, CCQDER conducts methodological research on the cognitive and interpretive aspects of survey methodology more generally. The purpose of this research is to enhance understanding of the question response process, to develop better standards for questionnaire design, and to improve data collection procedures. Ultimately these studies produce generalizable knowledge that improves the quality of data collection instruments both inside and outside of NCHS.
For both applied and methodological initiatives, CCQDER data systems can be characterized as either qualitative or mixed-method—though methodologies for specific studies are addressed in individual collection requests under this generic clearance.
Qualitative Data Collection
Qualitative methods include cognitive interviewing, focus groups, ethnography, and usability testing.
Cognitive Interviewing. The primary data collection method under this generic clearance is cognitive interviewing. Cognitive interviewing studies offer detailed depictions of meanings and processes used by respondents to answer questions—processes that ultimately produce survey data. The interview structure consists of respondents first answering a draft survey question and then providing textual information to reveal the processes involved in answering the test question. Specifically, cognitive interview respondents are asked to describe how and why they answered the question as they did. Through the interviewing process, patterns of interpretation are identified as well as various types of question-response problems (e.g., interpretive errors and recall accuracy) that would not normally be identified in a traditional survey interview.
Focus Groups. While cognitive interviewing studies are best suited for question evaluation, focus groups (or group interviews of 5-10 individuals) can be used to discuss general concepts or themes upon which to develop question topics or choices of terminology. Under this clearance, focus groups will largely be used to understand the general range of terminology. The method relies on the group thinking that emerges from discussions to uncover vernacular terms and epistemologies that can then be translated into survey questions. It can also be used to test the usability of survey materials (such as introduction letters and follow-up notices) and statistical products (such as growth charts and guidance documents).
Ethnography. Ethnography is the observation of social practices and interactions. In practice this methodology includes a range of unstructured (participant observation, such as survey interviewer follow-alongs and site visits), semi-structured (such as ethnographic interviews wherein the interviewer has a range of topics to cover but no specific order or question wording), and structured (such as collecting freelist and pile sorting data) techniques. For instance, when studying how surveys should list opioids in a questionnaire, CCQDER previously used a series of pile sorts and follow-up ethnographic interviews to develop a folk taxonomy of opioids. These findings allowed CCQDER to suggest survey item changes that better reflected how people perceive and interact with opioid pain killers.
Usability Testing. Usability testing examines how people react to and interact with documents such as self-response survey questionnaires, survey materials, and statistical products. As conducted by CCQDER, usability testing typically involves interviewers observing respondents or participants interacting with a document and then asking probing questions about their actions. For instance, CCQDER has previously examined the usability of NCHS’ National Post-Acute and Long-Term Care Study questionnaire by observing adult day care and residential care center directors as they filled out the survey. Doing so allowed NCHS to develop better survey instructions and guidance for respondents.
Respondents of qualitative studies are not selected through a random process, but rather are selected for specific characteristics such as race or health status or some other attribute that is relevant to the questions being examined or the topic of discussion. Because the goals of the methodologies are to identify interpretive patterns as opposed to making estimations or causal statements, a purposive sample rather than one randomly drawn is utilized. Specific sample descriptions will be addressed in individual collection requests under this generic clearance.
Unless deemed otherwise by the NCHS Ethical Review Board, all interviews will be video and/or audio recorded to ensure quality of analysis as well as to support transparency for each study. Interviews conducted offsite or virtually may also be audio and/or video recorded.
To ensure a data-driven, systematic analysis across all qualitative information collections and to provide public access to study reports, CCQDER developed a series of web applications. The applications allow for traditionally more complex comparative analyses as well as provide an audit trail to back study conclusions. Those applications include: Q-Video, Q-Notes, Q-Notes Plus, and Q-Bank.
Q-Video is a digitized video/audio application that captures, stores, and indexes the video and audio of cognitive interviews in a digitized database for the purpose of searching individual questions and conducting analysis.
Q-Notes is a data entry and analysis application developed to assist in a systematic analysis of cognitive interviewing studies. Q-Notes provides interviewers and analysts real-time access to interview data and allows interviews to be conducted in multiple geographical regions so that comparability can be examined for multi-national and multi-lingual surveys. Q-Notes is currently used by various academics and statistical agencies around the world both for their own projects and to collaborate with other agencies internationally.
Q-Notes Plus is an extension of Q-Notes, where the recorded interview is embedded within the application so that the findings can be traced to the original source.
Q-Bank (a product of an interagency collaboration and now maintained by NCHS) is a database consisting of evaluated questions from Federal surveys and links each question to the final report of the evaluation study. Questions are searchable by survey title and question topic (e.g., income, demographic, chronic health conditions. In addition, users can search for keywords within individual questions. Q-Bank is intended to support question-design methodologists and survey administrators as well as data users in the interpretation and application data.
Mixed-Method Data Collection: Research and Development Survey (RANDS)
CCQDER’s primary vehicle for mixed method data collection is its Research and Development Survey (RANDS) series, which uses recruited, statistically sampled, commercial survey panels as its sample source. RANDS began in 2015 under a previous round of this generic clearance as an attempt to 1) integrate quantitative and mixed method question evaluation into CCQDER’s workflow and advance the study of measurement error and 2) understand more generally the strengths and limitations of commercial survey panels, and how they could be used to support NCHS’ wider statistical work.
A primary objective for measurement research is to assess how recruited web-based platforms can be used to augment NCHS’ question evaluation program, which historically has primarily used cognitive interviewing methodology and qualitative analyses. Importantly, the purpose of investigating the use of recruited web panels is not to simply add another method to the program’s repertoire, but rather to assess how cognitive interviewing methodology and recruited web panel data collection might be strategically integrated to advance question evaluation methodology as a whole.
This approach is consistent with increasing calls from within the field of question evaluation for mixed-method design. Strictly quantitative methods of question evaluation, which use metrics such as item non-response and missing rates, only signal the potential of response error and cannot explain source or cause. Other more sophisticated quantitative methods (e.g., item response theory, latent class analysis, and multi-trait multi-method analysis) assess measurement quality by examining relationships between variables, but do not explicitly identify the phenomena captured by a single question, relying instead on the theoretical concept of latency. Cognitive interviewing studies, on the other hand, can distinguish the specific phenomena captured by a question, though results are not generalizable to the population. That is, while able to identify patterns of interpretation as well as reasoning as to why those patterns exist, cognitive interviewing studies cannot determine the extent to which those patterns are likely to occur within a survey sample or within the various socio-cultural groups of that sample. RANDS, therefore, attempts to bridge this divide by strategically integrating quantitative survey-based (i.e., close-ended web probing, experimental design, assessments of repeated measures) methods, qualitative survey-based (i.e., open-ended web probing) methods, and qualitative analysis derived from CCQDER’s traditional question evaluation studies. In doing so, it attempts to answer specific questions such as: “How much error or specific patterns of interpretation (as they are identified through cognitive interviewing) actually occur within a survey sample or population?” and “Are there specific groups of respondents who are significantly more likely to produce error?”2
Beyond this measurement error work, RANDS also provides NCHS’ mathematical statisticians the ability to investigate statistical methods for the purpose of integrating recruited web panel data with that from federal health surveys.3 In the end, such integration could be useful in producing intermediate estimates of topics fielded periodically or for producing more precise estimates for small sub-populations or geographic areas. Additionally, it is possible that panels might be used to collect detailed information about a specific health topic (e.g., an entire survey on disability or access to health care) or to produce preliminary estimates for emerging high priority outcomes not included on a national population health survey (e.g., opioid use or measles outbreaks).
To accomplish both objectives of RANDS, NCHS determined that the series would use recruited (as opposed to opt-in), statistically-sampled, commercial web panels. Such panels allow NCHS to conduct each individual survey relatively quickly, while still providing some level of sample quality as well as the sampling-related metadata required to fully research statistical calibration. The first two rounds of RANDS (conducted under a previous round of this clearance in 2015 and 2016) used the Gallup Organization’s Gallup Panel—a recruited panel based primarily on a dual frame (landline and cell phone) RDD survey with some address-based sampling recruitment. Starting with the third round of RANDS (in 2018) and continuing to the present (including three rounds of RANDS during COVID-19, which were conducted under separate emergency clearances, OMB Control #s 0920-1298 and 0920-1323), NCHS has been using NORC and the University of Chicago’s AmeriSpeak Panel, which is recruited using a mail-back survey sent to an address-based sample with in-person non-response follow-up. Prior to this current clearance, nine rounds of RANDS have been conducted or approved:
Table 1: Round of RANDS Completed or Approved Prior to the Requested Clearance
Round |
Panel |
Modes |
Total Completes |
Response Rate |
RANDS 1 |
Gallup |
Web Only |
2,304 |
23.5%1 |
RANDS 2 |
Gallup |
Web Only |
2,408 |
30.1%1 |
RANDS 3 |
AmeriSpeak |
Web Only |
2,646 |
18.1 %2 |
RANDS during COVID-19 Round 1 |
AmeriSpeak |
Web and Phone |
6,800 |
23.0%2 |
RANDS during COVID-19 Round 2 |
AmeriSpeak |
Web and Phone |
5,981 |
20.3%2 |
RANDS 4 |
AmeriSpeak |
Web and Phone |
3,442 |
14.0%2 |
RANDS during COVID-19 Round 3 |
AmeriSpeak |
Web and Phone |
5,458 |
11.8%2 |
RANDS 5 |
AmeriSpeak |
Web and Phone |
6,896 |
11.1% |
RANDS 6 |
AmeriSpeak |
Web and Phone |
3,135 |
12.9% |
RANDS 7 |
AmeriSpeak |
Web and Phone |
NA3 |
NA3 |
NOTES: 1Gallup only provided completion rates and not weighted cumulative response rates that include the initial recruitment effort. 2NORC reported weighted cumulative response rates, which include the initial recruitment effort, panel retention, and survey completion rates. 3 RANDS 7 was approved under the previous round of this generic clearance and is scheduled to be complete by November of 2022. |
Items of Information to be Collected
For the qualitative research studies, data collected include:
Contact information, such as names, email addresses and phone numbers, for potential respondents
Basic demographic information and study-specific information to determine study eligibility
Respondent answers to specific study questions under evaluation
Narrative text (in the form of video/audio recordings) explicating respondent rationales for answers. Typically, the text includes various experiences, impressions, or opinions that respondents considered to formulate an answer.
For RANDS data collection, items include:
Respondent answers to survey questions along a wide range of topics, including demographics, chronic conditions, and access to care
Respondent answers to open or close-ended follow-up probe questions relating to interpretations of previous questions
Information in Identifiable Form
Information in identifiable form is collected for linkage of various CCQDER forms (informed consent documentation, and respondent demographics) and audio and video recordings. All items have been routinely approved and collected in the past. The identifiable information includes:
Name
Mailing Address
Email Address
Phone Number
Employment Status
Audio Identifier (digital voice recording)
Photographic Identifier (digital video image)
Access to personal information is restricted to CCQDER staff and designated agents with signed Designated Agent Agreements for statistical purposes only.
2. Purpose and Use of Information Collection
NCHS and the Collaborating Center for Questionnaire Design and Evaluation Research (CCQDER) conducts question evaluation studies, for both applied and methodological purposes, with particular focus on question design, measurement, comparability, and error.
The purpose and use of collecting this information fall into four categories:
Development and testing of specific survey items and questionnaires
Research on the cognitive and interpretive aspects of survey methodology, including the study of comparability and the impact of intersectionality in question design
Research on human-computer interfaces and usability
Studies of the optimal design and presentation of statistical graphical and textual material
Development and cognitive testing of specific survey items and questionnaires:
Activities for the development and evaluation of specific survey items primarily remain the same, with the exception new evaluation projects and requirements for conducting virtual interviews due to the COVID-19 pandemic, both to be more fully discussed below. As the number of testing projects for CDC and other federal agencies has increased over the past three years, it is fully expected that the demand will continue to increase. It is appropriate that CCQDER perform these activities, as it is currently the only federal facility performing question evaluation studies for DHHS survey questionnaires. However, because the requests may arrive with little advance notice, it is not possible to specify the nature of these projects.
Some examples of previous and potential collections under this clearance include:
National Health Interview Survey (NHIS) (OMB No. 0920-0214, Exp. Date 12/31/2023): The NHIS collects annual data on health status and limitations, use of health care, AIDS testing, family resources, health insurance, access to care, injury, health behaviors and functioning. Personal interviews are conducted in approximately 43,000 households including about 106,000 persons. The CCQDER has routinely conducted studies of various modules under 10-day packages since 1999, including mental health, alternative health, disability, insurance, strengths and difficulties services, cancer screening questions, complementary and alternative medicine, oral health, children’s mental health, voice, swallowing, speech and language, sexual identity, health insurance, cancer control, occupational health, diabetes primary prevention, and cognitive disability.
Projected: In 2022-225, CCQDER anticipates cognitive interviewing studies for the addition of new NHIS modules, including COVID-related topics, gender and sexual identity, telemedicine, and access to care. Additionally, a large cognitive interviewing study is in planning to provide validation and construct documentation for existing NHIS questions. Such documentation serves NCHS data users in their application and interpretation of data.
Pregnancy Risk Assessment Monitoring System (PRAMS) (OMB No. 0920-0654): PRAMS is a surveillance project of the Centers for Disease Control and Prevention (CDC) and state health departments. PRAMS collects state-specific, population-based data on maternal attitudes and experiences prior to, during, and immediately following pregnancy. The CCQDER has conducted cognitive testing on PRAMS questionnaires under 10-day packages in 1999, 2001, 2003, 2007, 2014, 2016 and 2022.
Projected: In 2022-25, CCQDER expects to evaluate new questions, as well as questions that will be proposed as expansions and refinements to those already identified in PRAMS questionnaires during 2018-2022.
National Health and Nutrition Examination Survey (NHANES) (OMB No. 0920-0950, Exp. Date 4/30/2023): NHANES collects annual data about health and diet in the United States. The survey consists of two parts: an in-home interview and a health examination. CCQDER has conducted studies of various modules under 10-day packages since 1999, including a brochure designed to be used by the field interviewers to convert survey refusals, various modules for the “in-home interview,” including sexual identity, physical activity and pain, positive prostate specific antigen (PSA), hypertension and pre-hypertension, audio-CASI sensitive questions, creatine & life style questions, second-hand smoke , health care utilization, smoking, alcohol intake and second hand e-cigarettes.
Projected: In 2022-25, CCQDER anticipates cognitive interviewing studies for the addition of new modules, including COVID-related topics.
National Survey of Family Growth (NSFG) (OMB No. 0920-0314, Exp. Date 12/31/2024): The National Survey of Family Growth (NSFG) is a multipurpose survey based on personal interviews with a national sample of men and women 15-49 years of age in the civilian non-institutionalized population of the United States. Its main purpose is to provide reliable national data on marriage, divorce, contraception, infertility, and the health of adults and infants in the United States. CCQDER has conducted cognitive testing of various NSFG modules under 10-day packages since 1999.
Projected: In 2022-25, CCQDER expects to evaluate newly proposed questions as well as revised questions resulting from previous testing.
Division of Health Care Statistics Surveys (DHCS) (various clearances): The NCHS Division of Health Care Statistics includes surveys that are designed to answer key questions of interest to health care policy makers, public health professionals, and researchers. These can include factors that influence use of health care resources, quality of health care, including safety, and disparities in health care services provided to population subgroups in the United States. CCQDER has conducted cognitive testing of DHCS surveys and modules under 10-day packages including the National Ambulatory Medical Care Survey (NAMCS) and National Hospital Ambulatory Medical Care Survey (NHAMCS) Patient Record Evaluation Study; 2011 Physician Workflow Electronic Health Records (EMR) Supplement; 2012 Asthma Management Supplement; the National Survey of Long-Term Care Providers; 2015 NAMCS Feasibility Study, 2015 National Electronic Health Records Survey (NEHRS), and the 2016 NAMCS Culturally and Linguistically Appropriate Services (CLAS) Supplement.
Projected: In 2022-25, CCQDER anticipates cognitive interviewing studies for the addition of new modules, including impact of COVID-19 on health care systems.
Traditionally, CCQDER conducts in-person cognitive interviews in its NCHS facility. However, due to the COVID-19 pandemic, for the past two years interviewing has moved to an on-line, virtual environment. To make the shift to a remote work environment, new protocols were developed for recruitment, interview set-up (e.g., respondent instructions regarding technology requirements, accepting a Zoom invitation, logging into the app), giving consent, and sending and receiving post-interview incentives. Additionally, new protocols for managing PII were developed, specifically, identification of a secured and CDC-approved platform for conducting interviews, the storage and use of respondent contact information and interview recordings, as well as the transport of recordings to NCHS and, ultimately, uploaded onto Q-Video (CCQDER’s video storage data base of cognitive interviews, described earlier in this package). Because CCQDER staff were previously using Q-Notes (CCQDER’s on-line data entry and analysis tool, also described earlier), the ability to conduct projects collaboratively and systematically was not hindered by the change to virtual interviewing. The primary hinderance, however, was due to the inability of viewing others’ recorded interviews; when conducting analyses, the sole source of data to draw upon were others’ interview notes. To compensate for this deficit, interviewers were required to write transcripts of their interviews as opposed to summary notes—a tedious and time-consuming task.
To date, CCQDER is assessing whether virtual interviewing should be, at least in part, incorporated into its post-pandemic work. For example, it is likely that virtual interviewing would be the optimum method for interviewing physicians or other highly educated professionals who may have less time to travel to NCHS, and who also tend to be less moved by incentives. However, use of virtual interviews most likely eliminates the possibility of interviewing less educated respondents with little or no technological capability. And, while virtual interviewing expanded the pool of potential respondents beyond the local area, it is not clear how respondent requirements (e.g., access to a computer or cell phone, technological savviness, inclination to participate virtually, etc.) might reduce the size and quality of samples.
To inform these decisions, CCQDER is conducting several studies on the impact of virtual interviews, burden, and incentive amounts.
Research on virtual interviewing. To further understanding of how features of virtual interviewing and burden might impact recruitment and successful data collection, CCQDER is currently (and will continue) to conduct follow-up interviews with both respondents and interviewers. For those respondents who agree to be re-interviewed, CCQDER will conduct short, 5-minute qualitative follow-up debriefings, covering both the motivation for participation, such as incentives, as well as any difficulties respondents experienced during the interview. Follow-up discussions are not recorded, and no additional incentives will be made to respondents. Additionally, interviewers are asked to provide feedback after conducting their interview, making note of any technical issues that may have arisen during the interview, providing their overall impression of the virtual format, and offering conclusions about the success of the interview in terms of data quality.
Research on incentive amounts. Because virtual platforms are a new medium for conducting cognitive interviews, it is unclear how burden impacts sample quality. It may be that a lesser amount of $25 is a sufficient participation incentive for a virtual environment, as the burden of getting to the appointment, parking, etc., is eliminated. On the other hand, the technological requirements of navigating Zoom, having a computer/tablet, and adequate Wi-Fi might impose their own burden requiring incentive amounts similar to that of in-person interviews. For example, in the 2020 study of RANDS during COVID-19 testing, four participation requirements were identified that appeared to impact respondent participation—which may (or may not) be tempered with a higher level of remuneration. They include whether the potential respondent:
Has email
Is willing to provide home address for FedEx remuneration
Has required technology (computer or smart phone with video-capacity, good internet connection)
Has the ability to connect to Zoom and practice the connection prior to interview
To investigate the effectiveness and necessity of renumeration in the context of this new format, CCQDER has begun, and will continue, to examine how various unique features of virtual interviewing and levels of remuneration might impact recruitment and the successful completion of an interview. Depending on sample criteria and feasibility, experiments will be conducted of the individual cognitive interviewing studies—each specific package will indicate whether the incentive experiment will be included as well as the details of the study. For each cognitive interviewing study, three levels of remuneration will be offered (split roughly equally among the sample), one below the standard amount ($25), one at the standard amount ($50), and one above the standard amount ($60). Because most studies recruit specific types of respondents, with some being more difficult to recruit than others, it is necessary to conduct this experiment across numerous projects.
The table below presents the metrics collected to gain a better understanding of the impact of incentive amount within a virtual environment.
Table 2: Metrics Collected to Study Impact of Virtual Interviewing
Project Name |
|
||
Sample Characteristics |
|
||
|
$25 |
$50 |
$60 |
Total Incoming Calls |
|
|
|
Total Incoming Emails |
|
|
|
Total Website Visits |
|
|
|
Percent that left voicemails |
|
|
|
Percent who took screener |
|
|
|
Interviews Scheduled |
|
|
|
Percent No-Show |
|
|
|
|
|
|
|
Reasons for dropping out |
|
|
|
No email |
|
|
|
Unwilling to give address |
|
|
|
Cannot Access Zoom |
|
|
|
Does not want to take time |
|
|
|
Other |
|
|
|
Research on the Cognitive and Interpretive Aspects of Survey Methodology
The second major purpose of CCQDER’s data collection is to conduct research on the cognitive and interpretive aspects of survey methodology.
Some examples of this methodological research include:
Research to examine the ways in which social structure and intersectionality impact data quality
The CCQDER will explicitly contribute to NCHS’ efforts to increase equity in the data that it collects by examining the ways in which social structure (e.g., gender, race, education, income) and the intersectionality of those constructs at the individual level impact the question response process and data quality.
Central to the study of cognitive and interpretive processes, CCQDER studies set out to understand the ways in which social context impacts the question response process, thereby impacting comparability across relevant respondent groups. Specifically, CCQDER follows the socio-cultural approach as laid out in Miller and Willis (2016) and Miller et al. (2014)4.
This approach recognizes that the four-stage model of question response (i.e., comprehension, retrieval, judgment, and response) developed within the field of psychology is an individual-centric understanding and does not fully account for the fact that these cognitions are informed by experience and meaning derived from social context. Therefore, to fully understand question performance, it is necessary to study how underlying social structures (and the intersectionality of those structures) inform meaning and shape the question response process. As such, the central research question within the socio-cultural approach is one of comparability: Do questions mean the same to all socio-cultural and linguistic groups represented in a survey? Are data elicited from questions capturing the same phenomena across all groups of respondents? Is the quality of responses similar across all respondent groups?
Both qualitative and mixed-method information collections under this generic package will examine the ways in which social structure and intersectionality impact data quality.
Qualitative Information: Cognitive Interviews, Focus Groups, and Usability Tests. Under this generic clearance, qualitative studies (i.e., cognitive interviewing, usability testing and focus group studies) are set within the socio-cultural approach and are specifically designed to address comparability. The design pertains to each stage of the research process, including data collection, analysis and documentation, and is detailed in Miller et al.’s book, Cognitive Interviewing Methodology: An Interpretive Approach for Survey Question Evaluation. For data collection, a purposive sample design, based on the constant comparative method, is implemented, and interviews are defined by narrativity and the collection of ‘respondent experience’ as opposed to structured probe-questions designed to identify ‘question problems.’ Furthermore, the principle of reflexivity is invoked for both interviewing procedures and analysis. Specifically, for analysis, a 5-stage process is utilized in which interpretive patterns are first identified across respondents and ultimately compared across relevant respondent groups, with a particular focus on race and ethnicity, education, income, and gender. Final conclusions are publicly documented and reveal the phenomena (including unintended constructs) captured by each survey question, variations across respondent groups, and explanations for those variations.
Mixed-Method: Research and Development Survey (RANDS). CCQDER’s recent expansion into more regularly using mixed-method question evaluation techniques via its Research and Development Survey (or RANDS) program also offers an opportunity to explore comparability—as defined above—within a quantitative design. RANDS samples are obtained from a recruited, commercial survey panel. Doing so allows NCHS to collect large samples with the statistical power to analyze intersectional groups’ survey outcomes. Analysis of RANDS question evaluation data (such as that from web probes and embedded experiments) produces estimates of interpretive patterns and magnitudes of difference across groups, indicating how differential measurement error may impact final estimates derived from those evaluated items.
Additionally, the possibility of oversampling intersectional groups exists if needed for a specific study. If, for example, a cognitive interviewing study indicates that low-income respondents of a particular racial group understand survey items differently than members of other subgroups, that intersectional group can be oversampled to allow for the statistical power needed to conduct wording or other experiments.
Research on appropriateness of response scales: An important determinant of survey data quality is that questions include appropriate response scales. Response scales must have clear meanings to respondents and must allow for adequate expression of their experience. An emerging body of research suggests that seemingly trivial variations in response scales (e.g., using a scale from 1 to 10 as opposed to a scale from
-5 to +5) can significantly affect response distributions. CCQDER has begun preliminary research on the meanings of vague quantifiers (such as often, sometimes, and rarely) and the benefits of certain scales over others (e.g., seven-point scales over feeling thermometers). CCQDER staff will continue this line of inquiry, conducting cognitive interviewing studies of alternative response scales, as well as RANDS split-ballot experimentation.
Research on cognitive and interpretive aspects of nonresponse: Nonresponse creates numerous analytic difficulties for surveys. Minimizing this problem requires a greater understanding of the cognitive processes that lead respondents to decide not to answer surveys or particular survey questions. CCQDER plans to conduct cognitive interview interviewing studies to explore these decision processes further, specifically examining non-responder rationales for unwillingness or inability to complete surveys. Additionally, data may be collected through experimental questionnaires administered outside of the laboratory that explore the effect of various design decisions on item nonresponse. Contracts may be used for some components of this data collection and analysis.
Respondent perceptions of confidentiality and survey participation: To encourage participation, NCHS surveys such as the NHIS and NHANES depend on advance letters, promising confidentiality and explaining uses of collected data. However, it is not known how well these statements are generally understood and believed by survey respondents. Therefore, CCQDER staff proposes to conduct cognitive interviewing studies of laboratory respondents to examine statement comprehension. Results will determine the necessary modifications to improve communication of key issues (e.g., informed consent), and to explain the need and purpose for survey data.
In 2022-2025 CCQDER plans to continue research on methods evaluation and general questionnaire design research. Much of the work will be conducted collaboratively with survey researchers from universities and other federal agencies to define and examine several research areas, including, but not limited to: 1) differences between face-to-face, telephone, and virtual/video-over internet cognitive interviewing, 2) development and evaluation of questions for complex constructs, such as gender and sexual identity, 3) reactions of both survey respondents and survey interviewers to the use of Computer Assisted Personal Interviewing (CAPI), Audio Computer-Assisted Self-Interview (ACASI), video-over internet, 4) social, cultural and linguistic factors in the question response process, and 5) recruitment and respondent participation at varying levels of incentive. Procedures for each of these studies will be similar to applied studies as described above. These studies will be conducted either by CCQDER staff, DHHS staff, or NCHS contractors trained in cognitive interviewing methodology. Results of these studies will be applied to questionnaire development activities to improve ongoing item and questionnaire testing activities.
Research on human-computer interfaces and usability
The third major purpose of this data collection is to conduct research on computer-user interface designs for computer-assisted instruments, often referred to as “usability testing.” This research examines how survey questions, instructions, and supplemental information are presented on computer instruments (e.g., CAPI, Computer Assisted Self-Interviewing (CASI) instruments, ACASI, or web-based instruments) and investigates how their presentation affects the ability of users to effectively use and interact with these instruments. Authors of computer-assisted instruments make numerous design decisions: how to position the survey question on a computer screen; how to display interviewer instructions that are not to be read to respondents; the maximum amount of information that can be effectively presented on one screen; how supplemental information such as “help screens” should be accessed; whether to use different colors for different types of information presented on the screen; and so on. Research has shown that these decisions can have a significant effect on the time required to administer survey questions, the accuracy of question-reading, the accuracy of data entry, and the full exploitation of resources available to help the user complete tasks. Usability testing has many similarities to questionnaire-based cognitive research (described in Section 2.1), since it focuses on the ability of individuals to understand and process information to accurately complete survey data collection. It is also somewhat different, in that the typical user can be an interviewer (in the case of CAPI instruments) as well as a respondent (in the case of CASI/ACASI instruments). It also focuses more heavily on matters of formatting and presentation of information than traditional cognitive testing does.
D. Studies of the optimal design and presentation of graphical and textual material
The final major purpose is related to the growth of the internet for collecting data (including web-based surveys), and in disseminating health information. NCHS, the federal government’s principal health statistics agency, is responsible for collecting and disseminating many reports and volumes of data annually. Over the past few years, the techniques developed for determining whether respondents understand survey questions have been applied with great utility to studying whether statistical publications and web releases are optimally clear. One project, for example, involved the development and testing of a brochure designed by staff of the National Health and Nutrition Examination Survey (NHANES) (OMB. No. 0920-0950, Exp. Date 04/30/2023) to convert refusals to acceptance. Another recent project involved testing and evaluation of different Health Surveillance Map formats (choropleth versus isopleth) to determine if they affect ability to extract information from the maps for the Division of Adult and Community Health/the National Center for Chronic Disease Prevention and Health Promotion (NCCDPHP), CDC. It is anticipated that similar projects will occur during 2022-2025.
3. Use of Improved Information Technology and Burden Reduction
Usually, cognitive interviews will be conducted in the mode intended for the survey, i.e., face-to-face; telephone, self-administered, Computer Assisted Personal Interviewing (CAPI), Computer Assisted Telephone Interviewing (CATI), Audio Computer-Assisted Self-Interview (ACASI), or web-based. As previously described, CCQDER is in the processes of examining the use of virtual interviewing as a way of reducing burden, while also considering its impact on sample quality.
Additionally, CCQDER continues to maintain and use Q-Bank, a publicly accessible searchable data base question-evaluation studies, to determine if a survey question has been previously studied. The regular use of Q-Bank reduces unnecessary testing as well as allows CCQDER to build upon existing knowledge learned from past testing projects. Furthermore, each cognitive interview is digitally recorded and stored on an internal, searchable video database. Like Q-Bank, this technology allows CCQDER staff to build upon past projects and, at the same time, it improves the accountability of test findings.
4. Efforts to Identify Duplication and Use of Similar Information
The CCQDER at NCHS is the only government facility that currently conducts testing and development of NCHS or other CDC questionnaires. Similar facilities at the Bureau of the Census and the Bureau of Labor Statistics bear the responsibility for testing survey questionnaires associated with their own agencies. The demand for CCQDER activities exceeds available resources.
In order to identify duplication across federal agencies, CCQDER hosts a publicly accessible online searchable database, Q-Bank, that contains all CCQDER evaluation reports. CCQDER encourages all agencies to submit their evaluation reports so that it is possible to track the work done across agencies as well as to build in existing knowledge.
5. Impact on Small Businesses and Other Small Entities
CCQDER testing does not impact small businesses or small entities. If such requests are made, these businesses will be approached in the same manner as the individuals we normally recruit; we will ask the organization to identify the appropriate staff members with whom to conduct the cognitive interviews.
6. Consequences of Collecting the Information Less Frequently
Individual projects usually involve one-time data collection activities. There are no legal obstacles to reducing the burden.
7. Special Circumstances Relating to Guidelines of 5 CFR 1320.5
This request fully complies with the regulation 5 CFR 1320.5.
8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside Agencies
A Federal Register notice for this collection was published on September 30, 2022 (Vol. 87, No. 189 p. 59425). The text of the notice is contained in Attachment B. No comments were received related to this notice.
Consultants outside of CDC:
The following individuals have been consulted within the past year on survey methodology and/or on a specific project:
Fred Conrad
Survey
Research Center
University of Michigan
P.O. Box 1248
Ann
Arbor, MI 48106-1248
University
of Michigan
(734)764-6314
[email protected]
Alisú Schua-Glusberg
Research Support Services
906 Ridge Avenue58
Evanston, IL 60602
(847) 864-5677
Jennifer Hunter
Statistical Research Division
U.S. Bureau of the Census
Washington D.C. 20233
Consultants within CDC:
The following individuals have been consulted within the past year on survey methodology and/or on a specific project:
Stephen Blumberg
Division of Health Interview Statistics
National Center for Health Statistics
3311 Toledo Road
Hyattsville, MD 20782
(301) 458-4327
Carol Pierannunzi
Division of Behavioral Surveillance
Centers for Disease Control and Prevention
1600 Clifton Rd NE
Atlanta, GA 30329
770.488.4609
9. Explanation of Any Payment or Gift to Respondents
For most testing projects, cognitive interview respondents receive incentives for several reasons:
Typically, respondents are recruited for specific characteristics that are related to the subject matter of the survey (e.g., questions may be relevant only to people with certain health conditions). The more specific the subject matter, the more difficult it is to recruit eligible respondents. Incentives helps to attract a greater number of potential respondents.
Cognitive interviews require an unusual level of mental effort, as respondents are asked to explain their mental processes as they hear the question, discuss its meaning and any ambiguities, and describe why they answered the questions the way they did.
They are usually asked to travel to the laboratory testing site, which involves transportation and parking expenses. (Many respondents incur additional expenses due to leaving their jobs during business hours, making arrangements for childcare, etc.).
For a standard cognitive interviewing project in which one-hour interviews are conducted at NCHS and eligibility requirements are of average complexity, respondents will be given $50.00. Higher incentives may be requested on a case-by-case basis for particularly difficult recruitments. For example, in a 2016 the CCQDER was unable to find physicians willing to be interviewed for less than $100. On rare occasions, a lower incentive is proposed.
It is important to offer incentives sufficient to attract the full range of needed respondent types for cognitive interviewing projects. Inadequate respondent recruitment limits the effectiveness of the questionnaire evaluation. Requests and justification for incentives will be included in each individual collection submission.
As described earlier, CCQDER plans to conduct research on the impact of incentive amounts on sample quality. (For details, please see “Research on incentive amounts” in 2.A of this package.) It is expected that this research will apply to all CCQDER studies unless it is not appropriate or feasible and will be discussed in individual requests for collection.
In the future, CCQDER plans to use e-gift cards as an alternative to our current method of giving incentive. As use of e-gift cards would better accommodate the current remote environment and provide an additional avenue for incentives that are comparable to industry standards.
10. Protection of the Privacy and Confidentiality of Information Provided by Respondents
The NCHS Privacy Act Coordinator has reviewed this request and has determined that the Privacy Act is applicable. The related System of Records Notice is 09-20-0164 Health and Demographic Surveys Conducted in Probability Samples of the U.S. Population.
A Privacy Impact Assessment was submitted on May 16, 2016. The CCQDER continues to collect, on a confidential basis, data needed in order to conduct CCQDER studies. The process of informing respondents of the procedures used to keep information confidential begins with the telephone screener and will carry through to the interviewer and all communications with potential respondents. Materials will include all elements of informed consent, including the purpose of the data collection, the voluntary nature of the study, audio or video recording of the interview, and the effect upon the respondent for terminating the interview at any time.
Confidentiality provided to respondents is assured by adherence to Section 308(d) of the Public Health Service Act (42 U.S.C. 242m) which states:
"No information, if an establishment or person supplying the information or described in it is identifiable, obtained in the course of activities undertaken or supported under section...306 (NCHS legislation),...may be used for any purpose other than the purpose for which it was supplied unless such establishment or person has consented (as determined under regulations of the Secretary) to its use for such other purpose and (1) in the case of information obtained in the course of health statistical or epidemiological activities under section...306, such information may not be published or released in other form if the particular establishment or person supplying the information or described in it is identifiable unless such establishment or person has consented (as determined under regulations of the Secretary) to its publication or release in other form,..."
In addition, legislation covering confidentiality is provided according to section 513 of the Confidential Information Protection and Statistical Efficiency Act or CIPSEA (PL 107-347) which states:
“Whoever, being an officer, employee, or agent of an agency acquiring information for exclusively statistical purposes, having taken and subscribed the oath of office, or having sworn to observe the limitations imposed by section 512, comes into possession of such information by reason of his or her being an officer, employee, or agent and, knowing that the disclosure of the specific information is prohibited under the provisions of this title, willfully discloses the information in any manner to a person or agency not entitled to receive it, shall be guilty of a class E felony and imprisoned for not more than 5 years, or fined not more than $250,000, or both.”
The CIPSEA legislation authorizes the designation of agents (“designated agents” or “agents”) to perform statistical activities on behalf of an agency. These agents function under the supervision of the agency’s employees and are subject to the same provisions of law with regard to confidentiality as an agency’s employees. A Designated Agent Agreement between the agency and the designated agents (e.g. contractors) must be executed before the agents can acquire information for the agency for exclusively statistical purposes under a pledge of confidentiality. This is in accordance with section 308(d) of the Public Health Service Act (42 U.S.C. 242m(d)) and the Confidential Information Protection and Statistical Efficiency Act of 2018 (CIPSEA Pub. L. No. 115-435, 132 Stat. 5529 § 302). In accordance with CIPSEA, every NCHS employee, contractor, and agent has taken an oath and is subject to a jail term of up to five years, a fine of up to $250,000, or both if he or she willfully discloses ANY identifiable information about you. In addition to the above cited laws, NCHS complies with the Federal Cybersecurity Enhancement Act of 2015 (6 U.S.C. §§ 151 and 151 note) which protects Federal information systems from cybersecurity risks by screening their networks.
A Designated Agent Agreement between NCHS and any CCQDER contractor will be executed if any contractors are hired to acquire information for the NCHS for exclusively statistical purposes under a pledge of confidentiality (i.e. complete any of the five types of activities described in this generic clearance request). Additionally, the agents (contractors) will be required to complete NCHS Confidentiality Training (https://www.cdc.gov/nchs/training/confidentiality/training/), submit a certificate of completion, and sign a pledge to maintain confidentiality (Nondisclosure Affidavit; see Attachment C) prior to completing CCQDER work. If the CCQDER contractor hires subcontractors to complete CCQDER work, the subcontractors must adhere to the same confidentiality and security requirements as CCQDER staff and contractors.
Data in identifiable form is collected for linkage of various CCQDER forms (informed consent documentation and respondent demographics) and audio and video recordings. The CCQDER also uses some identifiable data (name, phone number, email address) to contact previous respondents for CCQDER studies. The ability to match respondents to other data (informed consent documents, respondent demographics, and audio/video recordings) greatly expands the usefulness of the data at a very low cost.
As outlined in the informed consent form, access to personal information is restricted to CCQDER staff who can only access the personal information for statistical purposes. Additionally, designated agents such as CCQDER contractors or subcontractors may access the personal information for statistical purposes only after signing a Designated Agent Agreement with NCHS. CCQDER staff, designated agents, and staff from collaborating agencies must complete annual NCHS confidentiality training (https://www.cdc.gov/nchs/training/confidentiality/training/), submit a certificate of completion, and sign the NCHS affidavit of nondisclosure (see Attachment C) prior to being granted access to any personal information.
The collection of information in identifiable form requires strong measures to ensure that private information is not disclosed in a breach of confidentiality. Storage of confidential data is protected through procedures such as an internal QDRL LAN, passwords and restricted access.
Confidentiality of responses and safeguarding of data at NCHS
The CCQDER has a routine set of administrative, technical, and physical measures to safeguard confidentiality, including the following:
Storage of confidential data (informed consent form, respondent database, video and audio recordings) on the QDRL LAN are protected through procedures such as an internal QDRL LAN, passwords, and carefully restricted access;
The QDRL LAN is not located on the NCHS LAN, the QDRL LAN is inaccessible to others (not CCQDER staff) inside or outside NCHS;
All CCQDER personnel (including CCQDER contractors/designated agents) who have access to confidential data (informed consent form, respondent database, video and audio recordings) complete NCHS Confidentiality Training (https://www.cdc.gov/nchs/training/confidentiality/training/), submit a certificate of completion, and sign a pledge to maintain confidentiality (Nondisclosure Affidavit; see Attachment C), and are given instruction by the CCQDER Laboratory Manager on the requirement to protect confidentiality. Contracted personnel send hardcopies of the NCHS Confidentiality Training certificates and original signed hardcopies of the Nondisclosure Affidavits to Lauren Creamer, CCQDER Behavioral Scientist/Contracting Officer Representative (COR);
Only such authorized CCQDER personnel are allowed access to confidential data (informed consent form, respondent database, video and audio recordings) and only when their work requires it. CCQDER Personnel holding proper passwords may access the QDRL LAN through their CCQDER Computer Desktop which is hardwired to the QDRL LAN;
Data (informed consent form, audio recordings) from cognitive interviews and focus groups that are conducted off-site are stored in a secured travel case to ensure that there is no loss in transit until returned to NCHS, at which point the data is stored in secure conditions (CCQDER Control Room or locked staff office in a locked drawer) until the recordings can be manually ingested and consent documents scanned into the secure QDRL LAN.
Restricted signage (see below) is placed on the external doors of the CCQDER with point of contact information and phone numbers of whom to contact during and after business hours.
QDRL Lab Access Protocol
This Lab is a restricted access secure facility. Access is by CCQDER staff or by CCQDER staff escort only.
Should access to the Lab be needed by anyone other than CCQDER staff members for any reason including emergency please adhere to the following:
During normal working hours contact one of the following CCQDER staff who will provide escorted access to the Lab.
Sean Murphy x4391 Mobile: 202-503-0321
Kristen Miller x4625 Mobile: 301-605-5350
Amanda Titus x4579 Mobile: 240-543-9171
During off hours, weekends, and holidays contact one of the following CCQDER staff:
Sean Murphy Mobile 202-503-0321
Lee Burch Mobile 301-233-0311
Kristen Miller Mobile 301-605-5350
All respondents receive a copy of Attachment E, Informed Consent Form, which describes the procedures by which confidentiality of data identifying individuals is maintained.
Q-Video Access Protocol
Only NCHS onsite CCQDER staff and contractors with PIV cards and proper passwords have access to the digitized video/audio application that captures, stores, and indexes the video and audio of a cognitive interview at the questionnaire level in a digitized database. Q-Video is housed on the QDRL LAN which is located in the secured QDRL Control Room within CCQDER’s secured space (authorized keycard access only). Q-Video is isolated on the QDRL LAN and it has no outside connectivity, including no connectivity to the NCHS LAN, CDC, or the internet. It is inaccessible to anyone who is not a CCQDER staff member or CCQDER contractor.
NOTE: While CCQDER staff and contractors have access to Q-Video they rarely access Q-Video directly. The most common method of viewing a video recording is by logging on to Q-Notes Plus which contains embedded video of the interview for which the CCQDER researcher is entering notes. The video is searchable to the specific question desired which is helpful in ensuring that the notes are accurate and correctly represent the interview as well as in analysis and report writing.
Q-Notes Access Protocol
Q-Notes is the external version of the notes software. It is accessible by CCQDER staff, contractors, sub-contractors (i.e., Designated Agents) for remote work on CCQDER projects, or anyone wishing to utilize Q-Notes for their own projects. Access for CCQDER staff and contractors is based upon a Principal Investigator and supporting research staff being assigned to a Project by the CCQDER Director. The CCQDER Principal Investigator (or CCQDER contractor for the Principal Investigator) sets themselves up as the “Project Manager” in Q-Notes for the specific project to which they were assigned. Then they can add staff to the project and assign staff appropriate access to that project. Once assigned to the project CCQDER staff, contractors, and subcontractors are required to enter passwords to log on to the project. In addition, outside organizations or individuals can register online to get their own workspace where they can create a project that is only accessible to them. Once they set themselves up as a “Project Manager” they can then assign staff to their project. The set-up process, staff assignment, password access, and limitation of access to the project to which they are assigned is the same as it is for CCQDER. No one has access to a project to which they have not been assigned. They are only aware of their own project in Q-Notes. They are not aware of other projects existing.
Q-Notes Plus Access Protocol
Only NCHS onsite CCQDER staff and contractors (i.e., Designated Agents) with PIV cards and proper passwords have access to Q-Notes Plus (the internal interviewer notes application plus video/audio recordings). The process for setting up a project and granting staff access rights to a project is the same as it is for Q-Notes. Q-Notes Plus is housed on the QDRL LAN located in the secured QDRL Control Room within CCQDER’s secured space (authorized keycard access only). Because the QDRL LAN is not located on the NCHS LAN, it is inaccessible to others inside or outside NCHS. Access is further restricted at the cognitive interviewing project level. Only CCQDER staff and contractors assigned to a specific project have access to that projects’ interviewer notes and video/audio recordings.
Q-Bank Access Protocol
Q-Bank is accessible to the public. There is no restricted access. Q-Bank is a database of evaluated questions from Federal surveys and the scientific report that evaluated the survey questions.
Records Retention Schedule for Cognitive Interviews
The cognitive interview’s retention status pertains to 1) the permitted retention time for its recording (e.g., two or five years), 2) the required media format for its storage (i.e., audio and/or video), and 3) the persons permitted access to the recording (e.g., special consent). An interview’s Retention Status is determined by several project and interview-level factors. Thus, each interview has its own individual retention status. Only interviews in which the respondent has consented to their interview being used for future research will be retained after the completion of the project. Future research consists only of work directly related to the survey questions discussed in the interview, but not necessarily tied to the current specific project. In most cases, retention is used to verify the accuracy of findings stated in final reports and allows for re-investigation of such findings. For those interviews being maintained for future use, the data retention period for storing the interview recording will begin after the conclusion of each project.
The factors determining retention status include the type of consent agreed upon by the respondent (e.g., special consent) and whether it is a restricted or unrestricted interview. Interviews become restricted depending on the type of respondent (e.g., < 18 years old) as well as the interview topic (e.g., illegal behaviors clearly defined by law and are punishable if disclosed). Restricted interviews require enhanced protections for data storage and retention. Interviews that are given the restricted designation are stripped of video (upon submission of the final report) and maintained only in audio format. The audience for restricted interviews is limited and the interviews are reviewed every two years to determine whether the interview continues to have qualitative value for use in federal question evaluation research projects or activities.
Unrestricted interviews contain questions about behaviors that are not illegal and, in some projects, could be deemed as embarrassing or disconcerting. Unrestricted interviews can be maintained in audio and/or video format depending on respondent’s consent. The audience for unrestricted interviews is broader than restricted interview and may include sharing the interviews in a classroom setting if respondent’s consent was provided. Unrestricted interviews are reviewed every five years to determine whether the interview continues to have qualitative value for use in federal question evaluation research projects or activities.
If the restricted or unrestricted interview continues to be of value (defined as ongoing use by research staff, topic relevance, likely use for federal questions evaluation research), reassessment of the recording will occur again in either 2 years (for restricted interviews) or 5 years (for unrestricted interviews).
Informed Consent documents
Informed consent documents are stored by project in a separate drawer in a locked filing cabinet in the locked office of the CCQDER Recruiter or CCQDER staff member working on the project until the informed consent documents can be electronically scanned and moved to the secure CCQDER Local Area Network (LAN).
CCQDER Respondent Database: A custom designed CCQDER Respondent Database contains personal identifiable information and demographic information on respondents who have participated in past CCQDER studies such as name, phone number, email address, age, marital status, ethnicity, race, education, employment status, and household income. The CCQDER Respondent Database is used to produce periodic tabulations and reports on database characteristics and response rates. The CCQDER Respondent Database is also used to conduct computerized searches to locate computer records of past respondents having salient characteristics for use in future studies. The CCQDER Respondent Database is housed on the secure QDRL LAN. Only CCQDER Staff and designated agents such as CCQDER contracted staff holding proper passwords have access to CCQDER Respondent Database. The QDRL LAN is located in the secured QDRL Control Room within CCQDER’s secured space (authorized keycard access only). Because the QDRL LAN is not located on the NCHS LAN, it is inaccessible to others inside or outside NCHS. If normal office operations are restricted, such as the COVID-19 pandemic, CCQDER’s Respondent Database is to be housed on the NCHS CIPSEA sever. Access to the server is restricted to four members of CCQDER’s Operations Team that have undergone training in handling and protection of personal identifiable information.
Audio and video recordings: The CCQDER Recruiter/CCQDER Staff label each audio and video recording by a unique respondent identifier number, date, time, and project title. No other identifying information is labeled on the recording. Audio and video recordings are housed on the secure QDRL LAN. Only CCQDER Staff and designated agents such as CCQDER contracted staff holding proper passwords have access to interview recordings. The QDRL LAN is located in the secured QDRL Control Room within CCQDER’s secured space (authorized keycard access only). Because the QDRL LAN is not located on the NCHS LAN, it is inaccessible to others inside or outside NCHS.
Safeguarding of video recordings viewed at locations other than NCHS: Depending on the project, sponsors and collaborators may be from CDC, and occasionally from other DHHS or outside Federal agencies. The Informed Consent Form is tailored to describe each project and will specify which agencies are collaborating in the research and which staff(s) may be [listening to/viewing] the recording. Any outside NCHS collaborator [listening to/viewing] the recording onsite at NCHS in secured the QDRL will be required to complete NCHS Confidentiality Training (https://www.cdc.gov/nchs/training/confidentiality/training/), submit a certificate of completion, and sign a pledge to maintain confidentiality (Nondisclosure Affidavit, Attachment C).
Reports and publications: No respondent names or other personal identifying information is included in any reports or publications of cognitive testing results.
Presentations: No respondent names or other non-photographic identifying information is included in any presentations of cognitive testing results. As outlined in the standard informed consent and the special consent for expanded use of video and audio recordings, CCQDER respondents have been informed that voice and face identifiers will remain on the recording and have granted permission for the audio or video recordings to be played either to individuals working closely on the project or at conferences, meetings, or in the classroom.
11. Institutional Review Board (IRB) and Justification for Sensitive Questions
Each CCQDER project will be submitted to the NCHS Research Ethics Review Board individually for review.
Informed Consent and Voluntary Nature
CCQDER respondents/interviews conducted at NCHS
Respondents are recruited through media advertisements, flyers, and word-of-mouth, and either call the CCQDER voice mail system or contact a person coordinating the recruitment. Data collection for this project is authorized under 42 U.S.C. 242k (Section 306 of the Public Health Service Act).
During the telephone screener (Attachment D), potential respondents are informed that answering the telephone screener questions to determine their eligibility for the study is completely voluntary. They are informed that we are required by law to use the information they provided in the telephone screener for statistical research only and to keep it confidential, and that the law prohibits us from giving anyone any information that may identify them without their consent. In addition, respondents who are determined to be eligible for the study are informed during the telephone screener that the information they provide during the cognitive interview is confidential.
Prior to the start of the cognitive interview, CCQDER respondents read and sign Attachment E, Informed Consent Form (written at an 8th grade reading level). There are five templates in the attachment to cover various consent situations. The consent form states that participation is voluntary, they are free to terminate the interview at any time, and if they do so, they will still receive the incentive. The consent form describes the purpose of the interview and recording, specifies that the recordings may be played for other staff working closely on that project, that voice and face identifiers will remain on the recording, and that they may be recognized by a staff member viewing or listening to the recording. Cognitive interviews deemed to be about illegal behaviors will not be video recorded, only audio recorded. Respondents are given a copy of the consent form, which contains contact information for the CCQDER Laboratory Manager, the NCHS Research Ethics Review Board (ERB), and the NCHS Confidentiality Officer.
At the close of the cognitive interview, a respondent may also be asked by the interviewer to sign Attachment F, the Special Consent for Expanded Use of Video and Audio Recordings Form. The purpose of this form is to allow for the playing of recordings at conferences, meetings, or in the classroom to illustrate findings from cognitive interviewing. Use of this form is at the discretion of the interviewer and is typically warranted if (1) the interview demonstrated a unique question problem or research finding and (2) there is an anticipated need to demonstrate the research finding at a conference, meeting, or instructional session. This form is not used when the topic of the cognitive interview is an illegal behavior (self-report or proxy report) or in the case of interviews with minors (persons under the age of 18); recordings of interviews with minors will never be shown to others not included in the study staff. Respondents are given a copy of the form which contains information about how to contact the CCQDER Laboratory Manager, the NCHS Research Ethics Review Board Chair, and the NCHS Confidentiality Officer. If respondents grant Special Consent, recordings are kept for as long as there is a justifiable, scientific use for the recordings as determined by the NCHS Research Ethics Review Board.
CCQDER respondents/interviews conducted off-site5: Sometimes interviewers must travel to conduct cognitive interviews in these cases, a mutually agreeable location will be chosen. In all cases, extreme care is taken with audio and video recordings and any materials that contain personal identifiers such as the Informed Consent Form or the Special Consent for Expanded Use of Video and Audio Recordings Materials are then transported to the CCQDER, where standard procedures are followed.
CCQDER respondents/interviews conducted virtually: Respondents are recruited through media advertisements, flyers, and word-of-mouth, and either call the CCQDER voice mail system or contact a person coordinating the recruitment.
During the telephone screener (Attachment D, template 1), potential respondents are informed that answering the telephone screener questions to determine their eligibility for the study is completely voluntary. They are informed that we are required by law to use the information they provided in the telephone screener for statistical research only and to keep it confidential, and that the law prohibits us from giving anyone any information that may identify them without their consent. In addition, respondents who are determined to be eligible for the study are informed during the telephone screener that the information they provide during the cognitive interview is confidential.
Prior to the start of the cognitive interview, CCQDER respondents read and sign Attachment E, template 1, Informed Consent Form (written at an 8th grade reading level). The consent form states that participation is voluntary, they are free to terminate the interview at any time, and if they do so, they will still receive the incentive. The consent form describes the purpose of the interview and recording, specifies that the recordings may be played for other staff working closely on that project, that voice and face identifiers will remain on the recording, and that they may be recognized by a staff member viewing or listening to the recording. After the interview has concluded respondents will be given the thank-you letter signed by Director of NCHS (Attachment N), their incentive, and a copy of the informed consent document (Attachment
E, template 1), which contains contact information for the CCQDER Laboratory Manager, the NCHS Research Ethics Review Board (ERB), and the NCHS Confidentiality Officer.
At the close of the cognitive interview, a respondent may also be asked by the interviewer to sign Attachment F, the Special Consent for Expanded Use of Video and Audio Recordings Form. The purpose of this form is to allow for the playing of recordings at conferences, meetings, or in the classroom to illustrate findings from cognitive interviewing. Use of this form is at the discretion of the interviewer and is typically warranted if (1) the interview demonstrated a unique question problem or research finding and (2) there is an anticipated need to demonstrate the research finding at a conference, meeting, or instructional session. This form is not used in the case of interviews with minors (persons under the age of 18); recordings of interviews with minors will never be played in public settings only viewed by the study staff. Respondents are given a copy of the form which contains information about how to contact the CCQDER Laboratory Manager, the NCHS Research Ethics Review Board Chair, and the NCHS Confidentiality Officer.
NCHS government issued encrypted laptops will be used to video and audio record the interviews. Due to the size of the video recordings, the internal drive of the encrypted laptop is not sufficient for storage of the recordings. Recordings will be saved to an NCHS government issued encrypted flash drive. The encrypted flash drive is FIPS 140-2 compliant and approved for use by OCISO.
CCQDER staff will also use the NCHS government issued encrypted laptops to input their interviewer notes into Q-Notes. Within 24 hours, a CCQDER staff member will review project specific interview notes and will delete any direct or indirect personal identifiable information (PII) if found.
Extreme care will be taken with all recordings and paperwork from the interviews conducted virtually. Recordings and identifying paperwork will be stored in a secured travel case until returned to NCHS, at which point they will be transferred to the usual secured locked storage cabinets. Once the video and audio recordings are transferred to the QDRL Network, the recordings will be deleted from encrypted flash drive. Once deleted, the files are no longer available for use.
Focus groups: In focus group settings, participants are together and obviously can hear each other’s comments, statements, and questions. Participants are told in their initial telephone screening interview that they will be participating in a discussion group with other volunteers. Before the group discussion begins, participants sign the Informed Consent Form (Attachment E Template 3) which is tailored to specify that they will be participating in a focus group. The Informed Consent also states that they will be asked to pick a name and put it on a name tag, and that they do not have to use their real name. It is the responsibility of the interviewer (usually referred to as a moderator when conducting a focus group) to instruct the group that the information discussed will be held confidential by NCHS staff and should be treated confidentially by all respondents. Participants are strongly urged to respect the privacy of the other respondents and not to discuss with others what was discussed by the group.
At the close of the focus group, participants may be asked by the moderator to sign Attachment G, the Special Consent for Expanded Use of Video and Audio Recordings for Individual Respondents of Discussion Groups Form. The purpose of this form is to allow for the playing of recordings at conferences, meetings, or in the classroom to illustrate particular findings from a focus group. Use of this form is at the discretion of the moderator and is typically warranted if (1) the focus group demonstrated a unique question problem or research finding and (2) there is an anticipated need to demonstrate the research finding at a conference, meeting, or instructional session. This form is not used when the topic of the cognitive interview is considered to be an illegal behavior (self-report or proxy report) or in the case of focus groups with minors (persons under the age of 18); recordings of focus groups with minors will never be shown to others not included in the study staff. Participants are given a copy of the form which contains information about how to contact the CCQDER Laboratory Manager, the NCHS Research Ethics Review Board Chair, and the NCHS Confidentiality Officer. If participants grant Special Consent, recordings are kept for as long as there is a justifiable, scientific use for the recordings as determined by the NCHS Research Ethics Review Board. If any one respondent from the focus group does not grant special consent, the recording will not be used in this way.
Contractor conducted interviews
On the rare occasion when contractors (designated agents) are used to collect data as part of CCQDER projects, they are contractually bound by NCHS confidentiality provisions and must submit documentation concerning their safeguarding practices to NCHS prior to data collection. The documentation is reviewed by the NCHS Confidentiality Officer and the NCHS Information Systems Security Officer. This is standard NCHS practice and does not reflect a special CCQDER procedure. If recordings are to be shared with the contractor, a contract as well as a Designated Agent Agreement will be developed. The contractor employee will view the NCHS confidentiality training (https://www.cdc.gov/nchs/training/confidentiality/training/), submit a certificate of completion, and sign the NCHS non-disclosure statement (see Attachment C) before starting work on the project.
Field Tests/Pilot Interviews
For field test/pilot interviews of household and telephone respondents, standard operating procedures regarding informed consent and survey administration procedures specific to the survey being tested will be followed.
Most of the questionnaires currently proposed for study generally do not contain questions that are highly sensitive in nature. There are some exceptions, such as the National Survey of Family Growth (OMB No. 0920-0314, Exp. Date 12/31/2024), National Health Interview Survey (NHIS) (OMB No. 0920-0214, Exp. Date 12/31/2023) questions on income and HIV, and National Health and Nutrition Examination Survey (NHANES) OMB No. 0920-0950, Exp. Date 04/30/2023) questions on sexual behavior. Again, one purpose of pre-testing such questions is to determine means for fashioning these questions in such a way that sensitivity is minimized, and responses are valid.
12. Estimates of Annualized Burden hours and costs:
An average of 71,925 respondents participates in CCQDER activities in a given year and the average annual respondent burden is estimated to be 21,905 hours. Annualized estimates of respondent burden for each of the questionnaire development studies, over the course of data collection, are provided below. For most questionnaire development studies, it is anticipated that interviews will last one hour. For some questionnaire development studies, questionnaire administration is anticipated to frequently require less than an hour of a respondent’s time (for example, a fifteen-minute interview may be conducted), and in rare cases, the burden may be more than one hour. Because the hours per response in questionnaire development studies are expected to vary, we will select the final sample size for each project in such a way that the total burden hours do not exceed the estimate listed above. For focus groups, the usual amount of time is 90 minutes (1.5 hours) which includes instructions and ancillary paperwork.
For interviews in the laboratory, time required to travel to the lab is not covered, because distances and modes of transportation are unknown. No retrieval of information by respondents is anticipated; although it is possible that validation of data at some point may require respondents to check records, probably those kept at home. In that case, the study will be designed so that the response time includes record retrieval. All estimates are based on NCHS' experience (1988 through 2022).
Estimated Annualized Burden Table
Types of Respondents |
Form Name |
Number of Respondents |
Number of Responses per Respondent |
Average hours per response (in hours) |
Total Burden Hours
|
Individuals or households |
Eligibility Screeners |
4,400 |
1 |
5/60 |
367 |
Individuals or households |
Developmental Questionnaires |
8,750 |
1 |
55/60 |
8,021 |
Individuals or households |
Respondent Data Collection Sheet |
8,750 |
1 |
5/60 |
729 |
Individuals or households |
Focus Group Documents |
225 |
1 |
1.5 |
338 |
Individuals or households |
RANDS (Methodological Survey) |
49,800 |
1 |
15/60 |
12,450 |
Total |
|
71,925 |
|
|
21,905 |
Estimated Annualized Burden Costs to Respondents.
The average annual response burden cost for the CCQDER is estimated to be $962,283.55. The hourly wage estimate is based on the Bureau of Labor Statistics May 2021 National Occupational Employment and Wage Estimates (http://www.bls.gov/oes/current/oes_nat.htm). There is no cost to respondents other than their time to participate.
Type of Respondent |
Form Name |
Total Burden Hours |
Hourly Wage Rate |
Total Respondent Costs |
Individuals or households |
Eligibility Screeners |
367 |
$28.01 |
$10,279.67 |
Individuals or households |
Developmental Questionnaires |
8,021 |
$28.01 |
$224,668.21 |
Individuals or households |
Respondent Data Collection Sheet |
729 |
$28.01 |
$20,419.29 |
Individuals or households |
Focus Group Documents |
338 |
$28.01 |
$9,467.38 |
Individuals or households |
RANDS (Methodological Survey) |
12,450 |
$28.01 |
$348,724.5 |
Total |
|
$613,559.05
|
13. Estimates of Other Total Annual Cost Burden to Respondents and Record keepers
There is no annual capital or maintenance costs to the respondent resulting from this collection of information.
14. Annualized Costs to the Federal Government
The cost to the government consists mainly of the salaries of the CCQDER staff that will (1) assist the questionnaire designers in the design of appropriate laboratory instruments, (2) recruit, schedule, and assist in interviewing volunteer respondents, and (3) assist in the analysis of the results and recommend changes in questionnaire wording.
Total annualized project costs are as follows:
NCHS costs for CCQDER staff to plan, conduct, and analyze the outcomes of the questionnaire development activities:
Staffing 17.0 FTEs $1,897,625.00
Incentives for CCQDER respondents 4500 @ $50
(Pilot Household and web panel tests do not include incentives) $225,000.00
CCQDER Contract Staff (including remuneration) $985,250.24
Contracts for assistance with
methodological research $30,000.00
Off-site travel: (see note below under
travel costs) $10,000.00
Materials for conducting household
interviews $500.00
Flyers $200.00
Advertisements $24,750.00
Hardware and software upgrades $50,000.00
Annual Total $3,223,325.24
3 Year Total (for generic submission) $9,669,975.72
Travel costs: Most data will be collected in NCHS office space. However, it will be more efficient in certain instances to hold interviews with individuals at other locations, which will involve some travel costs. Further, household interviews will require limited numbers of in-person interviews in respondent households. Household interviews will be done locally, in order to limit travel costs, unless there is a compelling reason to do otherwise (for example, if respondents critical to the study can be interviewed only at a distant location).
15. Explanation for Program Changes or Adjustments
This is a generic clearance. The current annualized total burden hours is 9,455. We are requesting 21,905 annualized total burden. The increase of 12,450 hours is due to a program change with additional rounds of RANDS methodological surveys.
16. Plans for Tabulation and Publication and Project Time Schedule
This clearance request is for questionnaire development activities to be conducted prior to survey production and for developmental work that will guide future questionnaire design. Most laboratory investigations will be analyzed qualitatively. The survey designers and lab staff serve as interviewers and use detailed notes and transcriptions from the in-depth cognitive interviews to conduct analyses. Final reports will be written that document how the question performed in the interviews, including question problems as well as the phenomena captured by the survey question. All reports will be placed on Q-Bank for public access. Reports are used to provide necessary information to guide designs for redesigning a question prior to fielding as well as to assist end users when analyzing the survey data. For field tests/pilot interviewing activities, qualitative and quantitative analysis will be performed on samples of observational data from household interviews to determine where additional problems occur. Because NCHS is using state-of-the-art questionnaire development techniques, methodological papers will be written which may include descriptions of response problems, recall strategies used, and quantitative analysis of frequency counts of several classes of problems that are uncovered through the cognitive interview and observation techniques.
Each individually submitted information collection will include a project time schedule specific to that project.
17. Reason(s) Display of OMB Expiration Date is Inappropriate
The expiration date will be displayed.
18. Exceptions to Certification for Paperwork Reduction Act Submissions
The certifications are included in this submission.
1 Scanlon P. Using targeted embedded probes to quantify cognitive interviewing findings. In Beatty PC, Collins D, Kaye L, Padilla JL, Willis G, Wilmot A, editors. Advances in questionnaire design, development, evaluation, and testing. Hoboken, NJ: John Wiley & Sons, 427–50. 2019. DOI: 10.1002/9781119263685.ch17; Cornelia E. Neuert, Katharina Meitinger, Dorothée Behr. 2021. “Open-ended versus Closed Probes: Assessing Different Formats of Web Probing.” Sociological Methods & Research, https://doi.org/10.1177/00491241211031271.
2 Scanlon P. The effects of embedding closed-ended cognitive probes in a web survey on survey response. Field Methods 31(4):328–43. 2019. DOI: 10.1177/1525822X19871546; Scanlon P. Using targeted embedded probes to quantify cognitive interviewing findings. In Beatty PC, Collins D, Kaye L, Padilla JL, Willis G, Wilmot A, editors. Advances in questionnaire design, development, evaluation, and testing. Hoboken, NJ: John Wiley & Sons, 427–50. 2019. DOI: 10.1002/9781119263685.ch17
3 Irimata KE, He Y, Cai B, Shin H-C, Parsons VL, Parker JD. Comparison of Quarterly and Yearly Calibration Data for Propensity Score Adjusted Web Survey Estimates. Survey Methods: Insights from the Field, Special issue ‘Advancements in Online and Mobile Survey Methods.’ 2020. DOI:10.13094/SMIF-2020-00018; Parker J, Miller K, He Y, Scanlon P, Cai B, Shin H-C, Parsons V, Irimata K. Overview and Initial Results of the National Center for Health Statistics’ Research and Development Survey. Statistical Journal of the International Association for Official Statistics 36(4):1199-1211. 2020. DOI: 3233/SJI-200678; He Y, Cai B, Shin H-C, Beresovsky V, Parsons V, Irimata K, Scanlon P, Parker J. The National Center for Health Statistics’ 2015 and 2016 Research and Development Surveys. National Center for Health Statistics. Vital Health Stat 1(64). 2020. https://www.cdc.gov/nchs/data/series/sr_01/sr01-64-508.pdf
4 Miller, K. and Willis, G. (2016) Cognitive Models of Answering Processes. The SAGE Handbook of Survey Methodology. Thousand Oaks, CA: Sage.
Miller, K., Willson, S., Chepp, V., and Padilla, J. (2014) Cognitive Interviewing Methodology: An Interpretive Approach for Survey Question Evaluation. (New York: Wiley and Sons.
5Off-site interviews fall into two categories. First, it is not always feasible for individuals to travel to the CCQDER, or it may be more efficient for interviewers to travel to a particular site. Second, we occasionally conduct establishment studies where a visit to the business location is pertinent to data collection.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Modified | 0000-00-00 |
File Created | 2025-05-19 |