Supporting Statement Part A
Request for Generic Clearance:
QUESTIONNAIRE COGNITIVE
INTERVIEWING AND PRETESTING (NCI)
(OMB #0925-0589, Expiry 5/31/2011)
This is an extension of a currently approved submission.
Changes are indicated in yellow highlights.
Contact Information:
Gordon Willis, PhD.
Cognitive Psychologist
Applied Research Program
Division of Cancer Control and Population Sciences
National Cancer Institute
6130 Executive Blvd, MSC 7344
EPN 4005
Bethesda, MD 20892
301-594-6652
301-435-3710 (fax)
January, 2011
Table of Contents
Table of Contents ii
List of Attachments iii
A. JUSTIFICATION 1
A.1. Circumstances Making the Collection of Information Necessary 1
A.2. Purpose and Use of Information 8
A.2.1. Development and Testing of Specific Survey Questionnaires 8
A.2.2. Research on the Cognitive Aspects of Survey Methodology…..12
A.2.3. Research on Human-Computer Interfaces/ Usability 13
A.2.4. Pilot Household Interviewing 14
A.3. Use of Improved Information Technology and Burden Reduction 16
A.4. Efforts to Identify Duplication and Use of Similar Information 16
A.5. Involvement of Small Businesses and Other Small Entities 18
A.6. Consequences of Collecting the Information Less Frequently 18
A.7. Special Circumstances Relating to 5 CFR 1320.5 18
A.8. Comments in Response to the Federal Register Notice and Efforts to
Consult Outside Agencies 18
A.9. Explanation of Any Payment of Gift to Respondents 20
A.10. Assurances of Confidentiality Provided to Respondents 22
A.11. Justification of Sensitive Questions 25
A.12. Estimates of Annualized Burden Hours and Costs 25
A.13. Estimates of Total Annual Cost Burden to Respondents and
Record keepers 27
A.14. Annualized Costs to the Federal Government 27
A.15. Explanation for Program Changes or Adjustments 28
A.16. Plans for Tabulation and Publication and Project Time Schedule 29
A.17. Expiration Date Display Exemption 29
A.18. Exceptions to Certification 30
LIST OF ATTACHMENTS FOR SUPPORTING STATEMENTS
ATTACHMENT 1: List of Currently Approved and Proposed Sub-studies for This Generic Clearance from 2008-2011
ATTACHMENT 2: Description of cognitive testing: Willis, G.B. (2005). Cognitive Interviewing. In S.J. Best & B. Radcliff, Polling America: An Encyclopedia of Public Opinion, pp. 92-98. Greenwood Press: Westport, CT.
ATTACHMENT 3: Letter from NIH Privacy Act Officer
A. JUSTIFICATION
A.1. Circumstances Making the Collection of Information Necessary
Background and History. In 1983-1984, the Committee on National Statistics conducted a seminar on the Cognitive Aspects of Survey Methodology (CASM) under a grant from the National Science Foundation (NSF). The participants in the CASM I seminar were cognitive psychologists from academic institutions and survey researchers from the National Center for Health Statistics (NCHS) and the Bureau of the Census. The seminar examined a number of cognitive-related methodological proposals that might lead to improvements in the questionnaires and interviewing procedures employed in scientific surveys in general, and in the NCHS National Health Interview Survey (NHIS) as a test case.
Following this seminar, the NSF provided funding to NCHS to investigate how relevant knowledge and techniques in cognitive science could be applied to improve health surveys. The project, begun in 1984, was called Laboratory-Based Studies of the Cognitive Aspects of Survey Methodology, and used cognitive psychological methods to study the survey interviewing process. In its final report, NCHS concluded that it is feasible and efficient for Federal statistical agencies to conduct qualitative research on the cognitive aspects of survey questionnaires. Subsequently, NCHS applied the cognitive research techniques being tested under the grant to develop the 1987 NHIS supplement, a comprehensive set of questions on knowledge, attitudes, and practices regarding cancer risk factors. Cognitive research techniques - now commonly referred to as cognitive interviewing - proved effective for identifying conceptual problems with draft questions. The NCHS project staff concluded from this experience that past questionnaire design procedures were often unable to identify questions that failed to measure what was intended, and that cognitive interviews were effective for identifying these kinds of measurement errors. At that point, a Questionnaire Design Research Laboratory (QDRL) was created at NCHS to provide such testing for surveys on a regular basis, as well as to continue more general research on the survey response process, questionnaire design, and pretesting methodology (OMB No. 0920-0222).
Since the inception of the QDRL, several other Federal agencies, including the Census Bureau and Bureau of Labor Statistics (BLS), have established cognitive laboratories or otherwise developed capacity for the conduct of cognitive interviews. These interviewing activities are currently conducted under Generic OMB Clearance at NCHS (OMB No. 0920-0222), at the Census Bureau (OMB No. 0607-0725), and at the Bureau of Labor Statistics (OMB No. 1220-0141) as cognitive interviewing techniques have been performed almost continuously for the evaluation of numerous survey questionnaires. In fact, one of the major conclusions of a second CASM seminar (CASM II, held in 1997) was that cognitive testing of survey questionnaires has become a standard practice in the Federal government, as well as in private and academic survey research organizations. Generally, testing staff are not the original authors of the survey questionnaires and do not make decisions about the overall content and survey objectives. Rather, they are methodological specialists – either Agency staff members or contracted specialists -- who submit questionnaires to intensive evaluations designed to improve these measures. This work has proven to be effective for enhancing the quality of Federal survey data for over twenty years.
Proposed NCI Generic Clearance request. Given this history of use of pretesting techniques under Generic Clearance, DCCPS/NCI in 2008 established its own Generic clearance to facilitate testing of NCI-sponsored surveys (OMB #0925-0589). As stated in the OMB document “Questions and Answers When Designing Surveys for Information Collections: What is a generic clearance for pretesting activities?” (http://www.whitehouse.gov/omb/inforeg/pmc_survey_guidance_2006.pdf):
Agencies that regularly do pretesting and development work for multiple surveys have found it beneficial to obtain a generic clearance specifically for these kinds of studies. Once the overall generic clearance is obtained on the pretesting activities and methods that will be used (e.g., cognitive interviews, focus groups, respondent debriefings, etc.) through the normal clearance process, agencies can submit abbreviated collection requests on the specific questions to be tested and obtain expedited OMB review (often within 10 working days) of the specific study, which can greatly facilitate ongoing and iterative rounds of testing. For example, cognitive laboratories at the Bureau of Labor Statistics, the Bureau of the Census, and the National Center for Health Statistics have these clearances.
In some cases NCI has pre-existing relationships with agencies that already conduct pretesting under such a Generic clearance. In particular, testing of the periodic Cancer Supplement to the National Health Interview Survey has been conducted at the NCHS QDRL (OMB No. 0920-0222); and pretesting of the NCI-sponsored Tobacco Use Supplement to the Current Population Survey (TUS-CPS) has been conducted by the Census Bureau (OMB No. 0607-0725). Increasingly, however, NCI has developed one-time and periodic surveys – such as the Health Information National Trends Survey (HINTS: OMB No. 0925-0538, in 2005) -- that do not involve collaboration by other Federal agencies, or that would benefit from NCI-sponsored pretesting. For these surveys, it is advantageous to the government if development were to follow a pretesting sequence equivalent to that used at NCHS, the Census Bureau, or BLS.
As such, NCI staff developed the capacity for the conduct or oversight of such pretesting techniques, through hiring of a senior methodologist who previously conducted questionnaire cognitive pretesting at the NCHS QDRL, who is expert in pretesting techniques, and who has authored a book on survey pretesting1. Further, NCI developed a capacity for the conduct of cognitive interviews through training of its staff, and by establishing connections to several contract research organizations that have similar capabilities. Based on these developments, DCCPS/NCI modeled its initial Generic clearance request after that used by NCHS and approved by OMB (No. 0920-0222), but tailored to NCI activities. The original submission of “Questionnaire Cognitive Interviewing and Pretesting” was approved in May, 2008, for three years. Attachment 1 contains a summary of projects that have been conducted under this clearance to this point, or are planned for, in the future.
Cognitive interviewing methods. Methods used closely follow those that have been commonly applied to date in the testing of Federal survey questionnaires. Cognitive interviewing techniques – also called intensive interviews -- focus on the use of both think-aloud and on verbal probing. Generally, a volunteer participant is asked to think aloud as he/she answers the questions, and the specially-trained interviewer probes the participant for additional information. The interviews are generally semi-structured; the interviewer uses draft survey questions as a guide, but probes as needed to determine the participant's interpretation of the questions and the recall, and decision processes used to arrive at his/her answers. This method uncovers ambiguities in question wording, participant strategies for dealing with vague questions, or questions that ask for information that is not readily available (see Attachment 2 for more detailed information).
A variant of this approach is retrospective cognitive interviewing (or debriefing), in which the interviewer first administers the entire draft questionnaire, and then reviews the questions and responses with the participant, probing for reactions to the questions. While less information is gained about the recall techniques used by participants, there is also less deviation from the natural flow of an interview. In some cases, interviews are audiotaped or videotaped (assuming the subject provides appropriate consent), so that the interviewer can concentrate on probing the responses and can analyze content of the collected information later.
Occasionally, focus groups, typically of 5-12 individuals, are used to discuss general concepts that survey questions will focus on. Individual interviews are generally preferable to focus groups for evaluating specific questions because respondents usually respond to surveys individually, and the group dynamic associated with a focus group format can have a strong influence on interpretations and responses2. However, focus groups can sometimes assist questionnaire designers in understanding the relevant background circumstances of various groups of people, and this information can be used to craft questions that better match respondent experiences3.
Additional issues arise in computer-assisted and Internet based survey instruments, involving the human-interface design, ease of use, comprehension, privacy, quality of on-line help and efficiency of screen organization4. For questionnaires that involve Web-administration, we will rely on usability testing techniques that are very similar to those used for cognitive interviews, but that involve a more technologically-intensive environment (e.g., administration via laptop or desktop computer).
Pilot Household Interviewing: Although the cognitive interviewing methods described above are effective for identifying problems that are missed by traditional field pretests, they are limited because they do not administer questions under actual field-based interviewing conditions. Therefore, further pilot tests conducted within selected households, and that may include up to 200 households, depending on the size of the survey to be fielded later, are a vital complement to cognitive interviews. Survey methodologists may conduct small-scale pilot household interviewing at various points in questionnaire development – not for purposes of field data collection and computation of survey estimates -- but rather as a vital step in the questionnaire development sequence. Also, as time and resources allow, researches apply behavior coding to record the behaviors of both interviewers and survey participants in such interviews to allow for systematic analysis5. These activities have been used successfully to develop the questionnaires used in previous Federal questionnaires, such as the NCHS National Health Interview Survey (NHIS) Supplements, and the NCI-sponsored Tobacco Use Supplement to the Current Population Survey (OMB No. 0925-0368). NCI therefore proposes to make use of similar activities in the development of future cancer-related surveys.
Generally, pilot interviews for face-to-face surveys are conducted in the participant's household; pilot interviews for telephone surveys are conducted over the telephone; and those for mail-based self-administration are sent via the mail. Professional field interviewers (often, contactor staff interviewers who are enlisted for the tested survey) normally conduct the interviews. A subset of these interviews is usually observed by a survey professional (a Federal staff member or member of the contract staff). As the interviewer conducts the pilot household interview, the observer compiles notes regarding respondent misunderstandings or difficulties in answering, or questions that interviewers have difficulty administering, which help to identify potential question revisions. This practice allows testing of types of individuals who do not ordinarily volunteer for cognitive interviews, and who may be more typical of the usual survey participant; it also provides information collected under field conditions, and is collected early enough to be useful for questionnaire design decisions.
Finally, we recognize that field pretests are a third type of pretesting, generally conducted after questionnaire development steps (whether cognitive interviews and/or pilot household interviews) are completed. Such pretests are a vital component of preparations for a major survey collection and serve as “dress rehearsal” that evaluates questionnaire flow, length, logical progression, and so on. Cognitive interviews and pilot household interviews do not replace full-scale field pretests, but complement them with a greater focus on questionnaire design issues. Full-scale field pretests of large surveys may consist of 300 or more households, and are normally approved through the regular OMB clearance request process for a fielded survey (either as part of the fielded survey, or as a separate request), rather than through Generic clearance. NCI anticipates that full field pretests for large surveys such as the NHIS will continue to be approved by OMB separately from those activities described in the current request.
Data collection for this project is authorized under Section 410 (42 USC 285) and 412 (42 USC 285a-1) of the Public Health Service Act. NCI/DCCPS is requesting a three year clearance with terms similar to that previously granted under OMB No. 0925-0589. DCCPS staff will submit individual or bundled sub-studies under this generic. Should bundled sub-studies (up to three at once) be submitted it is with the agreement that this process will allow for a faster OMB review than should each sub-study be submitted consecutively. Approval for participant incentives will be evaluated on a case-by-case basis.
A.2. Purpose and Use of Information
The purpose and use of collecting this information fall into four categories—the first three of which involve cognitive/intensive interviews, and the fourth relies on pilot testing with behavior coding:
2.1 Development and testing of specific survey questionnaires
2.2 Research on the cognitive aspects of survey methodology
2.3 Research on human-computer interfaces/usability
2.4 Pilot household interviewing
A.2.1. Purpose and Use of: Development and Testing of Specific Survey Questionnaires
This data collection primarily uses cognitive interviewing methodology to identify and correct questionnaire flaws, e.g., questions which are vague or ambiguous, cannot be answered readily or accurately by the participant, or otherwise contribute to the non-sampling errors of the survey. Attachment 2 contains a short description that outlines the contributions of the cognitive interviewing methodology to the questionnaire development process, the methods used at various stages of the process, and the strengths and limitations of this methodology. The methods used will vary depending on the stage of development of the various data collection instruments to be studied. When questions have been used successfully in earlier surveys, testing will evaluate whether the questions function appropriately in the new context. In cases where there is evidence that previously developed questions were not reliable or valid, more extensive evaluation will be conducted. The most extensive questionnaire development activities will be applied to untested draft questions and undeveloped lists of data objectives.
Although we cannot anticipate all of our pretesting activities over the next several years, especially because plans for future surveys depend on budget and establishment of priorities, our planning currently anticipates the following specific survey projects:
a) The Health Information National Trends Survey (HINTS) (OMB No. 0925-0538). NCI’s HINTS survey has been conducted in 2003, 2005, and 2007-8 primarily by telephone. For the planned fourth cycle of HINTS, administration will primary be through mail. Further, the survey will adopt a continuous administration model for the years 2012-2014, in which a fairly static core will be used for each annual cycle, with supplementary question sets that are modified for each annual cycle. HINTS is meant to provide dynamic information concerning issues of current interest in the field of health communication. This generic clearance plans to conduct pre-testing of HINTS mailed materials and new questions.
b) Tobacco Use Supplement to the Current Population Survey (TUS-CPS) (OMB No. 0925-0368). The National Cancer Institute periodically sponsors the administration of a large-scale population-based tobacco survey within the CPS, which is itself conducted by the Census Bureau for BLS. For the TUS, NCI varies the topics of emphasis as new data collection needs arise (for example, in 2003 NCI developed the TUS Special Cessation Supplement to track tobacco quitting behaviors; the 2006-7 TUS-CPS provided population surveillance data on tobacco use; and the 2010-2011 cycle represented a combination of 2003 and 2006-7 versions). A future cycle of the TUS-CPS is tentatively planned for next several years, and we anticipate conducting pretesting under the current clearance request. Because new or modified questions developed for the CPS require careful pretesting, we will also conduct cognitive interviewing of draft forms of new questions that are developed, both to determine the functioning of these items, and to ensure that they function appropriately in the context of existing items.
c) The California Health Interview Survey Discrimination Module (CHIS-DM). In conjunction with the University of California, Los Angeles, which administers the CHIS survey, DCCPS/NCI staff have developed a novel measure of racial and ethnic discrimination, appropriate for telephone-based administration in surveys such as the CHIS. Over the next three years, we plan to further develop these measures, through the conduct of cognitive interviewing and Pilot testing, across multiple language and cultural groups, The goal of this formative research is to produce a brief, easy-to-administer instrument that is clear to survey respondents, and that produces comparable data when translated into Spanish and several Asian languages.
d) Health-Related Quality of Life/Quality of Care Assessment
A major challenge to NCI and to epidemiologists and researchers is the development of self-report items for use by patients, and members of the general public, relating to Health-Related Quality of Life (HR-QOL), and more specifically to NCI, self-assessed quality of cancer care received by medical providers (i.e., Patient-Reported Outcomes, or PROs). Development of such items that are reliable, and that do not place significant burden on respondents, has presented a consistent challenge to practitioners and researchers interested in the development of items and scales that represent these concepts6. Cognitive testing, and pretesting in general, are useful in the development of these items, especially in conjunction with quantitative methods that involve psychometrics and statistical analysis. As such, NCI staff are actively engaged in the development of item banks of questions focused on Quality of Life/Care, to be used across a range of future investigations and surveys. In particular, the “Patient Reported Outcome Measurement System” (PROMIS) (see http://www.nihpromis.org/default.aspx). NCI researchers and colleagues who are developing PROMIS have an active interest in developing and evaluating their measures using cognitive interviewing techniques, and these sub-studies are anticipated to be supported under the current Generic clearance.
Further, NCI has worked with the Food and Drug Administration to develop PROs through the project “Patient-Reported Outcomes version of the Common Terminology Criteria for Adverse Events (PRO-CTCAE)” (http://outcomes.cancer.gov/tools/pro-ctcae.html). The purpose of the PRO-CTCAE project is to develop an electronic-based system for patient self-reporting of symptom adverse events (AEs) listed in the CTCAE in an effort to improve the accuracy and precision of grading of this class of AEs.
We anticipate conducting focus groups, cognitive testing, and perhaps pilot testing activities, that focus specifically on the qualitative aspects of these items, and that in particular assess whether they are clear and interpreted similarly across individuals, across patient groups, and across racial/ethnic/cultural groups. Beyond informing initiatives such as PROMIS and PRO-CTCAE, the results of these pretesting activities may not be targeted toward a specific survey, but rather toward the establishment of scales that are appropriate for incorporation into future studies. These efforts should greatly facilitate the development of new surveys, as much of the requisite evaluative work will have already been conducted.
e) Other questionnaire testing and development: In addition to the specific questionnaire testing and development activities listed above, we anticipate that NCI staff will perform testing of other questionnaires that require development over a short time-frame. Because the requests may arrive with little advance notice, we cannot presently specify the nature of these questionnaires.
The interviews for questionnaire development activities (a) through (e) above will usually be conducted using procedures described in Attachment 2. Interviews are normally conducted at NCI facilities (e.g. at the NCI Usability Laboratory) or in contractor offices (such as the Westat cognitive laboratory). If we are unable to obtain adequate numbers of individuals from particular population subgroups (e.g., the elderly, or those who have specific health problems), we will attempt to make arrangements with organizations such as centers for the elderly, or service organizations for persons with specific health conditions, in order to interview participants at outside locations. Usually, cognitive interviews will be conducted in the mode intended for the survey (face-to-face, telephone, self-administered, or web-based). For a telephone interview, we will either make arrangements to call the participant at home, or to conduct the interview in our laboratory, but calling the participant from another room for questionnaire administration, followed by face-to-face debriefing.
A.2.2. Purpose and Use of: Research on the Cognitive Aspects of Survey Methodology
The second major purpose of data collection is to conduct several cognitive research projects:
a) Cross-cultural research. NCI endeavors to conduct basic studies of how best to measure increasingly important factors associated with the cross-cultural aspects of survey response, such as measurement of respondent acculturation. Such questions are key to understanding language and cultural issues that impact access to care, and health in general. NCI staff intend to conduct cognitive testing of these questions, or of newly developed alternative approaches, of Hispanics who are of varying levels of acculturation to U.S. society, to determine whether the questions (in English, and in Spanish) are both understandable, and obtain the types of information intend. Further, we anticipate the development of acculturation questions appropriate as well to Asian or other respondents, and plan to be prepared to develop and evaluate appropriate measures.
b) General methodological research: DCCPS/NCI staff constantly evaluate and refine cognitive interviewing methods, especially in order to respond to changes such as the conversion from telephone-based interviewer administration, to paper-based self-administration, associated with the widespread adoption of Address-Based Sampling (ABS). Further, NCI staff regularly conduct applied research on questionnaire design issues, such as the optimal wording for measures of complex concepts related to cancer risk factors and related issues (e.g., physical activity; diet; tobacco use; cancer screening). For the next cycle of pretesting under this generic clearance, DCCPS staff plan to continue research on methods evaluation and general questionnaire design research. We envision that over the next three years, NCI staff and contactors will work collaboratively with survey researchers from Universities and other federal agencies to define and examine several research areas, including, but not limited to: 1) differences between face-to-face and telephone-based cognitive interviewing, 2) effectiveness of different approaches to cognitive interviewing, such as concurrent and retrospective probing, and 3) social, cultural and linguistic factors in the question response process.
Procedures for each of these studies will be similar to those applied in the usual testing of survey questions. For example, different versions of a survey question will be developed, and the variants then administered to separate groups of participants in order to study the cognitive processes that account for the differences in responses obtained across different versions. These studies will be conducted either by NCI personnel or by contractors who are trained in cognitive interviewing techniques. The results of these studies will be applied to our specific questionnaire development activities in order to improve the methods that we use to conduct questionnaire testing, and to guide questionnaire design in general.
A.2.3. Purpose and use of: Research on Human-Computer Interfaces/Usability
The third major purpose of this data collection is to conduct research on computer-user interface designs for computer-assisted and Web-based instruments, which is often referred to as usability testing. This research examines how survey questions, instructions, and supplemental information are presented on computer instruments, especially over the Internet, and investigates how the presentation affects the ability of users to effectively utilize these instruments. Authors of computer-assisted instruments make numerous design decisions: how to position the survey question on a computer screen; how to display interviewer instructions to respondents; the maximum amount of information that can be effectively presented on one screen; how supplemental information such as “help screens” should be accessed; whether to use different colors for different types of information presented on the screen; and so on. Research has shown that these decisions can have a significant effect on the time required to administer survey questions, the accuracy of question-reading, the accuracy of data entry, and the full exploitation of resources available to help the user complete his or her task7.
Usability testing has many obvious similarities to questionnaire-based cognitive research (described in Section A.2.1), as it focuses on the ability of individuals to understand and process information in order to accurately complete survey data collection. It is also somewhat divergent in the sense that dynamic visual information is of greatly increased importance. In particular, it also focuses more heavily on matters of formatting and presentation of information than does traditional cognitive testing.
It is anticipated that this generic clearance will again be actively used by DCCPS/NCI staff who want to test the usability of their web pages. In 2010, three of the five approved sub-studies for this generic clearance involved usability testing that was funded through NIH Set-Aside funds. Though it is unclear if the NIH Set-Aside Funds will be used similarly in 2011, it anticipated that there will be continued interest and possibly an increase in the use of usability testing at NCI.
A.2.4. Purpose and Use of: Pilot Household Interviewing
The activities described above – cognitive interviews, focus groups, and usability studies – can together be terms as intensive interviewing methods. The fourth major purpose of data collection differs from these, as it instead applies to unobtrusive field-based questionnaire evaluation techniques, with respect to future surveys conducted either within the household or over the telephone. The use of the Pilot Household Interview, subsequent to cognitive testing, was introduced by NCHS researchers in the 1990s (NCHS Cognitive Methods Staff Working Papers #3; #15; Hyattsville, MD), and have been supported first under OMB No. 0920-0222, and then under the first cycle of NCI Generic Pretesting Clearance OMB No. 0925-0589. The tested questionnaires may be pilot-tested either individually or in combination, depending on developmental status of the instruments, the appropriateness of combining them, and their overall length. It is envisioned that for any single pilot test, four or five professional field interviewers will conduct a total of approximately 100-200 pilot household interviews. There are three components to the proposed form of testing: a) a limited number of interviews on a draft version of the questionnaires that are conducted using household participants, by NCI and other staff trained in observational techniques; b) inclusion in the questionnaires of two different versions of particular questions, to gather information relevant to determining which version functions better in the field environment; and c) to make such determinations, the systematic coding of interviewer-respondent interactions. Although no pilot testing of this type was conducted under the first three-year cycle of this clearance, for the next cycle, we anticipate conducting such tests as part of the development of the Health Information National Trends Survey (HINTS), described above. Details of this testing will be submitted as separate sub-studies under this request.
Overall, the four major activities outlined above have well-demonstrated practical utility. As a result of pretesting, questionnaires may produce clearer materials, and therefore less response error, than would occur in the absence of this testing. Thus, users of NCI data, in both Federal agencies and in the general health research community, will be less likely to be misled by erroneous statistical results. This assertion is supported by over twenty-five years of experience in using these techniques, and has been supported by findings presented at many statistical and research related conferences, and published in scientific journals such as Public Opinion Quarterly and Applied Cognitive Psychology. The practical utility of Pilot Household Interviewing has also been supported in findings reported at an annual meeting of the American Statistical Association and the American Association for Public Opinion Research. Further evaluation of the efficacy of these methods will be ongoing.
For later discussion, the term Intensive Pretesting will be used as a general term to refer to cognitive interviews, focus groups, and usability testing; these are distinguished from observational, field-based Pilot Household Interviews.
A.3. Use of Improved Information Technology and Burden Reduction
Pretesting will be conducted using most recent modes of survey data collection, including CAPI/CASI, touch-tone data entry (TDE), the Internet, or other modes applied to Federal surveys.
A Privacy Impact Assessment (PIA) is not needed because there is no information technology (IT) system associated with this information collection. Should this change, or should an individual generic sub-study have an IT system, then they will pursue a PIA at that time.
A.4. Efforts to Identify Duplication and Use of Similar Information
NCI staff work closely with staffs of other Federal agencies that conduct pretesting activities, including (a) the QDRL at NCHS, (b) The Census Bureau’s Center for Survey Methods Research), and (c) the BLS Collection Procedures Research Laboratory. Further, participation of key NCI staff in the Interagency Response Error Group (IREG), which meets quarterly to discuss questionnaire development and pretesting among Federal statistical and research agencies, ensures that pretesting is conducted in a manner that is coordinated across agencies. NCI staff will avoid the conduct of pretesting activities that are duplicative of those of the other agencies. In most cases this is true simply because the various agencies evaluate different survey questionnaires (generally those they develop and administer). However, where surveys do overlap between Agencies (such as the National Health Interview Survey, where responsibility for Cancer-related Modules is shared between NCI and NCHS), NCI staff will collaborate regularly with the other Agency to produce a pretesting plan that is optimal for purposes of timely and efficient production of results, in a way that minimizes respondent burden. This could involve sharing of pretesting responsibilities, but in a coordinated, non-duplicative manner. In some cases parallel testing of the same questions could be conducted across Agencies, for purposes of comparison of pretest results. In particular, as cognitive techniques have been applied, there has been a paucity of research concerning the reliability of obtained results8; parallel testing between agencies provides an important methodological bridge to provide an answer to this persistent question.
Overall, NCI questionnaire design researchers also maintain very close contact with other experts in the field of questionnaire development in the academic survey community, in the health sciences field, at the Census Bureau, BLS, NCHS, the General Accounting Office, the National Science Foundation, and the Energy Information Administration. From these contacts, it is clear that no other projects that duplicate the current proposal are now underway.
A.5. Involvement of Small Businesses and Other Small Entities
It is possible that representatives of small businesses will be interviewed as part of testing involving medical offices and other establishments, for provider/physician surveys. For these interviews, the organization/office will be approached in the same manner as the individuals we normally recruit; we will ask the organization to identify the appropriate staff members with whom to conduct the cognitive interview.
A.6. Consequences of Collecting the Information Less Frequently
The project involves one-time data collection activities only. There are no legal obstacles to reducing the burden.
A.7. Special Circumstances Relating to 5 CFR 1320.5
There are no special circumstances.
A.8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside Agencies
A 60-Day Federal Register notice for this collection was published on December 17, 2010 (75 FR 79009). No public comments were received.
Other agencies and individuals: Some of the topics selected for NCI surveys may be developed in conjunction with other agencies: For example, the Tobacco Use Supplement to the Current Population Survey TUS-CPS) has in the past been developed in conjunction with the Office of Smoking and Health within the Centers for Disease Control and Prevention. These agencies may be involved in development of draft questionnaires. Further, NCI staff maintain ongoing connections with staffs of NCHS and of the Census Bureau, concerning the development and pretesting of the NHIS, the TUS-CPS, and other joint efforts.
Researchers who have special interest and expertise in the research areas explored will be contacted as necessary.
Consultants outside of NCI:
Kristen Miller, Ph.D.
Office of Research and Methodology
National Center for Health Statistics
Centers for Disease Control and Prevention
Toledo Rd
Hyattsville, MD
(301) 458-4625
Floyd J. Fowler, Jr., Ph.D.
Center for Survey Research, University of Massachusetts
100 Arlington Street
Boston, Massachusetts 02116
(617) 956-1150
Kerry Levin
Westat
1650 Research Blvd.
Rockville, MD 20850
(301) 738-3563
Consultants within NCI:
All of the following may be contacted at:
6130 Executive Blvd, MSC 7344
EPN 4005
Bethesda, MD 20892-7344
(301) 435-4742
Rachel Ballard-Barbash, MD, MPH [email protected]
Nancy Breen, PhD [email protected]
Anne Hartman, MS [email protected]
Sue Krebs-Smith, PhD [email protected]
Richard Moser, PhD [email protected]
Rick Troiano, PhD [email protected]
Consultation with representatives of those from whom data will be collected will take place in the form of interviews with volunteers to determine the feasibility of collecting the needed data, the most promising approach for data collection, and general attitudes about the participants which might influence data collection.
A.9. Explanation of Any Payment or Gift to Respondents
For intensive forms of interviews (that is, cognitive interviews, focus groups, and usability tests), participants generally receive an incentive, for several reasons:
Eligibility criteria for participants are usually specific. Some of these criteria are determined by the subject matter of the survey (e.g., questions may be only relevant to people with certain health conditions). The more specific the subject matter, the more difficult it is to recruit eligible participants; payments help to attract them.
Intensive forms of interviews require an unusual level of mental effort, as participants are asked to explain their mental processes as they hear the question, discuss its meaning and point out any ambiguities, and evaluate the acceptability of response options that are provided.
Participants are usually asked to travel to a cognitive laboratory or other testing location, which involves transportation and parking expenses. Many participants incur additional expenses due to leaving their jobs during business hours, making arrangements for child care, etc.).
For a standard interviewing project, in which one-hour intensive interviews are conducted at NCI or contractor offices and eligibility requirements are of average complexity, participants will be receive $40.00- $50.00, depending on the burden imposed by travel and the interview itself. The incentive may be reduced to an amount no lower than $30.00 if the interview is of shorter duration, or does not require the participant to travel to NCI. Higher incentives may be requested for particularly difficult recruitments. For example, in a 1995 study, the NCHS QDRL has reported being unable to find auto mechanics and truck drivers willing to be interviewed for less than $75.
Focus groups or individual interviews of highly-compensated professionals, such as physicians, are vital for development of provider surveys, but normally require incentive amounts of up to $150 for an hour-long interview. Such incentives were previously used as part of the development of three NCI provider/physician surveys (where n<10): The National Survey of Primary Care Physicians’ Cancer Screening Recommendations and Practices (OMB No. 0925-0562); The National Survey of Energy Balance-Related Care Among Primary Care Physicians (OMB No. 0925-0583); and The Survey of Physician Attitudes Regarding the Care of Cancer Survivors (OMB No. 0925-0595). Additionally, a sub-study that was recently approved under this generic and is titled, “Cognitive Interview for Multidisciplinary Care (MDC) Survey (OMB No. 0925-0589-07), was approved for $150 to recruit physicians to participate in both a cognitive interview and complete a 30 minute survey.
It is important to offer an incentive sufficient to attract the full range of needed participant types for intensive interviewing projects. Inadequate participant recruitment limits the effectiveness of the questionnaire evaluation. In addition, we face competition from other laboratories (public and private) in a highly saturated research area. Sometimes our advertisements are adjacent to ads offering participants substantially higher incentives for the same commitment. Requests and justification for incentives will be included in each individual collection submission.
For activities that are meant to resemble the usual household interview – in particular, Pilot Household Interviewing -- participants will not receive remuneration, given that the methods are meant to replicate usual field conditions, for which survey respondents are normally not provided remuneration. Further, Pilot interviews are conducted in respondent households, so no travel is required, and also generally take far less than the one hour required for more intensive pretesting activities such as the cognitive interview.
A.10. Assurances of Confidentiality Provided to Respondents
Information collected under this clearance will include personally identifiable information (PII) in the form of names and contact information. Names and contact information will be used only for purpose of subject recruitment for pretest interviews (e.g., focus groups, cognitive interviews, usability tests); will not be associated with substantive data collected during interviews; and will be destroyed immediately after the interview. The NIH Privacy Act Officer has reviewed the work scope of this proposal to determine whether the Privacy Act is applicable to this data collection. Additionally, the NIH Privacy Act Officer will be asked to review the protocols of each sub-study under this generic clearance to ensure that NCI adheres to privacy requirements (see Attachment 3). Individual sub-studies will also submit, as part of their sub-study memo, a consent form and a plan for ensuring that identifiers are not retained as part of the research.
Activities covered under this clearance are generally considered to be Exempt from IRB review at NIH. NCI Staff for each sub-study, as well as submitting a specific request for Privacy Act review and an assessment of Privacy Impact Assessment, will submit a request for Exemption to the NIH Office of Human Subjects Research (OHSR). If OHSR determines that the data collection involves non-exempt activities, and should be reviewed by the NCI Special Studies Institutional Review Board (SSIRB), then NCI staff will develop appropriate materials, and will not contact human subjects for that project until the SSIRB has approved the research Protocol. If a contractor is involved in human subjects research activities, that contractor’s IRB will also review that testing project.
For Pilot interviews, whether of household and telephone participants, standard operating procedures regarding informed consent specific to the survey being tested will be slightly modified to reflect participation in the testing of survey questions, rather than participation in the actual survey to be field-administered. Again, no PII will be maintained by the sub-studies carried out under this clearance.
Plans for assuring confidentiality, and for safeguarding of collected information, will be specified by each sub-study submitted under this clearance. In general, the key NCI staff or contractor project director is responsible for safeguarding schedules, consent documents, audiotapes and videotapes, questionnaires, and cash incentives to participants.
Upon completion of a cognitive interview, the interviewer is responsible for the questionnaire, any notes written on other pieces of paper, and if created, the interview recording. The interviewer is instructed to lock all materials in his/her work area until all analysis is completed. Recordings are labeled by participant identifier number, date, time, and project title. No other identifying information is labeled on the recording. Once analysis is completed, interviewers are responsible for returning questionnaires and recordings to the project coordinator, who stores the materials in a locked location. No participant names or other identifying information is included in any reports, publications, or presentations of interview results.
Sometimes interviewers must travel to establishments or individuals’ homes in order to conduct interviews9. It is the interviewer’s responsibility to take necessary steps to ensure privacy, confidentiality, and safeguarding of materials. Generally interviews will be conducted in private rooms with a closed door. If no private room is available, the participant can select a private area and the interviewer will judge whether the area is sufficient for ensuing privacy. If the interviewer determines that the area is not private and/or soundproof enough, and no alternative area can be provided, the interview is postponed. For those surveys conducted in the participant’s home, the interviewer requests in advance that the participant arrange for privacy. However, interview location within the home is the choice of the participant.
As for other interviewing formats, focus group confidentiality and informed consent procedures will be specified for each sub-study. In focus group settings, participants are interviewed together and can hear each other’s comments, statements, and questions. Participants are told in their initial telephone screening interview that they will be participating in a discussion group with other volunteers. Before the group discussion begins, participants sign a consent form which is tailored to specify that they will be participating in a focus group. Generally, the interviewer (usually referred to as a Moderator when conducting a focus group) will instruct the group that the information discussed will be kept secure, to the extent of the law by NCI staff. Participants are asked to respect the privacy of the other participants and not to reveal to others what was discussed by the group.
When contractors are employed to collect data as part of NCI projects, they are contractually bound by NCI confidentiality provisions, and must submit documentation concerning their safeguarding practices to NCI prior to data collection. For any data collection activity, the contractor’s Institutional Review Board will review the data collection plan, and will complete the review and approval process before contact with human subjects is made.
A.11. Justification of Sensitive Questions
There will be no personally identifiable information retained for the generic sub-studies under this clearance. Additionally, the questionnaires currently proposed for study generally do not contain questions that are highly sensitive in nature. There are exceptions, however, and item sensitivity cannot always be predicted (note that one purpose of pretesting is to assess level of sensitivity). Therefore, a major purpose of cognitive and other pretesting of such questions is to determine means for fashioning them – and explanations for their administration-- in such a way that sensitivity is minimized, and responses are valid.
A.12. Estimates of Annualized Burden Hours and Costs
A. Hour Burden Estimates
The average annual participant burden is estimated to be 1200 hours, or a total of 3,600 hours over a three-year approval period (Table A.12-1). Estimates are based mainly on the practice of conducting one-hour interviews with participants. The estimates cover the time that each participant will spend communicating with the individual serving as the initial point of contact, in answering screening questions and survey questions and, in some cases, being debriefed following the interviews concerning their thoughts about the tested items. In rare cases, the burden may be more than one hour (although not more than 1.5 hours). Because the time per response is expected to vary, we will select the final sample size for each project in such a way that the total burden hours do not exceed the estimate listed above. For focus groups, the usual amount of time required is 90 minutes (1.5 hours) with instructions and ancillary paperwork processes taking an additional 15-25 minutes.
For all intensive interviewing activities (cognitive, focus group, or usability) conducted at NCI or contractor offices, the time required to travel to the location of the interview is not included in the current burden estimates, because distances and modes of transportation are unknown. No retrieval of information by participants is anticipated; although it is possible that validation of data at some point may require participants to check records, probably those kept at home. In that case, the study will be designed so that the one-hour response time includes record retrieval. All estimates are based on conferring with NCI staff who coordinate or lead the relevant questionnaire development activities, and on previous small-scale pretesting activities (involving samples of less than nine) that have been conducted under NCI auspices.
Table A.12-1 Estimates of Burden Hours Over Three-Year Approval Period
Type of Respondents |
Projects |
Number of Respondents |
Frequency of Responses
|
Average Time per Response (minutes/hour) |
Burden Hours |
Physicians, Scientists, and similar Respondents |
Include: survey questionnaire development, research on cognitive aspects of survey methodology, research on computer-user interface design, and pilot household interviews. |
1,200 |
1 |
75/60 (1.25) |
1,500.0 |
Experts in their Field |
600 |
1 |
75/60 (1.25) |
750.0 |
|
Administrators/ Managers |
600 |
1 |
75/60 (1.25) |
750.0 |
|
General Public |
1,200 |
1 |
30/60 (0.5) |
600.0 |
|
Total |
|
3,600 |
|
|
3,600.0 |
B. Annualized Costs to Respondents Over the Three-Year Approval Period
Type of Respondents |
Number of Respondents |
Burden Hours |
Hourly Wage Rate |
Wage Rate |
Physicians, Scientists, and similar Respondents |
1,200 |
1,500.0 |
$80 |
$120,000.00 |
Experts in their Field |
600 |
750.0 |
$100 |
$75,000.00 |
Administrators/Managers |
600 |
750.0 |
$40 |
$30,000.00 |
General Public |
1,200 |
600.0 |
$20 |
$12,000.00 |
Total |
3,600 |
3,600.0 |
|
$237,000.00 |
Each sub-study will identify the most appropriate respondents to complete the cognitive interviewing, survey development, usability testing and Pilot household surveys. As a result, the respondents for each category may include: the general public, physicians and a hospital administrators, scientists, and other respondents who may not have been identified yet. As a result, it makes estimating the respondent cost difficult and inaccurate. The table below is our best estimate of the respondent costs.
Table A.12-2 Respondent Costs Over Three-Year Approval Period
A.13. Estimates of Total Annual Cost Burden to Respondents and Record keepers
There are no direct costs to record keepers or respondents other than their time to participate.
A.14. Annualized Costs to the Federal Government
The cost to the government consists mainly of the salaries of Federal and contract staff who will: (1) recruit, schedule, and interview volunteer participants, and (2) assist in the analysis of the results and recommend changes in questionnaire wording. Total annualized project costs are located in Table A.14-1.
Table A.14-1 Annualized Costs to the Federal Government
Annual costs for NCI staff to plan, conduct, and analyze the outcomes of the questionnaire development activities:
|
Managerial |
0.50 FTE |
$40,000 |
Professional |
0.50 FTE |
$60,000 |
|
Support |
1.00 FTE |
$40,000 |
|
Payment, under contract, for assistance with pretesting activities/research |
$100,000
|
||
Travel costs (mainly local travel): |
$1,000 |
||
Materials for conducting household Interviews |
$1,000 |
||
Recruitment materials: (flyers, newspaper advertisements): |
$2,000 |
||
TOTAL |
$244,000 |
Travel costs: Most data will be collected in NCI or contractor office space. However, it will be more efficient in certain instances to hold interviews with individuals at other locations (homes, health centers, elderly centers), which will involve minor travel costs. Further, household interviews will require limited numbers of in-person interviews in participant households. Household interviews will be done locally, in order to limit travel costs, unless there is a compelling reason to do otherwise (for example, if participants critical to the study can be interviewed only at a distant location).
A.15. Explanation for Program Changes or Adjustments
This is a request for an extension of OMB#: 0925-0589, which is a considered an adjustment in Agency estimate. There are no anticipated changes in the next three years, except to increase the burden due to anticipated increased usage of this clearance by NCI for the formative development and pretesting of self-report questionnaires and data-collection instruments. The methodology remains the same as was proposed in the original generic clearance.
A.16. Plans for Tabulation and Publication and Project Time Schedule
This clearance request is for questionnaire development activities to be conducted prior to field administration, and for developmental work that will guide future questionnaire design. The majority of intensive interviewing investigations will be analyzed qualitatively. The survey designers and lab staff serve as interviewers, and use detailed notes and transcriptions from the in-depth cognitive interviews to conduct analyses. The results of these investigations will be used primarily to develop reliable survey instruments and methods. For the Pilot Household Interviewing activities, qualitative and quantitative analysis will be performed on samples of observational data from household interviews, in order to determine where additional problems occur. In particular, Behavior Coding, which involves the systematic tabulation of several categories of interviewer behavior (e.g., erroneous reading of question) and respondent behavior (e.g., request for repeat or clarification; providing an uncodeable response) will be used to assess problems with survey questions. Because NCI is using state-of-the-art questionnaire development techniques, NCI staff and collaborating contract staff will produce methodological papers which may include descriptions of response problems, recall strategies used, and quantitative analysis of frequency counts of several classes of problems that are uncovered through the cognitive interview and observation coding techniques.
The time schedule of activities will in most cases be linked to the development cycle of surveys to be supported by the pretesting activities described (e.g., the HINTS or TUS-CPS). All activities will be conducted over the period CY 2011 – 2014.
A.17. Expiration Date Display Exemption
All surveys and interview guides will display the OMB number and expiration date.
A.18. Exceptions to Certification
No exceptions to the Certification for Paperwork Reduction Act submissions are requested.
1 Willis, G., 2005, Cognitive Interviewing- A Tool for Improving Questionnaire Design, Sage, Thousand Oaks, CA.
2 Fowler, F.J. Jr., (1995), Improving Survey Questions: Design and Evaluation, Applied Social Research Methods Series Volume 38, Sage, Thousand Oaks, CA.
3 Krueger, R. A. (1994). Focus groups: A practical guide for applied research, Sage, Thousand Oaks, CA.
4 Couper, M., 1999. The Application of Cognitive Science to Computer Assisted Interviewing, in Sirken, M., Hermann, D., Schechter, S., Schwarz, N., Tanur, J., and Tourangeau, R. (eds.), Cognition and Survey Research, Wiley, New York, pp. 277–300.
5 Fowler, F. J., & Cannell, C. F. (1996), Using behavioral coding to identify problems with survey questions. In N. Schwarz & S. Sudman (Eds.), Answering questions: Methodology for determining cognitive and communicative processes in survey research, San Francisco, Jossey-Bass, pp. 15-36.
6 Lipscomb, J., Gotay, C., & Snyder, C., 2005, Outcomes Assessment in Cancer, Cambridge.
7 Couper, M., 1999, The Application of Cognitive Science to Computer Assisted Interviewing, in Sirken, M., Hermann, D., Schechter, S., Schwarz, N., Tanur, J., and Tourangeau, R. (eds.), Cognition and Survey Research, New York, Wiley, 277–300.
8 Beatty & Willis, The Practice of Cognitive Interviewing, Public Opinion Quarterly, 2007.
9 Off-site interviews fall into two categories. First, on rare occasions, participants who are recruited for lab interviews request that the interview be conducted in their home. Second, we occasionally conduct establishment studies where a visit to the business location is pertinent to data collection.
Page
File Type | application/msword |
File Title | SUMMARY |
Author | Karen Whitaker |
Last Modified By | Vivian Horovitch-Kelley |
File Modified | 2011-02-02 |
File Created | 2011-01-06 |