Opioid Plus Card Sort - 10 Day Letter

Opioid Plus Card Sort 10-day 060818.docx

NCHS Questionnaire Design Research Laboratory

Opioid Plus Card Sort - 10 Day Letter

OMB: 0920-0222

Document [docx]
Download: docx | pdf

DEPARTMENT OF HEALTH & HUMAN SERVICES Public Health Service

Centers for Disease Control and Prevention

Shape1 National Center for Health Statistics

3311 Toledo Road

Hyattsville, Maryland 20782

May 8, 2018


Margo Schwab, Ph.D.

Office of Management and Budget

725 17th Street, N.W.

Washington, DC 20503


Dear Dr. Schwab:


The NCHS Collaborating Center for Questionnaire Design and Evaluation Research (CCQDER) (OMB No. 0920-0222, Exp. Date 07/31/2018) proposes to conduct a cognitive interviewing study to evaluate questions pertaining to opioid-use for population-based household surveys. You have already approved the cognitive interviewing portion of this study (May 3, 2018), but we are proposing a slightly expanded protocol that will include a card sorting activity. Beyond the typical cognitive interviewing method approved previously, the addition of a short card sorting activity at the end of the interview that will allow CCQDER to better understand how respondents understand and relate opioids and other pain management drugs. Because previously-approved generic Information Collection Requests cannot be edited for non-substantive changes, we are re-submitting this project for your consideration. The following 10-day letter differs only slightly from what you approved on May 3, 2018, in that it now includes information on the card sorting exercise. Attachments 1 through 6 remain the same as what was previously approved and the burden has not changed. Attachment 7 includes the card templates that will be used in the card sorting exercise.


Most of the questions intended for this study currently appear on the Substance Abuse and Mental Health Services Administration’s (SAMHSA’s) National Survey on Drug Use and Health (NSDUH) (OMB No. 0930-0110, Exp. Date 08/31/2020) but have not been fully evaluated, and it is not certain whether or not, and the degree to which, the questions capture the constructs required by the CDC. Recruitment of respondents and interviewing would begin as soon as approval is received. Interviews will be conducted in up to seven diverse regions in the United States.


Cognitive Interviewing Methodology


The methodological design of this proposed study is consistent with the design of most NCHS/CCQDER cognitive interviewing studies: the purpose is to identify the various patterns of interpretation that respondents consider when formulating an answer to a survey question as well as any problems experienced. Findings demonstrate the construct captured by each question, consistency of patterns across respondent groups, and potential sources of response error. Interviews are in-depth and semi-structured; analysis is conducted using qualitative methodologies. Findings from all CCQDER studies are documented in a final report and made publicly accessible on a searchable website at https://wwwn.cdc.gov/QBank.


Card Sorting Methodology


Card, or pile sorting, is a structured social science data collection method that allows researchers to explore how individuals and groups perceive and organize a given cultural domain1. In short, card sorting is a proximity method to explore the relationships within a cultural domain that produces unconstrained clusters of the items being grouped, which represent a cultural taxonomy2. In this case, we are specifically interested in how respondents organize and understand pain management drugs.


Previous cognitive interviewing conducted by CCQDER indicate that the term “opioids” is not a meaningful label for most respondents. Analysis of both the qualitative and quantitative data produced by the card sort exercises will allow CCQDER to develop opioid and pain-management questions that will be better understood by potential NCHS and CDC survey respondents. By exploring how the sample, or sub-groups within the sample, group the cards depicting pain management drugs, the “cultural proximity” of each item within the cultural domain can be calculated. Medicines that are grouped together more frequently are understood to be more similar than medicines that are grouped together less frequently. By examining both this quantitative data, as well as the qualitative information that emerges from the discussion of the groups, a better understanding of how respondents organize and talk about pain relievers is possible. This information can then be used to design or refine survey questions.


Cognitive Interviewing Study of Opioid-Related Questions


Background: Much of CDC’s information on opioid use comes from disparate sources of survey and commercial data sets of varying levels of quality that produce inconsistent estimates. For example, CDC relies on data from SAMHSA’s NSDUH to understand various aspects of opioid use, misuse, and disorder. However, compared to other commercial data sources like IQVIA – which reports to collect 90% of prescriptions – the prevalence of opioid use from NSDUH appears much higher (IQVIA: 19.2% vs. NSDUH: 37.8%, Han et al., 20173). At the same time, NSDUH estimates of misuse and opioid-use disorder appear to be lower than what is expected. Currently, it is not clear which source is the more exact for opioid-use, and there is strong speculation that not all NSDUH opioid items are fully capturing accurate information.


Study Research Questions: The primary goal of this study is to investigate the ways in which existing opioid questions perform in differing socio-cultural contexts within the United States and, more specifically, to better understand barriers to accurate reporting of opioid use, misuse, impairment and addiction on household surveys. It is not the intention of this study to produce a fully valid set of opioid-related questions, although it will contribute to this likely future endeavor. Household surveys may hold specific data quality limitations that other sources of data do not, and it is necessary to fully explore the cognitive and interpretative processes that produce opioid survey data in order to assess those limitations. Furthermore, findings from a previous CCQDER study showed that, when formulating answers to opioid impairment questions, respondents’ interpretations of such questions were directly linked to their personal experience and circumstance,4 suggesting the need to examine the comparability of interpretive patterns across socio-economic and cultural groups. Not doing so could lead to differing levels of quality in terms of understanding the opioid crisis in the various affected communities.


Thus, this study aims to address the following research questions:

  1. When answering questions about opioid use, what kinds of medication do respondents consider? How is it defined, and what are the parameters for these considerations? Does this vary according to respondents’ background, experiences with the medical system, and/or their socio-cultural context?

  2. How do respondents understand the concept of misuse? For those whose actions would be defined as misuse by the CDC, how do they make sense of or rationalize their actions? How do these personal explanations impact their response to questions about misuse? How consistent are these patterns across differing groups of respondents?

  3. How do respondents conceptualize the concept of opioid impairment? Are there differences across respondent groups? How do these conceptualizations impact responses to impairment questions?

  4. Regarding the opioid addiction questions, are respondents able to reflexively examine their actions and accurately report back as would be intended by the CDC? What are the factors that lead to response error? Do these factors vary by question topic? By respondent group?

  5. In terms of answering questions about usage, are there any cognitive tasks that are over-burdensome to the extent that data quality is compromised? If so, what are the characteristics of those questions? Does this differ across respondents?

  6. Should some types of opioid-related questions be deemed as too sensitive to ask on face-to-face, household surveys? What are the characteristics of those questions? Does this vary across respondents?

  7. How do respondents understand and group the overall set of opioid pain relievers? Do they differentiate opioid and non-opioid pain relievers based on their chemical ingredients, or do they instead differentiate pain relievers based on other concepts such as familiarity and salience?


Collaboration: On March 15, CDC convened a working group of CCQDER staff as well as representatives from the National Center for Injury Prevention and Control (NCIPC) to discuss the proposed project to assess the validity of opioid questions on household surveys. NCHS/CCQDER staff presented 1) an overview of the CCQDER program and cognitive interviewing methodology, 2) findings from a previous study pertaining to opioids, and 3) the purpose and goal for the proposed study (i.e. to study the feasibility of asking opioid questions on household surveys). NCIPC staff presented CDC’s overall strategy for handling the opioid crisis, data sets used to track the opioid crisis, and data quality concerns that are outlined in the above section. From this discussion, the group developed a research agenda for the proposed study as well as a proposed questionnaire—much of what comes from the NSDUH.



On April 10, the workgroup met with SAMHSA representatives to describe the current project and to initiate a collaboration. As part of the collaboration, NSDUH staff provided necessary materials for the interviews, including past testing reports, and it was agreed that SAMHSA colleagues will have access to cognitive interview recordings, will be kept abreast of the study, and will receive a final report. It was also agreed that, if possible, the two agencies would co-author study-related papers.


Study Protocol: The opioid use, misuse, impairment and use disorder questions to be examined are included as Attachment 1. The proposed questions for cognitive testing have appeared on the 2017 NSDUH survey; however these questions have not been formally evaluated according to OMB Statistical Policy Directive No.2: Standards and Guidelines for Cognitive Interviews. Additional questions on general health, and access to care are used to frame respondents’ answers to the opioid questions. The testing procedure conforms to the cognitive interviewing techniques that have been described in CCQDER’s generic clearance package (OMB No. 0920-0222, Exp. Date 07/31/2018).


We propose to recruit up to 200 English-speaking adults (aged 18 and over) with a range of experiences (in the past 30 days verses 12 months) with pain and opioid use. Within these constraints, we plan to recruit participants with some demographic variety (particularly in terms of regional wealth, urban/rural, education, age, race/ethnicity, and gender).


Recruitment will be carried out through a combination of a newspaper advertisements/flyers, word-of-mouth, as well as a database of predetermined volunteers. The newspaper advertisement/flyer used to recruit respondents is shown in Attachment 2. The 5 minute screener used to determine eligibility of individuals responding to the newspaper advertisement/flyer is shown in Attachment 3a. The 5 minute screener used to determine eligibility of individuals from the CCQDER Respondent Database is shown in Attachment 3b. Note that wording of the template has been approved and is contained within our umbrella package. Only project specific information has been added to the document. It is anticipated that as many as 360 individuals may need to be screened in order to recruit 200 participants. Interviews will be conducted in as many as seven diverse locations in the United States, possibly including the Northeast, South, Southwest, Pacific Northwest, Midwest, and Mid Atlantic.


Interviews averaging 60 minutes (including the completion of a Respondent Data Collection Sheet) will be conducted by CCQDER staff members with English speaking respondents. Interviews will be conducted in the Questionnaire Design and Evaluation Research Laboratory as well as at off-site locations. Interviews conducted in the Questionnaire Design and Evaluation Research Laboratory and those off-site will be video and audio recorded to allow researchers to review the behaviors and body language of the respondents. These recordings will allow researchers to ensure the quality of their interview notes. In the rare case that a study participant initially agrees to a recording during the telephone screening, but changes their mind and checks “no” to allowing the interview to be recorded on the informed consent document, the interview will proceed without any recording. In this case the interviewer will depend on their handwritten notes when conducting analysis. In addition, individuals who select “yes” for allowing the interview recording on the informed consent form, but “no” for retaining the recording for future research (final text before signatures on informed consent form), will be allowed to participate in the study.


After respondents have been briefed on the purpose of the study and the procedures that CCQDER routinely takes to protect human subjects, respondents will be asked to read and sign an Informed Consent (Attachment 4). Only project specific information has been added to the document. Respondents will also be asked to fill in their demographic characteristics on the Respondent Data Collection Sheet (Attachment 5). This document is contained in our umbrella package. The burden for completion of this form is captured in the interview.


The interviewer will then ask the respondent to confirm that he/she understands the information in the Informed Consent, and then state that we would like to record the interview. The recorder will be turned on once it is clear that the procedures are understood and agreed upon.


The interviewer will then orient the respondent to the cognitive interview with the following introduction:


[fill staff name] may have told you that we will be working on some questions that will eventually be added to national surveys. Before that happens, we like to test them out on a variety of people. The questions we are testing today are about pain and opioid use. We are interested in your answers, but also in how you go about making them. I may also ask you questions about the questions—whether they make sense, what you think about when you hear certain words, and so on.


I will read each question to you, and I’d like you to answer as best you can. Please try to tell me what you are thinking as you figure out how to answer. Also, please tell me if:

there are words you don’t understand,

the question doesn’t make sense to you,

you could interpret it more than one way,

it seems out of order,

or if the answer you are looking for is not provided.


The more you can tell us, the more useful it will be to us as we try to develop better questions. Okay? Do you have any questions before we start? If yes, answer questions. If not, let’s get started.



The interview will conclude with the card sorting activity. Respondents will be invited to look through the set of cards that both show and name pain management drugs (see Attachment 7). Respondents will then be directed to “group the medicines that you think are similar” and “separate the medicines that you think are different” into however many groups they want. Once respondents have grouped the cards, they will be asked to name and describe each group. Interviewers will probe respondents about the content of each group and why they made the sorting decisions they did. Interviewers will record both this qualitative data, as well as the composition of the respondent’s individual groups. Qualitative data will be analyzed in the typical manner, and in conjunction with the rest of the information collected during the cognitive interview. The grouping data will be analyzed across both the full set of respondents and across sub-groups (such as income- or geographic-based groups), using the analysis of similarity matrices (for example using cluster analysis and multidimensional scaling).

After the interview, respondents will be given the thank-you letter (document contained in umbrella package) signed by the Charles J. Rothwell, Director of NCHS (Attachment 6), a copy of the informed consent document, and $40.


Extreme care will be taken with all recordings and paperwork from the interviews conducted off-site. Recordings and identifying paperwork will be stored in a secured travel case until returned to NCHS, at which point they will be transferred to the usual secured locked storage cabinets.


We propose giving participants $40 incentives, which is our standard incentive. In total, for this project, the maximum respondent burden will be 230 hours. A burden table for this project is shown below:


Form Name


Number of

Participants


Number of

Responses/

Participant

Average hours

per response


Response

Burden

(in hours)


Screener (recruited from newspaper or database)


360


1


5/60


30


Questionnaire

200

1

1

200



Attachments (7)

cc:

V. Buie

J. Zirger

DHHS RCO


1 Weller, S.C. and A.K. Romney. 1988. Systematic Data Collection. Newbury Park, CA: Sage ; Bernard H.R. and G.W. Ryan. 2010. Analyzing Qualitative Data. Los Angeles, CA: Sage

2 Trotter R.T. and J.M. Potter. 1993. “Pile sorts, a cognitive anthropological model of drug and AIDS risk for Navajo teenagers: Assessment of a new evaluation tool.” Drugs and Society 7: 23-39. DOI:10.1300/J023v07n03_03.

3 Han B, Compton WM, Blanco C, Crane E, Lee J, Jones CM. Prescription opioid use, misuse, and use disorders in U.S. adults: 2015 National Survey on Drug Use and Health. Ann Intern Med.2017;167:293-301.


4 Willson, S. (2017). Cognitive Interview Evaluation of Survey Items to Measure Substance Use and Impaired Driving. National Center for Health Statistics. Hyattsville, MD. https://wwwn.cdc.gov/QBank/Report.aspx?1186. Accessed 4/9/2018.

5 | Page


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy