2019 OMB addendum Field test memo

2019 OMB Addendum_FT memo_FINAL 09 09 2019.docx

2019-20 National Survey on Drug Use and Health (NSDUH)

2019 OMB addendum Field test memo

OMB: 0930-0110

Document [docx]
Download: docx | pdf

National Survey on Drug Use and Health (NSDUH) 2020 Redesign Field Test (FT) Details


Overview

In order to continue producing accurate and high-quality data on substance use and mental health, SAMHSA’s Center for Behavioral Health Statistics and Quality (CBHSQ) must update the NSDUH periodically to reflect changing substance use and mental health issues, as well as changing field conditions, technology, and methodology. CBHSQ is planning to redesign the NSDUH for the 2025 survey year. The redesign will seek to achieve two main goals: 1) revise the questionnaire and survey design to address changing policy and research data needs; and 2) modify the survey materials and methods to improve the quality of estimates and the efficiency of data collection. SAMHSA is requesting approval to conduct the Redesign Field Test (FT) to assess the effectiveness and potential impact of changes prior to implementation in the main study.


FT data collection is currently scheduled for August through November 2020. During that time, approximately 12,774 sample dwelling units (SDUs) will be contacted to yield approximately 8,110 completed screenings and approximately 4,000 completed interviews, administered in English only. The respondent universe for the FT is the civilian, noninstitutionalized population aged 12 or older in the contiguous United States. FT eligibility will be determined based on where residents of SDUs live for most of the time during the months of FT data collection (August, September, October, and November 2020).


A key element of the FT data collection is an assessment of the impact of a new screening incentive and an increased interview incentive. These incentives will be administered as part of a 2x2 experiment, crossing the screening incentive amounts, new vs. current ($5 and $0), and the interview incentive amounts, new vs. current ($50 and $30). The purpose of the incentive test is to assess whether adding incentive decreases nonresponse rates, reduces nonresponse bias, and decreases field interviewer level of effort.


Another key element of the FT is to test a revised NSDUH questionnaire. The revised interview questionnaire has undergone major changes which has involved expert review and cognitive testing. The revised questionnaire will include new items on vaping, marijuana mode of administration, synthetic drugs, chronic pain, and sleep disturbance in addition to modifications to existing questions and modules to update content and terminology, reduce wordiness, and ensure consistency across survey items. The field test data will be compared to estimates from the current NSDUH and new items will be benchmarked to existing data sources when possible.


Additional elements of the FT assessment include:

  • Revised respondent materials including the Lead Letter, the Study Description, and the Question and Answer Brochure;

  • Restructured screening instrument to reduce respondent and interviewer burden;

  • Replacing current paper Quality Control Form with electronic form.


The FT will also include testing for possible effects on data quality (as measured by outcomes such as unit nonresponse, item nonresponse, and survey response), questionnaire timing, data collection efficiency, and difference in reporting of substance use or mental health items on estimates. The FT is essential for providing a thorough examination of these changes prior to their deployment on the main study NSDUH.


Background

Preparations for the FT have been ongoing since the spring of 2017 with a variety of research activities. In April 2017, CBHSQ solicited comments from the public regarding the possibility of a NSDUH redesign via a Federal Register Notice [FR Doc. 2017-08400]. The FRN received comments from Federal agencies, researchers, and other data users and served as a foundation for the redesign. CBHSQ also held substantive topic meetings with internal SAMHSA stakeholders to understand how the NSDUH could better fit the agency’s data needs.


As part of the early redesign work, NSDUH staff met with the National Center for Health Statistics staff who work on the National Health Interview Survey (NHIS) and National Health and Nutrition Examination Survey (NHANES). The purpose of the meetings was to learn more about each survey, their current innovations, and to gain insight into their recent survey redesign procedures. Both the NHIS and NHANES staff provided valuable information about their survey and their redesign processes and information gained from the NHANES staff was instrumental to the design of the FT incentive test.


Key Element 1: Incentive Experiment

Similar to other national surveys, response rates for NSDUH have been declining (Center for Behavioral Health Statistics and Quality, 2013; 2014; 2015; 2016; 2017) and this is particularly true for screening response rates. Adding a screening incentive and increasing the main interview incentive could increase participation at both the screening and interviewing stages. If lower response rates reflect systematic shifts in who is being screened then increasing screening response could reduce nonresponse bias across both stages and therefore, reduce the potential for nonresponse bias in key NSDUH estimates.


The literature on incentives demonstrates that monetary incentives are effective in increasing survey cooperation rates. Multiple studies have shown that incentives tend to increase participation among sample members who are less interested in or involved with the survey topic (Groves, Singer, & Corning, 2000; Groves, Presser, & Dipko, 2004; Groves et al., 2006). Research has also demonstrated that respondent incentives (either at the screening or interview stage) can increase response rates and, in some situations, can reduce nonresponse bias in survey estimates (Armstrong, 1975; Church, 1993; Groves & Couper, 1998; Groves et al., 2006; Kulka, 1994; Singer, 2002; Singer & Ye, 2013).


Like many other national surveys, recent NSDUH response rates have been steadily declining (Williams and Brick, 2018). The weighted NSDUH screening response rate (SRR) for the most recent completed year (2018) was 73.3%. This is lower than previous years – about 2 percentage points lower than 2017 and over 4 percentage points lower than 2016. The weighted NSDUH interview response rate (IRR) for the most recent completed year (2018) was 66.6%. This is lower than previous years – about 1 percentage point lower than 2017 and about 2 percentage points lower than 2016. Despite increases in the level of effort to complete screenings and interviews (as measured in terms of number of contact attempts), response rates continue to decline, particularly at the screening stage. Increasing screening response could reduce nonresponse bias by bringing in households whose residents are less interested in substance use or mental health issues.


Currently, NSDUH interview respondents are offered a $30 cash incentive only when they complete the main interview. No incentive is offered for completing the household screening or as part of refusal conversions efforts. As part of the FT, SAMHSA would like to evaluate whether adding a screening incentive and increasing the interview incentive increases the likelihood of participation in the household screening and subsequent interview(s). The NSDUH incentive experiment will involve testing both a new screening incentive ($5 vs. $0) and an increased interview incentive ($50 vs. $30). For more information on the screening and interview incentive experimental design and power analysis, see Attachment FT-1.


The higher interview incentive amount is proposed based on a review of other nationally-representative in-person surveys that have recently conducted experimental tests to determine the impact of increasing the interview incentive amount. Examples of other nationally-representative in-person surveys reviewed included:

  • Medical Expenditure Panel Survey (MEPS). The interview incentive was increased from $30 to $50 in 2011. A higher incentive amount of $70 has been tested, but not implemented.

  • Panel Study of Income Dynamics (PSID). Between 2002 and 2015, the PSID interview incentive increased by $5 increments three times about every 3 to 4 years, from a starting incentive of $55 in 2003 to the current incentive of $70 in 2015.

  • National Survey of Family Growth (NSFG). The interview incentive has been $40 since 2006, with an increased incentive promised for nonresponse follow-up. A higher incentive amount of $60 has been tested, but not implemented.


These examples illustrate the value of testing different interview incentive amounts to determine whether increased and/or additional incentives increase response rates and reduce nonresponse bias in key estimates.


Evidence from many surveys across different data collection modes indicates that prepaid incentives are usually more effective than promised (or conditional) incentives, especially for self-administered surveys (Gelman, Stevens, & Chan, 2002; Singer, 2002). Most of these studies use incentives delivered in advance by mail. For surveys where nearly all sample units are eligible to participate in the survey based on the established criteria for eligible dwelling units (DUs), sending prepaid screening incentives by mail can be cost-effective. Key factors in determining whether prepaid incentives are cost-effective are the eligibility rates of selected DUs and of individuals who live in the eligible DUs.


For NSDUH, it is unlikely that mailed prepaid incentives would be cost-effective. Each quarterly NSDUH data collection results in a significant number of ineligible SDUs, typically about 16 percent of SDUs, or about 32,000 households. It is expected that delivering a prepaid incentive in-person, immediately after identifying an eligible screening respondent will be more cost-effective than sending a prepaid incentive with the study lead letter. In addition, delivering prepaid screening incentives to eligible screening respondents supports the Belmont fairness principle and reflects how important it is to stem the decline in NSDUH screening response rates over the past decade. A prepaid screening incentive could also make the promised interview incentive more credible and hence more effective, again supporting and possibly boosting main interview response rates.


We propose a 2x2 design to assess the separate impacts of the screening and increased interview incentive on response rates, and therefore, potential reduction in nonresponse bias. Table 1 presents the four possible combinations of screening and interview incentive amounts that would comprise the four experimental conditions.


Table 1. Four Possible Experimental Conditions for the Incentive Experiment



Screening Incentive Amounts

Interview Incentive Amounts

1

$0 screening +

2

$5 screening +

$30 interview

$30 interview

3

$0 screening +

4

$5 screening +

$50 interview

$50 interview

Note: Condition 1 represents current NSDUH practice.


The screening and interview incentive experiment will be used to assess:

  1. The impact of offering a screening incentive on screening response rates (SRR).

  2. The impact of a higher interview incentive on screening and interview response rates (IRR).

  3. The impact of offering a screening incentive on nonresponse bias by examining the demographic composition of households screened.


We will use results of the experiment to make recommendations on the use of incentives in the redesigned main NSDUH study (proposed 2025). The screening incentive and increased interview incentive will be considered effective if the NSDUH FT results show meaningful differences between experimental conditions that are statistically significant.


We project that an increase of approximately 5% for either the screening or interview response rate will be meaningful. Our study design will be able to detect differences in screening response rates of at least 4.7% and differences in interview response rates of at least 5.2%. We will interpret differences at or above these rates to be statistically significant and meaningfully different. As such, these values represent our indicators of incentive effectiveness. In addition, we expect the screening incentive to impact the demographic composition of households screened. For households offered the $5 screening incentive compared to those not offered the screening incentive, one or more of the demographic characteristics (1) differs significantly between the no incentive and $5 incentive condition and (2) the estimate from the $5 incentive condition is closer to American Community Survey (ACS) estimates. For age groups, marginal mean differences ranging from 1.8% to 4.0% would be detectable as statistically significant. For gender, a marginal mean difference of 3.3% would be statistically significant. For race/ethnicity, marginal mean differences ranging from 2.8% to 5.0% would be statistically significant. Because nonresponse bias is a property of each survey estimate, differences in response rates could result in no practical impact on nonresponse bias for some estimates and a meaningful impact for other estimates.


An important outcome of the incentive experiment will be to make a clear, informed decision about whether the addition of a screening and/or increase in an interview incentive is effective in increasing response rates and potentially limiting nonresponse bias. The experimental design will determine whether the screening incentive has a significant effect on response rates independent of the increased interview incentive, whether the increased interview incentive has the primary impact on screening and interview response rates, or whether a specific combination of the screening and interview incentive amounts has the greatest impact on response rates. The incentive experiment provides a foundation for an evidence-based decision on whether adding a screening incentive, increasing the interview incentive, or a combination of these incentives, may improve response rates and mitigate nonresponse bias in the Redesigned NSDUH.


Key Element 2: Revised Interview Questionnaire

For planned changes to the NSDUH interview questionnaire, development activities included multiple literature reviews, individual consultations between internal and contractor instrumentation experts and expert panel webinars with substantive experts, and several in-person expert panel meetings.


The substantive experts external to CBHSQ in mental health and substance use assessment by topic include:


Criminal Justice:

  • Jennifer Bronson, Bureau of Justice Statistics

  • Deborah Dawes, RTI International

  • Lauren Glaze, Bureau of Justice Statistics

  • Laura Maruschak, Bureau of Justice Statistics

  • Tracy Snell, Bureau of Justice Statistics


Marijuana:

  • Jane Allen, RTI International

  • Timothy Lefever, RTI International

  • Michael Pemberton, RTI International (Also provided expertise on Vaping, Youth Mental Health)


Substance Use Disorder:

  • Paul C. Beatty, Center for Survey Measurement, U.S. Census Bureau

  • Raul Caetano, Pacific Institute for Research and Evaluation

  • Glorisa Canino, University of Puerto Rico (Also provided expertise on Youth Mental Health)

  • Michael First, Columbia University

  • Prudence Fisher, Columbia University

  • Gary Giovino, SUNY Buffalo (Also provided expertise on vaping)

  • Deborah Hasin, Columbia University

  • Aaron Hogue, The National Center on Addiction and Substance Abuse

  • James Jackson, University of Michigan

  • Silvia Martins, Columbia University

  • Brent Moore, Yale School of Medicine

  • Patrick O’Malley, University of Michigan


Substance Use and Mental Health Treatment:

  • Kathy Batts, RTI International

  • Kevin Conway, RTI International

  • Ben Druss, Emory University

  • Christine Grella, University of California, Los Angeles

  • Rick Harwood, National Association of State Alcohol and Drug Abuse Directors

  • Valerie Hoffman, RTI International

  • Mark Olfson, Columbia University Medical Center (Also provided expertise on Youth Mental Health)

  • David Shern, Mental Health America

  • Leyla Stambaugh, RTI International (Also provided expertise on Youth Mental Health)

  • Terry Zobeck, Office of National Drug Control Policy

  • Sam Zuvekas, Agency for Healthcare Research and Quality (Also provided expertise on Youth Mental Health)


Vaping:

  • Jessica Barrington-Trimis, University of Southern California

  • Blair Coleman, U.S. Food & Drug Administration

  • Jonathan Foulds, Penn State University

  • Gary Giovino, SUNY Buffalo (Also provided expertise on Substance Use Disorder)

  • Rachel Grana Mayne, National Cancer Institute

  • Bonnie Halpern-Travers, Roswell Park

  • Brian King, Centers for Disease Control and Prevention

  • Suchitra Krishnan-Sarin, Yale University

  • Youn Lee, RTI International

  • Richard Miech, University of Michigan

  • James Nonnemaker, RTI International

  • Jennifer Pearson, University of Nevada, Reno

  • Nicholas Peiper, RTI International

  • Michael Pemberton, RTI International (Also provided expertise on Marijuana, Youth Mental Health)

  • Jessica Pepper, RTI International

  • Saul Shiffman, Pinney Associates

  • Erin Sutfin, Wake Forest School of Medicine

  • Andrea Villanti, University of Vermont

  • Jennifer Wagner, U.S. Department of Health and Human Services

  • Jon Zibbell, RTI International


Youth Mental Health:

  • Carla Bann, RTI International

  • Stephen Blumberg, National Center for Health Statistics, Centers for Disease Control and Prevention

  • Glorisa Canino, University of Puerto Rico (Also provided expertise on Substance Use Disorder)

  • William Copeland, Duke University Medical Center

  • Prudence Fisher, Columbia University

  • Robert Goodman, King's College, London

  • Ronald Kessler, Harvard Medical School

  • Natasha Latzman, RTI International

  • Christopher Lucas, Columbia University

  • Kathleen Merikangas, National Institute of Mental Health

  • Mark Olfson, Columbia University Medical Center (Also provided expertise on Substance Use and Mental Health Treatment)

  • Michael Pemberton, RTI International (Also provided expertise on Marijuana, Vaping)

  • Heather Ringeisen, RTI International

  • Leyla Stambaugh, RTI International (Also provided expertise on Substance Use and Mental Health Treatment)

  • Sam Zuvekas, Agency for Healthcare Research and Quality (Also provided expertise on Substance Use and Mental Health Treatment)


The goal of the various expert reviews was to discuss new content, develop and/or revise questions, and ensure the new interview questions will be analytically useful and easily and accurately answered by survey respondents.


Next, revised questions on substance use, mental health, and associated covariates were tested with 189 respondents across three phases of cognitive interviewing (OMB No. 0930-0290). The cognitive testing had four primary goals:

  • To test revisions to substance use questions that will reflect changes in the type and forms of drugs available (e.g., marijuana concentrates, synthetic drugs), changes in the street names used for different substances, changes in the methods of administration of these drugs (e.g., vaping), and changes in the type of substance use treatment methods available (such as medication-assisted treatment)

  • To test modifications designed to improve NSDUH’s current mental health measures which involved adapting the current NSDUH questions on adult suicide ideation for youth respondents, replacing youth Major Depressive Episode measures with a wider range of mental health indicators, and improving measures of mental health treatment.

  • To test new and revised questions on sleep disturbance, chronic pain, and experiences with the criminal justice system as these are important covariates of substance use and mental health and improve the utility of NSDUH data.

  • To assess usability issues related to self-administered survey questionnaires, specifically how well respondents select their answers and navigate through the instrument.


As a result of this work, changes were made throughout the NSDUH instrument.


A summary of key additions and revisions to the NSDUH questionnaire for implementation and assessment as part of the FT are summarized below:

  • Revisions to questions about military family members;

  • Questions on education were revised and moved to the interviewer-administered portion of the survey to improve accuracy;

  • Wording changes throughout the Marijuana module, new marijuana mode of administration questions, and updated marijuana market module;

  • New measures of Electronic Nicotine Delivery Systems (ENDS) questions (otherwise known as vaping) and new items on vaping substances other than nicotine or marijuana;

  • New measures of synthetic drug including synthetic marijuana, stimulants, and illegally manufactured fentanyl;

  • Revisions to the Substance Use Disorder module to incorporate the criteria based on the 5th edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5; American Psychiatric Association, 2013), which have been cognitively tested and will be validated in the 2020 Clinical Validation Study;

  • Revised module on substance use treatment (replacing the prior Drug Treatment module) and a new Mental Health Service Utilization module (replacing the prior individual adolescent and adult modules), plus changes to treatment for depression items and addition of a new item on mental health screening;

  • Addition of measures of adolescent psychological distress and/or impairment to replace the youth Major Depressive Episode module;

  • Expansion of suicide items to be asked of youth;

  • New measures of chronic pain and sleep disturbance;

  • Revision of needle use items to select substance use modules;

  • Revisions to the perceived risk questions;

  • Revisions to the criminal justice items;

  • Addition of new items on the characteristics of underage drinking;

  • Combining the tranquilizers and sedatives modules; and

  • Other changes to the questionnaire were made to update the street names for drugs, reduce wordiness, and ensure consistency across survey items.


A paper representation of the entire NSDUH FT interview questionnaire is included as Attachment FT-2, with the various changes specific to the FT highlighted.


Additional Elements: Revised Respondent Materials, Restructured Screening Instrument, and Electronic Quality Control Form


Another key component of the redesign process was collecting feedback directly from the NSDUH field staff and management. Given their direct interaction with NSDUH respondents, their insights are uniquely valuable in assessing feasibility and identifying areas of potential improvement. Through interactive online feedback sessions, field staff provided insight on existing procedures and specifically provided feedback on the screening instrument, respondent materials, and ways to improve interactions with controlled access situations such as senior living and gated communities. Design and content changes to NSDUH respondent materials and screening specifications for the FT were initially developed based on online feedback sessions conducted with NSDUH field staff and managers.


In addition, substantive experts, with relevant experience on large-scale field studies, provided feedback on the presentation and messaging of the revised Lead Letter, Study Description, and Question and Answer Brochure.


Expert reviewers of the respondent materials included the following persons:

  • Reem Ghandour, Health Resources and Services Administration (HRSA)

  • Jeffery Rhoades, Agency for Healthcare Research and Quality (AHRQ)

  • Amy Conley, RTI International

  • Susan Kinsey, RTI International

  • Sue Pedrazzani, RTI International.


Lead letter

Although similar to the NSDUH main study version, the following design and content modifications were made to the Lead Letter for the FT:

  • An improved overall format and professional look;

  • Reorganization of messages to better capture attention of the reader;

  • Additional bullet points to increase readability;

  • Additional text to clarify the sequence of screening and interviewing steps and to clarify when incentives are given; and

  • A revised NSDUH website address.


Study Description

The following design and content modifications were made to the Study Description for the FT:

  • Use of color and bold font to better catch the attention of the reader and to emphasize important messaging;

  • Elimination of the signature line to help distinguish from the Lead Letter;

  • Revisions to clarify the sequence of screening and interviewing steps to match the Lead Letter;

  • Additional text to clarify when incentives are given; and

  • An increased emphasis on privacy and confidentiality.


Question & Answer Brochure

The following design and content modifications were made to the Q&A Brochure for the FT:

  • Addition of the NSDUH logo to front cover;

  • Reformatting of text on the front cover to reduce emphasis on “drug use;”

  • Addition of the tagline “A trusted source of data since 1971” to front cover;

  • Increased emphasis on privacy and confidentiality;

  • Removal of third reference to the full study name on inside panel header;

  • Addition of a Quick Response (QR) code;

  • Revised NSDUH website address;

  • Addition of the toll-free help line and website for treatment referral; and

  • Modifications to the formatting and use of color blocks to emphasize messaging in text boxes and pictures.


Restructured Screening Instrument

Usability testing on the restructured screening instrument is planned for late 2019, conducted as internal testing with field staff to assess how well the revised screening works with respect to performance and user satisfaction. Changes to the NSDUH screening that will be implemented and assessed in the FT to reduce interviewer and/or respondent burden, include:

  • Eliminate the use of householder designation (“someone who owns or rents this home”) to define the first person rostered and instead designated the screening respondent as the first person roster;

  • Changed age question to ask about current age rather than age at last birthday;

  • Revised roster structure;

  • Modified instrument and procedures to incorporate screener incentive, and

  • Modified verification screen.


Electronic Quality Control Form

Currently, NSDUH uses a paper form for collecting respondent contact information for fieldwork verification purposes. Following an interview, the FI mails the form in an envelope sealed by the respondent. The forms are received at the Contractor’s Research Operations Center to be keyed for telephone or mail verification procedures. In order to reduce respondent and FI burden, administrative burden and cost, and streamline the verification process, the Redesign FT will use an electronic Quality Control Form. As part of the 2018 National Mental Health Study-Field Test (NMHS-FT, OMB Control No. 0930-0290), an electronic Quality Control Form was used with great success. An electronic Quality Control Form that uses the NMHS-FT version as a template was developed for use during the NSDUH Redesign FT (see Appendix X). The overall success of the electronic Quality Control Form will be assessed as part the FT.


Respondent Universe and Sampling Methods

Similar to the respondent universe for the annual NSDUH main study, the respondent universe for the FT is the civilian, noninstitutionalized U.S. population aged 12 or older. To control costs, individuals residing in Alaska and Hawaii will be excluded from the FT. Unlike the main study, only respondents who can complete the screening and interview in English will be included in the FT.


As shown in Table 2, approximately 356 segments and 12,774 SDUs will be needed to yield approximately 8,110 completed screening interviews and approximately 4,000 completed interviews. State sampling regions (SSRs), defined as contiguous groups of census tracts in the main NSDUH study, will be used as primary sampling units (PSUs) in the FT. To achieve representation of the age-eligible, English-speaking population in the contiguous United States, a probability proportional to size (PPS) sample of 89 (of 726) of the NSDUH SSRs will be selected. Four segments will be selected in each SSR. The age allocation for interviews will be the same as the current NSDUH main study: 25 percent aged 12 to 17, 25 percent aged 18 to 25, and 50 percent aged 26 or older.


Prior to selecting the FT sample, SSRs will be stratified by historical response rate experience. Response rate experience from the 2018 NSDUH (or pooled 2017-2018 data) will be used to identify an appropriate number of strata and strata cut points. For example, response rate strata may be defined using response rate quartiles as follows: low (lower quartile), medium (second and third quartiles), and high (upper quartile). Response rate experience was chosen as the stratification variable because screening and interview response rate are a key outcome for the FT. Additional implicit stratification will be achieved by sorting SSRs by mean income, computed using prior year NSDUH data, within strata. Sorting SSRs by income is useful because  household income level and response to the different screening and interview incentive amounts could likely correlate. A proportional number of SSRs will be selected from the sorted frame in each stratum with probability proportional to size. This design will maximize the efficiency (i.e., increase precision) of the FT estimates by reducing variation in the weights while controlling for factors known to be associated with screening and interview response. Within each selected SSR, SDUs will be selected from retired NSDUH segments.


The nature of the stratified, clustered sampling design for the FT requires the design structure be taken into consideration when computing variances of survey estimates. Key nesting variables will be created to capture explicit stratification and to identify clustering. FT sampling strata will serve as variance strata. Then, each sampled SSR within a variance stratum will be assigned to a replicate (i.e., the PSUs are replicates). SSRs within variance strata are expected to be similar based on historical response rate experience. Further, the FT will have a sufficient number of strata and degrees of freedom (86) for incentive analyses and other FT analyses for stable variance estimation and hypothesis testing.


Table 2. Summary of the 2020 Field Test Sample Design


Statistic

Total

Expected Rate

State Sampling Regions (SSRs)

89

 

Segments

356

 

Sample Dwelling Units (SDUs)

12,774

 

Expected Eligible Dwelling Units

10,798

0.85

Expected Completed Screening Interviews

8,110

0.75

Expected Selected Persons

6,206

 

Expected Eligible Persons (English-Speaking)

5,958

0.96

Expected Completed Interviews

4,000

0.67

12-17

1,000

 

18-25

1,000

 

26+

2,000

 

NOTE: Outcome rates were computed using 2017 NSDUH data, excluding Alaska and Hawaii. Also, at this time, the assumption is that there will be three strata based on historical response rate experience.


Information Collection Procedures

Unless otherwise specified, the FT procedures will follow the same processes as planned for the 2020 NSDUH main study.


Prior to the FI’s arrival at the SDU, a Lead Letter (see Attachment FT-3) will be mailed to the resident(s) briefly explaining the survey and requesting their cooperation. In the FT, four different versions of the Lead Letter will be utilized specific to the incentive conditions selected for the SDUs within each respective segment (e.g., those SDUs in a segment in the $50 interview incentive and $5 screening incentive experiment will receive letters noting those specific amounts, etc.)


Upon arrival at the SDU, the FI will refer the resident to this letter and answer any questions. If the resident has no knowledge of the Lead Letter, the FI will provide another copy, explain that one was previously sent, and then answer any questions. If no one is home during the initial visit to the SDU, the FI may leave a Sorry I Missed You Card (Attachment FT-4) informing the resident(s) that the FI plans to make another callback at a later date/time. Callbacks will be made as soon as feasible following the initial visit. FIs will attempt to make at least four callbacks (in addition to the initial call) to each SDU in order to complete the screening process and complete an interview, if yielded.


If the FI is unable to contact anyone at the SDU after repeated attempts, that FI’s Field Supervisor (FS) may send an Unable-to-Contact (UTC) letter (Attachment FT-5). These UTC letters reiterate information contained in the Lead Letter and present a plea for the resident to participate in the study. If after sending the UTC letter, an FI is still unable to contact anyone at an SDU, a Call-Me letter (Attachment FT-5) may be sent to the SDU requesting that the resident(s) call the FS as soon as possible to set up an appointment for the FI to visit the resident(s).


When in person contact is made with an adult member of the SDU and introductory procedures are completed, the FI will present a Study Description (Attachment FT-6) and answer any questions that person might have concerning the study. For the FT, two different versions of the Study Description will be utilized, specific to the interview incentive conditions for each SDU ($50 or $30). For SDUs assigned to the $5 screening incentive condition, the FI will provide the $5 screening incentive as part of the introductory procedures.


A Question & Answer Brochure (Attachment FT-7) that provides answers to commonly-asked questions may also be given. For the FT, two different versions of the brochure will be utilized, specific to the interview incentive conditions for each SDU ($50 or $30).


In addition, FIs are supplied with copies of the NSDUH Highlights & Newspaper Articles (Attachment FT-8) for use in informing sample members of the importance of NSDUH data and eliciting participation, which can be left with the respondent.


Also, the FI may utilize the multimedia capability of the touch-screen tablet to display one of two short videos (approximately 60 seconds total run time per video) for members of the SDU to view, which provides a brief explanation of the study and why participation is important. The scripts for these videos are included as Attachment FT-9.


With SR cooperation, the FI will begin screening the SDU by asking either the Housing Unit Screening questions, or the Group Quarters Unit Screening questions, as appropriate (Attachment FT-10). The screening questions will be administered using an eight-inch touch screen Android tablet computer.


Race/ethnicity questions are FI-administered and meet all of the guidelines for the OMB minimum categories. The addition of the finer delineation of Guamanian or Chamorro and Samoan, which collapse into the OMB standard Native Hawaiian/Other Pacific Islander category, were a requirement of the new HHS Data Collection Standards and will continue to be included in the FT questionnaire.


As mentioned previously, updated versions of survey materials and methods are being included in the FT to improve the quality of estimates and the efficiency of data collection. This includes various updates to the format and flow of the screening questions.


Once all household members aged 12 or older have been rostered, the FI proceeds with the interview selection process where the tablet performs the within-dwelling-unit sampling process, selecting zero, one, or two members to complete the interview. Immediately after the tablet selection process, the FI will inform the SR about whether zero, one, or two members were selected to complete the interview.


At that point, the FI will focus on gaining cooperation from any SDU members selected to complete the interview. For cases with no one selected, the FI will thank the respondent and conclude the household contact. For cases where one or two members are selected for the interview, the FI will complete the following steps:

  • If the selected individual is aged 18 or older and is currently available, the FI immediately seeks to obtain informed consent. Once consent is obtained, the FI begins to administer the questionnaire in a private setting within the dwelling unit. As necessary and appropriate, the FI may make use of the Appointment Card (in Attachment FT-4) for scheduled return visits with the respondent.


  • If the selected individual is 12 to 17 years of age, the FI reads the parental introductory script (Attachment FT-11) to the parent or guardian before speaking with the youth about the interview. Subsequently, parental consent is sought from the selected individual’s parent or legal guardian using the parent section of the youth version of the Introduction and Informed Consent Scripts (Attachment FT-12). Once parental consent is granted, the minor is then asked to participate using the youth section of the same document. If assent is received, the FI begins to administer the questionnaire in a private setting within the dwelling unit with at least one parent, guardian or another adult remaining present in the home throughout the interview.


For cases with no one selected for the interview, the FI will collect verification information by asking for a name and phone number for use in verifying the quality of the FI’s work.


If a potential SR refuses to be screened, the FI has been trained to accept the refusal in a positive manner, thereby minimizing the possibility of creating an adversarial relationship that might preclude future opportunities for contact. The FS may then request that a Refusal Letter (Attachment FT-13) be sent to the residence. The letter sent addresses common concerns expressed by respondents and emphasizes important messages regarding the study and asks him or her to reconsider participation. Refusal letters are customized and also include the FS’s phone number in case the potential respondent has questions or would like to set up an appointment with the FI. Unless the respondent calls the FS or the Contractor’s office to refuse participation, an in-person conversion is then attempted by specially-selected FIs with successful conversion experience.


Like the NSDUH main study, the FI will administer the FT interview in a prescribed and uniform manner with sensitive portions of the interview completed via Audio Computer-Assisted Self-Interviewing (ACASI). In order to facilitate the respondent’s recollection of prescription-type drugs and their proper names, pill images will appear on the laptop screen during the ACASI portions of interviews as appropriate in the FT. Also, respondents will use an electronic reference date calendar, which displays automatically on the computer screens when needed throughout the ACASI parts of the interview. Finally, in the FI-administered portion of the questionnaire, showcards are included in the FT Showcard Booklet (Attachment FT-14) that allow the respondent to refer to information necessary for accurate responses.


After the FT interview is completed and before the verification procedures begin, each respondent is given a $30 or $50 cash incentive, depending upon which experimental condition that case was pre-selected for, and an Interview Incentive Receipt (Attachment FT-15) signed by the FI. As mentioned previously, the interview incentive amounts ($30 or $50) will be randomized across segments so that approximately half of interview respondents will be offered $30 and half will be offered $50.


For verification purposes, the FT will implement a new electronic Quality Form. Like the 2020 main study, interview respondents will still be asked to provide their current phone number and address so Contractor staff can potentially call to ensure the FI did his or her job appropriately. However, unlike the main study, in the FT FIs will record the respondent’s current phone number and confirm the address using the tablet rather than asking the respondent to fill out a paper Quality Control Form. In addition, for a youth interview respondent, the FI will enter the relationship of the parent or guardian who provided consent in the tablet rather than recording the relationship on a paper Quality Control Form. Respondents will be informed that completing the electronic Quality Control Form is voluntary. In previous NSDUHs, using the paper Quality Control Form, less than one percent of the verification sample refused to fill out the Quality Control Form.


FIs may give a Certificate of Participation (Attachment FT-16) to interested respondents after the interview is completed. Respondents may attempt to use these certificates to earn school or community service credit hours. As stated on the certificate, no guarantee of credit is made by SAMHSA or the Contractor. The respondent’s name is not written on the certificate. The FI signs his or her name and dates the certificate, but for confidentiality reasons the section for recording the respondent’s name is left blank. The respondent can fill in his/her name at a later time so the FI will not be made aware of the respondent’s identity. It is the respondent’s choice whether he or she would like to be identified as a NSDUH respondent by using the certificate in an attempt to obtain school or community service credit.


A random sample of those who complete the electronic Quality Control Forms will be contacted via telephone to answer a few questions verifying that the interview took place, that proper procedures were followed, and that the amount of time required to administer the interview was within expected parameters. The CATI Verification Scripts (Attachment FT-17) contain the scripts for these interview verification contacts via telephone, as well as the scripts used when verifying a percentage of certain completed screening cases in which no one was selected for an interview or the SDU was otherwise ineligible (vacant, not primary residence, not a dwelling unit, dwelling unit contains only military personnel, respondents living at the sampled residence for less than half of the quarter).


All interview data are transmitted on a regular basis via secure encrypted data transmission to the Contractor’s offices in a FIPS-Moderate environment, where the data are subsequently processed and prepared for reporting and data file delivery.


Information Technology Use

For the FT, FIs will be using the same equipment used for the 2020 main NSDUH study: Samsung Galaxy Tab A8.0 – an eight-inch, 32GB, Android-based tablet – for screenings, interview respondent selection, incentives, and answering FI observation questions and the Dell Latitude 7490 laptop – with a 14-inch screen and the Windows 10 Professional operating system – for interviews. This equipment is very similar to those models used on NSDUH from 2015-2019. Also, these laptops are FIPS-Moderate compliant and secured with 2-factor login, using Microsoft’s integrated TPM-based 2-factor authentication mechanism for Windows 10 and BitLocker. In addition, the tablets are encrypted at rest using the FIPS 140-2 compliant device-level encryption facilities built into the implementation of the Android operating system running on each tablet.


Like the NSDUH main study, the FT data will be collected in a face-to-face interview setting in respondents’ homes using laptop computers. Interviews will be administered using ACASI for sensitive questions, which represent most of the interview. The remainder of the interview will be administered by the FIs using computer-assisted personal interviewing (CAPI). This mode has been used on NSDUH since 1999, while continually enhancing and expanding the interviewing program to take advantage of improvements in technology.


The CAPI/ACASI technology affords a number of advantages in the collection of NSDUH data. First, this methodology permits the questionnaire designer to incorporate into the questionnaire routings that might be overly complex or not possible using a paper-and-pencil questionnaire. The computer can be programmed to implement complex skip patterns and fill specific words based on the respondent’s previous answers. FI and respondent errors caused by faulty implementation of skip instructions are virtually eliminated. Second, this methodology increases the consistency of the data. The computer can be programmed to identify inconsistent responses and attempt to resolve them through respondent prompts. This approach reduces the need for most manual and machine editing, thus saving both time and money. In addition, it is likely that respondent-resolved inconsistencies will result in data that are more accurate than when inconsistencies are resolved using editing rules. Third, in addition to time and money saved by minimizing edits needed to resolve discrepancies, the ACASI technology reduces social desirability bias.


For the FT, questions administered via ACASI in the NSDUH main study interview will continue to be read aloud to respondents using Text-to-Speech (TTS) software offered by Microsoft, Speech Platform, which features a dynamic implementation mode that uses the TTS engine to read question text in real time and eliminates the use of pre-recorded audio files altogether. Since the integration of the Speech Platform software into all NSDUH questionnaires since 2015, there have been no reported problems with the pronunciation of any words or phrases produced by the TTS voice in English.


Payment to Respondents


Screening Incentive

Eligible SRs (adults aged 18 or older who live in the SDU) from SDUs assigned to one of the screening incentive conditions (approximately 50% of the FT sample) will be given $5 in cash during the screening introductory procedures. Adult SRs from SDUs not assigned to one of the screening incentive conditions (approximately 50% of the FT sample) will receive no cash incentive for the screening.


For SDUs assigned to one of the screening incentive conditions, the screening incentive will be mentioned in the following respondent materials: Lead Letter (Attachment FT-3); Study Description (Attachment FT-6); Screening Questions (Attachment FT-10); screening versions of the Unable-to-Contact, Controlled Access, and Call-Me Letters (Attachment FT-5); Refusal Letters (Attachment FT-13); and Interview Incentive Receipt (Attachment FT-15).


Interview Incentive

Adult interview respondents (aged 18 or older) and youth interview respondents (aged 12 to 17) will be given a cash incentive upon completion of the NSDUH main study interview. Approximately half of the respondents will receive the current NSDUH interview incentive of $30; the other half will receive $50. On October 18, 2001, the initial use an interview incentive ($30) was approved by OMB for use in the 2002 NSDUH. The 2002 NSDUH experienced an increase in the weighted overall response rate (screening * interviewing) from 67 percent to 71 percent. Since then, OMB approval was provided for the continued use of the $30 incentive for the 2003-2019 NSDUHs. The 2020 main study NSDUH will continue to use the $30 incentive.


The interview incentive will be mentioned in the following respondent materials: Lead Letter (Attachment FT-3); Question & Answer Brochure (Attachment FT-7); Study Description (Attachment FT-6); Introduction and Informed Consent Scripts (Attachment FT-12); Screening Questions (Attachment FT-10); Unable-to-Contact, Controlled Access, and Call-Me Letters (Attachment FT-5); Refusal Letters (Attachment FT-13); and Interview Incentive Receipt (Attachment FT-15).


Assurance of Confidentiality

Concern for the confidentiality and protection of respondents’ rights has always played a central part in the implementation of NSDUH and will continue to be given the utmost emphasis in the FT, which will provide the same assurance of confidentiality as the main study.


Current NSDUH FIs will be hired to work the FT. These FIs are thoroughly trained in methods for maximizing a respondent’s understanding of the government’s commitment to confidentiality. Furthermore, FIs make every attempt to secure an interview setting in the respondent’s home that is as private as possible, particularly when the respondent is a youth. The Contractor’s Institutional Review Board (IRB) was granted a Federalwide Assurance (Attachment FT-18) by the Office for Human Research Protections (OHRP) and HHS in compliance with the requirements for the protection of human subjects (45 CFR 46). The Contractor’s IRB will approve the protocols and consent forms for the FT prior to any respondent contact. The IRB’s primary concern is protecting respondents’ rights, one of which is maintaining the confidentiality of respondent information. By obtaining IRB approval for NSDUH procedures and materials, CBHSQ is assured that respondent confidentiality will be maintained.


Several FT procedures ensure that respondents’ rights will be protected. First, the FI will introduce himself or herself and the study using the Introduction and Informed Consent Scripts (Attachment FT-12), reading the scripted text aloud to each interview respondent and, if needed to a parent/guardian of a youth respondent. This statement will appear in the Showcard Booklet (Attachment FT-14). As part of the process for obtaining informed consent, respondents will be given a Study Description (Attachment FT-6), which includes information on the Confidential Information Protection and Statistical Efficiency Act of 2002 (CIPSEA, included as Title V in the E-Government Act of 2002, P.L. 107-347) and the protection that it affords. This statute prohibits disclosure or release, for non-statistical purposes, of information collected under a pledge of confidentiality.


Specifically, the Study Description states that respondents’ answers will be used only by authorized personnel for statistical purposes and cannot be used for any other purpose. If a respondent is aged 12 to 17, when the youth is selected for the NSDUH main study interview, the FI will read the parental introductory script (Attachment FT-11) to the parent or guardian requesting permission to speak with the youth about NSDUH. After that introduction, parental consent for the interview will be obtained from the selected respondent’s parent or guardian, youth assent will be requested and at least one parent, guardian or another adult must remain present in the home throughout the interview.


Under CIPSEA, data may not be released to unauthorized persons. CIPSEA safeguards the confidentiality of individually-identifiable information acquired under a pledge of confidentiality by controlling access to, and uses made of, such information. CIPSEA includes fines and penalties for any knowing and willful disclosure of individually-identifiable information by an officer, employee, or agent of SAMHSA. Willful and knowing disclosure of protected data to unauthorized persons is a felony punishable by up to five years imprisonment and up to a $250,000 fine.


As CIPSEA agents, all Contractor staff complete an annual CIPSEA training and sign a notarized Confidentiality Agreement (Attachment FT-19). FIs also complete CIPSEA and project training on ensuring respondent confidentiality and will have signed a notarized Data Collection Agreement (Attachment FT-19) certifying they will keep all respondent information confidential.


After obtaining informed consent, FIs will make every attempt to secure an interview setting that is as private as possible. In addition, the interview process, by design, will include techniques to afford privacy for the respondent. The ACASI portion of the questionnaire will maximize privacy and confidentiality by giving control of the sensitive questionnaire sections directly to the respondent. The ACASI methodology will allow the respondent to listen to questions through headphones and/or to read the questions on the computer screen, and then key his or her own responses into the computer via the keyboard. At the end of the ACASI portion, the respondent’s answers will be locked so no one, including the FI, can see the responses until after the data are transmitted, processed, and aggregated by the Contractor in a FIPS-Moderate environment.


To further ensure confidentiality, the respondent’s name, address, or other identifying information will never be noted. The one exception is the electronic Quality Control Form, where the respondent will be asked to voluntarily provide information and at the end of the interview. The FI will explain the procedures in advance, and using the tablet ask the respondent to provide his or her phone number and confirm his or her current address. The FI will then record the respondent’s information into the tablet. At the end of the day, the FI will send the electronic Quality Control Form information to RTI via the data transmission process.


Each day they work, FIs will electronically transmit all completed screening and interview data to the Contractor’s servers via secure encrypted data transmission in a FIPS-Moderate environment. As part of that FIPS-Moderate compliance, the laptops and tablets will be protected with FIPS 140-2 compliant device-level encryption and the laptops will require two-factor authentication to access.


On the data files, respondents will be distinguished only by a unique number assigned to screenings and interviews. Although the unique number is associated with a location number and a dwelling unit number, the Contractor will delete this location information before the delivery of data to CBHSQ. The dwelling unit address information, which is maintained in a separate file for Contractor use in sampling, fielding, and weighting cases, will be purged at the completion of data processing.


After delivery and acceptance of the final survey data files, all electronic Quality Control Form information will be destroyed, thus eliminating records of SDU addresses. The permanent sampling records will show only the general location in which interviews were conducted; there will be no record of specific dwelling units contacted.


This data collection is subject to the Privacy Act of 1974. Furthermore, Privacy Impact Assessment (PIA) documentation for NSDUH is reviewed each year as part of NSDUH’s annual system security assessments. Subsequently, the PIA documentation in the HHS system is updated by SAMHSA personnel as needed. The most recent review cycle was in May of 2018.


Questions of a Sensitive Nature

Many of the FT interview questions address topics likely to be of a sensitive nature to many respondents. Many safeguards, including the ACASI mode of questionnaire administration, will improve the privacy of data collected on sensitive issues. As part of the interview introduction, the FI will inform the respondent why the information is necessary, indicate who sponsors the study, request consent to conduct an interview, and explain the procedures that ensure confidentiality. For respondents between the ages of 12 and 17, verbal consent will be obtained from both the parent or guardian and then the youth. (See Attachment FT-12, Introduction and Informed Consent Scripts, for verbal consent text.) Once parental consent is obtained, every attempt will be made to ensure the actual interview is conducted without parental observation or intervention, though at least one parent, guardian or another adult must remain present elsewhere in the home throughout the interview.


Answers to sensitive questions, including all substance use, mental health, and sexual orientation and attraction questions (adults only), will be obtained by closed interview design. In the ACASI portion of the interview, the respondent will enter his or her answers directly into the computer. The FI will not see these answers.


All FT data collected will be transmitted regularly to the Contractor via secure encrypted data transmission in a FIPS-Moderate environment and distinguished only with a unique number, which is a code associated with the SDU. The questionnaire data will be processed immediately upon receipt at the Contractor’s facilities, and all associations between a questionnaire and the respondent’s address will be destroyed after all data processing activities are completed. The listings of SDU addresses will be kept under secured conditions and destroyed after all data processing activities are completed.


Estimates of Annualized Hour Burden

During FT data collection from August through November 2020, conducted separately from ongoing 2020 NSDUH main study data collection at that time, screenings will be completed with approximately 8,110 English-speaking respondents in the contiguous United States. (Alaska and Hawaii are excluded from the FT to control study costs.) From those screenings, approximately 4,000 respondents, as representatives of the civilian, noninstitutional population aged 12 years old or older, are expected to complete a FT interview using the revised questionnaire and materials.


The total annual burden estimate for the FT is shown below in Table 3.

Table 3. Annualized Estimated Burden for Redesign Field Test

Instrument

Number of
respondents

Responses per respondent

Total number of responses

Hours per response

Total burden hours

Household Screening

8,110

1

8,110

0.083

673

Interview

4,000

1

4,000

1.000

4,000

Screening Verification

246

1

246

0.067

17

Interview Verification

600

1

600

0.067

40

Total

8,110


12,956


4,730


Estimates of Annualized Cost to the Government

Total costs associated with the FT are estimated to be $1,450,982 over a 48-month contract performance period, including the FT. Of those total costs, $3,712,164 are contract costs for the FT.


Changes in Burden

SAMHSA is requesting 4,730 burden hours for the FT (of the 82,604 hours total for 2020).



List of Attachments

Attachment FT-1. Power Analysis Report

Attachment FT-2. CAI Questionnaire

Attachment FT-3. Lead Letter

Attachment FT-4. Contact Cards – Sorry I Missed You Card and Appointment Cards

Attachment FT-5. Unable-to-Contact, Controlled Access, and Call-Me Letters

Attachment FT-6. Study Description

Attachment FT-7. Question & Answer Brochure

Attachment FT-8. NSDUH Highlights and Newspaper Articles

Attachment FT-9. Tablet Screening Video Scripts

Attachment FT-10. Screening Questions

Attachment FT-11. Parental Introductory Script

Attachment FT-12. Introduction and Informed Consent Scripts

Attachment FT-13. Refusal Letters

Attachment FT-14. Showcard Booklet

Attachment FT-15. Interview Incentive Receipt

Attachment FT-16. Certificate of Participation

Attachment FT-17. CATI Verification Scripts

Attachment FT-18. Federalwide Assurance

Attachment FT-19. Confidentiality Agreement and Data Collection Agreement.






References


Groves, R. M., Couper, M. P., Presser, S., Singer, E., Tourangeau, R., Acosta, G. P., & Nelson, L. (2006). Experiments in producing nonresponse bias. Public Opinion Quarterly, 70(5), 720-736.


Groves, R. M., Presser, S., & Dipko, S. (2004). The role of topic interest in survey participation decisions. Public Opinion Quarterly, 68(1), 2-31.


Groves, R. M., Singer, E., & Corning, A. (2000). Leverage-saliency theory of survey participation - Description and an illustration. Public Opinion Quarterly, 64(3), 299-308.


Singer, E., & Ye, C. (2013). The use and effects of incentives in surveys. The Annals of the American Academy of Political and Social Science, 645, 112–141.


Williams, D., & Brick, M. (2018). Trends in U.S. face-to-face household survey nonresponse and level of effort. Journal of Survey Statistics and Methodology, 6, 186-211.


20


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorGEM
File Modified0000-00-00
File Created2021-01-13

© 2024 OMB.report | Privacy Policy