2013 National Survey on Drug Use and Health
SUPPORTING STATEMENT
B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS
1. Respondent Universe and Sampling Methods
The respondent universe for the 2013 NSDUH study is the civilian, noninstitutionalized population aged 12 years old and older within the 50 states and the District of Columbia. The NSDUH universe includes residents of noninstitutional group quarters (e.g., shelters, rooming houses, dormitories), residents of Alaska and Hawaii, and civilians residing on military bases. Persons excluded from the universe include those with no fixed household address, e.g., homeless transients not in shelters, and residents of institutional group quarters such as jails and hospitals.
The sample design will consist of a stratified, multi-stage area probability design (see Attachment M for a detailed presentation of the sample design). As with most area household surveys, the NSDUH design will continue to offer the advantage of minimizing interviewing costs by clustering the sample. This type of design also maximizes coverage of the respondent universe since an adequate dwelling unit and/or person-level sample frame is not available. Although the main concern of area surveys is the potential variance-increasing effects due to clustering and unequal weighting, these potential problems will be directly addressed in the NSDUH by selecting a rather large sample of clusters at the early stages of selection and by selecting these clusters with probability proportionate to a composite size measure. This type of selection maximizes precision by allowing one to achieve an approximately self-weighting sample within strata at the latter stages of selection. Furthermore, it is attractive because the design of the composite size measure makes the interviewer workload roughly equal among clusters.
A coordinated five-year design was developed for the 2005-2009 NSDUHs which was extended to the 2010-2012 NSDUHs and will continue in 2013. The sample selection procedures began by geographically partitioning each state into roughly equal size state sampling (SS) regions. Regions were formed so that each area would yield, in expectation, roughly the same number of interviews during each data collection period. This partition divided the United States into 900 SS regions. Within each of these SS regions, a sample of Census tracts was selected. Then, within sampled Census tracts, smaller geographic areas, or segments, were selected. A total of 48 segments per SS region were selected for the 2005-2009 NSDUHs. Only 24 segments per region were used for these surveys; the remaining 24 segments constitute the "reserve" sample and were available for use in 2010-2013. In general, segments consisted of adjacent Census blocks and are equivalent to area segments selected at the second stage of selection in NSDUHs prior to 1999 and at the first stage of selection in the 1999-2004 NSDUHs. The additional stage of selection (i.e. Census tracts) ensures that the majority of sample segments are contained within a single tract's boundaries, thus improving the ability to match to external data. In summary, the first stage stratification for the 2013 studies will be states and SS regions within states, the first stage sampling units will be Census tracts, and the second stage sampling units will be small area segments. This design for the 2005-2013 NSDUHs at the first stages of selection is desirable because of (1) the much larger person-level sample required at the latter stages of selection in the design and (2) the increased interest among NSDUH data users and policy-makers in state and other local-level statistics.
The coordinated design facilitates 50 percent overlap in second stage units (area segments) between each two successive years from 2005 through 2013. The expected precision of difference estimates generated from consecutive years, (e.g., the year-to-year difference in past month marijuana use among 12-17 year old respondents) will be improved because of the expected positive correlation resulting from the overlapping sample.
Similar to previous NSDUHs, at the latter stages of selection, five age group strata will be sampled at different rates. These five strata will be defined by the following age group classifications: 12‑17, 18‑25, 26-34, 35‑49, and 50 years old and over. We project that adequate precision for race/ethnicity estimates at the national level will be achieved with the larger sample size and the optimal allocation to the age group strata. Consequently, race/ethnicity groups will not be over-sampled. However, consistent with previous NSDUHs, the 2013 NSDUH will be designed to over-sample the younger age groups.
Table 1 in Attachment M shows main study sample sizes and projected number of completed interviews by sample design stages. Table 2 (Attachment M) shows main study sample sizes by state and projected number of person respondents by state and age group1. Table 3 (Attachment M) shows the expected precision for national estimates. Table 4 (Attachment M) shows the expected precision for direct state estimates.
2. Information Collection Procedures
Prior to the interviewer’s arrival at the sample dwelling unit (SDU), a letter will be mailed to the resident(s) briefly explaining the survey and requesting their cooperation. This letter will be printed on Department of Health and Human Services letterhead with the signature of the DHHS National Study Director and the Contractor’s National Field Director (see Attachment D).
Upon arrival at the SDU, the interviewer will refer the respondent to this letter and answer any questions. If the respondent has no knowledge of the lead letter, the interviewer will provide another copy, explain that one was previously sent, and then answer any questions. If no one is at home during the initial call at the SDU, the interviewer may leave a Sorry I Missed You card (Attachment E) informing the resident(s) that the interviewer plans to make another callback at a later date/time. Callbacks will be made as soon as possible. Interviewers will attempt to make at least four callbacks (in addition to the initial call) to each SDU in order to complete the screening process and obtain an interview.
If the interviewer is unable to contact anyone at the SDU after repeated attempts, the interviewer’s Field Supervisor may send an Unable to Contact (UTC) letter. The UTC letter re-iterates information contained in the lead letter and presents a plea for the respondent to participate in the study (See Attachment P for all UTC letters). If after sending that letter an interviewer is still unable to contact anyone at an SDU, another informational letter (See Attachment P) may be sent to the SDU requesting that the resident(s) call the Field Supervisor as soon as possible to set up an appointment for the interviewer to visit the resident(s).
As necessary and appropriate, the interviewer may make use of the Appointment Card (Attachment E) for scheduled return visits with the respondent. When an in‑person contact is made with an adult member of the SDU and introductory procedures are completed, the interviewer will present a Study Description (Attachment G) and answer questions if required. Assuming respondent cooperation, a screening of the SDU then will be initiated through administration of the Housing Unit Screening questions for housing units, or the Group Quarters Unit Screening questions for group quarters units. The screening questions are administered via a hand-held, pen-based computer, which also performs the subsequent sample-selection routines. A paper representation of the housing unit and group quarters unit screening process is shown in Attachment H.
If a potential respondent refuses to be screened, the interviewer is trained to accept the refusal in a positive manner, thereby avoiding the possibility of creating an adversarial relationship and precluding future opportunities for conversion. A refusal letter may then be sent by the Field Supervisor. The refusal letter sent is tailored to the specific concerns expressed by the potential respondent and asks him/her to reconsider participation (See Attachment Q for all refusal letters). An in-person conversion is then attempted either by supervisory field staff or specially selected interviewers with successful conversion experience. If the respondent proceeds with the screening process, the interviewer answers any questions that the screening respondent may have concerning the study. A Question & Answer Brochure (Attachment I) that provides answers to commonly asked questions also will be given to the respondent at this time, or just prior to the start of the interview. In addition, interviewers will be supplied with copies of the Example NSDUH Highlights (Attachment J) and the Example NSDUH Newspaper Clippings (Attachment J) which can be left with the respondent. Following this introductory exchange, the screening will continue until completion.
Once the rostering of all dwelling unit members 12 or older is complete, and assuming the within dwelling unit sampling process selects one or two members to participate in the study by completing the interview, the following procedures are implemented:
If the selected individual is 18 or older and currently available, the interviewer moves immediately to begin administering the questionnaire in a private setting within the dwelling unit after obtaining informed consent. If the selected individual is 12 to 17 years of age, parental consent is obtained from the selected individual’s parent or legal guardian, using the Introduction and Informed Consent for Sample Members Age 12-17 Years Old found in the Showcard Booklet (Attachment K); the minor is then asked to participate. Once consent is obtained from the parent and child, the interviewer begins the interview process.
For all identified/selected eligible potential respondents, the interviewer administers the interview in a prescribed and uniform manner. The sensitive/self‑administered portions of the interview will be completed via ACASI; that is, the respondent will listen privately to the questions through an audio headset and/or read them on the computer screen, and will enter his/her own responses directly into the computer. This method maximizes respondent privacy and confidentiality.
Race/ethnicity questions are interviewer administered and meet all of the guidelines for the OMB minimum categories. The addition of the finer delineation of Guamanian or Chamorro and Samoan, which collapse into the OMB standard Native Hawaiian/Other Pacific Islander category, were a requirement of the new HHS Data Collection Standards (see http://aspe.hhs.gov/datacncl/standards/ACA/4302/index.shtml).
In order to facilitate the respondent's recollection of prescription type drugs and their proper names, a set of color pillcards is provided to the respondent at the appropriate time. These pillcards and other showcards are included in the Showcard Booklet (Attachment K) and allow the respondent to refer to information necessary for accurate responses. The respondent enters his/her own answers directly into the computer during the ACASI interview.
After the interview is completed and before the verification procedures are begun, each respondent is given a $30.00 incentive and a Field Interviewer-signed Interview Incentive Receipt (Attachment N).
For verification purposes, interview respondents are asked to complete a Quality Control Form (Attachment C) that requests their address and telephone number for possible follow‑up to ensure that the interviewer did his/her job appropriately. Respondents are informed that completing the Quality Control Form is voluntary. This form is completed and placed in an envelope by the respondent and mailed to the NSDUH Contractor for processing.
Interviewers will be supplied with Certificates of Participation (Attachment O) to distribute to interested respondents, primarily adolescents, after the interview is completed. Respondents may attempt to use these certificates to earn school or community service credit hours. No guarantee of credit is made by SAMHSA or the Contractor and the certificates clearly state this lack of guarantee. The child’s name is not written on the certificate. The interviewer signs his or her name and dates the certificate, then gives it to the child. The child’s name is left blank and the child is told that he may or may not choose to use the certificate to attempt to get community service credit for school or for some other purpose. The study name is included on the certificate, but again, it is the child’s choice if they would like to use the certificate and identify themselves as a participant in the survey.
A random sample of those who complete Quality Control Forms receive a telephone call expressing appreciation for their participation in the study. Each respondent also is asked to answer a few questions verifying that the interview took place, that proper procedures were followed, and that the amount of time required to administer the interview was within expected parameters. Quality Control letters are mailed when telephone numbers are unavailable (see Attachment L). In previous NSDUH surveys, less than 1 percent of the verification sample refused to fill out Quality Control Forms. As in the past, the respondents are given the opportunity to decline to complete the form.
All interview data are transmitted to the Contractor’s offices on a regular basis.
Questionnaire
The version of the questionnaire to be fielded in 2013 is a computerized (CAPI/ACASI) instrument that is similar in content and structure to the computerized instrument fielded in 2012.
The NSDUH questionnaire and interview methods are designed to retain respondent interest, ensure confidentiality, and maximize the validity of response. The questionnaire is administered in such a way that interviewers will not know respondents’ answers to the sensitive questions, including those on illicit drug use. These questions are self-administered (ACASI), that is, respondents listen to or read the questions and enter their responses directly into the computer. The respondent listens in private through headphones, so even those who have difficulty seeing or reading are able to complete the self-administered portion.
The questionnaire is divided into sections based on specific substances or other main topics. The same questions are asked for each substance or substance class, ascertaining the respondent’s history in terms of age of first use, most recent use, number of times used in lifetime, and frequency of use in past 30 days and past 12 months. These substance use histories allow estimation of the incidence, prevalence, and patterns of use for licit and illicit substances.
Topics that are administered by the interviewer (i.e., the CAPI section) include Demographics, Health Insurance, and Income. For the Income and Health Insurance sections, respondents will be asked if there is anyone else at home who would be better able to provide accurate answers.
The current questionnaire is founded on the CAI instrument that was first implemented for the 1999 NSDUH. While the mode changed in 1999, the content was based on the 1994 questionnaire, which resulted from a series of methodological studies and discussions with consultants. Additional methodological testing was completed in preparation for the conversion to computer-assisted interviewing. The questionnaire incorporates improvements in question wordings (e.g., clearer definitions, less vague terminology, elimination of hidden questions) and questionnaire structure (e.g., greater use of skip patterns, improved formatting for the benefit of interviewers and respondents). Enhanced instructions regarding the reference periods used (i.e., past 30 days, past 12 months) also were added, including a paper reference date calendar to facilitate the respondent’s accurate recall of events. A key feature of the questionnaire is a core-supplement structure. A set of core questions that are critical for basic trend measurement of substance use incidence and prevalence rates will remain in the survey every year and comprise the main part of the questionnaire. Supplemental questions, or modules, which can be revised, dropped or added from year to year comprise the remainder of the questionnaire.
The core is comprised of the initial demographic questions and the Tobacco through Sedatives modules. Supplemental items include the remaining modules, demographic and health questions. Some of the supplemental portion of the questionnaire is likely to remain in the survey, essentially unchanged, every year (e.g., insurance).
2013 NSDUH CAPI/ACASI Questionnaire Content
The proposed questionnaire content for 2013 is shown in Attachment B. While the actual administration will be electronic, the document shown is a paper representation of the content that is to be programmed. The 2013 questionnaire has been updated to include new questions relevant topics. A summary of the changes, along with the module in which they appear, is listed below.
Front End Demographics – A couple of new questions have been added to further determine if respondents who report that they are on active duty in the military are actually reservists. Current questions about military service were edited for consistency.
Blunts – Two new questions have been added that ask respondents if their marijuana use was prescribed by a doctor.
Health – Questions added to the health module include height, weight, and discussions with a doctor about substance use.
Back End Demographics – Edits were made to the wording and routing of existing questions to more precisely introduce survey topics and to use more natural question wording.
As in previous years, the State program names for Medicaid, CHIP, and TANF will be updated. All other modular components of the questionnaire will remain unchanged.
As in past years, two versions of the instrument will be prepared: an English version and a Spanish translation. Both versions will have the same essential content.
3. Methods to Maximize Response Rates
In 2011, the weighted response rates were 87% for screening and 75% for interviews, with an overall response rate (screening * interview) of 65%. With the continuation of the $30.00 incentive for the 2013 survey year, the Contractor expects the weighted response rates for 2013 to be about the same as the 2011 rates.
Nonresponse to the NSDUH has historically been concentrated among the 50 + age group. In 2004, focus groups were conducted with NSDUH field interviewers on the topic of nonresponse among the 50+ age group to gather information on the root causes for differential response by age. The study examined the components of nonresponse (refusals, noncontacts, and/or other incompletes) among the 50+ age group in NSDUH. It also examined respondent, environmental, and interviewer characteristics in order to identify the correlates of nonresponse among the 50+ group, including relationships that are unique to the 50+ group. The results indicated that the high rate of nonresponse among the 50+ group was primarily due to a high rate of refusals, especially among sample members aged 50 to 69, and a high rate of physical and mental incapability among those 70 or older. Taken together with evidence from interviewer focus groups, it appeared that the higher rate of refusal among the 50+ age group may, in part, have been due to fears and misperceptions about the survey and interviewers' intentions. It was suggested that increased public awareness about the study may allay these fears.
In 2005, focus groups were conducted with potential NSDUH respondents to examine the issue of nonresponse among persons 50 and over. Participants in these groups recommended that the NSDUH contact materials should focus more on establishing the legitimacy of the sponsoring and research organizations, clearly conveying the survey objectives, describing the selection process, and emphasizing the importance of the selected individual’s participation.
As a result, a study was conducted to improve the NSDUH Lead Letter and Question & Answer Brochure for the 2015 redesign. CBHSQ and the NSDUH contractor revised the materials through review of contact materials used for other government-sponsored surveys, expert review, and feedback from 17 focus groups conducted in five metropolitan areas (OMB No. 0930-0290). The primary focus of redesigning the contact materials was to improve the materials in ways likely to generate positive reactions from members of sampled households, especially those in the 50+ age group, and therefore, maximize participation. These new materials are currently being tested in the NSDUH Questionnaire Field Test for potential implementation in the 2015 redesigned survey.
Nonresponse Bias Studies
Many studies have been conducted over the years to assess nonresponse bias in the NSDUH. Biemer and Link (2007) provide a general method for nonresponse adjustment that relaxed the ignorable nonresponse assumption. Their method, which extended the ideas of Drew and Fuller (1980), used level of effort (LOE) indicators based on call attempts to model the response propensity. In most surveys, call history data are available for all sample members, including nonrespondents, and since the LOE required to interview a sample member is likely to be highly correlated with response propensity, this method is ideally suited for modeling the nonignorable nonresponse. The approach was first studied in a telephone survey setting and then applied to data from the 2006 NSDUH, where level of effort was measured by contact attempts (or callbacks) made by field interviewers.
The callback modeling approach adopted in this report confirmed what was known from other studies on nonresponse adjustment approaches (i.e., there is no uniformly best approach for reducing the effects of nonresponse on survey estimates). All models under consideration in this report were the best in eliminating nonresponse bias in different situations using various measures. Errors exist in the callback data as reported by field interviewers. For example, while not indisputably confirmed, it is strongly suspected that field interviewers underreport callback attempts. Also, it will be very difficult to apply uniform reporting procedures amongst the large interviewing staff that is spread across the country. The likelihood of these types of errors in the data is high and has the potential to seriously bias the results of the callback modeling approach to nonresponse adjustment. For these and other reasons, the callback modeling approach was not implemented in the NSDUH nonresponse weighting adjustment process.
In 2005, a study was conducted on a methodology designed to reduce item
nonresponse to critical items in the ACASI portion of the NSDUH questionnaire (Caspar et al., 2005). Respondents providing "Don't know" or "Refused" responses to items designated as essential to the study's objectives received tailored follow-up questions designed to simulate interviewer probes. Logistic regression was used to determine what respondent characteristics tend to be associated with triggering follow-up questions.
The analyses showed that item nonresponse to the critical items is quite low, so the authors caution the reader to interpret the data with care. However, the findings suggest the follow-up methodology is a useful strategy for reducing item nonresponse, particularly when the nonresponse is due to "Don't know" responses.
In a preliminary experiment conducted in the 2001 NHSDA, it was concluded that providing incentives increased response rates; therefore, a $30 incentive was used in the subsequent 2002 NHSDA (Wright et al., 2005). Wright and his coauthors explored the effect that the incentive had on nonresponse bias. The sample data were weighted by likelihood of response between the incentive and nonincentive cases. Next, a logistic regression model was fit using substance use variables and controlling for other demographic variables associated with either response propensity or drug use. The results indicate that for past year marijuana use, the incentive is either encouraging users to respond who otherwise would not respond, or it is encouraging respondents who would have participated without the incentive to report more honestly about drug use. Therefore, it is difficult to determine whether the incentive money is reducing nonresponse bias, response bias, or both. However, reports of past year and lifetime cocaine did not increase in the incentive category, and past month use of cocaine actually was lower in the incentive group than in the control group.
In 2005, Cowell and Mamo conducted a study to evaluate the accuracy of the income measure in NSDUH. They compared the distribution of 1999 personal income data from the 2000 NSDUH with the distributions in the same year from the Current Population Survey (CPS) and the statistics of income (SOI) data. Despite some fundamental differences between the SOI and either of the survey datasets (CPS and NSDUH), there were strong similarities between the three income distributions. With the exception of one interval, the frequencies of the three datasets were within 2.5 percentage points of one another across all income intervals.
In 2004, Eyerman et al. conducted a study with the goal of providing a better understanding of nonresponse among older sample members in NSDUH in order to tailor methods to improve response rates and reduce the threat of nonresponse error. They examined the components of nonresponse (refusals, noncontacts, and/or other incompletes) among the 50+ age group in NSDUH. They also examined respondent, environmental, and interviewer characteristics in order to identify the correlates of nonresponse among the 50+ group, including relationships that are unique to the 50+ group. Finally, they considered the root causes for differential nonresponse by age, drawing from focus group sessions with NSDUH field interviewers on the topic of nonresponse among the 50+ group.
The results indicated that the high rate of nonresponse among the 50+ group was primarily due to a high rate of refusals, especially among sample members aged 50 to 69, and a high rate of physical and mental incapability among those 70 or older. Taken together with evidence from interviewer focus groups, it appeared that the higher rate of refusal among the 50+ age group may, in part, have been due to fears and misperceptions about the survey and interviewers' intentions. Increased public awareness about the study may allay these fears.
In 2001, CBHSQ produced a report to address the nonresponse patterns obtained in the 1999 NHSDA. The report was motivated by the relatively low response rates in the 1999 NHSDA and by the apparent general trend of declining response rates in field studies. The analyses presented in this report were produced to help provide an explanation for the rates in the 1999 NHSDA and guidance for the management of future projects. The report describes NHSDA data collection patterns from 1994 through 1998. It also describes the data collection process in 1999 with a detailed discussion of design changes, summary figures and statistics, and a series of logistic regressions comparing 1998 with 1999 nonresponse patterns.
The results of this study are consistent with the conventional wisdom of the professional survey research field and the findings in survey research literature. The nonresponse can be attributed to a set of interviewer influences, respondent influences, design features, and environmental characteristics. The nonresponse followed the demographic patterns observed in other studies, with urban and high crime areas having the worst rates. Finally, efforts taken to improve the response rates were effective. Unfortunately, the tight labor market combined with the large increase in sample size caused these efforts to lag behind the data collection calendar. The authors used the results to generate several suggestions for the management of future projects.
The 1990 NHSDA was one of six large Federal or federally sponsored surveys used in the compilation of a dataset that then was matched to the 1990 decennial census for analyzing the correlates of nonresponse (Groves and Couper, 1998). In addition, data from surveys of NHSDA interviewers were combined with those from these other surveys to examine the effects of interviewer characteristics on nonresponse.
One of the main findings was that those with lower socioeconomic status were no less likely to cooperate than those with higher socioeconomic status; there was instead a tendency for those in high-cost housing to refuse survey requests, which was partially accounted for by residence in high-density urban areas. There was also some evidence that interviewers with higher levels of confidence in their ability to gain participation achieved higher cooperation rates.
To assess the impact of nonresponse, a special follow-up study was undertaken on a subset of nonrespondents to the 1990 NSDUH (Caspar et al., 1992). The aim was to understand the reasons people chose not to participate, or were otherwise missed in the survey, and to use this information in assessing the extent of the bias, if any, that nonresponse introduced into the 1990 NHSDA estimates. The study was conducted in the Washington, DC, area, a region with a traditionally high nonresponse rate. The follow-up survey design included a $10 incentive and a shortened version of the instrument. The response rate for the follow-up survey was 38 percent.
The results of the follow-up study did not demonstrate definitively either the presence or absence of a serious nonresponse bias in the 1990 NHSDA. In terms of demographic characteristics, follow-up respondents appeared to be similar to the original NHSDA respondents. Estimates of drug use for follow-up respondents showed patterns that were similar to the regular NHSDA respondents. Another finding was that among those who participated in the follow-up survey, one third were judged by interviewers to have participated in the follow-up because they were unavailable for the main survey request. Finally, 27 percent were judged to have been swayed by the incentive and another 13 percent were judged to have participated in the follow-up due to the shorter instrument.
In addition, response rates for the NSDUH are tracked on a daily basis. Interviewers transmit their work each evening and a web-based case management system (CMS) is used by project management to monitor their area of responsibility. The system allows managers can identify problem as early as possible and take corrective action. The CMS also has a data quality component that collects information from the interviewers and data processing staff and produces reports for management. Some of the items available in these data quality reports are verification information, time discrepancies, interview length problems, missing data, and form errors.
Twice a year, response rate patterns are analyzed by state. States with significant changes are closely scrutinized to uncover possible reasons for the changes. Action plans are put into place for states with significant declines, and any special techniques, such as a particular greeting or use of certain survey materials, used by states with increases are noted. If the technique is generalizable, it may be given to the interviewers as a helpful tip or worked into a future training. Response and nonresponse patterns are also tracked by various demographics on an annual basis in the NSDUH Data Collection Final report. The report provides detailed information about noncontacts vs. refusals, including reasons for refusals. This information is reviewed annually for changes in trends.
In May 2011, SAMHSA received a three-year renewal of its generic clearance for methodological field tests (OMB No. 0930-0290). These methodological studies will be used to inform decisions regarding sample design, data collection methods, questionnaire format, data processing and estimation.
The next wave of methodological tests will continue to examine ways to increase data quality, lower operating costs, and gain a better understanding of sources and effects of nonsampling error on the NSDUH estimates. One of the goals of these methodological tests will be to assess new methods for gaining cooperation and participation of respondents with the intent of increasing response and decreasing potential bias in the survey estimates. Particular attention will be given to improving response rates among persons residing in controlled access communities (locked apartment buildings, gated communities, college dormitories, etc.) and other hard-to-reach populations. Other activities currently under consideration are targeted at assessing the characteristics of nonrespondents and determining the feasibility of alternative sample designs and modes of data collection.
One of the specific proposed studies for the generic clearance is a potential nonresponse follow-up study in which a subset of selected respondents who initially did not complete the NSDUH are recontacted. An incentive would be offered for these individuals to complete the interview and to provide information regarding the reasons for refusal, as well as information on their drug use and other characteristics to allow for the assessment of bias.
With plans to implement a new questionnaire, contact materials, and sample design in the near future, the first step will be to assess the potential impact these changes may have on participation, data quality, and the overall survey estimates. Once the design changes have been in place long enough to quantify, and report potential implications of the redesign, the next step will be to design a plan to evaluate nonresponse bias and other methodological issues in the new survey.
4. Tests of Procedures
Since there are no planned additions to the 2013 data collection protocol, field testing will not occur. Most of the planned modifications to the questionnaire have already been tested under NSDUH Methodological Field Tests generic OMB clearance (OMB No. 0930-0290) which was renewed on May 18, 2011.
5. Statistical Consultants
The basic NSDUH design was reviewed by statistical experts, both within and outside SAMHSA. Statistical experts reviewing the 1999-2013 survey designs include William Kalsbeek, Ph.D., University of North Carolina; Robert Groves, Ph.D., Bureau of the Census; and Michael Hidiroglou, Ph.D., Statistics Canada. Monroe Sirken, Ph.D., National Center for Health Statistics (NCHS); James Massey, Ph.D., (deceased) also of NCHS; Douglas Wright, CBHSQ, SAMHSA (retired), and Arthur Hughes, CBHSQ, SAMHSA were consulted on the 1992 and subsequent survey designs. Michael Jones, CBHSQ, SAMHSA is the Government Project Officer, (240) 276-1274. Joseph Gfroerer, CBHSQ, SAMHSA is the primary mathematical statistician responsible for overall project management, (240) 276-1262. RTI senior statisticians contributing to the design are Dr. James Chromy and Dr. Ralph Folsom.
The 2012–2013 National Survey on Drug Use and Health contract was awarded to Research Triangle Institute (RTI) on September 30, 2010. Contractor personnel will implement the sample design, recruit field staff, train interviewers, conduct data collection, conduct data receipt/editing/coding/keying, conduct data analysis, and develop statistical reports. SAMHSA will provide direction and review functions to the Contractor. Data collection will be conducted throughout the 2013 calendar year.
Appendix A
Current NSDUH Consultants
a. Consultants on NSDUH Design
Michael Arthur, Ph.D., Project Director (206) 685-3858
Social Development Research Group
University of Washington
Raul Caetano, M.D., Ph.D., Assistant Dean (214) 648-1080
Dallas Satellite MPH Program
University of Texas at Houston
John Carnevale, Ph.D., President (301) 963-2151
Carnevale Associates
Barbara Delaney (212) 973-3509
Director of Research
Partnership for a Drug-Free America
Bill Kalsbeek, Ph.D., Associate Professor/Director (919) 962-3249
Survey Research Unit, Biostatistics
University of North Carolina at Chapel Hill
Graham Kalton, Ph.D. (301) 251-8253
Senior Vice President
Westat
Philip Leaf, Ph.D., Professor (410) 955‑3962
Department of Mental Hygiene, Mental Health and Psychiatry
School of Public Health
Johns Hopkins University
Patrick O’Malley, Ph.D., Senior Research Scientist (734) 763-5043
Survey Research Center, The Institute for Social Research
University of Michigan
University of Maryland, School of Public Affairs
Peter Reuter, Ph.D. (301) 405-6367
School of Public Policy
University of Maryland
b. NSDUH Consultant for the Tobacco Module
Gary A. Giovino, Ph.D., Professor (716) 845-8444
Department of Health Behavior
University at Buffalo - SUNY
c. NSDUH Consultants for Mental Health Modules
Jeffrey Buck, Ph.D. (301) 443-0588
Director of Office of Managed Care
Center for Mental Health Services
Michael First, M.D., Professor (212) 543-5531
Department of Psychiatry
Columbia University Medical Center
Marilyn Henderson (retired) (301) 443-2293
Center for Mental Health Services
Kimberly Hoagwood, Ph.D., Professor (212) 543-5311
Department of Child and Adolescent Psychiatry
Columbia University
Jeffrey Johnson, PhD, Associate Professor (212) 543-5523
Department of Psychiatry
College of Physicians and Surgeons
Columbia University
Ronald C. Kessler, Ph.D., Professor (617) 423-3587
Department of Health Care Policy
Harvard Medical School
Christopher P. Lucas, M.D. (212) 543-5358
Department of Child Psychiatry
Columbia University
Michael Schoenbaum, PhD (301) 435-8760
Senior Advisor for Mental Health Services,
Epidemiology and Economics
National Institute of Mental Health
Phillip Wang, MD, PhD, Director (301) 443-6233
Division of Services and
Intervention Research
National Institute of Mental Health
ATTACHMENTS
Attachment A ‑ Federal-Wide Assurance
Attachment B - CAI Questionnaire Content
Attachment C ‑ Quality Control Form
Attachment D ‑ Lead Letter to Selected Dwelling Unit
Attachment E ‑ Contact Cards - Sorry I Missed You Card & Appointment Card
Attachment F ‑ Introduction and Informed Consent
Attachment G ‑ Study Description
Attachment H ‑ Housing Unit and Group Quarters Unit Screening Questions
Attachment I ‑ Question and Answer Brochure
Attachment J ‑ Example of NSDUH Highlights & NSDUH Newspaper Clippings
Attachment K - Showcard Booklet
Attachment L - Quality Control Letter
Attachment M ‑ Sample Design
Attachment N - Interview Incentive Receipt
Attachment O - Certificate of Participation
Attachment P - Unable to Contact, Controlled Access, and Call-Me Letters
Attachment Q - Refusal Letters
Attachment R - NSDUH Confidentiality Agreement
Attachment S - CATI Verification Scripts
Attachment T - Response to Federal Register Comment
References
Biemer, P., & Link, M. (2007). Evaluating and modeling early cooperator bias in RDD surveys. In Lepkowski, J. et al. (Eds.), Advances in telephone survey methodology. Hoboken, NJ: John Wiley & Sons.
Caspar, R. A., Penne, M. A., & Dean, E. (2005). Evaluation of follow-up probes to reduce
item nonresponse in NSDUH. In J. Kennet & J. Gfroerer (Eds.), Evaluating and improving methods used in the National Survey on Drug Use and Health (DHHS Publication No. SMA 05-4044, Methodology Series M-5, pp. 121-148). Rockville, MD: Substance Abuse and Mental Health Services Administration, Office of Applied Studies.
First, M. B., Spitzer, R. L, Gibbon, M., & Williams, J. B. W. (2002). Structured Clinical Interview for DSM-IV-TR Axis I Disorders, Research Version, Non-patient Edition. (SCID-I/NP) New York: Biometrics Research, New York State Psychiatric Institute.
Gfroerer, J., Wright, D., & Kopstein, A. (1997). Prevalence of youth substance use: The impact of methodological differences between two national surveys. Drug and Alcohol Dependence, 47, 19–30.
Groves, R. M., and M.P. Couper. 1998. Nonresponse in Household Interview Surveys. New York: Wiley.
Groves, R. (1989). Survey Errors and Survey Costs. New York: Wiley.
Grucza, R. A., Abbacchi, A. M., Przybeck, T. R., & Gfroerer, J. C. (2007). Discrepancies in estimates of prevalence and correlates of substance use and disorders between two national surveys. Addiction, 102, 623-629.
Hennessy, K., & Ginsberg, C. (Eds.). (2001). Substance use survey data collection methodologies [Special issue]. Journal of Drug Issues, 31(3), 595–727.
Kulka, R. A., Eyerman, J., & McNeeley, M. E. (2005). The use of monetary
incentives in federal surveys on substance use and abuse. Journal for Social and Economic Measurement, 30(2-3), 233-249.
Miller, J. W., Gfroerer, J. C., Brewer, R. D., Naimi, T. S., Mokdad, A., & Giles, W. H. (2004).
Prevalence of adult binge drinking: A comparison of two national surveys. American Journal of Preventive Medicine, 27, 197-204.
Murphy, J., Eyerman, J., & Kennet, J. (2004). Nonresponse among persons age 50
and older in the National Survey on Drug Use and Health. In S. B. Cohen & J. M. Lepkowski (Eds.), Eighth Conference on Health Survey Research Methods (DHHS Publication No. PHS 04-1013, pp. 73-78). Hyattsville, MD: U.S. Department of Health and Human Services, Public Health Service, Centers for Disease Control and Prevention, National Center for Health Statistics.
Office of Applied Studies. (2003). Results from the 2002 National Household Survey on Drug Abuse.
Rockville, MD: Substance Abuse and Mental Health Services Administration.
Office of Applied Studies. (2001). National Household Survey on Drug Abuse:
1999 nonresponse analysis report. Rockville, MD: Substance Abuse and Mental Health Services Administration.
Rehm J, Ustin T, Saxena S, et al. 1999. On the development and psychometric testing of the WHO screening instrument to assess disablement in the general population. International Journal of Methods in Psychiatric Research 8: 110–22.
Wright, D., Bowman, K., Butler, D., & Eyerman, J. (2005). Non-response bias from the National
Household Survey on Drug Abuse incentive experiment. Journal for Social and Economic Measurement, 30(2-3), 219-231.
1Five age groups actually will be used for the 2013 design so that somewhat lower sampling rates are applied to persons 50+ years old than to those 35-49 years old. Only four age groups are shown in Tables 2 and 3.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | 2003 National Survey on Drug Use and Health |
Author | lky |
File Modified | 0000-00-00 |
File Created | 2021-01-30 |