OMB Supporting Statement_revised

OMB Supporting Statement_revised.doc

2010 National Survey on Drug Use and Health (NSDUH)

OMB: 0930-0110

Document [doc]
Download: doc | pdf

2010 National Survey on Drug Use and Health

SUPPORTING STATEMENT



A. JUSTIFICATION


1. Circumstances of Information Collection


NSDUH Main Study

The Substance Abuse and Mental Health Services Administration (SAMHSA), sponsor of the National Survey on Drug Use and Health (NSDUH), submits a revision for approval from the Office of Management and Budget to conduct the 2010 NSDUH. The conduct of the NSDUH is paramount in meeting a critical objective of SAMHSA’s mission, i.e., to maintain current data on the incidence and prevalence of substance use in the United States. The NSDUH has been conducted on a periodic basis from 1971-1988, and annually since 1990. The 2010 survey will represent the thirtieth in the series (OMB No. 0930-0110).


The NSDUH is authorized by Section 505 of the Public Health Service Act (42 USC 290aa4 – Data Collection). Section 505 specifically authorizes annual data collection for monitoring the incidence and prevalence of illicit substance use and mental health problems, as well as the abuse of licit substances in the United States population.


The NSDUH provides current data on substance use incidence and prevalence for the U.S. population – aged 12 or older – as well as each state. Eight States are designated as large sample States (California, Florida, Illinois, Michigan, New York, Ohio, Pennsylvania, and Texas) with target sample sizes of 3,600. For the remaining 42 States and the District of Columbia, the target sample size was 900. This approach ensures there is sufficient sample in every State to support small area estimation (SAE) while at the same time maintaining efficiency for national estimates.

Information collected through the NSDUH has multiple applications, including: (1) the study of the epidemiology of substance abuse and mental health; (2) monitoring substance abuse and mental health trends and patterns; (3) identifying licit and illicit substances being abused (including those causing/contributing to medical, psychological, or social problems requiring emergency medical care or rehabilitation); (4) the study of the use of health care resources for treatment of substance abuse and mental health problems; and (5) assisting federal, state and local agencies in the allocation of resources, and the proper design and implementation of substance abuse prevention, treatment, and rehabilitation programs.


The NSDUH instrument is administered by computer-assisted interviewing (CAI) using a laptop computer. The household screening and respondent selection procedures will be administered using a hand-held computer. The length and content of the screening questions and the overall screening process will remain essentially the same in 2010 as in 2009. However, select questions that were present in the 2009 Screening instrument have been deleted for 2010. The 2010 Screener no longer asks respondents to confirm a second time that each of the rostered household members will live in the house for the majority of the data collection quarter, and no longer asks if there are any household members that were missed during the roster process. The removal of these two confirmation questions is not expected to impact respondent burden and no tests of procedures are planned.

The sample design for 2010 will be the same as it was for the 2009 CAI sample in that it will be large enough to facilitate the reporting of drug use incidence and prevalence estimates for each of the 50 States, and the District of Columbia. The expansion of the sample size, which was initiated in 1999, was proposed in the President’s FY 1998 budget request, and funded by the Congress, with stipulations described in the Conference Report 105-390, accompanying H.R. 2264, Appropriations for the Departments of Labor, Health and Human Services and Education for Fiscal Year 1998.

Mental Health Surveillance Study

In December 2006, a meeting of expert consultants was convened by SAMHSA’s Center for Mental Health Services (CMHS) to solicit recommendations for mental health surveillance data collection strategies. A summary of this meeting is included in Attachment X. The panel recommended conducting methodological studies to calibrate NSDUH mental health and impairment screening tools with a ‘gold standard’ clinical psychiatric interview to create a statistically sound measure that may be used to estimate the prevalence of serious mental illness (SMI) among adults (age 18+).


Based on these recommendations, a mental health surveillance study (MHSS) was conducted as an embedded split-sample follow-up study within the 2008 NSDUH. Analysis of data from the first 2 quarters of 2008 (approximately 750 adults) determined one impairment scale that, combined with a psychological distress score, best predicted SMI as determined from the clinical interview. This single impairment scale, a modified version of the World Health Organization-Disability Assessment Scale (WHO-DAS) (Rehm et al, 1999), was administered in the 2009 NSDUH and will also be included in the 2010 NSDUH. A six-month MHSS analysis report was sent to OMB on September 10, 2008, and approval was granted on the entire 2009 NSDUH on September 22, 2008.


The Mental Health Surveillance Study will be conducted in conjunction with the 2010 instrument as well. The modified version of the WHO-DAS will continue to be administered to the entire adult sample for the 2010 survey. The Structured Clinical Interview for DSM-IV- TR Axis I Disorders Non-patient Edition (SCID-I/NP, 2/2007 revision) (First, M; Spitzer, R; Gibbon, M; & Williams, J; 2002) was tailored for the study and will continue to be used as the follow-up interview. Data from these interviews will be analyzed annually to update the calibration of the screening measure. The procedures for conducting the survey remain the same. Approximately 500 clinical follow-up interviews will be completed with adults (18+) during 2010, with approximately 125 completed per quarter (see Exhibit 1 in Attachment N).


Embedding the MHSS in the regular sample provides an opportunity to recruit respondents for the clinical follow-up without incurring additional screening costs. A sub-sample of respondents will be selected with probabilities based on their K-6 scores. The K-6 score is a measure of psychological distress that ranges from 0 (lowest) to 24 (highest).  A score of 13 or higher indicates serious psychological distress.  The score is derived from the following six items that refer to one month in the past 12 months when the respondent felt the most depressed, anxious, or emotionally stressed: how often felt nervous (DSNERV1), how often felt hopeless (DSHOPE), how often felt restless (DSFIDG), how often couldn't be cheered up (DSNOCHR), how often felt everything was an effort (DSEFFORT), and how often felt down, no good, or worthless (DSDOWN).  The K-6 score will be calculated within the CAI instrument and persons will be sampled using a selection algorithm that ensures an adequate sample size across the range of K-6 scores in order to maximize the power of the analysis.


2. Purpose and Use of Information


The purpose of the survey is to collect current data on substance use incidence and prevalence and mental health statistics for the total U.S. population as well as each State, and to issue reports on the survey results. The sample is sufficient to support small area estimates in each state and the District of Columbia while maintaining efficiency for national estimates.


NSDUH data are used by SAMHSA, the National Institute on Drug Abuse (NIDA), the Centers for Disease Control and Prevention, the Office of National Drug Control Policy (ONDCP), and other Federal agencies interested in the incidence and prevalence of substance use. The data are used to design prevention programs, respond to inquiries on the extent of substance use, estimate treatment need, study the social and economic impact of substance abuse, identify the correlates of substance use, and evaluate the overall impact that Federal and State programs have on drug demand. The NSDUH will provide a useful indicator of individual States’ overall success at reducing youth substance use. In conjunction with other data sources, the NSDUH data will provide a means for assessing and improving outcomes of prevention and treatment services. It will help SAMHSA identify areas where serious substance abuse problems exist and provide assistance to States to help them develop and adopt targeted responses for those problems. Also, many special requests for survey information emanate from the White House, Congress, and various State and local government agencies. The questionnaire asks for the minimum information necessary to meet the needs of Federal policy makers and the substance abuse research, prevention, and treatment communities.


The Department of Health and Human Services (DHHS) continues to affirm the need for annual NSDUH surveys as essential to the President’s annual Drug Control Strategy and Federal objectives related to substance use. Since the NSDUH is the nation’s only source of reliable national substance use data on the U.S. population, this survey will ensure that SAMHSA and other Federal, State, and local agencies will have timely data available for release by late summer of 2011. The ability to respond effectively and efficiently to the continually changing dynamics of the drug culture is critical to sound prevention and treatment strategies. Data from the NSDUH on past month substance use among youth (12-17) are also used as a Government Performance and Results Act (GPRA) measure for the Centers for the Application of Prevention Technologies and the State Incentive Grant programs of the Center for Substance Abuse Prevention. The NSDUH is also used to address a weakness found in the Program Assessment rating Tool (PART) review of the Substance Abuse Treatment Block Grant program, resulting in the National Outcome Measures (NOMS) project. NOMS is a SAMHSA performance-based management initiative that involves tracking a set of key outcome measures at the National and State levels annually. Discussions between SAMHSA and the States have identified specific NSDUH variables to be tracked under this system. Among them are:


  1. Cigarettes (past 30 day use)

  2. Other tobacco products (past 30 day use)

  3. Alcohol (past 30 day use)

  4. Alcohol (age 12-20 vs. age 21 or over)

  5. Binge drinking (age 12-20 vs. age 21 or over)

  6. Heavy drinking (age 12-20 vs. age 21 or over)

  7. Marijuana (past 30 day use)

  8. Any illicit drug other than marijuana (past 30 day use)

  9. Cigarettes (average number of days used in the past 30 days)

  10. Alcohol (average number of days used in the past 30 days)

  11. Marijuana (average number of days used in the past 30 days)

  12. Age at first use – cigarettes

  13. Age at first use – other tobacco products

  14. Age at first use – alcohol

  15. Age at first use – marijuana

  16. Age at first use – any illicit drug other than marijuana

  17. Perceived risk of cigarette use

  18. Perceived risk of marijuana use

  19. Perceived risk of heavy drinking

  20. Perceived risk of LSD use

  21. Perceived risk of heroin use

  22. Perceived risk of cocaine use

  23. Perceived disapproval of cigarette use – adolescents

  24. Perceived disapproval of marijuana use – adolescents

  25. Perceived disapproval of heavy drinking – adolescents

  26. Perceived peer disapproval of cigarette use – adolescents

  27. Knowledge/consequences of workplace drug policies – adults

  28. Driving under the influence of alcohol

  29. Exposure to prevention messages – adolescents

  30. Adolescent communication with parent(s) about drug and alcohol use

  31. Parental communication with adolescent about drug and alcohol use


Because mental health issues are correlates of substance abuse, SAMHSA continues to include questions on mental health and utilization of mental health services in the NSDUH. Questions on mental health, in conjunction with questions on substance use, treatment for substance use, and mental health services, greatly enhance the ability to characterize and understand the co-occurrence and treatment of mental illness and substance use problems in the U.S. SAMHSA will use data from the mental health surveillance study described in item A.1 above to estimate the prevalence of SMI among adults.


To look specifically at depression, the 2004 NSDUH introduced two depression modules – one for adults and one for youths. The data collected focuses on lifetime and past year prevalence of major depressive episodes, past year treatment for it, and its severity and impact on functioning. These data are used to obtain the prevalence and need for treatment of depression in the U.S., and will allow further research into the interaction between depression and drug use. These modules were included in the 2005-2009 NSDUHs, and will be included in the 2010 NSDUH as well. A detailed discussion of the 2010 questionnaire is presented in section B.2.



3. Use of Information Technology


The NSDUH study has been administered via computer-assisted interviewing (CAI) since 1999. The 2010 interview will be administered using audio computer-assisted self-interviewing (ACASI) for the more sensitive questions, representing most of the interview; the remainder of the interview will be administered using computer-assisted personal interviewing (CAPI).


The CAPI/ACASI technology affords a number of advantages in the collection of survey data. First, this methodology permits the instrument designer to incorporate more complex routings into the questionnaire compared to a paper-and-pencil instrument. The computer can be programmed to implement complex skip patterns and fill specific wordings based on answers previously provided by the respondent. Errors made by interviewers (and respondents) due to faulty implementation of skip instructions are virtually eliminated. A second feature relates to the consistency of data. The computer can be programmed to identify inconsistent responses and attempt to resolve them through respondent prompts. This reduces the need for most manual and machine editing, thus saving both time and money. In addition, it is likely that respondent-resolved inconsistencies will result in data that are more accurate than when inconsistencies are resolved using editing rules. Also, the ACASI technology permits nonreaders to complete the interview in total privacy.


CAPI/ACASI technology permits greater expediency with respect to data processing and analysis, e.g. a number of back-end processing steps, including editing, coding, and data entry become part of the data collection process. Data are transmitted via modem rather than by mail. These efficiencies save time due to the speed of data transmission, as well as receipt in a format suitable for analysis. Tasks formerly completed by clerical staff are accomplished by the CAPI/ACASI program. In addition, the cost of printing paper questionnaires and associated mailing is eliminated.


There is evidence that the ACASI methodology is especially useful for surveys of sensitive topics. Providing the respondent with a methodology that improves privacy and confidentiality makes reporting of potentially embarrassing, stigmatizing, and illegal behaviors (e.g., drug use, mental health issues) less threatening and enhances response validity and response rates.


The 2010 NSDUH will use iPAQ hand-held computers to conduct household screening interviews. The primary advantage of this computer-assisted methodology is improved accuracy in selecting the correct household member for an interview. The computer automatically selects the correct household member based on the demographic variables entered, thus substantially reducing the probability for human error.


The selection of interview respondents for the clinical follow-up interview will be pre-programmed into the CAI instrument and will be based on the respondent's K-6 score. For those selected follow-up interview respondents, follow-up interview recruitment scripts that are programmed within the 2010 NSDUH main study questionnaire will be administered at the end of the initial interview using computer-assisted personal interviewing (CAPI). The field interviewer will not know if the respondent is selected for the follow-up interview until the recruiting scripts appear at the end of the CAI program. Contact information for those who agree to participate will be entered in the laptop. This information will be posted to a secure website for access by the clinician assigned to contact the respondent for the follow-up interview. The follow-up SCID interview will be administered via telephone on a paper and pencil (PAPI) SCID instrument.


4. Efforts to Identify Duplication


The NSDUH is the only survey of substance use in the United States with a sample size capable of producing high quality national and separate state incidence and prevalence estimates, especially by detailed demographic variables. No other survey provides the level of detail on substance use and abuse as provided by the NSDUH. No duplication of effort has been identified.


Several other surveys and data systems collect data on substance use, abuse, and dependence. However, it is important to understand the methodological differences between the different surveys and the impact that these differences could have on estimates of substance use prevalence.


The Monitoring the Future (MTF) study is a national survey, sponsored by the National Institute on Drug Abuse (NIDA) that tracks substance use trends and related attitudes among America's adolescents. It is a school-based survey of 8th, 10th, and 12th graders that includes an ongoing panel study from each graduating class conducted by mail. Since the NSDUH is an annual survey of the civilian, noninstitutionalized population of the United States aged 12 years old or older, the two studies clearly have different populations of interest. In addition, the MTF does not survey dropouts, a group that NSDUH has shown to have higher rates of illicit drug use (Gfroerer et al., 1997).


Research has shown that the mode of a survey can have considerable effects on the results, especially with items that are prone to social desirability bias (Groves, 1989). The MTF conducts self-administered surveys in a school setting and by mail. The NSDUH is conducted in the household using a computer-assisted instrument. When the NSDUH is subset to the same student population covered by the MTF, comparisons between the MTF and NSDUH estimates generally have shown NSDUH substance use prevalence levels to be lower than MTF estimates, with differences tending to be more pronounced for 8th graders. The lower prevalences in the NSDUH may be due to more underreporting in the household setting as compared with the MTF school setting.


The Youth Risk Behavior Survey (YRBS) is a component of the Centers for Disease Control and Prevention's (CDC's) Youth Risk Behavior Surveillance System (YRBSS), which biennially measures the prevalence of six priority health risk behavior categories: (a) behaviors that contribute to unintentional and intentional injuries; (b) tobacco use; (c) alcohol and other drug use; (d) sexual behaviors that contribute to unintended pregnancy and sexually transmitted diseases (STDs); (e) unhealthy dietary behaviors; and (f) physical inactivity. The YRBSS includes national, State, territorial, and local school-based surveys of high school students in grades 9 through 12. The students are given a self-administered questionnaire during a regular class period. Although the YRBS includes measures on tobacco, alcohol, and illicit drugs, it is not a comprehensive substance use survey. It only includes a few basic questions on these topics. Like the MTF, this study is targeted at a different population and collects data in a different setting than the NSDUH. As a result, the prevalence estimates of illicit drug use are generally much higher from the YRBS.


In 2000, a series of papers comparing different aspects of the NHSDA, MTF, and the YRBS was commissioned by the U.S. Department of Health and Human Services (DHHS). Under contract with the Office of the Assistant Secretary for Planning and Evaluation, Westat, Inc., identified and funded several experts in survey methods to prepare these papers. The papers were published in the Journal of Drug Issues (Hennessy & Ginsberg, 2001). The major findings of this study indicate that differences in survey methodology may affect comparisons of prevalence estimates among youths. The study also found that all three surveys were well designed and managed, but they each have different purposes.


The Behavioral Risk Factor Surveillance System (BRFSS) is an annual, State-based telephone survey of the civilian, noninstitutionalized adult population aged 18 or older and is sponsored by the Centers for Disease Control and Prevention (CDC). Since 2002, BRFSS has collected data from all 50 States, the District of Columbia, Puerto Rico, the U.S. Virgin Islands, and Guam using a computer-assisted telephone interviewing (CATI) design. BRFSS collects information on access to health care, health status indicators, health risk behaviors (including cigarette and alcohol use), and the use of clinical preventive services. More than 350,000 adults are interviewed each year. National data are calculated using a median score across States.


NSDUH has shown consistently higher rates of binge drinking than BRFSS. The use of audio computer-assisted self-interviewing (ACASI) in NSDUH, which is considered to be more anonymous and yields higher reporting of sensitive behaviors, was offered as an explanation for the lower rates in BRFSS (Miller et al., 2004).


Sponsored by the National Institute on Alcohol Abuse and Alcoholism (NIAAA), the National Epidemiologic Survey on Alcohol and Alcohol Related Conditions (NESARC) is another study that contains comprehensive assessments of drug use, abuse, and dependence, as well as associated mental disorders. While the NSDUH is an annual survey of the civilian, noninstitutionalized population of the United States aged 12 years old or older, the NESARC was designed to make inferences for persons aged 18 or older and is conducted in waves (2001/2002 and 2004/2005). The NESARC is designed to be a longitudinal survey, whereas the NSDUH provides annual cross-sectional data. Another methodological difference is that sensitive questions in the NSDUH are self-administered while the NESARC is wholly interviewer-administered. Methodological variables, including factors related to privacy and anonymity, and differences in diagnostic instrumentation result in different prevalence estimates. In particular, NSDUH produces substantially higher rates of use of illicit drugs (Grucza et al., 2007).


The Office of Applied Studies, SAMHSA, is in contact with all major federal health survey managers and is aware of no other efforts to calibrate mental health screening and impairment scales to a structured clinical interview to derive national estimates of Serious Mental Illness. Mental health assessment experts convened in December 2006 and recommended a study such as the NSDUH Mental Health Surveillance Study to address a prominent data gap; thus, there is no evidence of duplication of effort.


5. Involvement of Small Entities


This survey does not involve small businesses or other such entities.


6. Consequences If Information Collected Less Frequently


The existence of substance abuse patterns and behaviors is a rapidly evolving and changing phenomenon, which calls for timely measurement and analysis of the data. It is imperative to continue the Survey on an annual basis for three reasons:


  1. the statutory mandate for annual data collection on the national incidence and prevalence of substance abuse,


  1. the continued demand within SAMHSA, ONDCP and other federal agencies for data on the nature and size of the nation’s substance abuse problem, and


  1. the requirement for current data for each of the 50 States and the District of Columbia, to evaluate the effectiveness of programs designed to reduce the use of illicit substances.


7. Consistency with the Guidelines in 5 CFR 1320.5(d)(2)


This information collection fully complies with 5 CFR 1320.5 (d)(2).


8. Consultation Outside the Agency


A Federal Register notice published on April 7, 2009 (Vol. 74, page 15729) solicited no comments on the 2010 NSDUH.


It is DHHS policy that all national surveys are reviewed by the Office of the Assistant Secretary for Planning and Evaluation (ASPE). The review for the 2010 survey was conducted in April 2009. The DHHS Data Council has been kept informed about the status and plans for the 2010 NSDUH.


Appendix A of the Supporting Statement contains a listing of current consultants on the main NSDUH questionnaire.


There are no unresolved issues resulting from these consultations.


9. Payment to Respondents


On October 18, 2001, the use of a $30.00 incentive was approved by OMB for use in the 2002 NSDUH survey. The 2002 NSDUH experienced an increase in the weighted overall response rate (screening * interviewing) from 67% to 71%. Prior OMB approval was provided for the continued use of the $30.00 incentive for the 2003-2008 NSDUH surveys. The weighted overall response rates for 2003, 2004, 2005, 2006, 2007, and 2008 were 71%, 70%, 70%, 68%, 67%, and 67%, respectively. The 2010 NSDUH calls for a similar incentive plan, whereby a $30.00 incentive payment will be given to respondents upon completion of the interview. The incentive payment is mentioned in the following respondent materials: Lead Letter (Attachment D), Appointment Card (Attachment F), Study Description (Attachment G), Introduction and Informed Consents (Attachment L), Screening Questions (Attachments H), Question and Answer Brochure (Attachment I), Unable to Contact Letters (Attachment Q), Call-Me Letters (Attachment R), Refusal Letters (Attachment S) and Interview Payment Receipt (Attachment O).


The telephone interview to be completed for the MHSS will constitute an additional burden on respondents, and may make it more difficult to obtain respondent participation. To maintain adequate response rates, SAMHSA believes it is necessary to offer respondents an additional $30 payment for completing the follow-up clinical interview. The clinical interview will take about the same amount of time as the initial interview, so an equitable incentive is necessary. Research studies have shown that providing incentives before the interview increases the likelihood that participants will complete the interview (Groves & Couper, 1998). Therefore, SAMHSA believes it is necessary to provide the additional $30 follow-up incentive at the end of the NSDUH main interview, once the participant agrees to the follow-up interview. Prior OMB approval was provided for the use of the $30.00 incentive in the 2008 and 2009 MHSS. Respondents who agree to complete the follow-up interview will receive a total of $60 at the end of the initial interview. The cash payment for the follow-up interview is mentioned in the following respondent materials: Follow-up Interview Recruitment Scripts (Attachment B, pages 442-443), Follow-up Study Description (Attachment T), and Follow-up Interview Payment Receipt (Attachment W).


10. Assurance of Confidentiality


Concern for the confidentiality and protection of respondents’ rights has always played a central part in the implementation of the National Survey on Drug Use and Health and will continue to be given the utmost emphasis.


Interviewers are thoroughly educated in methods for maximizing a respondent’s understanding of the government’s commitment to confidentiality. Furthermore, interviewers make every attempt to secure an interview setting in the respondent’s home that is as private as possible, particularly when the respondent is a youth. (Attachment A: notice of approval of Federal-Wide Assurance, submitted by RTI to the Office for Human Research Protections (OHRP), DHHS in compliance with the requirements for the protection of human subjects (45 CFR 46)).


The interview incorporates several procedures to ensure that respondents’ rights will be protected. The interviewer introduces himself/herself and the session with a consent statement. This statement will appear in the Showcard Booklet (Attachment L) and will be read out loud to each interview respondent. As part of the process for obtaining informed consent, respondents are given a Study Description (Attachment G), which includes information on the Confidential Information Protection and Statistical Efficiency Act of 2002 (included as Title V in the E-Government Act of 2002, P.L. 107-347) and the protection that it affords. Specifically, the Study Description states that respondents’ answers will only be used by authorized personnel for statistical purposes and cannot be used for any other purpose.


The questionnaire uses techniques to afford privacy for the respondent during the interview process. The audio computer-assisted self-interviewing (ACASI) portion of the instrument will maximize privacy and confidentiality by giving control of the sensitive questionnaire sections directly to the respondent. The ACASI methodology allows the respondent to listen to questions through a headset and/or to read the questions on the computer screen, then key his or her own responses into the computer via the keyboard.


Hard copy materials generated during the course of the interview are marked for identification by the interviewer according to specific instructions. Name, address, or other easily traceable marks are never noted on the hard copy materials, except on the Quality Control Form (Attachment C) at the end of the interview (with the respondent’s permission); even then, the name is not recorded for interview respondents. Furthermore, the respondent places the Quality Control Form in an envelope and seals it after recording the information. The respondent is told of these procedures in advance. The Quality Control Form is mailed directly to the Contractor’s main office in North Carolina.


With the CAI methodology, all sensitive data are entered privately by the respondent, and completed interview data are electronically transmitted to the Contractor’s offices on a regular basis via secure encrypted data transmission. Interviewers are unable to review or to edit questionnaire data as the completed interview files are locked. Also, once the respondent has completed the ACASI portion of the interview, the ACASI section is locked, so that the interviewer is unable to back up into this area and review the respondent’s most sensitive data. On the data file, respondents are identified only by a link number assigned to screening files and questionnaires/interviews. Although the link number is associated with a location number and a dwelling unit number, this location information is deleted by the Contractor before the delivery of data to SAMHSA. The dwelling unit address information, which is maintained in a separate file for Contractor use in sampling, fielding, and weighting cases, is purged at the completion of data processing.


After delivery and acceptance of the final survey data files, all Quality Control Forms are destroyed, thus eliminating any means of identifying addresses of sample dwelling units. The permanent sampling records show only the general location in which interviews were conducted; there is no record of specific dwelling units contacted.


The MHSS will incorporate several procedures to ensure that respondents’ rights will be protected, including procedures developed for the main NSDUH. The interviewer will introduce the follow-up interview with recruitment scripts (Attachment B, pages 442-443). These scripts will appear on the computer screen at the end of the initial CAI interview and will be read out loud to each interview respondent selected for the MHSS. As part of the process for obtaining informed consent for the follow-up interview, respondents will be given a Follow-up Study Description (Attachment T), which includes information on the Confidential Information Protection and Statistical Efficiency Act of 2002 (included as Title V in the E-Government Act of 2002, P.L. 107-347) and the protection that it affords. Specifically, the Follow-up Study Description states that respondents’ answers will only be used by authorized personnel for statistical purposes and cannot be used for any other purpose. The dwelling unit address information, which is maintained in a separate file for Contractor use in sampling, fielding, and weighting cases, as well as the respondent’s first name, and phone number will be destroyed when all final data files are delivered to SAMHSA and approval received by the SAMHSA Project Officer.


Although the respondent’s first name and phone number will be collected within the main interview, it will be used only for re-contact purposes. Once the CAI data are transmitted and arrive at RTI, the respondent's name, phone number, and text regarding the best time to call will be split off into a separate database with only the random number ID for linkage.  The rest of the CAI data will be converted into a SAS data file format and merged onto the master data file.


The follow-up interview will be conducted over the telephone by clinicians trained in the administration of the SCID. All clinical interviewers will receive training on the importance of keeping all information learned from respondents confidential. A confidentiality pledge will be read and signed by all clinical interviewers during the project training process (See Attachment Y).


Follow-up interview materials are marked for identification by the interviewer using a randomly-generated 7-digit number called the QuestID. The respondent’s address or other easily traceable marks will not be included on the SCID paper form.


The permanent sampling records will contain no record of which addresses were selected for the MHSS.


There will be no Privacy Act System of Records established for this effort.


11. Questions of a Sensitive Nature


As mentioned in section A.1 above, SAMHSA is required to report annually on the incidence and prevalence of substance abuse and mental health problems due to Section 505 of the Public Health Service Act. Many safeguards, including the mode of questionnaire administration, have been incorporated into the NSDUH study design in order to improve the collection of data on sensitive issues/information. As a part of the interview process and upon introduction, the interviewer informs the respondent why the information is necessary, indicates who sponsors the Survey, requests consent to conduct an interview, and explains the procedures which assure confidentiality. Verbal parental consent is obtained for respondents between the ages of 12 and 17 years old. (See Attachment L, Showcard Booklet, for verbal consent text.) However, every attempt is made to ensure that the actual interview is conducted without parental observation or intervention.


Answers to sensitive questions, including all substance use questions and mental health questions, are obtained by closed interview design. In the ACASI administration, the respondent enters his/her answers directly into the computer. The interviewer does not see the answers. Data from the electronic interviews are transmitted regularly to the Contractor via secured data transmission. All CAI data are telecommunicated to the Contractor’s office, and are identified with a respondent number, which is a code associated with the sample dwelling unit. There is no system of records which identifies respondents. The questionnaire data are processed immediately upon receipt at the Contractor’s facilities and all links between a questionnaire and the respondent’s address are destroyed after all data processing activities are completed.


No signed consent forms are used; however, verbal consent is obtained as explained above. The listing of selected dwelling unit locations and addresses are kept under locked and secured conditions and destroyed after all data processing activities are completed.


The follow-up interview will be delivered by mental health clinicians trained in administering the SCID and deriving DSM-IV diagnoses from structured clinical interviews. The clinician will administer the SCID over the telephone from a private location in his/her home or office. When calling to conduct the SCID, the clinician will ask the respondent to go to a private location for the duration of the interview. The clinician will explain to the respondent that the only identifying information he/she has is the respondent’s first name and telephone number and that this identifying information will be discarded after data collection ends. The clinician will repeat the confidentiality assurances and will ask for permission to record the interview (Attachment U) for quality control purposes. Permission to record the interview is not a requirement to complete the interview. The clinician will note the respondent’s answers on a paper SCID response sheet and will keep the questionnaire in a secure location until shipping them via Federal Express to RTI for a quality control (QC) review by specially-trained clinical supervisors. After this QC step, the paper SCIDs will be technically edited and keyed. All paper SCIDs and audio recordings will be destroyed approximately six months after the end of data collection.


All clinical interviewers will be issued project-owned laptop computers, preconfigured with the following software.


  • Pointsec whole-disk encryption software

    • Custom software to electronically capture audio recordings of clinical interviews

    • Custom software to automate upload of audio files


All clinical interviews for which the respondent grants permission will be recorded using a laptop computer that is connected between the clinical interviewer’s phone and wall phone jack using a telephone line splitter. The audio recording of clinical interviews will proceed according to the following sequence, in which the clinical interviewer:


  1. Connects their telephone to their project-owned laptop using a line splitter

  2. Tests their laptop-phone connection by placing a test call

  3. Initiates a telephone call to the respondent and obtains informed consent

  4. Uses laptop software to begin recording the clinical interview

  5. Completes the clinical interview and stops recording

  6. Uses custom laptop computer software to securely upload the audio file to RTI


The above protocol provides a high degree of protection for the confidentiality of the audio files. Audio files are encrypted both at rest and also during transmission. The Pointsec whole-disk encryption software protects the audio files while they reside on the clinical interviewers’ laptops. To move interview files from laptops back to servers at RTI, custom software on the laptop will use secure FTP and/or HTTPS protocols to make sure the entire file transfer is securely encrypted for transfer over the public Internet. Once files are received at RTI, they will be protected by the complete set of security controls that protect RTI’s corporate computer networks.


Subsequent to the transfer of audio files to RTI, authorized project researchers and management staff will use a secure web-based file sharing facility to either download or directly listen to the audio recordings. Again, any transfer (even in-place playback) of the recordings will be strongly encrypted during transport across the public Internet using the HTTPS protocol. The project-owned laptops used by the above mentioned authorized project staff will be preconfigured with PointSec disk encryption software to insure full data protection.


In summary, at any point in time, an audio file may potentially reside in only three places:


  • On the encrypted hard drive of the clinical interviewer who performed the interview

  • On an RTI file server, protected from the public internet by our corporate IT security controls

  • On the encrypted hard drive of a project-owned laptop issued to a properly authorized person for use solely on this study.


All audio recordings will be erased (from RTI file servers and project-owned laptops) by June 30, 2011.

12. Estimates of Annualized Hour Burden


The total sample size for the 2010 National Survey is approximately 67,500 persons. This sample size is required to ensure reliable state-level estimates for each of the 50 states, as well as estimates on the many sub-populations included in NSDUH specifications, e.g., Blacks, Hispanics, youth, etc. It is necessary to screen approximately 190,800 households to obtain the requisite survey sample size.


The experience with the first quarter of 2009 indicated that the average interview time remained approximately 60 minutes.


Based on the 2010 questionnaire having the same length, it is estimated that the average amount of time required to administer the 2010 CAI questionnaire will also be approximately 60 minutes, including 2 minutes for the Quality Control Form. Administration of the screening questions will take an average of 5 minutes per dwelling unit. Based on the 2010 clinical interview having roughly the same length as the 2008 clinical interview, the follow-up clinical interview is estimated to take on average an additional 60 minutes.


Screening verification and interview verification contacts both take an average of 4 minutes and are administered only to a subsample of the cases. An approximate fifteen percent random sample of each interviewer’s work (i.e., completed interviews) will be verified. In addition to the verification of completed interviews, certain completed screening codes (vacant, not primary residence, not a dwelling unit, DU contains ONLY military personnel, respondents living at residence for less that half of the quarter, and no one selected for interview) will be verified. Previous experience indicates that approximately 60% of all screenings will result in one of these six screening codes. An approximate five percent random sample of all such screening codes will be selected for verification follow up.


The hourly wage of $14.64 was calculated based on weighted data from the 2008 NSDUH respondents' personal annual income.


The data collection field period for the 2010 NSDUH is 12 months long, spanning the period from January through December 2010. The respondent burden for the 2010 NSDUH is shown in the following table:


Estimated Burden for 2010 NSDUH




Instrument



No. of

Respondents


Responses per respondent


Hours per response


Total burden hours


Hourly

Wage rate


Annualized

Hourly costs


Household

Screening


190,800


1


0.083


15,836


$14.64


$231,839


Interview


67,500


1


1.000


67,500


$14.64


$988,200


Clinical Follow-up Certification


24


1


1.000


24


$14.64


$351


Clinical Follow-up Interview


500


1


1.000


500


$14.64


$7,320


Screening Verification


5,400


1


0.067


362


$14.64


$ 5,300


Interview Verification


10,125


1


0.067


678


$14.64


$9,926


TOTAL:


190,824






84,900




$ 1,242,936


13. Estimates of Annualized Cost Burden to Respondents


There are no capital, startup, operational, or maintenance costs to respondents.




14. Estimates of Annualized Cost to the Government


Total costs associated with the 2010 National Survey on Drug Use and Health are estimated to be $53,490,664 over a 40-month contract performance period.  Of the total costs, $49,610,164 are for contract costs, e.g., sampling, data collection, processing, reports, etc., and approximately $3,880,500 represents SAMHSA costs to manage/administrate the survey.  The annualized cost is approximately $16,047,199.   This represents a total increase in contract costs from the 2009 survey to the 2010 survey of approximately $1,764,059.


Total costs associated with the MHSS are estimated to be $996,993 over a 29-month performance period. Of the total costs, $828,993 is for contract costs, e.g., sampling, data collection, processing, reports, etc., and approximately $168,000 represents SAMHSA costs to manage/administrate the survey. The annualized cost is approximately $412,549. This represents a total increase in MHSS costs from the 2009 MHSS of approximately $80,846. With the drop in the MHSS sample size from 1500 to 500 in 2009, the project was able to downsize the clinical interviewing staff by one-third, retaining the best performing interviewers. Since some attrition may occur in 2010, however, replacement clinical interviewer training may have to take place.


15. Changes in Burden


Currently there are 84,906 hours in the 2009 OMB inventory. The 2010 NSDUH is requesting 84,906 hours, which does not involve a change from the 2009 NSDUH.


16. Time Schedule, Publication and Analysis Plans


Plans for the 2010 Survey data involve four major types of products: an early report that presents results from the 2010 NSDUH (available at the annual DHHS press release of NSDUH data); two state specific reports; five analytic reports; and a public use data file. Descriptions of major publications, as well as delivery dates for major publications, follow.


NATIONAL FINDINGS FROM THE 2010 NSDUH (September, 2011) - This report will present highlights and detailed findings from the 2010 data collection year. It consists of a series of exhibits, both graphic and tabular, presenting recent trends of substance use by recency of use and numerous demographic characteristics. Essentially, this report examines substance use incidence and prevalence in 2010, trends since 2002, demographic correlates of substance use, substance use patterns, and public perceptions of the harmfulness of illicit substance use as well as opportunities to use drugs. Final weighted and edited data are used to construct the tables.


STATE FINDINGS REPORT (Early, 2012) - A state data report (approximately 200 pages) will present substance use incidence and prevalence estimates for each of the 50 states and the District of Columbia. It will also document the methodology in detail.


ANALYTIC REPORTS - Additional data analyses and special analytical papers will be produced and released as part of the SAMHSA, Office of Applied Studies (OAS) Analytic Series or A report series. Reports of findings from the MHSS will also be produced. Additional topics and dates of completion for these reports are currently undetermined. Supplemental tables involving population projections for specified licit and illicit substances also will be produced and made available to those requesting such information.



2010 NSDUH PROJECT SCHEDULE


ACTIVITY TIME FRAME


Design and select area frame sample December 2008 to March 2009


Prepare field Segment Kits January 2009 to May 2009


Recruit/train field staff to list Sample Dwelling Units (SDUs) March 2009 to May 2009


Field listing and subsequent keying of SDUs April 2009 to January 2010


Recruit remaining field staff and generate all

required materials/assignments for distribution August 2009 to January 2010


Finalize programming of NSDUH interview August 2009 to October 2009


Prepare for and conduct field staff training May 2009 to January 2010


Conduct NSDUH interviews January 2010 to December 2010


Data processing and file preparation January 2011 to March 2012


Trend Tables and Special Tabulations:

‑‑ Shells March 2011

‑‑ Annual Tables June 2011


Raw Data Files May 2011


Preliminary Weighted Data Files May 2011


Final analytic data file and documentation September 2011


Sampling Error Report July 2011


National Findings September 2011


State Small Area Estimation Analytical Report August 2011 to March 2012


Public Use Data File December 2011


Methodological Resource Book March 2012



17. Display of Expiration Date


The OMB expiration date will be displayed on all data collection instruments.


18. Exceptions to Certification Statement


The certifications are included in this submission.


B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS


1. Respondent Universe and Sampling Methods


The respondent universe for the 2010 NSDUH study is the civilian, noninstitutionalized population aged 12 years old and older within the 50 states and the District of Columbia. Consistent with the NSDUH designs since 1991, the 2010 NSDUH universe includes residents of noninstitutional group quarters (e.g., shelters, rooming houses, dormitories), residents of Alaska and Hawaii, and civilians residing on military bases. Persons excluded from the universe include those with no fixed household address, e.g., homeless transients not in shelters, and residents of institutional group quarters such as jails and hospitals.


The 2010 sample design will consist of a stratified, multi-stage area probability design (see Attachment N for a detailed presentation of the sample design). As with most area household surveys, the 2010 design will offer the advantage of minimizing interviewing costs by clustering the sample. This type of design also maximizes coverage of the respondent universe since an adequate dwelling unit and/or person-level sample frame is not available. Although the main concern of area surveys is the potential variance-increasing effects due to clustering and unequal weighting, these potential problems will be directly addressed with the 2010 design by selecting a rather large sample of clusters at the early stages of selection and by selecting these clusters with probability proportionate to a composite size measure. This type of selection maximizes precision by allowing one to achieve an approximately self-weighting sample within strata at the latter stages of selection. Furthermore, it is attractive because the design of the composite size measure makes the interviewer workload roughly equal among clusters.


A coordinated five-year design was developed for the 2005-2009 NSDUHs which will be extended to the 2010 and 2011 NSDUHs. The sample selection procedures began by geographically partitioning each state into roughly equal size state sampling (SS) regions. Regions were formed so that each area would yield, in expectation, roughly the same number of interviews during each data collection period. This partition divided the United States into 900 SS regions. Within each of these SS regions, a sample of Census tracts was selected. Then, within sampled Census tracts, smaller geographic areas, or segments, were selected. A total of 48 segments per SS region were selected for the 2005-2009 NSDUHs. Only 24 segments per region were used for these surveys; the remaining 24 segments constitute the "reserve" sample and are available for use in 2010 and 2011. In general, segments consisted of adjacent Census blocks and are equivalent to area segments selected at the second stage of selection in NSDUHs prior to 1999 and at the first stage of selection in the 1999-2004 NSDUHs. The additional stage of selection (i.e. Census tracts) ensures that the majority of sample segments are contained within a single tract's boundaries, thus improving the ability to match to external data. In summary, the first stage stratification for the 2010 study will be states and SS regions within states, the first stage sampling units will be Census tracts, and the second stage sampling units will be small area segments. This design for the 2005-2011 NSDUHs at the first stages of selection is desirable because of (1) the much larger person-level sample required at the latter stages of selection in the design and (2) the increased interest among NSDUH data users and policy-makers in state and other local-level statistics.


The coordinated design facilitates 50 percent overlap in second stage units (area segments) between each two successive years from 2005 through 2011. The expected precision of difference estimates generated from consecutive years, (e.g., the year-to-year difference in past month marijuana use among 12-17 year old respondents) will be improved because of the expected positive correlation resulting from the overlapping sample.


Similar to previous NSDUHs, at the latter stages of selection, five age group strata will be sampled at different rates. These five strata will be defined by the following age group classifications: 12‑17, 18‑25, 26-34, 35‑49, and 50 years old and over. We project that adequate precision for race/ethnicity estimates at the national level will be achieved with the larger sample size and the optimal allocation to the age group strata. Consequently, race/ethnicity groups will not be over-sampled. However, consistent with previous NSDUHs, the 2010 NSDUH will be designed to over-sample the younger age groups.


Table 1 in Attachment N shows main study sample sizes and projected number of completed interviews by sample design stages. Table 2 (Attachment N) shows main study sample sizes by state and projected number of person respondents by state and age group1. Table 3 (Attachment N) shows the expected precision for national estimates. Table 4 (Attachment N) shows the expected precision for direct state estimates.


The MHSS sample will be embedded within the main study. Exhibit 1 in Attachment N shows that approximately 765 main study respondents will need to be selected to yield a total of 500 completed follow-up interviews. This is assuming an 86% agreement rate and a 76% participation rate for follow-up interviews.

2. Information Collection Procedures


NSDUH Main Study

Prior to the interviewer’s arrival at the sample dwelling unit (SDU), a letter will be mailed to the resident(s) briefly explaining the survey and requesting their cooperation. This letter will be printed on Department of Health and Human Services letterhead with the signature of the DHHS National Study Director and the Contractor’s National Field Director (see Attachment D).


Upon arrival at the SDU, the interviewer will refer the respondent to this letter and answer any questions. If the respondent has no knowledge of the lead letter, the interviewer will provide another copy, explain that one was previously sent, and then answer any questions. If no one is at home during the initial call at the SDU, the interviewer may leave a Sorry I Missed You card (Attachment E) informing the resident(s) that the interviewer plans to make another callback at a later date/time. Callbacks will be made as soon as possible. Interviewers will attempt to make at least four callbacks (in addition to the initial call) to each SDU in order to complete the screening process and obtain an interview.


If the interviewer is unable to contact anyone at the SDU after repeated attempts, the interviewer’s Field Supervisor may send an Unable to Contact (UTC) letter. The UTC letter re-iterates information contained in the lead letter and presents a plea for the respondent to participate in the study (See Attachment Q for all UTC letters). If after sending that letter an interviewer is still unable to contact anyone at an SDU, another informational letter (See Attachment R) may be sent to the SDU requesting that the resident(s) call the Field Supervisor as soon as possible to set up an appointment for the interviewer to visit the resident(s).


As necessary and appropriate, the interviewer may make use of the Appointment Card (Attachment F) for scheduled return visits with the respondent. When an in‑person contact is made with an adult member of the SDU and introductory procedures are completed, the interviewer will present a Study Description (Attachment G) and answer questions if required. Assuming respondent cooperation, a screening of the SDU then will be initiated through administration of the Housing Unit Screening questions for housing units, or the Group Quarters Unit Screening questions for group quarters units. The screening questions are administered via a hand-held, pen-based computer, which also performs the subsequent sample-selection routines. A paper representation of the housing unit and group quarters unit screening process is shown in Attachment H. Select questions that were present in the 2009 Screening instrument have been deleted for 2010. The 2010 Screener no longer asks respondents to confirm that each of the rostered household members will live in the house for the majority of the data collection quarter, and no longer asks if there are any household members that were missed during the roster process. This change is not expected to impact respondent burden and no tests of procedures are planned.


If a potential respondent refuses to be screened, the interviewer is trained to accept the refusal in a positive manner, thereby avoiding the possibility of creating an adversarial relationship and precluding future opportunities for conversion. A refusal letter may then be sent by the Field Supervisor. The refusal letter sent is tailored to the specific concerns expressed by the potential respondent and asks him/her to reconsider participation (See Attachment S for all refusal letters). An in-person conversion is then attempted either by supervisory field staff or specially selected interviewers with successful conversion experience. If the respondent proceeds with the screening process, the interviewer answers any questions that the screening respondent may have concerning the study. A Question & Answer Brochure (Attachment I) that provides answers to commonly asked questions also will be given to the respondent at this time, or just prior to the start of the interview. In addition, interviewers will be supplied with copies of the Example NSDUH Highlights (Attachment J) and the Example NSDUH Newspaper Clippings (Attachment K) which can be left with the respondent. Following this introductory exchange, the screening will continue until completion.


Once the rostering of all dwelling unit members 12 or older is complete, and assuming the within dwelling unit sampling process selects one or two members to participate in the study by completing the interview, the following procedures are implemented:


If the selected individual is 18 or older and currently available, the interviewer moves immediately to begin administering the questionnaire in a private setting within the dwelling unit after obtaining informed consent. If the selected individual is 12 to 17 years of age, parental consent is obtained from the selected individual’s parent or legal guardian, using the Introduction and Informed Consent for Sample Members Age 12-17 Years Old found in the Showcard Booklet (Attachment L); the minor is then asked to participate. Once consent is obtained from the parent and child, the interviewer begins the interview process.


For all identified/selected eligible potential respondents, the interviewer administers the interview in a prescribed and uniform manner. The sensitive/self‑administered portions of the interview will be completed via ACASI; that is, the respondent will listen privately to the questions through an audio headset and/or read them on the computer screen, and will enter his/her own responses directly into the computer. This method maximizes respondent privacy and confidentiality.


In order to facilitate the respondent's recollection of prescription type drugs and their proper names, a set of color pillcards is provided to the respondent at the appropriate time. These pillcards and other showcards are included in the Showcard Booklet (Attachment L) and allow the respondent to refer to information necessary for accurate responses. The respondent enters his/her own answers directly into the computer during the ACASI interview.


After the interview is completed and before the verification procedures are begun, each respondent is given a $30.00 incentive payment and a Field Interviewer-signed Interview Payment Receipt (Attachment O).


For verification purposes, interview respondents are asked to complete a Quality Control Form (Attachment C) that requests their address and telephone number for possible follow‑up to ensure that the interviewer did his/her job appropriately. Respondents are informed that completing the Quality Control Form is voluntary. This form is completed and placed in an envelope by the respondent and mailed to the NSDUH Contractor for processing.


Interviewers will be supplied with Certificates of Participation (Attachment P) to distribute to interested respondents, primarily adolescents, after the interview is completed. Respondents may attempt to use these certificates to earn school or community service credit hours. No guarantee of credit is made by SAMHSA or the Contractor and the certificates clearly state this lack of guarantee.


A random sample of those who complete Quality Control Forms receive a telephone call expressing appreciation for their participation in the study. Each respondent also is asked to answer a few questions verifying that the interview took place, that proper procedures were followed, and that the amount of time required to administer the interview was within expected parameters. Quality Control letters are mailed when telephone numbers are unavailable (see Attachment M). In previous NSDUH surveys, less than 1 percent of the verification sample refused to fill out Quality Control Forms. As in the past, the respondents are given the opportunity to decline to complete the form.


All interview data are transmitted to the Contractor’s offices on a regular basis.


Mental Health Surveillance Study

As in 2009, field interviewers will administer the NSDUH questionnaire via CAI using standard protocols (see section 3 - Use of Information Technology). For those respondents randomly selected to participate in the follow-up interview, after completing an initial NSDUH interview, the field interviewer will read CAPI scripts describing the follow-up interview process to the respondent. These recruitment scripts are provided in Attachment B, pages 442-443. The respondent as well as the interviewer will be unaware that the respondent will be asked to complete another interview until these scripts are activated at the end of the CAI program. If the respondent refuses to complete the follow-up interview at recruitment or at the time of the follow-up clinical interview, the field or clinical interviewer will not recontact the household again.


The respondent will be given a copy of the Follow-Up Study Description (Attachment T) during this recruitment process, and the interviewer will try to collect the respondent’s first name, phone number, and best days of the week and times of the day for a clinical interviewer to complete the follow-up interview. This contact information will be entered into the password-protected laptop and shared, via a secure website, with the clinical interviewer who will then schedule and conduct the interview in the following one to four weeks. Respondents will receive a Reminder Card (Attachment Z) listing the days and times that a Clinical Interviewer may attempt to reach them. All respondents that agree to participate in the follow-up interview will receive an additional $30 cash payment and a signed Follow-up Interview Payment Receipt from the field interviewer (Attachment W). The selected follow-up interview respondent will also be read an introductory informed consent script (Attachment U) by the clinician, prior to any follow-up interview data collection.


For reference, the clinical interviewers will also receive the date and time of the initial interview, along with the Record of Call history for the case. Recognizing the tremendous importance of confidentiality with regard to the NSDUH, no other information about the respondents will be provided to clinical interviewers. Clinical interviewers will telephone respondents to conduct the clinical interview based on the availability of information provided in the initial interview. They will record identification numbers and responses on a paper and pencil version of the SCID and, with the respondent’s permission, audio record the interview. Clinical interviewers will forward the paper SCID to experts in the SCID administration for a quality control (QC) review. Clinical interviewers will also upload the completed audio recordings to a secure project server for the SCID experts to download for QC purposes. After this QC step, the paper SCID will be forwarded for data entry. This process will be completed in a timely manner (typically weekly). All SCIDs and audio files will be destroyed six months after the end of 2010 data collection.


Clinical interviewers trained in administering the SCID were recruited to conduct the 2009 MHSS. These interviewers were certified before administering the SCID. Volunteer respondents were recruited from mental health treatment centers and paid $40 for participating in a telephone SCID interview, which, with the respondents’ consent, was audio recorded. An expert in the SCID listened to the audio-taped interview and reviewed the paper SCID to determine whether the clinical interviewer administered the instrument properly. The 2010 clinical interviewing staff will be comprised mostly of interviewers who participated in the 2009 study. Any new interviewers recruited for 2010 will complete the same certification process. As in 2009, these clinical interviewers will be supervised throughout the data collection year to maintain the integrity and reliability of clinical assessment and to resolve any clinical issues or questions that emerge.


Creating a robust, sustainable mental health surveillance system using the NSDUH as the household survey vehicle requires three essential elements: 1) appropriate screening and gold standard measures; 2) seamless procedures to transfer respondents from the NSDUH to the clinical interview section of the Mental Health Surveillance Study (MHSS); and 3) appropriate analysis of the data.


The clinical interview measure is the SCID I/NP. Recommended by the expert consultants, this tool has been used for clinical calibration in several other studies such as the National Comorbidity Survey-Replication (NCS-R) (Kessler et al., 2004); the National Survey of American Life (Jackson et al., 2004); and the NSDUH substance use disorders clinical reappraisal study (Jordan et al., 2003). Dr. Michael B. First, primary author of the SCID interview, has been the primary expert consultant on all matters associated with the adaptation and use of the SCID in the MHSS. Adaptations of the SCID included modifying (shortening) the interview to assess mental disorders in the past 12 months (removing items used to assess lifetime disorder) and insuring each module is in the appropriate format for telephone administration.


Questionnaire


The version of the questionnaire to be fielded in 2010 is a computerized (CAPI/ACASI) instrument that is identical in content and structure to the computerized instrument fielded in 2009.


The NSDUH questionnaire and interview methods are designed to retain respondent interest, ensure confidentiality, and maximize the validity of response. The questionnaire is administered in such a way that interviewers will not know respondents’ answers to the sensitive questions, including those on illicit drug use. These questions are self-administered (ACASI), that is, respondents listen to or read the questions and enter their responses directly into the computer. The respondent listens in private through headphones, so even those who have difficulty seeing or reading are able to complete the self-administered portion.


The questionnaire is divided into sections based on specific substances or other main topics. The same questions are asked for each substance or substance class, ascertaining the respondent’s history in terms of age of first use, most recent use, number of times used in lifetime, and frequency of use in past 30 days and past 12 months. These substance use histories allow estimation of the incidence, prevalence, and patterns of use for licit and illicit substances.


Topics that are administered by the interviewer (i.e., the CAPI section) include Demographics, Health Insurance, and Income. For the Income and Health Insurance sections, respondents will be asked if there is anyone else at home who would be better able to provide accurate answers.


The questionnaire for 2010 is founded on the CAI instrument that was first implemented for the 1999 NSDUH. While the mode changed in 1999, the content was based on the 1994 questionnaire, which resulted from a series of methodological studies and discussions with consultants. Additional methodological testing was completed in preparation for the conversion to computer-assisted interviewing. The questionnaire incorporates improvements in question wordings (e.g., clearer definitions, less vague terminology, elimination of hidden questions) and questionnaire structure (e.g., greater use of skip patterns, improved formatting for the benefit of interviewers and respondents). Enhanced instructions regarding the reference periods used (i.e., past 30 days, past 12 months) also were added, including a paper reference date calendar to facilitate the respondent’s accurate recall of events. A key feature of the questionnaire is a core-supplement structure. A set of core questions that are critical for basic trend measurement of substance use incidence and prevalence rates will remain in the survey every year and comprise the main part of the questionnaire. Supplemental questions, or modules, which can be revised, dropped or added from year to year comprise the remainder of the questionnaire.


The core is comprised of the initial demographic questions and the Tobacco through Sedatives modules. Supplemental items include the remaining modules, demographic and health questions. Some of the supplemental portion of the questionnaire is likely to remain in the survey, essentially unchanged, every year (e.g., insurance).


The follow-up interview is a slightly modified version of the Structured Clinical Interview for DSM-IV- TR Axis I Disorders Non-patient Edition (SCID-I/NP, 1/2007 revision), which screens for:


1 Major Depressive Episode

2 Manic Episode

3 Dysthymic Episode

4 Substance Use Disorders

5 Psychotic Episode

6 Bipolar Disorder

7 Anxiety Disorder

8 Phobias

9 Obsessive/Compulsive Disorder

10 Posttraumatic Distress Syndrome

11 Eating Disorders

12 Impulse Control Disorders

13 Adjustment Disorders


A paper representation of the follow-up SCID interview is found in Attachment V.

2010 NSDUH CAPI/ACASI Questionnaire Content


The proposed questionnaire content for 2010 is shown in Attachment B. While the actual administration will be electronic, the document shown is a paper representation of the content that is to be programmed. No questions were modified for the 2010 NSDUH instrument. As in previous years, the State program names for Medicaid, CHIP, and TANF will be updated. All other modular components of the 2010 questionnaire will remain unchanged from the 2009 version.


As in past years, two versions of the instrument will be prepared: an English version and a Spanish translation. Both versions will have the same essential content.


3. Methods to Maximize Response Rates


In 2008, the weighted response rates were 89% for screening and 75% for interviews, with an overall response rate (screening * interview) of 67%. With the continuation of the $30.00 incentive for the 2009 survey year, the Contractor expects the response rates for 2009 to be about the same as the 2008 rates.


With a $30.00 incentive for the initial interview and an additional, up-front $30.00 incentive for the follow-up interview, the Contractor expects to obtain a 79% unweighted interview response rate (IRR) for initial interviews and a 76% unweighted IRR for follow-up interviews. An overall response rate (ORR) of 53% is expected for the embedded follow-up study.


The field interviewers will not be recontacting households to convert follow-up interview refusals, but they will be trained to answer respondent questions at the time of recruitment as appropriate.


To maximize clinical interview response rates, the clinical interviewers will use the best day/best time information obtained by the field interviewer to schedule interviews, but they will also be flexible in scheduling a time for the follow-up interview that is convenient to the respondent. If a respondent is unavailable when the clinician calls to complete the follow-up interview, the clinician will schedule a callback appointment. Clinicians will be trained to thoroughly explain the study, its purpose, and answer questions from respondents.


4. Tests of Procedures


Since there are no planned modifications to the 2010 instrument, cognitive testing will not occur.

5. Statistical Consultants


The basic NSDUH design was reviewed by statistical experts, both within and outside SAMHSA. Statistical experts reviewing the 1999-2010 survey designs include William Kalsbeek, Ph.D., University of North Carolina; Robert Groves, Ph.D., University of Michigan; and Michael Hidiroglou, Ph.D., Statistics Canada. Monroe Sirken, Ph.D., National Center for Health Statistics (NCHS); James Massey, Ph.D., (deceased) also of NCHS; Douglas Wright, Mathematical Statistician (retired), Division of Population Surveys, OAS, SAMHSA, and Arthur Hughes, Mathematical Statistician, Division of Population Surveys, OAS, SAMHSA were consulted on the 1992 and subsequent survey designs. Michael Jones, Mathematical Statistician, Division of Population Surveys, OAS, SAMHSA is the Government Project Officer, (240) 276-1274. Joseph Gfroerer, Director, Division of Population Surveys, OAS, SAMHSA is the primary mathematical statistician responsible for overall project management, (240) 276-1262. RTI statisticians contributing to the design are Dr. James Chromy, Senior Fellow and Director of Statistical Operations and Dr. Ralph Folsom, Chief Scientist and Director of Small Area Estimation.


The 2010–2011 National Survey on Drug Use and Health contract was awarded to Research Triangle Institute (RTI) on September 2, 2008. RTI key personnel on this contract are:


Mr. Thomas Virag, Project Director (919) 485-5732

Mr. David Cunningham, Associate Project Director (919) 485-2612

Ms. Ilona Johnson, Director of Field Operations (919) 485-5731

Dr. James Chromy, Director of Sampling Operations & Statistical Reports (919) 541-7019

Dr. Ralph Folsom, Director of Small Area Estimation Study (919) 541-6248

Dr. Martin Meyer, Director of Data Management & Processing (919) 541-7035

Ms. Elizabeth Dean, Director of Instrument Assessment & Development (919) 541-7445

Ms. Allison McKamey, Director of Training Programs & Field Materials (336) 643-8338

Ms. Lisa Packer, Director of Analysis & Table Production (919) 541-6633

Dr. Mary Ellen Marsden, Director of Report Generation (781) 259-0923

Dr. Doug Currivan, Director of Methodological Issues & Special Analysis (919) 316-3334

Dr. Kimberly Ault, Task Manager, Imputation Activities (919) 541-7455

Mr. Patrick Chen, Task Manager, Weighting Activities (919) 541-6309


Contractor personnel will implement the sample design, recruit field staff, train interviewers, conduct data collection, conduct data receipt/editing/coding/keying, conduct data analysis, and develop statistical reports. SAMHSA will provide direction and review functions to the Contractor. Data collection will be conducted throughout the 2010 calendar year.


Appendix A

Current NSDUH Consultants


a. Consultants on NSDUH Design

Michael Arthur, Ph.D., Project Director (206) 685-3858

Social Development Research Group

University of Washington

Raul Caetano, M.D., Ph.D., Assistant Dean (214) 648-1080

Dallas Satellite MPH Program

University of Texas at Houston

John Carnevale, Ph.D., President (301) 963-2151

Carnevale Associates


Barbara Delaney (212) 973-3509

Director of Research

Partnership for a Drug-Free America


Robert Groves, Ph.D., Director (734) 764-8365

Survey Research Center

Institute of Social Research

University of Michigan


Bill Kalsbeek, Ph.D., Associate Professor/Director (919) 962-3249

Survey Research Unit, Biostatistics

University of North Carolina at Chapel Hill

Graham Kalton, Ph.D. (301) 251-8253

Senior Vice President

Westat


Philip Leaf, Ph.D., Professor (410) 955‑3962

Department of Mental Hygiene, Mental Health and Psychiatry

School of Public Health

Johns Hopkins University


Patrick O’Malley, Ph.D., Senior Research Scientist (734) 763-5043

Survey Research Center, The Institute for Social Research

University of Michigan

University of Maryland, School of Public Affairs


Peter Reuter, Ph.D. (301) 405-6367

School of Public Policy

University of Maryland




b. NSDUH Consultants for the Tobacco Module


Gary A. Giovino, Ph.D., Professor (716) 845-8444

Department of Health Behavior

University at Buffalo - SUNY


c. NSDUH Consultants for Mental Health Modules


Jeffrey Buck, Ph.D. (301) 443-0588

Director of Office of Managed Care

Center for Mental Health Services

Michael First, M.D., Professor (212) 543-5531

Department of Psychiatry

Columbia University Medical Center


Marilyn Henderson (retired) (301) 443-2293

Center for Mental Health Services


Kimberly Hoagwood, Ph.D., Professor (212) 543-5311

Department of Child and Adolescent Psychiatry

Columbia University


Jeffrey Johnson, PhD, Associate Professor (212) 543-5523

Department of Psychiatry

College of Physicians and Surgeons

Columbia University


Ronald C. Kessler, Ph.D., Professor (617) 423-3587

Department of Health Care Policy

Harvard Medical School


Christopher P. Lucas, M.D. (212) 543-5358

Department of Child Psychiatry

Columbia University


Michael Schoenbaum, PhD (301) 435-8760

Senior Advisor for Mental Health Services,

Epidemiology and Economics

National Institute of Mental Health


Phillip Wang, MD, PhD, Director (301) 443-6233

Division of Services and

Intervention Research

National Institute of Mental Health






ATTACHMENTS


Attachment A ‑ Federal-Wide Assurance


Attachment B - CAI Questionnaire Content


Attachment C ‑ Quality Control Form


Attachment D ‑ Lead Letter to Selected Dwelling Unit


Attachment E ‑ Sorry I Missed You Card


Attachment F ‑ Appointment Card


Attachment G ‑ Study Description


Attachment H ‑ Housing Unit and Group Quarters Unit Screening Questions


Attachment I ‑ Question and Answer Brochure


Attachment J ‑ Example of NSDUH Highlights


Attachment K - Example of NSDUH Newspaper Clippings


Attachment L - Showcard Booklet


Attachment M ‑ Quality Control Letter


Attachment N - Sample Design


Attachment O - Interview Payment Receipt


Attachment P - Certificate of Participation


Attachment Q - Unable to Contact Letters


Attachment R - Call Me Letters


Attachment S - Refusal Letters


Attachment T - Follow-up Study Description


Attachment U - Introduction to Clinical Follow-up Interview


Attachment V - Follow-up SCID Interview Content


Attachment W - Follow-up Interview Payment Receipt


Attachment X - December 6, 2006 Mental Health Data Meeting Summary


Attachment Y - NSDUH Confidentiality Pledge


Attachment Z - Follow-up Interview Reminder Card

References


Borges G, Angst J, Nock MK, Ruscio AM, Walters EE, Kessler RC.A risk index for 12-month suicide attempts in the National Comorbidity Survey Replication (NCS-R). Psychol Med 2006;36:1747-1757.


First, M. B., Spitzer, R. L, Gibbon, M., & Williams, J. B. W. (2002). Structured Clinical Interview for DSM-IV-TR Axis I Disorders, Research Version, Non-patient Edition. (SCID-I/NP) New York: Biometrics Research, New York State Psychiatric Institute.

Gfroerer, J., Wright, D., & Kopstein, A. (1997). Prevalence of youth substance use: The impact of methodological differences between two national surveys. Drug and Alcohol Dependence, 47, 19–30.

Groves, R. M., and M.P. Couper. 1998. Nonresponse in Household Interview Surveys. New York: Wiley.


Groves, R. (1989). Survey Errors and Survey Costs. New York: Wiley.


Grucza, R. A., Abbacchi, A. M., Przybeck, T. R., & Gfroerer, J. C. (2007). Discrepancies in estimates of prevalence and correlates of substance use and disorders between two national surveys. Addiction, 102, 623-629.


Hennessy, K., & Ginsberg, C. (Eds.). (2001). Substance use survey data collection methodologies [Special issue]. Journal of Drug Issues, 31(3), 595–727.


Jackson JS, Torres M, Caldwell CH, et al. The National Survey of American life: a study of racial, ethnic, and cultural influences on mental disorders and mental health. Int J Methods Psychiatr Res. 2004;13:196–207.


Jordan, B.K., R.S. Karg, K.R. Batts, and C. Wiesen (December 2003). “Statistical Report on the Concordance of the 2001 Substance Use Disorder Module of the National Household Survey of Drug Use and Health [NSDUH] and a Structured Clinical Interview Assessment for Substance Use Disorders.” Report submitted to the Substance Abuse and Mental Health Services Administration (SAMHSA) Office of Applied Studies (OAS) NSDUH Development Team.


Kessler RC, Merikangas KR: The National Comorbidity Survey Replication (NCS-R): background and aims. Int J Methods Psychiatr Res 2004; 13:60–68


Kessler RC, Berglund P, Chiu WT, Demler O, Heeringa S, Hiripi E, Jin R, Pennell BE, Walters EE, Zaslavsky A, Zheng H: The US National Comorbidity Survey Replication (NCS-R): design and field procedures. Int J Methods Psychiatr Res 2004; 13:69–92


Miller, J. W., Gfroerer, J. C., Brewer, R. D., Naimi, T. S., Mokdad, A., & Giles, W. H. (2004).

Prevalence of adult binge drinking: A comparison of two national surveys. American Journal of Preventive Medicine, 27, 197-204.


Rehm J, Ustin T, Saxena S, et al.  1999.  On the development and psychometric testing of the WHO screening instrument to assess disablement in the general population.  International Journal of Methods in Psychiatric Research 8: 110–22.


1Five age groups actually will be used for the 2010 design so that somewhat lower sampling rates are applied to persons 50+ years old than to those 35-49 years old. Only four age groups are shown in Tables 2 and 3.


28



File Typeapplication/msword
File Title2003 National Survey on Drug Use and Health
Authorlky
Last Modified ByDicy Painter
File Modified2009-06-17
File Created2009-06-16

© 2024 OMB.report | Privacy Policy