NMHS OMB Supporting Statements A 7.12.17

NMHS OMB Supporting Statements A 7.12.17.docx

National Mental Health Study (NMHS) Field Test

OMB: 0930-0380

Document [docx]
Download: docx | pdf

National Mental Health Study Field Test

SUPPORTING STATEMENT

A. JUSTIFICATION

1. Circumstances of Information Collection

Overview

The Substance Abuse and Mental Health Services Administration (SAMHSA) is requesting OMB approval to conduct a methodological field test for a potential national mental health study, provisionally named the National Mental Health Study (NMHS) Field Test. SAMHSA is collaborating with the National Institute of Mental Health (NIMH) to implement this field test, which will use mental disorder assessments similar to studies last conducted over a decade ago in the National Comorbidity Survey-Replication among adults in 2001-2003 and the National Comorbidity Survey-Adolescent supplement among adolescents in 2001-2002.


The availability of timely and high-quality epidemiological data is key to supporting SAMHSA and NIMH’s strategic initiatives to increase awareness and understanding of mental and substance use disorders. Unfortunately, data on the prevalence of specific mental disorders among adults and adolescents across the United States are outdated; most recent studies collected data in the early 2000s.1,2 Moreover, nationally representative data on the prevalence of some critical and debilitating psychiatric experiences, such as borderline personality disorder or prodromal psychotic experiences, are missing.


The NMHS Field Test is funded under an optional task on the 2014-2017 National Survey on Drug Use and Health (NSDUH) contract (OMB No. 0930-0290), sponsored by SAMHSA’s Center for Behavioral Health Statistics and Quality (CBHSQ), via an interagency agreement between NIMH and CBHSQ. Since the NMHS Field Test is being conducted under the NSDUH contract, it capitalizes on similar methodology and materials to the NSDUH and the 2008-2012 NSDUH Mental Health Surveillance Study (MHSS)3 (under OMB No. 0930-0290). However, there is no operational overlap between NMHS and NSDUH; the studies will be conducted separately.

In order to properly assess the feasibility of a national study on mental health disorders, the NMHS Field Test will consist of three main components. The first component is sample selection using a household screener amongst a total sample of 3,563 dwelling units. The household screener will be used to determine eligibility of individuals and to make selections of individuals to recruit for participation. The household screener is modeled on the screener used in the 2015 NSDUH and is designed to roster all members of the dwelling unit aged 13 or older. The roster collects data only on those characteristics of individuals needed to implement sample selection (i.e., age, race/ethnicity and military status).


The second component consists of the in-person survey of the selected adult and adolescent respondents, with approximately 1,200 interviews planned for the NMHS Field Test – 900 adults, 300 adolescents. The NMHS procedures vary somewhat between adults (aged 18 or older) and adolescents (aged 13 to 17). For adult respondents, the in-person assessment (using the adult instrument) will be conducted primarily using audio computer-assisted self-interviewing (ACASI), with an emphasis on respondents completing the interview in a single session. For adolescent respondents, the interview will also be conducted in person in the home primarily using ACASI (the adolescent instrument), and approximately 210 parents/legal guardians of those adolescent respondents will complete a supplemental interview via the web or phone (the parent instrument). The parent interview will collect information about the sampled adolescent for which a parent or guardian may be more qualified to answer (i.e., adolescent’s health, development and family background).


The third component consists of a telephone clinical reappraisal study (CRS) of a selected subgroup of adult and adolescent respondents, with ancillary parent/legal guardian reporting for adolescent interviews that include depression, ADHD, or conduct disorder modules. A total of 200 CRS interviews are planned (100 adults, 50 adolescents, 50 parents). The CRS field test will test the procedures surrounding selection, routing, and administering the CRS.


Additional details on the sampling and interviewing procedures are included in Sections B.1 and B.2 of this supporting statement.


2. Purpose and Use of Information

The NMHS Field Test will evaluate the feasibility of conducting a full-scale national household, area probability mental health study designed to achieve three main goals:

  1. estimate the prevalence of a number of specific mental disorders;

  2. investigate correlates of mental health problems and patterns of care; and

  3. provide a platform for follow-up studies on subgroups of interest.


A full-scale study would provide a framework for updating national estimates of specific mental disorders meeting the new DSM-5 criteria and would provide first-time national estimates of some basic underlying functional dimensions of mental health. These data could be vital for calculating the burden of mental illness and to understanding the degree of resources necessary for delivering quality mental health care. The full-scale study would also investigate sociodemographic correlates of mental disorders and co-occurring mental, substance use, and physical health problems. This information would shed light on mental disorder risk factors and lend clues to the etiology and mechanisms by which these disorders occur. Finally, the full-scale NMHS will offer the opportunity for future follow-up studies focused on high priority subgroups of interest, such as individuals at risk for psychotic disorders or those in treatment for specific mental disorders.


The field test will test questionnaires and operations designed for a full-scale NMHS. Adult and adolescent questionnaires covering an extensive set of Diagnostic and Statistical Manual of Mental Disorders, fifth edition (DSM-5),4 mental disorder assessments, dimensional psychiatric symptom scales, measures of personality features, mental health service use assessments, and risk and protective factor assessments will be used. The questionnaires have been adapted from the 2015 NSDUH demographic, employment, income, and health insurance questions; the latest version (4.0) of the World Mental Health Composite International Diagnostic Interview (WHO-CIDI); the NIH PhenX Toolkit5; and dimensional symptoms scales recommended by an NIMH Research Domain Criteria (RDoC) subcommittee. The WHO-CIDI 4.0 modules were built from the self-administered modules recently used for the Army STARRS study6 and developed by Dr. Ron Kessler of Harvard and the National Institute of Mental Health (NIMH).


3. Use of Information Technology

NMHS Field Test data will be collected using multiple survey data collection modes. Interview respondents will be selected by a household screening program running on hand-held tablet computers used for NSDUH screenings since 2015. Interviews will be administered to respondents in their homes using audio computer assisted self-interviewing (ACASI) and computer assisted personal interviewing (CAPI) software on light-weight ultra-book laptop computers used for NSDUH interviews since 2015.


Interviews with parents of adolescents will be completed via a web-based questionnaire, using computer assisted self-interviewing (CASI), or via computer assisted telephone interview (CATI), at the respondent’s convenience and in the method of their choosing. Finally, CRS interviews will be administered to a selected subset of study respondents via telephone interview and administered by qualified, trained clinical interviewers (Cis) using paper and pencil questionnaires (PAPI). CRS interviews will also be audio recorded upon receiving respondents’ permission.


Electronic data collection affords a number of advantages in the collection of NMHS data. First, this methodology permits the instrument designer to incorporate into the questionnaire routings that might be overly complex or not possible using a paper and pencil instrument. The laptops and web sites can be programmed to implement complex skip patterns and fill specific words based on the respondent’s previous answers. Interviewer and respondent errors caused by faulty implementation of skip instructions are virtually eliminated. Second, these methodologies increase the consistency of the data. The interview software can be designed to identify inconsistent responses and attempt to resolve them through respondent prompts. This approach reduces the need for most manual and machine editing, thus saving both time and money. In addition, it is likely that respondent-resolved inconsistencies will result in more accurate data than when inconsistencies are resolved using editing rules.


Electronic technology also permits greater expediency with respect to data processing and analysis (i.e., a number of back-end processing steps, including coding and data entry). Data are transmitted electronically rather than by mail and will reside in a FIPS-Moderate system at rest. These efficiencies save time due to the speed of data transmission, as well as receipt in a format suitable for analysis. Tasks formerly completed by clerical staff are accomplished by the ACASI, CAPI, CASI, CATI, and web data collection programs. In addition, the cost of printing paper questionnaires and associated mailing is eliminated. Finally, as noted above, computerized self-interview survey technology permits respondents, including nonreaders, to complete sensitive portions of the interview in total privacy. Providing the respondent with a methodology that improves privacy and confidentiality makes reporting of potentially embarrassing, stigmatizing, or illegal behaviors (i.e., mental health issues, drug use) less threatening and enhances response validity and response rates.


For the field test, questions administered via ACASI in the main adult and adolescent interviews will be read aloud to respondents via headphones using Text-to-Speech (TTS) software offered by Microsoft’s Speech Platform, which features a dynamic implementation mode that uses the TTS engine to read question text in real time and eliminates the use of pre-recorded audio files altogether. Prior experience with the usage of TTS technology on other large household surveys, including NSDUH since 2015, has been very positive, with respondents typically reporting no problems understanding words or phrases produced by the TTS voices in English.


FIs will also use the tablet to collect the GPS coordinates at each dwelling unit for data quality purposes (i.e., for identifying falsification or ensuring that the FI is at the correct DU) and for assigning DUs to blocks which will allow examination of field test operations by geographic region. Although collecting GPS coordinates during the screening interview may not be 100% accurate at the block level, it will significantly reduce the number of respondents requiring manual census block assignment. Based on prior studies conducted by RTI, it is estimated that GPS coordinates will be successfully captured approximately 90% of the time. The remaining 10% will require geocoding or manual assignment.


At the end of interviews, FIs will ask permission of respondents to collect GPS coordinates from the exterior of the dwelling unit using the tablet (Attachment A-1, Adult and Adolescent Questionnaire Specifications). With permission, FIs will activate a GPS receiver on the tablet to collect the specific coordinates for that dwelling unit. The tablet does not require the use of an internet connection for this activity; the GPS receiver reads information broadcast by satellites to collect the coordinates.


As part of the regular transmission process, FIs will send the GPS coordinates collected for dwelling units to the Contractor for comparison against address coordinates published by Google for the dwelling units. An auto-generated report will flag any discrepancies between FI-collected coordinates and those reported by Google for adjudication by project management.


In addition, for the CRS interviews, CIs will record audio from interviews with the respondents’ permissions. The Interactive Voice Response (IVR) system will conduct the recordings and then store the audio files directly onto a network share protected by the Contractor’s security controls. This method can improve data quality by resolving unclear answers on the PAPI questionnaires. A newly hired CI will receive full reviews of their first two recorded interviews or until they demonstrate proficiency in administering all SCID modules. Once a CI is proficient, 10 percent of the remaining interviews for each CI will be randomly selected for a full review of the audio recordings and SCID booklet in their entirety. If a respondent refuses the audio recording, they will not be selected for audio review but the case will still go through technical and clinical editing.


4. Efforts to Identify Duplication

CBHSQ is in contact with all major Federal health survey managers and is aware of no other field tests to prepare for a full-scale national household, area probability mental health study designed to (1) estimate the prevalence of a number of specific mental disorders, (2) investigate correlates of mental health problems and patterns of care, and (3) provide a platform for follow-up studies on subgroups of interest. To date, no duplication of effort has been identified. In addition, comparable studies of this kind have not been conducted for more than a decade.


5. Involvement of Small Entities

This survey does not involve small businesses or other such entities.


6. Consequences If Information Is Collected Less Frequently

The existence of mental health disorders as well as behaviors and symptoms that signal the development of a behavioral disorder are a rapidly evolving and changing phenomenon that calls for timely measurement and analysis of the data. The NMHS Field Test is integral to implementing a full-scale study. Without testing prior to full-scale implementation, SAMHSA and NIMH would not be able to adequately identify problems or inefficiencies with questionnaires, study design, or operations that if present in a national study could bring into question the validity or minimize the utility of the study.


7. Consistency with the Guidelines in 5 CFR 1320.5(d)(2)

This information collection fully complies with 5 CFR 1320.5(d)(2).


8. Consultation Outside the Agency

A Federal Register notice was published on May 4, 2017 (82 FR 20900). One public comment was received, see Attachments AH. See Attachment AI for SAMHSA’s response.


The review for the NMHS Field Test is scheduled for May 2017. The HHS Data Council has been kept informed about the status and plans for the NMHS Field Test.


Section B.5 of this Supporting Statement contains a listing of current consultants.


There are no unresolved issues resulting from these consultations.


9. Payments to Respondents

Survey response rates have been declining in recent years. This has raised concerns, in large part because with lower response rates comes the increased potential for nonresponse bias. To maximize response rates and reduce the risk of nonresponse bias affecting key estimates, many surveys offer cash incentives to sample members to encourage their participation. Offering incentives to sample members has been shown to be a cost-effective means of lowering nonresponse. Theories suggest that incentives are effective because of their interpretation as either a token of appreciation (social exchange theory),7 compensation for one’s time and effort (economic exchange theory)8, or the subjective weight a sample member puts on various factors when the survey request is made (leverage-salience theory).9 10


Based on careful review of incentive amounts offered on comparable federal government sponsored surveys and NSDUH experience offering interview cash incentives (see Appendix A), NMHS will offer a fixed incentive amount for the in-person adult and adolescent interviews, CRS adult and adolescent interviews, and the parent interviews.

As part of the field test, different amounts of screener incentives will be evaluated. Exhibit 1 shows the incentive amounts planned for each interview type.


Exhibit 1. NMHS Field Test Incentive Amounts

Interview Type

Duration

Incentive – Level 1

Incentive – Level 2

Screening Interview


5 min

$5

$10


In-Person Interview for Adults and Adolescents

65 min

$40


Web/Telephone Interview for Parents

30 min

$30


CRS Interview for Adults and Adolescents

60 min

$40


CRS Interview for Parents

30 min

$30



Screening Interviews

Based on prior research, it is clear that offering some incentive over no incentive has a beneficial impact on response rates and retention.11,12,13 There is also evidence that interview incentives reduce the cost per survey through their influence on the amount of effort that is required to obtain the completed interview.14,15 The question of what incentive amount to offer to screening respondents is not well-documented in the literature. There is precedent for a $5 screening incentive based on findings from the National Survey of Family Growth (NSFG), which implemented a two-stage data collection design and incorporated a $5 prepaid incentive to screening respondents during Phase 2 (the last three weeks of data collection each quarter).16 Based on results of the experiment, the $5 screening incentive is now offered to all Phase 2 screening respondents in the NSFG.


Similarly, sample members who completed a short screener for the Food and Drug Administration’s Research and Evaluation Survey for the Public Education Campaign on Tobacco among LGBT (RESPECT) were given $10 upon completion of the screener. The $10 value was selected because it was the lowest value deemed to be sufficiently attractive to sample members who were not necessarily predisposed to participate in research. The researchers proposed that the screening incentive would increase participants’ engagement in the study, thereby resulting in higher data validity. They also believed it would increase response to follow-up surveys.17 While the incentive was not implemented experimentally, the project team reports that the $10 incentive facilitated data collection by getting and keeping the attention of the individuals they were trying to screen, even in an environment with many distractions.


A number of other studies have also offered incentives for screening interviews. One example is the National Household Education Surveys (NHES) Program 2011 Field Test, which offered sample members a $2 or $5 advance incentive for a screening survey.18 The response rate for the $5 condition was significantly higher than the $2 condition and it saved some cost associated with nonresponse follow-up mailings. As a result, $5 has been used as a screening incentive in subsequent NHES surveys. Similarly, the National Household Food Acquisition and Purchase Survey (FoodAPS) also offered a $5 prepaid incentive to households contacted for screening.19 Meanwhile, the National Adult Training and Education Survey (ATES) Pilot Study offered a $2 incentive for the screener.20


Based on these results, we believe a $5 incentive would have some positive impact on respondent cooperation, but might not be enough to significantly increase response rates or lower costs. Therefore, we are proposing a $5 versus $10 incentive experiment for respondents who complete the household screening. Respondents will be informed in an advance letter whether they will receive $5 or $10 incentive upon completion of the screener. To avoid interviewer effects, FIs will be assigned both incentive treatment types, as was done in the 2001 NSDUH incentive experiment. FIs and their Field Supervisors (FSs) will receive training on the screening incentive experimental design and how to avoid confounding the experiment.


The screening incentive is mentioned in the Lead Letter (Attachment B), Unable-to-Contact Letters and Call-Me Letters (Attachment C), Refusal Letters (Attachment D), as well as the Incentive Receipt (Attachment E).


In-Person Interviews

FIs will give each NMHS adult and adolescent respondent $40 after they complete an in-person interview. Since 2002, OMB has approved giving a $30 incentive to NSDUH in-person interview respondents. Slightly higher incentive values are appropriate for the NMHS in-person interviews to reflect inflation that has occurred since NSDUH originally began offering an incentive. The increase also seems reasonable because the NMHS consists of multiple surveys (i.e., an adult household member may be selected to complete the adult interview, a parent interview for a selected adolescent, and a CRS interview) making it similar to a longitudinal design. Longitudinal surveys tend to use larger incentives because the increased burden of taking multiple surveys increases the difficulty of participant recruitment and retention.21 Retention in the out waves of a longitudinal survey is especially critical as it can negatively impact the researcher’s ability to analyze the data if sample sizes become too small to yield sufficient power to detect differences. The U.S. Census Bureau has experimented with and begun offering incentives for several of its longitudinal panel surveys, including The Survey of Income and Program Participation (SIPP) and the Survey of Program Dynamics (SPD). SIPP has conducted several multi-wave incentive studies, most recently with their 2008 panel, comparing results of $10, $20, and $40 incentive amounts to those of the $0 control group. The study has examined response rate outcomes in various subgroups of interest (i.e., the poverty stratum), use of targeted incentives for non-interview cases, and the impact of base wave incentives on participation in later waves of data collection. Overall, the results suggest that higher incentives increase response rates and also improve the conversion rate for non-interview cases. Similarly, SPD has conducted four incentive studies, testing $20, $40, $50, and $100 amounts in an effort to minimize attrition in subsequent waves of the study. Incentives were found to have a positive impact on both response and attrition rates; most recently, a fourth incentive study found that the average interview rate greatly increased with the use of incentives.22 Similarly, Jäckle and Lynn found that incentives at multiple waves significantly reduced attrition at all waves.23


The in-person visit incentive is mentioned in the following respondent materials: Lead Letter (Attachment B), Unable-to-Contact and Call-Me Letters (Attachment C), Refusal Letters (Attachment D), Incentive Receipt (Attachment E), Appointment Card (Attachment F), Study Description (Attachment G), Introduction and Informed Consent Scripts (Attachment H), Screening Questions (Attachment I), and Question & Answer Brochure (Attachment J),


Parent Web/Telephone Interviews

Each parent will receive a $30 prepaid incentive for agreeing to take part in this NMHS interview by web or by telephone before they actually complete the interview. A $30 amount was chosen because the NMHS web/telephone parent interview will take less time to complete than the in-person interviews. On average, the parent interviews are expected to take 30 minutes to complete and recruitment for the survey will be done in person as the adolescent is starting the ACASI portion of the in-person interview. Research studies have shown providing incentives before the interview increases the likelihood participants will complete the interview.24


The parent web/telephone interview incentive is mentioned in the following respondent materials: Incentive Receipt (Attachment E), Parent Interview Introductory Script (Page 3-1 of Attachment I), Parent Study Description (Attachment K), and Parent Study Informed Consent (Attachment L).


CRS Interviews

The CRS interviews will create additional burden for respondents in a short period of time, which may result in greater difficulty obtaining respondent participation. To maintain adequate response rates, it is necessary to offer adult and adolescent respondents for these CRS interviews a $40 incentive for agreeing to complete this follow-up interview. This will be in addition to the $40 received for completion of the in-person interview. The adult and adolescent CRS interviews will take about the same amount of time as the in-person interviews, so an identical incentive is equitable. Since the parent’s CRS interview is shorter than the other CRS interviews, offering $30 for the parental CRS interview is appropriate.


OMB approved a similar incentive plan for the 2008-2012 NSDUH MHSS, in which main study adult interview respondents were given $30 upon completion of the initial NSDUH interview and were then immediately given an additional $30 once selected for and agreeing to participate in the MHSS clinical follow-up interview at a later date.


The incentives for the follow-up CRS interviews are mentioned in the following respondent materials: CRS Recruitment Scripts (Attachment M-1, CRS Data Collection Materials), Adult and Adolescent CRS Follow-up Study Descriptions (Attachment M-1, CRS Data Collection Materials), Parent CRS Follow-up Study Description (Attachment M-2, CRS Data Collection Materials (Parent Only)), Introduction and Informed Consent for the Clinical Interview (Attachment N), and Incentive Receipt (Attachment E).


Multiple Cash Incentives

Adult and adolescent respondents who complete an in-person interview and are selected for and agree to complete a follow-up CRS interview will be given a total of $80 at the end of the in-person interview. Parents who agree to complete a web/telephone parent interview and are selected for and agree to complete a CRS parent interview will be given a total of $60 for each adolescent about whom they report. The maximum number of adolescents selected per household will be two and that is likely to be a rare event. If an adult completes a screening and receives a $10 screening incentive, is selected for an in-person interview and completes it, is selected for and agrees to complete a CRS interview for himself/herself, has an adolescent who is interviewed and agrees to complete the web/telephone parent interview, and is selected for and agrees to complete a CRS parent interview, that adult could receive a total of $150 in incentives for the completion of five separate interviews (including the screening interview). Similar mental health studies offered separate incentives for each component of the study or separate interviews where the total amount of money earned by an adult could total $150-$200. For the 2002 US National Comorbidity Survey Replication (NCSR), adults could receive $50 for completing a 90-minute CAPI interview or $100 (only during the last 60 days of the field period) for a truncated CAPI or CATI interview. In addition, they could receive another $50 if they completed a parent interview and another $50 if selected for a follow-up clinical reappraisal interview.25


10. Assurance of Confidentiality

Concern for the confidentiality and protection of respondents’ rights in the implementation of the NMHS Field Test will be given the utmost emphasis as set forth by the Confidential Information Protection and Statistical Efficiency Act of 2002 (CIPSEA, included as Title V in the E-Government Act of 2002, P.L. 107-347). This statute prohibits disclosure or release, for non-statistical purposes, of information collected under a pledge of confidentiality. As CIPSEA agents, all Contractor staff working on the NMHS Field Test complete an annual CIPSEA training and sign a notarized Confidentiality Agreement (Attachment O). FIs and FSs, who work for a subcontractor to the Contractor, will also complete CIPSEA and project training on ensuring respondent confidentiality and will have signed a notarized Data Collection Agreement (Attachment O) certifying they will keep all respondent information confidential.


Under CIPSEA, data may not be released to unauthorized persons. CIPSEA safeguards the confidentiality of individually identifiable information acquired under a pledge of confidentiality by controlling access to, and uses made of, such information. CIPSEA includes fines and penalties for any knowing and willful disclosure of individually identifiable information by an officer, employee, or agent of SAMHSA. Willful and knowing disclosure of protected data to unauthorized persons is a felony punishable by up to five years’ imprisonment and up to a $250,000 fine.


Screening and In-Person Interviews

FIs will be thoroughly educated in methods for maximizing a respondent’s understanding of the government’s commitment to confidentiality. Furthermore, FIs will make every attempt to secure an interview setting in the respondent’s home that is as private as possible, particularly when the respondent is an adolescent. The Contractor’s Institutional Review Board (IRB) was granted a Federalwide Assurance (Attachment P) by the Office for Human Research Protections (OHRP) and HHS in compliance with the requirements for the protection of human subjects (45 CFR 46). The Contractor’s IRB will approve the protocols and consent forms for the NMHS prior to any respondent contact. The IRB’s primary concern is protecting respondents’ rights, one of which is maintaining the confidentiality of respondent information. By obtaining IRB approval for NMHS procedures and materials, CBHSQ is assured that respondent confidentiality will be maintained.


Several procedures ensure respondents’ rights are protected. First, as a security measure to protect confidentiality, the study will use a tablet and laptop to physically separate the screening and demographic data from sensitive interview data. All addresses will be contained on the tablet FIs will use to capture screening and demographic data. For subjects selected for an interview, a separate laptop will be used to capture the interview data.


The project laptop and tablet will contain full-device encryption that is FIPS 140-2 compliant.  In addition to the encryption provided by CheckPoint, the laptop will require 2-factor authentication for Windows login, using a personalized token called Yubikey. The Yubikey tokens will be issued by the Contractor to each active field interviewer. The laptops will also be equipped with the most up-to-date antivirus program. The Contactor will perform routine software updates through LANDESK on these laptops to mitigate potential operating system vulnerabilities.  In addition to the device-level encryption provided by Samsung, users must know the password to turn on the tablets. All personally identifiable information collected in the tablets will be encrypted in the database at rest. The laptop also has an installed utility that will run in the background to monitor potential malware activities.


For the doorstep screening, the FI will ask to speak with an adult at the address, and introduce him or herself and the legitimacy of the study by displaying an ID badge and providing a copy of the NMHS Study Description (Attachment G) and the NMHS Fact Sheet (Attachment Q). A Question & Answer Brochure (Attachment J) that provides answers to commonly-asked questions may also be given. In addition, FIs will be supplied with copies of NIMH Articles and Information Sheets (Attachment R) for use in eliciting participation, which can be left with the respondent. If the respondent questions the legitimacy of the study, the FI will refer them to the SAMHSA Authorization Letter (Attachment S). At each screening the FI will use the tablet computer to list all dwelling unit residents aged 13 and older. The tablet will have a programmed selection algorithm that will be used to select up to 2 individuals to be interviewed. The screening data will be encrypted at rest on the tablet.


After the screening is completed, the FI will introduce himself or herself and the study to individuals selected for an interview using the Introduction and Informed Consent Scripts (Attachment H), reading the scripted text aloud to each interview respondent. This statement will appear in the Showcard Booklet (Attachment T) and will be read aloud to each interview respondent. As part of the process for obtaining informed consent, respondents will be given a Study Description (Attachment G), which includes information on CIPSEA and the protection that it affords. Specifically, the Study Description states that respondents’ answers will be used only by authorized personnel for statistical purposes and cannot be used for any other purpose. If a respondent is aged 13 to 17, when the adolescent is selected for the interview, the FI will read the parental introductory script (Attachment U) to the parent or legal guardian requesting permission to speak with the adolescent about the NMHS. The only exception is in the rare instances where a 17-year-old lives independently from his or her parent or legal guardian (in which case the 17-year-old provides his or her own consent). After that introduction, parental consent for the interview will be obtained from the selected respondent’s parent or legal guardian, adolescent assent will be requested and at least one parent or legal guardian must remain present in the home throughout the interview.


After obtaining informed consent, FIs will make every attempt to secure an interview setting in the respondent’s home that is as private as possible. In addition, the interview process, by design, will include techniques to afford privacy for the respondent. The ACASI portion of the adult and adolescent questionnaire will maximize privacy and confidentiality by giving control of the sensitive questionnaire sections directly to the respondent. The ACASI methodology will allow the respondent to listen to questions through a headset and/or to read the questions on the computer screen, and then key his or her own responses into the computer via the keyboard. At the end of the ACASI portion, the respondent’s answers will be locked so that no one can see the responses until after the data are transmitted, processed, and aggregated by the Contractor.


Parent Web/Telephone Interviews

The study will attempt to interview the parent of each adolescent interviewed. The FI will introduce the parent study by providing the parent a Parent Study Description (Attachment K), which also includes information on CIPSEA and the protection that it affords. Specifically, the Study Description states that any data provided will only be used by authorized personnel for statistical purposes and cannot be used for any other purpose. If the parent agrees to participate in the parent interview, the parent will receive a Parent Interview Information Card (Attachment V) filled out by the FI which will include the child’s initials, the web address to complete the interview, a technical support telephone number and an option of calling to complete the survey with a telephone interviewer (TI). The secure web link for the parent interview will adhere to government https protocols. The FI will also request to record the child’s initials, parent’s email address, and telephone number in the tablet to use for e-mail and text prompts to parents who do not promptly complete the interview. If the parent or legal guardian refuses to participate and does not provide key contact information during the recruitment phase with the FI, the FI will not recontact the household. The information will be collected in the tablet and then stored and transmitted in encrypted format to the Contractor where it will be parsed and saved as a separate file on the contractor’s secure network in a FIPS-Moderate environment. The parent contact information will only be accessed and used to prompt parents where needed. FIs will encourage parents to complete the interview while the FI is still in the household but the parent can complete it at a later time either by telephone or web. The parent will also go through a consent process at the start of the parent interview by telephone or web. The Parent Study Informed Consent (Attachment L) also mentions that federal law keeps interview answers private, and that any data provided will only be used by authorized personnel for statistical purposes according to the Confidential Information Protection and Statistical Efficiency Act of 2002.


CRS Interviews

At the end of each in-person adult and adolescent interview, a sample selection algorithm will determine whether the adult and adolescent respondent, and the parent of the adolescent should be invited to participate in the CRS. With each person selected, the field interviewer will provide the CRS Follow-up Study Description (Adult and Adolescent in Attachment M-1, Parent in Attachment M-2) and attempt to recruit subjects to participate in the CRS. The study description provides information on how the CRS data is protected. Specifically, the study description mentions that the participant’s name and telephone number will not be included on the paper interview forms on which answers will be written, or on any interview audio files that might be recorded for quality assurance, and that federal law protects the privacy of answers and requires study staff to keep all answers confidential. For each person who agrees to participate in the CRS, the FI will request and then enter the first name, telephone number and best times to contact into the laptop. The information will be stored and transmitted in encrypted format to the Contractor where it will be parsed and saved as a separate file on the contractor’s secure network in a FIPS-Moderate environment. This information will only be used by the CI to contact the person to arrange an appointment to complete the CRS interview. CRS contact information will be maintained and available to the CIs through a CRS Case Management System (see Section 11). The CIs will follow a structured process on the telephone by first administering the Introduction and Consent for the Clinical Interview (Attachment N) and answering any questions, before administering the CRS interview. The consent also mentions that Federal law keeps interview answers private.

Collection and Storage of Identifying Information

To ensure confidentiality, the respondent’s name, address, or other identifying information are never connected to their survey responses. There are five instances where the collection of identifying information is necessary:

  • For contacting parents, FIs will collect the child’s initials, parent’s email address, and telephone number in the tablet only for e-mail and text prompts to parents who do not promptly complete the interview.

  • For the CI to use in contacting subjects to schedule and complete CRS interviews the FI will request and enter the first name, telephone number and best times to contact in the laptop.

  • For verification of the FI’s work, at the end of the screening or interview subjects will be asked to voluntarily provide their address and telephone number to be called to confirm the interviewer did their job correctly. Also, for verification of non-screening cases, the person who provides the information will be asked to provide their first name and telephone number. This information will be entered in the tablet. The Contractor will contact respondents from approximately 15 percent of all completed in-person interviews and 5 percent of all completed screenings to verify that study protocols were properly followed by FIs.

  • For the rare case when screening group quarter units, the first name of each occupant will be collected as occupants are often unrelated and of a similar age.

  • For data verification and block level assignments, the FI will obtain permission to collect GPS coordinates and record the coordinates in the tablet when leaving the household.


The identifying information above is collected and then stored in encrypted format in the tablet or laptop, and will be transmitted in encrypted format to the Contractor. These identifiers will be parsed and saved as separate files from the other survey data.

Each day they work, FIs will electronically transmit all completed screening, interview, GPS coordinates and contact information (for parent and CRS interviews) data to the Contractor’s servers via secure encrypted data transmission. On the screening and interview data files, respondents will be distinguished only by a unique number.


This data collection is subject to the Privacy Act of 1974.26 The most recent NSDUH Privacy Impact Assessment (PIA), updated by SAMHSA on May 13, 2016, would cover the NMHS Field Test.


11. Questions of a Sensitive Nature

Many of the NMHS interview questions concern topics likely viewed as sensitive by many respondents. Methodological safeguards, including the use of a self-administered mode of questionnaire administration, improve the privacy of data collected on sensitive issues and can improve reporting of sensitive information.


Adult and Adolescent Interviews

For both the adult and adolescent questionnaires, the interview will begin with the survey introduction where the FI informs the respondent why the information is necessary, indicates who sponsors the study, requests consent to conduct an interview, and explains the procedures that ensure confidentiality. As noted in section A.10, for respondents between the ages of 13 and 17 – except in rare instances where a 17-year-old lives independently without a parent or legal guardian and provides his or her own consent – verbal consent will be obtained from the parent or legal guardian and then verbal assent will be obtained from the adolescent (See Attachment H, Introduction and Informed Consent Scripts, for verbal consent text.). Once parental consent is obtained, great care will be given to ensuring the actual interview is conducted without parental observation or intervention, though at least one parent or legal guardian must remain present elsewhere in the home throughout the interview.


Answers to sensitive questions, including all mental health, substance use, sexual behavior, self-harm, crime and violence questions will be obtained by closed interview design. In the ACASI portion of the interview, the respondent will see the questions on the computer screen, hear the questions read aloud through headphones and enter his or her answers directly into the computer. The FI will be trained to sit where he or she cannot see the computer screen and thus does not know which questions are being asked nor how the respondent answers the questions. The ACASI methodology ensures privacy is maximized even for respondents with lower levels of literacy which is especially important for a study that includes adolescents.


Parent Interviews

The parent interview is designed as a self-administered web survey (a telephone interview is available to respondents who do not have internet access or choose not to complete the interview via web). The FI will explain the purpose of the study to the parent at the time the adolescent interview is conducted and provide details on how to access the web survey or how to call in to participate in the survey. The secure web link for the parent interview will adhere to government https protocols. In either mode, web or telephone, the parent will be told some of the questions are sensitive and that completing the survey in a private setting is recommended.


Transmission and Consent

All NMHS data collected in the field using Computer Assisted Interviewing (CAI) will be transmitted regularly to the Contractor via secure encrypted data transmission and distinguished only with a unique number, which is a code associated with the selected dwelling unit (SDU). The questionnaire data will be processed immediately upon receipt at the Contractor’s facilities, and all associations between a questionnaire and the respondent’s address will be destroyed after all data processing activities are completed. The listings of SDU addresses will be kept under secured conditions and destroyed after all data processing activities are completed.


No signed consent forms will be used; however, verbal consent will be obtained as explained above.


CRS Interviews

The CRS is a clinical interview comprised of sensitive topics by definition. The CRS interviews will be completed by CIs trained in administering the semi-structured psychiatric interviews such as the Structured Clinical Interview for DSM-5 for Axis-I disorders (SCID)27 for adults and the Kiddie Schedule for Affective Disorders and Schizophrenia-Present and Lifetime Version 2013 (K-SADS-PL)28 for adolescents and parents. DSM-5 diagnoses will be derived from these clinical interviews. The CIs will administer the semi-structured interview over the telephone from a private location in a home or office. Offices must be in a separate, dedicated room, away from the main traffic flow of the home to prevent interruptions and ensure confidentiality of CRS forms. When calling to conduct the clinical follow-up interview, the CI will ask the respondent to go to a private location for the duration of the interview; adolescents will be required to complete the interview at home and the parent or legal guardian of the adolescent will need to provide consent via telephone before the CI can proceed with the clinical interview.


The CI will explain to each respondent that he/she knows only the respondent’s first name and telephone number and that this identifying information will be destroyed by the CI after the interview has been completed. The CI will repeat the confidentiality assurances and will ask for permission to make an audio recording of the interview (see Attachment N). The CI will note the respondent’s answers on a paper interview response sheet and will keep the questionnaire in a secure location until shipping them via United Parcel Service (UPS) to the Contractor for a quality control (QC) review by specially-trained clinical supervisors (CSs). After this QC step, the paper response sheet will be technically edited and keyed.


The only identifying information on the CRS questionnaires will be a randomly-generated 7-digit number to link the CRS data to the in-person interview data, which can only be linked by researchers at the data collection Contractor who have signed confidentiality pledges (Attachment O) and who will use the data only for research purposes.


All CIs will be using their own computers to access the CRS Case Management System (CMS) website, which will require a 2-factor login as part of a FIPS-Moderate system. The respondent contact information will be encrypted in the CMS database. No project data will be stored in the CI computers. All interviews will be conducted via the IVR system designed by the Contractor that resides on a secured internal network. The procedure of conducting a CRS is as follows:

  1. The CI will log on to the CMS website and obtain the respondent’s contact information.

  2. The CI will call into the IVR system that will prompt for the respondent’s telephone number.

  3. The IVR system will initiate the call that connects the CI with the respondent to start the interview.

  4. The CI will initiate and end the recording within the IVR system.


Permission to record the interview is not a requirement to complete the interview.

All CRS interviews for which the respondent allows permission will be recorded and stored directly into a secured network. The above protocol provides a high degree of protection for the confidentiality of the audio files.


In summary, CRS data will be protected by a high level of security measures:

  • Respondent contact information will be encrypted in the database at rest.

  • The CMS website will require 2-factor login for users to obtain respondent contact information. To access the CMS, a user must have a valid user name and password in addition to a project-issued device that generates a random string every time the individual attempts access. The device will be personalized to the user and cannot be shared with others.

  • All recorded data will be stored on the Contractor’s file server, protected from the public internet by corporate IT security controls.


All paper and audio recordings will be destroyed approximately 12 months after the end of data collection.


12. Estimates of Annualized Hour Burden

For the NMHS Field Test, the sample has been designed to yield approximately 1,200 completed interviews. It will be necessary to sample approximately 3,563 households and complete approximately 2,331 household screenings to obtain the desired number of interviews for the field test.


The field test screening process will be based on the process developed for and used successfully on the 2015 NSDUH. Based on that experience, administration of the screening questions is expected to take an average of five minutes per SDU.


Initial timing estimates indicate the NMHS adult and adolescent in-person questionnaires will take about 63 minutes to administer, on average. The quality control questions (Attachment A-1, adult questionnaire p.226, adolescent questionnaire p. 233) at the end of the in-person interviews are estimated to take an additional two minutes to administer. This will bring the total in-person interview burden to about 65 minutes. The parent questionnaire will take about 30 minutes to administer, on average. The adult and adolescent CRS interviews will take about 60 minutes, on average. The parent CRS will take approximately 30 minutes. All timing estimates will be evaluated as part of the field test.


Since the NMHS screening and interview verification contacts will be based on NSDUH processes, each is estimated to take an average of four minutes and will be administered only to a subsample of the cases. An approximate 15 percent random sample of each FI’s completed in-person interviews will be verified via telephone. In addition, a portion of the following completed screening codes that do not result in a respondent being selected for an interview will be verified:

  • vacant;

  • not a primary residence;

  • not a dwelling unit;

  • contain only military personnel;

  • include only residents who will live in the household for less than half of the quarter; and

  • no one was selected for interview.


Previous NSDUH experience indicates that approximately 60 percent of all screenings will result in one of those six screening codes. An approximate five percent random sample of all such screening codes will be selected for verification follow-up.


The NMHS Field Test data collection period will be two months, spanning the period from mid-October to mid-December 2017. The annualized estimated respondent burden for the NMHS is shown in Table 1. The hourly wage of $15.58 was calculated based on weighted data from the 2015 NSDUH and respondents' reported personal annual income.


Table 1. Annualized Estimated Respondent Burden for the NMHS Field Test

Instrument

No. of Field Test
respondents

Responses per respondent

Total number of responses

Hours per response

Total burden hours

Hourly
wage rate

Total hour cost

Household

Screening

2,331

1

2,331

0.083

193

$15.58

$3,007

In-Person Interview Plus Quality Form

1,200

1

1,200

1.083

1,300

$15.58

$20,254

Parent Web/ Telephone Interview

210

1

210

0.50

105

$15.58

$1,636

Clinical Interview for Adults and Adolescents

150

1

150

1.00

150

$15.58

$2,337

Clinical Parent Interview

50

1

50

0.50

25

$15.58

$390

Screening Verification

142

1

142

0.067

10

$15.58

$156

Interview Verification

180

1

180

0.067

12

$15.58

$187

Total

4,263




1,795


$27,967



13. Estimates of Annualized Cost Burden to Respondents

There are no capital, startup, operational, or maintenance costs to respondents.


14. Estimates of Annualized Cost to the Government

Total costs associated with the NMHS Field Test are estimated to be $7,392,061 over a 36-month contract performance period. Of the total costs, $6,257,356 are for contract costs (i.e., sampling, data collection, processing, reports), and approximately $1,134,705 represents CBHSQ costs to manage/administrate the survey. The annualized cost is approximately $2,464,020.


15. Changes in Burden

This is a one-time, new data collection.


16. Time Schedule, Publication and Analysis Plans

Plans for the NMHS Field Test data involve two data products: (a) NMHS Field Test Report; and (b) Field Test Data File and Codebook. Field test data collection is anticipated between October and December 2017. Analysis will begin in tandem with data collection. Table 2 includes a schedule for the NMHS Field Test.


(a) NMHS Field Test Final Report (July 2018). This report will present highlights and detailed findings of the NMHS Field Test. It will consist of a series of exhibits, both graphic and tabular, and narrative highlights of the evaluation of the field test, as well as selected mental health results and recommendations for improvement for future data collection efforts.


(b) NMHS Field Test Data File and Codebook (July 2018).

This will be an Analytic Data File which will serve as the basis for the Field Test Final Report and as the basis for future analyses. The codebook will provide standardized documentation using processes developed for NSDUH. This data file is for use by the Contractor, SAMHSA, and NIMH only. It is not planned for public release.


Table 2. Project Schedule for the NMHS Field Test

Activity

Time Frame

Design and select area frame sample

March 2017 to March 2017

Prepare for and conduct field staff training

March 2017 to January 2018

Program the screening and interview instruments

January 2017 to May 2017

Recruit field staff and generate all required materials/assignments for distribution

August 2017 to November 2017

Conduct screenings and interviews

January 2018 to February 2018

Conduct data processing and file preparation

February 2018 to June 2018

Deliver Raw Data Files and Codebooks

July 2018

Deliver NMHS Field Test Final Report

July 2018


17. Display of Expiration Date

The OMB expiration date will be displayed on all NMHS Field Test data collection instruments (Attachment A-1 to A-4) and specific materials, including the Study Descriptions (Attachment G, K, M-1, and M-2), Screening Questions (Attachment I), Quality Control Letter (Attachment W), CRS Cover Sheet and Transmittal Forms (Attachment X).


18. Exceptions to Certification Statement

The certifications are included in this submission and fully comply with 5 CFR 1320.9.

1 Kessler, R. C., & Merikangas, K. R. (2004). The National Comorbidity Survey Replication (NCSR): Background and aims. International Journal of Methods in Psychiatric Research, 13(2), 60-68.

2 Merikangas, K. R., Avenevoli, S., Costello, E. J., Koretz, D., & Kessler, R. C. (2009). National Comorbidity Survey Replication Adolescent Supplement (NCS-A): I. Background and measures. Journal of the American Academy of Child & Adolescent Psychiatry, 48(4), 367-379.

3 Center for Behavioral Health Statistics and Quality. (2014). 2012 National Survey on Drug Use and Health: Methodological Resource Book (Section 16a, Mental Health Surveillance Study Operations Report [2008-2012]). Substance Abuse and Mental Health Services Administration, Rockville, MD.

4 American Psychiatric Association. (2013). Diagnostic and statistical manual of mental disorders (5th ed.). Washington, DC: Author.

5 Hamilton, C. M., Strader, L. C., Pratt, J. G., Maiese, D., Hendershot, T., Kwok, R. K., Hammond, J. A., Haines, J. (January 01, 2011). The PhenX Toolkit: Get the Most From Your Measures. American Journal of Epidemiology, 174, 3, 253-260.

6 Ursano, R. J., Colpe, L. J., Heeringa, S. G., Kessler, R. C., Schoenbaum, M., Stein, M. B., & Army STARRS collaborators. (January 01, 2014). The Army study to assess risk and resilience in service members (Army STARRS). Psychiatry, 77, 2, 107-19.

7 Dillman, D. A. 2000. Mail and Internet Surveys: The Tailored Design Method. 2nd Edition. New York: John Wiley Co.

8 Biner, P. M., & Kidd, H. J. (1994). The interactive effects of monetary incentive justification and questionnaire length on mail survey response rates. Psychology and Marketing, 11, 483–492.

9 Groves, R. M., Singer, E., & Corning, A. D. (2000). A leverage-saliency theory of survey participation: description and illustration. Public Opinion Quarterly, 64, 299–308.

10 Groves, R. M., Presser, S., & Dipko, S. (2004). The role of topic interest in survey participation decisions. Public Opinion Quarterly, 68(1), 2–31.

11 Kulka, R. A., Eyerman, J., & McNeeley, M. E. (2005). The use of monetary incentives in federal surveys on substance use and abuse. Journal of Economic and Social Measurement, 30(2-3), 233–249.

12 Singer, E., and Ye, C. (2013). The use and effects of incentives in surveys. The Annals of the American Academy of Political and Social Science, 645(1), 112–141.

13 Trussell, N., & Lavrakas, P. J. (2004). The influence of incremental increases in token cash incentives on mail survey response is there an optimal amount? Public Opinion Quarterly, 68(3), 349-367.

14 Beebe, T. J., Davern, M. E., McAlpine, D. D., Call, K. T., & Rockwood, T. H. (2005). Increasing response rates in a survey of Medicaid enrollees: the effect of a prepaid monetary incentive and mixed modes (mail and telephone). Medical Care43(4), 411-414.

15 Kennet, J., Gfroerer, J., Bowman, K. R., Martin, P. C., & Cunningham, D. B. (2005). Introduction of an incentive and its effects on response rates and costs in NSDUH. In Kennet, J., & Gfroerer, J. (Eds.), Evaluating and improving methods used in the National Survey on Drug Abuse (DHHS Publication No. SMA 05-4044, Methodology Series M-5). Rockville MD: Substance Abuse and Mental Health Services Administration, Office of Applied Studies.

16 Groves, R. M., Mosher, W. D., Lepkowski, J., Kirgis, N. G. Planning and development of the continuous National Survey of Family Growth. National Center for Health Statistics. Vital Health Stat 1(48). 2009.

17 Food and Drug Administration’s Research and Evaluation Survey for the Public Education Campaign on Tobacco among LGBT (RESPECT): Supporting Statement. (n.d.). http://www.reginfo.gov/public/do/DownloadDocument?objectID=65626801

18 Han, D., Montaquila, J. M, & Brick, J. M. (2012). An evaluation of incentive experiments in a two-phase address-based mail survey. In Proceedings of the Survey Research Methods Section of the American Statistical Association (pp. 3765–3778).

19 Kirlin, J. A., & Denbaly, M. (2013). FoodAPS National Household Food Acquisition and Purchase Survey. US Department of Agriculture, Economic Research Service.

20 Bielick, S., Cronen, S., Stone, C., Montaquila, J., and Roth, S. (2013) The Adult Training and Education Survey (ATES) Pilot Study: Technical Report (NCES 2013-190). U.S. Department of Education. Washington, DC: National Center for Education Statistics. Retrieved 28Nov2016 from http://nces.ed.gov/pubsearch

21 Singer, E., and Ye, C. (2013). The use and effects of incentives in surveys. The Annals of the American Academy of Political and Social Science, 645(1), 112–141.

22 Creighton, K. P., King, K. E., & Martin, E. A. (2007). The use of monetary incentives in Census Bureau longitudinal surveys. Survey Methodology, 2.

23 Jackle, A. and P. Lynn. 2008. Respondent incentives in a multi-mode panel survey: cumulative effects on nonresponse and bias. Survey Methodology 34: 105–117.

24 Groves and Couper, 1998. Nonresponse in household interview surveys. New York: Wiley.

25 Kessler, Ronald C., et al. "The US National Comorbidity Survey Replication (NCSR): design and field procedures." International journal of methods in psychiatric research 13.2 (2004): 69-92.


26 The SAMHSA System of Record Notice covering NMHS is 09-30-0036 and 09-30-0049. See ttp://beta.samhsa.gov/privacy/pia for more information.

27 First, M. B., Williams, J. B. W., Karg, R. S., Spitzer, R. L. (2015). Structured Clinical Interview for DSM-5—Research Version (SCID-5 for DSM-5, Research Version; SCID-5-RV). Arlington, VA: American Psychiatric Association.

28 Kaufman, J., Birmaher, B., Axelson, D., Perepletchikova, F., Brent, D., Ryan, N. (2013). Kiddie Schedule for Affective Disorders and Schizophrenia for School Age Children–Present and Lifetime Version (K-SADS-PL), 2013. University of Pittsburgh School of Medicine, Department of Psychiatry.


13

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Title2003 National Survey on Drug Use and Health
AuthorMierzwa, Frank
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy