MDPS Supporting Statement A - clean 20220328

MDPS Supporting Statement A - clean 20220328.docx

Mental and Substance Use Disorders Prevalence Study (MDPS)

OMB: 0930-0388

Document [docx]
Download: docx | pdf

Mental and Substance Use Disorders Prevalence Study

SUPPORTING STATEMENT

Part A. Justification

A1. Circumstances Necessitating Data Collection

The Substance Abuse and Mental Health Services Administration (SAMHSA) is requesting approval from the Office of Management and Budget (OMB) to conduct the Mental and Substance Use Disorders Prevalence Study (MDPS) pilot program.

This data collection includes the following instruments:

  1. Household Roster (Attachments A and B) – This instrument includes questions to identify all adults between 18-65 years of age within selected households.

  2. Household and Jail Mental Health Screening Instrument (Attachments C and D) – This instrument includes socio-demographic questions, questions about COVID-19 impacts, and one of two screening tools to assess mental or substance use disorder symptoms, either (1) select items from the Composite International Diagnostic Interview (CIDI), or (2) the Computerized Adaptive Test – Mental Health (CAT-MH). The purpose of the mental health screening instrument is help select sample members for the clinical interview based on their likelihood of having various mental or substance use disorders. Because screening for mental health and substance use, prior to a clinical interview, has not been done in a national household sample, two versions of the screener will be used to identify an optimal method. However, each respondent will only complete one of the two mental health screening interviews, either the CIDI, or the CAT-MH, and those that complete the CAT-MH will be asked a subset of the full bank of CAT-MH questions, given the adaptive nature of the instrument.

  3. Household and Non-household Clinical Interview (Attachment E) – This interview will include a computerized version of the Structured Clinical Instrument for DSM-5 Clinical Version (SCID-5-CV), a semi-structured clinical assessment of mental and substance use disorders; additional demographic and background questions regarding treatment; COVID-19 impacts on treatment access; health insurance, cigarette and vaping use; and criminal history. This instrument is the only one used in the non-household populations, with the exception of the mental health screening instrument administered to the jail participants.

SAMHSA requests approval to use these instruments while conducting the following activities:

  1. Collecting data on select mental disorders from U.S. household residents for the MDPS pilot program.

  2. Recruiting and engaging federal or state prisons, homeless shelters, state psychiatric hospitals, and jails, for the purpose of gathering information to facilitate data collection for non-household residents.

  3. Collecting data on select mental disorders from residents of homeless shelters, state psychiatric hospitals, prisons, and jails, for the MDPS pilot program.

Despite being a new clearance request, the MDPS pilot program has obtained IRB and OHRP approval and initiated data collection. As required by the terms of the SAMHSA cooperative agreement, MDPS data collection began in October 2020 and is currently ongoing. To date, household data collection has been completed with 15,052 roster respondents, 13,064 screener respondents, and 1,913 clinical interview respondents. A total of 263 facility respondents have completed the clinical interview, as part of the MDPS pilot program. As data collection has occurred during the COVID-19 pandemic, data collection strategies are designed to prioritize non-in-person data collection opportunities.

A1.1 Study Background

Adult mental illness, particularly serious mental illness (SMI), and substance use disorders (SUDs) are significant public health problems with substantial unmet treatment need in the United States. In 2019, there were an estimated 51.5 million adults (20.6 percent) with past-year symptoms of mental illness living in households in the United States (SAMHSA, 2020). This 2019 estimate was higher than the comparable estimate for each year from 2008 to 2018. Of those, 13.1 million adults (5.2 percent of all U.S. adults; 25.4 percent of those with any mental illness in the past year) experienced mental illness with serious functional impairment (i.e., SMI) that interfered with or limited one or more major life activities. More than one third (34.5 percent) of household-dwelling adults with SMI in 2019 did not receive treatment, and 47.7 percent said there was a time in the past year in which they thought they needed treatment or counseling for their mental health but did not receive it (SAMHSA, 2019). These data highlight that mental illness, particularly SMI, is a significant public health problem with substantial unmet treatment needs in the United States. Despite calls for improved surveillance, critical methodological gaps remain (e.g., Satcher & Druss, 2010). Two gaps are particularly important: (1) the lack of accurate estimates in household populations of the most seriously impairing disorders and (2) the exclusion from national estimates of the incarcerated, homeless, and institutionalized non-household populations, among whom SMI and substance use disorders (SUDs) are over-represented (Lamb, 1998; Lamb, 2004).

Based on these needs, the Substance Abuse and Mental Health Services Administration (SAMHSA) launched the Mental and Substance Use Disorders Prevalence Study (MDPS). The study is a pilot program to determine best methods for estimating the number of individuals living with mental and substance use disorders, especially those of the greatest severity. The primary goal of the MDPS is to examine methods to provide unbiased and precise national prevalence estimates of schizophrenia/schizoaffective disorder, bipolar disorder, major depression, posttraumatic stress disorder (PTSD), obsessive-compulsive disorder, generalized anxiety disorder, and alcohol, benzodiazepine, opioid, stimulant, and cannabis use disorders among U.S. adults ages 18 to 65. The SCID instrument used for the MDPS will not differentiate schizoaffective from schizophrenia as greater diagnostic specificity would have required a significantly lengthier SCID interview thereby increasing the interview length substantially.

The study is being conducted over 4 years. The study began in October of 2019. Year 1 was devoted to planning; years 2 and 3 are devoted to data collection; and year 4 will include finalizing data collection, preparing final reports, and delivering final data sets to SAMHSA.

A2. Purpose and Use of Information

Two of SAMHSA’s five key priority areas are to (1) address serious mental illness and serious emotional disturbance and (2) advance prevention, treatment, and recovery support services for substance use. The MDPS pilot program, and future data collection efforts that will be developed from methods examined within the MDPS, will provide critical information that can help guide actions that SAMHSA takes to address these priority areas.

To better understand the current prevalence of select mental disorders, particularly those with serious functional impairment, SAMHSA plans to conduct the MDPS pilot program and investigate a method for estimating mental disorders across household and non-household populations. After deliberation and input from SAMHSA’s National Advisory Councils (SAMHSA, 2017) and in consultation with external expert reviewers (See Section A8.2 Consultation with Experts Outside of the Study), the decision was made to focus on those with seriously impairing disorders. This reasoning was based on the public health and personal burden posed by these disorders, the notable gap of psychosis assessment in prior research studies and expenditure of resources on disorders with low impairment, and recent concern over undercounting of estimates of psychosis and SMI in the United States (see Tasman, 2018).

A major gap of existing mental illness surveillance efforts is the focus on household populations, with the exclusion of non-household populations. The MDPS pilot program will begin to address this gap by examining methods to estimate the prevalence of specific mental illnesses, particularly adults with psychotic disorders and serious functional impairment, and treatment in both populations. The information collected is meant to build the body of knowledge about the rates of mental and substance use disorders in the U.S. population. It is not intended to be used as the principal basis for public policy decisions.


Specifically, the MDPS pilot program is designed to answer two core research questions:

  • What is the prevalence of schizophrenia/schizoaffective disorder (lifetime and past year), bipolar I disorder (past year), major depressive disorder (past year), generalized anxiety disorder (past year), posttraumatic stress disorder (past year), obsessive-compulsive disorder (past year), anorexia nervosa (past year), and alcohol, benzodiazepine, opioid, stimulant, and cannabis use disorders (past year) among adults, ages 18-65, in the United States?

  • What proportion of adults in the United States with these disorders received treatment in the past year?

In addition to these research questions, the MDPS pilot program will allow for procedural evaluation to:

  • Identify which set of screening instruments might be best to accurately identify mental and substance use disorders within the U.S. household population;

  • Understand the best approaches to conducting data collection within non-household settings, to gather information on mental illness and treatment;

  • Design protocols for collecting clinical interviews from proxy respondents; and

  • Establish a protocol that can be used at a larger scale to understand the prevalence and burden of specific mental disorders in both non-household and household populations across the United States.

A2.2 Sample Composition

The MDPS pilot program will focus on adults aged 18 to 65 in the following settings: residents of households; inmates in state/federal prisons and jails; homeless adults seeking services at shelters; and inpatient residents of state psychiatric hospitals. Adults older than 65, those who are on active duty in the military, and children will be excluded. These exclusion decisions were made based on extant research and in consultation with the expert panelists. First, adults older than 65 were excluded because of the difficulty of differentiating primary mental illness (like psychosis or depression) from symptoms of dementia (Srikanth, Nagaraja, & Ratnavalli, 2005). Moreover, the age of onset for mental illnesses is generally before age 65, and with the exception of episodic disorders, chronic disorder incidence rates drop sharply by age 65 (Kessler et al., 2007). For example, nearly 100% of lifetime cases of psychosis are detected by age 65 (Kirkbride et al., 2006). Second, active-duty military personnel were excluded from the study, because of practical concerns over deployment and transience and the low probability that an individual with a mental illness with serious functional impairment would be able to retain their position of active duty. Finally, children were excluded because of the differences in types of disorders displayed in this population and differences in the settings that children with mental illness are treated or confined in.

A2.3 Design Overview

The MDPS pilot program will conduct clinical interviews with adults across the United States between the ages of 18 and 65 years. The sample will include adults residing both in household and three non-household settings. Non-household populations will include the incarcerated (federal/state prisons and jails), institutionalized (state psychiatric hospitals), and homeless individuals residing in shelters. To investigate methods to accurately estimate mental and substance use disorders, psychiatric epidemiological surveys will be conducted from (1) a national probability household sample; (2) a national probability sample of incarcerated individuals in federal/state prisons; and (3) convenience non-probability samples of individuals residing in homeless shelters, state psychiatric hospitals and jails either located in the MDPS university partner areas—New York City; the Seattle, Washington area; and the Durham, North Carolina area—or in similar areas selected by the partners.

Approximately 45,000 households will be rostered, to identify adults eligible to participate in the MDPS study. Up to 45,000 individuals within households will be screened, to identify adults at elevated risk of mental or substance use disorders. Household sampling procedures will oversample screened individuals at elevated risk, particularly for schizophrenia/schizoaffective disorder. Up to two adults per household and household group quarters (i.e., adult care homes, board and care homes) will be selected and invited to participate in the clinical interview. No screening will be conducted in non-household settings, with the exception of jails. Instead, facility administrative rosters will be used to identify individuals in jails, prisons, homeless shelters, and state psychiatric hospitals. Using screening data or facility rosters to identify respondents, trained mental health clinicians will administer the Structured Clinical Interview for DSM 5-Clinical Version (SCID-5-CV) to up to 7,200 individuals from the household and non-household populations. Interviews will be conducted by video, phone, or in person. Interviews will be conducted in English and Spanish. A method will then be developed to combine estimates from the household and non-household samples using statistical techniques, to yield national prevalence estimates of specific mental and substance use disorders.

Four feasibility studies will be conducted as part of the MDPS pilot program. These feasibility studies will examine methodological issues to consider in future MDPS-like surveillance efforts. The first feasibility study focuses on screening instruments for use in the household population. This feasibility study will examine the performance of two household screening approaches—one that uses a set of adaptive testing screening measures and one that uses a set of non-adaptive screeners—to determine which approach best identifies people at high risk for the prioritized mental disorder conditions.

The second feasibility study focuses on conducting clinical interviews with proxy respondents in households and hospitals. Some disorders, such as schizophrenia or schizoaffective disorder, may make it difficult to conduct interviews (e.g., because of disorganized thinking or speech or reluctance to participate because of paranoia). This feasibility study will create a method for interviewing proxy respondents and determine how often proxy respondents may be necessary and for what disorders.

The third feasibility study focuses on jails. This feasibility study is designed to determine the necessity of gathering data from a jailed population, to adequately represent the jail population in any future MDPS design. The average length of stay among those in jails is sufficiently low that, over the course of a 12-month data collection effort, this population would likely be identified in one of the other study populations (household, prison, homeless shelter or state psychiatric hospital). It will also examine the operational feasibility of collecting this type of data from jail inmates. Screening will be conducted in one or more jails selected by a co-investigator site. A subset of participants recruited from the jail sample will be invited to participate in the clinical interview after their release.

The fourth feasibility study focuses on the use of administrative health records in the institutionalized hospital sample. Hospital records will be reviewed for participants in state psychiatric hospitals, with the goal of determining concordance between the clinical interview and the hospital record in this population. This comparison will help determine if administrative hospital health records might validly replace direct clinical interviews in future MDPS efforts.

This is a one-time data collection effort, and data collection will occur over a 15-month period. More detail on study design, sampling, and data collection procedures are given in Supporting Statement B. Exhibit 1 provides a summary of the MDPS data collection instruments. All instruments and respondent materials will be available in both English and Spanish.



Exhibit 1. Summary of MDPS Instruments

Instruments

Respondent, Content, Purpose of Collection

Mode and Duration

Household Roster

Respondents: 45,000 adults age 18 or older


Content: age, sex of household members


Purpose: identify age eligible household members to complete a mental health screening instrument.

Mode: Web, In-person, Mail, Telephone


Duration: 8 minutes

Mental Health Screening Instrument

Respondents: 45,000 adults ages 18 – 65; up to two adults per household


Content: Mental health symptoms, substance use, sociodemographic factors including race/ethnicity, COVID-19 exposure


Purpose: Identify risk level for mental and substance use disorders, to determine eligibility for the clinical interview

Mode: Web, In-person, Mail, Telephone


Duration: 15 minutes

Clinical Interview

Respondents: 7200 adults ages 18-65 residing in households, prisons, homeless shelters or state psychiatric hospitals


Content: SCID, treatment, medication use, socio-demographic factors including race/ethnicity, current sex and gender identity, and sex and gender identity at birth, COVID-19 exposure


Purpose: Determine the prevalence of mental and substance use disorders and their correlates

Mode: Interviewer: administered by video, phone, or in-person


Duration: 83 minutes (household); 68 minutes (prison); 84.8 minutes (homeless shelter); 78.7 minutes (state psychiatric hospital)





A3. Use of Information Technology

The MDPS clinical interview data are collected using laptop computers and virtual methods, such as Zoom video calls, by phone, or in a face-to-face setting in the respondents’ homes or in facilities for those residing in a non-household facility. Interviews are administered by a clinical interviewer using computer-assisted interviewing (CAI). The CAI technology affords several advantages in the collection of MDPS data. First, this methodology permits the instrument designer to incorporate routings that might be overly complex or not possible using a paper-and-pencil instrument. The computer can be programmed to implement complex skip patterns. Interviewer and respondent errors caused by faulty implementation of skip instructions are virtually eliminated. Two examples of such methodology are the CAT-MH screener and the NetSCID clinical interview instruments. The NetSCID is a computerized version of the SCID interview programmed as a computer application for interviewer administration. Second, this methodology increases the consistency of the data. The computer can be programmed to identify inconsistent responses and attempt to resolve them through prompts to the respondent. This approach reduces the need for most manual and machine editing, thus saving both time and money. In addition, it is likely that respondent-resolved inconsistencies will result in data that are more accurate than when inconsistencies are resolved using editing rules.

CAI technology permits greater expediency with respect to data processing and analyses (e.g., a number of back-end processing steps, including coding and data entry). Data are transmitted electronically in a Federal Information Processing Standard (FIPS)-Moderate environment, rather than by mail. These technologies save time due to the speed of data transmission, as well as receipt in a format suitable for analysis. Tasks formerly completed by clerical staff are accomplished by the CAI program. In addition, the cost of printing paper questionnaires and associated mailing for the full sample is eliminated.

The household rostering and screening procedures are completed either online or by phone using a web-based instrument. Rosters and screenings completed in person will use a hand-held tablet computer. The primary advantage of this computer-assisted methodology is an increased accuracy in selecting the correct household member to receive the screening and clinical interviews. The computer automatically selects the correct household member, based on the demographic variables entered (from the household roster) and functional impairment responses (from the screening), thereby reducing the probability of human error. Hand-held tablet computers also provide the benefits of complex case management tools and quick and secure electronic transfer of data, as well as the option for interviewer or self-administration.

A.4 Efforts to Identify Duplication

The MDPS pilot program will provide an understanding of methods, measurements, and outcomes that are not duplicated by any current surveillance efforts. Historical studies, however, are vital to the development of the MDPS pilot methods. Particularly relevant is the Epidemiological Catchment Area Study (ECA) study, one of the earliest epidemiological studies of mental illness, which was conducted between 1980 and 1985 (with additional follow-up waves for some areas) (DHHS, 1994; Regier et al, 1990). This study, funded by the National Institute of Mental Health (NIMH), collected information on the prevalence of specific mental illnesses and treatment use among individuals living in household and non-household settings within five catchment areas across the United States. Catchment areas were based on the participating academic institutions (Yale University, Johns Hopkins University, Washington University, Duke University, and University of California at Los Angeles). Each site surveyed at least 3,000 household respondents and 500 institutional respondents (DHHS, 1994). The ECA was a groundbreaking study and provided the nation with large area estimates of numerous mental illnesses and treatment estimates. The ECA data are now quite outdated. Consequently, this pilot program will build upon the ECA method by using a national probability, rather than catchment area, sample. And the MDPS pilot program will include convenience non-probability samples of individuals residing in homeless shelters, state psychiatric hospitals and jails.

The direct successor to the ECA was the National Institute of Mental Health-sponsored National Comorbidity Survey (NCS), conducted from 1990 to 1992, which was a nationally representative study of mental illness in the United States that included one follow-up wave (NCS-2). The NCS used a probability sampling of over 8,000 household respondents to generate estimates of mental illness in the United States. Unlike the ECA, the NCS did not include a non-household component (with the exception of college students residing in campus housing). Similarly, its follow-up, the 2001–2003 NCS-R (part of the Collaborative Psychiatric Epidemiology Surveys, which all followed the same design structure but surveyed different racial/ethnic groups) only included household residents (n = 9,282). Neither study had sufficient sample size to evaluate psychotic disorders (Kessler et al., 2004; ICPSR, 2019; National Comorbidity Study, 2005). Similar limitations are found for the 2001 – 2002 National Epidemiologic Survey on Alcohol and Related Conditions (NESARC), which was a two-wave, nationally representative, longitudinal study sponsored by the National Institute on Alcohol Abuse and Alcoholism (NIAAA) that included an assessment of specific mental illnesses among approximately 43,000 household residents (Hasin & Grant, 2015). Despite a considerably larger sample size than any of its predecessors, the NESARC was still limited to household populations and psychosis was not assessed, thereby excluding one of the rarer, but most impairing mental illnesses from its estimates. In 2012-2013, a new data collection, NESARC-III, was conducted. However, like its predecessors, this study was limited in scope to those living in household settings or some non-institutionalized group settings like college dormitories (National Epidemiologic Survey on Alcohol and Related Conditions-III (NESARC-III) | National Institute on Alcohol Abuse and Alcoholism (NIAAA) (nih.gov)).

In addition to these periodic studies, ongoing surveillance efforts are aimed at measuring mental illness in the United States. The NSDUH is an ongoing, annual data collection effort of over 68,500 household respondents, sponsored by SAMHSA. NSDUH collects household-based estimates of past year major depressive episode (MDE) and MDE with severe impairment and generates probabilities of any mental illness and serious mental illness in the past year, based on screening scales for mental distress, functional impairment, and suicidality. However, NSDUH does not measure other mental illnesses. The MDPS pilot program will be different from NSDUH in that it will include a clinical assessment of specific mental illnesses, including psychotic disorders, and it will include a non-probability sample of non-household and institutionalized settings.





A4.1 Non-household Studies

Criminal Justice Populations

The “criminalization of mental illness” is widely documented and recognized as a serious problem. This phrase refers to circumstances wherein individuals with mental illness are incarcerated, usually due to the commission of minor crimes related to their disorder, instead of receiving mental health treatment (Lamb et al., 1998). It is well documented that state and federal jails and prisons have higher rates of mental illness than the general population and have even been referred to as America’s new mental health hospitals (Torrey, 1995). For example, a 2009 study of jail inmates found that the prevalence of SMI (defined as psychosis, major depressive disorder or depression not otherwise specified, or bipolar disorder) was about 17.1 percent among male inmates and 34.3 percent among female inmates (Steadman et al., 2009). A 2014 meta-analysis of studies of state prisons in the United States found that estimates of SMI (defined as psychosis, major depressive disorder, or bipolar disorder) ranged from 6 to 14 percent. A more systematic study has not been conducted since the Survey of Inmates in Local Jails and the Survey of Inmates in State and Federal Correctional Facilities in the early 2000s. Moreover, the methods of disorder assessment in these studies are rarely comparable with those used in household populations, thereby precluding combination of these data into catchment area estimates.

Homeless Populations

The Department of Housing and Urban Development (2017) provides annual estimates of the number of homeless individuals. However, the last national assessment of the health of the homeless was the National Survey of Homeless Assistance Providers in 1996. Estimates of serious mental illness are challenging to obtain in this population. One systematic review of studies on schizophrenia prevalence in the homeless found a pooled prevalence rate of 11 percent, but the range was broad (from 4 to 16 percent, depending on the study) (Folsom & Jeste, 2002).

Hospitalized Populations


Some information on diagnosed mental illnesses is captured in pre-existing data collection efforts of select non-household populations but is too limited in scope. For example, the Nationwide Inpatient Sample provides estimates of the number of individuals with a diagnosis of mental illness who have been admitted to a hospital (Healthcare Cost and Utilization Project, 2018). Medicare data are available that enable assessment of mental illness diagnoses in the elderly, and these data have been used by researchers to study the prevalence rates of diagnosed schizophrenia and bipolar disorder among Medicare enrollees living in long-term care facilities (Brown University, 2019). However, these data collection and analysis efforts did not use standardized diagnostic assessment methods, thereby precluding their combination with household sample estimates to generate area wide estimates.

A5. Involvement of Small Entities

This data collection will have no significant impact on small entities.



A6. Consequences If Information Is Collected Less Frequently

The MDPS is a one-time data collection effort.

A7. Consistency with the Guidelines in 5 CFR 1320.5(d)(2)

This information collection fully complies with 5 CFR 1320.5(d)(2).

A8. Consultation Outside the Agency

A8.1 Federal Register Notice and Comments

The notice required in 5 CFR Part 1320.8(d) was published in the Federal Register on January 19, 2022, 87 FR 2885. No comments were received.

A8.2 Consultation with Experts Outside of the Study

In 2018, SAMHSA supported an MDPS design contract in which a panel of experts with diverse expertise provided input and recommendations on all aspects of the study design. The panel included patient advocates, researchers, and policy makers as shown in Exhibit 2 below. These individuals reviewed the data collection instruments to ensure they were written with plain, coherent, and unambiguous text that would be understandable to potential respondents.

Exhibit 2. 2018 MDPS Design Contract Expert Panel Members

Name (Affiliation)

Contact Information

Elizabeth Sinclair (Treatment Advocacy Center)*

Phone: 802-522-9496

Email: [email protected]

E. Fuller Torrey (Treatment Advocacy Center)

Phone: 301-571-0760

Email: [email protected]

John Snook (Treatment Advocacy Center)

Phone: 703-294-6001

Email: [email protected]

Lisa Dixon (Columbia University)*

Phone: 646-774-8420

Email: [email protected]

Mark Olfson (Columbia University)*

Phone: 646-774-6413

Email: [email protected]

Brian Hepburn (National Association of State Mental Health Program Directors)

Phone: 703-739-9333

Email: [email protected]

*Denotes 2018 expert panel members who now work as part of the current MDPS project team.

During project years 1 and 2 of the MDPS cooperative agreement, the project team sought input from 3 expert consultants, listed in Exhibit 3. These consultants provided advice about study design and data collection strategies.

Exhibit 3. MDPS Cooperative Agreement Consultants

Name (Affiliation)

Contact Information

Evelyn Bromet (Stony Brook University)

Phone: 631-638-1920

Email: [email protected]

Ron Manderscheid (National Association of County Behavioral Health and Developmental Disability Directors)

Phone: 202-942-4296

Email: [email protected]

Alaina Boyer (National Health Care for the Homeless Council)

Phone: 615-226-2292

Email: [email protected]

A9. Payments to Participants

Survey response rates have been declining in recent years. This decline has raised concerns, in large part because lower response rates are associated with increased potential for nonresponse bias. To maximize response rates and reduce the risk of nonresponse bias affecting key estimates, many surveys offer cash incentives to encourage participation. Offering incentives to sample members has been shown to be a cost-effective means of lowering nonresponse. Theories suggest that incentives are effective for several reasons: participant interpretation as either a token of appreciation (social exchange theory) (Dillman, 2000) or compensation for one’s time and effort (economic exchange theory) (Biner & Kidd, 1994); or the subjective weight a person puts on various factors when the survey request is made (leverage-salience theory) (Groves et al., 2000, 2004). Experiments in survey nonresponse have demonstrated that those less interested and less involved in the topic of the survey are more likely to be nonrespondents, and incentives disproportionately increase participation among these people, consistent with leverage-salience theory (Groves, Singer and Corning, 2000; Groves, Presser and Dipko, 2004; Groves et al., 2006).

Lower response rates at different stages in the study due to lack of incentives or amounts that are too small can also result in making the survey request to a larger number of households and individuals. Insufficient motivation at each stage (roster, mental health screening interview, and clinical interview) for this study—designed based on the number of completed clinical interviews—would also lead to unnecessary respondent burden for those who complete rosters or mental health screening surveys but do not complete the clinical interview.

The onset of the COVID-19 pandemic necessitated substantially greater reliance on modes other than in-person than originally planned. In particular, a substantial reliance on web and paper self-administration was required, as in-person data collection was not an option initially. These self-administered modes limit efforts to gain participation from sample members. Incentives have shown to be instrumental to increase cooperation and are one of the key components in the survey design to reduce the risk of substantial nonresponse bias. Based on careful review of incentive amounts offered on comparable federal government-sponsored surveys offering interview cash incentives, the MDPS pilot program offers a fixed incentive amount for the household and non-incarcerated, institutional clinical interviews. In the household sample, a monetary token of appreciation will be provided to motivate response and decrease nonresponse bias at each stage of household recruitment—rostering and mental health screening—and at the clinical interview stage.

Many state Departments of Corrections (DOCs) do not allow inmates to receive monetary incentives for participating in survey research, and the federal Bureau of Prisons does not allow such payments either. However, these jurisdictions have been willing to offer a non-monetary incentive such as a food item, a stamped envelope or a phone card which can be meaningful to inmates and can aid in increasing participation rates. A primary concern that jurisdictions will have regarding any type of inmate incentive is whether the incentive may create increased security issues in the facility. For that reason, incentives that can be used as “currency” within the facility are undesirable. Food items that can be eaten during the interview or items that can be obtained from the facility’s commissary at the time they are ready to be used will be preferable. The National Inmate Survey, which has conducted over 100,000 interviews in federal and state prisons and local jails, has offered a single serving bag of cookies to inmates who participate in a 35-minute survey on the sensitive topic of sexual victimization during incarceration. The researchers reported a 10% increase in response rates at facilities where the incentive was offered (Caspar et al., 2012). The MDPS will offer a small non-monetary incentive to inmates, if allowed by the facility where they are housed.

Exhibit 4 shows the incentive amounts offered for each interview type.

Exhibit 4. MDPS Incentive Amounts

Interview Type

Duration

Token of Appreciation

Household Roster Pre-Incentive

10 min

$2

Household Roster Interview

8 min

$10

Screening Interview


15 min

$20


In-Person Interview for Adults in Households

83 min

$30


In-Person Interview for Adults in Non-Incarcerated, Non-household, Institutionalized Settings

82 min

$30

In-Person Interview for Adult Inmates

68 min

Non-Monetary Item


A.9.1 Household Roster and Mental Health Screening Interviews

Based on prior research, it is clear that offering some incentive over no incentive has a beneficial impact on response and retention rates (Kulka et al., 2005; Singer & Ye, 2003; Trussell & Lavrakas, 2004). There is also evidence that interview incentives reduce the cost per survey, by reducing the level of effort required to obtain the completed interview (Beebe et al., 2005; Kennett et al., 2005). The question of incentive amounts for screening respondents for eligibility and household enumeration is not well-documented in the literature. There is precedent for a $5 screening incentive, based on initial findings from the National Survey of Family Growth (NSFG), which implemented a two-stage data collection design and incorporated a $5 prepaid incentive to screening respondents during Phase 2 (the last three weeks of data collection each quarter) (Groves et al., 2009), the $5 screening incentive is now offered to all Phase 2 screening respondents in the NSFG. The NSFG screener purpose is analogous to the MDPS roster: household enumeration, eligibility ascertainment, and selection. However, the MDPS roster administration time is longer than the NSFG screening administration time. The MDPS roster also relies on self-administration. Upon selection, MDPS includes up to two surveys, a mental health screening and a clinical interview, for selected adults instead of one interview in NSFG. These three differences, along with other respondent burdens in MDPS, make it more difficult to gain participation on MDPS and support the need for a larger MDPS roster incentive.

Similarly, sample members who completed a short screener for the Food and Drug Administration’s Research and Evaluation Survey for the Public Education Campaign on Tobacco among LGBT (RESPECT) were given $10 upon completion of the screening. The $10 value was selected because it was the lowest value deemed to be sufficiently attractive to the sample population who were not necessarily predisposed to participate in research. The researchers proposed that the screening incentive would increase participants’ engagement in the study, thereby resulting in higher data validity, and they also believed it would increase response to follow-up surveys. Although the incentive was not implemented with a randomized experimental design, the project team reported that the $10 incentive facilitated data collection by getting and keeping the attention of the individuals they were trying to screen, even in a distraction-rich environment.

Several other studies have also offered incentives for screening interviews. One example is the National Household Education Surveys (NHES) Program 2011 Field Test, which offered people a $2 or $5 advance cash incentive for participating in a screening survey (Han et al., 2012). The response rate for the $5 condition was significantly higher than the $2 condition and it saved on costs associated with nonresponse follow-up mailings. As a result, $5 has been used as a screening incentive in subsequent NHES surveys. Similarly, the National Household Food Acquisition and Purchase Survey (FoodAPS) also offered a $5 prepaid incentive to households contacted for screening (Kirlin & Denbaly, 2013). Meanwhile, the National Adult Training and Education Survey (ATES) Pilot Study offered a $2 incentive for the screener (Bielick et al., 2013). The NHES screening time is comparable to the MDPS household roster administration time; the MDPS screening administration time is longer than the NHES screening administration time.

Based on our review of prior screening study efforts, the respondent burden associated with the MDPS three-stage design, and the reliance on self-administration caused by the onset of COVID-19 on a survey originally designed to collect approximately 80% of the data in-person with commensurate expected response rates, we provide a $2 pre-paid incentive to complete the household roster, $10 for completing the household roster, and $20 for completing the screening interview. The incremental payments are provided to address nonresponse in the multi-stage design and to provide a token of appreciation for their participation at each stage. Additionally, as the MDPS is a pilot program conducted to provide information regarding the feasibility of conducting an in-depth mental health clinical interview with specific study populations, participation rates at each stage and for each study population will be informative for future studies. The pre-paid roster incentive is paid in cash. The roster and screening completion incentives are paid by electronic gift card, check, or in cash if completed in person, based on the respondent’s preference.

A.9.2 Clinical Interviews

The MDPS clinical interview can be completed virtually, by video call with a clinical interviewer, by phone, or in person. To date, all clinical interviews have been completed either by video (77%) or by phone (23%). The clinical interview incentive amount was prescribed by SAMHSA in the original funding opportunity announcement (FOA) as $30. Incentives are used to encourage participation and convey appreciation for respondent contributions to the research. Studies designed to assess the prevalence of select mental disorders often include clinical interviews that are 60 minutes or longer. The use of incentives for interview participation can significantly increase participation rates and reduce nonresponse (e.g., Singer 2002; Singer & Ye, 2013). A common argument against the use of incentives is the cost associated with them. Yet, incentives can reduce the cost per case, through field staff devoting less time for follow-up and activity prompts (Kennet et al., 2005).

The MDPS incentive amount is lower than that used in similar studies. For example, the 2002 US National Comorbidity Survey Replication (NCSR), offered adult respondents $50 for completing a 90-minute CAPI interview (Kessler et al., 2004). The National Survey of Child and Adolescent Well-Being (NSCAW) (OMB: 0970-0202) offers over 8,000 adult respondents $50 to complete a 90-minute in-person interview across 2 waves of data collection. The MDPS initially provided the $30 clinical interview incentive, in the form of a promised electronic gift card once a respondent completed the household screening interview and agreed to complete the clinical interview. More recently, the $30 clinical interview incentive has been provided after completion of the clinical interview, due to the often-substantial delay between the mental health screening interview completion and the clinical interview scheduling.

A.9.3 Facilities

Non-household facilities, with the exception of prisons, will be offered $250 as tokens of appreciation for their participation in data collection activities, which will include providing facility rosters and helping to coordinate/schedule times to conduct resident clinical interviews. The tokens of appreciation that will be offered are based on RTI’s prior experiences coordinating data collection efforts with facilities and institutions. For example, NSCAW offers $200 tokens of appreciation to participating child welfare agencies. These agencies have used this token of appreciation to buy transit cards for their clients or toys and children’s books for their waiting areas, host small social events for agency personnel, or make charitable donations to organizations serving children in their area. The tokens of appreciation offered to non-household facilities are intended to serve as formalized acknowledgements of the time and effort that participating agencies provide to the study. Through this token of appreciation, facilities will receive a symbolic thank you from the project team at the end of their study participation.

Tokens of appreciation are used to encourage participation and convey appreciation for respondents’ and facilities contributions to the research. The study will not rely exclusively on tokens of appreciation to minimize nonresponse. Other data collection and analysis methods to minimize nonresponse are described in Supporting Statement B.

A10. Assurance of Confidentiality

SAMHSA has statutory authority to collect data under the Government Performance and Results Act (GPRA, Public Law 1103(a), Title 31) and is subject to the Privacy Act for the protection of these data. Only aggregate data will be reported, thus protecting the privacy and confidentiality of survey participants. Information collected will be kept private to the extent permitted by law. The clinical interview consent statement provided to all participants includes the following:

  • The fact that the information collection is sponsored by SAMHSA, an agency of the U.S. Federal Government;

  • The purpose of the information collection and how the information will be used;

  • Assurance that the research team will protect the privacy of respondents to the fullest extent possible under the law; and

  • Assurance that that respondents’ participation is voluntary, and that they may withdraw their consent at any time without any negative consequence.

Under the revised Common Rule, the IRB waived the requirement for signed informed consent. A summary of key information is provided to the respondent verbally at the time of the clinical interview, or in advance, as well as the opportunity for the respondent to read the full consent and have any questions answered. In addition to project-specific training about study procedures, members of the data collection team will receive training that includes general security and privacy procedures. All members of the data collection team will be knowledgeable about privacy procedures and will be prepared to describe them in detail or to answer any related questions raised by the respondents.

Prior to beginning data collection activities, approval was obtained from the Advarra Institutional Review Board, including the prison board. The Office of Human Subjects Protection (OHRP) approved the prison protocols and consent forms in advance of outreach to prisons. Additionally, approval is obtained from each facility’s human subjects review board.

A10.1  Data Security

The MDPS team will utilize its corporate administrative and security systems to prevent the unauthorized release of personally identifiable information (PII), including encryption hardware and software that meet federal standards, and physical security that includes keyless cardcontrolled access system on all buildings and local desktop security and lockout of account via Microsoft Windows.

The MDPS contractor, RTI, will carry out the following activities to enhance data security at all phases of data collection:

▪       Field staff laptops will be passwordprotected, and full disk encrypted. There are several levels of passwordprotected access required to view the files on the laptops. Failure to provide a password at any of the levels will result in access to the case data being denied.

▪       Data will be transmitted and stored in such a way that only authorized members of the project team will have access to any identifying information within a Federal Information Processing Standard- (FIPS)-moderate compliant project share or Microsoft SQL databases. All project team members will be trained on data security procedures and will sign confidentiality agreements that provide for termination of employment, civil suit, and financial and other penalties in case of violation. Field laptops and data transmitted to and from them are encrypted with FIPS 140.2 compliant algorithms.

   All personnel working on the survey must sign affidavits pledging that the data they will collect or work with will not be disclosed. Penalties for disclosure include termination of employment and substantial financial fines.

▪       Access to project file shares, systems, and data is strictly controlled by role-based security in the form of Windows Active Directory security groups. An individual’s security group membership is based on the minimum necessary access to perform their job function on the project and needtoknow.

▪       Household and non-household addresses will not be stored with the data and will be destroyed after all data processing activities are complete.

  • The MDPS mental health screening instrument will launch the Computer Adaptive Test – Mental Health (CAT-MH) hosted by Adaptive Testing Technologies (ATT) in their IT infrastructure. For each case, a unique link id will be used for the CAT-MH record and the mapping between the case id and this link id will be maintained only by RTI. It is important to note that there are no PII collected within CAT-MH.

  • The MDPS clinical interview instrument launches the NetSCID, a computerized version of the Structured Clinical Interview for the DSM-V, hosted by TeleSage in their IT infrastructure. For each case, a unique link id will be used for the NetSCID record and the mapping between the case id and this link id will be maintained only by RTI. It is important to note that there are no PII collected within Net-SCID.

  • The MDPS clinical interview may be recorded using Zoom Video or Audio calls. These calls are password protected using unique passwords for each call. These video or audio files are downloaded by the clinical interviewer to their full-disk encrypted laptops, uploaded to an intranet website through RTI’s VPN in the FIPS-Moderate environment at RTI for review, and then deleted off the laptops. Periodically, these video or audio files are archived away to a folder that is not accessible even from the intranet website. Note that this website is not accessible from the internet and is behind RTI’s firewall. The video and audio files will be destroyed upon completion of the study.

A10.2  Receipt of Roster Files from Non-Household Organizations.

Rosters of individuals residing at non-household organizations will contain personally identifying information (PII). Several data security procedures will be implemented to ensure protection of the PII contained in these files. The data will be uploaded to RTI through a password-protected, Secure Sockets Layer (SSL) secured website.

Each non-household organization will have a unique login directing them to a web page for submitting their roster file. As the organization uploads a roster file, the file will be cached in memory, and then automatically moved using secure FTP to an internal data folder in the FIPS-moderate compliant network that is not directly accessible by the web browser.

The FIPS-moderate compliant network is isolated from the internet and accessed only via twofactor authentication (PIN plus token). Data files are protected through access restrictions on a need-only basis on project shares controlled by Active Directory security groups. These files will be destroyed after the sample weights have been created and verified as required in any data use agreement negotiated with participating organizations.

A11. Questions of a Sensitive Nature

The purpose of the MDPS pilot program data collection is to gather sensitive information regarding the respondent’s mental health and substance abuse. Individuals will also be asked about traumatic events, suicidality, and criminal history. As part of the informed consent process, questions of a sensitive nature will be described. The potential respondents will be informed of the potential risks and harms of sharing this sensitive information. The timing and location of the consent process will be conducted in a manner that is private and convenient to the respondent.

Only adults will be approached for data collection in this study; however, some adults without the capacity to consent may give permission for a proxy respondent to complete the interview on their behalf. We expect that this could happen particularly in the hospital setting. In these situations, the selected adult respondent will provide permission for the study team to contact their designated proxy respondent. At that time, the proxy respondent will be contacted to provide consent for participation in the clinical interview.

A12. Estimates of Annualized Hour Burden

The MDPS sample has been designed to yield approximately 7,200 completed clinical interviews in household populations and within non-household facilities. The household and prison samples have been selected nationally, whereas homeless shelters and state psychiatric hospitals will be identified from the MDPS pilot program partner sites. Individuals from non-household facilities will be selected for clinical interviews based on information provided by the facility; no screening will be needed. It will be necessary to roster approximately 45,000 households and complete approximately 45,000 household screenings to select enough respondents to participate in the clinical interview. This sample size is estimated to provide an adequate oversampling of individuals living in household settings at greatest risk of psychotic disorders.

The household roster takes 8 minutes on average and the household screener takes 15 minutes on average, to complete. The clinical interview length varies by population type with the household clinical interview averaging 83 minutes to complete, the prison clinical interview averaging 68 minutes, and the hospital and shelter administration averaging 82 minutes. The prison clinical interview is shorter for two primary reasons: prisoners are often limited to 90 minutes of release time to participate in the research and the instrument has been adapted to remove questions on current alcohol and drug use. The current MDPS pilot program average clinical interview administration times are consistent with our past experience administering a similar number of modules from the Structured Clinical Interview for DSM-5 (SCID-5) within the National Mental Health Study field test and the NSDUH National Mental Health and Substance Use Surveillance Study.

The data collection field period for the MDPS is 15 months. The annualized estimated household and non-household respondent burden for the MDPS is shown in Exhibit 5.

Exhibit 5. Annualized Estimated Respondent Burden for MDPS

Instrument

No. of
respondents

Responses per respondent

Total number of responses

Hours per response

Total burden hours

Hourly
wage rate

Total hour cost

Household Rostering

45,000

1

45,000

0.13

5,850

$19.83

$116,006

Household contact attempts*

45,000

1

45,000

0.17

7,650

$19.83

$151,700

Household Screening

45,000

1

45,000

0.25

11,250

$19.83

$223,088

Screening contact attempts*

45,000

1

45,000

0.17

7,650

$19.83

$151,700

Clinical Interview (household and non-household)

7,200

1

7,200

1.40

10,080

$19.83

$199,886

Clinical Interview contact attempts*

7,200

1

7,200

0.25

1,800

$19.83

$35,694

Jail Screening Interview

208

1

208

0.33

69

$19.83

$1,361

Total Annual Estimates:







194,608






44,349


$879,435

*Contact attempts include the time spent reviewing all follow-up letters and study materials, including the respondent website, interactions with field and telephone interviewers, the consent process including asking questions regarding rights as a participant and receiving responses, and all other exchanges during the recruitment and interviewing process.

To compute total estimated annual cost, the total burden hours were multiplied by the average hourly wage for each adult participant, according to a Bureau of Labor Statistics (BLS) chart called “Median usual weekly earnings of full-time wage and salary workers by educational attainment.” (Median usual weekly earnings of full-time wage and salary workers by educational attainment (bls.gov)). We used the median salary in the 2nd quarter of 2021 for full-time employees over the age of 25 who are high school graduates with no college experience ($19.83 per hour).

The MDPS non-household sample will be drawn from a national sample of prisons, to include approximately 36 prisons, and local samples of state psychiatric hospitals (n=4) and homeless shelters (n=18). Each facility will have one point of contact for the MDPS who will be asked to provide the study team with a roster of facility residents. Each facility point of contact will also be asked to help schedule time to administer up to 50 clinical interviews. Facility recruitment and engagement is expected to occur over 12 months. The annualized estimated facility burden for the MDPS is shown in Exhibit 6.

Exhibit 6. Annualized Estimated Facility Burden for MDPS

Information to be Provided

Total Number of Respondents

Number of Responses Per Respondent





Total Number of Responses

Average Burden Hours Per Response

Annual Burden Hours

Average Hourly Wage*

Total Annual Cost

Information package review for facility administrators

58

1

58

0.75

43.5

$25.09

$1,091

Initial call with facility staff

58

1

58

1

58

$25.09

$1,455

Telephone call with facility staff to explain roster file process

58

1

58

2

116

$25.09

$2,910

Facility staff provides roster

58

4

232

2

464

$25.09

$11,642

Facility staff coordinates time and location for clinical interview administration

58

4

232

2

464

$25.09

$11,642

Total Annual Estimates:



638


1,145.5


$28,740


* Assumes an average hourly rate of $25.09 for Community and Social Service Managers in the Bureau of Labor Statistic’s Occupational Employment Statistics, May 2020.


Final Burden





Total number of responses

Total burden hours

Respondent Burden

194,608

44,349

Facility Burden

638

1,146

Total Annual Estimates:

195,246

45495

A13. Estimates of Annualized Cost Burden to Respondents

There are no capital, startup, operational, or maintenance costs to respondents.

A14. Estimates of Annualized Cost to the Government

Total costs associated with the MDPS are estimated to be $30,046,540 over a 48-month cooperative agreement performance period. Of the total costs, $30,000,000 are for cooperative agreement costs (e.g., sampling, data collection, data processing, analysis and reporting), and approximately $46,540 represents SAMHSA costs to manage/administrate the survey. The annualized cost is approximately $7,511,635.

A15. Burden Level

The participant burden is approximately 2.4 hours on average for household respondents (including roster, mental health screening and clinical interview). Participant burden for non-household respondent is approximately 1.65 hours on average. The research team has verified this estimate through our early data collection efforts.

A 16. Time Schedule, Analysis and Publication Plans

Timeline

The MDPS pilot program began in September 2019 and is scheduled to be completed by September 2023. The first 12 project months were devoted to design development, instrument programming, and data collection materials preparation. Data collection will occur for approximately 24 months. This period includes household rostering, household screening, and clinical interviews conducted within the household and non-household settings. The final year will include data processing, analysis and delivery of data files.

Activity

Date

MDPS Cooperative Agreement Award

09/2019

Survey Development and Materials Preparation

10/2019 – 09/2020

Data Collection

10/2020 – 09/2022

Analysis and Reporting

10/2022 – 9/2023



Analysis Plan

Primary Statistical Analysis

Analyses will focus on the two research topics that shaped the MDPS pilot program study:

  • Past year prevalence estimates of psychotic disorders (primary) and bipolar I disorder, major depressive disorder, posttraumatic stress disorder, alcohol use disorder, marijuana use disorder, stimulant use disorder, and opioid use disorder among U.S. adults; and

  • Prevalence of receipt of past year treatment for mental disorders among adults with each of the assessed mental disorders (psychotic disorder being of primary interest)

The MDPS pilot program is powered to produce prevalence estimates at the national level to encompass 4 populations—household, incarcerated, homeless, and institutionalized—and at the national level for the household population alone and the incarcerated population alone.

The national estimates required to address the research questions require the creation of analysis weights to account for the sampling strategy (where appropriate) and to mitigate as much as possible any biases (e.g., nonresponse).


Weighting

Independent analysis weights will be constructed for each of the four study populations—household, incarcerated, homeless, and institutionalized—before applying adjustments to weights with the combined data for the national prevalence estimates. Plans for each population are summarized below.

Household population. Samples from the household population are selected through efficient random selection procedures to produce probability-based national estimates. The sample design (i.e., base) weight (or inclusion weight) for the household sample will reflect the stratified, multistage sampling design used for the study:

Stage 1: Probability proportional to size (PPS) selection of Primary Sampling Units (PSUs), defined as county or county group

Stage 2: PPS selection of Secondary Sampling Units (SSUs), defined as Census Block Group

Stage 3: Systematic random sample of residential addresses from an address-based sampling frame constructed for each randomly chosen SSU

Stage 4: Random selection of at most 2 adults within a household to complete the screener questionnaire

Stage 5: Random selection of screener respondents for the clinical interview with differential sampling rates based on screener results

In addition, weight adjustments will be applied to address nonresponse and coverage bias at the two person level using a generalized exponential model (GEM) technique via the WTADJUST procedure in SUDAAN that controls for extreme weights (Research Triangle Institute, 2012).

Incarcerated population. The base weight for the incarcerated population reflects two stages of probability-based (i.e., random) selection:

Stage 1: Probability proportional to size (PPS) selection of prisons

Stage 2: Random selection of adults from the sampled prison to complete the clinical interview only

As with the household sample, the base weights will be adjusted for nonresponse and coverage to limit the associated biases in the estimates using GEM.

Homeless and Institutionalized Populations. Unlike the household and incarcerated populations, the MDPS pilot program will include convenience samples for the homeless and institutionalized populations. A random sample of persons in the homeless shelters and state psychiatric hospitals are identified for the clinical interview only.

Owing to the convenience sampling procedures, propensity scores will be investigated for use as pseudo-inclusion weights (Valliant et al. 2018). These weights will be adjusted by available population distributions, to address selection bias.



Nonresponse Bias Analyses

The goal of weighting is to produce accurate population estimates for policy and other statistical evaluations. Consequently, weight adjustments are intended to minimize bias and maximize precision of the key set of estimates. In addition to an extensive list of quality checks (e.g., outlier detection), the weights will be evaluated where possible, to determine if detectable levels of nonresponse bias are seen in the estimates.

Statistical tests on nonresponse bias, such as t tests and R-indicators (Schouten, Cobben, & Bethlehem 2009; Shlomo, Skinner, & Schouten 2012), rely on information provided for the sample (i.e., both respondents and non-respondents) either through the sampling frame or through supplied information. For the household estimates, information from the roster is used for testing bias in the screener responses; the screener responses in turn are used to evaluate nonresponse bias in the clinical interview. Within the prison and state-supported psychiatric hospital estimates, administrative records are key for this evaluation. Paradata may prove useful for the analysis of the homeless population estimates. Meaningfully high levels of bias in the estimates may suggest a need for further evaluation of the weighting methodology, such as revisiting the covariates used in the GEM models.



Publication Plan

Methodological Reports

One final report will be produced by the end of the period of performance (September 2022). This report will describe final study methods, analytic methods, operational lessons learned, and findings in response to the key research questions. This report will be made available on the SAMHSA website.

Public Use File

A deidentified, public use data file and corresponding codebook will be produced and made publicly available through a to-be-determined archive. The expected availability of the public use file is the last quarter of 2023.

A17. Display of Expiration Date

The expiration date will be displayed.

A18. Exceptions to Certification Statement

No exceptions are required.

References

Beebe, T. J., Davern, M. E., McAlpine, D. D., Call, K. T., & Rockwood, T. H. (2005). Increasing response rates in a survey of Medicaid enrollees: the effect of a prepaid monetary incentive and mixed modes (mail and telephone). Medical Care43(4), 411-414.

Bielick, S., Cronen, S., Stone, C., Montaquila, J., & Roth, S. (2013). The Adult Training and Education Survey (ATES) pilot study: Technical report (NCES 2013-190). U.S. Department of Education. Washington, DC: National Center for Education Statistics. Retrieved from http://nces.ed.gov/pubsearch

Biner, P. M., & Kidd, H. J. (1994). The interactive effects of monetary incentive justification and questionnaire length on mail survey response rates. Psychology and Marketing, 11, 483-492.

Brown University. (2019). LTC focus: Create custom reports on long-term care. Retrieved from http://ltcfocus.org/

Caspar, R., Berzofsky, M., & Krebs, C. (2012). The impact of a person level incentive on establishment level response and prevalence rates within correctional facilities. Paper presented at the 4th International Conference on Establishment Surveys, Montreal, Canada.

Department of Health and Human Services. (1994). National Institute of Mental Health. Epidemiologic Catchment Area Study, 1980-1985. Ann Arbor, MI: Inter-university Consortium for Political and Social Research [distributor]. Retrieved from https://doi.org/10.3886/ICPSR06153.v1 https://www.icpsr.umich.edu/icpsrweb/ICPSR/studies/6153

Department of Housing and Urban Development. (2017). The 2017 Annual Homeless Assessment Report (AHAR) to Congress. Retrieved from https://www.hudexchange.info/resources/documents/2017-AHAR-Part-1.pdf

Dillman, D. A. 2000. Mail and internet surveys: The tailored design method. 2nd Ed. New York: John Wiley Co.

Folsom, D., & Jeste, D. V. (2002). Schizophrenia in homeless persons: A systematic review of the literature. Acta Psychiatrica Scandinavica, 105(6), 404-413.

Food and Drug Administration. (n.d.). Research and Evaluation Survey for the Public Education Campaign on Tobacco among LGBT (RESPECT): Supporting Statement. (n.d.). Retrieved from http://www.reginfo.gov/public/do/DownloadDocument?objectID=65626801

Groves, R. M., Couper, M. P., Presser, S., Singer, E., Tourangeau, R., Acosta, G. P., & Nelson, L. (2006). Experiments in Producing Nonresponse Bias. Public Opinion Quarterly, 70(5), 720-736.

Groves, R. M., Mosher, W. D., Lepkowski, J., & Kirgis, N. G. (2009). Planning and development of the continuous National Survey of Family Growth. National Center for Health Statistics. Vital Health Statistics, 1(48).

Groves, R. M., Presser, S., & Dipko, S. (2004). The role of topic interest in survey participation decisions. Public Opinion Quarterly, 68(1), 2-31.

Groves, R. M., Singer, E., & Corning, A. D. (2000). A leverage-saliency theory of survey participation: Description and illustration. Public Opinion Quarterly, 64, 299–308.

Han, D., Montaquila, J. M, & Brick, J. M. (2012). An evaluation of incentive experiments in a two-phase address-based mail survey. In Proceedings of the Survey Research Methods Section of the American Statistical Association (pp. 3765–3778).

Hasin, D. S., & Grant, B. F. (2015). The National Epidemiologic Survey on Alcohol and Related Conditions (NESARC) waves 1 and 2: Review and summary of findings. Social Psychiatry and Psychiatric Epidemiology, 50, 1609-1640.

ICPSR. (2019). National Comorbidity Survey (NCS) series. Retrieved from https://www.icpsr.umich.edu/icpsrweb/ICPSR/series/527

Kennet, J., Gfroerer, J., Bowman, K. R., Martin, P. C., & Cunningham, D. B. (2005). Introduction of an incentive and its effects on response rates and costs in NSDUH. In Kennet, J., & Gfroerer, J. (Eds.), Evaluating and improving methods used in the National Survey on Drug Abuse (DHHS Publication No. SMA 05-4044, Methodology Series M-5). Rockville MD: Substance Abuse and Mental Health Services Administration, Office of Applied Studies.

Kessler, R. C., Berglund, P., Chiu, W.T., Demler, O., Heeringa, S., Hiripi, E…Zheng, H. (2004). The US National Comorbidity Survey Replication (NCS‐R): Design and field procedures. International Journal of Methods in Psychiatric Research, 13(2), 69-92.

Kessler, R. C., Angermeyer, M., Anthony, J. C., De Graff, R., Demyttenaere, K., Gasquet, I…Bedirhan, E.T. (2007). Lifetime prevalence and age-of-onset distributions of mental disorders in the World Health Organization’s World Mental Health Survey Initiative. World Psychiatry, 6(3), 168-176.

Kirkbride, J. B., Fearon, P., Morgan, C., Dazzan, P., Morgan, K., Tarrant, J…Mallett, R. M. (2006). Heterogeneity in incidence rates of schizophrenia and other psychotic syndromes: Findings from the 3-center AeSOP study. Archives of General Psychiatry63(3), 250-258.

Kirlin, J. A., & Denbaly, M. (2013). FoodAPS National Household Food Acquisition and Purchase Survey. US Department of Agriculture, Economic Research Service.

Kulka, R. A., Eyerman, J., & McNeeley, M. E. (2005). The use of monetary incentives in federal surveys on substance use and abuse. Journal of Economic and Social Measurement, 30(2-3), 233-249.

Lamb, H. R, & Weinberger, L. E.. (1998). Persons with severe mental illness in jails and prisons: A review. Psychiatric Services, 49(4), 483-492.

Lamb, H. R, Weinberger, L.E, Gross, B.H. (2004). Mentally ill persons in the criminal justice system: some perspectives. Psychiatric Quarterly, 75(2):107-26.

National Comorbidity Study. (2005). About the National Comorbidity Study (NCS) family. Retrieved from https://www.hcp.med.harvard.edu/ncs/

NIAAA. (n.d.). National Epidemiologic Survey on Alcohol and Related Conditions-III (NESARC-111). Retrieved from https://www.niaaa.nih.gov/research/nesarc-iii

Regier, D.A, Farmer, M.E, Rae, D.S, Locke, B.Z., Keith, S.J., Judd, L.L., & Goodwin, F.K. (1990). Comorbidity of mental disorders with alcohol and other drug abuse: Results from the Epidemiologic Catchment Area (ECA) Study. JAMA, 264, 2511-2518.

Research Triangle Institute (2012). SUDAAN Language Manual, Volumes 1 and 2, Release 11. Research Triangle Park, NC: Research Triangle Institute.

Substance Abuse and Mental Health Services Administration. (2020). Key substance use and mental health indicators in the United States: Results from the 2019 National Survey on Drug Use and Health (HHS Publication No. PEP20-07-01-001, NSDUH Series H-55). Rockville, MD: Center for Behavioral Health Statistics and Quality, Substance Abuse and Mental Health Services Administration. Retrieved from https://www.samhsa.gov/data/

Schouten, B. Cobben, F. & Bethlehem, J. (2009). Indicators for the representativeness of survey response. Survey Methodology, 35(1): 101–113.

Shlomo, N., Skinner, C., & Schouten, B. (2012). Estimation of an indicator of the representativeness of survey response. Journal of Statistical Planning and Inference, 142(1): 201–211.

Singer, E. (2002). The use of incentives to reduce nonresponse in household surveys. In Groves, R. B., Dillman, D. A., Eltinge, J. L, & Little, R. J. A. (Eds), Survey nonresponse. New York: Wiley.

Singer, E., & Ye, C. (2013). The use and effects of incentives in surveys. The Annals of the American Academy of Political and Social Science, 645, 112-141.

Srikanth, S., Nagaraja, A.V., & Ratnavalli, E. (2005). Neuropsychiatric symptoms in dementia-frequency, relationship to dementia severity and comparison in Alzheimer’s disease, vascular dementia and frontotemporal dementia. Journal of the Neurological Sciences, 236(1–2), 43-48. Retrieved from https://www.sciencedirect.com/science/article/pii/S0022510X05001784

Steadman, H. J., Osher, F. C., Robbins, P. C., Case, B., & Samuels, S. (2009). Prevalence of serious mental illness among jail inmates. Psychiatric Services, 60, 761-765.

Tasman, A. (2018). Lies, damn lies, and statistics. Psychiatric Times, 32(3). Retrieved from https://www.psychiatrictimes.com/schizophrenia/lies-damn-lies-and-statistics

Torrey, E. F. (1995). Jails and prisons—America’s new mental hospitals. American Journal of Public Health85(12), 1611-1613.

Trussell, N., & Lavrakas, P. J. (2004). The influence of incremental increases in token cash incentives on mail survey response is there an optimal amount? Public Opinion Quarterly, 68(3), 349-367.

Valliant, R., Dever, J., & Kreuter, F. (2018). Practical tools for designing and weighting survey samples. (2nd ed.) (Statistics for Social and Behavioral Sciences). Springer. https://doi.org/10.1007/978-3-319-93632-1





Attachments



Attachment A. Household Roster

Attachment B. Household Roster: PAPI Instrument

Attachment C. Household Screening Instrument

Attachment D. Household Screening Instrument: PAPI

Attachment E. Clinical Interview

Attachment F. Federal Register Notice (to be attached once received)

Attachment G. Informed Consent Forms

Attachment H. Household Respondent Materials

Attachment I. Non-household Facility Recruitment Materials

Attachment J. Non-household Respondent Materials





11

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authornpearce
File Modified0000-00-00
File Created2022-03-30

© 2024 OMB.report | Privacy Policy