OMB_Supporting_Statement_A 6-19-17

OMB_Supporting_Statement_A 6-19-17.docx

Mental Health Block Grant Ten Percent Set Aside Evaluation

OMB: 0930-0376

Document [docx]
Download: docx | pdf


THE SUBSTANCE ABUSE AND MENTAL HEALTH SERVICES ADMINISTRATION (SAMHSA) MENTAL HEALTH BLOCK GRANT TEN PERCENT SET ASIDE EVALUATION

SUPPORTING STATEMENT PART A

A. Justification

A.1 Circumstances of Information Collection

The Substance Abuse and Mental Health Services Administration (SAMHSA), Center for Mental Health Services (CMHS) is requesting approval from the Office of Management and Budget (OMB) for new data collection activities for the Mental Health Block Grant Ten Percent Set Aside Evaluation and the following seven data collection activities:


  • Site Survey

  • Agency Director/Administrator Interview

  • Coordinated Specialty Care (CSC) Staff Interview

  • Coordinated Specialty Care (CSC) Participant Interview

  • State Mental Health Authority Interview

  • Fidelity Interview

  • Possible Administrative Data Elements


This data collection is authorized under Section 520A of the Public Health Service Act (42 USC 290bb-32 – Priority Mental Health Needs of the Regional and National Significance).


According to experts at the National Institute of Mental Health (NIMH), approximately 100,000 young people experience first episode psychosis (FEP) each year in the U.S.1 When young people are trying to establish autonomy as adults, psychotic disorders can set them on a trajectory of increasing disability, particularly if left untreated. Fortunately, effective treatments exist for addressing FEP and have recently been combined into a Coordinated Specialty Care (CSC) delivery package – an evidence-based, early treatment intervention for individuals with FEP. CSC is a team-based intervention for FEP that combines various well-established evidence-based treatments including assertive case management, individual or group psychotherapy, supported employment and education services, family education and support, and low doses of anti-psychotic medications. These services are also closely coordinated with primary health care (Azrin, Goldstein, & Heinssen, 2015). CSC aims to address the systemic gaps and barriers that young people with FEP face in accessing appropriate services.


In the Consolidated Appropriations Act, 2014 (P.L. 113-76), Congress required that states set aside five percent of their Mental Health Block Grant (MHBG) to support individuals with early serious mental illness. More recently, the Consolidated Appropriations Act, 2016 (P.L. 114-113) provided a $50 million increase for MHBG funding and increased the required set aside for early intervention services to 10 percent. The goal of the MHBG ten percent set aside is to help consolidate partnerships between the federal government and the states to support evidence-based programs (EBPs) that address the needs of individuals with early serious mental illness, including psychotic disorders.


The Substance Abuse and Mental Health Services Administration (SAMHSA) and the NIMH are supporting a three-year evaluation of programs implementing CSC using MHBG ten percent set-aside funding. The primary purpose of the evaluation is to study how the CSC delivery package is associated with the outcomes for individuals with FEP who are receiving CSC services in these programs. We will design and implement fidelity and outcome assessments of selected CSC programs supported by ten percent set aside funding. The evaluation will include seven main data collection activities: (1) a Site Survey of all sites using MHBG ten percent set aside funding for CSC programming (not just those included in the study), (2) an Agency Director Interview, (3) a Coordinated Specialty Care (CSC) Staff Interview, (4) a Coordinated Specialty Care (CSC) Participant Interview, (5) a State Mental Health Authority Interview, (6) a Fidelity Interview, and (7) client-level administrative data at baseline and every six months, up to 18 months total (see Table 1).


Table 1. Timing of data collection activities

Data collection activity

Maximum
number of times

When

Site survey

1

Year 1

Agency Director/Administrator Interview

2 times during site visits to evaluation sites

Years 1, 2

Coordinated Specialty Care (CSC) Staff Interview


2 times during site visits to evaluation sites

Years 1, 2

Coordinated Specialty Care (CSC) Participant Interview


2 times during site visits to evaluation sites

Years 1, 2

State Mental Health Authority Interview

1

Years 1

Fidelity Interview

2

Years 1, 2

Possible Administrative Data Elements

4

Evaluation sites submit data at baseline and every 6 months thereafter through evaluation period




A.2 Purpose and Use of Information

Up to 32 CSC sites across the nation will be recruited to participate in the evaluation. The data collection activities for the Mental Health Block Grant Ten Percent Set Aside Evaluation will include the following seven data collection activities:


  • Site Survey: This is a one-time online survey with site directors of all 250 centers using MHBG ten percent set aside funding (not just those included in the evaluation). The survey focuses on how centers across the U.S. are providing services to individuals with First Episode Psychosis (FEP) in their communities.


  • Agency Director/Administrator Interview: This semi-structured interview will be conducted twice with Agency Director/Administrators at each of the 32 CSC sites in the evaluation about the successes and challenges involved in implementing the CSC program.


  • Coordinated Specialty Care (CSC) Staff Interview: This semi-structured interview will be conducted twice with CSC Staff at each of the 32 CSC sites in the evaluation about the successes and challenges involved in implementing the CSC program.



  • Coordinated Specialty Care (CSC) Participant Interview: This semi-structured interview will be conducted twice with participants involved in programs at the 32 CSC sites in the evaluation. The purpose of the interview is to gather participant input on how CSC programs are operating and their thoughts and opinions about successes and challenges while participating in the CSC program.


  • State Mental Health Authority Interview: This is a one-time semi-structured interview with state mental health leadership in the states where the 32 sites in the evaluation are located. The interview focuses on their thoughts and opinions about context in which CSC programs are implemented within their state and the state’s role in the implementation of the CSC programs.


  • Fidelity Interview: This interview will be conducted twice during the evaluation. The phone interview is designed to be used in conjunction with the First Episode Psychosis Fidelity Scale (FEPS-FS) to examine whether elements of CSC are implemented at the sites. The fidelity interview will be conducted with up to four CSC staff at each site.


  • Possible Administrative Data Elements: Each site will provide the evaluation team with administrative data elements on participant demographics and outcomes. To minimize burden and maximize the number of sites reporting outcome measures, we will seek sites that are already collecting the individual level outcome measures identified for this study including quality of life, symptomology, employment status, educational status, and living situation. These administrative data elements are included in the core collection of measures recommended by the Mental Health Research Panel through the PhenX Toolkit (www.phenxtoolkit.org) for use by all mental health researchers. Thus, we expect that majority of the sites will already be collecting these measures as a part of their routine practice.


Evaluation findings will be useful to SAMHSA, NIMH, Grant Project Officers (GPOs) and sites serving individuals with FEP in providing essential program management, development, and implementation information. SAMHSA can use evaluation findings to address program management priorities including accountability, program and policy planning, and program justification. The findings will clarify the extent to which fidelity to the CSC model is related to client outcomes. Reports of effectiveness may encourage other programs to adopt CSC and bring it to scale within a state and eventually across the U.S. Finally, the evaluation will inform the government’s efforts to provide technical assistants to states using the MHBG to implement CSC and bring it to scale in their communities.


Evaluation findings will be of use to both SAMHSA and states through:


  • Showing whether there are observable differences in participant outcomes that can be plausibly linked to the CSC approach and fidelity to the CSC model

  • Identifying best practices and effective strategies

  • Describing implementation experiences and practices

  • Understanding barriers and facilitators to successful CSC implementation

  • Illustrating the development of CSC as states move toward offering integrated and comprehensive services

  • Describing how participants experience CSC and how they use services and supports.

The practice community can use evaluation findings to:


  • Improve the implementation of CSC

  • Improve the quality and fidelity of the services they provide

  • Learn about the barriers to treatment and essential services that young people with FEP and their families experience and how to address the barriers;

  • Learn whether participants experience services as the states intend and identify CSCs’ strengths and weaknesses and

  • Identify gaps and barriers in system development.


A.3 Use of Improved Information Technology

A web-based portal will be developed to collect and manage all administrative data submitted by CSC sites. The use of web-based data submission decreases respondent burden as compared to that required for alternate methods, such as a paper format, by allowing direct transmission of the data. Respondents can enter and submit the data at a time and location that is convenient for them. In addition, the data entry and quality control mechanisms built into the web-based portal reduce errors that might otherwise require follow-up, thus reducing burden compared to that required for hardcopy data collection. In addition, the national survey of all FEP programs funded by ten percent set-aside funds will be conducted online to make the data entry and analysis as efficient as possible.


The evaluation team will ensure that all web-based solutions are fully compliant with Section 508 of the Rehabilitation Act. This includes ensuring that all posted documents are compliant or have a compliant alternative. The project will utilize Adobe products that are capable of producing compliant PDF files per the SAMHSA-recommended process. The evaluation team has a thorough knowledge of Section 508 standards and employs accessibility experts with a variety of assistive technologies, including screen readers, screen magnifiers, and voice recognition software.



A.4 Efforts to Avoid Duplication

This evaluation will provide information specific to the MHBG ten percent set aside program for implementing CSC services. It will serve as a primary mechanism through which the relation between fidelity to CSC and program outcomes will be understood, improved, and sustained. The data are not collected through any other mechanism.



A.5 Involvement of Small Businesses

No small businesses will be involved in the evaluation.



A.6 Consequences if Information Collected Less Frequently

The evaluation was designed to keep the burden of data collection to a minimum by using the least number of rounds of data collection that will accomplish the objectives of the effort and to meet evaluation reporting requirements. Some data items need to be collected more than once to assess change over time (e.g., how fidelity to the CSC program model and individual outcomes change over the course of the evaluation period).


A.7 Consistency with the Guidelines in 5 CFR 1320.5(d)(2)

The data collection efforts will be consistent with the guidelines at 5 CFR 1320.5(d)(2).



A.8 Federal Register Notice and Consultations Outside the Agency


A.8.1 Federal Register Notice


As required by 5 CFR 1320.8(d), the 60-day FRN was published in the Federal Register on April 12, 2017 (82 FR 17670). No comments were received.



A.8.2 Consultations Outside the Agency

Internal and external stakeholders were consulted in the development of the evaluation design, data collection methodology, and associated burden. These stakeholders included the Executive Technical Committee comprised of Westat experts Howard Goldman and Lisa Dixon. They also included the Research Methods Group comprised of Westat experts Abram Rosenblatt, Gary Bond, and Robert Drake; outside experts Ted Lutterman Kristin Neylon, David Shern, Pat Shea, Don Addington, Tamara Sale and Federal experts including Steven Dettwyler of SAMHSA and Susan Azrin of NIMH.



A.9 Payments/Gifts to Respondents

Working with each site, Westat will identify a convenience sample of 2 program participants at each site for in-person in-depth interviews. Program participants who agree to attend the in-person interview will receive a $25 gift card. Respondents will be informed that the interview will last for approximately 60 minutes. Agreement to be interviewed will be obtained on the phone to be followed up with written consent in person.



A.10 Assurance of Confidentiality

Westat has already obtained IRB approval of all data collection activities and has approval to conduct the evaluation. The IRB requires that the project also submit the final set of tools. Further, the project will conform to all requirements of the Privacy Act of 1974 under the System of Records: Alcohol, Drug, and MH Epidemiological, and Biometric Research Data, DHHS, #09-30-0036; the most recent publication in the Federal Register occurred on January 19, 1999 (64 FR 2914). The Westat IRB requires that sites submit to their own IRBs for review and approval prior to beginning any data collection for this project.


All members of the evaluation team will receive general awareness training and role-based training commensurate with the responsibilities required to perform the tasks of the project. Prior to performing any project work or accessing any system, and annually thereafter throughout the life of the evaluation, each team member will have completed the SAMHSA Security Awareness Training required by the agency, as well as Records Management and Human Subjects Research Training. The project will maintain a list of all individuals who have completed these trainings and will submit this list to the Project Officer upon request.


The evaluation team will safeguard the names of respondents, all information or opinions collected in the course of interviews and observations, and any information about respondents learned incidentally during the project. Hard copies of evaluation data and notes containing personal identifiers will be kept in locked containers or a locked room when not being used. Reasonable caution will be used in limiting access to data to only those persons who are working on the project and who have been instructed in appropriate Human Subjects requirements for the project. All evaluation data, notes, recordings, etc. will be destroyed no later than 6 months after the end of the contract, and SAMHSA will have documentation of the destruction of these items.


Identifying information such as individual names and addresses will not be part of any machine data record. Electronic files and audio files will be accessible only to project staff and under password protection. Access to network-based data files is controlled through the use of Access Control Lists or directory- and file-access rights based on user account ID and the associated user group designation. Staff are instructed on the proper use of PCs for the storage, transfer, and use of sensitive information and the tools available such as encryption.


Individuals and organizations providing information to the evaluation will be told the purposes for which the information is collected and that any identifiable information about them will not be used or disclosed for any other purpose. Identifiers such as name, email address, and position will be collected to facilitate survey administration and to notify respondents of the survey. Once data collection is complete, personal identifiers will be removed from the data and destroyed.


Site Survey. In Year 1 of data collection, a survey of all 10% set-aside sites in the U.S. (approximately 250) will be administered to obtain an understanding of how the programs operate and what they use the set-aside funding for. Respondents’ identities will be known, so an active informed consent process will be followed. Potential participants will be contacted by mail, email, or telephone to explain the survey. The explanation will cover the voluntary nature of the survey, treatment of responses, and their risks, benefits, and rights as respondents. Participants will be asked to indicate, by checking a box on the online survey form, that they agree to participate in the evaluation, before they complete and submit the survey. The letter and online survey form will provide contact information if the survey participant has questions or wants clarification prior to participation. If the individual does not have internet access, alternative methods will be used such as (1) a packet sent by mail containing cover letter, informed consent form, survey, and return envelope, or (2) administering the survey by telephone.


Agency Director/Administrator Interview, Coordinated Specialty Care (CSC) Staff Interview, CSC Participant Interview, and State Mental Health Authority Interview. Through in-person site visits in Years 1 and 2 of data collection, evaluation staff will obtain informed consent and conduct audio-recorded (with permission) interviews with evaluation sites’ program directors, staff members, and participants receiving CSC services. In Year 1 only, staff will also interview state mental health authority representatives. A small number of evaluation staff will have access to the recordings, transcripts, abstracted data, and any other identifying information associated with the interviews. All materials will be stored on a secure project directory on the Westat network. Hard-copy documents will be kept in a locked filing cabinet accessible only to evaluation staff. Any names or other PII mentioned during interviews will be redacted from all transcripts before these data are imported into analysis software.


Fidelity Interview. Evaluation sites’ fidelity to the CSC program model will be assessed through rounds of telephone interviews in Years 1 and 2 of data collection. Respondents’ identities will be known, so to ensure their rights an active informed consent process will occur. Evaluation staff will obtain verbal consent for the telephone interviews using the consent scripts included in the attachments with the corresponding instrument. The telephone interviews will be audio-recorded (with permission), and separate informed consent will be obtained for the recording. Data from the interviews will be handled as described previously with the process assessment.


Possible Administrative Data Elements. When evaluation sites’ electronic data on client outcomes are transferred to the evaluation, data files will be encrypted to make the information indecipherable during the electronic transfer. Data will be transferred securely and all caution will be used as described in Section A.3, Use of Improved Information Technology. Access to this information will be password protected and data encryption will enhance security. No information that can potentially be used to identify a client will be included in these data files other than the individual’s unique evaluation identification number. No member of the evaluation team will ever have access to information that could link the unique identification number to personally identifiable information. Further, the project will operate under an ADP/IT security plan approved by SAMHSA for project data. In reporting the results of the evaluation, only aggregated information will be included. The project will not disseminate small numbers with demographic information that might be used to deduce the identity of individual respondents. One typical method used is to suppress data where a small number of respondents would otherwise appear in a table or text.



A.11 Questions of a Sensitive Nature

There are no questions of a particularly sensitive nature included in the evaluation.



A.12 Estimates of Burden Hours

Table 2 shows the estimated annualized burden hours for the respondents’ time to participate in each data collection activity. Across the instruments, the total burden is estimated to be 4,658 hours. The total cost burden is estimated to be $104,498.



Table 2. Estimated burden hours

Data Collection Activity

Number of respondents

Responses per respondent

Total responses

Average burden per response
(in hours)

Total
burden
(in hours)

Hourly Wage

Costa

Total Wage Cost

Site Survey

250

1

250

0.2

50

29.83

$1,492

Agency Director/ Administrator Interview

64

1

64

2.0

128

29.83

$3,818

Coordinated Specialty Care (CSC) Staff Interview

192

1

192

2.0

384

22.47

$8,628

Coordinated Specialty Care (CSC) Participant Interview

128

1

128

1.0

128

7.25

$928

State Mental Health Authority Interview

32

1

32

2.0

64

29.83

$1,909

Fidelity Interview

64

4

256

4.0

1,024

22.47

$23,009

Possible Administrative Data Elements

32

18

576

5.0

2,880

22.47

$64,714

Total

762


1,498


4,658


$104,498


a Based on the average hourly wages for Community and Social Service Specialists, All Other (21-1099; $22.47) and Social Workers (21-1020; $29.83) from the May 2015 National Industry-Specific Occupational Employment and Wage Estimates, 621330 – Offices of Mental Health Practitioners; and the federal minimum wage of $7.25/hour.




A.13 Estimates of Annualized Respondent Capital and Maintenance Costs

There are neither capital nor startup costs, nor are there any operations or maintenance costs.



A.14 Estimates of Annualized Cost to the Federal Government

SAMHSA has planned and allocated resources for the management, processing, and use of the collected information in a manner that will enhance its utility to agencies. The contract award to cover this evaluation is $2,183,812 over a 36-month period. Thus, the annualized contract cost is $727,937. It is estimated that one SAMHSA employee will be involved for 5% of their time, at an estimated annualized cost of $4407 to the government.  The total estimated average cost to the government per year is $732,344.



A.15 Change in Burden


This is a new activity.



A.16 Time Schedule, Publication and Analysis Plan


Pending OMB approval, the goal will be to begin data collection starting in October 2017 and continue for approximately 18 months. The last 6 months of the project will involve data analysis and report writing. Table 3 provides an overview of evaluation activities and dates.



Table 3. Schedule of evaluation activities

Activity

Date

Receive OMB clearance for data collection

September 2017

Begin data collection pending OMB clearance

October 2017

Begin providing training and technical assistance to evaluation sites

October 2017

Begin processing and analyzing data

December 2017

Stop data collection

April 2019

Produce annual evaluation reports

August 2017, 2018, and 2019



A.17 Exemption for Display of Expiration Date

No exemption is being requested.



A.18 Exceptions to Certification Statement


This collection of information involves no exceptions to the Certification for Paperwork Reduction Act Submissions.

1Heinssen R., Goldstein, A., & Azrin, S. (2014). Evidence-based treatment for first episode psychosis: Components of coordinated specialty care. Retrieved on April 18, 2016 from http://www.nimh.nih.gov/health/topics/schizophrenia/‌raise/nimh-white-paper-csc-for- fep_147096.pdf


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorSonji Hogan
File Modified0000-00-00
File Created2021-01-22

© 2024 OMB.report | Privacy Policy