Sbirtiii_omb_ss-a_10-27-15

SBIRTIII_OMB_SS-A_10-27-15.docx

Screening, Brief Intervention, and Referral to Treatment (SBIRT) Cross-Site Evaluation

OMB: 0930-0359

Document [docx]
Download: docx | pdf

screening, brief intervention, and referral to treatment (sbirt) cross-site evaluation


supporting statement


A. JUSTIFICATION

1. Circumstances of Information Collection

The Substance Abuse and Mental Health Services Administration (SAMHSA) is requesting approval from the Office of Management and Budget (OMB) for the “Performance Site Survey” data collection activity for the cross-site evaluation of the Screening, Brief Intervention, and Referral to Treatment (SBIRT) program as implemented in the sixth cohort of states and tribal organizations. Cohort VI is six states, which began funding in 2013 and 2014. A web-based survey will be administered to staff in sites where SBIRT services are being delivered—referred to as performance sites. The Performance Site Survey will be distributed to individuals who directly provide SBIRT services and staff who interact regularly with SBIRT providers and patients receiving SBIRT services.

SAMHSA’s SBIRT program supports the provision of clinically appropriate services for persons at risk for or diagnosed with a substance use disorder. SBIRT addresses an important service gap for people who use substances at a level that is unhealthy but below the threshold to be classified as a disorder. It is intended to be universal, so the screen component has significant potential for improving public health. Its potential reach is enormous: in 2012, 144 million adults used substances at levels that are unhealthy but below disorder thresholds (authors’ estimates using SAMHSA, 2014). SBIRT also helps address the needs of people with substance use disorders, estimated at 22 million people aged 12 or older in the United States (SAMHSA, 2013), and those with co-occurring mental health or medical conditions where substance use may exacerbate symptoms.

SBIRT has been shown to be effective in a variety of medical settings, including hospitals, primary care clinics, emergency rooms, clinical research settings, and college campuses (e.g., Jonas et al., 2012). Most studies and most findings of efficacy are for alcohol, particularly when SBIRT is implemented in primary care. Recent and ongoing studies continue to expand the evidence base on illicit drugs (e.g., WHO, 2008) and tobacco (Fiore et al., 2000). Beyond clinical efficacy, SBIRT has been shown to be cost-effective and cost-beneficial (e.g., Solberg et al., 2008).

The SBIRT program is authorized under Section 509 (Priority Substance Abuse Treatment Needs of Regional and National Significance) of the Public Health Service Act. SAMHSA has funded SBIRT programs across 21 states and tribal communities and 27 educational institutions since fiscal year 2004, investing over $250 million to provide SBIRT services to nearly 2 million patients.

SAMHSA has also overseen and supported the production of numerous training manuals and procedures for SBIRT (SAMHSA, 2009), helped develop SBIRT reimbursement codes, and partnered with the National Institute on Drug Abuse to support randomized trials of SBIRT for illicit drug use. SAMHSA has also played a critical role in integrating SBIRT into the National Drug Control Strategy.

Previous cross-site evaluations of SBIRT have focused on the implementation of SBIRT in medical settings and provided evidence on staffing models, a model matrix to inform program administrators on important staffing and protocol decisions, cost estimates across settings and models, and the degree to which service provision adheres to an evidence base. Additionally, findings from previous cross-site evaluations and the broader literature document the factors related to successful implementation. This literature to date has largely focused on factors at the patient and practitioner levels.

One gap in understanding SBIRT implementation that the current evaluation will address is the role of implementation factors at the organizational level, including an organization’s readiness to implement SBIRT. More broadly, the concept of organizational readiness is a growing area of research in the implementation science field. Implementation science in the behavioral health field has focused on acceptance and buy-in at the organizational, leadership, and direct service levels; positive expectations about the program; presence of a local champion at the project and organizational levels; and effective leadership support (Amaral, Ronzani, & Souza-Formigoni, 2010; Berends, MacLean, Hunter, Mugavin, & Carswell, 2011; Bernstein et al., 2009; MacLean, Berends, Hunter, Roberts, & Mugavin, 2012; Shaw et al., 2012). These components are significant organizational factors that may be associated with successful program implementation.

Other important factors that can facilitate or hinder implementation of behavioral health interventions and services are supportiveness of organizational policies and procedures, ability to recruit and retain skilled and committed staff members, clinical systems that provide regular services, ease of information sharing across community health services systems/hospitals, and reimbursement and coverage systems to support physician interventions and patient services (Berends et al., 2011; Gassman, 2003; MacLean et al., 2012; Yoast, Wilford, & Hayashi, 2008). Although previous evaluations of the SBIRT program have documented that organizational readiness is related to successful implementation, there is little understanding as to why it is important and the extent to which organizational readiness moderates downstream patient behaviors. This is an important scientific question and critical to providing effective technical assistance for federally funded SBIRT programs.

A related issue that the evaluation will address is to understand how health information technology (HIT) is being used to implement SBIRT. The Health Information Technology for Economic and Clinical Health Act of 2009 implemented a series of incentives, grants, and programs to increase the use of HIT with the overall goals of improving clinical care, reducing health care costs, and supporting population and public health. SAMHSA has created several funding mechanisms to help states and providers implement HIT. Several technological challenges exist regarding the implementation of new technologies (Chen & Popovich, 2003) and engagement of screening for at-risk individuals (Katon, 2003; Prince et al., 2007; Robson & Gray, 2007). It is therefore important to understand the strategic plans for HIT within medical settings and how integrated SBIRT programs are affected.

To date, there is little evidence that draws on data from multiple organizations and multiple states on the role of HIT in SBIRT implementation. Given potential efficiencies and improvements in patient care, it is important to use data from several organizations implementing SBIRT to understand the extent to which HIT tools can be integrated successfully with SBIRT services.

2. Purpose and Use of Information

The SBIRT cross-site evaluation for the sixth cohort of SBIRT grantees will build on the established knowledge and evidence base for SBIRT service delivery and implementation. This evaluation will focus on the six states awarded cooperative agreements in 2013 and 2014 (Cohort VI). The specific goals of the current cross-site evaluation include

  • assessing various implementation models and the organizations in which they succeed, including factors that affect organizational readiness and the integration of SBIRT into general medical settings;

  • understanding the impact of HIT implementation on the impact and sustainability of SBIRT programs;

  • determining the short- and long-term impacts of SBIRT using robust measures of outcome and quality; and

  • assessing the cost-effectiveness of SBIRT programs being integrated into the health care system.

The cross-site evaluation team has worked with SAMHSA to develop a set of evaluation questions around these four goals. These evaluation questions drive all other aspects of the evaluation, including plans for data collection. The evaluation questions are organized based on the type of evaluation (process, HIT, outcome, or economic) and on the level of data needed to answer them (grantee, performance site, or patient). In addition to these questions, a system-wide question integrates evaluation results to present a comprehensive understanding of SBIRT’s role in the treatment system as a whole. The current evaluation questions are included in Exhibit 1.

The evaluation will allow SAMHSA to determine the extent to which SBIRT has met its objectives of implementing a comprehensive system of identification and care to meet the needs of individuals at all points along the substance use continuum. The Performance Site Survey will produce key data necessary to understand the organizational readiness of performance sites and the use of HIT to implement SBIRT. Currently, SAMHSA monitors the performance of these SBIRT programs using datausi collected through the Government Performance and Results Act (GPRA) (OMB No. 0930-0208). The GPRA data gather information from patients receiving services and are used for evaluation and monitoring of patient-level variables (e.g., past 30 day substance use). These data are not sufficient for evaluating performance site–level outcomes and practices.

Exhibit 1. Evaluation Questions

Section 1 (Process): SBIRT implementation and service delivery factors with respect to outcomes and future SBIRT policy

1.1

How ready are the performance sites to implement and deliver SBIRT?

1.2

What are the factors faced by grantees and performance sites related to the implementation of SBIRT in the health care system?

1.3

What are the factors faced by grantees and performance sites related to the integration of SBIRT into the health care system?

1.4

What are the factors faced by grantees, performance sites, and patients related to service delivery?

1.5

What are the characteristics of the performance sites and the practitioners delivering SBIRT services?

1.6

How do the workflow processes vary across the performance sites, and what explains these variations?

Section 2 (Health Information Technology [HIT]): The impact of HIT implementation on the efficacy and sustainability of SBIRT programs

2.1

What HIT/Electronic Health Record (EHR) systems (overall and specifically for SBIRT) are used by grantee performance sites and health systems and how are they integrating with existing HIT and state Health Information Exchanges?

2.2

What are the factors faced by grantees and performance sites related to the selected SBIRT HIT implementation?

2.3

To what extent does the SBIRT HIT implementation improve SBIRT's efficacy/effectiveness?

Section 3 (Outcomes): The short-term and long-term outcomes resulting from SBIRT implementation

3.1

How many patients receive SBIRT services, and what are these patients’ characteristics?

3.2

What is the efficacy of SBIRT for illicit drug use? To what extent did illicit drug use (and other substances) change as a result of SBIRT program efforts?

3.3

What effect does SBIRT have on long-term patient outcomes (e.g., health care utilization)?

3.4

What is the impact of SBIRT on treatment and data systems and the continuum of care?

3.5

What factors influence (i.e., moderate) patient short-term and long-term outcomes (e.g., grantee and performance site characteristics, patient characteristics, implementation quality, HIT/EHR)?

3.6

What factors influence (i.e., moderate) the expansion and sustainability of SBIRT?

Section 4 (Economic): The cost-effectiveness of SBIRT programs being integrated in the health care system

4.1

What are the costs of the SBIRT program to the grantees and to performance sites?

4.2

To what degree is SBIRT likely to be financially viable after grant funding has ended?

4.3

How has HIT changed the programmatic and service delivery costs of SBIRT across the grantees?

4.4

What is the impact of HIT on the cost-effectiveness of various SBIRT models?



The findings from the Performance Site Survey will help SAMHSA and its constituents understand how to meet the needs of patients effectively. The findings will thus inform policy concerning the development and implementation of behavioral health interventions in medical settings. The results of this data collection effort will also provide SAMHSA with substantive, technical, and administrative support to help transfer science to services in public and private sector substance abuse programs.

Specifically, the Performance Site Survey data are critical to a comprehensive evaluation of how organizations support the implementation of SBIRT (see Attachment 1). All staff (i.e., intake staff, managerial staff, medical providers, behavioral health providers, and social workers) working at sites that deliver SBIRT services are eligible to be surveyed. The survey includes the collection of basic demographic information, questions about the organization’s readiness to implement SBIRT, and questions about the use of HIT to deliver SBIRT services. The measures of organizational readiness and HIT use will be assessed individually and combined in scales of readiness and support, respectively. Analyses of organizational readiness and HIT use will examine point-in-time estimates and changes over time.

3. Use of Information Technology

The Performance Site Survey is a self-administered, web-based survey to be completed through Survey Gizmo. Using a web instrument allows for automated data checks and for skip procedures and prepopulated fields based on prior responses to certain questions. The approach will reduce respondent burden and the possibility of data entry error, thereby increasing the efficiency of data entry and improving data quality. The automated data checks will assess the consistency of responses between items (e.g., checking numbers reached against demographic breakdowns of those reached) and ensure that responses follow the expected format (e.g., numbers or dates where those are expected). Responses will generate skip patterns for later questions in the instrument.

All completed surveys will be downloaded from Survey Gizmo to RTI’s secure network. Details on RTI’s network security procedures are presented in Attachment 2.

A paper-and-pencil version of the Performance Site Survey will also be distributed and collected at sites where web access is not available. It is anticipated that more than 90 percent of surveys will be completed electronically with less than 10 percent being completed using the paper-and-pencil version. Once completed forms are received, responses will be entered into a secure database using double-key data entry procedures, and hard copy surveys will be stored in locked cabinets or offices.

4. Effort to Identify Duplication

The SBIRT cross-site evaluation team conducted an extensive literature review to confirm that the data collected through these sites would not duplicate any ongoing national or state-level data collection efforts. Data collected in this evaluation will be unique because of the scale and breadth of the initiative’s implementation: nationwide, across a spectrum of medical settings, and across a broad cross-section of populations.

5. Involvement of Small Entities

Participation of performance site staff in the SBIRT Cross-Site Evaluation will not be a significant burden on small businesses or small entities or on their workforces.

6. Consequences If Information Collected Less Frequently

A critical piece of the organizational readiness and HIT assessment is to monitor change over time. Organizations are expected to evolve in their ability to support SBIRT implementation and in their use of HIT to implement SBIRT. Changes over time in organizational readiness and HIT should in turn affect service provision and thus patient behaviors. Changes in these measures over time will also likely be correlated with the degree to which services at organizations are sustained. The Performance Site Survey will be administered longitudinally to each performance site over the course of three, approximately annual, waves. Each survey wave will represent an independent cross-section of performance site staff. Less frequent data collection would not allow for needed variation to address the evaluation’s primary objectives.

7. Consistency with the Guidelines in 5 CFR 1320.5(d)(2)

This information collection fully complies with the guidelines in 5 CFR 1320.5(d)(2).

8. Consultation Outside the Agency

The notice required by 5 CFR1320.8(d) was published in the Federal Register on November 9, 2015 (80 FR 69233).

SAMHSA has made extensive use of experts in the area of substance abuse research to provide guidance on the design and analysis of the cross-site evaluation. An expert panel meeting was held in March 2015 to review the various aspects of the cross-site evaluation, including the evaluation plan, data collection procedures, and data analysis methods. The list of experts and representatives of the sponsoring federal agency is provided in Exhibit 2. The experts provided feedback on all aspects of the evaluation, including the Performance Site Survey, and their comments were incorporated into later drafts of the survey.

Exhibit 2. Expert Panel Members

Expert Panel Member

Affiliation

E-mail Address

Dr. Richard Brown

University of Wisconsin, School of Medicine and Public Health

[email protected]

Sarah Duffy

National Institute on Drug Abuse

[email protected]

Minnjuan Flournoy-Floyd

Substance Abuse and Mental Health Services Administration

[email protected]

Dr. Suzanne Gelber

The AVISA Group

[email protected]

Emily Jones

Assistant Secretary for Planning and Evaluation

[email protected]

Erich Kleinschmidt

Substance Abuse and Mental Health Services Administration

[email protected]

Peggy Murray

National Institute on Alcohol Abuse and Alcoholism

[email protected]

Sarah Ndiangui

Substance Abuse and Mental Health Services Administration

[email protected]

Dr. Janice Pringle

University of Pittsburgh, School of Pharmacy, Program Evaluation and Research Unit

[email protected]

Laura Rosas

Substance Abuse and Mental Health Services Administration

[email protected]

Gerlinda Somerville

Substance Abuse and Mental Health Services Administration

[email protected]

Kate Wetherby

Substance Abuse and Mental Health Services Administration

[email protected]

Dr. Emily Williams

U.S. Department of Veterans Affairs, Health Services Research & Development and the University of Washington Health Services

[email protected]

Dr. Janet Williams

University of Texas Health Science Center

[email protected]


9. Payment to Respondents

No cash incentives or gifts will be given to respondents for completing the Performance Site Survey.

10. Assurance of Confidentiality

Concern for privacy and protection of respondents’ rights will play a central part in the implementation of all study components. RTI International is implementing the Performance Site Survey and collecting the data. The survey data collected will be kept private. In some situations, demographics collected as a part of the survey might permit the respondents to be identified. Therefore, RTI will de-identify the demographic data as needed to assure respondent confidentiality. RTI has extensive experience protecting and maintaining the privacy of respondent data.

The SBIRT cross-site evaluation team will use passwords to safeguard project servers and analysis files containing completed survey data to prevent the inadvertent disclosure of study data. The team also will be trained on handling sensitive data and the importance of privacy. All project staff will sign a privacy pledge (see Attachment 3). In addition, all studies involving human subjects are reviewed by RTI’s Institutional Review Board (IRB) (Federal Wide Assurance Number 3331) and by grantee IRBs as necessary before study implementation. In keeping with 45 CFR 46, Protection of Human Subjects, the SBIRT procedures for data collection, consent, and data maintenance are formulated to protect respondents’ rights and the privacy of information collected. Data from the Performance Site Survey will be kept strictly private in compliance with the Privacy Act of 1974 (5 U.S.C. 552a). The privacy of data records will be explained to all respondents during the consent process (see Attachment 4).

No contact information will be collected from respondents; all performance sites will provide a roster of staff and email addresses to receive the link to the web-based survey. Names of respondents will be secured and stored separately from the survey data.

11. Questions of a Sensitive Nature

No sensitive information will be collected from the respondents.

12. Estimates of Annualized Hour Burden

Estimate the annualized hour burden of the collection of information from performance site staff. The cross-site evaluation team expects that the number of eligible respondents will differ by the number of staff at each performance site. For example, one would expect more staff in an emergency department (ED) than in a primary care office. Across the 6 states, the cross-site evaluation team expects a total of 23 EDs and 100 primary care offices to be surveyed. In ED sites, the cross-site evaluation team expects that there will be on average 5 to 15 intake/front office staff, 1 to 3 performance site administrators, 1 to 3 clinical supervisors, 20 to 40 medical providers, 1 to 3 behavioral health providers, and 1 social worker. In primary care offices, the cross-site evaluation team expects that there will be on average 1 to 5 intake/front office staff, 1 to 3 performance site administrators, 1 to 2 clinical supervisors, 1 to 25 medical providers, 1 to 4 behavioral health providers, and 1 to 2 social workers. These staffing estimates were informed by cross-site evaluation site visits.

The total staff sample size for the SBIRT cross-site data collection effort is estimated to be a maximum of 1,407 respondents. For each type of respondent, cross-site evaluation staff collected rosters of existing performance sites and projected the likely composition of future sites based on site visit data. Exhibit 3 presents estimates of annualized and total burden based on preliminary testing. Sampling procedures are discussed in Section B.1. There will be a maximum of three annual waves of data collection with any one staff member at each organization. This number is a maximum because organizations will join or withdraw from the sample during the study period.



Exhibit 3. Annualized Burden Estimate

Respondent

Number of Respondents (a)

Number of Responses/ Respondent

Total Number of Responses

Hours per Response (b)

Annual Burden Hours

Hourly Wage (c)

Annual Cost ($) (d)

Intake/front desk staff

215

1

215

0.22

47.30

$16.12

$762.48

Performance site administrators

191

1

191

0.22

42.02

$49.84

$2,094.28

Clinical supervisors

101

1

101

0.22

22.22

$41.20

$915.46

Medical providers

571

1

571

0.22

125.62

$45.62

$5,730.78

Behavioral health providers

211

1

211

0.22

46.42

$23.09

$1,071.84

Social workers

118

1

118

0.22

25.96

$23.63

$613.43

TOTAL

1,407


1,407

 

309.54

 

$11,188.27

(a) The maximum number of annual respondents has been based on an estimates from cross-site evaluation site visits.

(b) The average burden per response was estimated based on independent review of the instrument by contractor staff.

(c) Mean wages estimates were obtained from salary estimates of related professions from the U.S. Department of Labor’s Bureau of Labor Statistics (http://www.bls.gov/oes/current/oes_nat.htm). The following job codes were used for each respondent category: 43-6013 for intake/front desk staff; 11-9111 for performance site administrators; the average of 11-9111 and 11-9151 for clinical supervisors; 29-1000 for medical providers; 21-1011 for behavioral health providers; and 21-1020 for social workers.

(d) Annual respondent cost is calculated as number of respondents (a) × hours per response (b) × hourly wage (c).





Estimate the annualized cost burden to the respondent for the collection of information from performance site staff. There are no direct costs to respondents other than their time to participate in the study. The annual cost of the time respondents spend completing these surveys is $11,188.27, which is calculated by multiplying the number of annual respondents by the hourly wage and the survey burden.

13. Estimates of Annualized Cost Burden to Respondents

There are no respondent costs for capital or start-up or for operation or maintenance.

14. Estimates of Annualized Cost to the Government

The estimated cost to the government for 4 years of data collection and analysis is $251,954. This includes approximately $241,384 for a 4-year contract for data collection, processing, reports, etc., and approximately $10,570 represents SAMHSA costs to manage/ administer the survey for 2% of one employee (GS-15). The annualized cost is approximately $62,988.

15. Changes in Burden

This is a new collection of information.

16. Time Schedule, Publications, and Analysis Plan

Time Schedule: Exhibit 4 outlines the key time points for the study and for the collection of information. The requested period also allows for training and start-up activities associated with the preparation for data collection.

Exhibit 4. Time Schedule for Entire Project

Activity

Time Schedule

Obtaining OMB approval for data collection

Spring 2016

Data collection

3 months post OMB approval for 36 months

Data analysis

Beginning 18 months post OMB approval

Dissemination of findings
Interim reports, manuscripts, final report

Beginning 18 months post OMB approval through 2019


Publications: The SBIRT cross-site evaluation is designed to produce information about the implementation and impact of Cohort VI SBIRT models. It is therefore important to prepare and disseminate reports, concept papers, documents, and oral presentations that present project results clearly and concisely so that they can be appreciated by both technical and nontechnical audiences. The SBIRT cross-site evaluation team will

  • produce rapid-turnaround analysis papers, briefs, and reports;

  • prepare and submit monthly technical progress reports and a final SBIRT cross-site evaluation team report;

  • prepare final cross-site findings report, including an executive summary;

  • deliver presentations at professional and federally sponsored conventions and meetings; and

  • disseminate reports and materials to entities inside and outside SAMHSA.

Analysis Plan: The analysis centers on specific evaluation questions presented in Exhibit 1. The analysis of the Performance Site Survey will primarily be descriptive statistics based on the scale indices and longitudinal comparisons across time. These results will also be incorporated into other analyses, including the following:

  • Correlation with performance monitoring of program outputs, such as rates of patient screening and screen positives

  • Triangulation of the results with other qualitative and administrative data on implementation success and economic efficiency

The basic approach will use both a case study design and a pooling of data. Attachment 5 is a table shell in which results of the analysis of organizational readiness and HIT outcomes may be reported.

Analyses of organizational readiness will use both the individual measures and the measures combined in scales of readiness. The evaluation team will perform descriptive measures of means and proportions and multivariate analyses treating organizational readiness as a dependent variable. The multivariate analyses will assess associations among readiness and other site-level characteristics (e.g., medical setting). Other analyses will use organizational readiness as a predictor of other dependent variables in the main evaluation. For example, organizational readiness may be correlated with the proportion of eligible patients who are then screened. Analytic techniques include regression methods, such as Generalized Linear Mixed Models.

Analyses of the HIT data will be qualitative and quantitative. For each performance site, estimates from these data will be combined with other information describing the site (e.g., medical setting) to create a comprehensive characterization or profile. These estimates will be considered qualitatively alongside the other information so that descriptions are internally consistent. Triangulation and other qualitative analysis will be used to ascertain important relationships among the different characteristics, such as how the implementation of new HIT interacted with implementation of a new SBIRT program. Formal qualitative comparative analysis will be used to make logical statements about patterns of relationships when supported by the data. Following these qualitative analyses, statistical tests will determine whether differences in HIT support are explained by factors and strata such as urbanicity and geographic region, medical setting type, and state-level policies around HIT. Similar to the organizational readiness analyses, the HIT measures will be used as a predictor of other dependent variables in the main evaluation.

17. Display of Expiration Date

OMB approval expiration dates will be displayed.

18. Exceptions to Certification for Statement

There are no exceptions to the certification statement. The certifications are included in this submission.

9


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorMallonee, Erin
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy