2 SBIRT II OMB Supporting Statement 2011-03-03

2 SBIRT II OMB Supporting Statement 2011-03-03.doc

Screening, Brief Intervention, and Referral to Treatment (SBIRT) Cross-Site Evaluation

OMB: 0930-0321

Document [doc]
Download: doc | pdf

SUPPORTING STATEMENT FOR

SCREENING, BRIEF INTERVENTION, REFERRAL AND TREATMENT (SBIRT)

CROSS-SITE EVALUATION



A. JUSTIFICATION


1. Circumstances of Information Collection


The Substance Abuse and Mental Health Services Administration (SAMHSA) is requesting approval from the Office of Management and Budget (OMB) for the “Practitioner Survey” data collection activity for the Cross-Site Evaluation of the “Screening, Brief Intervention, and Referral to Treatment (SBIRT)” program in Cohort III States and Tribal Organizations. These activities include a survey of practitioners in health care providers where SBIRT services are being delivered.


Federal programs have tended to emphasize either universal substance abuse prevention strategies aimed at those who have never initiated use (Mrazek and Haggerty, 1994) or specialist treatment for those who are dependent (Gerstein and Harwood, 1990). Little attention has been paid to the large group of individuals who use substances but are not, or not yet, dependent and who could successfully reduce substance use through early intervention (Klitzner et al., 1992; Fleming, 2002).


The specialty treatment system is often not appropriate for persons at risk for a substance use disorder (SUD; i.e., substance abuse or substance dependence), and that system alone cannot address the needs of all persons diagnosed with an SUD. Consequently, new program efforts are needed to provide funding to introduce or expand screening and brief intervention and brief treatment for persons at risk for, or diagnosed with, an SUD. These new program efforts need to be initiated in general medical and other community settings (e.g., community health centers, nursing homes, schools and student assistance programs, occupational health clinics, hospitals, EDs). Screening for substance use and misuse among patients in primary care settings offers many potential benefits. It provides an opportunity to educate patients about low-risk consumption levels and the risks of excessive use (U.S. Department of Health and Human Services (DHHS), 1997). Information about the amount and frequency of alcohol or drug consumption may also inform the diagnosis of the patient’s presenting condition, and it may alert clinicians to the need to advise patients regarding adverse effects of medication use and other aspects of their treatment. Screening also offers the opportunity for practitioners to take preventive measures proven to be effective in reducing alcohol-/drug-related risks.


A large body of evidence in the literature indicates that SBIRT services reduce primary care patients’ substance use. Reliable self-report substance abuse screening tools are available to health professionals, and brief intervention (BI) strategies appear to reduce substance use amongst nondependent heavy alcohol users and cigarette smokers, and potentially among marijuana users (Babor et al., 2007; Bernstein et al., 2005; Fleming et al., 2007; Gentilello et al., 1999; Kraemer, 2007; Madras et al., 2009; Soderstrom et al., 2007; Stephens et al., 2007). Brief treatments (BT) also appear to be effective in reducing alcohol use and smoking among alcohol and other drug users (Babor et al., 2007; Bernstein et al., 2005; Fleming et al., 2007; Gentilello et al., 1999; Kraemer, 2007; Madras et al., 2009; Soderstrom et al., 2007; Stephens et al., 2007). New implementation strategies, such as the use of Internet applications (Copeland & Martin, 2004), show promise for increasing the level and impact of SBI and potentially improving sustainability. In addition, research has demonstrated the cost-effectiveness of SBIRT services (Babor et al., 2006; Mauch, Kautz, & Smith, 2008).


The SBIRT program is authorized under Section 509 of the Public Health Service Act, as amended. The program also addresses Healthy People 2020 Focus Area 2020-6--Substance Abuse (DHHS). The White House National Drug Control Strategy (NDCS) emphasizes (1) preventing drug use before it starts, (2) intervening with and healing those who already use drugs, and (3) disrupting the market for illicit substances (Office of National Drug Control Policy [ONDCP], 2007). SBIRT focuses on early intervention and treatment as a central component toward implementing the NDCS.


Recognizing that treatment needs could be better met through a comprehensive approach to identifying and treating substance use problems across a continuum of severity, SAMHSA established the SBIRT program. SAMHSA’s SBIRT program is a cooperative agreement grant program designed to help States, Territories, and Tribes expand the continuum of care available for substance misuse and use disorders. The program includes screening, BI, BT, and referrals to treatment for persons at risk for dependence on alcohol or drugs. The SBIRT program represents a major advance in the basic philosophy of addressing substance use issues and the role of the treatment system. Like other practices developed in tightly controlled research settings, it is important to understand how SBIRT will work best in various settings and under somewhat different approaches. It is also important to examine which models of SBIRT offer the greatest potential to improve the U.S. service system.


Begun in 2004, the first SAMHSA SBIRT Cross-Site Evaluation assessed the impact of SBIRT in six Cohort I States and one Tribal Organization. The Cohort I evaluation will be generating key findings (2010) on process, outcome, and economic areas of investigation that cannot be addressed by small randomized control trials.


As SBIRT continues to evolve and develop, research needs to shift from establishing an evidence base for SBIRT services to examining the administration and implementation of SBIRT programs. Areas of specific interest for further exploration include

  • SBIRT and the use of technology for SBI (Boudreaux et al., 2009, McRee, 2009),

  • efficacy and effectiveness for illicit substances (Zahradnik et al., 2009; Madras et al., 2009; Bernstein et al., 2005),

  • utility of prescreening (Vinson, Galliher, Reidinger, & Kappus, 2003),

  • integration of substance abuse screening into general health/behavioral health screening (Beich, Thorsen, & Rollnick, 2003; Hungerford & Pollock, 2003),

  • training of medical staff within various settings for implementing SBIRT (Bradley et al., 2007),

  • economic implications of different implementation models, and

  • various implementation models and the organizations in which they succeed.

Fortunately, the field is well-positioned to produce evidence on several of these areas over the next several years. The second SAMHSA SBIRT Cross-Site Evaluation has structured its evaluation to continue investigating effective implementation of SBIRT as well as provide evidence on these additional areas of interest. While four States were awarded co-operative agreements in 2006 (Cohort II), this evaluation will focus on the three States and one Tribal Organization awarded co-operative agreements in 2008 (Cohort III). Due to logistical constraints, Cohort II grantees were not included in either the first or second SBIRT Cross-Site Evaluations.

As part of the second SBIRT Cross-Site Evaluation, the Practitioner Survey will produce the key outcome data necessary for a complete evaluation sufficient to establish best practices. Currently, SAMHSA monitors the performance of these SBIRT programs using datausi collected through the Government Performance and Results Act (GPRA) (OMB No. 0930-0208). Although GPRA data are sufficient for program monitoring, they are not sufficient for establishing best practices of competing programs.


The SBIRT Cross-Site Evaluation of a multiprotocol, multipopulation effort in Cohort III States and Tribal Organizations will generate empirically-based knowledge about a variety of interventions and how they function within a variety of populations and contexts, thus broadening SAMHSA’s initiatives. As clinical trials originally reported in the literature (Fleming et al., 1999; Ballesteros et al., 2004; Moyer et al., 2002; D’Onofrio and Degutis, 2002; Saitz et al., 2003) and SAMHSA’s original SBIRT I Cross-Site Evaluation (OMB No. 0930-0282) demonstrated the effectiveness of SBIRT in reducing substance use and use risk, the interest in SBIRT implementation continues to grow. Additionally, with challenges to SBIRT implementation and integration reported in the literature itself (Modesto-Lowe and Boornazian, 2000; Roche and Freeman, 2004; Arndt et al., 2002; Church and Babor, 1995), and the first SBIRT Cross-Site Evaluation data showing differences between clinical research and real-world practice, further investigation is needed. By linking this evidence base in the literature to the models actually being implemented by the four sites, the second SBIRT Cross-Site Evaluation will be examining the administration and implementation of SBIRT programs thereby gauging the success and impact of the broader implementation of SBIRT.


  1. Purpose and Use of Information


The purpose of this evaluation is to provide a comprehensive assessment of the effects of SBIRT on patient outcomes, performance site practices (i.e., general medical and other community settings), and treatment systems. This information will allow SAMHSA to determine the extent to which SBIRT has met its objectives of implementing a comprehensive system of identification and care to meet the needs of individuals at all points along the substance use continuum.


To achieve this overarching objective, the evaluation will assess the impact of SBIRT on the existing treatment system, including identification of the barriers, challenges, and facilitators of successful SBIRT implementation. This evaluation will also examine the feasibility, utility, and sustainability of future SBIRT cohorts and make recommendations to SAMHSA of ways to improve future initiatives within the SBIRT portfolio. Consistent with SAMHSA’s focus on accountability and effectiveness, this evaluation will yield information to guide and refine the processing/monitoring system being developed and maintained by SAMHSA. Finally, to the extent possible, the evaluation will be responsive to the needs of key stakeholders, such as patients, performance sites, providers, payers, and policy makers.


The Cross-Site Evaluation Team has worked with SAMHSA to develop a set of key evaluation questions. These evaluation questions drive all other aspects of the evaluation, including plans for data collection. The evaluation questions are organized based on the type of evaluation (process, outcome, or economic) and on the level of data needed to answer them (grantee, performance site, or patient). In addition to process, outcome, and economic questions, there are system-wide questions that integrate evaluation results from all three to present a comprehensive picture of the effects of SBIRT on the treatment system as a whole. The current evaluation questions are included in Attachment 1.


The results of this data collection effort will provide SAMHSA with substantive, technical, and administrative support to transfer science to services concerning public and private sector substance abuse programs. Data collected via the practitioner survey will enable the SBIRT program to increase its effectiveness in meeting the needs of their clients with substance use disorders. It will also inform future policy concerning the development and implementation of SBIRT within a non-substance abuse treatment setting.


The Practitioner Survey will provide the data necessary to conduct a complete outcome evaluation (see Attachment 2). All practitioners (e.g., physicians, health educators) working at sites that deliver SBIRT services are eligible to be surveyed. These include health educators, chemical dependency counselors, physicians, nursing staff, and other clinical or administrative staff. Demographics and educational background characteristics will be collected along with a randomly generated site identification number. The practitioner survey includes sets of questions that attempt to gauge barriers to implementation encountered by the practitioners and training received by the practitioners. The analysis will be based primarily on descriptive statistics on service delivery unit type and practitioner characteristics and attitudes.


3. Use of Information Technology


A paper and pencil version of the Practitioner Survey will be distributed and collected while the SBIRT Cross-Site Evaluation Team is on-site at each location. This will be the primary method of data collection. Once completed forms are received, responses will be entered into a secure database using double-key data entry procedures. Details on RTI’s network security procedures are presented in Attachment 3.


  1. Effort to Identify Duplication


The SBIRT Cross-Site Evaluation Team conducted an extensive literature review to confirm that the data collected through these sites would not be duplicative of any ongoing national or state-level data collection efforts. Data collected in this evaluation will be unique because of the scale and breadth of the initiative’s implementation: nationwide, across a spectrum of provider settings, and across a broad cross-section of populations.


5. Involvement of Small Entities


Participation of practitioners in the SBIRT Cross-Site Evaluation will not be a significant burden on small businesses or small entities or on their workforces.


6. Consequences If Information Collected Less Frequently


The Practitioner Survey will be administered to each respondent one time. Because the objective of the Practitioner Survey is not to monitor trends in variables over time or before and after an intervention, obtaining the data more frequently would be an unnecessary burden. Less frequent data collection would not achieve the SBIRT Cross-Site Evaluation initiative’s primary objectives.


7. Consistency with the Guidelines in 5 CFR 1320.5(d)(2)


This information collection fully complies with the guidelines in 5 CFR 1320.5(d)(2).


8. Consultation Outside the Agency


The notice required by 5 CFR1320.8(d) was published in the Federal Register on June 2, 2010 (75 FR 30837-30838). No comments were received in response to this notice.


SAMHSA has made extensive use of experts in the area of substance abuse research to provide guidance on the design and analysis of the cross-site evaluation. An expert panel meeting was held in January 2010 to review the various aspects of the cross-site evaluation, including the evaluation plan, data collection procedures, and data analysis methods. The list of experts is provided in Exhibit 1.


Exhibit 1: Expert Panel Members

Expert

Affiliation

Contact Information

Janette Baird, PhD

Assistant Professor of Research

Department of Emergency Medicine

Brown University

Warren Alpert School of Medicine

Rhode Island Hospital
593 Eddy Street
Providence, RI 02903


Phone: (401) 444-2976

Fax: (401) 444-2249

Email: [email protected]


Sharon Estee, PhD

Chief, Program Research and Evaluation Section

Washington Department of Social and Health Services

P.O. Box 45204, 1115 Washington Street

Olympia, WA 98504

Phone: (360) 902-7655

Fax: (360) 902-0705

Email: [email protected]


Dean Fixsen, PhD

Senior Scientist

University of North Carolina at Chapel Hill

Campus Box 8040

Chapel Hill, NC 27599

Phone: (919) 966-3892

Fax: (919) 966-7463

Email: [email protected]


Michael French, PhD

Professor of Health Economics and Director of the Health Economics Research Group

University of Miami

5202 University Drive Merrick Building, Room 121F

Coral Gables, FL 33124

Phone: (305) 284-6039

Fax: (305) 284-5716

Email: [email protected]


Larry Gentilello, MD, FACS

Professor of Surgery

University of Texas

6238 Pemberton Drive

Dallas, TX 75230

Phone: (214) 632-9831

Email: [email protected]


Daniel W. Hungerford, DrPH

Epidemiologist

Centers for Disease Control and Prevention

CDC: NCIPC/DIR MS-F62, 4770 Buford Hwy NE

Atlanta, GA 30341

Phone: (770) 488-4142

Fax: (770) 488-3551

Email: [email protected]


Maristela Monteiro, MD, PhD

Senior Advisor on Alcohol and Substance Abuse

Pan American Health Organization (PAHO)

13405 Oriental Court

Rockville, MD 20853

Phone: (202) 974-3108

Email: [email protected]

Margaret M. Murray, MSW

Senior Advisor to the Director

National Institute on Alcohol Abuse and Alcoholism, NIH

5635 Fishers Lane

Rockville, MD 20852

Phone: (301) 443-2594

Fax: (301) 443-7043

Email: [email protected]


Stephen O’Neil, MA, CDP

Director, Georgia BASICS SBIRT Project

Georgia Dept. of Behavioral Health & Developmental Disabilities, Division of Addictive Diseases

2 Peachtree St.

Atlanta, GA 30303

Phone: (404) 651-6450

Fax: (404) 657-6917

Email: [email protected]


Janice Pringle, PhD

Research Associate Professor & Director of the Program Evaluation and Research Unit

University of Pittsburgh School of Pharmacy

2100 Wharton Street, Suite 720-C

Pittsburgh, PA 15203

Phone: (412) 904-6127

Fax: (412) 904-6125

Email: [email protected]


Jodi Trojan, M.C.J.

Behavioral Health Evaluator

Tanana Chiefs Conference

3754 Mitchell Ave

Fairbanks, AK 99709

Phone: (907) 451-6822

Email: [email protected]



The experts provided feedback on all aspects of the evaluation, including the practitioner survey, and their comments were incorporated into later drafts of the survey.


9. Payment to Respondents


No cash incentives or gifts will be given to respondents for completing the Practitioner Survey.


10. Assurance of Confidentiality


Concern for privacy and protection of respondents’ rights will play a central part in the implementation of all study components. RTI International is implementing the cross-site surveys and collecting and analyzing the data and has extensive experience protecting and maintaining the privacy of respondent data.


The SBIRT Cross-Site Evaluation Team will use passwords to safeguard project directories and analysis files containing completed survey data to ensure that there is no inadvertent disclosure of study data. The team also will be trained on handling sensitive data and the importance of privacy. All project staff will sign a privacy pledge. (See Attachment 4.) In addition, all studies involving human subjects will be reviewed and approved by RTI’s Institutional Review Board (IRB) (Federal Wide Assurance Number 3331) and by grantee IRBs as necessary prior to study implementation. In keeping with 45 CFR 46, Protection of Human Subjects, the SBIRT procedures for data collection, consent, and data maintenance are formulated to protect respondents’ rights and the privacy of information collected. Strict procedures will be followed for protecting the privacy of respondents’ information and for obtaining their informed consent. The IRB-approved model informed consent in Attachment 5 meets all Federal requirements for informed consent documentation. This template will be customized by each grantee to obtain informed consent for participation in the study. Any necessary changes to the survey will be reviewed by the RTI IRB.


Data from the Practitioner Survey will be kept strictly private in compliance with the Privacy Act of 1974 (5 U.S.C. 552a). The privacy of data records will be explained to all respondents during the consent process and in the consent form.


No contact information will be collected from respondents, and no follow-up interviews will be administered. Names of sampled respondents will be secured and stored separately from the survey data. The survey data collected will be anonymous. Demographics and educational background characteristics will be collected along with a randomly generated site identification number. In some situations, these characteristics might permit the practitioner respondents to be identified. Therefore, the protocols and data protections above will be used to ensure the privacy of practitioner respondents.


11. Questions of a Sensitive Nature


No sensitive information will be collected from the respondents. Respondents will be informed about the purpose of the data collection and that responding to all survey questions is voluntary. In addition, specific assurances will be provided to respondents concerning the safety and protection of data collected from them. Respondents’ names or other identifying information will not be collected.


12. Estimates of Annualized Hour Burden


Estimate the annualized hour burden of the collection of information from practitioners. The Cross-Site Evaluation Team expects that the number of eligible respondents will differ by the number of patients seen at a performance site. For example, one would expect more practitioners in an emergency department (high flow) than a primary care office (low flow). At 5 high flow sites, approximately 15 SBIRT practitioners and 60 non-SBIRT practitioners are expected to be surveyed. At 20 low flow sites, approximately 5 SBIRT practitioners and 30 non-SBIRT practitioners are expected to be surveyed. The total practitioner sample size for the SBIRT cross-site data collection effort is estimated to be a maximum of 1,075 respondents (5 high flow sites x 75 respondents per high flow site; 20 low flow sites x 35 respondents per low flow site). Exhibit 2 presents estimates of annualized burden based on preliminary testing. Sampling procedures are discussed in Section B.1.


Estimate the annualized cost burden to the respondent for the collection of information from practitioners. There are no direct costs to respondents other than their time to participate in the study. The annual cost of the time respondents spend completing these surveys is $10,320 (number of practitioner respondent hours × $32, the estimated average hourly wages for individuals working in health-related occupations as published by the Bureau of Labor Statistics, 2009).


Exhibit 2. Data Collection Burden for Practitioner Survey

Instrument/Activity

Number of Respondents

Responses per Respondent

Hours per Response

Total Burden Hours

Hourly Wage

Total Respondent Costa

Practitioner Survey

1,075

1

.30

322.5

$32

$10,320

aTotal respondent cost is calculated as hourly wage × time spent on survey × number of respondents.



13. Estimates of Annualized Cost Burden to Respondents


There are no respondent costs for capital or start-up or for operation or maintenance.


14. Estimates of Annualized Cost to the Government


The estimated cost to the government for the data collection is $629,175. This includes approximately $615,150 for a 5-year contract for sampling, data collection, processing, reports, etc. and approximately $2,805 per year represents SAMHSA costs to manage/administrate the survey for 2% of one employee (GS-15). The annualized cost is approximately $125,835.


15. Changes in Burden


This is a new collection of information.


16. Time Schedule, Publications, and Analysis Plan


Time Schedule: Exhibit 3 outlines the key time points for the study and for the collection of information. The requested period also allows for training and start-up activities associated with the preparation for data collection.


Exhibit 3. Time Schedule for Entire Project

Activity

Time Schedule

Obtaining OMB approval for data collection

December 2010

Data collection

3 months post OMB approval for 15 months

Data analysis

Beginning 18 months post OMB approval

Dissemination of findings
Interim reports, manuscripts, final report

Beginning 18 months post OMB approval through 2014

Publications: The SBIRT Cross-Site Evaluation is designed to produce knowledge about the implementation and impact of SBIRT models. It is therefore important to prepare and disseminate reports, concept papers, documents, and oral presentations that clearly and concisely present project results so that they can be appreciated by both technical and nontechnical audiences. The SBIRT Cross-Site Evaluation Team will:


  • Produce rapid-turnaround analysis papers, briefs, and reports;

  • Prepare and submit monthly technical progress reports and a final SBIRT Cross-Site Evaluation Team report;

  • Prepare final cross-site findings report, including an executive summary;

  • Deliver presentations at professional and federally sponsored conventions and meetings; and

  • Disseminate reports and materials to entities inside and outside SAMHSA.


Analysis Plan: The analysis centers on specific evaluation questions found in Attachment 2. The analysis of the Practitioner Survey will be based primarily on descriptive statistics on service delivery unit type and practitioner characteristics and attitudes. Additional analyses will:


  • Correlate the results with patient screening and screen positive rates. 

  • Compare the results with other data collected on implementation success and economic efficiency. 

  • Use average practitioner characteristics as moderators in patient outcomes monitoring.


The basic approach will use both a case study design and a pooling of data. Attachment 6 is a table shell in which results of the analysis of practitioner outcomes may be reported.


Our primary analysis technique for patient outcomes based on GPRA data will be Generalized Linear Mixed Model (GLMM) a flexible estimation technique that subsumes a variety of other techniques. We will use GLMM of the following form to test the hypotheses associated with the evaluation. Bold face font indicates vector notation.


Yij:l = f(β0 + β1Tj + β2Xij:l + γMi:l + δSk)+ εijk:l (1)


Yij:l is the outcome (e.g., substance use) for person i observed at time j, nested within condition l (i.e., intervention or control); f(∙) is a link function; and εijk:l is an i.i.d. error or residual. Specifying both f(∙) and the distribution of εijk:l yields various models appropriate for a variety of outcomes. The βs are fixed-effect parameters to be estimated and the γs and δs are random-effect parameters (i.e., variance components) to be estimated..Tj is a dichotomous variable indicating the jth time point or in other words whether the observation was prior to SBIRT or after SBIRT. Xij:l is a vector of demographic and other potential confounders. Mi:l is a vector of indicator variables for each individual. Sk is a vector indicator variables for each of k clinical sites. Given the specification of the fixed effects, β1 captures the differential change in outcome Yij:l from baseline to that follow-up time point. We will estimate separate models for each follow-up time point, as well as a model that includes all follow-up time points. The interpretation of the magnitude of the estimated intervention effect depends on the link function used to estimate equation (1) that will be determined by the distribution of the outcome.


To assess potential moderating effects of practitioner characteristics collected, we will also use the GLMM framework. To test the moderating effects of specific factors on the effect of the intervention on outcomes, we will include interactions between the hypothesized moderator and the design variables included in equation (1). Thus, for a moderators Wqk (average practitioner characteristic q in site k) we will estimate the following GLMM:

Yij:l = f(β0 + β1Tj + β2Wqk + β3TjWqk + β4Xij:l + γMi:l+ δSk)+ εijk:l (2)


By using the GLMM framework, W can be either continuous or dichotomous. To test the significance of the moderating effect of Wk on the intervention effect, one simply tests the joint and individual significance of the β3.


17. Display of Expiration Date


OMB approval expiration dates will be displayed.


18. Exceptions to Certification for Statement


There are no exceptions to the certification statement. The certifications are included in this submission.


B. COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS


1. Respondent Universe and Sampling Methods


In order to evaluate the success of SBIRT implementation at the site level, all practitioners at locations delivering SBIRT services are eligible to be surveyed. The types of SBIRT practitioners surveyed will include physicians, nurses and other medical staff, chemical dependency counselors, health educators, and other administrative staff involved in the delivery of services. Since evaluation team members will be traveling to selected SBIRT providers and coordinating with site administrators, there is an opportunity to census all SBIRT practitioners at a given site with a minimal level of burden. In addition to this SBIRT practitioner census, a pre-determined random sample of non-SBIRT practitioners will be surveyed during each site visit in order to form valid comparison samples. Here, non-SBIRT practitioners will be stratified into three broad groups with random samples being pulled from within each stratum. The three non-SBIRT strata include physicians, clinical (non-physicians), and administrative staff. Within each strata of non-SBIRT practitioners sample sizes will depend on the overall number of practitioners employed at the site. For sites with 20 or fewer non-SBIRT practitioners in a stratum, a full census of that group will occur. If there are 21 or more non-SBIRT practitioners in a stratum, then a stratified random sample of 20 percent will be surveyed.


2. Information Collection Procedures


During the initial site visit preparation, individual site administrators will be contacted to inform them of the survey and to ask for their help informing practitioners of the survey’s intent. These administrators will also provide census rosters of all SBIRT and non-SBIRT practitioners to aid in the sampling and collection procedures. As the site visit approaches, these administrators will again be contacted in order to coordinate interviews with all those eligible to be surveyed. To protect the privacy of responses, the site administrators will not be informed of which practitioners eventually return surveys.


During the actual site visit, the SBIRT Cross-Site Evaluation Team will contact selected practitioners to complete paper and pencil versions of the survey to be returned in sealed envelopes. These pre-paid envelopes will contain no information that is uniquely identifiable to the respondent. The surveys distributed will only be identified by number. The SBIRT Cross-Site Evaluation Team will keep the names of sampled respondents in a secured separate file. Team members will use the cross-walk of surveys to respondent names only to follow-up with practitioners to encourage them to complete the survey. Those practitioners that do not finish the survey during the visit will be encouraged to mail in their response afterwards. We cease contacting practitioners with reminders to return the survey within six weeks of the end of the site visit. At that time, we will destroy their identifying and contact information.



3. Methods to Maximize Response Rates


The SBIRT Cross-Site Evaluation Team expects an 80 percent or greater response rate on the Practitioner Survey. To maximize initial response rates, the SBIRT Cross-Site Evaluation Team will follow protocols that have successfully been used on other projects to achieve a greater than 80 percent response rate on similar surveys. The focus will be on reducing the burden on practitioners. The protocols include proper timing and location of survey administration to accommodate the practitioners. For most practitioners, survey staff will distribute the survey during a staff meeting or scheduled briefing in order to increase overall response rate and decrease individual burden. The survey administration will also to take place at the beginning of the site visit to allow ample time to follow-up with all respondents.


During the site visits, some SBIRT practitioners will be observed delivering services by the evaluation team. For SBIRT practitioners selected for observation, the Practitioner Survey will be distributed at the beginning of the observation. This will increase the likelihood of response to the Practitioner Survey, but it will also make it possible to link the data from the Practitioner Survey with the data collected as part of the observations. This will allow for additional analyses linking practitioner perceptions with the provision of services. These practitioners will be given a uniquely numbered survey to complete at the beginning of the observation with survey staff noting that number on the observation record. Names of respondents to the Practitioner Survey and practitioners participating in observations will be stored securely and separately to ensure the privacy of respondents.


All practitioners at these selected sites will be informed, in advance, of the motivation and significance of the survey in order to encourage their response participation in this survey. Finally, the efficiency of the survey and the assurance of privacy will make survey completion more amenable to the practitioners.


4. Test of Procedures


The SBIRT Cross-Site Evaluation Team tested a pencil-and-paper version of the Practitioner Survey with eight respondents and found that it takes approximately 13 minutes to complete. In addition, it takes 5 minutes to read the informed consent for a total of 18 minutes.


The Practitioner Survey includes questions that attempt to assess barriers to implementation encountered by the practitioners and to gauge the effectiveness of the training they received. These measures were developed and used by Babor et al. (2005) in their comparable study comparing different implementation strategies for primary care screening and brief intervention programs for hazardous and harmful drinkers. The Practitioner Survey also includes an instrument developed by Panzano and Roth (2006) to measures an organization’s willingness to adopt new innovative practices.


The Practitioner Survey also includes questions on demographics and training. Because one of the Cohort III grantees represents a Tribal organization, the team plans to collect race separately for Alaska Native and American Indian practitioners (see Attachment 2- Question A4).


5. Statistical Consultants


As noted in Section A.8, the SBIRT Cross-Site Evaluation Team has consulted extensively with an expert panel that has reviewed and approved all data collection and analysis methodologies outlined in this package. They will also continue to provide expert advice throughout the course of the program. In addition, several in-house experts will be consulted throughout the program on various statistical aspects of the design, methodological issues, economic analysis, database management, and data analysis. Exhibit 4 provides details of these advisors.



Exhibit 4. Senior Advisors

Expert

Affiliation

Contact Information

Jeremy W. Bray, PhD
Cross-Site Evaluation Director

Fellow, Health Economics
RTI International
3040 Cornwallis Road
Research Triangle Park, NC 27709

Phone: 919-541-7003
Fax: 919-541-6683
E-mail:
[email protected]

Georgiy Bobashev, PhD
Advisor

Senior Research Statistician
Statistical Research Division
RTI International
3040 Cornwallis Road
Research Triangle Park, NC 27709

Phone: 919-541-6161
Fax: 919-541-5966
E-mail:
[email protected]

Gary A. Zarkin, PhD
Advisor

Vice President
Behavioral Health and Criminal Justice Research Division
RTI International
3040 Cornwallis Road
Research Triangle Park, NC 27709

Phone: 919-541-5858
Fax: 919-541-6683
E-mail:
[email protected]

James Nonnemaker, PhD
Advisor

Research Economist
Public Health Policy Research
RTI International
3040 Cornwallis Road
Research Triangle Park, NC 27709

Phone: 919-541-7064
Fax: 919-541-6683
E-mail:
[email protected]

Jason Williams, PhD
Advisor

Research Psychologist
Risk Behavior and Family Research
RTI International
3040 Cornwallis Road
Research Triangle Park, NC 27709

Phone: 919-541-6734
Fax: 919-485-5555
E-mail:
[email protected]

References


Arndt, S.; Schultz, S.K.; Turvey, C.; and Petersen, A. Screening for Alcoholism in the Primary care Setting: Are We Talking to the Right People? Journal of Family Practice 51(1):41-46, 2002.


Babor, T. F., Higgins-Biddle, J. C., Dauser, D., Burleson, J. A., Zarkin, G. A., & Bray, J. (2006). Brief intervention for at-risk drinking: Patient outcomes and cost-effectiveness in managed care organizations. Alcohol and Alcoholism, 41(6), 624–631.

Babor, T. F., McRee, B. G., Kassebaum, P. A., Grimaldi, P. L., Ahmed, K., & Bray, J. (2007). Screening, brief intervention, and referral to treatment (SBIRT): Toward a public health approach to the management of substance abuse. Substance Abuse, 28, 7–30.

Ballesteros, J., J.C. Duffy, I. Querejeta, J. Arino, and A. Gonzalez-Pinto. 2004. “Efficacy of Brief Interventions for Hazardous Drinkers in Primary Care: Systematic Review and Meta-Analyses.” Alcoholism: Clinical and Experimental Research 28(4):608-618.

Beich, A., Thorsen, T., & Rollnick, S. (2003). Screening in brief intervention trials targeting excessive drinkers in general practice: Systematic review and meta-analysis. British Medical Journal, 327(7414), 536–542.

Bernstein, E., Bernstein, J., Tassiopoulos, K., Heeren, T., Levenson, S., & Hingson, R. (2005). Brief intervention at a clinic visit reduced cocaine and heroin use. Drug Alcohol Dependence, 77, 49–59.

Bourdreaux, E. D., Bedek, K. D., Gilles, D., Baumann, B. M., Hollenberg, S., Lord, S. A., & Grisson, G. (2009). The dynamic assessment and referral system for substance abuse (DARSSA): Development, functionality, and end-user satisfaction. Drug and Alcohol Dependence, 99, 37–46.

Bradley, K. A., DeBenedetti, A. F., Volk, R. J., Williams, E. C., Frank, D. & Kivlahan, D. R. (2007). AUDIT-C as a brief screen for alcohol misuse in primary care. Alcoholism: Clinical and Experimental Research, 31(7), 1208–1217.

Church, O.M., and T.F. Babor. 1995. “Barriers and Breakthroughs: Substance Abuse Curricula in Nursing Education.” Journal of Nursing Education 34(6):278-281.

Copeland, J., & Martin, G. (2004). Web-based interventions for substance use disorders: A qualitative review. Journal of Substance Abuse Treatment, 26, 109–116.

D’Onofrio, G., & Degutis, L. C. (2002). Preventive care in the emergency department: Screening and brief intervention for alcohol problems in the emergency department: A systematic review. Academic Emergency Medicine, 9, 627–638.

Fleming, M.F. 2002. “Screening, assessment, and intervention for substance use disorders in settings.” In Strategic Plan for Interdisciplinary Faculty Development: Arming the Nations’ Health Professional Workforce for a New Approach to Substance Use Disorders. Providence RI: Association for Medical Education and Research in Substance Abuse (AMERSA). www.projectmainstream.net/mainstream/supportdata/part1.pdf

Fleming, M. F., Balousek, S. L., Klessig, C. L., Mundt, M. P., & Brown, D. D. (2007). Substance use disorders in a primary care sample receiving daily opioid therapy. The Journal of Pain, 8(7), 573–582.

Gentilello, L. M., Donovan, D. M., Dunn, C. W., & Rivara, F. P. (1999). Alcohol interventions in a trauma center as a means of reducing the risk of injury recurrence. Annals of Surgery, 230, 1–18.

Hungerford, D. W., & Pollock, D. A. (2003). Emergency department services for patients with alcohol problems: Research directions. Academy of Emergency Medicine, 10(1), 79–84.

Klitzner, M., Fisher, D., Stewart, K, and Gilbert, S. 1992. “Early Intervention for Adolescents”. Princeton NJ: Robert Wood Johnson Foundation.

Kraemer, K. L. (2007). The cost-effectiveness and cost-benefit of screening and brief intervention for unhealthy alcohol use in medical settings. Substance Abuse, 28(3), 67–77.

Madras, B. K., Compton, W. M., Avula, D., Stegbauer, T., Stein, J. B., & Clark, H. W. (2008). Screening, brief interventions, referral to treatment (SBIRT) for illicit drug and alcohol use at multiple healthcare sites: Comparison at intake and 6 months later. Drug and Alcohol Dependence, 99(1–

Mauch, D., Kautz, C., & Smith, S. A. (2008). Reimbursement of mental health services in primary care settings (HHS Pub No. SMA-08-4324). Rockville, MD: Center for Mental Health Services, Substance Abuse and Mental Health Services Administration.

McRee, B. (2009, November). Implementing effective SBIRT programs: Lessons learned from practice-based initiatives. Presentation delivered at the 2009 SBIRT Grantee Meeting, Bethesda, MD.

Modesto-Lowe, V., and A. Boornazian. 2000. “Screening and Brief Intervention in the Management of Early Problem Drinkers: Integration into Healthcare Settings.” Disease Management & Health Outcomes 8(3):129-137.

Moyer, A., J. Finney, C. Swearingen, and P. Vergun. 2002. “Brief Interventions for Alcohol Problems: A Meta-Analytic Review of Controlled Investigations in Treatment-Seeking and Non-Treatment-Seeking Populations.” Addiction 97:279-292.

Mrazek PJ and Haggerty RJ (eds). 1994. “Reducing Risks for Mental Disorders: Frontiers for Preventive Intervention Research.” Washington DC: National Academy Press.

Office of National Drug Control Policy (ONDCP). (2007). National drug control strategy, 2007 (NCJ Publication No. 216431). Washington, DC: Office of National Drug Control Policy. http://www.whitehousedrugpolicy.gov/policy/ndcs.html.

Roche, A. M., & Freeman, T. (2004). Brief interventions: Good in theory but weak in practice. Drug Alcohol Review, 23(1), 11–18.

Saitz, R., N.J. Horton, L.M. Sullivan, M.A. Moskowitz, and J.H. Samet. 2003. “Addressing Alcohol Problems in Primary Care: A Cluster Randomized, Controlled Trial of a Systems Intervention. The Screening and Intervention in Primary Care (SIP) Study”. Annals of Internal Medicine 138(5):372-382.

Soderstrom, C. A., DiClemente, C. C., Dischinger, P. C., Hebel, J. R., McDuff, D. R., Auman, K. M., & Kufera, J. A. (2007). A controlled trial of brief intervention versus brief advice for at-risk drinking trauma center patients. Journal of Trauma, 62, 1102–1111, discussion 1111–1112.

Stephens, R. S., Roffman, R. A., Fearer, S. A., Williams, C., & Burke, R. S. (2007). The marijuana check-up: Promoting change in ambivalent marijuana users. Addiction, 102(6), 947–957.

Vinson, D. C., Galliher, J. M., Reidinger, C., & Kappus, J. A. (2003). Comfortably engaging: Which approach to alcohol use should we use? Annals of Family Medicine, 2, 398–404.

U.S. Department of Health and Human Services. Office of Disease Prevention and Health

Promotion. Healthy People 2020: The Road Ahead. Retrieved on March 22, 2010, at http://www.healthypeople.gov/HP2020/objectives/ViewObjective.aspx?Id=595&TopicArea=Substance+Abuse&Objective=SA+HP2020%e2%80%936&TopicAreaId=46.

Zahradnik, A., Otto, C., Crackau, B., Lohrmann, I., Bischof, G., John, U., & Rumpf, H. (2009). Randomized controlled trial of brief intervention for problematic prescription drug use in non-treatment-seeking patients. Addiction, 104, 109–117.

ATTACHMENTS



Attachment 1: Evaluation Questions

Attachment 2: Practitioner Survey

Attachment 3: Network Security at RTI International

Attachment 4: Privacy Pledge

Attachment 5: Practitioner Informed Consent

Attachment 6: Table Shell- Descriptive Results, Practitioner Outcomes







17


File Typeapplication/msword
File TitleSupporting Statement for
Authorproth
Last Modified ByDHHS
File Modified2011-03-22
File Created2011-03-22

© 2024 OMB.report | Privacy Policy