To evaluate the success of SBIRT implementation at the organizational level, a web-based survey will be administered to staff in sites where SBIRT services are being delivered—referred to as performance sites. The Performance Site Survey will target individuals who directly provide SBIRT services and staff who interact with SBIRT providers and SBIRT patients regularly. The types of staff surveyed will include intake staff, medical providers, behavioral health providers, social workers, and managerial and administrative staff who oversee these staff. Since cross-site evaluation team members will be traveling to selected SBIRT providers and coordinating with state and site administrators on a yearly basis, there is an opportunity to complete a near-census of all SBIRT-related staff at performance sites with a minimal level of burden.
Individual site administrators will be contacted approximately 8 weeks ahead of the planned distribution to inform and remind them of the survey and to ask for their help to obtain a roster of SBIRT staff and to inform and remind staff about the survey’s intent. Before the planned distribution of the survey, administrators will again be contacted to coordinate and confirm survey distribution with all those eligible to be surveyed. To protect the privacy of responses, the site administrators will not be informed of which staff eventually participate in the survey.
The cross-site evaluation team will administer paper-and-pencil versions of the survey to be returned in sealed envelopes if the web survey is not feasible. Paper-based distribution of the survey will occur in concordance with an approved process negotiated with the site administrator. This could occur during a site visit or during a regular staff meeting. Pre-paid envelopes will contain no information that can uniquely identify the respondent. The surveys distributed will only be identified by number. The SBIRT cross-site evaluation team will keep the names of respondents in a secured separate file. Team members will use the cross-walk of surveys to respondent names only to follow up with staff to encourage them to complete the survey. Reminders to staff to return the survey will stop within 4 weeks of the survey launch.
The SBIRT cross-site evaluation team expects an 80 percent or greater response rate on the Performance Site Survey. To maximize response rates, the SBIRT cross-site evaluation team will follow protocols that have been used successfully on other projects to achieve a greater than 80 percent response rate on similar surveys. The focus will be on reducing the burden on staff. The web survey will be compatible with any device capable of web browsing. Using a web survey will allow staff to take the survey at their convenience and avoid interrupting patient interactions.
For the paper survey, the protocols include proper timing and location of survey administration to accommodate the practitioners. Survey staff will work with site administrators to ensure that the survey is distributed at a staff meeting or scheduled briefing to increase the overall response rate and decrease individual burden.
The survey launch will occur around a site visit as timing and scheduling permit to increase the likelihood of response. Survey staff will work with key staff at these sites at least 3 months before the survey to lay the groundwork for seamless data collection and to obtain buy-in from staff who act as local champions. All staff at these performance sites will be informed, in advance, of the purpose and significance of the survey to encourage their participation in this survey. Finally, the efficiency of the survey and the assurance of privacy will make survey completion more amenable to performance site staff.
The cross-site evaluation team tested a pencil-and-paper version of the Performance Site Survey with 17 respondents and found that it takes approximately 13 minutes to complete.
The 78 question web survey includes the collection of basic demographic information, questions about the organization’s readiness to implement SBIRT, and questions about the use of HIT to deliver SBIRT services. The demographic questions were tailored from a previous cross-site evaluation survey to fit the current set of cross-site grantees. The organizational readiness questions were developed through a review of the extant implementation science research literature (e.g., Chaudoir, Dugan, & Barr, 2013; Damschroder et al., 2009; Garner, 2009; Greenhalgh, MacFarlane, & Kyriakidou, 2004; Weiner, 2009; Weiner, Belden, Bergmire, & Johnston, 2011). Based on this review, the Organizational Readiness for Implementation Change (Shea, Jacobs, Esserman, Bruce, & Weiner, 2014) and the Implementation Climate Scale (Jacobs, Weiner, & Bunger, 2014) were identified as the two most appropriate instruments. In addition to questions from these two instruments, the survey includes questions to assess satisfaction, capacity, and infrastructure to implement screening, brief intervention, brief treatment, and referral to treatment.
To identify relevant HIT measures, the cross-site evaluation team modified measures from socio-technical frameworks (Kling, 1980), including the DeLone and McClean framework (DeLone & McLean, 2004), the Public Health Informatics Institute Framework (PHII, 2005), and the Human Organization and Technology (HOT)-Fit Framework (Yusof, 2008). Across these three frameworks, the survey captures measures of system availability, information availability, organizational structure and environment, utilization, and user satisfaction.
As noted in Section A.8, the SBIRT cross-site evaluation team has consulted extensively with an expert panel that has reviewed and approved all data collection and analysis methodologies outlined in this package. Panel members will also continue to provide expert advice throughout the course of the program. In addition, several in-house experts will be consulted throughout the program on statistical aspects of the design, methodological issues, economic analysis, database management, and data analysis (Exhibit 5).
Expert |
Affiliation |
Contact Information |
Jeremy W. Bray, PhD |
Professor and Department Head Department of Economics Bryan School of Business and Economics 462 Bryan Building PO Box 26170 UNCG Greensboro, NC 27402-6170 |
Phone: 336-334-3910 |
Bryan Garner, PhD |
Senior Implementation
Research Scientist |
Phone: 919-597-5159 |
Gary A. Zarkin,
PhD |
Vice President |
Phone: 919-541-5858 |
Laura Dunlap, PhD |
Director |
Phone: 919-541-7310 |
Antonio Morgan-Lopez,
PhD |
Senior Research
Quantitative Psychologist |
Phone: 919-316-3436 |
Sarah Ndiangui SAMHSA Contracting Officer’s Representative (COR) |
Public Health Advisor SAMHSA CSAT 1 Choke Cherry Rd Rockville, MD 20857 |
Phone: 240-276-2918 E-mail: [email protected]
|
Guileine Kraft, PhD, LCSW-C SAMHSA Alternate COR |
Public Health Advisor SAMHSA CSAT 1 Choke Cherry Rd Rockville, MD 20857 |
Phone: 240-276-2915 E-mail: [email protected]
|
Amaral, M. B., Ronzani, T. M., & Souza-Formigoni, M. L. (2010). Process evaluation of the implementation of a screening and brief intervention program for alcohol risk in primary health care: An experience in Brazil. Drug and Alcohol Review, 29(2), 162–168.
Berends, L., MacLean, S., Hunter, B., Mugavin, J., & Carswell, S. (2011). Implementing alcohol and other drug interventions effectively: How does location matter? Australian Journal of Rural Health, 19, 211–217.
Bernstein, E., Edwards, E., Dorfman, D., Heeren, T., Bliss, C., & Bernstein, J. (2009). Screening and brief intervention to reduce marijuana use among youth and young adults in a pediatric emergency department. Academic Emergency Medicine, 16(11), 1174–1185.
Chaudoir, S. R., Dugan, A. G., & Barr, C. H. (2013). Measuring factors affecting implementation of health innovations: A systematic review of structural, organizational, provider, patient, and innovation level measures. Implementation Science, 8(1), 22.
Chen, I. J., & Popovich, K. (2003). Understanding customer relationship management (CRM): People, process and technology. Business Process Management Journal, 9(5), 672–688.
Damschroder, L. J., Aron, D. C., Keith, R. E., Kirsh, S. R., Alexander, J. A., & Lowery, J. C. (2009). Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implementation Science, 4(1), 50.
DeLone, W. H., & McLean, E. R. (2004). Measuring e-commerce success: Applying the DeLone & McLean information systems success model. International Journal of Electronic Commerce, 9(1), 31–47.
Fiore, M. C., Bailey, W. C., Cohen, S. J., Dorfman, S. F., Goldstein, M. G., Gritz, E. R., et al. (2000). Treating tobacco use and dependence. Clinical practice guideline. Rockville, MD: U.S. Department of Health and Human Services, Public Health Service.
Garner, B. R. (2009). Research on the diffusion of evidence-based treatments within substance abuse treatment: A systematic review. Journal of Substance Abuse Treatment, 36(4), 376–399.
Gassman, R. A. (2003). Medicaid specialization, profession, and mediating beliefs that predict stated likelihood of alcohol screening and brief intervention: Targeting educational interventions. Substance Abuse, 24(3), 141–156.
Greenhalgh, T., Glenn, R., Macfarlane, F., Bate, P., & Kyriakidou, O. (2004). Diffusion of innovations in service organizations: Systematic literature review and recommendations for future research. The Milbank Quarterly, 82(4), 581–629.
Jacobs, S. R., Weiner, B. J., & Bunger, A. C. (2014). Context matters: Measuring implementation climate among individuals and groups. Implementation Science, 9, 46.
Jonas, D. E., Garbutt, J. C., Amick, H. R., Brown, J. M., Brownley, K. A., Council, C. L., et al. (2012). Behavioral counseling after screening for alcohol misuse in primary care: A systematic review and meta-analysis for the U.S. Preventive Services Task Force. Annals of Internal Medicine, 157(9), 645–654.
Katon, W. J. (2003). Clinical and health services relationships between major depression, depressive symptoms, and general medical illness. Biological Psychiatry, 54(3), 216–226.
Kling, R. (1980). Social analyses of computing: Theoretical perspectives in recent empirical research. ACM Computing Surveys, 12(1), 61–110.
MacLean, S., Berends, L., Hunter, B., Roberts, B., & Mugavin, J. (2012). Factors that enable and hinder the implementation of projects in the alcohol and other drug field. Australian and New Zealand Journal of Public Health, 36(1), 61–68.
PHII. (2005). Toward Measuring Value. Retrieved from https://phii.org/sites/default/files/resource/pdfs/TowardsMeasuringValueBrief.pdf
Prince, M., Patel, V., Saxena, S., Maj, M., Maselko, J., Phillips, M. R., & Rahman, A. (2007, September 8). No health without mental health. Lancet, 370(9590), 859–877.
Robson, D., & Gray, R. (2007). Serious mental illness and physical health problems: A discussion paper. International Journal of Nursing Studies, 44(3), 457–466.
Shaw, E. K., Howard, J., West, D. R., Crabtree, B. F., Nease, D. E., Tutt, B., & Nutting, P. A. (2012). The role of the champion in primary care change efforts. Journal of the American Board of Family Medicine, 25(5), 676–685.
Shea, C. M., Jacobs, S. R., Esserman, D. A., Bruce, K., & Weiner, B. J. (2014). Organizational readiness for implementing change: A psychometric assessment of a new measure. Implementation Science, 9(7), 1–15.
Solberg, L. I., Maciosek, M. V., & Edwards, N. M. (2008). Primary care intervention to reduce alcohol misuse ranking its health impact and cost effectiveness. American Journal of Preventive Medicine, 34(2), 143–152.e3.
Substance Abuse and Mental Health Services Administration (SAMHSA). (2009). Screening, Brief Intervention, and Referral to Treatment. Available at: http://sbirt.samhsa.gov/index.htm
Substance Abuse and Mental Health Services Administration (SAMHSA). (2013). Results from the 2012 National Survey on Drug Use and Health: Summary of national findings. NSDUH Series H-46, HHS Publication No. (SMA) 13-4795. Rockville, MD: SAMHSA.
Substance Abuse and Mental Health Services Administration (SAMHSA). (2014). Substance Abuse and Mental Health Data Archive. Retrieved May 31, 2014, from http://www.icpsr.umich.edu/icpsrweb/SAMHDA
Weiner, B. J., Lewis, M. A., & Linnan, L. A. (2009). Using organization theory to understand the determinants of effective implementation of worksite health promotion programs. Health Education Research, 24(2), 292–305.
Weiner, B. J., Belden, C. M., Bergmire, D. M., & Johnston, M. (2011). The meaning and measurement of implementation climate. Implementation Science, 6(78), 1–12.
World Health Organization (WHO). (2008). The effectiveness of a brief intervention for illicit drugs linked to the alcohol, smoking, and substance involvement screening test (ASSIST) in primary health care settings: A technical report of phase III findings of the WHO ASSIST Randomized control trial. Retrieved October 7, 2011, from http://www.who.int/substance_abuse/activities/assist_technicalreport_phase3_final.pdf
Yoast, R. A., Wilford, B. B., & Hayashi, S. W. (2008). Encouraging physicians to screen for and intervene in substance use disorders: Obstacles and strategies for change. Journal of Addictive Diseases, 27(3), 77–97.
Yusof, M. M., Kuljis, J., Papazafeiropoulou, A., et al. (2008). An evaluation framework for Health Information Systems: Human, organization and technology-fit factors (HOT-fit). International Journal of Medical Informatics, 77(6), 386–398.
.
ATTACHMENTS
Attachment 1: Performance site survey
Attachment 2: Network security at RTI International
Attachment 3: Privacy pledge
Attachment 4: Performance site survey consent script
Attachment 5: Table shells for descriptive results
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Mallonee, Erin |
File Modified | 0000-00-00 |
File Created | 2021-01-24 |