Supporting Statement A
Check off which applies:
☐ New
☐ Revision
☒ Reinstatement with Change
☐ Reinstatement without Change
☐ Extension
☐ Emergency
☐ Existing
Justification
The Substance Abuse and Mental Health Services Administration’s (SAMHSA’s) Division of Service and Systems Improvement of the Center for Mental Health Services (CMHS) is requesting Office of Management and Budget (OMB) approval for the reinstatement of data collection associated with the previously approved cross-site evaluation of the Garrett Lee Smith (GLS) State/Tribal Youth Suicide Prevention and Early Intervention Program (short title: GLS State/Tribal Program) Passed by Congress in 2004, the Garrett Lee Smith Memorial Act (GLSMA) was the first legislation to provide funding for States, Tribes, and institutions of higher education to develop, improve, and evaluate early intervention and suicide prevention programs. The previously approved cross-site evaluation included the two grant programs funded under the GLSMA: the GLS State/Tribal Program which focuses on youth suicide prevention in States and Tribes and the GLS Campus Suicide Prevention which focuses only on institutions of higher education. The reinstatement of this data collection is only for the GLS State/Tribal program. In addition to providing programmatic funding, the GLSMA mandates that the effectiveness of the GLS State/Tribal Program be evaluated through both cross-site and local evaluation and reported to Congress. Per this mandate, the cross-site evaluation of the GLS State/Tribal program was conceptualized in 2005 and was implemented until 2019. This evaluation was significantly redesigned in 2015 in order to better meet the needs of program stakeholders, and a new data collection package was approved by the OMB on April 1, 2016. SAMHSA is proposing to update the evaluation approach in order to grow the body of information required to demonstrate ongoing reduction in suicide mortality and continue the evaluation work previously in progress.
Informed by its fourteen-year history of conducting cross-site evaluations of GLS State/Tribal program and taking into account the evolution of the program over time, , SAMHSA is continuing to refine and enhance the ongoing evaluation of the implementation, outcomes, and impacts of the GLS State/Tribal program.
This proposed reinstatement of the previously approved evaluation (OMB No. 0930-0286; Expiration, March 31, 2019) builds on the prior published GLS evaluation proximal and distal training and aggregate findings from program activities (e.g., Condron, Godoy-Garraza, Walrath, McKeon, & Heilbron, 2014; Walrath, Godoy-Garraza, Reid, Goldston, & McKeon, 2015; Godoy-Garraza, Walrath, Kuiper, Goldston, & McKeon, 2018; Condron, Godoy-Garraza, Kuiper, Sukumar, Walrath, & McKeon, 2018; Godoy-Garraza, Kuiper, Goldston, McKeon, & Walrath, 2019; Godoy-Garraza, Kuiper, Cross, Hicks, & Walrath, 2020; Goldston & Walrath, 2023). The current design reflects SAMHSA’s desire to assess the implementation, outcomes, and impacts of the GLS State/Tribal Program. As such, the GLS State/Tribal Evaluation is designed to address the field’s need for additional evidence on the impacts of the GLS State/Tribal Program in four areas:
Reduction in suicide morbidity and mortality
Suicide prevention training effectiveness
Youth experiences of the services and supports received
Continuum of care effectiveness in connecting youths to treatment services and supports
Approval is being requested for data collection associated with four revised instruments and five new instruments. These include web-based surveys, inventories, and forms; a telephone simulation using a standardized patient interaction; and abstractions/submissions of existing data. Due to changes in evaluation design and the fulfillment of data collection requirements, approval for removal of three instruments is also requested in order to minimize burden.
Suicide continues to be a major public health problem in the United States, particularly with respect to youths and young adults. Childhood mental health concerns and suicide rates have risen steadily from 2010 to 2020; and by 2018, suicide was the second leading cause of death for youths 10–24 years of age (American Academy of Pediatrics [AAP], 2021). The COVID-19 pandemic has intensified this crisis. Across the country, emergency department (ED) visits have increased for all mental health emergencies including suspected suicide attempts (Yard, et al., 2021). Beginning in April 2020, the proportion of children’s mental health–related ED visits among all pediatric ED visits increased and remained elevated through October 2020. Compared with 2019, the proportion of mental health–related visits for children aged 5–11 and 12–17 years increased approximately 24% and 31%, respectively (Leeb, et al., 2020). An analysis of private healthcare claims data indicates that use of services among youths related to intentional self-harm, overdoses, substance use disorders, and mental health conditions has also increased since the start of the pandemic, underscoring the toll the pandemic has taken on youth behavioral health (FAIR Health, 2021). The pandemic has also added to challenges already faced by America’s youth and young adults. It disrupted in-person schooling; in-person social opportunities with peers and mentors; access to healthcare, social services, food, and housing; and the health of caregivers (Wachino, et al., 2021). One study estimated that more than 140,000 children in this country have lost a primary and/or secondary caregiver, with youths of color disproportionately impacted (Hillis, et al., 2021).
As has been widely documented, the prevalence of mental health disorders among America’s children and youth was high even before the pandemic. Recent findings indicate that 20% of all children have an identified mental health condition each year, while 40% of all children will meet criteria by age 18 (Bitsko, et al., 2022). Also, children living in poverty and racial and ethnic minority populations fare worse than their peers with respect to access to care, identifiable risk factors, and prevalence of certain mental health conditions. Despite high rates of mental health conditions, Bitsko, et al. (2022) also report low rates of treatment: about 11.4% annually for White, 9.8% for Black, and 8.7% for Latin children. The pandemic also made youth access to mental healthcare more difficult, with providers and the mental health system as a whole operating beyond capacity (Wachino, et al., 2021). While risk factors for suicide are multifaceted, including, for instance, substance use, experience of childhood trauma, and stigma against help-seeking, barriers to accessing mental healthcare are among the risks (World Health Organization, n.d.).
Suicidal behaviors among high school students also increased during the decade preceding COVID-19, with 19% seriously considering attempting suicide—a 36% increase from 2009 to 2019—and about 16% having made a suicide plan in the prior year—a 44% increase from 2009 to 2019 (U.S. Surgeon General, 2021). Between 2007 and 2018, suicide rates among U.S. youths aged 10–24 years increased by 57%, and estimates show more than 6,600 suicide deaths among this age group in 2020 (U.S. Surgeon General, 2021).The pandemic’s negative impacts most heavily affected individuals already vulnerable to suicide, such as youths with disabilities; racial and ethnic minorities; lesbian, gay, bisexual, transgender, and questioning (LGBTQ)+ youths; low-income youths; youths in rural areas; youths in immigrant households; youths involved with the child welfare or juvenile justice systems; and homeless youths (U.S. Surgeon General, 2021)
GLS State/Tribal Youth Suicide Prevention and Early Intervention Program
Administered by SAMHSA and authorized by GLSMA, the GLS State/Tribal Program has been devoted to suicide prevention for youths and young adults up to 24 years of age. Suicide prevention activities supported by GLS State/Tribal grantees have included education; training programs including gatekeeper training, screening activities, and enhancement of infrastructure for improved linkages to services; crisis hotlines; and community partnerships. During Fiscal Years (FYs) 2005–2019, SAMHSA awarded 230 state and tribal grants to 50 states, two U.S. territories, and 58 tribes. A total of 295 campus grants were also awarded to 279 colleges and universities in 49 states, two U.S. territories, and the District of Columbia (SAMHSA, 2019). SAMHSA funds multiple grantees at a time in groups that are called cohorts.
Going forward, the purpose of the program is to support states and tribes in the implementation of youth suicide prevention and early intervention strategies in schools, educational institutions, juvenile justice systems, substance use programs, mental health programs, foster care systems, and other child and youth-serving organizations. The overall focus is on increasing the number of youth-serving organizations that can identify and work with youths at risk of suicide; increasing the capacity of clinical service providers to assess, manage, and treat youths at risk of suicide; and improving the continuity of care and follow-up of youths identified to be at risk for suicide, including those who have been discharged from EDs and inpatient psychiatric units. Grantees are encouraged to reach populations most in need and to address racial, ethnic, sexual orientation, and military family/veteran behavioral health disparities with culturally appropriate prevention and intervention strategies. Compared with prior years of the program, new allowable activities include, for example, an upstream focus on the social determinants of health and behavioral health disparities. Newly required activities include a focus on lethal means restriction and integration of those with lived experience (suicide and loss survivors). As in prior cohorts, the goal of the GLS State/Tribal Program is the reduction of suicide and suicide attempts in youths across America.
In partnership with GLS grantees, ICF (a government contractor) provides training and technical assistance (TA) regarding data collection and research design for the evaluation. In addition, ICF directly collects data, receives data from grantee data collection efforts, monitors data quality, and provides feedback to grantees.
Section
520E (g) of the GLSMA mandates a cross-site evaluation to be
conducted concerning the effectiveness of the activities carried out
under the GLS State/Tribal Program. The GLSMA specifies that a report
to Congress must be submitted “…to analyze the
effectiveness and efficacy of the activities conducted with grants,
collaborations, and consultations under [Section 520E].”
Purpose and Use of Information Collected
The previously approved evaluation (OMB No. 0930-0286; Expiration, March 31, 2019) is the 2015 redesign of the original cross-site evaluation of the GLS State/Tribal Program that was first implemented in 2005. The original cross-site evaluation consisted of four stages of information gathering that targeted funded program activity areas: context stage, product stage, process stage, and impact stage. The aim of the context stage was to gain an understanding of grantees’ program plans, such as target population, target region, service delivery mechanisms, service delivery setting, types of program activities to be funded, evaluation activities, existing data sources, and availability of data elements to support the cross-site evaluation. The product stage described the development and utilization of prevention strategies at each GLS grantee site. The process stage assessed progress on key activities related to implementation of grantee programs, such as the types of training conducted and roles of participants. Finally, the impact stage examined the early impacts that suicide prevention programs have on individuals at risk for suicide.
Building on the earlier findings from the original cross-site evaluation, the current GLS State/Tribal Evaluation continues to focus on priority areas of inquiry important to SAMHSA, Congress, and other suicide prevention stakeholders. The evaluation aligns with SAMHSA’s primary aim to assess the impact of the GLS State/Tribal program activities with respect to reducing suicide attempts and deaths by suicide. The GLS State/Tribal Evaluation will allow SAMHSA to expand the evidence base for suicide prevention; address factors contributing to suicide deaths and attempts; and establish standards for developing, implementing, and evaluating suicide prevention programs.
The purpose of the GLS State/Tribal Evaluation is to build the program’s knowledge base by expanding on information gathered through the prior evaluation related to the process, products, context, and impacts of the State/Tribal program. The GLS State/Tribal Evaluation is designed to gather detailed outcome and impact data to provide SAMHSA with the data and information needed to understand what works, why it works, and under what conditions, relative to program activities.
The GLS State/Tribal Evaluation incorporates three areas of evaluation to provide a robust understanding of the implementation, outcomes, and impacts of the GLS State/Tribal Program. A behavioral health equity and cultural equity lens will be applied to each area of evaluation to ensure a culturally specific understanding of intervention implementation, outcomes, and impacts.
The Implementation Evaluation inventories the array of strategies and services implemented by grantees and answers questions about the extent to which grantees are implementing required and allowed prevention strategies and services, including related settings, populations, and degree of fidelity to their work plan.
The Outcome Evaluation includes three studies related to trainings, youths’ experience of services, and the continuity of care for at-risk youths—i.e., the Training Outcomes Study, Youth Experience, Outcomes, and Resiliency Study (Youth Study), and Continuity of Care Study. These studies will provide a deeper examination of the effectiveness of these strategies as they relate to the long-term gains in trainee skills to identify and manage youths at risk for suicide; youths’ perspectives, including an assessment of how youths experience services and supports, and facets that encourage building resilience, stress tolerance, and self-management skills; and the effectiveness of a continuum of care that connects youths to treatment services, supports, and post-discharge follow-up.
Finally, the Impact Evaluation will combine data from the Implementation and Outcome Evaluations to assess the effectiveness of the GLS State/Tribal Program on decreasing suicide morbidity and mortality. Through implementation of this evaluation design, it will be possible to isolate prevention strategy impacts and explain cross-program impacts on short-term (e.g., change in self-efficacy to identify, change in the number of youths identified as at risk) and long-term program outcomes such as suicide attempts, and deaths by suicide.
The proposed multimethod design approach, including cohort and quasi-experimental case-control studies, also meets the legislative requirements outlined in GLSMA to inform performance and implementation of programs. The design considers allowable and required activities, variation in the partnerships and provider networks/infrastructure, program settings, populations being served, the range of program implementation plans and goals, existing data systems, and grant infrastructures that support implementation and evaluation participation. In addition, the design considers the stage of implementation of currently funded and to-be-funded cohorts of grantees to seamlessly integrate cohorts appropriately into the evaluation studies.
Over 14 years, from 2005 to 2019, the prior evaluation of the GLS State/Tribal Program generated data resulting in the largest repository of youth suicide prevention data in the United States and has been essential in helping communities and decision makers at all levels of government improve suicide prevention effectiveness. Evaluation activity has provided continuous documentation of the context to which funded suicide prevention activities were implemented, the range of prevention services and activities supported through grant funding, and the impact of the program on youth suicide morbidity and mortality. Through participation in the evaluation, GLS State/Tribal Program grantees have generated data regarding the nature and extent of suicide prevention activities across the United States including the number of individuals affected by grantee activities—such as youth screened or gatekeepers trained—and proximal outcomes of such efforts including increased knowledge or awareness of the signs and symptoms of suicide risk and numbers of youths identified as at risk who were referred for services. For example, state grantees reported 11,542 suicide prevention activities, and tribal grantees reported 7,538 activities between October 2006 and June 2019. More than 97% of GLS State/Tribal Program grantees implemented gatekeeper trainings, and 55% of state trainees and 48% of tribal trainees reported that they had identified youths at risk for suicide based on data collected between 2010 and 2019 (SAMHSA, 2019). Moving forward, the GLS State/Tribal Evaluation will continue to identify factors that facilitate the early identification, referral, and follow-up of youth at risk (Heilbron, Goldston, Walrath, Rodi & McKeon, 2013; Rodi, M. S., Godoy-Garraza, L., Walrath, C., Stephens, R., Condron, D. S., Hicks, B. B., et al., 2012).
In addition, gatekeeper training has been identified as a critical element in suicide prevention efforts (Isaac, M., Elias, B., Katz, L. Y., Belik, S., Deane, F. P., Enns, M. W. et al., 2009). Over 1,482,497 individuals have participated in trainings and educational seminars sponsored by the GLS State/Tribal since 2006 (ICF, 2018). These trainings have been found to increase knowledge of suicide intervention, skills, attitudes, and intention to help someone at risk for suicide, including school counselors and teachers (King & Smith, 2000; Reis & Cornell, 2008; Wyman, 2008); juvenile justice and child welfare staff members (Keller, D. P., Schut, L. J., Puddy, R. W., Williams, L., Stephens, R. L., McKeon, R., et al., 2009); those working with veterans (Matthieu, Cross, Batres, Flora, & Knox, 2008); and others (Isaac, M., Elias, B., Katz, L. Y., Belik, S., Deane, F. P., Enns, M. W. et al., 2009). Findings from the evaluation indicate that GLS-trained gatekeepers are identifying youth at risk across service settings, and those youth are being referred for services without regard for race, gender, or the settings in which they are identified (Rodi, M. S., Godoy-Garraza, L., Walrath, C., Stephens, R., Condron, D. S., Hicks, B. B., et al., 2012). The evaluation has also provided initial findings that indicate a positive collective impact of GLS Program-sponsored suicide prevention trainings on subsequent identification behavior of trainees (Condron, Godoy-Garraza, Walrath, McKeon, & Heilbron, 2015) and establishes the effect of GLS State/Tribal Program trainings on youth suicide attempts and suicide mortality. Findings indicate that counties where GLS trainings were implemented had lower suicide rates in the year following training events compared to similar counties that did not have GLS trainings (Walrath, Godoy-Garraza, Reid, Goldston, & McKeon, 2015). Additionally, two years following implementation of GLS trainings, youth suicide mortality rates remained lower than comparison counties (Godoy-Garraza, Walrath, Kuiper, Goldston, & McKeon, 2018b).
Further, after identification and referral, best practices call for tracking and monitoring of youth into follow-up services to ensure service receipt and prevent youth from ‘falling through the cracks’ after identification. The prior Continuity of Care Study examined the process of early identification, referral, and follow-up practices of GLS grantees. Since 2005, nearly 70,000 youth have been identified through GLS-sponsored screenings (n = 32,392) or by a GLS-trained gatekeeper (n = 37,407) as being at risk for suicide. Youth identified via a screening tool frequently were referred to a public mental health provider (55.7 percent), a private health provider (34.7 percent), or a school counselor (15.1 percent). If a youth was identified via a gatekeeper training, he or she usually was referred to a public mental health provider (67.3 percent), a private health provider (18.3 percent), or a psychiatric hospital (10.1 percent) (ICF, 2018). The majority of youth identified through either a screening (77%) or a gatekeeper (93%) received a mental health referral. Learning more about this pathway of care, as well as the factors that support follow-up care and treatment adherence, will be important to guide future policies and practices for supporting youth identified both through gatekeeper identifications and screenings.
Collectively, this information has been used to help guide the field of suicide prevention and contribute to findings on the relationship of training length and identification of youth at risk for suicide and overall reductions in suicide death during the year following trainings. For example, the evaluation has shown that, for participants typically interacting with youth in school settings, a larger number of identifications were associated with participation in longer gatekeeper training when compared with shorter trainings 3 months after the activity (Condron, D. S., Godoy-Garraza, L., Walrath, C. M., McKeon, R., Goldston, D. B., & Heilbron, N. S., 2015).
Overall, the evaluation has supplied critical information related to process, proximal, and intermediate outcomes. Going forward, the updated evaluation design allows for new methodological possibilities that provide a means of addressing current questions for the next stage of evaluation, building on the evaluation findings to date and focusing on priority areas of inquiry important to SAMHSA, Congress, and other suicide prevention stakeholders.
Approval is being requested for 4 revised data collection instruments and 5 new data collection instruments that comprise the GLS State/Tribal Evaluation (refer to Exhibit 3). Changes to instruments are described in Exhibit 13, Revisions to the Evaluation. A full list of attachments is located at the end of Supporting Statement B.
Exhibit 3. GLS State/Tribal Evaluation Instrument List with Acronyms and Status
Attachment |
Acronym |
Name |
Instrument Status |
B.1 B.2 |
PSI |
Prevention Strategies Inventory - clean Prevention Strategies Inventory – changes marked |
Revised |
C.1 C.2 |
TASP |
Training Activity Summary Page - clean Training Activity Summary Page – changes marked |
Revised |
D |
TSA-P |
Training Skills Assessment – Post Training |
New |
E |
TSA-F |
Training Skills Assessment – Follow-up |
New |
F |
TSA-PS |
Training Skills Assessment – Phone Simulation |
New |
G.1
G.2 |
EIRFT-I |
Early Identification, Referral, Follow-up, and Treatment – Individual Form - clean Early Identification, Referral, Follow-up, and Treatment – Individual Form – changes marked |
Revised |
H.1
H.2 |
EIRFT-S |
Early Identification, Referral, Follow-up, and Treatment – Screening Form - clean Early Identification, Referral, Follow-up, and Treatment – Screening Form – changes marked |
Revised |
I |
YORS |
Youth Outcomes and Resiliency Survey |
New |
J |
YER Journal |
Youth Experience Reflective Journal |
New |
GLS State/Tribal Evaluation Instruments
PSI (Revised): the PSI is a web-based survey that captures all GLS State/Tribal program prevention strategies and products. Data include strategy types and products distributed, intended audiences or populations of focus, and expenditures across major categories (e.g., outreach and awareness, gatekeeper training, screening programs). Each major strategy includes sub-strategies, enabling grantees to specify and provide details about the sub-strategy, including implementation setting/location, timeframe, and intended audiences or populations of focus. The PSI is completed by grantee staff each quarter. PSI data will inform the Training Outcomes Study and Continuity of Care Study. Compared to the prior version of the PSI, the revised PSI includes all previous strategies and integrates new or revised questions related to the following areas of interest: (1) grantees use of emerging technologies (2) implementation of Evidence Based Practices (EBPs), (3) cultural adaptations and health equity practices, and (4) program sustainability. In addition, we have revised the PSI to optimize the assessment of implementation timeframe and location and the alignment of audiences more precisely with grantee strategies implemented.
TASP (Revised): the TASP is a web-based survey collecting aggregate-level training data from all S/T grantees. Data include information about the type of training delivered, the number and roles of training participants, and the setting of the training, including ZIP code where the training is held (for use in analysis of GLS program impact). The TASP also assesses intended outcomes, as well as the number of online trainings completed, train-the-trainer events held, and booster trainings that follow the initial training. The TASP also gathers information about the inclusion of behavioral rehearsal or role-play and resources provided at the training as these elements have been found to improve retention of knowledge and skills post-training. Additionally, the TASP collects information about resources or materials provided to trainees (e.g., mobile or online tools or applications for suicide prevention, fact or resource sheets, and wallet card information) to improve understanding of how skills can be maintained over time with materials provided at trainings (Cross et al., 2011). A TASP is completed by grantee program staff within 2 weeks of each in-person training activity and quarterly for virtual training activities. The revised TASP includes more refined assessment of training format including (1) in person; (2) virtual (facilitated on a specific date) and (3) virtual (self-directed; trainee completes training at own pace) and revisions to align with updated Government Performance and Results Act (GPRA) indicators.
TSA-P (New): the TSA-P is a web-based survey to assess trainee confidence in identifying and managing youth at risk for suicide after participation in a training event. At the conclusion of all training events, trainees will be asked to complete the TSA-P. The instrument is designed to assess baseline confidence following the training, knowledge of suicide prevention, confidence in identifying and managing suicidal youth, and pretraining behaviors related to identifying and managing youths at risk of suicide. As part of the TSA-P, trainees will be asked to complete a consent-to-contact web form indicating their willingness to be contacted by the GLS State/Tribal Evaluation team to participate in the TSA-F and TSA-PS. If a trainer is unable to administer the survey or consent-to-contact form electronically, or a trainee does not have access to a mobile device or computer, they may also complete the survey and consent-to-contact form on paper. The grantee will submit this information to ICF, through direct data entry into the Suicide Prevention Data Center (SPDC), within 2 weeks of the training event. Once consent to contact has been received, ICF will create a random sample of participants for the phone simulation and the 6- and 12-month follow-up surveys. TSA-P data will inform the Training Outcomes Study.
TSA-F (New): The TSA-F is a follow-up web-based survey to assess trainees’ sustained confidence and skills in identifying and managing youth at risk for suicide, as well as experience with managing at-risk youth since training (interventions with youths, additional training, etc.). The survey will be administered to a sample of training participants at 6- and 12-months after the initial TSA-P is completed. TSA-F data will inform the Training Outcomes Study.
TSA-PS (New): The TSA-PS is a follow-up phone simulation using standardized interaction to assess trainee skills in identification and management of a youth in suicidal crisis. A random subsample of training participants will be contacted by the evaluation team to participate in a simulated conversation with a youth in suicidal crisis portrayed by a trained actor. These simulations will occur between 3 and 6 months following their initial training. The simulated conversation between the training participant and actor will last approximately 10 to 30 minutes (community gatekeeper sessions will likely be shorter than the clinician interactions). In total, the session will be scheduled for 45 minutes to allow for consent, instructions, and a debrief. These phone sessions will be administered via tele video and recorded for additional post-simulation scoring and analysis. All sessions will be attended by the training participant, an actor, and an evaluation team member (observer), who will be responsible for facilitating the interaction, administering the consent, scoring the interaction (both in real time and based on the recording), and providing a short debrief to the training participant. TSA-PS data will inform the Training Outcomes Study.
EIRFT-I (Revised): the web-based EIRFT-I gathers existing data for each at-risk youth identified as a result of the GLS program (via a GLS-trained gatekeeper, a GLS-sponsored screening identification, or via a discharge from an emergency room or inpatient psychiatric treatment). Initial follow-up information (whether a service was received after referral or not) is obtained along with details on all services received in the 6 months following identification. Ensuring adequate resources and services for referral to care is a best practice for both screenings and gatekeeper trainings. In addition, a response system that ensures timely referrals is part of GLS grant requirements. Data can be extracted from case records or other existing data sources, including any organizational staff, community members, or family members who make a mental health identification and referral. Respondents include grant program staff and service providers representing all grantees in all funding years. Data collection is ongoing for each youth identified at risk, screened positive, or discharged from an emergency room or hospital for a suicide attempt and/or suicidal ideation. No personal identifiers are requested on the EIRFT-I. Grantee program staff enter EIRF-I data on an ongoing basis. EIRFT-I data will inform the Training Outcomes and Continuity of Care Studies. This instrument builds upon the previous EIRF-I, with the addition of data collection on follow-up post-discharge from emergency departments or psychiatric hospitalization and additional information on treatment.
EIRFT-S (Revised): the web-based EIRFT-S gathers aggregate information about all screening activities conducted as part of the GLS program. Data include aggregate information on the number of youths screened for suicide risk through the GLS program, and the number screening positive. On an ongoing basis, the grantee will submit EIRFT-S forms. EIRFT-S forms are completed once per implementation of a screening tool in a group setting, once per month for clinical screenings, and once per month for one-on-one screenings. For each screening event in which multiple youths are screened at a given time, one EIRFT-S should be completed for the event. For one-on-one screenings in a clinical or other setting, one aggregated EIRFT-S is completed per month to reflect screening outcomes of all youths screened during the month. Grantees develop systems locally to gather identification and referral data, including extracting data from existing electronic health records or forms. No personal identifiers are requested on the EIRFT-S. EIRFT-S data will inform the Continuity of Care Study. This instrument continues the previous EIRF-S.
YORS (New): the YORS is a web-based survey assessing the experience and outcomes of those youth who are served by the GLS Program. The instrument is designed to assess suicidality, positive youth development, satisfaction with services received, youth engagement experience, and family and school dynamics. Youth between the ages of 14–24 years who receive a positive screening result (as part of the GLS program activities) and receive a referral to a mental health service, or youths who attend skills-based training will be considered eligible for the study. A sample of eligible youth will be enrolled in the Youth Study. The age of the youth respondent will dictate how consent is obtained for the YORS. All youths under the age of 18 at selected grantee sites will be asked to have their parent complete consent-to-contact forms and participate in the YORS and YER Journal when they consent to receiving screening from the grantee. Youths over the age of 18 will be asked to complete consent-to-contact forms at the time of initial referral and screening (after gatekeeper identification). The YORS will be administered at 3-, 6-, and 12-months post enrollment, with enrollment occurring no later than 1 month following referral to a behavioral health service.
YER Journal (New): the YER Journal is a web-based survey consisting of a weekly journal prompt that youth can respond to with a photo and corresponding narrative interpretation of the photo. For example, youths may be asked to reflect on a recent experience receiving services. The youth would be asked to submit a photo that represents that experience, followed by a prompt that asks: “What words come to mind? How did it make you feel?” The narrative description of what the photo represents will be analyzed using qualitative methodologies. Up to 25 youths will be recruited to participate in the YER Journal each year. Youths participating in the YORS will be invited to join the YER Journal via contact through the YORS data collection activities. For example, a youth may complete their third quarterly YORS follow-up, and be invited to join the YER Journal study simultaneously. Our team will leverage innovative data collection technology to engage youth. Weekly prompts will be sent to youths for 6 weeks post enrollment to discover, for example, which components of what youths are receiving are meaningful and helpful, and how youths may be utilizing skills or services following the initial screening, both in the short and long terms.
Major Study Components
The previously approved evaluation consisted of three interconnected studies—Training, Continuity of Care, and Suicide Safer Environment. For the current reinstatement, as noted, the GLS State/Tribal Evaluation has been redesigned to include three evaluation components: the Implementation Evaluation, Outcome Evaluation (comprised of three studies—the Training Outcomes Study, Youth Study, and Continuity of Care Study), and Impact Evaluation. Thus, the focus on training and continuity of care will continue, while the Suicide Safer Environment Study has been removed. A description of each evaluation component, study, and associated data collection activities follows.
Implementation Evaluation
As described, to date, program grantees have generated data regarding the nature and extent of suicide prevention activities across the United States including the number of individuals affected by grantee activities—such as screened or trained—and proximal outcomes of such efforts including increased knowledge or awareness of the signs and symptoms of suicide risk and numbers of youths identified as at risk who were referred for services (SAMHSA, 2019). The Implementation Evaluation will continue to advance understanding of what works to prevent youth suicide and attempts, why it works, for whom, and under what conditions.
The Implementation Evaluation inventories the array of youth suicide prevention and early intervention strategies implemented by GLS grantees and addresses questions about the types of strategies, services, or products developed and/or implemented, including required and allowable activities and EBPs; implementation settings (e.g., schools, educational institutions, juvenile justice systems, substance use programs, mental health programs, foster care systems, and other child and youth-serving organizations); populations reached; the extent of implementation progress; and whether grantee strategies and activities are delivered in accordance with their work plans. In addition, the evaluation seeks to understand which elements of the NSSP are being implemented by grantees and whether and how grantees promote behavioral health equity as part of their strategies and programs.
The approach employs methods consistent with the prior GLS evaluations, where relevant, to enable an assessment of implementation context and progress over time, while focusing on current priorities for evaluation. For example, data collected through this evaluation component will contribute to cross-study analyses focused on behavioral health equity such as an assessment of specific subgroups of youths at high risk for suicide, potential social determinants of health, and other community-level factors. Primary data collection will provide a comprehensive assessment of current grantee implementation activity at the program level. Data sources include the PSI, the TASP, the EIRFT-I, and the EIRFT-S.
Training Outcomes Study
As of June 2019, campus, state, and tribal grantees had trained over 1.6 million people and implemented more than 66,000 trainings as part of GLS programs (SAMHSA, 2019). As described, prior evaluation indicates that counties where GLS trainings were implemented had lower suicide rates in the year following training events as compared with similar counties that did not have GLS trainings (SAMHSA, 2019). Historically, such trainings have focused predominantly on training community gatekeepers (e.g., professionals and community members like parents, teachers, and coaches) to identify and refer youth and young adults at increased risk for suicide to appropriate services. Over 97% of GLS state and tribal grantees implement community gate keeper trainings and 73% of state and 46% of tribal grantees implement training for behavioral health professionals (SAMHSA, 2019). Given the emphasis on training by grantees, the GLS Evaluation will continue to assess training activities and the related impact.
While previous studies of gatekeeper training effectiveness have found that trainings effect more immediate outcomes (Isaac, M., Elias, B., Katz, L. Y., Belik, S., Deane, F. P., Enns, M. W. et al., 2009), less is known about the effect of gatekeeper trainings on intermediate outcomes (identifications and referrals) and distal outcomes (attempts and deaths). Results from a RCT of a gatekeeper training (Question, Persuade and Refer [QPR]) found the evidence of the effect of gatekeeper training on identifications and referrals inconclusive, except among gatekeepers who were already communicating with youth (Wyman, P. A., Hendricks Brown, C., Inman, J., Cross, W., Schmeelk-Cone, K., Guo, J., et al., 2008). Additionally, although gatekeeper training impacts knowledge and awareness, many trainees’ skills decreased over time. Cross et al. (2011) found that role-play practice and feedback during training improved retention of gatekeeper skills, especially the ability to ask an individual directly about suicide and the ability to communicate with someone in distress.
The Training Outcomes Study will examine the impact of gatekeeper trainings and aligns with SAMHSA’s NSSP goals 7, 8 and 9 (SAMHSA, 2017). The Training Outcomes Study will focus on the trainings for clinicians and staff that directly work with youth in crisis (e.g., mental health professionals, staff and volunteers of prevention hot and helplines, juvenile justice staff, law enforcement, foster care providers, etc.). The study will focus on two types of training: community gatekeeper trainings that provide essential skills to successfully identify at-risk youths (e.g., QPR, ASIST, safeTALK) and trainings for mental health professionals and clinical staff to identify and manage individuals expressing suicidal thoughts or behaviors (e.g., AMSR and RRSR). To understand the training process and the impact that training has on behaviors, the Training Outcomes Study will examine training participants’ post-training behaviors and utilization of skills. Furthermore, this study will investigate the long-term gains in trainees’ skills to identify and manage youth at increased risk for suicide. The study includes core questions about trainings implemented and their proximal outcomes. Three primary data sources inform the Training Outcomes Study: (1) TSA-P, (2) TSA-F6 and TSA-F12, and (3) TSA-PS. All trainees will complete a baseline TSA-P, and a sample of trainees will be invited to complete a TSA-F at 6 and 12 months after the training. In addition, a subset of TSA-P participants will also be invited to participate in the TSA-PS, which will occur approximately 3-6 months after the training.
Youth Experience, Outcomes, and Resiliency Study
GLS State/Tribal grantees are expected to implement programs that will increase the number of youth-serving organizations, however, the field has often overlooked youths as partners in research and evaluation designed to improve their outcomes. In addition, youths are not always seen as assets in their communities or experts on topics pertaining to them (Sprague-Martinez et al., 2018; Finn & Checkoway, 1998). Yet, it is important to understand how individuals served by the grant experience the intervention, services, and supports received in addition to understanding the effectiveness of the services. In the past evaluation the Youth Exploratory Services Interview (YESI) recruited students from 12 GLS-supported colleges to report on the identification and referral process, perceptions of service, and continuity of care for suicide prevention and treatment. Findings from YESI suggested that gatekeepers play a vital role in identifying students at risk for suicide and campuses need to enhance suicide prevention and treatment services. The Youth Study seeks to build on the lessons learned from YESI to understand how individuals served by GLS grantees experience the intervention, services, and supports received as well as the effectiveness of the services and supports in leading to positive outcomes.
Positive youth development (PYD) is “an intentional, prosocial approach that engages youths within their communities, schools, organizations, peer groups, and families in a manner that is productive and constructive; recognizes, utilizes, and enhances young people’s strengths; and promotes positive outcomes for young people” (youth.gov). PYD builds resilience by enhancing and utilizing young people’s positive assets, leadership, and knowledge (youth.gov), and PYD has been found to be associated with psychosocial outcomes, including anxiety, depression, and suicide ideation (Onyeka et al., 2021; Leung et al., 2017). A key tenet to PYD is youth engagement in program development and evaluation. Listening and learning from youth is critical; youth can provide organizations with key insights that help create innovative and effective programs.
The Youth Study seeks to understand, from the youth’s perspective, which skills are most useful in the self-management of thoughts, plans, and attempts at suicide; whether self-management skills increase because of the grant activities; and whether positive skills including resilience and stress tolerance are improved through participation in grant activities. The Youth Study will include a Youth Advisory Board (YAB) to provide oversight and feedback on the evaluation’s study design, data collection, and interpretation of findings. All youths under the age of 18 at selected grantee sites will be asked to have their parent complete consent-to-contact forms and participate in the YORS and YER Journal when they consent to receiving screening from the grantee. Youths over the age of 18 will be asked to complete consent-to-contact forms at the time of initial referral and screening (after gatekeeper identification).
Specific data collect activities (instruments) include the YORS and YER Journal. The YORS will be administered at 3-, 6-, and 12-months post enrollment, with enrollment occurring no later than 1 month following referral to a behavioral health service. Initial findings from the YORS will be used to develop YER journal weekly prompts. Youths participating in the YORS will be invited to join the YER Journal study via contact through the YORS data collection activities. The ICF team may follow-up with YER Journal participants to conduct a brief interview to allow youths to provide their own interpretation of their submitted photos. For youths enrolled in the YORS, we will employ a graduated incentive scheme to encourage participation and ensure retention (e.g., receive a compensation for data collection at timepoints 1 - 3, and a slightly higher compensation for timepoint 4 over the course of 12 months). For the YER Journal, youth will receive the same compensation per participation.
Youths will be enrolled at the same point as they come into the GLS EIRFT process: (1) screened positive, (2) identified as being at risk by a trained gatekeeper, or (3) discharged from an ED or hospital for suicidal ideation or attempt. Youths in crisis will not be immediately recruited; once stabilized, however, they will be invited to participate. Consent documents will be electronically assessable via link or QR code and will be hosted in the SPDC. For grantees that prefer, paper consent forms will be available.
Continuity of Care Study
Best practices recommendations state that gatekeeper trainings must include post-identification protocols, community-specific suicide prevention resources, and supports that are available where the trainee works and/or lives—all which demand that adequate supports/services are in fact, available to at-risk youth. Recommendations for screening include developing response protocols for youth perceived at risk, including imminent risk, to ensure the receipt of immediate guidance and referral—also demanding the availability of adequate services in the event that an at-risk youth is identified. Previous evaluation data tracked three aspects of how grantees respond to youths who are identified as being at risk for suicide: (1) how they are identified, (2) what referral they receive at time of identification, and (3) the first two services they receive. The Continuity of Care study will continue with this data collection to understand the referral patterns and identify potential gaps in the identification, referral, and services receipt process. Furthermore, this study will expand to follow identified youths for a longer period of tracking (6 months) to track their services, additional referrals, and outcomes of additional assessment. This longer period of tracking will facilitate a better understanding of the path at-risk youths follow while receiving care (additional referrals and services), how long care continues, and at what points a youth may drop out of care.
This Continuity of Care Study will assess the practices of GLS State/Tribal grantees related to early identification, referral, and follow-up practices of grantees to ensure youths are not falling through gaps during the follow-up and services receipt process. The goal of this study is to understand the extent to which youths who are identified through a GLS State/Tribal Suicide Prevention program are referred to and receive services in the 6-month period following their identification. In addition to those youths identified as part of the GLS State/Tribal program, this study seeks to follow up with and understand the care pathway of those youths who were not necessarily identified via the GLS Program but were discharged from an emergency department or inpatient psychiatric unit in the grant service area.
All state and tribal grantees who are participating in the evaluation will participate in continuity of care data collection activities. To understand what grantees are doing to support continuity of care throughout the identification, referral, and treatment process, ICF will use 3 instruments to collect administrative, behavioral health, and grantee respondent data.
The Continuity of Care Study comprises primary and secondary data collection activities—surveys, inventories, and existing data abstractions—to document the GLS-sponsored prevention activities that support and contribute to the early identification, referral, and follow-up of students and youth at risk for suicide.
To understand what grantees are doing, the study will rely on data gathered via the PSI. The PSI gathers the prevention strategies for the early identification, referral, and follow-up of at-risk youth and students; how grantees track and monitor at-risk youth identified through screenings; and the follow-up protocols for each screening activity (e.g., the protocols and tracking tools used to ensure that youth referred for services are followed up with and get to adequate mental health or other support referral sources). Data from the PSI will provide insight about what GLS grantees are doing as well as inform grantee technical assistance needs related to developing protocols to collect EIRFT data.
To assess if GLS prevention activities are effective in developing and supporting continuity of care, grantee staff will submit EIRFT-I data for each youth identified as at risk. The EIRFT-I collects individual (de-identified) data on initial follow-up, referrals, and details on all services in the 6 months post-identification. The EIRFT-S gathers aggregate information about all screening activities completed as a part of the GLS State/Tribal Program (e.g., the number of youth screened for suicide risk, the number screening positive, and the number confirmed to be at risk after initial positive screening). EIRFT-S forms are completed once per implementation of a screening tool in a group setting, once per month for clinical screenings, and once per month for one-on-one screenings. No personal identifiers are requested on either the EIRFT-I or EIRFT-S forms.
Impact Evaluation
The Impact Evaluation consists of three primary lines of inquiry into the impact of the GLS State/Tribal Program on suicide morbidity and mortality: (1) the Impact Analyses, (2) the Rapid Retrospective Information Acquisition, and (3) the Cumulative Impact Synthesis.
Data obtained through primary data collection (see Implementation Evaluation above) and secondary data from sources such as the Health Care Cost and Utilization Project (HCUP) will be used to quasi-experimentally assess changes in suicide outcomes related to GLS interventions. In addition, GLS State/Tribal grantee activities documented in archival records will be used to fill knowledge gaps from 2019 to 2022. A cumulative impact synthesis will use PSI, TASP, EIRFT-I and EIRFT-S measures to identify combinations of intervention characteristics (e.g., means restriction training, referral training) that form causal pathways between programming and outcomes.
Revisions
Exhibit 13 contains a summary of revisions to the previously approved package and the rationale behind each of the changes.
Exhibit 13. Revisions to the Evaluation
Uses of Information Collected
The fourteen-years of cross-site evaluation of the GLS State/Tribal Program have resulted in the largest repository of youth suicide prevention data in the United States. Across its history, the evaluation has responded to the National Strategy for Suicide Prevention (National Strategy) developed by the National Action Alliance for Suicide Prevention (Action Alliance) in 2001 and revised in 2012. The information gathered has been essential to SAMHSA and others in helping communities and decision-makers at all levels of government improve suicide prevention effectiveness. Building on the revised National Strategy (HHS, 2012) the Action Alliance released the first-of-its-kind action plan in 2014, A Prioritized Research Agenda for Suicide Prevention: An Action Plan to Save Lives, aimed at prioritizing suicide prevention research with the greatest likelihood of reducing suicide morbidity and mortality. Consistent with the Action Alliance goal to save 20,000 lives in 5 years, the agenda outlined multiple approaches that collectively could achieve a reduction in suicide attempts and deaths by 20% in 5 years and 40% or greater in 10 years (Action Alliance, 2014). The agenda is organized around 6 key questions, each tied to one or more of 12 aspirational goals which serve as an organizing framework for suggested research pathways to reduce the burden of suicide. The public health approach to suicide prevention recommended in these guiding documents has been a hallmark of SAMHSA’s suicide prevention programs. Information collected through the evaluation has contributed to key areas of both efforts and is a priority for the GLS State/Tribal program Evaluation.
Information gathered through the reinstated GLS State/Tribal program evaluation will continue to be useful to SAMHSA and its partners, other federal agencies, and administrators, GLS State/Tribal grantees, legislators, the National Strategy and the field of suicide prevention, individual youth and their families, and the communities in which they live. The focus of the evaluation on assessing the implementation, outcomes, and impacts of the GLS State/Tribal Program will contribute immensely to advancing the field of suicide prevention. The proposed updated design and evaluation plan will allow for new methodological possibilities that provide a means of addressing current questions for the next stage of evaluation, building on the evaluation findings to date and focusing on areas of inquiry important to SAMHSA, Congress, and other suicide prevention stakeholders. For example, data collected will help to inform SAMHSA’s other suicide prevention initiatives, such as the new 988 dialing code that will route callers to the National Suicide Prevention Lifeline (NSPL or Lifeline), eventually transforming the crisis care system in this country.
Without this evaluation, Federal and local officials will not be able to determine whether the suicide prevention programs implemented as part of the GLSMA have an impact on the prevention of suicide; their effectiveness on identification, referral, and provision of services to youth and students identified as at risk; and whether GLS grantee programs are meeting the goals of the GLSMA. SAMHSA will use the data collected to provide objective measures of progress toward meeting key performance indicators put forward in its annual performance plans as required by law under GPRA.
In addition, as a contributor to HHS’ evidence-building activity, SAMHSA will lead efforts via this evaluation to understand how HHS programs can reduce suicidal ideation, leading to reductions in mortality and morbidity among youths across the country. Specifically, outcome evaluation findings, to be documented in SAMHSA’s Report to Congress on the GLS State/Tribal Youth Program, will address key priorities outlined in the HHS Evidence-Building Plan for FYs 2023–26, addressing the priority question: How do HHS policies and programs promote healthy lifestyle choices to reduce occurrence and disparities in preventable injury, illness, and death?
Use of Improved Information Technology
Every effort has been made to limit burden on individual respondents who participate in the GLS State/Tribal Evaluation through the use of technology. Data collection instruments will be administered via a web-based portal referred to as the Suicide Prevention Data Center (SPDC). The SPDC will serve as a (1) data entry tool for grantee program and evaluation staff to enter information and (2) data collection tool for administering web surveys to respondents. All data obtained through direct entry by grant program staff or through web surveys will be stored in the central repository in the SAMHSA cloud environment to reduce evaluation burden on grantees and to allow ease of access to data for program personnel and evaluation team members. All data collected will be stored in the central data repository, allowing for the analysis and summary of information within and across surveys.
The web-based SPDC system will support data collection, management, and dissemination activities associated with the GLS State/Tribal Evaluation, including communication between grantees and the evaluation team, secure data transmission and storage, data quality monitoring that triggers corrective action when necessary, and updates around evaluation activities and performance. Through the SPDC, grantees can:
Manage their own user profile and add subordinate users with same or limited permission
Access web-based data collection forms, surveys, and data import tools, designed responsively to support varying screen sizes across a wide spectrum of devices (e.g., desktop, laptop, tablet, phone)
Track progress and completeness at the instrument level and by categories or major sections within each individual data collection tool
Respond to real-time validations during data entry or upload, fixing errors and inconsistencies prior to submission and reducing the effort of responding to post-submission data questions. Front-end validations will prevent invalid or incomplete data from being saved into the database. These types of validations can be warnings that allow a user to proceed or hard errors that must be corrected before proceeding. Using validations and data reports, the system will allow for quick identification of inaccuracies and anomalies in the data and allow for corrective action.
Access and download grantee-specific reports and datasets
Access general and targeted evaluation-related documents (e.g., data dictionaries, codebooks, user manuals, links to websites).
To maintain privacy, the secure SPDC offers six levels of password-protected access to site-specific and aggregate data as described in Exhibit 14. Only users with administrative privileges (evaluation management, evaluation team, and grantee site administrators) will have the security to access the raw data. To protect from potential misuse of those data (inadvertent identification of respondents as a function of their unique demographic information), the following measures will be in place: (1) access to raw datasets will be restricted to designated individual(s) and (2) the grantee site administrator will sign a data use agreement. Within the context of protecting from inadvertent identification, this agreement will stipulate who, how, and under what circumstances the raw data can be analyzed/reported. For example, the GLS State/Tribal Evaluation team will obtain an agreement from each grantee site administrator not to report categories in which less than 10 cases exist and to stipulate who will have access to raw data. Further, the agreement will indicate that no attempt, through complex analysis using outside information, will be made to ascertain the identity of particular individuals from the datasets. A copy of the SPDC data use agreement is located in Attachment K.
Exhibit 14. SPDC User Security Levels
Security Level |
SPDC Privileges |
Evaluation Management Administrator |
|
Evaluation Team Member Administrator |
|
Grantee Site Administrator |
|
Grantee Site User |
|
Grantee Contact User |
|
SAMHSA, its consultants, & partners |
|
The newly designed SPDC is currently under development and the System of Records Notice (SORN) and an HHS Privacy Impact Assessment (PIA) have yet to be created.
Efforts to Identify Duplication
The GLS State/Tribal Evaluation team, in developing the data collection activities and updating the design for the evaluation, conducted a literature review to avoid duplication in data collection activities and the use of similar information. Specifically, existing research studies and the efforts of other Federal initiatives designed to evaluate suicide or suicide prevention were reviewed.
Two decades ago, many in the field of suicide prevention agreed that there was insufficient information on the causes of suicide and even less information on how to most effectively prevent suicide (SPAN USA, Inc., 2001; Institutes of Medicine, 2002; U.S. Public Health Service, 2001). The studies on suicide prevention activities provided important information, but for the most part was conducted with nonrandomized groups. Similarly, the lack of longitudinal and prospective studies had been a barrier to understanding and preventing suicide (Institutes of Medicine, 2002). Acknowledging the dearth of information on the effectiveness of suicide prevention programs, the IOM Report, Reducing Suicide: A National Imperative, provided recommendations for increasing research on suicide (2002), including Federal funding for suicide prevention interventions and longitudinal studies that focus on the medium to long-term impacts of suicide prevention activities.
Since then, research has identified gatekeeper training as a critical element in suicide prevention efforts (Isaac, M., Elias, B., Katz, L. Y., Belik, S., Deane, F. P., Enns, M. W. et al., 2009) and also has shown that training increases knowledge, skills, an intention to help someone at risk for suicide among an array of gatekeepers (King & Smith, 2000; Reis & Cornell, 2008; Wyman P. A., Hendricks Brown, C., Inman, J., Cross, W., Schmeelk,-Cone, K., Guo, J., et al., 2008; Keller, D. P., Schut, L. J., Puddy, R. W., Williams, L., Stephens, R. L., McKeon, R., et al., 2009; Matthieu, Cross, Batres, Flora, & Knox, 2008; Isaac, M., Elias, B., Katz, L. Y., Belik, S., Deane, F. P., Enns, M. W. et al., 2009). Findings from a previous cross-site evaluation of the GLS program indicated a positive collective impact of suicide prevention trainings on subsequent identification behavior of trainees (Condron, Godoy-Garraza, Walrath, McKeon, & Heilbron, 2014) and established the effect of GLS State/Tribal program trainings and activities on youth suicide attempts and suicide mortality. Findings indicated that counties where GLS trainings were implemented had lower suicide rates in the year following training events compared to similar counties that did not have GLS trainings (Walrath, Godoy-Garraza, Reid, Goldston, & McKeon, 2015; SAMHSA, 2013). Findings such as these helped to guide the direction of further implementation, such as determining the training types and practices that are most effective when identifying and referring youth.
In addition, the evaluation of SAMHSA’s National Suicide Prevention Lifeline (NSPL) has developed evidence to support crisis lines’ effectiveness and the value of suicide prevention efforts (King, Nurcombe, Bickman, Hides, & Reid, 2003; Gould, Kalafat, Munfakh & Kleinman, 2007; Kalafat, Gould, Harris-Munfakh & Kleinman, 2007; Mishara, Chagnon, Daigle, Balan, Raymond, Markoux, et al., 2007a & 2007b; Gould & Kalafat, 2009; Gould, Munfakh, Kleinman, & Lake, 2012; Knox, Kemp, McKeon & Katz, 2012; Gould, Cross, Pisani, Munfakh, & Kleinman, 2013; Goul, Lake, Munfakh, Galfalvy, Kleinman, Williams, Glass & McKeon, 2016). As a result, the 988 Suicide & Crisis Lifeline has emerged as a key component for a range of suicide prevention programs and has expanded to offer clinical follow-up services to callers and to those who had received service in emergency departments and hospitals for suicidality. Based on the success of past Suicide Hotlines and the current mental health crisis in the United States, the 988 Suicide & Crisis Lifeline has been developed by SAMHSA and prioritized by the Biden administration (The White House, 2023). 988 is a short, memorable, and easy to dial version of the NSPL, that allows for fast access to mental health and suicide preventing care (SAMHSA, 2023a).
The Centers for Disease Control and Prevention’s (CDC) has created a national initiative to stop suicide nationwide. This campaign includes #BeThere, an educational campaign for peers and gatekeepers to see the signs of someone who is suicidal (CDC, 2019). The #BeThere campaign has also started the #BeThe1To, to educate and encourage peers and gatekeepers to seek help when they see someone showing signs of potentially being suicidal (CDC,2019).
The CDC’s suicide prevention campaign is managed by the CDC National Center for Injury Prevention and Control provides funding and technical assistance to states through its Core Violence and Injury Prevention Program (Core VIPP). The program supports State health departments in strengthening their capacity to collect data and use data for a better understanding of local injury issues, including suicide. The focus of Core VIPP is on supporting funded state partners in their efforts to build a solid violence and injury prevention infrastructure, collect and analyze data, and implement and evaluate injury prevention programs. This CDC program may provide a broader understanding of suicide as a by-product of its efforts to gain a better understanding of local injury issues; however, the focus of the GLS State/Tribal Evaluation is specifically to evaluate the effectiveness of suicide prevention programs.
SAMHSA is sponsoring an ongoing evaluation of the 988 Suicide & Crisis Lifeline. The purpose of the evaluation, like the previous NSPL, is to assess the impact of the national crisis hotline connecting callers to crisis counselors and to assess participation with the Lifelines networks. The 988 Suicide & Crisis Lifeline has emerged as a key component for a range of suicide prevention programs and has expanded to offer clinical follow-up services to callers and to those who had received service in emergency departments and hospitals for suicidality. Based on this work and to continue to address gaps in research and the field, this work will focus on evaluating clinical follow-up for those who received emergency department or hospital care, emerging technology approaches (i.e., chat interventions), and imminent risk caller guidelines.
In 2022, SAMHSA awarded cooperative agreements to 20 federally recognized tribes or tribal organizations to implement the Tribal Behavioral Health Program (Native Connections) to address high rates of suicide and substance abuse with AI/AN youth up to age 24. SAMHSA has subsequently awarded funding to two additional cohorts of grantees through a standalone Native Connections program The Native Connections program has eight cohorts, the first three cohorts have already completed the program (SAMHSA, 2023b).While cohort four has 45 grantees entering their fifth year, cohort five has 26 grantees entering their fourth year, cohort six has 40 grantees entering their third year, cohort seven has 29 grantees entering their second year, and cohort eight has 12 grantees that were awarded in 2022 (SAMHSA, 2023b). The Native Connections program provides support to tribes and tribal organizations to build capacity for the implementation of suicide prevention, substance abuse prevention, surveillance, and mental health promotion activities among young people. For this project, SAMHSA conducted evaluability assessments to assess Native Connections grantee readiness to participate in local or cross-program evaluation, as well as to support grantees in their efforts to implement and use existing and new surveillance to understand prevalence and prevention outcomes related to suicide and substance abuse.
In FY2017, SAMHSA also funded five National Strategy Grants to State program grantees to support the implementation of the 2012 National Strategy. The cooperative agreements provide funding for suicide prevention among working-age adults from 25 to 64 years old.
Finally, also in FY2017, SAMHSA announced grant funding through its Zero Suicide in Health Systems Program (Zero Suicide Program) to state and U.S. territory health agencies with mental and/or behavioral health functions, tribes/tribal organizations, community-based primary care or behavioral health care organizations, emergency departments, and/or local public health agencies to implement the Zero Suicide model throughout their health systems. Health systems not providing direct care services can partner with agencies/organizations to implement the Zero Suicide model. Communities without well-developed behavioral health care services can implement the Zero Suicide model in Federally Qualified Health Centers or other primary care settings.
Grantees are charged with implementing suicide prevention and intervention programs, for individuals aged 25 years or older, designed to raise awareness, establish referral processes, and improve care and outcomes for individuals at risk. The grants require recipients to use their funding primarily to support direct services or practices that have a demonstrated evidence base and are appropriate for the population(s) of focus, including but not limited to:
Screening all individuals receiving care for suicidal thoughts and behaviors and conducting a comprehensive risk assessment of individuals identified at risk for suicide
Training the healthcare work force in, as well as implementing, effective, evidence-based treatments to treat suicidal ideation and behaviors
Ensuring that the most appropriate, least restrictive treatment and support is provided
Developing a Suicide Care Management Plan for every individual identified as at-risk of suicide and continuously monitoring the individual’s progress
Working with Veterans Health Administration and community-based outpatient clinics, state department of veteran affairs, and national SAMHSA and Veterans Administration (VA) suicide prevention resources to engage and intervene with veterans at risk for suicide, but who are not currently receiving VA services
Developing and implementing a plan that assures attention to preventing suicide among those receiving treatment for SMI and for services designed for those with SMI
Ensuring feedback and leadership of survivors of suicide attempts and suicide loss are involved in all required activities
In total, SAMHSA awarded three grants in FY2017 and 15 grants in FY2018. By addressing all seven elements of the Zero Suicide model, SAMHSA’s Zero Suicide Program, grantees are transforming their health systems into those that are ready to identify, treat, refer, and ensure continuity of care for individuals at risk for suicide and suicidal behaviors. SAMHSA has also initiated a pilot test of the Zero Suicide Evaluation.
Impact on Small Businesses or Other Small Entities
Some data collection activities involve individuals from public agencies that provide a wide array of services such as mental health, juvenile justice, education, and child welfare. While most data will be collected from public agencies, it is possible that small businesses will participate in suicide prevention training and have individuals included in the Training Outcomes Study. We anticipate that any data collection efforts will not have a significant impact on these small entities.
Consequences if Information Collected Less Frequently
The rigor of the GLS State/Tribal Evaluation design and its ability to answer the primary evaluation questions is dependent on the frequency of the data collected. Additionally, because the evaluation is aligned with the key elements of the GLS State/Tribal Programs, the frequency with which data collection activities are administered is critical to SAMHSA’s overall assessment of the GLS State/Tribal Program. Exhibit 15 describes the consequences if data are collected less frequently.
Exhibit 15. Data Collection Activities and Consequences If Information Collected Less Frequently
Activity |
Rationale |
EIRFT-I |
The EIRFT-I requires S/T grantees to share existing data on the youth identified as at risk. Data from the EIRFT-I are integral to understanding the impact of gatekeeper training and screening programs on identifications and referrals and services received as a result of the GLS State/Tribal Program. |
EIRFT-S |
S/T grantees are also required to report aggregate screening information for all youth screened as part of their suicide prevention programs. The information collected includes the number of youth screened and the number screening positive. This information is necessary for SAMHSA to understand the types and effectiveness of screenings implemented as a result of the GLS program. |
PSI |
Grantees will be required to complete the PSI beginning in year 1 of the grant. Thereafter, they will complete the PSI on a quarterly basis over the duration of their grant period. Collecting this information quarterly is necessary to track progress toward meeting suicide prevention goals and to provide information on the development stage of products and services within S/T programs. The consequence of collecting the PSI less frequently is losing information related to all studies of the GLS State/Tribal Evaluation as well as losing the ability to track progress over time. |
TASP |
Because training is a widely implemented suicide prevention strategy among S/T grantees, aggregate basic information about trainings and trainee types and roles is necessary for SAMHSA to understand how grant funds are being utilized in support of training. TASP data is also critical to the Impact Evaluation that assesses if the overall GLS program reduces youth suicide attempts and deaths. |
TSA (P, F and PS) |
Information from the TSA informs the Training Outcomes and Continuity of Care Studies and tracks the effectiveness of trainings on participant knowledge and use of skills. The consequence of not collecting these data at the conclusion of the training experience include a loss of knowledge about the types of trainings and practices implemented, as well as the impact of training on the identification and referral of at-risk youth. |
YER Journal |
The YER Journal provides an opportunity for youth to convey their feelings and experiences after being identified as at-risk. If these data were to be collected less frequently, we would lose the youth perspective which is critical to providing effective supports and access to treatment. |
YORS |
Information from the YORS focuses on the youth’s perception of their suicidality, positive youth development, satisfaction with services received, engagement experience, and family and school dynamics at three time points. If these data are collected less frequently, we will be less able to understand youth experiences over time. |
Consistency with the Guidelines of 5 CFR 1320.5(d)(2)
The data collection fully complies with the requirements of 5 CFR 1320.5(d) (2).
Consultation Outside the Agency
SAMHSA published a notice in the Federal Register on September 15, 2023 (Volume 88 page 63593), soliciting public comment on this study. No public comments were received.
Consultation on the design, instrumentation, and statistical aspects of the evaluation has occurred with individuals outside of SAMHSA. Most recently, in February-April 2023 we convened feedback sessions with current GLS grantees, expert advisory panel (EAP) members, and two youth advisors that allowed us to discuss and finalize the current data collection plan and instruments.
Since the inception of the cross-site evaluation in 2005, we have solicited guidance from grantees, researchers, and practitioners to ensure the approach was grounded in best practices and the latest suicide prevention research findings. In 2005, an evaluation steering committee was established to provide input and guidance in designing and implementing the original cross-site evaluation. Consultation with the steering committee has continued since 2005. Similarly, an EAP established in 2014 and convened annually from 2015-2018 provided input and guidance on the updated design and implementation of the previous cross-site evaluation. Consultation with this EAP has continued annually and as needed throughout the grant-funding period. Representatives on the EAP included leaders in the field of suicide prevention program implementation, research, and evaluation.
As with previous evaluations, updates to the instruments were informed through direct consultation with current and former grantees, as well as representatives of the SPRC and CDC. These consultations had four purposes: (1) to ensure continued coordination of related activities, especially at the Federal level; (2) to ensure the rigor of the evaluation design, the proper implementation of the design, and the technical soundness of study results; (3) to verify the relevance and accessibility of the data to be collected; and (4) to minimize respondent burden.
Payment to Respondents
As with previous evaluations, the GLS State/Tribal Evaluation will use a research-based approach and will require participation by youth, and suicide prevention training participants. Consequently, remuneration is suggested for respondents not directly affiliated with suicide prevention programs at the time of their participation in surveys and interviews as compensation for the additional burden, potential inconvenience of participation, and any related costs (e.g., mobile phone minutes or data, compensation for time). Remuneration also is a standard practice in longitudinal studies partly because respondents are typically not directly affiliated with the program being evaluated. Given the use of longitudinal data collection for the GLS State/Tribal Evaluation and the hard-to-reach nature of these populations, compensation will be provided for follow-up activities.
Planned renumeration for training participants is a $20 incentive for their participation in each of the TSA-F 6-month and 12-month surveys and $50 for the TSA-PS. For youths enrolled in the YORS, we will employ a graduated incentive scheme to encourage participation and ensure retention. Youths will receive $20 for data collection timepoint 1, $25 for timepoint 2, $25 for timepoint 3, and $30 for timepoint 4, for a total of $100 for study participation for 12 months. For the YER Journal, youth will receive $20 for participation. Respondents to other data collection activities are primarily staff of the suicide prevention programs or close affiliates. Therefore, no remuneration is planned for those activities.
Assurances of Confidentiality
Data will be kept private to the extent allowed by law. To ensure the confidentiality of data collected and the protection of human subjects, the data collection protocol and instruments for the GLS State/Tribal Evaluation will be reviewed through the ICF institutional review board (IRB) prior to the collection of covered or protected data. The ICF IRB holds a Federal wide Assurance (FWA00002349; Expiration, October 13, 2025) from the HHS Office for Human Research Protections (OHRP). This review ensures compliance with the spirit and letter of HHS regulations governing such projects. All protected data will be stored on secure servers in the SAMHSA. In addition, the web-based data collection and management system, the SPDC, will facilitate data entry and management for the evaluation.
Descriptive information will be collected from respondents during data collection activities. Most data collection will be via the SPDC portal, however if some activities require hard copy forms with identifying information, these will be stored in locked cabinets. The contact information will be entered into the SPDC via a page on the site that can only be accessed by the limited number of individuals who require access (selected ICF staff such as data analysts and administrative staff for administering the incentives). These individuals have signed privacy, data access, and data use agreements. Identifying information collected to facilitate the administration of surveys will not be stored with survey responses. Further, datasets will be stripped of any identifying information prior to use by data analysts. Once data collection is concluded and incentives are distributed, respondent contact information will be deleted from the database and the hard copy forms will be destroyed.
Data collection activities requiring the collection of identifying information for the GLS State/Tribal Evaluation include the following: TSA-P, YORS, and YER Journal. Specific procedures to protect the privacy of respondents are described below.
TSA-P: As part of each training event, all training participants will be asked to respond to a request for consent to contact that will gather information about identification and referral behaviors to help establish a baseline about trainee behaviors. The consent-to-contact form will ask participants to provide the identifying information (i.e., name, work telephone number, cell telephone number, work email, and personal email) necessary to contact them for the phone simulation, survey and administer the incentive. The consent to contact form is distributed and collected by the grantee project staff or training facilitators during the training activity via a QR code or URL to the SPDC survey or by hard-copy form, if necessary, due to internet or device limitations. Information gathered via hard-copy form will be returned to ICF for entry into the SPDC by evaluation staff. The consent to contact form will ask trainees for their consent to be contacted at up to three time points: at 3-months for a phone simulation and at 6- and 12-months following the training for the survey.
YORS: Identifying information for respondents to the YORS will be necessary for survey administration. Contact information will be limited to youth names, email addresses, cell phone numbers, and caregiver cell phone number and will be entered into a password-protected database.
YER Journal: Identifying information for respondents to the YORS will be necessary for survey administration. Contact information will be limited to youth names, email addresses, cell phone numbers, and caregiver cell phone number and will be entered into a password-protected database.
Questions of a Sensitive Nature
Survey and interview instruments include questions that are potentially sensitive because this project concerns the topic of suicide and suicide prevention. These questions collect information about suicidal ideation, suicide attempts, mental health, substance abuse, family circumstances, and mental health services seeking. These questions are central to the agency’s goal of learning about the identification of at-risk youth, understanding the referrals and services they receive and understanding the youth experience. Names and email addresses collected as part of the consent process will be kept separate from responses as stated above. All data will be managed and stored in the manner described above and therefore will be available only to authorized evaluation staff. Active consent forms explicitly advise potential respondents and participants about the sensitive nature and content of the data collection protocol as well as the voluntary nature of all data collection activities. Unanticipated or negative consequences will be reported immediately to grantee and ICF institutional review boards (IRB). If any negative consequences do result, the Principal Investigator and Project Director will consult with appropriate clinical professionals and determine the suitable action and make the appropriate referrals.
Estimates of Annualized Burden Hours and Costs
Clearance is being requested for 3 years of data collection for the GLS State/Tribal Evaluation. Exhibit 16 below describes the burden and costs associated with data collection activities. Burden is calculated for 31 S/T grantees, which represents the number active grantees anticipated for each year of data collection. This number is derived from the number of current grantees that will still be active during each year of the data collection and the expected number of grantees that will be funded and begin data submission activities during the approval period. All S/T grantees have a 5-year funding cycle which was taken into account when calculating the number of active grantees in each year of data collection. The cost was calculated based on the hourly wage rates for appropriate wage rate categories using data collected as part of the Occupational Employment Statistics Survey (BLS, 2022) and from the U.S. Department of Labor Federal Minimum Wage Standards. Exhibit 17 shows an annualized summary of burden hours by respondent type.
Exhibit 16. Estimated Annualized Burden Hours and Costs (Across the 3-Year Clearance Period)
Type of Respondent |
Instrument |
Number of Respondents |
Responses per Respondent |
Total Number of Responses |
Burden per Response (hours) |
Annual Burden (hours) |
Hourly Wage Rate ($) |
Total Cost ($) |
Project Evaluator |
PSI |
31 |
4 |
124 |
1.25 |
155 |
37.111 |
$5,752 |
Project Evaluator |
TASP |
31 |
10 |
310 |
0.25 |
78 |
37.11 |
$2,876 |
Project Evaluator |
EIRFT-Individual Form |
31 |
4 |
124 |
2 |
248 |
37.11 |
$9,203 |
Project Evaluator |
EIRFT-Screening Form |
31 |
4 |
124 |
0.75 |
93 |
37.11 |
$3,451 |
Provider Trainee |
TSA Consent to Contact |
10,000 |
1 |
10,000 |
0.08 |
800 |
$27.462 |
$21,968 |
Provider Trainee |
TSA-P |
10,000 |
1 |
10,000 |
0.3 |
3000 |
$27.46 |
$82,380 |
Provider Trainee |
TSA 6-month |
187 |
1 |
187 |
0.3 |
56 |
27.46 |
$1,541 |
Provider Trainee |
TSA 12-month |
140 |
1 |
140 |
0.3 |
42 |
$27.46 |
$1,153 |
Provider Trainee |
TSA-PS |
101 |
1 |
101 |
0.75 |
76 |
$27.46 |
$2,080 |
Youth |
YORS baseline |
300 |
1 |
300 |
0.5 |
150 |
$7.253 |
$1,088 |
Youth |
YORS 3-month |
240 |
1 |
240 |
0.5 |
120 |
$7.25 |
$870 |
Youth |
YORS 6-month |
192 |
1 |
192 |
0.5 |
96 |
$7.25 |
$696 |
Youth |
YORS 12-month |
115 |
1 |
115 |
0.5 |
58 |
$7.25 |
$417 |
Youth |
YER Journal |
25 |
6 |
150 |
0.25 |
38 |
$7.25 |
$272 |
Total |
21,424 |
|
22,107 |
|
5,008 |
|
$133,747 |
Exhibit 17. Annualized Summary Burden by Respondent Type
Respondents |
Number of Respondents |
Responses/ Respondent |
Total Responses |
Total Annualized Hour Burden |
Project Evaluators |
124 |
5.5 |
682 |
573 |
Provider Trainee |
20,428 |
1 |
20,428 |
3,974 |
Youth |
872 |
1.1 |
997 |
461 |
Total |
21,424 |
37 |
22,107 |
5,008 |
Estimates of Annualized Cost Burden to Respondents or Record Keepers
Grantees are collecting the majority of the required data elements as part of their normal suicide prevention program operations. Grantees maintain this information for their own program planning, quality improvement, and reporting purposes. Therefore, there are no additional capital or start-up costs associated with the GLS State/Tribal Evaluation. There will be some additional burden on record keepers to provide potential respondent lists for data collection activities. However, these operation costs will be minimal. Each grantee has been funded, as part of the overall cooperative agreement award, to fund an evaluator and related costs to carry out the requirements of the GLS State/Tribal Evaluation. Therefore, no cost burden is imposed on the grantee by this additional effort.
Estimates of Annualized Cost to the Government
CMHS has planned and allocated resources for the management, processing, and use of the collected information in a manner that will enhance its utility to grantees, the government, community agencies, and the public. Including the Federal contribution to local grantee evaluation efforts, the contract with the ICF (the national evaluator), and Government staff to oversee the evaluation, the annualized cost to the Government is estimated at $2,354,274 These costs are described below.
Each grantee is expected to fund an evaluator to conduct their self-evaluation and to satisfy the requirements of the GLS State/Tribal Evaluation. It is estimated that participating in the GLS State/Tribal Evaluation will require 0.20 full-time equivalent (FTE) to collect information, enter information into the SPDC web-based data collection and management system, and to conduct analyses at the local level. Assuming an annual evaluator salary of $77,200 based on the BLS May 2022 data for the Survey Researcher category, 20 percent effort for one grantee would be $15,440. With 31 grantees participating in the GLS State/Tribal Evaluation, the total grantee cost would be $478,640.
A contract has been awarded to ICF for evaluation of the GLS State/Tribal Program. The current evaluation contract with SAMHSA is funded to conduct the GLS State/Tribal Evaluation with 31 grantees over the next 5 years with a value of $9,251,349. The estimated average annual cost of the contract will be $1,850,270. This covers expenses related to developing and monitoring the GLS State/Tribal Evaluation including, but not limited to developing the evaluation design and instrumentation; developing training and technical assistance resources (e.g., manuals, training materials); conducting training and technical assistance; monitoring of grantees; traveling to grantee sites and relevant meetings; and analyzing data and disseminating findings. In addition, these funds will support the maintenance of the web-based data collection and management system and fund staff support for data collection. It is estimated that CMHS will allocate 0.30 of a full-time equivalent each year for Government oversight of the evaluation. Assuming an annual salary of $ 84,546for a GS-13 step 1 payscale, these Government costs will be $25,364 per year.
Changes in Burden
SAMHSA is requesting 5,008 annual burden hours for this submission, representing an increase of 879 annual burden hours over the most recent OMB package. The GLS State/Tribal Evaluation design and programmatic changes that account for this increase in burden include:
The increase in time for grantees to complete the PSI instrument (from .75 to 1.25) is due to the inclusion of questions to capture 1) grantees’ activities to reduce behavioral health disparities and promote behavioral health equity as part of their program implementation, 2) grantees’ plans to sustain their strategies and program after the grant ends, and 3) the nature of the community partnerships that the grantees have made. These topics are priority areas of inquiry for SAMHSA.
The increase in responses per respondent for the TASP (from 4 to10) is based upon an average number of trainings per grantee from the prior evaluation.
The increase in time for grantees to complete the EIRFT-I instrument (from .75 to 2) is due to the request for data for 6-months post identification of at-risk status. These additional questions will provide a better understanding of the referral process and the receipt of services for youth.
The addition of the TSA instruments will assess trainee confidence in identifying and managing youth at risk for suicide after participation in a training event, which is the most common prevention strategy activity that grantees use.
The addition of the TSA-P will assess whether trainees maintain the skills that they were taught in previous training events.
The addition of the YORS and YER Journal will provide the youth voice to the GLS State/Tribal Evaluation, which has been largely lacking to this point. SAMHSA has specifically stated that GLS grantees and the GLS State/Tribal Evaluation include perspective from individuals with lived experience.
Time Schedule, Publication, and Analysis Plans
The time schedule for implementing the GLS State/Tribal Evaluation is summarized in Exhibit 18. A 3-year clearance is requested for this project.
Exhibit 18. Time Schedule
Activity |
Timeframe |
Begin data collection for 31 grantees across GLS State/Tribal cohorts 14-17. |
April 2024 (1 month after OMB clearance estimated for March 1, 2024) |
Data collection completed for cohort 14 grantees |
January 2025 |
Data collection completed for cohort 15 grantees |
November 2025 |
Data collection completed for cohort 16 grantees |
March 2026 |
Data collection completed for cohort 17 grantees |
September 2027 |
Publication Plans
The GLSMA requires ongoing congressional reports summarizing the results of GLS State/Tribal Evaluation. The evaluation team will analyze collected data and prepare congressional reports to summarize key findings. A final report on the results of the GLS Evaluation is also required by the GLSMA and will be produced by the evaluation team. Due to the importance of the GLS State/Tribal Evaluation to the field of suicide prevention, in collaboration with SAMHSA and the Government project officer, the results of the evaluation also will be published in relevant professional journals to inform the research community as well as the decision making of policymakers and program administrators.
ICF will develop a minimum of one GLS-focused article for submission to peer-reviewed journals for each year of the contract. An outline and or draft for each manuscript will be submitted to the COR for review and feedback on the structure and content and the potential peer-reviewed journals for submission. Potential manuscript topics will include findings related to priority areas such as the impact of the GLS Program on youth morbidity and mortality, the care pathways that youth follow after at-risk identification, and the ability of suicide preventions trainees to retain the needed skills to be effective in their roles. Additional manuscript topics may also be related to the research questions and findings that emerge from the Training Outcomes Study, the Youth Study, and the Continuity of Care Study. These may include the following topics:
All publications will be submitted to the Contracting Office Representative (COR) in draft form for review and approval prior to submission to the selected journal. Examples of journals that will be considered as vehicles for publication include the following:
American Journal of Public Health
American Psychologist
American Journal of Diseases of Children
Child Development
Crisis
Evaluation Review
Evaluation Quarterly
Journal of the American Academy of Child and Adolescent Psychology
Journal of Applied Development Psychology
Journal of Child and Family Studies
Journal of Clinical Child and Adolescent Psychology
Journal of Consulting and Clinical Psychology
Journal of Health and Social Behavior
Journal of Mental Health Administration
Psychological Reports
Social Services Review
Suicide and Life-Threatening Behavior
Data Analysis Plan
Data collected through the GLS State/Tribal Evaluation will be analyzed to address key evaluation questions and related sub-questions. Analysis plans for each study are described below. In addition, two special analyses will be conducted to address evaluation questions that cut across the GLS State/Tribal Evaluation studies, and integrate extant data, including data collected from earlier GLS grantee cohorts (included in previous GLS Program evaluations).
Implementation Evaluation Analysis
Retrospective Study
The purpose of the Retrospective Study is to attempt to fill in the data gaps that exist from 2019 to 2023 in which SAMHSA did not conduct national evaluations of the GLS Program. The ICF team will develop a tracking database in Excel to monitor retrospective information acquisition, including for each grantee (1) types of archival records reviewed, (2) progress related to information obtained/confirmed and gaps in information, and (3) questions or details to be verified through discussion with grantee program staff. The initial archival record review will inform the development of key search terms to be included in an analysis through which we will apply natural language processing techniques to archival records (e.g., grantee applications, annual progress reports) to obtain structured information. Through initial archival record review, we will identify search terms and phrases to frame the natural language processing analysis related to key areas of assessment for the retrospective review, including: (1) trainings conducted between 2019 and 2022, training type, number of individuals trained, trainee roles, and geographic location (if available); (2) early intervention strategies implemented; (3) cultural adaptations; and (4) continuity of care and follow-up for youths identified to be at risk for suicide. We will use descriptive statistics (i.e., frequencies, means, standard deviations, and proportions, as appropriate) to summarize the findings. To the extent possible, these data will also be incorporated into the Behavioral Health Equity Cross-Study analyses and the impact analyses and cumulative impact synthesis conducted as part of the Impact Evaluation.
Prospective Study
Through primary data collection and drawing on the PSI, TASP, EIRFT-I, and EIRFT-S, we will provide a comprehensive assessment of current GLS grantee implementation activity, progress, and context with a focus on addressing the evaluation priorities and questions previously noted. Specifically, to assess the extent to which GLS grantees are implementing the various aspects of the GLS State/Tribal Program, we will analyze data from the PSI using descriptive statistics to provide a precise characterization of the related strategies and activities and their outputs. This will include, for example: (1) types of suicide prevention strategies implemented by GLS grantees, (2) populations of focus, (3) types of emerging technologies used (e.g., chat, texting, use of social media), and (4) whether grantees are implementing strategies in accordance with their work plan.
To understand the extent to which GLS grantees are implementing treatment and prevention services for diverse cultural populations that address the specific risk and protective factors of the various populations being served, as part of the Behavioral Health Equity Cross Study, we will analyze PSI data using descriptive statistics. This will include an assessment of: (1) cultural adaptations reported by grantees, (2) characteristics of diverse populations taken into consideration when implementing various aspects of the GLS State/Tribal program, and (3) which social determinants of health grantees are intending to address or consider as part of their strategy implementation. We will also use qualitative data analysis techniques to explore themes and variation in grantee implementation activity through review of data from open-ended fields in the PSI. This analysis will focus on contextual information about implementation processes including grantee efforts to adapt and tailor their strategies to address the needs of specific populations (e.g., under-resourced populations that experience behavioral health disparities) and grantee strategies to promote behavioral health equity.
To assess the extent to which GLS grantees are training individuals on how to effectively identify and refer youths who are at risk for suicide, we will analyze data from the PSI and the TASP relying on descriptive statistics. This will include: (1) the number of trainings grantees are implementing by types of training and types of grantees (state, tribal), (2) the number of trainees participating in these activities and their typical role and other characteristics, and (3) the geographic reach of the trainings. As described, using the PSI, all grantees will document the types of prevention strategies they implement including training-related strategies and expenditures. Using the TASP, grantees will provide detailed information related to each individual training they implement.
To assess the extent to which grantees are implementing a response system to ensure that timely referrals incorporating safety planning are provided to appropriate community-based mental health care and treatment programs, we will analyze data from the EIRFT-I and EIRFT-S using descriptive statistics. From the EIRFT-S, this will include: (1) the number of youths screened, (2) where screenings take place; from the EIRFT-I, (3) the number referred, and (4) safety planning integrated into referral protocols.
To understand the extent to which grantees are developing collaborative partnerships, we will analyze data from the PSI using descriptive statistics. This will include: (1) types of organizations grantees partner with to implement early intervention and assessment services including screenings, (2) types of community-based mental health care and treatment programs that grantees partner with to create a timely referral response system, (3) components of the NSSP that grantees are implementing, and (4) EBPs that grantees report implementing.
In addition to the analyses described above, we will develop composite scoring, categorization, and comparisons of grantees based on their implementation of prevention strategies. When applicable, we will merge grantee archival data and PSI and TASP data and use multiple regression techniques to explore and summarize the associations between program activities. As with the retrospective data, prospective data also will be incorporated into the Behavioral Health Equity Cross-Study analyses and the impact analyses and cumulative impact synthesis conducted as part of the Impact Evaluation.
Training Outcomes Study Analysis
Descriptive statistics will be used to provide a precise characterization of trainee demographics and participants’ behavior after training. Multivariate regression techniques, particularly ordinal and binary logistic regression, will be implemented to explore and summarize associations between training activities and their proximal outcomes (e.g., trainee knowledge, confidence, and self-efficacy; identification and referral of at-risk youth), as well as variation in outcomes by subpopulations of interest (e.g., community gatekeeper and clinical trainees; state and tribal grantees). To assess the effectiveness of trainings in building participants’ skills to identify and manage youths at risk for suicide and whether those skills are sustained over time, the evaluation team will use ANOVA at three measurement timepoints (baseline, 6 months, and 12 months).
For the phone simulation analysis, the evaluation team will rely on a scoring matrix to assess suicide prevention core competencies displayed during the simulations. For gatekeepers, these include attitudes and reactions; empathetic stance toward the youth; asking directly about suicidal ideation; and referral to a mental health professional. In addition to the competencies assessed for gatekeepers, clinical staff will also be scored on their assessment of risk and protective factors; current suicidality including triggers, onset, access to means, and a plan; safety planning; and follow-up. Within each competency, the observer will give the participant a total score; generally, a 0 indicates that the skill was not observed, a 1 indicates that it was partially observed, and a 2 means that the skill was fully observed. Throughout the scoring matrix, additional expectations are indicated, including specific terminology or phrasing that may or may not indicate a successful demonstration of competencies. The evaluation team member will score the simulation during the call but may rely on the phone recording to confirm or assess the demonstration of skills. The total skills score will be analyzed descriptively, and the phone simulation subsample will be divided into two groups (high vs. low skills retention) based on a median split of the subsample. These groups will be included in additional analyses to understand the linkages between variation in skills retention at 3 months and subsequent 6-month and 12-month self-reported self-efficacy, core identification and management skills, number of contacts with potentially suicidal youth, and referrals to additional services and supports using repeated measures ANOVA with two groups and three timepoints, as well as multivariate regression analyses (e.g., mixed models) to explore the influence of additional covariates.
Youth Experience, Outcomes, and Resiliency Study
ICF will use descriptive statistics to provide a precise characterization of the youths (aged 14–24 years) who have participated in GLS grant-sponsored treatment or activities using the data collected on the YORS on a quarterly basis up to 12 months after enrollment. We will use analysis of variance and regression models to analyze the change in outcomes, including respondents’: (1) total score of resilience, stress tolerance, and self-management skills; (2) total score of suicidality, including self-harm, passive, and active suicidality; and (3) youth experiences with services received. We will also analyze variation over time in youth outcomes across implementation settings (schools, juvenile justice, and community coalitions) using mixed-effects regression models or generalized estimating equations. (Data related to the study-relevant Behavioral Health Equity Evaluation questions will be analyzed under the Behavioral Health Equity Cross Study.)
Using qualitative software (e.g., MAXQDA) we will analyze YER Journal textual data to identify service components that youths are receiving. Our team will code qualitative interview data by developing coding rubrics and codebooks. Our team will develop composite scoring, categorization, and comparison techniques using a traditional inductive coding method with well-established procedures for coding and analyzing these data. The research team will analyze photos shared by youth participants through the YER Journal by analyzing both visual data and narrative data in four stages, including a photograph analysis based on the research team’s interpretations, a photograph analysis based on the participants’ interpretations, a cross-comparison, and theorization (Tsang, 2020). Our team will identify common experiences and feelings that emerge across participant descriptions.
Continuity of Care Study Analysis
To assess if GLS prevention activities are effective in developing and supporting continuity of care, we will analyze EIRFT-I data using multivariate regression techniques, particularly binary logistic regression, to explore and summarize the associations between proximal outcomes and youth, provider, and grantee characteristics. We will use these techniques to build upon our previous analyses to understand who in the grantee service area is identifying youths, making referrals, and being connected to services.
To understand the proportion of youths who received referrals and services, and changes in suicidal thoughts, plans, and attempts, ICF will use descriptive statistics to regularly provide a precise characterization of the program early identification, referral, follow-up, and treatment activities, as well as youth proximal outcomes. This includes (1) the number, characteristics (e.g., gender, race/ethnicity) of youths, and the settings where they were identified at risk of suicide by GLS grantees; (2) the proportion of those youths identified as at risk who received follow-up support and ongoing treatment; and (3) the risk status of the youths receiving mental health services by a GLS provider up to 6 months after initial contact.
To understand the proportion of youths being discharged from inpatient psychiatric units or emergency departments that receive some type of support or follow-up, the nature of the follow-up, and the duration of that support, we will rely on descriptive statistics that follow a youth over the 6-month period after discharge. This new analysis will explore the path of care a youth receives over a 6-month period and what happens after a youth begins services. If possible, we may be able to identify if some paths are more common in different geographic areas (i.e., urban versus rural), different populations served (i.e., tribal versus non-tribal), or if other factors influence a care pathway. We may rely on generalized linear modeling approaches to compare outcomes among different groups. This technique was used in our previous work to understand the effect of rural and urban locations on service receipt and follow-up after referral of at-risk youths to mental health services either by a trained gatekeeper or by a screening program.
In our previous work we implemented an analysis that used geospatial techniques to explore the relationship between areas where identifications occurred (at the county level) and areas where GLS-funded activities occurred to assess whether individuals in GLS service areas were more likely to seek treatment. We will incorporate these techniques for the current study, and if youths in GLS program areas do receive services at a greater rate, our analysis will incorporate the grantee as a possible source of variation in follow-up patterns, either through random-effects models, as before, or through generalized estimating equations in combination with grantee-level fixed effects. Data from the Continuity of Care Study will also be used in the quasi-experimental comparison study described in the Impact Evaluation section.
Impact Evaluation
Impact Analyses
While we lack the ability to randomize youths to the GLS State/Tribal Program, we have methodological and analytic tools available to develop a rigorous, valid, and defensible counterfactual condition. Employing and extending methods previously reported in research literature (Garraza et al., 2015; Walrath et al., 2015; Godoy Garraza et al., 2019), we will assess the current GLS State/Tribal Program grantee context and develop a quasi-experimental approach that minimizes baseline differences between GLSState/Tribal Program grantee data and potential comparison data on suicide morbidity, mortality, and relevant population characteristics.
We will conduct the analyses in two steps. First, we will estimate the impact of the GLSState/Tribal Program in each grantee community using a state-of-the-art approach to counterfactual estimation through a synthetic control method (SCM) like Bayesian additive regression tree (BART). Second, we will use machine learning to identify different patterns of effect and contextual characteristics that may predict them. An ensemble of tree models will be used to flexibly model the observed outcome. Bayesian priors will be used to minimize identification of large trees and to give small weight to any singular tree in the ensemble. Machine learning procedures will be used to identify combinations of characteristics and circumstances from a potentially large set of alternatives, as well as their potential interactions, without relying on stringent functional form assumptions of traditional methods like stepwise regression. We will use recursive partitioning algorithms to split the data repeatedly to identify subgroups that are homogenous with respect to the impact variable. Specifically, the so-called “evolutionary algorithm” has been demonstrated to have comparatively better predictive accuracy than methods based on forward stepwise search (Grubinger, Zeileis & Pfeiffer, 2014). In addition, by applying the same analysis techniques to related variables that are not expected to be affected by GLS interventions (termed ‘control’ outcomes), we gain additional protection against unmeasured confounding variables.
To fill in the records gap of the last 3 years of GLS State/Tribal grantee activities, we will review grantee archival records, including grantee annual reports, training logs, and meeting agendas, to extract information about the types and locations of activities, especially training events, that grantee cohorts funded in FYs 2019–21 have implemented between 2019 and 2023. Retrieval of this information will inform the implementation of the Impact Analyses.
Cumulative Impact Synthesis
We will also conduct analyses to assess the causal linkage of GLS State/Tribal grantee implementation factors to program outcomes and impact using a coincidence analysis (CNA) to support causal inference. It identifies combinations of implementation conditions minimally necessary or sufficient for achievement of a specific outcome (Baumgartner, 2009)—as well as the possible presence of multiple causal paths to an outcome—and it can be applied to large-n or small-n studies (Baumgartner, 2013; Baumgartner & Epple, 2014). For example, in a set of implementation strategies available to grantees, some strategies (e.g., methods for identification of at-risk youths, processes for referral to services and supports) can yield the desired outcome (e.g., reduction in suicide morbidity/mortality) in combination with certain other strategies (e.g., partnering with specific types of agencies), while other combinations of the same strategies may not be necessary and sufficient to yield the desired outcome. Also, the same outcome might be obtained via different bundles of strategies.
We will use CNA to identify the causal paths that lead from the presence or absence of combinations of implementation conditions (e.g., screening, assessment and referral training, outreach, means restriction, or awareness activities) to the continuum of outcomes from proximal to distal (e.g., self-reported suicidal thoughts, plans, and attempts; inpatient hospitalization and emergency department discharges; and mortality) thereby establishing the necessary and sufficient implementation conditions that lead to positive impact.
We will perform the CNA to establish what works, how it works, in what context, and for whom (Whitaker et al., 2020). A CNA is designed to support causal inferences, answer evaluation questions about combinations of conditions that are minimally necessary or sufficient for an outcome and to identify the possible presence of multiple causal paths to an outcome (Baumgartner & Thiem, 2015). A CNA can be applied to large-n and small-n data sets. It is one of a class of models referred to as configurational comparative models (Baumgartner & Falk, 2021) that frame causation in terms of causal structures with one or both of the following characteristics: (1) conjunctivity, in which outcomes can result from alternative causal paths, such that when one path is constrained, the outcome may still be produced by an alternate path, and (2) disjunctivity, in which causes are organized in complex bundles that become operative only when all their components are properly co-instantiated, while each component considered separately is ineffective or leads to different outcomes. Causation in CNA is a relation that holds between factors taking on specific values. In our comprehensive impact synthesis, factors that represent implementation conditions will be used to partition grantees into a binary set with 0 representing the absence of the factor and 1 the presence of that factor based on clearly defined criteria. This reflects the Boolean foundation of CNA in which AND- and OR-connections represent conjunctions and disjunctions. Boolean algebra provides a set of tools that take data from binary, multi-value, or continuous factors as input and infer causal structures as defined by the so-called INUS theory (Baumgartner & Falk, 2019). In INUS theory, a cause is an insufficient but nonredundant part of an unnecessary but sufficient condition. Using CNA, we will apply INUS logic to arrive at combinations of intervention characteristics that identify minimally sufficient and necessary causal paths to GLS State/Tribal Program outcomes.
Behavioral Health Equity Cross-Study Analysis
The Behavioral Health Equity Cross Study will address evaluation questions that cut across GLS studies. We will integrate existing data sources and findings from the evaluation’s implementation and outcome studies to understand how the GLS State/Tribal Program invests in and accomplishes decreasing behavioral health disparities, especially among youths who identify as AI/AN, Black, military families/veterans, or LGBTQ+. As an overarching study across the implementation, outcomes, and impact evaluations, the Behavioral Health Equity Cross Study will use the data sources described in the previous sections. Primary data sources will include the following instruments: PSI; TASP; EIRF-I and EIRF-S; TSA-P, TSA-F, and TSA-PS; YORS and YER Journal. Secondary data sets, including the Youth Risk Behavior Survey, ESSENCE, and Medicaid data, will be used to assess outcomes among subgroups of youths at high risk for suicide. Archival records (e.g., grantee annual reports) will be used to extract information about the types of activities implemented, including any cultural adaptations made and health equity practices employed.
The analyses for the Behavioral Health Equity Cross Study will address the questions of how grantees apply a behavioral health equity lens within their strategies and programs to address the disproportionate effect of suicide on high-risk communities, like AI/AN youths, Black youths, youths who are members of military families/veterans, and LGBTQ+ youths; and how program implementation, outcomes, and impact may vary across different communities. We will explore innovations in prevention strategies with an emphasis on technologies and cultural adaptations to address health equity and reach youth populations disproportionately affected by suicide. We will use qualitative analysis techniques to understand the specific strategies grantees use to tailor approaches that address the needs of specific subpopulations they serve and their implementation of strategies to address behavioral health equity through qualitative coding and extracting themes. We will use multivariate regression techniques, particularly binary logistic regression, to explore the association between proximal outcomes and characteristics of youths, providers, and grantees.
To address the question of potential behavioral health disparities in the impact of the GLS State/Tribal Program on specific subgroups at high risk for suicide (e.g., AI/AN, Black, LGBTQ youths), we will continue to refine our spatiotemporal models for impact evaluation combining the use of small area estimation, conditional autoregressive modeling, and integrated nested Laplace approximation with the quasi-experimental approach described in the Impact Evaluation section that involves using BART as a SCM and refining the solution with machine learning techniques. Assessing the impact of an intervention (such as a project, program, or policy) in small geographic areas or specific subgroups with small population sizes has been challenging where reliable estimates of the outcome of interest are difficult to obtain. An area or domain of estimation is small precisely when direct estimations are extremely unreliable or entirely unfeasible give the sample size. The situation often arises with relatively rare outcomes (such as suicide), as well as with more frequent outcomes (such as suicide-related hospitalizations), when the interest lies in specific segments of the population. The field of disease mapping is concerned with estimating the risk of a disease or health outcome using case counts within small administrative districts or regions. Building on hierarchical models originally proposed for small-area estimation, disease-mapping models further incorporate spatial dependence and take advantage of multiple time periods. The recent introduction of Bayesian spatiotemporal models developed for disease mapping (Bauer et al., 2016) explicitly takes advantage of spatial structure together with temporal dependencies to aid the estimation. We will apply this approach to suicide prevention impact evaluation. The application both extends the utility of the approach beyond disease mapping and significantly advances the ability to understand the impact of suicide prevention programming in some of the highest-risk populations (Godoy Garraza, Campos & Walrath, 2021).
17. Display of Expiration Date
All data collection instruments will display the expiration date of OMB approval.
Exceptions to the Certification Statement
This collection of information involves no exceptions to the Certification for Paperwork Reduction Act Submissions.
1 BLS OES May 2022 National Industry-Specific Occupation Employment and Wage Estimates average annual salary for Survey Researchers (code 19-3022); https://www.bls.gov/oes/current/naics5_541720.htm.
2 BLS OES May 2022 National Industry-Specific Occupation Employment and Wage Estimates average annual salary for Community and Social Service Occupations (code 21-0000); https://www.bls.gov/oes/current/naics5_210000.htm.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Zanakos, Sophia |
File Modified | 0000-00-00 |
File Created | 2023-12-12 |