Suicide Cross-Site-Eval Supporting Statement V 6 (FINAL_672010)

Suicide Cross-Site-Eval Supporting Statement V 6 (FINAL_672010).doc

Cross-Site Evaluation of the Garrett Lee Smith Memorial Suicide Prevention and Early Intervention Program

OMB: 0930-0286

Document [doc]
Download: doc | pdf


Cross-Site Evaluation of the Garrett Lee Smith Memorial Suicide

Prevention and Early Intervention Program

Supporting Statement


A. Justification



The Prevention Initiatives and Priority Programs Development Branch of the Center for Mental Health Services (CMHS) within the Substance Abuse and Mental Health Services Administration (SAMHSA), is requesting clearance for data collection associated with the cross-site evaluation of the Garrett Lee Smith (GLS) Memorial Youth Suicide Early Intervention and Prevention Program (“State/Tribal Suicide Prevention Program”) and the GLS Campus Suicide Prevention Program (“Campus Suicide Prevention Program”). The Garrett Lee Smith Memorial Act (SEC. 520E–1. o290bb–36a. SUICIDE PREVENTION FOR CHILDREN AND

ADOLESCENTS) passed by Congress in October 2004, was the first legislation to provide funding specifically for State/Tribal and Campus Suicide Prevention programs. Under this legislation, funding has been set aside for states, tribes, and institutions of higher learning to develop, evaluate and improve early intervention and suicide prevention programs, and mandates that the effectiveness of programs be evaluated and reported.

To date, SAMHSA has awarded 79 State/Tribal grantees and 88 Campus grantees with funds under the GLSMA. The cross-site evaluation of the GLS Suicide Prevention Program was designed to evaluate the effectiveness of suicide prevention activities across multiple sites and to report those findings to Congress. While the desired long-term outcome of suicide prevention activities is a reduction in suicide attempts and deaths by suicide, there are potential intermediary variables that must be adequately and robustly evaluated prior to the evaluation of suicidal behavior itself. Complex conceptual models that include intermediary pathways of effect, such as those that underpin suicide prevention programs, must be evaluated using a staged framework which allows for the assessment of process, mediating, and long-term outcomes (i.e., potential mediating variables). For example, many suicide prevention programs currently do not have information on whether youth identified as at risk are able to access treatment – an intermediate variable that requires investigation. Furthermore, the data management infrastructure across states and tribes has not reached the point of consistency and sophistication that would allow for cross-state/tribe tracking and aggregation of suicide attempt and deaths by suicide. For example, states/tribes differ in how they classify suicide attempts and deaths by suicide - which could make aggregation and interpretation of these statistics potentially misleading. The cross-site evaluation, through components designed to capture process, proximal and intermediate outcomes, as well as information regarding the current status of existing data systems, will supply critical information to the field that will ultimately lead to rigorous collection and interpretation of the long term outcomes of suicide prevention efforts.


More specifically, to date there have been few systematic studies of these mediating variables, and without the results of such an evaluation, the interpretation of suicidal behavior outcomes (whether positive or negative) will remain impossible. For example, the causal chain upon which early identification gatekeeper training activities is based includes the early identification of youth at-risk, their referral to service, their subsequent connection with those services, their receipt of services; the amelioration of their at-risk circumstances; hence an ultimate reduction in suicidal attempts and related deaths. In this scenario one must first understand the impact of the gatekeeper training on the referrals to service and subsequent connection to services; for without positive outcome in these intermediate areas ultimate outcomes associated with suicidal behavior are unrealistic.


The cross-site evaluation is the first comprehensive and systematic evaluation of the crucial mediating (proximal) outcomes of suicide prevention efforts such as awareness, knowledge, referrals, and service access. Currently, data collection for the cross-site evaluation is operating under OMB clearance (OMB No. 0930-0286) valid until May 2010. SAMHSA is requesting approval for revisions to the previously approved cross-site evaluation package.


The cross-site evaluation has four stages of information gathering that target the funded program activity areas. These four stages are: (1) Context Stage, (2) Product Stage, (3) Process Stage, and (4) Impact Stage. Additionally, the cross-site evaluation has an Enhanced Evaluation component that seeks to deepen the knowledge base about youth served in funded suicide prevention programs, with a focus on more long term outcomes related to suicidal behavior. Data collection activities have been tailored to the programmatic activities funded because different programmatic approaches are funded in the State/Tribal sites and the Campus sites. In addition to assessing the effectiveness of the GLS Suicide Prevention Program, information collected through the cross-site evaluation will continue to be used to report on SAMHSA’s National Outcome Measures (NOMs) that are relevant to program activities, in addition to reporting on the Government Performance Reporting Act (GPRA) measures that are identified for this program.


The table below summarizes the data collection instruments and data abstraction activities included in this clearance request.


Table 1 - Summary of Data Collection Activities


Type of Grantee

Data Collection Instrument

Data Abstraction

State/Tribal Grantees

  1. Prevention Strategies Inventory – State/Tribal (PSI-ST) –Attachment A.1

  2. Training Exit Survey - State/Tribal (TES-ST) – Attachment B.1

  3. Training Utilization and Preservation - Survey (TUP-S) – Attachment C.1

  4. Training Utilization and Preservation - Interview (TUP-I) – Attachment D.1

  5. Referral Network Survey (RNS) – Attachment E

  1. Early Identification Referral and Follow Up Analysis (EIRF) – Attachment F.1

  2. Early Identification Referral and Follow Up Aggregate Screening Form (EIRF-S) – Attachment F.2

  3. Training Exit Survey Cover Page – State/Tribal Version (TES-CP-ST) – Attachment F.3

Campus Grantees

  1. Prevention Strategies Inventory Baseline and Follow Up – Campus (PSI-C) – Attachment A.2

  2. Training Exit Survey – Campus (TES-C) –– Attachment B.2

  3. Suicide Prevention Exposure, Awareness and Knowledge - Students (SPEAKS-S)- Attachment G.1

  4. Suicide Prevention Exposure, Awareness and Knowledge -Faculty/Staff (SPEAKS-FS) – Attachment G.2

  5. Campus Infrastructure Interview (CIFI) - Attachment H.1

  1. MIS Data Abstraction – Attachment F.5

  2. Training Exit Survey Cover Page – Campus Version (TES-CP-C) –– Attachment F.4

Campus grantees selected for Enhanced Evaluation

  1. Student Focus Group Moderator’s Guide – Attachment I.1

  2. Faculty/Staff Focus Group Moderator’s Guide Attachment J.2

  3. Case Study Key Informant Interviews (7 versions)

Attachments J.1 to J.7



1. Circumstances of Information Collection



a. Background



While youth suicide is an enormous public health problem that takes the lives of many young persons and causes pain and suffering for those left in the aftermath, suicide also can result in feelings of guilt and shame for the friends and family members of the 4,000 adolescents and young adults who commit suicide every year (National Adolescent Health Information Center [NAHIC], 2004). Although adolescent males, in comparison with adolescent females, die more frequently from suicide, adolescent females are more likely than adolescent males to attempt suicide (NAHIC, 2004). Of all youth populations, American Indian/Alaska Native males have the highest suicide rates (Anderson & Smith, 2003). Despite these prevalence data, the scope of this problem is not entirely known because of the manner in which cause of death is recorded on death certificates and because of the ambiguity of homicides and accidental deaths where the person attempting suicide intentionally places himself or herself in harm’s way (U.S. Public Health Service, 1999).

Youth suicide can be linked to a number of mental health disorders as well as substance abuse. In 2003, the President’s New Freedom Commission on Mental Health recognized youth suicide prevention as a major priority. This was due to the high rates of youth suicide, rates that included large numbers of individuals who had been diagnosed with a mental illness and/or substance abuse disorders (Institute of Medicine, 2002). Adolescence is a time of rapid maturity and increasing responsibility, which leave many youth with a feeling of hopelessness for the future. This can apply particularly to college students and older adolescents between the ages of 20 and 24, the ages where the highest youth suicide rates are observed (NAHIC, 2004). In a study by the American College Health Association (as cited in the GLSMA, Public Law 108-355), 61 percent of college students reported feeling hopeless, and 45 percent reported feeling so depressed they could barely function; while 9 percent reported feeling suicidal.

Despite these high prevalence rates, youth suicide remains a public health problem that has gone largely unaddressed. This is unfortunate because suicide is preventable. Up to 80% of teens that attempt suicide display warning signs that if acted upon could prevent attempts (National Mental Health Association, 2005). These may include indirect or direct suicide threats, an obsession with death, or giving away belongings. Also, because of the negative social norms that surround mental health and suicide, youth often do not disclose their underlying emotional state or behavioral intentions. Consequently, it is extremely important to recognize these signs when exhibited, because the inability to do so may represent a missed opportunity for suicide prevention and intervention.

Suicide warning signs are less likely to occur, however, if protective factors are first recognized and taken into consideration. Various studies have shown that the proportion and interaction of risk and protective factors contribute to the potential for suicide to occur (Moscicki, 1997). Youth who exhibit risk factors, such as depression, impulsivity, alcohol and substance abuse, and a history of trauma or abuse, are believed to have a greater potential for suicidal behavior (Beautrais, 2000). Examples of protective factors include problem-solving skills, effective clinical care, strong connections to family and community support, and restricted access to lethal methods for attempting suicide. Research into this issue has generated goals and strategies for reducing the occurrence and subsequent burden of youth suicide, which build on the foundation of reducing risk factors while increasing protective factors (U.S. Public Health Service, 2001).

However, suicide does not occur simply because of an inadequate blending of these factors nor will a universal solution result because of a proper combination of specific risk and protective factors. As emphasized in the following reports, it will take involvement from mental health, substance abuse, juvenile justice, primary care, education, the media, and other youth-serving organizations to successfully prevent the occurrence of youth suicide. Three documents, Reducing Suicide: A National Imperative (Institute of Medicine, 2002), The Surgeon General’s Call to Action to Prevent Suicide (U.S. Department of Health and Human Services [DHHS], Public Health Service, 1999), and National Strategy for Suicide Prevention: Goals and Objectives for Action (U.S. DHHS, Public Health Service, 2001), all provide overlapping recommendations for how this problem can be effectively addressed.

The Institute of Medicine’s Reducing Suicide: A National Imperative (2002) highlighted the prevalence of suicide attempts and suicidal behaviors and emphasized the need for research to understand how to prevent suicide, while highlighting the challenges associated with such research. The Surgeon General’s Call to Action to Prevent Suicide (U.S. Public Health Service, 1999) highlighted the need for increased public awareness of the problem of youth suicide, interventions to enhance treatments, services, and programs, as well as a methodology to advance the science of suicide prevention, better known as AIM: awareness, intervention, and methodology. AIM is the foundation for the 15 key recommendations highlighted in the Surgeon General’s report. As a result of the collaboration of the Federal government, many private and public stakeholders, and family members of persons who committed suicide, the AIM framework became the catalyst for a further thorough and comprehensive strategy—the National Strategy for Suicide Prevention: Goals and Objectives for Action (U.S. Public Health Service, 2001).


On October 21, 2004, Congress passed the Garrett Lee Smith Memorial Act (GLSMA), which was signed into law by President Bush, to mobilize efforts to support suicide prevention and early intervention. This act authorizes the use of $82 million over 3 years to support States, Tribal communities, and colleges and universities to develop and implement various suicide prevention initiatives. This act strongly builds on Reducing Suicide: The Surgeon General’s Call to Action (U.S. Public Health Service, 1999), and the National Strategy for Suicide Prevention (U.S. Public Health Service, 2001) in its directive to use the scientifically proven methodologies identified in each of these reports to target those youth and young adolescents who have historically generated the highest suicide rates. Products of this effort, which encapsulate recommendations from each of these reports, include the GLS State/Tribal Youth Suicide Prevention and Early Intervention Program as well as the GLS Campus Suicide Prevention and Early Intervention Program. Objectives of these two programs range from providing early intervention and assessment for youth at risk for mental or emotional disorders; conducting information and awareness campaigns to inform gatekeepers, family members, peers, and others about the risk factors associated with youth suicide; to training physicians, educators, and providers to identify youth who exhibit at-risk behavior for youth suicide. This legislation not only provides support for implementing these strategies, but also directs these programs to evaluate the effectiveness of the targeted interventions provided by these programs at the local level, and requires a cross-site evaluation and report to Congress


On September 20, 2005, the Center for Mental Health Services (CMHS) of the Substance Abuse and Mental Health Services Administration (SAMHSA) announced the award of 14 State/Tribal and 22 Campus cooperative agreements as a part of the Garrett Lee Smith State/Tribal Youth Suicide Prevention and Early Intervention Program. Congress authorized an additional $27 million in FY 2006 to provide additional funding for States, Tribal communities and colleges across the country. In May 2006, SAMHSA announced the award of an additional 8 State/Tribal cooperative agreements. In September 2006, an additional 14 State/Tribal cooperative agreements and 34 Campus cooperative agreements were awarded. In June 2007, two additional State/Tribal cooperative agreements were awarded. In September 2008, thirty additional State/Tribal sites (seven of which were cohort 1 continuation awards) and 17 campuses (6 of which were continuation awards) were awarded funding. Most recently, SAMHSA awarded 18 State/Tribal cooperative agreements and 22 more Campus cooperative agreements. In sum, the GLS State/Tribal Suicide Prevention Program has funded 79 State/Tribal grantees and 88 campus programs.


b. The Need for Evaluation



Section 520E (g) of the GLSMA mandates a cross-site evaluation to be conducted concerning the effectiveness of the activities carried out under the State/Tribal Youth Suicide Early Intervention and Prevention Program. The GLSMA specifies that a report to Congress must be submitted:


to analyze the effectiveness and efficacy of the activities conducted with grants, collaborations and consultations under [Section 520E].”


In addition, Section 520-E-2 (f) of the GLSMA mandates a cross-site evaluation of the Campus Suicide Prevention Program. The GLSMA specifies that a report must be submitted to Congress to include:


an evaluation of the grant program outcomes, including a summary of activities carried out with the grant and the results achieved through those activities.”, including “recommendations on how to improve access to mental and behavioral health services at institutions of higher education, including efforts to reduce the incidence of suicide and substance abuse.”


The cross-site evaluation will serve as a primary mechanism through which the initiative will be understood, improved, and sustained. As described previously there is a dire need in the field for a better understanding of the impact of suicide prevention efforts; first and foremost on the intermediate outcomes of these efforts and then ultimately on suicidal behavior itself. Because this suicide prevention initiative is the first to be federally funded, the rigor and utility of the evaluation and its findings are particularly critical, as such the emphasis of the cross-site evaluation is to gather the needed intermediate outcome information and data system infrastructure information across grantees, so that in future years of the GLS initiative cross-site evaluation efforts can move strategically forward on scientific ground to assess the impact of funded efforts on suicidal behavior. As such, the GLS cross-site evaluation will collect and analyze comprehensive data that focus on the context within which these programs are implemented; the products and services that are developed and utilized; the process though which programmatic activities are implemented; and impacts associated with those activities.


A government contractor (referred to as the cross-site evaluator throughout this document) coordinates data collection for the cross-site evaluation and provides support for its local-level implementation. Each grantee is required by the cooperative agreement to both conduct a self-evaluation and to participate in the cross-site evaluation. In this partnership between the cross-site evaluator and the local evaluators, the cross-site evaluator provides training and technical assistance regarding data collection and research design for the cross-site evaluation. In addition, the cross-site evaluator directly collects data, receives data from grantee data collection efforts, monitors data quality, and provides feedback to grantees. The data collection procedures, while systematically applied across funded sites, are specific to the local programmatic activities and infrastructure supporting those activities. The data gathered through the cross-site evaluation will continue to be utilized for both grantee-specific and national assessments of the program.


c. Previously Approved Clearance


Currently, data collection for the cross-site evaluation is operating under OMB clearance (OMB No. 0930-0286) valid until May 2010. What follows is a brief description of the evaluation design included in the previously approved OMB request.


The four-stage cross-site evaluation is designed to answer the following overarching questions:


  • What types of prevention/intervention programs, services and products are used with youth determined to be at risk for suicidal behavior?

  • What is the reach of program services, products, and strategies?

  • To what extent does collaboration and integration influence referral mechanisms and service use?

  • What is the impact of program services, products, and strategies on knowledge, process, and behavior?


The cross-site evaluation stages are described below.


Context Stage. The purpose of the Context stage is to gain an understanding of grantees’ program plans, such as grantee’s target population, target region, service delivery mechanisms, service delivery setting, types of program activities to be funded, evaluation activities, existing data sources and availability of data elements to support the cross-site evaluation. Collectively, the information learned through the context stage is used to support other components of the cross-site evaluation.


Product Stage. The purpose of the Product stage is to describe the development and utilization of prevention strategies at each State/Tribal and Campus grantee site. These prevention strategies may include public awareness campaigns; outreach and awareness events; gatekeeper trainings; lifeskills development activities for youth; policies and protocols for responding to youth at risk; means restriction strategies; screening programs and enhanced services, including early intervention, family support, and postsuicide intervention services.


Process Stage. The process stage of the cross-site evaluation assesses progress on key activities related to implementation of each grantee’s suicide prevention plans. Since there are differences between the State/Tribal and Campus program approaches towards suicide prevention, the type of information collected differs by type of grantee. Given that training is a major component of most grantees’ suicide prevention programs, this stage is designed to collect information on the major characteristics of the trainings from both State/Tribal and Campus grantees, such as the type of training as well the roles and demographics of participants. For State/Tribal grantees, information is collected on participants’ intended use and satisfaction with the training, immediately following the training experience. For a sample of these participants, qualitative interviews are conducted two months following the training in order to understand how participants have utilized and retained the knowledge, skills and/or techniques they learned through the suicide prevention program training. For State/Tribal grantees, data collected through the process stage is used to examine collaboration among different organizations/agencies involved in youth referral networks and how these networks change over time. For Campus grantees, this component examines the suicide prevention exposure, awareness and knowledge of faculty/staff and students as well as the suicide prevention infrastructure on campuses.


Impact Stage. The purpose of the impact stage is to assess the impact that the suicide prevention programs have on youth who are at risk for suicide. Existing data sources are used to assess the impact of program activities at the State/Tribal grantee and the Campus grantee levels. To assess the impact of State/Tribal program activities, existing information on youth referred for services and service receipt as a result of early identification activities is analyzed. To assess the impact of Campus program activities, existing administrative data related to the number of students who are at risk for suicide, the school retention rate, the number who seek services, and the type of services received, including emergency service use, is analyzed to determine the impact of Campus program activities on the student and campus populations.


Enhanced Evaluation. The cross-site evaluation design also includes an enhanced evaluation component that seeks to enhance the information learned about youth served in funded suicide prevention programs, with a focus on more long term outcomes related to suicidal behavior. Through an interagency agreement between SAMHSA and the CDC, the enhanced evaluation provides funds for additional evaluation activities to be conducted in selected grantee sites. The information collected through the enhanced evaluation is used to analyze the direct and measurable impact program activities have on proximal outcomes, such as knowledge, skills, and attitudes of professionals working with at-risk youth in a variety of settings and distal, community level outcomes, such as the number of children referred for services and long-term changes in skills and attitudes.


d. Clearance Request


SAMHSA is requesting approval for revisions to the previously approved cross-site evaluation package. The fundamental design of the cross-site evaluation remains unchanged. Drawing upon our experience of three years of data collection for the cross-site evaluation and feedback from grantees, we have made improvements to the cross-site evaluation data collection instruments in order to reduce response burden, maximize utility of data for all stakeholders and deepen our understanding and knowledge of particular areas in the suicide prevention field. Revisions to the cross-site evaluation are summarized in Section A2b.


2. Purposes and Use of the Information Collection



What follows is a description of the major components of the cross-site evaluation and their associated data collection instruments, revisions from the previously approved package, the uses of the information collected through the cross-site evaluation and the importance of the cross-site evaluation in addressing National Outcome Measures (NOMs) and GPRA reporting.


a. Cross-Site Evaluation Design and Data Collection Instruments


The various components of the cross-site evaluation are described below. Since there are differences between the State/Tribal and Campus program approaches towards suicide prevention, the type of information collected differs by type of grantee.


Context Stage


The purpose of the context stage is to gain an understanding of grantees’ program plans, such as grantee’s target population, target region, service delivery mechanisms, service delivery setting, types of program activities to be funded, evaluation activities, existing data sources and availability of data elements to support the cross-site evaluation. The cross-site evaluation team will use existing grant applications to gather information on grantees’ programs and the contexts in which they are implemented. Since information gathering in this stage utilizes existing grantee applications and will be conducted by the cross-site evaluation team, there is no formal data collection instrument and associated response burden for the grantees. Collectively, the information learned through the context stage is used to inform other components of the cross-site evaluation.


Product Stage


The purpose of the product stage is to describe the development and utilization of prevention strategies at each State/Tribal and Campus grantee site. The Prevention Strategies Inventory (PSI) (see Attachments A.1 and A.2) will be administered to one representative from each of the State/Tribal and Campus grantees. This inventory asks grantees to describe the different types of prevention strategies that they have implemented, such as: public awareness campaigns; outreach and awareness events; gatekeeper trainings; lifeskills development activities for youth; policies and protocols for responding to youth at risk; means restriction strategies; screening programs and enhanced services, including early intervention, family support, and postsuicide intervention services. There are two slightly different versions of the inventory for Campus grantees and State/Tribal grantees. Grantees will first complete the Baseline version. Thereafter, they will complete the Follow Up version on a quarterly basis over the duration of their grant period.


Process Stage



The process stage of the cross-site evaluation assesses progress on key activities related to implementation of each grantee’s suicide prevention plans. Since there are differences between State/Tribal and Campus program approaches towards suicide prevention, the type of information collected differs by type of grantee. This stage includes several data collection instruments and data abstraction processes.


Training to enhance awareness, knowledge, early identification, and referral of youth at risk for suicide is a primary program activity for most State/Tribal and Campus grantees. Both Campus and State/Tribal grantees are required to report aggregate training participant information for all training conducted as part of their suicide prevention programs. These data are aggregated from existing data sources, some of which are attendance sheets, management information systems, etc. Grantees are responsible for aggregating these data and submitting to the cross-site evaluation team in the format of the Training Exit Survey Cover Page (TES -CP) (see Attachments F.3 & F.4). There are two slightly different versions of the Cover Page for Campus grantees and State/Tribal grantees.



To assess the content of the training, the participants’ intended use of the skills and knowledge learned and satisfaction with the training experience, the Training Exit Survey (TES) (see Attachments B.1 & B.2) will be administered to all participants immediately following the conclusion of the training. The Training Exit Survey (TES) has two parts. While the core section of the survey will collect information on participant role, demographics and satisfaction with training experience, the modules will ask questions about participant knowledge, self-efficacy and intent to use tailored to particular training types. There are two slightly different versions of the survey for Campus grantees and State/Tribal grantees.



For State/Tribal grantees, the quantitative Training Utilization and Preservation Survey (TUP-S) (see Attachment C.1) will be administered to a random sample of trainees two months following the training in order to expand our knowledge on the utilization and retention of participants’ knowledge, skills and/or techniques learned through the training. The TUP-S will systematically measure gatekeeper behaviors and will include measures of self-efficacy, awareness and education efforts, and, most importantly, suicide identification behavior. The TUP-S will collect demographic information about individuals identified at risk, information about the subsequent referrals and/or supports provided by the trainee, and any available information about services accessed by the at-risk individual.



The Training Utilization and Preservation Interviews (TUP-I) (see Attachments D.1) is a qualitative follow-up interview that is targeted towards locally developed and understudied standardized training curricula as well as towards particular understudied gatekeeper trainee populations. The TUP-I will be administered to respondents two months following the training experience to assess whether the suicide prevention knowledge, skills and/or techniques learned through training were utilized and had an impact on youth. The TUP-I will be administered to 10 selected trainings per year. The interviews are semistructured and open ended.



For Campus grantees, the Suicide Prevention Exposure, Awareness and Knowledge Survey (SPEAKS) (see Attachments G.1 and G.2) will be administered to students, faculty, and staff at funded campuses annually over the grant period. The survey will collect information about the respondents’ perceptions about suicide, suicide prevention, and resources for help. There are two versions of this survey – one for students and the other for faculty/staff. Both versions examine: the exposure of campus populations to suicide prevention initiatives; awareness of appropriate crisis interventions, supports, services, and resources for mental health seeking; knowledge of myths and facts related to suicide and suicide prevention; and perceived and personal stigma related to depression and mental health seeking. The Student version additionally asks about respondents’ about their sense of connectedness to the campus community and their help-seeking behaviors. The SPEAKS will be administered annually.



For State/Tribal grantees, the Referral Network Survey (RNS) (see Attachment E) will be administered to representatives of youth-serving organizations and/or agencies that form referral networks supporting youth identified at risk. The RNS examines how collaboration and integration are used for sharing and transferring knowledge, resources, and technology among State/Tribal Program agencies and organizational stakeholders, how these networks influence referral mechanisms and service availability, policies and protocols regarding follow-up for youths who have attempted suicide and who are at risk for suicide, and access to electronic databases. The RNS will be administered to referral networks on an annual basis over the period of the grant.


For Campus grantees, the Campus Infrastructure Interviews (CIFI) (see Attachment H.1) is designed to gather information around campus infrastructure, program, policy, and planning related to suicide prevention; it involves key informant interviews conducted by the cross-site evaluation team via teleconference for each campus twice during the life of the grant. These semistructured interviews are conducted with up to five site representatives to gather information from multiple and varied perspectives on campus-based infrastructure development around suicide prevention activities. These representatives include; (1) Administrator, (2) Student Leader, (3) Counseling Center Staff, (4) Faculty/Staff-human services department, and (5) Faculty/Staff-non-human service department. Questions on the Campus Infrastructure Interview include whether respondents are aware of suicide prevention activities, what the campus culture is related to suicide prevention, and what specific efforts are in place to prevent suicide among the campus population.


Impact Stage


The purpose of the impact stage is to assess the impact that the suicide prevention programs have on youth who are at risk for suicide. Existing data sources are used to assess the impact of program activities at the State/Tribal grantee and the Campus grantee levels.


To assess the impact of State/Tribal program activities, existing information on youth referred for services and service receipt as a result of early identification activities is analyzed. The Early Identification, Referral and Follow Up Analyses (see Attachment F.1) require State/Tribal grantees to share existing data with the cross-site evaluation team on the number of youth identified at risk as a result of early identification activities, referred for services, and who present for services. The type of information that will be shared with the cross-site evaluation includes basic demographic information; types of service referrals; and types of services received, which includes mental health assessments, mental health treatment, emergency services, and nontraditional support services.


State/Tribal grantees are also required to report aggregate screening information for all youth screened as part of their suicide prevention programs. These data are aggregated from existing data sources. Grantees are responsible for aggregating these data and submitting to the cross-site evaluation team in the format of the Early Identification, Referral and Follow-up Aggregate Screening Form (see Attachment F.2).


To assess the impact of Campus program activities, the cross-site evaluation team will request campus sites to engage in a MIS data abstraction (see Attachment F.5) process to submit existing administrative data related to the number of students who are at risk for suicide, the school retention rate, the number who seek services, and the type of services received, including emergency service use, is analyzed to determine the impact of Campus program activities on the student and campus populations.


Enhanced Evaluation


The cross-site evaluation design also includes an enhanced evaluation component that seeks to enhance the information learned about youth served in funded suicide prevention programs, with a focus on more long term outcomes related to suicidal behavior. Through an interagency agreement between SAMHSA and the CDC, the enhanced evaluation provides funds for additional evaluation activities to be conducted in selected grantee sites. To obtain a comprehensive understanding of the integration of community-based behavioral health services with services provided by college or university campuses, SAMHSA will conduct case studies of four exemplary Campus suicide prevention programs. The goal of the Campus Case Studies is to understand how a public health approach is successfully applied as a model for campus suicide prevention efforts, and will explore, in a systematic manner: the suicide prevention related infrastructures and supports (e.g., clinical and non-clinical) that exist on selected GLS-funded campuses; the various student-level factors that are related to suicide prevention efforts (e.g., protective factors, coping strategies, social norms, and facilitators and barriers to student access and receipt of behavioral healthcare); campus interdepartmental collaboration and the relationship between various efforts to promote student mental health and wellness; and the extent to which the campus infrastructures and supports promote and address these factors.


Student Focus Groups will be implemented on the selected campuses (see Attachment I.1). This component will assess student risk and protective factors related to mental health, help-seeking behaviors, and knowledge of prevention activities on campus and their perceived effectiveness. This will help researchers more fully understand student-level factors in relation to population-level factors addressed by the SPEAKS-S. Questions address stressors that different groups of students face while in college, barriers to seeking help, attitudes and stigma related to seeking help, and the accessibility of the campus counseling center. Six focus groups will be conducted on each campus once over the data collection period. The following groups of students will potentially be represented in the focus groups, as decided by the campus: (1) first-year students, (2) athletes, (3) international students, (4) Lesbian, Gay, Bisexual, and Transgender (LGBT) students, (5) Greek life students, (6) graduate students, and (7) residential advisors/peer educators. Recruitment will be conducted by campus project staff.


Faculty and staff focus groups will also be conducted to assess the campus’ approach to prevention, attitudes and stigma around student mental health and wellness on campus, campus infrastructure supports for students who need mental health help, and the general campus climate around mental health and wellness (see Attachment I.2). Faculty and staff will also describe their knowledge of prevention activities on campus and their perceived effectiveness of these efforts. Two faculty focus groups and one staff focus group will be conducted on each campus once over the period of data collection.


Case Study Key Informant Interviews will include 7 qualitative interview versions: (1) Administrator, (2) Counseling Staff, (3) Coalition Member – Faculty, (4) Prevention Staff, (5) Case Finder, (6) Campus Police, and (7) Student Leader. Local project staff will be responsible for identifying appropriate respondents for each CSI version and scheduling the interview to occur during site visits by the case study team (see Attachments J.1 to J.7). Seven individuals from each of the campus sites will be selected as key informants to participate in the CSIs in each of the two stages of the GLS Campus Case Studies. Questions on the CSIs include whether respondents are aware of suicide prevention activities, what the campus culture is related to suicide prevention, and what specific efforts are in place to prevent suicide among the campus population. Items are formatted as open-ended and semi-structured questions. On the second site visit, the case study team will incorporate preliminary findings from the case studies in the interviews, which may be modified to some extent to collect more comprehensive information and gather feedback from local key informants surrounding the context of the preliminary findings.


b. Revisions


Below is a summary of revisions to the previously approved cross-site evaluation package and the rationale behind each of the program changes:


  • The original OMB clearance was requested and approved for the first 3 years of the evaluation till May 2010. Respondent burden for the revised clearance is calculated for the next 3 years of data collection from May 2010 to May 2013.


  • The number of grantees for which burden is calculated is 86 (48 State/Tribal grantees and 38 Campus grantees), which represents the number of currently active grantees and is close to the 91 grantees used in the previously approved package. It should be noted that we are using this number as an estimate of the number of grantees that are active per year. Forty six grantees (out of the 86 grantees) were funded in October 2008 and will reach the end of their grant period in September 2011. At that point, additional grantees may be funded. Therefore, we are estimating that in a given year, we would have 86 active grantees.


  • For the Product Stage, the Prevention Strategies Inventory (PSI) (see Attachments A.1 and A.2) has improved categories that better describe the prevention strategies utilized by campus and state grantees and the particular strategies adopted by tribal grantees. Questions have been tailored to the different types of prevention strategies and several questions have been removed. These changes enhance the utility and accuracy of the data collected.


  • In the Process Stage, several improvements have been made to instruments designed to collect data on gatekeeper training. In order to enhance our understanding of participant knowledge, self-efficacy and intent to use for particular types of trainings, modules for particular training types have been added to the Training Exit Survey (TES) (see Attachments B.1 & B.2), while the core satisfaction section of the questionnaire used for all training types has been shortened. Furthermore, we propose to implement this survey for Campus grantees in order to significantly increase our understanding of training activities implemented by Campus sites. Many Campus grantees already implement some sort of exit survey and have expressed interest in participating in this data collection effort.


In order to expand our knowledge base on the utilization and retention of participants’ knowledge, skills and/or techniques learned through the training, a telephone administered quantitative survey Training Utilization Preservation Survey (TUP-S) (see Attachments C.1) will be implemented on a random sample of trainees two months following the training. The rich qualitative data collected through the qualitative interviews have allowed the cross-site team to identify a full range of gatekeeper behaviors that we can now measure systematically using a quantitative survey. This approach will lead to larger samples and a more comprehensive understanding of gatekeeper training and its impact on gatekeeper behavior.


Given that the qualitative follow up interviews with trainees two months post training has provided rich information on how trainees have utilized QPR, ASIST and AMSR training curricula, the improved qualitative Training Utilization and Preservation Interviews (TUP-I) (see Attachments D.1) will be targeted towards relatively under-studied training types, such as locally developed training curricula and under-studied standardized curricula (e.g., SAFETalk, Yellow Ribbon, Sources of Strength).


  • The Referral Network Survey (RNS) utilized in the Process Stage has undergone several changes. The mode of administration for this survey will be changed from web to phone, in order to boost response rates. While several questions have been removed, questions have been added on agency protocols for providing follow-up services to youth who attempt suicide and the availability and accessibility of electronic data systems.


  • The Suicide Prevention, Exposure and Awareness Knowledge Survey for Students (SPEAKS-S) that forms part of the Process Stage has been expanded to improve our understanding of students’ perceptions around mental health, their self-efficacy in recognizing and responding to individuals at risk, their sense of connectedness and health-seeking behaviors. The Suicide Prevention, Exposure and Awareness Knowledge Survey for Faculty/Staff (SPEAKS-FS) has been similarly modified, with the exception of items regarding connectedness and health-seeking behaviors.


  • The Campus Infrastructure Interviews (CIFI) that is part of the Process Stage has been modified to collect more specific information about campus’ public health approach to suicide prevention, the GLS program impact and the sustainability of suicide prevention efforts on campus.


  • For the Enhanced Evaluation component of the Cross-Site Evaluation, we propose to implement case studies of four exemplary Campus suicide prevention programs. These case studies aim to obtain a comprehensive understanding of the integration of community-based behavioral health services with services provided by college or university campuses. Case study methods include key informant interviews with faculty, staff, and students and focus groups with faculty and students.


c. Uses of information collected through the Cross-Site Evaluation


Specifically, information gathered through the four stages of the cross-site evaluation of the GLS Suicide Prevention and Early Intervention Programs describes for State/Tribal grantees (1) the context in which suicide prevention activities are being implemented, (2) the products and services funded through the program, (3) the training experiences of individuals who receive training as part of the suicide prevention programs, (4) the utilization and penetration of the skills, knowledge and techniques learned through suicide prevention training programs, and (5) the referral networks in place to support youth identified at risk for suicide.


Despite the extensive knowledge that research has provided regarding suicide risk and protective factors, there is little known about how to integrate these factors and understand how they work in concert to evoke suicidal behavior or to prevent it (Institute of Medicine, 2002). Specifically, even though gatekeeper training is a common activity to support suicide prevention, little information is available about the extent to which gatekeeper training actually supports the prevention and intervention with high-risk youth. Data describing trainee perceptions of their enhanced awareness of suicide risk factors and how to recognize and appropriately respond to suicide risk factors as a result of training activities is limited. Similarly, data describing how the training they received increased referrals for mental health services and/or social support will add to the existing knowledge base about the effectiveness of suicide prevention programs. In addition, little information exists about the referral networks that support youth identified at risk within communities sponsoring suicide prevention programs. Data describing the extent to which referral networks exist and are being utilized will contribute extensively to the existing knowledge base and assist other States and tribal communities in implementing referral networks.


For Campus grantees, the information gathered through the cross-site evaluation describes (1) the context in which suicide prevention activities are being implemented, (2) the products and services funded through the program, (3) the training experiences of individuals who receive training as part of the suicide prevention programs (4) the suicide prevention exposure, awareness, and knowledge among campus students and faculty/staff at two points in time, and (5) the campus infrastructure in place to support suicide prevention program activities.


Suicide prevention is an important issue for colleges and universities across the country. Existing research shows that college students face enormous pressures and often have difficulties dealing with these stressors (as cited in the GLSMA, Public Law 108-355); however little is known about whether suicide prevention activities are reaching the students being targeted. Data describing campus students’ and faculty/staff’s exposure to suicide prevention activities and awareness and knowledge of suicide risk factors will continue to significantly contribute to the existing knowledge base. All of these data for example, will continue to inform policy makers and federal representatives in their decision making around appropriations and funding as well as youth and their families in their every day quest to identify and respond to risk. Collectively, and with information provided through the cross-site evaluation, the quest to prevent suicide can be approached from multiple perspectives, and the National Strategy goals and activities, built upon through the GLS Suicide Prevention and Early Intervention Program, can be assessed and documented in their utility – while simultaneously advancing the field of suicide prevention.


In totality, the data collected as part of the cross-site evaluation will continue to be useful to SAMHSA and its partners, other Federal agencies, the State/Tribal grantees, the Campus grantees, legislators, federal administrators, the field of suicide prevention, individual youth and their families, and the communities in which they live. Comprehensive information gathered from multiple sites at various levels and stages of their programmatic activity will continue to tremendously augment the existing knowledge base.


In addition, and of equal importance, SAMHSA will continue to use the results from the cross-site evaluation to develop policies and provide information to other States, Tribal communities, and campuses regarding the development and implementation of suicide prevention programs, as well as develop and refine future funding priorities of the GLS Suicide Prevention Program or similar programs. Finally, information from the cross-site evaluation helps other SAMHSA programs, such as the Linking Adolescents at Risk to Mental Health Services Grantees in developing and implementing suicide prevention activities, design comprehensive data collection efforts to monitor those activities, and report to local and federal stakeholders. If these data are not collected, policymakers and program planners at the Federal and local levels will not have the necessary information to determine the extent to which suicide prevention activities are effective and having an impact on youth at risk for suicide. Without this evaluation, Federal and local officials will not know whether the suicide prevention programs implemented as part of the GLSMA had an impact on suicide prevention and identifying at risk youth and whether GLS grantee programs are meeting the goals of the GLSMA.


The stage-specific utility and contribution of the cross-site data collection to SAMHSA’s mission and decision making are described below:


Context Stage. Specifically, the cross-site evaluation team and SAMHSA will use information collected through the context stage to assess the availability of existing data sources to report on program activities and to support GPRA reporting. Assessing the availability of existing data will also support analyses conducted as part of the impact stage of the cross-site evaluation.


Product Stage. Specifically, SAMHSA will use information gained through the cross-site evaluation to describe the prevention strategies that were developed and/or utilized as part of suicide prevention programs. Information collected as part of the product stage will inform other States and Tribal communities, as well as campuses, across the country as to what products and services support suicide prevention.


Process Stage. As part of the process stage, specific findings related to training activities will inform SAMHSA, States, Tribal communities and Campuses on what type of training activities are being implemented via these funded suicide prevention programs, who is being training, the intended and actual utilization and impact of those trainings, and the overall satisfaction with training experiences. This information will assist grantees in implementing training activities as part of their suicide prevention program. In addition, information collected as part of the training exit survey will continue to inform grantees about any necessary training modifications and/or enhancements; and follow-up training information will help inform the extent to which training activities are having an impact on youth in the community. Also as part of the process stage, specific findings related to referral networks will inform SAMHSA and State/Tribal suicide prevention efforts across the country by describing the organizations involved in referral networks, what types of relationships exist, the extent to which grant funding enhanced the development of referral networks, and to what extent these networks are being used to support high risk youth. For funded State/Tribal grantees, information collected during the first administration of the State/Tribal referral network survey will assist State/Tribal grantees in further developing their referral networks in years 2 and 3 of grant funding.


As part of the process stage for Campus programs, specific findings related to student and faculty/staff exposure, awareness and knowledge of suicide prevention activities will continue to assist other campuses across the country in assessing the potential impact of suicide prevention activities on their campus. For funded campuses’, information collected through the awareness and knowledge surveys will assist campuses’ in their local planning and implementation of awareness campaigns and activities in the out years of their grant funding. Data collected through the campus infrastructure interviews will inform SAMHSA and other campuses across the country what is involved in building a campus suicide prevention infrastructure, responding to crises, and what has been effective. Information collected through the infrastructure interviews will also assist funded campus grantees to identify necessary modifications and improvements to their existing infrastructures.


Specific examples of ways that SAMSHA has utilized the cross-site evaluation data include - adding focus on State/Tribal efforts to respond to youth suicide attempters based on data from the Prevention Strategies Inventory (PSI) that indicated it was an underserved population. Data from the Training Exit Survey and the Training Utilization and Preservation - Interview (TUP-I) have been used to provide additional program guidance to grantee on their training programs. Finally, data from the Early Identification, Referral, and Follow-up Analysis (EIRF) will be used to follow up on youth at risk for suicide who are unable to be seen for services within three months.


Overall, data collected through the cross-site evaluation will inform policy decisions, the continued improvement of funded State/Tribal and Campus suicide prevention programs, and suicide prevention efforts for other States, tribal communities and campuses across the country. SAMHSA will also use data collected as part of the cross-site evaluation to provide objective measures of its progress toward meeting targets of key performance indicators put forward in its annual performance plans as required by law under the GPRA.


Enhanced Evaluation. The goal of the Campus Case Studies (CCS) is to understand how a public health approach to youth suicide prevention may be successfully implemented in post-secondary educational settings. The CCS will explore, in a systematic manner: the suicide prevention related infrastructures and supports (e.g., clinical and non-clinical) on up to six selected GLS-funded campuses; the various student level factors that are related to suicide prevention efforts (e.g., protective factors, coping strategies, social norms, and facilitators and barriers to student access and receipt of behavioral healthcare); campus interdepartmental collaboration and the relationship between various efforts to promote student mental health and wellness; and the extent to which the campus infrastructures and supports promote and address these factors.


The case study approach has been chosen because it allows the opportunity to explore in more depth the issues identified above, including but not limited to motivations behind behaviors, the decision-making process, successes and challenges encountered, and relationships that hinder or facilitate suicide prevention efforts. The case study approach also provides an important advantage in that it allows field staff to utilize what they learn as part of the case study process to inform further data collection. The result will be a comprehensive assessment of efforts and issues on the selected campuses that can be explored, discussed, considered, and potentially replicated in other settings.


The data collected through this project will contribute to the knowledge base regarding a successful model for suicide prevention that integrates multiple prevention programs targeting risk and protective behaviors related to a host of negative mental and physical health outcomes correlated with suicide, including violence, stress, depression and mental illness, and academic failure. These factors are all located on a wellness continuum, and can not be successfully targeted in isolation, or without the involvement of the whole campus community.


  1. Addressing National Outcome Measures (NOMs) and GPRA Reporting


The cross-site evaluation was designed in part to support the Substance Abuse and Mental Health Services Administration (SAMHSA) performance measurement and management efforts. In assessing the effectiveness of each State/Tribal and Campus suicide prevention program, the cross-site evaluation will evaluate the GLS Suicide Prevention and Early Intervention Program as a whole. This is a critical step toward assessing the ability of the program to achieve many of the goals implied by GPRA indicators and SAMHSA National Outcome Measures (NOMs). The cross-site evaluation design reflects the intention of SAMHSA to implement performance management and accountability in all programs.


The cross-site evaluation design addresses the three-tiered SAMHSA NOMs and GPRA measurement approach by incorporating relevant client-level, training-related and infrastructure development outcome measures. The SAMHSA client-level NOM domains to date have been developed to address outcomes related to mental health and substance abuse treatment programs and substance abuse prevention programs. Because the GLS Suicide Prevention Program focuses on suicide and prevention, rather than treatment and/or substance abuse, not all client-level measures included in the existing 10-domain client-level NOM framework are appropriate for suicide prevention. To further explain lack of appropriateness, the majority of funding across both State/Tribal and Campus programs is dedicated to the early identification and referral of youth at risk for suicide, and enhancing awareness related to suicide. Currently no funds are devoted to the provision of treatment. As a result, data collection activities and resources, as well as monitoring of program focus, should be appropriately focused on the activities being funded and related outcomes. Furthermore, while many of the treatment NOM domains are considered potential distal outcomes for those youth or university/college students who are identified at risk, referred into service, and receive treatment (e.g., decreased mental health symptomatology, abstinence from drug and alcohol use), the reporting of this type of information requires, among other things, (1) the receipt of mental health treatment which the GLS Suicide Prevention funds are not currently supporting, (2) the tracking of individuals to request self-reported information which the GLS suicide prevention grantees are not resourced to accomplish, and/or (3) the access to existing treatment MIS systems which the GLS suicide prevention grantees typically do not access given their strategic plans and partnership structure.


To that end, client-level measures that are viable for GLS suicide prevention program activities have been abstracted from the existing 10-domain structure, and appropriate training and infrastructure NOMs have been proposed. Jointly reporting on these NOMs will provide a comprehensive performance measurement and management approach that will represent the breadth of GLS program activities and their reach. A summary of the client-, training, and infrastructure-level indicators that will be used to facilitate NOMs/GPRA reporting for the GLS Suicide Prevention Program is described below and in Table 1.


Client-level NOMs: As detailed above, several of the client-level NOM domains are considered inappropriate for the GLS Suicide Prevention and Early Identification Program. Specifically, domains related to decreased symptomatology, increased stability in housing, decrease in juvenile justice involvement, retention in substance use treatment, and abstinence from alcohol are considered unviable for the reasons described in the previous section. Several client-level domains, however, are relevant for GLS suicide prevention programs because they specify outcomes related to early identification and referral of youth – specifically, access to mental health services, increased social supports, use of evidence-based programs/practices, and retention in education for university/college students. Early identification activities are a key component of GLS suicide prevention programs and focus on the use of evidence-based practices/approaches [NOM: use of evidence-based practice] to identify youth or university/college students at risk for suicide and connecting those individuals to appropriate mental health or emergency services [NOM: access to service] and support services [NOM: social supports and connectedness]. In addition, because Campus suicide prevention activities are being implemented with university/college students, the NOM related to education retention will be reported for the Campus program. Data from the cross-site evaluation will be used to facilitate reporting on these three client-level NOMs.


Training-related Proposed Domains: Because the GLS Suicide Prevention Program focuses on prevention rather than treatment, a large amount of grant funds, particularly in the State/Tribal sites are being dedicated to gatekeeper training and early identification activities. Appropriate training-level measures become critically important for the consistent performance measurement and management for the GLS State/Tribal and Campus programs. Specifically, access to training, satisfaction with training experience, increased knowledge as a result of training, and intended use of the acquired skills are incorporated into the Cross-site evaluation design of the State/Tribal program activities.


Infrastructure Proposed Domains: Across the GLS Suicide Prevention Programs (i.e., State/Tribal and Campus programs), the prevention activities are being collectively implemented in an effort to build and strengthen suicide prevention infrastructures (i.e., at the State level and the Campus level). These activities include public information campaigns, education campaigns, gatekeeper trainings, product development and coalition building. In an effort to facilitate consistent performance measurement and management of infrastructure development and change, the National Strategy for Suicide Prevention objectives has been used as a framework for selecting relevant infrastructure indicators. Specifically, promoting awareness, the provision and implementation of suicide prevention activities across sectors (e.g., justice, education, clergy, child welfare, etc.), and improving and expanding suicide attempt and completion surveillance are being used as proposed infrastructure domains.


Table 1 provides a cross-walk of the proposed GPRA indicators for the GLS Suicide Prevention Program and details the Cross-site evaluation State/Tribal and Campus data source for each proposed indicator.


Table 2

SAMHSA National Outcome Measure Crosswalk with the Cross-site Evaluation of the GLS Suicide Prevention and Early Intervention Program


CLIENT-LEVEL OUTCOMES

NOMs DOMAIN

NOMs OUTCOME

CROSS-SITE EVALUATION

STATE/TRIBAL DATA SOURCE

CROSS-SITE EVALUATION

CAMPUS DATA SOURCE

Access/ Capacity 

Increased Access to Services (Service Capacity)

Information obtained through the Early Identification, Referral and Follow-up (EIRF) analysis will provide a measure of service accessibility for the State/Tribal suicide prevention programs and a measure of emergency service use. The EIRF process will identify the number of youth who are identified at risk for suicide through program activities, the number who are referred for services and the number who receive services and type. This will provide a measure of service capacity among State/Tribal suicide prevention programs.

In the context stage of the evaluation, the cross-site team will identify existing sources of information that can be obtained from campuses to facilitate the reporting of access to services and service capacity on Campuses involved in early identification activities. The cross-site team will identify existing data elements of interest, and request that campuses share those data with the cross-site evaluation for analyses. This will include a measure of emergency service use among campus student populations.

Social Connectedness 

Increased Social Supports/Social Connectedness


Information obtained through the Early Identification, Referral and Follow-up (EIRF) analysis. The EIRF process will identify the number of youth who are identified at risk for suicide and who are referred for social supports. In addition, the PSI will collect information on lifeskills development activities and cultural activities that aim to strengthen youth’s sense of social connectedness.

The SPEAKS-S inquires about student’s involvement and connectedness to the campus as well as their help seeking behaviors. In addition, the PSI will collect information on lifeskills and wellness activities that increase students’ sense of connectedness to the campus community.

Use of Evidence-based Practice

Use of Evidence-based practices

The PSI documents on a quarterly basis the programs that have been implemented as part of the GLS suicide prevention program. The extent to which grantees use evidence-based programs can be analyzed by looking at whether the programs reported by grantees are part of the SPRC/AFSP Evidence-Based Practices Project and SAMHSA’s National Registry of Evidence-Based Programs and Practices.



The PSI documents on a quarterly basis the programs that have been implemented as part of the GLS suicide prevention program. The extent to which grantees use evidence-based programs can be analyzed by looking at whether the programs reported by grantees are part of the SPRC/AFSP Evidence-Based Practices Project and SAMHSA’s National Registry of Evidence-Based Programs and Practices.


Education Retention

Student Retention Rate

Not applicable: States/Tribes funds are focusing on statewide youth in a variety of community and organizational settings.

In the Context stage of the evaluation, we will identify the source of information for student retention. Campuses will be required to share aggregate student retention rates to the cross-site evaluation team.

TRAINING RELATED OUTCOMES

Proposed Domain

Proposed

Outcome Measure (National Strategy for Suicide Prevention [NSSP] Goal)


CROSS-SITE EVALUATION

STATE/TRIBAL DATA SOURCE

CROSS-SITE EVALUATION DATA SOURCE

Satisfaction with Training

Satisfaction with training activities

The Training Exit Survey will provide measures of satisfaction among gatekeepers and providers trained as part of the State/Tribal suicide prevention programs.


The Training Exit Survey will provide measures of satisfaction among gatekeepers and providers trained as part of the Campus suicide prevention programs.

Implement Training to Identify At Risk Behavior

Increase in the number of gatekeepers in GLS-funded States and Campuses who have received training in identification of and response to suicide risk and behaviors: Justice, education, clergy, family members (NSSP Goal 6: Objectives 6.4, 6.5 6.6 and 6.8 )







To measure the number of education staff, justice staff, clergy persons and family members who have received training as part of GLS-funded programs, the Training Exit Survey will document the number trained and the role for each trainee.


To measure the number education staff, justice staff, clergy persons and family members who have received training as part of GLS-funded programs, the Training Exit Survey will document the number trained and the role for each trainee.



INFRASTRUCTURE DEVELOPMENT OUTCOMES

Proposed Domain

Proposed

Outcome Measure

(National Strategy for Suicide Prevention [NSSP] Goal)

CROSS-SITE EVALUATION

STATE/TRIBAL DATA SOURCE

CROSS-SITE EVALUATION

CAMPUS DATA SOURCE

Promote Awareness

Increase in number of GLS-funded States and Campuses with public information campaigns designed to increase public knowledge of suicide prevention (NSSP Goal 1: Objective 1.1)

To measure the implementation of public information campaigns in GLS-funded States and Tribes, the PSI will document on a quarterly basis all public information products and services that were implemented as part of each grantees suicide prevention program.

To measure the implementation of public information campaigns in GLS-funded Campuses, the PSI will document on a quarterly basis all public information products and services that were implemented as part of each grantees suicide prevention program.

Promote Awareness

Increase in the number of GLS-funded States and Campuses that have disseminated suicide prevention information via the World Wide Web (NSSP Goal 1: Objective 1.4).

To measure the extent that the World Wide Web is utilized to disseminate information, the PSI will document on a quarterly basis all public information efforts that involve website development or enhancements for the purposes of disseminating suicide prevention information.

To measure the extent that the World Wide Web is utilized to disseminate information, the PSI will document on a quarterly basis all public information efforts that involve website development or enhancements for the purposes of disseminating suicide prevention information.

Develop and Implement Prevention Programs

Increase in the number of GLS-funded States with comprehensive suicide prevention plans that satisfy all of the following criteria: a) coordinate across government agencies; b) involve the private sector; and c) support plan development, implementation, and evaluation in its communities (NSSP Goal 4: Objective 4.1).

As part of the cross-site evaluation, an annual evaluation progress report will be provided by all grantees to document evaluation progress. Included in this process will be an assessment of whether GLS-funded States have a suicide prevention plan that satisfies all three criteria described in the National Strategy.


Not relevant to Campus grantees



Increase the number of schools (public or private) in GLS-funded States with evidence-based programs designed to prevent suicide (NSSP Goal 4: Objective 4.2).

To measure the extent that evidence-based programs are being implemented in schools, the Training Exit Survey will document the evidence-based programs that are being implemented as part of GLS-funded State/Tribal programs, and in what capacity.

Not relevant to Campus grantees

Increase in the number of GLS-funded colleges and universities with evidence-based programs designed to prevent suicide (NSSP Goal 4: Objective 4.3).

The PSI documents on a quarterly basis the programs that have been implemented as part of the GLS suicide prevention program, whether these programs are evidence-based, and whether these programs are implemented in colleges or universities.




The PSI documents on a quarterly basis the programs that have been implemented as part of the GLS suicide prevention program. The extent to which grantees use evidence-based programs can be analyzed by looking at whether the programs reported by grantees are part of the SPRC/AFSP Evidence-Based Practices Project and SAMHSA’s National Registry of Evidence-Based Programs and Practices.


Increase in the number juvenile justice-related agencies and organizations in GLS-funded States with evidence-based suicide prevention programs (NSSP Goal 4: Objective 4.5).

To measure the extent that evidence-based programs are being implemented in juvenile-justice related settings, the Training Exit Survey Cover Page will document the evidence-based programs that are being implemented as part of GLS-funded programs, and in what capacity. This includes juvenile probation offices, correction facilities, detention centers, law enforcement, etc.



Not relevant to Campus grantees

Increase in the number of family, youth and community service providers and organizations in GLS-funded States and Campuses with evidence-based suicide prevention programs (NSSP Goal 4: Objective 4.7).

To measure the extent that evidence-based programs are being implemented in agencies and organizations serving families and youth, the Training Exit Survey Cover Page will document the evidence-based programs that are being implemented as part of GLS-funded programs, and in what capacity. This includes child welfare offices, family service offices, community-based organizations, etc.



The PSI documents on a quarterly basis the programs that have been implemented as part of the GLS suicide prevention program and whether these programs are implemented in family, youth or community service systems.





Improve and Expand Surveillance Systems

Increase in the number of GLS-funded States that produce annual reports on suicide and suicide attempts, integrating data from multiple State data management systems. (NSSP Goal 11: Objective 11.5)

As part of the cross-site evaluation, an annual evaluation progress report will be provided by all grantees to document evaluation progress. Included in this process will be an assessment of whether GLS-related program data are integrated from multiple data management systems and whether these data are utilized in annual reports.



As part of the cross-site evaluation, an annual evaluation progress report will be provided by all grantees to document evaluation progress. Included in this process will be an assessment of whether GLS-related program data are integrated from multiple data management systems and whether these data are utilized in annual reports.


The GLS Suicide Prevention and Early Intervention Program evaluation approach, the process through which it was developed, and the training and technical assistance that will be provided to grantees, have each fully intersected with utilization-focused federal program accountability requirements (i.e., PART, GPRA and NOMs). Therefore, a recommendation has been made that SAMHSA submit the cross-site evaluation package to the Office of Management and Budget.


3. Use of Improved Information Technology

Every effort was made to limit burden on individual respondents who participate in the cross-site evaluation through the use of technology. Data collection instruments will be administered via web and telephone. Below is a description of the web-based data collection and management system and the CATI technology that will be used for data collection


Web-based data collection and management system

A web-based data collection and management system will be used to facilitate data collection by program staff, program participants, key stakeholders, students, and Campus faculty/staff. The web-based data collection and management system will serve two functions; (1) as a data entry tool for program staff and cross-site evaluation staff to enter cross-site evaluation information or data elements, and (2) as a data collection tool for administering web-based surveys to respondents. All cross-site evaluation data obtained either through direct entry by program and/or evaluation staff or through web-based surveys will be stored in the web-based data collection and management system. The web-based data collection and management system reduces evaluation burden for the grantees and allows ease of access to data for program personnel and cross-site evaluation team members.


The web-based system is a completely secure system that maintains confidentiality through the provision of five different levels of password-protected access to site specific and aggregate data. All data collected will be stored in the central data repository that will allow for the analysis and summary of information within and across surveys. The five distinct user security levels include:

The Cross-site Administrator will have access to site-specific data from all grantee sites stored in the data collection and management system, and will have access to aggregate reports available on the system using this privilege level.


The Site Administrator will have access to site-specific data from the data collection and management system, and will have access to site-specific and aggregate reports available on the system. They will also be able to view the number of instruments that have been completed and submitted. One individual per community will be designated the Site Administrator.


A Site User has the capability to access information available on the system, but will be restricted from accessing datasets.


The Contact User will have access to aggregate information available on the repository. The Contact User will not have rights to download datasets, nor to access information specific to a grant-funded community.


Data contributors are data collectors and survey respondents who will have the capability to enter data into the web-based system, but will have no other privileges.


The cross-site evaluation team will provide training and technical assistance to support grantees in implementing the cross-site evaluation and in using data at the site level. Program personnel will be trained to utilize the data collection and management system and will be provided with a user’s manual.


Only individuals (Cross-Site Administrator and Site Administrator) with security access at the site administrator level are allowed access to raw data. To protect potential misuse of that data, specifically related to the inadvertent identification of respondents as a function of their unique demographic/workforce characteristic profile, the cross-site evaluation team will restrict access to raw datasets to designated individual(s), and the site administrator of the SPDC will be asked to sign a data use agreement. Within the context of protecting from inadvertent identification, this agreement will stipulate who, how, and under what circumstances the raw data can be analyzed/reported. For example, the cross-site evaluation team will obtain an agreement from each site administrator agreeing not to report categories where less than 10 cases exist and to stipulate who will have access to raw data. Further, the agreement will indicate that no attempt, through complex analysis and with outside information, will be made to ascertain from the data sets the identity of particular persons. Attachment K is the agreement that will be utilized.


CATI Technology


The Training Utilization and Preservation Survey (TUP-S) and the Referral Network Survey will be administered over the telephone using CATI technology. The evaluation firm ICF Macro operates fully integrated call centers in Springfield, Ohio, Burlington, Vermont, and Plattsburgh, New York. The Springfield facility has 102 interviewing stations, the Plattsburgh facility contains 120 CATI interviewing stations, and Burlington contains 70 stations. These centers are networked with each other. These facilities use the same CATI software, operate on the same platform, and are connected by a high-speed link that allows projects managed at one site to be conducted from the other site, or from both sites simultaneously. The CfMC questionnaire programming language provides call management and quota controls, inbound calling capabilities, multilingual interviewing capabilities, data back-up and monitoring, and incidence tracking. All of these CATI stations are equipped with predictive dialing capabilities. The use of ICF Macro’s CATI system, predictive dialing system, and supervisory staff ensure that this data is methodologically consistent with other study efforts.

4. Efforts to Identify Duplication



The cross-site evaluation team in developing the data collection activities for the cross-site evaluation conducted a literature review to avoid duplication in data collection activities and the use of similar information. Specifically, existing research studies and the efforts of other federal initiatives designed to evaluate suicide or suicide prevention were reviewed.



a. Existing Research


Many in the field of suicide prevention agree that there is a lack of information on the causes of suicide and even less information on how to prevent suicide (SPAN USA, Inc., 2001; Institutes of Medicine, 2002, U.S. Public Health Service, 2001). The studies on suicide prevention activities have provided important information, but for the most part have been conducted with specific populations under certain circumstances and are not generalizable to other populations (Institutes of Medicine, 2002). Similarly, the lack of longitudinal and prospective studies has been a barrier to understanding and preventing suicide (Institutes of Medicine, 2002). Acknowledging the dearth of information on the effectiveness of suicide prevention programs, the Institutes of Medicine’s Report, “Reducing Suicide: A National Imperative” provides several recommendations for increasing research on suicide (2002). The report recommends that federal funding be provided for the development, testing, and expansion of suicide prevention interventions, and for longitudinal studies that focus on the medium to long-term impacts of suicide prevention activities, such as the impact on risk and protective factors and treatment and prevention. Specifically, the report recommends exploring the impact of suicide prevention programs through large nationally coordinated efforts.


Although there have been evaluations examining the effectiveness of specific suicide prevention activities, such as gatekeeper trainings, suicide screening programs, and skills trainings, these studies have focused on specific populations, mostly school-based, and have not assessed the impact of programs across multiple sites or across time (Eggert et al., 1997; King & Smith, 2000, Eggert, Nicholas & Owen, 1995). For example, an evaluation of the Lifelines School-Based Adolescent Suicide Prevention Program found increases in knowledge and help-seeking behaviors (Kalafat & Elias, 1994), but was specific to youth in schools. The cross-site evaluation will assess suicide prevention approaches across multiple sites targeting diverse youth groups to determine the impact of suicide prevention activities and the extent to which funded activities meet the goals and objectives of the GLSMA. Cross-site evaluation data will also be used to assess performance across time in these diverse settings, in efforts to improve and enhance suicide prevention programs for funded and future funded grantees.


The existing knowledge base focuses on short-term impacts, and little is know about medium to long-term impacts of suicide prevention programs across broader and more diverse populations, as well as any direct impact on youth being referred for services. No evaluations have been conducted to examine the impact of suicide prevention programs across multiple sites, with diverse populations, involving diverse child-serving agencies (i.e., mental health, juvenile justice, foster care, etc), and to examine the impact on receipt of services. The cross-site evaluation of the GLS Suicide Prevention Program will be the first opportunity to collect information from multiple sites implementing suicide prevention activities in efforts to assess the effectiveness of those activities and the impact on youth at risk for suicide. The information learned from previous research on suicide prevention activities was crucial in designing the cross-site evaluation but the cross-site evaluation does not include data collection activities that will collect similar information as previous studies.




b. Other Federal Efforts


The Centers for Disease Control and Prevention (CDC) is supporting evaluations of evidence-based suicide prevention programs in Maine and Virginia as part of the CDC’s Targeted Injury Prevention Programs. In Maine and Virginia, the CDC is supporting research that documents the efficacy of a community-based cognitive therapy program for preventing suicidal behavior among suicide attempters identified in emergency departments. The focus of the intervention is to help youth develop more adaptive ways of thinking and more functional ways of responding to periods of emotional distress. These CDC evaluations will provide valuable information on the efficacy of interventions for youth displaying suicide risk factors, but the focus of the cross-site evaluation is to evaluate the effectiveness of suicide prevention programs rather than specific interventions.

CDC is also collecting and examining data from hospital emergency departments to assess the prevalence of suicide and suicide attempts. The National Electronic Injury Surveillance System-All Injury Program tracks data on all types and external causes of nonfatal injuries and poisonings treated in U.S. hospital emergency departments. With these data, CDC researchers can generate national estimates of nonfatal injuries, including those related to suicidal behavior. Again, although this effort is significant in providing a broader understanding of suicide, the information gathered through the cross-site evaluation focuses on the effectiveness of suicide prevention programs.


CDC will also sponsor evaluation of projects that use connectedness as a means to reduce suicidal behavior through the Prevention of Suicidal Behavior through the Enhancement of Connectedness program. This evaluation is designed to target one or more modifiable risk factors for suicidal behavior with a primary prevention strategy that is designed to enhance connectedness and to rigorously assess the efficacy or effectiveness of that strategy. Findings from the evaluation will address the need for the development and rigorous evaluation of primary prevention strategies for preventing initial occurrences of suicidal behavior. Up to two grants will awarded for project periods of up to five years.


The National Institute of Mental Health (NIMH) is sponsoring the Suicide Prevention in Emergency Medicine Departments program. .One grant site will receive a cooperative research grant to develop and test the effectiveness of practical interventions that can form an evidence base for the improved care of suicidal individuals seen in Emergency Departments (ED). Improvements in care will include patient screening, assessment, and interventions that form a “chain of care” to reduce suicide risk. The effort will include (1) the development and testing of a standardized mechanism to screen ED patients for high risk of suicidal behavior, ideally adapted from one or more existing screening tools; and (2) the development and testing of one or more ED-based post-screening interventions to reduce suicidal behavior and associated morbidity and mortality, delivered in the ED or following ED discharge. One site will be funded for this project.


The Substance Abuse and Mental Health Administration (SAMHSA) is sponsoring an evaluation of the National Suicide Prevention Lifeline, the national crisis hotline. The purpose of the evaluation is to assess the impact of the national crisis hotline connecting callers to mental health professionals assessing participation with the Lifelines networks. Although the data collection activities planned as part of this effort will provide valuable information on the effectiveness of this important service for at risk youth, the scope of the evaluation focuses on all callers (adult and youth) to the national hotline and is specific to one intervention. The cross-site evaluation will add to the information collected as part of this effort to assess other suicide prevention strategies (i.e., gatekeeper training, suicide screening activities, etc.) and focuses on youth specifically.


5. Impact on Small Businesses or Other Small Entities



Some of the data for this evaluation will be collected from individuals involved with public agencies, such as mental health, juvenile justice, education, and child welfare agencies and from colleges and university. While most data will be collected from public agencies or universities, it is possible that organizations involved in the referral networks would qualify as small entities. Also, respondents to the Training Exit Survey and the follow-up training qualitative interview, while most likely employed by public agencies, may also be employed by small businesses or other small entities. But, these data collection activities will not have a significant impact on these agencies or organizations.

6. Consequences of Collecting the Information Less Frequently



Cross-Site Evaluation


Product Stage. Grantees will be required to first complete the baseline version of the Prevention Strategies Inventory in Year 1 of the grant. Thereafter, they will be required to complete the Follow Up version of the Prevention Strategies Inventory on a quarterly basis over the duration of their three year grant period. Collecting this information quarterly is necessary to track progress toward meeting suicide prevention goals and to provide information on the development stage of products and services within State/Tribal and Campus programs. Consequences of collecting those data less frequently is the potential of losing information related to the process of developing and implementing products and services and losing the ability to track progress over time.


Process Stage. Both Campus and State/Tribal grantees are required to report aggregate training participant information for all trainings conducted as part of their suicide prevention programs in the format of the Training Exit Survey Cover Page (TES-CP) (See Attachments F.3 and F.4). Since gatekeeper training is a widely implemented suicide prevention strategy among State/Tribal and Campus grantees, aggregate basic information about trainings is necessary to understand how grant funds are being utilized in support of training.


The Training Exit Survey (TES) (See Attachments B.1 and B.2) which assesses participants’ training experiences immediately following the training is collected one time at the conclusion of the training. For a random sample of participants, the Training Utilization and Preservation Survey (TUP-S) (See Attachment C.1) will implemented within 2 months following the training in order to collect information on the utilization of the knowledge, skills, and techniques learned through the training. For selected trainings, the Training Utilization and Preservation Interviews (TUP-I) (See Attachment D.1) will be implemented to learn about the utilization of the knowledge, skills, and techniques learned for locally developed and under-studies trainings. The consequence of not collecting the training experience data at the conclusion of the training experience would be the absence of understanding and cross-site knowledge about the types of trainings being provided with grant funds, the quality of those trainings, and the individuals being trained. The consequences of not conducting the follow-up training utilization and preservation surveys and interviews would be a lack of important information concerning the impact and penetration of the suicide prevention training activities.


The Referral Network Survey (RNS) (See Attachment E) will be administered to referral networks identified by State/Tribal grantees annually over the three year grant period. Multiple annual administrations of the Referral Network Survey is important in learning whether the suicide prevention programs have an impact on building referral networks for youth identified at risk for suicide. The consequences associated with less frequent data collection would be a lack of information assessing the impact of time on the development of referral networks.


For the Campus grantees, the Suicide Prevention Exposure Awareness and Knowledge Surveys (SPEAKS) (See Attachments G.1 and G.2) for students and faculty/staff will be administered annually over the 3 year grant period. Data collected cross-sectionally at multiple points in time is necessary to assess any change in awareness and knowledge as a result of suicide prevention activities. If data were collected only at one time, there would be no ability to assess change over time, which is an important element of the suicide prevention program.


The Campus Infrastructure Interviews (CIFI) (See Attachment H.1) will be administered to key informants from each campus twice over the period of the grant – once in Year 1 and once in Year 3. The consequences of not collecting this data would be the absence of understanding the extent to which the prevention of suicide has permeated the operations and functioning of the campus administration and departments, and the extent to which this permeation supports sustainability of the suicide prevention efforts.


Impact Stage. To assess the impact of State/Tribal program activities, existing information on youth referred for services and service receipt as a result of early identification activities is analyzed. The Early Identification, Referral and Follow Up Analyses (EIRF) (See Attachment F.1) require State/Tribal grantees to share existing data with the cross-site evaluation team on the youth identified at risk as a result of early identification activities supported by their suicide prevention programs, their referral for services, and service receipt. State/Tribal grantees are also required to report aggregate screening information for all youth screened as part of their suicide prevention programs in the format of the Early Identification, Referral and Follow Up Aggregate Screening Form (EIRF-S) (See Attachment F.2). To assess the impact of Campus program activities, the cross-site evaluation team will request campus sites to engage in a MIS data abstraction process to submit existing administrative data related to the number of students who are at risk for suicide, the school retention rate, the number who seek services, and the type of services received, including emergency service use. Data for these abstraction processes are requested every quarter. The consequences of not collecting this information will be lack of understanding of the impact of the suicide prevention program on the identification of youth at risk, their referral to services and their service receipt. Information tracked through these data abstraction activities is needed to report on proposed NOMs related to access to services and use of social supports as well as for GPRA reporting.


Enhanced Evaluation


The 12 month data collection period for Campus Case Studies (CCS) is divided into 3 stages. In stage 1 of the CCS, the case study team will conduct one-time focus groups with staff, faculty, and students to gather information about each of the four research questions that guide the CCS (See Attachments I.1 and I.2). Each of the 4 Campus sites will identify six groups of students, 2 faculty groups and 1 staff group whose ideas and opinions are of particular interest to the campus in terms of its suicide prevention efforts. During this stage, the case study team will also conduct case study key informant interviews with 7 key informants (See Attachments J.1 to J.7). This will ensure a breadth of information, as well as the ability to triangulate responses for reliability and accuracy without excessive redundancy.


During stage 2, the case study team will administer the SPEAKS (See Attachments G.1 and G.2) described previously in the Process Stage, to gather population-level data on relevant student risk and protective behaviors.


In stage 3, the case study team will conduct an additional 7 case study interviews (CSIs) on each campus (See Attachments I.1 to I.7). These interviews will be conducted on the second site visit to clarify questions raised during the first phase of analysis, incorporate findings from the first two stages of data collection for the implementation of additional follow-up questions, and ensure comprehensiveness.


In each stage, data are only collected once. It is likely that some subset of the key informants interviewed during stage 1 may also be interviewed again in stage 3, but both the context and questions will be different based on findings from stages 1 and 2.


7. Consistency with the Guidelines of 5 CFR 1320.5(d)(2)



The data collection fully complies with the requirements of 5 CFR 1320.5(d)(2).



8. Consultation Outside the Agency



a. Federal Register Notice


SAMHSA published a notice in the Federal Register, volume 75, page 7600 on February 22, 2010 soliciting public comment on this study. SAMHSA received no comments on the planned data collection.


b. Consultation Outside the Agency


Consultation on the design, instrumentation, and statistical aspects of the evaluation has occurred with individuals outside of SAMHSA. An evaluation steering committee was established in 2005 to provide input and guidance in designing and implementing the cross-site evaluation. Consultation with the evaluation steering committee began in 2005 and will continue as needed throughout the grant-funding period. Representatives on the steering committee include leaders in the field of suicide prevention and evaluation. In addition, representatives of the Suicide Prevention Resource Center (SPRC) were consulted with respect to the design of the cross-site evaluation in 2005. The SPRC provides technical assistance to entities implementing suicide prevention programs. Input from representatives of the Centers for Disease Control and Prevention (CDC) was also solicited in 2005. The CDC has conducted research in the field of suicide prevention and was consulted to comment on the cross-site evaluation design, frequency of data collection activities, and instrumentation.


In addition, updates to the cross-site evaluation instruments were informed through direct consultation with current and former grantees, as well as representatives of the SPRC and CDC. These consultations had several purposes: (1) to ensure continued coordination of related activities, especially at the Federal level; (2) to ensure the rigor of the evaluation design, the proper implementation of the design, and the technical soundness of study results; (3) to verify the relevance and accessibility of the data to be collected; and (4) to minimize respondent burden.


Consultation on the concept and design of the Campus Case Studies has occurred with individuals outside of SAMHSA. A meeting with the evaluation steering committee members, SAMHSA, and Macro International occurred on March 27 – 28, 2007 where the case studies were discussed and input and guidance in design and implementation were gathered. Representatives on the steering committee include leaders in the field of suicide prevention and evaluation. In addition, representatives from the universities selected to participate in the case studies provided information, feedback, and guidance on research questions, instrumentation and study design.


9. Payment or Gift to Respondents


Remuneration is a standard practice on university campuses, and has proven to increase response rates for college student surveys (Dillman, 2000). In a study examining response rates in the National Survey of College Graduates, incentives provided to an experimental group resulted in an increase in response rates of nearly 11% versus no incentives (Dillman, 2000).


Remuneration will be used for the Suicide Prevention Exposure, Awareness, and Knowledge Survey for students (SPEAKS-S) (See Attachment G.1), the Training Utilization and Preservation Survey (TUP-S) (See Attachment C.1), Training Utilization and Preservation Interviews (TUP-I) (See Attachment D.1) as well as the student, faculty and staff focus groups and key information interviews that are part of the Campus Case Studies. Payment will not be provided to any other respondents as part of the cross-site evaluation. Respondents to other data collection activities are primarily staff of the suicide prevention programs or close affiliates. Therefore, no remuneration is planned for those activities.



Suicide Prevention Exposure, Awareness, and Knowledge Survey (SPEAKS)

Based on our experience implementing this survey, including feedback from participants as well as firewall accessibility issues, a mixed-mode approach will be used to increase response rates. This approach will address previous problems with coverage of solicitations and overall nonresponse. A presurvey e-mail will be sent to all faculty/staff and students in the samples explaining the importance of the survey and their campuses’ involvement in the GLS Suicide Prevention Program. An introductory letter will be sent to faculty/staff and student in the samples requesting participation in the survey. This letter will contain directions for logging into the Web site to complete the survey and a password for accessing the survey. The letter will contain a modest financial incentive ($1–$2) for all students in the sample.


Training Utilization and Preservation Survey (TUP-S) and Training Utilization and Preservation Interview (TUP-I)

Remuneration is a standard practice in longitudinal studies in efforts to maintain participation in the study. Recontacting survey respondents for follow-up interviews is difficult given the lapse in time between the original survey and the follow-up interview. Compounding the difficulty is when respondents are not directly affiliated with the programs being evaluated. Therefore, given the hard to reach nature of these populations, an incentive will be provided for two cross-site evaluation data collection activities that involve follow-up interviews. Participants in the TUP-I will receive a $10 money order incentive and participants in the TUP-S will be provided a $20 money order incentive. An incentive for these respondents is particularly deemed appropriate because these respondents are gatekeepers not directly affiliated with the suicide prevention program.


Focus groups for Campus Case Studies

Focus group participants will receive a $20 gift card in appreciation of their time. In addition, the case study team will provide pizza and soda during all focus groups.


Payment will not be provided to any other respondents as part of the cross-site evaluation. Respondents to other data collection activities are primarily staff of the suicide prevention programs or close affiliates. Therefore, no remuneration is planned.


10. Assurance of Confidentiality


A web-based data collection and management system was designed to facilitate data entry and management for the cross-site evaluation. Descriptive information will be collected from respondents to cross-site evaluation data collection activities, but no identifying information will be entered or stored into the web-based data collection and management system. Identifying information will be requested in order to facilitate the Training Utilization and Preservation – Survey (TUP-S), Training Utilization and Preservation – Interviews (TUP-I), the Referral Network Survey (RNS), the Campus Infrastructure Interviews (CIFI), the SPEAKS-Student and Faculty/Staff Versions, and Case Study Key Informant Interviews (CSIs) for the Campus Case Studies and Focus Groups for the Campus Case Studies. Identifying information will not be stored with survey responses and specific procedures to protect the privacy of respondents are described below for each data collection activity.


Prevention Strategies Inventories. Information to complete the inventories will be directly entered into the web-based system. To access the system, each respondent will be provided a username and password to protect their privacy and no identifying information is requested on the inventories.


Training Exit Survey. Each respondent to the Training Exit Survey will be provided a randomly generated training participant ID, but no identifying information will be requested on the survey. Responses to the survey will be entered into the web-based system, but no identifying information will be entered. A consent-to-contact form will accompany the Training Exit Survey for respondents interested in being recontacted for administration of the Training Utilization and Preservation – Survey (TUP-S) and the Training Utilization and Preservation – Interview (TUP-I) (see Attachments C.2 and D.2) . The consent-to-contact form will include the training participant ID and identifying information necessary for contacting selected respondents for the TUP-S and the TUP-I. However, again, no identifying information will be entered into the web-based data collection and management system and all consent-to-contact forms will be stored separately from Training Exit Survey responses in order to protect the privacy of respondents. For respondents not selected for the TUP-S and the TUP-I, the consent-to-contact forms will be destroyed upon completion of the study component.


Training Utilization and Preservation – Survey (TUP-S) and TUP Key Informant Interviews (TUP-I): Contact information for the telephone-administered TUP-S and TUP-I will be collected through the Consent to Contact form that will be distributed at trainings along with the Training Exit Survey (see Attachments C.2 and D.2). The Consent-to-Contact form will include a training participant ID (which contains no identifying information) and ask participants to provide the identifying information (name, telephone number and mailing address) necessary for contacting them for the TUP-S and TUP-I and for administering the incentive. The hard copy Consent-to-Contact forms will be stored in locked cabinets and the contact information will be entered into a password-protected database which can only be accessed by the limited number of individuals (selected ICF Macro staff - telephone interviewers, data analysts and administrative staff for administering the incentives) who require access. These individuals have signed confidentiality, data access and use agreements. Datasets used by the data analysts will be stripped of any identifying information. Once the incentives are sent out, respondent contact information will be deleted from the database and the hard copy consent to contact forms will be destroyed. At the start of the telephone interview, verbal consent will be obtained from the respondents (See Attachments C.3 and D.3).


Campus Infrastructure Interviews. Identifying information will also be obtained for participants in the Campus Infrastructure Interviews in order to contact respondents. However, no identifying information will be entered or stored in the data collection or management system and will not be linked to responses. Contact data and ids will be kept in a password-protected Microsoft Access tracking database separate from the survey database. Other procedures for assuring the privacy of respondents will include limiting the number of individuals who have access to identifying information, using locked files to store hardcopy forms that include identifying information, assigning unique code numbers to each participant to ensure anonymity, and implementing guidelines pertaining to data submission and dissemination. Data collectors will be extensively trained and will be responsible for entering data into the web-based data collection system. The Campus Infrastructure Interviews include a verbal consent form (see Attachment H.2).


Referral Network Survey (RNS). Identifying information for respondents to the Referral Network Survey will be necessary in order to administer the RNS by telephone. Contact information will be limited to agency affiliations, names and telephone numbers. Contact information will be entered into a password-protected database which can only be accessed by a limited number of individuals (selected ICF Macro staff - telephone interviewers and cross-site team members) who require access. These individuals have signed confidentiality, data access and use agreements. Datasets used by the data analysts will be stripped of identifying name and telephone number information. However, although the individual’s identifying name will not be used by any reports or datasets, the reports and datasets will contain the name of the agency/organization and the information provided about the agency or organization. Therefore, an individual may be identifiable when reporting results. Respondents are informed of possible identification in the verbal consent statement at the start of the interview (see Attachment E).


SPEAKS-Student and Faculty/Staff Version. Identifying information will be necessary to send out the pre-notification letter by mail and email with login information for the survey. Identifying information will be limited to mailing addresses, email addresses and campus affiliations and will not be stored with survey responses. Contact information will be entered into a password-protected database which can only be accessed by the limited number of individuals (selected ICF Macro staff - telephone interviewers, data analysts and administrative staff for administering the incentives) who require access. These individuals have signed confidentiality, data access and use agreements. Respondents will be assigned a username and password. To ensure privacy, no identifying information will be entered in the data collection and management system. Therefore, no identifying information will be associated with individual responses and no identifying information will be used for analysis or reporting efforts.


Case Study Key Informant Interviews (CSIs) for the Campus Case Studies. Case Study Key Informant Interviews (CSIs). Interview respondents will sign a consent form (see Attachment J.8); however, no identifying information will be entered or stored in the data collection or management system and will not be linked to responses. The case study team will fill out a cover sheet with respondent information and respondent IDs. This cover sheet will be removed from the hard-copy interview and stored separately. IDs will be kept in a password-protected Microsoft Access tracking database separate from the interview content database. Other procedures for assuring the privacy of respondents will include limiting the number of individuals who have access to identifying information, using locked files to store hardcopy forms that include identifying information, assigning unique code numbers to each participant to ensure anonymity, and implementing guidelines pertaining to data submission and dissemination. Data collectors will be extensively trained and will be responsible for entering data into the web-based data collection system.


Focus Groups for the Campus Case Studies. Students, faculty and staff members affiliated with the universities selected for the case studies will participate in focus groups during the first on-site visit. Focus groups will be audio-recorded and transcripts will be produced. Participants will sign a hardcopy consent form (See attachments I.3 and I.4), but no identifying information will be obtained. In addition, respondents will be asked to use first names only or alternate names during focus groups. Consent forms will be stored in locked cabinets, separate from the qualitative data collected. In addition, the case study team will maintain anonymity and privacy by implementing guidelines pertaining to data submission and dissemination. Data collectors will be extensively trained and will be responsible for entering data into the web-based data collection system.


11. Questions of a Sensitive Nature


Because this project concerns suicide prevention, survey, interview and focus group instruments include questions that are potentially sensitive. These questions collect information about mental health, substance abuse, and family circumstances. These questions are central to the agency’s goal of learning about the protective factors and campus wellness context related to suicide prevention. Names and email addresses collected as part of the consent process will be kept separate from responses as stated above. All data will be managed and stored in the manner described above and therefore will be unavailable to anyone but authorized project staff. Active consent forms explicitly advise potential respondents and participants about the sensitive nature and content of the data collection protocol as well as the voluntary nature of all data collection activities. Unanticipated or negative consequences will be reported immediately to the campus and Macro International Institutional Review Boards. The Principal Investigator and Project Director will also consult with appropriate clinical professionals and immediately determine if the participant presents a risk to themselves or others and make appropriate referrals.


12. Estimates of Annualized Burden Hours and Costs


Data collection for the cross-site evaluation for the 48 State/Tribal grantees and the 38 Campus grantees will cover a 3-year project period. Data collection for the currently active grantees is operating under the previously approved OMB clearance, which will expire in May 2010.


The table below shows the burden associated with cross-site evaluation and enhanced evaluation data collection activities and the associated costs. The number of grantees for which burden is calculated is 86 (48 State/Tribal grantees and 38 Campus grantees), which represents the number of currently active grantees and is close to the 91 grantees used in the previously approved package. It should be noted that we are using this number as an estimate of the number of grantees that are active per year. Forty six grantees (out of the 86 grantees) were funded in October 2008 and will reach the end of their grant period in September 2011. At that point, additional grantees may be funded. Therefore, we are estimating that in a given year, we would have 86 active grantees.

The cost was calculated based on the hourly wage rates for appropriate wage rate categories using data collected as part of the National Compensation Survey (BLS, 2006) and the American Association of University Professors (AAUP) National Survey of university faculty salaries.
















Table 3

Annualized Burden Hours and Costs


STATE/TRIBAL CROSS-SITE EVALUATION INSTRUMENTS

Type of Respondent

 Measure Name

No. of Respondents

No. of Responses/ Respondent

Hours/ response

Response Burden (in hours)

Wage (in $s)

Total Cost (in $s)

Project Evaluator

Prevention Strategies Inventory - State Tribal (PSI-ST)

48

4

0.75

144

33.74

4,859

Provider (Trainees)

Training Exit Survey State/Tribal (TES-ST)

94,848

1

0.17

16,125

20.13

324,597

Provider (Trainees)

Training Utilization and Penetration Survey (TUP-S)

2,000

1

0.25

500

20.13

10,065

Provider (Trainees)

Training Utilization and Penetration Interview (TUP-I)

100

1

0.67

67

20.13

1,349

Provider (Stakeholder)

Referral Network Survey (RNS)

1,024

1

0.67

687

20.13

13,830

Project Evaluator

Early Identification, Referral and Follow Up Analysis (EIRF)

48

4

1

192

33.74

6,479

Project Evaluator

Early Identification, Referral and Follow Up Aggregate Screening Form (EIRF-S)

48

4

0.33

64

33.74

2,160

Project Evaluator

Training Exit Survey Cover Page State/Tribal (TES-CP-ST)

48

4

0.33

64

33.74

2,160

CAMPUS CROSS-SITE EVALUATION INSTRUMENTS

Type of Respondent

 Measure Name

No. of Respondents

No. of Responses/ Respondent

Hours/ response

Response Burden (in hours)

Wage

Total Cost

Project Evaluator

Prevention Strategies Inventory-Campus (PSI-C)

38

4

0.75

114

33.74

3,847

Provider (Trainees)

Training Exit Survey Campus (TES-C)

23,712

1

0.17

4,032

23.73

95,680

Student

Suicide Prevention Exposure, Awareness and Knowledge Survey-Student Version (SPEAKS-S)

7,600

1

0.42

3,192

7.25

23,142

Faculty

Suicide Prevention Exposure, Awareness and Knowledge Survey-Faculty/Staff (SPEAKS-FS)

1,900

1

0.25

475

32.94

15,647

Student

Campus Infrastructure Interview (CIFI) for Student

38

1

0.75

29

7.25

211

Faculty

Campus Infrastructure Interview (CIFI) for Faculty

76

1

0.75

57

32.94

1,878

Administrator

Campus Infrastructure Interview (CIFI) for Administrator

38

1

0.75

29

35.77

1,038

Counselor

Campus Infrastructure Interview (CIFI) for Counselor

38

1

0.75

29

23.73

689

Project Evaluator

Training Exit Survey Cover Page Campus (TES-CP-C)

38

4

0.33

51

33.74

1,721

Project Evaluator

MIS Data Abstraction

38

4

0.33

51

33.74

1,721

CAMPUS CASE STUDY INSTRUMENTS

Type of Respondent

 Measure Name

No. of Respondents

No. of Responses/ Respondent

Hours/ response

Response Burden (in hours)

Wage

Total Cost

College Student

Focus Group – Student Version

216

1

1.5

324

7.25

2,349

College Faculty

Focus Group – Faculty Version

72

1

1.5

108

32.94

3,558

College Staff

Focus Group – Staff Version

36

1

1.5

54

23.73

1,282

College Student

Interview—Student Leader Version

8

1

1

8

7.25

58

College Student

Interview—Case Finder Version

4

1

1

4

7.25

29

College Faculty

Interview—Faculty Version

8

1

1

8

32.94

264

College Staff

Interview—Campus Police Version

8

1

1

8

18.90

152

College Staff

Interview—Counseling Staff Version

8

1

1

8

23.73

190

College Staff

Interview—Prevention Staff Version

12

1

1

12

23.73

285

College Staff

Interview—Administrator Version

8

1

1

8

35.77

287

 

Total

132,060

 

 

26,444

 

519,527




  1. National Compensation Survey: Occupational Wages in the United States, June 2006, US Bureau of Labor Statistics (BLS) US Dept of Labor, June 2007. The category Market and Survey Researchers under Life, Physical and Social Science Occupations was used as an approximation for Project Evaluators.

Link: http://www.bls.gov/ncs/ocs/sp/ncbl0910.pdf

  1. National Compensation Survey: Occupational Wages in the United States, June 2006, US Bureau of Labor Statistics (BLS) US Dept of Labor, June 2007. Category: Social Workers under Community and Social Services Occupations

Link: http://www.bls.gov/ncs/ocs/sp/ncbl0910.pdf

  1. National Compensation Survey: Occupational Wages in the United States, June 2006, US Bureau of Labor Statistics (BLS) US Dept of Labor, June 2007. Category: Counselors under Community and Social Services Occupations

Link: http://www.bls.gov/ncs/ocs/sp/ncbl0910.pdf

  1. Federal Minimum Wage

Link: http://www.dol.gov/elaws/faq/esa/flsa/001.htm

  1. Based on the 2004-2005 American Association for University Professor's (AAUP) Annual Salary Survey, which found that the annual average for professors was $68,505, http://www.aaup.org/.

  2. National Compensation Survey: Occupational Wages in the United States, June 2006, US Bureau of Labor Statistics (BLS) US Dept of Labor, June 2007. Category: Protective Services Occupations

Link: http://www.bls.gov/ncs/ocs/sp/ncbl0910.pdf

  1. National Compensation Survey, Bureau of Labor Statistics (BLS) US Dept of Labor, Administrators-education and related fields, July 2004.


Annualized Summary Table


Respondents

Number of Respondents

Responses/

Respondent

Total Responses

Total Annualized Hour Burden

STATE/TRIBAL CROSS-SITE EVALUATION INSTRUMENTS


Project Evaluators

192

16

3,072

464

Providers

97,972

4

391,888

17,379

CAMPUS CROSS-SITE EVALUATION INSTRUMENTS


Project Evaluators

114

12

1,368

216

Students

7,638

2

15,276

3,221

Campus Staff

2,052

4

8,208

590

Providers

23,712

1

23,712

4,032

CAMPUS CASE STUDY INSTRUMENTS


Students

228

3

684

336

Campus Staff

152

7

1064

206

Total

132,060

49

445272

26,444


13. Estimates of Annualized Cost Burden to Respondents or Record Keepers



Grantees are collecting the majority of the required data elements as part of their normal suicide prevention program operations. Grantees will maintain this information for their own program planning, quality improvement, and reporting purposes. Therefore, there are no additional capital or start-up costs associated with the cross-site evaluation. There will be some additional burden on record keepers to provide potential respondent lists for data collection activities. However, these operation costs will be minimal.


Other costs related to this effort, such as the cost of shipping completed questionnaires (i.e., training exit survey) and consent-to-contact forms is cost to the Federal government as part of the funding received for participation in the cross-site evaluation. Each grantee has been funded, as part of the overall cooperative agreement award, to fund an evaluator and to include related costs to carry out the requirements of the cross-site evaluation. Therefore, no cost burden is imposed on the grantee by this additional effort.



14. Estimates of Annualized Cost to the Government



CMHS has planned and allocated resources for the management, processing and use of the collected information in a manner that shall enhance its utility to agencies and the public. Including the Federal contribution to local grantee evaluation efforts, the contract with the National Evaluator, and government staff to oversee the evaluation, the annualized cost to the government is estimated at $3,583,452. These costs are described below.

Each grantee is expected to fund an evaluator to conduct the self-evaluation and to satisfy the requirements of the cross-site evaluation. It is estimated that participating in the cross-site evaluation will require 0.20 full-time equivalent (FTE) to collect information, enter information into the web-based data collection and management system, and to conduct analyses at the local level. Assuming: 1) an average annual salary of $62,614 (BLS, 2006) for a 0.20 FTE evaluator, 2) 48 State/Tribal and 38 Campus grantees; and 3) that Campus grantees had to cost share on a 1:1 basis, the annual cost for the cross-site evaluation at the grantee level is estimated at $839,028. These monies are included in the cooperative agreement awards.


The cross-site evaluation contract has been awarded to ICF Macro for evaluation of the 86 suicide prevention programs. The current cross-site evaluation contract with SAMHSA provides $10,804,485 for a five-year period. The estimated average annual cost of the contract will be $2,160,897. Included in these costs are the expenses related to developing and monitoring the cross-site evaluation including, but not limited to, developing the evaluation design; developing the cross-site evaluation instrumentation; developing training and technical assistance resources (i.e., manuals, training materials, etc.); conducting in-person or telephone training and technical assistance; monitoring of grantees; traveling to grantee sites and relevant meetings; and data analysis and dissemination activities. In addition, these funds will support the development of the web-based data collection and management system and fund staff support for data collection.


It is estimated that CMHS will allocate 0.30 of a full-time equivalent each year for government oversight of the evaluation. Assuming an annual salary of $80,000, these government costs will be $24,000 per year. In addition, through the interagency agreement between SAMHSA and the CDC, the CDC will allocate 0.50 of a full-time equivalent each year for government oversight, technical assistance, and monitoring of the enhanced evaluation. Assuming an annual salary of $80,000, these government costs will be $40,000 per year.


15. Change in Burden


Currently there are 6,331 total burden hours in the OMB inventory. SAMHSA is requesting 26,444 hours for this submission. This represents an increase in burden of 20,113 hours due to a program change. The number of grantees for which burden is calculated is 86 (48 State/Tribal grantees and 38 Campus grantees), which represents the number of currently active grantees and is close to the 91 grantees (36 State/Tribal grantees and 55 Campus grantees) used in the previously approved package.


Major program changes that account for the change in burden are described below:


  • In the previously approved OMB package, the estimated number of respondents for the Training Exit Survey State/Tribal (TES-ST) was 12,000. This number has been revised to 94,848. Based on data that has been collected, each State/Tribal site approximately trains 1,976 individuals per year. The increase in burden is 14,085 hours.

  • We propose to implement Training Exit Survey Campus (TES-C) for Campus grantees in order to significantly increase our understanding of training activities implemented by Campus sites. In the previously approved package, Campus sites were not required to do this survey. The estimated burden for this effort is 4,032 hours.

  • We propose to implement a new data collection instrument Training Utilization Preservation Survey (TUP-S) in order to expand our knowledge base on the utilization and retention of participants’ knowledge, skills and/or techniques learned through the training. This survey will be administered on a random sample of 2000 participants per year from State/Tribal sites. The estimated burden for this effort is 500 hours.

  • Questions have been added to the Suicide Prevention, Exposure and Awareness Knowledge Survey for Students (SPEAKS-S) to improve our understanding of students’ perceptions around mental health, their self-efficacy in recognizing and responding to individuals at risk, their sense of connectedness and health-seeking behaviors. This has increased the length of the survey from 10 minutes per respondent to 25 minutes per respondent. . The increase in burden is 792 hours.

  • For the Enhanced Evaluation component of the Cross-Site Evaluation, we propose to implement case studies of four exemplary Campus suicide prevention programs. The Campus Case Studies represent a new data collection effort. In the previously approved package, clearance for a State/Tribal enhanced evaluation was provided. The estimated burden for the Campus Case Studies is 542 hours.

  • Qualitative data collected thus far through follow up interviews with trainees two months post training has provided rich information on how trainees have utilized QPR, ASIST and AMSR training curricula. The improved qualitative Training Utilization and Preservation Interviews (TUP-I) will be targeted towards relatively under-studied training types, such as locally developed training curricula and under-studied standardized curricula (e.g., SAFETalk, Yellow Ribbon, Sources of Strength). Consequently, 10 trainings per year will be targeted, which is less than the 36 trainings (1 training per grantee) that was provided in the previously approved package. The decrease in burden is 174 hours.



16. Time Schedule, Publication, Analysis Plans



a. Time Schedule



The time schedule for implementing the cross-site evaluation is summarized in Table 2. A 3-year clearance is requested for this project.


Table 4

Time Schedule




Begin data collection for 86 grantees

(48 State/Tribal Grantees & 38 Campus Grantees)


June 2010

(1 month after OMB approval estimated to occur in May 2010)

Begin data collection for 4 Campus Sites selected for the Enhanced Evaluation Campus Case Studies

June 2010

(1 month after OMB approval)

Final GLS Campus and State/Tribal Programs Evaluation Report

October 2010

Data collection completed for the grantees funded in FY2008

(A new cohort of grantees may replace this cohort of grantees)

September 2011

Final GLS Campus and State/Tribal Programs Evaluation Report

October 2011


Data collection completed for grantees funded in FY2009

(A new cohort of grantees may replace this cohort of grantees)

September 2012



Final GLS Campus and State/Tribal Programs Evaluation Report

October 2012


Data collection completed for grantees funded in FY2009

(A new cohort of grantees may replace this cohort of grantees)


September 2012


Final GLS Program (Campus and State/Tribal Evaluation Report)

October 2012

Data collection continues until expiry of OMB approval

May 2012




b. Publication Plans



The GLSMA requires annual reports summarizing the results of the cross-site evaluation. The cross-site evaluation team will analyze data collected and prepare interim annual reports to summarize key findings. A final report on the results of the cross-site evaluation is also required by the GLSMA, and will be produced by the cross-site evaluation team no later than 3 years after the grants were received.


Because of the importance of the cross-site evaluation to the field of suicide prevention, in collaboration with SAMHSA and the government project officer, we will publish the results of the cross-site evaluation in relevant professional journals to inform the research community as well as the decision making of policymakers and program administrators. Up to 5 publications are planned, and will most likely be submitted in the final year of the cross-site evaluation. Possible publications include a manuscript providing an overview of the GLS Suicide Prevention Program and the key findings, as well as manuscripts reporting results from the Training Exit Survey, Training Utilization and Preservation Interview, Training Utilization and Preservation Survey, Referral Network Survey, SPEAKS, Campus Infrastructure Interviews, Early Identification, Referral and Follow Up data abstractions and Campus Case Studies. All publications will be submitted to the Government Project Officer (GPO) in draft form for review and approval prior to submission to the selected journal.


Examples of journals that will be considered as vehicles for publication include the following:


  • American Journal of Public Health

  • American Psychologist

  • American Journal of Diseases of Children

  • Child Development

  • Evaluation Review

  • Evaluation Quarterly

  • Journal of the American Academy of Child and Adolescent Psychology

  • Journal of Applied Development Psychology

  • Journal of Child and Family Studies

  • Journal of Clinical Child and Adolescent Psychology

  • Journal of Consulting and Clinical Psychology

  • Journal of Health and Social Behavior

  • Journal of Mental Health Administration

  • Psychological Reports

  • Social Services Review

  • Suicide and Life Threatening Behavior


c. Data Analysis Plan



The cross-site evaluation data collected through the different stages of the evaluation will be analyzed to address key questions. Both quantitative and qualitative analysis techniques will be used to determine and compare sites in terms of the interventions adopted, level of implementation, reach and outcomes of the different efforts, and mediators associated with these results. The following subsections describe the analyses and data sources associated with each core evaluation question in the cross-site evaluation and the campus case studies.



CROSS-SITE EVALUATION


What is the reach of the early intervention and prevention strategies?

The cross-site evaluation team will employ descriptive statistics to determine the reach of the early intervention and prevention strategies primarily based on the information collected through the Early Identification Referral and Follow Up abstraction. We will provide descriptive summaries concerning youth identified at risk; youth receiving in-school or community early identification programming; youth referred for intervention; location of referral; frequency of service receipt, the reasons if recommended service is not provided; and individual characteristics of youth identified as being at risk. Additionally, the proposed Training Utilization and Preservation Survey and Training Utilization and Preservation Interview will enable more consistent determination of the prevention strategy reach in sites focusing on general community gatekeeper training. Because this data collection activity will be based on a probabilistic sample, appropriate measures of uncertainty (i.e., standard error and confidence intervals) will be calculated and reported with corresponding summary statistics.

What types of prevention/intervention programs were used?

To understand the type of prevention/intervention programs used, we will largely rely on the descriptive information of the different products and services mix developed and used by sites as collected through the Prevention Strategies Inventory. We will produce descriptive information on the number of sites adopting each specific intervention and use robust statistics of central tendency and dispersion to summarize how sites apportion their budget to each type of intervention. Clustering techniques are particularly useful to analyze patterns of budget allocation and will permit identification of groups of sites with similar focus.

What kinds of services were recommended to youth who were determined to be at risk for suicidal behavior?

Data from the Early Identification Referral and Follow Up abstraction of the State/Tribal evaluation will be used to understand service recommendations for youth determined to be at risk for suicidal behavior. We will use descriptive statistics to determine the types of referrals youth received such as in-school counseling, community mental health, or emergency services. Chi-square tests and related analysis techniques will be used as appropriate to compare referral patterns across identification settings and source of referral. The likelihood of service receipt at follow-up as a function of site- and individual-level characteristics will also be examined within a mixed-effect regression framework. In the case of Campus evaluation, the descriptive analysis of MIS extraction is used to assess mental health service use and its evolution over time, while Campus Infrastructure Interviews enables a qualitative assessment of service infrastructure existing in-campus. A more in-depth analysis of service recommendation patterns is done through the Campus case studies described below.

What sorts of linkages were made as a result of the referral mechanisms used?

To understand the influence of referral mechanisms on subsequent linkages, we propose social network analysis based on the information collected through the Referral Network Survey. Social network analysis will examine the collaborations occurring between organizations within a potentially complex web of referral sources. By examining basic characteristics such as quality and symmetry of relationships, centrality, and density, we can understand the extent to which the major agencies of youth-serving systems are working together to ensure that at-risk youth receive services.

What were the process measures?

Process measures will include both quantitative and qualitative components that address different aspects of program implementation. In particular, given their key role among the interventions adopted, an extended set of process measures will focus on training implementation. The cross-site evaluation team will generate summary statistics for the type of training implemented using a classification compatible with the National Registry of Evidence-based Programs and Practices (NREPP), which specify the level of evidence supporting each type of training. Descriptive information on the number of gatekeepers trained, their individual characteristics, and the setting in which they interact with youth will be provided. Additional analyses will include participants’ training satisfaction and anticipated use of tools learned.



Based on the information collected through the proposed Training Exit Survey specific modules, the cross-site evaluation team will developed fidelity indicators for a set of specific training interventions. We will explore the association of these measures with intended training outcomes, both immediate (as captured by Training Exit Survey) and over time (as captured by quantitative Training Utilization Preservation Survey), using bivariate measures of association such as Pearson’s correlation coefficients. Additional assessment of this relationship, as well as the way in which it is mediated by trainee characteristics, training features, and site level/context variables will be performed using linear mixed-effect regression.

What are the mediators associated with changes in outcomes of these programs and services?

The association of outcome indicators with both individual- and site/program-level mediators will be examined using multivariate analysis techniques based on information collected through Early Identification Referral and Follow Up abstraction and quantitative Training Utilization Preservation Survey, in the case of State/Tribal Programs, and Suicide Prevention Education Awareness and Knowledge Survey in the case of Campus Programs, combined with site-level information from context and product data collection activities. In particular, we propose to use parametric modeling for outcomes of interest within a mixed-effect regression framework. As with other regression techniques, we will estimate coefficients representing the differential importance of each mediator in predicting the outcome. Unlike simple regressions, however, mixed-effect or multilevel models allow for correct statistical inference by accounting for the clustering of observations within the site .Furthermore, the models provide estimates of the relative importance of the source of variation. Finally, by borrowing strength from sites with greater number of observations, Bayesian estimation of site random effects can be used to identify over- and underachiever sites with added precision, which in turn may suggest additional hypotheses regarding mediators.


The specific type of regression to be implemented will vary depending on the outcome under analysis. For instance, the number of children identified at risk or number of children referred to services is more naturally modeled using Poisson distribution, while the likelihood of receiving service at follow-up is more adequately modeled using logistic distribution. The proposed strategic planning, in the case of the State/Tribal evaluation, and the analysis of Campus Infrastructure Interviews, in the case of Campus evaluation, will allow us to perform a qualitative assessment of the association between outcomes and mediators as perceived by the sites’ main stakeholders, particularly in relation to system outcomes.






CAMPUS CASE STUDIES


What are the student-level and population-level factors impacted by suicide prevention and mental health efforts on campus?


Analyses will focus on the risk and protective factors associated with suicide prevention and campus wellness. A blended qualitative and quantitative design will be used for data collection and analyses. Qualitative measures will be anchored around issues of protective internal (cognitive) factors such as problem-solving, planning, and positive thought as well as external (behavioral) factors such as seeking help and advice, and avoiding risky situations. Analytical anchors for the risk measures will include cognitive factors such as denial and negative thought, suicidal ideation, and depression as well as behavioral factors including substance abuse and risky behaviors. Multivariate analyses of quantitative measures of help seeking, coping, mental health status, and student demographics will focus on the interrelationships between these factors.


What is the campus infrastructure available to support suicide prevention and student mental health?


The analyses for this question ties key informant interviews and focus groups together to develop a detailed qualitative description of the core policy, finance, and procedural components of campus’ well-being efforts. Analytical anchors for these qualitative measures include referral protocols, information sharing policies, finance policies, emergency mental health protocols, student monitoring procedures, and mental health service accessibility.


What is the campus approach to suicide prevention?


Analyses corresponding to this question will also be based on the key informant interview and focus group questions with a specific focus on the campus’ programmatic approach to suicide prevention and the resources promulgated by and built around the GLS project funding. Analytical anchors for these qualitative measures include social marketing campaigns, suicide prevention training, and program-specific outreach.


What is the campus climate around mental health and wellness?


This question will be addressed through a blended qualitative and quantitative analytical approach. Qualitative data generated through key informant interviews and focus groups will be anchored by measures of student, faculty, and staff perceptions about high-risk behaviors, mental illness, mental health services with a specific focus on coping and help-seeking facilitators and barriers on the campuses. Quantitative measures from the Suicide Prevention Education Awareness and Knowledge Survey will be matched with these qualitative measures at the student level to examine the interrelationships between climate, service utilization, and perceptions about well-being.


17. Display of Expiration Date


All data collection instruments will display the expiration date of OMB approval.


18. Exceptions to the Certification Statement


This collection of information involves no exceptions to the Certification for Paperwork Reduction Act Submissions.


B. Statistical Methods



1. Respondent Universe and Sampling Methods



The respondent universe and sampling methods are described for each of the data collection instruments below.


Prevention Strategies Inventory (PSI). Respondents for the Prevention Strategies Inventory will be project evaluators and/or program staff. Each of the 48 State/Tribal grantees and 38 Campus grantees will be required to complete the inventory.


Training Exit Survey (TES). Respondents for the Training Exit Survey will include all individuals who participate in a training activity sponsored by the 48 State/Tribal grantees and 38 Campus grantees. The Survey will be administered one time to each training participant for each training activity. Therefore, no statistical methods will be used to identify respondents. It is estimated that there will be about 94,848 trainees from State/Tribal sites and 23,712 trainees from Campus sites per year. These numbers are based on data previously collected which indicate that State/Tribal sites train a mean of 1,976 participants per year and Campus sites train a mean of 624 participants per year. Because the respondents to the survey represent the entire trainee population in each grantee site, there is no need for calculation of precision of point estimates for survey responses. The number of respondents will be sufficient to conduct assessments of the psychometric properties of the scales developed for this study both within and across grantee sites.


Training Utilization and Preservation Survey (TUP-S). The ICF Macro team will design and select a multistage sample of trainees within trainings within grantees; i.e., a three-stage sample of individuals selected to complete the survey. We expect that approximately 2,000 trainees will participate in the survey. The sampling frame will be constructed from grantee-specific lists of trainees that include contact information (telephone number). We will select the sample so that 5 participating trainees complete the survey in each of 8 sample trainings selected per grantee. We show below how these sample sizes will ensure that confidence intervals are all within +/- 5% for all parameters of interest. The allocation of the two-stage sample of trainings and trainees within each grantee was designed to minimize the variance-inflating effects of clustering that may be quantified via the intracluster correlation coefficient (ICC). There are in fact two ICC components that impact the variance, the component across trainings within a grantee, and the component across individuals within a same training. To estimate the precision expected for survey estimates, we focus on estimated percentages or proportions. The key survey estimates will indeed take the form of the percentage of trainees who use a certain aspect of the training. Because the variance achieves its maximum value for percentages of 50%, it is sufficient to ensure that precision requirements are met for estimates in this range. The table below presents the (maximum) standard errors expected for estimated percentages for various ICC scenarios. For simplicity, the exhibit is confined to within-grantee ICCs that contribute the most to the variance, and examine ICCs between 0.01 and 0.05, the expected range for these parameters.

Table 5: Standard Error of Estimated Percentages


Intracluster Correlation

Standard Error

Confidence Interval

0.01

1.32%

2.58%

0.02

1.49%

2.92%

0.05

1.92%

3.76%


The table also shows the 95% confidence intervals half-width associated with these estimates. The exhibit shows that 95% confidence intervals will be within +/- 4 percentage points for all parameters; specifically, even for the largest ICC (0.05), intervals will be within +/-3.76%. We stress that these estimates reflect clustering effects on the variance, which is assumed larger than the variance of a simple random sample of the same size. These variance-inflating effects can be also quantified by the Design Effect (DEFF), defined as the variance under the actual sampling design divided by the variance that would be attained by a simple random sample of the same size.


Training Utilization and Penetration Interviews (TUP-I). Many of the State/Tribal programs are planning multiple training activities; therefore in attempts to obtain information from key informants who experienced the same training activity, the cross-site evaluation team in consultation with local program staff will select 10 particular training activities per year for which to administer the Training Utilization and Penetration Interviews (TUP-I). Respondents to the Training Exit Survey will be asked to complete a separate contact consent form indicating their willingness to be contacted to participate in the TUP and return the form to local program staff. Key informants for the TUP-I will be randomly selected from those individuals who consent to be contacted by the cross-site evaluation team. Local program staff will forward the contact consent forms to the cross-site evaluation team. Ten respondents from each of the 10 selected trainings will be randomly selected from among the potential respondents based on contact consent information, for a total of 100 respondents per year. Interviews will be conducted within 2 months of completion of the training activity. We estimate that ten respondents per grantee will be sufficient to ensure saturation of themes in the content analysis of results from the qualitative interviews.


Referral Network Survey (RNS). Respondents for the Referral Network Survey will be identified by the local program staff and/or project evaluators based on the organizations involved in the referral network(s) associated with each of the 48 State/Tribal grantees. Two representatives from each identified referral network organization will be included as respondents. In the first year of administration, each State/Tribal grantee will identify one network of five agencies/organizations and two respondents from each agency/organization. During the interviews, respondents will be asked to nominate one other agency to be part of the network. Thus, in the second and third years of the administration, the network will comprise 10 agencies/organizations. In the first year of administration, the estimated number of respondents is 480. In the second and third years of administration, the estimated number of respondents per year is 960. No statistical methods will be used to identify respondents for the Referral Network Survey.


Campus Infrastructure Key Informant Interviews (CIFI). Key informants for the Campus Infrastructure Key Informant Interviews will be identified by the local program staff and/or project evaluator to represent five key roles on each campus: (1) Administrator, (2) Student Leader, (3) Counseling Staff, (4) Faculty/Staff from a human services academic department, and (5) Faculty/Staff from a non-human service academic department. One respondent in each category will be interviewed for each of the 38 campus grantees, for a total of 190 respondents in each year of administration. Within respondent categories with more than one appropriate key informant, respondents will be randomly selected. We estimate that one respondent per grantee in each category will be sufficient to ensure saturation of themes in the content analysis of results from the qualitative interviews.


Suicide Prevention Exposure, Awareness and Knowledge Survey - Student Version (SPEAKS-S). Respondents for the student version of the Suicide Prevention Exposure, Awareness and Knowledge Survey (SPEAKS) will represent a sample of the student population. A sampling plan to obtain 200 students respondents in each of the 38 Campus grantees per administration for a total of 7,600 respondents will be developed by the cross-site evaluation team. Oversampling will be required. Based on data collected thus far, the mean response rate per campus is 12%. Based on our experience implementing this survey, including feedback from participants as well as firewall accessibility issues, a mixed-mode approach will be used to increase response rates (see B2 Procedures for Data Collection). This approach will address previous problems with coverage of solicitations and overall nonresponse. The campus evaluation team will draw a proportionately weighted stratified random sample within each grantee site targeted for SPEAKS administration from the matriculated student register. The matriculated student sample (sampled with replacements) will be stratified by gender, major, matriculation year, and race/ethnicity.


The minimum detectable difference across two waves of administration of SPEAKS for a set of selected variables with desired power (80%) and significance level (5%) is estimated both at Campus and Cross-Site levels for a sample of 200 students per campus per wave and a total 38 campuses.


For the present power analysis the null hypothesis is that there is no difference in the mean value of the variable of interest across 2 waves of administration; the alternative hypothesis is that the two means differ in any direction. At the individual Campus level, the estimations are based on a t-test of difference in means for two independent samples. At the aggregate cross-site level, however, the correlation among observations from the same campus must be taken into account. The effective sample size is, therefore, smaller than the total number of student in the sample by a factor which depends on the inter-class correlation (ICC) for each variable.


We focus on three variables: Self-rating on Knowledge of Suicide; Perception of Stigma towards Seeking Mental Health Treatment; and Knowledge of Myths and Facts about Suicide. The three variables are simple mean scores based on a set of 6 5-point Likert questions; 5 4-point Likert questions; and 28 dichotomous questions, respectively. Estimated mean, standard deviation and ICC values for Self-rating on Knowledge; Perception of Stigma and Knowledge of Myths and Facts variables are based on a sample of 20,219; 20,224 and 20,140 students, respectively, from 53 different campuses.


The table below presents the minimum detectable difference for each variable both at Campus and Cross-Site levels for the desired power and significance level. In sum, a sample of 200 students per campus in each wave, for a total of 38 campuses, has 80% power to detect a relatively small difference (less than a third of the standard deviation of the variable in all cases) at a 5% significance level.


Table 6: Minimum detectable difference with 80% power at 5% significance level


 

Estimated Values

Minimum detectable difference between waves

 

 

Campus level

n=200

(in each wave)

Cross-site level

N=7600

(in each wave)

Self-rating on Knowledge of Suicide




Mean

3.01

0.237

0.086

Standard Deviation

0.84



ICC

0.021



Perception of Stigma Towards Seeking Mental Health Treatment




Mean

2.15

0.153

0.051

Standard Deviation

0.54



ICC

0.016



Knowledge of Myths and Facts About Suicide




Mean

0.73

0.030

0.015

Standard Deviation

0.11



ICC

0.044





Suicide Prevention Exposure, Awareness and Knowledge Survey - Faculty/Staff Version (SPEAKS-FS). Respondents for the Faculty/Staff version of the Suicide Prevention Exposure, Awareness and Knowledge Survey (SPEAKS) will represent a sample of the faculty/staff population. A sampling plan to obtain 50 faculty/staff respondents in each of the 38 Campus grantee sites for a total of 1,900 respondents will be developed by the cross-site evaluation team. Local program staff and/or project evaluators will be responsible for pulling the sample. Oversampling will be required. Based on data collected thus far, the mean response rate per campus is 18%. Based on our experience implementing this survey, including feedback from participants as well as firewall accessibility issues, a mixed-mode approach will be used to increase response rates (see B2 Procedures for Data Collection). The faculty/staff sample (sampled with replacements) will be stratified by gender, race/ethnicity, faculty/staff position, and employment status (e.g., part-time, full-time, permanent, temporary).


The minimum detectable difference across two waves of administration of SPEAKS for a set of selected variables with desired power (80%) and significance level (5%) is estimated both at Campus and Cross-Site levels for a sample of 50 faculty and staff per campus per wave and a total 38 campuses.


For the present power analysis the null hypothesis is that there is no difference in the mean value of the variable of interest across 2 waves of administration; the alternative hypothesis is that the two means differ in any direction. At the individual Campus level, the estimations are based on a t-test of difference in means for two independent samples. At the aggregate cross-site level, however, the correlation among observation from the same campus must be taken into account. The effective sample size is, therefore, smaller than the total number of student in the sample by a factor which depends on the inter-class correlation (ICC) for each variable.


We focus on three variables: Self-rating on Knowledge of Suicide; Perception of Stigma towards Seeking Mental Health Treatment; and Knowledge of Myths and Facts about Suicide. The three variables are simple mean scores based on a set of 6 5-point Likert questions; 5 4-point Likert questions; and 28 dichotomous questions, respectively. Estimated mean, standard deviation and ICC values for Self-rating on Knowledge; Perception of Stigma and Knowledge of Myths and Facts variables are based on a sample of 6,859; 6,838 and 6,631 faculty and staff, respectively, from 53 different campuses.


The table below presents the minimum detectable difference for each variable both at Campus and Cross-Site levels for the desired power and significance level. In sum, a sample of 50 faculty and staff per campus in each wave, for a total of 38 campuses, has 80% power to detect medium size differences at the individual Campus level (less than two thirds of the standard deviation) and relatively small differences at the Cross-site level (less than a fifth of the standard deviation) at a 5% significance level.



Table 7: Minimum detectable difference with 80% power at 5% significance level


 

Estimated Values

Minimum detectable difference between waves

 

 

Campus level

n=50

(in each wave)

Cross-site level

N=1900

(in each wave)

Self-rating on Knowledge of Suicide




Mean

3.11

0.541

0.168

Standard Deviation

0.96



ICC

0.056



Perception of Stigma Towards Seeking Mental Health Treatment




Mean

2.04

0.291

0.067

Standard Deviation

0.51



ICC

0.022



Knowledge of Myths and Facts About Suicide




Mean

0.76

0.057

0.016

Standard Deviation

0.10



ICC

0.045




Key Informant Interviews for Campus Case Studies. Key informants will be identified by the local program staff or project evaluator three weeks prior to a visit by the case study team. The key informants identified will represent seven key roles on each campus: (1) Administrator, (2) Counseling Staff, (3) Coalition Member – Faculty, (4) Prevention Staff, (5) Case Finder, (6) Campus Police, and (7) Student Leader. No more than three respondents in each category will be interviewed for each of the campus grantees for a total of up to 14 respondents per site. We estimate that this number of respondents will be sufficient to ensure saturation of themes in the content analysis of results from the qualitative interviews.


Focus Groups for Campus Case Studies. Focus groups will be conducted during the first on-site visit. Local program staff and evaluators will be responsible for recruiting focus group participants across respondent types. For each focus group faculty, staff and students will be contacted until 9 participants for each respondent type have been successfully recruited. This number of participants is needed in order to conduct 8 focus groups with 9 people in each for each respondent type, which allows for a broad range of opinions to be voiced while keeping the groups small enough that everyone will have an opportunity to speak. Student participants will be informed of the financial incentive for participation in the groups.


Data abstraction activities for the Early Identification, Referral and Follow Up Analyses (EIRF), Early Identification, Referral and Follow-up Aggregate Screening Form (EIRF-S), MIS data abstraction and Training Exit Survey Cover Page (TES-CP) are utilizing existing data maintained in grantee reporting systems. Statistical methods are not applied for these data abstraction activities.


2. Procedures for Collection of Information


Prevention Strategies Inventory (PSI). Respondents for the Prevention Strategies Inventory will be project evaluators and/or program staff. Each of the 48 State/Tribal grantees and 38 Campus grantees will be required to complete the inventory. The Baseline version of this web-based inventory will be implemented following the first two quarters in Year 1 of the grantee’s funding period and thereafter quarterly over the duration of the grant period. The cross-site evaluation team will provide a web-based platform for data entry, will train program staff to complete the inventory, and will monitor completion. Each grantee will be provided via email a unique username and password to log in to the web-based inventory. No individual identifying information will be provided when completing the inventory. Logging in and completing the inventory will imply consent for completion.


Training Exit Survey (TES). All individuals involved in training activities at each of the 48 State/Tribal grantee sites will be asked to complete the Training Exit Survey. Upon completion of a training activity, local program staff and/or project evaluator will be responsible for providing the Training Exit Survey to participants for self-administration and immediate return. The survey cover page introduces the survey and explains the consent process. The cross-site evaluation team will train local program staff to administer the training exit survey during a 2-day site visit prior to the start of administration. Consent will be implied based on completion and submission of the survey to program and/or evaluation staff. A scannable survey option will be made available or as an alternative the survey can be administered in a paper-and-pencil format. If using the scannable surveys, local program staff will collect completed surveys and forward to the cross-site evaluation team. If paper-and-pencil surveys are used, local program staff will be responsible for entering survey data into the web-based data collection system. Participation in the Training Exit Survey will be voluntary but a survey will be offered to all training participants.


Training Utilization and Penetration – Survey (TUP-S). The Training Utilization and Preservation – Survey (TUPS-S) will be administered to a random sample of 2000 participants per year. When completing the Training Exit Survey, respondents will be asked to complete a separate form indicating their willingness to be contacted by the cross-site evaluation team to participate in the TUP-S and then to return the form to local program staff. Local program staff will forward consent forms to the cross-site evaluation team. Because it will be necessary to facilitate administration of the interview, identifying information for each key informant will be forwarded to the cross-site evaluation team. The cross-site evaluation team will contact each identified key informant via telephone within two months of the training activity to introduce the study, request participation and to schedule an appointment for administration of the interview. ICF Macro telephone interviewers will be responsible for administering the interview and will be trained by the cross-site evaluation team in administering the survey. Each respondent prior to administration of the TUP Interviews will provide verbal consent


Training Utilization and Penetration Interview (TUP-I). The Training Utilization and Penetration (TUP-I) will be administered to a subset of respondents to the Training Exit Survey from a total of 10 trainings per year. When completing the Training Exit Survey, respondents will be asked to complete a separate form indicating their willingness to be contacted by the cross-site evaluation team to participate in the TUP and then to return the form to local program staff. Local program staff will forward consent forms to the cross-site evaluation team. Because it will be necessary to facilitate administration of the interview, identifying information for each key informant will be forwarded to the cross-site evaluation team. The cross-site evaluation team will contact each identified key informant via telephone within two months of the training activity to introduce the study, request participation and to schedule an appointment for administration of the interview. The cross-site evaluation team will be responsible for administering the interview. Each respondent prior to administration of the TUP Interviews will provide verbal consent. Interviews will be audio recorded but respondents will not be identified by name.


Referral Network Survey (RNS). For the first administration of the Referral Network Survey (see Attachment E), each of the 48 State/Tribal grantees will identify one network comprising five agencies or organizations. Local program staff will contact the director of each identified agency/organization and request that two appropriate respondents knowledgeable of the suicide prevention referral network be identified. Local program staff will collect contact information (i.e., names, email address, and telephone number) from each potential respondent and forward this information to the cross-site evaluation team. ICF Macro telephone interviewers will administer the Web-based Referral Network Survey by telephone. During the interview, respondents will be asked to nominate one other agency for inclusion in the network. The networks for the second and third administration of the Referral Network Survey will therefore expand to 10 agencies or organizations. The same data collection procedures will be used for the second and third administrations.


Suicide Prevention Exposure, Awareness and Knowledge Survey - Student Version (SPEAKS-S). The SPEAKS will be administered to students in each of the 38 campus grantees, during each year of the three-year grant period. Local program staff and/or project evaluators will be responsible for identifying the list of respondents. The cross-site evaluation team will develop the sampling plan and local program staff will be responsible for identifying the sampling frame and pulling the sample. Once the sample has been pulled, local program staff will forward contact information (i.e., email addresses and postal addresses) to the cross-site evaluation team for administration of the SPEAKS.


Based on our experience implementing this survey, including feedback from participants as well as firewall accessibility issues, a mixed-mode approach will be used to increase response rates. (See Attachment G.3 to G.7 for all notification materials). This approach will address previous problems with coverage of solicitations and overall nonresponse. A presurvey e-mail will be sent to all faculty/staff and students in the samples explaining the importance of the survey and their campuses’ involvement in the GLS Suicide Prevention Program. An introductory letter will be sent to students in the samples requesting participation in the survey (See Attachment G.5). This letter will contain directions for logging into the Web site to complete the survey and a password for accessing the survey. The letter will contain a modest financial incentive ($1–$2) for all students in the sample. A follow-up e-mail will be sent 1 week later, to serve as both a thank you note and reminder note as per the Dillman (2000) method. This contact will contain both the login information and the password; 1 week after that, a final e-mail reminder will be sent to those who have not completed the Web survey.


To promote and legitimize the Web-based survey hosted by the cross-site evaluation team, Campus project staff will implement a formalized recruitment system to encourage survey participation. Campus project staff will be responsible for testing sample e-mails and mailing information provided to cross-site team. Unusable e-mails or postal addresses (e.g., bounce backs, expired accounts, returned letters, etc.) must be replaced through random sampling protocols. The cross-site evaluation team will work with Campus project staff to ensure IRB approvals, provide access to samples of faculty/staff and student populations, and address implementation issues (e.g., reconcile firewall issues, reconcile bad e-mail addresses). Campus project staff will be responsible for securing respondent lists, working with campus IT to ensure firewall e-mail accessibility, securing IRB approvals as necessary, and developing a formalized survey recruitment system.


Suicide Prevention Exposure, Awareness and Knowledge Survey - Faculty/Staff Version (SPEAKS-FS). The SPEAKS-faculty/staff version will be administered to faculty or staff in each of the 38 Campus grantees. Local program staff and/or project evaluators will be responsible for identifying the list of respondents. The cross-site evaluation team will develop the sampling plan and local program staff will be responsible for identifying the sampling frame and pulling the sample. Once the sample has been pulled, local program staff will forward contact information (i.e., email addresses) to the cross-site evaluation team for administration of the SPEAKS.


Based on our experience implementing this survey, including feedback from participants as well as firewall accessibility issues, a mixed-mode approach will be used to increase response rates (See Attachments G.8 to G.11 for all notification materials). This approach will address previous problems with coverage of solicitations and overall nonresponse. A presurvey e-mail will be sent to all faculty/staff and students in the samples explaining the importance of the survey and their campuses’ involvement in the GLS Suicide Prevention Program. An introductory letter will be sent to faculty/staff in the samples requesting participation in the survey. This letter will contain directions for logging into the Web site to complete the survey and a password for accessing the survey. A follow-up e-mail will be sent 1 week later, to serve as both a thank you note and reminder note as per the Dillman (2000) method. This contact will contain both the login information and the password; 1 week after that, a final e-mail reminder will be sent to those who have not completed the Web survey.


To promote and legitimize the Web-based survey hosted by the cross-site evaluation team, Campus project staff will implement a formalized recruitment system to encourage survey participation. Campus project staff will be responsible for testing sample e-mails and mailing information provided to cross-site team. Unusable e-mails or postal addresses (e.g., bounce backs, expired accounts, returned letters, etc.) must be replaced through random sampling protocols. The cross-site evaluation team will work with Campus project staff to ensure IRB approvals, provide access to samples of faculty/staff and student populations, and address implementation issues (e.g., reconcile firewall issues, reconcile bad e-mail addresses). Campus project staff will be responsible for securing respondent lists, working with campus IT to ensure firewall e-mail accessibility, securing IRB approvals as necessary, and developing a formalized survey recruitment system.


Campus Infrastructure Interviews (CIFI). Local evaluators will be responsible for identifying a list of appropriate respondents for each Campus Infrastructure Interview version and forwarding the appropriate contact information to the cross-site evaluation team for administration. The local program staff will be responsible for obtaining the necessary releases of information or consents-to-contact. Because it will be necessary to facilitate administration of the interview, identifying information for each respondent will be forwarded to the cross-site evaluation team. However, no identifying information will be included on the data collection instrument. The cross-site evaluation team will randomly select one respondent from each respondent list and contact the individual via telephone to introduce the study, request participation and to schedule an appointment for administration of the interview. Each respondent prior to administration of the Campus Infrastructure Interviews will provide verbal consent. The cross-site evaluation team will be responsible for administering the interview and will be trained by the cross-site evaluation project director or deputy project director in qualitative interviewing. Interviews will be audio recorded but respondents will not be identified by name.

Enhanced Evaluation.


Focus Groups. There are two focus group guide versions, one for students and one for faculty/staff (see Attachments I.1 and I.2). Six of the following student focus groups will be conducted on each campus: (1) first-year students, (2) athletes, (3) international students, (4) Lesbian, Gay, Bisexual, and Transgender (LGBT) students, (5) Greek life students, (6) graduate students, and (7) residential advisors/peer educators. The case study team will hold two focus groups with faculty and one with staff members. Each respondent prior to administration of the focus groups will provide written consent (see Attachments J.3 and J.4).Local program staff and evaluators will be responsible for identifying up to 9 participants per focus group and scheduling the focus groups. Two case study team members will facilitate the focus groups. Focus groups will be audio recorded but respondents will not be identified by name.


Key Informant Interviews. There are seven versions of the qualitative Key Informant Interviews; (1) Administrator, (2) Counseling Staff, (3) Coalition Member – Faculty, (4) Prevention Staff, (5) Case Finder, (6) Campus Police, and (7) Student Leader (see Attachments J.1 to J.7). Local program staff will be responsible for identifying appropriate respondents for each Key Informant Interview version and scheduling the interview to occur during a site visit by Campus Case Study evaluation staff. Each respondent prior to administration of the Key Informant Interviews will provide written consent (see Attachment J.8). The case study team will be responsible for administering the interview and have been trained in qualitative interviewing. Interviews will be audio recorded but respondents will not be identified by name and no identifying information will be included on the data collection instrument.


3. Methods to Maximize Response Rates



Participation in the cross-site evaluation is a requirement of the GLS Suicide Prevention Program. Therefore, completion of the Prevention Strategies Inventory by program staff will be a requirement. However, the cross-site evaluation team has taken a number of steps to minimize the burden on local programs to ensure that completion is timely. These steps include developing a web-based data collection system, and providing training and technical assistance to each grantee.


The cross-site evaluation team also will provide technical assistance and training to all grantee sites, to maximize response rates for the other data collection activities. This will be done by providing web cast trainings, distributing data collection procedures manuals, conducting on-site training visits for the State/Tribal grantees, and providing on-going one-on-one contact with each grantee through a technical assistance liaison.


Based on our experience implementing this survey, including feedback from participants as well as firewall accessibility issues, a mixed-mode approach will be used to increase response rates. This approach will address previous problems with coverage of solicitations and overall nonresponse. A presurvey e-mail will be sent to all faculty/staff and students in the samples explaining the importance of the survey and their campuses’ involvement in the GLS Suicide Prevention Program. An introductory letter will be sent to students in the samples requesting participation in the survey. This letter will contain directions for logging into the Web site to complete the survey and a password for accessing the survey. The letter will contain a modest financial incentive ($1–$2) for all students in the sample. A follow-up e-mail will be sent 1 week later, to serve as both a thank you note and reminder note as per the Dillman (2000) method. This contact will contain both the login information and the password; 1 week after that, a final e-mail reminder will be sent to those who have not completed the Web survey.


Methods that will be used to maximize response rates for the qualitative interviews (i.e., the Training Utilization Preservation Interviews and Campus Infrastructure Interviews) include obtaining buy-in from key program stakeholders, providing flexibility in scheduling, and conducting follow-up phone calls and emails to nonresponders. In addition, local program staff will be utilized to obtain contact information for respondents, which will result in more accurate information, thus increasing response rates. If any identified respondents for the qualitative interviews are nonresponsive, the cross-site evaluation team will request that local program staff identify replacement respondents.


4. Tests the Procedures




The GLS Suicide Prevention and Early Intervention Program is the first federally funded program to support suicide prevention programs in States, tribal communities and campuses. Drawing upon our experience of three years of data collection for the cross-site evaluation and feedback from grantees, we have made improvements to the administration protocols and content of cross-site evaluation data collection instruments.


As new measures were developed, standard instrument development procedures including review of the literature, item development, and content review by experts in the field were used. All instruments underwent cognitive and/or pilot testing, and/or expert review. These procedures were used to enhance question accuracy and determine administration times. In addition, web-enabled instruments will undergo usability testing prior to fielding. Usability testing refers to pilot testing of the Web-based interface for administering questionnaires to determine the most efficient and understandable presentation. Typically this is completed with a prototype and modifications are made before final fielding.


First, a thorough review of the literature was conducted related to suicide prevention training activities and suicide awareness and knowledge in efforts to develop the Training Exit Survey, the Training Utilization and Penetration Key Informant Interviews, and the SPEAKS. In addition, experts in mental health referral networks were consulted in developing the Referral Network Survey and representatives from Universities not involved in GLS Suicide Prevention Programs were consulted in developing the SPEAKS and Campus Infrastructure Interviews. Second, drafts of the instruments were developed and reviewed by cross-site evaluation team members, representatives from SAMHSA, and content experts in the field of suicide prevention. Third, the revised instruments underwent cognitive testing and/or pilot testing on no more than 9 respondents matching the type appropriate for the instrument, in efforts to enhance question accuracy and determine administration time.


ICF Macro will conduct a pilot intercept survey administration with student populations on two campuses during Year 1 of the contract. This approach will assess one potential method for increasing student response rates. Student populations are often overwhelmed with survey requests via e-mail solicitations resulting in low response rates and intercept surveys will provide the cross-site team access to a broad sample of students on pilot campuses. Lessons learned from the pilot administration will be used to inform the expansion of this approach to all campuses.


The cross-site evaluation team will work with Campus staff on two campuses to identify intercept areas and times. The intercept locations will be identified by Campus project staff and exposed to cluster sampling procedures when required. These locations will have high volumes of student populations at varying times throughout school week and serve a variety of campus populations (e.g., undergraduate, graduate, and commuter students). Once Campus staff have identified potential intercept locations and times, the cross-site staff will work with Campus project staff to finalize a rigorous, systematic intercept plan to be implemented at each Campus.


Cross-site evaluation team members will conduct the intercept survey using compact, computer notebooks. These Web-enabled computers will allow students to access SPEAKS directly, which will minimize data entry, data format conversion, and data validation issues associated with hard-copy administration. Token incentives will be provided to participants and will contain a suicide prevention public health message (e.g., stress ball with suicide prevention hotline number).

5. Statistical Consultants


The cross-site evaluator has full responsibility for the development of the overall statistical design, and assumes oversight responsibility for data collection and analysis. Training, technical assistance, and monitoring of data collection will be provided by the cross-site evaluator. The individuals responsible for overseeing data collection and analysis are:


Brigitte Manteuffel, Ph.D.

ORC Macro, Inc.

3 Corporate Square, Suite 370

Atlanta, GA 30329

(404) 321-3211


Christine M. Walrath-Greene, Ph.D.

ORC Macro, Inc.

116 John Street, Fl. 8

New York, NY 10038

(212) 941-5555


The following individuals will serve as statistical consultants to this project:


Christine M. Walrath-Greene, Ph.D.

ORC Macro, Inc.

116 John Street, Fl. 8

New York, NY 10038

(212) 941-5555


Robert Stephens, Ph.D.

ORC Macro, Inc.

3 Corporate Square, Suite 370

Atlanta, GA 30329

(404) 321-3211


Ye Xu, M.S.

ORC Macro, Inc.

3 Corporate Square, Suite 370

Atlanta, GA 30329

(404) 321-3211


David Goldston, PhD

Duke University

Duke Child and Family Study Center

718 Rutherford Street DUMC 3527

Durham, NC 27710

(919) 416-2423


The agency staff person responsible for receiving and approving contract deliverables is:


Richard McKeon, Ph.D.

Prevention Initiatives and Priority Programs Development Branch

Center for Mental Health Services

Substance Abuse and Mental Health Services

1 Choke Cherry Road

Room 6-1105

Rockville, MD 20857

Phone: (240) 276-1873



References


Dillman, D. (2000). Mail and Internet Surveys-Second Edition. New York, NY: John Wiley & Sons, Inc.


Eggert LL, Nicholas LJ, Owen LM. 1995a. Reconnecting Youth: A Peer Group Approach to Building Life Skills. Bloomington, IN: National Educational Service.


Eggert LL, Randell BR, Thompson EA, Johnson CL. 1997. Washington State Youth Suicide Prevention Program: Report of Activities. Seattle, WA: University of Washington.


Kalafat, J., and Elias, M. (1994). An evaluation of a school-based suicide awareness intervention. Suicide and Life-Threatening Behavior, 24(3), 224-233.


King KA, Smith J. 2000. Project SOAR: A training program to increase school counselors’ knowledge and confidence regarding suicide prevention and intervention. Journal of School Health, 70(10): 402-407.



List of Attachments


Attachment 1 – State/Tribal Project Evaluator Instruments

  • Document A.1 Prevention Strategies Inventory State/Tribal Version

  • Document F.1 Data Elements for the Early Identification and Referral Follow-up Analysis

  • Document F.2 Early Identification and Referral Follow-up Aggregate

  • Document F.3 Data Elements for the Training Exit Survey Cover Page


Attachment 2 – State/Tribal Project Evaluator Supporting Documents

  • Document K – Data Use Agreement


Attachment 3 – State/Tribal Provider Instruments

  • Document B.1 – Training Exit Survey

  • Document C.1 – Training Utilization and Preservation Survey

  • Document D.1 - Training Utilization and Preservation Key Informant Interview Guide

  • Document E – Referral Network Survey


Attachment 4 – State/Tribal Provider Supporting Documents

  • Document C.2 -Training Utilization and Penetration Survey Consent To Contact FORM

  • Document D.2 - Training Utilization And Penetration Consent To Contact Form

  • Document D.3 - Training Utilization and Preservation – Verbal Consent


Attachment 5 – Campus Project Evaluator Instruments

  • Document A.2 - Prevention Strategies Inventory Campus Version

  • Document F.4 - Data Elements for the Training Exit Survey Cover Page – Campus

  • Document F.5 – MIS Data Abstraction


Attachment 6 – Campus Student Instruments

  • Document G.1 – Suicide Prevention Exposure, Awareness and Knowledge Survey (SPEAKS) – Student Version

  • Document H.1 - Campus Infrastructure Interview


Attachment 7 – Campus Student Supporting Documents

  • Document G.3 - SPEAKS Student Advance Email

  • Document G.4 – SPEAKS Intro Email

  • Document G.5 – SPEAKS Student Reminder

  • Document G.6 – SPEAKS Student Reminder

  • Document G.7 – SPEAKS Student Final Reminder

  • Document H.2 - Campus Infrastructure Interview (CIFI) Phone Script and Verbal Consent


Attachment 8 – Campus – Staff Documents

  • Document G.2 - Suicide Prevention, Exposure, Awareness and Knowledge Survey (SPEAKS) - FS (Faculty/Staff Version)

  • Document H.1 Campus Infrastructure Interview


Attachment 9 – Campus – Staff Supporting Documents

  • Document G.8 - SPEAKS FS Advance Email

  • Document G.9 - SPEAKS FS Intro Email.

  • Document G.10 - SPEAKS FS Reminder Email

  • Document G.11 - SPEAKS FS Final Reminder Email

  • Document H.2 - Campus Infrastructure Interview (CIFI) Phone Script and Verbal Consent.


Attachment 10 – Campus Provider Instruments

  • Document B.2 – Training Exit Survey


Attachment 11 – CCS – Student Instruments

  • Document I.1 - Focus Group Moderator’s Guide - (Student Version)

  • Document J.2 - Campus Case Study Interview – (Student Version)

  • Document J.4 - Campus Case Study Interview – (Case Finder Version)


Attachment 12 – CCS – Student Supporting Documents

  • Document I.3 - Campus Case Study Focus Group – Student Consent Form

  • Document J.8 - Campus Case Study Key Informant Interview Consent Form


Attachment 13 – CCS Staff Instruments

  • Document I.2 – Focus Group Moderator’s Guide (Faculty/Staff Version

  • Document J.1 – Campus Case Study Interview (Faculty Version)

  • Document J.3 - Campus Case Study Interview (Prevention Staff Version)

  • Document J.5 - Campus Case Study Interview (Campus Police Version)

  • Document J.6 - Campus Case Study Interview CC (Counseling Center Version)

  • Document J.7 - Campus Case Study Interview (Administrator Version)


Attachment 14 – CCS – Student Supporting Documents

  • Document I.4 - Campus Case Study Focus Group – Faculty/Staff Consent Form

  • Document J.8 - Campus Case Study Key Informant Interview Consent Form




19

File Typeapplication/msword
File TitleSupporting Statement
AuthorGordon
Last Modified ByDHHS
File Modified2010-06-07
File Created2010-06-07

© 2024 OMB.report | Privacy Policy