Suicide Cross-Site-Eval Supporting Statement 7 13 2006 revised 10 26 2006

Suicide Cross-Site-Eval Supporting Statement 7 13 2006 revised 10 26 2006.doc

Cross-Site Evaluation of the Garrett Lee Smith Memorial Suicide Prevention and Early Intervention Program

OMB: 0930-0286

Document [doc]
Download: doc | pdf


Cross-Site Evaluation of the Garrett Lee Smith Memorial Suicide

Prevention and Early Intervention Program

Supporting Statement


A. Justification



The Prevention Initiatives and Priority Programs Development Branch of the Center for Mental Health Services (CMHS) within the Substance Abuse and Mental Health Services Administration (SAMHSA), is requesting clearance for data collection associated with the cross-site evaluation of the Garrett Lee Smith (GLS) Memorial Youth Suicide Early Intervention and Prevention Program (“State/Tribal Suicide Prevention Program”) and the GLS Campus Suicide Prevention Program (“Campus Suicide Prevention Program”). The Garrett Lee Smith Memorial Act (GLSMA), passed by Congress in October 2004, was the first legislation to provide funding specifically for State/Tribal and Campus Suicide Prevention programs. Under this legislation, funding has been set aside for states, tribes, and institutions of higher learning to develop, evaluate and improve early intervention and suicide prevention programs, and mandates that the effectiveness of programs be evaluated and reported (see Attachment A).

The SAMHSA awarded 36 State/Tribal Suicide Prevention Programs and 55 Campus Suicide Prevention Programs with funds under the GLSMA. The cross-site evaluation of the GLS Suicide Prevention Program was designed to evaluate the effectiveness of suicide prevention activities across multiple sites and to report those findings to Congress. While the desired long-term outcome of suicide prevention activities is a reduction in suicide attempts and deaths by suicide, there are potential intermediary variables that must be adequately and robustly evaluated prior to the evaluation of suicidal behavior itself. Complex conceptual models that include intermediary pathways of effect, such as those that underpin suicide prevention programs, must be evaluated using a staged framework which allows for the assessment of process, mediating, and long-term outcomes (i.e., potential mediating variables). For example, many suicide prevention programs currently do not have information on whether youth identified as at risk are able to access treatment – an intermediate variable that requires investigation. Furthermore, the data management infrastructure across states and tribes has not reached the point of consistency and sophistication that would allow for cross-state/tribe tracking and aggregation of suicide attempt and deaths by suicide. For example, states/tribes differ in how they classify suicide attempts and deaths by suicide - which could make aggregation and interpretation of these statistics potentially misleading.  The cross-site evaluation, through components designed to capture process, proximal and intermediate outcomes, as well as information regarding the current status of existing data systems, will supply critical information to the field that will ultimately lead to rigorous collection and interpretation of the long term outcomes of suicide prevention efforts.


More specifically, to date there have been few systematic studies of these mediating variables, and without the results of such an evaluation, the interpretation of suicidal behavior outcomes (whether positive or negative) will remain impossible. For example, the causal chain upon which early identification gatekeeper training activities is based includes the early identification of youth at-risk, their referral to service, their subsequent connection with those services, their receipt of services; the amelioration of their at-risk circumstances; hence an ultimate reduction in suicidal attempts and related deaths. In this scenario one must first understand the impact of the gatekeeper training on the referrals to service and subsequent connection to services; for without positive outcome in these intermediate areas ultimate outcomes associated with suicidal behavior are unrealistic.


As such, the cross-site evaluation includes four stages of information gathering that will serve as the first comprehensive and systematic evaluation of the crucial mediating (proximal) outcomes of suicide prevention efforts such as awareness, knowledge, referrals, and service access. These four stages target the funded program activity areas and will be conducted with all grantees (i.e., 36 State/Tribal grantees, 55 Campus grantees, and three enhanced evaluation grantees) and include: (1) Context Stage, (2) Product Stage, (3) Process Stage, and (4) Impact Stage. Data collection activities have been tailored to the programmatic activities funded because different programmatic approaches are funded in the State/Tribal sites and the Campus sites. In addition to assessing the effectiveness of the GLS Suicide Prevention Program, information collected through the cross-site evaluation will be used to report on SAMHSA’s National Outcome Measures (NOMs) that are relevant to program activities, in addition to reporting on the Government Performance Reporting Act (GPRA) measures that are identified for this program.


The cross-site evaluation includes 16 data collection instruments within the four evaluation stages and the enhanced evaluation for which clearance is being requested.


The table below summarizes the data collection instruments included in this clearance request.


Type of Grantee

Data Collection Instrument

State/Tribal Grantees

  1. Existing Database Inventory (EDI)–Appendix A.1

  2. Product and Service Inventory (PSI) – Baseline Version–Appendix B.1

  3. Product and Services Inventory (PSI) – Follow-up Version-Appendix B.2

  4. Training Exit Survey-Appendix C

  5. Training Utilization and Penetration (TUP) Key Informant Interview-Appendix D.1

  6. Referral Network Survey-Appendix E.1

Campus Grantees

  1. Existing Database Inventory (EDI)-Appendix A.2

  2. Product and Service Inventory (PSI) – Baseline Version-Appendix B.3

  3. Product and Services Inventory (PSI) – Follow-up Version-Appendix B.4

  4. Suicide Prevention Exposure, Awareness and Knowledge (SPEAKS)-Student Version- Appendix F.1

  5. Suicide Prevention Exposure, Awareness and Knowledge (SPEAKS)-Faculty/Staff Version-Appendix G

  6. Campus Infrastructure Interview-Administrator Version-Appendix H.1

  7. Campus Infrastructure Interview- Counseling Center Staff Version-Appendix H.2

  8. Campus Infrastructure Interview- Faculty Version-Appendix H.3

  9. Campus Infrastructure Interview-Student Group Leader Version-Appendix H.4


Enhanced Evaluation Grantee

  1. TLC-6-month Follow-up Survey- Appendix I.1



1. Circumstances of Information Collection



a. Background



While youth suicide is an enormous public health problem that takes the lives of many young persons and causes pain and suffering for those left in the aftermath, suicide also can result in feelings of guilt and shame for the friends and family members of the 4,000 adolescents and young adults who commit suicide every year (National Adolescent Health Information Center [NAHIC], 2004). Although adolescent males, in comparison with adolescent females, die more frequently from suicide, adolescent females are more likely than adolescent males to attempt suicide (NAHIC, 2004). Of all youth populations, American Indian/Alaska Native males have the highest suicide rates (Anderson & Smith, 2003). Despite these prevalence data, the scope of this problem is not entirely known because of the manner in which cause of death is recorded on death certificates and because of the ambiguity of homicides and accidental deaths where the person attempting suicide intentionally places himself or herself in harm’s way (U.S. Public Health Service, 1999).

Youth suicide can be linked to a number of mental health disorders as well as substance abuse. In 2003, the President’s New Freedom Commission on Mental Health recognized youth suicide prevention as a major priority. This was due to the high rates of youth suicide, rates that included large numbers of individuals who had been diagnosed with a mental illness and/or substance abuse disorders (Institute of Medicine, 2002). Adolescence is a time of rapid maturity and increasing responsibility, which leave many youth with a feeling of hopelessness for the future. This can apply particularly to college students and older adolescents between the ages of 20 and 24, the ages where the highest youth suicide rates are observed (NAHIC, 2004). In a study by the American College Health Association (as cited in the GLSMA, Public Law 108-355), 61 percent of college students reported feeling hopeless, and 45 percent reported feeling so depressed they could barely function; while 9 percent reported feeling suicidal.

Despite these high prevalence rates, youth suicide remains a public health problem that has gone largely unaddressed. This is unfortunate because suicide is preventable. Up to 80% of teens that attempt suicide display warning signs that if acted upon could prevent attempts (National Mental Health Association, 2005). These may include indirect or direct suicide threats, an obsession with death, or giving away belongings. Also, because of the negative social norms that surround mental health and suicide, youth often do not disclose their underlying emotional state or behavioral intentions. Consequently, it is extremely important to recognize these signs when exhibited, because the inability to do so may represent a missed opportunity for suicide prevention and intervention.

Suicide warning signs are less likely to occur, however, if protective factors are first recognized and taken into consideration. Various studies have shown that the proportion and interaction of risk and protective factors contribute to the potential for suicide to occur (Moscicki, 1997). Youth who exhibit risk factors, such as depression, impulsivity, alcohol and substance abuse, and a history of trauma or abuse, are believed to have a greater potential for suicidal behavior (Beautrais, 2000). Examples of protective factors include problem-solving skills, effective clinical care, strong connections to family and community support, and restricted access to lethal methods for attempting suicide. Research into this issue has generated goals and strategies for reducing the occurrence and subsequent burden of youth suicide, which build on the foundation of reducing risk factors while increasing protective factors (U.S. Public Health Service, 2001).

However, suicide does not occur simply because of an inadequate blending of these factors nor will a universal solution result because of a proper combination of specific risk and protective factors. As emphasized in the following reports, it will take involvement from mental health, substance abuse, juvenile justice, primary care, education, the media, and other youth-serving organizations to successfully prevent the occurrence of youth suicide. Three documents, Reducing Suicide: A National Imperative (Institute of Medicine, 2002), The Surgeon General’s Call to Action to Prevent Suicide (U.S. Department of Health and Human Services [DHHS], Public Health Service, 1999), and National Strategy for Suicide Prevention: Goals and Objectives for Action (U.S. DHHS, Public Health Service, 2001), all provide overlapping recommendations for how this problem can be effectively addressed.

The Institute of Medicine’s Reducing Suicide: A National Imperative (2002) highlighted the prevalence of suicide attempts and suicidal behaviors and emphasized the need for research to understand how to prevent suicide, while highlighting the challenges associated with such research. The Surgeon General’s Call to Action to Prevent Suicide (U.S. Public Health Service, 1999) highlighted the need for increased public awareness of the problem of youth suicide, interventions to enhance treatments, services, and programs, as well as a methodology to advance the science of suicide prevention, better known as AIM: awareness, intervention, and methodology. AIM is the foundation for the 15 key recommendations highlighted in the Surgeon General’s report. As a result of the collaboration of the Federal government, many private and public stakeholders, and family members of persons who committed suicide, the AIM framework became the catalyst for a further thorough and comprehensive strategy—the National Strategy for Suicide Prevention: Goals and Objectives for Action (U.S. Public Health Service, 2001).


On October 21, 2004, Congress passed the Garrett Lee Smith Memorial Act (GLSMA), which was signed into law by President Bush, to mobilize efforts to support suicide prevention and early intervention. This act authorizes the use of $82 million over 3 years to support States, Tribal communities, and colleges and universities to develop and implement various suicide prevention initiatives. Congress authorized an additional $27 million in FY 2006 to provide additional funding for States, Tribal communities and colleges across the country. This act strongly builds on Reducing Suicide: The Surgeon General’s Call to Action (U.S. Public Health Service, 1999), and the National Strategy for Suicide Prevention (U.S. Public Health Service, 2001) in its directive to use the scientifically proven methodologies identified in each of these reports to target those youth and young adolescents who have historically generated the highest suicide rates. Products of this effort, which encapsulate recommendations from each of these reports, include the GLS State/Tribal Youth Suicide Prevention and Early Intervention Program as well as the GLS Campus Suicide Prevention and Early Intervention Program. Objectives of these two programs range from providing early intervention and assessment for youth at risk for mental or emotional disorders; conducting information and awareness campaigns to inform gatekeepers, family members, peers, and others about the risk factors associated with youth suicide; to training physicians, educators, and providers to identify youth who exhibit at-risk behavior for youth suicide. This legislation not only provides support for implementing these strategies, but also directs these programs to evaluate the effectiveness of the targeted interventions provided by these programs at the local level, and requires a cross-site evaluation and report to Congress.



b. The Need for Evaluation



Section 520E (g) of the GLSMA mandates a cross-site evaluation to be conducted concerning the effectiveness of the activities carried out under the State/Tribal Youth Suicide Early Intervention and Prevention Program. The GLSMA specifies that a report to Congress must be submitted:


to analyze the effectiveness and efficacy of the activities conducted with grants, collaborations and consultations under [Section 520E].”


In addition, Section 520-E-2 (f) of the GLSMA mandates a cross-site evaluation of the Campus Suicide Prevention Program. The GLSMA specifies that a report must be submitted to Congress to include:


an evaluation of the grant program outcomes, including a summary of activities carried out with the grant and the results achieved through those activities.”, including “recommendations on how to improve access to mental and behavioral health services at institutions of higher education, including efforts to reduce the incidence of suicide and substance abuse.”


The cross-site evaluation will serve as a primary mechanism through which the initiative will be understood, improved, and sustained. As described previously there is a dire need in the field for a better understanding of the impact of suicide prevention efforts; first and foremost on the intermediate outcomes of these efforts and the existing data system infrastructures, and then ultimately on suicidal behavior itself. Because this suicide prevention initiative is the first to be federally funded, the rigor and utility of the evaluation and its findings are particularly critical, as such the emphasis of the cross-site evaluation is to gather the needed intermediate outcome information and data system infrastructure information across grantees, so that in future years of the GLS initiative cross-site evaluation efforts can move strategically forward on scientific ground to assess the impact of funded efforts on suicidal behavior. As such, the GLS cross-site evaluation will collect and analyze comprehensive data that focus on the context within which these programs are implemented; the products and services that are developed and utilized; the process though which programmatic activities are implemented; and impacts associated with those activities.


A government contractor (referred to as the cross-site evaluator throughout this document) coordinates data collection for the cross-site evaluation and provides support for its local-level implementation. Each grantee is required by the cooperative agreement to both conduct a self-evaluation and to participate in the cross-site evaluation. In this partnership between the cross-site evaluator and the local evaluators, the cross-site evaluator provides training and technical assistance regarding data collection and research design for the cross-site evaluation. In addition, the cross-site evaluator directly collects data, receives data from grantee data collection efforts, monitors data quality, and provides feedback to grantees. The data collection procedures, while systematically applied across funded sites, are specific to the local programmatic activities and infrastructure supporting those activities. The data gathered through the cross-site evaluation will be utilized for both grantee-specific and national assessments of the program.


c. Clearance Request



This submission requests OMB clearance for 16 data collection instruments, which are part of the four stage cross-site evaluation and the enhanced evaluation. The cross-site evaluation stages, in their entirety, are designed to answer the following overarching questions:


  • What types of prevention/intervention programs, services and products are used with youth determined to be at risk for suicidal behavior?

  • What is the reach of program services, products, and strategies?

  • To what extent does collaboration and integration influence referral mechanisms and service use?

  • What is the impact of program services, products, and strategies on knowledge, process, and behavior?


The cross-site evaluation stages and related data collection instruments are described below.


Context Stage. The purpose of the context stage is to provide information on the grantees existing data sources and availability of data elements to support the cross-site evaluation. This stage will include a contextual review of existing institutional data sources and data elements (e.g., management information system [MIS] data), availability and accessibility of existing data, and additional data collection instruments/interview protocols to support the product, process and impact stages of the cross-site evaluation. One representative from each State/Tribal grantee and from each Campus grantee will complete a Web-based Existing Database Inventory (Appendix A.2 and A.2) in years 2 and 3 of the cross-site evaluation. Collectively, and over time, the information learned through the context stage will be used to inform requests for data extractions to support other components of the cross-site evaluation, SAMHSA National Outcome Measures (NOMs), and Government Performance Reporting Act (GPRA) reporting.


Product Stage. The purpose of the product stage is to describe the development and utilization of products and services at each State/Tribal and Campus grantee site. These products and services may include awareness campaign products and materials; risk identification training materials and workshops; and enhanced services, including early intervention, family support, and postsuicide intervention services, as well as evidence-based treatments. One representative from each State/Tribal grantee and each Campus grantee will complete the Product and Services Inventory (B.1, B.2, B.3 and B.4) in the final quarter of the first year of the cross-site evaluation and then quarterly in years 2 and 3. Information related to the products and services developed, in development, or in use will be submitted. Additional information related to development stage of products and services will be submitted.


Process Stage. The process stage of the cross-site evaluation will assess progress on key activities related to implementation of each grantee’s suicide prevention plans. Because State/Tribal grantees and Campus grantees were funded to provide different suicide prevention activities, data collection activities for the process stage differ depending on the type of grantee. The majority of the 36 State/Tribal grantees are expected to include training activities as part of their suicide prevention programs. To assess the experiences of individuals who participate in training activities, a Training Exit Survey (Appendix C) will be administered immediately following the conclusion of the final training session for all training participants. Information related to training content, intended use and satisfaction with the training will be collected. To further assess utilization and penetration of the knowledge, skills and/or techniques learned through the suicide prevention program training, the Training Utilization and Penetration Key Informant Interview (Appendix D.1) will be conducted with a subset of trainees from each State/Tribal grantee site. To describe the referral networks for each State/Tribal grantee site and to assess whether these networks develop overtime, the Referral Network Survey (Appendix E.1) will be conducted in years 2 and 3 of the program for State/Tribal grantees funded in October 2005 and in years 1, 2, and 3 for grantees funded in June and October 2006.


To further assess progress on key suicide prevention activities among State/Tribal grantees, the cross-site evaluation will analyze existing suicide prevention program information, which is tracked locally. The Early Identification, Referral and Follow-up (EIRF) Analysis will analyze existing program information that tracks the number of youth identified at risk as a result of early identification activities, the youth who are referred for services, and the youth who present for services. The type of information to be tracked includes basic demographic information, types of service referrals and types of services received, including mental health assessments, mental health treatment, emergency services, and nontraditional support services. This information is tracked locally as part of suicide prevention program activities and will be shared with the cross-site evaluation team for analysis to determine the impact of suicide prevention program activities. This is a key component of the cross-site evaluation because it tracks critical prevention program activities, however, because it utilizes existing data there is no data collection instrument and no respondent burden. Therefore, we are not requesting OMB clearance for the EIRF but included it in this statement for background purposes.


To assess progress on key activities related to the Campus suicide prevention programs, the Suicide Prevention Exposure, Awareness, and Knowledge Survey (SPEAKS) (Appendix F.1 and G) will be administered to students and faculty/staff affiliated with October 2005 funded grantees in year 2 and year 3 and to students and faculty/staff affiliated with October 2006 grantees in years 1, 2 and 3. Finally, the Campus Infrastructure Interviews (Appendix H.1, H.2, H.3, H.4) will be conducted with four respondent types to describe the campus suicide prevention infrastructure and to assess progress on developing and/or enhancing the suicide prevention infrastructure. As such, there are four versions of the Campus Infrastructure Interviews.


To further assess progress on key Campus suicide prevention activities across all grantee sites, the cross-site evaluation will analyze Campus training activity and participant information, which is tracked by each Campus as part of their suicide prevention program and will be shared with the cross-site evaluation. The cross-site evaluation will utilize existing program information to analyze the number of participants (i.e., students, faculty, counseling center staff, etc.) trained in suicide prevention and the number of participants in educational seminars or workshops on suicide prevention. This information requested will be in aggregate and will include the number of participants as well as key demographic characteristic distributions. This information will be analyzed and summarized by the cross-site evaluation team. This is a key component of the Campus cross-site evaluation because it tracks participants in a key Campus prevention program activity, however, because it utilizes existing data there is no data collection instrument and no respondent burden. Therefore, we are not requesting OMB clearance for the training participant tracking and reporting component but have included it in this statement for background purposes.


Impact Stage. The purpose of the impact stage is to assess the impact that the suicide prevention programs have on youth who are at risk for suicide. As previously mentioned, existing data sources will be used to assess the impact of program activities at the State/Tribal grantee and the Campus grantee levels. To assess the impact of State/Tribal program activities, existing information that tracks information on youth referred for services and service receipt as a result of early identification activities will be analyzed. To assess the impact of Campus program activities, existing administrative data related to the number of students who are at risk for suicide, the school retention rate, the number who seek services, and the type of services received, including emergency service use, will be analyzed to determine the impact of Campus program activities on the student and campus populations. We will learn through the Existing Database Inventory (see above) the availability of student-level outcomes and request those existing data for analyses. For the purposes of GPRA requirements, aggregate student retention rates and emergency service use rates will be reported using existing administrative information submitted to the cross-site evaluation team (see below, Section 1.d, for more information). Because this information is obtained through existing sources, data collection instruments were not developed as part of the cross-site evaluation and no identifiable respondents exist; therefore OMB clearance for this evaluation component is not being requested.


Enhanced Evaluation. Through an interagency agreement between SAMHSA and the CDC, the enhanced evaluation provides funds for additional evaluation activities to be conducted in three of the State/Tribal grantee sites funded in October 2005 to enhance the information learned about youth served in funded suicide prevention programs, with a focus on more long term outcomes related to suicidal behavior. The three State/Tribal grantee sites selected for enhanced evaluation activities include: (1) Native American Rehabilitation Association (NARA), Northwest, (2) Tennessee Lives Count (TLC), and (3) Maine Youth Suicide Prevention Program.


Although the enhanced evaluation for NARA, Northwest is funded as part of the cross-site evaluation of the GLS Suicide Prevention Program, clearance from the Office of Management and Budget (OMB) for data collection activities will be requested separately from the cross-site evaluation. This will allow us to more clearly define the evaluation design and to identify the justification for data collection in the tribal communities involved in the NARA, Northwest enhanced evaluation. The Maine enhanced evaluation will examine the outcomes of referrals and the impact of a targeted community intervention on behavioral risks for suicide, using existing data sources. Primary sources of data include existing State and county level surveillance data, and program tracking information. Because these data sources are existing, OMB clearance for these activities is not being requested.


Clearance for one data collection instrument related to the enhanced evaluation of Tennessee Lives Count (TLC) is being requested. The purpose of enhanced evaluation activities in Tennessee Lives Count is to further assess the impact of their suicide prevention program. Existing data from a pre- and post-training assessment and existing data from a statewide survey will be utilized, along with the TLC 6-month Follow-up Survey (Appendix I.1), which will be administered to training participants 6 months after they receive training. As part of the enhanced evaluation, a random sample of training participants will receive a follow-up survey six months following the training experience. The information collected through the enhanced evaluation will measure what direct and measurable impact program activities have on proximal outcomes, such as knowledge, skills, and attitudes of professionals working with at-risk youth in a variety of settings and distal, community level outcomes, such as the number of children referred for services and long-term changes in skills and attitudes.

  1. Addressing National Outcome Measures (NOMs) and GPRA Reporting


The cross-site evaluation was designed in part to support the Substance Abuse and Mental Health Services Administration (SAMHSA) performance measurement and management efforts. In assessing the effectiveness of each State/Tribal and Campus suicide prevention program, the cross-site evaluation will evaluate the GLS Suicide Prevention and Early Intervention Program as a whole. This is a critical step toward assessing the ability of the program to achieve many of the goals implied by GPRA indicators and SAMHSA National Outcome Measures (NOMs). The cross-site evaluation design reflects the intention of SAMHSA to implement performance management and accountability in all programs.


The cross-site evaluation design addresses the three-tiered SAMHSA NOMs and GPRA measurement approach by incorporating relevant client-level, training-related and infrastructure development outcome measures. The SAMHSA client-level NOM domains to date have been developed to address outcomes related to mental health and substance abuse treatment programs and substance abuse prevention programs. Because the GLS Suicide Prevention Program focuses on suicide and prevention, rather than treatment and/or substance abuse, not all client-level measures included in the existing 10-domain client-level NOM framework are appropriate for suicide prevention. To further explain lack of appropriateness, the majority of funding across both State/Tribal and Campus programs is dedicated to the early identification and referral of youth at risk for suicide, and enhancing awareness related to suicide. Currently no funds are devoted to the provision of treatment. As a result, data collection activities and resources, as well as monitoring of program focus, should be appropriately focused on the activities being funded and related outcomes. Furthermore, while many of the treatment NOM domains are considered potential distal outcomes for those youth or university/college students who are identified at risk, referred into service, and receive treatment (e.g., decreased mental health symptomatology, abstinence from drug and alcohol use), the reporting of this type of information requires, among other things, (1) the receipt of mental health treatment which the GLS Suicide Prevention funds are not currently supporting, (2) the tracking of individuals to request self-reported information which the GLS suicide prevention grantees are not resourced to accomplish, and/or (3) the access to existing treatment MIS systems which the GLS suicide prevention grantees typically do not access given their strategic plans and partnership structure.


To that end, client-level measures that are viable for GLS suicide prevention program activities have been abstracted from the existing 10-domain structure, and appropriate training and infrastructure NOMs have been proposed. Jointly reporting on these NOMs will provide a comprehensive performance measurement and management approach that will represent the breadth of GLS program activities and their reach. A summary of the client-, training, and infrastructure-level indicators that will be used to facilitate NOMs/GPRA reporting for the GLS Suicide Prevention Program is described below and in Table 1.


Client-level NOMs: As detailed above, several of the client-level NOM domains are considered inappropriate for the GLS Suicide Prevention and Early Identification Program. Specifically, domains related to decreased symptomatology, increased stability in housing, decrease in juvenile justice involvement, retention in substance use treatment, and abstinence from alcohol are considered unviable for the reasons described in the previous section. Several client-level domains, however, are relevant for GLS suicide prevention programs because they specify outcomes related to early identification and referral of youth – specifically, access to mental health services, increased social supports, use of evidence-based programs/practices, and retention in education for university/college students. Early identification activities are a key component of GLS suicide prevention programs and focus on the use of evidence-based practices/approaches [NOM: use of evidence-based practice] to identify youth or university/college students at risk for suicide and connecting those individuals to appropriate mental health or emergency services [NOM: access to service] and support services [NOM: social supports and connectedness]. In addition, because Campus suicide prevention activities are being implemented with university/college students, the NOM related to education retention will be reported for the Campus program. Data from the cross-site evaluation will be used to facilitate reporting on these three client-level NOMs.


Training-related Proposed Domains: Because the GLS Suicide Prevention Program focuses on prevention rather than treatment, a large amount of grant funds, particularly in the State/Tribal sites are being dedicated to gatekeeper training and early identification activities. Appropriate training-level measures become critically important for the consistent performance measurement and management for the GLS State/Tribal program. Specifically, access to training, satisfaction with training experience, increased knowledge as a result of training, and intended use of the acquired skills are incorporated into the Cross-site evaluation design of the State/Tribal program activities.


Infrastructure Proposed Domains: Across the GLS Suicide Prevention Programs (i.e., State/Tribal and Campus programs), the prevention activities are being collectively implemented in an effort to build and strengthen suicide prevention infrastructures (i.e., at the State level and the Campus level). These activities include public information campaigns, education campaigns, gatekeeper trainings, product development and coalition building. In an effort to facilitate consistent performance measurement and management of infrastructure development and change, the National Strategy for Suicide Prevention objectives have been used as a framework for selecting relevant infrastructure indicators. Specifically, promoting awareness, the provision and implementation of suicide prevention activities across sectors (e.g., justice, education, clergy, child welfare, etc.), and improving and expanding suicide attempt and completion surveillance are being used as proposed infrastructure domains.


Table 1 provides a cross-walk of the proposed GPRA indicators for the GLS Suicide Prevention Program and details the Cross-site evaluation State/Tribal and Campus data source for each proposed indicator.


Table 1

SAMHSA National Outcome Measure Crosswalk with the Cross-site Evaluation of the GLS Suicide Prevention and Early Intervention Program


CLIENT-LEVEL OUTCOMES

NOMs DOMAIN

NOMs OUTCOME

CROSS-SITE EVALUATION

STATE/TRIBAL DATA SOURCE

CROSS-SITE EVALUATION

CAMPUS DATA SOURCE

Access/ Capacity 

Increased Access to Services (Service Capacity)

Information obtained through the Early Identification, Referral and Follow-up (EIRF) analysis will provide a measure of service accessibility for the State/Tribal suicide prevention programs and a measure of emergency service use. The EIRF process will identify the number of youth who are identified at risk for suicide through program activities, the number who are referred for services and the number who receive services and type. This will provide a measure of service capacity among State/Tribal suicide prevention programs. In addition, use

The context stage of the evaluation, through the Existing Database Inventory will be used to identify existing sources of information that can be obtained from campuses to facilitate reporting of access to services and service capacity on Campus’s involved in early identification activities. The cross-site evaluation will identify existing data elements of interest, and request that campuses share those existing data with the cross-site evaluation for analyses. This will include a measure of emergency service use among campus student populations.

Social Connectedness 

Increased Social Supports/Social Connectedness


Information obtained through the Early Identification, Referral and Follow-up (EIRF) analysis. The EIRF process will identify the number of youth who are identified at risk for suicide and who are referred for social supports. In addition, the PSI will catalogue the products and services that were developed and disseminated as part of both the State/Tribal and the Campus suicide prevention programs, which may include social support services. The information obtained through the inventories can be used to assess the availability of increased social supports for youth and college students identified at risk for suicide and their families.

Given the relatively lower level of program funding provided to the Campus site, social supports and social connectedness are not the focus Campus Program activities.

Use of Evidence-based Practice

Use of Evidence-based practices

Use of evidence-based practices to treat youth who are at risk for suicide is an important aspect of the GLS Suicide Prevention Program. To measure the availability and use of evidence-based practices, the Product and Services Inventory catalogues the evidence-based practices that are being utilized in the State/Tribal programs. This will provide a measure of evidence-based practice use among GLS Suicide Prevention programs.

Use of evidence-based practices to treat students who are at risk for suicide is an important aspect of the GLS Suicide Prevention Program. To measure the availability and use of evidence-based practices, the Product and Services Inventory catalogues the evidence-based practices that are being utilized in the Campus programs. This will provide a measure of evidence-based practice use among GLS Suicide Prevention programs.

Education Retention

Student Retention Rate

Not applicable: States/Tribes funds are not specifically targeting at-risk school-based populations, but rather statewide youth in a variety of community and organizational settings.

The context stage of the evaluation, through the Existing Database Inventory will be used to identify the source of information for student retention. Using the results from the Existing Database Inventory, Campuses will be required to share aggregate student retention rates to the cross-site evaluation team.

TRAINING RELATED OUTCOMES

Proposed Domain

Proposed

Outcome Measure (National Strategy for Suicide Prevention [NSSP] Goal)


CROSS-SITE EVALUATION

STATE/TRIBAL DATA SOURCE

CROSS-SITE EVALUATION DATA SOURCE

Satisfaction with Training

Satisfaction with training activities

The Training Exit Survey will provide a measure of client satisfaction among gatekeepers and providers trained as part of the State/Tribal suicide prevention programs.


Given the relatively lower level of program funding provided to the Campus sites, trainings are not the primary focus Campus Program activities.

Implement Training to Identify At Risk Behavior

Increase in the number of gatekeepers in GLS-funded States and Campuses who have received training in identification of and response to suicide risk and behaviors: Justice, education, clergy, family members (NSSP Goal 6: Objectives 6.4, 6.5 6.6 and 6.8 )







To measure the number of education staff, justice staff, clergy persons and family members who have received training as part of GLS-funded programs, the Training Exit Survey will document the number trained and the role for each trainee.


To measure the number of gatekeepers in GLS-funded campuses who have received training, campuses will share existing aggregate information on the number of individuals trained in suicide prevention activities and their demographic characteristics with the cross-site evaluation.

INFRASTRUCTURE DEVELOPMENT OUTCOMES

Proposed Domain

Proposed

Outcome Measure

(National Strategy for Suicide Prevention [NSSP] Goal)

CROSS-SITE EVALUATION

STATE/TRIBAL DATA SOURCE

CROSS-SITE EVALUATION

CAMPUS DATA SOURCE

Promote Awareness

Increase in number of GLS-funded States and Campuses with public information campaigns designed to increase public knowledge of suicide prevention (NSSP Goal 1: Objective 1.1)

To measure the implementation of public information campaigns in GLS-funded States, the Product and Services Inventory will document on a quarterly basis all public information products and services that were implemented as part of each grantees suicide prevention program.

To measure the implementation of public information campaigns in GLS-funded Campuses, the Product and Services Inventory will document on a quarterly basis all public information products and services that were implemented as part of each grantees suicide prevention program.

Promote Awareness

Increase in the number of GLS-funded States and Campuses that have disseminated suicide prevention information via the World Wide Web (NSSP Goal 1: Objective 1.4).

To measure the extent that the World Wide Web is utilized to disseminate information, the Product and Services Inventory will document on a quarterly basis all public information efforts that involve website development or enhancements for the purposes of disseminating suicide prevention information.

To measure the extent that the World Wide Web is utilized to disseminate information, the Product and Services Inventory will document on a quarterly basis all public information efforts that involve website development or enhancements for the purposes of disseminating suicide prevention information.

Develop and Implement Prevention Programs

Increase in the number of GLS-funded States with comprehensive suicide prevention plans that satisfy all of the following criteria: a) coordinate across government agencies; b) involve the private sector; and c) support plan development, implementation, and evaluation in its communities (NSSP Goal 4: Objective 4.1).

As part of the cross-site evaluation, an annual evaluation progress report will be provided by all grantees to document evaluation progress. Included in this process will be an assessment of whether GLS-funded States have a suicide prevention plan that satisfies all three criteria described in the National Strategy.

Not relevant to Campus grantees

Increase the number of schools (public or private) in GLS-funded States with evidence-based programs designed to prevent suicide (NSSP Goal 4: Objective 4.2).

To measure the extent that evidence-based programs are being implemented in schools, the Training Exit Survey will document the evidence-based programs that are being implemented as part of GLS-funded State/Tribal programs, and in what capacity.

Not relevant to Campus grantees

Increase in the number of GLS-funded colleges and universities with evidence-based programs designed to prevent suicide (NSSP Goal 4: Objective 4.3).

The Product and Services Inventory documents on a quarterly basis the programs that have been implemented as part of the GLS suicide prevention program, whether these programs are evidence-based, and whether these programs are implemented in colleges or universities.

The Product and Services Inventory documents on a quarterly basis the programs that have been implemented as part of the GLS suicide prevention program, and whether these programs are evidence-based.

Increase in the number juvenile justice-related agencies and organizations in GLS-funded States with evidence-based suicide prevention programs (NSSP Goal 4: Objective 4.5).

To measure the extent that evidence-based programs are being implemented in juvenile-justice related settings, the Training Exit Survey will document the evidence-based programs that are being implemented as part of GLS-funded programs, and in what capacity. This includes juvenile probation offices, correction facilities, detention centers, law enforcement, etc.

Not relevant to Campus grantees

Increase in the number of family, youth and community service providers and organizations in GLS-funded States and Campuses with evidence-based suicide prevention programs (NSSP Goal 4: Objective 4.7).

To measure the extent that evidence-based programs are being implemented in agencies and organizations serving families and youth, the Training Exit Survey will document the evidence-based programs that are being implemented as part of GLS-funded programs, and in what capacity. This includes child welfare offices, family service offices, community-based organizations, etc.

The Product and Services Inventory documents on a quarterly basis the programs that have been implemented as part of the GLS suicide prevention program, whether these programs are evidence-based, and whether these programs are implemented in family, youth or community service systems.

Improve and Expand Surveillance Systems

Increase in the number of GLS-funded States that produce annual reports on suicide and suicide attempts, integrating data from multiple State data management systems. (NSSP Goal 11: Objective 11.5)

As part of the cross-site evaluation, an annual evaluation progress report will be provided by all grantees to document evaluation progress. Included in this process will be an assessment of whether GLS-related program data are integrated from multiple data management systems and whether these data are utilized in annual reports.

As part of the cross-site evaluation, an annual evaluation progress report will be provided by all grantees to document evaluation progress. Included in this process will be an assessment of whether GLS-related program data are integrated from multiple data management systems and whether these data are utilized in annual reports.


The GLS Suicide Prevention and Early Intervention Program evaluation approach, the process through which it was developed, and the training and technical assistance that will be provided to grantees, have each fully intersected with utilization-focused federal program accountability requirements (i.e., PART, GPRA and NOMs). Therefore, a recommendation has been made that SAMHSA submit the cross-site evaluation package to the Office of Management and Budget.


2. Purposes and Use of the Information Collection



Specifically, information gathered through the four stages of the cross-site evaluation of the GLS Suicide Prevention and Early Intervention Programs will describe for State/Tribal grantees (1) the context in which suicide prevention activities are being implemented, (2) the products and services funded through the program, (3) the training experiences of individuals who receive training as part of the suicide prevention programs, (4) the utilization and penetration of the skills, knowledge and techniques learned through suicide prevention training programs, and (5) the referral networks in place to support youth identified at risk for suicide. In addition, the enhanced evaluations will assess the impact of suicide prevention activities on youth served by NARA, Northwest by collecting pre- and post program information directly from youth. The enhanced evaluation for Tennessee Lives Count will assess the medium term impacts of training activities on the utilization of those skills and the impact on the number of youth who are identified and referred for services.


Despite the extensive knowledge that research has provided regarding suicide risk and protective factors, there is little known about how to integrate these factors and understand how they work in concert to evoke suicidal behavior or to prevent it (Institute of Medicine, 2002). Specifically, even though gatekeeper training is a common activity to support suicide prevention, little information is available about the extent to which gatekeeper training actually supports the prevention and intervention with high-risk youth. Data describing trainee perceptions of their enhanced awareness of suicide risk factors and how to recognize and appropriately respond to suicide risk factors as a result of training activities is limited. Similarly, data describing how the training they received increased referrals for mental health services and/or social support will add to the existing knowledge base about the effectiveness of suicide prevention programs. In addition, little information exists about the referral networks that support youth identified at risk within communities sponsoring suicide prevention programs. Data describing the extent to which referral networks exist and are being utilized will contribute extensively to the existing knowledge base and assist other States and tribal communities in implementing referral networks.


For Campus grantees, the information gathered through the cross-site evaluation will describe (1) the context in which suicide prevention activities are being implemented, (2) the products and services funded through the program, (3) the suicide prevention exposure, awareness, and knowledge among campus students and faculty/staff at two points in time, and (4) the campus infrastructure in place to support suicide prevention program activities.


Suicide prevention is an important issue for colleges and universities across the country. Existing research shows that college students face enormous pressures and often have difficulties dealing with these stressors (as cited in the GLSMA, Public Law 108-355); however little is known about whether suicide prevention activities are reaching the students being targeted. Data describing campus students’ and faculty/staff’s exposure to suicide prevention activities and awareness and knowledge of suicide risk factors will significantly contribute to the existing knowledge base. All of these data for example, will serve to inform policy makers and federal representatives in their decision making around appropriations and funding as well as youth and their families in their every day quest to identify and respond to risk. Collectively, and with information provided through the cross-site evaluation, the quest to prevent suicide can be approached from multiple perspectives, and the National Strategy goals and activities, built upon through the GLS Suicide Prevention and Early Intervention Program, can be assessed and documented in their utility – while simultaneously advancing the field of suicide prevention.


In totality, the data collected as part of the cross-site evaluation will be useful to SAMHSA and its partners, other Federal agencies, the State/Tribal grantees, the Campus grantees, legislators, federal administrators, the field of suicide prevention, individual youth and their families, and the communities in which they live. Comprehensive information gathered from multiple sites at various levels and stages of their programmatic activity will tremendously augment the existing knowledge base.


In addition, and of equal importance, SAMHSA will use the results from the cross-site evaluation to develop policies and provide information to other States, Tribal communities, and campuses regarding the development and implementation of suicide prevention programs, as well as develop and refine future funding priorities of the GLS Suicide Prevention Program or similar programs. Finally, information from the cross-site evaluation may also help other SAMHSA programs, such as the Linking Adolescents at Risk to Mental Health Services Grantees in developing and implementing suicide prevention activities, design comprehensive data collection efforts to monitor those activities, and report to local and federal stakeholders. If these data are not collected, policymakers and program planners at the Federal and local levels will not have the necessary information to determine the extent to which suicide prevention activities are effective and having an impact on youth at risk for suicide. Without this evaluation, Federal and local officials will not know whether the suicide prevention programs implemented as part of the GLSMA had an impact on suicide prevention and identifying at risk youth and whether GLS grantee programs are meeting the goals of the GLSMA.


The stage-specific utility and contribution of the cross-site data collection to SAMHSA’s mission and decision making are described below:


Context Stage. Specifically, the cross-site evaluation team and SAMHSA will use information collected through the context stage to assess the availability of existing data sources to report on program activities and to support GPRA reporting. Assessing the availability of existing data will also support analyses conducted as part of the impact stage of the cross-site evaluation.


Product Stage. Specifically, SAMHSA will use information gained through the cross-site evaluation to describe the products and services that were developed and/or utilized as part of suicide prevention programs. Information collected as part of the product stage will inform other States and Tribal communities, as well as campuses, across the country as to what products and services support suicide prevention.


Process Stage. As part of the process stage, specific findings related to training activities will inform SAMHSA and other States and tribal communities what type of training activities are being implemented via these funded suicide prevention programs, who is being training, the intended and actual utilization and impact of those trainings, and the overall satisfaction with training experiences. This information will assist other States and tribal communities in implementing training activities as part of their suicide prevention program. In addition, for funded State/Tribal grantees, information collected as part of the training exit survey will inform any necessary training modifications and/or enhancements; and follow-up training information will help inform the extent to which training activities are having an impact on youth in the community. Also as part of the process stage, specific findings related to referral networks will inform SAMHSA and State/Tribal suicide prevention efforts across the country by describing the organizations involved in referral networks, what types of relationships exist, the extent to which grant funding enhanced the development of referral networks, and to what extent these networks are being used to support high risk youth. For funded State/Tribal grantees, information collected during the first administration of the State/Tribal referral network survey will assist State/Tribal grantees in further developing their referral networks in years 2 and 3 of grant funding.


As part of the process stage for Campus programs, specific findings related to student and faculty/staff exposure, awareness and knowledge of suicide prevention activities will assist other campuses across the country in assessing the potential impact of suicide prevention activities on their campus. For funded campuses’, information collected through the awareness and knowledge surveys will assist campuses’ in their local planning and implementation of awareness campaigns and activities in the out years of their grant funding. Data collected through the campus infrastructure interviews will inform SAMHSA and other campuses across the country what is involved in building a campus suicide prevention infrastructure, responding to crises, and what has been effective. Information collected through the infrastructure interviews will also assist funded campus grantees to identify necessary modifications and improvements to their existing infrastructures.


Overall, data collected through the cross-site evaluation will inform policy decisions, the continued improvement of funded State/Tribal and Campus suicide prevention programs, and suicide prevention efforts for other States, tribal communities and campuses across the country. SAMHSA will also use data collected as part of the cross-site evaluation to provide objective measures of its progress toward meeting targets of key performance indicators put forward in its annual performance plans as required by law under the GPRA.


Enhanced Evaluation. The objectives of the enhanced evaluation focus on the public health impact of suicide prevention programs. Information related to the public health impact of Statewide or regionwide suicide prevention programs has not been available on this scale. These types of data assessing the impact and effectiveness of suicide prevention activities on youth identified at risk and on community members trained in suicide awareness will add significantly to the field of suicide prevention and to the evidence-based for gatekeeper training.


The enhanced evaluation of the Tennessee Lives Count program will provide critical information on the impact of suicide prevention programs on youth identified at risk for suicide and the impact of training activities on knowledge, skills, and attitudes toward suicide prevention. The enhanced evaluation plan builds on previous evaluation activities in Tennessee by measuring program impact on distal community-level outcomes to asses what linkages exist between planned program activities (primarily QPR gatekeeper training), proximal outcomes, and distal, community-level indicators. Information on distal, community-level indicators has not been available on this scale, and contributes greatly to the evidence base around gatekeeper training models. Specifically, the Tennessee Lives Count enhanced evaluation will provide information on what long-term impact gatekeeper training has had on knowledge and attitudes, identification of youth at risk for suicide, and referring youth at risk for suicide. This information, along with information collected through the cross-site evaluation will present a valuable profile of the impacts of gatekeeper training on preventing suicide.


3. Use of Improved Information Technology

Every effort was made to limit burden on individual respondents who participate in the cross-site evaluation through the use of technology. A web-based data collection and management system will be used to facilitate data collection by program staff, program participants, key stakeholders, students, and Campus faculty/staff. The web-based data collection and management system will serve two functions; (1) as a data entry tool for program staff and cross-site evaluation staff to enter cross-site evaluation information or data elements, and (2) as a data collection tool for administering web-based surveys to respondents. All cross-site evaluation data obtained either through direct entry by program and/or evaluation staff or through web-based surveys will be stored in the web-based data collection and management system. The web-based data collection and management system reduces evaluation burden for the grantees and allows ease of access to data for program personnel and cross-site evaluation team members.


The web-based system is a completely secure system that maintains confidentiality through the provision of five different levels of password-protected access to site specific and aggregate data. All data collected will be stored in the central data repository that will allow for the analysis and summary of information within and across surveys. The five distinct user security levels include:

The Cross-site Administrator will have access to site-specific data from all grantee sites stored in the data collection and management system, and will have access to aggregate reports available on the system using this privilege level.


The Site Administrator will have access to site-specific data from the data collection and management system, and will have access to site-specific and aggregate reports available on the system. They will also be able to view the number of instruments that have been completed and submitted. One individual per community will be designated the Site Administrator.


A Site User has the capability to access information available on the system, but will be restricted from accessing datasets.


The Contact User will have access to aggregate information available on the repository. The Contact User will not have rights to download datasets, nor to access information specific to a grant-funded community.


Data contributors are data collectors and survey respondents who will have the capability to enter data into the web-based system, but will have no other privileges.


The cross-site evaluation team will provide training and technical assistance to support grantees in implementing the cross-site evaluation and in using data at the site level. Program personnel will be trained to utilize the data collection and management system and will be provided with a user’s manual.


Enhanced Evaluation. The Tennessee Lives Count 6-month follow-up survey will be a web-based administered via Survey Monkey. The servers that hold Survey Monkey data are kept in computer servers at SunGard. Servers are under continuous surveillance and kept in locked cages that require passcards and biometric recognition for entry. The network has multiple independent connections to Tier 1 Internet access providers, fully redundant OC-48 SONET Rings, and firewall restrictions. Hardware is protected via redundant internal power supplies and physical controls for temperature, humidity and smoke/fire detection. Data are backed up internally on the hour and every night to both centralized backup and offsite backup systems in the event of catastrophe. Use of survey monkey for data collection reduces burden for the local evaluation team and provides for a streamlined completion process by incorporating appropriate skip patterns.

4. Efforts to Identify Duplication



The cross-site evaluation team in developing the data collection activities for the cross-site evaluation conducted a literature review to avoid duplication in data collection activities and the use of similar information. Specifically, existing research studies and the efforts of other federal initiatives designed to evaluate suicide or suicide prevention were reviewed.


a. Existing Research


Many in the field of suicide prevention agree that there is a lack of information on the causes of suicide and even less information on how to prevent suicide (SPAN USA, Inc., 2001; Institutes of Medicine, 2002, U.S. Public Health Service, 2001). The studies on suicide prevention activities have provided important information, but for the most part have been conducted with specific populations under certain circumstances and are not generalizable to other populations (Institutes of Medicine, 2002). Similarly, the lack of longitudinal and prospective studies has been a barrier to understanding and preventing suicide (Institutes of Medicine, 2002). Acknowledging the dearth of information on the effectiveness of suicide prevention programs, the Institutes of Medicine’s Report, “Reducing Suicide: A National Imperative” provides several recommendations for increasing research on suicide (2002). The report recommends that federal funding be provided for the development, testing, and expansion of suicide prevention interventions, and for longitudinal studies that focus on the medium to long-term impacts of suicide prevention activities, such as the impact on risk and protective factors and treatment and prevention. Specifically, the report recommends exploring the impact of suicide prevention programs through large nationally coordinated efforts.


Although there have been evaluations examining the effectiveness of specific suicide prevention activities, such as gatekeeper trainings, suicide screening programs, and skills trainings, these studies have focused on specific populations, mostly school-based, and have not assessed the impact of programs across multiple sites or across time (Eggert et al., 1997; King & Smith, 2000, Eggert, Nicholas & Owen, 1995). For example, an evaluation of the Lifelines School-Based Adolescent Suicide Prevention Program found increases in knowledge and help-seeking behaviors (Kalafat & Elias, 1994), but was specific to youth in schools. The cross-site evaluation will assess suicide prevention approaches across multiple sites targeting diverse youth groups to determine the impact of suicide prevention activities and the extent to which funded activities meet the goals and objectives of the GLSMA. Cross-site evaluation data will also be used to assess performance across time in these diverse settings, in efforts to improve and enhance suicide prevention programs for funded and future funded grantees.


The existing knowledge base focuses on short-term impacts, and little is know about medium to long-term impacts of suicide prevention programs across broader and more diverse populations, as well as any direct impact on youth being referred for services. No evaluations have been conducted to examine the impact of suicide prevention programs across multiple sites, with diverse populations, involving diverse child-serving agencies (i.e., mental health, juvenile justice, foster care, etc), and to examine the impact on receipt of services. The cross-site evaluation of the GLS Suicide Prevention Program will be the first opportunity to collect information from multiple sites implementing suicide prevention activities in efforts to assess the effectiveness of those activities and the impact on youth at risk for suicide. The information learned from previous research on suicide prevention activities was crucial in designing the cross-site evaluation but the cross-site evaluation does not include data collection activities that will collect similar information as previous studies.

Enhanced Evaluation. The Tennessee Lives Count enhanced evaluation expands on the research base related to gatekeeper training. Existing research on gatekeeper trainings focus on school-based populations and has not linked knowledge and attitudes among trainees to impacts on youth behavior. The Tennessee Lives Count will be the first opportunity to assess the longitudinal impact of gatekeeper training on distal outcomes, such as suicide attempts and completions.


b. Other Federal Efforts

The Centers for Disease Control and Prevention (CDC) is supporting evaluations of evidence-based suicide prevention programs in Maine and Virginia as part of the CDC’s Targeted Injury Prevention Programs. In Maine and Virginia, the CDC is supporting research that documents the efficacy of a community-based cognitive therapy program for preventing suicidal behavior among suicide attempters identified in emergency departments. The focus of the intervention is to help youth develop more adaptive ways of thinking and more functional ways of responding to periods of emotional distress. These CDC evaluations will provide valuable information on the efficacy of interventions for youth displaying suicide risk factors, but the focus of the cross-site evaluation is to evaluate the effectiveness of suicide prevention programs rather than specific interventions.

CDC is also collecting and examining data from hospital emergency departments to assess the prevalence of suicide and suicide attempts. The National Electronic Injury Surveillance System-All Injury Program tracks data on all types and external causes of nonfatal injuries and poisonings treated in U.S. hospital emergency departments. With these data, CDC researchers can generate national estimates of nonfatal injuries, including those related to suicidal behavior. Again, although this effort is significant in providing a broader understanding of suicide, the information gathered through the cross-site evaluation focuses on the effectiveness of suicide prevention programs.

The Substance Abuse and Mental Health Administration (SAMHSA) is sponsoring an evaluation of the National Suicide Prevention Lifeline, the national crisis hotline. The purpose of the evaluation is to assess the impact of the national crisis hotline connecting callers to mental health professionals assessing participation with the Lifelines networks. Although the data collection activities planned as part of this effort will provide valuable information on the effectiveness of this important service for at risk youth, the scope of the evaluation focuses on all callers (adult and youth) to the national hotline and is specific to one intervention. The cross-site evaluation will add to the information collected as part of this effort to assess other suicide prevention strategies (i.e., gatekeeper training, suicide screening activities, etc.) and focuses on youth specifically.


5. Impact on Small Businesses or Other Small Entities



Some of the data for this evaluation will be collected from individuals involved with public agencies, such as mental health, juvenile justice, education, and child welfare agencies and from colleges and university. While most data will be collected from public agencies or universities, it is possible that organizations involved in the referral networks would qualify as small entities. Also, respondents to the Training Exit Survey and the follow-up training qualitative interview, while most likely employed by public agencies, may also be employed by small businesses or other small entities. But, these data collection activities will not have a significant impact on these agencies or organizations.





6. Consequences of Collecting the Information Less Frequently



Context Stage. Data for the context stage is collected twice; once in year two of the evaluation and once in year three. The information collected in year two is important to assess the availability of existing data sources and the availability of data to report on program activities and GRPA measures. The information collected in year three is important to update the availability of existing data based on any system development that occurred since the first administration. If these data are collected less frequently, it will not capture the extent that the data systems within grantee sites may develop over the course of the grant period.


Product Stage. Data for the product stage will be collected one time in year one of the cross-site evaluation and updated at the end of each quarter thereafter in years 2 and 3 to document the development and utilization of products and services. Collecting this information quarterly is necessary to track progress toward meeting suicide prevention goals and to provide information on the development stage of products and services within State/Tribal and Campus programs. Consequences of collecting those data less frequently is the potential of losing information related to the process of developing and implementing products and services and losing the ability to track progress over time.


Process Stage. For the process stage, data related to training experiences is collected one time at the conclusion of the training experience for State/Tribal grantees. Follow-up data collection will occur within 2 months for a subset of training participants to collect information on the utilization of the knowledge, skills, and techniques learned through the training. The consequence of not collecting the training experience data at the conclusion of the training experience would be the absence of understanding and cross-site knowledge about the types of trainings being provided with grant funds, the quality of those trainings, and the individuals being trained. The consequences of not conducting the follow-up interviews would be a lack of important information concerning the impact and penetration of the suicide prevention training activities.


Also as part of the process stage for the State/Tribal grantees, the referral network survey will be conducted twice for grantees funded in October 2005; once in year two and again in year three, and three administrations for grantees funded in June and October 2006; once in each year of program funding. Multiple administrations of the referral network survey is important in learning whether the suicide prevention programs have an impact on building referral networks for youth identified at risk for suicide. The consequences associated with less frequent data collection would be a lack of information assessing the impact of time on the development of referral networks.


For the Campus grantees, the process stage involves two administrations of the suicide prevention exposure, awareness and knowledge survey to campus students and two administrations to faculty/staff for campuses funded in October 2005 and three administrations for campuses funded in October 2006. Data collected cross-sectionally at multiple points in time is necessary to assess any change in awareness and knowledge as a result of suicide prevention activities. If data were collected only one-time, there would be no ability to assess change over time, which is an important element of the suicide prevention program. Finally, data on campus infrastructure around suicide prevention will be collected from a subset of key informants from each campus one time (either at the end of year two of the beginning of year 3). The consequences of not collecting this data would be the absence of understanding the extent to which the prevention of suicide has permeated the operations and functioning of the campus administration and departments, and the extent to which this permeation supports sustainability of the suicide prevention efforts.


Enhanced Evaluation. For the Tennessee Lives Count enhanced evaluation, data will be collected from participants in the suicide prevention training activities 6 months following their participation to assess the medium-term impact of the training on their knowledge and skills regarding suicide prevention. The Tennessee Lives Count local evaluation efforts intend to collect information from training participants prior to the training activity, which will allow an assessment of the impact of the training activity on individual trainee knowledge and skills. Collecting these data less frequently will not allow for assessment of the medium-term impacts of the training activities after a period of time.



7. Consistency with the Guidelines of 5 CFR 1320.5(d)(2)



The data collection fully complies with the requirements of 5 CFR 1320.5(d)(2).



8. Consultation Outside the Agency



a. Federal Register Notice


SAMHSA published a notice in the Federal Register, volume 71, page 34147 on June 13, 2006 soliciting public comment on this study. SAMHSA received no comments on the planned data collection. A copy of the notice can be found in Attachment A.J.


b. Consultation Outside the Agency


Consultation on the design, instrumentation, and statistical aspects of the evaluation has occurred with individuals outside of SAMHSA. An evaluation steering committee was established in 2005 to provide input and guidance in designing and implementing the cross-site evaluation. Consultation with the evaluation steering committee began in 2005 and will continue at least quarterly throughout the grant-funding period. Representatives on the steering committee include leaders in the field of suicide prevention and evaluation. In addition, representatives of the Suicide Prevention Resource Center (SPRC) were consulted with respect to the design of the cross-site evaluation in 2005. The SPRC provides technical assistance to entities implementing suicide prevention programs. Input from representatives of the Centers for Disease Control and Prevention (CDC) was also solicited in 2005. The CDC has conducted research in the filed of suicide prevention and was consulted to comment on the cross-site evaluation design, frequency of data collection activities, and instrumentation.


These consultations had several purposes: (1) to ensure continued coordination of related activities, especially at the Federal level; (2) to ensure the rigor of the evaluation design, the proper implementation of the design, and the technical soundness of study results; (3) to verify the relevance and accessibility of the data to be collected; and (4) to minimize respondent burden.


9. Payment or Gift to Respondents






A lottery incentive structure will be utilized with students responding to the web-based Suicide Prevention Exposure, Awareness, and Knowledge Survey (SPEAKS). Remuneration is a standard practice on university campuses, and has proven to increase response rates for college student surveys (Dillman, 2000). In a study examining response rates in the National Survey of College Graduates, incentives provided to an experimental group resulted in an increase in response rates of nearly 11% versus no incentives (Dillman, 2000). In a study examining the impact of a lottery incentive, there was a slight but significant increase in response rates for students entered into a lottery versus a control group offered no incentive (Porter & Whitcomb, 2003).


Remuneration is a standard practice in longitudinal studies in efforts to maintain participation in the study. Recontacting survey respondents for follow-up interviews is difficult given the lapse in time between the original survey and the follow-up interview. Compounding the difficulty is when respondents are not directly affiliated with the programs being evaluated. Therefore, given the hard to reach nature of these populations, an incentive will be provided for two cross-site evaluation data collection activities that involve follow-up interviews. Key informants who consent to participant in the Training Utilization and Penetration (TUP) Key Informant Interview will be provided a $20 incentive. An incentive for these respondents is particularly deemed appropriate because these respondents are gatekeepers not directly affiliated with the suicide prevention program. Respondents to the TLC 6-month follow-up survey will be compensated $10 upon completion of the survey. The TUP is estimated to take 60 minutes and the TLC follow-up interview 20 minutes, which explains the larger incentive for the TUP.


Payment will not be provided to any other respondents as part of the cross-site evaluation.

Respondents to other data collection activities are primarily staff of the suicide prevention programs or close affiliates. Therefore, no remuneration is planned.



10. Assurance of Confidentiality



A web-based data collection and management system was designed to facilitate data entry and management for the cross-site evaluation. Descriptive information will be collected from respondents to cross-site evaluation data collection activities, but no identifying information will be entered or stored into the web-based data collection and management system. Identifying information will be requested in order to facilitate the TUP Key Informant Interviews, the Referral Network Survey, the Campus Infrastructure Interviews, the SPEAKS-Student and Faculty/Staff Versions, and the TLC 6-month Follow-up Survey. Identifying information will not be stored with survey responses and specific procedures to protect the privacy of respondents are described below for each data collection activity.


The Existing Database Inventories and the Product and Services Inventories. Information to complete the inventories will be directly entered into the web-based system. To access the system, each respondent will be provided a username and password to protect their privacy and no identifying information is requested on the inventories.


Training Exit Survey. Each respondent to the Training Exit Survey will be provided a training participant ID, but no identifying information will be requested on the survey. Responses to the survey will be entered into the web-based system, but no identifying information will be entered. A consent-to-contact form will accompany the Training Exit Survey for respondents interested in being recontacted for administration of the TUP Key Informant Interviews. The consent-to-contact form will include the training participant ID and identifying information necessary for contacting selected respondents for the TUP. However, again, no identifying information will be entered into the web-based data collection and management system and all consent-to-contact forms will be stored separately from Training Exit Survey responses in order to protect the privacy of respondents. For respondents not selected for the TUP Key Informant Interviews, the consent-to-contact forms will be destroyed upon completion of the study component.


TUP Key Informant Interviews. Responses to the TUP Key Informant Interviews will be entered into the web-based system, but no identifying information is requested on the interview. However, as stated above, identifying information will be collected from interested Training Exit Survey respondents in order to contact key informants, but the identifying information will not be entered into the web-based system, nor will it be linked in anyway to Training Exit Survey responses or TUP responses. Contact data and ids will be kept in a password-protected Microsoft Access tracking database separate from the survey database. Other procedures for assuring the privacy of respondents will include limiting the number of individuals who have access to identifying information, using locked files to store hardcopy forms that include identifying information, assigning unique code numbers to each participant to ensure anonymity, and implementing guidelines pertaining to data submission and dissemination. Data collectors will be extensively trained and will be responsible for entering data into the web-based data collection system.


In addition, identifying information will be requested for distribution of the incentive. This information will be collected at the close of the telephone interview and stored separately from the interview database and its contents. There will be no way to link the contact information to the information provided during the interview in order to ensure the privacy of the respondents.


Campus Infrastructure Interviews. Identifying information will also be obtained for participants in the Campus Infrastructure Interviews in order to contact respondents. However, no identifying information will be entered or stored in the data collection or management system and will not be linked to responses. Contact data and ids will be kept in a password-protected Microsoft Access tracking database separate from the survey database. Other procedures for assuring the privacy of respondents will include limiting the number of individuals who have access to identifying information, using locked files to store hardcopy forms that include identifying information, assigning unique code numbers to each participant to ensure anonymity, and implementing guidelines pertaining to data submission and dissemination. Data collectors will be extensively trained and will be responsible for entering data into the web-based data collection system.


Referral Network Survey. Identifying information for respondents to the Referral Network Survey will be necessary in order to facilitate administration. However, identifying information will be limited to email addresses, agency affiliations, names and telephone numbers in order to contact non-responders, but will not be stored with survey responses. To ensure privacy, no identifying information will be entered in the data collection and management system and therefore no identifying information will be associated with individual responses. Respondents will be assigned a username and password, which will be changed by the respondent upon logging in to the system. Only the web survey programmers will have access to identifying information (i.e., email addresses) in order to administer the survey, but again, identifying information will not be connected to individual responses for analysis or reporting efforts.


SPEAKS-Student and Faculty/Staff Version. Identifying information will be necessary in order to facilitate the administration of the SPEAKS. However, identifying information will be limited to email addresses and campus affiliations and will not be stored with survey responses. Respondents will be assigned a username and password, which will be changed by the respondent upon logging in to the system. To ensure privacy, no identifying information will be entered in the data collection and management system and therefore no identifying information will be associated with individual responses. Only the web survey programmers will have access to identifying information (i.e., email addresses) in order to administer the survey, but again, identifying information will not be connected to individual responses for analysis or reporting efforts.


In addition, because respondents to the Student Version will receive an incentive, those students wishing to enter the incentive lottery, will provide identifying information for distribution of the incentive. This information will be collected through a web-enabled interface stored separately from the survey database and its contents. There will be no way to link the student contact information to the information provided on the survey.


Enhanced Evaluation. Identifying information will be necessary in order to facilitate administration of the TLC 6-month follow-up interviews. Contact information will be obtained through a consent-to-contact process, but identifying information will not be stored with survey responses. Participant IDs will be assigned to track respondents across data collection waves. Contact data and ids will be kept in a password-protected Microsoft Access tracking database separate from the survey database. Data collectors will be extensively trained and will be responsible for entering data and maintaining the tracking database. The database will be organized so that data collectors can enter changes to contact information (including the date of the change) and ensure that original contact information is not lost. No identifying information will be entered or stored with survey responses. Other procedures for ensuring the confidentiality of respondents include limiting the number of individuals who have access to identifying information, using locked files to store hardcopy forms that include identifying information, assigning unique identification numbers to each participant, and implementing safeguards pertaining to data submission (especially password protection).


11. Questions of a Sensitive Nature



No questions of a sensitive nature are included in the data collection instruments.


12. Estimates of Annualized Burden Hours and Costs


Data collection for the cross-site evaluation in each of the State/Tribal and Campus grantee sites funded in FY 2006 (i.e., 22 State/Tribal grantees and 21 Campus grantees), and in the enhanced evaluation site, will begin in the final quarter of FY 2006 and continue through FY 2008, covering a 3-year project period. Data collection is expected to begin in August 2006 and continue through September 2008. Data collection for each of the State/Tribal and Campus grantees funded in FY2007 (i.e., 14 State/Tribal grantees and 31 Campus grantees) will commence upon receipt of their funding and local regulatory approvals are in place. The start date for this data collection is expected by the end of the first quarter of the FY2007 and will continue through FY2009. This covers a 3-year project period for these grantees, the final year of which will be submitted in an OMB renewal package. Table 1 shows the burden associated with cross-site evaluation and enhanced evaluation data collection activities and the associated costs.


All measures included in Table 2 were developed for the cross-site evaluation of the GLS Suicide Prevention Program and the enhanced evaluation. As such, the cross-site evaluation team piloted each measure with less than 10 respondents to determine burden estimates. The cost was calculated based on the hourly wage rates for appropriate wage rate categories using data collected as part of the National Compensation Survey (BLS, 2004) and the American Association of University Professors (AAUP) National Survey of university faculty salaries.


Table 2

Annualized Burden Hours and Costs


Type of Respondent

 Measure Name

No. of Respondents

No. of Responses/ Respondent

Hours/ response

Response Burden *

Wage

Total cost

Project Evaluator 1

Existing Database Inventory-State version

36

1

0.5

18

$29.40

$529

Project Evaluator

Existing Database Inventory-Campus version

55

1

0.5

28

$29.40

$809

Project Evaluator

Product and Services Inventory-State version-baseline

36

1

0.75

27

$29.40

$794

Project Evaluator

Product and Services Inventory-State version-follow-up

36

2

0.75

54

$29.40

$1,588

Project Evaluator

Product and Services Inventory-Campus version-baseline

55

1

0.75

41

$29.40

$1,213

Project Evaluator

Product and Services Inventory-Campus version-follow-up

55

1

0.75

41

$29.40

$1,213

Provider (Trainees) 2

Training Exit Survey

12,000

1

0.17

2040

$18.51

$37,760

Provider (Trainees)

Training Utilization and Penetration (TUP) Key Informant Interview

360

1

0.67

241

$18.51

$4,465

Provider (Stakeholder) 2

Referral Network Survey

1,003

1

0.67

672

$18.51

$12,439

Student 3

Suicide Prevention Exposure, Awareness and Knowledge Survey-Student Version (SPEAKS-S)

9,600

1

0.25

2400

$5.15

$12,360

Faculty 4

Suicide Prevention Exposure, Awareness and Knowledge Survey-Faculty/Staff (SPEAKS-FS)

2,400

1

0.25

600

$32.94

$19,764

Key Informant-Student 3

Campus Infrastructure Interview-Student Leader Version

18

1

1

18

$5.15

$93

Key Informant-Faculty 4

Campus Infrastructure Interview-Faculty/Staff Version

37

1

1

37

$32.94

$1,219

Key Informant-Administrator 5

Campus Infrastructure Interview-Administrator Version

18

1

1

18

$35.77

$644

Key Informant-Counselor 6

Campus Infrastructure Interview-Counseling Center Staff Version

18

1

1

18

$28.52

$513

Provider (Trainees) 2

Tennessee Lives Count 6-month Interview

466

1

0.25

117

$18.51

$2,166


Total

26,193

 

 

6,370

 

$97,567


  1. National Compensation Survey, Bureau of Labor Statistics (BLS) US Dept of Labor, Professional-specialty and technical occupations, July 2004.

  2. National Compensation Survey, Bureau of Labor Statistics (BLS) US Dept of Labor, Social Worker, July 2004.

  3. Based on the federal minimum wage of $5.15.

  4. Based on the 2004-2005 American Association for University Professor's (AAUP) Annual Salary Survey, which found that the annual average for professors was $68,505, http://www.aaup.org/.

  5. National Compensation Survey, Bureau of Labor Statistics (BLS) US Dept of Labor, Administrators-education and related fields, July 2004.

  6. National Compensation Survey, Bureau of Labor Statistics (BLS) US Dept of Labor, Counselors- educational and vocational, July 2004.



13. Estimates of Annualized Cost Burden to Respondents or Record Keepers



Grantees are collecting the majority of the required data elements as part of their normal suicide prevention program operations. Grantees will maintain this information for their own program planning, quality improvement, and reporting purposes. Therefore, there are no additional capital or start-up costs associated with the cross-site evaluation. There will be some additional burden on record keepers to provide potential respondent lists for data collection activities. However, these operation costs will be minimal.


Other costs related to this effort, such as the cost of shipping completed questionnaires (i.e., training exit survey) and consent-to-contact forms is cost to the Federal government as part of the funding received for participation in the cross-site evaluation. Each grantee has been funded, as part of the overall cooperative agreement award, to fund an evaluator and to include related costs to carry out the requirements of the cross-site evaluation. Therefore, no cost burden is imposed on the grantee by this additional effort.



14. Estimates of Annualized Cost to the Government



CMHS has planned and allocated resources for the management, processing and use of the collected information in a manner that shall enhance its utility to agencies and the public. Including the Federal contribution to local grantee evaluation efforts, the contract with the National Evaluator, and government staff to oversee the evaluation, the annualized cost to the government is estimated at $4,273,443. These costs are described below.


Each grantee is expected to fund an evaluator to conduct the self-evaluation and to satisfy the requirements of the cross-site evaluation. It is estimated that participating in the cross-site evaluation will require 0.20 full-time equivalent (FTE) to collect information, enter information into the web-based data collection and management system, and to conduct analyses at the local level. Assuming: 1) an average annual salary of $61,000 (BLS, 2004) for a 0.20 FTE evaluator, 2) 36 State/Tribal and 55 Campus grantees; and 3) that Campus grantees had to cost share on a 1:1 basis, the annual cost for the cross-site evaluation at the grantee level is estimated at $774,700. These monies are included in the cooperative agreement awards.


The cross-site evaluation contract has been awarded to ORC Macro for evaluation of the 88 suicide prevention programs. The current cross-site evaluation contract with SAMHSA provides $2,624,971 for a three-year period and covers data collection activities with the 36 sites funded in October 2005. The estimated average annual cost of the contract will be $874,990. Included in these costs are the expenses related to developing and monitoring the cross-site evaluation including, but not limited to, developing the evaluation design; developing the cross-site evaluation instrumentation; developing training and technical assistance resources (i.e., manuals, training materials, etc.); conducting in-person or telephone training and technical assistance; monitoring of grantees; traveling to grantee sites and relevant meetings; and data analysis and dissemination activities. In addition, these funds will support the development of the web-based data collection and management system and fund staff support for data collection.


It is estimated that CMHS will allocate 0.30 of a full-time equivalent each year for government oversight of the evaluation. Assuming an annual salary of $80,000, these government costs will be $24,000 per year. In addition, through the interagency agreement between SAMHSA and the CDC, the CDC will allocate 0.50 of a full-time equivalent each year for government oversight, technical assistance, and monitoring of the enhanced evaluation. Assuming an annual salary of $80,000, these government costs will be $40,000 per year.


15. Change in Burden



This is a new project.


16. Time Schedule, Publication, Analysis Plans



a. Time Schedule



The time schedule for implementing the cross-site evaluation is summarized in Table 2.


Table 2

Time Schedule



Begin data collection for 22 State/Tribal Grantees funded in FY2006 (i.e., 14 funded in October and 8 funded in June)


1 month after OMB approval


Begin data collection for 21 Campus Grantees funded in FY2006

1 month after OMB approval

Begin data collection for 14 State/Tribal Grantees funded in FY2007

4 months after OMB approval

Begin data collection for 34 Campus Grantees funded in FY2007

4 months after OMB approval

Begin data collection for enhanced evaluation with sites funded in FY2006

1 month after OMB approval


Data collection completed for 22 State/Tribal Grantees funded in FY2006 (i.e., 14 funded in October and 8 funded in June)


26 months after OMB approval


Data collection completed for 21 Campus Grantees funded in FY2006


26 months after OMB approval

Data collection completed for enhanced evaluation

26 months after OMB approval


Data collection completed for 14 State/Tribal Grantees funded in FY2007

36 months after OMB approval



Data collection completed for 34 Campus Grantees funded in FY2007



36 months after OMB approval

Validate data


Ongoing

Analyze data


Ongoing


Produce interim annual reports


12 months, 24 months, and 36 months after OMB approval




Produce final dissemination report

26 months after OMB approval



b. Publication Plans



The GLSMA requires annual reports summarizing the results of the cross-site evaluation. The cross-site evaluation team will analyze data collected and prepare interim annual reports to summarize key findings. A final report on the results of the cross-site evaluation is also required by the GLSMA, and will be produced by the cross-site evaluation team no later than 3 years after the grants were received.


Because of the importance of the cross-site evaluation to the field of suicide prevention, in collaboration with SAMHSA and the government project officer, we will publish the results of the cross-site evaluation in relevant professional journals to inform the research community as well as the decision making of policymakers and program administrators. Up to 5 publications are planned, and will most likely be submitted in the final year of the cross-site evaluation. Possible publications include a manuscript providing an overview of the GLS Suicide Prevention Program and the key findings, as well as manuscripts reporting results from the Training Exit Survey and the TUP Key Informant Interviews, the Referral Network Survey, the SPEAKS, and the Campus Infrastructure Interviews. All publications will be submitted to the Government Project Officer (GPO) in draft form for review and approval prior to submission to the selected journal.


Examples of journals that will be considered as vehicles for publication include the following:


  • American Journal of Public Health

  • American Psychologist

  • American Journal of Diseases of Children

  • Child Development

  • Evaluation Review

  • Evaluation Quarterly

  • Journal of the American Academy of Child and Adolescent Psychology

  • Journal of Applied Development Psychology

  • Journal of Child and Family Studies

  • Journal of Clinical Child and Adolescent Psychology

  • Journal of Consulting and Clinical Psychology

  • Journal of Health and Social Behavior

  • Journal of Mental Health Administration

  • Psychological Reports

  • Social Services Review

  • Suicide and Life Threatening Behavior


Enhanced Evaluation. Tennessee Lives Count (TLC) will obtain approval from SAMHSA and CDC staff prior to any presentation or publication associated with this project. Dissemination plans specifically target three audiences: local (i.e., TLC program staff and stakeholders), statewide, and national. Initial dissemination efforts are targeted to provide useful information to program staff and other key stakeholders that can be used to inform program efforts to implement TLC program efforts across the state. Local formative evaluation feedback will be used to guide program planning and will include summaries of QPR training findings, analysis of regional and county needs from available data, and reports on progress toward achieving grant goals and objectives.


At the State and National level, TLC plans to widely disseminate project information in order to aid and inform similar suicide prevention efforts, particularly those of statewide scope and/or using the QPR gatekeeper training model. Dissemination activities are planned for both State and National forums, for consumer, lay, and professional audiences. Project dissemination activities will be tailored to the specific audience and will involve all project staff, as well as members of Tennessee’s Suicide Prevention Task Force. Project staff plans to present in professional forums such as the American Association of Suicidology Annual Conference, the Foster Care Association Conference, Tennessee Association for Child Care (TACC) Annual Conference, and TCCY Annual Children’s Advocacy Day. Staff also plans to submit articles to publications such as Advancing Suicide Prevention, the Tennessee Suicide Prevention Network Newsletters, and appropriate professional journals (such as Suicide and Life-Threatening Behavior). Project findings and technical reports will also be made available on www.TSPN.org, the Tennessee Suicide Prevention Network’s website (with permission from SAMHSA and CDC). Project brochures will also be developed to inform the public about project activities and outcomes. One of the key deliverables planned for the project is a replication manual to document the global project plan (including evaluation), successful implementation strategies, common barriers encountered, and program outcomes, to establish a template other states may choose to adopt in part or whole as part of their suicide prevention efforts.


c. Data Analysis Plan



Context Stage. The context stage will provide information concerning the availability of existing data. Analysis of these data will consist of tabulation and use of descriptive statistics to summarize the information collected.


Product Stage. The product stage will provide information concerning the development and utilization of products and services as part of the suicide prevention programs. Descriptive statistics will be used to document the types of programs and services used and to examine the reach of program products and services. Bivariate relationships between product and service variables of interest and program activity characteristics will also be examined. Bivariate analytic techniques will include t tests, analysis of variance, chi-square tests, and correlations.


Process Stage.

Training Exit Survey. Descriptive statistics will be used to summarize information on staff trained as part of the suicide prevention programs; utilization of training; perceived impact of training activities; referral networks to support suicide prevention; exposure to, and awareness and knowledge of, suicide prevention activities on campus; and campus suicide prevention infrastructures. The relationships between variables obtained through the training exit survey and program characteristics, respondent characteristics, and program activity characteristics will also be explored. Specifically, program characteristics may include type of training (i.e., QPR, ASIST, etc.), geographic region, and type of trainee (i.e., gatekeeper, provider, etc.). Bivariate relationships will be explored between these characteristics and the information obtained through the exit survey.


For the training exit interview, descriptive statistics will be employed to examine the distributions of individual items. For single item measures, examination of the distributions is sufficient. For items that will be part of summative scales, descriptive statistics will be calculated for the scale scores as well, based on the original conceptual groupings. Further, the psychometric properties of the scales will be assessed. First, reliability coefficients will be calculated (i.e., Cronbach’s alpha for continuous variables and KR-20 for nominal variables) for the scales based on the original item groupings. In addition, analysis will include examination of the scales’ factor structures to assess the extent to which initial conceptual groupings of items are supported by statistical item reduction techniques. To accomplish this, we will utilize exploratory factor analysis (EFA) and other scale development techniques to explore the scale properties. Items will be combined into scales based on EFA results. Then these will be tested for internal consistency using Cronbach’s alpha or KR-20, as appropriate. Item-total correlations will be calculated to examine the extent to which each item contributes to the total scale score. These results will be compared with results from the same calculations based on the original conceptual groupings of items to determine the best item clustering for scale construction. – BOB: do we need confidence intervals


Training Utilization and Penetration (TUP) Key Informant Interviews. The follow-up training interview will provide qualitative data, the interpretation of which can provide an understanding of the results produced by these activities on suicide prevention general knowledge and the impact on youth in the community. Qualitative data will be entered into a qualitative database (e.g., using ATLAS.ti software) to allow for thematic analyses within and across grantee sites.


Referral Network Survey. To analyze the influence of referral mechanisms, social network analysis will be used. Social network analysis will explicate the interactions between youth support agencies and organizations, which organizations interact, the nature of their interactions, and network characteristics (e.g., centrality, clustering of the most highly interacting players, gaps in interactions). Specifically, social network analysis will help determine the relationship between individuals and between organizations within a potentially complex web of referral sources. By examining basic characteristics, such as strong ties, multiple relationships, symmetry of relationships, centrality (e.g., degree, betweenness, closeness), density, and number of components, we can understand the cohesiveness of the referral networks and the mechanisms through which knowledge, resources, and technology are shared within this relationship model. Sociograms will be constructed based on the links among organizations. Sociograms will be examined for cutpoints and bridges to identify the critical connections and nodes in the network (i.e., the connections and nodes that if removed would cause the network to break down).


In addition, provided an existing source for information is available, we will identify the initial status of service use for youth and the referral mechanisms employed when their risk status was determined, and we can link these to the subsequent service use of youth over time (e.g., modality, type, intensity, duration), using growth curve modeling techniques. This will allow us to examine the influence that different referral mechanisms have on change in services over time and to explain how individuals’ initial status of service use influences their later change. Furthermore, regression analysis and general linear modeling can be used to analyze change over time in levels of collaboration.


Suicide Prevention Exposure, Awareness and Knowledge Survey (SPEAKS)-Student Version and Faculty/Staff Version. Descriptive statistics will be used to summarize information on students surveyed at each wave of data collection. For single item measures, examination of the distributions is sufficient. As with the training exit interviews, items that will be part of summative scales also will be summarized using descriptive statistics for the scale scores, based on the original conceptual item groupings. Then, psychometric properties of the scales will be assessed using Cronbach’s alpha for continuous variables and KR-20 for nominal variables. Analysis also will include EFA and other scale development techniques to explore the scale properties. As with the exit interview data, these results will be compared with results from the same calculations based on the original conceptual groupings of items to determine the best item clustering for scale construction. In addition, within each grantee site, differences between the responses of the groups of respondents at each wave will be compared using two-sample t-tests and chi-square analyses.


Further, selected dichotomous items will be used as indicators in a latent class analysis (LCA) to create sub-groups of respondents based on their pattern of responses to the items. LCA attempts to categorize different patterns of responses into a small number of mutually exclusive classes of respondents, with each class having a distinct probability of endorsing each item. LCA also offers the opportunity to explore the effects of covariates on class membership as well as the relationship between classes and outcomes.


Campus Infrastructure Key Informant Interviews. As part of the process stage for the Campus programs, qualitative data obtained from key informants on campus will provide an understanding of how the designated resources (i.e., funding, infrastructure, efforts, and technical assistance) are spent to conduct the targeted suicide prevention and intervention activities, as well as the results of these activities on referral mechanisms, support, information, and knowledge of suicide risk factors. Qualitative data will be entered into a qualitative database (e.g., using ATLAS.ti software) to allow for thematic analyses within and across grantee sites. In addition to the descriptive analyses described above, the evaluation team can use multivariate analytical techniques to assess exposure, awareness, and knowledge of suicide risk factors, with data from the suicide prevention exposure, awareness, and knowledge survey.


Enhanced Evaluation. The specific hypothesis of the enhanced evaluation proposed is that program activities will have a direct and measurable (correlational) impact on proximal outcomes, such as knowledge, skills, and attitudes of professionals working with at-risk youth in a variety of settings. Changes in proximal outcomes are then hypothesized to be correlated with changes in distal outcomes at the community level, particularly on suicide attempts and completions. Confirming the presence of hypothesized pathways from program activities to distal outcomes will necessitate the use of a longitudinal quasi-experimental design, leveraging the fact that program activities will be implemented at different times across the state. The use of existing community data sources will provide a useful long-term baseline of pre-TLC suicide indicators for comparison in subsequent longitudinal analyses. Several analytic methods may be used as appropriate for the dataset to be analyzed. For example, hierarchical linear models may be useful to explore county-level effects nested within a region (or even states in coordination with the National Evaluation), correcting for differences across counties in implementation time and adjusting for proximal variables such as training "saturation" within county, CPORT results, and related QPR follow-up survey findings. Structural equation modeling (SEM) may also prove useful, as perhaps the principal weakness of this research area is related to abundant measurement error (and low base rates) in distal outcome measures and the development of latent variables from clusters of proximal and distal outcome indicators may prove important in separating the "signal" of program effects from the "noise" of measurement error. SEM will also allow explicit modeling and testing of hypothesized relationships between program activities, proximal, and distal outcomes.


Table 3 summarizes the evaluation questions and the associated data sources and analytic approached.


Table 3

Evaluation Questions, Data Sources, and Analysis Techniques



Evaluation Questions


Data Sources


Data Analysis


Context Stage





  • What are the existing data sources and data elements at each State/Tribal and campus grantee?

  • Existing Database Inventory


No data analysis planned

Product Stage



  • What products and services are being developed, delivered, and utilized?

  • What audiences/populations are being targeted?

  • Are evidence-based practices being utilized?

  • To what extent do actual products and services parallel initial plans?


  • Product and Services Inventory


  • Descriptive and bivariate analysis


Process Stage




  • What is the penetration of training activities?

  • Who and how many are trained?

  • What was the training experience?

  • To what extent is training knowledge being retained and utilized?

  • To what extent are agency stakeholders involved and interacting?

  • What is the nature and quality of the interaction?

  • How does collaboration influence referral mechanisms & service use?

  • What are facilitating and barrier factors?

  • What is the overall level of suicide prevention awareness and knowledge among campus staff/faculty and students?

  • How does the suicide prevention infrastructure develop and evolve over time?

  • Training exit survey

  • Existing Data

  • Training Utilization and Penetration Interview

  • Referral Network Survey

  • SPEAKS-Student version

  • SPEAKS-faculty/staff version



  • Descriptive analysis

  • Bivariate analysis

  • Social network analysis

  • Exploratory factor analysis

  • Multivariate analysis



Impact Stage



  • What is the impact of program activities?

  • Who and how many at-risk youth are identified, screened and referred?

  • Who and how many youth are following-up on referrals?

  • What are the immediate and long term outcomes for identified at-risk youth referred for service?


  • Existing Data


  • Descriptive analysis

  • Bivariate analysis

  • Multivariate analysis

Enhanced Evaluation



  • What is the impact of gatekeeper training on proximal outcomes?

  • Do youth referrals differ by gatekeeper knowledge of suicide risk factors, skills, and attitudes around suicide?

  • What gatekeeper factors contribute to referring youth for services?

  • What impact does gatekeeper training have on distal outcomes, such as suicide attempts and completions?



  • Existing data (CPORT, pre- and post-training assessments)

  • 6-month follow-up survey


  • Descriptive analysis

  • Bivariate analysis

  • Multivariate and multi-level analysis




17. Display of Expiration Date



All data collection instruments will display the expiration date of OMB approval.


18. Exceptions to the Certification Statement



This collection of information involves no exceptions to the Certification for Paperwork Reduction Act Submissions.



B. Statistical Methods



1. Respondent Universe and Sampling Methods



Context Stage

Respondents for the Existing Database Inventory as part of the context stage of the cross-site evaluation will be program staff and/or project evaluators. Each of the 36 State/Tribal grantees and 55 Campus grantees will be required to complete the inventory.


Product Stage

Respondents for the Product and Services Inventory as part of the product stage of the cross-site evaluation will be program staff and/or project evaluators. Each of the 36 State/Tribal grantees and 55 Campus grantees will be required to complete the inventory.


Process Stage

Training Exit Survey. Respondents for the Training Exit Survey include all individuals who participate in a training activity sponsored by the State/Tribal suicide prevention programs. The Survey will be administered one time to each training participant for each training activity. Therefore, no statistical methods will be used to identify respondents. It is estimated that up to 36,000 respondents will receive training across the 36 State/Tribal grantee sites and will be administered the training exit survey (i.e., approximately 12,000 per project year). This is based on a review of program activities included in the grant applications that were funded. Because this sample will represent a survey of the entire trainee population in each grantee site, there is no need for calculation of precision of point estimates for survey responses. Sample sizes will also be sufficient to conduct assessments of the psychometric properties of the scales developed for this study both within and across grantee sites.


Training Utilization and Penetration (TUP) Key Informant Interviews. Many of the State/Tribal programs are planning multiple training activities; therefore in attempts to obtain information from key informants who experienced the same training activity, the cross-site evaluation team in consultation with local program staff will select one training activity per grantee per year for which to administer the Training Utilization and Penetration (TUP) Key Informant Interviews. Respondents to the Training Exit Survey (see above) will be asked to complete a separate contact consent form indicating their willingness to be contacted to participate in the TUP and return the form to local program staff. Key informants for the TUP Key Informant Interviews will be randomly selected from those individuals who consent to be contacted by the cross-site evaluation team. Local program staff will forward the contact consent forms to the cross-site evaluation team. Ten respondents from each of the 36 State/Tribal grantees will be randomly selected from among the potential respondents at each grantee site based on contact consent information, for a total of 360 respondents per year. Interviews will be conducted within 2 months of completion of the training activity. We estimate that five respondents per grantee will be sufficient to ensure saturation of themes in the content analysis of results from the qualitative interviews.


Referral Network Survey. Respondents for the Referral Network Survey will be identified by the local program staff and/or project evaluators based on the organizations involved in the referral network(s) associated with each State/Tribal grantee. Two representatives from each identified referral network organization will be included as a respondent. Each State/Tribal grantee will identify one network of agencies/organizations, all the agencies and organizations involved in the network, and two respondents from each agency/organization. The 14 State/Tribal grantees funded in October 2005 will be administered the survey in year 2 and year 3 (2 administrations) and the 22 State/Tribal grantees funded in June and October 2006 will be administered the survey in years 1, 2 and 3 (3 administrations). We estimate that for each of the 36 State/Tribal referral networks, there are approximately 20 agencies/organizations involved. Therefore, assuming 2 appropriate respondents per agency/organizations and an 80% response rate, we estimated that 3,008 respondents would complete the Referral Network Survey, or 1,003 annually. Our estimations were based on a review of State/Tribal program activities included in the grant applications that were funded. No statistical methods will be used to identify respondents for the Referral Network Survey.


Campus Infrastructure Key Informant Interviews. Key informants for the Campus Infrastructure Key Informant Interviews will be identified by the local program staff and/or project evaluator to represent five key roles on each campus: (1) Administrator, (2) Student Leader, (3) Counseling Staff, (4) Faculty/Staff from a human services academic department, and (5) Faculty/Staff from a non-human service academic department. One respondent in each category will be interviewed for each of the 55 campus grantees, for a total of 275 respondents (i.e., approximately 91 per project year). Within respondent categories with more than one appropriate key informant, respondents will be randomly selected. We estimate that one respondent per grantee in each category will be sufficient to ensure saturation of themes in the content analysis of results from the qualitative interviews.


Suicide Prevention Exposure, Awareness and Knowledge Survey (SPEAKS)-Student Version. Respondents for the student version of the Suicide Prevention Exposure, Awareness and Knowledge Survey (SPEAKS) will represent a sample of the student population. The 21 Campus grantees funded in FY 2006 will administer the survey in year 2 and year 3 (2 administrations) and the 34 Campus grantees funded in FY 2007 will administer the survey in years 1, 2 and 3 (3 administrations). A sampling plan to obtain 200 students respondents in each of the 55 Campus grantees per administration for a total of 28,800 total respondents will be developed by the cross-site evaluation team. Local program staff and/or project evaluators will be responsible for pulling sample. Anticipated response rates of 30-40% per campus are anticipated, given the difficult population we are surveying. Therefore, oversampling will be required. The campus evaluation team will draw a proportionately weighted stratified random sample within each grantee site targeted for SPEAKS administration from the matriculated student register. The matriculated student sample will be stratified by gender, matriculation year, and race/ethnicity. At each wave of administration, a sample size of 11,000 will achieve a margin of error of +/- 1.3% with a 95% confidence interval. In addition, within each grantee site, group sample sizes of 200 independent respondents at each wave achieve 80% power to detect a difference of -0.14 between groups with standard deviations of 0.5 in each group at alpha = .05 using a two-sided two-sample t-test.


Suicide Prevention Exposure, Awareness and Knowledge Survey (SPEAKS)-Faculty/Staff Version. Respondents for the Faculty/Staff version of the Suicide Prevention Exposure, Awareness and Knowledge Survey (SPEAKS) will represent a sample of the faculty/staff population. The 21 Campus grantees funded in FY 2006 will administer the survey in year 2 and 3 (2 administrations) and the 34 Campus grantees funded in FY 2007 will administer the survey year 1, 2 and 3 (3 administrations). A sampling plan to obtain 50 faculty/staff respondents in each of the 55 Campus grantee sites for a total of 7,200 total respondents will be developed by the cross-site evaluation team. Local program staff and/or project evaluators will be responsible for pulling the sample. Anticipated response rates of 30-40% per campus are anticipated, given the difficult population we are surveying. As with the student sample, the campus evaluation team will draw a proportionately weighted stratified random sample within each grantee site targeted for SPEAKS administration from campus staff and faculty lists. The faculty/staff sample will be stratified by gender, race/ethnicity, and staff/faculty position. At each wave of administration, a sample size of 2,750 will achieve a margin of error of +/- 2.87% with a 95% confidence interval. In addition, within each grantee site, group sample sizes of 50 independent respondents at each wave achieve 80% power to detect a difference of -0.3 between groups with standard deviations of 0.5 in each group at alpha = .05 using a two-sided two-sample t-test.


Enhanced Study. The enhanced evaluation proposes a sampling scheme to maximize generalizability of findings. A stratified random sample will be used so that regional effects and effects across participant groups can be examined. A reasonable confidence level of 90% will be employed and a conservative standard deviation of 0.50. For purposes of estimating cell sizes, an equal distribution of participants across regions (e.g., East, Middle, West) and across sub-groups (e.g., DCS staff, foster parents, school personnel, etc.) within regions is assumed. The target sample for each sub-group is 68 per region (204 across the state), with the exception of two smaller groups, including (a) juvenile court judges and referees, and (b) adults who work with gay, lesbian, and bisexual youth. Assuming a goal of obtaining an 80% survey completion rate, over-sampling of the larger groups will be required to ensure a 90% confidence interval can be attained at the level of group within region. Thus, 85 participants (68 + 17) will be followed in each of these larger sub-groups. Altogether the target sample for each region is 466 individuals (568 including over-sampling) and the statewide sample is 1,398 individuals (1,704 including over-sampling). This sampling scheme affords reasonable precision for examining sub-groups within region and substantial precision for collapsing groups at the regional (96.2% confidence interval) and statewide level (97.8% confidence interval).


2. Procedures for Collection of Information



Context Stage

Existing Database Inventory. Program staff and/or project evaluators from each of the 88 State/Tribal and Campus grantees will complete the web-based Existing Database Inventory once at the beginning of year 2 and year 3 (see Attachment A.1 and A.2). The cross-site evaluation team will provide a web-based platform for data entry, will train program staff to complete the inventory, and will monitor completion. Information related the existing databases and the availability of data elements are included in the inventory. Each grantee will be provided a unique username and password to log in to the web-based inventory. No individual identifying information will be provided when completing the inventory. Logging in and completing the inventory will imply consent for completion.


Product Stage

Product and Services Inventory. Program staff and/or project evaluators from each of the 88 State/Tribal and Campus grantees will complete the web-based Product and Services Inventory once in year 1 of the cross-site evaluation, four times in year 2 (at the end of each quarter), and four times in year 3 (at the end of each quarter) (see Attachment B.1, B.2, B.3, B.4). The cross-site evaluation team will provide a web-based platform for data entry, will train program staff to complete the inventory, and will monitor completion. Information related to products and services that will be collected include identifying the products and services developed, products and services in development, and utilization of those products and services. Each grantee will be provided a unique username and password to log in to the web-based inventory. No individual identifying information will be provided when completing the inventory. Logging in and completing the inventory will imply consent for completion.


Process Stage

Training Exit Survey. Individuals involved in training activities at each of the 36 State/Tribal grantee sites will be asked to complete the Training Exit Survey (see Attachment C). Upon completion of a training activity, local program staff and/or project evaluator will be responsible for providing the Training Exit Survey to participants for self-administration and immediate return. The survey cover page introduces the survey and explains the consent process. The cross-site evaluation team will train local program staff to administer the training exit survey during a 2-day site visit prior to the start of administration. Consent will be implied based on completion and submission of the survey to program and/or evaluation staff. A scannable survey option will be made available or as an alternative the survey can be administered in a paper-and-pencil format. If using the scannable surveys, local program staff will collect completed surveys and forward to the cross-site evaluation team. If paper-and-pencil surveys are used, local program staff will be responsible for entering survey data into the web-based data collection system. Participation in the Training Exit Survey will be voluntary but a survey will be offered to all training participants.


Training Utilization and Penetration (TUP) Key Informant Interviews. The Training Utilization and Penetration (TUP) Key Informant Interviews (see Attachment D.1) will be administered to a subset of respondents to the Training Exit Survey for one training activity per grantee per year. Many of the State/Tribal programs are planning multiple training activities; therefore in attempts to obtain information from key informants who experienced the same training activity, the cross-site evaluation team in consultation with local program staff will select one training activity per grantee per year in which to administer the TUP Key Informant Interviews. When completing the Training Exit Survey, respondents will be asked to complete a separate form indicating their willingness to be contacted by the cross-site evaluation team to participate in the TUP and then to return the form to local program staff (see Attachment D.2). Local program staff will forward consent forms to the cross-site evaluation team. Because it will be necessary to facilitate administration of the interview, identifying information for each key informant will be forwarded to the cross-site evaluation team. The cross-site evaluation team will contact each identified key informant via telephone within two months of the training activity to introduce the study, request participation and to schedule an appointment for administration of the interview. The cross-site evaluation team will be responsible for administering the interview and will be trained by the cross-site evaluation project director or deputy project director in qualitative interviewing. Each respondent prior to administration of the TUP Interviews will provide verbal consent (see Attachment D.3). Interviews will be audio recorded but respondents will not be identified by name.


Referral Network Survey. For the first administration of the Referral Network Survey (see Attachment E.1), local program staff will identify all of the agencies or organizations involved in the referral network(s) for each of the 36 State/Tribal suicide prevention program. Local program staff will contact the director of each identified agency/organization and request that two appropriate respondents knowledgeable of the suicide prevention referral network be identified. Local program staff will collect contact information (i.e., names, email address, and telephone number) from each potential respondent and forward this information to the cross-site evaluation team. The cross-site evaluation team will administer the Web-based Referral Network Survey. Implementation of this survey will adhere to accepted methods for Internet surveys. Following recruitment activities and verification of contact information, the cross-site evaluation team will begin contacting potential respondents to complete the Referral Network Survey. A pre-survey email explaining that the recipient will be asked to participate in a survey will be sent to selected respondents in each agency/organization within each State/Tribal grantee site. The initial email will be followed 1 week later by an email containing directions for logging onto a Website to complete the Internet survey. A follow-up reminder postcard will be sent 1 week later, and 1 week after that; a final reminder will be sent to all providers who have not completed the Web survey (see Attachments E.2, E.3, E.4, E.5) (Dillman, 2000). Telephone reminder calls will be made to any remaining non-respondents. The log in page of the Referral Network Survey will provide an introduction, instructions on how to complete the survey, and a description of the consent process, which is included in Attachment E.1. Each respondent will be provided a unique username and password to log in to the web-based survey and logging in and completing the survey will imply consent. The respondent list for the second and third administration of the Referral Network Survey will be updated and any additional agencies or organizations involved in the referral network(s) will be added. The same data collection procedures will be used for the second administration.


Suicide Prevention Exposure, Awareness and Knowledge Survey (SPEAKS)-Student Version. The SPEAKS will be administered to students in each of the 21 FY2006-funded Campus grantee sites in year 2 and in year 3 (see Attachment F.1) and annually to students in each of the 34 FY2007-funded Campus grantees. Local program staff and/or project evaluators will be responsible for identifying the list of respondents. The cross-site evaluation team will develop the sampling plan and local program staff will be responsible for identifying the sampling frame and pulling the sample. Once the sample has been pulled, local program staff will forward contact information (i.e., email addresses) to the cross-site evaluation team for administration of the SPEAKS. Implementation of this survey will adhere to accepted methods for Internet surveys. Following recruitment activities and verification of email addresses, the cross-site evaluation team will begin emailing potential respondents to complete the SPEAKS-student version. A pre-survey email explaining that the recipient will be asked to participate in a survey will be sent to selected respondents. The initial email will be followed 1 week later by an email containing directions for logging onto a Website to complete the Internet survey. A follow-up reminder postcard will be sent 1 week later, and 1 week after that; another reminder email will be sent to all students who have not completed the Web survey (see Attachments F.2, F.3, F.4, F.5) (Dillman, 2000). The log in page of the SPEAKS-Student Version will provide an introduction, instructions on how to complete the survey, and a description of the consent process, which is included in Attachment F.1. Each respondent will be provided a unique username and password to log in to the web-based survey and logging in and completing the survey will imply consent. The sample identification for the second administration of the SPEAKS-student version in year 3 will follow the same methods.


Suicide Prevention Exposure, Awareness and Knowledge Survey (SPEAKS)-Faculty/Staff Version. The SPEAKS-faculty/staff version will be administered to faculty or staff in each of the 21 FY2006-funded Campus grantees Campus grantee sites in year 2 and in year 3 (see Attachment G) and annually to faculty in staff of each of the 34 FY2007-funded Campus grantees. Local program staff and/or project evaluators will be responsible for identifying the list of respondents. The cross-site evaluation team will develop the sampling plan and local program staff will be responsible for identifying the sampling frame and pulling the sample. Once the sample has been pulled, local program staff will forward contact information (i.e., email addresses) to the cross-site evaluation team for administration of the SPEAKS. Implementation of this survey will adhere to accepted methods for Internet surveys. Following recruitment activities and verification of email addresses, the cross-site evaluation team will begin contacting potential respondents to complete the SPEAKS-faculty/staff version. A pre-survey email explaining that the recipient will be asked to participate in a survey will be sent to selected respondents. The initial email will be followed 1 week later by an email containing directions for logging onto a Web site to complete the Internet survey. A follow-up reminder postcard will be sent 1 week later, and 1 week after that; another reminder email will be sent to all faculty/staff who have not completed the Web survey (see Attachments F.2, F.3, F.4, F.5) (Dillman, 2001). The log in page of the SPEAKS-Faculty/Staff Version will provide an introduction, instructions on how to complete the survey, and a description of the consent process, which is included in Attachment G. Each respondent will be provided a unique username and password to log in to the web-based survey and logging in and completing the survey will imply consent. The sample identification for the second administration of the SPEAKS-faculty/staff version in year 3 will follow the same methods.


Campus Infrastructure Interviews. There are four versions of the Campus Infrastructure qualitative interviews; (1) Administrator, (2) Counseling Center Staff, (3) Student Group Leader, and (4) Faculty/Staff (see Attachment H.1, H.2, H.3, H.4). Local evaluators will be responsible for identifying a list of appropriate respondents for each Campus Infrastructure Interview version and forwarding the appropriate contact information to the cross-site evaluation team for administration. The local program staff will be responsible for obtaining the necessary releases of information or consents-to-contact. Because it will be necessary to facilitate administration of the interview, identifying information for each respondent will be forwarded to the cross-site evaluation team. However, no identifying information will be included on the data collection instrument. The cross-site evaluation team will randomly select one respondent from each respondent list and contact the individual via telephone to introduce the study, request participation and to schedule an appointment for administration of the interview. Each respondent prior to administration of the Campus Infrastructure Interviews will provide verbal consent (see Attachment H.5, H.6, H.7, H.8). The cross-site evaluation team will be responsible for administering the interview and will be trained by the cross-site evaluation project director or deputy project director in qualitative interviewing. Interviews will be audio recorded but respondents will not be identified by name.

Enhanced Evaluation. The Tennessee Lives Count (TLC) 6-month survey will be used for the enhanced evaluation (see Attachment I.1). Data will be collected primarily on the internet using a web-based survey. Individuals will be selected randomly from the list of participants who agreed to be contacted for the 6-month follow-up survey (see Attachment I.2). Each respondent prior to administration of the TLC 6-month survey will provide consent (see Attachment I.3). Upon completion of gatekeeper training, TLC trainers will instruct participants to fill out a consent-to-contact form. To enhance buy-in, trainers will briefly discuss the purpose and importance follow-up study, explain when and how the study will be conducted, describe survey length and compensation, and assure participants that contact information will be used only for its intended purpose. Each training will have a designee (e.g., the contact who sponsored the training) who will collect materials and review each form for accuracy and completeness of contact information. The information we will ask for includes the participant’s primary and secondary email addresses, home address and phone number, place of work, work address and phone number, and contact information for two or three individuals who will always know how to reach the participant.


A two-month window (five to seven months from TLC gatekeeper training) will be used to maximize completion rates. Respondents completing the survey via the Internet will receive an introductory email. An advance email (or postcard) will be sent at the five-month mark to thank participants for agreeing to be contacted, to remind them about the project and the $10 compensation, and to expect a follow-up email (or mailing) within one week that provides directions for accessing and completing the survey. The follow-up email will be sent one week later with instructions for accessing the survey and the username and password. A third email/postcard will be sent to non-responders at the six-month mark to remind them to complete the survey. After another two weeks, a final reminder will be sent (see Attachment I.4, I.5, I.6, I.7).


Respondents who did not provide accurate email addresses on the consent to contact form or whose emails were returned undeliverable will be contacted individually by mail and/or by phone. Participants who provide only a mailing address will receive an invitation to complete the survey by mail, including a paper-and-pencil version of the survey, a survey compensation form to collect payee name and preferred address, and a pre-addressed, stamped envelope for returning the materials. We will also provide a toll-free number for those individuals who prefer to complete the survey by phone.


Table 4 summarizes the respondent, data collection procedure, and periodicity for each measure.


Table 4

Instrumentation, Respondents, and Periodicity



Measure


Data Source(s)


Method


When Collected

Context Stage



Existing Database Inventory

Program staff and/or project evaluator

Web-based inventory

Once in year 2 and once in year 3


Product Stage



Product and Services Inventory



Program staff and/or project evaluator



Web-based inventory

Once in year 1; quarterly in year 2 and in year 3


Process Stage



Training Exit Survey

Provider (Training participants)

Survey

One-time following completion of training activity



Training Utilization and Penetration (TUP) Key Informant Interviews





Provider (Training Participants)

Interview

One-time per year within 2 months of completing the Training Exit Survey

Referral Network Survey

Provider (Referral Network Stakeholders)

Web-based Survey

Once in year 2 and once in year 3 for October 2005 funded grantees; annually for June and October 2006 funded grantees


Suicide Prevention Exposure, Awareness, and Knowledge Survey-Student Version

Students

Web-based Survey


One-time in year 2 and one-time in year 3 for FY 2006 funded grantees and annually for FY 2007 grantees

Suicide Prevention Exposure, Awareness, and Knowledge Survey-Faculty/Staff Version

Faculty/Staff

Web-based Survey

One-time in year 2 and one-time in year 3 for FY 2006 funded grantees and annually for FY 2007 grantees

Campus Infrastructure Interview (4 versions)

Campus Administrators

Student Group Leaders

Counseling Center Staff

Faculty/Staff

Interview

One-time either in year 2 or year 3

Enhanced Evaluation

TLC 6-month Follow-up Survey

Provider (Training Participations)

Web-based Survey

6-months following training activity



3. Methods to Maximize Response Rates



Participation in the cross-site evaluation is a requirement of the GLS Suicide Prevention Program. Therefore, completion of the Existing Database Inventory and the Product and Services Inventory by program staff will be a requirement. However, the cross-site evaluation team has taken a number of steps to minimize the burden on local programs to ensure that completion is timely. These steps include developing a web-based data collection system, and providing training and technical assistance to each grantee.


The cross-site evaluation team also will provide technical assistance and training to all grantee sites, to maximize response rates for the other data collection activities. This will be done by providing web cast trainings, distributing data collection procedures manuals, conducting on-site training visits for the State/Tribal grantees, and providing on-going one-on-one contact with each grantee through a technical assistance liaison.


To maximize response rates specifically for the web-based surveys (i.e., the Referral Network Survey, SPEAKS-student version, SPEAKS-faculty/staff version, and the TLC 6-month survey) a 4-stage mailing process will be utilized (Dillman, 2001). All efforts have been made to minimize the burden on individual respondents by limiting the number of items on each questionnaire and building in functions to facilitate ease in data entry. Additionally, an incentive will be provided for students who complete the SPEAKS and for respondents to the TLC 6-month survey. For the referral network survey and the TCL 6-month survey, nonresponders will be contacted via telephone to increase response rates. Respondents for the TLC 6-month survey will have the option of completing the survey sing a paper-and pencil version. For the SPEAKS, no personal contact will be made to nonresponders beyond the 4-stage mailing process described above. Because student populations are difficult populations to survey, it is expected that there will be nonresponders. However, using the Dillman method and the incentive lottery will minimize the level of nonresponse.


Methods that will be used to maximize response rates for the qualitative interviews (i.e., the TUP Key Informant Interviews and Campus Infrastructure Interviews) include obtaining buy-in from key program stakeholders, providing flexibility in scheduling, and conducting follow-up phone calls and emails to nonresponders. In addition, local program staff will be utilized to obtain contact information for respondents, which will result in more accurate information, thus increasing response rates. If any identified respondents for the qualitative interviews are nonresponsive, the cross-site evaluation team will request that local program staff identify replacement respondents.



4. Tests the Procedures




The GLS Suicide Prevention and Early Intervention Program is the first federally funded program to support suicide prevention programs in States, tribal communities and campuses. As such, the instruments to be used in the cross-site evaluation and the enhanced evaluation were customized to meet the needs of the GLS Suicide Prevention and Early Intervention Program. As new measures were developed, standard instrument development procedures including review of the literature, item development, and content review by experts in the field were used (see below). All instruments underwent cognitive and/or pilot testing, and/or expert review. These procedures were used to enhance question accuracy and determine administration times. In addition, web-enabled instruments will undergo usability testing prior to fielding. Usability testing refers to pilot testing of the Web-based interface for administering questionnaires to determine the most efficient and understandable presentation. Typically this is completed with a prototype and modifications are made before final fielding.


First, a thorough review of the literature was conducted related to suicide prevention training activities and suicide awareness and knowledge in efforts to develop the Training Exit Survey, the Training Utilization and Penetration Key Informant Interviews, and the SPEAKS. In addition, experts in mental health referral networks were consulted in developing the Referral Network Survey and representatives from Universities not involved in GLS Suicide Prevention Programs were consulted in developing the SPEAKS and Campus Infrastructure Interviews. Second, drafts of the instruments were developed and reviewed by cross-site evaluation team members, representatives from SAMHSA, and content experts in the field of suicide prevention. Third, the revised instruments underwent cognitive testing and/or pilot testing on no more than 9 respondents matching the type appropriate for the instrument, in efforts to enhance question accuracy and determine administration time.



5. Statistical Consultants



The cross-site evaluator has full responsibility for the development of the overall statistical design, and assumes oversight responsibility for data collection and analysis. Training, technical assistance, and monitoring of data collection will be provided by the cross-site evaluator. The individuals responsible for overseeing data collection and analysis are:


Brigitte Manteuffel, Ph.D.

ORC Macro, Inc.

3 Corporate Square, Suite 370

Atlanta, GA 30329

(404) 321-3211


Christine M. Walrath-Greene, Ph.D.

ORC Macro, Inc.

116 John Street, Fl. 8

New York, NY 10038

(212) 941-5555


The following individuals will serve as statistical consultants to this project:


Christine M. Walrath-Greene, Ph.D.

ORC Macro, Inc.

116 John Street, Fl. 8

New York, NY 10038

(212) 941-5555


Robert Stephens, Ph.D.

ORC Macro, Inc.

3 Corporate Square, Suite 370

Atlanta, GA 30329

(404) 321-3211


Ye Xu, M.S.

ORC Macro, Inc.

3 Corporate Square, Suite 370

Atlanta, GA 30329

(404) 321-3211


David Goldston, PhD

Duke University

Duke Child and Family Study Center

718 Rutherford Street DUMC 3527

Durham, NC 27710

(919) 416-2423


The agency staff person responsible for receiving and approving contract deliverables is:


Richard McKeon, Ph.D.

Prevention Initiatives and Priority Programs Development Branch

Center for Mental Health Services

Substance Abuse and Mental Health Services

1 Choke Cherry Road

Room 6-1105

Rockville, MD 20857

Phone: (240) 276-1873



References


Dillman, D. (2000). Mail and Internet Surveys-Second Edition. New York, NY: John Wiley & Sons, Inc.


Eggert LL, Nicholas LJ, Owen LM. 1995a. Reconnecting Youth: A Peer Group Approach to Building Life Skills. Bloomington, IN: National Educational Service.


Eggert LL, Randell BR, Thompson EA, Johnson CL. 1997. Washington State Youth Suicide Prevention Program: Report of Activities. Seattle, WA: University of Washington.


Kalafat, J., and Elias, M. (1994). An evaluation of a school-based suicide awareness intervention. Suicide and Life-Threatening Behavior, 24(3), 224-233.


King KA, Smith J. 2000. Project SOAR: A training program to increase school counselors’ knowledge and confidence regarding suicide prevention and intervention. Journal of School Health, 70(10): 402-407.



Attachment A

Garrett Lee Smith Memorial Act - S.2634

Attachment A.1

Existing Database Inventory-State/Tribal Version

Attachment A.2

Existing Database Inventory-Campus Version

Attachment B.1

Product and Services Inventory-State/Tribal Baseline Version

Attachment B.2

Product and Services Inventory-State/Tribal Follow-up Version

Attachment B.3

Product and Services Inventory-Campus Baseline Version

Attachment B.4

Product and Services Inventory-Campus Follow-up Version

Attachment C

Training Exit Survey

Attachment D.1

Training Utilization and Penetration (TUP) Key Informant Interview

Attachment D.2

TUP Consent to Contact Form

Attachment D.3

TUP Introductory Script

Attachment E.1

Referral Network Survey

Attachment E.2

Referral Network Survey-Advance Email

Attachment E.3

Referral Network Survey-Introductory Email

Attachment E.4

Referral Network Survey-Reminder Email

Attachment E.5

Referral Network Survey-Final Reminder Email

Attachment F.1

Suicide Prevention Exposure, Awareness, and Knowledge Survey (SPEAKS)-Student Version

Attachment F.2

Suicide Prevention Exposure, Awareness, and Knowledge Survey (SPEAKS)-Advance Email

Attachment F.3

Suicide Prevention Exposure, Awareness, and Knowledge Survey (SPEAKS)-Introductory Email

Attachment F.4

Suicide Prevention Exposure, Awareness, and Knowledge Survey (SPEAKS)-Reminder Email

Attachment F.5

Suicide Prevention Exposure, Awareness, and Knowledge Survey (SPEAKS)-Final Reminder Email

Attachment G

Suicide Prevention Exposure, Awareness, and Knowledge Survey (SPEAKS)-Faculty Staff Version

Attachment H.1

Campus Infrastructure Interview-Administrator Version

Attachment H.2

Campus Infrastructure Interview-Counseling Center Staff Version

Attachment H.3

Campus Infrastructure Interview-Faculty Version

Attachment H.4

Campus Infrastructure Interview-Student Leader Version

Attachment H.5

Campus Infrastructure Interview Introductory Script-Administrator Version

Attachment H.6

Campus Infrastructure Interview Introductory Script -Counseling Center Staff Version

Attachment H.7

Campus Infrastructure Interview Introductory Script -Faculty Version

Attachment H.8

Campus Infrastructure Interview Introductory Script -Student Leader Version

Attachment I.1

TLC-6-month Survey

Attachment I.2

TLC-6-month Consent to Contact

Attachment I.3

TLC-6-month Consent

Attachment I.4

TLC-6-month Advance Email

Attachment I.5

TLC-6-month Login Instructions

Attachment I.6

TLC-6-month 1st reminder

Attachment I.7

TLC-6-month 2nd reminder

Attachment J

Federal Register Notice

List of Attachments



11

File Typeapplication/msword
File TitleSupporting Statement
AuthorGordon
Last Modified BySKING
File Modified2006-12-28
File Created2006-11-07

© 2024 OMB.report | Privacy Policy