SAMHSA Response

List of Revisions to OMB Submittal SAMHSA.doc

Cross-Site Evaluation of the Garrett Lee Smith Memorial Suicide Prevention and Early Intervention Program

SAMHSA Response

OMB: 0930-0286

Document [doc]
Download: doc | pdf


Memorandum


TO: John Kramer

Office of Management and Budget


FROM: Summer King

SAMHSA Clearance Officer


RE: Responses to questions posed on April 25th regarding the GLS Suicide Prevention Cross-site Evaluation



Below are the questions and answers from the conference call held on April 25th 2007 to address issues related to the GLS Suicide Prevention Cross-site Evaluation Protocol. The feedback was very helpful and has allowed the Program to explicitly address your five key points of concern: (1) the inclusion of data abstraction/reporting activities in the burden estimate, (2) controlling grantee access to raw data to protect the inadvertent identification of individuals, (3) the (mis)use of the term confidential, (4) studying the relative effectiveness of lottery incentive structures on college student survey response and completion rates, and (5) consistent formatting of race/ethnicity questions.


Each of these five issues has been addressed, in turn, below. The issue raised is first summarized in italics, followed by our response to the concern. Please let me know if you have any other questions for the Program.


1. Burden for Data Abstraction, Aggregation and Reporting Activities: EIRF and TAR


The Early Identification, Referral, and Follow-up (EIRF) Analysis and the Training Activity Report (TAR) were cited in the original OMB supporting statement as activities that potentially did not require OMB review. OMB provided guidance that the burden associated with data abstraction and aggregation must indeed be included in the overall burden estimates.


The Early Identification, Referral and Followup Analyses require State/Tribal grantees to share existing data with the cross-site evaluation team on the number of youth identified at risk as a result of early identification activities, referred for services, and who present for services. The type of information that will be shared with the cross-site evaluation includes basic demographic information; types of service referrals; and types of services received, which includes mental health assessments, mental health treatment, emergency services, and nontraditional support services. Because this information is tracked locally as part of program activities and maintained in management information systems, State/Tribal grantees will query their data systems and upload de-identified data to the cross-site evaluation team at least quarterly. Because it utilizes existing data, the burden associated with sharing these data is the amount of initial time to develop a query of the existing data system and the amount of time each quarter to upload the queried data. As such, it is estimated that the burden to query the data and upload the dataset is approximately 0.17 hours, four times per year for all 36 State/Tribal grantees, for a total annual burden of 24 hours. See Attachment 1, which provides the data specifications to assist grantees in querying their data systems.


Campus grantees are required to report aggregate training participant information for all individuals who are trained as part of their suicide prevention programs at least quarterly. These data are aggregated from existing data sources, some of which are attendance sheets, management information systems, etc. Grantees are responsible for aggregating these data and submitting to the cross-site evaluation team. As such, it is estimated that the burden to aggregate the data and enter into the Web-based system is approximately 0.33 hours, four times per year for all 55 Campus grantees, for a total annual burden of 73 hours. Attachment 2 is the template that was provided to grantees to assist them in aggregating data for submittal.


Type of Respondent

 Measure Name

No. of Respondents

No. of Responses/ Respondent

Hours/ response

Response Burden (hours)

Wage

Total cost

Project Evaluator 1

Early Identification, Referral and Follow-up Data

36

4

0.17

24

$29.40

$720

Project Evaluator 1

Training Activity Report

55

4

0.33

73

$29.40

$2,134

1 National Compensation Survey, Bureau of Labor Statistics (BLS) US Dept of Labor, Professional-specialty and technical occupations, July 2004.


2. Grantee Access to Raw Data


The OMB requires that if/when de-identified raw data are to be shared with grantees that precautions are taken (either statistically or through controlled access) to protect inadvertent identification of individuals through the unique configuration of variables gathered about them (e.g. race, employment agency, gender, etc.)


In an effort to promote data driven program improvement and support program sustainability efforts by grantees, the intention was to provide grantees access to site-specific de-identified raw data. The cross-site evaluation team has developed a data collection and management system (Suicide Prevention Data Center [SPDC]) that establishes strict security privileges. Only individuals with security access at the site administrator level are allowed access to raw data. To protect potential misuse of that data, specifically related to the inadvertent identification of respondents as a function of their unique demographic/workforce characteristic profile, the cross-site evaluation team will restrict access to raw datasets to designated individual(s), and the site administrator of the SPDC will be asked to sign a data use agreement. Within the context of protecting from inadvertent identification, this agreement will stipulate who, how, and under what circumstances the raw data can be analyzed/reported. For example, the cross-site evaluation team will obtain an agreement from each site administrator agreeing not to report categories where less than 10 cases exist and to stipulate who will have access to raw data. Further, the agreement will indicate that no attempt, through complex analysis and with outside information, will be made to ascertain from the data sets the identity of particular persons. Attachment 3 is the agreement that will be utilized.


3. Use of the Term Confidential and Confidentiality


The use of the term confidentiality has strict connotations and implications and must be supported through a statute or law. Unless supported through such a statute this term should not be used, and in its place – as appropriate – terms such as anonymous, private, etc. should be used.


All reference to confidential and confidentiality was removed from consent forms, instruments, and recruitment materials. Attachment 4 provides a detailed list of the changes made.


4. Use of Lottery Incentives and Studying Their Relative Impact on Response/Completion Rates


Concern was expressed over the use of “lottery-style” incentives for the Student SPEAKS. It was suggested, however, that OMB would consider a study of the relative impact of different incentive approaches if it were imbedded into the Campus Cross-site Evaluation.


Students from 55 campuses across the country will participate in the Suicide Prevention Exposure, Awareness, and Knowledge Survey (SPEAKS). An incentive plan was deemed appropriate to increase response rates given this hard-to-reach population. In designing incentive structures, the Association for Institutional Research recommends that survey researchers consider the culture of the school when designing an incentive structure (AIR, 2006). As such, each campus participating in the SPEAKS designed their own incentive structure based on their campus culture, some of which include post-survey payment to all who complete the survey and others which include random assignment of post-survey incentives to selected individuals (i.e., lottery-style). When exploring what incentive structure would work best on each campus, campuses considered their own experiences Several relevant examples are provided below:


  • Rensselaer Polytechnic Institute recently conducted the National/American College Health Assessment and surveyed 3,600 undergraduates using an incentive amounting to ten chances to win one gift certificate to the RPI Bookstore valued at $100 (each). The typical RPI student response rate is 20-25%; however the response rate for this survey was 39%.

  • Northwest Missouri State University conducted focus groups with students to determine what would motivate students to participate in the required general education test. Results indicated that a larger dollar amount scholarship would be more of a motivator than any of the other smaller incentive structures awarded to all student such as food and bevereage.


While research in traditional survey research methods indicates that providing incentives prior to survey administration yields higher response rates than no incentive or post-survey incentives, similar evidence related to Web-based surveys is more equivocal (Heerwegh, 2006; O’Neil, Penrod, and Bornstein, 2003; Porter and Whitcomb, 2003, Dillman, 2000). Randomly assigned incentives (i.e., “lottery” incentives) have become increasingly common with surveys of college students (Porter & Whitcomb, 2003), and there is promising evidence that this approach increases response rates compared to no incentive and reduces item non-response (Heerwegh, 2006; O’Neil, Penrol, & Bornstein, 2003; Porter & Whitcomb, 2003).


Furthermore, while Web-surveying and lottery incentives are widely used with college students, equivocal evidence exists concerning the effectiveness of lottery-style post-Web-based survey incentives. Thus, the cross-site evaluation team proposes conducting a study of the relative impact of various post-survey incentive strategies on Web-survey response and completion rates. The study will include both naturalistic comparisons of post-survey incentive structures across campuses and experimentally manipulated comparisons of post-survey incentive structures within a subset of campuses (i.e., students randomly assigned to either a traditional post-survey incentive group or a lottery-style incentive group). Incentive plans have been identified for each of the 21 Cohort 1 campuses (funded in 2005). Of the 21 Cohort 1 campuses (i.e., funded in 2005), two have elected to provide $5.00 post-survey incentives to all of the respondents, and the remainder have elected some sort of “lottery-style” incentive. The lottery-style incentive elections vary in both type and amount and include cashiers checks, cards to major retail stores, electronic gifts, educationally relevant gift certifications (e.g., bookstore, tuition), etc. that range in value from $20 to $500. This variation provides an excellent opportunity to do naturalistic comparisons across campuses of various incentive strategies that will include:


  • Post-incentives to all respondents vs. lottery-style (regardless of type or amount),

  • Large vs. small lottery-style incentives,

  • Entertainment vs. educational lottery-style incentives, and

  • Cash equivalent vs. product-specific lottery-style incentives.


In addition, four Cohort 1 campuses have expressed an interest/willingness in participating in experimental manipulation of incentives on their campuses where a randomly selected ½ of respondents will be assigned to a group that will receive traditional post-incentive for every respondent and the other ½ of respondents will be assigned to a group that will receive a lottery-style incentive. The invitations for participation will be tailored accordingly.


It is anticipated that as Cohort 2 campuses finalize their desired incentive approaches, comparable natural variation will occur and additional campuses will be willing to participate in an experimental manipulation.


Collectively, the natural and experimental comparisons will allow the cross-site evaluation to systematically investigate the differential impact of post-survey incentive structures on college student Web-survey response and completion rates. These findings from the experiments will significantly contribute to what is known about different incentive structures for Web-based surveys with college students.


5. Consistent Formatting of Race/Ethnicity Questions


Race/ethnicity questions across all instruments were reviewed and modified (where necessary) for consistency. Attachment 4 provides a detailed list of the instruments and items where changes were made. In addition, in response to the guidelines around aggregating race/ethnicity data, item 5 on the Referral Network Survey was modified. Because respondents are being asked to provide their perception of the racial/ethnic distribution of the population served rather than to aggregate racial/ethnic information from existing records, the aggregation guidelines were less applicable. Therefore, rather than asking for specific percentages by race/ethnic category, it is proposed to allow the respondent to select the racial/ethnic groups represented in the population they serve.


Referral Network Survey


5. Based on your perception, check the race/ethnic groups that are represented by the populations you/your agency serves (select all that apply)?

___ American Indian or Alaska Native

___ Asian

___ Black or African American

___ Native Hawaiian or other Pacific Islander

___ White


References


Association for Institutional Research Professional File (2006), Number 102, Winter 2006, Essential Steps for Web Surveys: A Guide to Designing, Administering and Utilizing Web Surveys for University Decision-Making.


Dillman (2000). Mail and Internet Surveys- Second Edition. New York, NY: John Wiley & Sons, Inc.


Heervwegh, D. (2006). An Investigation of the Effect of Lotteries on Web Survey Response Rates. Field Methods, Vol. 18, No. 2, May 2006 205–220.


O’Neil, K.M., Penrod, S.D., & , Bornstein, B.H. (2003). Web-based Research: Methodological Variables’ Effects on Dropout and Sample Characteristics. Behavior Research, Methods, Instruments, and Computers, 35 (2), 217 – 226.


Porter, S.R., and Whitcomb, M.E. (2003). The Impact of Lottery Incentives on Student Survey Response Rates. Research in Higher Education, 44 ( 4), 389-407.











Attachment 1

Data Elements for the Early Identification and Referral Follow-up Analysis

Variable Name

Question Number

Question

Formats & Codes

eirfdate

cs1

Today's Date - Month/Day/Year

(Text)

efpid

cs2

Participant ID

(Numeric); Must be 8 digits in length

efcase

cs3

Sources of information used to complete this form: Case record review or existing data system

0=Not Endorsed
1=Endorsed

efprovid

cs3

Directly from a provider (i.e., case manager, clinician, mental health professional)

0=Not Endorsed
1=Endorsed

efgate

cs3

Directly from a gatekeeper (i.e., not a mental health professional)

0=Not Endorsed
1=Endorsed

efoth

cs3

Other

0=Not Endorsed
1=Endorsed

efothd

cs3o

Other, please describe

(Text)

efsett

cs4

Early Identification Activity Setting

1= School
2 = Child Welfare
3 = Juvenile Justice
4 = Law Enforcement
5= Community-based Organization
6= Physical Health
7= Mental Health Agency
8 = Other

efsetto

cs4o

Other Early Identification Activity Setting

(Text)

efsource

cs5

Source of Early Identification of Youth

1=Screening
2= Parent / Foster Parent / Caregiver
3= Mental health service provider (e.g., clinician, school counselor, etc.)
4= Teacher or other secondary school staff
5=Child welfare staff
6= Probation officer or other juvenile justice staff
7=Primary care provider (i.e., doctor, nurse)
8= Police officer or other law enforcement staff
9= Other

efsour_o

cs6

Other Source of Early Identification of Youth

(Text)

eirf1

1

Youth Age

(Numeric)

eirf2

2

Youth Gender

1 = Boy
2 = Girl
3= Transgender
4 = Other

eirf2o

2o

Other gender, specified

(text)

eirf3

3

Is the youth of Hispanic or Latino cultural/ethnic background?

1=No
2=Yes

eirf3a_1

3a

Mexican, Mexican-American, or Chicano

0=Not Endorsed
1=Endorsed

eirf3a_2

3a

Puerto Rican

0=Not Endorsed
1=Endorsed

eirf3a_3

3a

Cuban

0=Not Endorsed
1=Endorsed

eirf3a_4

3a

Dominican

0=Not Endorsed
1=Endorsed

eirf3a_5

3a

Central American

0=Not Endorsed
1=Endorsed

eirf3a_6

3a

South American

0=Not Endorsed
1=Endorsed

eirf3a_7

3a

Hispanic origin captured in local MIS but not represented in list above

0=Not Endorsed
1=Endorsed

eirf3ao

3ao

Text explanation for eirf3a_7

(Text)

eirf4_1

4

American Indian or Alaska Native

0=Not Endorsed
1=Endorsed

eirf4_2

4

Asian

0=Not Endorsed
1=Endorsed

eirf4_3

4

Black or African American

0=Not Endorsed
1=Endorsed

eirf4_4

4

Native Hawaiian or Other Pacific Islander

0=Not Endorsed
1=Endorsed

eirf4_5

4

White

0=Not Endorsed
1=Endorsed

eirf4_6

4

Race captured in local MIS but not represented in list above

0=Not Endorsed
1=Endorsed

eirf4o

4o

Text explanation for eirf4_6

(Text)

eirf5

5

Was the youth referred for mental health related services?

1=Yes
2=No

eirf5a1

5a

Referral made to: Mental health assessment/treatment

0=Not Endorsed
1=Endorsed

eirf5a2

5a

Substance use assessment / treatment

0=Not Endorsed
1=Endorsed

eirf5a3

5a

Psychiatric hospitalization

0=Not Endorsed
1=Endorsed

eirf5a4

5a

Emergency room or mobile crisis

0=Not Endorsed
1=Endorsed

eirf5a5

5a

Other

0=Not Endorsed
1=Endorsed

eirf5ao

5ao

Other, please describe

(Text)

eirf6

6

Was the youth referred for non-mental health related services?

1=Yes
2=No

eirf6a1

6a

Non-mental health referral made to: Informed youth of crisis hotline

0=Not Endorsed
1=Endorsed

eirf6a2

6a

Discussed availability of other supports with youth

0=Not Endorsed
1=Endorsed

eirf6a3

6a

Tutoring / academic counseling

0=Not Endorsed
1=Endorsed

eirf6a4

6a

Physical Health referral

0=Not Endorsed
1=Endorsed

eirf6a5

6a

Other

0=Not Endorsed
1=Endorsed

eirf6ao

6ao

Other, please describe

(Text)

eirf6b1

6b

Why was the youth not referred for any services? No need for additional services

0=Not Endorsed
1=Endorsed

eirf6b2

6b

Youth was already receiving mental health services

0=Not Endorsed
1=Endorsed

eirf6b3

6b

No capacity at provider agencies to make a mental health referral

0=Not Endorsed
1=Endorsed

eirf6b4

6b

Youth already receiving other supports

0=Not Endorsed
1=Endorsed

eirf6b5

6b

Other

0=Not Endorsed
1=Endorsed

eirf6bo

6bo

Other, please describe

(Text)

eirf7m

7

Month

(Numeric)

eirf7y

7

Year

(Numeric)

eirf8

8

Where was the child referred for mental health related services?

1 = Mental Health Agency
2 = Hospital
3= Emergency room
4= Substance Abuse Treatment Center
5 = School Counselor
6= Private practice
7 = Other

eirf8o

8o

Other, please describe

(Text)

eirf9

9

In the 3 months following the date of referral, which of the following best describes the youth’s situation as it relates to completing the referral?

1=No action was taken following the referral
2=Made an appointment but youth did not attend the appointment
3=Attempted to make an appointment but youth was wait-listed for at least 3 months
4=Made an appointment and youth received first service within 3 months
5=Youth received emergency services

eirf10m

10

Date of 1st service: Month

(Numeric)

eirf10y

10

Date of 1st service: Year

(Numeric)

eirf11

11

What service did the youth receive at the 1st appointment?

1 = Mental Health assessment
2 = Substance use assessment
3 = Family Therapy
4= Individual Therapy
5= Group therapy
6= Substance abuse counseling
7= Emergency room services
8= Other service

eirf11o

11o

Other, please describe

(Text)

C

Attachment 2

AMPUS

Training activity report (TAR) –

aGGREGATE TEMPLATE


Name of Training: __________________________


Date of Training: (mm/dd/yyyy): ____________________


Type of Activity:  Training  Educational Seminar


Unduplicated count of attendees: _______


  1. Gender (provide counts)

____ Female

____ Male

____ Transgender

____ Other (specify):______________________


  1. Race/Ethnicity (provide counts)

____ American Indian or Alaska Native

____ Asian

____ Black or African American

____ Native Hawaiian or Other Pacific Islander

____ White

____ American Indian or Alaska Native and White

____ Asian and White

____ Black or African American and White

____ American Indian or Alaska Native and Black or African American

____ Individuals reporting multiple races not included above

____ No race available


  1. Role (provide counts)

____ Student

____ Family Member

____ Faculty

____ Staff

____ Clergy

____ Community group member

____ Other (please describe: _______________)

____ Other (please describe: _______________)

____ Other (please describe: _______________)


Attachment 3

GLS Cross-site Evaluation Data Access and Use Agreement


As the Suicide Prevention Data Center (SPDC) Site Administrator you have the highest level of privileges assigned to the SPDC and that includes data download privileges. All data sets have been de-identified; however, to ensure the highest level of rights protection for respondents represented in these data sets you are required to review and sign this data access and use agreement. There are two fundamental aspects to this agreement, described below. The first relates explicitly to SPDC access and use, and the second to data use and reporting.


I. SPDC Access: User IDs and Password


The purpose of this section of the agreement is to specify the conditions related to using the Suicide Prevention Data Center as an assigned Site Administrator. The SPDC is the data management system that was developed to support the Cross-site Evaluation of the Garrett Lee Smith Suicide Prevention Program sponsored by the Center for Mental Health Services (CMHS) of the Substance Abuse and Mental Health Services Administration (SAMHSA).


Site Administrators of the SPDC are assigned the highest level of privilege to the system, including the ability to assign other users and to download grantee-specific data. No other level of user access allows data download capability. As an SPDC Site Administrator you are expected to adhere to the security standards of the SPDC to the fullest, in respect to their interaction with other users and in handling suicide prevention information. As a user, you are required to change the system-generated password to a self-generated password.


The undersigned gives the following assurances with respect to using the SPDC:


  • You will not allow any other person to share your user ID and password and will accept responsibility for all logins to the SPDC using your user ID and password.

  • You will not provide your user ID and/or password to any third party.

  • You will not leave the SPDC Web site unattended while logged on to the system.

  • If you believe any breach of security has occurred, such as the disclosure, theft, or unauthorized use of your user ID and password, you will contact Macro International Inc. immediately.


II. Data Use


The purpose of this section of the agreement is to specify the conditions related to accessing/using the grantee-specific Cross-site Evaluation data from the Garrett Lee Smith Suicide Prevention Program sponsored by CMHS.


The data sets should be used for the express purposes of local program monitoring, evaluation and sustainability. No identifying information will be included in the grantee-specific data sets.


The undersigned gives the following assurances with respect to use of grantee-specific data:


  • I must adhere to and be knowledgeable of relevant IRB regulations regarding the proposed use of the data. In compliance with the Health Insurance Portability and Accountability Act (HIPAA) regulations, distributed data will have all personal and identifying information removed from the dataset.

  • I will not use nor permit others to use cross-site evaluation data in any way except for aggregate statistical reporting.

  • I will require others in the organization (specified below) who use the data to sign this agreement and will keep those signed agreements on file and will submit copies of those signed agreements to SAMHSA upon request.

  • I will not use, release nor permit others to release any information that identifies persons, directly or indirectly.

  • I will not, nor will others, report aggregate information based on sample sizes fewer than 10.

  • I will not use or release nor permit others to use or release the data sets or any part of them to any person who is not a member of the organization (specified below), except with the approval of SAMHSA and the project officer for the grant programs under analysis.

  • I will not attempt to use nor permit others to use the cross-site evaluation data sets to learn the identity of any person included in any set.

  • I will not contact nor permit others to contact establishments or persons in the data sets to question, verify, or discuss data in the cross-site evaluation dataset.

  • I will make no statement nor permit others to make statements indicating or suggesting that interpretations drawn analyses of these data are those of Macro International Inc. or SAMHSA.

  • I will acknowledge in all reports based on these data that the source of the data is the GLS cross-site evaluation funded by the Center for Mental Health Services, Substance Abuse and Mental Health Services Administration.



____________________________ ________________________ _________

Site Administrator’s signature Organizational Affiliation Date


____________________________ _________

Macro International Inc. Date


_____________________________ _________

Center for Mental Health Services Date




******************************************************************************



The undersigned local users have read and understand all aspects of the data use portion of this agreement.


____________________________ ________________________ _________

Local user 1 Organizational Affiliation Date


____________________________ ________________________ _________

Local user 2 Organizational Affiliation Date


____________________________ ________________________ _________

Local user 3 Organizational Affiliation Date

Attachment 4

Changes to Instruments, Consents and Recruitment Materials


A.1 Existing Database Inventory (State/Tribal Version)

Page 2, Consent form: Replaced the word confidentiality with privacy and confidential with private


A.2 Existing Database Inventory (Campus Version)

Page 2, Consent form: Replaced the word confidentiality with privacy and confidential with private


B.1 Product and Services Inventory (State/Tribal Version - Baseline)

Page 2, Consent form: Replaced the word confidentiality with privacy and confidential with private


B.3 Product and Services Inventory (Campus Version - Baseline)

Page 2, Consent form: Replaced the word confidentiality with privacy and confidential with private


C Training Exit Survey

Page 3, Consent form: Replaced the word confidentiality with privacy and confidential with private

Page 6, item 29: Removed the response option of ‘other’.


D.3 Training Utilization and Penetration Key Informant Interview PHONE SCRIPT AND VERBAL CONSENT FORM

Page 1, Consent form: Replaced the word confidentiality with privacy and confidential with private


E.1 Referral Network Survey

Page 2, Consent form: Replaced the word confidentiality with privacy and confidential with private


E.2 Advance RNS

Replaced the word confidentiality with privacy and confidential with private


F.1 Suicide Prevention Exposure, Awareness and Knowledge Survey – Student Version

Page 2, Consent form: Replaced the word confidentiality with privacy and confidential with private

Page 7, item 50: Removed the response option of ‘other’.


F.2 Advance SPEAKS

Replaced the word confidentiality with privacy and confidential with private


G Suicide Prevention Exposure, Awareness and Knowledge Survey – Faculty/Staff Version

Page 2, Consent form: Replaced the word confidentiality with privacy and confidential with private

Page 7, item 52: Removed the response option of ‘other’.


H.5 Campus Infrastructure Interview-A PHONE SCRIPT AND VERBAL CONSENT

Page 1: Replaced the word confidentiality with privacy


H.6 Campus Infrastructure Interview-C PHONE SCRIPT AND VERBAL CONSENT

Page 1: Replaced the word confidentiality with privacy


H.7 Campus Infrastructure Interview-F PHONE SCRIPT AND VERBAL CONSENT

Page 1: Replaced the word confidentiality with privacy


H.8 Campus Infrastructure Interview-S PHONE SCRIPT AND VERBAL CONSENT

Page 1: Replaced the word confidentiality with privacy


I.1 Tennessee Lives Count Six-Month Follow-up Survey

Page 1: Replaced the word confidential with private

Page 1, item 3: Added (select one) to the end of the question.

Page 1, item 4: Replaced (check all that apply) with (select one or more) and removed the response option ‘other’.


I.3 TLC Six-Month Consent

Page 1: Replaced the word confidentiality with privacy







15


File Typeapplication/msword
File TitleTraining Exit Survey
AuthorMacro User
Last Modified Bykraemer_j
File Modified2007-05-29
File Created2007-05-29

© 2024 OMB.report | Privacy Policy