OMBJustification_B_JDTR_12.27.12

OMBJustification_B_JDTR_12.27.12.docx

Targeted Capacity Expansion Grants for Jail Diversion Programs

OMB: 0930-0277

Document [docx]
Download: docx | pdf

Targeted Capacity Expansion Grants for Jail Diversion Programs

SUPPORTING STATEMENT


B. Collection of Information Employing Statistical Methods.

1) Respondent Universe and Sampling Methods

The TCE jail-diversion Grantees will collect data about all individuals screened for diversion in addition to all jail diversion program enrollees. The starting point for data collection is everyone that the programs screen for possible entry into the jail diversion program. Based on data received from the Grantees in the prior years, on average approximately 500 individuals per year, per Grantee are screened at least once for possible jail diversion enrollment. (This number is variable across Grantees, however, due to differences in the size and types of programs.) Although this is a large number, the evaluation requires only basic information on these individuals in order for SAMSHA/CMHS to obtain an accurate representation of the population that the enrollees are part of and to credit the programs with the extra level of planning and work required in order to enroll individuals into the jail diversion programs.


Based on current JDTR Enrollment rates, on average, 40 individuals are enrolled in jail diversion by each Grantee per year. (Again this number is variable across Grantees due to differences in the size and types of programs.) Hence, compared to the census of everyone screened, the number of individuals deemed eligible and enrolled in the jail-diversion programs is quite small. Also program enrollees may decline to participate in the evaluation and evaluation participants may decline to participate in one or both follow-up interviews. Furthermore, it should be noted that prior to the addition of the grant mandated evaluation, no formal sample size was formulated. The goal for the program evaluation is to collect data on as many jail diversion program enrollees as possible at each site. For these reasons a universal sampling frame will be required to obtain the largest n possible.


SAMHSA estimates that Grantees will have a minimum evaluation participation rate of 80 percent which should insure a sufficient sample size for the purposes of this evaluation. This estimate is based on prior experience; both FY 2008 and FY2009 Grantees achieved a participation rate of 82% under the prior OMB data collection. We expect this participation rate to remain the same.


SAMHSA will make the following efforts to determine if non-response is related (a) self-selection bias between persons who participated in the evaluation and those who did not; or (b) differential attrition between persons who completed all follow up evaluation interviews and those who did not. We will conduct analyses of any differences on key measures (e.g. demographic, clinical, length of service) between these groups. For participants who elected not to participate in the evaluation, the Person Tracking data collection, which contains limited demographic data on program enrollment, can be used for this purpose.  For those participants who completed a baseline interview but not subsequent follow up interviews, we will compare characteristics using baseline information.


2) Information Collection Procedures

As discussed earlier, there are three primary data sources that comprise all of the data collected through this evaluation:

  1. Interview Data composed of the following primary measures

  1. GPRA/NOMS measures

  2. Military Service Questions/Combat Experiences

  3. Lifetime Mental Health/Substance Use Treatment Services

  4. Drug and Alcohol Use

  5. Criminal Justice Questions

  6. Traumatic Events

  7. Posttraumatic Stress Checklist

  8. BASIS-24

  9. REE Recovery markers

  1. Tracking Data

  1. Events Tracking (demographic and eligibility information)

  2. Person Tracking (demographics, diagnosis, charges, diversion point and condition and target arrest/incident and enrollment information)

  1. Record Review Data

  1. Service Use (type and number of services received)

  2. Arrest History (type and jail days of arrests)


The starting point for data collection is everyone that the programs screen for possible entry into the jail diversion program. Once a person is deemed eligible for enrollment, and the court accepts the diversion plan (if applicable), then the person is enrolled in the jail diversion program. All of the data gathered about those individuals screened for possible diversion are collected and entered by Grantee staff using information obtained from the potential diversion program enrollees.


All diversion program enrollees are eligible to participate in the evaluation and unless there are extenuating circumstances (such as difficulties locating the person or hospitalization) all enrollees are approached for consent to participate. Attachment H provides an example of a consent form. Of those enrollees agreeing to participate in the evaluation, all are expected to receive a baseline interview within seven days of enrollment which can be conducted by Grantee staff. The participants are then called back for 6 and 12 month follow-up interviews, which must be administered within 30 days on either side of the due date and are conducted by Grantee staff only. Attachment I provides an example of a follow-up interview reminder letter. Note that while program staff may administer baseline interviews, only Grantee staff who is not in any way involved in providing services to program participants administers follow-up interviews. This is done to protect the privacy of the participant and to ensure that no adverse effects result from a refusal to participate in the evaluation or from any responses given.


Participants completing a baseline interview as well as a 6 and/or 12 month interview will receive service use and arrest history record reviews, which are conducted by Grantee staff. Person Tracking data are collected on all enrollees regardless of enrollment in the evaluation and contain information obtained either directly or indirectly from the potential diversion program enrollees. For those enrollees participating in the evaluation, the Person Tracking data also include information about interview completion statuses (when due and whether/when completed).


Most Grantee staff will record participant information through a pencil and paper method. This evaluation will not interfere with ongoing program operations. Most Grantee staff will collect client interviews through computer assisted interview programs. Grantee staff will submit electronic interviews forms monthly, electronic tracking data extracts bimonthly and electronic records forms at least once annually. All data, except the non-identifiable Events Tracking data, are matched using a unique client identifier created by the Person Tracking program.


3) Methods to Maximize Response Rates

The expected minimum response rate for enrollments in the evaluation is 80% as described above under 1) Sampling Universe. SAMHSA does not require a follow-up rate for the evaluation. However, Grantees are expected to maintain a minimum reassessment rate of 80% for the CMHS Client level National Outcome Measures (NOMS). We expect Grantees to obtain a similar follow-up response rate for the evaluation. . To support Grantees in achieving this rate before data collection, each jail diversion program has an Access-based software database distributed to assist in collecting required demographic and background information to assist in tracking participants and providing demographic information for the evaluation. The tracking software program has the capability to provide information on the current addresses for the participants and when they are due for a follow-up interview. The Baseline Person Tracking Program Information Form (Attachment E) is part of the tracking software program. It is intended to help interviewers contact participants for follow-up interviews.


Grantees are trained on the best practice methods for client retention and tracking. Among these techniques include frequent, varied outreach and contact with participants, to assure that they participate in the 6 and 12 month interviews. These methods vary from site to site and may mailing a reminder letter, contacting friends, family, case managers, and/or therapists (with permission), dropping in on commonly visited locations (e.g., soup kitchens, shelters, AA meetings), and coordinating follow-up interviews with other scheduled appointments at the program. The contractor will provide technical assistance to Grantees to achieve the 80% follow-up response rate or better.


After data collection is completed, we will evaluate and control for missing information due to attrition as well as item non-response in any analyses.  We will employ several statistical techniques, such as maximum likelihood estimation or multiple imputation, that can be used to improve the consistency and efficiency of parameter estimates when there is missing data.  We will evaluate and present any assumptions made -- e.g., that the data was missing at random (MAR) – in any presentations of findings.

4) Tests of Procedures

The measures included in the TCE Initiative’s interviews are a combination of mandated NOMS/GPRA items and additional non-GPRA mental health scales. The data collection of all of the measures have been implemented and successfully administered under the Prior OMB approval.


As part of the Evaluation Advisory Committee Process, FY2008 TCE Grantees had the opportunity to review and comment on the revised instruments, agreed to the inclusion of all items approved by the OMB. All of the non-GPRA measures contained in the instruments have been pilot tested and/or are well established data collection tools tested for validity and reliability. The four main non-GPRA measures along with their developmental background are as follows:

  1. Traumatic Events – These screening questions for trauma are adapted from the Posttraumatic Diagnostic Scale (PDS) developed by Edna Foa and the DC Trauma Screening developed by Community Connections in Washington, DC. Both of these instruments are used by clinicians to determine an individual's trauma history, including the recentness of the trauma. The screen is a descriptive tool only and, as such, has no psychometric properties. For the TCE Initiative, its inclusion is intended to provide basic descriptive information about individual trauma levels.

  2. Posttraumatic Stress Checklist - The PCL is a 17-item self-report measure of Posttraumatic Stress Disorder (PTSD) symptoms based on DSM-IV criteria, with a 5-point Likert scale response format that rates the severity of each symptom over the past month. Continuous scores are used to assess symptom severity and a cut-point of 3 (moderate severity) is used per each PTSD symptom to derive a PTSD diagnosis. The PCL has good psychometric properties. It has been found to be highly correlated with the Clinician Administered PTSD Scale (r = .929), the “gold standard” measure of PTSD, has good diagnostic efficiency (> .70), and robust psychometric properties with a variety of trauma populations (1, 2). Among individuals with serious mental illness, high internal consistency of the PCL was reported (.94 coefficient alpha), along with moderate test-retest reliability (.66) and moderate convergent validity with the CAPS (κ=.67) (3). Based on the brevity of the scale, along with its validity and reliability, the contractor agreed that this would be a useful measure of PTSD symptoms to potentially be included in the TCE Initiative’s evaluation.

  3. Behavior and Symptom Identification Scale (BASIS 24) – The 24-item self-report tool is used to assess change in mental health symptoms and behavioral distress following treatment. The instrument covers six domains: depression/functioning, difficulty in interpersonal relationships, self-harm, emotional lability, psychotic symptoms, and substance abuse- and an overall mental health score computed. Its predecessor (BASIS 32) is a widely used and tested behavioral health tool. It addition to being shorter, the BASIS 24 is more comprehensive, cutting across diagnoses by identifying a wide range of symptoms and problems that occur across the diagnostic spectrum. Validated and found reliable in inpatient, residential, and outpatient settings, BASIS-24 assesses treatment outcomes from the patient perspective4. The instrument, along with other options, was reviewed by the Evaluation Advisory Committee and approved for inclusion in the proposed revised instrument.

  4. Military Service and Combat Experience Questions – These questions were adapted from the several sources, and reviewed by Evaluation Advisory Committee members with experience and knowledge in this area. The military service questions were developed from the RAND the survey items used in the RAND Monograph Invisible Wounds of War5 and the questions from the Office of Justice Program, Bureau of Justice Statistics veteran questions6. The RAND monograph presents the results of a comprehensive study conducted between April 2007 and January 2008 on the post-deployment related health needs associated with PTDS, major depression, and traumatic brain injury of Operation Enduring Freedom/Operation Iraqi Freedom (OEF/OIF) veterans. This BJS Report presents data on the military and criminal backgrounds of incarcerated veterans. The combat experience questions were adapted from Hoge et al (2004)7 study of military combat duty in Iraq and Afghanistan and the associated mental health problems. These questions have been used in other studies of returning military personnel because of the broad range of types of experiences, which include not only not being hurt or hurting someone, but also include the aftermath of combat. A version of the combat experiences was also used in the RAND monograph, and questions drafted for inclusion were reviewed for relevance and appropriateness by consumer veterans and a representative from the Veteran’s Administration.

  5. CAGE- The Cut down, Annoyed, Guilty, Eye-Opener (CAGE)8 is a four item screening tool used to detect alcohol abuse or dependence. It is structured in a “have you ever” format that is applicable to the interviewee’s past or present. Answering Yes to two questions provides strong indication for substance abuse or dependency on alcohol. The tools validity has been demonstrated with substance abusing populations and it is commonly used in criminal justice settings. It is short, easy to remember, and easy to incorporate into an interview (NIAAA, 2002)9.

  6. Lifetime Mental Health and Substance Use Questions-These are eight questions were developed to assess client’s participation in treatment during their lifetime. There are four areas covered each for mental health and for substance abuse services: outpatient services, inpatient services, peer supported services, and age at first contact. These questions were based on previous instruments, including the CSAT/CMHS Homeless Families Study and the CMHS Support Housing Study. This information is primarily for descriptive purposes.

  7. Lifetime Criminal Justice Questions- There are six criminal justice questions that were included to assess the client’s previous involvement (longer than the past year) with the criminal justice system. These include age at first arrest, prior experience on probation, number of times incarcerated, and history retraining/protection order. While the purpose of these items is descriptive, their inclusion may distinguish between successful and unsuccessful program clients. These questions were based on previous SAMHSA cross-site instruments, including the CSAT/CMHS Homeless Families Study and the CMHS Support Housing Study.

  8. Recovery Markers from the Recovery Enhancing Environment Measure (REE)- The REE Recovery Markers is a 23 item self-assessment of personal recovery. The instrument examines personal recovery by focusing markers of recovery (immediate outcomes) they currently experience, including: motivation, goals, social role reclamation, basic needs such as housing and income, symptom self-management, physical health, quality of life, and personal strengths and positive relationships. Two formal pilot tests have been conducted on the REE10 and preliminary analyses indicate that the instrument is psychometrically sound; the coefficient alphas indicate for subscales range from (.77-.98). Additionally, the REE provides an important new element to client outcomes that focuses on clients’ well-being instead of symptoms.


The non-interview forms (for events tracking, person tracking, service use and arrest history data collection) collect commonly used descriptive and/or publicly available information. As with the interview forms, the Grantees on the Evaluation Advisory Committee provided feedback about the information in, and format of, the tracking and record review forms. These forms collect the same information as the previously approved forms, however the formatting has changes. These forms were implemented and successfully completed by all prior cohorts of the Grantees.


As described above, the FY2008 Grantees had the opportunity to review the revised interviews the new items described above and agreed to include all items previously approved by OMB. These revised interviews will be administered by all Grantees awarded in FY2008, FY2009 and FY2010.




5) Statistical Consultants


Contractors/Statistical Consultants:

Name/Title

Address

Contact Information

Kristin Stainbrook, Ph.D.

Deputy Directory of Research


Advocates for Human Potential

41 State Street Suite 500

Albany, NY 12207

Phone: (518) 729-1241

[email protected]

Jenneth Carpenter, Ph.D.

Senior Research Associate


Advocates for Human Potential

41 State Street Suite 500Albany, NY 12207

Phone: (508) 202-5571

[email protected]


Nick Huntington, M.A.

Senior Analyst


Advocates for Human Potential

490-B Boston Post Road

Sudbury, MA 01776

Phone: (978) 261-1453

[email protected]


Laura Elwyn, Ph.D

Senior Research Associate


Advocates for Human Potential

41 State Street Suite 500

Albany, NY 12207

Phone: (518)729-1221

[email protected]

Steven Sullivan, Ph.D

Consultant

The Cloudburst Group

8100 Corporate Drive, Suite 320

Landover, MD  20785-2231

Phone: (301) 918-4400

[email protected]



Federal Project Officers/Statistical Consultants


Name/Title

Address

Contact Information

David Morrissette, Ph.D.

Government Project Officer


Center for Mental Health Services, SAMHSA

Phone: (240) 276-1912

[email protected]



List of Attachments:

  1. Baseline Interview

  2. 6-Month Interview

  3. 12-Month Interview

  4. Event Tracking Screen

  5. Baseline Person Tracking Information Form

  6. Service Use Data Collection Form

  7. Arrest Data Collection Form

  8. Example of a Consent Form

  9. Follow-up Interview Reminder Letter


1 Andrykowski, M.A., Cordova, M.J., Studts, J.L., & Miller, T.W. (1998). Posttraumatic stress disorder after treatment for breast cancer: Prevalence of diagnosis and use of the PTSD Checklist–Civilian Version (PCL–C) as a screening instrument. Journal of Consulting and Clinical Psychology, 6, 586–590.

2 Blanchard, E.B., Jones-Alexander, J., Buckley, T.C., & Forneris, C.A. (1996). Psychometric properties of the PTSD Checklist (PCL). Behaviour Research and Therapy, 34, 669-673.

3 Mueser, K.T., Rosenberg, S.D., Fox, L., Salyers, M.P., Ford, J.D., & Carty, P. (2001). Psychometric evaluation of trauma and posttraumatic stress disorder assessments in persons with severe mental illness. Psychological Assessment, 13(1), 110-117.

4 Eisen, S.V. Normand, G.R., Belanger et al. The Revised Behavior and Symptom Identification Scale: Reliability and validity. Medical Care, 2004, 42: 1230-1241.

Eisen, S.V., Ranganathan, G, Seal, P, and Spiro, A. 2007. Measuring Clinically Meaningful Change following Mental Health Treatment. Journal of Behavioral Health Sciences Research, 43 (3): 272-289.

5 Tanielian, T. & Jaycox, L.A., Eds. (2008). Invisible wounds of war: Psychological and cognitive injuries, their consequences, and services to assist recovery. Santa Monica, CA: RAND Center for Military Health Policy Research.

6 Noonan, Margaret E.; Mumola, Christopher J., "Veterans in State and Federal Prisons, 2004 ." Special Report. NCJ 217199, Washington, DC: United States Department of Justice, Bureau of Justice Statistics, May 2007.

7 Hoge, C.W., Castro, C.A., Messer, S.C., McGurk, D., Cotting, D.I., & Koffman, R.L. (2004). Combat duty in Iraq and Afghanistan, mental health problems, and barriers to care. New England Journal of Medicine, 351, 13-22.

8 Isaacson JH, Schorling JB. Screening for alcohol problems in primary care. Med Clin North Am. 1999 Nov;83(6):1547-63, viii. PubMed Entry

9 National Institute on Alcohol Abuse and Alcoholism (NIAAA). Screening for alcohol problems—an update [Internet]. Bethesda, MD: National Institute on Alcohol Abuse and Alcoholism; 2002 Apr [accessed May 20, 2009]. (Alcohol Alert; 56). Available from: http://pubs.niaaa.nih.gov/publications/aa56.htm


10 Ridgeway, P, Press, A. Ratzlaff, S, Davidson, L, and Rapp, C (2003). Reports on the field testing of the Recovery Enhancing Environment Measure. Lawrence, KS. School of Social Welfare, Office of Mental Health Research and Training.

Ridgeway, P, Press, A., Anderson, D., and Deegan, P.E. (in preparation). Pilot testing the Recovery Enhancing Environment Measure: The Massachusetts experience. Byflield, MA: Pat Deegan and Associates.

5


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleTHE SUPPORTING STATEMENT
AuthorSarah Fleury
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy