EVHAMHS Justification Part A

EVHAMHS Justification Part A.doc

Evaluation of Veterans Health Administration Mental Health Services

OMB: 2900-0713

Document [doc]
Download: doc | pdf

Evaluation of Veterans Health Administration Mental Health Services

OMB Supporting Statement


A. Justification


1. Need for Information


This project is being conducted for and funded by the U.S. Department of Veterans Affairs (VA) Office of Policy, and Planning as a legislated evaluation of the mental health services provided by the Veterans Health Administration (VHA). It is covered under the requirements of P.L. 103-62, the Government Performance and Results Act (GPRA) of 1993; Title 38, section 527, Evaluation and Data Collection; and 38 CFR section 1.15, Standards for Program Evaluation. The GPRA requires Federal government agencies to evaluate their performance on a regular basis. On August 2, 2006, VA awarded contract no. 101-G67214/101-G67215 to the Altarum Institute to conduct a formal, independent evaluation with the RAND-University of Pittsburgh Health Institute (RUPHI) as a subcontractor. The evaluation will be responsive to the changing emphasis on mental health resulting from the implementation of the VA Mental Health Strategic Plan (MHSP), and the report of the President's New Freedom Commission on Mental Health, that in part catalyzed development of the MHSP. The VHA Office of Research Oversight and the VHA Office of Research and Development have reviewed the project and determined it to be a program evaluation, and not a research project.


2. Use of Information


The information collected through this evaluation under GPRA will assist the VA in assessing the extent to which program goals are being met, and, when performance falls short, to develop effective policy and program recommendations for action that policymakers can implement quickly to improve performance. Outcomes developed and used for analysis will relate to the goals expressed in the MHSP, and will respond to the Deputy Secretary’s request for information about the breadth of mental health patient care services.


Specific goals of program evaluations are as follows:

  • Assess the extent to which program outcome goals are being met.

  • Assess the needs and requirements of veterans to ensure that the nature and scope of future benefits and services are aligned with their changing needs and expectations.

  • Assess the adequacy of outcomes and outcome measures in determining the extent to which the programs are achieving intended purposes and goals.


To achieve these program evaluation goals, the VA has contracted with Altarum/ RUPHI to provide information that will answer the following research questions:


  1. To what extent is VA achieving the program outcomes for veterans with a diagnosis of post-traumatic stress disorder (PTSD), schizophrenia, major depressive disorder (MDD), bipolar disorder, and substance use disorder (SUD)?


  1. When appropriately adjusted, how do the specified outcomes for VA patients (with the diagnoses included in the study) compare to outcomes for comparable veterans treated in non-VA-funded public and private sector care?


  1. How does the available continuum of mental health care services compare across VA medical centers (VAMCs)? How does the care at each VAMC compare to the range of mental health services at the same site in the first half of FY05, in terms of programs offered, staffing level, and mental health workload? Are mental health services across the full continuum of care available to all veterans (with the diagnoses included in this study) who need them? How does the care at each VAMC compare to local non-VA services?


  1. Question dropped from original VA study.


  1. When there is a dual diagnosis (i.e., including co-occurring SUD and the other diagnoses included in this study), identify whether and how both conditions are being managed.


  1. What factors influence the use of VA specialty mental health services by veterans service-connected for the diagnoses covered in this survey? What are the barriers, if any, to access to care? What non-VA services are used by veterans service-connected for these diagnoses?


  1. In the context of recommendations from the President’s New Freedom Commission report, how widespread is use of the strongest evidence-based models of care for each of the diagnoses? In particular, are psychosocial approaches to care fully implemented, in concert with psychopharmacology approaches as needed, to support a recovery-oriented model that seeks to return veterans to full roles in the community, at work, and with their families? The specific evidence-based care for each diagnosis will need to be identified, but for at least some of the diagnoses will include approaches such as cognitive behavioral therapy, social skills training, supported employment with individual assistance, mental health intensive case management (or other), and family education in VA.


  1. Question dropped from original VA study.


The project team has conceptualized the evaluation in terms of the Donabedian (1980) quality of care model, which includes the components of structure, process, and outcomes of care, and has aligned the VA’s research questions with these three components of this model. The team has also developed an optimally efficient, cost-effective data collection methodology for each component of the model, and each related research question, that utilizes existing VA data to the fullest extent possible and augments the existing data with additional collection only as needed. This approach can be summarized as follows:


  1. Structure of Care: What Services Are Available To Veterans?

Research Questions with Major Connection: 1, 2, 3, 6, 7

Data Sources: Facility Survey & Administrative Data


  1. Process of Care: What Services Do Veterans Receive?

Research Questions with Major Connection: 1, 2, 5, 6

Research Questions with Some Connection: 3, 7

Data Sources: Administrative Data & Medical Record Data


  1. Outcomes of Care: Does It Make A Difference?

Research Questions with Major Connection: 1, 2

Research Questions with Some Connection: 3, 6

Data Sources: Client Survey & Medical Record Data


The data to be collected via the client survey will be used by the VA to answer the critical outcomes question, “Does it make a difference?”, and to gather information on additional indicators that cannot be reliably collected from other project data sources. The survey will yield information to understand the functional status of veterans and their perceived improvement, their decision to use VA mental health services, their perceived need for services, their satisfaction with care and perceived timeliness of care, and their receipt of employment assistance and residential assistance. As guided by the Donabedian model, achieving an understanding of the structure, process, and outcomes of care is necessary to improve quality of care. If, for example, VISNs vary in terms of the timeliness of care and this variation is explained by certain characteristics of the settings or services provided, VHA can use this information to improve timeliness and thus the quality of care.


3. Involvement of Information Technology


The project team will use automated and electronic systems for data collection.


The RAND Survey Research Group (SRG) will conduct interviews using Computer Assisted Telephone Interviewing (CATI). Data quality will be enhanced by programming skip patterns; programming the administration of particular questions based on answers to previous questions and/or pre-filled information, such as whether participants are service connected or not; and eliminating the need for keying data. We will also include range and consistency checks into the programming to maximize accuracy. We will use electronic systems to schedule calls at various times of the day and days of the week to capture respondents’ available time. The use of CATI reduces respondent burden by removing the need for the interviewer to read and follow skip instructions, thus making the interview quicker and smoother for the respondent.



4. Efforts to Identify Duplication


As described in detail in #2, the project team is collecting several types of data to meet the VA’s program evaluation needs and to respond to the specific research questions required. These include: facility survey data, administrative data, medical record data, and client survey data. To the extent possible, the team is utilizing existing VA data to meet the required goals of the evaluation. However, the full range of necessary performance indicators cannot be adequately assessed using administrative or medical record data alone. To understand, for example, patient satisfaction with care, timeliness of care, and veterans’ decision to use VA mental health services, it is critical to collect client-level survey data as well. Additionally, although functioning can be documented in the medical record, this is highly dependent on the thoroughness and accuracy of the clinician’s medical recording. Contradictory results were demonstrated in VA when medical record abstractions were compared to client surveys for people with schizophrenia (Young et al, 1998; Cradock et al, 2001). Without the additional information obtained from the client surveys, differences in medical record documentation would have led to incorrect identification of poor quality care.


There will be minimal duplication based on the nature and scope of this survey.  There are only a few other national VA surveys, including Survey of Health Experiences of Patients (SHEP) (OMB # 2900-0227) and the Annual Survey of Enrollees’ Health and Reliance Upon VA (OMB #2900-0609).  SHEP is a cross-sectional survey that is conducted each month by mail with a stratified random sample of veterans seen as outpatients in VA primary and specialty care clinics and veterans discharged from inpatient care.  The SHEP pertains to the most recent episodes of inpatient and outpatient care.  For inpatient care, the sample size is 12,600/month, and for outpatient care the sample size is 36,600/month.  The Annual Survey of Enrollees has been fielded four times since 1999 (42,000 completes in 2005) and is a telephone interview with a stratified random sample of enrolled veterans.


While these existing surveys assess, for example, veterans’ satisfaction with services, they do not include all the domains that have been identified as critical for inclusion in the current survey. For example, decision to use VA mental health services, recovery, and receipt of housing assistance are not addressed in existing national surveys of veterans. Further, our sampling method will focus on veterans who are service connected for one of five mental health diagnoses and will allow comparisons across VISNs and diagnoses. 


5. Impact on Small Business


Not applicable; the client survey will involve individual veterans only.


6. Consequences of Not Collecting the Information


If the client survey data are not collected, the project team will not have adequate information to answer VA research questions 1, 2, 3, and 6 or to properly assess the critical outcomes question: “Does it make a difference?”. In addition, no comparison data at the patient-level will be available to assess differences in the care provided by VHA and non-VHA facilities. Inclusion of all planned data sources to yield information about structure, process, and outcomes is necessary to achieve a complete representation of quality of care. Without these data sources, the VA will not be able to fully assess the extent to which program goals are being met, nor will it be able to take the most effective actions for improvement when performance appears to fall short of intended program goals. Given that the client survey will be administered only once to each participating veteran, there is no feasible way to reduce the respondent burden by collecting data less frequently.


7. Special Circumstances


This project involves none of the special circumstances listed in the documentation.


8. Adherence to 5 CFR 1320.8(d) and Outside Consultations


The notice of proposed information collection activity was published in the Federal Register on August 22, 2007, page 47127-47128. No comments were received. In addition, the project team has elicited feedback from members of the mental health field regarding the content of the survey and psychometrics, and from individual consumers regarding the clarity of the instructions.


9. Provision of Payments or Gifts to Respondents


All respondents will be remunerated for their participation in the survey in the amount of $10, in the form of a check, mailed after the interview is completed. Research has shown that monetary incentives are more effective in increasing response rates than non-monetary incentives. Since we anticipate that this might be a mobile and difficult to find population, mailing the money in advance has an increased probability of not getting to the right recipient.


Past experience: The RAND SRG has a great deal of experience in interviewing the SMI population and our excellent response rates in this generally lower income population, are in part due to monetary compensation for participation.


Data quality: One of the challenges in this population will be to encourage people to remain on the phone for 30 minutes to complete the consent and survey. We believe that the monetary incentive will improve data quality by motivating some to continue to respond to questions until the interview is completed, thus limiting missing items.


Burden: Remaining on the phone and attentive for 30 minutes to this population, will require some of them to exert considerable effort in focusing their attention to the questions at hand. Offering this modest remuneration will motivate some to remain on the phone with us.


10. Assurance of Confidentiality


The RAND SRG will adhere to all RAND policies for assuring confidentiality. RAND will assure the respondent of confidentiality in basic language in the advance letter which will be mailed to each potential respondent about two weeks before they are called. The language in the letter will be designed at close to 6th grade reading level. In the introduction to the telephone survey, the respondent will be reminded about the voluntary nature and about our assurance of confidentiality. The exception to the assurance will be stated on the phone: if the respondent indicates present child or elder abuse or current intent to harm herself or others in which case the interviewer needs to report. Assurance of confidentiality will again be stated in the phone script before the start of the interview.


11. Justification for Sensitive Questions


As previously stated, the client survey will assess, for example, veterans’ functional status, decision to use VA mental health services, and satisfaction with care received. None of these items will assess sexual behavior or attitudes, or religious beliefs. Veterans will, however, be asked to report how they are feeling (e.g., downhearted) and about emotional problems or symptoms. These topics may be considered sensitive, but this information is necessary to include in the survey to answer the study’s research questions.


12. Estimate of Respondent Burden


We estimated that it would require an average of 25 minutes to complete this survey with the SMI population, plus five minutes for the introduction, responding to their questions, and verifying confidentiality and consent (See Table 1). We have confirmed these estimates by conducting pretests with 8 individual veterans in October 2007. Pre-test participants have included African American, Hispanic, and non-Hispanic veterans of various ages with diagnoses including schizophrenia, PTSD, and depression.


Table 1: Current estimate of respondent burden confirmed with pretests in October 2007.


Type of Respondent

No. of Respondents

Estimated Time Per Respondent

Total Hours

Veterans w/ SMI

8218

.50 hr (30 mins.)

4,109






13. Estimate of Total Annual Cost Burden


Respondents will be mailed an advance letter and a reminder card and will receive a telephone call from a RAND interviewer. Therefore, there is no direct cost to the respondent. Respondents will not be asked to maintain records nor will they incur any monetary costs in completing the survey.


14. Estimate of Annualized Costs to the Federal Government


Survey costs: The estimated cost for attempting phone interviews with approximately 15,000 veterans and completing interviews with approximately 8,000 is $1,725,435 over one year of data collection and approximately an additional 9 months of planning, programming and preparation. The estimated costs include labor costs, fringe expenses, administrative expenses, respondent payments, and costs associated with reproduction of letters, postage, and telephone expenses.


15. Changes from OMB Form 83-I


There is no change to the burden hours. This is the study’s first submission.


16. Plans for Tabulation and Publication


Through the proposed survey, the project team will collect additional data at the patient level in order to respond to the VA research questions not fully covered by other non-patient data collection efforts conducted as part of this evaluation. This additional data collection includes a review of medical records of about 11,448 veterans (to be conducted by a third party subcontractor) and the client survey to be conducted by the RAND SRG. Comparison data from veterans eligible for VHA care but who are not using VHA care will be obtained from the client survey only. The team will also re-administer the facility survey conducted in Phase 1. The results of this data collection and the accompanying analyses will be used to produce a patient outcome database for delivery to VA.


The patient-level data collected from the client survey will be analyzed to produce estimates, basic descriptive statistics, examine the variability of responses to questions, and conduct correlations or cross tabulations of responses. If we find that our sample does not reflect the underlying population, we will derive and use sampling weights in analyses.


The project team will only use the data for the designated tasks within the contract. Data and information obtained under the contract will not be used to create databases or any other product not intended for use specifically for the project. All survey and data collection instruments, draft and final reports, all data files and associated working papers, and all other materials deemed relevant by VA which have been generated by or provided to the project by the VA in the performance of this contract are the exclusive property of the U.S. Government and are to be submitted to the COTR at the completion of the contract.


Project team leaders will conduct consultations with key stakeholder groups on the design, progress, and findings of the program evaluation. Additional management briefings will be delivered to VA top program officials and other VA stakeholders on the major findings of the evaluation and proposed recommendations.

The project team will prepare draft and final reports for VA which cover the full scope of all three phases of the project, including all related appendices, tables, databases, surveys, and software.


The project team also anticipates requesting permission from the VA to publish the methods and results of the evaluation in peer-reviewed journals at the completion of the study. This is permitted by the contract as follows: "The Assistant Secretary for Policy, Planning, and Preparedness or his authorized designee will be the sole authorized official to release (orally, electronically, or in writing) any data, the draft or final deliverables, or any other written or printed materials pertaining to this contract. The project members will release no information. Requests for journal publication are included in this request. Any request for information about this contract presented to the project team will be submitted to the Contracting Officer for response." 


The timeline for tasks related to the proposed data collection, analyses, deliverables to VA, and related publications is as follows:


  • Develop sample file and finalize sampling frame: August 2007-February 2008

  • Update contact information in sampling frame: May 2008-May 2009

  • Select and locate survey participants: February 2008-May 2008

  • Notify potential survey participants about the study and the forthcoming telephone survey: June 2008-June 2009

  • Conduct client survey: July 2008-June 2009

  • Obtain comparison data via client survey: July 2008-June 2009

  • Develop Program Outcome Database

    • Analyze administrative data: August 2007-April 2008

    • Conduct medical record review: May 2008-June 2009

  • Revise program outcome database: October-December 2009

  • Conduct stakeholder consultations: August 2007-June 2010

  • Re-administer facility survey: September-October 2009

  • Analyze facility survey results: October-November 2009

  • Meet human subjects, IT/data security, and other related ethical requirements: Ongoing throughout Phase 3

  • Draft final report to VA: August 2007-March 2010

  • Prepare and deliver final report to VA: February 2010-June 2010

  • Conduct management briefings: May-June 2010

  • Produce related peer-review publications: Post June 2010

17. Expiration Date


The expiration date of the OMB approval will be displayed in the survey notification letter and provided to interviewers, so that they may provide it to respondents who ask.


18. Exceptions to the Certification Statement


There are no exceptions identified in Item 19, “Certification for Paperwork Reduction Act Submissions,” of OMB Form 83-I.


9


File Typeapplication/msword
File TitleSUPPORTING STATEMENT
AuthorIST
File Modified2008-01-10
File Created2008-01-10

© 2024 OMB.report | Privacy Policy