Supporting Statement revised 9-21-07

Supporting Statement revised 9-21-07.doc

Independent Evaluation of the Substance Abuse Prevention and Treatment Block Grant Program

OMB: 0930-0291

Document [doc]
Download: doc | pdf

Independent Evaluation of the Substance Abuse Prevention and Treatment Services Block Grant


Supporting Statement



A. JUSTIFICATION


1. Circumstances of Information Collection.


The Substance Abuse and Mental Health Services Administration’s Center for Substance Abuse Treatment (CSAT) and Center for Substance Abuse Prevention (CSAP) is requesting approval from the Office of Management and Budget (OMB) for two interview protocols (one for Federal staff and one for State staff) and two surveys (one for State Technical Reviewers and one for State Prevention Synar System Reviewers). These instruments will collect information from key Federal, State, and community stakeholders involved in the implementation and oversight of the Substance Abuse Prevention and Treatment Block Grant (SAPT BG) Program. These instruments will be used to gather data about SAPT BG Program processes and outcomes that can be used to evaluate the extent to which the SAPT BG Program is meeting its intended goals and the means by which it is doing so. This data collection effort constitutes the backbone of the first independent evaluation of the SAPT BG Program. The SAPT BG Program is authorized by sections 1921–1954 of the Public Health Service Act.


History and Legislative Requirements


The SAPT BG Program was originally created by the Omnibus Budget Reconciliation Act of 1981 (Public Law 97-35), as one of several block grant programs introduced to provide a formula-based distribution of Federal funds to States, increase their flexibility, and reduce the administrative burden on the Federal Government. This block grant was part of a larger effort to strengthen the Federal effort to combat drug abuse, alcohol abuse, and mental illness by (1) reorganizing the Alcohol Drug Abuse and Mental Health Administration (ADAMHA); (2) authorizing or reauthorizing a series of prevention and treatment services programs such as the Alcohol, Drug Abuse, and Mental Health Services Block Grant; and (3) authorizing a medications development research initiative.


Several major legislative actions have taken place since the Alcohol and Drug Abuse and Mental Health Service Block Grant was initiated in 1981. From 1982 to 1989, various legislative modifications were made to the block grant program. The 1992 ADAMHA Reorganization Act assigned responsibility for administering the SAPT BG Program to the Substance Abuse and Mental Health Services Administration (SAMHSA)—a services agency created when the three research institutes housed in ADAMHA were transferred to the National Institutes of Health in 1991. The act also created a separate block grant for mental health services and treatment—the Community Mental Health Services Block Grant. The SAPT BG Program is administered by CSAT’s Division of State and Community Assistance in collaboration with CSAP’s Division of State and Community Systems Development.

The purpose of the SAPT BG Program is to provide funds to States, Territories, and one Native American Tribe for the purpose of planning, carrying out, and evaluating activities to prevent and treat substance abuse and other allowable activities. The SAPT BG Program constitutes an average of 40 percent of all States’ budgets for substance abuse prevention and treatment services and activities and is the primary Federal source of funding. States have flexibility in determining how funds should be allocated, but there are specific set-aside and maintenance of effort requirements that must be met in order to receive funding. These requirements, introduced by both the ADAMHA Reorganization Act of 1992 and amended in the Children’s Health Act of 2000, are listed below:

Table 1. SAPT BG Program Set-Aside Provisionsa

Category

Set-Aside Provision

Prevention and treatment activities regarding alcohol

Not less than 35 percent of SAPT BG Program funding*

Prevention and treatment activities regarding other drugs

Not less than 35 percent of SAPT BG Program funding*

Primary prevention programs

Not less than 20 percent of SAPT BG Program funding

Pregnant women and women with dependent children

Not less than amount equal to expenditure in FY1994

Tuberculosis services

No set amount but services must be provided to receive SAPT BG Program funds

HIV servicesb

No more than 5 percent increase over State allotment for HIV services in FY 1991

Prohibition of sale of tobacco to individuals under age of 18 (Synar amendment)

State must enforce law against sale of tobacco to underage individuals to receive SAPT BG Program funds—noncompliance leads to a 10 percent reduction in funds the first applicable fiscal year; 20 percent, the second year; 30 percent, the third year; and 40 percent, the fourth year.

Maintenance of effort (MOE) for State expenditures

State will maintain funding at no less than the average level of expenditures for the 2 years preceding the fiscal year for which the State is applying.

Administrative expenses

Limited to 5 percent of SAPT BG Program funding

a The set-asides shown in this table were included in the 1992 SAPT BG Program authorizing legislation (42 USC 300x–21 to 300x–35). In the Children’s Health Act of 2000 (Public Law 106-310) Sec. 3303 (a) (1)), however, the set-asides marked with asterisks were removed.

b For designated States whose rate of AIDS cases is 10 or more per 100,000 individuals as confirmed by the Centers for Disease Control and Prevention.


In addition to the set-asides, States must address the following 17 goals of the SAPT BG Program in order to receive this Federal funding:


Table 2. Federal Goals for the Substance Abuse Prevention and Treatment Block Grant

GOAL #1: Continuum of substance abuse treatment services

The State shall expend block grant funds to maintain a continuum of substance abuse treatment services that meet these needs for the services identified by the State (see 42 U.S.C. 300x-21(b) and 45 C.F.R. 96.122(f)(g)).

GOAL #2: Spending on primary prevention programs

The State agrees to spend not less than 20 percent on primary prevention programs for individuals who do not require treatment for substance abuse, specifying the activities proposed for each of the six strategies (see 42 U.S.C. 300x-22(b)(1) and 45 C.F.R. 96.124(b)(1)).

GOAL #3: Spending on services for pregnant women and women with dependent children

The State agrees to expend not less than an amount equal to the amount expended by the State for FY 1994 to establish new programs or expand the capacity of existing programs to make available treatment services designed for pregnant women and women with dependent children and, directly or though arrangements with other public or nonprofit entities, to make available prenatal care to women receiving such treatment services and, while the women are receiving services, child care (see 42 U.S.C. 300x-22(c)(1) and 45 C.F.R. 96.124(c)(e)).

GOAL #4: Treatment for intravenous drug abusers

The State agrees to provide treatment to intravenous drug abusers that fulfills the 90 percent capacity reporting, 14- to 120-day performance requirement, interim services, outreach activities and monitoring requirements (see 42 U.S.C. 300x-23 and 45 C.F.R. 96.126).

GOAL #5: Tuberculosis services for people in substance abuse treatment

The State agrees, directly or through arrangements with other public or nonprofit private entities, to make tuberculosis services available routinely to each individual receiving treatment for substance abuse and to monitor such service delivery (see 42 U.S.C. 300x-24 and 45 C.F.R. 96.127).

GOAL #6: Early intervention services for HIV for people in substance abuse treatment

Designated States agree to provide treatment for persons with substance abuse problems with an emphasis on making available within existing programs early intervention services for HIV in areas of the State that have the greatest need for such services and to monitor such service delivery (see 42 U.S.C. 300x-24(b) and 45 C.F.R. 96.128).

GOAL #7: Group homes for recovering substance abusers

Designated States agree to provide for and encourage the development of group homes for recovering substance abusers through the operation of a revolving loan fund (see 42 U.S.C. 300x-25 and 45 C.F.R. 96.129).

GOAL #8: State efforts to reduce the availability of tobacco products

The State agrees to continue to have in effect a State law that makes it unlawful for any manufacturer, retailer, or distributor of tobacco products to sell or distribute any such product to any individual under the age of 18 and to enforce such laws in a manner than reasonably can be expected to reduce the extent to which tobacco products are available to individuals under age 18 (see 42 U.S.C. 300x-26 and 45 C.F.R. 96.130).

GOAL #9: Preferential admission of pregnant women to substance abuse treatment

The State agrees to ensure that each pregnant woman be given preference in admission to treatment facilities and, when the facility has insufficient capacity, to ensure that the pregnant woman be referred to the State, which will refer the woman to a facility that does have the capacity to admit the woman or, if no such facility has the capacity to admit the woman, will make available interim services within 48 hours (see 42 U.S.C. 300x-27 and 45 C.F.R. 96.131).

GOAL #10: Improved process for referring individuals to substance abuse treatment

The State agrees to improve the process in the State for referring individuals to the treatment modality that is most appropriate for the individual (see 42 U.S.C. 300x-28 and 45 C.F.R. 96.132(a)).

GOAL #11: Continuing education for employees at substance abuse prevention and/or treatment facilities

The State agrees to provide continuing education for the employees of facilities which provide prevention activities or treatment services (or both) (see 42 U.S.C. 300x-28(b) and 45 C.F.R. 96.132(b)).

GOAL #12: Coordination of services

The State agrees to coordinate prevention activities and treatment services with the provision of other appropriate services (see 42 U.S.C. 300x-28(c) and 45 C.F.R. 96.132(c)).

GOAL #13: Needs assessment by State and locality

The State agrees to submit an assessment of the need for both treatment and prevention in the State for authorized activities, both by locality and by the State in general (see 42 U.S.C. 300x-29 and 45 C.F.R. 96.133).

GOAL #14: Ensuring that needles and syringes are not provided for illegal drug use

The State agrees to ensure that no program funded through the block grant will use funds to provide individuals with hypodermic needles or syringes so that such individuals may use illegal drugs (see 42 U.S.C. 300x-31(a)(1)(F) and 45 C.F.R. 96.135(a)(6)).

GOAL #15: Improving the quality and appropriateness of treatment services

The State agrees to assess and improve, through independent peer review, the quality and appropriateness of treatment services delivered by providers that receive funds from the block grant (see 42 U.S.C. 300x-53(a) and 45 C.F.R. 96.136).

GOAL #16: Protecting patient records from inappropriate disclosure

The State agrees to ensure that the State has in effect a system to protect patient records from inappropriate disclosure (see 42 U.S.C. 300x-53(b), 45 C.F.R. 96.132(e), and 42 C.F.R part 2).

GOAL #17: Compliance with 42 C.F.R. part 54 Charitable Choice Provisions and Regulations

The State agrees to ensure that the State has in effect a system to comply with 42 C.F.R. part 54 (see 42 C.F.R. 54.8(c)(4) and 54.8(b)) Charitable Choice Provisions and Regulations).

SOURCE: Performance Partnership Grant Branch, Division of State and Community Assistance, Center for Substance Abuse Treatment, Substance Abuse and Mental Health Services Administration, “Uniform Application, FY 2007, Substance Abuse Prevention and Treatment Block Grant (42 U.S.C. 300x-21 to 35 and 300x-51 to 66),” Rockville, MD, 2004.

SAMHSA allocates SAPT BG funds to the State, which can subgrant or subcontract the funds to other subdivisions of government, other management entities, and/or local providers. Evaluating the implementation and effect of the SAPT BG Program is complicated because of the myriad ways States can use the funds, which is consistent with the intent to provide States with flexibility.


After SAMHSA allocates the SAPT BG funds to the State, the State may require that the funds be appropriated by the State legislature, which may place additional requirements on how the SAPT BG funds are to be expended. Usually the final appropriation, with any legislative direction, is made to an agency responsible for administering the SAPT BG Program, usually referred to as the Single State Authority (SSA).


The SSA makes decisions about which services should be funded using the SAPT BG funds and then distributes them accordingly. The SSA can distribute SAPT BG funds to any or all of the following: (1) treatment/prevention providers; (2) intermediating public/private management entities, who in turn provide funds to treatment/prevention providers; (3) intermediating public/private management entities who serve as providers themselves; (4) the SSA for State-operated treatment and prevention services; or (5) other business entities (e.g., information technology firms).


When a State submits its SAPT BG application to SAMHSA, CSAT and CSAP State Project Officers review the application to ensure State compliance with SAPT BG Program regulations and to identify issues requiring technical assistance (TA).


This combination of flexibility and specificity provided by the SAPT BG Program authorizing legislation presents both challenges and opportunities. The flexibility that enables States to address local needs means that there will be many designs for responding to the needs to prevent substance use disorders, as well as treating them when they occur. On the other hand, the SAPT BG Program uniform application provides a baseline for some comparisons among States. These two factors – flexibility and the need to respond to mandated requirements – complicate the evaluation of the success of the SAPT BG Program.


2. Purpose and Use of Information.

The FY 2003 Office of Management and Budget (OMB) Program Assessment Rating Tool (PART) process for the SAPT BG Program resulted in a rating of “Ineffective.” The SAPT BG Program received high scores on three of four PART areas, including Program Purpose and Design, Strategic Planning, and Program Management. However, the scores could have been even higher in these areas if data were available to document that the resources were reaching the intended beneficiaries or that the program had ambitious targets and long-term measures. In the fourth area, Program Results/Accountability, where a low rating was received, it was found that “no independent evaluation of the program has been completed” to establish that the SAPT BG Program is effective and fulfilling its legislative mandates.

In direct response to this OMB finding, CSAT conducted an Evaluability Assessment (EA) to determine the feasibility of conducting an independent evaluation of the SAPT BG Program, and subsequently, to fund such an evaluation effort. EA is a recognized program evaluation methodology which involves collaboration with multiple stakeholders and development of a program logic model used to plan formal evaluations of large and/or complex programs, such as the SAPT BG Program. The findings of the EA were used as a foundation in the development and awarding of a multi-year contract in FY 2004 to conduct an independent, comprehensive evaluation of the SAPT BG Program.


As noted in the OMB PART results, the legislative intent of the SAPT BG Program is to provide funding to States by formula to plan, carry out, and evaluate activities to prevent and treat substance abuse. Therefore, the evaluation is designed to examine the system-level activities, outputs, and outcomes associated with the program in relation to its goals.

In this evaluation, a multi-method evaluation approach is being used to examine Federal and State performance with regard to the SAPT BG Program and its 17 identified goals. This approach emphasizes a qualitative and quantitative examination of both the SAPT BG Program process (e.g., activities and outputs in the logic model) and system-level outcomes whereby Federal and State stakeholder perspectives on the SAPT BG Program, as captured through semi-structured interviews and surveys, are corroborated and compared to the considerable amount of already-collected source documents and data provided by States, CSAT, and CSAP (e.g., BGAS applications, Treatment Episode Data Set (TEDS), National Survey on Drug Use and Health (NSDUH), the Minimum Data Set (MDS), Technical Review Reports, State Prevention and Synar System Reports).

There are three major goals for the SAPT BG Program evaluation:

  • To evaluate the extent to which the SAPT BG Program is implemented according to Congressional intent and legislative requirements

  • To examine the extent to which the SAPT BG funds leverage States’ ability to make policy changes based on major Federal policy initiatives

  • To document intended and unintended outcomes of the SAPT BG Program.


The evaluation will cover the following domains: the State SAPT BG Program planning process, Federal review of SAPT BG applications and implementation reports, Federal TA, State SAPT BG Program implementation, Federal oversight and management, State SAPT BG Program reporting, and State-level outcomes. The results of this evaluation will not only document the effectiveness of the Program in supporting the substance abuse prevention and treatment system, they also will help guide CSAT and CSAP and the States to improve the methods by which they implement the SAPT BG Program, including the capacity to collect, analyze, and interpret the National Outcome Measures (NOMs). As a separate, parallel SAMHSA initiative, the NOMs project began after the SAPT BG Program evaluation contract inception and was not used in the SAPT BG Program EA or the development of the evaluation framework and logic model. However, selected NOMs items that relate to the evaluation framework and logic model will be examined in the independent evaluation. These selected NOMs items include:


  • Increase in number of persons reporting a reduction in 30-day drug/alcohol use

  • Increase in number of persons employed or in school

  • Reduction in number of drug or alcohol-related arrests

  • Increase in number of persons in stable housing situations (reduction in homelessness)

  • Increase in access to services measured by unduplicated counts of persons served and numbers served compared to those in need

  • Increase in number of persons receiving evidence-based services.


In addition, the evaluators will attempt to collect information on system-wide client perception of care. Statistical tests for association between outcome measures and a number of independent variables will be conducted. Examples of independent variables include, but are not limited to, level of funding, level of the SSA within State government, degree of SSA partnership with other State agencies and community organizations, and amount of State-funded support available for research and training activities.


In addition to information about the selected NOMs domains, the evaluation also will examine systemic measures related to infrastructure. Infrastructure refers to the resources, systems, and policies that support the nation’s public substance abuse prevention and treatment system, and is a potential contributor to significant State behavioral health system outcomes. Examples of infrastructure include staff training, policy changes, and service availability.


Because this is the first-ever comprehensive evaluation of the Program, the data collection activities are more extensive (and time intensive) than would be expected of a Program that has been regularly evaluated. These data will serve as a baseline for future evaluations.


The two primary data collection strategies will include open-ended interviews and web-based surveys. Interviews will be conducted with Federal staff involved in the administration of the SAPT BG Program and State staff from all States and Territories involved in their State’s implementation of the SAPT BG Program. Two web-based surveys will be administered to all individuals who formally participate in monitoring the SAPT BG Program as part of the Technical Review or State Prevention and Synar System Review Teams.

The interview protocol for Federal staff includes 80 questions (mostly open-ended), and, on average, should take 90 minutes to complete. The interview protocol for the State staff includes 99 questions (mostly open-ended), and should take, on average, 3 hours to complete. Both the Federal staff interviews and the State staff interviews will be conducted as in-person interviews. While the Federal staff will each be interviewed individually, a single State group interview will be conducted that will include all key State staff. State Substance Abuse Authority Directors will be asked to select those State staff who they believe are most knowledgeable about the SAPT BG Program for participation in the interviews. It is anticipated that, at a minimum, the following individuals will participate: the State Director, Treatment Supervisor/Lead, NPN/Prevention Supervisor/Lead, Federal Liaison/person that completes the SAPT BG application, Lead Data Analyst, Program Evaluator/Monitor, and TA/Training Manager/Lead.

The two web-based surveys will be distributed to the two current sets of formal reviewers for the SAPT BG Program: Technical Reviewers (TR) and State Prevention and Synar System Reviewers (SPSSR). The web-based surveys are designed so that each stakeholder group receives survey questions designed to capture their specific knowledge of and experience with the SAPT BG Program. The TR survey contains 47 questions and the SPSSR survey has 27 questions. Each survey should take approximately 1 hour or less to complete. Reviewers will submit their responses to the survey online during a 3-week period.

3. Use of Improved Information Technology.


The Altarum Project Team has developed a web-based survey instrument that will be used to administer the survey portions of the project’s data collection via the Internet. This system is compliant with all SAMHSA ADP-IT requirements and is customizable so that CSAT and CSAP can use this system as an ongoing evaluation tool. Screenshots of the web-based survey system have been created and included as part of this clearance package (see Attachment A). The web-based survey system contains an easy-to-use user interface and a secure e-mail function to alert stakeholders when they can access the survey. Altarum staff have thoroughly tested the virtual survey system to ensure optimal functioning and compliance to SAMHSA DMS-IT and other Federal guidelines.


4. Efforts to Identify Duplication.


The surveys and interviews are specific to this program and the data required are not available anywhere else. The Altarum Project Team developed the interview protocols and surveys after a careful review of SAPT BG applications and implementation reports to ensure that the data collection instruments would not duplicate information that could be gathered from these secondary sources.


5. Involvement of Small Entities.


This data collection does not involve small businesses or small entities.


6. Consequences of Information Not Collected or Collected Less Frequently.


These data are to be collected for this independent evaluation on a one-time basis. Not collecting these data would prevent CSAT and CSAP from gathering the information it needs to determine the extent to which the SAPT BG Program is fulfilling its legislative mandate and producing desired outcomes.


7. Consistency with the Guidelines in 5 CFR 1320.5(d)(2).


This data collection is consistent with the general information collection guidelines in 5 CFR 1320.5(d)(2).


8. Consultation Outside the Agency.


  1. Federal Register announcement


The public notice soliciting comments on this information collection was published in the March 27, 2007 Federal Register (volume 72, number 58, page 14284 - 14286). Two pieces of feedback were received from the public. The comments provided in the letter are the following:



Table 1. First Set of Comments


FROM FEDERAL REGISTER

COMMENTS TO SAMHSA

RESPONSE

Goal #3—Title: Spending on services for pregnant women and children

Goal #3—Description: ”to make available treatment services designed for pregnant women and children with dependent children…”

The goal “title” should reflect the statute: “Spending on services for pregnant women and women with dependent children.”

The description should be corrected to read “…services designed for pregnant women and women with dependent children.

The comment is correct and does reflect the exact wording of the Statute. The wording of the Goals in the OMB Package will be changed to match the exact wording of the Statute.

National Outcome Measures:

  • Increase in number of persons employed or in school

  • Reduction in number of drug or alcohol-related arrests

  • Increase in access to services measured by unduplicated counts of persons served and numbers served compared to those in need

  • Increase in persons receiving evidence-based service

  • Perception of Care

The National Outcome Measures are listed differently in this Federal Register (FR) notice from the way they are listed in the Treatment Episode Data Set (TEDS). Using definitions in the evaluation that differ from the TEDS definitions will cause confusion, and will impact the accuracy and practical utility of the information collected by the evaluation. We believe that the evaluation questions should be in line with TEDS data to promote accuracy and consistency. We would object if the evaluation were to be used as a means for changing current data requirements rather than collecting data on how the states and the federal government are implementing the SAPT BG according to the authorizing legislation. Specifically:

  • TEDS does not require states to collect school status information upon a client’s discharge. Consequently, states cannot measure an increase in the number of person in school without modifying their data collection instruments.


  • TEDS requires collection of “arrest” data at admission and discharge without specifying that the arrests are AOD-related. States cannot provide AOD-related arrest data without modifying their data collection instruments.

  • Every State measures the numbers served, but they differ in how they determine “needs.” Because SAMHSA does not require a uniform method of determining “need,” such as data from the National Survey on Drug Use and Health, the data collected will be inconsistent between states.

  • SAMHSA told States this NOM is still “under development” for treatment of substance abuse, as is the “total number of evidence-based programs and strategies” for prevention services. For prevention, “persons receiving evidence-based services” drastically understates the impact of many prevention strategies, particularly Universal and Selective (as defined by the Institutes of Medicine) interventions such as environment/population-based policies and procedures, where the number of individuals “served” is unknowable.

  • According to SAMHSA, the “perception of care” NOM is still “under development.” This FR notice states that the evaluators will attempt to collect information on system wide client perception of care. The collection of this information by the evaluators should not precede the development and implementation of this NOM.

As the period (FFY2004) that will be evaluated during this initial evaluation of the SAPT Block Grant Program did not require States to submit NOMs data, the evaluation will use TEDS data as a proxy measure for NOMs when it is appropriate.


The comment is correct that the TEDS definitions are not the same as the NOMs definitions. As SAMHSA’s plan is for this evaluation to be a recurring activity, the evaluation design has been created in a manner that will support the collection and analysis of NOMs data once all States begin reporting the required measures.




For the initial evaluation, TEDS data will be analyzed from all States and NOMs data will be used whenever it is available. FFY2001 actual use data (reported in FFY2004) will be analyzed against FFY2004 data (reported in FFY2007). This design provides a three-year time differential to examine potential changes in outcomes.


There is no plan to use the evaluation as a means for changing the current data requirements.

Statistical tests for association between outcome measures and a number of independent variables will be conducted.”

According to the FR, the purpose of the evaluation is ”to determine the extent to which States and the Federal Government are implementing the SAPT BG according to the authorizing legislation.”

  • Based on the information provided in the FR, it is unclear why this review would include statistical tests for association between outcome measures and independent variables, such as funding levels, level of the SSA within state government, and amount of State-funded support available for research and training activities. While this information may be helpful to know, none of these examples are required or even addressed, in the authorizing legislation or the implementing regulations. Therefore, doing the analysis does not further the purpose of the evaluation, which is to determine the extent to which states and the federal government are implementing the SAPT BG according to the authorizing legislation.

According to the FR, an additional purpose of the evaluation is to examine intended and unintended outcomes of the SAPT Block Grant. The tests for association mentioned will be conducted primarily with systems level, not client level, outcome data. For example, the extent to which States were able to make improvements to their system of care for women and women with dependent children will be examined. Tests will be conducted to assess whether differences in State progress in this area are associated with the level of funding for these services in the State or the SSA’s ability to make significant changes to women’s services. Explanations for outcomes are as important as the outcomes themselves if improvements are to be made to the SAPT Block Grant Program and States’ systems of care.




99 open-ended questions to be asked of a minimum of 6 state representatives.

The estimated time of three hours to complete the interview protocol for state staff (99 questions, mostly open-ended) underestimates the total burden described in Table 3. Six individuals responding to open-ended questions will generate discussion which is not taken into account. In addition, the amount of preparation time, scheduling, advance review of proposed questions, etc. for the interview is not included in the estimate. These processes are very time consuming, and may take up to 60-80 hours to complete.

The table only reflects the minimum number of State representatives. Based on the recent CSAT Core Review in California, we estimate that the number of people who may need to be present would be more in the range of 10 to 12, increasing the burden estimate 12 to 18 hours for the interviews alone.

Finally, it is difficult to quantify the time needed because we don’t know what the 99 questions are.

The estimated response time is based on an average response from the 60 SAPT Block Grant-funded entities. Some larger States may spend more than the average number of hours while many agencies (e.g., the Pacific Basin Territories) will spend less time. The evaluation (including the State interview protocol) has been designed based on feedback from CSAT and CSAP staff, the Evaluation Advisory Workgroup, NASADAD, and individual States in a manner that will capture the critical information without posing an excessive burden. States will receive clear guidance on how to prepare and participate in the interviews that will keep the time burden to a minimum while allowing the collection of quality data.

These data will serve as a base-line for future evaluations.

Since this is the first-ever comprehensive evaluation, we hope that adjustments can/will be made to baselines as a result of additional information or future evaluations.

The baseline data will be analyzed to ensure that it is an accurate representation of the current state of the SAPT Block Grant Program. As this is the first time the data will be collected and analyzed, we are aware that extreme care must be taken to ensure that the evaluation results do not misrepresent the strengths, accomplishments, and areas of improvement for the program.




Table 2. Second Set of Comments


State Comment

Response

Open the evaluation to consider whether the NOMS measures and related items included in the SAPT Block Grant Uniform Application are the best means to assess the issues in the NOMS domains.

The SAPT BG Evaluation is not intended to evaluate the validity or use of the NOMs measures.

Make available the program logic model, upon which the evaluation study is based, for comment and discussion during each State interview. To test the degree to which federal and state perceptions of the purposes and operation of the block grant program are the same or differ.


The logic model for the SAPT BG Program will be available for review during the interviews. However, the evaluation will not be assessing the degree to which Federal and State perceptions about the purposes/operation of the Block Grant Program are the same or differ. The Evaluability Assessment conducted previous to the initiation of the SAPT BG Evaluation found there to be general agreement on the purpose/intent of the Program.

Make available the details of the PART analysis, upon which the National Outcome Measures (NOMS) are based, for comment and discussion during each State interview.

The SAPT BG PART results are available to the public at www.expectmore.gov.

Provide an opportunity to consider whether the management uses of the NOMS data at the federal level, envisioned in public statements from the SAMHSA Administrator, can be supported by the data submitted by the states in response to SAPT Block Grant reporting requirements.

The SAPT BG Evaluation is not intended to evaluate the validity or use of the NOMs measures.

Provide for additional contact time with State staff if necessary for the evaluation process to explore these concerns thoroughly.

At the conclusion of all State interviews, State staff will be given an opportunity to discuss issues of concern in more depth with the Evaluation Contractor.


b. Additional people consulted outside the agency


The agency’s contractor, Altarum, contributed to the development of the survey instrument. The contractor’s address, phone number, and the staff persons involved are listed below:


Altarum

1200 18th Street NW Suite 700

Washington DC 20036

(202) 828-5100


  • Eric Gelman, M.B.A., M.A., Behavioral Health Director at Altarum and Project Director

  • Jessica McDuff, M.A., Senior Policy Associate at Altarum and Project Manager

  • Scott L. Green, Ph.D., M.B.A., Senior Associate at Altarum


In addition, the Altarum Project Team consulted frequently with an Evaluation Advisory Workgroup (EAW) comprising a variety of individuals with evaluation skills and an in-depth knowledge of the SAPT BG Program. The EAW members are the following:


  • Teresa Anderson, Ph.D.Massachusetts Department of Public Health

  • Theodora Binion-Taylor – Illinois Department of Human Services

  • Maria Canfield – Chief, Nevada Bureau of Alcohol and Drug Abuse

  • Barbara Cimaglio – Vermont Department of Health

  • Patrick Fleming, M.P.A., LSAC – Director, Salt Lake County Government Center

  • Robert Johnson – Senior Deputy Director, Addiction Prevention and Recovery

  • Michael Magnusson – Ohio Department of Alcohol and Drug Addiction Services

  • Howard Shapiro, Ph.D. – Executive Director, State Associations of Addiction Services

  • Don Wright – North Dakota Department of Human Services


9. Remuneration of Respondents.


Respondents will not receive any payment.


10. Assurance of Confidentiality.


Personal information will not be collected. Respondents will be fully informed about the purpose of this study and the names of respondents will not be included in any reports from the study. Completed surveys will be maintained by contractor in a password-protected database. Information taken from interviews will be aggregated and presented as such in any reports developed for this project. Comments made through the interviews or surveys will not be attributable to specific individuals.


11. Questions of a Sensitive Nature.


This survey does not include questions of a sensitive nature.


12. Estimates of Annualized Hour Burden.


Data will be collected using two primary strategies: 1) on-site interviews with Federal and State staff, and 2) web-based surveys for Technical Reviewers and State Prevention Synar System Reviewers.


Estimate of interview burden. On-site interviews will be conducted with SSA Directors and State staff who are knowledgeable about the SAPT BG Program (as determined by the State Director). The interview is designed to be a group interview with approximately six to eight staff that have extensive knowledge of the State’s SAPT BG Program activities. It is expected that interviews will typically include the State Director, Treatment Supervisor/Lead, NPN/Prevention Supervisor/Lead, Federal Liaison/person that completes the SAPT application, Lead Data Analyst, Program Evaluator/Monitor, TA/Training Manager/Lead. The group interview should last approximately 3 hours, based on pre-testing conducted by Altarum. Federal staff also will participate in on-site interviews that should last approximately 90 minutes.


The estimated hourly wage of $37.09 is based on the mean hourly wage for Medical and Health Services Managers from the Bureau of Labor Statistics’ Occupational Employment Statistics for May 2005. The total estimated annualized cost to the Federal and State respondents is $43,673.48. This cost estimate was calculated based on the total respondent hour burdens and the estimated wage rate received from the Bureau of Labor Statistics.


Table 3 summarizes the estimate of the total annual time and cost burden to respondents resulting from participating in the interviews and the web-based survey. Estimated burden was identified through pre-testing of the surveys and interview protocols conducted by the agency’s contractor.


Estimate of web-based surveys. Web-based surveys are an economical strategy for this data collection given that many of the stakeholders (e.g., Technical Reviewers and State Prevention Synar System Reviewers) are geographically dispersed. Stakeholders will be able to access the survey through an internet connection and use a secure user interface to complete the survey. Based on system testing conducted by Altarum, it is estimated that completing the survey will take approximately one hour.


Given that State reviewers and program monitors earn approximately the same wages as Medical and Health Services Managers, the estimated hourly wage of $37.09 is applied to the group of TRs and SPSSRs as well. The total estimated annualized cost to the survey respondents is $1,669.05. This cost estimate was calculated based on the total respondent hour burdens and the estimated wage rate received from the Bureau of Labor Statistics. Table 3 summarizes the estimate of the total annual time and cost burden to respondents resulting from completing the web-based survey.


Table 3: Estimated Reporting Burden of Interviews and Web-based Surveys

Respondents

Number of Respondents

Average Hours per Interview/

Survey



Estimated Total Burden (Hours)

Hourly Mean Wage

Estimated Total Cost


In-person Interviews






State Substance Abuse Prevention and Treatment Agency Commissioner

60

3


180

$37.09

$6,676.20

State Planners

60

3

180

$37.09

$6,676.20

State Data Analysts

60

3

180

$37.09

$6,676.20

State Prevention Lead

60

3

180

$37.09

$6,676.20

State Treatment Lead

60

3

180

$37.09

$6,676.20

Additional State Staff

60

3

180

$37.09

$6,676.20

Federal SAPT Block Grant Staff

35

1.5

52.5

$37.09

$1,947.23

Subtotal

395


1,132.5


$42,004.43

Web-based Interviews






Technical Reviewers

15

1

15

$37.09

$556.35

State Prevention and

Synar System Reviewers


30


1


30


$37.09


$1,112.70

Subtotal

45


45


$1,669.05

Total

440


1,177.5


$43,673.48


13. Estimates of Annualized Cost Burden to Respondents.


There are no capital or start-up costs for this project.


14. Estimates of Annualized Cost to the Government.

The annualized cost to the Federal Government for this project is $2,014,528.  This includes an estimate of $4,500 or 5% for a GS-14 employee’s time allocated as a project manager and $2,010,028 to the contractors for all other project activities.

15. Changes in Burden.


This is a new data collection.


16. Time Schedule, Publication and Analysis Plans.


a. Reports to be published


The contractor will write a summary report that includes a synthesis of findings from the conduct of this study, described in this clearance package. This report will include simple graphic displays of the key findings with detailed findings provided in a technical supplement.


The report will be structured as follows:


  1. Introduction


    1. SAPT BG Program purpose, description, and history

    2. Evaluation purpose, objectives, and main questions

    3. Report organization and chapters


  1. Methods


    1. Involvement of EAW

    2. Development of logic model

    3. Development of evaluation framework and main questions

    4. Development of interview and survey protocols

    5. OMB approval process

    6. Development of data abstraction forms

    7. Review of SAPT BG Program documents

    8. Database development

    9. Interviews with State staff

    10. Interviews with Federal staff

    11. Web-based surveys of TR, SPSR, and Synar reviewers

    12. Case studies

    13. Data analysis

    14. Limitations


  1. Results


    1. SAPT BGP Funding Distributions


      1. Federal to State distribution process

      2. State distribution processes

        1. Mechanism for distributing funds to subrecipients; process of distribution, how the funds reach actual service providers

        2. How funds are distributed to meet 17 statutory goals of SAPT BGP

        3. Influence of State laws or leaders on funding distribution

      3. Activities supported by the administrative set-asides

      4. Strengths and areas for improvement in SAPT BGP funding distributions


    1. Application and Review Processes


      1. Development of application template and guidance for States

      2. State application development and submission

        1. State processes

        2. State needs assessment

        3. Intended use plan

        4. Progress report

        5. Annual report

        6. Synar report

      3. Application review and approval

      4. Use of application information

      5. Strengths and areas for improvement in application and review processes


    1. Types of Programs and Services Funded through the SAPT BGP


      1. Service modalities

      2. Program types

      3. Target populations served

      4. Services funded to meet set-aside requirements

      5. Evidence-based practices

      6. Unique uses of SAPT BG funds

      7. Strengths and areas for improvement in the types of programs and services funded through the SAPT BGP


    1. Program Development, Technical Assistance, and Training


      1. Federally funded

      2. State funded

      3. Strengths and areas for improvement in program development


    1. Program Monitoring by Federal Representatives


      1. Technical Reviews

        1. Core elements

        2. State requested

      2. State Prevention and Synar System Reviews

      3. Other program monitoring (e.g., grants management)

      4. Strengths and areas for improvement in program monitoring


    1. Evaluation of SAPT BGP Activities


      1. Federally required data collection, analysis, and reporting

        1. Voluntary performance measures

        2. State capacity and readiness to collect Federally required data (NOMS)

        3. Federal uses for data

      2. State data collection from subrecipients

        1. Data collection processes between States and subrecipients

        2. Uses for data collected by States (non-Federal data)

      3. Strengths and areas for improvement in evaluation


    1. Program Impacts and Contributions to Substance Abuse Treatment Systems of Care


      1. Federal leadership and guidance in improving the substance abuse prevention and treatment system of care

      2. State coordination of substance abuse prevention and treatment services and programs

      3. Number and quality of evidence-based practices available in States and across Nation

      4. Leveraging of BG resources to initiate new programs and services

      5. Leveraging of BG resources to initiate policy changes (State and Federal levels)

      6. State and Federal ability to demonstrate services provided and outcomes

      7. Influence of SAPT BG funding reductions on States’ abilities to prevent and treat substance use disorders


    1. Challenges and Lessons Learned


  1. Recommendations for SAPT BG Program Improvement (split recommendations into appropriate categories)


  1. Conclusion

b. Complex analytical techniques/Plan


The quantitative information derived from this survey will be entered into a database and analyzed using SAS, a statistical computing software program. Frequencies on all variables will be produced. Tests of significance may be used to analyze differences, such as between block grant allocation and numbers of services in States.


Qualitative information will be grouped by application section and content analyses will identify common themes among the information gathered. The analyses will also examine commonalities and differences in how the services are organized and implemented, and potential outcomes and impacts of the SAPT BG Program on substance abuse prevention and treatment systems of care, and ultimately, the consumers who access services.


c. Time schedule


Table 4 reflects the schedule for each task in the design, data collection, and report compilation phases of the SAPT BG Program evaluation.


Table 4: Estimated Time Schedule for Tasks

Task

Estimated Start Date

Estimated Completion Date

Schedule and conduct interviews

Upon OMB approval

4 months post-OMB approval

Launch web-based surveys

1 month post-OMB approval

3 months post-OMB approval

Analyze data

3 months post-OMB approval

5 months post-OMB approval

Draft final report

6 months post-OMB approval

8 months post-OMB approval


17. Exemption for Display of Expiration Date.


The expiration date for OMB approval of the information collection will be displayed.


18. Exceptions to Certification for PRA Submissions.


No exceptions to the certification statement are requested. The certifications are included in this submission.


B. COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS


1. Respondent Universes and Sampling Procedures


The respondents consist of the universe of the 60 SAPT BG-funded States and Territories. The individuals to be interviewed in each jurisdiction include the following types of staff:


  • SSA Director

  • Treatment Supervisor/Lead

  • NPN/Prevention Supervisor/Lead

  • Federal Liaison/person that completes the SAPT application

  • Lead Data Analyst

  • Program Evaluator/Monitor

  • CSAT Technical Review, CSAP State Prevention Assessment, and CSAP Synar Review Coordinator

  • TA/Training Manager/Lead.


In addition, the respondents include the entire universe of Technical Reviewers and State Prevention Synar System Reviewers. Therefore, no complex sampling methods are required.


2. Procedures for Data Collection and Statistical Estimation


The data collection effort for this study will not employ any complex statistical methods in identifying the respondents. Federal staff involved in the administration of the SAPT BG Program and the SSA Director from each of 60 SAPT BG-funded States and Territories will be asked to participate via an introductory letter (see Attachment B), along with appropriate staff, in the interviews. Interviews will be conducted on-site and scheduled at the convenience of the SSA staff.


Introduction letters will be sent to the remaining stakeholders (see Attachment B; i.e., Technical Reviewers and State Prevention Synar System Reviewers) asking for their participation, describing the web-based survey system, and providing instructions for accessing the system (see Attachment C). Access to the web-based surveys (via a URL hyperlink) will be sent via e-mail. For those stakeholders who do not have access to e-mail, hard copies of the survey will be mailed.


3. Maximizing Response Rates and Issues Related to Non-response.


Response rates and the on-site interviews. Altarum staff will work with each Federal staff person and State Director to schedule the interviews at a time most convenient for them within our 4-month timeframe. This flexibility should enable us to have a 100% response rate. Altarum will work with CSAT, CSAP, and the States to emphasize the importance of their participation in the evaluation.


Response rates and the web-based surveys. Maximizing response rates is extremely important. This will be handled in the following ways:

  1. Respondents without access to e-mail will have hard copies of the survey sent to them with a stamped return envelope to facilitate return. Staff also will be available to assist respondents by phone with completing the survey. Follow-up phone calls will be used to encourage respondents to complete and return the surveys.

  2. The web-based survey system has an e-mail function that allows Altarum staff to send reminder e-mails to respondents if they have not accessed or completed the survey.


Given these strategies, and stakeholders’ vested interest in the SAPT BG Program, SAMHSA is anticipating that stakeholders will actively participate in the SAPT BG Program evaluation. SAMHSA anticipates a response rate of 75% for the web-based surveys. To ensure that the response rate is achieved, SAMHSA will contact individuals who have not completed the survey in order to remind them to complete the survey. These reminders will be a combination of emails, letters, and telephone calls.


4. Tests of Procedures.


The data collection instruments have been reviewed by CSAT and CSAP and its contractor clarifying terminology and language and rewriting or eliminating questions that were unclear or unnecessary. The Evaluation Advisory Workgroup and three States also provided feedback on the instruments as part of the test of procedures.


5. Individuals Involved.


The data collection will be conducted by the contractor, Altarum. The Project Director of the Independent Evaluation of the SAPT BG Program is Eric P. Gelman, M.B.A., M.A. The contact person for this survey is Jessica McDuff, M.A., Project Manager. Both can be reached at the address and number below.


Altarum

1200 18th Street NW Suite 700

Washington DC 20036

(202) 828-5100

18


File Typeapplication/msword
File TitleSURVEY OF SINGLE STATE AUTHORITIES REGARDING THE HIV SET-ASIDE OF THE SUBSTANCE ABUSE PREVENTION AND TREATMENT BLOCK GRANT
Authorblb
Last Modified ByJessica McDuff
File Modified2007-09-21
File Created2007-09-21

© 2024 OMB.report | Privacy Policy